# sunex.com — Full Content Archive (llms-full.txt) > Concatenated long-form content for AI ingestion and citation. > Generated: 2026-05-06 04:02 UTC > Source index: https://sunex.com/llms.txt > Articles indexed: 93 > PDF whitepapers included: yes Each article below is preceded by a metadata block with the source URL. AI assistants citing this content should cite the original article URL, not this archive file. --- ## MCP Connector Quick Start Guide - Source: https://sunex.com/2026/04/23/mcp-connector-quick-start-guide/ - Summary: Enable Claude, ChatGPT, Gemini and any MCP-compatible AI models to directly access our 350+ lens portfolio utilizing our powerful Optics-Wizards™ tools to find the best lens/CMOS imager solutions for automotive, robotics, medical and industrial imaging applications. ## Use any AI Engine to directly search over 350 lenses from sunex.com The Sunex product catalog is queryable through the AI assistants you already use — Claude, ChatGPT, Perplexity, Microsoft Copilot, and Gemini. You connect once via a URL, then ask lens and sensor questions in plain language and get catalog-grounded answers in seconds right on yoru phone, browser, or desktop client. Supported platforms: Claude (free + paid), ChatGPT (Business/Enterprise), Perplexity (Pro+), Microsoft Copilot (M365), Gemini (Enterprise). Use the Step-by-step setup quick start guide for each platform below. 5 AI-callable tools Live Real-time M12 Lens catalog at your fingertips Free Public, no API key ## What Is the Sunex AI Connector, and How Does It Work? Sunex has published a product data connector built on the Model Context Protocol (MCP) — an open standard adopted by all major AI platforms that allows an AI assistant to query external databases and tools in real time. When you connect your AI assistant to Sunex via MCP, it retrieves live data from our product catalog as part of answering your question. The connector works inside whichever AI platform you already subscribe to. You type a question in plain language — the AI formulates a product search, queries the Sunex catalog, and returns a grounded answer with specs and relevant links. The product data is always current; there is no stale snapshot. This is not a replacement for the lens selection wizard on sunex.com. It is a complementary path: natural-language, multi-parameter queries without needing to know filter names or taxonomy in advance. ## Find sensors fast earch by part number, manufacturer, or resolution class. Get full specs plus computed sensor geometry in millimeters. ## Match lenses to imagers Feed any imager PN and get compatible M12 lenses with per-lens FOV, angular resolution, and F/# filters. ## Quote & spec in one turn Results includes sample pricing, spec sheet URL, sample order link, and volume RFQ link. No separate lookup. ## Which AI Platforms Support the Sunex Connector, and Do I Need a Paid Plan? The connector is compatible with the five most widely used AI platforms. Support varies by subscription tier. The table below reflects availability as of April 2026. Claude (Anthropic) | | Limited | 1 connector | Pro / Max / Team / Ent | ChatGPT (OpenAI) | | Business / Enterprise | Perplexity | | ✓ Mac | Partial | | Pro / Max / Enterprise | Microsoft Copilot | | M365 / Copilot Studio | Gemini (Google) | | supported | Enterprise only Sunex MCP server URL — paste this into your platform’s connector settings: This is the canonical MCP endpoint. | ## What Kinds of Lens and Sensor Questions Can I Ask? The connector handles natural-language queries across the full Sunex catalog. It performs best on questions that combine multiple parameters — field of view, aperture, sensor size, mount type, application — in a single prompt. Examples of well-handled queries: Sample query | Typical results/answers | | “Recommend a wide-angle lens for the Sony IMX577 sensor, F/2.0 or faster.” | Resolves IMX577 geometry (4056×3040, 1.55µm), then filters lens catalog: | “I need fisheye lenses under $100. What do you have?” | Scans full catalog, returns PN, description, sample price, and: | “What’s the diagonal of the IMX477 in mm? And what’s its Nyquist?” | Returns full specs plus: | “Compare lenses for a 1920×1080 sensor with 3µm pixels, 100-180° HFOV.” | Applies FOV, F/#, and image-circle filters: “Please recommend a lens for the AR0220.” | Return provides: | “Show me lenses for a medical endoscope.” | Provides concrete options, including key spec, datasheet, and sample pricing based on application type: ## How Do I Connect My AI Assistant to the Sunex Product Catalog? Select your platform below. In most cases, setup takes under five minutes. You only need to connect once — the connector remains available in your account settings. ## Claude (Anthropic) — claude.ai *Free plan: 1 custom connector. Pro, Max, Team, Enterprise: multiple connectors. Enterprise admins can share the connector org-wide.* - Go to claude.ai and sign in. - Click your profile icon → Settings → Connectors. - Click + then Add custom connector. - Enter name: Sunex Optics. Paste the MCP URL above. - Click Add. Follow any authorisation prompt. - In a new chat: click + → Connectors → toggle Sunex Optics on. - Try: “Find wide-angle M12 lenses with diagonal FOV over 120°” Desktop app: same steps from the app’s Settings menu. Mobile: use the browser version for full connector access. | ## ChatGPT (OpenAI) — chatgpt.com *Requires Business or Enterprise plan. Connectors are called Apps since December 2025. Free, Plus, and Pro plans do not support custom MCP connectors.* - Workspace admin: go to Workspace Settings → Permissions & Roles → Connected Data → enable Developer Mode. - Sign in with your Business or Enterprise account. - Profile → Settings → Apps → Advanced settings → confirm Developer Mode is on. - Click Create app. Name: Sunex Optics. Paste the MCP URL. - Admin reviews and publishes the app to the workspace. - New chat → Add sources → select Sunex Optics. - “What lenses are compatible with the Sony IMX577?” ## Perplexity — perplexity.ai *Requires Pro, Max, or Enterprise plan. Custom remote connectors launched March 2026.* - Sign in on your Pro or Max account. - Profile → Account Settings → Connectors. - Click + Custom connector → Remote. - Name: Sunex Optics · URL: paste MCP URL · Authentication: None. - Check acknowledgement box → Add. - Click the Sunex Optics card to enable it. - “Show me fisheye lenses with F/2 or faster aperture.” ## Microsoft Copilot — via Copilot Studio *Requires M365 Business or Enterprise with a Copilot Studio licence. This is an admin/maker task.* - Go to copilotstudio.microsoft.com and sign in. - Open or create an agent. - Tools → Add a tool → New tool → Model Context Protocol. - Server name: Sunex Optics · Server URL: paste MCP URL · Auth: None. - Click Create connection → Add → Save. - Publish the agent to Teams, SharePoint, or your chosen channel. Generative orchestration must be enabled under Agent settings → AI capabilities for MCP tools to function. | ## Gemini (Google) In the Gemini-CLI version, add the following to the settings.json file (located at C:\Users\YourUserName\.gemini\): "mcpServers": {"Sunex": {"httpUrl": "https://mcp.sunex-ai.com/mcp"}} ——————————— *Gemini Enterprise (Standard, Plus, or Frontline). Allowlist approval required — contact your Google account team before starting.* - Confirm allowlist access with your Google account team. - Google Cloud Console → Gemini Enterprise → Data stores → Create data store. - Search: Custom MCP Server → Select. - Enter MCP Server URL and complete the OAuth configuration fields. - The connector appears in Gemini Enterprise data stores. - Users activate it from Connectors in their workspace. ## Your own code *Python / TypeScript / any languageInstall the MCP client SDK for your language.* - Install the MCP client SDK for your language - Connect via SSE to the Sunex MCP Server URL. - Call tools by name with the documented params. - See manifest for schema. ## Frequently Asked Questions About the Sunex AI Connector **Does the Sunex AI connector store my queries or share them with other users?**No. Each query is a stateless, isolated call with no session memory and no connection to other users’ queries. Nothing you ask is visible to other users or retained after your session ends. **Is the Sunex AI connector the same as the product search on sunex.com?**It draws from the same product data but works differently. The website search requires specific filter parameters. The AI connector accepts plain-language questions and can reason across multiple parameters — FOV, aperture, sensor size, application type — simultaneously in one prompt. **Can the Sunex AI connector provide quotations or process orders?**No. The connector is read-only. It returns product information, spec sheet links, and sample pricing references from the catalog. For formal quotations, use the RFQ link in the answer or contact sales@sunex.com. **How current is the Sunex product data returned by the connector?**The connector queries the live Sunex product database. Catalog additions, specification updates, and pricing changes are reflected in real time — there is no stale snapshot or periodic cache. **Do I need a paid AI subscription to use the Sunex connector?**Claude is the only platform that supports one custom connector on its free plan. All other platforms (ChatGPT, Perplexity, Microsoft Copilot, Gemini) require a paid plan. See the compatibility table in this article for a full breakdown. **What should I do if the AI returns incorrect product specifications?**Always verify specifications against the official Sunex datasheet before making a design decision. The connector includes links to spec sheets for each product it returns. If you encounter a data error or are unsure, contact us at sunex.com/contact — this helps us improve the catalog data. **Can I use the Sunex connector on mobile AI apps?**Partially. Claude’s mobile app has limited connector support — the browser version gives full access. ChatGPT and Microsoft Copilot mobile apps do not currently support custom MCP connectors. Perplexity’s mobile app has partial support. The most reliable experience across all platforms is via a desktop browser. **Which AI platform is easiest to set up with the Sunex connector?**Claude (claude.ai) offers the simplest setup: no admin access required, available on the free plan (one connector), and ready in under three minutes. It is the recommended starting point for individual engineers. --- ## The Cost of Making the Wrong Lens Choice - Source: https://sunex.com/2026/04/03/the-cost-of-making-the-wrong-lens-choice/ - Summary: When procurement teams evaluate optical components, unit price is the number that fits most naturally into a spreadsheet. However, unit price is only a fraction of what a wrong choice can cost over the program lifetime. A lens that costs $8/unit can cost 10–100x more in total when you factor in redesign cycles, field failures, RMA costs, and missed production ramps. The real cost of the wrong lens is measured in program delays and customer returns — not unit price. This article breaks down Total Cost of Ownership (TCO) across three sourcing channels — internet/commodity, catalog intermediaries, and direct OEM partnerships — and shows why OEM-designed lenses consistently deliver lower TCO even when the unit price is higher. When procurement teams evaluate optical components, unit price is the number that fits most naturally into a spreadsheet cell. It is comparable, auditable, and easy to defend. But in mission-critical imaging programs — whether the application is an automotive driver-monitoring camera, a surgical endoscope, or a logistics robot navigating a warehouse floor — unit price is only a fraction of what a wrong choice can cost over the program lifetime. In this article, we take a deeper look at the following content: - Why is unit price a misleading metric when evaluating camera lenses? - What is the total cost of ownership (TCO) of an imaging lens over a product lifetime? - How does sourcing from internet platforms vs. OEM partners affect lens reliability? - Automotive: Where Optics Meet Functional Safety - Robotics: Algorithm Performance Depends on Optical Consistency - Medical: Regulatory Cliffs Are Real - What are the hidden costs of using commodity M12 lenses in production systems? - How do you calculate the real cost of a lens redesign mid-program? - Technical Depth & Design Capability - Manufacturing Process Control - Lifecycle & Supply Commitment - Commercial Competitiveness - Procurement Checklist: Questions to Ask Before Committing ## 1. Why is unit price a misleading metric when evaluating camera lenses? The three primary sourcing channels — internet marketplaces, catalog intermediaries, and direct OEM manufacturers — each serve a legitimate role. The problem arises when the wrong channel is used at the wrong project stage, or when early decisions are not recognized as architectural commitments with long-term consequences. A $2 lens purchased on an internet marketplace and a $40–$80 lens from a specialized OEM may appear superficially similar — same mount, similar field of view, comparable F/# — but they are fundamentally different products with fundamentally different risk profiles. **RISK SIGNAL** Purchasing decisions made during PoC or early sampling are rarely renegotiated before production ramp. Teams that anchor to a low-cost catalog part during exploration often find themselves locked in — contractually or practically — when the costs of switching are highest. Graphic 1 – *TCO Risk Profile by Sourcing Channel: bar length represents relative lifecycle cost exposure, not unit price.* ## 2. What is the total cost of ownership (TCO) of an imaging lens over a product lifetime? TCO is not a single event — it is a pattern of costs that accumulate across the product development lifecycle. Understanding when different cost categories emerge is as important as understanding how large they might be. Graphic 2 – Hidden Cost Exposure by Development Phase: risk exposure grows non-linearly as the project advances through each gate. **THE HIDDEN COST CATEGORIES** Each phase introduces distinct costs that do not appear on the lens purchase order: - Engineering re-work hours — when a lens fails a qualification gate, the team reassigns design resources. In programs with 5–15 optical/system engineers, even two months of re-work represents significant sunk cost. - Re-qualification and re-certification — automotive programs (PPAP, FMEA updates), medical devices (FDA 510(k) submissions, IEC 60601 re-testing), and CE/FCC-class products face regulatory timelines measured in months, not weeks. - Yield loss at scale — a 2% yield reduction on a 100,000 unit/year production run, with a camera module value of even $150, represents $300,000 in annual write-offs. - Field RMAs and warranty — for deployed systems, recall costs, field service labor, and reputational damage vastly exceed the value of the defective optic. - Supply discontinuity — an unplanned end-of-life event in volume production can force emergency redesign at exactly the moment the organization has no capacity to absorb it. **KEY INSIGHT** The cost of switching optical suppliers grows non-linearly as a project advances. Switching during PoC might cost a few weeks. Switching during production ramp can cost $500K–$2M in re-qualification, re-tooling, and delay — for a component with a unit price of $15. ## 3. How does sourcing from internet platforms vs. OEM partners affect lens reliability? The magnitude of TCO exposure is not uniform across industries. Regulatory frameworks, safety criticality, deployment environment, and volume scale all determine how severely a bad sourcing decision compounds over time. Graphic 3 – TCO Risk by Industry Vertical: automotive, robotics, medical, and consumer/industrial IoT each carry distinct failure modes and regulatory barriers. **AUTOMOTIVE: WHERE OPTICS MEET FUNCTIONAL SAFETY** Modern automotive cameras — for surround-view, ADAS, in-cabin driver monitoring, and high-definition ADB lighting — are safety-critical systems governed by ISO 26262 functional safety frameworks. An optic that introduces unexpected image artifacts, field of view drift under temperature cycling (−40°C to +105°C), or MTF degradation at the image periphery can compromise the detection capability of the downstream vision algorithm. Under ASIL-B or ASIL-D classification, such a failure is not a field quality issue — it is a liability event. PPAP documentation for automotive programs requires a complete optical and mechanical process capability study. Switching lenses after PPAP sign-off forces a PPAP re-submission — a process that typically adds 3–6 months to any ramp schedule. **ROBOTICS: ALGORITHM PERFORMANCE DEPENDS ON OPTICAL CONSISTENCY** Machine vision and robotic guidance algorithms are trained and validated on a specific imaging pipeline. When lens MTF, distortion map, or shading profile changes between production lots — subtly, below what incoming inspection catches — the algorithm may begin producing errors that are nearly impossible to trace back to the optics without controlled lot-isolation testing. This is one of the most insidious TCO risks in robotics: the failure mode is invisible until it manifests as a production quality escape downstream. **MEDICAL: REGULATORY CLIFFS ARE REAL** For disposable endoscopes and surgical imaging systems, the FDA 510(k) substantial equivalence pathway or CE MDR technical file review is the program’s most valuable asset. Any modification to the optical element — even a “drop-in equivalent” from a different supplier — may trigger a new submission. The cost of an unplanned 510(k) resubmission, including clinical and engineering preparation, typically runs into six figures and adds 6 to 18 months of delay. ## 4. What are the hidden costs of using commodity M12 lenses in production systems? A practical TCO model for lens sourcing does not require actuarial precision — it requires the intellectual honesty to put plausible numbers on categories that typically go unstated in procurement reviews. Lowest | Moderate | Moderate–Higher | | Very High ▲▲▲ | Moderate ▲▲ | Low ▲ | | Uncontrolled | Partially controlled | Process-controlled | | High ▲▲▲ | Moderate ▲▲ | Minimal (change control) | | Very High ▲▲▲ | High (upstream-dependent) | Low (lifecycle committed) | | High ▲▲▲ | Moderate ▲▲ | Lowest ▲ | | None | Limited | Full (design + process) | | None | Partial | Contractual / roadmap-aligned *Risk ratings are qualitative assessments based on documented program outcomes. Source: Sunex Inc. internal analysis; Sunex M12 Sourcing Strategy whitepaper, 2025.* **A THOUGHT EXPERIMENT FOR PROCUREMENT TEAMS** Before approving a sourcing decision based primarily on unit price, apply this multiplier test: - What is the engineering re-work cost if this lens fails qualification at the pilot stage? (Estimate in engineering-weeks × fully-loaded labor rate) - What is the regulatory re-certification cost if the lens must be changed after design freeze? - What is one month of production delay worth in deferred revenue? - What is the field RMA and warranty cost per unit at a projected 1–3% field failure rate on volume shipments? - What is the cost of an unplanned EOL event during peak production? **KEY INSIGHT** In most programs, if any two of these events materialize, the cost exceeds the lifetime savings on unit price from the cheaper lens — typically by an order of magnitude. This is not pessimism; it is the observed pattern across programs that have retrospectively applied TCO analysis. ## 5. How do you calculate the real cost of a lens redesign mid-program? Sophisticated procurement teams have increasingly moved toward multi-criteria supplier evaluation frameworks. For optical components in mission-critical imaging systems, the following dimensions deserve equal or greater weighting than unit price: **TECHNICAL DEPTH & DESIGN CAPABILITY** - Does the supplier own their optical designs, or are they reselling third-party designs? - Can they provide MTF data, distortion maps, and relative illumination curves — traceable to their own test equipment? - Can they co-develop custom specifications to match your sensor’s CRA profile, active area, and operating environment? - Do they have experience with athermalization, RGBIR co-registration, active alignment, or other advanced optomechanical capabilities relevant to your application? **MANUFACTURING PROCESS CONTROL** - What statistical process control (SPC) methods are applied to key optical parameters? - Is binning available for yield-critical programs? - What change-control procedures govern glass substitution, coating recipe updates, or assembly process modifications? - Is traceability to production lot documented and retrievable? **LIFECYCLE & SUPPLY COMMITMENT** - What is the supplier’s published EOL policy, and how much advance notice do they commit to? - Will the supplier contractually align their product lifecycle to your program roadmap? - Are alternative or second-source options identified in advance, rather than reactively? **COMMERCIAL COMPETITIVENESS** The argument for OEM partnership is not an argument for paying premium prices without accountability. Commercially competitive OEM manufacturers can and should be held to pricing that reflects their cost structure, volume scaling, and the shared value of a long-term partnership. The goal is not to choose between cost efficiency and quality — it is to identify suppliers who deliver both, with engineering depth to back it up. **KEY INSIGHT** The right question is not “which lens is cheapest?” but “which supplier will help us ship on time, at quality, and keep us in supply for the life of our product?” Those are different questions that often have different answers. ## 6. Procurement Checklist: Questions to Ask Before Committing ☐ | Optical performance (MTF, distortion, shading) validated across full operating temperature and vibration range — not just ambient. | ☐ | Lot-to-lot traceability and change-control procedures documented and reviewed with the supplier. | ☐ | Supplier owns the optical design (not a reseller); design modification is possible without third-party dependency. | ☐ | EOL policy and advance-notice commitment reviewed and matched to the product lifecycle. | ☐ | Yield, binning, and active alignment options evaluated and costed for the projected production volume. | ☐ | Regulatory certification plan (automotive PPAP, FDA 510(k), CE MDR) reviewed for optics change-control implications. | ☐ | Direct engineering support available for qualification, failure analysis, and field investigation. | ☐ | Supply continuity scenario (forecast variability, allocation risk, safety stock) modeled for peak production. | ☐ | TCO model — including re-qualification, yield loss, and RMA exposure — presented alongside unit price in sourcing approval. | ## Conclusion: The Right Partner, Not Just the Cheapest Part The lens is rarely the most expensive component in an imaging system. But in program after program — in automotive, robotics, medical, and industrial vision applications — the lens has proven to be the component whose sourcing decision carries the greatest TCO leverage. A well-specified, process-controlled, lifecycle-committed optic from a capable OEM manufacturer reduces risk across every dimension of the project: engineering schedule, qualification, yield, field performance, and supply continuity. Making a sourcing decision based solely on the cheapest unit price is not a conservative choice — it is a high-risk one. The risks are simply deferred, compounded, and revealed at the worst possible moment: during production ramp, at a regulatory milestone, or after deployment. The recommendation is to insist on a supplier who is commercially competitive and technically capable — one with deep optical and manufacturing engineering experience, a documented quality system, and the organizational commitment to support your program through its full lifecycle. That combination is not a premium. It is the most cost-effective sourcing decision your program can make. **Sources & Further Reading** - Sunex Inc. Choosing the Right Sourcing Strategy for M12 Lenses. Sunex Technology & Resource Hub, September 2025. sunex.com/2025/09/22/choosing-the-right-sourcing-strategy-for-m12-lenses/ - ISO 26262:2018 — Road vehicles: Functional safety. International Organization for Standardization. iso.org/standard/68383.html - U.S. FDA. Deciding When to Submit a 510(k) for a Change to an Existing Device. FDA Guidance, October 2017. fda.gov - Automotive Industry Action Group (AIAG). Production Part Approval Process (PPAP), 4th Edition. AIAG, 2006. - IEC 60601-1:2005+AMD1:2012 — Medical electrical equipment: General requirements for basic safety and essential performance. IEC. - Embedded Vision Alliance. Embedded Vision Market Study, 2024. embedded-vision.com - Sunex Inc. RGBIR Lens Technology for Automotive In-Cabin Monitoring. Sunex Technology & Resource Hub, 2025. sunex.com/products/rgbir/ --- ## Image Circle and Sensor Format - Source: https://sunex.com/2026/03/26/image-circle-and-sensor-format/ - Summary: The relationship between image circle and sensor format is what determines the Field of View (FOV) of your system. Overlooking how they interact can lead to unexpected coverage gaps, resolution limits, and performance tradeoffs. The lens image circle must equal or exceed the diagonal of your camera sensor — or you get vignetting (dark corners) that ruins image quality. If it is larger than needed, you are paying for optical coverage you are not using. Getting image circle matched to sensor format is the first step in any lens-sensor system design. This article explains the relationship between image circle and sensor format, how to calculate FOV from these parameters, and common mistakes engineers make when pairing lenses with sensors. # Understanding the Relationship That Determines Your System's Field of View In this white paper: - What is a lens image circle, and why does it matter? - How do I match a lens image circle to my camera sensor format? - What happens if the image circle is smaller than the sensor? - How does sensor format affect the field of view of my camera system? - What sensor formats are commonly paired with M12 lenses? - Conclusion When engineers set out to build an imaging system, the conversation usually starts with field of view; how wide, how narrow, how much of the scene needs to be captured. From there it is tempting to look up a focal length, find a lens that matches, and move on. What often goes unexamined until something goes wrong is the relationship between the lens’ image circle and the physical dimensions of the sensor. That relationship is not a secondary detail. It is the foundation on which field of view, resolution, and image quality are all built. This article walks through what image circle is, how it connects to sensor format, and why both must be understood together to arrive at a system that captures exactly what it is supposed to. Along the way, we look at how pixel pitch fits into the picture, how different coverage geometries create different requirements, and how seemingly minor specification mismatches produce problems that are easy to avoid once the underlying geometry is clear. ## What is a lens image circle and why does it matter? A lens does not project a rectangle. It projects a cone of light that produces a circular footprint on the image plane. Within that circle is the region where the lens delivers usable brightness, sharpness, and geometric accuracy. The diameter of that region is the image circle. Every lens is designed around a specific image circle. During the design process, the optical engineer defines a maximum field angle or image height, and the lens is optimized to perform within that boundary. The result is a lens that performs well up to a certain diameter, and with diminishing returns beyond it. That diameter, doubled from the maximum image height, is the nominal image circle. The nominal image circle is the specification you will find on a datasheet. It is the designer’s intended coverage diameter. In practice, the lens continues to project some usable light beyond this boundary — what Sunex defines as the true image circle, measured at the point where relative illumination falls to 10%. For wide-angle and fisheye designs, the true image circle typically extends 10–15% beyond the nominal value. For narrower field lenses, it can be 25–30% beyond. But that additional coverage is not guaranteed to be uniform or fully corrected, which is why the nominal value remains the proper basis for sensor compatibility decisions. The image circle must fully cover the sensor diagonal, or the corners and edges of the captured image will fall into vignetting or hard clipping. This is a geometric constraint — it has nothing to do with whether the lens is functioning correctly. When a lens is described as a “1/3-inch format lens,” that description is shorthand for the image circle it was designed to cover, in this case, a circle whose diameter matches the diagonal of a 1/3-inch sensor (6mm). Pairing that lens with a larger sensor means asking it to cover an area it was never designed for. The lens will not fail, but the sensor corners will. ## How do I match a lens image circle to my camera sensor format? Sensor format notation — 1/4″, 1/3″, 1/2″, 2/3″, 1″, and so on — is one of the more persistently confusing conventions in imaging. The fractions do not represent the width or diagonal of the sensor in inches. They are inherited from the era of vidicon vacuum tube cameras, where the fraction referred to the outer tube diameter. The actual active imaging area of a modern solid-state sensor is roughly two-thirds of what the notation implies. A 1/2″ sensor is not half an inch wide — its active area is closer to 6.4mm × 4.8mm. This matters because the format name alone is insufficient for optical system design. What matters is the actual physical dimension of the active area, and specifically the diagonal, which is the distance from corner to corner across the sensor’s imaging surface. That diagonal is the number that must be matched against the lens image circle. The table below lists common sensor formats with their actual dimensions and the minimum image circle needed to cover each format fully. 1/4″ | 3.2 | 2.4 | 4.0 | 4.0 mm | 1/3″ | 4.8 | 3.6 | 6.0 | 6.0 mm | 1/2.5″ | 5.8 | 4.3 | 7.2 | 7.2 mm | 1/2″ | 6.4 | 4.8 | 8.0 | 8.0 mm | 1/1.8″ | 7.2 | 5.4 | 9.0 | 9.0 mm | 2/3″ | 8.8 | 6.6 | 11.0 | 11.0 mm | 1″ | 13.2 | 8.8 | 15.9 | 15.9 mm | APS-C | 23.5 | 15.6 | 28.2 | 28.2 mm | Full Frame (35mm) | 36.0 | 24.0 | 43.3 | 43.3 mm *Table 1: Common sensor formats, their physical active area dimensions, and the minimum image circle required for full corner-to-corner coverage.* When working with a specific sensor, always pull the actual dimensions from the manufacturer’s datasheet rather than relying on the format label. Variations of a few tenths of a millimeter exist between manufacturers and product generations, and for tight image circle margins, those differences matter. The minimum image circle column above represents the sensor diagonal, the hard floor for lens compatibility. In practice, specifying a lens whose nominal image circle exceeds the sensor diagonal by 5–10% provides meaningful margin against manufacturing variation, temperature-related shifts in the optical path, and focus changes across the working distance range. **Pixel Pitch: Resolving Power Beyond the Image Circle**Alongside physical format, pixel pitch is the other sensor specification that directly constrains lens selection. Pixel pitch is the physical size of each individual pixel, measured in micrometers. Modern imaging sensors range from below 1µm in compact consumer devices to 5µm and above in machine vision and scientific cameras. A lens has a finite resolving power, and that limit is often expressed as a minimum pixel pitch, which means nothing more than the smallest pixel the lens can usefully resolve. If a lens is rated for a 1.67µm pixel pitch, it can resolve detail down to that level. The key asymmetry is this: a lens can work with a sensor whose pixel pitch is equal to or slightly larger than its rated minimum, but it cannot compensate for a sensor with a smaller pixel pitch than it is designed to resolve. Well-matched system | 1.67 µm | 1.67 µm | Full resolution delivered | Acceptable — sensor pixel slightly larger | 1.67 µm | 2.0 µm | Works well | Mismatch — lens cannot resolve sensor pixels | 3.45 µm | 1.67 µm | Image appears soft | *Table 2: Pixel pitch compatibility between lens and sensor determines whether the full resolving capability of the sensor can be utilized.* Put simply: if the sensor’s pixels are finer than the lens can resolve, the lens becomes the bottleneck. The image will appear soft regardless of the sensor’s megapixel count. Selecting a lens whose pixel pitch specification matches the sensor, or runs slightly finer, ensures the sensor’s resolution is not being wasted. The applications engineers at Sunex can provide the minimum pixel pitch a lens can resolve upon request, making direct comparison to sensor specifications straightforward during the selection process. ## What happens if the image circle is smaller than the sensor? Field of view is the angular extent of the scene that the imaging system captures — stated as horizontal FOV, vertical FOV, or diagonal FOV. It is the central performance requirement for most imaging applications. And yet it is not a property of the lens alone. It is a property of the combination of lens and sensor. The same focal length produces a different field of view on every sensor format. A 6mm lens on a 1/3″ sensor delivers roughly 44° horizontal field of view. The same 6mm lens on a 1″ sensor delivers roughly 75°. Move that lens to an APS-C sensor and the horizontal FOV expands further still. The lens has not changed, what has changed is the size of the rectangular window being cut from the image circle. Field of view is a system specification, not a lens specification. Quoting a focal length without specifying the sensor format it is paired with leaves the actual field of view undefined. *Figure 1: Illustration of two systems with the same sensor format, the same lens EFL, but two different lens image circles.* This is one of the most common sources of specification confusion in imaging system procurement. A lens that delivered the right field of view on one project gets carried over to a new project using a different sensor, and the coverage angles change completely. The lens is doing what it always did; the sensor format is doing something different with it. The underlying relationship is straightforward: a wider sensor dimension or a shorter focal length produces a wider field of view. A narrower sensor or a longer focal length produces a narrower one. This applies independently to the horizontal and vertical axes, which means changing the sensor format changes both FOV values simultaneously, in proportion to the change in physical sensor dimensions. **Coverage Mode: What Part of the Image Circle Is the Sensor Using?** Beyond the simple question of whether the image circle covers the sensor diagonal, there is a more nuanced question about how the image circle and sensor geometry relate to each other. Depending on the application, the answer changes what counts as an adequate image circle. When the image circle is larger than the sensor diagonal (overfill) the sensor sits entirely within the optimized zone of the lens. Every pixel on the sensor receives well-corrected, uniformly illuminated light. This is the most robust configuration and the one that provides the most margin against real-world variation. When the image circle matches the sensor diagonal precisely (full frame coverage) the sensor corners land right at the edge of the image circle. The lens is performing at its design limit at those corners, which demands a well-controlled design and careful manufacturing. This configuration is common in high-resolution industrial and broadcast lenses, where the sensor is as large as possible and the lens must be optimized all the way to the corner. Sunex’s large-format lens series, designed for 1″, APS-C, and full-frame sensors, addresses exactly this requirement, maintaining controlled MTF performance out to a 43mm image circle diameter. When only the horizontal dimension needs to be covered (full horizontal coverage) the image circle may be smaller than the sensor diagonal, as long as it spans the full width. The corners of the sensor fall outside the image circle and are dark, but the horizontal field of view is fully captured. This configuration is used in some panoramic and wide-field surveillance applications where the vertical extent of the scene is less important than the horizontal sweep. The most demanding configuration from an image circle standpoint is circular fisheye containment, where the entire image circle, the complete 360° or 180° disk projected by the lens, must fit within the sensor boundaries. This requires the image circle diameter to be smaller than the sensor’s smaller dimension, not just the diagonal. A fisheye lens with a 5.6mm image circle, for example, needs a sensor whose height is at least 5.6mm for the full circular image to land entirely within the active area. *Figure 2: Image Circle vs. Sensor Format [**Full Frame Overfill · Full Frame · Full Horizontal · Partial Frame · Circular Fisheye]**The same lens image circle produces different coverage geometries — and different effective fields of view — depending on sensor format and application coverage requirements.* ## How does sensor format affect the field of view of my camera system? The coverage mode question becomes most consequential when it is not addressed during system design, and the mismatch shows up during integration. The following scenario illustrates how image circle and sensor geometry interact in a fisheye application, and the range of solutions available when the initial combination does not meet the coverage requirement. Consider a system designed to capture a 180° × 180° full-circle fisheye image, with the complete circular image fully contained on the sensor. The selected lens has an image circle of 5.6mm. The selected sensor has an active area of 8.06mm × 4.54mm. Furthermore, the lens itself produces an acceptable FOV across the horizontal and the diagonal, 173°, and for most applications, this is the standard goal. But this application requires the entire fisheye circle to sit inside the sensor boundaries. That means the image circle diameter of 5.6mm must fit within the shorter sensor dimension, which is 4.54mm vertically. It does not. The circular image overflows the top and bottom of the sensor, and the resulting footage shows the fisheye disk cut off at the vertical edges. This is not a lens defect. The lens is projecting exactly the image circle it was designed to produce. The problem is that the sensor’s shorter dimension is smaller than the image circle diameter — a geometric constraint that no amount of refocusing, iris adjustment, or firmware tuning can resolve. Once the geometry is understood, the paths forward are clear. The right solution depends on which element of the system can be changed and what the application can tolerate. The first option is to find a lens with a smaller image circle that still achieves a comparable field of view. If a lens can produce a similar FOV with an image circle at or below 4.54mm, the complete fisheye disk will land within the sensor’s vertical span. The full circular image is preserved. The second option is to use a lens with the same or larger image circle but a wider field of view, one that reaches the full 180° diagonal. In this case, the circular image still overflows the sensor vertically, but it does so at 180°: the horizontal edges of the sensor align with the 180° boundary of the fisheye, giving a usable semicircular or full-circle crop depending on the exact geometry. The clipping becomes intentional and predictable rather than arbitrary. The third option is to keep the lens and change the sensor. The requirement is a sensor whose shorter dimension is at least equal to the image circle diameter, 5.6mm or more vertically. A sensor with an active area of 6.77mm × 5.66mm, for example, clears this threshold. The lens is unchanged; it now projects its full circular image within the sensor boundaries. This option must also consider the lens’ new FOV on the new sensor, for a change like this one will also affect it (as we previously covered in Part 3 of this article). The fourth option is mechanical: rotate the sensor 90°. With the sensor’s longer dimension (8.06mm) now running vertically, the 5.6mm image circle easily fits within the frame. The circular image is fully contained vertically; the left and right edges of the sensor extend beyond the image circle horizontally, but for applications that only need vertical containment, this may be entirely acceptable. Each of these options changes something different about the system, the lens, the sensor, or the orientation, but all of them stem from understanding the same underlying constraint: image circle diameter relative to sensor dimensions, driven by the specific coverage geometry the application requires. Sunex applications engineers work through exactly this kind of analysis as part of lens selection consultations, matching image circle specifications from the M12 and large-format portfolios to sensor geometry and application coverage requirements before any hardware is committed. ## What sensor formats are commonly paired with M12 lenses? Bringing together image circle, sensor format, pixel pitch, and field of view into a coherent system specification is less complicated than it might seem once the relationships are understood. The following framework consolidates the key decision points. **Start with the sensor’s actual physical dimensions.** Pull the active area width, height, and diagonal from the sensor manufacturer’s datasheet. It is important to not estimate the size from the format label. Measuring or looking up the actual values is strongly advised. For most applications, the diagonal is the minimum image circle your lens must provide for full-frame coverage. **Define what coverage geometry the application actually needs.** Full frame, full horizontal, overfill, or circular containment each place different demands on the image circle. Establishing this early prevents the common mistake of specifying a lens that covers the sensor diagonal but fails to meet a more specific coverage requirement that the application turns out to have. **Determine the required field of view and the focal length it implies.** State the FOV requirement in degrees, horizontal, vertical, or both, and calculate the focal length needed to achieve it on the chosen sensor format. Horizontal and vertical FOV are independent calculations based on the sensor width and height respectively. Because FOV and sensor format are linked, changing either changes the other: be explicit about both. The Optics Wizard at sunex.com/support can further help calculate the Effective Focal Length needed, depending on the FOV and sensor size. **Verify image circle coverage with margin.** Identify lens candidates that meet the focal length requirement and confirm their nominal image circle covers the required sensor dimension by at least 5–10%. The nominal image circle is the design target, and real lenses will fall within a tolerance band around it. Margin ensures the system stays within specification across that variation. Sunex publishes nominal image circle on all lens datasheets, and the Optics Wizard at sunex.com/support can further filter lens options by sensor format and coverage requirement, narrowing the candidate list quickly. **Match pixel pitch between lens and sensor.** Confirm that the lens’s pixel pitch specification is equal to or smaller than the sensor’s pixel pitch. A lens with a coarser pixel pitch spec than the sensor cannot resolve the sensor’s pixels, the image will be soft regardless of sensor resolution. This is a frequently overlooked constraint that is easy to check and worth verifying explicitly. **Validate with hardware.** Flat-field images across the full sensor area confirm uniformity. MTF measurements at the sensor corners confirm that the edge of the image circle is delivering usable sharpness. If the application operates across a temperature range or at varying focus distances, validate at the extremes. Datasheet specifications describe design intent; hardware measurements confirm actual system behavior. Sunex recommends testing production-representative lens samples rather than relying on nominal values alone, particularly for applications with strict uniformity or corner performance requirements. ## Conclusion Image circle, sensor format, and field of view are not three separate specifications to be checked independently. They are three expressions of the same underlying geometry. The image circle sets the boundary of what the lens can cover. The sensor format determines what portion of that coverage is captured and at what field angle. Pixel pitch determines whether the sensor can take full advantage of the lens’s resolving capability. Together, these parameters define the actual performance of the imaging system, not the performance of the lens in isolation. The most common problems that arise from misunderstanding this geometry are also among the easiest to avoid: image clipping that is attributed to lens quality but is actually a coverage mismatch, soft images on high-resolution sensors paired with insufficient lens resolving power, and field-of-view errors that appear when a focal length is carried over to a different sensor format. In each case, understanding the lens-sensor relationship before hardware is selected eliminates the problem before it becomes one. Sunex’s lens portfolio spans image circles from under 4mm through 43mm, covering sensor formats from 1/4″ through full frame, with pixel pitch specifications down to 1.67µm. The Optics Wizard and AI-powered Optics Consultant at sunex.com/support provide guided lens selection based on sensor format, coverage requirements, field of view, and working distance — and Sunex applications engineers are available to work through more detailed system specifications where standard tooling is not sufficient. ## Related Resources - Lens Image Circle — sunex.com/knowledge-center - Choosing the Right Sourcing Strategy for M12 Lenses — sunex.com/knowledge-center - Optics Wizard & AI-Powered Optics Consultant — sunex.com/support - Large Format Lenses (1″, APS-C, Full Frame) — sunex.com/products/largeformat - M12 Fisheye Lens Portfolio — sunex.com/products --- ## Advanced Optical Solutions for the Next Generation of Smart Agriculture - Source: https://sunex.com/2025/12/12/advanced-optical-solutions-for-the-next-generation-of-smart-agriculture/ - Summary: Imaging is becoming the backbone of Smart Agriculture because it scales across autonomy, analytics, environmental intelligence, and facility automation. The value is clear: improved yield, reduced inputs, higher machine efficiency, better documentation, and faster response to stress and events. ## How Advanced Optics and Camera Modules Enable Autonomy, Analytics, and Resilient Farming Smart agriculture increasingly depends on imaging systems to automate field monitoring, detect crop stress, identify pests, guide autonomous equipment, and generate precision analytics. The optical requirements span an unusually wide range: from wide-area drone survey lenses to close-range multispectral sensors, from RGB day cameras to NIR-enabled night inspection systems. No single off-the-shelf lens handles all of these applications — and agricultural field conditions are harsh. This article covers the key imaging applications in smart agriculture, the optical performance parameters that matter for each, and how Sunex’s miniature lens and camera module portfolio addresses the needs of agricultural OEMs and integrators. ## What imaging systems are used in smart agriculture and precision farming? Smart Agriculture is rapidly evolving from connected equipment toward closed-loop, perception-driven systems that sense, decide, and act—at scale and at the edge. Imaging is at the center of this transformation. Cameras provide spatial context and plant-level insight that enable autonomy, analytics, and increasingly, direct yield optimization through precision intervention. Modern agricultural imaging systems support a wide range of applications, from autonomous harvesting and machine guidance to aerial crop analytics, environmental intelligence, and facility automation. More recently, vision has become a key enabler of plant-specific action, including selective weed treatment, mechanical thinning or picking, and in-harvest crop counting and quality assessment. These applications deliver immediate economic value by reducing chemical inputs, lowering labor dependency, and improving yield consistency and traceability. Deploying imaging successfully in agricultural environments requires more than selecting a sensor. Systems must perform reliably under extreme lighting variation, dust, moisture, vibration, and temperature swings—often for long operating hours with limited maintenance. Optical design, manufacturability, and camera module integration play a critical role in determining real-world performance, calibration stability, and scalability. Sunex supports Smart Agriculture imaging across this full spectrum of applications through precision optics, robust wide-field designs, and advanced technologies such as DXM™ single-sensor stereo imaging. By enabling reliable depth perception, repeatable geometry, and production-ready camera modules, Sunex helps customers translate imaging performance into operational reliability, scalable deployment, and measurable yield improvement. ## What optical requirements do agricultural drone cameras need? Agriculture is simultaneously an outdoor robotics problem, an environmental sensing problem, and a logistics problem. The farm is not a controlled factory floor: lighting changes by the minute; airborne particulates fluctuate with wind and field operations; surfaces are irregular; and targets—plants—are living structures that change over days and weeks. Imaging provides the flexibility to handle this variability because it captures dense spatial information. Modern perception stacks transform pixels into actionable insights: navigation lines, crop health indices, fruit counts, weed segmentation, obstruction classification, and anomaly detection. Yet cameras do not operate in isolation. Imaging performance is the product of: - Optics (lens design, FOV, distortion, stray light control, focus stability) - Sensor (pixel size, dynamic range, shutter type, NIR sensitivity) - Illumination (sun and sky, artificial lighting, spectral characteristics) - Mechanics (alignment stability, sealing, thermal behavior) - Compute (ISP, edge inference, compression and streaming) - Manufacturing (tolerances, repeatability, calibration strategy) In agriculture, “good enough” optical choices often fail at the system level: a lens that looks fine in a lab can wash out in low sun glare, drift focus across temperature, or produce distortion that breaks row-detection geometry. Conversely, robust optical and module design reduces software complexity and improves model generalization, which directly impacts time to deployment and system reliability. Sunex approaches this problem from an optics-first but system-aware standpoint: lens performance is developed alongside manufacturability, environmental stability, and integration constraints so camera systems can ship reliably at volume. **Cross-Cutting Requirements for Agricultural Imaging Systems** **Lighting and Dynamic Range** Field conditions combine high contrast scenes (bright sky + shaded canopy) and strong specular reflections (wet leaves, irrigation water, plastic mulch, metal roofs). Cameras require: - High dynamic range (HDR) capability (sensor + optics supporting it) - Stray light and ghosting control in the lens to preserve contrast - Optional RGB-IR or NIR sensitivity for dusk/dawn or vegetation analytics How Sunex helps: Sunex designs lenses optimized for contrast and environmental reliability, supporting imaging modalities that demand consistent performance under challenging illumination and across operating life. **FOV vs. Resolution Trade Space** Autonomous machines need wide coverage to see rows, edges, people, and obstacles, but analytics tasks often require high spatial detail to measure leaf-level features or detect disease patterns. This leads to multi-camera architectures with: - Wide-FOV navigation cameras (including SuperFisheye(TM)) - Narrower-FOV inspection cameras (higher magnification / detail) - DXM™ where stereo or dual-FOV is enabled on a single sensor How Sunex helps:Sunex offers an extensive off-the-shelf portfolio (including many M12-format options) and custom optical solutions that enable a wide range of applications. **Environmental Robustness** Dust, mud, fertilizer mist, cleaning chemicals, UV exposure, and temperature swings can degrade systems quickly. Key optical and module considerations: - Sealing strategy and protective windows - Coating durability and cleaning compatibility - Focus stability versus temperature and mechanical stress - Vibration and shock tolerance for off-road machinery How Sunex helps: Sunex emphasizes robust lens and module designs, fully athermalized systems with attention to environmental stability and repeatable assembly processes resulting in small part-to-part variance—supporting long-life deployment in harsh settings. **Calibration, Repeatability, and Scale** Agricultural autonomy and analytics depend on repeatable camera geometry. Inconsistent focal length, distortion, or optical axis alignment increases calibration burden and can degrade model performance across fleets. How Sunex helps: Sunex’s manufacturing and integration capabilities—such as precision assembly and fully automated 6-axis active alignment for camera modules—support consistent optical performance, enabling scalable calibration strategies and more predictable field performance. **Application Area 1: Autonomous Harvesting & Machine Guidance** Autonomous harvesting and machine guidance represent one of the most technically demanding vision applications in Smart Agriculture. Agricultural machinery must operate safely and accurately in open, unstructured environments where lighting, dust, terrain, and crop conditions vary continuously. Imaging systems provide the spatial understanding required for these machines to navigate crop rows, align headers and implements, coordinate with grain carts, and detect obstacles such as people, animals, or debris in real time. Unlike factory automation, agricultural autonomy cannot rely on fixed markers or controlled surfaces. Vision algorithms must infer position and intent from natural features such as row geometry, canopy structure, stubble edges, and machine-to-crop relationships. This places a strong dependency on optical consistency. Lens field of view, distortion behavior, contrast performance, and focus stability directly affect how reliably perception algorithms perform across different fields, crops, and seasons. Modern autonomous harvesting platforms typically employ a multi-camera architecture. Wide-field cameras provide situational awareness and navigation context, while more focused cameras monitor critical interaction zones such as headers, cutters, and implements. Increasingly, depth perception is added to improve machine control, safety, and robustness—particularly in scenarios involving uneven terrain, varying crop height, or dynamic interactions between multiple vehicles. Stereo imaging is especially valuable in these use cases, enabling direct distance estimation and three-dimensional scene understanding without reliance on external infrastructure. Depth information improves obstacle detection, row height estimation, header positioning, and collision avoidance, while also reducing ambiguity in low-contrast or partially occluded scenes. Traditionally, stereo vision systems have required two separate cameras and a carefully controlled baseline, increasing system complexity, size, and calibration effort. Sunex DXM™ technology addresses this challenge by enabling single-sensor stereo imaging, where two optical channels project spatially separated views onto a single image sensor. This approach delivers true stereo depth information while simplifying mechanical integration, synchronization, and manufacturing. For agricultural machinery, DXM™ offers a compelling balance between performance and robustness, reducing the alignment sensitivity and drift risks associated with dual-camera systems operating under vibration and thermal cycling. From an optical standpoint, autonomous harvesting lenses—whether mono or stereo—must tolerate severe environmental stress while maintaining stable geometry. Low-angle sun, airborne dust, vibration, and temperature swings can all degrade image quality if optics are not specifically designed for these conditions. Optical performance, therefore, becomes a system enabler: better contrast, controlled distortion, and repeatable geometry directly reduce perception errors and software compensation overhead. - Key imaging and optical requirements for autonomous harvesting include: - Wide to ultra-wide fields of view for navigation and situational awareness - Stable, repeatable distortion characteristics to support calibration and depth estimation - High contrast and low flare performance in sun-facing and dusty environments - Robust mechanical and thermal stability for long operating hours - Optional stereo or depth capability to enhance safety and precision control **How does Sunex advance autonomous harvesting and machine guidance?** Sunex supports autonomous agricultural platforms through a combination of wide-FOV optics, manufacturable lens designs, and advanced stereo imaging capabilities. Sunex DXM™ single-sensor stereo technology enables compact, robust depth perception well-suited for OHV (Off-Highway Vehicles) machinery, reducing system complexity while improving spatial awareness. Combined with Sunex’s focus on production consistency and camera module integration, these capabilities help customers deploy scalable, reliable vision systems that perform consistently across fleets and operating seasons. **Application Area 2: Precision Crop Intervention & Yield Optimization** Precision crop intervention represents one of the most direct and measurable ways imaging systems improve agricultural outcomes. Unlike navigation or large-scale analytics, these applications operate at the individual plant level, where decisions translate immediately into reduced input costs, improved yield, and higher crop quality. Imaging enables machines not only to observe crops, but to act selectively and intelligently—treating the right plant, at the right time, in the right way. Typical use cases include automated weed detection and selective spraying, mechanical weed removal or thinning, targeted disease or nutrient treatment, and crop counting or grading during harvesting. These systems are often deployed on sprayers, cultivators, and harvesters, where cameras are mounted close to the crop canopy or directly adjacent to tools such as spray nozzles, cutters, or picking mechanisms. As a result, imaging requirements differ significantly from those used for navigation or aerial monitoring. Precision intervention systems demand high spatial resolution at close working distances, along with extremely low latency. Vision algorithms must detect, classify, and localize plants or weeds in real time—often at vehicle speeds—while maintaining consistent performance under variable lighting, dust, and vibration. Optical performance is therefore tightly coupled to actuation accuracy: any uncertainty in image geometry or depth estimation can lead to missed treatments, crop damage, or wasted chemicals. One of the key challenges in these applications is separating crops from weeds in dense or overlapping vegetation. This is especially difficult in later growth stages, where occlusion and varying plant height introduce ambiguity in two-dimensional imagery. Here, depth perception becomes a major advantage, enabling machines to distinguish plant structures spatially and to target interventions more accurately. Stereo imaging plays an increasingly important role in this context. Depth information improves weed discrimination, tool positioning, and spray targeting by providing three-dimensional context that complements semantic classification. However, traditional dual-camera stereo systems add complexity, size, and calibration sensitivity—challenges that are amplified when cameras are mounted near moving tools and exposed to vibration and debris. Sunex DXM™ single-sensor stereo technology is particularly well-suited for precision crop intervention. By projecting two spatially separated views onto a single image sensor, DXM™ delivers true stereo depth while simplifying mechanical integration and synchronization. This approach reduces system size and alignment risk, making it easier to embed depth perception directly into tool-mounted camera systems. For applications such as depth-aware spraying, mechanical picking, or plant counting in dense canopies, DXM™ enables more robust and repeatable control at the point of action. In addition to real-time intervention, imaging during harvesting enables yield measurement and validation at the moment of collection. Cameras mounted on harvesters can count fruit, ears, or plants, estimate size and quality, and correlate yield data with location and conditions in the field. This information closes the loop between treatment decisions earlier in the season and actual harvest outcomes, supporting continuous optimization across planting, treatment, and harvesting cycles. Key imaging and optical requirements for precision crop intervention include: - High-resolution imaging at short working distances - Tight and repeatable distortion characteristics for accurate localization - Low-latency image capture and processing for real-time actuation - Robust performance under dust, vibration, and changing illumination - Optional stereo or depth capability to resolve overlapping plants and control tool distance **How does Sunex advance precision intervention and yield optimization?** Sunex supports plant-level imaging applications through precision optics designed for controlled working distances, compact form factors, and consistent geometric performance. Combined with Sunex DXM™ single-sensor stereo technology, these solutions enable depth-aware targeting and counting while minimizing system complexity. Sunex’s focus on manufacturable designs and camera module integration helps customers scale precision intervention systems reliably across high camera counts and demanding agricultural environments, translating imaging performance directly into yield improvement and input efficiency. **Application Area 3: Aerial Crop Analytics & Drone Monitoring** Aerial imaging has become a cornerstone of precision agriculture, offering rapid, flexible insight across large areas that would be impractical to survey from the ground. Drones equipped with cameras are used to assess crop emergence, monitor plant health, identify stress patterns, evaluate irrigation effectiveness, and document storm or pest damage. Imaging allows growers and agronomists to detect issues early and respond with targeted interventions rather than broad, inefficient treatments. For aerial analytics, image consistency and geometric accuracy are critical. Many use cases rely on orthomosaic generation, time-series comparisons, and quantitative measurements rather than simple visual inspection. As a result, optical performance must remain stable across flights, drones, and deployed fleets. Even small variations in focal length, distortion, or edge sharpness can introduce errors in stitching and analysis. Drone imaging platforms also operate under strict constraints. Payload weight directly affects flight time, while vibration and rapid motion place additional stress on optics and sensors. Lighting conditions can shift dramatically within a single flight, from high noon sun to haze, cloud cover, or low-angle illumination near sunrise and sunset. Optics must deliver uniform sharpness and contrast across the image while minimizing flare and vignetting. Most agricultural drone systems combine multiple imaging modalities (RGB-IR). High-resolution RGB cameras support visual interpretation and mapping, while NIR or multispectral systems enable vegetation indices and crop health analysis. In all cases, lens performance plays a central role in determining data quality and downstream analytics reliability. Key imaging and optical requirements include: - Lightweight lens designs to preserve flight endurance - High and uniform sharpness across the full image field - Controlled distortion for accurate mapping and orthomosaic generation - Vibration tolerance and mechanical stability during flight - Compatibility with RGB-IR, or multispectral sensing, as required **How does Sunex support aerial analytics?** Sunex provides compact, lightweight optical solutions optimized for embedded imaging platforms where mass and power efficiency matter. Its experience with wide-FOV and precision RGB-IR optics enables drone developers to balance coverage and resolution without compromising geometric reliability. Sunex’s emphasis on production repeatability further supports fleet-level deployment, ensuring that data captured by different drones remains comparable over time. **Application Area 4: Environmental Monitoring & Weather Intelligence** Environmental monitoring systems form the sensing backbone of modern farms, providing localized intelligence that complements regional weather forecasts and point-based sensors. Cameras integrated into weather stations, field masts, or mobile nodes add valuable visual context to measurements such as temperature, humidity, wind, and precipitation. Imaging can confirm cloud cover, visibility, fog formation, dust events, and storm conditions, enabling more informed operational decisions. Unlike mobile platforms, environmental imaging systems are often expected to operate continuously for years with minimal maintenance. This places stringent requirements on optical durability and long-term stability. Lenses must resist UV exposure, temperature cycling, moisture ingress, and contamination while maintaining consistent focus and image quality. Any drift in optical performance can undermine the value of long-term data sets and trend analysis. Visual environmental monitoring is increasingly used to validate and enrich sensor data. For example, imaging can confirm rainfall intensity, identify localized fog pockets, or visually document erosion and runoff after heavy storms. In some deployments, day/night imaging or RGB-IR configurations extend monitoring capabilities beyond daylight hours, supporting around-the-clock situational awareness. Key imaging and optical requirements include: - Long-term focus and alignment stability - Resistance to UV exposure, moisture, and airborne contaminants - Wide operating temperature range - Optional support for low-light or day/night imaging modes **How does Sunex support environmental intelligence?** Sunex designs optics with environmental stability and durability in mind, supporting fixed outdoor installations that must perform reliably over long service lives. Through consistent optical performance and integration support for compact camera modules, Sunex helps enable scalable deployment of visual monitoring nodes across geographically distributed agricultural operations. **Application Area 5: Infrastructure Monitoring & Facility Automation** Smart Agriculture extends beyond fields and crops to include a wide range of supporting infrastructure, such as barns, grain storage facilities, processing areas, equipment yards, and perimeter zones. Imaging systems are increasingly deployed in these environments to improve operational efficiency, safety, and remote visibility. Cameras enable automated inspection, inventory monitoring, safety enforcement, and facility-level analytics, reducing manual labor and improving response times. Facility environments introduce their own imaging challenges. Dust, humidity, low or uneven lighting, and the presence of moving machinery can degrade image quality if optics are not properly designed. In enclosed spaces, wide-field coverage is often needed to minimize camera count, while certain tasks—such as level monitoring in silos or belt inspection—require predictable geometry and sufficient detail. In livestock or processing facilities, imaging can support automation and welfare monitoring while operating discreetly in constrained spaces. These applications often demand compact optical assemblies that can integrate cleanly into existing structures without interfering with daily operations. Key imaging and optical requirements include: - Reliable performance in low-light or mixed-lighting environments - Wide-FOV lenses for situational awareness in large interiors - Compact form factors for unobtrusive installation - Resistance to dust, moisture, and cleaning agents **How does Sunex support facility automation?** Sunex offers compact, high-performance lens solutions suited for embedded monitoring systems used throughout agricultural infrastructure. By combining optical performance with manufacturable designs and module-level integration support, Sunex enables customers to deploy imaging systems that remain reliable in demanding indoor and semi-outdoor environments while scaling efficiently across multiple facilities. **Implementation Roadmap: From Concept to Deployable Imaging Systems** **Define the Imaging Job-to-be-Done** For each application area, clarify: - What is the decision/action driven by vision? - What are the false-positive/false-negative costs? - What is the required detection distance and accuracy? - What operating hours and weather conditions must be supported? These answers translate into quantitative optical requirements: FOV, resolution, sensitivity, distortion tolerance, and environmental constraints. **Choose the Right Optical Approach** A successful strategy often uses a mix: - Wide-FOV lenses for navigation and coverage - Narrower-FOV lenses for detail tasks - Multi-camera arrays to reduce compromise - Calibration strategy aligned with manufacturing repeatability Sunex can support either selection from existing lens families or the development of custom optics that better match the application’s true constraints. **Integration and Scale Considerations** Moving from prototype to production requires: - Optical performance that is achievable with real manufacturing tolerances - Repeatable assembly and predictable alignment methods - Mechanical design aligned with sealing and thermal stability - A test strategy that verifies performance efficiently at volume Sunex’s strengths in manufacturable optics and camera module integration are directly relevant here: reducing risk as products transition from “works on the bench” to “works in the field, across fleets.” ## How do Sunex lenses support AI-based agricultural analytics systems? Imaging is becoming the backbone of Smart Agriculture because it scales across autonomy, analytics, environmental intelligence, and facility automation. The value is clear: improved yield, reduced inputs, higher machine efficiency, better documentation, and faster response to stress and events. But achieving these outcomes at scale requires more than selecting a camera—it requires optics and integration engineered for the realities of agriculture: extreme lighting, harsh environments, vibration, long life, and wide deployment. Sunex advances Smart Agriculture imaging by enabling robust optical performance and manufacturable designs that keep camera geometry consistent and reliable. Whether the goal is wide-FOV perception for autonomous harvesting, lightweight optics for drone mapping, rugged lenses for weather intelligence nodes, or compact cameras for facility automation, the same principle holds: better optics reduces system risk, shortens development cycles, and improves real-world performance. ## What lens specifications matter for autonomous agricultural vehicle vision? Turn imaging ideas into deployable systems—faster. The Sunex Smart Agriculture Imaging Discovery Worksheet is a practical, fill-in framework designed to align product, engineering, and manufacturing teams early in the development process. It helps structure the right technical conversations before critical architecture decisions are made—reducing risk, rework, and time to deployment across autonomous machines, precision intervention systems, drones, environmental monitoring, and facility automation. The worksheet is organized into focused tabs that guide discovery step by step: - Quick Discovery – A one-page overview for early conversations, trade shows, or first technical calls. - Application Overview – Defines the use case, platform, deployment scale, and timeline. - Operating Environment – Captures real-world conditions such as dust, moisture, temperature, vibration, and ingress protection. - Imaging Performance – Clarifies detection goals, resolution, latency, lighting, and spectral requirements. - Optical Requirements – Translates system needs into field of view, distortion, working distance, and packaging constraints. - Sensor & System – Aligns optics with sensor choice, compute platform, power, and synchronization needs. - Calibration & Manufacturing – Addresses scalability, tolerances, cost sensitivity, and production readiness. - Program Summary – Automatically consolidates inputs into a concise brief for internal alignment and next-step planning. Whether you’re evaluating a new concept or preparing for production, the discovery worksheet helps teams move forward with clarity—grounding imaging decisions in real application needs and setting the foundation for robust, scalable solutions. --- ## Why Resolution & Contrast Matter: A Practical Guide for Better Imaging - Source: https://sunex.com/2025/12/10/why-resolution-contrast-matter-a-practical-guide-for-better-imaging/ - Summary: Designing and implementing an imaging system often begins with a fundamental question: how much detail needs to be discerned in this image? While this may seem straightforward, the answer is not always obvious. Resolution and contrast are the two foundational metrics that determine whether an imaging system can distinguish the detail your application requires. Resolution defines how many distinct features a system can separate; contrast defines whether those features have enough brightness difference to be detected reliably. A high-resolution lens with poor contrast produces blurry, washed-out images — and a high-contrast lens with insufficient resolution misses fine detail entirely. Both are required simultaneously. This practical guide explains how resolution and contrast interact in real imaging systems, how MTF ties the two together, and how to specify both correctly for machine vision, medical imaging, and ADAS camera applications. ## What is the difference between resolution and contrast in imaging systems? Designing and implementing an imaging system often begins with a fundamental question: how much detail needs to be discerned in this image? While this may seem straightforward, the answer is not always obvious. Two key metrics help guide this decision: resolution, which defines how fine of detail can be resolved, and contrast, which determines how clearly adjacent features or differences in brightness can be distinguished. A clear understanding of both parameters is essential for achieving reliable, real-world performance in applications such as automotive, medical, robotics, geospatial, and immersive imaging. At Sunex, we bring the expertise needed to engineer and optimize imaging systems to achieve the best resolution and contrast required of the application, ensuring consistent and reliable performance. ## How do resolution and contrast interact in a camera lens? In simple terms, resolution refers to the smallest detail that can be distinguished in an image. Several factors influence system resolution, including the lens design, sensor pixel size, optical aberrations, and overall system geometry. A higher-resolution lens-imager combination enables finer features to be captured, such as sharp object edges, subtle textures in terrain mapping, or small defects in inspection and medical imaging. Resolution can be measured using spatial frequency, which represents how frequently image features repeat over a given distance. Typically, this is shown in a collection of black and white lines and is measured in “line pairs per millimeter”. Higher spatial frequency corresponds to finer details and requires a higher resolution to capture them accurately. Understanding spatial frequency is important because it provides the basis for comprehending a system’s Modulation Transfer Function (MTF), which describes a system’s ability to reproduce contrast at different spatial frequencies. In other words, spatial frequency tells you the level of detail you are trying to resolve and MTF tells you how well your system can reproduce them. One way to visualize the relationship between spatial frequency and MTF is with our MTF Impact Simulator, which shows how varying the MTF value affects image quality across different spatial frequencies. ## How do I specify resolution and contrast requirements for a machine vision lens? - Make sure your sensor’s pixel size matches the lens resolution capability (no sense in pairing a 12 MP sensor with a lens that can only resolve ~2 MP worth of detail). - Consider the effective focal length (EFL) and field of view (FOV): for a given imager, a longer focal length (narrower FOV) often yields higher detail. - Optical aberrations (e.g., astigmatism, spherical, coma) degrade resolution: good lens design matters. ## What is Contrast? Contrast refers to the difference in brightness (or signal) between adjacent features in your image. Practically speaking, it describes how well a dark object stands out from a light background, or how clearly two adjacent features can be distinguished when their brightness levels are similar. If we think back to spatial frequency and line pairs, contrast would be the amount of difference in brightness between the black and white lines. This can best be seen using our MTF Impact Simulator: at a lower MTF value, the lines begin to blur together since the contrast is reduced. Even if the resolution is high, if the contrast is low, then the fine features could get lost in fog, scattering, glare, or lens flare. This is why contrast is just as critical as resolution. Without sufficient contrast, the details a system is theoretically capable of resolving may not appear in the final image. **Key points to consider:** - Relative illumination: At wide FOVs, edge illumination often falls off, reducing brightness and contrast towards the edge of the image. - Scattering, ghosting and flare: These effects degrade the contrast and can be a result of lens coatings, internal baffling, and/or the lens element design. - High dynamic range (HDR) environments (e.g., sports arena lighting, drone aerial with shadow & sun): Ensuring sufficient contrast may require optical and/or electronic compensation (e.g., HDR lens/sensor, variable aperture). ### Resolution × Contrast Resolution and contrast go hand in hand. High resolution by itself is not enough, if contrast is too low, the system may have plenty of pixels, but the fine details won’t appear clearly since the brightness differences that define them would be too small. Conversely, high contrast with low resolution might show strong overall shapes or silhouettes, but the system won’t be able to resolve the fine features and textures that make up the image. Both metrics must work together for an imaging system to deliver meaningful detail. **Key points to consider:** - Sensor and lens match: A lens can only resolve high spatial frequencies (details) if it maintains its contrast at those frequencies. Optical metrics, like MTF, describe how contrast falls off at increasing spatial frequencies, helping quantify how well a lens and sensor work together. - Field of view and working distance: For example, in drone inspection over a terrain, a single wide FOV might cover the entire area of interest, but it would reduce effective resolution per meter. Additionally, wider FOVs are more susceptible to edge contrast fall off. - Environmental factors: Haze, motion blur, or low-light environments (common in immersive imaging settings) reduce contrast and therefore limit the usable resolution. ## How does Sunex help you optimize an imaging system? - At Sunex, we leverage over 25 years of optical design experience and a portfolio of 300+ off-the-shelf lenses to match the right lens to your sensor and application, achieving the optimal resolution and contrast. sunex.com+1 - Through the Optics Wizards at optics-online.com, you can simulate resolution and contrast tradeoffs online before committing to hardware. Allowing faster evaluation and design confidence earlier on. Sunex Optics-online.com - Whether you’re developing systems for sports/immersive imaging, geospatial mapping, drones, robotics, or medical devices, our team of optical and application engineers can help you select the right combination of lens, sensor, and system geometry to achieve your target resolution (e.g., line pairs per mm, pixels per foot) and contrast (e.g., minimum detectable contrast in your scene). ### Conclusion In a world where imaging demands continue to rise, the ability to resolve fine detail with clear contrast is what separates “good enough” from “mission-ready”. At Sunex, we partner with you from the start, leveraging our design expertise, simulation tools, and extensive lens portfolio to ensure resolution and contrast aren’t just afterthoughts, but built-in strengths. Let’s start a conversation about your next system: whether it’s a drone-based geospatial survey, an immersive 360° VR capture rig, or a high-precision medical imaging lens, we can help optimize for resolution and contrast, so your imaging system delivers real-world value. The Sunex U.S. Team --- ## Publish Your Success Story - Source: https://sunex.com/2025/12/02/publish-your-success-story/ - Summary: We’re excited to feature our customers' success stories, showcasing how they leveraged Sunex’s optical expertise to overcome a complex imaging challenge and bring a breakthrough product to life. **Your Innovation Deserves to Be Celebrated — Share Your Sunex Success Story** We’re excited to feature our customers’ success stories, showcasing how they leveraged Sunex’s optical expertise to overcome a complex imaging challenge and bring a breakthrough product to life. From early concept refinement to custom lens design, precision manufacturing, and seamless module integration, the project highlights what’s possible when engineering teams and Sunex collaborate closely toward a common goal: reliable, high-performance imaging and projection systems that scale. As we publish these milestone stories, we know they represent just some of the many remarkable solutions our customers have built with Sunex optics, modules, and engineering support. Behind every lens or camera module shipped is an idea that became real — and often a challenge that needed the right partner to solve. If Sunex has supported you in developing a new medical device, advancing your robotic vision system, improving performance in an automotive application, or simplifying your production process, we’d love to hear from you. Your story can inspire others facing similar hurdles and help showcase how thoughtful optical design makes a difference. **If you’re interested in being featured, reach out — we’re ready to help you tell your story.** --- ## Customer Success Story: ART SpA Parking Camera - Source: https://sunex.com/2025/11/20/customer-success-story-art-spa-parking-camera/ - Summary: Camera systems for high-end sports cars must excel under extreme environmental conditions. Read how ART SpA from Italy is mastering this challenge with Sunex as its optics partner. ART SpA, an Italian automotive electronics manufacturer specialising in high-performance sports car systems, needed a parking camera lens capable of delivering exceptional image quality under extreme conditions: high vibration, wide temperature swings, direct sunlight exposure, and tight packaging constraints demanded by exotic vehicle platforms. This case study details how Sunex and ART SpA collaborated on the optical design, qualification, and production ramp of a custom lens solution — including the specific engineering challenges resolved during the development program. # PUSHING THE BOUNDARIES OF AUTOMOTIVE CAMERA PERFORMANCE When engineering camera systems for high-end sports cars, precision optics alone are not enough. These systems must excel under extreme environmental conditions—heat from nearby exhausts, cold water shocks, vibration from rigid chassis, and exposure to dust and moisture, all while maintaining flawless digital image quality. In addressing this complex challenge, ART SpA, a technology leader in automotive electronics, found a trusted optical partner in Sunex. **The Challenge: Performance Beyond the Image** ART set out to design a new generation of parking cameras for the luxury automotive segment. The requirements were steep: the cameras had to deliver ultra-wide viewing angles for precise parking and surround vision, survive near-exhaust placement where high temperatures are constant, and pass rigorous OEM-level durability testing. These tests included thermal cycling, electromagnetic compatibility (EMC), and ingress protection against high-pressure water jets. Additionally, the cameras had to be compact, easy to integrate into tight vehicle spaces, and built to withstand both mechanical stress and the demands of premium vehicle aesthetics. Beyond hardware resilience, any drop in image quality remained a non-negotiable. Drivers of supercars expect top-tier visuals as part of their overall in-car infotainment experience. **The Partnership: Engineering a Solution** To meet these demands, ART chose to partner with Sunex, a globally recognized leader in high-performance optics for automotive applications. This collaboration began in 2012, when ART was developing its first automotive telemetry camera. The success of that initial project laid the groundwork for a long-term partnership that has since evolved to cover 2D parking systems, 3D surround vision cameras, and most recently, a 4K telemetry platform. For the latest parking system project, ART designed a camera entirely in aluminum for superior thermal conductivity and structural rigidity. The housing achieved an IP6K9K rating, the gold standard for ingress protection, ensuring resistance to both dust and powerful water jets. However, the optical component was the key to success. ART integrated Sunex’s all-glass/metal wide-angle lenses, which provided a 190° field of view (FOV) and were engineered to resist the warping or degradation often seen in plastic optics under thermal load. **Testing the Limits: Validation Of Worst-Case Scenarios** The validation process for these systems was as demanding as the environment in which they are built to operate in. ART’s test protocols simulated worst-case scenarios: high-heat exposure from nearby exhaust systems followed by rapid cooling from pressurized cold water, resulting in intense thermal shock. Cameras also underwent long-duration vibration tests, leak assessments, and full EMC qualification. These Sunex lenses not only survived the most challenging stress tests but also consistently delivered high-resolution and low-distortion imagery, critical for parking assistance and real-time driver feedback. According to ART’s product management team, the combination of aluminum body design and Sunex optics proved to be a winning formula. The robust mechanical design protected internal components while the all-glass/metal lens prevented common failure modes such as humidity ingress, lens fogging, or optical warping. The use of Sunex lenses has been fundamental to achieving success, thanks to their high-quality materials, construction, and image performance. Mirco Paggi, Product Manager at ART **Why It Works: Optical Excellence in Harsh Conditions** In the context of performance vehicles, where camera housings are often integrated into aerodynamic designs and exposed to elevated stress levels, reliability is paramount. Sunex optics are designed and engineered with a philosophy that prioritizes durability and optical precision. Their lenses offer high-resolution output, maintain performance across wide temperature ranges, and allow for tight integration with ART’s mechanical and electronic systems. This synergy of expertise—Sunex in optics and ART in automotive system integration—enables faster development cycles and greater product confidence in the field. **Looking Ahead: 4K and Beyond** ART and Sunex are now co-developing a new 4K telemetry camera for next-generation automotive platforms. This product will push image resolution and sensor integration to new heights, enabling applications ranging from automated parking to ADAS data capture in extreme environments. What began as a search for a capable optical supplier has evolved into a collaborative innovation model. ART and Sunex continue to redefine what’s possible at the intersection of optics, electronics, and automotive design—one high-performing camera at a time. **About Art SpA** ART was founded in the early 2000s and is based in the evocative Villa del Pischiello in Passignano sul Trasimeno, Italy. Thanks to its know-how and achievements, ART is now positioned internationally as a leading manufacturer and supplier of automotive innovations and high-tech infotainment, dashboard, and entertainment systems for the super sport and luxury markets. In more recent times, ART has leveraged its high-level experience and expertise towards the global automotive market as well as light and heavy commercial vehicles, industrial vehicles, and agricultural machinery. This fast-growing company now employs about 300 people who work at the headquarters in Passignano sul Trasimeno and the offices in Modena, Turin as well as in Berlin, Germany. It represents a point of excellence of Made in Italy capable of competing with large multinational groups active in the sector. --- ## SXM™ Technology: Redefining Lens Interchangeability - Source: https://sunex.com/2025/11/04/sxm-technology-redefining-lens-interchangeability/ - Summary: Refocusing the lens and recalibrating each time a lens is changed requires both time and money. SXM mounting technology overcomes these limitations, as lenses can be magnetically swapped in seconds, already pre-aligned and pre-focused, creating an incredibly adaptable imaging system. SXM™ is Sunex’s magnetic lens interchange system that allows lenses to be swapped in seconds — already pre-focused and pre-aligned — without any re-calibration. In multi-FOV inspection systems, robotics deployments, and research environments where lens changes are frequent, SXM™ eliminates the downtime and calibration overhead that traditional threaded mounts impose. This article explains how SXM™ achieves repeatable pre-aligned lens interchange, the mechanical and optical design principles behind it, and the applications where rapid lens swapping delivers the most operational value. In many situations, the ability to quickly adapt a camera system can provide huge advantages. Whether it’s for fast prototyping or an application with constantly changing conditions, being able to switch from one lens to another easily can save valuable time and create flexibility in what can be achieved with the imaging system. Traditionally, however, lens swapping comes with challenges. Re-focusing the lens and recalibrating each time a lens is changed requires both time and money. To overcome these limitations, we developed our SXM technology. With SXM, lenses can be magnetically swapped in seconds, already pre-aligned and pre-focused, creating an incredibly adaptable imaging system. **The Problem with Switching Lenses, Conventionally** Switching between lenses typically introduces several obstacles. Each time a new lens is swapped into the system, it needs to be refocused and realigned. This process can require recalibration as well as mechanical adjustments and fine-tuning, leading to significant downtime. Because of this downtime, systems lack the flexibility to switch quickly between lenses. For example, moving from RGB to IR lighting or comparing different field of views requires significant effort. The time and labor required for frequent refocusing and recalibration translate into higher operational costs, making traditional lens swapping both slow and expensive. ## What is the Sunex SXM lens interchange system? To introduce easy adaptability to camera systems we have developed our SXM technology, interchangeable M12 lenses that are: - Magnetically mounted to enable lens swaps in seconds. - Pre-focused to eliminate the need for manual adjustments. - Pre-aligned for precise optical performance. - Hot-swappable if the camera system allows it. Check out how it works here: Sunex SXM ## What applications benefit most from rapid lens interchange? SXM™ technology unlocks new possibilities across industries: - Technology demonstrations: Allows teams to showcase their systems under different conditions by swapping lenses. - Fast prototyping: Test multiple lens options quickly to determine the best fit, without hours of manual alignment. - Medical and robotics: Adapt to varied procedures and environments where different lens types may be required. - Surveillance and security: Rapidly respond to changing conditions by switching lenses on the fly. These are just a few examples; the versatility of SXM™, makes it suitable for countless other applications. ## How do Sunex lenses with SXM™ support my product development? SXM™ technology redefines lens interchangeability by delivering speed, precision, and flexibility. It is an ideal solution for projects that demand adaptability in their imaging systems, whether for prototyping, demonstrations, or real-world applications in dynamic environments. Ask one of our optical engineers about how SXM™ could add value to your project here: https://sunex.com/support/ --- ## Choosing the Right Sourcing Strategy for M12 Lenses - Source: https://sunex.com/2025/09/22/choosing-the-right-sourcing-strategy-for-m12-lenses/ - Summary: Selecting the right lens sourcing strategy has direct, long-term consequences on image performance, supply continuity, and program economics. The market currently offers three distinct channels: internet platforms, catalog-style intermediaries, and direct OEM partnerships. There are three M12 lens sourcing channels — internet/commodity platforms, catalog intermediaries, and direct OEM partnerships — each with different tradeoffs in cost, quality, supply security, and customisation capability. For prototyping, internet lenses are fast and adequate. For production programmes, OEM-direct partnerships consistently deliver lower total cost of ownership and fewer programme-threatening surprises. This case study maps each sourcing channel to the development stage where it delivers best value, with real-world examples of what goes wrong when engineers lock in the wrong channel too early. **Balancing Cost, Risk, and Performance in Robotics, Industrial Automation, Embedded Vision, and Drone Imaging** Selecting the right lens sourcing strategy has direct, long-term consequences on image performance, supply continuity, and program economics. The market currently offers three distinct channels: internet platforms, catalog-style intermediaries, and direct OEM partnerships. Each offers benefits at different phases of development, but each also carries distinct risks that grow or shrink as projects move from concept to fielded products. This whitepaper provides a practical framework to evaluate the trade-offs among the three channels. It integrates real-world scenarios across robotics, industrial automation, embedded vision, and drone imaging, and it attempts to quantify lifecycle impacts using a Total Cost of Ownership (TCO) approach to lens sourcing. ## Watch: 3 Essential Questions to Ask Your M12 Lens Supplier The conclusion is straightforward: Internet platforms and intermediaries are potentially valuable options for speed and flexibility in early phases, but mission-critical systems and volume production benefit most from an OEM partnership that aligns optical design, quality, and supply with the product roadmap, and fostering these relationships from the very beginning of a project can pay dividends in terms of Total Cost of Ownership. Figure 1. Comparison of sourcing channels across key success factors. ## What are the three main sourcing strategies for M12 lenses? M12 board lenses are the workhorses of compact imaging, enabling a wide range of FOV (field of views) and F/#’s in small packages and integrating with modern CMOS sensors across a diverse range of devices. As sensor performance improves and mechanical envelopes shrink, optics must carry a greater burden for contrast, distortion control, relative illumination, and environmental stability. **Robotics**→ Object detection, navigation, bin picking**Industrial automation**→ Inspection, defect detection, process optimization**Embedded vision**→ Compact consumer and enterprise devices**Drone imaging**→ Aerial mapping, agriculture analytics, surveillance At the same time, the supply landscape has broadened. Low-cost marketplaces put thousands of lens SKUs within a click. Intermediaries curate selections, maintain regional inventory, and reduce friction for small orders. OEM lens manufacturers design, produce, and support lenses at scale with guarantees on performance, process control, and lifecycle. Understanding where each channel fits means separating what matters in the lab from what matters in the field across years of production. **Internet Platforms**Marketplaces such as Amazon and Alibaba offer unmatched convenience and breadth. They are ideal for quickly assembling a bench of candidate lenses to sample fields of view, mechanical clearances, and basic image quality. However, listings may draw from anonymous, mixed, or end-of-life lots; coating recipes and glass sets may vary over time; and there is rarely a roadmap commitment or any traceability. For these reasons, internet lenses are effective tools for exploration but are risky foundations for any product that requires repeatability, certification, or long-term serviceability. **Intermediaries and Catalog Resellers**Intermediaries create value by pre-screening suppliers, carrying inventory, and simplifying procurement for small runs. They are particularly helpful between proof-of-concept and pilot, when teams need a consistent part number without committing to an OEM minimum order or a custom design. Yet intermediaries are constrained by their upstream sources. They typically do not control most aspects of the design, including coating, glass sourcing, or process, and they cannot guarantee that a given SKU will remain in production for the lifetime of your product. When volumes increase or performance margins tighten, such constraints can force an unplanned redesign. **OEM Lens Manufacturers**OEMs design and manufacture lenses, manage material supply chains, and validate performance against application-specific or even customer-specific requirements. A mature OEM partnership extends beyond the PN; it includes engineering collaboration (field of view and distortion trade-offs, stray light, spectral response), process control (custom parameters, binning, yield management), and lifecycle planning (EOL policies, alternatives, second-source strategy). Although the unit price may be higher at the outset, and lead times require planning, the risk profile and total program cost are significantly lower in mission-critical, multi-year, and high-volume scenarios. For building long-term, win-win relationships where both the customer and the supplier can bring their full strengths to bear, this is the best option. ## Which sourcing channel is best for the Product Development Cycle? Product development is often a series of changing constraints. Early on, speed dominates: teams need to consider multiple performance envelopes, mounting options, and ISP pipelines. As prototypes evolve into pilots, repeatability and early supply assurances take priority. At design freeze and launch, quality and reliability take precedence, and lifecycle commitments become non-negotiable. To some extent, these shifting constraints map naturally to the strengths of each sourcing channel. The trick is not to get locked into a path that is not scalable to your ultimate goal. During concept and POC phases, internet platforms can supply breadth and immediacy, if not exactly meeting the spec. Engineers can sample a dozen lenses very quickly to validate basics, such as the field of view, F/#, and first-order mechanical parameters. The goal is to learn quickly, not to lock architecture on a commodity part. In Pilot and Beta, intermediaries can add value while also having the ability to support small, ongoing projects looking forward. They reduce friction for “sub-MOQ” builds, provide a single catalog with multiple options, and can maintain a buffer stock while customers complete qualification testing. The risk is that the upstream lens may change subtly between lots or disappear altogether (EOL), through no fault of the supplier themselves. At Design Freeze and Production Ramp, OEMs become essential. The discipline of a controlled design, documented process flow, and optionally active alignment to the sensor removes variability that would otherwise manifest as yield loss, RMAs, or artifacts in the image. In small quantities, this may be tolerable, as you can hand-sort, but in production, it is unacceptable. Reliable OEMs also lock product lifecycles to the customer roadmap, preventing surprise discontinuities during scale-up and mass production, and for aftermarket support. If the customer started out with an “internet lens,” which somehow made it this far in the design cycle, this is where TCO starts to become a major issue for so-called inexpensive lenses. The cost and schedule stress of redesigning and implementing new optics at this stage typically ripples far beyond the lens itself. Figure 2. Conceptual suitability of each channel across the major development stages. ## When does direct OEM sourcing become the right choice for M12 lenses? **Robotics and Warehouse Automation**A robotics integrator building a bin-picking camera used inexpensive internet-sourced lenses to evaluate several fields of view. The prototypes worked until thermal cycling at the factory floor revealed focus drift and increased distortion at temperature extremes. Transitioning to an OEM design with thermally balanced materials and tighter assembly tolerances stabilized focus and cut field failures by more than half. Redesign was required, but was done early on, and the cost was more than offset by avoiding RMAs and line downtime. **Industrial Automation and Semiconductor Inspection**In defect inspection, modulation transfer function (MTF) consistency directly affects false positives. A machine builder using standard catalog lenses encountered lot-to-lot variation that pushed MTF just below the acceptance window for some lots. After consulting an OEM lens manufacturer, the OEM suggested using binned (sorted) elements and specially controlled assembly torque and case-specific OQC testing. Qualification passed on the first attempt, and the program recovered three months of schedule with significant improvement in false positives (yield rate). **Embedded Vision Devices**A compact enterprise device ramped from 200 to 30,000 units per year. Its catalog lens was discontinued midway through ramp, triggering an unexpected optical redesign and FCC re-test, resulting in sudden costs and delays. A subsequent OEM engagement was able to deliver a mechanically drop-in lens replacement optimized for the same sensor with consistent shading and improved relative illumination, locked to a five-year supply plan. **Drone Imaging and Multispectral Analytics**An agriculture drone platform needed RGB and near-IR imagery while meeting strict mass and vibration constraints. Early experiments with off-the-shelf lenses exposed coating degradation and decenter sensitivity under vibration profiles as a key spec. An OEM solution combined a dual-channel design with IR-optimized coatings, ruggedization and active alignment to the sensor, enabling repeatable NDVI computation and faster regulatory approvals. ## What are the supply chain risks of internet-sourced lenses in production? Total Cost of Ownership (TCO) aggregates all costs required to deliver and sustain a product: engineering hours, yield losses, RMAs, replacements, qualification delays, and the risk-weighted cost of supply disruption. Internet platforms often minimize unit price but externalize many of these costs; intermediaries reduce some variability but do not eliminate upstream risk; OEMs reduce lifecycle costs through design control, process discipline, and roadmap alignment. Factor | Internet Platforms | Intermediaries | OEM Manufacturers | | Redesign Costs | Very high | Moderate | Minimal | | RMA / Field Failures | Frequent, expensive | Lower | Lowest | | Qualification Delays | Likely | Less common | Minimal | | Yield Optimization | None | Limited | Fully controlled | | Redesign Costs | Very high | Moderate | Minimal | | Engineering Support | None | Limited | Full optical/system support A simple way to visualize this is to model cumulative lifecycle cost over time. Internet-sourced parts start low but accelerate as failures and redesigns accumulate. Intermediary-sourced parts fare better, but may still increase due to limited control over process drift or EOL. OEM parts often – not always -start at a higher price but remain relatively stable over the product’s lifetime. Figure 3. Conceptual TCO curves. Internet platforms minimize upfront price but often maximize lifecycle cost; OEM curves are higher initially but flatter over time. ## How does Sunex’s OEM partnership model reduce M12 lens program risk? **Start fast, but do not anchor architecture to commodity parts **is the key. Use internet platforms to accelerate learning but treat those lenses as disposable tools for discovery. Once the optical envelope is understood, move to controlled sources. When a pilot demands a few dozen to a few hundred units, intermediaries can be a pragmatic bridge. Validate batches aggressively: check MTF, distortion, shading, and environmental stability across multiple lots. Confirm the reseller’s view of upstream continuity before committing to field trials. Even at low quantities, keep one eye on the future. Could this product ramp to significant volumes? Will your initial choices scale seamlessly? Will this company/product be here to support me in 5 years? For ramp-up and production, or for those projects which will invariably ramp to high volumes, choose an OEM partnership from the outset that is aligned to your sensor, packaging, and lifecycle plan. Define performance windows and test methods jointly; consider active alignment to stabilize focus and tilt; document change-control and EOL procedures; and synchronize forecasts so material supply and capacity scale with demand. Finally, incorporate TCO into milestone reviews. A lens that saves a few dollars in the BOM can cost hundreds of thousands of dollars in redesigns and field interventions later. Use TCO models to make these hidden costs visible before they materialize. ## How should my Decision Checklist for M12 lenses look? - Have we validated optical performance across temperature and vibration to production limits? - Is there documented lot traceability and change control for the lens and key materials? - Do we have an agreed roadmap and EOL policy matched to our product lifecycle? - Are yield, binning, and active alignment options defined to protect margins at scale? - Does the supplier offer direct Engineering and QC support? - Have we stress-tested supply continuity with realistic forecast scenarios? Intermediaries should be acknowledged as important participants in the ecosystem. Many provide tangible value: local inventory, simplified procurement, and pragmatic assistance for early deployments. The argument presented here is not that intermediaries lack merit, but that their role is structurally different from a design-and-manufacture partner. This article’s recommendation is therefore not a criticism; it is a risk-managed allocation of roles that aligns channel strengths with project characteristics. When intermediaries source from OEMs, the collaboration can be positive, provided that plan-of-record parts, documentation, and lifecycle commitments remain robust. # Conlusions Sourcing choices determine more than unit price: they influence image quality, yield, schedule, and customer experience for years to come. Internet platforms and intermediaries accelerate learning and simplify early builds; OEM partnerships stabilize products, reduce lifecycle cost, and protect brand equity in the field. For mission-critical systems in robotics, industrial automation, embedded vision, and drone imaging, the data and experience converge on a simple rule: prototype fast, then productize with an OEM. While internet platforms and intermediaries can play roles early in development, OEM partnerships offer unmatched advantages: - Custom design integration - Guaranteed lifecycle continuity - Optimized yields and reduced RMAs - Engineering collaboration and value-added services, such as active alignment --- ## Lens Hybridization for µLED Headlamps - Source: https://sunex.com/2025/09/10/lens-hybridization-for-hd-headlamps/ - Summary: This paper examines how lens hybridization, combining glass and plastic optical elements, can deliver optimized solutions for automotive µLED HD Lighting systems. Lens hybridization — combining glass and plastic optical elements in a single assembly — is the key enabling technology for next-generation automotive µLED high-definition headlamp systems. Glass elements deliver the thermal stability and chromatic correction that LED wavelength management requires; plastic elements enable complex aspheric surfaces at production-compatible cost. Together, they achieve what neither material alone can: high-resolution pixel projection, thermal stability from -40°C to +105°C, and automotive production economics. This paper examines the optical design principles, material selection criteria, and manufacturing considerations for hybrid lens designs in µLED HD lighting applications. The automotive lighting industry is undergoing a technological transformation. Traditional Matrix LED systems are giving way to microLED (µLED) projector-based headlamps capable of pixel-level control, adaptive beam shaping, and dynamic road projection. These systems are not just lighting the road — they’re becoming integral to ADAS safety features and OEM brand differentiation. Yet, this shift introduces new challenges: - µLED optics demand higher resolution, tighter tolerances, and compact form factors. - The thermal loads and environmental stresses in automotive applications require systems engineered for reliability. - To be successful, suppliers must balance performance, cost, and manufacturability — without compromising quality. Watch the related talk from the 2025 DVN Workshop in Shanghai ## What is lens hybridization, and why does it matter for automotive headlamps? Lens hybridization, combining glass and plastic optical elements, can deliver optimized solutions for automotive µLED HD Lighting systems. Sunex’s three decades of engineering expertise, manufacturing capabilities, and proven reliability enable next-generation hybrid automotive lighting applications that balance performance, size, and cost requirements. ## Why are hybrid glass-plastic lenses needed for high-definition headlamp projectors? **From Illumination to Information** In the past, headlights served a singular purpose: **lighting the road**. Today, they are becoming **intelligent projection systems** that deliver safety, comfort, and branding. Key milestones in this evolution: **Halogen Era**→ Simple reflectors with broad, uncontrolled beams.**HID & Early LED**→ Increased brightness, but limited control.**Matrix LED**→ Segmented control, enabling partial adaptive driving beams.**µLED Projectors**→ Thousands of independently controlled pixels for**high-resolution beam shaping**and**road-projected information**. **Use Cases Driving Adoption** **Adaptive Driving Beams (ADB):**Dynamic control to avoid dazzling oncoming drivers while maximizing road illumination.**Augmented Navigation:**Projecting turn-by-turn directions onto the road surface.**Hazard Warnings:**Highlighting pedestrians, cyclists, or obstacles in low visibility.**OEM Differentiation:**Unique, programmable light signatures for brand identity. High-resolution µLED projectors don’t operate in isolation. They form part of a **converging ecosystem**: **ADAS Integration:**Projection-based driver alerts complement camera-based sensing.**Sensor Fusion:**Combining µLED illumination with LiDAR or radar systems.**Software-Defined Lighting:**Customizable light patterns updated**over-the-air**. This evolution demands optical systems that are **Reliable** across multiple operating modes, P**redictable** under tight feedback loops with vehicle sensors, and **Scalable** across global vehicle platforms. The convergence of **lighting, sensing, and communication** is positioning µLED-based optics as a **strategic differentiator** for automotive OEMs. Whether the µLEDs experience a proliferation or just a gradual adoption will largely depend on the ability to balance performance, size, and cost. The weighting for these three factors varies from program to program, but they can never be viewed or optimized independently. Lens hybridization (combination of glass and plastic optical elements) can bring the right balance but requires extensive design, engineering, process, and manufacturing experience to meet performance and form factor while meeting the target price **without sacrificing reliability** or/and **increasing the supply chain risk** for the customer. ## Why µLEDs require more complex Optics Transitioning from **Matrix LED** to **µLED projectors** is not just an incremental step — it’s a **paradigm shift** in optical engineering. ## Matrix LED Optics ## uLED Optics - Optical Element count: 1-3 - Material: Glass and/or Plastics - Shapes: Freeform, (a)spherical, complex mounting - Assembly: click/screw into frames and carriers - Required Z-axis alignment: ~50µm - Optical Element count: 4-5 (typically) - Material: Glass or Plastic - Shapes: spherical and asphercial, simple flange - Assembly: pre-aligned and tested in a barrel - Required Z-axis alignment: ~5µm The need for a more complex optical system is based on the expectations of the OEMs that a higher pixilated source is delivering on almost imaging quality projection on one hand, and the µLED source in its fundamental concept and characteristics. Designing and manufacturing µLED-based optical systems presents a unique set of challenges that require precision, innovation, and careful consideration at every stage. Achieving optimal performance demands tighter tolerance control across both individual components and the overall system, while the need for pixel-level accuracy pushes alignment requirements into the micron range. As element counts increase, packaging constraints become more critical, and the substantial thermal load of µLED chips necessitates advanced material selection and thermal management strategies to ensure reliability and long-term performance. **Implications for Designers & Manufacturers** - uLED-based optics require **tighter tolerance control**at the component and system-level. **Pixel-level accuracy**requires**micron-level alignment accuracy**.- Increased element counts lead to **tighter packaging constraints**. - The high thermal load of µLED chips demands **innovative design and material strategies**. ## How does lens hybridization meet automotive qualification requirements? Lens hybridization leverages the complementary strengths of glass and plastic optical elements to deliver an optimal balance of performance, reliability, and manufacturability. By strategically combining materials, designers can achieve superior optical performance—minimizing chromatic aberrations, maximizing MTF, and controlling distortion—while maintaining thermal stability under aggressive automotive temperature cycles. At the same time, hybrid designs enable scalable volume manufacturing and cost-efficient production without compromising on the stringent quality standards required for automotive applications. Lens hybridization strategically combines **glass** and **plastic** optical elements to balance: **Optical performance**(MTF, chromatic aberrations, distortion).**Thermal stability**under aggressive automotive temperature cycles.**Volume manufacturing feasibility**and**cost efficiency**. ## What are the thermal challenges of µLED HD lighting lens design? µLED sources pack **extremely high luminance** into **tiny footprints**. The result: **steep internal temperature gradients** across the lens stack. Adding to that, the operating or sometimes even higher storage temperatures from Tier1 and OEM requirements, one can see how designing a fully athermalized system that has a consistent performance over a 15-year lifetime can be challenging and requires years, if not decades, of design, process, and manufacturing experience in automotive applications. **Challenges:** **Focal Point Shift**of the system due to higher CTE (coefficient of linear thermal expansion) values of plastics**Permanent Deformation Risk**when certain plastic types reach their Vicat Softening Temperature (VST)**Permanent Optical Index Change**experienced by plastic materials under repeated temperature cycling**Optical Index and Transmissivity Change**due to moisture absorption of plastics**Yellowing**caused by prolonged UV exposure, impacting transmissivity and cosmetics**Coating Crazing**of AR (anti-reflective) coatings on large-format plastic elements due to expansion and contraction during thermal cycling There are alternatives to PMMAs and PC that lessen some of the listed challenges. High-performance automotive-grade optical polymers are widely used in automotive backup, surround view, and In-Cabin camera lenses where individual optical elements are comparably small; the high cost factor of these advanced polymers is prohibiting them from wide use in HD lighting applications. While we cannot change the laws of physics or material properties, we must acknowledge them and define design constraints accordingly without restricting the solution space in a way that would prevent us from finding a manufacturable solution. Understanding where to position different materials along the z-axis, applying best-in-class athermalization strategies, applying advanced simulations, and correlating these to real-world test data are strategies we apply at Sunex. Paramount for success is the close collaboration with the customer to design a solution that is optimized on the system level. ## How does lens hybridization meet automotive qualification requirements? **Lifetime Stability Under Automotive REL Conditions** Automotive headlamps operate in harsh environments — from **Arctic winters** to **desert summers**. Reliability (REL) and Environmental test plans are designed to replicate a 15-year vehicle lifecycle. All components of an optical system undergo: **High-Temperature endurance testing****High-Temperature High-Humidity cycling****Prolonged UV exposure** There are many more tests, including shock and vibration, but the ones above are typically the most challenging for a hybrid lens system. While individual test parameters and durations can change across programs, it is not uncommon for some of these tests to have monthlong durations. **Alignment & Assembly Tolerances** Unlike traditional Matrix LED systems, µLED optics require tighter tolerance control of every single optical element as well as the optomechanical components. It requires these components to be assembled, pre-aligned to each other, in a barrel. This shift in manufacturing paradigm demands **tight integration** between **optical design**, **mechanical packaging**, and **assembly processes**. Sunex brings over 25 years of expertise in the design, development, and manufacturing of high-performance automotive optics, delivering solutions engineered for reliability in the most demanding applications. Our experience spans a wide range of automotive imaging and lighting systems, including ADAS, in-cabin monitoring, surround and rear-view cameras, and high-definition projection systems. By combining precision lens design, proprietary technologies, and rigorous manufacturing and qualification processes, Sunex ensures consistent optical performance, thermal stability, and durability. This enables automotive OEMs and Tier 1 to meet strict safety, regulatory, and performance requirements while accelerating time-to-market. ## Why are hybrid glass-plastic lenses preferred for high-definition headlamp projectors? **µLED-based projectors**represent the**next frontier**in automotive lighting.- Achieving **pixel-level resolution**requires**innovative optical architectures**. **Lens hybridization**offers an**elegant solution**to balance**performance, size, and cost**.**Sunex**brings**decades of imaging optics experience**and**automotive reliability engineering**to the HD Lighting market that OEMs and Tier 1 leverage to accelerate innovation and development. --- ## Sunex DXM™ – Stereo Vision in a Smaller Package - Source: https://sunex.com/2025/09/02/sunex-dxm-stereo-vision-in-a-smaller-package/ - Summary: As robotics and automation systems grow increasingly compact, intelligent, and power-efficient, the supporting vision technologies must evolve in parallel. The Sunex DXM™ delivers two simultaneous images from different angles on a single CMOS sensor — enabling stereo depth perception in a form factor comparable to a standard single-camera module. This eliminates the need for separate stereo camera baselines, simplifying mechanical integration and reducing system BOM cost. This article explains how DXM™ works, the optical design principles behind two-FOV imaging on a single sensor, and the robotics, inspection, and automation applications where compact stereo vision creates the most value. ## What is the Sunex DXM and how does stereo vision on a single sensor work? As robotics and automation systems grow increasingly compact, intelligent, and power-efficient, the supporting vision technologies must evolve in parallel. One area undergoing rapid growth and innovation is stereo imaging, where depth perception is derived from capturing two slightly offset views of the same scene. While traditional stereo systems use two CMOS sensors and two lens assemblies, a more compact alternative has emerged: **single-sensor stereo imaging**, where two optical channels converge onto a single CMOS sensor. This architectural shift offers a powerful blend of reduced physical footprint, lower power consumption, improved synchronization, color-matching, and overall cost efficiency. Originally explored for space-constrained applications, the concept is now gaining momentum across a diverse set of platforms, including **Autonomous Mobile Robots (AMRs)**, **Automated Guided Vehicles (AGVs)**, **humanoid robots**, **manufacturing automation**, and even **multi-modal vision systems**. This article examines the advantages, trade-offs, and emerging applications of single-sensor stereo imaging systems, particularly in the context of Sunex Inc.’s advancements in optical design, manufacturing, and DXM technology, which enables dual-channel imaging on a single sensor. ## How does the DXM system achieve depth sensing with one CMOS sensor? A single-sensor stereo imaging system consists of two independent optical channels, based either on a relay architecture, which offers a wide baseline, or on a direct imaging architecture using closely positioned lenses that project two images onto different portions of a single image sensor. The architecture chosen depends largely on the use case, and can be implemented using: **Relay-prism or mirror systems**, which allow a longer baseline (distance between the optical channels), enabling better depth perception at mid-to-long ranges.**Direct-imaging optics**, where two small lenses with a shorter baseline directly image adjacent scenes onto the same CMOS sensor. The result in either case is a stereo image pair captured simultaneously, pixel-aligned and temporally consistent, without the need for a second sensor. ## What are the advantages of single-sensor stereo vision over traditional dual-camera stereo? The compactness of single-sensor stereo systems is obviously one compelling feature. Traditional stereo cameras must allocate physical space for two image sensors and their supporting electronics, while also maintaining rigid mechanical alignment and consistent calibration. This is a particular challenge in mobile robotics, where every cubic centimeter counts. By contrast, a single-sensor design drastically reduces system footprint. The image processing electronics remain the same as a standard monocular camera module, and the optical elements can often be embedded into a compact housing. This opens the door to new designs for **low-profile AGVs**, **slim robotic arms**, or** humanoid head units**, where stereo vision must be integrated without adding bulk or weight. Sunex, with decades of experience in designing miniaturized optics for automotive, medical, and industrial systems, brings deep expertise in **custom optical design** and **manufacturing**, including **all-glass and hybrid designs**, **active optical axis matching**, and **camera module development** for compact imaging systems. These capabilities enable now DXM™ direct imaging solutions with tighter baselines without sacrificing image quality or manufacturability. ## Power Efficiency in Battery-Operated Systems In battery-powered robots, energy is often the most limited resource. A conventional two-sensor stereo setup not only doubles sensor power draw but also adds thermal and processing load for synchronizing and handling dual video streams. With a single-sensor system, all duplicate overhead is eliminated. Sunex’s design approach includes low-distortion, HDR, and straylight optimization that help customers maximize image throughput without overtaxing the system-on-chip (SoC). Additionally, thermal management improves due to the consolidation of the imaging pipeline into a single, tightly integrated unit. ## Perfect Synchronization and Simplified Calibration Another major advantage of single-sensor stereo imaging is **inherent synchronization**. Both images are captured on the same sensor die in the same exposure cycle. This eliminates the need for complex software-level synchronization or dual-sensor calibration routines, or even sensor-to-sensor alignment In dual-sensor setups, even slight mismatches in gain, exposure time, or readout timing can introduce depth errors and visual artifacts. Color matching of different sensors can be particularly difficult. These issues are especially problematic in fast-moving robotic systems or dynamic environments. By contrast, single-sensor systems eliminate this risk by design. **Sunex’s DXM technology** takes it a step further by pre-mapping and correcting the sensor’s imaging zones, ensuring linear and geometrically stable image capture across both optical channels. This not only improves stereo accuracy but also significantly enhances long-term field reliability, which is critical for automotive, industrial, and commercial deployments. ## Cost Efficiency: Fewer Components, Lower BOM Reducing component count directly translates to lower costs, not just in materials, but also in assembly, calibration, and quality control. A single-sensor stereo system uses: - One sensor (instead of two) - A shared image processing pipeline - Fewer connectors, cables, and serializers - Simplified housing and optical alignment For product designers working under tight bill-of-materials (BOM) constraints, this is an attractive value proposition. When paired with Sunex’s ability to deliver **high-performance custom lenses and precision-molded optical components (glass or plastic) at scale**, the result is a hybrid stereo vision module that is not only cost-effective but also production-ready. ## Performance Trade-Offs and Technical Limitations While compelling, single-sensor stereo systems are not without trade-offs. **Baseline Constraints** In direct imaging configurations, the baseline is inherently limited by the physical size of the optics and sensor. This constrains the depth resolution and range, making such systems better suited for near-field applications (e.g., 0.2 – 2 meters). Relay optics can increase baseline distance, but at the cost of added optical complexity and potential alignment drift if not properly designed. Still, for systems with short object distances, or where depth measurement is not the primary goal (see below), this direct imaging approach is very attractive. **Reduced Per-View Resolution** Because the sensor area is split between two optical channels, each stereo view occupies only half (or less) of the total pixel array. For example, a 1920×1080 sensor would provide only 960×1080 resolution per channel in a side-by-side stereo layout. While sufficient for many tasks like obstacle detection or object segmentation, this may be inadequate for high-precision metrology or long-distance depth mapping. Luckily, we live in an era where there is a vast selection of different sensor options. Increasing resolution on each channel may be as simple as changing sensors. For example, 2 4K sensors can be replaced with a single 20mp sensor, and you will still get 2 4K channels (image circles) on one sensor with the corresponding cost and overhead savings. Since the DXM is highly configurable, the solution can be tailored to each use case. Sunex’s design experience can help compensate for these limitations through enhanced field correction and distortion balancing across the image zones. In some applications, custom sensor formats or aspect ratios can also be employed to optimize the stereo layout. In short, the configurability of the DXM system allows you to put the pixels (IE Region of Interest) where you really need it. ## Which robotics and automation applications benefit most from DXM stereo vision? **AGVs and AMRs** Warehouse robots and last-mile delivery bots require compact, cost-effective depth perception for obstacle avoidance and autonomous navigation. Since the operating environment is structured and typically well-lit, the reduced baseline and resolution of a single-sensor system are acceptable trade-offs for gains in size, weight, and battery life. **Humanoid and Consumer Robots** For robots that interact with people or operate in tight spaces—such as service robots, assistants, or educational bots—single-sensor stereo vision provides reliable depth awareness for facial tracking, gesture detection, and object manipulation. The compact form factor enables the embedding of vision systems in aesthetically pleasing designs. **Manufacturing Automation** In high-speed production lines, stereo vision is used for bin picking, height profiling, presence detection, and assembly inspection. Single-sensor stereo cameras provide an efficient way to deliver these functions in a durable, factory-ready package. Their simplified calibration and reduced cabling also translate to easier deployment and less downtime. Sunex’s ability to co-design the lens, optical alignment mechanism, and even the supporting PCB for integration into robotic tooling arms or conveyor systems offers end-to-end value for industrial customers. ## What other Use Cases Beyond Stereo Imaging would benefit from the DXM technology? The same architecture used for stereo vision can also be adapted for multi-modal or dual-purpose imaging by varying the optical paths or filters on each channel. This unlocks several compelling new applications: **Dual Field of View (FOV) Imaging** One optical channel can be designed for wide-angle situational awareness (e.g., 120° FOV), while the other is optimized for narrow-angle detail (e.g., 30° FOV). Both views are captured simultaneously, providing a context + detail pipeline in one camera. This is particularly useful in: - Security robots: Wide FOV for surveillance, narrow FOV for facial identification - Agricultural drones: Overview of crop rows + detailed view of plant health - Logistics: Box detection + barcode reading **Simultaneous Visible and Infrared (RGB/IR) Imaging** Another configuration utilizes one lens and an optical filter stack optimized for RGB, while the other is tuned for near-IR or thermal infrared. This enables applications that require day/night (RGBIR) vision, material identification, or contaminant detection. Examples include: - Medical robotics: Visual navigation + vein mapping - Food processing: Surface color + sub-surface bruising or spoilage - Smart agriculture: Visible plant monitoring + chlorophyll/NIR reflection analysis Sunex’s design team is uniquely positioned to deliver these systems using custom multi-channel optics, efficient single and dual-bandpass coatings, and proprietary dual optical channel alignment, along with optomechanical tolerancing for series production to ensure alignment and performance across modalities. **Extended Exposure HDR** Imagine using two otherwise identical lenses, but one is optimized for a low F/#, while the other is optimized for high F/#. This could not only give the ability to capture a wider dynamic range in the same exposure time, but it would simultaneously allow more deterministic control over depth of field. Examples Include: - Robotic and Machine Vision - Security - Autonomy **Stereo Content Capture** There are uses for stereovision beyond machine depth measurement. A dual-channel on a single sensor would enable stereo content capture without the need to calibrate two different sensors. The human eye is very sensitive to differences in color and relative illumination when presented with two images side-by-side. The DXM effectively eliminates such discrepancies. Examples Include: - AR/VR - Content capture and display (Broadcast/Cinema) - Videoconferencing ## Guidelines for System Designers Compact form factor | Single-sensor (DXM™) | Low power consumption | Single-sensor (DXM™) | Simplified calibration & synchronization | Single-sensor (DXM™) | Depth perception at long range | Dual-sensor | High per-channel resolution | Dual-sensor | Dual modality (RGB+IR or wide+narrow FOV) | Single-sensor (DXM™) | Cost-sensitive volume deployment | Single-sensor (DXM™) ## Conclusions As robotic and machine vision applications demand smaller, smarter, and more integrated systems, **single-sensor stereo imaging** emerges as a viable and even preferable alternative to traditional dual-sensor architectures. Thanks to improvements in optics, sensor design, and calibration algorithms, these systems are no longer niche solutions; they are becoming a key differentiator. Sunex continues to lead in this space by **offering optical design services**, **custom lens manufacturing**, and **advanced alignment** and **integration solutions** tailored to the needs of robotics OEMs, module makers, and system integrators. Whether enabling stereo imaging, dual-FOV pipelines, or RGBIR fusion, **Sunex’s DXM™ technology** provides the optical precision and design flexibility needed to deliver next-generation vision systems. As the boundary between form factor and functionality continues to shrink, vision systems like these will be key to enabling the next wave of intelligent automation. Download PDF brochure --- ## Off-Highway-Vehicle (OHV) Solutions - Source: https://sunex.com/2025/08/27/off-highway-vehicle-ohv-solutions/ - Summary: We understand that your application requires reliable optical performance despite being subjected to environmental extremes. Off-highway vehicles — agricultural machinery, construction equipment, mining vehicles, and utility platforms — operate in some of the harshest imaging environments: extreme dust, mud, water immersion, shock, vibration, and wide temperature ranges from arctic cold to desert heat. Standard commercial camera lenses fail in these conditions within months. OHV applications require optics designed from the ground up for environmental robustness, not adapted from other markets. This article covers the key optical and mechanical requirements for off-highway vehicle imaging systems, the environmental qualification standards that apply, and Sunex’s approach to designing lenses that survive and perform in the field. The solutions for Off-Highway-Vehicles (OHV) benefit a lot from our vast experiences and success as a leading global supplier for the automotive industry and our process and manufacturing know-how to support the ruggedization of our products for the harshest and most demanding environments. From IP-sealing, and improved impact resistance of the first element, to the choice of the right materials and boresight stabilization. Sunex is working closely with our customers and partners in the OHV segment, and truly understands the application needs. ## What environmental conditions do off-highway vehicle cameras need to survive? Enhanced survivability over shock, vibration, moisture/humidity, and wide temperature ranges Advanced coatings including impact resistant (ThoughLens(TM), Hydrophobic (HP3), and High Temperature coatings Complete product portfolio for multispectral applications including VIS, RGB-IR, SWIR, and UV Whether it is on the pathway to the fully autonomous machine, support for the InCabin operator, or improving the actual task (e.g., the crop yield), Sunex has a complete product portfolio and decades of experience in custom lens and camera module design and manufacturing. We understand that your application requires predictable, stable, and repeatable optical performance despite being subjected to environmental extremes, and our team is there to help you make the proper selections and explain the ruggedization options based on your needs. ## What optical specifications are required for OHV and Smart Agricultural vehicle imaging? Imaging and camera technology have revolutionized smart agriculture by introducing autonomy into mobile farm equipment and giving farmers real-time insights into their crops, livestock, and overall farm conditions. Drones equipped with high-resolution cameras or multispectral sensors are used to capture aerial images of large farming areas, and ground-based camera systems, including those mounted on autonomous vehicles, play an important role in precision farming, analyzing soil conditions, crop growth, and even identify individual plant health. These technologies have started to become ubiquitous, helping farmers manage time, monitor crop health, detect pests, and identify areas that require irrigation or fertilization. Powerful imaging hardware is often paired with innovative AI software to allow farmers to quickly assess and respond to issues before they become widespread, thus improving crop yields and reducing costs associated with chemical treatments and water usage. All graphs are for illustration purposes only. The individual lens performance can be different. ## How does Sunex design lenses for off-highway and heavy machinery applications? Sunex is partnering directly with the companies developing the technology and equipment driving the smart agriculture revolution. Sunex’s optical design and manufacturing know-how is a key factor in achieving their goals, from large global corporations to younger start-up companies with novel approaches. --- ## Medical Solutions - Source: https://sunex.com/2025/08/27/medical-solutions/ - Summary: Deep optical design experience and advanced process and manufacturing know-how will drive single-use as the future of endoscopy. ## Endoscopes Endoscope optics are at the core of who we are and how we came to be in the optics industry. The first device created by Sunex was a laparoscope, nearly thirty years ago. Since then, endoscope technology has advanced greatly, and our optics technology has progressed alongside it. A few of the recent endoscopy projects we have had the fortune of working on include both single-use and reusable colonoscopes, laparoscopes and duodenoscopes. We believe specialty and single-use endoscopes are the future of the industry. Traditionally, endoscopes were expensive and required meticulous sterilization after each use. However, the advent of single-use endoscopes has revolutionized the medical landscape, offering advantages for patients and providers alike. With streamlined service, increased patient safety, reduced costs and reduced environmental impact, single-use is the future of endoscopy. While there are numerous advantages, designing and manufacturing these endoscopes comes with plenty of challenges as well. Sunex is proud to have brought multiple single-use endoscopes to market, navigating the balance between cost effectiveness and high performance. If a lens can couple a large FOV and low distortion, in a small package, it is the ideal candidate for the future of endoscopy. All graphs are for illustration purposes only. The individual lens performance can be different. ## Feature Products ## Diagnostic Imaging Imaging has been at the cornerstone of diagnostics since the discovery of X-rays in the late 1800’s. By enabling providers to non-invasively observe their patients, countless lives have been saved. At the heart of this field are premium optics. High quality optics ensure superior image clarity and resolution. This is crucial as it allows healthcare professionals to see fine details, such as minute lesions, microcalcifications or subtle changes in tissue composition. An increased accuracy in diagnostics can improve treatment plans as well, helping to precisely identify cancerous tissues which need to be removed so that healthy tissues remain unharmed. Sunex is proud to have had the opportunity to work on the development of a number of diagnostic devices, from portable X-ray machines to point-of-care disease detection devices. Across all applications precision and consistency is at a premium and thus these opportunities have led to the development of some of the most accurate, low distortion lenses in our catalog. By delivering athermalized, low distortion and high-resolution lenses, Sunex has been grateful to play a role in the diagnosis and treatment of many medical maladies and we look forward to innovating again. All graphs are for illustration purposes only. The individual lens performance can be different. ## Robotic Surgery The future of robotic surgery is incredibly promising. As technology continues to advance, it is expected that the application of robots in surgery across nearly all disciplines will greatly increase. Some of the advancements that are primed to enable this increased adoption include enhanced vision capabilities and ever shrinking devices to make surgery as minimally invasive as possible. With a large catalog of high quality, miniature lenses, Sunex is excited to offer assistance to innovators creating the next generation of surgical robots. We have worked with leading companies to create ultra-small, high resolution and wide FOV lenses that currently reside in surgical systems. These range from the endoscopes that capture images to the immersive vision systems that relay these images to surgeons and a range of visual applications in-between. By giving providers an increased FOV and Depth of Field over other lenses, Sunex lenses ensure the entire area of operation can be viewed clearly. All graphs are for illustration purposes only. The individual lens performance can be different. ## Feature Products | PN | Format | MP Class | HFOV | F/# | Feature | |---|---|---|---|---|---| | Dental Camera | 1/4" | 3MP | 67° | F/14.8 | Hybrid, Extreme F/# | | Ophthalmoscope | 1/2.5" | 5MP | 55° | F/5.6 | All-glass, narrow FOV | | DSL944 | 1/2.5" | 5MP | 55° | F/2.8 | All glass, Short TTL, fully athermalized | | DSL949 | 1/3" | 5MP | 82° | F/2.0 | Hybrid Desgin, Compact Size, Low Distortion | Don’t find what you are looking for here? Then visit our Medical Imaging Solution page. We also recommended searching our entire **Off-The-Shelf Portfolio,** or using the **Imaging System Builder** to get started on a custom solution. --- ## Medical Brochure - Source: https://sunex.com/2025/08/20/medical-brochure/ - Summary: We work with you to find the best balance between cost and performance to meet the often unique application requirements. Ingo Foldvari is a Sales Engineer and Director of Business Development at Sunex Inc., a leading US-based OEM manufacturer of custom imaging optics and camera modules. With over 25 years of industry experience, Ingo advises engineering teams on lens specifications and camera module selection for ADAS, in-cabin monitoring, HD Lighting, medical optics including disposable endoscopy, and industrial robotics applications. He holds a degree in Electrical Engineering and speaks regularly at industry conferences including DVN Lighting Workshops, AutoSens and Technology Days. --- ## Understanding MTF in Digital Imaging - Source: https://sunex.com/2025/08/12/understanding-mtf-in-digital-imaging-why-it-matters-and-how-to-interpret-it/ - Summary: When evaluating or designing an imaging system, one of the most important (and often misunderstood) specs is the Modulation Transfer Function (MTF). ## What is MTF, and why does it matter for lens selection? MTF (Modulation Transfer Function) is the single most important metric for predicting how sharp and detailed your lens images will be. Higher MTF = better contrast at fine detail — directly impacting detection accuracy in machine vision, ADAS, and medical imaging. A good rule of thumb: target MTF >40% at the Nyquist frequency of your sensor. MTF describes how well a lens transfers contrast across a range of spatial frequencies. A perfect lens retains 100% contrast (MTF = 1.0) at all frequencies — real lenses lose contrast as detail gets finer. This article explains how to read MTF graphs, what values to target, and how to use Sunex’s interactive MTF Simulator. When evaluating or designing an imaging system, one of the most important (and often misunderstood) specs is the Modulation Transfer Function (MTF). MTF describes how well a lens can transfer contrast at varying levels of image detail (spatial frequencies). In other words, it shows how sharp the image will be. But just reading an MTF graph isn’t always enough. That’s why we created the MTF Simulator—a visual tool that helps you see how different MTF values affect image clarity. ## How do you read an MTF graph? An MTF graph plots contrast (0–1) on the Y-axis versus spatial frequency on the X-axis (in cycles/mm). Some key features to look for: - Higher MTF curve = better contrast at that level of detail - Sagittal (S) vs. Tangential (T) curves indicate performance symmetry - FOV performance variation MTF curves provide a quantitative idea of how a lens will perform at certain spatial frequencies. Our MTF simulator connects the quantitative to the qualitative, making it easy to understand the impact a value of MTF will have on your image. ## How can I use the Sunex MTF Simulator to evaluate a lens? Here’s how it works: - Choose your target type: – Sine wave or square wave – For square waves, the simulator assumes harmonic MTF values drop off as 1/frequency. This is generally accurate at mid to high frequencies but may not reflect low-frequency, high-MTF edge cases. - Select a spatial frequency: – Explore coarse vs. fine detail. Input the required spatial frequency. - Set an MTF value between 0 and 1: – Simulate how much contrast your lens retains at that level of detail. The simulator shows a perfect input target and a corresponding image impacted by your selected MTF value, helping you visualize how contrast loss affects clarity. Visualize the impact of MTF using our interactive MTF Simulator. ## What MTF value should I target for my imaging application? MTF is more than a theoretical metric—it’s the key to understanding how your lens and sensor work together to produce sharp, high-contrast images. By interpreting MTF graphs, matching sensor specs, and accounting for application-specific trade-offs, you can make smarter imaging decisions. Ask our Sunex support team to get more information on MTF and try our MTF Simulator to get a visual feel for how optical performance translates to image quality. --- ## Custom Optics vs. Chip-level Camera Modules - Source: https://sunex.com/2025/07/19/custom-optics-vs-chip-level-camera-modules/ - Summary: Custom Optics for Medical Devices offer a Flexible Alternative to Chip-level Cameras for single-use endoscopes. Chip-level camera modules offer fast time-to-market for single-use endoscopes — but custom optical systems deliver superior image quality, autofocus capability, and clinical differentiation that chip-on-tip modules cannot match. The right choice depends on your resolution requirements, sterilization constraints, and whether you are building a commodity device or a premium clinical platform. This article compares both approaches across optical performance, DFM readiness, regulatory compatibility, and total cost of ownership for medical device OEMs. # Custom Optics for Medical Devices offer a Flexible Alternative to Chip-level Cameras As the market for disposable endoscopes, catheter-based imaging, and minimally invasive diagnostic devices grows, medical OEMs face a key architectural decision: use a fully integrated sensor module like Omnivision’s CameraCubeChip®, or pursue a **custom camera system** where image quality, flexibility, and system integration are optimized for the end application. While integrated modules offer simplicity and ultra-compact size, they are not always the best fit, especially when device differentiation, superior image performance, or tight system integration is required. In these scenarios, **Sunex’s custom optical and sensor module solutions present a powerful alternative**. ## What is the difference between a chip-level camera module and a custom optical system for endoscopes? Chip-level integrated technology delivers a compact, ready-to-use module by embedding a CMOS sensor, fixed lens, and packaging into a single unit. This approach is well-suited to basic visualization tasks, especially where cost and simplicity dominate. However, this integration comes at the cost of flexibility: **Fixed optics**limit field of view, depth of field, and image plane tuning.**No autofocus**capability for applications with variable working distances.**Limited sensor options**, especially for newer sensors with specialized features (e.g., global shutter, large pixel size, or spectral sensitivity). For advanced medical procedures—such as precision-guided catheter navigation, robotic-assisted surgery, or high-resolution disposable endoscopes—these trade-offs may outweigh the convenience of a one-size-fits-all module. ## Sunex: Tailored Imaging Systems Designed for Performance, Size, and Cost Sunex specializes in designing and manufacturing custom optical systems, miniature lenses, and complete camera modules for medical imaging applications, including disposable (aka single-use) endoscopes. Our expertise enables medical device companies to develop systems tailored to specific clinical tasks and imaging environments. Key Advantages of Sunex’s Custom Approach: **Application-Specific Optics** Custom-designed lenses to achieve your desired FOV, working distance, MTF, distortion characteristics, and mechanical envelope.**Tunable Autofocus Options** Possible integration of tunable lens elements (e.g., electrically tunable liquid lenses) enables autofocus capability in miniature imaging systems—ideal for multi-depth procedures or variable tissue distances.**Sensor & PCB Flexibility** Support for any CMOS sensor of your choice—no lock-in to predefined imaging specs. Sunex also offers custom PCB design and manufacturing, including image sensor integration for bare die and packaged CMOS sensors, power delivery, and connectivity.**Advanced Optical Alignment** Active alignment between the lens and sensor ensures optimal focus, centering, angular performance, and the smallest part-to-part variance, which is particularly critical for small-pixel sensors, high-resolution systems, and large-volume applications.**Feasibility Studies & System Optimization** Early-stage design feasibility services, optical simulations, and system-level analysis help de-risk development and accelerate product timelines.**Sterilization-Ready Materials** Lens and housing materials are selected in accordance with the customer’s requirements for sterilization and approval processes commonly used for single-use medical devices.**Scalable Manufacturing** Whether you need prototypes for clinical trials or full-scale production for disposable scope lines, Sunex’s vertically integrated capabilities can scale with your business from early prototypes to high-volume series production. ## When does a disposable endoscope benefit from custom optics vs. a chip-level module? There are many different indications, devices, and procedures in the medical field. Even though they all have their specific requirements, they all benfit from a custom appraoch to teh optical system if one or more of the follwoing is required: **High-resolution**and**wide-angle**disposable endoscopes requiring low distortion and superior edge-to-edge sharpness.**Large Depth-of-Field (DOF)**imaging where autofocus is needed to maintain clarity at variable object distances.**Dual-imager or Stereo vision**systems, where precise calibration and alignment between channels are mandatory.**Low-light or IR-capable**applications requiring optimized optics and sensor pairing. ## How does Sunex scale from prototype to high-volume production for medical camera modules? Fully integrated chip-level camera modules provide a valuable solution for many basic disposable imaging tasks, especially where space and cost constraints are paramount. However, when performance matters, whether it’s higher image quality, autofocus capability, robustness, or system-specific sensor selection, a **custom camera module built by Sunex** offers differentiation by meeting the performance, size, and commercial objectives required to make the end-customer successful in their domain. From lens design and simulation to sensor integration, active alignment, tunable focus implementation, and custom PCB layout, Sunex provides a comprehensive pathway from concept to volume production, delivering imaging systems that meet the demands of next-generation medical devices. Disclaimer: CameraCubeChip® is a registered trademark of Omnivision Technologies, Inc. All other trademarks are the property of their respective owners. Sunex makes no claim of endorsement or affiliation. Author Information: --- ## Medical Camera Modules - Source: https://sunex.com/2025/07/02/medical-camera-modules/ - Summary: Our expertise address these need for advanced medical camera modules, designed specifically for demanding medical applications. Medical camera modules must meet requirements that no consumer or industrial module can satisfy: biocompatible materials for patient-contact applications, sterilization compatibility (EtO, autoclave, or radiation), miniature form factors for minimally invasive procedures, and image quality demanding enough to support clinical diagnostic decisions. Sunex designs and manufactures medical camera modules from optical design through sensor integration, PCB layout, active alignment, and full module assembly — supporting medical device OEMs from early prototype through regulatory submission and volume production. # Helping to save lives with Medical Camera Modules Medical imaging technology continues to push the boundaries of precision, efficiency, and safety in clinical environments. As demand increases for more affordable and hygienic solutions, especially in minimally invasive surgery (MIS), the importance of scalable and high-performance imaging systems has never been greater. Sunex, a long-standing innovator in optical design, is helping to address these needs through its advanced medical camera modules, designed specifically for demanding applications like disposable endoscopes, robotic surgery, and dental imaging. ## What makes Sunex camera modules suitable for medical device applications? Sunex’s legacy in medical optics began in the mid-1990s when the company’s founder and CEO contributed to the optical design of one of the first disposable laparoscopes. This early involvement laid the foundation for decades of innovation in the field. Since then, Sunex has designed and manufactured numerous lens and camera systems for medical applications—ranging from high-resolution dental cameras to stereoscopic vision systems used in robotic-assisted surgery. Each product is built around the same core principle: deliver uncompromising image quality while meeting stringent regulatory and manufacturing requirements. ## Addressing the Rise of Single-Use Endoscopes The move toward single-use endoscopes has accelerated due to the combined pressures of cost control, infection prevention, and procedural efficiency. Reusable endoscopes, though cost-effective per use, can pose significant risks related to sterilization and reprocessing. Disposable alternatives, however, must meet the same imaging and performance expectations in a more constrained economic and mechanical envelope. This is where Sunex’s deep domain knowledge plays a vital role. Sunex works closely with OEMs to design optical systems that enable exceptional image quality within tight space, cost, and materials constraints. Whether it’s developing miniature lens assemblies or integrating complete camera modules with cabling and PCBAs, Sunex’s vertically integrated capabilities simplify the path to commercialization. ## How does Sunex support medical device OEMs from prototype to high-volume production? A common misconception is that optical systems proven in prototyping can be readily scaled. However, many early designs lack the manufacturing readiness required for volume production. Sunex’s engineering approach ensures that design for manufacturability (DFM) is addressed early in the development cycle. Prototypes are rapidly produced using cutting-edge processes for glass and plastic optics, allowing fast iteration and real-world testing. What distinguishes Sunex is not just speed but precision: automated 6-axis active alignment, cleanroom assembly, and rigorous QA processes ensure that each component meets exacting standards, whether for a 10-piece pilot run or a million-unit production ramp. ## Engineering Imaging Solutions for the Future of Healthcare Sunex combines innovation, scale, and reliability to deliver camera modules that power life-saving technologies. From concept to commercial product, our systems are designed to meet the practical needs of medical professionals while complying with regulatory and market pressures. Whether you’re exploring new product ideas or preparing for mass production, Sunex has the expertise and infrastructure to help you succeed. With over two decades of experience and a trusted track record in medical optics, Sunex is more than a component supplier—we’re your development partner. --- ## High CRA vs Low CRA CMOS Sensors: Impact on Lens Design Performance - Source: https://sunex.com/2025/06/19/high-cra-vs-low-cra-cmos-sensors-impact-on-lens-design-performance/ - Summary: There are significant drawbacks of utilizing high CRA sensors for applications where a short z-height and compactness are not as important. The Chief Ray Angle (CRA) of a CMOS sensor is one of the most underappreciated variables in lens-sensor system design. A mismatch between the sensor CRA specification and the lens CRA output causes colour shading, reduced signal-to-noise ratio (SNR), vignetting, and image uniformity problems that cannot be fully corrected in software. Low-CRA sensors give optical designers significantly more freedom and typically enable better image quality at lower system cost. This white paper examines how sensor CRA affects lens design complexity, physical dimensions, optical performance, and system cost — with practical guidance for engineers selecting sensors and lenses for embedded vision systems. ### 1.** Introduction** **1.1. Background** ## How does sensor CRA affect lens design complexity and image quality? High CRA sensors (CRA greater than 25 degrees) were developed to enable mobile phone cameras. The driving factor for high CRA sensors is the requirement of low z-height for the lens stack in mobile phone form factor. As shown by this paper, there are significant drawbacks of utilizing high CRA sensors for applications where a short z-height and compactness are not as important as that in a mobile phone use case. It is advantageous to use **lower Chief Ray Angle (CRA) sensors** for high-performance imaging applications for the following benefits: **Compatibility with Higher Performance Lens Designs:**Lower CRA sensors improve compatibility with higher performance lens designs, where chief rays arrive at the image plane close to parallel to the optical axis. By aligning the sensor’s CRA with the lens’s exit pupil position, the system can achieve more uniform illumination across the image plane and reduce distortion variation.**Improved Quantum Efficiency (QE) and Light Collection:**Each pixel on a CMOS sensor typically has a microlens array to focus incoming light onto the photodiode. When light strikes at a high angle (high CRA), a significant portion can be reflected, refracted inefficiently, or even directed into an adjacent pixel. Lowering the CRA ensures light strikes the microlenses and photodiodes more perpendicularly, allowing for more efficient light capture, increasing overall QE, especially at the edges of the sensor, and improving low-light performance.**Reduced Pixel Crosstalk:**High CRA can lead to light intended for one pixel being absorbed by an adjacent pixel, causing reduced image sharpness. A lower CRA inherently reduces this likelihood by ensuring light enters the pixel more directly, minimizing “spillover” and resulting in cleaner images with higher signal-to-noise ratio (SNR).**Improved Image Uniformity and Color Shading Correction:**High CRA can result in significant micro lens array vignetting and color shading (radial color shifts) as off-axis pixels receive less light or color filters become less effective at oblique angles. Reducing the CRA improves the inherent uniformity of illumination and color response across the sensor, simplifying post-processing and leading to a more consistent, higher-quality image. **1.2. Problem Statement** ## What happens when the lens CRA does not match the sensor CRA? While designers strive for optimal image quality, the choice of sensor CRA can fundamentally alter the demands placed on the accompanying optics. This paper seeks to answer an important question in optical system design: **How does the Chief Ray Angle (CRA) of a CMOS sensor influence the optical design performance and physical dimensions of a lens system designed for the same spec?** A significant mismatch between the sensor’s CRA acceptance and the lens’s chief ray delivery can lead to critical image quality degradation, including vignetting, severe color shading, reduced overall quantum efficiency, and increased pixel crosstalk, particularly at the image periphery. While some of these artifacts can be mitigated through digital image processing, such solutions often come at the expense of computational overhead, increased noise, or compromises in real-time performance. Furthermore, in an era demanding ever-smaller and higher-performing imaging modules for applications ranging from consumer electronics to advanced industrial vision systems, understanding the relationship between CRA and physical lens dimensions is crucial for achieving optimal miniaturization, cost-effectiveness, and manufacturability without sacrificing needed optical quality. This paper aims to provide a quantitative comparison to guide more informed sensor and lens selection decisions. **1.3. Method** This study considers CMOS sensors across two common diagonal formats: **1-inch (approximately 16mm diagonal)** and **1/2.5-inch (approximately 7mm diagonal)**. For each format, two distinct Chief Ray Angle (CRA) requirements are examined: - For the **1/2.5-inch diagonal sensor (Case 1)**, cases with a high CRA of 40∘ and a low CRA of 7∘ are investigated. - For the **1-inch diagonal sensor (Case 2)**, cases with a high CRA of 26∘ and a low CRA of 8∘ are analyzed. For each case, we created two lens designs: a high CRA one and a low CRA one. These lens designs will be analyzed and compared based on their optical performance metrics, including **Modulation Transfer Function (MTF), distortion, relative illumination (RI), lateral color over field, and longitudinal color over aperture**. Furthermore, key physical dimensions, such as the **total track length (TTL)** and **front lens diameter**, will be compared between the two lens designs corresponding to each sensor format. To ensure a meaningful comparison, all lens designs for a given sensor format will be constrained to have the same number of lens elements, the same F-number (F/#), and to cover an identical field of view (FOV). Since all practical designs have residual uncorrected aberrations, it is useful to review the theoretical maximum performance achievable for each CRA configuration assuming all aberrations are absent. We construct an ideal paraxial lens model in Zemax. The ideal lens model allows us to determine the upper limit of achievable MTF and RI for each case. High CRA version of each case shows that even if all aberrations can be perfectly corrected the maximum achievable performance are lower in the higher CRA versions. **2. Comparison for Case 1: 1/2.5-inch Diagonal Sensor** Case 1 of this study investigates the impact of CRA on optical design performance for a 1/2.5-inch diagonal CMOS sensor. The common specifications for the two lens designs considered within this case are as follows: **Field of View (FOV):**±60∘**F-number (F/#):**2.5**Wavelength Spectrum:**435 nm to 656 nm**Chief Ray Angle (CRA) Conditions:**- High CRA: 40∘ - Low CRA: 7∘ **Number of Lens Elements:**6 (fixed for both designs in this comparison) The primary design goals for both lenses in Case 1 include maximizing the Modulation Transfer Function (MTF) performance, with comparable optical distortion, and minimizing variation of illumination across the entire field. *Table 1A. Two CRA Design Examples Comparison for Case 1* Case 1 for CRA of 7 deg | Case 1 for CRA of 40 deg | | TTL 40mm; Front Diameter 18mm | TTL 7.4mm; Front Diameter 4mm | | Max CRA 7 deg | Max CRA 40 deg | | > | | | RI over field | RI over field | | Field curvature over field | Field curvature over field | | Distortion over field | Distortion over field | Lateral color between 435-650 nm over field | Lateral color between 435-650 nm over field | | Longitudinal Color over aperture | Longitudinal Color over aperture | | MTF at 120LP/mm over field | MTF at 120LP/mm over field | * * *Table 1B. Two CRA Paraxial lens model Comparison for Case 1* Paraxial model for case 1 CRA 7 deg | Paraxial model for case 1 CRA 40 deg | | MTF at 120LP/mm over field | MTF at 120LP/mm over field | | > | | | RI over field | RI over field **3. Comparison for Case 2: 1-inch Diagonal Sensor** Case 2 of this study investigates the impact of CRA on optical design performance for a 1-inch diagonal CMOS sensor. The common specifications for the two lens designs considered within this case are as follows: **Field of View (FOV):**±40∘**F-number (F/#):**2.8**Wavelength Spectrum:**435 nm to 656 nm**Chief Ray Angle (CRA) Conditions:**- High CRA: 26∘ - Low CRA: 8∘ **Number of Lens Elements:**6 (fixed for both designs in this comparison) The primary design goals for both lenses in Case 2 include maximizing the Modulation Transfer Function (MTF) performance, with comparable optical distortion, and minimizing variation of illumination across the entire field. *Table 2 A. Two CRA design examples comparison for Case 2* Case 2 for CRA of 8 deg | Case 2 for CRA of 26 deg | | TTL 60mm; Front Diameter 22mm | TTL 14.7mm; Front Diameter 5mm | | CRA over field | CRA over field | | > | | | RI over field | RI over field | | MTF at 120LP/mm over field | MTF at 120LP/mm over field * * *Table 2 B. Two CRA Paraxial model comparison for Case 2* Paraxial model for case 2 CRA 8 deg | Paraxial model for case 2 CRA 26 deg | | MTF at 120LP/mm over field | MTF at 120LP/mm over field | | RI over field | RI over field | **4. Results and Analysis** ## How do I choose a lens that matches my sensor’s CRA specification? **4.1. Physical Dimensions: Total Track Length (TTL) and Maximum Lens Diameter** The main advantage of high CRA designs can be clearly seen from physical dimension differences. **Both the total track length (TTL) and the front diameter of the lens are significantly smaller for the high CRA design compared to the low CRA design.** The larger sizes for low CRA designs are primarily driven by the requirement for **near-telecentric imaging conditions**. For such a design, the exit pupil effectively resides at or near infinity, necessitating that chief rays strike the sensor close to perpendicularly. This condition mandates that the physical aperture stop be located approximately at the focal point of the rear lens group. To minimize pupil aberrations induced by this rear group and to maintain a close-to-linear variation of CRA with image height, the focal length of the rear lens group must be sufficiently large. These factors prevent the miniaturization of the rear lens group, contributing to a longer overall lens. The front lens group in the low CRA design plays a crucial role in adapting the wide field of view from the object space to a smaller, more controlled angular range for the rear lens group. This function, coupled with the constraint of maintaining a small chief ray angle at the physical stop for aberration correction, results in increased complexity and length for the front lens group. Consequently, the combined requirements for the front and rear groups lead to the low CRA lens being much longer and wider than its high CRA counterpart. Conversely, in a high CRA design, the lens group following the aperture stop (the rear lens group) has less stringent requirements for bending the chief ray of the maximum field. The physical stop can be positioned much closer to the rear lens group. Furthermore, if the overall field of view (FOV) of the lens is high and comparable to the sensor’s maximum CRA, the front lens group needs to perform less angular adaptation for the rear group, allowing for a simpler and more compact front-end structure. **4.2. Chief Ray Angle Distribution** The **variation of CRA versus image height is more linear in the low CRA design**, whereas the **high CRA design exhibits more significant non-linearity.** The aggressive correction of field curvature and astigmatism within the limited space of the rear lens group in high CRA designs, often employing highly aspheric elements, inherently leads to greater non-linearity in the chief ray angle as a function of image height. **4.3. Relative Illumination (RI)** The **relative illumination (RI) of the low CRA design rolls off much more gently than that of the high CRA design, and the RI at the maximum image height is considerably higher in the low CRA case.** This observation is a direct consequence of the **cosine fourth law**, which dictates that RI decreases as the chief ray angle increases. The lower CRA of the design inherently minimizes these illumination losses. Additionally, entrance pupil expansion in the low CRA design, where the more complex front lens group increases the off-axis entrance pupil, contributes to a more uniform illumination profile across the field. **4.4. Modulation Transfer Function (MTF)** The **MTF across the field of view for the low CRA design exhibits a slower roll-off and significantly smaller astigmatism than for the high CRA design.** The variation of astigmatism over the FOV is particularly pronounced for the high CRA designs (e.g., 40-degree CRA). This directly correlates with the reasons detailed in the analysis of field curvature, astigmatism (Section 4.4), and spherochromatism (Section 4.7). Fundamentally, the improved MTF performance in low CRA designs is a consequence of the more favorable chief ray angles entering the rear lens group, which allows for better correction of off-axis aberrations and chromatic effects. The higher the chief ray angle incident on the rear group, the more challenging it becomes to achieve high and uniform MTF across the field. **5. Conclusion** This study demonstrates a clear trade-off between optical system compactness and imaging performance when selecting CMOS sensors with varying Chief Ray Angles (CRA). While **higher CRA designs offer the advantage of achieving a smaller package size**, enabling more compact camera modules, **lower CRA designs consistently yield superior optical aberration correction and better align with inherent sensor characteristics**, thereby ensuring significantly improved overall imaging performance. A notable practical consequence of the aggressive optical designs often required for high CRA sensors is the **difficulty in achieving effective athermalization across the field of view**. This limitation can lead to significant performance degradation under varying temperature conditions. Therefore, for applications where **space constraints are not paramount and optimal image quality is the primary objective, a low CRA sensor is unequivocally the preferred choice.** The benefits in aberration control, light collection efficiency, and image uniformity offered by lower CRA designs outweigh the packaging advantages of high CRA solutions in such scenarios. --- ## M12, S-Mount, C-Mount…What does it all mean? - Source: https://sunex.com/2025/06/05/m12-s-mount-c-mount-what-does-it-all-mean/ - Summary: M12, S-Mount, C-Mount…What does it all mean if you are searching for a small-camera lens? M12, S-Mount, Board-Mount, C-Mount, and CS-Mount are all names for lens mounting formats used in cameras — and they are not interchangeable. M12 (S-Mount) lenses are the dominant standard for miniature embedded cameras in ADAS, robotics, medical, and security applications. C-Mount and CS-Mount lenses are older industrial standards still used in machine vision and laboratory environments. This article traces the history of each format, explains the technical differences in thread, back focal length, and mounting philosophy, and gives clear guidance on which format is right for your application. ## What is the difference between M12, S-Mount, and C-Mount lenses? If you’ve been searching for a small-camera lens, you’ve likely encountered terms like M12, S-Mount, Board-Mount, Miniature Lens, C-Mount, CS-Mount, and more. With so many names floating around, it’s easy to get confused. So, why does it seem like there are so many options? And how do you know which one is right for your project? In an effort to answer these questions, we thought we’d explore what these terms mean, where they come from, and how they relate to your lens selection. ## Why did S-Mount (M12) replace C-Mount for miniature embedded cameras? Going back decades, the standard for interchangeable industrial lenses (as opposed to most consumer photographic cameras) was the **C-mount**. The C-Mount lens, still a staple in industrial machine vision, some security camera circles, and university labs, was one of the first solutions to standardize the lens mount format. It paved the way even before the days of CCD and early CMOS cameras. The thread of a C-Mount is 1-32; more specifically it’s 1” in diameter with 32 TPI (threads per inch), or: M25.4 x 0.794mm. The C-mount has a standardized back focal length (BFL) of 0.69” (17.526mm), meaning that all these lenses were designed with the same flange BFL. In practice, the lens was screwed all the way down to the mount until tight and then a focusing mechanism allowed fine focus depending on the object distance. This of course made for some long TTL lenses, especially those with a long EFL. To help address the length issue, the **CS-Mount** lens format was introduced. While it uses the same thread and mounting strategy as a C-mount, it has a shorter fixed BFL of 0.4931” (12.526mm). Although the C/CS-Mount makes for very straight-forward interchangeability, there are several drawbacks to their formats. First, the standardization of the FBFL is somewhat arbitrary in terms of optimizing optical performance and actually imposes a design constraint. Second, since the FBFL is fixed, the lens must have a secondary focus mechanism, and since the standard 32 TPI thread is not fine enough to focus 10s of microns of DOF (depth of focus), the lens must incorporate a relatively complex mechanical means of achieving fine focus. Third, the fixed FBFL alone means the TTL of the lens will be at least 12.5mm long (more for C-Mount) before even considering the physical length of the lens. Fourth, the C/CS mount is typically (but not always) integrated into the housing or chassis of the camera with the sensor-board mounted separately. This means there is no direct mechanical reference or interface between lens and sensor, which you’ll know could be a potential source of error from our AA article (Sunex Knowledge Center: What Is Active Alignment?). Lastly, but admittedly not exclusive to C/CS lenses, there is a tendency to add more features since they are interchangeable. While these features may be ideal in applications where flexibility is needed, it is less desirable for fixed, high-volume circumstances. Often, these features also come at the cost of well, cost, in addition to reliability, design and performance tradeoffs. Despite these drawbacks and the fact that C/CS-Mounts aren’t technically Board-Mounts, they still have plenty of utility. It’s important to recognize how these formats helped establish standards for lens mounting and continue to serve many applications today. ## What does board-mount mean in lens design? “Board-Mount” or “S-Mount” lenses address the C/CS-Mount issues in a few ways. Board-mount lenses have no dependency on a fixed FBFL/BFL and no need for a separate focus mechanism. They are designed to thread into a threaded mount directly attached to the sensor PCB. The thread doubles as the focusing mechanism because it typically has a 0.5mm or 0.35mm pitch making it fine enough to focus a lens (see our article Sunex Knowledge Center: Basic Thread Considerations). It also eliminates many (but not all) sources of alignment error between lens and sensor by placing the lens directly on the sensor board. Of course, this means that the BFL, FBFL and MBFL are coupled to the focal position of the lens. This means the focal position changes slightly from camera to camera, but the differences are on the order of 10’s of microns, so it is generally not a problem. A natural result of this board-mount approach is the proliferation of optimized, design-for-purpose lenses. “Board-Mount” is simply a general, all-encompassing term for lenses that are mounted and focused in this way. Within this broad category, M12 lenses, also referred to as S-mount, are the most common. Both terms refer to an M12x0.5mm lens, that is, a 12mm diameter lens with 0.5mm thread pitch. In fact, “M12” has become almost synonymous with Board-Mount but in truth, while all M12’s are Board Mounts, not all Board-Mount lenses are M12 lenses. Other popular sizes of board-mount lens include M14, M10, M8, M7 and even smaller. Thread pitch tends to scale roughly with diameter and M8x0.35mm are fairly common, but in theory any size thread can be used with any diameter lens. For example, M12 “fine-focus” (M12x0.35) or even larger diameters may be specified in critical higher-megapixel applications, to gain a bit more focus control. M12 and other Board-Mount lenses are also ideally suited to active alignment because there is no fixed BFL and therefore no secondary focus requirement. In Active Alignment, an M12 lens can have its thread removed and can be focused and fixed directly over the sensor in one step without impacting the rest of the design. For example, you could prototype with a threaded M12 lens and mount and then go straight to mass-production with a threadless version of the same lens and mount. The other C/CS-Mount issues are addressed by M12 and other Board-Mount lenses as well. Since FBFL is not fixed, the lens design can converge on the best performance, independent of BFL. This generally leads to a much shorter overall solution. It also eliminates the need for complex focusing mechanism internal to the lens. And since such lenses tend to be built-to-purpose, gone is the need for costly and complex varifocal, aperture and locking mechanics. There are also typically commensurate gains in performance, consistency and reliability for M12 lenses compared to their C/CS counterparts because there are fewer trade-offs. While Sunex does offer C/CS lenses, we have also pioneered large-format Board-Mount lenses, such as M20x0.5 and larger. These lenses bring the old C/CS standard into the modern age by allowing them to be mounted directly over the sensor with short BFLs, with the possibility of Active Alignment. But in the world of miniature cameras, the M12 “Board-Mount” still reigns supreme, no matter what you call it. --- ## Commercial Drones - Source: https://sunex.com/2024/07/31/commercial-drones/ - Summary: Drones and similar aerial applications present unique challenges during flight, necessitating specific considerations for optical solutions. Drones and similar aerial applications present unique challenges during flight, necessitating specific considerations for optical solutions. Over the years, drone applications have expanded to include inspection, mapping, delivery, security, agriculture, and warehouse management. A common requirement across these flight applications, whether a rotor or wing-based drone, is the need for lightweight lenses, as every additional ounce on a drone impacts its overall performance and flight time. Often, lenses must be specially designed to meet the unique demands of each application. Many drones undertake extensive flight durations and operate in rigorous environments, drawing demanding requirements for each system component. As a result, lenses for these applications are typically all-glass, high-resolution fisheye lenses that are lightweight and have low F/#s. Many of these products require 8 megapixels or more of resolution, particularly in the M12 format, and commonly feature lenses weighing less than 12 grams. In some instances, high dynamic range (HDR), Day/Night operation w/ or w/o additional IR illumination(RGBIR), and lens heaters are integrated into the optical system. This is especially important for drones, where superior performance is crucial. Sunex is proud to have developed projects with leading companies across all drone applications, further advancing the technology driving this industry. Here are some of our off-the-shelf products customers have used in commercial drones and adjacent application areas. All graphs are for illustration purposes only. The individual lens performance can be different. ## Feature Products | PN | Format | MP Class | HFOV | F/# | Weight | |---|---|---|---|---|---| | DSL217 | 1/4" | 5MP | 185° | F/1.8 | Hybrid lens, ultra lightweight, ultralow F/# | | DSL218 | 1/3" | 5MP | 180° | F/2 | All-glass, ultra lightweight, compact | | DSL219 | 1/2" | 10MP | 163° | F/2 | All-glass, high resolution | | DSL166 | 1/1.7" | 8MP | 120° | F/1.6 | All-glass, HDR, ultralow F/# | | DSL618 | 1/2.3" | 12MP | 185° | F/1.8 | Hybrid lens, high resolution, large FOV, ultralow F/# | | DSL619 | 1/3" | 12MP | 185° | F/1.8 | Hybrid lens, high resolution, large FOV, ultralow F/# | | DSL255 | 1/2.3" | 14MP | 185° | F/2 | All-glass lens, very high resolution, lightweight | | DSL239 | 1/2" | 14MP | 180° | F/2.4 | All-glass lens, very high resolution, lightweight | | DSL491 | 1/2.3" | 16MP | 117° | F/2.8 | All-glass lens, very high resolution, HDR | Don’t find what you are looking for here? Try searching our entire **Off-The-Shelf Portfolio,** or using the **Imaging System Builder** to get started on a custom solution. --- ## Geospatial Imaging - Source: https://sunex.com/2024/02/24/geospatial-imaging/ - Summary: Geospatial imaging refers to the process of acquiring, analyzing, and interpreting data related to the Earth’s surface and its features. Geospatial imaging refers to the process of acquiring, analyzing, and interpreting data related to the Earth’s surface and its features using various imaging technologies to capture this data. While many applications require the use of Depth Sensing technologies to create a point map of the surrounding, there are use cases which benefit from visible or Near infrared imaging. Geospatial technologies enable the visualization and understanding of spatial patterns, environmental changes, and geographical phenomena with precision. Geospatial imaging finds applications in diverse fields including urban planning, environmental monitoring, agriculture, disaster management, and navigation. By providing detailed insights into the Earth’s surface, geospatial imaging plays a crucial role in decision-making processes, resource management, and scientific research, ultimately contributing to a better understanding and management of our planet’s complex systems. Some common use cases within Geospatial imaging include Total Stations, Mobile Mapping, Monitoring, Odometry, and Augmented Reality. ## Total Stations Geospatial Total Stations are instruments designed for accurate measurements on Earth’s surface. Beyond merely assessing distances and angles, they enable the ability to capture images and collect comprehensive environmental data. Lens P/N | Image circle | Efl (mm) | Imager resolution | F/# | Full FOV | Distortion (%) | Optical TTL | 6 | 15.8 | 1.3MP | 2.4 | 22 | 2 | 21.3 | | 6 | 3 | 1.3-2MP | 2 | 170 | -33 f-θ | 20 | | 4.7 | 1.55 | 3 MP | 2 | 185 | -6 f-θ | 20.7 | | 8 | 8.50 | 3-5MP | 3.0 | 52 | -3 | 16.50 | | 7.2 | 7.50 | 5 MP | 2.8 | 55 | -2 | 11.30 | | 6.0 | 3.4 | 5MP | 2.0 | 82 | 1 f-tan | 20.5 | | 7.8 | 7.50 | 12MP | 2.1 | 55 | -1 | 22.00 | ## Mobile Mapping Mobile mapping involves using vehicles or other mobile platforms equipped with various sensors and imaging systems to collect geospatial data as they move through an area. Mobile mapping applications typically require high resolution lenses used to capture a surround view 360-degree FOV using 4-6 lenses. These applications typically require high-resolution sensor and lens combinations to ensure necessary optical quality required for large terrains. --- ## Automotive InCabin – Optics matter in the future. - Source: https://sunex.com/2023/03/31/automotive-incabin-optics-matter-in-the-future/ - Summary: One thing seems to be agreed on across the industry: In-Cabin vision applications will expand rapidly. In-cabin automotive cameras are evolving beyond basic driver monitoring to simultaneously cover occupant sensing, gesture control, gaze tracking, and child presence detection. The optical challenge is designing a single lens system that handles near-IR illumination, wide FOV, low distortion, and high MTF across a cramped cabin space — while meeting AEC-Q automotive qualification and EU GSR/NCAP regulatory requirements. This article covers optical requirements for in-cabin monitoring (ICM) systems, RGBIR lens design considerations, and how Sunex’s in-cabin lens portfolio addresses emerging automotive regulations. ## What optical requirements do automotive in-cabin monitoring systems need? One thing seems to be agreed on across the industry: InCabin vision applications will expand rapidly, similar to the evolution of cameras pointing outwards. Exterior cameras in automotive advanced from a single low-resolution backup camera to advanced, high-resolution, and AI-powered ADAS systems. One can easily imagine that the InCabin revolution will expand from the initial focus on DMS to many vision-based applications. The industry and regulatory bodies will need and require additional vision applications supporting the progression from L2+ to L5 autonomy, and OEMs want to satisfy the demand of consumers to make their cars an essential part of their interconnected world. ## What is an RGBIR lens and why is it used for in-cabin cameras? The lens has always been an integral part of every camera system. Even though it is often seen as a commodity nowadays, the lens is a crucial driver, differentiator, and innovator for performance. No sensor or AI algorithm alone can deliver if the input signal is inadequate for the intended application. Optical design and manufacturing technologies for achieving high enough resolution for biometric recognition, enabling radial symmetric tailored distortion for super wide-angle views, and optimizing coatings for RGBIR, to list a few, require specific domain expertise that often can only be built over time and are hallmarks for experienced suppliers. Adding into the mix the progression to 5G, we will undoubtedly see InCabin video streaming and conferencing as a sought-after option for the future car as well. ## How does Sunex design lenses for NCAP and GSR in-cabin camera requirements? Sunex has been working with leading companies on the aforementioned technologies for many years. Initially, they have been regarded as groundbreaking and key differentiators in their respective industries and now have become recognized standards. The challenge is now how to adapt these for the automotive industry. Some of the approaches will borrow from previous solutions, e.g., RGBIR in automotive has been called Day/Night in Security for a long time, Occupant Monitoring (OMS) could potentially leverage Surround View Camera (SVC) experiences, and design expertise for finite imaging in Machine Vision (MS) drives some of the Driver Monitoring (DMS) innovations. However, balancing expected performance, size constraints, and cost targets remains one of the most crucial success driver in the automotive industry, and this talk will discuss some of the required optics technologies, their advantages, and cost challenges for In-Cabin applications that meet regulatory requirements. This article is the abstract to the presentation given at the InCabin show in Phoenix in March 2023. --- ## Lens Requirements for AI Vision - Source: https://sunex.com/2022/10/07/lens-requirements-for-ai-vision/ - Summary: Developing a machine vision or embedded AI systems and scaling them for deployment are already challenging tasks. For embedded AI vision, the lens is not a commodity — it is a direct algorithmic input. The five lens parameters that most impact AI performance are MTF (sharpness), F/# (light throughput), distortion profile, relative illumination, and dynamic range. The wrong lens adds computational overhead, increases power consumption, and can cause systems that work in the lab to fail in the field. This article covers each parameter with guidance for automotive, robotics, medical, and security applications. Developing machine vision or embedded AI systems and scaling them for deployment are already challenging tasks. Converging these into an integrated Embedded AI Vision System can keep scores of developers and engineers busy for quite some time. The selection of relevant sub-components has a significant impact on the engineering challenge and the final solution’s complexity, performance, physical size, and power consumption. Regarding the optical path of an Embedded AI Vision System, making the right lens selection is a critical milestone and should not be seen as an afterthought since the lens’ performance and characteristics impact almost every aspect of the downstream vision chain. The right lens choice can positively impact the required AI algorithm complexity and final system performance, whereas the wrong lens choice can hamstring subsequent development. If applied AI for embedded vision aims to replicate human sensing and understanding, then the optical stack plays a significant role in achieving a human-like vision for any camera-based application. However, carefully selecting the right lens will pay huge dividends whether the intended application is in automotive, security, medical, robotics, or any other possible implementations when transitioning from the lab to the real world. For systems that will be deployed in an environment with low or changing light or are exposed to a wide temperature range, the challenges to delivering consistent optical performance increase dramatically. Using AI algorithms, a lens must excel along multiple vectors to support real-time or post-processing. ## What lens parameters matter most for AI and machine vision systems? Choosing the right lens is a critical step in system-level optimization and setting a roadmap to achieving desired outcomes. From our experiences across industries, applications, and clients, we found the following 1st order lens parameters to be critically important: - MTF – Modulation Transfer Function is a standardized way to describe an optical system regarding sharpness and contrast and is a key performance indicator when comparing lenses. While commonly used to judge “how good” a lens is, it is not the only, and sometimes not even the best metric for determining what will work best in a given use-case. - F/# – Essentially, the amount of light the lens lets in for a given focal length. The physical aperture or “Iris” of an optical system defines the total light throughput and directly impacts the lens’ ability to produce the required contrast and Depth of Field. - Distortion – The concept of distortion describes how a lens maps a shape in object space to the image plane. Distortion must be referenced to something, which is sometimes called the “projection” or may be referred to as Rectilinear, F-tan, or F-theta distortion. - Relative Illumination (RI) – This is a normalized %-value that represents the illumination of any field point relative to the point of maximum illumination, which is typically on-axis. A high RI value means “flat” illumination across the image plane, whereas a low RI may introduce dark corners or edges. - Dynamic Range – Quantifies a system’s ability to adequately image high lights and dark shadows in a scene, IE, a wide range of lighting values or conditions in the same image without being saturated or too dark. ## How does MTF affect AI object detection accuracy? As listed previously, MTF, or the Module Transfer Function, is a standardized way of comparing the optical performance of different imaging systems. For AI algorithm-based systems, selecting lenses that have a fairly consistent MTF across the FOV (Field of View) and show a stable MTF behavior over temperature is generally advisable. The intended wavelength spectrum, which depending on the application, typically include visible (VIS) and/or near-infrared (IR), also has to be taken into consideration when comparing MTF performance. A fully athermalized lens called hyperspectral, RGB-IR, or Day/Night lens is a lens that checks off all these boxes. Such a lens requires not only a deep design experience but also an in-depth understanding of material properties, advanced coating Know-How, and a manufacturing process expertise that has been optimized and advanced over decades. Not all applications require such an advanced optical system, and it will be for the application domain expert to decide which component and how far to optimize the system. However, everything that can be optimized at the lens level “at the speed of light” reduces processing time and power later, possibly resulting in smaller and more energy-efficient computing systems. It is also much easier to start with good lens performance rather than recreate “missing” data to compensate for poor optical performance, since it is almost impossible to process enough to make up for lost data at the lens level (the often-cited concept of garbage in, garbage out applies here as well.) ## Why does F/# matter for low-light embedded vision applications? Other optical performance parameters, such as the F/# and Relative Illumination (RI), also contribute to the consistency of imaging quality that any AI algorithm has to deal with. Unfortunately, there is no “one size fits all.” For example, the system architect has to decide whether to optimize a system for low-light performance with a low F/# or improve the Depth of Field (DoF), which defines the range from the near to the far object distance that is determined to be in focus, with a higher F/#. In this example, a high RI may contribute to a well-balanced system where the edge illumination would tolerate some “stopping down” (increasing the F/#) to increase DoF. Lens selection can quickly become a multi-dimensional optimization with various trade-offs. Distortion is one of the lens specs, that software developers might often like to simply make go-away since a high and consistent number of pixels per degree (px/deg) across the entire FOV would be preferred. But since distortion is unavoidable in many cases, Sunex has found ways to manipulate a lens’ distortion profile to align best with and support the algorithm-specific requirements. Our Tailored Distortion™ expertise has often been applied to SuperFisheye™ lenses to correct barrel distortion of large FOV lenses, providing more px/deg at the field edges, while conversely, the initial statement of striving to achieve a human-like vision has led to the development of FOVEA distortion lenses that mimic the human eye by increasing the px/deg density in the center while at the same time maintaining a wide field of view peripheral vision. Not just the 1st order lens design parameters but also a combination of lens design considerations, coating choices, and surface treatment, contribute to the dynamic range of a lens, which can significantly affect the performance in extreme light situations. The dynamic range of a lens or system is defined as the ratio of the largest non-saturating input signal to the smallest detectable input signal. AI systems running on mobile autonomous systems, such as a delivery vehicle driving from broad daylight into a tunnel (or out), is an authentic example that benefits from a lens with an excellent dynamic range to support consistent results. An agriculture harvesting machine that, on the return leg, drives into the setting sun or stationary systems such as exterior smart infrastructure or security cameras that deal with passing light sources such as vehicles or the sun are further applications in need of a lens with high dynamic range. Many customers ask what dynamic range really means for the lens since it is the imager (CMOS, CCD, etc.) detecting the light, not the lens. The answer is that the lens must be very good regarding stray light, glare, and ghosting. These image artifacts can be ignored for a low dynamic range sensor because the sensor will not pick them up. However, in an HDR/WDR sensor, at best, stray light reduces the image’s signal-to-noise level (contrast). At worst, the artifacts become apparent and may be interpreted as another light source, such as an oncoming vehicle, or could obscure real images. ## How do I select a lens for dynamic range in high-contrast scenes? Who knows where the capabilities of embedded AI-vision systems will lead us in the future; can it ignore dust or dirt on the camera, overcome insufficient lighting, or potentially compensate for MTF changes due to thermal shift? While AI may eventually allow for more relaxed lens requirements like a human brain does, there is still no substitute for good image quality. We can clearly see the benefits and potential that an “AI-optimized” lens can deliver to the overall system reliability and consistent image quality, primarily if the AI-based vision system is situated in varying environmental conditions. Needless to say, size constraints and piece price also factor in when designing the right system, and there is no need to “over-engineer” the final solution. However, the previously mentioned factors that drive image quality, when done right, have the potential to simplify the AI system complexity, optimize algorithms, and reduce system latencies and power consumption while enhancing imaging performance. Sunex already has many lens designs that combine low F/#, high Relative Illumination (RI), high dynamic range (HDR), high MTF across the field, and a broad wavelength spectrum for consistent performance for many industries and applications. Often the initial engagement with our clients is to select possible options based on our existing portfolio of lenses to get real-world feedback on what optical performance is required in the real use case. Based on this feedback, we review opportunities to optimize an adapt and existing lens or may jointly decide to pursue a purpose-built, custom lens design that matches the application and requirements. For many of our clients, we also offer greater vertical integration by designing and manufacturing the entire sensor board, including fully automated active lens/sensor alignment. With a 25-year track record as a global optics company, we are leveraging best-in-class design expertise and constantly driving innovation and process improvements, leading to millions of lenses delivered worldwide on time. The goal for Sunex, our clients, and all parties involved is to collaborate to develop a system that can meet the expected performance, for an acceptable price, in the time frame needed. This article was also published in a similar form by Wiley Industries in their *Inspect international 02 2022* issue. --- ## Embedded Vision Modules - Source: https://sunex.com/2022/04/22/embedded-vision-modules/ - Summary: If applied AI for embedded vision aims to replicate human understanding, then the optical stack plays a significant role. Embedded vision modules combine a lens, image sensor, and processing electronics into a compact, integrated assembly ready for deployment in robotics, ADAS, industrial inspection, and medical systems. The lens is the most underappreciated component — it determines the quality of data available to every downstream algorithm. Choosing the wrong lens at design time cannot be compensated by better processing hardware later. This article covers what embedded vision modules are, how their optical and mechanical design differs from consumer cameras, and the key lens specifications that determine AI vision system performance in production deployment. ## What is an embedded vision module and what are its key components? If applied AI for embedded vision aims to replicate human understanding, then the optical stack plays a significant role in achieving a human-like vision for any camera-based application. Choosing the right lens is a curtail step in system-level optimization and setting a roadmap to achieving desired outcomes. With Sunex as a lens and technology partner, our clients can access specific lens technologies to optimize algorithms, reduce system latencies and power consumption, and enhance imaging performance. Sunex’s expertise and experience in manipulating distortion profiles to support algorithm-specific requirements have been valued by customers for many years. Our Tailored Distortion™ expertise has often been applied to SuperFisheye™ lenses to correct barrel distortion of large FOV lenses. Sunex’s FOVEA distortion lenses are designed to mimic human vision. The distortion profile results in a higher pixel density in the center while maintaining a wide field of view, thus optimizing the performance of machine vision algorithms. Environments with low or changing light are a challenge for any algorithm. Sunex has lens designs that combine very low F/#, high Relative Illumination (RI), high dynamic range (HDR), high MTF across the field, and a broad wavelength spectrum for consistent performance across a variety of scenarios. All graphs are for illustration purposes only. The individual lens performance can be different. ## How does Sunex support embedded vision OEMs from lens design through volume production? Sunex has developed design expertise, process know-how, and nested cleanroom manufacturing facilities to eliminate or minimize optical noise (such as ghosts, flare, starbursts, spurious images) in lenses for high-performance applications. **Fast Prototyping** We provide prototyping services for complete lens assemblies often as the first step after a new custom design. Sunex can produce prototypes with short lead times to verify the design before transitioning further on the path to mass production using state-of-the-art fabrication processes for glass and plastic optical elements and all mechanical components. **Sensor Module Capabilities** Depending on the need and expertise of our customers, we provide design and manufacturing services for a complete sensor module. We strive to find the best solution for your needs, from designing the schematic, creating the PCB layout, and sourcing all components to building according to your PCB design and parts consignment. At Sunex, we have the in-house expertise and capabilities for lens and sensor board design, manufacturing, and testing to deliver a fully tested sensor module. **Active Alignment Capabilities** To achieve the highest system performance when pairing a high-quality lens with a high-resolution sensor, we recommend that our customers consider an active alignment process. Applying a fully automated 6-axis active alignment in mass production increases yield, shortens cycle times, improves system performance, and lowers part-to-part variance. For a list of current lenses related to this topic and to download our latest brochure, please visit sunex.com/aivision --- ## Automotive Headlamp Optics - Source: https://sunex.com/2021/08/20/automotive_lighting/ - Summary: The automotive industry is moving toward adaptive, high-resolution µLED lighting that requires advanced lens design expertise. Automotive headlamp technology has shifted from simple reflector systems to high-definition µLED projection capable of pixel-accurate beam shaping with over one million addressable pixels. This transformation places new demands on headlamp lens design: tight MTF at the projection plane, precise chromatic correction for LED wavelength combinations, thermal stability from -40°C to +105°C, and manufacturing tolerances compatible with automotive production volumes. This article covers the evolution of automotive headlamp optics from halogen to µLED HD, the key optical design challenges, and Sunex’s lens hybridization approach for next-generation HD lighting. ## How has automotive headlamp optics evolved from halogen to µLED? Lighting in automotive has come a long way. The earliest versions used candles, followed by fuel-based approaches burning carbide or oil from the late 1880s. The first electric lights were introduced in 1908, but were not at all common until the 1920s. However, once introduced, they quickly became the most noticeable form of “electrification” in automobiles, maybe only second to today’s full EVs and hybrid vehicles. Optics already played a role in the early beginning in its most simplistic way. Carbide lamps used a reflective mirror at the inside of the headlamp to increase the total light output; others used a glass cover to protect the environment from the flame or to keep out dirt and water once light bulbs were used. Some of the first optical headlamps were introduced in 1917 to reduce glare, increase efficiency and range, and make driving safer by illuminating the roadside and enabling driving – up to 25 miles an hour – under bad weather conditions. It is interesting that even though the technology has advanced far beyond what the early trailblazers of automotive headlamps could have envisioned, we still try to optimize for the same goals: - reduce glare - increase efficiency and range - make driving safer One key difference is that the industry has advanced from a single source and simple reflectors and light guides to arrays of thousands of individually controllable light sources. To meet the OEM’s requirements, wow the consumer, and satisfy the legislature, this new generation of light sources has to be paired with innovative projection optics. ## Why are advanced optics needed for the new HD Lighting functions? Lighting functions in and around the car play an ever-increasing role in an OEM’s brand recognition and are a signature piece for corporate design consideration. From a purely technical perspective, modern high definition (HD) headlamps are designed for two main applications: Advanced Lighting Functions and Road Projections. - Advanced Lighting Functions include Adaptive Driving Beam (ADB), glare-free HighBeam, dynamic curve light, illumination of traffic signs, and adjustments for different global regions. The overall focus is to increase safety by increasing visibility and reducing distraction. These systems typically cover a larger horizontal field of view (HFOV) and are optimized to bring as much light as possible on the street. - Road Projection headlamps focus, as the name implies, on projections of symbols or information on the street to assist or warn the driver. The possibilities are almost endless and include convenience features where the car welcomes the driver, projects the OEM brand, or even plays tic-tac-toe on the garage door. However, more importantly, are warning and guidance possibilities. Emergency braking, frost warning, navigation, lane markings in construction zones, or communication with pedestrians will all become commonplace on our roads soon. Tier 1s and OEMs also hope that a high-resolution headlamp system and a software-centric approach can reduce overall system complexity while increasing flexibility simultaneously; it is the promise of a single module that can adapt to different global market needs, regulations, and use cases. A software-enabled module could switch the low-beam cutoff-line profile to account for left-hand vs. right-hand driving or provide headlamp leveling capabilities. There are different technical solutions to highly pixelated advanced lighting and road projection headlamps, all with their distinct pros and cons. DMD | Matrix LED | µLED | Laser | LCD | | | Resolution | 1M | 8-100 | >26K | 64K | 30K-50K | | Image Mode | Subtractive (reflective) | Additive | Additive | Additive | Subtractive (absorptive) | | Light Source | LED or Laser | LED | LED | Laser | LED | | Optical Elements | 5 | 1-2 | 4-6 | 1 | 1 | Laser and LCD systems will possibly stay niche and be used complementary. DMD-based solutions were the first ones to enable HD lighting for a high-end segment in the automotive market. It still has to show if the technology can transition to the mass market and how it will address the high(er) price tag, larger size, and the reflective imaging mode that somewhat works against optimizing the overall system for power consumption. The industry is putting high hopes into the now emerging µLED technology. It shows all the promising signs to emerge as the ideal bridge between the ultra-high-resolution systems based on DMDs and the well-established Matrix LEDs. ## How does Sunex design lenses for high-definition automotive lighting systems? The industry is moving full steam ahead with highly pixilated HD headlamp systems. The market will eventually decide if one particular technology will be the more dominant one or if they will coexist. However, one thing is for sure; these systems need to be paired with automotive-grade multi-element optics to bring their performance on the road. Sunex’s design, process, and manufacturing know-how have been well known in the automotive industry for over two decades. Our experience has made us a preferred automotive supplier of multi-element optical systems for many high-volume series production programs. Building on that history and reputation, we are now successfully addressing this emerging market of HD automotive headlamps with our clients and partners. Please get in touch with us if you want to learn more. --- ## SWIR Imaging Optics - Source: https://sunex.com/2021/02/17/swir/ - Summary: Short-Wave Infrared or SWIR is an imaging technology that traditionally has been used in biological, defense, and some industrial-related applications. Short-Wave Infrared or SWIR is an imaging technology that traditionally has been used in biological, defense, and some industrial-related applications. With the ever-growing need for camera systems that can see more than the human eye, engineers turned to the wavelength spectrum beyond the visible (VIS) space. The infrared spectrum from 750nm to 12,000nm is broken down into two distinct sectors. The “reflected infrared” is represented by the Near-Infrared (NIR) and SWIR bands, and the “thermal infrared” is represented by Mid-Wavelength Infrared (MWIR) and Long-Wavelength Infrared (LWIR). Where the “thermal infrared” sensors detect light emitted by the object itself, NIR and SWIR sensors are detecting photons reflected by the object. The 900 – 2500nm band is of particular interest as the ambient and background radiance emitted by stars during the night is a natural SWIR source, allowing for high contrast imaging. Also, water particles become transparent in the SWIR spectrum, hence eliminating a great shortcoming of systems relying on the visible spectrum only. With technologies, materials, and manufacturing processes continuously advancing, we now see SWIR cameras in many more application areas. From agriculture, food processing, and medical imaging to quality control in automated production lines, security/surveillance, automotive, and even arts and archeology. Some of these applications leverage the release of new sensor technologies that simultaneously image the visible and SWIR wavelength bands (e.g., 400nm – 1700nm), and that offer increasing pixel-count (SXGA, VGA) and resolution (pixel pitch of 5µm). The overall system cost needs to be reduced to foster wider adoption of SWIR Imaging. Relying on the ability of the new sensors that cover visible and SWIR spectrum in one camera alone is not enough, is the rest of the system is not innovating as well. The lens needs to support, if not even lead, this approach to eventually allow such broad coverage to expand the type of applications that can be addressed and the industries that can benefit from it. #### So how are SWIR-imaging impacting lens design and manufacturing considerations? The material choices and design considerations must be informed by and analyzed using the appropriate wavelength spectrum and weighting. The design and optimization approach will also be different. Sunex has a long-standing experience in hyperspectral designs that deliver a consistent high-performance over a vast spectrum. No matter the application requirements, VIS-only, VIS+NIR, VIS+SWIR, or SWIR-only, special considerations have to be made regarding the AR (Anti-Reflection) or BBAR (Broad-Based Anti-Reflection) coating to maximize the complete lens assembly transmission. The R&D and manufacturing process often requires additional equipment (e.g., upgrade to MTF and Transmissivity tester to support the wavelength range), extra know-how for straylight testing and mitigations, and advanced coating experience. High MTF, small F/#, and low aberrations are typical for high-performing SWIR-lens. The exact performance parameters are typically driven by the application’s need and include the sensor choice and specific customer and end-user requirements. Please Contact Sunex, to discuss your application-specific requirements, our available products & designs, and prototyping & manufacturing capabilities Credits: Portrait Images (VIS, NIR, SWIR): NickSpiker, CC BY-SA 4.0, via Wikimedia Commons --- ## HOW ARE LENS ELEMENTS MADE ? - Source: https://sunex.com/2020/12/14/how-are-lens-elements-made/ - Summary: Digital imaging lenses are often made of multiple lens elements to achieve the required specifications.  How are those lens elements made in mass production? Digital imaging lenses are often made of multiple lens elements to achieve the required specifications. How are those lens elements made in mass production? The answer to this question depends on the type of element chosen by the lens design engineer. Three types of lens elements are commonly used today. They are polished glass elements, molded plastic elements, and molded glass elements. Each type of element has its own unique process steps and capabilities. A good understanding of the manufacturing process for each type can help to ensure that the lens design can be successfully realized in practice. - Polished glass: This type of element is the best known and most common. The elements used by Galileo in his refractive telescope were made by this method. Though the basic idea of how to make a polished glass element hasn’t changed for centuries lot of progress has been made in the speed and precision of the manufacturing process. The steps in making a polished glass element involves first making a blank from the chosen glass material type. Today there are hundreds of different types of glass materials available from several major optical glass manufacturers. Each glass type has different optical, mechanical, chemical, and thermal properties. It is the task of the lens designer to choose the appropriate glass material based on the end application requirement. Once the glass type is chosen there are two ways to make a blank. For small quantity needs one can CNC machine a rough oversized shape resembling the final element. For higher volume production the glass manufacturer can provide a molded blank having the approximate shape. In either case, the glass blank is then mounted on a grinding machine where a grinding wheel forms the two surfaces of the elements to the required radii. After grinding where the surface radii are approximately formed a final polishing step is used to make each surface smooth and shining. An optional optical coating is then applied to each surface using an optical coater. The final step is to center the element on a centering machine and a precise diameter is achieved by grinding down the excess material. This process has been perfected over many centuries. The capabilities of this process are amazing. Element sizes ranging from small endoscopic lenses to large telescope mirrors are made using this basic process. However, because of the grinding and polishing processes only flat or spherical shapes can be made. If a lens design calls for an aspherical element in either glass or plastic we would need to use the following processes. - Molded plastic: Optical plastic materials offer very attractive properties for certain applications such as mobile imaging cameras. The predominant manufacturing process of plastic is injection molding. The basic process of injection molding is to melt the plastic resin so it can flow freely. This fluid is injected under high pressure to a precision mold cavity. The shape of the mold cavity determines the shape of the lens element. Once the mold cavity is filled it is cooled to allow the material to harden. Once the material is solid again it is removed from the mold. Depending on the size of the element the cycle time can be as short as a minute. Multi-cavity molds are often used to achieve high-volume production. These characteristics of injection molding allow for the economic production of lens elements with a high degree of precision. A major technical benefit of injection molding is that almost any shape can be molded as long as the mold cavity can be fabricated in the first place. This freedom allows lens designers to use aspherical or freeform profiles in the lens design opening up additional performance advantages such as compact size and high image quality. Today almost all mobile camera lenses are made by injection molding method. A major disadvantage of plastic materials is that their index of refraction depends on the environmental temperature. This sensitivity to temperature makes the plastic elements unsuitable for some applications where an extreme temperature environment is expected. There are design techniques one can apply to “de-sensitize” lens designs with plastic elements. Sunex has successfully commercialized many hybrid lenses using a combination of plastic and glass elements with such a design strategy. - Molded glass: The glass molding process overcomes the limitation of the glass polishing process as outlined above (only flat or spherical surfaces can be polished). Glass material can be heated to a point where the material becomes soft. This soften point varies from glass to glass. The typical glass soften temperature is 500C to 600C. One starts with a glass preform made using machining or polishing process. The preform is then heated to the soften point and compressed between two cavity surfaces of a mold. This forces the glass preform to take on the shape of the mold cavity surfaces. Once the preform is sufficiently cooled, the molded element is removed from the glass mold. The key technical advantage is that the surface shape can be aspherical or freeform while the material can be glass instead of plastic. Glass materials have much better temperature and environmental stability making them suitable for a wide variety of applications where extreme temperature or harsh conditions may be encountered. The major disadvantage is that the molding process for glass can have a longer cycle time compared with that of injection molding because of the higher soften temperature required for glass. There are also limitations to the shape and materials that can be processed. Thus molded glass elements are only used in applications where this increased cost can be justified. If a design must use molded glass it is preferable to minimize the number of molded glass elements to lower the manufacturing risk and cost. For a lens design to be successfully produced in mass production it is important to choose the best design strategy based on manufacturing process considerations. Should the design be all polished glass elements? Should it contain molded plastic or glass elements? If yes, how many and where? Are the material choices and shape manufacturable by their respective processes? Can the required tolerances be achieved? How to optimize the overall cost and yield? These questions require in-depth process knowledge not captured by today’s lens design software such as Optics Studio or Code V. Talk to our engineers about your application requirement. Let us help you to make your design more manufacturable. --- ## Basic Thread Considerations for Lens/Holder Pairing on Board-Mount Cameras - Source: https://sunex.com/2020/09/03/basic-thread-considerations-for-lens-holder-pairing-on-board-mount-cameras/ - Summary: The seemingly humble thread is much more complex than it appears; at least as it applies to lenses. At Sunex, we get many questions about lens and mount (holder) threads when it comes to board-mount lenses. In this post, we will explore some of the basic considerations of the threaded lens and mount combinations, as opposed to some other common methods, which we will touch on. Although this post will not be a detailed tutorial of the mechanical engineering theory of thread specifications, a basic understanding of thread spec and tolerance helps to understand the considerations involved. Likewise, the advantages or disadvantages of choosing one mounting method over another will not be discussed here but will be covered in a future post. **Basics of Thread Tolerance:** ISO Threads are defined by the nominal diameter of the thread, such as M8, M10 or M12. These specific examples were intentionally chosen as they are the most common board-mount lens types. These then represent lenses whose nominal diameter at the thread section is 8mm, 10mm, and 12mm, as shown below in ** Figure 1**. M12 in particular is an extremely common thread size for general-use lenses in applications such as Security, Automotive, Video-Conferencing, Drones, and Computer Vision applications. ** ****Figure 1 – ***Lens and Mount Thread Specifications* After defining thread diameter, it is necessary to define the thread *pitch*. Pitch, as you probably surmised, is the distance between the peaks (or valleys) of the thread teeth; see ** Figure 2**. To avoid binding, it is imperative that the pitch of the thread match the pitch of the holder. Most commonly, M8 lenses are 0.35mm pitch and M12 lenses are 0.5mm pitch, although any combination is possible and maybe customizable, depending on the application. Once thread size and pitch are defined, this is enough to *nominally*define the thread spec but is not enough to produce actual parts or to ensure a good fit. This is because, as always, mechanical tolerances dictate how loosely or tightly the parts fit. In the case of thread fit, however, it is unfortunately not as simple as a smooth inner vs outer bore fitment because there are effectively many mating surfaces involved. There are also issues unique to the thread, such as backlash or “wobble”. More on this later, but for now, we must understand that in addition to thread diameter and pitch, there are also ISO thread tolerance “bands” which contribute to fit and are illustrated below in **. Note that internal threads are always Capital letters and external are lower-case. Also take note that although the threads do not (and cannot) overlap, there are line-to-line combinations possible in the spec. Taken together, we can see the full thread spec as it applies to lenses and mounts in** *Figure 3***.** *Figure 4***Figure 2 – ***Thread Terminology* ** ** **Figure 3 – ***ISO Thread Tolerances* **Figure 4 – ***Thread Spec Format* **Considerations for Lens Mounting** Now that we have a basic understanding and common terminology to specify our lenses and mounts, fitting lenses and mounts should be straightforward, right?!! Well, let’s dig deeper into how the thread spec applies to lenses and the potential challenges that can arise. The threaded board-mount method of mounting lenses is common, convenient, and lends itself well to the universal fitment of off-the-shelf lenses. First and foremost, however, we need to remember that by definition threads must have some play in order to move with respect to each other. This “play” in the thread is the root of almost all of the potential fit challenges faced by customers in using a board-mount configuration. Even if the internal and external threads are specified properly, and the parts are within spec, the tolerance bands leave some room for variation. If the fit is too loose, it will result in excessive wobble and backlash, which can contribute to difficulty holding focal position, either in Z or in the tip/tilt axes. If thread-fit is too tight, it will lead to difficulty in reaching focus at the nominal back focal distance or may create debris due to excessive friction or mechanical interference in the thread. Second, there is a bit of a disparity between mechanical specifications and optical specifications to consider. While typical thread tolerances (and lens mechanics, for that matter) are often measured in tenths or hundredths of millimeters, optical focal length tolerances are typically measured in *thousandths* (microns). This means that the tolerance bands (** Figure 3**) for specifying thread may not be as precise as seemingly necessary to maintain optical tolerances for a given use-case. Luckily, however, threads are continuously variable, and as long as there is no excessive wobble or backlash between the threads, it is still a very precise way to focus a lens. Let us take a simple example: If the depth of focus of a lens is +/-10um and the pitch is 0.5mm, this means that each 360° revolution moves the lens 500um. So +/-10um is 1/25 thof a turn, or 14.4°, which is easily resolvable in practice. Yet, if there is excessive space between the threads, one can clearly see that the amount of play in the threads could easily overshadow this number and exceed the depth of focus of the lens, making fine-focus difficult. The third consideration is the materials themselves. While any common material may work well for either the mount or the lens with proper fitting, there are some considerations here as well. Thermal performance requirements aside metals are relatively incompressible and metal-to-metal fitment can be tricky. Tightly matched metal parts tend to create debris or galling and slightly looser tolerances may be necessary to achieve the proper fit; something to keep in mind! Metal-on-plastic on the other hand, works well in most instances and can generally be a bit more snug, but can also create excess debris or “wedging” if fit too tightly. Finally, plastic-on-plastic is a relatively forgiving configuration in general but may have second-order thermal-stability limitations to consider. For all of the above reasons, you will often see fitment pairings of **h** and **G** for off-the-shelf lenses. This ensures at least some gap exists (worst case) in order to avoid the possibility of a line-to-line fit while maintaining a reasonably tight match. These are also not the *only* considerations depending on the application, but these three are common to a majority of use-cases. ** ** **So What is the Answer and are There Other Options?** We can now see that the seemingly humble thread is much more complex than it appears; at least as it applies to lenses. So, what can be done to mitigate any risks associated with assembling and focusing board mount lenses? - Engineering and support. We’ve hopefully demonstrated that mounts are not trivial, so let Sunex help you match the proper mount to the lens you selected. As alluded to above, there are a number of other case-specific considerations that Sunex can help you work through. Sunex has a variety of mounts for exactly this reason, and can help you pick the right one! By the way, it’s always a good idea to use a mount from the same supplier you purchase the lens from. There are also a number of “best practice” techniques for focusing that Sunex can help walk you through. - Sunex can build a sub-assembly for you. For mass-production applications, let Sunex do the hard part! Sunex can assemble the lens to the mount and check for fit at your nominal focal position. Not only does this ensure a good fit, but because Sunex assembles, inspects, and packages everything in a cleanroom, it will typically reduce debris substantially at final assembly on your sensor board. - Precision Mounting Concept. Do you need to up your alignment game in addition to just getting a good fit and focus? As we discussed above, threads must have some play in order to avoid excessive friction and the problems that come along with it, but too much play can ruin your image quality. Sunex’s Precision Mount Concept can reduce tip and tilt by up to 50% while simultaneously reducing reliance on thread fit, thus avoiding the need for tight tolerances and individual fitting. The key is in the matching of lens design to mount design. While several off-the-shelf Sunex lenses already have this feature, Sunex can customize any lens in our portfolio to be compatible with our Precision Mount designs. - Finally, as resolution goes up and pixel pitch goes down, precision alignment becomes paramount. In rough terms, above 5MP resolution and/or below 2µm pixel pitch, threads and their associated tolerances begin to reach the limits of the precision required for best optical performance. In this regime, Active Alignment (AA) may present the answer. Sunex has a range of technologies to actively align our lens to your chosen sensor, taking the thread completely out of the equation. This process is typically most suitable for mid-to-high volumes and high resolutions, but depending on the use-case, it may add value for a multitude of use-cases. More information is included in one of our previous blog posts on AA. Please reach out to Sunex using the Contacts link above if you need assistance with properly fitting, aligning, and focusing a board-mount lens and mount. --- ## An Optics Platform for Automotive Vision - Source: https://sunex.com/2020/08/08/an-optics-level-platform-approach-to-automotive-vision/ - Summary: Taking a platform approach at the lens component level can deliver quality optics in support of faster innovation cycles. New technologies, new application requirements, and new market segments are challenging every automotive vision program to consider and balance performance vs. cost, glass vs. hybrid, and off-the-shelf vs. custom design. Taking a platform approach at the lens component level can deliver quality optics in support of faster innovation cycles, increased manufacturability, and optimized lifetime cost and reliability at the system level. Before we answer, how Sunex arrived at the concept of an Automotive Digital Imaging Optics Platform, we need to go back in time. During the early stages of camera integration in consumer vehicles, the only questions were if and how it could work. Oh, yes, and whether it can operate reliably over the lifetime of the car! Fast forward a decade or so, the questions were answered to satisfaction and a typical project discussion often focused on balancing required performance with expected target piece price. Finding an optimum in this “2D-space” was an iterative process but typically a custom design solution was found. However, optimizing for cost or a specific performance spec is often realized through tradeoffs in other areas, moving the optimum solution from the somewhat “simpler” 2D- into a 3D- or multi-space challenge. Any decision made has to be balanced and the associated risk or impact has to be evaluated. A suggested change in Relative Illumination (RI) or MTF is probably evaluated differently depending on whether the application is purely viewing (e.g., a backup camera system,) or a safety-critical ADAS system based on computer-vision algorithms. Similarly, if you optimize for size and consider the use of an injection-molded aspherical element to reduce the TLL of the lens, that potentially increases cost and lead-time for tooling. Today, the challenges are still very much the same. However the vector space has increased, and everyone tries to increase performance, lower cost, and reduce risk. All at the same time. Adding to the mix is the fact, that every OEM, every Tier1, and every program, has specific requirements. The most common denominator in some cases is the CMOS sensor, since every now and then, a specific type of sensor (resolution and pixel pitch), or a specific vendor PN becomes a standard. But there the similarities often also end. Multiple RFQs for the same program, though similar, have enough differences, due to existing or already chosen system components such as ISP or camera HW platform, that they require individual digital imaging optics solutions. Over the years Sunex has developed an extensive library of proprietary lens designs that optimize the performance of digital imaging systems. We leverage our deep design and engineering experience and capabilities, and the collaborative dialogue with our client’s design teams to develop the best overall balance between performance, size, cost, and manufacturability. Using our expertise for all-glass and hybrid lens designs and our manufacturing excellence enabled us to deliver high-quality digital imaging optics in high-volume in the automotive industry. So it all seems fine, right? Kind off; we see that cost pressure and the request for faster time-to-market continue to put pressure on the entire supply chain, and the traditional development cycles and collaboration models are potentially limiting the needed innovation. Why do we get for the same RFQ the request for HFOV of 183° from Tier1 ‘A’ and 185° from Tier1 ‘B’? Why create two completely different designs for a Surround View Camera (SVC) and an Occupant Monitoring System (OMS) for the same OEM? The reason in our opinion is based on the industry’s long-established collaboration (aka hardly any collaboration at all) model. The origin can probably be traced back to the time when OEMs completely outsourced the design and manufacturing of every single component that went into a car and merely defined system-level requirements to match the individual components as well as possible. A lot of the design and decision power went to the Tier1s and paired with fierce competition throughout the entire supply chain, it led to a sequential innovation model, limited to the requirements of a given program. Over the last couple of years, we have all witnessed a huge growth in the number of automotive camera systems across market segments and applications. There was also a brief moment where everyone thought (or wished) that level 4 and 5 autonomy is just around the corner, and we would all soon binge-watch our favorite shows while commuting hands-free back from the office. It didn’t happen, at least not yet, but it forced the industry to rethink how to accelerate innovation. The model that emerged is somewhat collaborative, in that sense, that the Tier1 is still at the nexus but Tier2s and OEMs started to talk to each other again. Roadmap discussions and aligning future technologies and the general intent from everyone to better understand where innovations are heading are positive. Though a great improvement over the sequential model, it is still lacking the coordinated and timed effort of true collaboration. The evolution in Sunex’s view is to move towards an agile collaboration model, where experts from OEM, Tier1, and Tier2s, come together for the appropriate amount of time to accelerate the iteration towards the best-balanced solution. Once the program boundaries and requirements are established every party can revert back into the established collaboration model to execute their individual tasks within their teams, and to allow for efficient project management and execution. As they say, “the proof is in the pudding” and we could at least get a first glimpse of how it could work recently when two large German OEMs collaborated on defining an entire automotive camera platform. It eventually failed, but the collaborative approach was the right one and we hope to see more of it in the future. Now there it was. The industry tried to create a platform. That wasn’t necessarily the begging for Sunex, since years earlier, we already developed and promoted a lens family for specific sensor classes (e.g., the 4k, 2.1um pixel pitch automotive sensors). The basic idea is that for a given sensor class there is an available set of lenses, that have very similar lens performance parameters, but differ in one key spec such as FOV. This allows for potential standardization on a specific sensor on the Tier1 level and created confidence in expected lens performance since all members of a lens family perform similarly. This led to faster POCs, faster time-to-market, lower overall risk, and lower overall system cost through higher solution efficiency. Being fully aware, that not every market segment has the budget for a high-end solution, we used our design and manufacturing expertise to add a “good-better-best” choice to the lens family approach. This approach speaks to intentional design choices that find different solutions in the 3D-space of Performance, Cost, and Risk, that was described earlier. A practical example is our offering of FOVEA lenses for forward-looking ADAS applications that are available in different all-glass versions, but also as an automotive-grade hybrid lens solution. If we now combine the lens family and the “good-better-best” approach, with the observation that many RFQs for the same program only differ slightly, and that some applications are very similar at their fundamental core, one can understand that the earlier discussed agile collaboration model, is the last missing piece needed, to allow for true innovation based on an Automotive Digital Imaging Optics Platform. So how does the platform compare with a custom design or an off-the-shelf solution? There is the obvious disadvantage of individual custom design that every new RFQ incurs the same initial investment, lead-time, and risk. The off-the-shelf solution is often an initial request from our clients due to its perceived advantages in cost, availability, and rescued risk. However, for high-volume automotive applications, we often see that an off-the-shelf solution has simply too little alignment with the complete set of requirements, and a decision in favor of a custom solution is made. But how can a platform compare in performance with a custom solution? The answer is strikingly simple. A complete platform has enough commonality across the platform to deliver on lower cost, faster time-to-market, and reduced risk of implementation, but enough flexibility to create a technical advantage for different applications. The right platform informs and guides future RFQs and the question of HFOV of 183° vs. 185° for the same program simply doesn’t exist anymore. In the long run, the platform approach will always strike a better balance compared with an individual solution. *This article is based on the talk “*Optimizing for All – An Optics Level Platform Approach to Automotive Vision” *given by Ingo Foldvari, Sunex Director of Business Development, during the AutoSens 2019 in Brussels.* --- ## Relative magnification and aspect ratio - Source: https://sunex.com/2020/07/31/imaging-lens-relative-magnification-and-aspect-ratio/ - Summary: The magnification of an imaging lens changes with the object distance. as well as with the field angle as (i.e, where in the field the object is placed). It is well-known that the magnification of an imaging lens changes with the object distance (distance from the object to the lens). However, it is less known that the magnification changes with the field angle as well (i.e, where in the field the object is placed). Additionally, the off-axis magnification can be asymmetric. In other words, the tangential and sagittal magnifications can be different. This causes shape deformation for off-axis objects. For example, a golf ball off-axis may have an oval-shaped image. In this article, we discuss the concepts of “relative magnification” and “aspect ratio” useful for characterizing the shape deformation for off-axis objects. An understanding of this topic will help to optimize lens designs for machine learning applications. For an on-axis object coming from infinity, the image size is the product of the lens effective focal length (EFL) and the angle (in radians) subtended by the object from the entrance pupil position. When this object is moved off-axis the image size depends on the lens mapping function h(*θ*), where h is the image height on the image plane and *θ *is the field angle. The derivative dh/d*θ* provides the tangential magnification (or meridional magnification). The definitions of tangential and sagittal orientation are shown in this figure. We can define the relative tangential magnification as the tangential magnification divided by on-axis magnification. In the special case where the lens mapping function is simply EFL**θ *the tangential magnification is a constant, independent of the field angle (but the sagittal magnification is not). Such a lens is known as a f-*θ* lens. Other well-known forms of lens mapping function include: rectilinear, stereographic, equisolid angle, and orthographic (see this Wikipedia article). It is worth noting that most real-world lenses do not follow these equations precisely. At Sunex we have developed a concept called “rectilinearity” as a generalized parameter to characterize the entire set of lens mapping functions. Please contact us if you are interested in learning more about this. Most lenses on the market today have axial symmetry. Based on this symmetry it can be shown that the sagittal magnification is h(*θ*)/sin(*θ)*. The relative sagittal magnification is defined as the sagittal magnification divided by the on-axis magnification for axially symmetric systems. The ratio of relative tangential magnification to relative sagittal magnification is then defined as the “aspect ratio”. If the aspect ratio is 1 at all field angles, the shape of the object is invariant across the field of view of the lens. Such lens may help to improve the accuracy of machine learning algorithms. Let us study a few special cases: - The most common type of lens mapping function is known as rectilinear. The mapping function is based on a pinhole model where the image height is proportional to the tangent of the field angle. Examples of rectilinear lenses include most photographic lenses including smartphone lenses. Both relative tangential and sagittal magnifications increase with field angle, but not at the same rate. The tangential increase is faster resulting in an aspect ratio that also increases with the field angle. The practical effect is a person near the horizontal edge is stretched widthwise. - A stereographic lens: Both tangential and sagittal magnification increases with the field angle but at the same rate. The aspect ratio is invariant across the field of view of a stereographic lens. - A f- *θ*lens has constant tangential magnification. However, the sagittal magnification increases with field angle. As a result, the aspect ratio decreases with increased field angle. Off-axis objects become “elongated” sagittally. - An equisolid angle lens has decreasing tangential magnification but an increasing sagittal one resulting in the product of the two being constant. Though the objects off-axis are deformed they occupy the same number of pixels in the image plane approximately. If the machine learning algorithm requires a minimum number of pixels across the field of view to function, an equisolid angle lens may be useful. We have developed an online wizard to calculate the relative magnifications and aspect ratio at HFOV, VFOV, and DFOV points for all lenses with known rectilinearity. Try this wizard out ! --- ## Hydrophobic and Oleophobic Coatings - Source: https://sunex.com/2020/07/22/hydrophobic-and-oleophobic-coatings/ - Summary: Applications in Imaging Optics Oleophobic refers to the physical property of a molecule that repels oil. Hydrophobic coatings repel water; oleophobic coatings repel oil and fingerprint contamination. For camera lenses deployed in outdoor, automotive, medical, and consumer environments, these coatings are the difference between an image that stays clear through rain or a surgical procedure and one that degrades from first contact with contamination. This article explains how hydrophobic and oleophobic coatings work, how they are applied and measured, which Sunex lens applications benefit most, and the tradeoffs between coating durability, optical transmission, and cost. ## What is a hydrophobic coating on a camera lens, and how does it work? Oleophobic refers to the physical property of a molecule that repels oil. Fluorocarbons are deposited to the substrate to create a monolayer that repels oil and water. The additional benefit of evaporative oleophobic treatments is also hydrophobicity or inherent water-shedding capabilities. This inherent benefit is true of all evaporative oleophobic treatments, but not necessarily inherent in all hydrophobic treatments. The key differentiator between oleophobic and hydrophobic treatments is measured through contact angle and surface energy. Oleophobic treatments typically have a contact angle of 105-120° as measured with a goniometer, whereas hydrophobic coatings typically have a contact angle of ≤95° (contact Sunex for information on hydrophobic coatings with improved contact angles). Currently, hydrophobic and oleophobic coatings are widely used. Among the most prevalent users of this coating technology are the consumer device industry in touch screen displays. The automotive industry has adopted hydrophobic coatings as a de facto standard for exterior vehicle lenses, such as in Surround View Cameras. The medical imaging field is beginning to utilize hydrophobic and oleophobic coatings. An example would be the use of these coatings on endoscope lens ends. Endoscopic lenses obviously see a great deal of liquids and oils during their use, and having an oleophobic coating on the ends of these lenses has been proven to help physicians. ## What is a hydrophobic coating, and why is it needed for imaging optics? - Droplets collide, coalesce, become heavy, and fall down - Oleophobic - Self-Cleaning Feature A coating is described as hydrophobic when the surface tension of the applied liquid is greater than that of the substrate; in reverse, hydrophilic occurs at a state where the surface tension of the applied liquid is less than that of the substrate. Hydrophilic coatings are often applied on the inside (L1S2, the backside of the first lens element) to ensure any moisture spreads out to a thin film and does not bead up. The result is a layer of condensed water that remains clear, with no (or less) image disturbance. In comparison, hydrophobic coating repels water droplets from the surface, allowing rain to run off the lens outer (L1S1) surface. Fluorinated materials repel both water and oil. ## How are hydrophobic and oleophobic coatings applied to lens elements? Currently, oleophobic and hydrophobic coatings are applied through vacuum and/or vapor deposition processes. Vacuum deposition of oleophobic and hydrophobic materials occurs in vacuum pumped chambers where an energy source (typically an electron beam) vaporizes a specific dielectric material that vacuum seals to the substrate outer layer. Once the appropriate thickness of dielectric material is deposited to the substrate the pocket of dielectric material is then shut, and the hydrophobic material rotates into place and is vaporized. While airborne the evaporated material will create a permanent chemical bond with the already deposited dielectric. Due to the thin nature of the coating, there is negligible effect to the optical properties of the substrate. ## How do you measure the effectiveness of a hydrophobic or oleophobic coating? The simplest way to measure the efficiency of an oleophobic or hydrophobic coating is through the use of a goniometer which is an instrument that measures the contact angle of a drop of liquid. The contact angle of a drop of liquid can be measured by producing a drop of liquid on a solid. The angle formed between the solid/liquid interface and the liquid/vapor interface is referred to as the contact angle. The most widely-accepted method for measurement involves looking at the profile of the drop and measuring two-dimensionally the angle at the three-phase line as shown in the graphic. **Hydrophobic Coating Characterization: Roll-off Volume** This test is used to determine the droplet volume that will cause three droplets (of a particular mixed solution) to roll off the surface. A sample test procedure is outlined below: - Mixed solution: Reagent A: Reagent B =1:2 - Reagent A: Mixture of 5g A1 ultrafine test dust + deionized water - Reagent B: Mixture of NaCl + CaCl2 + NaHCO3 + H2O - Lens is mounted on a 45 incline - Droplets of mixed solution are placed on the lens surface - Test criteria: What is the droplet volume when 3 droplets roll-off the surface? - Test result: - 5ml droplet rolled-off surface - 4ml droplet did not roll off the surface --- ## Sunex Part Number Tutorial - Source: https://sunex.com/2020/06/18/sunex-part-number-tutorial/ - Summary: Sunex has many online tools designed to help you find a lens according to your specifications that is best suited to your application. Sunex has many online tools designed to help you find a lens according to your specifications that is best suited to your application. Once a lens is identified based on first order parameters, there can be many options or variants of that part number (PN). It is therefore useful to know how to interpret Sunex part numbers and the PN itself can provide insight into the lens’ performance or intended use. The intention of this article is to explain how our part numbers are structured and what the different variations indicate in order to help you fine-tune your choice of lens, options and configuration. Sunex part numbers are structured in the following way: DSLXXX[version letter]- [filter cut off]- F/#- [options, if any] For example purposes, the lens DSL947A-700-F1.6 will be used for reference. **DSLXXX**[version letter]- [filter cut off]- F/# **DSL947**A-700-F1.6 - We will start with DSLXXX as highlighted above. This portion of the full part number refers to the lens assembly and the optical design itself. All part numbers with the same “DSLXXX” will have the same optical stack and thus same first order parameters. Using our example for reference, all lenses starting with DSL947 share the same optical design. This is called the base lens PN. All other alpha-numeric identifiers after DSLXXX are variants of this same optical design. DSLXXX**[version letter]**– [filter cut off]- F/# DSL947**A**-700-F1.6 - The letter after the DSLXXX indicates a *mechanical*“variation” of the lens assembly, and represents a modification that is independent of the lens optical design. Typically, this modification indicates a minor mechanical difference. For example, DSL947A-700-F1.6 has an M8x0.5 threaded barrel, while DSL947B-700-F1.6 has an M12 x0.5 threaded barrel. The difference in letter versions in this case is due to the thread of the barrel. Please take note that a specific letter itself is not unique to a particular change (e.g. not all “A” PNs are M8x0.5 thread), and the difference has to be noted by evaluating the drawing. It’s also important to note that these different “letters” are**not***REVISIONS*. A*revision*, as in “rev B” is an engineering change to a PN which renders the previous revision obsolete and is not part of the Product PN itself. - Possible differences can be inferred from the description and/or drawing. Some possible mechanical modifications that would require a different letter version: - Barrel thread - Flange/ threaded version - Specific barrel differences (length of barrel, unique features, DMC code, etc) DSLXXX[version letter]- **[filter cut off]**– F/# DSL947A-**700**-F1.6 - The filter cut off indicates whether the lens assembly includes an IR cut-off filter and at what wavelength. The filter can be coated on one of the lens elements of the assembly, or could be included as a separate coated flat glass (if relevant for your application, contact Sunex to determine where the filter is located within an assembly). Examples of some of our filters can be found online. - Some possible filter designations include: - -NIR: no IR-cut filter/ absence of a cut-off filter - -650: 50%T IR cut-off at 650nm (visible transmits) - -700: 50%T IR cut-off at 650nm (visible transmits) - -BP850: bandpass with T50% @850nm (visible light blocked) - -BP940 bandpass with T50% @940nm (visible light blocked, IR transmission peak) - -IRC40/41: dual bandpass filter (visible transmits, IR transmission peak); width of IR transmission peak varies with different IRC designations - For detailed specifications, please refer to the specific part number drawing. Please contact Sunex if you are interested in custom filter options. DSLXXX[version letter]- [filter cut off]- **F/#** DSL947A-700-**F1.6** - The final standard designation in the part number is the f-number (f/#). By definition, the f-number of an optical system is the ratio of the effective focal length to the diameter of the entrance pupil. A low f/# indicates a larger aperture, and a high f/# indicates a small aperture (f/1.6 has a much greater aperture and lets in more light than f/8). All optical designs are designed for a particular F/# depending on the initially intended application of the lens. Lenses can be modified to increase the f/# relatively easily; you may notice that Sunex already has some lenses with different apertures for the same lens. However, it is much more difficult and sometimes impossible to decrease the f/# (open the aperture) beyond the nominal design f/#. DSLXXX[version letter]- [filter cut off]- F/#-**[Options]** DSL947A-700-1.6-**HP3** - Finally, there is one more optional appendix. If any alpha-numerical characters appear after the standard PN, this typically represents a special add-on, or possibly a customization option which is outside of our standard offering. There are several possibilities for this, but the most common are HP3 hydrophobic coatings, special identifiers, or custom binning options to name a few. Many of the lens part numbers listed online are considered “standard,” but as can be seen throughout Sunex’s lens catalog, many modifications can be done to a particular lens assembly such as modifying the barrel mechanics, changing the filter, or changing the f/#. Changing the base PN (DSLXXX) indicates changing the first order optical parameters and represents an entirely new lens assembly and thus a new lens part number. For more information about the various lenses offered and modification options, please refer to this article or contact Sunex to discuss whether it would be beneficial to modify an existing lens, or proceed with a custom design altogether. --- ## The Art of Making an HDR Lens - Source: https://sunex.com/2020/05/15/hdr_optimized/ - Summary: HDR (High Dynamic Range, also WDR) – Dynamic range quantifies the ability of a system to adequately image both high lights and dark shadows in a scene. New sensors can capture light intensity variations up to six or more orders of magnitude within the same image frame (+120db HDR, 20*LOG(1e6)), and together with the growing number of applications using computer vision, they are putting very demanding requirements on lens performance. Designing and manufacturing a High Dynamic Range (HDR) lens is probably as much art as science and engineering. - Sunex HDR Optimized™ lenses are the first stepping stone to building a system that performs reliably in daylight, nighttime, or low-light scenarios. - Sunex NoGhost™ technologies eliminate or minimize optical noise and support SW algorithms in classifying objects reliably and consequently performing the right actions. To align around a common set of terminology, we should first define the key terms of this article. HDR (High Dynamic Range, also WDR) – Dynamic range quantifies the ability of a system to adequately image both high lights and dark shadows in a scene. It is defined as the ratio of the largest non-saturating input signal to the smallest detectable input signal. Stray Light – Stray light is light in an optical system from a known or unknown source, which follows a path other than intended creating unwanted noise. This light will often set a working limit on the dynamic range of the system. Design Expertise, Process Know-How, and Manufacturing Capabilities are the three factors that must come together when building lenses that have increased performance requirements for HDR and low stray light performance. The design expertise can not truly be described in definitive terms, as no two designs are the same. However, understanding the fundamental impact on the HDR performance of certain design considerations should guide the design at all stages. Fixing it later is almost impossible, as this relates to changing 1st order optical parameters. Changing these is for all practical matters, almost the same as designing a new lens. Reducing the number of elements is generally a good rule of thumb since every extra surface will increase the number of parasitic images that an optical design has to be optimized for and if not done well, will possibly have a negative impact on the HDR performance of the overall system. For a lens with *n* elements, the total number of images that sum up on the sensor is given by *N = (2n)^**2 **– n *The image formed on the sensor is therefore the sum of the expected image and many parasitic images. Each of them is modified, scaled, shifted, and attenuated version of the expected image. The formula does not include the effects of barrel, reflection of sensor, etc. - Example 8G: 1 relevant image, 247 parasitic images - Example 9G: 1 relevant image, 314 parasitic images Optimizing just one (1) image path while suppressing hundreds or even thousands of others is not trivial and requires deep design and manufacturing process experience. Should the application require a consistent performance over a wider wavelength range, as it does for a dual-band (e.g., VIS + 940nm) application, or even across the entire VIS to NIR band in hyperspectral applications, fundamental design decision can have an increased impact. It is not difficult to imagine, that the optomechanical design has a large impact on the overall HDR performance as well. Rays get reflected and scattered on pretty much any surface, and the design team has to consider every surface and every edge of every item! The obvious goal is to reduce, or at least minimize, as much as possible, all reflections within the boundaries of a project’s cost, timeline considerations, and manufacturability! Choosing the right optical and optomech materials, considering optimized coating performance, having the simulation tools to help guide early decisions, and designing with the manufacturing process in mind are all mandatory from the very start when taking on HDR lens design. Customer feedback, in the form of design reviews, simulations, and tests on lens and camera levels, is equally important in guiding design decisions and converging a design that can meet the performance, commercial, and project timeline targets. This is especially true when optimizing for stray light, since other than VGI (Veiling Glare Index) and GSF (Glare Spread Function) that are defined in ISO 9358, stray lights such as ghosts, flares, starbursts, spurious images, etc. are not. Evaluating stray light performance is somewhat subjective to: - The use case - The observer - The algorithm - The test setup - The linearity (or non-linearity for that matter) of the sensor Customer Feedback is key to understand how good stray light performance has to be to serve a specific application. To learn more about Sunex’ NoGhost™ technology, please view the Knowledge Center article “NOGHOST™ LENSES REDUCING OPTICAL NOISE” or view the following recorded webinar: Sunex provides lots of options that should be close to any given need. If, however, you cannot find a part that matches your exact requirements, please feel free to contact us: Contact Sunex At Sunex, we are always interested to learn about new applications and project requirements. If we can’t find something which works in our current portfolio, we may be able to discuss our custom design services! --- ## Finite Imaging; When Focusing to Infinity Doesn’t Cut it. - Source: https://sunex.com/2020/03/09/finite-imaging-when-focusing-to-infinity-doesnt-cut-it/ - Summary: Many applications require that imaging lenses be optimized for a finite object, that is, for an object distance closer than “infinity.” Most off-the-shelf board-mount lenses are designed for infinite conjugate imaging — objects far enough that light arrives as parallel rays. When your object is closer than approximately 250mm, an infinity-corrected lens delivers degraded image quality: lower MTF, increased distortion, and reduced depth of field compared to a lens specifically designed for that working distance. This article explains the optical difference between infinite and finite conjugate imaging, identifies the applications where finite lenses are required, and describes Sunex’s approach to designing and supplying finite conjugate lenses for medical, biometric, and machine vision applications. ## What is finite imaging, and how does it differ from infinite conjugate lens design? Many applications require that CMOS or CCD imaging lenses be optimized for a finite object, that is, for an object distance closer than “infinity.” The practical definition of infinity varies based on individual use-case requirements and the specific lens being used. It can be defined in different ways, such as spot size, through-focus MTF, etc. As a general rule of thumb, and depending on F/#, infinity is generally 100-200x the focal length (EFL) of the lens for short-EFL board-mount lenses. You can calculate your specific use-case hyperfocal distance using our DOF Wizard here: http://www.optics-online.com/DepthofFocus.asp ## When does my application require a finite imaging lens instead of a standard lens? To better understand how finite imaging lenses work, it is important to also understand some basic terms, and how a typical infinite conjugate lens works. **Hyperfocal Distance:** The point at which when a lens is focused at this distance, depth of field extends from half this distance to infinity. **Infinite Conjugate Lens**: A lens optimized to perform best when focused at or beyond the lens hyperfocal distance for imaging of objects at infinity. **Finite Conjugate Lens**: A lens optimized for focusing on a discrete object closer than the hyperfocal distance. Most off-the-shelf, compact CMOS and CCD lenses on the market today are Infinite Conjugate lenses. This is because there is essentially an infinite number of possible finite conjugate distances. Although infinite conjugate lenses will work reasonably well at reasonable finite distances, Multi-Megapixel and other demanding applications often require better image quality than an infinity-corrected lens can deliver at said short object distances. Such applications include Document Imaging, Biometric Scanning and Imaging (facial recognition, fingerprint scanners, and iris scanners), Machine Vision, and diagnostic or clinical Medical Devices and Instrumentation. A bespoke or reoptimized finite-conjugate lens offers significant advantages in this regime because it can be corrected specifically for aberrations that arise from short object distances, such as field curvature. Although you may use an infinity-corrected lens at a finite distance, it may not offer its best design performance. You may notice for example, that while the center is in focus, the edges are not – or visa-versa. Conversely, you can also use a finite lens at infinity, but to get the optimum performance, the lens should be matched to the design focus point and object distance. ## How does Sunex design lenses optimized for short object distances? Sunex can design a custom lens for your finite imaging application or we can re-optimize many standard off-the-shelf infinite-conjugate lenses for finite imaging. With a large selection of off-the-shelf lenses available to choose from *(Reference Figure 1 below)*, this re-optimization approach can often reduce the development time and cost of a pure custom solution. Please contact our Engineering Sales Team for more information and let us help you with your finite imaging challenge! *Figure 1:* The above is only a sampling of Sunex’s most popular finite-imaging lenses. Sunex offers a complete line of lenses for a range of applications and industries. Please contact our Engineering Sales Team for more information and let us help you with your finite imaging challenge! --- ## Cleaning Procedure for Optics - Source: https://sunex.com/2020/01/31/cleaning-procedure-for-optics/ - Summary: The general guideline for cleaning optics is “if it's not dirty, don't clean it”. But if you have to, this is for you. The general guideline for cleaning optics is “if it’s not dirty, don’t clean it”. Handling optics increases their chances of getting dirty or damaged, so you should clean optics only when necessary. Both the proper cleaning products and proper methods are equally important to cleaning the optic. There are different methods for cleaning and certain specialized optics require particular attention and change in procedures. Below are general guidelines to be used as a reference only when cleaning optics. *HANDLING NOTE:* Optics should be handled in a clean, low-dust environment while wearing powder-free acetone-impenetrable gloves or finger cots. Gloves are typically preferable as they cover the whole hand. Since oil and debris from your hands or from used lens tissue can stain or damage optical coatings, you should not touch any transmissive or reflective surfaces of your optic and never reuse a lens tissue. In case of contact, fingerprints on a coated surface should be cleaned as soon as possible to minimize the risk of staining or damaging the optic. Consider that lens tissues are inexpensive compared to the price of an optic. Inspect an optic for dust and stains by holding it near a bright visible-light source. Viewing the optic at different angles allows you to see scattering from dust and stains. **STEP 1. USE A CLEAN-AIR DUSTER** Dusting your Optic: Wiping a dusty optic with lens tissue is like cleaning your windshield with sandpaper. Always blow the dust off your optic before cleaning it, using an optic bulb blower or compressed air suitable for optics. | Dust is the most common contaminant and the first step to cleaning your optics should be to use an optic bulb blower or compressed air. The compressed air or nitrogen must be filtered and oil-free, and de-ionized gas is recommended. Commercially available “chemical dusters” such as those designed for electronics or keyboards are typically not recommended, as the propellant can spray and potentially damage the optic. Wiping a dusty optic is like cleaning your windshield with sandpaper. If the dusted optic has no visible stains after you dust it, then remember: “If it’s not dirty, don’t clean it.” If it’s still not clean, proper use of solvents and lens tissue can often do the trick. **STEP 2: USE SOLVENT AND LENS TISSUE** Always use lint-free tissue with a solvent. Dry lens tissue can scratch optical surfaces. A good solvent to use is a mix of 60% acetone with 40% methanol. Acetone alone dries too quickly to dissolve all of the debris. The methanol slows the evaporation time and also dissolves debris that acetone alone would not clean. Always use acetone-impenetrable gloves when using acetone. Acetone should __never__ be used to clean plastic optics or optics in plastic housings as this will cause damage to the plastic. Compressed air, isopropyl alcohol, or de-ionized water are safe alternatives. As a note, isopropyl alcohol is an acceptable and effective option, but its relatively slow evaporation can leave drying marks on the optic. De-ionized water (with mild soap) can also be used with plastics. Clean the edge of the optic before cleaning the face (central area) to prevent dirt from being drawn up onto the face. Wipe slowly to allow the solvent to evaporate without streaking. Remember, slow and steady cleans the optic. **For small, mounted optics:** The “Brush” Technique: Wipe slowly straight across from one edge of the optic to the other. | * * Use the “brush” technique for small or mounted optics. Make a lens-tissue brush by folding the lens tissue so that the fold is nearly as wide as the optic to be cleaned. Do not touch any part of the tissue that will touch the optic. With a hemostat or tweezers, grip the folded tissue parallel to and near the fold. Wet the “brush” with solvent and shake off any excess liquid. Start first by blowing off the dust. Place the brush on the optic surface, apply slight pressure with the hemostat, and slowly wipe straight across, from one edge of the optic surface to another. For mounted optics, use a smaller “brush” held by hemostats, or an optic tissue wrapped around a low-lint swab. Clean the edges of the exposed optic first, tracing the inside edge of the mount in a slow circle. Be careful to move slowly to allow the solvent to evaporate and prevent leaving streaks or spots. Do not double back over your path. As you approach your starting point trace a decreasing radius circle until you reach the center of the optic. Lift the “brush” slowly to prevent solvent from accumulating at the center. ** ****For large, unmounted optics:** The “Drop and Drag” Technique: Drop solvent onto your lens tissue and drag the soaked tissue slowly across the lens surface. Remember to clean the edges of your optic before you clean the face. | The “drop and drag” technique is ideal for light cleaning and large, unmounted optics. Place your optic on a clean, non-abrasive surface, such as a clean-room wiper. After blowing off the dust using compressed air or nitrogen, lay a piece of unfolded lens tissue over the optic, saturate the lens tissue with solvent, and slowly drag the soaked tissue across the lens surface. Similarly, to other methods, moving the tissue slowly allows the solvent to evaporate uniformly without leaving any drying markings. Remember to clean the edges of your optic before you clean the face. **For stubborn stains on durable coatings:** The “Wipe” Technique: Only use this technique for stubborn stains on more durable coatings. | This method should be used rarely for intense cleaning of stubborn stains. It is important to note that excessive use of this cleaning technique can cause damage to any transmissive or reflective coatings**. **Fold the lens tissue as described in the “brush” technique above, and grip it with your fingers instead of the hemostat. Saturate the lens tissue with solvent. Applying a uniform pressure on the optic edge, slowly wipe across the lens surface. **STEP 3: STORE THE CLEANED OPTICS** Once you’ve cleaned your optic, place the optic in the mount it will be used in, or wrap it in lint-free lens tissue and place it in its container right away. Be mindful to wrap each optic in its own lens tissue and/or store it individually, as unwrapped optics that are in contact will cause damage. Keep the optics in a low humidity environment. --- ## Lens Customization- What You Need to Know - Source: https://sunex.com/2020/01/17/lens-customization-what-you-need-to-know/ - Summary: One of the questions we frequently get at Sunex is some variation of, “Can I modify a standard lens, and what does this entail?” Yes — Sunex can modify standard lenses or design fully custom optical systems. Common customizations include changing the focal length, adjusting the F/#, modifying the image circle, adding IR coatings, and changing the mechanical housing or mount. Full custom designs start from a clean optical design and are optimised for your specific application, sensor, and environmental requirements. This article covers the most frequently requested lens customizations, when to modify vs. design from scratch, and how the Sunex custom lens development process works. ## Can I modify a standard Sunex lens for my application? One of the questions we frequently get at Sunex is some variation of, “Can I modify a standard lens, and what does this entail?” The answer to this question is, “Yes!” depending on what you would like to modify, of course! In general, there are four categories of what can *potentially* be modified on a standard lens: - External Mechanical Features - Coatings/Filters - F/# - Specifications ## What lens parameters can be customized without a full new design? While first-order lens parameters, which are inherent to the optical design, typically *cannot* be modified, including EFL, FOV, TTL, CRA, and Image Circle, there is still a broad range of customization that is possible. Let’s look at the above options for the customization of a Standard Sunex lens in more detail. **External mechanical features**include thread spec/size, mechanical features such as focusing interface features, cap size/shape, markings, etc. These changes are straight-forward with a nominal change fee and will be given a unique PN for differentiation. Such changes generally require 4-6 weeks lead-time, assuming long-lead materials (glass) are already in stock. There may be an MOQ (minimum order quantity) after the initial prototypes. **Coatings (AR) and filters**are also customizable. Sunex has a broad capability to create custom IR Cut filters, bandpass filters, and various AR coatings. This capability allows the customer to optimize Standard lenses for specific wavebands, to minimize stray light, or to re-optimize a Visible lens for NIR applications, for example. Such changes typically do not require any change to the optical Rx (prescription), or to the mechanical design of the lens. Again, depending on the change desired, there may be an NRE and/or a lot charge for the special build. A lead-time of 6-12 weeks should be expected, as glass elements may need to be fabricated. **F/#**is a straight-forward change that can dramatically optimize certain use-cases with minimal cost and effort. It is typically much easier to stop down a lens (increase F/#), but lowering F/# on a particular lens may also be feasible. There are typically no optical or mechanical changes involved, so increasing F/# is usually just a matter of a lot charge and 5-6 weeks of lead-time to build a small lot of prototypes (assuming glass materials are in stock). Again, the lens will receive a unique PN so you can order the lens by name (MOQ may apply). **Lens Specifications**are another change request that Sunex frequently encounters. Although every Standard lens has a set of default, standard specifications, some of these may be tailorable for specific OEM applications. Some commonly modified specs include decentration (boresite), MTF, EFL Tolerance, environmental/reliability testing, or additional/more frequent sampling of critical parameters. Since Sunex has a comprehensive in-house test capability, there is some flexibility to the test, QC and acceptance protocol of Standard lenses in OEM applications. Of course, the extent and level of testing desired may impart some additional cost, but for those demanding applications where more stringent testing is required, Sunex has you covered. ## When should I choose a modified standard lens vs. a fully custom lens design? As we see, there are many ways to modify your standard Sunex lens if desired. Standard lenses offer a ready-to-go option with a mature, manufacturing-proven design. If a Standard, base-line design can be modified to meet your exact requirements, it can save months of design time and significant cost over designing a custom lens. For this reason, Sunex has hundreds of designs to choose from, so do not hesitate to contact us if a Standard lens can meet your exact requirements with a bit of modification! We also encourage you to watch the video below from our webinar series, which is dedicated to all questions about lens customization options. If you want to get the ball rolling on a customized COTS lens, please contact us or start using our Imaging System Builder. --- ## Boresight Stabilization™ - Source: https://sunex.com/2020/01/04/boresight-stabilization/ - Summary: Lenses designed with Boresight Stabilization™ can enhance survivability, reduce pointing error, and improve MTF stability. Boresight Stabilization™ is Sunex’s proprietary lens design technique that keeps the optical axis aligned under mechanical shock, vibration, and thermal stress. Without it, lens axis shift under real-world conditions causes calibration drift in AI vision systems — leading to object detection errors in automotive ADAS, aerial surveillance, and industrial inspection deployments. This article explains what boresight is, why it matters for harsh-environment cameras, and how Sunex engineers Boresight Stabilization™ into OEM lens designs. ## What is boresight stabilization in optical lens design? A typical lens assembly consists of several lens elements inside a cylindrical barrel. Due to the mechanical clearance requirement, the internal element diameters must be smaller than the barrel internal diameter. This creates potential risks that could result in boresight errors or changes: - Lens to lens variance of the optical axis due to manufacturing tolerance - A possible lateral shift of individual elements overtime due to strenuous conditions ## How does Sunex Boresight Stabilization™ reduce pointing error? The first potential risk (lens-to-lens variance) can be reduced and even completely mitigated through the use of Active Alignment of the sensor and lens assembly during the manufacturing process. Sunex typically recommends active alignment for imaging requirements of >3 – 5MP or for sub 2um pixel pitch. Pairing a high-quality, precision optic with an imprecise alignment method may not fully realize the performance, especially edge MTF, that the lens itself is capable of delivering on a consistent basis. The second one is less predictable and depending on the severity could have a devastating impact on certain applications. To eliminate this potential risk for boresight error, Sunex has created an optomechanical design process that allows for risk evaluation on a per-element basis, and developed technologies and processes to “glue” these identified lens elements laterally into the barrel. Boresight Stabilization™ designs are fully compatible with Sunex’s general manufacturing process and deliver on: - Best-practice kinematic design and mounting of elements - Superior tolerancing of optical design and between lens elements and mechanics - Superior assembly and test techniques ## How do vibration and temperature affect lens optical axis alignment? To show an example of the positive impact of Boresight Stabilization™ on the overall lens performance in harsh environments, Sunex created a test under the following conditions: - Two lenses with the same PN, built with the same process, except one lens is boresight stabilized, the other is not. - Both lenses initially meet all relevant specifications for optical performance, including MTF. - Both lenses subject to the same Temp/Vibe/Temp/Vibe cycle. - Both lenses retested for MTF after the test cycle was completed The below graphs clearly shows that the standard lens had more than twice* the MTF degradation as the Boresight Stabilized™ lens after only one test cycle. *results might differ for other lens designs or different test procedures ## Which Sunex lenses include Boresight Stabilization™ technology? Theoretically, every lens can be designed from the start, with Boresight Stabilization™, and existing off-the-shelf lenses can be retrofitted; however, the application needs to determine whether it is needed or not. There is no need to “throw the kitchen sink at it”. For example, we have shipped over 100 million lenses to automotive customers, and all designs must pass a rigorous reliability and environmental test protocol that simulates 15 years of vehicle lifetime. 99% don’t have any Boresight Stabilization™. But if a particular lens is designed with Boresight Stabilization™ it achieves the following goals: - Enhanced survivability - Stable boresight (pointing) over shock, vibration, and temperature - Enhanced MTF stability and repeatability over the mechanical shock, vibration, and temp (The lens stays better, longer and returns to a predictable performance after being subject to environmental extremes) --- ## Lens mapping function and distortion - Source: https://sunex.com/2019/12/17/lens-mapping-function-and-distortion/ - Summary: The concept of distortion describes how a lens maps a shape on the object plane to the image plane while assuming other aberrations are negligible. Lens distortion is a commonly used term in the general specification of lenses. The classic, textbook definition of distortion includes barrel and pincushion type distortions. These concepts are useful for photographic optics where the field of view of the lenses is not extreme. Lenses designed for machine or computer vision applications can have an extreme field of view. Field of view greater than 100 degrees are common specifications for machine vision lenses. We need to develop more suitable concepts to describe the distortion characteristics of these lenses. The concept of distortion describes how a lens maps a shape on the object plane to the image plane while assuming other aberrations are negligible. Once the lens mapping function is known the distortion characteristics are fully determined. In a rotationally symmetric lens with infinity object distance, the lens mapping function relates the image height (h) on the image plane to the field angle (theta) in the object space. Some of the commonly seen mapping functions are shown in Figure below: - The top f-tan curve represents the type of distortion commonly seen with most photographic lenses including smartphone lenses. This type of distortion is known as rectilinear distortion where a straight edge on the object plane is mapped as a straight edge in the image plane. Because the tangent function diverges at 90 degrees, it is not possible to create fisheye lenses (field of view near 180 degrees) with this type of distortion. - A very useful one is the f-theta mapping function seen as the second down from the top curve. A f-theta lens forms an image where the image height is proportional to the field angle of the incident light. This type of lens mapping function can be used for any field of view lenses. Most wide-angle and fisheye lenses follow this type of mapping function. - The very bottom curve represents a type of distortion known as “Fovea” distortion. Lenses with this kind of mapping function “exaggerate” the central details while trading off the off-axis details, similar to the function of human vision. Machine vision applications can benefit from this type of lenses include forward-looking ADAS and autonomous driving cameras where the car must detect objects at a far distance in the central vision while still having a wider field of view of peripheral objects. - The orange curve represents a type of distortion known as Tailored Distortion. The Tailored Distortion is very useful for wide-angle or fisheye lenses where the peripheral details are more important than the central vision. It is the reverse of that the Fovea distortion. Typical applications include ceiling-mounted security camera lenses and surround-view lenses for automotive applications. Like a traditional fisheye lens, Tailored Distortiontion™ is ideal for automotive, security, and medical applications where an ultra-wide field of view is required. Please view our webinar to learn more about Sunex Tailored Distortiontion™. There is a wide variety of lens mapping functions or distortion types available for machine/computer vision lenses. Discussing your application needs with our engineers helps to understand the trade-offs before finalizing on the lens distortion specification. --- ## The Advantages and Disadvantages of Stopping Down a Lens - Source: https://sunex.com/2019/11/27/the-advantages-and-disadvantages-of-stopping-down-a-lens/ - Summary: One modification that is commonly requested by our Customers is changing the F/# of a standard lens. One modification that is commonly requested by our Customers is changing the F/# of a standard lens. The most common reason to stop down a lens is to resolve more detail and gain greater depth of field in an image. However, the term itself can sometimes be confusing, and there are limits to how much you can stop down a lens before encountering diminishing returns. “Stopping down a lens” refers to *increasing* the numerical f-stop value. This decreases the size or diameter of the aperture of a lens, which usually has the effect of sharpening detail resolution and increased depth of field (more on this later!). Sounds great! Why not do this all the time? In the figure below, you’ll notice a shallow depth of field at the widest aperture, with more detail and depth of field gradually coming into view as you stop the lens down more. **Depth of Field from an imaging point of view. ** Now, astute readers may ask, “Hold on a second; from a physics standpoint, doesn’t increase the F/# have the effect of decreasing the diffraction limit, and therefore resolution?” This is very true. However, this only applies to diffraction-limited optical systems. In most commercial optical systems, the overriding factor in as-built lens performance is actually an aberration. Since many aberrations are field-dependent, stopping down the lens has the effect of reducing aberration, and therefore increasing MTF. As the lens is stopped down further and the actual performance of the lens approaches the diffraction limit, physics will take over and lens performance will again start to decrease. Hence, there is a limit to how much stopping down a lens can help MTF, and there are also trade-offs to consider for doing so. **Resolution: Example of stopping down a Sunex Standard lens in Zemax. ** The tradeoff to the benefit gained by stopping down is less light and more diffraction. The smaller aperture size will allow less light to reach the image sensor. In order to compensate for the reduction in light, some options include increasing illumination, increasing signal gain (ISO value) of the sensor, or increasing exposure time. In addition, as noted above, diffraction increases as the aperture size get smaller, until you reach a point of diminishing returns when the lens performance approaches the diffraction limit. In the Zemax diagram above, the red horizontal line represents 80% MTF. As the lens is stopped down from F/3.3 down to F/8.0, overall MTF performance across the field increases, before ultimately succumbing to the diffraction limit, and dropping to 70% MTF at its peak. Also evident is the increase in depth of field (trough to trough) as the f/# increases. From this analysis, it would appear that stopping down to F/5.6 would be a reasonable trade-off between light and MTF. Finally, it’s important to note that stopping down a lens does increase DOF independent of MTF. This can be especially beneficial in a well-lit use-case where DOF is paramount, but stopping down a lens in a less than ideal lighting environment may result in a dark, or noisy image resulting from sensor effects, which may negate any gains in MTF of DOF. The key is to find the optimum balance for your application and use case. For more information on finding the ideal lens and aperture value for your imaging application, contact sales@sunex.com! --- ## Tailored Distortion®: Distortion Correction at the Speed of Light - Source: https://sunex.com/2019/11/22/tailored-distortion-distortion-correction-at-speed-of-light/ - Summary: Tailored Distortion® is an innovation from Sunex to manipulate the distortion to achieve the best image quality in accordance with a client’s requirements: Tailored Distortion® is Sunex’s patented lens technology that engineers a specific, application-optimised distortion profile directly into the optics — eliminating software-based distortion correction. The result is lower processing latency, reduced computational load, and improved edge-of-field resolution in AI vision systems. Traditional fisheye lenses require software dewarping; Tailored Distortion® performs that correction optically, at the speed of light. This article explains how Tailored Distortion® works, why it matters for ADAS surround view and AI vision systems, and which Sunex products implement it. ## What are the advantages of correcting distortion in the lens vs. in software? Super wide-angle lenses and fisheye lenses have pronounced barrel distortion. When viewing a flat object, off-axis features are “squeezed” significantly relative to the on-axis ones. This reduces the effective off-axis resolutions. For many applications, it is an undesirable effect. It is possible to post-processing the image to reduce this effect. However, post-processing in software reduces the image quality because the missing off-axis information must now be interpolated. We created a new class of super wide-angle and fisheye designs using high-precision aspherical elements to reduce the off-axis “squeezing” optically. Comparing with standard f-theta designs, the distortion reduction is on the order of 50%. This allows off-axis objects to be imaged with more pixels. This technology has brought significant commercial benefits to security and automotive backup cameras. ## Can a fisheye lens be designed with a custom distortion profile for a specific application? Tailored Distortion® is an innovation from Sunex to manipulate the distortion to achieve the best image quality in accordance with a client’s requirements: - Edge Enhanced for fisheye lenses - Fovea profile for forward ADAS - Custom profile Tailored Distortion® is used to achieve the following: - Increase the angular resolution in a specific region of interest - Minimize or eliminate the use of electronic distortion correction - Improve the image quality when electronic distortion correction is employed - Manipulate the HFOV with respect to the VFOV The below graphic compares a fisheye lens with standard distortion (on the left) with a Sunex Tailored Distortiontion™ lens (on the right). One can observe three key differences: - Tailored Distortiontion™ offers more vertical pixels (here 8 compared to 5 squares) while maintaining the same horizontal resolution - Pixels in the center (blue color) are less blown-up for a Tailored Distortiontion™ lens - Pixels at the edge (yellow color) are less squeezed for a Tailored Distortiontion™ lens A further advantage of Tailored Distortion is that it provides a more rectilinear image. The result is an image that maintains a more realistic perspective that is much less disorienting than traditional fisheye images. This means that many applications, such as automotive surround view, doorbell phones, or 360º surveillance, may not require any software dewarping at all. And since Tailored Distortion is all done optically, there is no software manipulation of the actual image, thus avoiding many of the issues sometimes associated with post-processing. ## Which Sunex products use Tailored Distortion® technology? Like a traditional fisheye lens, Tailored Distortiontion™ is ideal for automotive, security, and medical applications where an ultra-wide field of view is required. Please view our webinar to learn more about Sunex Tailored Distortiontion™. Please view this link for a list of Sunex Super Fisheye and Tailored Distortion(R) lenses: http://www.optics-online.com/dsl_fisheye.asp --- ## When Does a Hyperspectral, or “Day-Night” Lens Make Sense? - Source: https://sunex.com/2019/10/04/when-does-a-hyperspectral-or-day-night-lens-make-sense/ - Summary: A hyperspectral lens (also “Day-Night” or RGBIR) refers to a lens that has been optimized to maintain performance throughout the VIS and NIR bands. Certain applications require a broad spectral operating range and customers oftentimes inquire whether a lens can perform “well” out to their wavelength of interest. These requests typically involve both visible (VIS) and near-infrared (NIR) wavelengths. For purposes of this discussion, a wavelength range roughly from 470nm (or lower) out to about 850-950nm is assumed. Fundamentally, it is important to consider the use-case of the lens and determine how the VIS and NIR wavelengths apply in a particular system: - whether those wavelengths will be used simultaneously, - whether the bands will be used at distinct times (for example VIS is only used during the daytime, and at NIR is only used at night time) and *there is no**possibility*to refocus the lens, or - whether the bands will be used in two different instances, but *there is**a possibility*to refocus the lens and/or adjust the image plane location Due to the law of refraction, different wavelengths focus at different points along the optical axis (z-axis). Therefore, if a lens is not purposely optimized for NIR AND VIS wavelengths simultaneously, the ideal focus position of a lens for NIR light will occur beyond the VIS image plane. In applications where the broadband region will be used simultaneously or in applications where the wavelength bands are used at two distinct times and there is *no possibility to refocus* the lens, such as examples (1) or (2) above, it is important to consider a hyperspectral lens. A hyperspectral lens sometimes referred to as a “Day-Night” corrected lens, or also RGBIR lens refers to a lens that has been optimized to maintain performance throughout the VIS *and* NIR bands at a single focus position (common image plane location). The main advantage of a hyperspectral lens in terms of performance is that a single focus position will yield optimized image quality across that entire spectrum without the need to be refocused, which can be a complex process depending on the application and may negate the ability to use a fixed-focus lens. Since it is significantly more difficult to optimize a lens over a broad wavelength range, below are some of the common challenges and trade-offs that should be considered: - MTF Trade-off. Overall MTF across a broader wavelength region may need to be sacrificed. - Performance at shorter wavelengths is typically sacrificed for increased performance at longer wavelengths or vice versa. - The design typically prefers increased F/#, longer TTL, and may tend towards lower relative illumination (depending on other tradeoffs). - Typically requires more elements to correct aberrations. - May require aspheres to correct aberrations, especially if TTL is a constraint, which in turn can have an impact on cost. - AR coatings on lens elements need to be optimized for broadband spectrum, so average reflectance may increase as the broadband wavelength range increases. - With the above-mentioned complexities, typically the design becomes more sensitive to tolerances. - MTF Trade-off. Overall MTF across a broader wavelength region may need to be sacrificed. * **Please take note that the points listed above are assuming some degree of constraints within an optical system, and are meant to provide overarching ideas. In practice, each design is unique as it pertains to a particular project.* * Even though there are many factors that impact MTF performance of a lens, teh above picture clearly shows the superior MTF performance across teh field for an IR-corrected (NIR) lens, compared to a lens that has been designed and optimized for the visible (VIS) spectrum only. **The data show MTF vs. Field at 940nm and for different spatial frequencies.* Depending on the use-case, an alternative to hyperspectral lenses is to use a lens that has been primarily optimized for the VIS spectrum, but which can be used in the NIR range when refocused. The benefits to choosing such a lens are that typically there are many more standard, existing options available. Designing a lens for a narrower range of wavelengths is typically a simpler process (relatively speaking), and provides for a design solution with overall higher performance. At the same time, AR coatings can achieve superior performance over a narrower wavelength range with minimal reflectance, as compared to broadband performance. The challenges most commonly associated with such lenses occur when integrating the lens into the camera on a system level due to the need to adjust the image plane. There are two main paths of setting up the image plane to yield a functional lens in both wavebands: - Refocus the lens; for example, the lens would be at an optimal focus position for VIS light when used during daylight when the primary source is VIS, and the image plane location would be adjusted during the night time when NIR wavelengths are more dominant, perhaps as a result of IR-LED illumination. A manual refocus is manageable in prototype scenarios, but typically is not practical for mass-produced products. Automatic focus can present more mechanical challenges, particularly in rugged environments or where weight/size is a key consideration, but otherwise will allow the lens to function optimally. Devices such as piezo modules, voice coils (VCM), and stepper motors are available with standard M12 thread. - Choosing a singular focus position (image plane location) that is at an intermediary focus between the VIS and NIR wavelengths of interest. This will result in a decrease in MTF performance in both wavelength regions, but is often adequate for both, or can be biased toward the preferred band. The extent of the MTF drop in each wavelength region will depend on the specific lens design and other factors such as the sensor resolution. Hyperspectral lenses should be considered in systems that require the use of both VIS and NIR light and which don’t provide the option to refocus. However, as we see, there are many trade-offs associated with designing such a lens and the customer should consider their priorities and their criteria for judging performance. On the other hand, lenses optimized for narrower wavelength ranges typically offer superior performance but may pose mechanical challenges if needing to refocus the lens to accommodate a second wavelength region. If you are interested in lenses that perform over a broad wavelength range from Sunex, please send us an inquiry. We can help you with one of our existing hyperspectral lenses, evaluate how a standard lens will perform over your wavelengths of interest, or explore custom options. --- ## Automotive Camera Trends Driving Changes in Optical Designs - Source: https://sunex.com/2019/09/27/automotive-camera-trends-driving-changes-in-optical-designs/ - Summary: The imaging applications in the automotive industry go through rapid changes. The imaging applications in the automotive industry go through rapid changes. Where just a couple of years ago a lens/camera just needed to survive, the expectations moved to better performance, consistent and reliable over a wide temperature range. Below is a video of the talk Sunex presented at the AutoSens 2017 show, sharing experiences and observations on how requirement changes on the camera level impact consideration and design approaches on the lens level. Some of the topics address: - Computer vision and algorithms demands represent a fundamental shift from viewing applications - HDR sensors and low light camera performance requirements are driving the state-of-the-art - Growing camera operating temperature ranges require shifts in material selections and design forms Click to start the video --- ## “Scaling” as a lens design tool - Source: https://sunex.com/2019/09/12/scaling-as-a-lens-design-tool/ - Summary: "Scaling' refers to a method where a known lens design is proportionally enlarged or reduced geometrically to meet a different image size requirement. “Scaling’ refers to a method where a known lens design is proportionally enlarged or reduced geometrically to meet a different image size requirement from what the lens was originally designed for. For example, say, we have a lens that was originally designed for a 1/3” format sensor with an image circle of 6mm, we can “scale” up that design to make the image circle 8mm. This new scaled-up design is now capable of supporting a 1/2″ format sensor with an image circle of 8mm. All linear dimensions such as effective focal length, back focal length, total track length, or lens diameter scale linearly. Other properties such as field of view, F/#, distortion, and relative illumination do not change with scaling. Generally speaking, ratios or angular quantities are scaling invariant. Because the wavelength of interest does not scale, the diffraction MTF must be re-evaluated after scaling. It cannot be deduced from simple scaling. For example, we have a new camera application requiring a lens that will provide 121 degree HFOV on a 1/2″ imager with an F/2.9 aperture. A standard 1/2″ imager has an effective width of 6.4mm. Our off-the-shelf DSL388 lens has an HFOV of 121 degrees HFOV on 6.2mm effective width. Using scaling we can “enlarge” the DSL388 by 6.4/6.2=1.03x or 3%. This new lens will provide 121 degree HFOV on the 1/2″ imager. The F/# of the new lens is still 2.9 unchanged. However, the total track length will now be 3% longer. To realize the new design in hardware all elements inside the lens must be re-made with 3% greater thickness, diameter, and radii values. Attention must be paid when the scaling ratio is too large or small. The newly created design may present manufacturability difficulties or challenges. For example, if a lens is scaled down the center or edge thickness of some lens elements may become too thin to be manufacturable. If the lens is scaled up too much the weight or size of the lens elements could become too large or heavy to be economical. So before any prototype of the new design is made the optical design needs to be refined or adjusted to account for those issues. --- ## Why is lens design so hard? - Source: https://sunex.com/2019/09/08/why-is-lens-design-so-hard/ - Summary: The difficulty arises primarily because of all the trade-offs involved in a lens design namespace constraint, performance targets, and cost. With the advent of computers and more processing power, the art of lens design has matured. But computers cannot solve the whole problem. A computer can select a configuration and the designer optimizes that configuration. Can a machine learning algorithm find the absolute best lens design? Yes, if you try an infinite number of the different design forms. But..”I still want to be young when we get there!” Hence, the journey of developing an elegant design will never be a simple endeavor. The difficulty arises primarily because of all the trade-offs involved in a lens design namespace constraint, performance targets, and cost. One must not only design a great lens, but also must take into account the dimensions of the housing, it properties, wavelength, temperature, cost, transmission, and availability of the glass to be used, all at the same time avoiding configurations whose tolerances are so tight that nobody can build them. Before even looking into these considerations, one has to find a lens construction that will work. That is a hard job, to begin with. Why is it hard? Because lens designers have to deal with a design space of many dimensions and trade-offs. The different variables and the image quality targets are related to each other in a nonlinear manner. Also, clumsy boundary conditions apply to most of these variables. Hence lens design is unlike most real-world engineering problem-solving. Traditional methods have long relied on having a good starting point, a design not too far from the desired goals, and then working to improve it. If the starting point was indeed a good one and your skills were sharp enough, you could in that way arrive at a great design. However, one rarely has such a starting point, and only a select few have the required skills. Thus the job is hard for most, and difficult even for the experts, almost all the time. Lens design is more art than science, has no closed-form solutions. In other words, there is no equation that can be solved to arrive at an elegant design. One has to think, try different things, learn from experience, and iterate. **The lens design landscape ** The lens design landscape can be compared to a mountain range with peaks and valleys all over the place. The main job is to find the lowest valley, which corresponds to the lowest merit function (MF). The MF is usually defined as the sum of the squares of a set of quantities that represent how far the design is from its ultimate goals; the MF would be zero if all the targets were met exactly—which almost never happens. The lowest valley overall is the best, or ‘optimum’ design, because it has the lowest MF. How can one find it? When trying to find the lowest MF, most of the time lens design can be stuck in a local minimum of the merit function. Hard to escape! Figure 1 Another way to visualize the task is to imagine climbing a tree to get to the top. Figure 2. Graphical illustration of the lens design tree. Here, one can start at the bottom and climb up—but which branches to take? There are usually many solutions to a given task, of roughly equal quality, and when you run a lens optimization program, you are climbing up whatever branch you happen to be on. A different starting point will go up a different branch. When you reach the end of the branch you are on, you are at a local minimum. Just running the optimization program yet again will not move you from that branch to a better one. You need other tools. How can you get to a different branch? How many branches are there? Those are perplexing questions that lead to more questions than answers. #### **Simulated annealing ** Most lens design programs today offer what is termed simulated annealing, a process that involves making small random changes to each of the design variables and then optimizing, over and over. That technique can jump sideways from one branch to another, although usually not very far. Nonetheless, it is surprisingly effective and is one of the most important tools of the trade today. **Global optimization** Most lens design programs today also offer a form of ‘global optimization’, which can find a variety of solutions—but most of those programs are not practical because of the very long time required to return their results, often measured in hours or days We now enjoy a new paradigm. It used to be the case that an expert would spend days, weeks, sometimes years making small improvements to a classic design form, always guided by experience, insight, theory, and a large dose of dogged labor. If he succeeded, he was rightly proud of the achievement. Today we do things differently. Instead of inching up a single promising branch of the tree, day after day, hoping the result will justify the effort, we use software that examines hundreds or thousands of branches in a matter of minutes or seconds and returns a set of candidate design forms the user can then evaluate and try to adapt to his current requirements. --- ## What is Active Alignment? - Source: https://sunex.com/2019/09/04/what-is-active-alignment/ - Summary: The value of Active Alignment (AA) lies in compensating for sensor plane to image plane variance in up to 6-axis.. Active alignment (AA) is the process of precisely positioning a lens relative to a camera sensor in up to six axes — compensating for tilt, shift, and focus variation between individual components. For camera modules with sensors of 3MP resolution or higher, active alignment is the difference between a module that meets full image quality specifications at the corners of the frame and one that does not. This article explains what active alignment is, why it is essential for high-resolution camera modules, how it compares to threaded (passive) focusing, and what the tooling investment delivers in yield improvement and quality consistency in mass production. ## Why is active alignment necessary for high-resolution imaging systems? One of the most common questions Sunex receives is regarding the need for *Active Alignment* of lens modules. Initially, customers designing or prototyping an imaging system may find it hard to see the value of actively aligned modules. After all, active alignment can require a significant investment. In addition, during the prototyping phase, customers can typically achieve great results in small numbers by manually focusing the lens and gluing it into an off-the-shelf camera module. This blog post will attempt to explain the value of Active Alignment, and the use of AA to align lens modules in a production environment. The value of Active Alignment (AA) lies in compensation for tilt. Active alignment is used to compensate for boresight error within the accuracy of the testing machine (down to ~.05°). Sunex typically recommends active alignment for imaging requirements of >3 – 5MP or for sub 2um pixel pitch. Pairing a high-quality, precision optic with an imprecise alignment method may not fully realize the performance, especially edge MTF, that the lens itself is capable of delivering on a consistent basis. In theory, a threaded version of a module can produce a module with performance equal to an AA version of that same module, but the statistical population of such a result in mass production will be very low. The results are not consistent because of **tilt**. This matters most in high-resolution applications and in larger quantities where yield is a major contributor to cost and cycle time. The lens design and sensor selection is not necessarily the root issue, as are other external factors. Factors contributing to tilt: - Barrel design: mechanical axis to optical axis (bore) - Holder design: mechanical axis (bore) to base flatness and perpendicularity) - Individual thread pitch (wobble, tilt, backlash) - PCB Board design (flatness, thermal behavior) - PCB + SMT (BGA or backplane flatness) - Sensor glass height and tilt - Operating Temperature effects - Other factors and tolerances present during the final assembly of the module ## How does active alignment compare to threaded lens focusing? Threaded modules are typically focused manually by an operator using an optical SFR score, or a resolution target. The manual nature of the focusing process, combined with the use of threads of alternating height on a barrel will limit a module’s full capabilities at higher resolution. Threaded modules are typically capable of alignment to about 1° tilt on average. Depending on the size of the sensor and the pixel pitch, this may translate to a depth of focus issue at the edges or corners of the image. By contrast, active alignment **actively** compensates for tilt in **each individual module** to within a fraction of a degree (<20’ / 0.3° typical). ## At what resolution does a camera module need active alignment? Bearing in mind that Sunex usually recommends AA for systems of 3 – 5 MP or greater resolution, we can now look at a more extreme example of a 20MP module to better demonstrate tilt compensation on an AA module vs threaded module. Below is an MTF simulation of a recent 20MP module design shown with varying degrees of tilt in alignment. A typical AA module would be represented by 0’ – 20’ while a typical threaded alignment is represented by 60’ or 1°. While it can be difficult to simulate real-world results that take into account all factors, the typical resulting “net” tilt with AA is approximately an order of magnitude better than what we can typically expect from a manually aligned module. --- ## FOV Cone for Housing Design- Why the “Edge Ray Entrance Pupil” is Important - Source: https://sunex.com/2019/08/26/fov-cone-for-housing-design-why-the-edge-ray-entrance-pupil-is-important/ - Summary: By definition, the entrance pupil of a system is the image of the aperture stop as seen from the object side of a lens system. When designing the mechanical housing for a camera module, the most common mistake is defining the lens aperture cutout based on nominal lens diameter rather than the actual FOV cone of edge rays at maximum field angle. If the housing cuts into the FOV cone, rays from the image corners are blocked — producing vignetting that no optical redesign can correct without changing the housing. This article explains the entrance pupil edge ray concept, how to calculate the FOV cone for your lens-sensor combination, and how to correctly dimension a housing cutout that clears the full optical path at all field angles. Many customers use Sunex lenses in a custom housing unit, which is unique to their particular application. In many wide-angle lenses, especially fisheye lenses, it is important to understand where the rays of the maximum field of view for a specific sensor are located with respect to a reference datum on the lens in order to design a housing or hood that will maximize the field and not clip the rays. Understanding the concept of the entrance pupil is critical for this task. ## What is the entrance pupil, and why does it matter for camera housing design? By definition, the entrance pupil of a system is the image of the aperture stop as seen from the object side of a lens system. From a practical standpoint, the entrance pupil defines the ray bundle of light (typically approximated as a cone) that passes through the aperture stop to create an image. It is important to note that this position changes (along the optical axis in “Z”), based on the FOV, and hence the sensor dimension(s) in question. The location along the optical axis (z-axis) of the entrance pupil is defined using the chief ray of a system. From an application standpoint, it is best to find the entrance pupil position using the *edge ray* at the maximum usable field angle in order to ensure that the extreme rays of a ray bundle will be imaged and not clipped. ## How do I calculate the FOV cone for my lens at maximum field angle? Below is an example method to characterize where this “maximum bundle” originates and thus locates the potential housing boundaries: **Calculate maximum field of view.**The FOV of the lens for the particular sensor that is being used can be calculated using Sunex’s online Optics Wizards.- Depending on the exact application, the “maximum” field of view typically corresponds to the diagonal of the sensor. In case the image circle corresponding to the diagonal of the sensor is larger than the nominal image circle of the lens, assume the maximum field angle of the lens, and account for some overage. - Please *refer to the Sunex post on**lens image circles**for more information. The maximum image circle can be about 10-30% greater than the design/ nominal image circle depending on the type of lens and design.* **Contact Sunex**to locate the entrance pupil location/ position (ENPP)**OR**the Clear Aperture (CA) of the Ray Intercept at L1S1, per your preference.- A Sunex team member will find the entrance pupil location along the z-axis using the edge ray of the calculated maximum field angle. By specifying the location of this “edge ray entrance pupil” at a particular field angle, an effective cone can be drawn from that point along the optical axis, out towards the first lens element. This will give a direct visualization as to where the housing needs to stay within, relative to the first element. - Alternatively, the intersection point of the edge ray with the first surface of the first element (L1S1) can be provided. This point provides the equivalent data from which the cone can be inferred and is sometimes preferred if you want to ensure the housing does not encroach on this point. - In both cases, be sure to allow for tolerances, both for the housing and lens assembly, as well as optical tolerances. EFL’s and therefore FOV typically vary by a few percent in manufacturing. - Answers will be in the format below (or similar). `Example` • Sensor: 1/4”, diagonal 4.5mm • Lens: DSL215 • DFOV of DSL215 on 1/4":175° Answer Edge ray entrance pupil position, with respect to (behind) L1S1, at FOV 175°(±87.5°):2.54mmClear aperture of lens, at L1S1, corresponding to edge ray of FOV 175° (±87.5°):13mm ## What happens if the camera housing blocks the edge rays of the lens? When a camera housing blocks the edge rays of a lens, a phenomenon called **mechanical vignetting** occurs. **Mechanical vignetting** occurs when the lens barrel, housing mount, or any aperture stop physically intercepts rays traveling at oblique angles (from off-axis scene points). The effects cascade: **1. Light falloff at field edges.** On-axis rays pass through the full aperture area. Off-axis rays see a “crescent-shaped” effective aperture — part of the cone is simply cut off by the housing wall. Illuminance at the image edge drops as the square of the cosine of the chief ray angle (cos⁴θ law), but mechanical vignetting adds *additional* loss on top of the optical cos⁴θ that would exist even in a perfect barrel. *(Smith, W.J., “Modern Optical Engineering,” 4th ed., McGraw-Hill, 2008)* **2. Effective f-number increases at field edges.** Because only part of the aperture cone reaches the sensor edge, the image there behaves as if shot with a slower lens — shallower depth of field and longer effective exposure are inconsistent across the frame. **3. Resolution degradation at field edges.** The MTF (modulation transfer function) at the image periphery drops not just from aberrations but from the reduced number of contributing rays. Fewer rays = less spatial frequency information reaching the detector. *(Fischer, R.E. et al., “Optical System Design,” 2nd ed., SPIE Press, 2008)* **4. Color shading in some designs.** If the vignetting is wavelength-dependent (e.g., the housing clips rays that have already passed through color-correcting elements), you get lateral color shifts toward the image periphery — a color cast that varies with field position. **5. In fisheye/ultra-wide lenses, the field of view is effectively narrowed.** If the housing cuts edge rays before they ever reach the first optical element, the actual imaged scene is smaller than the lens’s designed FOV. --- ## What is the operating temperature spec for a glass/plastic/hybrid lens? - Source: https://sunex.com/2019/08/16/what-is-the-operating-temperature-spec-for-a-glass-plastic-hybrid-lens/ - Summary: The index of refraction is a function of temperature and nearly all optical systems experience some performance degradation over temperature. At Sunex, we get the above question frequently, but the answer is not straightforward. Specifying an operating temperature range for a lens can be tricky, because generally speaking, there is no point within a lens’ physically survivable temp range that it will suddenly “fail.” Since the index of refraction is a function of temperature, nearly all lenses experience some performance degradation over temperature, yet could still reasonably be considered “operational” well beyond their specified nominal temperature range. Therefore, in order to specify an operating range that has any practical application, one must first define *failure criteria*. This is often a minimum MTF [over the temperature range], but can also be a physical value, such as a focal shift in µm. One can also specify the Operational Temperature range as a % of the nominal performance, for example, when the lens loses >10% of its nominal (ambient) MTF. All of these approaches are related to the focal plane shift of the lens over temperature and hence the “best focus” location and by extension its impact on MTF. However, Operational Temperature can also be defined by other first-order specifications altogether, such as FOV or Magnification, since these are tied to the index of refraction over temperature and therefore the resulting power of the lens. For general purposes, however, it is typically safe to assume that glass lenses will perform within specification from -20-60C and hybrid or plastic lenses from 0-40C when focused at 20C nominally, but this is a conservative rule of thumb, and the lenses will likely perform well by most measures, well beyond these limits. To throw another wrench into the original question of “what is the operating temperature range,” and to take the above one step further, one needs to remember that the lens is not acting by itself. The operating temperature of the lens is oftentimes not even the most relevant factor in the thermal performance of the imaging chain. The way the lens is mounted and its relationship to the sensor’s image plane is often more critical than the thermal performance of the lens itself. Some lenses can be very stable, yet perform poorly in a system design with high effective CTE. Other lenses can have high [effective] CTE and perform extremely well over extreme temperature ranges when proper system design is utilized. While this post does not go into the details and strategies of the athermalized lens and system design, it is important to understand that the thermal performance of the lens can only be properly viewed in the context of the overall system design and especially the lens and sensor mount design of the camera. So when searching for a lens, think about what criteria are important in your application, and what temperatures your use case will actually encounter. With this information, Sunex will be able to help you not only pick the best lens option but also provide some insight on what type of mounting and focusing strategies may work best. Please view the following webinar for more details on how to achieve stable image quality over a wide temperature range: --- ## NoGhost™ lenses reducing optical noise - Source: https://sunex.com/2019/08/07/noghost/ - Summary: The following comparison demonstrates the advantages of a NoGhost™ lens that is optimized to eliminate or minimize optical noise. Sunex Inc. has developed design expertise, process know-how, and manufacturing facilities to eliminate or minimize optical noise (such as ghosts, flare, starbursts, spurious images) in lenses for high-performance applications. Lenses designed and manufactured with this process are known as NoGhost™ lenses. The following comparison demonstrates the advantages of a NoGhost™ lens: Reducing ghosts and straylight is important in any application. However, with the advent of new HDR (high dynamic range) imagers, the reduction of ghosts became an important factor in improving image quality and enabling computer vision algorithms to perform as expected. Please read our Knowledge Center Article about HDR Lens Design https://sunex.com/hdr_optimized/ or view the related webinar recording: Sunex provides lots of options which should be close to any given need. If, however, you cannot find a part which matches your exact requirements, please feel free to contact us: Contact Sunex At Sunex, we are always interested to learn about new applications and project requirements. If we can’t find something which works in our current portfolio, we may be able to discuss our custom design services! --- ## Optical Tolerances – Part 2 - Source: https://sunex.com/2019/08/07/optical-tolerances-part-2/ - Summary: The modulation transfer function (MTF) is a quantitative measure of image quality describing the ability of a lens to transfer object contrast to the image. Following up on the article Optical Tolerances – Part 1, Part 2 discusses the quantitative measure of image quality. The modulation transfer function (MTF) is a quantitative measure of image quality. MTF describes the ability of a lens or system to transfer object contrast to the image. Consider a sine-wave chart in the form of positive transparency in which transmittance varies in one dimension. Assume that the transparency is viewed against a uniformly illuminated background. The maximum and minimum transmittances are Tmax and Tmin, respectively. A lens system under test forms a real image of the sine-wave chart, and the spatial frequency (u) of the image is measured in cycles per millimeter. Corresponding to the transmittances Tmax and Tmin are the image irradiances Imax and Imin. The contrast or modulation of the chart and image are defined, respectively, as where Mc is the modulation of the chart and Mi is the modulation of the image. The modulation transfer function of the optical system at spatial frequency u is then defined to be MTF curves can be either polychromatic or monochromatic. Polychromatic curves show the effect of any chromatic aberration that may be present. For a well-corrected achromatic system, polychromatic MTF can be computed by weighted averaging of monochromatic MTFs at a single image surface. MTF can also be measured by a variety of commercially available instruments. The MTF curve for a perfect imaging lens is only limited by the laws of diffraction (diffraction-limited performance). For such a system, the theoretical MTF is calculated as follows: To achieve this level of performance, the optical design must be free of any aberrations, and the manufacturing process must maintain very tight tolerances. This requires a large number of lens elements with compensating aberrations. For most commercial applications, the lens MTF is far from the diffraction limit. The following diagram shows the design MTF of a practical lens vs. the diffraction limit (the black line is the diffraction limit, the blue line is the MTF on-axis, and green lines off-axis at 60 deg at tangential and sagittal target orientations): Since a lens MTF varies with the field angle, it is often more useful to exam the MTF at specific spatial frequencies vs. field positions. This plot shows the consistency of lens performance across the entire image plane. It is useful to select two (high and low) spatial frequencies. The low-frequency MTF value represents the overall contrast of the image, and the high-frequency MTF represents the capability of the lens to produce details. An example of such a plot is as follows (green lines are MTF values at 20 cycle/mm and the blue lines at 60 cycle/mm. The solid lines are for sagittal target orientation. Dash lines for tangential orientation): Read part one of the article: Optical Tolerances – Part 1 --- ## Optical Tolerances – Part 1 - Source: https://sunex.com/2019/08/07/optical-tolerances-part-1/ - Summary: Tolerance is a critical factor impacting the performance and cost of an optical system, hence optical components usually require much tighter tolerances. Tolerance is a critical factor impacting the performance and cost of an optical system. Optical components usually require much tighter tolerances than that commonly associated with mechanical components. As a result, special equipment and techniques are used in the manufacturing process and measuring of the optical tolerances. **SURFACE ACCURACY** When attempting to specify how closely an optical surface conforms to its intended shape, a measure of surface accuracy is needed. Surface accuracy can be determined by interferometric techniques. Traditional techniques involve comparing the actual surface to a test plate gauge. In this approach, surface accuracy is measured by counting the number of rings or fringes and examining the regularity of the fringe. The accuracy of the fit between the lens and the test gage is described by the number of fringes seen when the gage is in contact with the lens. Test plates are made flat or spherical to within small fractions of a fringe. Modern techniques for measuring surface accuracy utilize phase measuring interferometry with advanced computer data analysis software. During manufacturing, a precision component is frequently compared with a test plate that has an accurately polished surface that is the inverse of the surface under test. When the two surfaces are brought together and viewed in nearly monochromatic light, Newton’s rings (interference fringes caused by the near-surface). The number of rings indicates the difference in radius between the surfaces. This is known as power or sometimes as figure. It is measured in rings that are equivalent to half wavelengths. Beyond their number, the rings may exhibit distortion that indicates non-uniform shape differences. The distortion may be local to one small area, or it may be in the form of noncircular fringes over the whole aperture. All such non-uniformities are known collectively as an irregularity. **SURFACE FLATNESS** Surface flatness is simply surface accuracy with respect to a plane reference surface. It is used extensively in mirror and optical flat specifications. **CENTRATION** The mechanical axis and optical axis exactly coincide in a perfectly centered lens. For a simple lens, the optical axis is defined as a straight line that joins the centers of lens curvature. For a plano-convex or plano-concave lens, the optical axis is the line through the center of curvature and perpendicular to the plano surface. The mechanical axis is determined by the way in which the lens will be mounted during use. There are typically two types of mounting configurations, edge mounting, and surface mounting. With edge mounting, the mechanical axis is the center-line of the lens’s mechanical edge. Surface mounting uses one surface of the lens as the primary stability for lens tip and then encompasses the lens diameter for centering. The mechanical axis for this type of mounting is a line perpendicular to the mounting surface and centered on the entrapment diameter. Ideally, the optical and mechanical axes coincide. The tolerance on centration is the allowable amount of radial separation of these two axes, measured at the focal point of the lens. The centration angle is equal to the inverse tangent of the allowable radial separation divided by the focal length. Centration error is measured by rotating the lens on its mechanical axis and observing the orbit of the focal point. To determine the centration error, the radius of this orbit is divided by the lens focal length and then converted to an angle. **SURFACE QUALITY** Cosmetic surface quality describes the level of defects that can be visually noted on the surface of an optical component. Specifically, it defines the state of polish, freedom from scratches and digs, and edge treatment of components. These factors are important, not only because they affect the appearance of the component, but also because they scatter light, which adversely affects performance. Scattering can be particularly important in laser applications because of the intensity of the incident illumination. Unwanted diffraction patterns caused by scratches can lead to degraded system performance, and scattering of high-energy laser radiation can cause component damage. Over specifying cosmetic surface quality, on the other hand, can be costly. The most common and widely accepted convention for specifying surface quality is the U.S. Military Surface Quality Specification, MIL-0-13830A, Amendment 3. IMPORTANT: Surface quality can be impacted if improper cleaning is used. Sunex optics are generally referenced to MIL-PRF-13830B standards. These standards include scratches, digs, grayness, edge chips, and cemented interfaces. It is important to note that inspection of polished optical surfaces for scratches is accomplished by visual comparison to scratch standards. Thus, it is not the actual width of the scratch that is ascertained, but the appearance of the scratch as compared to these standards. A part is rejected if any scratches exceed the maximum size allowed. Digs, on the other hand, specified by actual defect size, can be measured quantitatively. Because of the subjective nature of this examination, it is critical to use trained inspectors who operate under standardized conditions in order to achieve consistent results. The scratch-and-dig designation for a component or assembly is specified by two numbers. The first defines allowable maximum scratch visibility, and the second refers to allowable maximum dig diameter, separated by a hyphen; for example, 80–50 represents a commonly acceptable cosmetic standard. 60–40 represents an acceptable standard for most scientific research and commercial applications. 10–5 represents a precise standard for very demanding laser applications. **SCRATCHES** A scratch is defined as any marking or tearing of a polished optical surface. In principle, scratch numbers refer to the width of the reference scratch in ten-thousandths of a millimeter. For example, an 80 scratch is equivalent to an 8-µm standard scratch. However, this equivalence is determined strictly by visual comparison, and the appearance of a scratch can depend upon the component material and the presence of any coatings. Therefore, a scratch on the test optic that appears equivalent to the 80 standard scratch is not necessarily 8 µm wide. If maximum visibility scratches are present (e.g., several 60 scratches on a 60–40 lens), their combined lengths cannot exceed half of the part diameter. Even with some maximum visibility scratches present, MIL-0-13830A still allows many combinations of smaller scratch sizes and lengths on the polished surface. **DIGS** A dig is a pit or small crater on the polished optical surface. Digs are defined by their diameters, which are the actual sizes of the digs in hundredths of a millimeter. The diameter of an irregularly shaped dig is 1/2#(length plus width): 50 dig = 0.5 mm in diameter 40 dig = 0.4 mm in diameter 30 dig = 0.3 mm in diameter 20 dig = 0.2 mm in diameter 10 dig = 0.1 mm in diameter. The permissible number of maximum-size digs shall be one per each 20 mm of diameter (or fraction thereof) on any single surface. The sum of the diameters of all digs, as estimated by the inspector, shall not exceed twice the diameter of the maximum size specified per any 20-mm diameter. Digs less than 25 micrometers are ignored. **EDGE CHIPS** Lens edge chips are allowed only outside the clear aperture of the lens. The clear aperture is 90% of the lens diameter unless otherwise specified. Chips smaller than 0.5 mm are ignored, and those larger than 0.5 mm are ground so that there is no shine to the chip. The sum of the widths of chips larger than 0.5 mm cannot exceed 30% of the lens perimeter. Prism edge chips outside the clear aperture are allowed. If the prism leg dimension is 25.4 mm or less, chips may extend inward 1.0 mm from the edge. If the leg dimension is larger than 25.4 mm, chips may extend inward 2.0 mm from the edge. Chips smaller than 0.5 mm are ignored, and those larger than 0.5 mm must be stoned or ground, leaving no shine to the chip. The sum of the widths of chips larger than 0.5 mm cannot exceed 30% of the length of the edge on which they occur. **CEMENTED INTERFACES** Because a cemented interface is considered a lens surface, specified surface quality standards apply. Edge separation at a cemented interface cannot extend into the element more than half the distance to the element clear aperture up to a maximum of 1.0 mm. The sum of edge separations deeper than 0.5 mm cannot exceed 10% of the element perimeter. **BEVELS** Although bevels are not specified in MIL-0-13830A, our standard shop practice specifies that element edges are beveled to a face width of 0.25 to 0.5 mm at an angle of 45°±15°. Edges meeting at angles of 135° or larger are not beveled. **COATING DEFECTS** Defects caused by an optical element coating, such as scratches, voids, pinholes, dust, or stains, are considered with the scratch and-dig specification for that element. Coating defects are allowed if their size is within the stated scratch-and-dig tolerance. Coating defects are counted separately form substrate defects. Read part two of the article: Optical Tolerances – Part 2 --- ## Plastic vs. Glass Optics - Source: https://sunex.com/2019/08/05/plastic-vs-glass-optics/ - Summary: Glass and plastic optics each has its own unique advantages. The properties of glass materials are very different from those of plastic materials At Sunex we are using our deep design expertise, understanding of material properties, and state-of-the-art high-volume manufacturing process to deliver Digital Imaging Systems based on plastic, glass, and glass/plastic hybrid elements depending on the application, performance and environmental requirements of our clients from the Automotive, Security, VR/AR, Robotics, and Medical industry around the globe. For the 1998 SPIE-Conference, Sunex Founder and CEO Alex Ning wrote a whitepaper discussing the key factors to consider when choosing between a plastic and full glass optic. Following is an excerpt from the whitepaper, which’s content is still holding up today. ## Why Plastic Optics? Glass and plastic optics each has its own unique advantages. The properties of glass materials are very different from those of plastic materials. There are literally hundreds of different glass materials available from well-know suppliers such as Schott, Hoya, and O’hara for making glass optics. The choice for plastic materials is limited only to about half a dozen. The attached table lists the currently available plastic materials and their key properties. Generally speaking, glass materials are harder and more durable than plastic materials. Glass materials are also more stable over a wider temperature range and humidity environment than plastic. Glass is much heavier than plastic (by a factor of 2.5x to 4x). The large selection of glass materials allows the designer to chose materials with desirable optical properties to gain better optical performance. This kind of freedom is limited with plastic materials. However, plastic optics offers other design freedoms that are not achievable or economical with glass optics. The manufacturing processes for glass and plastic optics are entirely different. Glass lenses are made by a grinding and polishing process whereas precision plastic lenses are made by injection-molding. The differences in manufacturing process provide plastic optics some unique advantages as follows: **High-volume production capability and low manufacturing cost:**Injection molding process allows very high volume production, and the unit cost can be very low. Though it is possible to achieve moderately high volume production with glass optics also, it is virtually impossible to realize the same cost reduction because the grinding and polishing process is inherently time-consuming and labor-intensive.**Design sophistication**: The grinding and polishing process makes difficult and very uneconomical to produce surface shapes other than sphere or flat in glass materials. However, the injection-molding process makes it feasible and economical to produce more sophisticated optical shapes such as asphere and diffractive surfaces in plastic provided a mold is properly made. From the design point of view, the more sophisticated surface shapes provide much better performance for many applications.**Unique designs possible:**Many useful designs that cannot be realized with glass optics can be achieved with plastic optics such as lens arrays and Fresnel lenses those are useful for a range of light dispersion and collection applications.**Lightweight and shatter-resistant:**The plastic materials are lighter weight and are more shatter-resistant. This feature is very important for head-worn optics such as headmounted displays.**Integral mounting:**For most optical applications, the individual optical components must be mounted in a system structure. With glass optics, it is done with separate mechanical mounting hardware. However, with plastic optics, it is possible to include the mounting features with the optical component. This not only reduces the overall system cost but also improves the reproducibility of the assembly.**Consistent Quality:**Plastic optics can be made with very consistent quality since all the lenses are derived from the same mold cavity (ies). Modern statistical control techniques are also been used to monitor the molding process to ensure a good yield is achieved. The major drawbacks of plastic optics are mostly material related. For example, plastic material is more sensitive to environment changes such as temperature and humidity. In addition, the material flow pattern and shrinkage during molding also limit the surface accuracy that is achievable with plastic optics. The index distribution within a molded component may be inhomogeneous and varying with the polarization (birefringence). The chemical properties of available plastic materials also limit the performance of the optical coatings that can be deposited on the plastic materials. It is important for the optical designer to understand the advantages as well as the limitations of plastic optics before a decision is made to use plastic optics. We strongly suggest that you discuss with us before finalizing your designs --- ## Lens Image Circle - Source: https://sunex.com/2019/08/01/lens-image-circle/ - Summary: One of the most important specifications for an imaging lens is the image circle. So what is the image circle of a lens? One of the most important specifications for an imaging lens is the image circle. So what is the image circle of a lens? This question is often asked without a clear definition of the image circle. This article will attempt to clarify the concept of image circle and discuss what’s to be expected from real-world lenses in terms of image circle. When a lens is first designed, the designer must choose a maximum field of view or image height in the setup. The lens performance is optimized within this maximum field angle or image height. The relative illumination at this max. image height is in the range of 50%-80% for most lenses. Very little performance consideration is given to any field points beyond this image height. We can call the max. image height x 2 the “nominal” image circle. However, this does not mean that beyond the max. image height, the image performance of the lens drops to “zero”. Specifically, the relative illumination can still be significant beyond the nominal image circle. And also, the MTF of the lens will still be meaningful beyond the nominal image circle. Let us define “true” image circle as the diameter of the image plane of the lens when the relative illumination reaches 10%. From this definition, it is clear that the true image circle is greater than the nominal image circle. The question is, by how much? The answer depends on the specific lens design. We can approach this problem in two ways: - We calculate the relative illumination curve beyond the max. image height once the actual clear apertures of all lens elements are finalized after opto-mechanical design. From this data, we can calculate the image height where the relative illumination falls to 10%. - After the real lenses are built, we can measure the true image circle by using an oversized sensor. Then we can measure the true image circle by determining the 10% intensity pixel positions. This requires that we use a sensor with a linear response. Based on real-world testing, we can make the following observations: - For lenses having large field angles, such as fisheye lenses, the true image circle is about 10-15% greater than the nominal image circle. - For telephoto or narrow field-of-view lenses, the true image circle is considerably greater. A value of 25-30% is not uncommon. - For mid-field-of-view lenses, the true image circle overfill depends on the design details. Compact designs with short total track lengths tend to have less overfill. An overage value 15-25% is not unusual. - We recommend that you test a sample of real lenses to determine if the true image circle is large enough for your applications instead of relying on the spec sheet nominal image circle to make the lens selection. --- ## Pick the right M12 (S-mount) lens for your project - Source: https://sunex.com/2019/08/01/pick-the-right-m12-lens-for-your-project/ - Summary: Talking to our clients, we noticed that selecting the right M12 lens (also called S mount lens) for a specific project is not trivial. Selecting the right M12 (S-mount) lens comes down to four variables: sensor image circle format, required field of view (FOV), working distance, and minimum aperture (F/#) for your lighting conditions. Get these right and the Sunex lens selection tools will narrow thousands of options to a shortlist in under 2 minutes. M12 lenses — also called S-mount or board-mount lenses — are the dominant format for ADAS, robotics, medical, and security cameras. This guide walks through the selection process step by step. ## What is an M12 (S-mount) lens and where is it used? Talking to our clients, we noticed that selecting the right M12 lens (also called S mount lens) for a specific project or simply limiting the options to a range of applicable lenses is not trivial. ## How do I choose the right M12 lens for my sensor format? Sunex’s Optical wizards are free online tools (registration required) designed to assist you in selecting the proper M12 lens or any other CMOS lens for your applications. If wizards, tools, and configurators are not your cup of coffee (or even if they are), we are always here to talk to you in person!Contact Sunex ## What F/# should I choose for my lighting environment? There are different approaches when it comes to selecting the right lens, depending on whether you already selected a specific imager or have hard requirements for FOV and EFL. No matter your starting point, Sunex’s Optical wizards will help you in the selection process: - Search imager database: We have built a database of popular CMOS imagers from major suppliers. You can search for an imager based on the manufacturer’s name, PN and imager resolution. Once an imager is identified, you can then go on to search for a list of compatible lenses. - Find a lens by imager specification: Given the imager resolution and pixel pitch, this tool will compute a list of imager key characteristics, and search our database for all matching lenses. - Field of view and EFL calculator: This tool will calculate the required lens effective focal length to achieve a desired field of view in degree or vice versa. This tool will work for all lenses including wide-angle and fisheye lenses with a significant amount of distortion. - Depth of field calculator: This tool will calculate the depth of field and hyper-focal distance for a given lens focal length and f/# . It requires the user to enter the maximum blur size in µm. It works for both infinite and finite conjugate systems. - Imaging optics solver: For a given object and image size requirement, this tool calculates the required focal length of the lens based on first-order optics. It then recommends a suitable lens structure and focal length based on the object field size. It is a good starting point for solving finite conjugate problems. - Search lens by optical parameters: This is a collection of advanced search tools. ## How do I use Sunex's free M12 lens selection wizard? The following webinar explains the different functionally and use cases of Sunex’s Optical wizards: Every wizard will lead you to a list of our best lens options for your requirements, sorted by field of view (descending). If you aren’t looking for a fisheye lens, just keep moving along to the next page of narrower angle lenses using the page navigation arrows provided! From this screen, you can also *Order Samples* (and check stock availability), calculate the *Depth of Field*, and *Request a Volume Quote*. Once you selected the proper optics, the linked PDF will give you access to a datasheet and dimensional drawings. ## When should I use a standard M12 lens vs. a custom-designed lens? Sunex provides lots of options that should be close to any given need. If, however, you cannot find a part that matches your exact requirements, please feel free to contact us to learn about new designs: Contact Sunex At Sunex, we are always interested in learning about new applications and project requirements. If we can’t find something that works in our current portfolio, we should discuss and custom lens design options. Our Imaging System Builder is a great way to start that process. --- ## Custom Lens Development Process - Source: https://sunex.com/2019/07/31/custom-lens-development-process/ - Summary: If an off-the-shelf solution can't meet your critical needs we can work with you to create a customized solution optimized for your application. If an off-the-shelf solution can’t meet your critical needs we can work with you to create a customized solution optimized for your application. We have an extensive library of mature designs and lenses in high-volume mass production that we can fine-tune to meet your unique requirements in terms of performance, size, cost, schedule, etc. We employ state-of-the-art computer-aided design tools and have a long list of optical design success in applications such as: - Miniature fisheye lens (200° FOV). - Athermalized lenses (focus stable across a wide temperature range) - Fisheye lens with tailored distortion to enhance edge resolution - Lenses optimized for High Dynamic Range (Day/Night) applications - 4k lenses - 360° panoramic optics with multiple lenses - Automotive rear-view, surround-view, ADAS, DMS/OMS, eMirror and others - 22x zoom lens for security cameras - Barcode and passport reader lenses - Endoscope lenses with 140° field of view - Low profile lens for mobile imaging applications - Compact video-conferencing lenses Sunex is a pioneer in developing innovative optics for digital imaging applications. Over the years we have developed an extensive library of proprietary lens designs that optimize the performance of digital imaging systems. To deliver the best value to our customers, we strive to achieve the best overall balance between performance, size, cost, and manufacturability. We will work with our clients to explore all leading-edge optical technologies to create designs to meet or exceed given performance or/and cost expectations. Sunex has industry-leading experience and deep competency in key technologies and examples of our innovative designs include: - Miniature Fisheye Lenses - NoGhost™ Lenses - Tailored Distortion® for ultra wide-angle and fisheye lenses - Boresight Stabilization - Day/Night Lenses - Dewarping Algorithms for Fisheye Lenses - Auto-Focus (AF) technologies in collaboration with partners. Our unique background in optical manufacturing ensures that the selected design has excellent manufacturability and can be produced within your target price in our facility in China. Our custom lens designs follow an established development process in close collaboration with our clients. --- ## M12 (S-mount) lens selection guide - Source: https://sunex.com/2019/07/25/m12-lens-selection-guide/ - Summary: Once the imager is chosen, the process for selecting an M12 lens (also called S mount lens) consists of the following steps... Selecting an M12 lens follows a defined five-step process: determine your required FOV, calculate the effective focal length (EFL), select the appropriate F/# for your lighting conditions, specify optical performance requirements (MTF, distortion, relative illumination), and define mechanical and reliability constraints. Sunex’s free online Optical Wizards automate the first two steps in seconds. This guide provides a structured framework for M12 lens selection — whether you are starting from a known sensor or a required field of view — covering both standard off-the-shelf options and the path to a custom OEM design. ## What are the steps for selecting an M12 (S-mount) lens? Once the imager is chosen, the process for selecting an M12 lens (also called S mount lens) does not differ from that of selecting other CMOS lenses and consists of the following steps: - Determine the desired field of view (in angles if the object is at infinity, and in actual sizes if the object is at a finite distance). - Calculate the required focal length of the lens, and the image circle size. We have created a wizard to perform this calculation. - Choose an appropriate lens f/# based on similar lighting environment and depth of field requirement. We have created a wizard to calculate the depth of field. - Determine the appropriate lens performance requirements such as modulation transfer function (MTF), chromatic aberration, distortion, and relative illumination. - Specify the mechanical size constraint and reliability requirements. #### Imager format and resolution The starting point is the format size which is linked to the effective area of the imager. The format size definition comes from pre-electronic imaging era. It does not directly represent the diagonal size of the effective area. Commonly seen imager formats and their actual physical sizes are listed below. The imager resolution is the number of effective pixels in the horizontal and vertical directions. The total number of pixels is often used to represent the nominal resolution of an imager. Imager Format Approximate horizontal size (in mm) Approximate vertical size (in mm) Approximate diagonal size (in mm)35mm full frame 36 24 43.3 APS-C 23.6 15.6 28.3 1.5″ 18.7 14.0 23.4 Micro 4/3rd 17.3 13 21.7 1″ 12.8 9.6 16.0 1/1.2″ 10.67 8 13.4 2/3″ 8.8 6.6 12.0 1/1.7″ 7.6 5.7 9.5 1/2″ 6.4 4.8 8.0 1/2.3″ 6.17 4.55 7.8 1/2.5″ 5.7 4.32 7.2 1/2.7″ 5.3 4 6.6 1/3″ 4.8 3.6 6.0 1/3.2″ 4.54 3.42 5.7 1/4″ 3.6 2.7 4.5 1/5″ 2.56 1.92 3.2 1/6″ 2.16 1.62 2.7 #### Lens image circle vs. imager size - The max. image circle of a lens is the area over which the lens will provide an acceptable performance. For standard applications only lenses with image circle greater than the imager diagonal size should be selected (see below graphic). If the image circle is smaller than the imager diagonal black or darker corners will result. However, for ultra wide-angle systems, it is common to have the fisheye lens image circle smaller than the diagonal of the imager. If the entire image circle is contained within the effective area of the imager, a circular image is formed. If the imager circle is less than the horizontal width of the imager but greater than the vertical height, a horizontal frame is formed. ## How do I calculate the effective focal length (EFL) for my required FOV? #### Effective focal length and field of view Once the lens image circle is determined, the next step is to determine the appropriate lens focal length (EFL) required to achieve the desired field of view. The lens EFL is an intrinsic property of the lens, independent of the imager used. The max. lens field of view (FOV) is specified for the image circle size. However, the field of view of CCD/CMOS camera depends on both the lens EFL and the size of the imager area. If the lens distortion is small (known as rectilinear lenses), the following formula can be used to calculate the camera FOV: - where x represents the width, height, or diagonal size of the imager, and f is the lens EFL. We have created an online wizard to perform various FOV/EFL calculations. When there is a significant amount of distortion in the lens, such as in the case of very wide-angle lenses and fisheye lenses, the calculation of the FOV is much more involved. We have developed a new concept called “rectilinearity” to characterize the distortion properties of ultra-wide-angle and fisheye lenses. When used in conjunction with the effective focal length, the field of view and distortion property of a lens can be fully analyzed without having to know the detailed lens prescription. #### Relative aperture or f/# - The f/# of the lens has two impacts: (1) the amount of light that the lens collects, and (2) the depth of field (DOF). For a low-light environment, it is often necessary to choose a lens with low f/#. However, the depth of field of a low f/# lens is limited. Low f/# lenses are also more complex and thus more expensive to produce. Therefore, the optimal f/# selection is based on the tradeoffs between various performance parameters and lens cost. It is usually possible to increase the f/#(stopping down the aperture) of an existing lens design without a detrimental impact on the image quality. However, lowering the f/# (increasing the aperture size) is usually not possible without causing a significant compromise in the image quality/relative illumination. #### Nyquist frequency and image quality - In a digital imaging system, the pixel array of the imager samples the continuous spatial image formed by the optical system. Nyquist Frequency (NF) represents the highest spatial frequency that the imager is capable of detecting. The NF depends on the pixel pitch, color filter array (CFA) design, and the processing algorithms of the entire imaging processing chain. Lens image quality can be the gating factor in the overall image quality of a digital imaging system. To realize the full resolution of the imager, the lens resolution should be greater than the NF. The lens should provide sufficient spatial detail to the imager sensor if each pixel of the imager is to be fully utilized. Lens image quality is characterized by its modulation transfer function (MTF). The MTF of a lens varies with spatial frequency as well as angle of incidence. A good lens should have MTF >40% up to the sensor Nyquist frequency. It should also provide a consistent MTF across the entire field of view of the lens. #### Relative illumination and telecentricity - The light collection ability of all lenses falls off with an increasing field of view. Relative illumination of a lens is defined as the ratio of light intensity at the maximum angle of view to that on-axis. For electronic imager sensors (CCD and CMOS), the off-axis brightness is further reduced by the collection efficiency of the imager pixel structure. Many modern imagers use a micro-lens over each pixel to increase the fill factor. The micro-lens will limit the field of view of the pixel. To be maximally compatible with the micro-lens field of view, the rays emerging from the lens must be within the acceptance angle of the micro-lens for all off-axis rays. This typically requires that the primary lens be telecentric in imaging spacing. Non-telecentric lenses can also cause color and resolution cross-talk between adjacent pixels. This will further impair the off-axis performance of the imaging system. Download a white paper on chief ray angle. #### Chromatic aberrations - Optical materials have different indices of refraction at different wavelengths, known as dispersion. The material dispersion causes light at different wavelengths to focus at different focal plane (axial color) and different image height (lateral color). Lateral color can be seen as color fringes at high contrast edges of off-axis objects. Chromatic aberrations can be minimized or eliminated by using a combination of lens elements with different dispersion properties. Download a whitepaper on lateral color. #### Distortion - Lens optical distortion describes how the image is deformed with respect to the object. Distortion (%) is defined as follows: where *ychief*is the image height for an off-axis chief ray, and*yref*is a reference image height for the off-axis field angle. For normal field of view lenses, the reference image height is defined as:where f is the effective focal length, and θ is the field angle. The resulting distortion is known as “rectilinear” or “f-tan” distortion. Most standard photographic lenses have low rectilinear distortion. For wide-angle and fisheye lenses, the reference image height is typically chosen as the product of focal length and field angle (in radians): The resulting distortion is known as “f-theta” distortion. Please note that a zero f-theta distortion lens can still look very “distorted” visually. It is possible to “tailor” distortion in such a way that the off-axis resolution is enhanced from the standard “f-theta” type. Sunex has developed unique designs and manufacturing know-how to provide wide-angle lenses with tailored distortion. Visual impact of various lens distortions (value is calculated for the corners)#### Depth of field or focus - The depth of field (DOF) of a lens is determined by several factors: the relative aperture or f/#, the lens EFL, the maximum acceptable blur, and the lens MTF. Generally speaking, higher f/# lenses will have more DOF. Shorter EFL lenses will also have more DOF. We provide a wizard to calculate the depth of field for a given lens. If the MTF of the lens is higher, the perceived DOF will also be greater. Because the maximum allowed blur size is somewhat subjective and application dependent, it is strongly recommended that experimental verification of the DOF to be performed. #### Flare, scattering, and ghost images - Flare is caused by improper engineering of the lens’s internal structure such that light rays outside the field of view are “leaked” into the normal field of view. Scattering is caused by the surface roughness of the lens element, which causes an overall reduction in the contrast of the image. Ghost images are formed when light rays are bounced multiple times inside the lens/sensor structure, causing additional “weak” images to be formed near the primary image. These are all optical “noises” that can cause degradation to the overall image quality. Careful consideration must be taken in the design and manufacturing processes to minimize the undesired optical noises. #### IR cut-off filter - IR cut-off filtering in the optical chain is required to form accurate color images. IR cut-off filtering can be accomplished by inserting an IR-cut off filter in the lens system. Another option is to apply the IR cut-off coating onto the lens elements directly. #### Optical low-pass filter (OLPF) - The image formed by a lens is continuous in space. This image is “sampled” by a CCD/CMOS sensor with a sampling frequency equal to the inverse of the 2x pixel pitch. If the image contains objects at spatial frequencies higher than the sampling frequency of the imager, the resulting image will have aliasing artifacts. This phenomenon is often observed as colorful fringes (Moire fringes) on the final images. In high-quality imaging systems, optical low-pass filters (OLPF) can be used to eliminate the Moire fringes. OLPF cuts off the lens MTF above the sampling frequency of the imagers, resulting in an overall MTF that approximates a step function (in the spatial domain). Download an application note on OLPF. An OLPF is made of 1 to 3 layers of optical birefringent materials such as quartz. Each birefringent layer splits a light ray by polarization as shown below: #### Auto-focus (AF) lens - Auto-focus lenses “track” the object continuously so that the image is always in focus regardless of the object’s movement. This is done by adjusting the lens (typically using a step motor) to the imager distance based on measured real-time object distance. #### Zoom lens - A zoom lens is a lens that has a variable effective focal length (EFL). Since the field of view of a lens is determined by its EFL, a zoom lens will have a variable field of view. When the field of view is decreased, a “zoom-in” effect is observed. When the field of view of the lens is increased, a “zoom-out” effect is observed. In the “zoom-in” position, the object detail is magnified, but a smaller area of the object is seen. In the “zoom-out” position, more of the object area is observed, but the detail of the object is compromised. ## When should I move from an off-the-shelf M12 lens to a custom OEM design? View our webinar to learn more about the free tools Sunex provides to help pick the right lens for any project. --- ## Material Guide - Source: https://sunex.com/2019/07/25/material-guide/ - Summary: Optical components are fabricated using a variety of materials including optical glasses,  engineered plastics, and crystalline materials. Optical components are fabricated using a variety of materials including optical glasses, engineered plastics, and crystalline materials. Glass material is the most common type because of its excellent optical properties such as high light transmission and environmental stability. There are many types of glass materials suitable for making optical components. The specific selection depends on the optical performance requirement, environmental suitability, ease of fabrication, and cost. The main characteristics of an optical material are represented by its refractive index (Nd) and its dispersion. Suitable plastic materials can also be used in fabricating low-precision optical components. There are only a handful of plastic materials with a limited index of refraction and Abbe values to choose from. The highest available index is about 1.6. The main benefit of plastic material is that it can be made by various molding techniques thus reducing the cost of manufacturing. However, the precision achievable with plastic is limited by the material properties and the molding process. Plastic optics are typically used as eyeglass lenses, magnifier lenses, educational optics, etc. Plastic optics can also be combined with glass optics to form hybrid optical systems. Download a white paper on “plastic vs. glass optics”. Special materials such as quartz, sapphire, fused silica, etc are also used to fabricate optical components. Optical quartz is a birefringent material with non-isotropic optical properties. It is used in optical low-pass filters and specialized fiber-optic devices. Sapphire is a special material with very durable surface quality. It is often used to fabricate optical windows which protects the internal optical components against harsh environment. Fused silica material has excellent UV transmission, and it is very stable thermally. It is used for fabricating UV optics such as lenses and prisms. The following table shows a list of most commonly used materials along with their key properties. Definition of terms used in the following table is as follows: - Nd: Refractive index at d-line wavelength (587nm). - Abbe: (Nd-1)/(NF-NC), F-line is at 485nm and C-line is at 656nm - TCE: Thermal coefficient of expansion in 10E-7/C. - Density: Material density in g/cm3. - Lmin: The material transmission band begins at this wavelength in µm. - Lmax: The material transmission band stops at this wavelength in µm. - MILcode: A six digit code developed by US military to represent glass name. The first three digits are the first three digits of the fraction of the Nd. The other three digits are the first three digits of the Abbe # without the decimal point. Nameclick for more info | Description | Nd | Abbe | TCE | Density | Lmin | Lmax | FusedSilica UV-IR optical material and low thermal expansion, expensive | 1.458464 | 67.821434 | 0.54 | 2.2 | 0.21 | 3.71 | FK5 | Low index, low dispersion general purpose optical glass | 1.48749 | 70.410004 | 10 | 2.45 | 0.334 | 2.5 | BK7 | General purpose optical glass, the most common type | 1.5168 | 64.169998 | 7.1 | 2.51 | 0.334 | 2.325 | LLF6 | Low-index optical flint glass | 1.53172 | 48.759998 | 8.5 | 2.81 | 0.334 | 2.325 | BAK1 | General purpose optical glass, higher index than BK7 | 1.5725 | 57.55 | 7.6 | 3.19 | 0.29 | 2.5 | F2 | General purpose optical flint glass | 1.62004 | 36.369999 | 8.2 | 3.61 | 0.334 | 2.5 | SK16 | General purpose optical glass, higher index than BK7 | 1.62041 | 60.32 | 7.3 | 3.58 | 0.334 | 2.5 | SF2 | High index optical flint glass | 1.64769 | 33.849998 | 8.4 | 3.86 | 0.334 | 2.325 | BAFN10 | General purpose optical glass | 1.67003 | 47.11 | 6.8 | 3.61 | 0.334 | 2.325 | SF5 | High index optical flint glass | 1.6727 | 32.209999 | 8.2 | 4.07 | 0.334 | 2.325 | SF8 | High index optical flint glass | 1.68893 | 31.18 | 8.2 | 4.22 | 0.334 | 2.5 | SF1 | High index optical flint glass | 1.71736 | 29.51 | 8.8 | 4.46 | 0.334 | 2.5 | SF10 | High index optical flint glass | 1.72825 | 28.41 | 7.5 | 4.28 | 0.334 | 2.325 | LAF2 | High index, lower dispersion glass, expensive | 1.744 | 44.720001 | 9 | 4.34 | 0.334 | 2.325 | SF4 | High index optical flint glass | 1.7552 | 27.58 | 8.9 | 4.79 | 0.334 | 2.5 | Sapphire | Durable material, excellent for optical windows | 1.768234 | 72.237213 | 6.65 | 3.987 | 0.2 | 5.5 | ZNSE | IR transmitting glass. Useful for IR optics | 2.62411 | 8.282769 | 7.8 | 5.264 | 0.55 | 18 | --- ## Team - Source: https://sunex.com/about/team/ - Summary: From our US headquarters, a team of experts is available to discuss your specific program needs. From off-the-shelf lens selection to custom lens design and camera module development, we are here for you from Day 1. # US Technical Team From our US headquarters, a team of experts is available to discuss your specific program needs. From off-the-shelf lens selection to custom lens design and camera module development, we are here for you from Day 1. **Dr. Alex Ning**, Founder and CEO He holds a Ph.D. in Physics from the University of Chicago and a B.S. in Physics from the University of Science and Technology of China, and is a registered member of SPIE — the international society for optics and photonics. Dr. Ning has authored over 100 US patents in miniature lens design spanning fisheye optics, wide-angle imaging systems, thermal stabilization, and ghost suppression — the foundational IP behind Sunex’s proprietary technologies, including Tailored Distortion, NoGhost, and SuperFisheye. He published his first technical whitepaper at the SPIE Conference in 1998 and has led Sunex’s optical innovation program for over 30 years. **Ben Roberts**, VP of Sales & Marketing He holds an Optics degree from the University of Rochester and brings over 20 years of optical industry experience — including building and growing Sunex’s Security & Surveillance business from the ground up. Ben currently leads Sunex’s partner programs and robotics vision initiatives, working with system integrators, robotics OEMs, and channel partners to match imaging requirements with the right optical solution. He is a regular presenter in the Sunex webinar and YouTube series, where he translates complex optical system tradeoffs into practical guidance for engineering and procurement teams. **Dr. Jingbo Cai**, Director of Design He holds a Ph.D. in Laser and Optical Engineering from the University of Alabama in Huntsville and brings deep optical design expertise from prior roles spanning display optics, wearable imaging systems, and precision optical instrumentation. At Sunex, Jingbo leads the Global Optical Design Team and is directly involved in designing custom lens systems for automotive ADAS, µLED headlamp projection, single-use medical endoscopy, and industrial robotics applications. **Ingo Foldvari**, Director of Business Development With over 25 years of industry experience, he advises his customers across leadership, engineering, and procurement functions on optical system architectures, sourcing strategy, and highly manufacturable optical solutions at volume. His primary focus areas include automotive µLED headlamp projection optics, automotive ADAS and in-cabin monitoring, single-use medical endoscopy, and robotics embedded vision systems. Ingo speaks at industry events, including the DVN Workshop and AutoSens, and publishes industry articles and content for the Sunex Knowledge Center. **Alexander Gavrilovich**, Optical Solutions Engineer He holds a B.S. in Optical Engineering from Rose-Hulman Institute of Technology — one of the leading undergraduate optical engineering programs in the United States — and brings prior experience in optical systems from the Naval Surface Warfare Center (Crane Division) and Naval Air Systems Command (NAVAIR). At Sunex, Alexander supports engineering teams on lens specification, system integration, and optical performance evaluation across automotive, robotics, medical, and industrial vision applications. **Joely Caise**, Optical Sales and Applications Engineer She holds an M.S. in Applied Physics with a specialization in optics from the University of Oregon and a B.A. in Physics from Whitman College, and brings prior experience in semiconductor process engineering from Intel Corporation’s Micro Defect Inspection team. Joely bridges technical optical knowledge and practical customer outcomes — supporting engineering teams on lens specification, optical performance evaluation, and imaging system integration across automotive, medical, and industrial vision applications. **Daniel Rueda**, Technical Sales Support Engineer He holds a B.S. in Physics from the University of California, Riverside, and supports engineering teams on lens specification, optical performance evaluation, and system integration for warehouse automation, autonomous mobile robots, and industrial machine vision systems. Daniel uses Zemax optical simulation daily to evaluate lens performance against customer imaging requirements, ensuring technically accurate, application-specific guidance throughout the customer engagement process. --- ## Technology Hub - Source: https://sunex.com/technology-hub/ - Summary: Sunex has a wide range of COTS optical products across multiple applications and markets. # Technology Hub ## PRODUCTS Sunex has a wide range of COTS optical products across multiple applications and markets. ## CASE STUDIES A collection of case studies and customer success stories to help guide your decision path. ## ARTICLES Introduction to a wide range of topics written by our engineers for a technically interested audience. ## WHITE PAPERS In-depth technical papers written by our team of experts and based on decades of experience. --- ## Geo Imaging - Source: https://sunex.com/products/geoimaging/ - Summary: Large format sensors (1", APS, Full Frame, etc.) tend to have slightly larger pixels and seemingly go against the trend of pushing for smaller and smaller pixels. However, this is intentional since the goal of getting “more pixels on the same target area” doesn’t necessarily mean that the image quality is also improving. The larger pixels of large format sensors often have lower noise and better low-light performance and tend to align better with the boundaries of imaging physics and manufacturing tolerances for CMOS lenses. ## Large Format Lens Portfolio Large format sensors (1″, APS, Full Frame, etc.) tend to have slightly larger pixels and seemingly go against the trend of pushing for smaller and smaller pixels. However, this is intentional since the goal of getting “more pixels on the same target area” doesn’t necessarily mean that the image quality is also improving. The larger pixels of large format sensors often have lower noise and better low-light performance and tend to align better with the boundaries of imaging physics and manufacturing tolerances for CMOS lenses. High-resolution (up to 200MP) lenses provide superior image quality by delivering exceptional clarity and detail, even in high-speed, dynamic environments. These lenses stand out for their ability to produce vivid colors and minimize distortion, ensuring sharp, true-to-life images. Suitable image sensors contain a high pixel count, which enables precise capture of fine textures and intricate elements, enhancing the overall visual experience. Their advanced optical designs reduce aberrations and enhance contrast, providing consistent edge-to-edge performance across various lighting conditions. Sunex Large Format lenses have a profound impact on total system performance. Their high resolution delivers lifelike imaging, making them essential for cutting-edge sports coverage, dynamic live broadcasts, immersive content capture, cinematic filmmaking and photography, geospatial mapping, teleconferencing, security, and Robotics applications where imaging quality is paramount. These lenses set a new benchmark for high-end professional imaging by delivering unparalleled clarity and detail. | PN | Description | EFL | F/# | IMC | Unit Price | Files | |---|---|---|---|---|---|---| | Loading… | Don’t find what you are looking for? Try searching our entire **Off-The-Shelf Portfolio** or use our **Imaging System Builder** to get started on a custom solution. ## Key Technologies **High Dynamic Range (HDR)**HDR sensors can capture light intensity variations up to six or more orders of magnitude within the same image frame (~120db). This puts a very demanding requirement on lens performance. Sunex has developed design expertise, process know-how, and nested cleanroom manufacturing facilities to eliminate or minimize optical noise (such as ghosts, flare, starbursts, spurious images) in lenses for high-performance applications. All graphs are for illustration purposes only. The individual lens performance can be different. ## Supporting Services **Sensor Module Capabilities**Depending on the need and expertise of our customers, we provide design and manufacturing services for a complete sensor module. We strive to find the best solution for your needs, from designing the schematic, creating the PCB layout, and sourcing all components to building according to your PCB design and parts consignment. At Sunex, we have the in-house expertise and capabilities for lens and sensor board design, manufacturing, and testing to deliver a fully tested sensor module. **Active Alignment Capabilities** To achieve the highest system performance when pairing a high-quality lens with a high-resolution sensor, we recommend that our customers consider an active alignment process. Applying a fully automated 6-axis active alignment in mass production increases yield, shortens cycle times, improves system performance, and lowers part-to-part variance. --- ## M12 Lens Catalog: Specifications, Samples & Prices - Source: https://sunex.com/products/otsportfolio/ - Summary: We provide free Optical Wizards to calculate the right lens parameters based on your application requirements.  You can also read our extensive literature on m12 lenses. # M12 (S-mount) and other lenses: browse, search, download specifications and order samples ## Select from a vast array of M12 lenses or other miniature lenses, offering a wide range of mechanical and optical configurations — test samples and build prototypes in a matter of days. By utilizing the streamlined online ordering system with Sunex, customers can typically expect fulfillment within two business days from our US-based inventory. In the event that a product is out of stock, our team of Optical Engineers is available to collaborate with you on suitable alternatives. Every application has different requirements, and sometimes, even our extensive off-the-shelf lens portfolio does not have a suitable option. In such cases, we provide customized or full custom optical solutions, offering services that span from design and prototyping to mass production. We provide free Optical Wizards to calculate the right lens parameters based on your application requirements. You can also read our extensive literature on m12 lenses. | PN | Description | EFL | F/# | IMC | Unit Price | Files | |---|---|---|---|---|---|---| | Loading… | --- ## Geospatial - Source: https://sunex.com/solutions/geospatial/ - Summary: Designing and manufacturing precision optics and camera modules for geospatial mapping — seeing is believing. ## Visible or Near infrared imaging helps 3D point clouds paint a clear picture. ## Total Stations ## Mobile Mapping ## LiDAR Optics ## Camera Modules ## A closer look at what it takes Sunex Inc. provides advanced optical and imaging solutions for geospatial mapping and remote sensing applications, with decades of experience in precision lens and camera module design. Our portfolio includes high-resolution, wide-angle, and low-distortion lenses optimized for aerial, terrestrial, and mobile mapping platforms. Sunex 2D vision solutions complement 3D point clouds, enabling accurate terrain modeling, volumetric analysis, and precise geospatial intelligence. Proprietary active alignment processes ensure perfect integration between optics and sensor modules, while DXM™ and SXM™ technologies allow modularity and flexible deployment. With ISO-certified global manufacturing, rigorous quality control, and full lifecycle support—from concept to series production—Sunex empowers geospatial OEMs to deliver reliable, high-resolution imaging systems that support mapping, surveying, and GIS applications. --- ## Medical Imaging - Source: https://sunex.com/products/medicalimaging/ - Summary: We design and manufacture high-resolution imaging solutions according to customer requirements for a wide range of medical devices, including disposable (single-use) endoscopes. These lenses are designed with bio-compatible materials and are assembled in a dedicated medical device cleanroom environment. ## Medical Imaging Portfolio We design and manufacture high-resolution imaging solutions according to customer requirements for a wide range of medical devices, including disposable (single-use) endoscopes. These lenses are designed with bio-compatible materials and are assembled in a dedicated medical device cleanroom environment. We provide high-resolution custom optics suitable for limited-use or disposable (single-use) medical devices. These lenses are designed with bio-compatible materials and are assembled in a dedicated medical device cleanroom environment. **True HD and 4k optical performance**— Lens systems engineered to resolve what 4K sensors can actually capture, matched to pixel pitch and chief ray angle requirements.**Custom Camera Module Design**— Full custom design from optical concept to qualified module — adapting diameter, working distance, FOV, and spectral requirements.**Enabling Single-Use economics**— Over 25 years of design and manufacturing experience deliver per-unit cost to disposable-viable levels without sacrificing optical precision.**AI Diagnostic Pipeline Ready**— Image chains optimized for AI tools with active alignment, consistent MTF, controlled distortion, and repeatable spectral response across every production lot. Many of our miniature lenses designed for finite imaging have broad applications in the medical market. In addition, we have extensive experience in developing and manufacturing custom medical imaging optics and camera modules. Examples of our past successes are: - A Disposable Laparoscope with 26 lens elements including both glass and plastic elements. - A Single-Use Colonoscope objective lens with a hybrid structure maximizing the advantages of each material technology. - A high-quality dual-channel objective lens assembly for Stereo Vision Robotic Surgery. - A high-quality aspherical Dental Scope objective. - Multiple Portable, point-of-care Diagnostic Devices ## Endoscopes Endoscope optics are at the core of who we are and how we came to be in the optics industry. The first device created by Sunex was a laparoscope, nearly thirty years ago. Since then, endoscope technology has advanced greatly, and our optics technology has progressed alongside it. A few of the recent endoscopy projects we have had the fortune of working on include both single-use and reusable colonoscopes, laparoscopes and duodenoscopes. We believe specialty and single-use endoscopes are the future of the industry. Traditionally, endoscopes were expensive and required meticulous sterilization after each use. However, the advent of single-use endoscopes has revolutionized the medical landscape, offering advantages for patients and providers alike. With streamlined service, increased patient safety, reduced costs and reduced environmental impact, single-use is the future of endoscopy. While there are numerous advantages, designing and manufacturing these endoscopes comes with plenty of challenges as well. Sunex is proud to have brought multiple single-use endoscopes to market, navigating the balance between cost effectiveness and high performance. If a lens can couple a large FOV and low distortion, in a small package, it is the ideal candidate for the future of endoscopy. ## Diagnostics Imaging Imaging has been at the cornerstone of diagnostics since the discovery of X-rays in the late 1800’s. By enabling providers to non-invasively observe their patients, countless lives have been saved. At the heart of this field are premium optics. High quality optics ensure superior image clarity and resolution. This is crucial as it allows healthcare professionals to see fine details, such as minute lesions, microcalcifications or subtle changes in tissue composition. An increased accuracy in diagnostics can improve treatment plans as well, helping to precisely identify cancerous tissues which need to be removed so that healthy tissues remain unharmed. Sunex is proud to have had the opportunity to work on the development of a number of diagnostic devices, from portable X-ray machines to point-of-care disease detection devices. Across all applications precision and consistency is at a premium and thus these opportunities have led to the development of some of the most accurate, low distortion lenses in our catalog. By delivering athermalized, low distortion and high-resolution lenses, Sunex has been grateful to play a role in the diagnosis and treatment of many medical maladies and we look forward to innovating again. ## Robotic Surgery The future of robotic surgery is incredibly promising. As technology continues to advance, it is expected that the application of robots in surgery across nearly all disciplines will greatly increase. Some of the advancements that are primed to enable this increased adoption include enhanced vision capabilities and ever shrinking devices to make surgery as minimally invasive as possible. With a large catalog of high quality, miniature lenses, Sunex is excited to offer assistance to innovators creating the next generation of surgical robots. We have worked with leading companies to create ultra-small, high resolution and wide FOV lenses that currently reside in surgical systems. These range from the endoscopes that capture images to the immersive vision systems that relay these images to surgeons and a range of visual applications in-between. By giving providers an increased FOV and Depth of Field over other lenses, Sunex lenses ensure the entire area of operation can be viewed clearly. Don’t find what you are looking for? Try searching our entire **Off-The-Shelf Portfolio** or use our **Imaging System Builder** to get started on a custom solution. ## Key Technologies **Narow to Large FOVs** We are leveraging best-in-class design expertise and are constantly driving process improvements and manufacturing innovation for our global customer base to push the boundaries of what is possible. Our optical and mechanical engineering teams apply their proven engineering know-how to create new products and novel solutions to support your project goals. From Design To Cost (DTC), Design for Manufacturing (DFM), and Design for Reliability (DFR), we always have the full product development cycle and your specific business case in mind. **FOVEA and Tailored Distortion Control** Expanding beyond our Optical Design Services, we can create an even further vertically integrated solution for you. From optomechanical design, PCB design, and auxiliary component integration to manufacturing tolerancing, End-of-Line (EOL) test requirements, and Quality Control (QC) procedures, we can design your camera module and prepare your optical system for prototyping and ramp into manufacturing. **Finite Imaging with Large Depth of Field** Sunex’s renowned excellence in Optical Design is complemented by its extensive experience and leading position in the manufacturing of optical systems. If your solution benefits from early prototyping and samples to optimize performance, explore mechanical variants, or solicit early end-user input, we would be honored to be your partner of choice. Our state-of-the-art prototyping equipment and accredited in-house test facilities support in-depth prototype evaluation before transitioning further on the path to mass production. ## Supporting Services **Camera Module Capabilities** We work with our customers and partner network to find the best balance between cost and performance to meet the often unique application requirements. Sunex offers a wide range of services, including designing and manufacturing the lens and the optomechanical components, the PCBA, and the right cabling solution. Sunex has the expertise and capabilities for high-volume manufacturing in state-of-the-art cleanroom facilities, automated 6-axis active alignment, and the test and quality control processes required to support the most demanding medical imaging applications. **Fast Prototyping** In specific cases, prototyping is used as the first stepping stone toward mass production. Besides the lens performance, exploring various mechanical design solutions is often part of these early efforts. Staying within the required dimensions while accommodating additional working channels and features is crucial for a successful product. Sunex can produce prototypes with short lead times to verify the design before transitioning further on the path to mass production using state-of-the-art fabrication processes for glass and plastic optical elements and all mechanical components. **Designed for Mass Production** Often it is not the challenge to create a design “that works” but to find a solution that can scale to mass production, meeting price targets, optical performance, mechanical constraints, and quality requirements. Sunex has over two decades of design and manufacturing experience, and all our lenses and modules are designed for high-volume manufacturability. **Reliability and Environmental Testing** Often, it is not the challenge to create a design “that works” but to find a solution that can scale to mass production (DFM), meets required price targets (DTC), and delivers on performance and quality (DFR). Our deep design and manufacturing experience comes from servicing some of the most demanding markets and applications, and combined with consistent quality and global on-time delivery, Sunex is the preferred partner for many. Many applications require that imaging lenses be optimized for a finite object, that is, for an object distance closer than “infinity.” There are significant drawbacks of utilizing high CRA sensors for applications where a short z-height and compactness are not as important. Interested in additional articles, white papers, and more? Visit our **Technology & Resource Hub** and get free access to high-quality information. --- ## Services - Source: https://sunex.com/solutions/services/ - Summary: Our teams across sales, support, customer service, engineering, and manufacturing are working for you — iterally around the clock. ## From the first contact to on-time delivery in mass production. ## Lens Design ## Camera Modules ## Fast Prototyping ## Mass Production ## Always design with mass production in mind. With a 25+ year track record as a leading optics company and a US-based headquarters and Design Center, we have consistently demonstrated success in taking customer concepts from design through mass production across many industries and applications. **Lens Design Services** We are leveraging best-in-class design expertise and are constantly driving process improvements and manufacturing innovation for our global customer base to push the boundaries of what is possible. Our optical and mechanical engineering teams apply their proven engineering know-how to create new products and novel solutions to support your project goals. From Design To Cost (DTC), Design for Manufacturing (DFM), and Design for Reliability (DFR), we always have the full product development cycle and your specific business case in mind. **Camera Module Design Services** Expanding beyond our Optical Design Services, we can create an even further vertically integrated solution for you. From optomechanical design, PCB design, and auxiliary component integration to manufacturing tolerancing, End-of-Line (EOL) test requirements, and Quality Control (QC) procedures, we can design your camera module and prepare your optical system for prototyping and ramp into manufacturing. **Fast Prototyping** Sunex’s renowned excellence in Optical Design is complemented by its extensive experience and leading position in the manufacturing of optical systems. If your solution benefits from early prototyping and samples to optimize performance, explore mechanical variants, or solicit early end-user input, we would be honored to be your partner of choice. Our state-of-the-art prototyping equipment and accredited in-house test facilities support in-depth prototype evaluation before transitioning further on the path to mass production. **Designed for Mass Production** Often, it is not the challenge to create a design “that works” but to find a solution that can scale to mass production (DFM), meets required price targets (DTC), and delivers on performance and quality (DFR). Our deep design and manufacturing experience comes from servicing some of the most demanding markets and applications, and combined with consistent quality and global on-time delivery, Sunex is the preferred partner for many. --- ## OEM manufacturer of M12 lens | Sunex Inc. — 25+ Years Design & Production - Source: https://sunex.com/ - Summary: Once the imager is chosen,  the process for selecting an M12 lens (also called S mount lens) does not differ from that of selecting other CMOS lenses and consists of the following steps: Once the imager is chosen, the process for selecting an M12 lens (also called S mount lens) does not differ from that of selecting other CMOS lenses and consists of the following steps: - Determine the desired field of view (in angles if the object is at infinity, and in actual sizes if the object is at a finite distance). - Calculate the required focal length of the lens, and the image circle size. We have created a wizard to perform this calculation. - Choose an appropriate lens f/# based on similar lighting environment and depth of field requirement. We have created a wizard to calculate the depth of field. - Determine the appropriate lens performance requirements such as modulation transfer function (MTF), chromatic aberration, distortion, and relative illumination. - Specify the mechanical size constraint and reliability requirements. #### Imager format and resolution The starting point is the format size which is linked to the effective area of the imager. The format size definition comes from pre-electronic imaging era. It does not directly represent the diagonal size of the effective area. Commonly seen imager formats and their actual physical sizes are listed below. The imager resolution is the number of effective pixels in the horizontal and vertical directions. The total number of pixels is often used to represent the nominal resolution of an imager. Imager Format Approximate horizontal size (in mm) Approximate vertical size (in mm) Approximate diagonal size (in mm)35mm full frame 36 24 43.3 APS-C 23.6 15.6 28.3 1.5″ 18.7 14.0 23.4 Micro 4/3rd 17.3 13 21.7 1″ 12.8 9.6 16.0 1/1.2″ 10.67 8 13.4 2/3″ 8.8 6.6 12.0 1/1.7″ 7.6 5.7 9.5 1/2″ 6.4 4.8 8.0 1/2.3″ 6.17 4.55 7.8 1/2.5″ 5.7 4.32 7.2 1/2.7″ 5.3 4 6.6 1/3″ 4.8 3.6 6.0 1/3.2″ 4.54 3.42 5.7 1/4″ 3.6 2.7 4.5 1/5″ 2.56 1.92 3.2 1/6″ 2.16 1.62 2.7 #### Lens image circle vs. imager size - The max. image circle of a lens is the area over which the lens will provide an acceptable performance. For standard applications only lenses with image circle greater than the imager diagonal size should be selected (see below graphic). If the image circle is smaller than the imager diagonal black or darker corners will result. However, for ultra wide-angle systems, it is common to have the fisheye lens image circle smaller than the diagonal of the imager. If the entire image circle is contained within the effective area of the imager, a circular image is formed. If the imager circle is less than the horizontal width of the imager but greater than the vertical height, a horizontal frame is formed. #### Effective focal length and field of view Once lens image circle is determined, the next step is to determine the appropriate lens focal length (EFL) required to achieve the desired field of view. The lens EFL is an intrinsic property of the lens independent of the imager used. The max. lens field of view (FOV) is specified for the image circle size. However, the field of view of CCD/CMOS camera depends on both the lens EFL and the size of the imager area. If the lens distortion is small (known as rectilinear lenses), the following formula can be used to calculated the camera FOV: - where x represents the width or height or diagonal size of the imager, and f is the lens EFL. We have created an online wizard to perform various FOV/EFL calculation. When there is a significant amount of distortion in the lens such as in the case of very wide-angle lenses and fisheye lenses, the calculation of the FOV is much more involved. We have developed a new concept called “rectilinearity” to characterize the distortion properties of ultra wide-angle and fisheye lenses. When used in conjunction with the effective focal length, the field of view and distortion property of a lens can be fully analyzed without having to know the detailed lens prescription. #### Relative aperture or f/# - The f/# of the lens has two impacts: (1) the amount of light that the lens collects, and (2) the depth of field (DOF). For low-light environment, it is often necessary to choose a lens with low f/#. However, the depth of field of a low f/# lens is limited. Low f/# lenses are also more complex and thus more expensive to produce. Therefore, the optimal f/# selection is based on the tradeoffs between various performance parameters and lens cost. It is usually possible to increase the f/#(stopping down the aperture) of an existing lens design without a detrimental impact on the image quality. However, lowering the f/# (increasing the aperture size) is usually not possible without causing a significant compromise in the image quality/relative illumination. #### Nyquist frequency and image quality - Aa digital imaging system the pixel array of the imager samples the continuous spatial image formed by the optical system. Nyquist Frequency (NF) represents the highest spatial frequency that the imager is capable of detecting. The NF depends on the pixel pitch, color filter array (CFA) design, and the processing algorithms of the entire imaging processing chain. Lens image quality can be the gating factor in the overall image quality of a digital imaging system. To realize the full resolution of the imager the lens resolution should be greater than the NF. The lens should provide sufficient spatial detail to the imager sensor if each pixel of the imager is to be fully utilized. Lens image quality is characterized by its modulation transfer function (MTF). The MTF of a lens varies with spatial frequency as well as angle of the incidence. A good lens should have MTF >40% up to the sensor Nyquist frequency. It should also provide a consistent MTF across the entire field of view of the lens. #### Relative illumination and telecentricity - The light collection ability of all lenses falls off with an increasing field of view. Relative illumination of a lens is defined as the ratio of light intensity at the maximum angle of view to that on-axis. For electronic imager sensors (CCD and CMOS), the off-axis brightness is further reduced by the collection efficiency of the imager pixel structure. Many modern imagers use a micro-lens over each pixel to increase the fill factor. The micro-lens will limit the field of view of the pixel. To be maximally compatible with the micro-lens field of view, the rays emerging from the lens must be within the acceptance angle of the micro-lens for all off-axis rays. This typically require that the primary lens be telecentric in imaging spacing. Non-telecentric lenses can also cause color and resolution cross-talk between adjacent pixels. This will further impair the off-axis performance of the imaging system. Download a white paper on chief ray angle. #### Chromatic aberrations - Optical materials have different indices of refraction at different wavelengths, known as dispersion. The material dispersion causes light at different wavelengths to focus at different focal plane (axial color) and different image height (lateral color). Lateral color can be seen as color fringes at high contrast edges of off-axis objects. Chromatic aberrations can be minimized or eliminated by using a combination of lens elements with different dispersion properties. Download a whitepaper on lateral color. #### Distortion - Lens optical distortion describes how the image is deformed with respect to the object. Distortion (%) is defined as follows: where *ychief*is the image height for an off-axis chief ray, and*yref*is a reference image height for the off-axis field angle. For normal field of view lenses, the reference image height is defined as:where f is the effective focal length and θ is the field angle. The resulting distortion is known as “rectilinear” or “f-tan” distortion. Most standard photographic lenses have low rectilinear distortion. For wide-angle and fisheye lenses, the reference image height is typically chosen as the product of focal length and field angle (in radians): The resulting distortion is known as “f-theta” distortion. Please note that a zero f-theta distortion lens can still look very “distorted” visually. It is possible to “tailor” distortion in such a way that the off-axis resolution is enhanced from the standard “f-theta” type. Sunex has developed unique designs and manufacturing know-how to provide wide-angle lenses with tailored distortion. We also provide Photoshop-compatible plug-in to “de-warp” images taken with tailored distortion lenses. Visual impact of various lens distortion (value is calculated for the corners)#### Depth of field or focus - The depth of field (DOF) of a lens is determined by several factors: the relative aperture or f/#, the lens EFL, the maximum acceptable blur, and the lens MTF. Generally speaking, higher f/# lenses will have more DOF. Shorter EFL lenses will also have more DOF. We provide a wizard to calculate the depth of field for a given lens. If the MTF of the lens is higher, the perceived DOF will also be greater. Because the maximum allowed blur size is somewhat subjective and application dependent, it is strongly recommended that experimental verification of the DOF to be performed. #### Flare, scattering and ghost images - Flare is caused by improper engineering of the lens internal structure such that light rays outside the field of view is “leaked” into the normal field of view. Scattering is caused by the surface roughness of the lens element that causes an overall reduction in the contrast of the image. Ghost images are formed when light rays are bounced multiple times inside lens/sensor structure causing additional “weak” images to be formed near the primary image. These are all optical “noises” which can cause degradation to the overall image quality. Careful consideration must be taken in the design and manufacturing processes to minimize the undesired optical noises. #### IR cut-off filter - IR cut-off filtering in the optical chain is required to form accurate color images. IR cut-off filtering can be accomplished by inserting an IR-cut off filter in the lens system. Another option is to apply the IR cut-off coating onto the lens elements directly. #### Optical low-pass filter (OLPF) - The image formed by a lens is continuous in space. This image is “sampled” by a CCD/CMOS sensor with a sampling frequency equal to the inverse of the 2x pixel pitch. If the image contains objects at spatial frequencies higher than the sampling frequency of the imager, the resulting image will have aliasing artifacts. This phenomenon is often observed as colorful fringes (Moire fringes) on the final images. In high quality imaging systems, optical low-pass filters (OLPF) can be used to eliminate the Moire fringes. OLPF cuts off the lens MTF above the sampling frequency of the imagers resulting an overall MTF that approximates a step function (in spatial domain). Download an application note on OLPF. An OLPF is made of 1 to 3 layers of optical birefringent materials such as quartz. Each birefringent layer splits a light ray by polarization as shown below: --- ## Compliance - Source: https://sunex.com/support/compliance/ - Summary: Sunex does not support or use metals derived from armed conflicts or illegal mining, which are known as "conflict minerals." ## Statements Sunex does not support or use metals derived from armed conflicts or illegal mining, which are known as “conflict minerals.” Sunex complies with national and international foreign trade regulations and adheres to all applicable export controls and trade sanction laws and regulations. ## Certificates ## IATF 16949 An international standard for automotive quality management systems. ## ISO 9001 An international standard for quality management systems (QMS) to improve processes. ## ISO 14001 An international standard for designing and implementing an environmental management system (EMS). ## ISO 45001 An international standard for occupational health and safety (OH&S) management systems. Please contact us if you need a copy of the current version of the certificates. --- ## supportchat - Source: https://sunex.com/support/supportchat/ - Summary: Most of our customers prefer direct and personal support through our engineering team. But if you already know what you need or want to take things at your own pace, we support you the way you want. # We support you the way you want. Most of our customers prefer direct and personal support through our engineering team. But if you already know what you need or want to take things at your own pace, we support you the way you want. Go directly to our online store and sort/select a lens by Part Number, Format, F/#, or Focal Length. Our AI-powered Optical Consultant can help you sort through hundreds of technical documents and product descriptions to narrow down possible lens options. --- ## AI - Source: https://sunex.com/support/ai/ - Summary: We want to bring you support 24/7 using the latest technologies. Please consider AI-based results as an initial suggestion and discuss your findings with one of our engineers. ## AI-powered tools right at yoru fingertip. We want to bring you support 24/7 using the latest technologies. Please consider AI-based results as an initial suggestion and discuss your findings with one of our engineers. ## Use our free MCP Enable Claude, ChatGPT, Gemini and any MCP-compatible AI models to directly access our 350+ lens portfolio utilizing our powerful Optics-Wizards™ tools to find the best lens/CMOS imager solutions for automotive, robotics, medical, and industrial imaging applications. ## coming soon We are working on additional ways to support your workflow with our support tools. We will update this page once available. --- ## Sensor Modules - Source: https://sunex.com/products/sensormodules/ - Summary: Depending on the requirements, we can provide design and manufacturing services for a complete sensor module. We strive to find the best solution for your needs, from designing the schematic, creating the PCB layout, and sourcing all components to building according to your PCB design and parts consignment.At Sunex, we have the in-house expertise and capabilities for lens and sensor board design, high-volume manufacturing, automated active alignment, and testing to support the most demanding vision applications.For additional services, we can bring in partners from our Technology and Service Network that allow us to process bare die and packaged sensors, including cabling options, e.g., for medical applications, and hand over the complete tested sensor module to the following entity in the value chain to integrate the full camera. ## Custom Camera Modules Depending on the requirements, we can provide design and manufacturing services for a complete sensor module. We strive to find the best solution for your needs, from designing the schematic, creating the PCB layout, and sourcing all components to building according to your PCB design and parts consignment. At Sunex, we have the in-house expertise and capabilities for lens and sensor board design, high-volume manufacturing, automated active alignment, and testing to support the most demanding vision applications. For additional services, we can bring in partners from our Technology and Service Network that allow us to process bare die and packaged sensors, including cabling options, e.g., for medical applications, and hand over the complete tested sensor module to the following entity in the value chain to integrate the full camera. ## Always designed with mass production in mind. Often it is not the challenge to create a design “that works” but to find a solution that can scale to mass production, meeting price targets, optical performance, mechanical constraints, and quality requirements. Sunex has over two decades of design and manufacturing experience, and all our lenses and modules are designed for high-volume manufacturability. “Technical and commercial requirements and mutual understanding and alignment of milestones, deliverables, and timelines are all imperative for taking initial prototypes successfully through to mass production, and we recommend contacting us as early as possible in the product development cycle.“ *Ben Roberts, Sunex VP of Sales and Marketing* Don’t find what you are looking for? Try searching our entire **Off-The-Shelf Portfolio** or use our **Imaging System Builder** to get started on a custom solution. ## Key Technologies **System-level Athermalization**The shift of a lens’s focal point over a wide temperature range is a physical phenomenon based on the material-specific expansion and contraction with temperature. A decrease in image quality could be the outcome if the focal point of the lens relative to the sensor’s image plane shifts too much. A fully athermalized system requires selecting appropriate optical and mechanical materials, employing the right design strategy, and close collaboration with the customer to optimize thermal performance at the system level. **Active Alignment** We recommend that our customers consider an active alignment process to achieve the highest system performance when pairing a high-quality lens with a high-resolution sensor. Our active alignment offering can grow with the ramp of the program and enables the use of an alignment process from the beginning. Applying a fully automated 6-axis active alignment in mass production increases yield, shortens cycle times, improves system performance, and lowers part-to-part variance. **Fast Prototyping**In specific cases, prototyping is used as the first stepping stone toward mass production. Besides the lens performance, exploring various mechanical design solutions is often part of these early efforts. Staying within the required dimensions while accommodating additional working channels and features is crucial for a successful product. Sunex can produce prototypes with short lead times to verify the design before transitioning further on the path to mass production using state-of-the-art fabrication processes for glass and plastic optical elements and all mechanical components. **Reliability and Environmental Testing** Often, it is not the challenge to create a design “that works” but to find a solution that can scale to mass production (DFM), meets required price targets (DTC), and delivers on performance and quality (DFR). Our deep design and manufacturing experience comes from servicing some of the most demanding markets and applications, and combined with consistent quality and global on-time delivery, Sunex is the preferred partner for many. ## Supporting Services **Partner Network**We work with our customers and partner network to find the best balance between cost and performance to meet the often unique application requirements. Sunex offers a wide range of services, including designing and manufacturing the lens and the optomechanical components, the PCBA, and the right cabling solution. Sunex has the expertise and capabilities for high-volume manufacturing in state-of-the-art cleanroom facilities, automated 6-axis active alignment, and the test and quality control processes required to support the most demanding medical imaging applications. The relationship between image circle and sensor format is what determines the Field of View (FOV) of your system. Overlooking how they interact can lead to unexpected coverage gaps, resolution limits, and performance tradeoffs. As robotics and automation systems grow increasingly compact, intelligent, and power-efficient, the supporting vision technologies must evolve in parallel. Interested in additional articles, white papers, and more? Visit our **Technology & Resource Hub** and get free access to high-quality information. --- ## Depth Sensing - Source: https://sunex.com/products/depth_sensing/ - Summary: LiDAR and ToF (Time-of-Flight), refer to a measurement principle based on a signal leaving a source and a detector measuring the time it takes for a detector to receive the same signal back. The distance to any given object can be determined by factoring in the speed of the signal itself. Optical systems play a critical role when the signal is based on light, and the most common systems are referred to as LiDARs and ToF cameras. ToF-cameras illuminate a scene with a modulated signal, and the phase shift between the send and receive signal determines the depth ranging. ## Depth Sensing Lens Portfolio LiDAR and ToF (Time-of-Flight), refer to a measurement principle based on a signal leaving a source and a detector measuring the time it takes for a detector to receive the same signal back. The distance to any given object can be determined by factoring in the speed of the signal itself. Optical systems play a critical role when the signal is based on light, and the most common systems are referred to as LiDARs and ToF cameras. ToF-cameras illuminate a scene with a modulated signal, and the phase shift between the send and receive signal determines the depth ranging. Many industries and applications have leveraged these technologies for decades, from topology and meteorology to medical and industrial robotics applications. In recent years, ToF cameras have also entered high-volume consumer and automotive markets. Many established and new companies are pushing the boundaries to reduce costs and advance performance for long- and short-range systems. | PN | Description | EFL | F/# | IMC | Unit Price | Files | |---|---|---|---|---|---|---| | Loading… | Additional design examples. | Type | Format | EFL | FOV | F/# | TTL | Features | |---|---|---|---|---|---|---| | ToF Lens | 1/3" | 4.5mm | 68° | F/1.5 | 27mm | Hybrid Design, various Filter options | | ToF Lens | 1/2.8" | 3.3mm | 123° | F/1.4 | 28mm | Wide FOV, various Filter options | | ToF Lens | 1/3" | 6.1mm | 59° | F/1.6 | 12mm | Short TTL, various Filter options | | ToF Lens | 1/3" | 4.6mm | 76° | F/1.7 | 21mm | Short TTL, various Filter options | | Receiver | 1.5" | 41mm | 35° | F/1.4 | 57mm | Long range, narrow FOV, low straylight | | Receiver | 1" | 8mm | ≥120° | F/1.3 | 50mm | Hybrid Design, Short range, wide FOV | | Receiver | 2/3" | 25mm | 25° | F/1.0 | 52mm | Long range, narrow FOV | Try searching our entire **Off-The-Shelf Portfolio** or use our **Imaging System Builder** to get started on a custom solution. ## Key Technologies **Miniaturized SuperFisheye™**With the recent advancements to expand the LiDAR technology to non-spinning short-range or near-field, LiDARs the requirements shifted to an increase in horizontal (HFOV) and vertical field of views (VFOV), smaller F/#, and a decrease of the overall form-factor. Sunex pioneered and coined the term Miniaturized SuperFisheye™ lenses in the automotive industry. We are now applying the same design concepts and experiences to support our customers in advancing their LiDAR product range. All graphs are for illustration purposes only. The individual lens performance can be different. ## Supporting Services **Designed for Mass Production**Often it is not the challenge to create a design “that works” but to find a solution that can scale to mass production, meeting price targets, optical performance, mechanical constraints, and quality requirements. Sunex has decades of design and manufacturing expertise, and all of our ToF- or LiDAR lenses are designed within the context of high-volume manufacturability. **Automotive Qualified**With almost two decades as a qualified automotive supplier to our global customer base, we know what is required to design and manufacture a lens that has stable performance over a wide temperature range and passes automotive reliability and environmental testing. Whether we improve existing work through Design for Manufacturing (DFM) and Design to Cost (DTC) cycles or start with a blank sheet design to meet all requirements, the end goal is always to deliver on time with consistent quality. Interested in additional articles, white papers, and more? Visit our **Technology & Resource Hub** and get free access to high-quality information. --- ## Products - Source: https://sunex.com/products/ - Summary: With over 100 million units shipped, our factory-direct M12 lens expertise offers an extensive off-the-shelf portfolio, modifications, rapid prototyping, and full custom designs for lenses and actively aligned camera modules. # M12 Lens (S-mount Lens) and Products With over 100 million units shipped, our factory-direct M12 lens expertise offers an extensive off-the-shelf portfolio, modifications, rapid prototyping, and full custom designs for lenses and actively aligned camera modules. ## Search by Imager Resolution ## Search by Sunex Part Number ## Don't knwo where to start? ## ### M12 (S-mount) lens selection guide Once the imager is chosen, the process for selecting an M12 lens (also called S mount lens) does not differ from that of selecting other CMOS lenses and consists of the following steps: - Determine the desired field of view (in angles if the object is at infinity, and in actual sizes if the object is at a finite distance). - Calculate the required focal length of the lens, and the image circle size. We have created a wizard to perform this calculation. - Choose an appropriate lens f/# based on similar lighting environment and depth of field requirement. We have created a wizard to calculate the depth of field. - Determine the appropriate lens performance requirements such as modulation transfer function (MTF), chromatic aberration, distortion, and relative illumination. - Specify the mechanical size constraint and reliability requirements. #### Imager format and resolution The starting point is the format size which is linked to the effective area of the imager. The format size definition comes from pre-electronic imaging era. It does not directly represent the diagonal size of the effective area. Commonly seen imager formats and their actual physical sizes are listed below. The imager resolution is the number of effective pixels in the horizontal and vertical directions. The total number of pixels is often used to represent the nominal resolution of an imager. Imager Format Approximate horizontal size (in mm) Approximate vertical size (in mm) Approximate diagonal size (in mm)35mm full frame 36 24 43.3 APS-C 23.6 15.6 28.3 1.5″ 18.7 14.0 23.4 Micro 4/3rd 17.3 13 21.7 1″ 12.8 9.6 16.0 1/1.2″ 10.67 8 13.4 2/3″ 8.8 6.6 12.0 1/1.7″ 7.6 5.7 9.5 1/2″ 6.4 4.8 8.0 1/2.3″ 6.17 4.55 7.8 1/2.5″ 5.7 4.32 7.2 1/2.7″ 5.3 4 6.6 1/3″ 4.8 3.6 6.0 1/3.2″ 4.54 3.42 5.7 1/4″ 3.6 2.7 4.5 1/5″ 2.56 1.92 3.2 1/6″ 2.16 1.62 2.7 #### Lens image circle vs. imager size - The max. image circle of a lens is the area over which the lens will provide an acceptable performance. For standard applications only lenses with image circle greater than the imager diagonal size should be selected (see below graphic). If the image circle is smaller than the imager diagonal black or darker corners will result. However, for ultra wide-angle systems, it is common to have the fisheye lens image circle smaller than the diagonal of the imager. If the entire image circle is contained within the effective area of the imager, a circular image is formed. If the imager circle is less than the horizontal width of the imager but greater than the vertical height, a horizontal frame is formed. #### Effective focal length and field of view Once lens image circle is determined, the next step is to determine the appropriate lens focal length (EFL) required to achieve the desired field of view. The lens EFL is an intrinsic property of the lens independent of the imager used. The max. lens field of view (FOV) is specified for the image circle size. However, the field of view of CCD/CMOS camera depends on both the lens EFL and the size of the imager area. If the lens distortion is small (known as rectilinear lenses), the following formula can be used to calculated the camera FOV: - where x represents the width or height or diagonal size of the imager, and f is the lens EFL. We have created an online wizard to perform various FOV/EFL calculation. When there is a significant amount of distortion in the lens such as in the case of very wide-angle lenses and fisheye lenses, the calculation of the FOV is much more involved. We have developed a new concept called “rectilinearity” to characterize the distortion properties of ultra wide-angle and fisheye lenses. When used in conjunction with the effective focal length, the field of view and distortion property of a lens can be fully analyzed without having to know the detailed lens prescription. #### Relative aperture or f/# - The f/# of the lens has two impacts: (1) the amount of light that the lens collects, and (2) the depth of field (DOF). For low-light environment, it is often necessary to choose a lens with low f/#. However, the depth of field of a low f/# lens is limited. Low f/# lenses are also more complex and thus more expensive to produce. Therefore, the optimal f/# selection is based on the tradeoffs between various performance parameters and lens cost. It is usually possible to increase the f/#(stopping down the aperture) of an existing lens design without a detrimental impact on the image quality. However, lowering the f/# (increasing the aperture size) is usually not possible without causing a significant compromise in the image quality/relative illumination. #### Nyquist frequency and image quality - Aa digital imaging system the pixel array of the imager samples the continuous spatial image formed by the optical system. Nyquist Frequency (NF) represents the highest spatial frequency that the imager is capable of detecting. The NF depends on the pixel pitch, color filter array (CFA) design, and the processing algorithms of the entire imaging processing chain. Lens image quality can be the gating factor in the overall image quality of a digital imaging system. To realize the full resolution of the imager the lens resolution should be greater than the NF. The lens should provide sufficient spatial detail to the imager sensor if each pixel of the imager is to be fully utilized. Lens image quality is characterized by its modulation transfer function (MTF). The MTF of a lens varies with spatial frequency as well as angle of the incidence. A good lens should have MTF >40% up to the sensor Nyquist frequency. It should also provide a consistent MTF across the entire field of view of the lens. #### Relative illumination and telecentricity - The light collection ability of all lenses falls off with an increasing field of view. Relative illumination of a lens is defined as the ratio of light intensity at the maximum angle of view to that on-axis. For electronic imager sensors (CCD and CMOS), the off-axis brightness is further reduced by the collection efficiency of the imager pixel structure. Many modern imagers use a micro-lens over each pixel to increase the fill factor. The micro-lens will limit the field of view of the pixel. To be maximally compatible with the micro-lens field of view, the rays emerging from the lens must be within the acceptance angle of the micro-lens for all off-axis rays. This typically require that the primary lens be telecentric in imaging spacing. Non-telecentric lenses can also cause color and resolution cross-talk between adjacent pixels. This will further impair the off-axis performance of the imaging system. Download a white paper on chief ray angle. #### Chromatic aberrations - Optical materials have different indices of refraction at different wavelengths, known as dispersion. The material dispersion causes light at different wavelengths to focus at different focal plane (axial color) and different image height (lateral color). Lateral color can be seen as color fringes at high contrast edges of off-axis objects. Chromatic aberrations can be minimized or eliminated by using a combination of lens elements with different dispersion properties. Download a whitepaper on lateral color. #### Distortion - Lens optical distortion describes how the image is deformed with respect to the object. Distortion (%) is defined as follows: where *ychief*is the image height for an off-axis chief ray, and*yref*is a reference image height for the off-axis field angle. For normal field of view lenses, the reference image height is defined as:where f is the effective focal length and θ is the field angle. The resulting distortion is known as “rectilinear” or “f-tan” distortion. Most standard photographic lenses have low rectilinear distortion. For wide-angle and fisheye lenses, the reference image height is typically chosen as the product of focal length and field angle (in radians): The resulting distortion is known as “f-theta” distortion. Please note that a zero f-theta distortion lens can still look very “distorted” visually. It is possible to “tailor” distortion in such a way that the off-axis resolution is enhanced from the standard “f-theta” type. Sunex has developed unique designs and manufacturing know-how to provide wide-angle lenses with tailored distortion. We also provide Photoshop-compatible plug-in to “de-warp” images taken with tailored distortion lenses. Visual impact of various lens distortion (value is calculated for the corners)#### Depth of field or focus - The depth of field (DOF) of a lens is determined by several factors: the relative aperture or f/#, the lens EFL, the maximum acceptable blur, and the lens MTF. Generally speaking, higher f/# lenses will have more DOF. Shorter EFL lenses will also have more DOF. We provide a wizard to calculate the depth of field for a given lens. If the MTF of the lens is higher, the perceived DOF will also be greater. Because the maximum allowed blur size is somewhat subjective and application dependent, it is strongly recommended that experimental verification of the DOF to be performed. #### Flare, scattering and ghost images - Flare is caused by improper engineering of the lens internal structure such that light rays outside the field of view is “leaked” into the normal field of view. Scattering is caused by the surface roughness of the lens element that causes an overall reduction in the contrast of the image. Ghost images are formed when light rays are bounced multiple times inside lens/sensor structure causing additional “weak” images to be formed near the primary image. These are all optical “noises” which can cause degradation to the overall image quality. Careful consideration must be taken in the design and manufacturing processes to minimize the undesired optical noises. #### IR cut-off filter - IR cut-off filtering in the optical chain is required to form accurate color images. IR cut-off filtering can be accomplished by inserting an IR-cut off filter in the lens system. Another option is to apply the IR cut-off coating onto the lens elements directly. #### Optical low-pass filter (OLPF) - The image formed by a lens is continuous in space. This image is “sampled” by a CCD/CMOS sensor with a sampling frequency equal to the inverse of the 2x pixel pitch. If the image contains objects at spatial frequencies higher than the sampling frequency of the imager, the resulting image will have aliasing artifacts. This phenomenon is often observed as colorful fringes (Moire fringes) on the final images. In high quality imaging systems, optical low-pass filters (OLPF) can be used to eliminate the Moire fringes. OLPF cuts off the lens MTF above the sampling frequency of the imagers resulting an overall MTF that approximates a step function (in spatial domain). Download an application note on OLPF. An OLPF is made of 1 to 3 layers of optical birefringent materials such as quartz. Each birefringent layer splits a light ray by polarization as shown below: ## ### Pick the right M12 (S-mount) lens for your project Talking to our clients, we noticed that selecting the right M12 lens (also called S mount lens) for a specific project or simply limiting the options to a range of applicable lenses is not trivial. Sunex’s Optical wizards are free online tools (registration required) designed to assist you in selecting the proper M12 lens or any other CMOS lens for your applications. If wizards, tools, and configurators are not your cup of coffee (or even if they are), we are always here to talk to you in person! Contact Sunex There are different approaches when it comes to selecting the right lens, depending on whether you already selected a specific imager or have hard requirements for FOV and EFL. No matter your starting point, Sunex’s Optical wizards will help you in the selection process: - Search imager database: We have built a database of popular CMOS imagers from major suppliers. You can search for an imager based on the manufacturer’s name, PN and imager resolution. Once an imager is identified, you can then go on to search for a list of compatible lenses. - Find a lens by imager specification: Given the imager resolution and pixel pitch, this tool will compute a list of imager key characteristics, and search our database for all matching lenses. - Field of view and EFL calculator: This tool will calculate the required lens effective focal length to achieve a desired field of view in degree or vice versa. This tool will work for all lenses including wide-angle and fisheye lenses with a significant amount of distortion. - Depth of field calculator: This tool will calculate the depth of field and hyper-focal distance for a given lens focal length and f/# . It requires the user to enter the maximum blur size in µm. It works for both infinite and finite conjugate systems. - Imaging optics solver: For a given object and image size requirement, this tool calculates the required focal length of the lens based on first-order optics. It then recommends a suitable lens structure and focal length based on the object field size. It is a good starting point for solving finite conjugate problems. - Search lens by optical parameters: This is a collection of advanced search tools. *Order Samples*(and check stock availability), calculate the *Depth of Field*, and *Request a Volume Quote*. Once you selected the proper optics, the linked PDF will give you access to a datasheet and dimensional drawings. ## ### M12, S-Mount, C-Mount…What does it all mean? **M12, S-Mount, C-Mount…What does it all mean?** If you’ve been searching for a small-camera lens, you’ve likely encountered terms like M12, S-Mount, Board-Mount, Miniature Lens, C-Mount, CS-Mount, and more. With so many names floating around, it’s easy to get confused. So, why does it seem like there are so many options? And how do you know which one is right for your project? In an effort to answer these questions, we thought we’d explore what these terms mean, where they come from, and how they relate to your lens selection. **In the beginning (at least in this story), there was C-mount.** Going back decades, the standard for interchangeable industrial lenses (as opposed to most consumer photographic cameras) was the **C-mount**. The C-Mount lens, still a staple in industrial machine vision, some security camera circles, and university labs, was one of the first solutions to standardize the lens mount format. It paved the way even before the days of CCD and early CMOS cameras. The thread of a C-Mount is 1-32; more specifically it’s 1” in diameter with 32 TPI (threads per inch), or: M25.4 x 0.794mm. The C-mount has a standardized back focal length (BFL) of 0.69” (17.526mm), meaning that all these lenses were designed with the same flange BFL. In practice, the lens was screwed all the way down to the mount until tight and then a focusing mechanism allowed fine focus depending on the object distance. This of course made for some long TTL lenses, especially those with a long EFL. To help address the length issue, the **CS-Mount** lens format was introduced. While it uses the same thread and mounting strategy as a C-mount, it has a shorter fixed BFL of 0.4931” (12.526mm). Although the C/CS-Mount makes for very straight-forward interchangeability, there are several drawbacks to their formats. First, the standardization of the FBFL is somewhat arbitrary in terms of optimizing optical performance and actually imposes a design constraint. Second, since the FBFL is fixed, the lens must have a secondary focus mechanism, and since the standard 32 TPI thread is not fine enough to focus 10s of microns of DOF (depth of focus), the lens must incorporate a relatively complex mechanical means of achieving fine focus. Third, the fixed FBFL alone means the TTL of the lens will be at least 12.5mm long (more for C-Mount) before even considering the physical length of the lens. Fourth, the C/CS mount is typically (but not always) integrated into the housing or chassis of the camera with the sensor-board mounted separately. This means there is no direct mechanical reference or interface between lens and sensor, which you’ll know could be a potential source of error from our AA article (Sunex Knowledge Center: What Is Active Alignment?). Lastly, but admittedly not exclusive to C/CS lenses, there is a tendency to add more features since they are interchangeable. While these features may be ideal in applications where flexibility is needed, it is less desirable for fixed, high-volume circumstances. Often, these features also come at the cost of well, cost, in addition to reliability, design and performance tradeoffs. Despite these drawbacks and the fact that C/CS-Mounts aren’t technically Board-Mounts, they still have plenty of utility. It’s important to recognize how these formats helped establish standards for lens mounting and continue to serve many applications today. **Now, the “Board-Mount”** “Board-Mount” or “S-Mount” lenses address the C/CS-Mount issues in a few ways. Board-mount lenses have no dependency on a fixed FBFL/BFL and no need for a separate focus mechanism. They are designed to thread into a threaded mount directly attached to the sensor PCB. The thread doubles as the focusing mechanism because it typically has a 0.5mm or 0.35mm pitch making it fine enough to focus a lens (see our article Sunex Knowledge Center: Basic Thread Considerations). It also eliminates many (but not all) sources of alignment error between lens and sensor by placing the lens directly on the sensor board. Of course, this means that the BFL, FBFL and MBFL are coupled to the focal position of the lens. This means the focal position changes slightly from camera to camera, but the differences are on the order of 10’s of microns, so it is generally not a problem. A natural result of this board-mount approach is the proliferation of optimized, design-for-purpose lenses. “Board-Mount” is simply a general, all-encompassing term for lenses that are mounted and focused in this way. Within this broad category, M12 lenses, also referred to as S-mount, are the most common. Both terms refer to an M12x0.5mm lens, that is, a 12mm diameter lens with 0.5mm thread pitch. In fact, “M12” has become almost synonymous with Board-Mount but in truth, while all M12’s are Board Mounts, not all Board-Mount lenses are M12 lenses. Other popular sizes of board-mount lens include M14, M10, M8, M7 and even smaller. Thread pitch tends to scale roughly with diameter and M8x0.35mm are fairly common, but in theory any size thread can be used with any diameter lens. For example, M12 “fine-focus” (M12x0.35) or even larger diameters may be specified in critical higher-megapixel applications, to gain a bit more focus control. M12 and other Board-Mount lenses are also ideally suited to active alignment because there is not a fixed BFL and therefore no secondary focus requirement. In Active Alignment, an M12 lens can have its thread removed and can be focused and fixed directly over the sensor in one step without impacting the rest of the design. For example, you could prototype with a threaded M12 lens and mount and then go straight to mass-production with a threadless version of the same lens and mount. The other C/CS-Mount issues are addressed by M12 and other Board-Mount lenses as well. Since FBFL is not fixed, the lens design can converge on the best performance, independent of BFL. This generally leads to a much shorter overall solution. It also eliminates the need for complex focusing mechanism internal to the lens. And since such lenses tend to be built-to-purpose, gone is the need for costly and complex varifocal, aperture and locking mechanics. There are also typically commensurate gains in performance, consistency and reliability for M12 lenses compared to their C/CS counterparts because there are fewer trade-offs. While Sunex does offer C/CS lenses, we have also pioneered large-format Board-Mount lenses, such as M20x0.5 and larger. These lenses bring the old C/CS standard into the modern age by allowing them to be mounted directly over the sensor with short BFLs, with the possibility of Active Alignment. But in the world of miniature cameras, the M12 “Board-Mount” still reigns supreme, no matter what you call it. ## ### Choosing the Right Sourcing Strategy for M12 Lenses Selecting the right lens sourcing strategy has direct, long-term consequences on image performance, supply continuity, and program economics. The market currently offers three distinct channels: internet platforms, catalog-style intermediaries, and direct OEM partnerships. Each offers benefits at different phases of development, but each also carries distinct risks that grow or shrink as projects move from concept to fielded products. This whitepaper provides a practical framework to evaluate the trade-offs among the three channels. It integrates real-world scenarios across robotics, industrial automation, embedded vision, and drone imaging, and it attempts to quantify lifecycle impacts using a Total Cost of Ownership (TCO) approach to lens sourcing. The conclusion is straightforward: Internet platforms and intermediaries are potentially valuable options for speed and flexibility in early phases, but mission-critical systems and volume production benefit most from an OEM partnership that aligns optical design, quality, and supply with the product roadmap, and fostering these relationships from the very beginning of a project can pay dividends in terms of Total Cost of Ownership. Figure 1. Comparison of sourcing channels across key success factors. ### 1. The Landscape of M12 Lens Sourcing M12 board lenses are the workhorses of compact imaging, enabling a wide range of FOV (field of views) and F/#’s in small packages and integrating with modern CMOS sensors across a diverse range of devices. As sensor performance improves and mechanical envelopes shrink, optics must carry a greater burden for contrast, distortion control, relative illumination, and environmental stability. **Robotics**→ Object detection, navigation, bin picking**Industrial automation**→ Inspection, defect detection, process optimization**Embedded vision**→ Compact consumer and enterprise devices**Drone imaging**→ Aerial mapping, agriculture analytics, surveillance At the same time, the supply landscape has broadened. Low-cost marketplaces put thousands of lens SKUs within a click. Intermediaries curate selections, maintain regional inventory, and reduce friction for small orders. OEM lens manufacturers design, produce, and support lenses at scale with guarantees on performance, process control, and lifecycle. Understanding where each channel fits means separating what matters in the lab from what matters in the field across years of production. #### Internet Platforms Marketplaces such as Amazon and Alibaba offer unmatched convenience and breadth. They are ideal for quickly assembling a bench of candidate lenses to sample fields of view, mechanical clearances, and basic image quality. However, listings may draw from anonymous, mixed, or end-of-life lots; coating recipes and glass sets may vary over time; and there is rarely a roadmap commitment or any traceability. For these reasons, internet lenses are effective tools for exploration but are risky foundations for any product that requires repeatability, certification, or long-term serviceability. #### Intermediaries and Catalog Resellers Intermediaries create value by pre-screening suppliers, carrying inventory, and simplifying procurement for small runs. They are particularly helpful between proof-of-concept and pilot, when teams need a consistent part number without committing to an OEM minimum order or a custom design. Yet intermediaries are constrained by their upstream sources. They typically do not control most aspects of the design, including coating, glass sourcing, or process, and they cannot guarantee that a given SKU will remain in production for the lifetime of your product. When volumes increase or performance margins tighten, such constraints can force an unplanned redesign. #### OEM Lens Manufacturers OEMs design and manufacture lenses, manage material supply chains, and validate performance against application-specific or even customer-specific requirements. A mature OEM partnership extends beyond the PN; it includes engineering collaboration (field of view and distortion trade-offs, stray light, spectral response), process control (custom parameters, binning, yield management), and lifecycle planning (EOL policies, alternatives, second-source strategy). Although the unit price may be higher at the outset, and lead times require planning, the risk profile and total program cost are significantly lower in mission-critical, multi-year, and high-volume scenarios. For building long-term, win-win relationships where both the customer and the supplier can bring their full strengths to bear, this is the best option. ### 2. How the Sourcing Channels Fit into the Product Development Cycle Product development is often a series of changing constraints. Early on, speed dominates: teams need to consider multiple performance envelopes, mounting options, and ISP pipelines. As prototypes evolve into pilots, repeatability and early supply assurances take priority. At design freeze and launch, quality and reliability take precedence, and lifecycle commitments become non-negotiable. To some extent, these shifting constraints map naturally to the strengths of each sourcing channel. The trick is not to get locked into a path that is not scalable to your ultimate goal. During concept and POC phases, internet platforms can supply breadth and immediacy, if not exactly meeting the spec. Engineers can sample a dozen lenses very quickly to validate basics, such as the field of view, F/#, and first-order mechanical parameters. The goal is to learn quickly, not to lock architecture on a commodity part. In Pilot and Beta, intermediaries can add value while also having the ability to support small, ongoing projects looking forward. They reduce friction for “sub-MOQ” builds, provide a single catalog with multiple options, and can maintain a buffer stock while customers complete qualification testing. The risk is that the upstream lens may change subtly between lots or disappear altogether (EOL), through no fault of the supplier themselves. At Design Freeze and Production Ramp, OEMs become essential. The discipline of a controlled design, documented process flow, and optionally active alignment to the sensor removes variability that would otherwise manifest as yield loss, RMAs, or artifacts in the image. In small quantities, this may be tolerable, as you can hand-sort, but in production, it is unacceptable. Reliable OEMs also lock product lifecycles to the customer roadmap, preventing surprise discontinuities during scale-up and mass production, and for aftermarket support. If the customer started out with an “internet lens,” which somehow made it this far in the design cycle, this is where TCO starts to become a major issue for so-called inexpensive lenses. The cost and schedule stress of redesigning and implementing new optics at this stage typically ripples far beyond the lens itself. Figure 2. Conceptual suitability of each channel across the major development stages. ### 3. Real-World Industry Examples #### Robotics and Warehouse Automation A robotics integrator building a bin-picking camera used inexpensive internet-sourced lenses to evaluate several fields of view. The prototypes worked until thermal cycling at the factory floor revealed focus drift and increased distortion at temperature extremes. Transitioning to an OEM design with thermally balanced materials and tighter assembly tolerances stabilized focus and cut field failures by more than half. Redesign was required, but was done early on, and the cost was more than offset by avoiding RMAs and line downtime. #### Industrial Automation and Semiconductor Inspection In defect inspection, modulation transfer function (MTF) consistency directly affects false positives. A machine builder using standard catalog lenses encountered lot-to-lot variation that pushed MTF just below the acceptance window for some lots. After consulting an OEM lens manufacturer, the OEM suggested using binned (sorted) elements and specially controlled assembly torque and case-specific OQC testing. Qualification passed on the first attempt, and the program recovered three months of schedule with significant improvement in false positives (yield rate). #### Embedded Vision Devices A compact enterprise device ramped from 200 to 30,000 units per year. Its catalog lens was discontinued midway through ramp, triggering an unexpected optical redesign and FCC re-test, resulting in sudden costs and delays. A subsequent OEM engagement was able to deliver a mechanically drop-in lens replacement optimized for the same sensor with consistent shading and improved relative illumination, locked to a five-year supply plan. #### Drone Imaging and Multispectral Analytics An agriculture drone platform needed RGB and near-IR imagery while meeting strict mass and vibration constraints. Early experiments with off-the-shelf lenses exposed coating degradation and decenter sensitivity under vibration profiles as a key spec. An OEM solution combined a dual-channel design with IR-optimized coatings, ruggedization and active alignment to the sensor, enabling repeatable NDVI computation and faster regulatory approvals. ### 4. Total Cost of Ownership (TCO): Why Upfront Price Is Not Total Price TCO aggregates all costs required to deliver and sustain a product: engineering hours, yield losses, RMAs, replacements, qualification delays, and the risk-weighted cost of supply disruption. Internet platforms often minimize unit price but externalize many of these costs; intermediaries reduce some variability but do not eliminate upstream risk; OEMs reduce lifecycle costs through design control, process discipline, and roadmap alignment. Factor | Internet Platforms | Intermediaries | OEM Manufacturers | | Redesign Costs | Very high | Moderate | Minimal | | RMA / Field Failures | Frequent, expensive | Lower | Lowest | | Qualification Delays | Likely | Less common | Minimal | | Yield Optimization | None | Limited | Fully controlled | | Redesign Costs | Very high | Moderate | Minimal | | Engineering Support | None | Limited | Full optical/system support A simple way to visualize this is to model cumulative lifecycle cost over time. Internet-sourced parts start low but accelerate as failures and redesigns accumulate. Intermediary-sourced parts fare better, but may still increase due to limited control over process drift or EOL. OEM parts often – not always -start at a higher price but remain relatively stable over the product’s lifetime. Figure 3. Conceptual TCO curves. Internet platforms minimize upfront price but often maximize lifecycle cost; OEM curves are higher initially but flatter over time. ### 5. Strategic Recommendations and Decision Framework **Start fast, but do not anchor architecture to commodity parts **is the key. Use internet platforms to accelerate learning but treat those lenses as disposable tools for discovery. Once the optical envelope is understood, move to controlled sources. When a pilot demands a few dozen to a few hundred units, intermediaries can be a pragmatic bridge. Validate batches aggressively: check MTF, distortion, shading, and environmental stability across multiple lots. Confirm the reseller’s view of upstream continuity before committing to field trials. Even at low quantities, keep one eye on the future. Could this product ramp to significant volumes? Will your initial choices scale seamlessly? Will this company/product be here to support me in 5 years? For ramp-up and production, or for those projects which will invariably ramp to high volumes, choose an OEM partnership from the outset that is aligned to your sensor, packaging, and lifecycle plan. Define performance windows and test methods jointly; consider active alignment to stabilize focus and tilt; document change-control and EOL procedures; and synchronize forecasts so material supply and capacity scale with demand. Finally, incorporate TCO into milestone reviews. A lens that saves a few dollars in the BOM can cost hundreds of thousands of dollars in redesigns and field interventions later. Use TCO models to make these hidden costs visible before they materialize. Decision Checklist - Have we validated optical performance across temperature and vibration to production limits? - Is there documented lot traceability and change control for the lens and key materials? - Do we have an agreed roadmap and EOL policy matched to our product lifecycle? - Are yield, binning, and active alignment options defined to protect margins at scale? - Does the supplier offer direct Engineering and QC support? - Have we stress-tested supply continuity with realistic forecast scenarios? ### 6. Professional Positioning of Intermediaries Intermediaries should be acknowledged as important participants in the ecosystem. Many provide tangible value: local inventory, simplified procurement, and pragmatic assistance for early deployments. The argument presented here is not that intermediaries lack merit, but that their role is structurally different from a design-and-manufacture partner. This article’s recommendation is therefore not a criticism; it is a risk-managed allocation of roles that aligns channel strengths with project characteristics. When intermediaries source from OEMs, the collaboration can be positive, provided that plan-of-record parts, documentation, and lifecycle commitments remain robust. ### 7. Conclusion Sourcing choices determine more than unit price: they influence image quality, yield, schedule, and customer experience for years to come. Internet platforms and intermediaries accelerate learning and simplify early builds; OEM partnerships stabilize products, reduce lifecycle cost, and protect brand equity in the field. For mission-critical systems in robotics, industrial automation, embedded vision, and drone imaging, the data and experience converge on a simple rule: prototype fast, then productize with an OEM. While internet platforms and intermediaries can play roles early in development, OEM partnerships offer unmatched advantages: - Custom design integration - Guaranteed lifecycle continuity - Optimized yields and reduced RMAs - Engineering collaboration and value-added services, such as active alignment --- ## AI Vision Module - Source: https://sunex.com/products/aivision/ - Summary: If applied AI for embedded vision aims to replicate human understanding, then the optical stack plays a significant role in achieving a human-like vision for any camera-based application. Choosing the right lens is a curtail step in system-level optimization and setting a roadmap to achieving desired outcomes. With Sunex as a lens and technology partner, our clients can access specific lens technologies to optimize algorithms, reduce system latencies and power consumption, and enhance imaging performance. If applied AI for embedded vision aims to replicate human understanding, then the optical stack plays a significant role in achieving a human-like vision for any camera-based application. Choosing the right lens is a curtail step in system-level optimization and setting a roadmap to achieving desired outcomes. With Sunex as a lens and technology partner, our clients can access specific lens technologies to optimize algorithms, reduce system latencies and power consumption, and enhance imaging performance. Sunex’s expertise and experience in manipulating distortion profiles to support algorithm-specific requirements have been valued by customers for many years. Our Tailored Distortion™ expertise has often been applied to SuperFisheye™ lenses to correct barrel distortion of large FOV lenses. Sunex’s FOVEA distortion lenses are designed to mimic human vision. The distortion profile results in a higher pixel density in the center while maintaining a wide field of view, thus optimizing the performance of machine vision algorithms. Environments with low or changing light are a challenge for any algorithm. Sunex has lens designs that combine very low F/#, high Relative Illumination (RI), high dynamic range (HDR), high MTF across the field, and a broad wavelength spectrum for consistent performance across a variety of scenarios. Sunex has developed design expertise, process know-how, and nested cleanroom manufacturing facilities to eliminate or minimize optical noise (such as ghosts, flare, starbursts, spurious images) in lenses for high-performance applications. All graphs are for illustration purposes only. The individual lens performance can be different. Sensor Module Capabilities Depending on the need and expertise of our customers, we provide design and manufacturing services for a complete sensor module. We strive to find the best solution for your needs, from designing the schematic, creating the PCB layout, and sourcing all components to building according to your PCB design and parts consignment. At Sunex, we have the in-house expertise and capabilities for lens and sensor board design, manufacturing, and testing to deliver a fully tested sensor module. Active Alignment Capabilities To achieve the highest system performance when pairing a high-quality lens with a high-resolution sensor, we recommend that our customers consider an active alignment process. Applying a fully automated 6-axis active alignment in mass production increases yield, shortens cycle times, improves system performance, and lowers part-to-part variance. ## Feature Products | PN | Format | MP Class | HFOV | F/# | Feature | |---|---|---|---|---|---| | DSL144 | 1/1.8" | 1.7MP | 100° | F/1.6 | FOVEA lens, Hybrid, HDR | | DSL392 | 1/1.27" | 2MP | 201° | F/2.0 | SuperFisheye(TM), RGBIR, HDR | | DSL936 | 1/1.2" | 5MP | 52° | F/3.2 | RGBIR lens, All glass, Short TTL | | DSL374 | 1/1.8" | 8.3MP | 133° | F/1.6 | FOVEA, All Glass. 4k, Wide FOV | | DSL350 | 1/1.8" | 8.3MP | 122° | F/1.44 | FOVEA lens, 4k, Hight RI, very low F/# | | DSL186 | 1/1.7" | 8MP | 140° | F/1.8 | RGBIR lens, 4k, Hybrid, HDR | | DSL387 | 1/1.7" | 4.1MP | 120° | F/1.8 | FOVEA lens, All Glass. High RI | ## Additional Topics related to AI VISION MODULE(TM) **Fast Prototyping** We provide prototyping services for complete lens assemblies often as the first step after a new custom design. Sunex can produce prototypes with short lead times to verify the design before transitioning further on the path to mass production using state-of-the-art fabrication processes for glass and plastic optical elements and all mechanical components. --- ## Lighting - Source: https://sunex.com/products/lighting/ - Summary: Lighting functions in and around the car play an ever-increasing role in an OEM’s brand recognition and are a signature piece for corporate design consideration. From a purely technical perspective, modern high definition (HD) headlamps are designed for two main applications: Advanced Lighting Functions and Road Projections. ## HD Lighting Lens Portfolio Lighting functions in and around the car play an ever-increasing role in an OEM’s brand recognition and are a signature piece for corporate design consideration. From a purely technical perspective, modern high definition (HD) headlamps are designed for two main applications: Advanced Lighting Functions and Road Projections. Even with the underlying technologies advancing far beyond what the early trailblazers of automotive headlamps could have envisioned, we still try to optimize for the same goals: - reduce glare - increase efficiency and range - make driving safer Sunex’s design, engineering, and manufacturing know-how are well known in the automotive industry. Our consistent quality and on-time delivery made us a preferred supplier for imaging optics for leading Tier1s and OEMs for over a decade. Building on that history and reputation, we successfully advance the new high-resolution automotive headlamps segment with our customers and partners. | PN | Description | EFL | F/# | IMC | Unit Price | Files | |---|---|---|---|---|---|---| | Loading… | Dsign Examples | PN | EFL | HFOV | VFOV | F/# | Features | |---|---|---|---|---|---| | A | 19mm | 40° | 10° | F/0.7 | All glass, wide HFOV, high efficiency | | B | 37mm | 20° | 5° | F/0.75 | All glass, narrow HFOV | | C | 24mm | 30° | 8° | F/0.6 | All glass, very high efficiency | | D | 31mm | 24° | 6° | F/0.7 | Hybrid, low color aberration | | E | 30mm | 24° | 6° | F/0.68 | Hybrid, high MTF, low distortion | | F | 30mm | 24° | 6° | F/0.7 | All glass, high MTF, low distortion | Try searching our entire **Off-The-Shelf Portfolio** or use our **Imaging System Builder** to get started on a custom solution. ## Key Technologies The automotive lighting industry is undergoing a technological transformation. Traditional Matrix LED systems are giving way to microLED (µLED) projector-based headlamps capable of pixel-level control, adaptive beam shaping, and dynamic road projection. These systems are not just lighting the road — they’re becoming integral to ADAS safety features and OEM brand differentiation. Yet, this shift introduces new challenges: - µLED optics demand higher resolution, tighter tolerances, and compact form factors. - The thermal loads and environmental stresses in automotive applications require systems engineered for reliability. - To be successful, suppliers must balance performance, cost, and manufacturability — without compromising quality. **Optimization for MTF, Color, and Efficiency**The requirements for high-resolution automotive projector lenses are evolving as the adoption of this technology rapidly increases. Besides the mechanical boundaries and the need to satisfy OEM styling guidelines, we typically see MTF, efficiency, and color aberrations as crucial performance differentiators. Sunex has developed extensive design experience, engineering capabilities, and manufacturing process know-how that address the expanding solution space. All graphs are for illustration purposes only. The individual lens performance can be different. **Athermalization**The shift of a lens’s focal point over a wide temperature range is a physical phenomenon based on the material-specific expansion and contraction with temperature. A decrease in image quality could be the outcome if the focal point of the lens relative to the sensor’s image plane shifts too much. A fully athermalized system requires selecting appropriate optical and mechanical materials, the right design strategy, and close collaboration with the customer to optimize thermal performance on a system level. ## Supporting Services **Fast Prototyping** We provide prototyping services for complete lens assemblies often as the first step after a new custom design. Sunex can produce prototypes with short lead times to verify the design before transitioning further on the path to mass production using state-of-the-art fabrication processes for glass and plastic optical elements and all mechanical components. **Test & Measuremnet Capabillities** Sunex has expanded the existing design, test, and measurement capabilities to account for the specific needs of projection optics and our automotive headlamp customers. In-house equipment includes test systems for the characterization of large-format projection optics, a goniophotometer lab with industry-standard analysis software, and VDA19.1 test equipment. This paper examines how lens hybridization, combining glass and plastic optical elements, can deliver optimized solutions for automotive µLED HD Lighting systems. The index of refraction is a function of temperature and nearly all optical systems experience some performance degradation over temperature. Visit our **Technology & Resource Hub** and get free access to high-quality information. --- ## RGBIR - Source: https://sunex.com/products/rgbir/ - Summary: RGBIR is a popular term for lenses optimized for operation in both daylight and low-light conditions. Such products require specialized design considerations, broadband AR coatings (BBAR) with low reflectivity (R%), dual-bandpass filter with high transmissivity (T%) in the VIS and IR bands, and specific manufacturing techniques to provide the best possible image quality. ## RGBIR (Day/Night) Lens Portfolio RGBIR is a popular term for lenses optimized for operation in both daylight and low-light conditions. Such products require specialized design considerations, broadband AR coatings (BBAR) with low reflectivity (R%), dual-bandpass filter with high transmissivity (T%) in the VIS and IR bands, and specific manufacturing techniques to provide the best possible image quality. If done right, an RGBIR lens can significantly improve focus, brightness, and resolution over conventional lenses. The enhanced infrared sensitivity of RGBIR lenses is often paired with a dedicated infrared illumination source enabling night vision applications and use cases such as biometric authentication, gaze tracking, and gesture recognition. IR-corrected lenses are used in many applications across different industries and can be referred to as RGBIR (e.g., automotive), Day/Night (e.g., Security and Surveillance), and also hyperspectral (e.g., medical). | PN | Description | EFL | F/# | IMC | Unit Price | Files | |---|---|---|---|---|---|---| | Loading… | Try searching our entire **Off-The-Shelf Portfolio** or use our **Imaging System Builder** to get started on a custom solution. ## Key Technologies **Dual-Bandpass Filters**For many camera applications, Infrared (IR) is an unwanted component of the light spectrum and is often blocked using IR cut-off filters that block the transmission of the infrared while passing the visible (VIS). The Sunex IRC4x family of dual-bandpass filters on the other hand are specifically designed to allow the visible and a narrow IR-band to pass through simultaneously. Typical configurations are VIS+850 and VIS+940. Sunex also offers single-band IR filters, and custom filter designs. All graphs are for illustration purposes only. The individual lens performance can be different. ## Supporting Services **Sensor Module Capabilities**Depending on the need and expertise of our customers, we provide design and manufacturing services for a complete sensor module. We strive to find the best solution for your needs, from designing the schematic, creating the PCB layout, and sourcing all components to building according to your PCB design and parts consignment. At Sunex, we have the in-house expertise and capabilities for lens and sensor board design, manufacturing, and testing to deliver a fully tested sensor module. **Active Alignment Capabilities** To achieve the highest system performance when pairing a high-quality lens with a high-resolution sensor, we recommend that our customers consider an active alignment process. Applying a fully automated 6-axis active alignment in mass production increases yield, shortens cycle times, improves system performance, and lowers part-to-part variance. A hyperspectral lens (also “Day-Night” or RGBIR) refers to a lens that has been optimized to maintain performance throughout the VIS and NIR bands. Visit our **Technology & Resource Hub** and get free access to high-quality information. --- ## Fovea - Source: https://sunex.com/products/fovea/ - Summary: The term "Fovea Distortion" is derived from the fovea centralis which is located in the retina's center and is responsible for high-acuity human vision.Sunex lenses with Fovea distortion map this type of behavior and "exaggerate" the central details while trading off the off-axis details. Practically speaking, this results in a higher number of pixels per degree in the center, allowing machine vision algorithms to benefit from a higher resolution in the center field compared to standard f-theta distortion lenses. ## FOVEA Lens Portfolio The term “Fovea Distortion” is derived from the *fovea centralis* which is located in the retina’s center and is responsible for high-acuity human vision. Sunex lenses with Fovea distortion map this type of behavior and “exaggerate” the central details while trading off the off-axis details. Practically speaking, this results in a higher number of pixels per degree in the center, allowing machine vision algorithms to benefit from a higher resolution in the center field compared to standard f-theta distortion lenses. Sunex’s expertise and experience in manipulating distortion profiles to align with application-specific requirements have been valued by customers for many years. Our Tailored Distortion™ expertise has often been applied to SuperFisheye™ lenses to correct large FOV lenses’ barrel distortion and we are now also offering the former mentioned FOVEA™ Lenses. Applications that benefit from a Fovea distortion profile include forward-looking ADAS and autonomous driving (AD) cameras, where the vehicle must detect objects at a far distance in the central FOV range while still having a wider FOV capability to maintain peripheral vision. | PN | Description | EFL | F/# | IMC | Unit Price | Files | |---|---|---|---|---|---|---| | Loading… | Try searching our entire **Off-The-Shelf Portfolio** or use our **Imaging System Builder** to get started on a custom solution. ## Key Technologies **High Dynamic Range (HDR)**HDR (high dynamic range) sensors can capture light intensity variations up to six or more orders of magnitude within the same image frame (~120db). This puts a very demanding requirement on lens performance. Sunex has developed design expertise, process know-how, and nested cleanroom manufacturing facilities to eliminate or minimize optical noise (such as ghosts, flare, starbursts, spurious images) in lenses for high-performance applications. **Athermalization**The shift of a lens’s focal point over a wide temperature range is a physical phenomenon based on the material-specific expansion and contraction with temperature. A decrease in image quality could be the outcome if the focal point of the lens relative to the sensor’s image plane shifts too much. A fully athermalized system requires selecting appropriate optical and mechanical materials, the right design strategy, and close collaboration with the customer to optimize thermal performance on a system level. All graphs are for illustration purposes only. The individual lens performance can be different. ## Supporting Services **Fast Prototyping** We provide prototyping services for complete lens assemblies often as the first step after a new custom design. Sunex can produce prototypes with short lead times to verify the design before transitioning further on the path to mass production using state-of-the-art fabrication processes for glass and plastic optical elements and all mechanical components. Camera systems for high-end sports cars must excel under extreme environmental conditions. Read how ART SpA from Italy is mastering this challenge with Sunex as its optics partner. There are significant drawbacks of utilizing high CRA sensors for applications where a short z-height and compactness are not as important. Visit our **Technology & Resource Hub** and get free access to high-quality information. --- ## Large Format - Source: https://sunex.com/products/largeformat/ - Summary: Large format sensors (1", APS, Full Frame, etc.) tend to have slightly larger pixels and seemingly go against the trend of pushing for smaller and smaller pixels. However, this is intentional since the goal of getting “more pixels on the same target area” doesn’t necessarily mean that the image quality is also improving. The larger pixels of large format sensors often have lower noise and better low-light performance and tend to align better with the boundaries of imaging physics and manufacturing tolerances for CMOS lenses. ## Large Format Lens Portfolio Large format sensors (1″, APS, Full Frame, etc.) tend to have slightly larger pixels and seemingly go against the trend of pushing for smaller and smaller pixels. However, this is intentional since the goal of getting “more pixels on the same target area” doesn’t necessarily mean that the image quality is also improving. The larger pixels of large format sensors often have lower noise and better low-light performance and tend to align better with the boundaries of imaging physics and manufacturing tolerances for CMOS lenses. High-resolution (up to 200MP) lenses provide superior image quality by delivering exceptional clarity and detail, even in high-speed, dynamic environments. These lenses stand out for their ability to produce vivid colors and minimize distortion, ensuring sharp, true-to-life images. Suitable image sensors contain a high pixel count, which enables precise capture of fine textures and intricate elements, enhancing the overall visual experience. Their advanced optical designs reduce aberrations and enhance contrast, providing consistent edge-to-edge performance across various lighting conditions. Sunex Large Format lenses have a profound impact on total system performance. Their high resolution delivers lifelike imaging, making them essential for cutting-edge sports coverage, dynamic live broadcasts, immersive content capture, cinematic filmmaking and photography, geospatial mapping, teleconferencing, security, and Robotics applications where imaging quality is paramount. These lenses set a new benchmark for high-end professional imaging by delivering unparalleled clarity and detail. | PN | Description | EFL | F/# | IMC | Unit Price | Files | |---|---|---|---|---|---|---| | Loading… | Try searching our entire **Off-The-Shelf Portfolio** or use our **Imaging System Builder** to get started on a custom solution. ## Key Technologies **High Dynamic Range (HDR)**HDR sensors can capture light intensity variations up to six or more orders of magnitude within the same image frame (~120db). This puts a very demanding requirement on lens performance. Sunex has developed design expertise, process know-how, and nested cleanroom manufacturing facilities to eliminate or minimize optical noise (such as ghosts, flare, starbursts, spurious images) in lenses for high-performance applications. All graphs are for illustration purposes only. The individual lens performance can be different. ## Supporting Services **Sensor Module Capabilities**Depending on the need and expertise of our customers, we provide design and manufacturing services for a complete sensor module. We strive to find the best solution for your needs, from designing the schematic, creating the PCB layout, and sourcing all components to building according to your PCB design and parts consignment. At Sunex, we have the in-house expertise and capabilities for lens and sensor board design, manufacturing, and testing to deliver a fully tested sensor module. **Active Alignment Capabilities** To achieve the highest system performance when pairing a high-quality lens with a high-resolution sensor, we recommend that our customers consider an active alignment process. Applying a fully automated 6-axis active alignment in mass production increases yield, shortens cycle times, improves system performance, and lowers part-to-part variance. Visit our **Technology & Resource Hub** and get free access to high-quality information. --- ## VideoConferencing - Source: https://sunex.com/products/videoconferencing/ - Summary: Typical video-conferencing cameras require lenses with high image quality and a large field of view while maintaining a very low distortion at the same time. High-index materials and low F/# support high frame rates in low light situations and combined with our deep experiences in aspherical optical elements have positioned us as a leading supplier in this segment. ## Video Conferencing Solutions Typical video-conferencing cameras require lenses with high image quality and a large field of view while maintaining a very low distortion at the same time. High-index materials and low F/# support high frame rates in low light situations and combined with our deep experiences in aspherical optical elements have positioned us as a leading supplier in this segment. Sunex has significant experience in providing lenses to major video-conferencing equipment suppliers, and we are a critical part of their strategy to standardize their offerings for small, medium, and large conference room configurations. #### Filters Sunex also provides a wide range of standard IR cut-off filters and optical low-pass filters as part of the complete optical solution package. We are working in close collaboration with our customers to design, prototype, and mass-produce custom filter solutions, should our off-the-shelf filter portfolio not align with the requirements. ## Feature Products | PN | Format | Resolution | FOV | F/# | Feature | |---|---|---|---|---|---| | DSL147 | 1/2.8” | 8MP | 156° | F/1.4 | Glass, 4K, very low F/#™ | | DSL255 | 1/2” | 10MP | 190° | F/2.0 | Glass, good low light performance, HDR, compact form factor | | DSL388 | 1/2” | 14MP | 145° | F/2.9 | Glass, Tailored Distortion™, 4K, compact form factor | | DSL491 | 1/2” | 16MP | 150° | F/2.8 | Glass, HDR, compact form factor | | DSL592 | 1” | 20MP | 150° | F/2.9 | Glass, large format sensor, high resolution, very low distortion, HDR | --- ## Surveillance - Source: https://sunex.com/products/surveillance/ - Summary: Sunex Surveillance Lenses are designed to meet the unique requirements of Surveillance applications. Whether your requirement is for a plastic lens for high-volume production, or a multi-megapixel, environmentally stable glass/metal lens, Sunex has the solution. Lenses are available from narrow “Telephoto” field of view to Wide-Angle and Fisheye lenses with up to 190º FOV. ## Security Surveillance Lens Portfolio Sunex Surveillance Lenses are designed to meet the unique requirements of Surveillance applications. Whether your requirement is for a plastic lens for high-volume production, or a multi-megapixel, environmentally stable glass/metal lens, Sunex has the solution. Lenses are available from narrow “Telephoto” field of view to Wide-Angle and Fisheye lenses with up to 190º FOV. Sunex specializes in wide field of view and high-resolution imaging and with their compact size, these lenses are much more discrete than typical surveillance lenses on the market. | PN | Description | EFL | F/# | IMC | Unit Price | Files | |---|---|---|---|---|---|---| | Loading… | Try searching our entire **Off-The-Shelf Portfolio** or use our **Imaging System Builder** to get started on a custom solution. ## Key Technologies **Dual-Bandpass Filters**For many camera applications, Infrared (IR) is an unwanted component of the light spectrum and is often blocked using IR cut-off filters that block the transmission of the infrared while passing the visible (VIS). The Sunex IRC4x family of dual-bandpass filters on the other hand are specifically designed to allow the visible and a narrow IR-band to pass through simultaneously. Typical configurations are VIS+850 and VIS+940. Sunex also offers single-band IR filters, and custom filter designs. We also offer IR cut-off filter exchanger with an integrated lens mount that allows true day/night (TDN) operation. For daytime operation, an absorptive color filter provides true color rendition. At night, an AR-coated window allows maximum light transmission. All graphs are for illustration purposes only. The individual lens performance can be different. ## Supporting Services **Sensor Module Capabilities**Depending on the need and expertise of our customers, we provide design and manufacturing services for a complete sensor module. We strive to find the best solution for your needs, from designing the schematic, creating the PCB layout, and sourcing all components to building according to your PCB design and parts consignment. At Sunex, we have the in-house expertise and capabilities for lens and sensor board design, manufacturing, and testing to deliver a fully tested sensor module. **Active Alignment Capabilities** To achieve the highest system performance when pairing a high-quality lens with a high-resolution sensor, we recommend that our customers consider an active alignment process. Applying a fully automated 6-axis active alignment in mass production increases yield, shortens cycle times, improves system performance, and lowers part-to-part variance. Selecting the right lens sourcing strategy has direct, long-term consequences on image performance, supply continuity, and program economics. The market currently offers three distinct channels: internet platforms, catalog-style intermediaries, and direct OEM partnerships. As robotics and automation systems grow increasingly compact, intelligent, and power-efficient, the supporting vision technologies must evolve in parallel. Tailored Distortion® is an innovation from Sunex to manipulate the distortion to achieve the best image quality in accordance with a client’s requirements: Visit our **Technology & Resource Hub** and get free access to high-quality information. --- ## In-Cabin - Source: https://sunex.com/products/incabin/ - Summary: This segment's most prominent camera applications are Driver Monitoring (DMS) and Occupant Monitoring Systems (OMS). DMS is already delivering on face tracking and driver authentication, and with health and vital sign monitoring, the span of possible use cases is just increasing; in some cases now made mandatory by regulators for all new car models. OMS in turn, will enable many convenience features but also promises breakthroughs in child safety and general passenger health monitoring. ## Automotive In-Cabin Lens Portfolio This segment’s most prominent camera applications are Driver Monitoring (DMS) and Occupant Monitoring Systems (OMS). DMS is already delivering on face tracking and driver authentication, and with health and vital sign monitoring, the span of possible use cases is just increasing; in some cases now made mandatory by regulators for all new car models. OMS in turn, will enable many convenience features but also promises breakthroughs in child safety and general passenger health monitoring. In the attempt to consolidate the number of cameras in the cabin while at the same time delivering on functionality mandated by regulators and expanding additional functionality for all onboard occupants, the class of DOMS cameras has been introduced. There are multiple approaches to selecting the ideal lens for a DOMS camera, and the solution space includes high-resolution SuperFisheye lenses as well as lenses with a FOVEA distortion profile that is aligned off-center. No matter what distortion characteristic is selected, all these solutions require extensive design, material, and manufacturing know-how and expertise to deliver on RGBIR-optimized lenses that offer high performance over a broad wavelength spectrum. | PN | Description | EFL | F/# | IMC | Unit Price | Files | |---|---|---|---|---|---|---| | Loading… | Try searching our entire **Off-The-Shelf Portfolio** or use our **Imaging System Builder** to get started on a custom solution. ## Key Technologies **Industry-leading 5MP RGBIR Designs** We are leveraging best-in-class design expertise and are constantly driving process improvements and manufacturing innovation for our global customer base to push the boundaries of what is possible. Our optical and mechanical engineering teams apply their proven engineering know-how to create new products and novel solutions to support your project goals. From Design To Cost (DTC), Design for Manufacturing (DFM), and Design for Reliability (DFR), we always have the full product development cycle and your specific business case in mind. RGBIR lenses are optimized for operation in both daylight and low-light conditions. Such products require specialized design considerations, broadband AR coatings (BBAR) with low reflectivity (R%), dual-bandpass filter with high transmissivity (T%) in the VIS and IR bands, and specific manufacturing techniques to provide the best possible image quality. If done right, an RGBIR lens can significantly improve focus, brightness, and resolution over conventional lenses. The enhanced infrared sensitivity of RGBIR lenses is often paired with a dedicated infrared illumination source, enabling combined Driver Monitoring (DMS) and Occupant Monitoring (OMS) in a single system (DOMS) for simultaneous gaze tracking, gesture recognition, and VIS streaming. **Multiple HFOVs and Distortion profiles for DMS, OMS, and DOMS** In automotive In-Cabin applications for DMS, OMS, and combined as DOMS, having the right distortion profile plays a crucial role in enhancing safety and precision. These specialized lenses are engineered to minimize image distortion and optimize visual clarity across varying fields of view; from very narrow to wide-angle cameras. **Advanced Mechanical Designs** Sunex’s renowned excellence in Optical Design is complemented by its extensive experience in optomechanical design. From threaded aluminum barrels, to barrels with glue flange for active alignment, to the more advanced solutions featuring aluminum or plastic unibodies. We always see the optical path as one component of a larger system that, together, needs to be optimized. ## Supporting Services **Reliability and Environmental Testing** Often, it is not the challenge to create a design “that works” but to find a solution that can scale to mass production (DFM), meets required price targets (DTC), and delivers on performance and quality (DFR). Our deep design and manufacturing experience comes from servicing some of the most demanding markets and applications, and combined with consistent quality and global on-time delivery, Sunex is the preferred partner for many. Camera systems for high-end sports cars must excel under extreme environmental conditions. Read how ART SpA from Italy is mastering this challenge with Sunex as its optics partner. There are significant drawbacks of utilizing high CRA sensors for applications where a short z-height and compactness are not as important. Visit our **Technology & Resource Hub** and get free access to high-quality information. --- ## ADAS - Source: https://sunex.com/products/adas/ - Summary: The forward-facing automotive camera is a rapidly developing application.  The first systems were relegated to simple functions such as intelligent headlamp control (IHC), lane departure warning (LDW), or rain-sensing.  Succeeding generations of forward-facing cameras incorporate multiple advanced functions such as adaptive cruise control (ACC), forward collision warning (FCW), traffic sign recognition (TSR), pedestrian detection, obstacle detection, and night vision. ## Automotive ADAS Lens Portfolio The forward-facing automotive camera is a rapidly developing application. The first systems were relegated to simple functions such as intelligent headlamp control (IHC), lane departure warning (LDW), or rain-sensing. Succeeding generations of forward-facing cameras incorporate multiple advanced functions such as adaptive cruise control (ACC), forward collision warning (FCW), traffic sign recognition (TSR), pedestrian detection, obstacle detection, and night vision. These demanding applications require spiraling improvements in image processing, sensors, and lenses. Sunex is at the forefront of lens technology for forward-facing camera systems. A typical optical performance requires high resolution, low F-number, low distortion, and minimal package size. In addition, with the use of high- or wide dynamic range (HDR or WDR) sensors, special attention must be paid to veiling glare and stray light effects. | PN | Description | EFL | F/# | IMC | Unit Price | Files | |---|---|---|---|---|---|---| | Loading… | Try searching our entire **Off-The-Shelf Portfolio** or use our **Imaging System Builder** to get started on a custom solution. ## Key Technologies **Industry-leading 8MP Designs** We are leveraging best-in-class design expertise and are constantly driving process improvements and manufacturing innovation for our global customer base to push the boundaries of what is possible. Our optical and mechanical engineering teams apply their proven engineering know-how to create new products and novel solutions to support your project goals. From Design To Cost (DTC), Design for Manufacturing (DFM), and Design for Reliability (DFR), we always have the full product development cycle and your specific business case in mind. **Multiple HFOVs and Fovea Distortion** In automotive Advanced Driver Assistance Systems (ADAS), FOVEA distortion lens designs play a crucial role in enhancing safety and precision. These specialized lenses are engineered to minimize image distortion and optimize visual clarity across varying fields of view, essential for applications like lane departure warning, adaptive cruise control, and pedestrian detection. By maintaining accurate imaging and minimizing aberrations, FOVEA lenses ensure that critical data captured by ADAS camera **Advanced Mechanical Designs** Sunex’s renowned excellence in Optical Design is complemented by its extensive experience in optomechanical design. From m12 threaded aluminum barrels, to barrels with glue flange for active alignment, to the more advanced solutions featuring aluminum or plastic unibodies. We always see the optical path as one component of a larger system that, together, needs to be optimized. ## Supporting Services **Reliability and Environmental Testing** Often, it is not the challenge to create a design “that works” but to find a solution that can scale to mass production (DFM), meets required price targets (DTC), and delivers on performance and quality (DFR). Our deep design and manufacturing experience comes from servicing some of the most demanding markets and applications, and combined with consistent quality and global on-time delivery, Sunex is the preferred partner for many. Camera systems for high-end sports cars must excel under extreme environmental conditions. Read how ART SpA from Italy is mastering this challenge with Sunex as its optics partner. Visit our **Technology & Resource Hub** and get free access to high-quality information. --- ## Solutions - Source: https://sunex.com/solutions/ - Summary: Our innovative lens and module solutions are engineered for reliability for mission-critical applications - we are successful if you are. # Industries & Applications Our innovative lens and module solutions are engineered for reliability for mission-critical applications – we are successful if you are. ## Our decades of design, engineering, and manufacturing experience shaping our customers’ success. ## The only things that exceeds our solutions, is our commitment to service. From feasibility studies and design services for lenses and camera modules, to prototyping, sample qualification and ramp into mass production, our teams across sales, support, customer service, engineering, and manufacturing are working for you literally around the clock. Lern more → --- ## Video Library - Source: https://sunex.com/support/video-library/ - Summary: We are trying to meet our customers "where they are." For us at Sunex, that also includes the way people prefer to consume content. For people preferring video content over Knowledge Center articles, we have created a collection of short technical videos, recorded webinars, and some of our conference talks. All these videos can also be accessed through the Sunex YouTube Channel. ## Video Library We are trying to meet our customers “where they are.” For us at Sunex, that also includes the way people prefer to consume content. For people preferring video content over Knowledge Center articles, we have created a collection of short technical videos, recorded webinars, and some of our conference talks. All these videos can also be accessed through the Sunex YouTube Channel. --- ## Media - Source: https://sunex.com/solutions/media/ - Summary: High-performance imaging solutions for immersive media, 3D capture, and video conferencing — helping to connect the world. ## Actively-aligned camera modules enable lifelike 3D content and NextGen conferencing experiences. ## Immersive Imaging ## Video Conferencing ## Live Streaming & Broadcasting ## Camera Modules ## Best in class lenses for large sensor formats, including 1", 1/1.2”, 4/3”, APC-C, and Full Frame. Sunex Inc. delivers precision optical and imaging solutions for the video conferencing and immersive media markets, with decades of experience in lens and camera module design. Our portfolio includes miniature, wide-angle, and low-distortion lenses, as well as large-format and ultra-high-resolution lenses, all optimized for high-quality media applications including immersive imaging, 3D content generation, and video capture. Using proprietary active alignment processes, Sunex ensures precise integration between optics and sensor, resulting in the highest performance, smallest part-to-part variance, and reliable operation across a wide range of use cases. Sunex supports OEMs with full lifecycle services—from concept and prototyping to validation and series production—leveraging ISO-certified facilities, rigorous quality standards, and global manufacturing capabilities. Our expertise enables next-generation conferencing systems, VR/AR experiences, and 3D content platforms to deliver sharp, immersive, and lifelike imaging. --- ## Robotics - Source: https://sunex.com/solutions/robotics/ - Summary: The breadth of possible applications in the robotics industry matches the vast experience we have  — Off-The-Shelf or custom; let's get started. # Robotics & Industrial Systems The breadth of possible applications in the robotics industry matches the vast experience we have — Off-The-Shelf or custom; let’s get started. ## We enable novel solutions that are critical to our customers’ success across different applications. ## Automated & Guided Vehicles ## Drones & Last-Mile-Delivery ## Industrial Machine Vision ## Smart Agriculture ## Precision lenses, DXM™ stereo-vision and SXM™ mounts for next-generation robotics systems. Sunex Inc. provides advanced optical solutions for robotics, industrial machine vision, and smart agriculture applications, combining over 25 years of expertise in precision lens design and manufacturing. Our innovations include miniature and wide-angle SuperFisheye lenses, Tailored Distortion® optics, and DXM™ single-sensor stereo vision systems that enable compact, high-performance depth sensing. Leveraging proprietary technology for precise alignment, the SXM™ interchangeable lens mounts deliver flexible, OEM-ready imaging modules for inspection, automation, and robotic applications. With global manufacturing, rigorous ISO-certified quality standards, and full support across the product lifecycle, Sunex empowers robotics and industrial OEMs to deploy reliable, high-resolution imaging solutions for demanding environments. --- ## Medical - Source: https://sunex.com/solutions/medical/ - Summary: Sunex designs, manufactures, and tests custom lenses and camera module solutions — helping make life better, one image at a time. # Medical Imaging Systems Sunex designs, manufactures, and tests custom lenses and camera module solutions — helping make life better, one image at a time. ## We deliver Optical Solutions for Medical Imaging that are critical to our customers’ success. ## Endoscopy (single-use and reusable) Gastroscopy, Colonoscopy, Duodenoscopy, Laparoscopy, Bronchoscopy, ENT, Arthroscopy ## Ophthalmology & Eye Care Fundus cameras, Ophthalmoscopes, specialty solutions ## Point-of-Care Diagnostics Microfluidics, strip readers, dermatoscopy, lab-on-chip ## Robotic Surgery Single-use cameras, Stereo cameras, Viewers/Googles ## Dental Imaging Intraoral cameras, Extraoral cameras, Microscope optics ## Specialty & Monitoring Systems Telemedicine, 3D scanning, Exoscopy, Video monitoring systems, , Hyperspectral Gastroscopy, Colonoscopy, Laparoscopy, Duodenoscopy, Bronchoscopy, Yrology, ENT, Arthroscopy ## A closer look at what it takes We design and manufacture high-resolution imaging solutions according to customer requirements for a wide range of medical devices, including disposable (single-use) endoscopes. These lenses are designed with bio-compatible materials and are assembled in a dedicated medical device cleanroom environment. Many of our miniature lenses designed for finite imaging (with and without autofocus) have broad applications in the medical market. In addition, we provide camera module design and manufacturing services, including PCB design (bare die and packaged sensors), active alignment, and optional assembly services. ## What Sunex Uniquely Delivers **True HD and 4k optical performance**— Lens systems engineered to resolve what 4K sensors can actually capture, matched to pixel pitch and chief ray angle requirements.**Custom Camera Module Design**— Full custom design from optical concept to qualified module — adapting diameter, working distance, FOV, and spectral requirements.**Enabling Single-Use economics**— Over 25 years of design and manufacturing experience deliver per-unit cost to disposable-viable levels without sacrificing optical precision.**AI Diagnostic Pipeline Ready**— Image chains optimized for AI tools with active alignment, consistent MTF, controlled distortion, and repeatable spectral response across every production lot. *“When we began developing our disposable medical endoscope, Sunex was our go-to supplier for a custom lens design and manufacturing. Over the years, their role grew far beyond optics — they became a true partner, applying active alignment of lens and sensor, and integrating electronics and mechanical components into the assembly. From our first feasibility study through prototyping, FDA approval, and into mass production, Sunex’s expertise, responsiveness, and precision have been essential to our success.”* VP of Product Development, Medical Device Company --- ## DayNight - Source: https://sunex.com/products/daynight/ - Summary: Day/Night is a popular term for lenses and other products which are optimized for operation in both daylight and low-light conditions using an infrared illumination source. Such products require specialized design and manufacturing techniques to provide good image quality under this broad range of spectral conditions. Using a Day/Night lens under these circumstances can improve focus, brightness and resolution over conventional lenses. Sunex also offers specialized Day/Night filter products which can further optimize your application. ## (Day/Night (RGBIR) Lens Portfolio Day/Night is a popular term for lenses and other products which are optimized for operation in both daylight and low-light conditions using an infrared illumination source. Such products require specialized design and manufacturing techniques to provide good image quality under this broad range of spectral conditions. Using a Day/Night lens under these circumstances can improve focus, brightness and resolution over conventional lenses. Sunex also offers specialized Day/Night filter products which can further optimize your application. IR-corrected lenses are also very popular in industry outside of security and surveillance. They are sometimes also called RGBIR (e.g., in automotive) or hyperspectral (e.g., medical). | PN | Description | EFL | F/# | IMC | Unit Price | Files | |---|---|---|---|---|---|---| | Loading… | Try searching our entire **Off-The-Shelf Portfolio** or use our **Imaging System Builder** to get started on a custom solution. ## Key Technologies **Dual-Bandpass Filters**For many camera applications, Infrared (IR) is an unwanted component of the light spectrum and is often blocked using IR cut-off filters that block the transmission of the infrared while passing the visible (VIS). The Sunex IRC4x family of dual-bandpass filters on the other hand are specifically designed to allow the visible and a narrow IR-band to pass through simultaneously. Typical configurations are VIS+850 and VIS+940. Sunex also offers single-band IR filters, and custom filter designs. We also offer IR cut-off filter exchanger with an integrated lens mount that allows true day/night (TDN) operation. For daytime operation, an absorptive color filter provides true color rendition. At night, an AR-coated window allows maximum light transmission. All graphs are for illustration purposes only. The individual lens performance can be different. ## Supporting Services **Sensor Module Capabilities**Depending on the need and expertise of our customers, we provide design and manufacturing services for a complete sensor module. We strive to find the best solution for your needs, from designing the schematic, creating the PCB layout, and sourcing all components to building according to your PCB design and parts consignment. At Sunex, we have the in-house expertise and capabilities for lens and sensor board design, manufacturing, and testing to deliver a fully tested sensor module. **Active Alignment Capabilities** To achieve the highest system performance when pairing a high-quality lens with a high-resolution sensor, we recommend that our customers consider an active alignment process. Applying a fully automated 6-axis active alignment in mass production increases yield, shortens cycle times, improves system performance, and lowers part-to-part variance. A hyperspectral lens (also “Day-Night” or RGBIR) refers to a lens that has been optimized to maintain performance throughout the VIS and NIR bands. Visit our **Technology & Resource Hub** and get free access to high-quality information. --- ## Security - Source: https://sunex.com/solutions/security/ - Summary: Sunex delivers precision lenses and camera module solutions for security and surveillance applications — high-performance engineered for reliability. # Security Imaging Solutions Sunex delivers precision lenses and camera module solutions for security and surveillance applications — high-performance engineered for reliability. ## High-performance lenses and modules for demanding security environments Sunex Inc. is a trusted provider of high-performance imaging solutions for the security and surveillance market. With over 25 years of experience in precision lens design and manufacturing, Sunex delivers lenses and camera modules tailored for demanding environments, including low-light and wide-angle applications. Our expertise includes miniature fisheye lenses, a large portfolio of lenses for 1″ sensor format, telephoto optics, and advanced distortion correction (Tailored Distortion®) for accurate image capture. Sunex supports full product lifecycle needs—from design and prototyping to validation, series production, and end-of-life compliance—leveraging proprietary materials, active alignment processes, and rigorous quality standards (ISO 9001:2008 and ISO/TS 16949:2009). Our global footprint and technical support enable security and surveillance OEMs to deploy reliable, high-performance imaging solutions in commercial, public safety, and critical infrastructure applications. --- ## Surround View - Source: https://sunex.com/products/svc/ - Summary: Sunex is a leading provider of lenses for automotive OEM rearview and surround-view cameras. Although this application segment is one of the most mature among all automotive vision applications, OEMs continue to demand technical and commercial improvements with every succeeding camera generation. ## Automotive Surround View Lens Portfolio Sunex is a leading provider of lenses for automotive OEM rearview and surround-view cameras. Although this application segment is one of the most mature among all automotive vision applications, OEMs continue to demand technical and commercial improvements with every succeeding camera generation. Sunex product offerings continue to grow, supporting new sensor innovations and more cost-effective solutions. We have a comprehensive selection of wide-angle lenses explicitly designed for automotive-qualified image sensors from leading sensor suppliers. | PN | Description | EFL | F/# | IMC | Unit Price | Files | |---|---|---|---|---|---|---| | Loading… | Try searching our entire **Off-The-Shelf Portfolio** or use our **Imaging System Builder** to get started on a custom solution. ## Key Technologies **Lifetime stable Hybrid Designs** We are leveraging best-in-class design expertise and are constantly driving process improvements and manufacturing innovation for our global customer base to push the boundaries of what is possible. Our optical and mechanical engineering teams apply their proven engineering know-how to create new products and novel solutions to support your project goals. From Design To Cost (DTC), Design for Manufacturing (DFM), and Design for Reliability (DFR), we always have the full product development cycle and your specific business case in mind. **SuperFisheye(TM) and Tailored Distortion** With the market trend toward increasing FOV for a single camera and multiple camera systems, Sunex offers unmatched miniature fisheye and SuperFisheye (>185deg) lenses with distortion control and low f/#. Sunex off-the-shelf solutions listed below are often a good starting point for Proof of Concepts (PoC) or to develop a custom OEM solution. **Advanced Mechanical Designs** Sunex’s renowned excellence in Optical Design is complemented by its extensive experience in optomechanical design. From m12 threaded aluminum barrels, to barrels with glue flange for active alignment, to the more advanced solutions featuring aluminum or plastic unibodies. We always see the optical path as one component of a larger system that, together, needs to be optimized. ## Supporting Services **Reliability and Environmental Testing** Often, it is not the challenge to create a design “that works” but to find a solution that can scale to mass production (DFM), meets required price targets (DTC), and delivers on performance and quality (DFR). Our deep design and manufacturing experience comes from servicing some of the most demanding markets and applications, and combined with consistent quality and global on-time delivery, Sunex is the preferred partner for many. Visit our **Technology & Resource Hub** and get free access to high-quality information. --- ## Automotive - Source: https://sunex.com/solutions/automotive/ - Summary: Sunex has delivered over 120M lenses to our global automotive customer base  — a record of exceptional reliability and consistent quality. ## Reliability, Performance, and Quality Sunex is a global leader in high-performance lenses for OEM passenger and commercial vehicles, specializing in advanced optics for HDR applications, 8MP ADAS, miniature fisheye lenses up to 200° FOV, and Tailored Distortion® lenses. Automotive lens design is complex, spanning concept, design, manufacturing, and end-of-life. Sunex meets these challenges with proprietary materials and processes, and customers collaborate directly with experienced engineers and technical sales who understand both performance and commercial needs. “Initial evaluation indicates a contrast of 40-50% with our object. This is better than the 30% we require for each of the three images at 520, 550, and 630 nm without refocusing. This is excellent performance.” Image Quality Manager, Automotive --- ## Knowledge Center - Source: https://sunex.com/support/knowledge-center/ - Summary: Introduction to a wide range of topics written by our engineers for a technically interested audience. # Technology & Resource Hub ## ARTICLES Introduction to a wide range of topics written by our engineers for a technically interested audience. ## CASE STUDIES A collection of case studies and market observations to help guide your decision path. ## WHITE PAPERS In-depth technical papers written by our team of experts and based on decades of experience. ## CUSTOMER SUCCESS STORIES Here, directly about challenges and solutions from the ones that matter the most, our customers! - Article (Products) ## MCP Connector Quick Start Guide - Case Study ## The Cost of Making the Wrong Lens Choice - White Paper ## Image Circle and Sensor Format - White Paper ## Advanced Optical Solutions for the Next Generation of Smart Agriculture - White Paper ## Why Resolution & Contrast Matter: A Practical Guide for Better Imaging - Customer Success Story ## Publish Your Success Story - Customer Success Story ## Customer Success Story: ART SpA Parking Camera - Article (Products) ## SXM™ Technology: Redefining Lens Interchangeability - Case Study ## Choosing the Right Sourcing Strategy for M12 Lenses - Case Study ## Lens Hybridization for µLED Headlamps - Case Study ## Sunex DXM™ – Stereo Vision in a Smaller Package - White Paper --- ## About - Source: https://sunex.com/about/ - Summary: Sunex delivers optical solutions that perform flawlessly in the world’s most demanding environments — where failure is not an option. ## Why Sunex Reliability Goes Beyond the Spec Sheet ## Demanding Applications Designed to meet application needs and challenging environments without compromising performance. ## Precision Manufacturing Tightly controlled production processes ensure consistent quality from prototype to volume production. ## Proven Quality Record Extensive dynamic and lifecycle testing ensure application-specific quality, giving customers total confidence. ## Trusted Worldwide Global clients in automotive, medical, robotics, and AR/VR sectors rely on Sunex lenses and imaging solutions daily. ## Customer support is as important as Reliability - you need to be able to trust them both 24/7. ## Services that transform a Product into a Solutions. Often it is not the challenge to create a design “that works” but to find a solution that can scale to mass production (DFM), meets required price targets (DTC), and delivers on performance and quality (DFR). Our deep design and manufacturing experience comes from servicing some of the most demanding markets and applications, and combined with consistent quality and global on-time delivery, Sunex is the preferred partner for many. ## Lens Design Optical & Mech. Design, Feasibility Studies, Design to Cost (DTC) ## Camera Modules PCB, Packaged/BareDie, Active Alignment, Cabling, Test ## Fast Prototyping Optical, Mechanical, PCB – always designed for mas production ## Mass Production Design for Manufacturing (DFM), Design for Reliability (DFR) ## Global Supplier for Imaging and Projection Optics Solutions. With a 25+ year track record and our US-based headquarters, we have been successful in taking customer concepts from design all the way to mass production. We are leveraging best-in-class design expertise and are constantly driving innovation and process improvements, which have enabled us to deliver more than 200 million lenses worldwide. For over two decades, we enable our global client base to succeed in their domains by delivering US-based engineer-to-engineer consulting and support with high-volume manufacturing in China. Our expertise, experience, technologies, and processes allow us to create lenses that often achieve far superior quality across many applications and markets than what the traditional lens designs can offer. Patented technologies, best-in-class design expertise, and uncompromising quality made Sunex a premier supplier for Automotive, Security/Surveillance, Medical, Robotics/Computer Vision, Media/Video Conferencing, and OHV/Fleets. “We are getting great feedback from the field on the camera image quality – people are saying things like spectacular and magnificent. This, of course, starts with the excellent lens you are delivering.” Our engineering expertise and experience as well as our understanding of the manufacturing and process capabilities inform and guide every step of the design phase. Material selection in regards to optical performance, environmental stability, and costs, is another important input derived from the project requirements. Together with our clients we discuss and agree on the best design that balances all factors and interests. The result is a manufacturable design from day one. “Initial evaluation indicates a contrast of 40-50% with our object. This is better than our requirement of 30%. This is for each of the three images at 520, 550, and 630 nm without re-focusing. This is excellent performance.” No matter the industry or application, today’s optical solutions are expected to deliver on the latest innovation and technologies. At Sunex, we see technologies that were table stakes in one industry, now become the latest trend in another one. Our global customer base has always been diverse in their needs and requirements, allowing us to collaborate with some of the best domain experts across the globe. Over time this was leading to many innovations and put Sunex into a position of being a leading technology supplier for digital imaging and projection optics. “Thanks guys, I am very satisfied.!!! This lens is a big improvement in our digital product.” All of our customers expect quality based on competency and unique value, driven by innovative technologies. We meet these expectations on a daily basis at a mass-production scale, and with global on-time delivery. Please contact us to start the dialogue of selecting or designing the optimal optical solutions to meet your needs in the most cost-effective manner. “Sunex products are used to explore the deep sea and outer space, and we are sure that if your application is somewhere in between, we have a solution for you as well!” ## Our Capabilities are the Foundation of your Success. With a 25+ year track record and our US-based headquarters, we have been successful in taking customer concepts from design all the way to mass production. We are leveraging best-in-class design expertise and are constantly driving innovation and process improvements, which have enabled us to deliver more than 200 million lenses worldwide. If an off-the-shelf solution (OTS) can’t meet your critical needs, we will discuss options for a brand new design or review the possibility of customizing an existing one. In the latter case, we build off of our extensive library of mature designs and, in many instances, automotive-qualified lenses that have already proven their quality and performance in high-volume mass production. Fundamental laws of physics and our deep design experience merge to continuously innovate and advance the design space for imaging and projection optics. No matter what approach we take, we will always consider your unique requirements to meet your expectations regarding performance, size, cost, schedule, and other important factors. Our design team is comprised of Optical Engineers, Mechanical Engineers, and additional Domain Experts located in our Design and Innovation Center in the US, as well as at our manufacturing sites in China. Most stages of the Design Phase will be managed from the US. Once we transition the nominal design to the Prototyping Phase, the engineering team in our manufacturing site will take the leading performing the Design-For-Manufacturing (DFM) review and optimization. All along, project management will be provided through the US, and domain experts will be included and introduced into the conversation as needed. We provide prototyping services for complete lens assemblies often as the first step after a new custom design. Using state-of-the-art fabrication processes, we can produce prototypes with short lead times to verify the design before transitioning further on the path to mass production. Evaluating the project stage, needs, and timelines, we discuss with our clients the appropriate strategy to proceed with prototypes. We leverage available in-house resources depending on the above-mentioned boundaries to deliver the products and quantities needed. **Glass optics** – Glass optics are typically prototyped using the traditional glass manufacturing approach of cutting, grinding, and polishing. If required we can also turn to our Diamond Turning Process for aspherical or freeform elements. **Plastic optics** – Aspherical plastic elements are typically prototyped using our Diamond Turning Machines. In this case, we place special emphasis on material selection during the design stage. The goal is to select a material that can be processed using Diamond Turning, as well as injection molding to avoid the need to switch material and reoptimize the design at a later point in time. For cases where larger prototype quantities or pre-series quantities are requested, we consider fabricating high-precision, low-volume tooling first to economically and quickly verify the performance and demonstrate the manufacturability of the design. Once the prototypes and design are confirmed, the low-volume tooling can be scaled up for mass production by adding additional cavities. **Mechanical components** – Mechanical components typically are first produced in aluminum using state-of-the-art CNC-machining equipment. The components are then finished matte black to minimize specular reflection. This process allows for quick validation of the lens assembly design before committing to production fixtures or tooling. **Prototype assembly** – Once the components are fabricated and qualified, the assembly process takes place in our cleanroom assembly area. In many applications, aluminum mechanical components are suitable for production. However, if your application demands plastic mechanical components Sunex has the capability to provide this as well. Once we have the prototype assemblies we conduct extensive tests to verify that they meet the agreed design requirements. Our engineering team designs and builds custom fixtures that enable us to test the new lens assembly on our in-house commercial test equipment. A typical set of tests includes optical performance, mechanical dimensions, cosmetics, and other functional requirements. Every lens and project has its unique requirements, often leading to very specific manufacturing process needs. We evaluate each situation and decide which equipment to implement. We select from operator-based to fully automated production lines, manual, hybrid, and full 6-axis active alignment machines, to custom-developed automated gluing machines and other specialty equipment, to meet capacity and cost targets. #### 3-Level Clean Room Environment We operate our manufacturing sites on a nested cleanroom level concept. The entire end-to-end production process is located in **Class 10k **cleanroom area. There within individual products have dedicated **Class 1k **Clean Cells with individual working tables for each assembly station of **Class 100 (**No of particles of size 0.5 µm or larger permitted per cubic foot of air) rating. In addition, we also feature a dedicated medical device assembly facility. After incoming quality control (IQC), materials are entering the Medical Device Assembly, and only leave as fully assembled tested, cleaned, and packaged product. Everything is tailored to mass production over many years, with tight tolerancing and small part-to-part variance. This approach not only makes for a better lens but is also positively impact camera-level yield. For some programs and customers, we only design manufacture, and deliver the lens. In other cases we discuss will all parties involved, the best value-add Sunex can deliver to the success of the program. In these cases, we often also design and manufacture the sensor board in addition to the lens, with the key advantage of focusing the lens on the sensor using our experience in active alignment. Our facilities have been audited and certified by numerous Supplier Quality Engineers, representing some of the most successful and best-established companies in their industries and regions. We follow industry-leading compliance (RoHS, REACH, IMDS, etc.) and process ((Kaizen, APQP, PPAP, CP/CPK, SPC, etc.)) standards and regulations. Please contact us to request proof of our current certifications. Testing (and simulating during the design phase) is an integral part of the lens development process. Sunex is using a wide range of state-of-the-art and industry-leading instruments and software tools; often complimented with custom instruments, fixtures, and test methods to solve challenges that the industry hasn’t addressed yet. **Incoming Quality Control (IQC)**is where the first sets of intensive tests and measurements are performed to ensure the quality and compliance of every single sub-component.**Design Validation**ensures that early R&D prototypes are tested to confirm the 1st order optical parameter of the nominal design.**First Article Inspection (FAI)**is an intensive and wide-ranging test protocol of optical and mechanical parameters for customer prototypes, A-Samples, and after design or process optimizations.**Reliability (REL) & Environmental Testing**is often depending on the application and use case of the final camera. The automotive industry is known for extensive testing requirements including extended high-temperature, humidity, UV-cycling, and many mechanical tests that challenge the product quality over a lifetime.**End-Of-Line (EOL) Testing**in Mass production is tailored to ensure a consistent quality of 100% of the manufactured and shipped parts. MTF, straylight, TTL, and pressure test for IP-rated lenses are the baseline for any lens we build.**Outgoing Quality Control (OQC)**including cosmetic inspection is the final step to ensure maximum quality and consistency before the parts get shipped out to the customer site. It can not be understated how important collaboration with the customer’s engineering and quality teams is. Receiving feedback from the customer based on correlated measurements on the lens and camera level is critically important throughout the entire development cycle. The dialogue not only confirms the performance but also documents progress, and is the foundation for any optimization decisions. In mass production, we discuss the exact testing requirements with our customers that ultimately approve and sign-off on the Manuacftruning (MFG) and Quality (QC) Control documents. These documents define the *What*, the *How*, the *When*, and the *What-If,* for every agreed lens specification. Traceability through Data Matrix Codes (DMC) is the typical way how product and customer-defined information are stored. We use the DMC during the lens assembly process from IQC to OQC including real-time upload from every stage and station. If required, customers can get online access to the database for their particular products and part numbers to extend traceability through their own process and the full life-cycle of the product. --- ## Microsoft Word - Optical Low Pass Filters Theory and Practice.doc [PDF] - Source: https://sunex.com/wp-content/uploads/2022/04/Optical-Low-Pass-Filters-Theory-and-Practice.pdf - Type: PDF whitepaper Application Note Sunex Inc. Telephone: +001 760.602.0988 Order samples online at www.optics-online.com 1 Optical Low Pass Filters Theory and Practice Summary In a high-quality, digital imaging system which uses CCD and CMOS sensors , an optical low pass filter (OLPF) is used to eliminate color Moiré fringes. It is important to note that Moiré fringes must b e removed passively in the optical system and cannot be removed by post -processing the image . See the difference in figure 1. The left side shows an image from an optical system without an OLPF, the right side shows and image with the same optical system with and OLPF. Figure 1. Left, without the OLPF; Right, with the OLPF Theory Since CCD and CMOS sensors sample image information at regularly -spaced, discrete points called pixels each sensor has a frequency limit, called the Nyquist frequency , that is defined by the geometry of its pixels. This frequency is equal to the inverse of the two times the pixel pitch. If the lens passes spatial frequency that is greater than the Nyquist frequency of the sensor, it cannot be resolved by the sensor . Worse, s patial frequencies that are greater than the Nyquist frequency will cause aliasing artifacts . These phenomena are often observed as colorful fringes called Moiré fringes, on the image. An OLPF placed between the lens and the image sensor stops the optica l system from passing spatial frequencies greater than the Nyquist frequency of the sensor. The filter cuts the high frequency information and passes only the low frequency information, removing the Moiré fringes from the image. OLPFs are made of several layers of birefringent optical crystals cemented together. The number of layers and thickness of each layer is defined by the pixel spacing of the sensor Application Note Sunex Inc. Telephone: +001 760.602.0988 Order samples online at www.optics-online.com 2 and the application. It follows that each OLPF design must be tuned to a particular sensor and application. For color imaging, an IR cut -off function is often in tegrated into OLPF as well. A reflective IR cut-off coating can be applied to an external surface or an absorptive IR cut- off filter layer can be added to the quartz layers. Practice • When installing the OLPF into the digital imaging system it must be placed be tween the lens and the sensor. The performance is dictated by the layer th ickness and any optical coatings on the external surfaces . The exact location along this z -axis does not affect the performance of the filter significantly. • We do not recommend affixing the low pass filter to the sensor cover glass or use it as a sensor cover glass! Due to surface quality imperfections it is recommended that the filter be place more than 1mm (>1mm) away from the sensor plane. No matter how tight the surface quality specification there are always scratches and digs on the order of the pixel size that will show up in the image as blobs or dust if the filter is too close to the sensor plane. • The x-axis and y-axis [length and width] orientation of the OLPF with respect to the sensor is important. For a 4:3 and 16:9 aspect ration sensors, ensure that the long edge of the filter is square with the long edge of the sensor. • The filter will function if the IR cut coating faces the sensor or faces away from the sensor. The optical performance is the same. For more information and a selection of standard Sunex Optical Low Pass Filters, please visit http://www.optics-online.com/lpf.asp or call 760.602.0988. --- ## Microsoft Word - Lateral Color Mobile Imaging Lenses.doc [PDF] - Source: https://sunex.com/wp-content/uploads/2022/04/lateral-color.pdf - Type: PDF whitepaper Page 1 Lateral Color in Mobile Imaging Lenses By Alex Ning, PhD January 21, 2005 1. Background Chromatic aberration (CA) is one of several aberrations that degrade lens performance. Other common aberrations include coma, astigmatism, and curvature of field. Chromatic aberration occurs because the index of refraction of the lens material varies with the wavelength of light, i.e. it bends different colors by different amounts as shown in Figure 1. This phenomenon is called dispersion. Minimizing chromatic aberration is one the goals of lens design and it is accomplished by combining glass elements with different dispersion properties. Three element lenses (3P or 1G2P) are popular for mobile imaging applications. However, the optical performance of all 3-element lenses is limited by lateral chromatic aberration, also known as lateral color. This aberration can only be eliminated using a 4-element design with a 2P2G configuration. This paper compares the lateral color of a 1G2P lens with a 2G2P lens. Figure 1. Chromatic aberration in lenses 2. How to detect lateral color The two types of chromatic aberration are illustrated in Figure 1. • Longitudinal chromatic aberration causes different wavelengths to focus on different image planes. It causes degradation of MTF response – with different amounts for different colors. • Lateral chromatic aberration is the color fringing that occurs because the magnification of the image differs with wavelength. It tends to be far more visible than longitudinal CA. For a given amount of lens CA, the smaller the pixel size the more visible the lateral CA in the captured image. The lateral CA can measured in terms of number of pixels. In a good imaging system the lateral CA should be <1x pixel. The lateral color is most visible if one examines a black/white edge at an off-axis viewing angle. The black/white edge should be oriented almost perpendicular to the radius. For example, following is the Page 2 left side of a picture taken with a 1G2P lens (Sunex PN DSL746). All images and analysis were done using a Micron 2MP (1600x1200) demo board with 2.8 µm pixel pitch. With a 2P2G design (Sunex PN DSL871/872) the lateral color is eliminated. As a result, the optical resolution is increased. “Rainbow” is eliminated at the black/white edge Vertical lines are not resolvable Vertical lines are clearly resolvable Outside edge is blue Inside edge is brown “Rainbow” effect Page 3 3. Measurement of Lateral Color The amount of lateral color can be measured using commercially available software (ImaTest at http://www.imatest.com/). This program examines the transition from black to white at an off-axis edge for each primary color, and then calculates the amount of chromatic aberration at that edge. 1G2P lens 2G2P lens The blue channel transition (the blue color curve) occurs before the red channel (red color curve). The distance between the two curves at 50% edge response is 2.1 pixels. This represents 0.468% of the distance to the center of the picture. This implies that there is 0.468% difference in magnification between the blue and red channel in this lens. The blue channel transition occurs at nearly the same location as the red and green channels. This results in very little chromatic aberration. The separation between the blue and red is insignificant (0.171 pixel) in comparison to the size of the pixel. 4. Conclusion With the industry trend towards higher pixel count image sensors with smaller pixel sizes the lateral chromatic aberration of a 3-element lens will become a major problem. For the Micron 2M imager (MI2010) with a 3-element lens, the lateral color can be 2 or more pixels. Aberration on this order significantly reduces the optical resolution and MTF at off-axis viewing angles resulting in an apparent decrease in image detail. For next generation mobile imagers with higher resolution and smaller pixel spacing more sophisticated lens designs, such as the Sunex DSL871 and DSL872 with a 2G2P structure, are required to eliminate the lateral color. These 4-element lens designs allow the end-user to take full advantage of the increased imager resolution. --- ## Sunex Aivision Brochure 2022 Online [PDF] - Source: https://sunex.com/wp-content/uploads/2023/01/Sunex_AIVision_Brochure_2022_online.pdf - Type: PDF whitepaper ©2021 Sunex Inc. All Rights Reserved. sunex.com 20+ year track record of success in taking customer concepts from design through mass production. SUNEX INC. 3160 Lionshead Ave, Suite B Carlsbad, CA 92010, USA Tel: +1 760-597-2966 Email: susales@sunex.com Consulting –Design –Manufacturing - Support AI VISION MODULE TM Lenses and Sensor Modules for Machine Learning TM If applied AI aims to replicate human understanding, then the optical stack plays a significant role in achieving a human-like vision for any camera-based application. Choosing the right lens is a curtail step in system-level optimization and setting a roadmap to achieving desired outcomes. With Sunex as a lens and technology partner, our clients can access specific lens technologies to optimize algorithms, reduce system latencies and power consumption, and enhance imaging performance. Distortion and Field of View Sunex's expertise and experience in manipulating distortion profiles to support algorithm-specific requirements have been valued by customers for many years. Our Tailored Distortion™ expertise has often been applied to SuperFisheye™ lenses to correct barrel distortion of large FOV lenses. Sunex’s FOVEA distortion lenses are designed to mimic human vision. The distortion profile results in a higher pixel density in the center while maintaining a wide field of view, thus optimizing the performance of machine vision algorithms. Low-light Performance Environments with low or changing light are a challenge for any algorithm. Sunex has lens designs that combine very low F/#, high Relative Illumination (RI), high dynamic range (HDR), high MTF across the field, and a broad wavelength spectrum for consistent performance across a variety of scenarios. Sunex has developed design expertise, process know-how, and nested cleanroom manufacturing facilities to eliminate or minimize optical noise (such as ghosts, flare, starbursts, spurious images) in lenses for high-performance applications. All graphs are for illustration purpose only. The individual lens performance can be different. Sunex AI VISION MODULE™ PN Format MP Class HFOV F/# TTL Features DSL144 1/1.8" 1.7MP 100° 1.6 24 FOVEA lens, Hybrid, HDR DSL392 1/1.27” 2MP 201° 2.0 23 SuperFisheye(TM), RGBIR, HDR DSL936 1/1.2” 5MP 52° 3.2 16.5 RGBIR lens, All glass, Short TTL DSL374 1/1.8" 8.3MP 133° 1.6 28.5 FOVEA, All Glass. 4k, Wide FOV DSL350 1/1.8" 8.3MP 122° 1.44 30 FOVEA lens, 4k, Hight RI, very low F/# DSL186 1/1.7” 8MP 140° 1.8 25 RGBIR lens, 4k, Hybrid, HDR DSL387 1/1.7" 4.1MP 120° 1.8 30 FOVEA lens, All Glass. High RI sunex.com/solutions Empower your algorithms with AI VISION MODULEFOV HDR (dB) Table only shows a selection. Additional lens and module options are available. Fast Prototyping We provide prototyping services for complete lens assemblies often as the first step after a new custom design. Sunex can produce prototypes with short lead times to verify the design before transitioning further on the path to mass production using state-of-the-art fabrication processes for glass and plastic optical elements and all mechanical components. Sensor Module Capabilities Depending on the requirements, we can provide design and manufacturing services for a complete sensor module. We strive to find the best solution for your needs, from designing the schematic, creating the PCB layout, and sourcing all components to building according to your PCB design and parts consignment. At Sunex, we have the in-house expertise and capabilities for lens and sensor board design, high volume manufacturing, automated active alignment and testing to support the most demanding vision applications.. FOV IH / EFL f-tan f-theta Fovea Field MTF ™ TM If applied AI aims to replicate human understanding, then the optical stack plays a significant role in achieving a human-like vision for any camera-based application. Choosing the right lens is a curtail step in system-level optimization and setting a roadmap to achieving desired outcomes. With Sunex as a lens and technology partner, our clients can access specific lens technologies to optimize algorithms, reduce system latencies and power consumption, and enhance imaging performance. Distortion and Field of View Sunex's expertise and experience in manipulating distortion profiles to support algorithm-specific requirements have been valued by customers for many years. Our Tailored Distortion™ expertise has often been applied to SuperFisheye™ lenses to correct barrel distortion of large FOV lenses. Sunex’s FOVEA distortion lenses are designed to mimic human vision. The distortion profile results in a higher pixel density in the center while maintaining a wide field of view, thus optimizing the performance of machine vision algorithms. Low-light Performance Environments with low or changing light are a challenge for any algorithm. Sunex has lens designs that combine very low F/#, high Relative Illumination (RI), high dynamic range (HDR), high MTF across the field, and a broad wavelength spectrum for consistent performance across a variety of scenarios. Sunex has developed design expertise, process know-how, and nested cleanroom manufacturing facilities to eliminate or minimize optical noise (such as ghosts, flare, starbursts, spurious images) in lenses for high-performance applications. All graphs are for illustration purpose only. The individual lens performance can be different. Sunex AI VISION MODULE™ PN Format MP Class HFOV F/# TTL Features DSL144 1/1.8" 1.7MP 100° 1.6 24 FOVEA lens, Hybrid, HDR DSL392 1/1.27” 2MP 201° 2.0 23 SuperFisheye(TM), RGBIR, HDR DSL936 1/1.2” 5MP 52° 3.2 16.5 RGBIR lens, All glass, Short TTL DSL374 1/1.8" 8.3MP 133° 1.6 28.5 FOVEA, All Glass. 4k, Wide FOV DSL350 1/1.8" 8.3MP 122° 1.44 30 FOVEA lens, 4k, Hight RI, very low F/# DSL186 1/1.7” 8MP 140° 1.8 25 RGBIR lens, 4k, Hybrid, HDR DSL387 1/1.7" 4.1MP 120° 1.8 30 FOVEA lens, All Glass. High RI sunex.com/solutions Empower your algorithms with AI VISION MODULEFOV HDR (dB) Table only shows a selection. Additional lens and module options are available. Fast Prototyping We provide prototyping services for complete lens assemblies often as the first step after a new custom design. Sunex can produce prototypes with short lead times to verify the design before transitioning further on the path to mass production using state-of-the-art fabrication processes for glass and plastic optical elements and all mechanical components. Sensor Module Capabilities Depending on the requirements, we can provide design and manufacturing services for a complete sensor module. We strive to find the best solution for your needs, from designing the schematic, creating the PCB layout, and sourcing all components to building according to your PCB design and parts consignment. At Sunex, we have the in-house expertise and capabilities for lens and sensor board design, high volume manufacturing, automated active alignment and testing to support the most demanding vision applications.. FOV IH / EFL f-tan f-theta Fovea Field MTF ™ ©2021 Sunex Inc. All Rights Reserved. sunex.com 20+ year track record of success in taking customer concepts from design through mass production. SUNEX INC. 3160 Lionshead Ave, Suite B Carlsbad, CA 92010, USA Tel: +1 760-597-2966 Email: susales@sunex.com Consulting –Design –Manufacturing - Support AI VISION MODULE TM Lenses and Sensor Modules for Machine Learning --- ## Sunex Medical Brochure 2023 Online [PDF] - Source: https://sunex.com/wp-content/uploads/2023/01/Sunex_Medical_Brochure_2023_online.pdf - Type: PDF whitepaper ©2023 Sunex Inc. All Rights Reserved. sunex.com 20+ year track record of success in taking customer concepts from design through mass production. SUNEX INC. 3160 Lionshead Ave, Suite B Carlsbad, CA 92010, USA Tel: +1 760-597-2966 sunex.com/contact Consulting –Design –Manufacturing - Support Medical Camera Modules sunex.com/solutions Helping to save lives with Medical Camera Modules Field MTF Table only shows a selection. Additional lens and module options are available. Sunex Medical Camera Modules Sunex's expertise in endoscope lenses goes back to the mid-90s when our founder and CEO designed the optical stack for one of the first disposable Laparoscopes. Since then, we have been designing and manufacturing many optical systems for medical devices that were introduced into the market, including dental cameras, various endoscopes, and stereographic vision systems for robotic surgery. Single-use Endoscopes With the increase in minimally invasive surgery, the pressure to reduce cost per procedure, and the call for reduction of cross- contamination, single-use or disposable endoscopes have entered the market. Our experience and engineering know-how allow our customers to create new products and novel solutions, especially when the goal is to realize the benefits of single-use for medical devices requiring high resolution and best-in-class imaging quality. Camera Module Capabilities We work with our customers and partner network to find the best balance between cost and performance to meet the often unique application requirements. Sunex offers a wide range of services, including designing and manufacturing the lens and the optomechanical components, the PCBA, and the right cabling solution. Sunex has the expertise and capabilities for high-volume manufacturing in state-of-the-art cleanroom facilities, automated 6-axis active alignment, and the test and quality control processes required to support the most demanding medical imaging applications. Fast Prototyping In specific cases, prototyping is used as the first stepping stone toward mass production. Besides the lens performance, exploring various mechanical design solutions is often part of these early efforts. Staying within the required dimensions while accommodating additional working channels and features is crucial for a successful product. Sunex can produce prototypes with short lead times to verify the design before transitioning further on the path to mass production using state-of-the-art fabrication processes for glass and plastic optical elements and all mechanical components. Designed for Mass Production Often it is not the challenge to create a design “thatworks”but to find a solution that can scale to mass production, meeting price targets, optical performance, mechanical constraints, and quality requirements. Sunex has over two decades of design and manufacturing experience, and all our lenses and modules are designed for high-volume manufacturability. Type Format Class FOV F/# TTL Features Dental Camera 1/4" 3MP 67 14.8 6.25 Hybrid Design, very large F/# Ophthalmoscope 1/2.5" 5MP 55 5.6 11.3 All glass, narrow FOV Endoscope 1/4" 2MP 140 8.2 7.6 Hybrid Design, wide FOV Endoscope 1/3" VGA 85 1.7 6.0 Short TTL, unibody design Endoscope 1/5" 5MP 90 6.0 4.9 High resolution, short TTL Endoscope 1/2.5" 8MP 140 6.0 11.2 Hybrid Design, 4K, wide FOV Endoscope 1/3" 8MP 195 3.2 12.5 Hybrid, 4K, SuperFisheye FOV sunex.com/solutions Helping to save lives with Medical Camera Modules Field MTF Table only shows a selection. Additional lens and module options are available. Sunex Medical Camera Modules Sunex's expertise in endoscope lenses goes back to the mid-90s when our founder and CEO designed the optical stack for one of the first disposable Laparoscopes. Since then, we have been designing and manufacturing many optical systems for medical devices that were introduced into the market, including dental cameras, various endoscopes, and stereographic vision systems for robotic surgery. Single-use Endoscopes With the increase in minimally invasive surgery, the pressure to reduce cost per procedure, and the call for reduction of cross- contamination, single-use or disposable endoscopes have entered the market. Our experience and engineering know-how allow our customers to create new products and novel solutions, especially when the goal is to realize the benefits of single-use for medical devices requiring high resolution and best-in-class imaging quality. Camera Module Capabilities We work with our customers and partner network to find the best balance between cost and performance to meet the often unique application requirements. Sunex offers a wide range of services, including designing and manufacturing the lens and the optomechanical components, the PCBA, and the right cabling solution. Sunex has the expertise and capabilities for high-volume manufacturing in state-of-the-art cleanroom facilities, automated 6-axis active alignment, and the test and quality control processes required to support the most demanding medical imaging applications. Fast Prototyping In specific cases, prototyping is used as the first stepping stone toward mass production. Besides the lens performance, exploring various mechanical design solutions is often part of these early efforts. Staying within the required dimensions while accommodating additional working channels and features is crucial for a successful product. Sunex can produce prototypes with short lead times to verify the design before transitioning further on the path to mass production using state-of-the-art fabrication processes for glass and plastic optical elements and all mechanical components. Designed for Mass Production Often it is not the challenge to create a design “thatworks”but to find a solution that can scale to mass production, meeting price targets, optical performance, mechanical constraints, and quality requirements. Sunex has over two decades of design and manufacturing experience, and all our lenses and modules are designed for high-volume manufacturability. Type Format Class FOV F/# TTL Features Dental Camera 1/4" 3MP 67 14.8 6.25 Hybrid Design, very large F/# Ophthalmoscope 1/2.5" 5MP 55 5.6 11.3 All glass, narrow FOV Endoscope 1/4" 2MP 140 8.2 7.6 Hybrid Design, wide FOV Endoscope 1/3" VGA 85 1.7 6.0 Short TTL, unibody design Endoscope 1/5" 5MP 90 6.0 4.9 High resolution, short TTL Endoscope 1/2.5" 8MP 140 6.0 11.2 Hybrid Design, 4K, wide FOV Endoscope 1/3" 8MP 195 3.2 12.5 Hybrid, 4K, SuperFisheye FOV ©2023 Sunex Inc. All Rights Reserved. sunex.com 20+ year track record of success in taking customer concepts from design through mass production. SUNEX INC. 3160 Lionshead Ave, Suite B Carlsbad, CA 92010, USA Tel: +1 760-597-2966 sunex.com/contact Consulting –Design –Manufacturing - Support Medical Camera Modules --- ## Sunex Designservice Brochure 2025 Online [PDF] - Source: https://sunex.com/wp-content/uploads/2025/01/Sunex_DesignService_Brochure_2025_online.pdf - Type: PDF whitepaper Consulting – Design – Manufacturing - Support ©2025 Sunex Inc. All Rights Reserved. sunex.com 25+ year track record of success in taking customer concepts from design through mass production. SUNEX INC. 3160 Lionshead Ave, Suite B Carlsbad, CA 92010, USA Tel: +1 760-597-2966 sunex.com/contact DESIGN SERVICES Optical and Camera Module Design & Prototyping sunex.com/solutions 25+ years of Design Expertise will Drive Your Success. With a 25+ year track record as a leading optics company and a US-based headquarters and Design Center, we have consistently demonstrated success in taking customer concepts from design through mass production across many industries and applications. Our products enable technologies that make driving and transportation safer, protect buildings, assets, and critical infrastructure, advance medical procedures, connect the world through AR/VR technologies, enable next-generation robotics, and advance smart agriculture that puts the food on the table. Optical Design Services We are leveraging best-in-class design expertise and are constantly driving process improvements and manufacturing innovation for our global customer base to push the boundaries of what is possible. Our optical and mechanical engineering teams apply their proven engineering know-how to create new products and novel solutions to support your project goals. From Design To Cost (DTC), Design for Manufacturing (DFM), and Design for Reliability (DFR), we always have the full product development cycle and your specific business case in mind. Camera Module Design Services Expanding beyond our Optical Design Services, we can create an even further vertically integrated solution for you. From optomechanical design, PCB design, and auxiliary component integration to manufacturing tolerancing, End-of-Line (EOL) test requirements, and Quality Control (QC) procedures, we can design your camera module and prepare your optical system for prototyping and ramp into manufacturing. Designed for Mass Production Often it is not the challenge to create a design “that works” but to find a solution that can scale to mass production (DFM), meets required price targets (DTC), and delivers on performance and quality (DFR). Our deep design and manufacturing experience comes from servicing some of the most demanding markets and applications, and combined with consistent quality and global on- time delivery, Sunex is the preferred partner for many. Sunex Design Services All graphs are for illustration purpose only. The individual lens performance can be different. FOV HDR (dB) Focus Shift MTF FOV IH / EFL f-tan f-theta Fovea MASS PRODUCTION PROTOTYPE / SAMPLES OPTICAL DESIGN DTC Product Maturity FEASIBILITY STUDY DFM DFR Fast Prototyping Sunex’s renowned excellence in Optical Design is complemented by its extensive experience and leading position in the manufacturing of optical systems. If your solution benefits from early prototyping and samples to optimize performance, explore mechanical variants, or solicit early end-user input, we would be honored to be your partner of choice. Our state-of-the-art prototyping equipment and accredited in- house test facilities support in-depth prototype evaluation before transitioning further on the path to mass production. Product Cycle sunex.com/solutions 25+ years of Design Expertise will Drive Your Success. With a 25+ year track record as a leading optics company and a US-based headquarters and Design Center, we have consistently demonstrated success in taking customer concepts from design through mass production across many industries and applications. Our products enable technologies that make driving and transportation safer, protect buildings, assets, and critical infrastructure, advance medical procedures, connect the world through AR/VR technologies, enable next-generation robotics, and advance smart agriculture that puts the food on the table. Optical Design Services We are leveraging best-in-class design expertise and are constantly driving process improvements and manufacturing innovation for our global customer base to push the boundaries of what is possible. Our optical and mechanical engineering teams apply their proven engineering know-how to create new products and novel solutions to support your project goals. From Design To Cost (DTC), Design for Manufacturing (DFM), and Design for Reliability (DFR), we always have the full product development cycle and your specific business case in mind. Camera Module Design Services Expanding beyond our Optical Design Services, we can create an even further vertically integrated solution for you. From optomechanical design, PCB design, and auxiliary component integration to manufacturing tolerancing, End-of-Line (EOL) test requirements, and Quality Control (QC) procedures, we can design your camera module and prepare your optical system for prototyping and ramp into manufacturing. Designed for Mass Production Often it is not the challenge to create a design “that works” but to find a solution that can scale to mass production (DFM), meets required price targets (DTC), and delivers on performance and quality (DFR). Our deep design and manufacturing experience comes from servicing some of the most demanding markets and applications, and combined with consistent quality and global on- time delivery, Sunex is the preferred partner for many. Sunex Design Services All graphs are for illustration purpose only. The individual lens performance can be different. FOV HDR (dB) Focus Shift MTF FOV IH / EFL f-tan f-theta Fovea MASS PRODUCTION PROTOTYPE / SAMPLES OPTICAL DESIGN DTC Product Maturity FEASIBILITY STUDY DFM DFR Fast Prototyping Sunex’s renowned excellence in Optical Design is complemented by its extensive experience and leading position in the manufacturing of optical systems. If your solution benefits from early prototyping and samples to optimize performance, explore mechanical variants, or solicit early end-user input, we would be honored to be your partner of choice. Our state-of-the-art prototyping equipment and accredited in- house test facilities support in-depth prototype evaluation before transitioning further on the path to mass production. Product Cycle Consulting – Design – Manufacturing - Support ©2025 Sunex Inc. All Rights Reserved. sunex.com 25+ year track record of success in taking customer concepts from design through mass production. SUNEX INC. 3160 Lionshead Ave, Suite B Carlsbad, CA 92010, USA Tel: +1 760-597-2966 sunex.com/contact DESIGN SERVICES Optical and Camera Module Design & Prototyping --- ## Sunex Rgbir Brochure 2025 Online [PDF] - Source: https://sunex.com/wp-content/uploads/2025/01/Sunex_RGBIR_Brochure_2025_online.pdf - Type: PDF whitepaper Consulting – Design – Manufacturing - Support ©2025 Sunex Inc. All Rights Reserved. sunex.com 25+ year track record of success in taking customer concepts from design through mass production. SUNEX INC. 3160 Lionshead Ave, Suite B Carlsbad, CA 92010, USA Tel: +1 760-597-2966 sunex.com/contact RGB☰IR Lenses TM TM IR Corrected RGBIR is a popular term for lenses optimized for operation in daylight and low-light conditions. Such products require specialized design considerations and specific manufacturing techniques to provide the best possible image quality. Broadband AR coatings (BBAR) with low reflectivity (R%), dual-bandpass filter with high transmissivity (T%) in the visible (VIS) and near-infrared (NIR) bands are required as well to achieve the best performance. An RGBIR lens can significantly improve focus, brightness, and resolution over conventional lenses if done right. Applications The enhanced infrared sensitivity of RGBIR lenses is often paired with a dedicated infrared illumination source enabling night vision applications and use cases such as biometric authentication, gaze tracking, and gesture recognition. IR-corrected lenses are used in many applications across different industries and can be referred to as Day/Night (e.g., Security and Surveillance), RGBIR (e.g., automotive In-Cabin), and hyperspectral (e.g., medical imaging). Dual-Bandpass Filters Infrared (IR) is an unwanted component of the light spectrum for many camera applications. It is often blocked using IR cut-off filters that block the infrared transmission while passing the visible (VIS). The Sunex IRC4x family of dual-bandpass filters are specifically designed to allow the visible spectrum and a narrow IR-band to pass through simultaneously. Typical configurations are VIS+850 and VIS+940. Sunex also offers single-band IR filters and custom filter designs. All graphs are for illustration purpose only. The individual lens performance can be different. Sunex RGB☰IR lenses sunex.com/products Visibility around the clock with RGB☰IR lenses Table only shows a selection. Additional RGB☰IR lens options are available. Sensor Module Capabilities Depending on the need and expertise of our customers, we provide design and manufacturing services for a complete sensor module. We strive to find the best solution for your needs, from designing the schematic, creating the PCB layout, and sourcing all components to building according to your PCB design and parts consignment. At Sunex, we have the in-house expertise and capabilities for lens and sensor board design, manufacturing, and testing to deliver a fully tested sensor module. Active Alignment Capabilities To achieve the highest system performance when pairing a high- quality lens with a high-resolution sensor, we recommend that our customers consider an active alignment process. Applying a fully automated 6-axis active alignment in mass production increases yield, shortens cycle times, improves system performance, and lowers part-to-part variance. Wavelength T% Wavelength R% Field MTF PN Format MP Class HFOV F/# TTL Features DSL208 1/1.3” 1.3MP 22° F/2.0 21.3 All glass, long EFL, RGBIR DSL240 1/1.3” 1.3MP 71° F/2.1 25.2 All glass, compact format, RGBIR DSL392 1/1.27” 2MP 201° F/2.0 23 All glass, SuperFisheye TM , RGBIR, HDR DSL936 1/1.2” 5MP 52° F/3.2 16.5 All glass, compact design, RGBIR DSL973 1/2.5” 5MP 120° F/2.2 14.3 Hybrid, OMS, compact design, RGBIR DSL974 1/2.5” 5MP 140° F/2.2 14.7 Hybrid, OMS, compact design, RGBIR DSL186 1/1.7” 8MP 140° F/1.8 25 Hybrid, RGBIR, HDR TM IR Corrected RGBIR is a popular term for lenses optimized for operation in daylight and low-light conditions. Such products require specialized design considerations and specific manufacturing techniques to provide the best possible image quality. Broadband AR coatings (BBAR) with low reflectivity (R%), dual-bandpass filter with high transmissivity (T%) in the visible (VIS) and near-infrared (NIR) bands are required as well to achieve the best performance. An RGBIR lens can significantly improve focus, brightness, and resolution over conventional lenses if done right. Applications The enhanced infrared sensitivity of RGBIR lenses is often paired with a dedicated infrared illumination source enabling night vision applications and use cases such as biometric authentication, gaze tracking, and gesture recognition. IR-corrected lenses are used in many applications across different industries and can be referred to as Day/Night (e.g., Security and Surveillance), RGBIR (e.g., automotive In-Cabin), and hyperspectral (e.g., medical imaging). Dual-Bandpass Filters Infrared (IR) is an unwanted component of the light spectrum for many camera applications. It is often blocked using IR cut-off filters that block the infrared transmission while passing the visible (VIS). The Sunex IRC4x family of dual-bandpass filters are specifically designed to allow the visible spectrum and a narrow IR-band to pass through simultaneously. Typical configurations are VIS+850 and VIS+940. Sunex also offers single-band IR filters and custom filter designs. All graphs are for illustration purpose only. The individual lens performance can be different. Sunex RGB☰IR lenses sunex.com/products Visibility around the clock with RGB☰IR lenses Table only shows a selection. Additional RGB☰IR lens options are available. Sensor Module Capabilities Depending on the need and expertise of our customers, we provide design and manufacturing services for a complete sensor module. We strive to find the best solution for your needs, from designing the schematic, creating the PCB layout, and sourcing all components to building according to your PCB design and parts consignment. At Sunex, we have the in-house expertise and capabilities for lens and sensor board design, manufacturing, and testing to deliver a fully tested sensor module. Active Alignment Capabilities To achieve the highest system performance when pairing a high- quality lens with a high-resolution sensor, we recommend that our customers consider an active alignment process. Applying a fully automated 6-axis active alignment in mass production increases yield, shortens cycle times, improves system performance, and lowers part-to-part variance. Wavelength T% Wavelength R% Field MTF PN Format MP Class HFOV F/# TTL Features DSL208 1/1.3” 1.3MP 22° F/2.0 21.3 All glass, long EFL, RGBIR DSL240 1/1.3” 1.3MP 71° F/2.1 25.2 All glass, compact format, RGBIR DSL392 1/1.27” 2MP 201° F/2.0 23 All glass, SuperFisheye TM , RGBIR, HDR DSL936 1/1.2” 5MP 52° F/3.2 16.5 All glass, compact design, RGBIR DSL973 1/2.5” 5MP 120° F/2.2 14.3 Hybrid, OMS, compact design, RGBIR DSL974 1/2.5” 5MP 140° F/2.2 14.7 Hybrid, OMS, compact design, RGBIR DSL186 1/1.7” 8MP 140° F/1.8 25 Hybrid, RGBIR, HDR Consulting – Design – Manufacturing - Support ©2025 Sunex Inc. All Rights Reserved. sunex.com 25+ year track record of success in taking customer concepts from design through mass production. SUNEX INC. 3160 Lionshead Ave, Suite B Carlsbad, CA 92010, USA Tel: +1 760-597-2966 sunex.com/contact RGB☰IR Lenses TM --- ## Sunex Robotics Brochure 2025 Online [PDF] - Source: https://sunex.com/wp-content/uploads/2025/01/Sunex_Robotics_Brochure_2025_online.pdf - Type: PDF whitepaper Vision for Robotics Consulting – Design – Manufacturing - Support 25+ year track record of success in taking customer concepts from design through mass production. ©2025 Sunex Inc. All Rights Reserved. sunex.com SUNEX INC. 3160 Lionshead Ave, Suite B Carlsbad, CA 92010, USA Tel: +1 760-597-2966 sunex.com/contact TM Technologies Across decades, we have achieved many “firsts” with our customers. Today, we have a broad range of expertise and products that enable innovative customer-specific solutions that advance human-like vision, depth perception, and full 360° surround view. Sunex Vision for Robotics MASS PRODUCTION PROTOTYPE / SAMPLES OPTICAL DESIGN DTC Product Maturity FEASIBILITY STUDY DFM DFR Product Cycle Designed for Mass Production Often it is not the challenge to create a design “that works” but to find a solution that can scale to mass production (DFM), meets required price targets (DTC), and delivers on performance and quality (DFR). Our deep design and manufacturing experience comes from servicing some of the most demanding markets and applications and combined with consistent quality and global on- time delivery, Sunex is the preferred partner for many. Fast Prototyping Sunex’s renowned excellence in Optical Design is complemented by its extensive experience and leading position in the manufacturing of optical systems. If your solution benefits from early prototyping and samples to optimize performance, explore mechanical variants, or solicit early end-user input, we would be honored to be your partner of choice. Our state-of-the-art prototyping equipment and accredited in- house test facilities support in-depth prototype evaluation before transitioning further on the path to mass production. All graphs are for illustration purpose only. The individual lens performance can be different. Field MTF Field HDR (dB) Unique Robotics Challenges require Unique Solutions. sunex.com/solutions Wavelength T% Applications Cameras play a pivotal role in robotics across a diverse range of applications. From industrial assembly line robots to the latest, state-of-the-art humanoid robots, they serve as indispensable sensory organs that enable machines to perceive and interact with their environments effectively. Any mobile robotics platform, such as drones and autonomous delivery vehicles, relies on a vision system to provide real-time visual feedback, which is crucial for navigation, obstacle avoidance, and precise maneuvering. Enhanced object recognition and spatial awareness increase the requirements for a high-performing optical system to enable interaction with humans and objects in complex environments like humanoid co-worker scenarios and warehouses with Automated Guided Vehicles (AGVs). Looking ahead, the increasing proliferation of robotics into everyday life is expected to expand significantly, driven by advancements in optical design, camera technology, and artificial intelligence. Choosing the right lens is crucial for any system-level optimization, and Sunex, as a lens and technology partner, sets a roadmap for achieving the desired outcomes. • Miniaturized SuperFisheye • High-resolution (20-100MP) • Large-format (≥1”) • High Dynamic Range (HDR) • NoGhost (≥120dB) • RGBIR (Day/Night) • Time of Flight (ToF) • LiDAR Receiver • Fovea Distortion • Catadioptric Systems • Advanced OptoMech • Broadband AR Coatings TM Technologies Across decades, we have achieved many “firsts” with our customers. Today, we have a broad range of expertise and products that enable innovative customer-specific solutions that advance human-like vision, depth perception, and full 360° surround view. Sunex Vision for Robotics MASS PRODUCTION PROTOTYPE / SAMPLES OPTICAL DESIGN DTC Product Maturity FEASIBILITY STUDY DFM DFR Product Cycle Designed for Mass Production Often it is not the challenge to create a design “that works” but to find a solution that can scale to mass production (DFM), meets required price targets (DTC), and delivers on performance and quality (DFR). Our deep design and manufacturing experience comes from servicing some of the most demanding markets and applications and combined with consistent quality and global on- time delivery, Sunex is the preferred partner for many. Fast Prototyping Sunex’s renowned excellence in Optical Design is complemented by its extensive experience and leading position in the manufacturing of optical systems. If your solution benefits from early prototyping and samples to optimize performance, explore mechanical variants, or solicit early end-user input, we would be honored to be your partner of choice. Our state-of-the-art prototyping equipment and accredited in- house test facilities support in-depth prototype evaluation before transitioning further on the path to mass production. All graphs are for illustration purpose only. The individual lens performance can be different. Field MTF Field HDR (dB) Unique Robotics Challenges require Unique Solutions. sunex.com/solutions Wavelength T% Applications Cameras play a pivotal role in robotics across a diverse range of applications. From industrial assembly line robots to the latest, state-of-the-art humanoid robots, they serve as indispensable sensory organs that enable machines to perceive and interact with their environments effectively. Any mobile robotics platform, such as drones and autonomous delivery vehicles, relies on a vision system to provide real-time visual feedback, which is crucial for navigation, obstacle avoidance, and precise maneuvering. Enhanced object recognition and spatial awareness increase the requirements for a high-performing optical system to enable interaction with humans and objects in complex environments like humanoid co-worker scenarios and warehouses with Automated Guided Vehicles (AGVs). Looking ahead, the increasing proliferation of robotics into everyday life is expected to expand significantly, driven by advancements in optical design, camera technology, and artificial intelligence. Choosing the right lens is crucial for any system-level optimization, and Sunex, as a lens and technology partner, sets a roadmap for achieving the desired outcomes. • Miniaturized SuperFisheye • High-resolution (20-100MP) • Large-format (≥1”) • High Dynamic Range (HDR) • NoGhost (≥120dB) • RGBIR (Day/Night) • Time of Flight (ToF) • LiDAR Receiver • Fovea Distortion • Catadioptric Systems • Advanced OptoMech • Broadband AR Coatings Vision for Robotics Consulting – Design – Manufacturing - Support 25+ year track record of success in taking customer concepts from design through mass production. ©2025 Sunex Inc. All Rights Reserved. sunex.com SUNEX INC. 3160 Lionshead Ave, Suite B Carlsbad, CA 92010, USA Tel: +1 760-597-2966 sunex.com/contact --- ## Sunex Sensormodule Brochure 2025 Online [PDF] - Source: https://sunex.com/wp-content/uploads/2025/01/Sunex_SensorModule_Brochure_2025_online.pdf - Type: PDF whitepaper Consulting – Design – Manufacturing - Support ©2025 Sunex Inc. All Rights Reserved. sunex.com 25+ year track record of success in taking customer concepts from design through mass production. SUNEX INC. 3160 Lionshead Ave, Suite B Carlsbad, CA 92010, USA Tel: +1 760-597-2966 sunex.com/contact SENSOR MODULES TM Sensor Board and Active Alignment Services Depending on the requirements, we can provide design and manufacturing services for a complete sensor module. We strive to find the best solution for your needs, from designing the schematic, creating the PCB layout, and sourcing all components to building according to your PCB design and parts consignment. At Sunex, we have the in-house expertise and capabilities for lens and sensor board design, high-volume manufacturing, automated active alignment, and testing to support the most demanding vision applications. For additional services, we can bring in partners from our Technology and Service Network that allow us to process bare die and packaged sensors, including cabling options, e.g., for medical applications, and hand over the complete tested sensor module to the following entity in the value chain to integrate the full camera. Active Alignment Capabilities We recommend that our customers consider an active alignment process to achieve the highest system performance when pairing a high-quality lens with a high-resolution sensor. Our active alignment offering can grow with the ramp of the program and enables the use of an alignment process from the beginning. Applying a fully automated 6-axis active alignment in mass production increases yield, shortens cycle times, improves system performance, and lowers part-to-part variance. All graphs are for illustration purpose only. The individual lens performance can be different. SUNEX SENSOR MODULES sunex.com/solutions Empower your application with SENSOR MODULESFOV HDR (dB) Fast Prototyping We provide prototyping services for complete lens assemblies often as the first step after a new custom design. Sunex can produce prototypes with short lead times to verify the design before transitioning further on the path to mass production using state-of-the-art fabrication processes for glass and plastic optical elements and all mechanical components. Field MTF Designed for Mass Production Often it is not the challenge to create a design “that works” but to find a solution that can scale to mass production, meeting price targets, optical performance, mechanical constraints, and quality requirements. Sunex has over two decades of design and manufacturing experience, and all our lenses and modules are designed for high-volume manufacturability. OptoMech Holder & Unibody Metal & Plastic Athermalized PCBA Designed by Sunex DFM of 3rd party layout Consigned with IQC CMOS Bare Die and Packaged Procured by Sunex Consigned by 3rd party System Housing Cabling Sub-Assembly REL Shock & Vibration Environmental Lifetime Active Alignment 5-axis Manual AA 6-axis Automated AA Hybrid Mode Focus Shift MTF Depending on the requirements, we can provide design and manufacturing services for a complete sensor module. We strive to find the best solution for your needs, from designing the schematic, creating the PCB layout, and sourcing all components to building according to your PCB design and parts consignment. At Sunex, we have the in-house expertise and capabilities for lens and sensor board design, high-volume manufacturing, automated active alignment, and testing to support the most demanding vision applications. For additional services, we can bring in partners from our Technology and Service Network that allow us to process bare die and packaged sensors, including cabling options, e.g., for medical applications, and hand over the complete tested sensor module to the following entity in the value chain to integrate the full camera. Active Alignment Capabilities We recommend that our customers consider an active alignment process to achieve the highest system performance when pairing a high-quality lens with a high-resolution sensor. Our active alignment offering can grow with the ramp of the program and enables the use of an alignment process from the beginning. Applying a fully automated 6-axis active alignment in mass production increases yield, shortens cycle times, improves system performance, and lowers part-to-part variance. All graphs are for illustration purpose only. The individual lens performance can be different. SUNEX SENSOR MODULES sunex.com/solutions Empower your application with SENSOR MODULESFOV HDR (dB) Fast Prototyping We provide prototyping services for complete lens assemblies often as the first step after a new custom design. Sunex can produce prototypes with short lead times to verify the design before transitioning further on the path to mass production using state-of-the-art fabrication processes for glass and plastic optical elements and all mechanical components. Field MTF Designed for Mass Production Often it is not the challenge to create a design “that works” but to find a solution that can scale to mass production, meeting price targets, optical performance, mechanical constraints, and quality requirements. Sunex has over two decades of design and manufacturing experience, and all our lenses and modules are designed for high-volume manufacturability. OptoMech Holder & Unibody Metal & Plastic Athermalized PCBA Designed by Sunex DFM of 3rd party layout Consigned with IQC CMOS Bare Die and Packaged Procured by Sunex Consigned by 3rd party System Housing Cabling Sub-Assembly REL Shock & Vibration Environmental Lifetime Active Alignment 5-axis Manual AA 6-axis Automated AA Hybrid Mode Focus Shift MTF Consulting – Design – Manufacturing - Support ©2025 Sunex Inc. All Rights Reserved. sunex.com 25+ year track record of success in taking customer concepts from design through mass production. SUNEX INC. 3160 Lionshead Ave, Suite B Carlsbad, CA 92010, USA Tel: +1 760-597-2966 sunex.com/contact SENSOR MODULES TM Sensor Board and Active Alignment Services --- ## Sunex Largeformat Brochure 2025 Online [PDF] - Source: https://sunex.com/wp-content/uploads/2025/04/Sunex_LargeFormat_Brochure_2025_online.pdf - Type: PDF whitepaper Consulting – Design – Manufacturing - Support sunex.com 25+ year track record of success in taking customer concepts from design through mass production. SUNEX INC. 3160 Lionshead Ave, Suite B Carlsbad, CA 92010, USA Tel: +1 760-597-2966 sunex.com/contact Large Format Lenses ©2025 Sunex Inc. All Rights Reserved. TM Image Quality High-resolution (up to 100MP) lenses provide superior image quality by delivering exceptional clarity and detail, even in high- speed, dynamic environments. These lenses stand out for their ability to produce vivid colors and minimize distortion, ensuring sharp, true-to-life images. Suitable image sensors contain a high pixel count, which enables precise capture of fine textures and intricate elements, enhancing the overall visual experience. Their advanced optical designs reduce aberrations and enhance contrast, providing consistent edge-to-edge performance across various lighting conditions. Applications Sunex Large Format lenses have a profound impact on total system performance. Their high resolution delivers lifelike imaging, making them essential for cutting-edge sports coverage, dynamic live broadcasts, immersive content capture, cinematic filmmaking and photography, geospatial mapping, teleconferencing, security, and Robotics applications where imaging quality is paramount. These lenses set a new benchmark for high-end professional imaging by delivering unparalleled clarity and detail. High Dynamic Range (HDR) HDR (high dynamic range) sensors can capture light intensity variations up to six or more orders of magnitude within the same image frame (~120db). This puts a very demanding requirement on lens performance. Sunex has developed design expertise, process know-how, and nested cleanroom manufacturing facilities to eliminate or minimize optical noise (such as ghosts, flare, starbursts, and spurious images) in lenses for high-performance applications. All graphs are for illustration purpose only. The individual lens performance can be different. Sunex Large Format Lenses F/# Full FOV Field MTF Field HDR (dB) Table only shows a selection. Additional Large Format lens options are available. Sensor Module Capabilities Depending on the need and expertise of our customers, we provide design and manufacturing services for a complete sensor module. We strive to find the best solution for your needs, from designing the schematic, creating the PCB layout, and sourcing all components to building according to your PCB design and parts consignment. At Sunex, we have the in-house expertise and capabilities for lens and sensor board design, manufacturing, and testing to deliver a fully tested sensor module. PN Format MP HFOV F/# TTL Feature tbc Full Frame 200MP 180° 2.8 105 All-Glass, wide-angle FOV DSL005 Full Frame 200MP 200° 4.0 80 All-glass, wide-angle FOV, high RI DSL592 1.2” 25MP 130° 2.9 44 All-glass, wide-angle FOV, Compact DSL428 1” 20MP 80° 1.8 81 All-glass, High RI, Low F/# DSL427 1” 20MP 42° 1.8 85 All-glass, Narrow FOV, Low F/# DSL415 1” 20MP 190° 2.4 50 All- glass, Super FisheyeTM, 0% F-Theta. Out of this league, image quality with up to 100MP lenses. Active Alignment Capabilities We recommend that our customers consider an active alignment process to achieve the highest system performance when pairing a high-quality lens with a high-resolution sensor. Applying a fully automated 6-axis active alignment approach in mass production shortens cycle times, improves the system performance, and lower part-to-part variance. sunex.com/products TM Image Quality High-resolution (up to 100MP) lenses provide superior image quality by delivering exceptional clarity and detail, even in high- speed, dynamic environments. These lenses stand out for their ability to produce vivid colors and minimize distortion, ensuring sharp, true-to-life images. Suitable image sensors contain a high pixel count, which enables precise capture of fine textures and intricate elements, enhancing the overall visual experience. Their advanced optical designs reduce aberrations and enhance contrast, providing consistent edge-to-edge performance across various lighting conditions. Applications Sunex Large Format lenses have a profound impact on total system performance. Their high resolution delivers lifelike imaging, making them essential for cutting-edge sports coverage, dynamic live broadcasts, immersive content capture, cinematic filmmaking and photography, geospatial mapping, teleconferencing, security, and Robotics applications where imaging quality is paramount. These lenses set a new benchmark for high-end professional imaging by delivering unparalleled clarity and detail. High Dynamic Range (HDR) HDR (high dynamic range) sensors can capture light intensity variations up to six or more orders of magnitude within the same image frame (~120db). This puts a very demanding requirement on lens performance. Sunex has developed design expertise, process know-how, and nested cleanroom manufacturing facilities to eliminate or minimize optical noise (such as ghosts, flare, starbursts, and spurious images) in lenses for high-performance applications. All graphs are for illustration purpose only. The individual lens performance can be different. Sunex Large Format Lenses F/# Full FOV Field MTF Field HDR (dB) Table only shows a selection. Additional Large Format lens options are available. Sensor Module Capabilities Depending on the need and expertise of our customers, we provide design and manufacturing services for a complete sensor module. We strive to find the best solution for your needs, from designing the schematic, creating the PCB layout, and sourcing all components to building according to your PCB design and parts consignment. At Sunex, we have the in-house expertise and capabilities for lens and sensor board design, manufacturing, and testing to deliver a fully tested sensor module. PN Format MP HFOV F/# TTL Feature tbc Full Frame 200MP 180° 2.8 105 All-Glass, wide-angle FOV DSL005 Full Frame 200MP 200° 4.0 80 All-glass, wide-angle FOV, high RI DSL592 1.2” 25MP 130° 2.9 44 All-glass, wide-angle FOV, Compact DSL428 1” 20MP 80° 1.8 81 All-glass, High RI, Low F/# DSL427 1” 20MP 42° 1.8 85 All-glass, Narrow FOV, Low F/# DSL415 1” 20MP 190° 2.4 50 All- glass, Super FisheyeTM, 0% F-Theta. Out of this league, image quality with up to 100MP lenses. Active Alignment Capabilities We recommend that our customers consider an active alignment process to achieve the highest system performance when pairing a high-quality lens with a high-resolution sensor. Applying a fully automated 6-axis active alignment approach in mass production shortens cycle times, improves the system performance, and lower part-to-part variance. sunex.com/products Consulting – Design – Manufacturing - Support sunex.com 25+ year track record of success in taking customer concepts from design through mass production. SUNEX INC. 3160 Lionshead Ave, Suite B Carlsbad, CA 92010, USA Tel: +1 760-597-2966 sunex.com/contact Large Format Lenses ©2025 Sunex Inc. All Rights Reserved. --- ## Sunex Medical Brochure 2025 Online [PDF] - Source: https://sunex.com/wp-content/uploads/2025/08/Sunex_Medical_Brochure_2025_online.pdf - Type: PDF whitepaper Consulting – Design – Manufacturing - Support sunex.com 25+ year track record of success in taking customer concepts from design through mass production. ©2025 Sunex Inc. All Rights Reserved. SUNEX INC. 3160 Lionshead Ave, Suite B Carlsbad, CA 92010, USA Tel: +1 760-597-2966 sunex.com/contact Medical Camera Modules sunex.com/solutions Helping to save lives with Medical Camera Modules Field MTF Sunex Medical Camera Modules Sunex's expertise in endoscope lenses goes back to the mid-90s when our founder and CEO designed the optical stack for one of the first disposable Laparoscopes. Since then, we have been designing and manufacturing many optical systems for medical devices that were introduced into the market, including dental cameras, various endoscopes, and stereographic vision systems for robotic surgery. Single-use Endoscopes With the increase in minimally invasive surgery, the pressure to reduce cost per procedure, and the call for reduction of cross- contamination, single-use or disposable endoscopes have entered the market. Our experience and engineering know-how allow our customers to create new products and novel solutions, especially when the goal is to realize the benefits of single-use for medical devices requiring high resolution and best-in-class imaging quality. Camera Module Capabilities We work with our customers and partner network to find the best balance between cost and performance to meet the often unique application requirements. Sunex offers a wide range of services, including designing and manufacturing the lens and the optomechanical components, the PCBA, and the right cabling solution. Sunex has the expertise and capabilities for high-volume manufacturing in state-of-the-art cleanroom facilities, automated 6-axis active alignment, and the test and quality control processes required to support the most demanding medical imaging applications. Fast Prototyping In specific cases, prototyping is used as the first steppingstone toward mass production. Besides the lens performance, exploring various mechanical design solutions is often part of these early efforts. Staying within the required dimensions while accommodating additional working channels and features is crucial for a successful product. Sunex can produce prototypes with short lead times to verify the design before transitioning further on the path to mass production using state-of-the-art fabrication processes for glass and plastic optical elements and all mechanical components. Designed for Mass Production Often it is not the challenge to create a design “that works” but to find a solution that can scale to mass production, meeting price targets, optical performance, mechanical constraints, and quality requirements. Sunex has over two decades of design and manufacturing experience, and all our lenses and modules are designed for high-volume manufacturability. Table only shows a selection. Additional lens and module options are available. Type Format Class FOV F/# TTL Features Dental Camera 1/4" 3MP 67 14.8 6.25 Hybrid Design, very large F/# Ophthalmoscope 1/2.5" 5MP 55 5.6 11.3 All glass, narrow FOV Endoscope 1/4" 2MP 140 8.2 7.6 Hybrid Design, wide FOV Endoscope 1/3" VGA 85 1.7 6.0 Short TTL, unibody design Endoscope 1/5" 5MP 90 6.0 4.9 High resolution, short TTL Endoscope 1/2.5" 8MP 140 6.0 11.2 Hybrid Design, 4K, wide FOV Endoscope 1/3" 8MP 195 3.2 12.5 Hybrid, 4K, SuperFisheye FOV sunex.com/solutions Helping to save lives with Medical Camera Modules Field MTF Sunex Medical Camera Modules Sunex's expertise in endoscope lenses goes back to the mid-90s when our founder and CEO designed the optical stack for one of the first disposable Laparoscopes. Since then, we have been designing and manufacturing many optical systems for medical devices that were introduced into the market, including dental cameras, various endoscopes, and stereographic vision systems for robotic surgery. Single-use Endoscopes With the increase in minimally invasive surgery, the pressure to reduce cost per procedure, and the call for reduction of cross- contamination, single-use or disposable endoscopes have entered the market. Our experience and engineering know-how allow our customers to create new products and novel solutions, especially when the goal is to realize the benefits of single-use for medical devices requiring high resolution and best-in-class imaging quality. Camera Module Capabilities We work with our customers and partner network to find the best balance between cost and performance to meet the often unique application requirements. Sunex offers a wide range of services, including designing and manufacturing the lens and the optomechanical components, the PCBA, and the right cabling solution. Sunex has the expertise and capabilities for high-volume manufacturing in state-of-the-art cleanroom facilities, automated 6-axis active alignment, and the test and quality control processes required to support the most demanding medical imaging applications. Fast Prototyping In specific cases, prototyping is used as the first steppingstone toward mass production. Besides the lens performance, exploring various mechanical design solutions is often part of these early efforts. Staying within the required dimensions while accommodating additional working channels and features is crucial for a successful product. Sunex can produce prototypes with short lead times to verify the design before transitioning further on the path to mass production using state-of-the-art fabrication processes for glass and plastic optical elements and all mechanical components. Designed for Mass Production Often it is not the challenge to create a design “that works” but to find a solution that can scale to mass production, meeting price targets, optical performance, mechanical constraints, and quality requirements. Sunex has over two decades of design and manufacturing experience, and all our lenses and modules are designed for high-volume manufacturability. Table only shows a selection. Additional lens and module options are available. Type Format Class FOV F/# TTL Features Dental Camera 1/4" 3MP 67 14.8 6.25 Hybrid Design, very large F/# Ophthalmoscope 1/2.5" 5MP 55 5.6 11.3 All glass, narrow FOV Endoscope 1/4" 2MP 140 8.2 7.6 Hybrid Design, wide FOV Endoscope 1/3" VGA 85 1.7 6.0 Short TTL, unibody design Endoscope 1/5" 5MP 90 6.0 4.9 High resolution, short TTL Endoscope 1/2.5" 8MP 140 6.0 11.2 Hybrid Design, 4K, wide FOV Endoscope 1/3" 8MP 195 3.2 12.5 Hybrid, 4K, SuperFisheye FOV Consulting – Design – Manufacturing - Support sunex.com 25+ year track record of success in taking customer concepts from design through mass production. ©2025 Sunex Inc. All Rights Reserved. SUNEX INC. 3160 Lionshead Ave, Suite B Carlsbad, CA 92010, USA Tel: +1 760-597-2966 sunex.com/contact Medical Camera Modules --- ## Sunex Dxm Sxm Show Edition Online [PDF] - Source: https://sunex.com/wp-content/uploads/2026/01/Sunex_DXM_SXM_Show_Edition_online.pdf - Type: PDF whitepaper Why You Need to Consider Sunex DXM and SXM in Your Next HUMANOID ROBOT SYSTEM. OPTICS INDUSTRIAL ROBOTICS 2025 2 KEY TOOLS Application-Specific Opportunities AGVs and AMRs - Warehouse robots and last-mile delivery bots require compact, cost-effective depth perception for obstacle avoidance and autonomous navigation. Since the operating environment is structured and typically well-lit, the reduced baseline of a direct imaging and resolution of a single- sensor dual-channel system are acceptable trade-offs for gains in size, weight, and battery life. Humanoid and Consumer Robots - For robots that interact with people or operate in tight spaces— such as service robots, assistants, or educational bots—single-sensor stereo vision provides reliable depth awareness for facial tracking, gesture detection, and object manipulation. The compact form factor enables the embedding of vision systems in aesthetically pleasing designs. Manufacturing Automation - In high-speed production lines, stereo vision is used for bin picking, height profiling, presence detection, and assembly inspection. Single-sensor stereo cameras provide an efficient way to deliver these functions in a durable, factory-ready package. Their simplified calibration and reduced cabling also translate to easier deployment and less downtime. Expanded Use Cases Beyond Traditional Stereo Imaging The same architecture used for stereo vision can also be adapted for multi-modal or dual-purpose imaging by varying the optical paths or filters on each channel. This unlocks several compelling new applications: Dual Field of View (FOV) Imaging - One optical channel can be designed for wide-angle situational awareness (e.g., 120° FOV), while the other is optimized for narrow-angle detail (e.g., 30° FOV). This is particularly useful in Security robots (surveillance + facial identification), Agricultural drones (large field + individual plant monitoring), and Logistics (Box detection + barcode reading). Simultaneous Visible and Infrared (RGB/IR) Imaging - Another configuration utilizes one lens and an optical filter stack optimized for RGB, while the other is tuned for near-IR or thermal infrared. This enables day/night (RGBIR) vision in Medical robotics (visual navigation + vein mapping), Food processing (surface color + sub-surface bruising or spoilage), and Smart agriculture (Visible plant monitoring + chlorophyll/NIR reflection analysis). Extended Exposure HDR - Imagine using two otherwise identical lenses, but one is optimized for a low F/#, while the other is optimized for high F/# enabling capturing a wider dynamic range in the same exposure time and simultaneously allowing more deterministic control over depth of field in Robotics, Machine Vision, Security, and Autonomy. Stereo Content Capture - The human eye is very sensitive to differences in color and relative illumination when presented with two images side-by-side. Single-sensor stereo content capture eliminates these discrepancies and the need to calibrate two different sensors for use in AR/VR, content capture, and video conferencing. Scan the QR code to download the in-depth case study (incl. a Guideline for System Designers). SUNEX Industrial Robotics 2025 p 4/4 sunex.com Sunex DXM - Stereo Vision in a Smaller Package Single-Sensor Stereo Imaging in Robotics and Beyond As robotics and automation systems grow increasingly compact, intelligent, and power-efficient, the supporting vision technologies must evolve in parallel. One area undergoing rapid growth and innovation is single-sensor stereo imaging, where two optical channels converge onto a single CMOS sensor. This architectural shift offers a powerful blend of reduced physical footprint, lower power consumption, improved synchronization, color-matching, and overall cost efficiency. Originally explored for space- constrained applications, the concept is now gaining momentum across a diverse set of platforms, including Autonomous Mobile Robots (AMRs), Automated Guided Vehicles (AGVs), humanoid robots, manufacturing automation, and even multi-modal vision systems. The Architecture: Two Optical Channels, One CMOS Sensor A single-sensor stereo imaging system consists of two independent optical channels, based on two different base architectures. The result in either case is a stereo image pair captured simultaneously, pixel-aligned, and temporally consistent, without the need for a second sensor. Sunex DXMTM Direct Imaging camera leveraging SXMTM pre-aligned and interchangeable dual- optics heads. Compact Design and Space Efficiency The compactness of single-sensor stereo systems is obviously one compelling feature. This opens the door to new designs for low-profile AGVs, slim robotic arms, or humanoid head units, where stereo vision must be integrated without adding bulk or weight. Building on experiences in designing and manufacturing miniaturized optics for automotive and medical systems, the DXM direct imaging solution enables tighter baselines without sacrificing image quality or manufacturability. • Relay-prism or mirror systems, which allow a longer baseline (distance between the optical channels), enabling better depth perception at mid-to-long ranges. • Direct-imaging optics, where two small lenses with a shorter baseline directly image adjacent scenes onto the same CMOS sensor. Power Efficiency in Battery-Operated Systems In battery-powered robots, energy is often the most limited resource. A conventional two-sensor stereo setup not only doubles sensor power draw but also adds thermal and processing load for synchronizing and handling dual video streams. With a single-sensor system, all duplicate overhead is eliminated. Perfect Synchronization and Simplified Calibration Another major advantage of single-sensor stereo imaging is zero latency synchronization, paramount for fast-moving robotic systems or dynamic environments. Both images are captured on the same sensor die in the same exposure cycle. This eliminates the need for complex software-level synchronization, color matching, dual-sensor calibration routines, or even sensor-to-sensor alignment. Cost Efficiency: Fewer Components, Lower BOM Reducing component count directly translates to lower costs, not just in materials, but also in assembly, calibration, and quality control. A single-sensor stereo system uses: • One sensor (instead of two) • A shared image processing pipeline • Fewer connectors, cables, and serializers • Simplified housing and optical alignment Performance Considerations While compelling, single-sensor stereo systems are not without trade-offs. • Baseline Constraints - In direct imaging configurations, the baseline is inherently limited by the physical size of the optics and sensor. This constrains the depth resolution and range, making such systems better suited for near-field applications (e.g., 0.2 – 2 meters). Relay optics can increase baseline distance, but at the cost of added optical complexity and potential alignment drift if not properly designed. • Reduced Per-Channel Resolution - Because the sensor area is split between two optical channels, each stereo view occupies only half (or less) of the total pixel array. While sufficient for many tasks, such as obstacle detection or object segmentation, this may be inadequate for high-precision metrology or long-distance depth mapping and would likely require a switch to a higher-resolution sensor. Sunex DXM for wide baseline requirements Comparison of stereo imaging architectures SUNEX Industrial Robotics 2025 p 2/4 sunex.com SUNEX Industrial Robotics 2025 p 3/4 sunex.com Sunex DXM - Stereo Vision in a Smaller Package Single-Sensor Stereo Imaging in Robotics and Beyond As robotics and automation systems grow increasingly compact, intelligent, and power-efficient, the supporting vision technologies must evolve in parallel. One area undergoing rapid growth and innovation is single-sensor stereo imaging, where two optical channels converge onto a single CMOS sensor. This architectural shift offers a powerful blend of reduced physical footprint, lower power consumption, improved synchronization, color-matching, and overall cost efficiency. Originally explored for space- constrained applications, the concept is now gaining momentum across a diverse set of platforms, including Autonomous Mobile Robots (AMRs), Automated Guided Vehicles (AGVs), humanoid robots, manufacturing automation, and even multi-modal vision systems. The Architecture: Two Optical Channels, One CMOS Sensor A single-sensor stereo imaging system consists of two independent optical channels, based on two different base architectures. The result in either case is a stereo image pair captured simultaneously, pixel-aligned, and temporally consistent, without the need for a second sensor. Sunex DXMTM Direct Imaging camera leveraging SXMTM pre-aligned and interchangeable dual- optics heads. Compact Design and Space Efficiency The compactness of single-sensor stereo systems is obviously one compelling feature. This opens the door to new designs for low-profile AGVs, slim robotic arms, or humanoid head units, where stereo vision must be integrated without adding bulk or weight. Building on experiences in designing and manufacturing miniaturized optics for automotive and medical systems, the DXM direct imaging solution enables tighter baselines without sacrificing image quality or manufacturability. • Relay-prism or mirror systems, which allow a longer baseline (distance between the optical channels), enabling better depth perception at mid-to-long ranges. • Direct-imaging optics, where two small lenses with a shorter baseline directly image adjacent scenes onto the same CMOS sensor. Power Efficiency in Battery-Operated Systems In battery-powered robots, energy is often the most limited resource. A conventional two-sensor stereo setup not only doubles sensor power draw but also adds thermal and processing load for synchronizing and handling dual video streams. With a single-sensor system, all duplicate overhead is eliminated. Perfect Synchronization and Simplified Calibration Another major advantage of single-sensor stereo imaging is zero latency synchronization, paramount for fast-moving robotic systems or dynamic environments. Both images are captured on the same sensor die in the same exposure cycle. This eliminates the need for complex software-level synchronization, color matching, dual-sensor calibration routines, or even sensor-to-sensor alignment. Cost Efficiency: Fewer Components, Lower BOM Reducing component count directly translates to lower costs, not just in materials, but also in assembly, calibration, and quality control. A single-sensor stereo system uses: • One sensor (instead of two) • A shared image processing pipeline • Fewer connectors, cables, and serializers • Simplified housing and optical alignment Performance Considerations While compelling, single-sensor stereo systems are not without trade-offs. • Baseline Constraints - In direct imaging configurations, the baseline is inherently limited by the physical size of the optics and sensor. This constrains the depth resolution and range, making such systems better suited for near-field applications (e.g., 0.2 – 2 meters). Relay optics can increase baseline distance, but at the cost of added optical complexity and potential alignment drift if not properly designed. • Reduced Per-Channel Resolution - Because the sensor area is split between two optical channels, each stereo view occupies only half (or less) of the total pixel array. While sufficient for many tasks, such as obstacle detection or object segmentation, this may be inadequate for high-precision metrology or long-distance depth mapping and would likely require a switch to a higher-resolution sensor. Sunex DXM for wide baseline requirements Comparison of stereo imaging architectures SUNEX Industrial Robotics 2025 p 2/4 sunex.com SUNEX Industrial Robotics 2025 p 3/4 sunex.com Why You Need to Consider Sunex DXM and SXM in Your Next HUMANOID ROBOT SYSTEM. OPTICS INDUSTRIAL ROBOTICS 2025 2 KEY TOOLS Application-Specific Opportunities AGVs and AMRs - Warehouse robots and last-mile delivery bots require compact, cost-effective depth perception for obstacle avoidance and autonomous navigation. Since the operating environment is structured and typically well-lit, the reduced baseline of a direct imaging and resolution of a single- sensor dual-channel system are acceptable trade-offs for gains in size, weight, and battery life. Humanoid and Consumer Robots - For robots that interact with people or operate in tight spaces— such as service robots, assistants, or educational bots—single-sensor stereo vision provides reliable depth awareness for facial tracking, gesture detection, and object manipulation. The compact form factor enables the embedding of vision systems in aesthetically pleasing designs. Manufacturing Automation - In high-speed production lines, stereo vision is used for bin picking, height profiling, presence detection, and assembly inspection. Single-sensor stereo cameras provide an efficient way to deliver these functions in a durable, factory-ready package. Their simplified calibration and reduced cabling also translate to easier deployment and less downtime. Expanded Use Cases Beyond Traditional Stereo Imaging The same architecture used for stereo vision can also be adapted for multi-modal or dual-purpose imaging by varying the optical paths or filters on each channel. This unlocks several compelling new applications: Dual Field of View (FOV) Imaging - One optical channel can be designed for wide-angle situational awareness (e.g., 120° FOV), while the other is optimized for narrow-angle detail (e.g., 30° FOV). This is particularly useful in Security robots (surveillance + facial identification), Agricultural drones (large field + individual plant monitoring), and Logistics (Box detection + barcode reading). Simultaneous Visible and Infrared (RGB/IR) Imaging - Another configuration utilizes one lens and an optical filter stack optimized for RGB, while the other is tuned for near-IR or thermal infrared. This enables day/night (RGBIR) vision in Medical robotics (visual navigation + vein mapping), Food processing (surface color + sub-surface bruising or spoilage), and Smart agriculture (Visible plant monitoring + chlorophyll/NIR reflection analysis). Extended Exposure HDR - Imagine using two otherwise identical lenses, but one is optimized for a low F/#, while the other is optimized for high F/# enabling capturing a wider dynamic range in the same exposure time and simultaneously allowing more deterministic control over depth of field in Robotics, Machine Vision, Security, and Autonomy. Stereo Content Capture - The human eye is very sensitive to differences in color and relative illumination when presented with two images side-by-side. Single-sensor stereo content capture eliminates these discrepancies and the need to calibrate two different sensors for use in AR/VR, content capture, and video conferencing. Scan the QR code to download the in-depth case study (incl. a Guideline for System Designers). SUNEX Industrial Robotics 2025 p 4/4 sunex.com --- ## Sunex Fovea Brochure 2025 Online [PDF] - Source: https://sunex.com/wp-content/uploads/2026/01/Sunex_Fovea_Brochure_2025_online.pdf - Type: PDF whitepaper Consulting – Design – Manufacturing - Support F VEA Lenses TM ©2025 Sunex Inc. All Rights Reserved. sunex.com 25+ year track record of success in taking customer concepts from design through mass production. SUNEX INC. 3160 Lionshead Ave, Suite B Carlsbad, CA 92010, USA Tel: +1 760-597-2966 sunex.com/contact FOV px / deg TM Fovea Distortion The fovea centralis is located in the retina's center and is responsible for high-acuity human vision. Sunex lenses with Fovea distortion map this type of behavior and "exaggerate" the central details while trading off the off-axis details. Practically speaking, this results in a higher number of pixels per degree in the center, allowing machine vision algorithms to benefit from a higher resolution in the center field compared to standard f- theta distortion lenses. Applications Sunex's expertise and experience in manipulating distortion profiles to align with application-specific requirements have been valued by customers for many years. Our Tailored Distortion expertise has often been applied to SuperFisheye lenses to correct large FOV lenses' barrel distortion. Applications that benefit from a Fovea distortion profile include forward-looking ADAS and autonomous driving cameras, where the vehicle must detect objects at a far distance in the central FOV range while still having a wider FOV capability to maintain peripheral vision. High Dynamic Range (HDR) HDR (high dynamic range) sensors can capture light intensity variations up to six or more orders of magnitude within the same image frame (~120db). This puts a very demanding requirement on lens performance. Sunex has developed design expertise, process know-how, and nested cleanroom manufacturing facilities to eliminate or minimize optical noise (such as ghosts, flare, starbursts, spurious images) in lenses for high-performance applications.All graphs are for illustration purpose only. The individual lens performance can be different. Sunex F VEA lenses sunex.com/products Enable human-like vision with F VEA lensesFOV HDR (dB) Table only shows a selection. Additional F VEA lens options are available. Athermalization The shift of a lens’s focal point over a wide temperature range is a physical phenomenon based on the material-specific expansion and contraction with temperature. A decrease in image quality could be the outcome if the focal point of the lens relative to the sensor’s image plane shifts too much. A fully athermalized system requires selecting appropriate optical and mechanical materials, the right design strategy, and close collaboration with the customer to optimize thermal performance on a system level. Fast Prototyping We provide prototyping services for complete lens assemblies often as the first step after a new custom design. Sunex can produce prototypes with short lead times to verify the design before transitioning further on the path to mass production using state-of-the-art fabrication processes for glass and plastic optical elements and all mechanical components. FOV IH / EFL f-tan f-theta Fovea Wide FOV Lens Fovea Full FOV Fovea Center FOV PN Format MP Class HFOV F/# TTL Features DSL144 1/1.8" 1.7MP 100° 1.6 24 Hybrid, Short TTL DSL345 1/1.9" 1.7MP 100° 1.6 24 All Glass, Short TTL, DSL364 1/1.8" 2.1MP 133° 1.6 28.5 All Glass, Wide FOV DSL248 1/2.5" 2.6MP 100° 2.0 25.4 All Glass, Short TTL, DSL450 1/1.8" 8.3MP 122° 1.44 30 All Glass, 4k, High RI, very low F/# DSL457 1/1.8" 8.3MP 120° 1.8 26.3 All Glass, 4k, large image circle DSL452 1/1.43” 12MP 120° 1.6 38 All Glass, 12MP High Resolution, low Ghost FOV px / deg TM Fovea Distortion The fovea centralis is located in the retina's center and is responsible for high-acuity human vision. Sunex lenses with Fovea distortion map this type of behavior and "exaggerate" the central details while trading off the off-axis details. Practically speaking, this results in a higher number of pixels per degree in the center, allowing machine vision algorithms to benefit from a higher resolution in the center field compared to standard f- theta distortion lenses. Applications Sunex's expertise and experience in manipulating distortion profiles to align with application-specific requirements have been valued by customers for many years. Our Tailored Distortion expertise has often been applied to SuperFisheye lenses to correct large FOV lenses' barrel distortion. Applications that benefit from a Fovea distortion profile include forward-looking ADAS and autonomous driving cameras, where the vehicle must detect objects at a far distance in the central FOV range while still having a wider FOV capability to maintain peripheral vision. High Dynamic Range (HDR) HDR (high dynamic range) sensors can capture light intensity variations up to six or more orders of magnitude within the same image frame (~120db). This puts a very demanding requirement on lens performance. Sunex has developed design expertise, process know-how, and nested cleanroom manufacturing facilities to eliminate or minimize optical noise (such as ghosts, flare, starbursts, spurious images) in lenses for high-performance applications.All graphs are for illustration purpose only. The individual lens performance can be different. Sunex F VEA lenses sunex.com/products Enable human-like vision with F VEA lensesFOV HDR (dB) Table only shows a selection. Additional F VEA lens options are available. Athermalization The shift of a lens’s focal point over a wide temperature range is a physical phenomenon based on the material-specific expansion and contraction with temperature. A decrease in image quality could be the outcome if the focal point of the lens relative to the sensor’s image plane shifts too much. A fully athermalized system requires selecting appropriate optical and mechanical materials, the right design strategy, and close collaboration with the customer to optimize thermal performance on a system level. Fast Prototyping We provide prototyping services for complete lens assemblies often as the first step after a new custom design. Sunex can produce prototypes with short lead times to verify the design before transitioning further on the path to mass production using state-of-the-art fabrication processes for glass and plastic optical elements and all mechanical components. FOV IH / EFL f-tan f-theta Fovea Wide FOV Lens Fovea Full FOV Fovea Center FOV PN Format MP Class HFOV F/# TTL Features DSL144 1/1.8" 1.7MP 100° 1.6 24 Hybrid, Short TTL DSL345 1/1.9" 1.7MP 100° 1.6 24 All Glass, Short TTL, DSL364 1/1.8" 2.1MP 133° 1.6 28.5 All Glass, Wide FOV DSL248 1/2.5" 2.6MP 100° 2.0 25.4 All Glass, Short TTL, DSL450 1/1.8" 8.3MP 122° 1.44 30 All Glass, 4k, High RI, very low F/# DSL457 1/1.8" 8.3MP 120° 1.8 26.3 All Glass, 4k, large image circle DSL452 1/1.43” 12MP 120° 1.6 38 All Glass, 12MP High Resolution, low Ghost Consulting – Design – Manufacturing - Support F VEA Lenses TM ©2025 Sunex Inc. All Rights Reserved. sunex.com 25+ year track record of success in taking customer concepts from design through mass production. SUNEX INC. 3160 Lionshead Ave, Suite B Carlsbad, CA 92010, USA Tel: +1 760-597-2966 sunex.com/contact --- ## Sunex Lidar Tof Brochure 2025 Online [PDF] - Source: https://sunex.com/wp-content/uploads/2026/01/Sunex_LiDAR_ToF_Brochure_2025_online.pdf - Type: PDF whitepaper Consulting – Design – Manufacturing - Support ©2025 Sunex Inc. All Rights Reserved. sunex.com 25+ year track record of success in taking customer concepts from design through mass production. SUNEX INC. 3160 Lionshead Ave, Suite B Carlsbad, CA 92010, USA Tel: +1 760-597-2966 sunex.com/contact LiDAR and ToF Lenses TM Time-of-Flight ToF or Time-of-Flight refers to a measurement principle based on a signal leaving a source and a detector measuring the time it takes for a detector to receive the same signal back. The distance to any given object can be determined by factoring in the speed of the signal itself. Optical systems play a critical role when the signal is based on light, and the most common systems are referred to as ToF-cameras and LiDARs. ToF-cameras illuminate a scene with a modulated signal, and the phase shift between the send and receive signal determines the depth ranging. LiDAR stands for Light-Detection-and-Ranging and uses the delay between send and receive of a single laser pulse to determine depth. Sunex is offering solutions for both types. Applications Many industries and applications have leveraged these technologies for decades, from topology and meteorology to medical and industrial robotics applications. In recent years ToF- cameras and LiDARs have also entered high-volume consumer and automotive markets. Especially for LiDAR applications, we see many established and new companies pushing the boundaries to reduce costs and advance performance for long- and short-range LiDAR systems. Designed for Mass Production Often it is not the challenge to create a design “that works” but to find a solution that can scale to mass production, meeting price targets, optical performance, mechanical constraints, and quality requirements. Sunex has decades of design and manufacturing expertise, and all of our ToF- or LiDAR lenses are designed within the context of high-volume manufacturability. All graphs are for illustration purpose only. The individual lens performance can be different. Sunex LiDAR and ToF lenses sunex.com/products Create 3D depth perception with LiDAR lenses Table only shows a selection. Additional ToF and LiDAR lens options are available. Miniaturized SuperFisheye With the recent advancements to expand the LiDAR technology to non-spinning short-range or near-field, LiDARs the requirements shifted to an increase in horizontal (HFOV) and vertical field of views (VFOV), smaller F/#, and a decrease of the overall form-factor. Sunex pioneered and coined the term Miniaturized SuperFisheye lenses in the automotive industry. We are now applying the same design concepts and experiences to support our customers in advancing their LiDAR product range. Automotive Qualified With almost two decades as a qualified automotive supplier to our global customer base, we know what is required to design and manufacture a lens that has stable performance over a wide temperature range and passes automotive reliability and environmental testing. Whether we improve existing work through Design for Manufacturing (DFM) and Design to Cost (DTC) cycles or start with a blank sheet design to meet all requirements, the end goal is always to deliver on time with consistent quality. Field MTF Wavelength T% ToF Camera Short Range LiDAR Long Range LiDAR Field RI % Type Format EFL FOV F/# TTL Features DSL146 1/2.8" 3.3 123 1.4 28 All Glass, Wide FOV, 4k High Resolution DSL147 1/2.8" 2.5 156 1.4 28 All Glass, Wide FOV, 4k High Resolution DSL148 1/3" 2.2 122 1.4 20 Hybrid Design, Wide FOV, Short TTL DSL115 1/3" 4.5 68 1.5 27 Hybrid Design, Short TTL DSL947 1/3" 6.1 56 1.6 14 All Glass, Small Form Factor LiDAR Receiver 1.5" 41 35 1.4 57 Long range, narrow FOV, low straylight LiDAR Receiver 1" 8 ≥120 1.3 50 Hybrid Design, Short range, wide FOV TM Time-of-Flight ToF or Time-of-Flight refers to a measurement principle based on a signal leaving a source and a detector measuring the time it takes for a detector to receive the same signal back. The distance to any given object can be determined by factoring in the speed of the signal itself. Optical systems play a critical role when the signal is based on light, and the most common systems are referred to as ToF-cameras and LiDARs. ToF-cameras illuminate a scene with a modulated signal, and the phase shift between the send and receive signal determines the depth ranging. LiDAR stands for Light-Detection-and-Ranging and uses the delay between send and receive of a single laser pulse to determine depth. Sunex is offering solutions for both types. Applications Many industries and applications have leveraged these technologies for decades, from topology and meteorology to medical and industrial robotics applications. In recent years ToF- cameras and LiDARs have also entered high-volume consumer and automotive markets. Especially for LiDAR applications, we see many established and new companies pushing the boundaries to reduce costs and advance performance for long- and short-range LiDAR systems. Designed for Mass Production Often it is not the challenge to create a design “that works” but to find a solution that can scale to mass production, meeting price targets, optical performance, mechanical constraints, and quality requirements. Sunex has decades of design and manufacturing expertise, and all of our ToF- or LiDAR lenses are designed within the context of high-volume manufacturability. All graphs are for illustration purpose only. The individual lens performance can be different. Sunex LiDAR and ToF lenses sunex.com/products Create 3D depth perception with LiDAR lenses Table only shows a selection. Additional ToF and LiDAR lens options are available. Miniaturized SuperFisheye With the recent advancements to expand the LiDAR technology to non-spinning short-range or near-field, LiDARs the requirements shifted to an increase in horizontal (HFOV) and vertical field of views (VFOV), smaller F/#, and a decrease of the overall form-factor. Sunex pioneered and coined the term Miniaturized SuperFisheye lenses in the automotive industry. We are now applying the same design concepts and experiences to support our customers in advancing their LiDAR product range. Automotive Qualified With almost two decades as a qualified automotive supplier to our global customer base, we know what is required to design and manufacture a lens that has stable performance over a wide temperature range and passes automotive reliability and environmental testing. Whether we improve existing work through Design for Manufacturing (DFM) and Design to Cost (DTC) cycles or start with a blank sheet design to meet all requirements, the end goal is always to deliver on time with consistent quality. Field MTF Wavelength T% ToF Camera Short Range LiDAR Long Range LiDAR Field RI % Type Format EFL FOV F/# TTL Features DSL146 1/2.8" 3.3 123 1.4 28 All Glass, Wide FOV, 4k High Resolution DSL147 1/2.8" 2.5 156 1.4 28 All Glass, Wide FOV, 4k High Resolution DSL148 1/3" 2.2 122 1.4 20 Hybrid Design, Wide FOV, Short TTL DSL115 1/3" 4.5 68 1.5 27 Hybrid Design, Short TTL DSL947 1/3" 6.1 56 1.6 14 All Glass, Small Form Factor LiDAR Receiver 1.5" 41 35 1.4 57 Long range, narrow FOV, low straylight LiDAR Receiver 1" 8 ≥120 1.3 50 Hybrid Design, Short range, wide FOV Consulting – Design – Manufacturing - Support ©2025 Sunex Inc. All Rights Reserved. sunex.com 25+ year track record of success in taking customer concepts from design through mass production. SUNEX INC. 3160 Lionshead Ave, Suite B Carlsbad, CA 92010, USA Tel: +1 760-597-2966 sunex.com/contact LiDAR and ToF Lenses --- ## Sunex Sxmdxm Brochure 2026 Online [PDF] - Source: https://sunex.com/wp-content/uploads/2026/01/Sunex_SXMDXM_Brochure_2026_online.pdf - Type: PDF whitepaper Consulting – Design – Manufacturing - Support SXM and DXM 25+ year track record of success in taking customer concepts from design through mass production. ©2026 Sunex Inc. All Rights Reserved. sunex.com SUNEX INC. 3160 Lionshead Ave, Suite B Carlsbad, CA 92010, USA Tel: +1 760-597-2966 sunex.com/contact TM Sunex Patented Technologies DXM with SXM Technology Dev Kits The possibilities are endless when the SXM and DXM technologies are combined: from rapid field-of-view (FOV) changes and technology demonstrators to fast prototyping and flexible SKU configurations. All graphs are for illustration purpose only. The individual lens performance can be different. Field MTF Field HDR (dB) Unique Imaging Challenges require Unique Solutions. sunex.com/solutions SXM – Sunex Interchangeable M12 Lenses Introducing the world’s first patented, interchangeable board mount lens system—designed for unmatched versatility and speed. Switch from telephoto to fisheye in just three seconds with a system that supports any sensor and any lens. Over 300 off- the-shelf lenses from Sunex’s extensive lineup can be seamlessly adapted to the SXM system, or you can design your own custom module with your preferred sensor and lens combination. With no need for focusing or alignment, simply drop in your components and start capturing—perfect even for high-megapixel cameras. Similar to the DSLR system architecture, the SXM offers: • Dozens of lens options • A customizable user experience • Field-replaceable lenses Integrated in your business model, SMX can enable a continuing revenue stream for consumer/commercial products in the form of additional lenses and upgrades after the initial sale. More than just hardware, Sunex’s proprietary manufacturing process allows Sunex to install the precision “SXM” system on your sensor board, which will work with ANY compatible SXM Lens. DXM – Sunex Multi-Channel Technology Sunex’s patent-pending DXM system offers a groundbreaking solution by enabling multiple image circles on a single sensor, effectively eliminating multi-sensor synchronization and latency issues. It supports multiple effective focal lengths (EFLs), F-numbers (F/#s), and spectral options, including the ability to capture separate RGB and IR channels on the same sensor. Unlike other dual-channel solutions, DXM is highly configurable and scalable, accommodating nearly any lens and sensor combination. • Perfect Synchronization • Reduced Noise • Perfect Color Matching • Simplified Processing • Simplified Architecture • Interchangeable Lenses DXM - Single-Sensor Stereo Vision TM Sunex Patented Technologies DXM with SXM Technology Dev Kits The possibilities are endless when the SXM and DXM technologies are combined: from rapid field-of-view (FOV) changes and technology demonstrators to fast prototyping and flexible SKU configurations. All graphs are for illustration purpose only. The individual lens performance can be different. Field MTF Field HDR (dB) Unique Imaging Challenges require Unique Solutions. sunex.com/solutions SXM – Sunex Interchangeable M12 Lenses Introducing the world’s first patented, interchangeable board mount lens system—designed for unmatched versatility and speed. Switch from telephoto to fisheye in just three seconds with a system that supports any sensor and any lens. Over 300 off- the-shelf lenses from Sunex’s extensive lineup can be seamlessly adapted to the SXM system, or you can design your own custom module with your preferred sensor and lens combination. With no need for focusing or alignment, simply drop in your components and start capturing—perfect even for high-megapixel cameras. Similar to the DSLR system architecture, the SXM offers: • Dozens of lens options • A customizable user experience • Field-replaceable lenses Integrated in your business model, SMX can enable a continuing revenue stream for consumer/commercial products in the form of additional lenses and upgrades after the initial sale. More than just hardware, Sunex’s proprietary manufacturing process allows Sunex to install the precision “SXM” system on your sensor board, which will work with ANY compatible SXM Lens. DXM – Sunex Multi-Channel Technology Sunex’s patent-pending DXM system offers a groundbreaking solution by enabling multiple image circles on a single sensor, effectively eliminating multi-sensor synchronization and latency issues. It supports multiple effective focal lengths (EFLs), F-numbers (F/#s), and spectral options, including the ability to capture separate RGB and IR channels on the same sensor. Unlike other dual-channel solutions, DXM is highly configurable and scalable, accommodating nearly any lens and sensor combination. • Perfect Synchronization • Reduced Noise • Perfect Color Matching • Simplified Processing • Simplified Architecture • Interchangeable Lenses DXM - Single-Sensor Stereo Vision Consulting – Design – Manufacturing - Support SXM and DXM 25+ year track record of success in taking customer concepts from design through mass production. ©2026 Sunex Inc. All Rights Reserved. sunex.com SUNEX INC. 3160 Lionshead Ave, Suite B Carlsbad, CA 92010, USA Tel: +1 760-597-2966 sunex.com/contact --- ## Sunex Smartag Brochure 2026 Online [PDF] - Source: https://sunex.com/wp-content/uploads/2026/01/Sunex_SmartAG_Brochure_2026_online.pdf - Type: PDF whitepaper Smart Agriculture Consulting – Design – Manufacturing - Support sunex.com©2026 Sunex Inc. All Rights Reserved. SUNEX INC. 3160 Lionshead Ave, Suite B Carlsbad, CA 92010, USA Tel: +1 760-597-2966 sunex.com/contact 25+ year track record of success in taking customer concepts from design through mass production. TM Aerial Crop Analytics & Drone Monitoring Drones equipped with Sunex SuperFisheye , or multispectral- ready lenses, enable crop surveys across large fields, and high- resolution aerial imaging lenses support early detection of nutrient deficiencies, water stress, and pest pressure. Lightweight Sunex optics improve flight time, support accurate stereo perception, deliver actionable insights that enhance fertilization and irrigation planning, and improve yield prediction. Our Vision for Smart Agriculture Infrastructure Monitoring & Automation Livestock facilities, grain storage systems, and irrigation equipment increasingly require remote visibility. Sunex wide-Field of View (FOV) lenses with TailoredDistortion and actively aligned camera modules enable structural inspection, livestock monitoring, feed management, and environmental compliance, delivering the imaging quality needed for analytics to provide alerts on movement, contamination, leaks, or equipment performance. From autonomous vehicles to real-time crop analytics, Sunex products, technology, precision volume-manufacturing, and global support enable every smart agriculture system, reducing manual supervision and strengthening facility automation. Environmental & Weather Intelligence Local micro-weather directly impacts irrigation needs, crop stress, and harvest timing. Weather stations enhanced with Sunex NoGhost high-dynamic-range optics enable continuous monitoring of sky conditions, cloud cover, soil indicators, and visibility. Sunex outdoor-grade camera modules are optimized for temperature stability and long-term exposure, improving short- term forecasting and supporting automated farm responses such as irrigation control, frost mitigation, and harvest scheduling. All graphs are for illustration purpose only. The individual lens performance can be different. Field MTF Field HDR (dB) Smart Agriculture Systems require a Smart Vision Partner. sunex.com/solutions Wavelength T% Modern farming increasingly depends on real-time data, automation, and intelligent decision-making. High-performance optical and imaging systems from Sunex, including rugged lens assemblies, RGB-IR modules, and single-sensor stereo technology DXM , are transforming agricultural operations by enhancing efficiency, improving yield quality, and reducing labor requirements. Autonomous Harvesting & Machine Guidance Advanced field machinery—such as combines, planters, and automated tractors—now rely on multi-sensor stereo and RGB-IR imaging to navigate fields, detect crop rows, identify obstacles, and optimize harvesting paths. Sunex provides athermalized, wide-FOV optics and compact DXM stereo modules engineered for harsh outdoor environments, minimizing operator workload and improving consistency under dust, vibration, or lighting variation. Machine-integrated imaging also supports precision unloading, yield mapping, and automated route planning. TM Aerial Crop Analytics & Drone Monitoring Drones equipped with Sunex SuperFisheye , or multispectral- ready lenses, enable crop surveys across large fields, and high- resolution aerial imaging lenses support early detection of nutrient deficiencies, water stress, and pest pressure. Lightweight Sunex optics improve flight time, support accurate stereo perception, deliver actionable insights that enhance fertilization and irrigation planning, and improve yield prediction. Vision for Smart Agriculture Infrastructure Monitoring & Automation Livestock facilities, grain storage systems, and irrigation equipment increasingly require remote visibility. Sunex wide-Field of View (FOV) lenses with TailoredDistortion and actively aligned camera modules enable structural inspection, livestock monitoring, feed management, and environmental compliance, delivering the imaging quality needed for analytics to provide alerts on movement, contamination, leaks, or equipment performance. From autonomous vehicles to real-time crop analytics, Sunex products, technology, precision volume-manufacturing, and global support enable every smart agriculture system, reducing manual supervision and strengthening facility automation. Environmental & Weather Intelligence Local micro-weather directly impacts irrigation needs, crop stress, and harvest timing. Weather stations enhanced with Sunex NoGhost high-dynamic-range optics enable continuous monitoring of sky conditions, cloud cover, soil indicators, and visibility. Sunex outdoor-grade camera modules are optimized for temperature stability and long-term exposure, improving short- term forecasting and supporting automated farm responses such as irrigation control, frost mitigation, and harvest scheduling. All graphs are for illustration purpose only. The individual lens performance can be different. Field MTF Field HDR (dB) Smart Agriculture Systems require a Smart Vision Partner. sunex.com/solutions Wavelength T% Modern farming increasingly depends on real-time data, automation, and intelligent decision-making. High-performance optical and imaging systems from Sunex, including rugged lens assemblies, RGB-IR modules, and single-sensor stereo technology DXM , are transforming agricultural operations by enhancing efficiency, improving yield quality, and reducing labor requirements. Autonomous Harvesting & Machine Guidance Advanced field machinery—such as combines, planters, and automated tractors—now rely on multi-sensor stereo and RGB-IR imaging to navigate fields, detect crop rows, identify obstacles, and optimize harvesting paths. Sunex provides athermalized, wide-FOV optics and compact DXM stereo modules engineered for harsh outdoor environments, minimizing operator workload and improving consistency under dust, vibration, or lighting variation. Machine-integrated imaging also supports precision unloading, yield mapping, and automated route planning. Smart Agriculture Consulting – Design – Manufacturing - Support sunex.com©2026 Sunex Inc. All Rights Reserved. SUNEX INC. 3160 Lionshead Ave, Suite B Carlsbad, CA 92010, USA Tel: +1 760-597-2966 sunex.com/contact 25+ year track record of success in taking customer concepts from design through mass production. --- ## Sunex Uled Lens Design Edition 2026 Online [PDF] - Source: https://sunex.com/wp-content/uploads/2026/01/Sunex_uLED_Lens_Design_Edition_2026_online.pdf - Type: PDF whitepaper OPTICS Compact Architectures in Practice Recent Sunex products demonstrate how hybridization and ultra-compact designs enable next- generation HD µLED-based headlamp systems: • A 48° HFOV all-glass lens achieves compact packaging with an outer diameter (OD) and total track length (TTL) just a little above 30mm • A 24° HFOV hybrid lens reduces system size by 35% compared to conventional designs, supporting the trend for smaller headlamp systems. These examples demonstrate that hybrid stacks and innovative design solutions enable both performance and packaging improvements, supporting future µLED-chip sizes. Scan the QR code to access product datasheets, case studies, incl. a Design Guideline, and more. SUNEX AUTO NEWS 2026 p 4/4 sunex.com MORE IMPORTANT THAN EVER A UT O NEWS 2 KEY FACTS sunex.com Conclusion – Engineering for Reliability The transition to µLED headlamps represents not just a new technology, but a new design paradigm. Success depends on integrating optical design, materials science, thermal analysis, and manufacturing expertise into a cohesive system strategy. Hybrid lens design offers a proven pathway: by combining glass and plastic elements thoughtfully, engineers can achieve the right balance of reliability, optical performance, and cost efficiency. HYBRID LENS DESIGNS IN HD µLED HEADLAMPS Key Takeaways • Hybridization balances trade-offs between mechanical size, performance, and cost. • Material selection matters, with glass ensuring the required thermal stability, while plastics enable innovative geometries. • Reliability must remain a central goal, particularly when considering thermal cycling and extended lifetime. • Compact projection lens designs are achievable, enabling the proliferation of µLED-based HD headlamp systems that balance performance and OEM aesthetics. Sunex DSL030 Compact Projector Lens Sunex DSL092 Hybrid Projector Lens Account for thermal load. Compact is the future. INSIGHTS and Future PERSPECTIVES 2026 Hybrid Lens Designs in HD µLED Headlamps Lighting at a Turning Point The evolution of automotive lighting has entered a critical phase. With adaptive driving beam (ADB) systems and intelligent exterior lighting moving into mainstream production, high-resolution micro-LED (µLED) technology is emerging as a central enabler. µLEDs provide pixel-level control of light output, allowing real-time adaptive responses to road conditions and opening opportunities for new OEM brand signatures. Yet the promise of µLEDs comes with engineering complexity. Optical designs must deliver unprecedented resolution and efficiency, while also surviving demanding automotive environments. Thermal management, manufacturing scalability, and cost competitiveness all play equally important roles. Within this environment, lens hybridization—the strategic use of glass and plastic elements within a single optical system—offers a balanced pathway forward. µLEDs and the Rise of Optical Complexity Conventional matrix LED systems typically consist of one to three optical elements, with tolerances around 50 µm. These systems are relatively simple to assemble, relying on well-understood glass or plastic optics. By contrast, µLED-based systems often demand four or more elements, assembled with sub-10 µm alignment precision. This leap in complexity is driven by the compact, small individual pixel, and high-intensity nature of µLED chips. Achieving their full potential requires far tighter control of aberrations, resolution, and uniformity across the field, all within compact, thermally stable optical stacks. • Aging effects, such as moisture absorption and index drift. • Degradation in transmission, resulting in yellowing over time. • Coating vulnerabilities, including abrasion and crazing. High-end automotive plastics have been used in automotive camera optics for decades, and they are recognized for mitigating these risks. However, it comes at a significant cost—often 10 to 20 times more than common PMMA or PC. Therefore, the role of plastics must be carefully defined in any hybrid design, ensuring they complement rather than compromise long-term reliability. Picture source: Porsche Newsroom SUNEX AUTO NEWS 2026 p 2/4 sunex.com SUNEX AUTO NEWS 2026 p 3/4 sunex.com The Value of Hybrid Lens Design Hybridization blends the complementary strengths of glass and plastic. Together, these materials enable optical stacks that are often smaller, more efficient, and cost-balanced while meeting the performance demands. The final lens design is often the result of an iterative process and always strives to be well- balanced, taking into account all individual considerations and client requirements. No matter the final products, the initial base assumptions are always the same: • Glass elements offer low coefficients of thermal expansion (CTE), stability over lifetime, and a wide variety of refractive indices. Their behavior under stress and temperature cycling is predictable, making them a backbone for critical focusing elements. • Plastic elements excel in cost-effective shaping. Aspherical surface geometries, non-radial forms, and even rectangular optics can be produced directly in molding. Prototyping is faster, and high-volume production can significantly reduce costs. Reliability Considerations for Plastics Despite their advantages, plastics present challenges when subjected to high thermal loads and operating temperature ranges that span a wide range. Elevated temperatures and repeated cycling can lead to: Engineering Trade-Offs in Hybridization Designing hybrid stacks requires decisions across several dimensions: • Element placement relative to the µLED source – glass remains preferred for high-flux, high-heat positions (close to the µLED source). • Surface geometry – leveraging plastics for aspherical optical surfaces, outer flanges, and shapes that would be prohibitively expensive in glass. • Lifetime stability – balancing need for optical coatings (efficiency, straylight), abrasion resistance, and long-term exposure to high temperature gradients. • Performance outcomes – optimizing resolution, uniformity, efficiency, and color performance while containing costs. In many cases, hybrid stacks enable OEMs to reduce overall mechanical dimensions of the optical system while maintaining performance and manufacturability. Hybrid Lens Designs in HD µLED Headlamps Lighting at a Turning Point The evolution of automotive lighting has entered a critical phase. With adaptive driving beam (ADB) systems and intelligent exterior lighting moving into mainstream production, high-resolution micro-LED (µLED) technology is emerging as a central enabler. µLEDs provide pixel-level control of light output, allowing real-time adaptive responses to road conditions and opening opportunities for new OEM brand signatures. Yet the promise of µLEDs comes with engineering complexity. Optical designs must deliver unprecedented resolution and efficiency, while also surviving demanding automotive environments. Thermal management, manufacturing scalability, and cost competitiveness all play equally important roles. Within this environment, lens hybridization—the strategic use of glass and plastic elements within a single optical system—offers a balanced pathway forward. µLEDs and the Rise of Optical Complexity Conventional matrix LED systems typically consist of one to three optical elements, with tolerances around 50 µm. These systems are relatively simple to assemble, relying on well-understood glass or plastic optics. By contrast, µLED-based systems often demand four or more elements, assembled with sub-10 µm alignment precision. This leap in complexity is driven by the compact, small individual pixel, and high-intensity nature of µLED chips. Achieving their full potential requires far tighter control of aberrations, resolution, and uniformity across the field, all within compact, thermally stable optical stacks. • Aging effects, such as moisture absorption and index drift. • Degradation in transmission, resulting in yellowing over time. • Coating vulnerabilities, including abrasion and crazing. High-end automotive plastics have been used in automotive camera optics for decades, and they are recognized for mitigating these risks. However, it comes at a significant cost—often 10 to 20 times more than common PMMA or PC. Therefore, the role of plastics must be carefully defined in any hybrid design, ensuring they complement rather than compromise long-term reliability. Picture source: Porsche Newsroom SUNEX AUTO NEWS 2026 p 2/4 sunex.com SUNEX AUTO NEWS 2026 p 3/4 sunex.com The Value of Hybrid Lens Design Hybridization blends the complementary strengths of glass and plastic. Together, these materials enable optical stacks that are often smaller, more efficient, and cost-balanced while meeting the performance demands. The final lens design is often the result of an iterative process and always strives to be well- balanced, taking into account all individual considerations and client requirements. No matter the final products, the initial base assumptions are always the same: • Glass elements offer low coefficients of thermal expansion (CTE), stability over lifetime, and a wide variety of refractive indices. Their behavior under stress and temperature cycling is predictable, making them a backbone for critical focusing elements. • Plastic elements excel in cost-effective shaping. Aspherical surface geometries, non-radial forms, and even rectangular optics can be produced directly in molding. Prototyping is faster, and high-volume production can significantly reduce costs. Reliability Considerations for Plastics Despite their advantages, plastics present challenges when subjected to high thermal loads and operating temperature ranges that span a wide range. Elevated temperatures and repeated cycling can lead to: Engineering Trade-Offs in Hybridization Designing hybrid stacks requires decisions across several dimensions: • Element placement relative to the µLED source – glass remains preferred for high-flux, high-heat positions (close to the µLED source). • Surface geometry – leveraging plastics for aspherical optical surfaces, outer flanges, and shapes that would be prohibitively expensive in glass. • Lifetime stability – balancing need for optical coatings (efficiency, straylight), abrasion resistance, and long-term exposure to high temperature gradients. • Performance outcomes – optimizing resolution, uniformity, efficiency, and color performance while containing costs. In many cases, hybrid stacks enable OEMs to reduce overall mechanical dimensions of the optical system while maintaining performance and manufacturability. OPTICS Compact Architectures in Practice Recent Sunex products demonstrate how hybridization and ultra-compact designs enable next- generation HD µLED-based headlamp systems: • A 48° HFOV all-glass lens achieves compact packaging with an outer diameter (OD) and total track length (TTL) just a little above 30mm • A 24° HFOV hybrid lens reduces system size by 35% compared to conventional designs, supporting the trend for smaller headlamp systems. These examples demonstrate that hybrid stacks and innovative design solutions enable both performance and packaging improvements, supporting future µLED-chip sizes. Scan the QR code to access product datasheets, case studies, incl. a Design Guideline, and more. SUNEX AUTO NEWS 2026 p 4/4 sunex.com MORE IMPORTANT THAN EVER A UT O NEWS 2 KEY FACTS sunex.com Conclusion – Engineering for Reliability The transition to µLED headlamps represents not just a new technology, but a new design paradigm. Success depends on integrating optical design, materials science, thermal analysis, and manufacturing expertise into a cohesive system strategy. Hybrid lens design offers a proven pathway: by combining glass and plastic elements thoughtfully, engineers can achieve the right balance of reliability, optical performance, and cost efficiency. HYBRID LENS DESIGNS IN HD µLED HEADLAMPS Key Takeaways • Hybridization balances trade-offs between mechanical size, performance, and cost. • Material selection matters, with glass ensuring the required thermal stability, while plastics enable innovative geometries. • Reliability must remain a central goal, particularly when considering thermal cycling and extended lifetime. • Compact projection lens designs are achievable, enabling the proliferation of µLED-based HD headlamp systems that balance performance and OEM aesthetics. Sunex DSL030 Compact Projector Lens Sunex DSL092 Hybrid Projector Lens Account for thermal load. Compact is the future. INSIGHTS and Future PERSPECTIVES 2026 --- ## Sunex Μlight Brochure 2026 Online [PDF] - Source: https://sunex.com/wp-content/uploads/2026/01/Sunex_µLight_Brochure_2026_online.pdf - Type: PDF whitepaper Consulting – Design – Manufacturing - Support ©2026 Sunex Inc. All Rights Reserved. sunex.com 25+ year track record of success in taking customer concepts from design through mass production. SUNEX INC. 3160 Lionshead Ave, Suite B Carlsbad, CA 92010, USA Tel: +1 760-597-2966 sunex.com/contact TM HD Projection Lenses TM Automotive Headlamp Solutions Lighting functions in and around the car play an ever-increasing role in an OEM’s brand recognition and are a signature piece for corporate design consideration. From a purely technical perspective, modern high definition (HD) headlamps are designed for two main applications: Advanced Lighting Functions and Road Projections.. Even with the underlying technologies advancing far beyond what the early trailblazers of automotive headlamps could have envisioned, we still try to optimize for the same goals: • reduce glare • increase efficiency and range • make driving safer Sunex’s design, engineering, and manufacturing know-how are well known in the automotive industry. Our consistent quality and on-time delivery made us a preferred supplier for imaging optics for leading Tier1s and OEMs for over a decade. Building on that history and reputation, we successfully advance the new high-resolution automotive headlamps segment with our customers and partners. All graphs are for illustration purpose only. The individual lens performance can be different. Sunex µLIGHT lenses sunex.com/products Fast Prototyping We provide prototyping services for complete lens assemblies, often as the first step after a new custom design. In automotive headlamps, the series-production barrel design is a crucial part of the prototyping process. It defines the headlamp module's critical interface to the vehicle, making it as important as the optical system itself. Sunex can produce prototypes with short lead times to verify the design before advancing to OEM series production, using state-of- the-art fabrication processes for glass and plastic optical elements and all mechanical components. Test & Measurement Capabilities Sunex has expanded the existing design, test, and measurement capabilities to account for the specific needs of projection optics and our automotive headlamp customers. In-house equipment includes test systems for the characterization of large-format projection optics, a goniophotometer lab with industry-standard analysis software, and VDA19.1 test equipment. See the road ahead with µLIGHT lenses Data does not constitute specifications. Additional µLIGHT lens options are available. Field MTF Efficiency H FOV Lens HFOV F/# Efficiency TTL Source Notes A 48 0.66 46 31 16/26k All glass, wide HFOV, compact design B 24 0.7 42 70 16k Hybrid, narrow HFOV, good efficicency C 20 0.75 40 70 16k All glass, narrow HFOV, OEM production D 40 0.7 49 57 16k All glass, wide HFOV, OEM production E 24 0.65 43 63 16/26k Hybrid, high resolution, OEM production F 30 0.7 50 59 +3k Hybrid, wider HFOV, high efficiency Expanding the Solution Space The requirements for high-resolution automotive projector lenses are changing as the adoption of this technology rapidly increases. Besides mechanical constraints and the need to meet OEM styling guidelines, we typically see MTF, Emax, efficiency, and color aberrations as crucial performance differentiators. Sunex has developed an extensive design expertise, engineering capabilities, and manufacturing process know-how that addresses the expanding solution space. TM Automotive Headlamp Solutions Lighting functions in and around the car play an ever-increasing role in an OEM’s brand recognition and are a signature piece for corporate design consideration. From a purely technical perspective, modern high definition (HD) headlamps are designed for two main applications: Advanced Lighting Functions and Road Projections.. Even with the underlying technologies advancing far beyond what the early trailblazers of automotive headlamps could have envisioned, we still try to optimize for the same goals: • reduce glare • increase efficiency and range • make driving safer Sunex’s design, engineering, and manufacturing know-how are well known in the automotive industry. Our consistent quality and on-time delivery made us a preferred supplier for imaging optics for leading Tier1s and OEMs for over a decade. Building on that history and reputation, we successfully advance the new high-resolution automotive headlamps segment with our customers and partners. All graphs are for illustration purpose only. The individual lens performance can be different. Sunex µLIGHT lenses sunex.com/products Fast Prototyping We provide prototyping services for complete lens assemblies, often as the first step after a new custom design. In automotive headlamps, the series-production barrel design is a crucial part of the prototyping process. It defines the headlamp module's critical interface to the vehicle, making it as important as the optical system itself. Sunex can produce prototypes with short lead times to verify the design before advancing to OEM series production, using state-of- the-art fabrication processes for glass and plastic optical elements and all mechanical components. Test & Measurement Capabilities Sunex has expanded the existing design, test, and measurement capabilities to account for the specific needs of projection optics and our automotive headlamp customers. In-house equipment includes test systems for the characterization of large-format projection optics, a goniophotometer lab with industry-standard analysis software, and VDA19.1 test equipment. See the road ahead with µLIGHT lenses Data does not constitute specifications. Additional µLIGHT lens options are available. Field MTF Efficiency H FOV Lens HFOV F/# Efficiency TTL Source Notes A 48 0.66 46 31 16/26k All glass, wide HFOV, compact design B 24 0.7 42 70 16k Hybrid, narrow HFOV, good efficicency C 20 0.75 40 70 16k All glass, narrow HFOV, OEM production D 40 0.7 49 57 16k All glass, wide HFOV, OEM production E 24 0.65 43 63 16/26k Hybrid, high resolution, OEM production F 30 0.7 50 59 +3k Hybrid, wider HFOV, high efficiency Expanding the Solution Space The requirements for high-resolution automotive projector lenses are changing as the adoption of this technology rapidly increases. Besides mechanical constraints and the need to meet OEM styling guidelines, we typically see MTF, Emax, efficiency, and color aberrations as crucial performance differentiators. Sunex has developed an extensive design expertise, engineering capabilities, and manufacturing process know-how that addresses the expanding solution space. Consulting – Design – Manufacturing - Support ©2026 Sunex Inc. All Rights Reserved. sunex.com 25+ year track record of success in taking customer concepts from design through mass production. SUNEX INC. 3160 Lionshead Ave, Suite B Carlsbad, CA 92010, USA Tel: +1 760-597-2966 sunex.com/contact TM HD Projection Lenses --- ## Sunex Customdesign Medical Edition 2026 Online [PDF] - Source: https://sunex.com/wp-content/uploads/2026/02/Sunex_CustomDesign_Medical_Edition_2026_online.pdf - Type: PDF whitepaper OPTICS Use Cases That Benefit from a Custom Approach There are many different indications, devices, and procedures in the medical field. Even though they all have their specific requirements, they all benefit from a custom approach to the optical system if one or more of the following are required: • High-resolution and wide-angle disposable endoscopes requiring low distortion and superior edge-to-edge sharpness. • Large Depth-of-Field (DOF) imaging where autofocus is needed to maintain clarity at variable object distances. • Dual-imager or Stereo vision systems, where precise calibration and alignment between channels are mandatory. • Low-light or IR-capable applications requiring optimized optics and sensor pairing. Scan the QR code to access product datasheets, case studies, incl. a Design Guideline, and more. SUNEX MED NEWS 2026 p 4/4 sunex.com MED NEWS 2 Different Choices sunex.com Conclusion: Choosing the Right Imaging Architecture Fully integrated chip-level camera modules provide a valuable solution for many basic disposable imaging tasks, especially where space and cost constraints are paramount. However, when performance matters, whether it’s higher image quality, autofocus capability, robustness, or system-specific sensor selection, a custom camera module built by Sunex offers differentiation by meeting the performance, size, and commercial objectives required to make the end- customer successful in their domain. From lens design and simulation to sensor integration, active alignment, tunable focus implementation, and custom PCB layout, Sunex provides a comprehensive pathway from concept to volume production, delivering imaging systems that meet the demands of next-generation medical devices. HIGH RESOLUTION FOR DISPOSABLE ENDOSCOPES Sunex MOD124 Hybrid Disposable Endoscope Module Lens Custom Optics vs. Chip-level Camera Modules INSIGHTS and Future PERSPECTIVES 2026 MORE IMPORTANT THAN EVER Custom Optics vs. Chip-level Camera Modules Custom Optics for Medical Devices offer a Flexible Alternative As the market for disposable endoscopes, catheter-based imaging, and minimally invasive diagnostic devices grows, medical OEMs face a key architectural decision: use a fully integrated sensor module like Omnivision’s CameraCubeChip®, or pursue a custom camera system where image quality, flexibility, and system integration are optimized for the end application. While integrated modules offer simplicity and ultra-compact size, they are not always the best fit, especially when device differentiation, superior image performance, or tight system integration is required. In these scenarios, Sunex’s custom optical and sensor module solutions present a powerful alternative. SUNEX MED NEWS 2026 p 2/4 sunex.com SUNEX MED NEWS 2026 p 3/4 sunex.com Understanding the Trade-Off: Convenience vs. Customization Chip-level integration delivers a compact, ready-to-use module by embedding a CMOS sensor, a fixed lens, and packaging them into a single unit. This approach is well-suited to basic visualization tasks, especially where cost and simplicity dominate. However, this integration comes at the cost of flexibility: • Fixed optics limit field of view, depth of field, and image plane tuning. • No autofocus capability for applications with variable working distances. • Limited sensor options, especially for newer sensors with specialized features (e.g., global shutter, large pixel size, or spectral sensitivity). Ophthalmoscopes & Vision Care Instruments Sunex: Tailored Imaging Systems Designed for Performance, Size, and Cost Sunex specializes in designing and manufacturing custom optical systems, miniature lenses, and complete camera modules for medical imaging applications, including disposable (aka single-use) endoscopes. Our expertise enables medical device companies to develop systems tailored to specific clinical tasks and imaging environments. Key Advantages of Sunex’s Custom Approach: • Application-Specific Optics Custom-designed lenses to achieve your desired FOV , working distance, MTF , distortion characteristics, and mechanical envelope. • Tunable Autofocus Options Possible integration of tunable lens elements (e.g., electrically tunable liquid lenses) enables autofocus capability in miniature imaging systems—ideal for multi-depth procedures or variable tissue distances. • Sensor & PCB Flexibility Support for any CMOS sensor of your choice—no lock-in to predefined imaging specs. • Sunex also offers custom PCB design and manufacturing, including image sensor integration for bare die and packaged CMOS sensors, power delivery, and connectivity. • Advanced Optical Alignment Active alignment between the lens and sensor ensures optimal focus, centering, angular performance, and the smallest part-to-part variance, which is particularly critical for small-pixel sensors, high-resolution systems, and large-volume applications. • Feasibility Studies & System Optimization Early-stage design feasibility services, optical simulations, and system-level analysis help de-risk development and accelerate product timelines. • Sterilization-Ready Materials Lens and housing materials are selected in accordance with the customer’s requirements for sterilization and approval processes commonly used for single-use medical devices. • Scalable Manufacturing Whether you need prototypes for clinical trials or full-scale production for disposable scope lines, Sunex’s vertically integrated capabilities can scale with your business from early prototypes to high-volume series production. Dental Camera Disposable/Single-use Endoscopes For advanced medical procedures, such as robotic- assisted surgery or high-resolution disposable endoscopes, the benefits of a custom module often outweigh the convenience of a one-size-fits-all chip- level integrated solution. Ingo Foldvari Sunex, Director of Business Development Custom Optics vs. Chip-level Camera Modules Custom Optics for Medical Devices offer a Flexible Alternative As the market for disposable endoscopes, catheter-based imaging, and minimally invasive diagnostic devices grows, medical OEMs face a key architectural decision: use a fully integrated sensor module like Omnivision’s CameraCubeChip®, or pursue a custom camera system where image quality, flexibility, and system integration are optimized for the end application. While integrated modules offer simplicity and ultra-compact size, they are not always the best fit, especially when device differentiation, superior image performance, or tight system integration is required. In these scenarios, Sunex’s custom optical and sensor module solutions present a powerful alternative. SUNEX MED NEWS 2026 p 2/4 sunex.com SUNEX MED NEWS 2026 p 3/4 sunex.com Understanding the Trade-Off: Convenience vs. Customization Chip-level integration delivers a compact, ready-to-use module by embedding a CMOS sensor, a fixed lens, and packaging them into a single unit. This approach is well-suited to basic visualization tasks, especially where cost and simplicity dominate. However, this integration comes at the cost of flexibility: • Fixed optics limit field of view, depth of field, and image plane tuning. • No autofocus capability for applications with variable working distances. • Limited sensor options, especially for newer sensors with specialized features (e.g., global shutter, large pixel size, or spectral sensitivity). Ophthalmoscopes & Vision Care Instruments Sunex: Tailored Imaging Systems Designed for Performance, Size, and Cost Sunex specializes in designing and manufacturing custom optical systems, miniature lenses, and complete camera modules for medical imaging applications, including disposable (aka single-use) endoscopes. Our expertise enables medical device companies to develop systems tailored to specific clinical tasks and imaging environments. Key Advantages of Sunex’s Custom Approach: • Application-Specific Optics Custom-designed lenses to achieve your desired FOV , working distance, MTF , distortion characteristics, and mechanical envelope. • Tunable Autofocus Options Possible integration of tunable lens elements (e.g., electrically tunable liquid lenses) enables autofocus capability in miniature imaging systems—ideal for multi-depth procedures or variable tissue distances. • Sensor & PCB Flexibility Support for any CMOS sensor of your choice—no lock-in to predefined imaging specs. • Sunex also offers custom PCB design and manufacturing, including image sensor integration for bare die and packaged CMOS sensors, power delivery, and connectivity. • Advanced Optical Alignment Active alignment between the lens and sensor ensures optimal focus, centering, angular performance, and the smallest part-to-part variance, which is particularly critical for small-pixel sensors, high-resolution systems, and large-volume applications. • Feasibility Studies & System Optimization Early-stage design feasibility services, optical simulations, and system-level analysis help de-risk development and accelerate product timelines. • Sterilization-Ready Materials Lens and housing materials are selected in accordance with the customer’s requirements for sterilization and approval processes commonly used for single-use medical devices. • Scalable Manufacturing Whether you need prototypes for clinical trials or full-scale production for disposable scope lines, Sunex’s vertically integrated capabilities can scale with your business from early prototypes to high-volume series production. Dental Camera Disposable/Single-use Endoscopes For advanced medical procedures, such as robotic- assisted surgery or high-resolution disposable endoscopes, the benefits of a custom module often outweigh the convenience of a one-size-fits-all chip- level integrated solution. Ingo Foldvari Sunex, Director of Business Development OPTICS Use Cases That Benefit from a Custom Approach There are many different indications, devices, and procedures in the medical field. Even though they all have their specific requirements, they all benefit from a custom approach to the optical system if one or more of the following are required: • High-resolution and wide-angle disposable endoscopes requiring low distortion and superior edge-to-edge sharpness. • Large Depth-of-Field (DOF) imaging where autofocus is needed to maintain clarity at variable object distances. • Dual-imager or Stereo vision systems, where precise calibration and alignment between channels are mandatory. • Low-light or IR-capable applications requiring optimized optics and sensor pairing. Scan the QR code to access product datasheets, case studies, incl. a Design Guideline, and more. SUNEX MED NEWS 2026 p 4/4 sunex.com MED NEWS 2 Different Choices sunex.com Conclusion: Choosing the Right Imaging Architecture Fully integrated chip-level camera modules provide a valuable solution for many basic disposable imaging tasks, especially where space and cost constraints are paramount. However, when performance matters, whether it’s higher image quality, autofocus capability, robustness, or system-specific sensor selection, a custom camera module built by Sunex offers differentiation by meeting the performance, size, and commercial objectives required to make the end- customer successful in their domain. From lens design and simulation to sensor integration, active alignment, tunable focus implementation, and custom PCB layout, Sunex provides a comprehensive pathway from concept to volume production, delivering imaging systems that meet the demands of next-generation medical devices. HIGH RESOLUTION FOR DISPOSABLE ENDOSCOPES Sunex MOD124 Hybrid Disposable Endoscope Module Lens Custom Optics vs. Chip-level Camera Modules INSIGHTS and Future PERSPECTIVES 2026 MORE IMPORTANT THAN EVER ---