260215519 © Scharfsinn86 | Dreamstime.com
car_vision_dreamstime_l_260215519

COTS and OSS Promote Cost-Effective Custom Vision-System Design (Part 2)

April 5, 2024
COTS-based machine-vision systems reduce time-to-market while providing more robust solutions. Part 2 looks at how open-source software can be leveraged to meet those goals.

This article is part of the TechXchange: Machine Vision.

What you’ll learn:

  • How open-source software (OSS) like OpenCV is changing the way machine-vision systems are designed.
  • How vision systems are used in in-vehicle applications.
  • How COTS and OSS can reduce development costs.

 

For Part 1, click here.

An increasing number of OEMs are incorporating vision systems to automate their products. One way to do this in a custom, affordable, and scalable way is to utilize commercial-off-the-shelf (COTS) components and open-source software. 

The first article in this series introduced how external developers are constantly improving COTS camera sensors to provide the solutions sought by their customers. Part 2 discusses how open-source software can reduce cost and time-to-market, and the key considerations every OEM should know when designing a vision system. 

Open-Source Software Development is Constant

Open-source software and artificial intelligence are emerging as an essential building block for engineers developing embedded camera and machine-vision systems. The source code is available to the public; thus, independent users can modify and develop successful platforms collaboratively.

This available software saves product developers many months of work. And by building off each other together, the entire industry can rapidly advance sensor technology software. Oh, and by the way, it’s free! 

One example of an open-source platform is OpenCV. OpenCV is an open-source computer vision and machine-learning (ML) software library built to provide a common infrastructure for complex image-sensor-related processing. Its international community of over 47,000 people has downloaded the software more than 18 million times. According to OpenCV.org, the library has 2,500-plus optimized computer vision and ML algorithms that are easy for businesses to utilize and modify the code.

It takes a special skill set for engineers to achieve the level of performance expected from a more complex system at a cost low enough to embed inside an OEM solution. Mechanical, electrical, and software engineers bring their diverse backgrounds to the table when they collaborate to build custom solutions with the help of open-source technologies and off-the-shelf products. 

Case Study: High-Performance In-Vehicle Camera System 

An OEM wanted to improve its existing high-performance image processor, which lived and operated inside the trunk of an active police car. The manufacturer needed a flexible, updatable platform to accommodate the ongoing advances in camera technology. 

EmbedTek was selected to design the new system, and quickly identified that migrations from one camera to the next would only be possible if the processor could remain and handle the upgrades. Unfortunately, data-capture components, cables, and connectors in the OEM’s current system were in the way. These proprietary components would require the OEM to replace the entire system to support future cameras. 

Custom hardware and software developed by EmbedTek eliminated the need for proprietary… anything. It allowed the OEM to replace just the processor module with the existing, already-deployed cameras. Therefore, field retrofits could begin immediately without changing the nature of the video stream. 

Engineers can accommodate future architecture needs when the next-generation cameras and interfaces become available. The new image processor was designed using field-tested COTS components as main building blocks, drastically reducing the time needed for the innovation process. 

The new product outperforms the OEM’s previous processor in many ways: 

  • The protection and durability required for the brutal in-vehicle environment are balanced with the high-performance quality necessary for the processor to capture images.
  • Able to withstand weather and climate anywhere in the world—from near Arctic in winter to Saudi Arabia in summer. 
  • Reduction in the overall volume of the system by 50%.
  • Longer, more straightforward cable runs and connectors provide better power distribution, reduce noise and signal loss, and lower manufacturing costs.
  • Higher-performance image processing and analytics capture more accurate information at faster vehicle speeds—upwards of 180 mph. 

The OEM is happy with this solution, which is a less expensive, more reliable, and durable design equipped to adapt to the fast pace of innovation in camera technology (see figure). Their end users are pleased with better and more reliable image captures that are critical to their jobs.

Vision-system design requires careful selection of the camera sensors and much thought into the software to capture, process, and interpret sensor(s) data. COTS and open source efficiently work together with custom designs to make an extremely powerful tool for OEM equipment in every industry. The exciting part is that every application is genuinely different from the next. But that’s the challenging part, too. 

Key Considerations When Designing a Vision System

There are hundreds of ways to approach a vision system challenge and thousands of options for COTS component configurations to develop a solution. Consider the following factors when selecting components and your approach for a new vision system platform. 

Cameras—as an OEM Sensor

Camera sensors and integrated COTS cameras are available with many options. Start by listing out the specific minimum requirements needed for your application. The following are some of the key factors for sensor selection, lens selection, and illumination methods. If a COTS camera doesn’t exist with all of the specifications you’re looking for, an engineering partner can help design a custom solution using components that consider price, lifecycle, and manufacturability. 

  • Resolution: The image size, captured with the camera, measured in X and Y pixels. This determines how much an image can be zoomed in before it becomes grainy. In machine vision, the resolution must be enough to capture the needed data, but if it’s more than required, it will slow down processing time. Camera technology has advanced and evolved to offer a range from sub-Mpixels to over 50 Mpixels, where text in an image may be recognized from a mile away.
  • Shutter speed: The speed at which the camera shutter allows light in to capture the image, resulting in different levels of color penetration, depth, and contrast. This ability is a major consideration in selecting a sensor, but what’s needed to freeze the object? A faster shutter speed requires more light; however, it can freeze a fast-moving object—important when implementing artificial light is impractical. With strobing, this capability may not be as important because the strobe will freeze an object even at a lower shutter speed. 
  • Shutter type: Two main types of shutters are available—rolling and global. In applications where the subject is moving quickly, global shutters are superior to make sure the image captures all of the pixels at once during the capture. Rolling shutters are less expensive and are okay for slower applications where the object doesn’t move during the exposure. However, it may have issues with artifacts if the image moves between frames.
  • Frame rate: The number of frames a camera can display per second. The higher the frames per second, the better the camera can capture the motion, and the better images synchronize with external events. A high frame rate is critical in many high-speed applications.
  • Lighting: Used in conjunction with resolution and shutter speed to capture the desired image. An internal or external camera component can use this source. Proper use of strobing, for example, allows for crisp capture of images and can eliminate issues such as glare with proper positioning.
  • Filters: Placed on camera lenses or within the optical path to augment light, control color, and reduce glare and saturation in a final image.
  • High dynamic range (HDR): Takes multiple photos at different exposures. When the images are edited, they can be combined to compensate for the oversaturation of light and render darker images in the background to create the best final image possible. Custom applications that require post-process can also be used to control glare and shadowing within an image.
  • Lenses: The lens selection is critical when developing a system. There are many things to consider including linearity, f-stop (how efficient the lens lets light in) quality of optics (important for high pixel cameras), focal length, and lens distortion. 

Software—to Analyze Data

Software developers write complex image-processing algorithms for their systems. Once data is captured from the camera, the system needs to analyze the data to identify trends, perform a function, or make a logical decision. A vision system is only as good as its data, so the analytics function is critical to a successful application. 

Open-source algorithms serve as a valuable foundation for a software process. Engineers can also learn from the algorithms and then rewrite, refactor, redesign, or hone the method to work more effectively in their specific application. The following are essential software features that are foundational to any vision system. 

  • Background subtraction: A widely used technique for generating a foreground mask via static cameras.
  • Object detection: Classify images to know what an image is, and obtain the bounding box coordinates to know where the object is located.
  • Object tracking: Object detection in a real-time video stream.
  • Attribute measurement: Size, area, contour, shape color, reflectivity… any parameter that would describe an object from a software perspective.
  • Distance measurement: The distance to an object in an image, calculated by the camera’s distance from an object and the object width.

Partner with a Machine-Vision Expert

Historically, in the machine-vision industry, the development of low-cost, long-life light sources, imaging devices, interfaces, computers, and software was unavailable, limiting the number of applications and end-users that could benefit from a vision system.

Now, off-the-shelf camera sensors, open-source software platforms, and a new segment of external developers with custom design expertise drive significant innovation. COTS components and open-source software applications are rapidly evolving the capabilities of vision systems while driving down the cost per unit, making vision systems accessible to any manufacturer in any industry.

When the sky is the limit, OEMs need an embedded technology partner they can work with to create a solution that has their customer's best interest in mind for today and throughout their product's lifetime. Select a partner that’s knowledgeable in the industry, understands how a new platform will impact the manufacturing supply chain, and has the skill to develop a truly unique and innovative solution.

Read more articles in the TechXchange: Machine Vision.

Sponsored Recommendations

Near- and Far-Field Measurements

April 16, 2024
In this comprehensive application note, we delve into the methods of measuring the transmission (or reception) pattern, a key determinant of antenna gain, using a vector network...

DigiKey Factory Tomorrow Season 3: Sustainable Manufacturing

April 16, 2024
Industry 4.0 is helping manufacturers develop and integrate technologies such as AI, edge computing and connectivity for the factories of tomorrow. Learn more at DigiKey today...

Connectivity – The Backbone of Sustainable Automation

April 16, 2024
Advanced interfaces for signals, data, and electrical power are essential. They help save resources and costs when networking production equipment.

Empowered by Cutting-Edge Automation Technology: The Sustainable Journey

April 16, 2024
Advanced automation is key to efficient production and is a powerful tool for optimizing infrastructure and processes in terms of sustainability.

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!