TechTips

Sort by:
Topic:

Embedded Vision Kit: Prototype development with the uEye Python interface and OpenCV

The evolution of the classical machine vision towards embedded vision is rapidly evolving. But developing an embedded vision device can be very time consuming and cost-intensive. Especially with proprietary developments you may lose a lot of time until the first results are available.

Today, there are a number of suitable embedded standard components that allow out-of-the-box testing. In combination with qualified software solutions, the first insights for vision applications can be derived very quickly.

Our TechTip shows in a few simple steps how to implement a simple embedded vision application with a uEye camera and a Raspberry Pi 3.

Without calibration of the RGB enhancement, the Bayer matrix is clearly visible

Greater resolution in monochrome mode: How to get greater resolution from your colour sensor

The AR1820HS 18 mega pixel sensor in our UI-3590 camera models was launched by the sensor manufacturer ON Semiconductor as a pure colour sensor. Like all colour sensors, the Bayer filter means that you get colour images with effectively only around a quarter of the nominal sensor resolution, as the colour information for each pixel is obtained from four neighbours

 

To use each individual pixel, however, it is not sufficient to operate the sensor in RAW data format (without Bayer interpolation). The Bayer matrix results in a different brightness perception of the individual pixels. We will show you how to use the colour sensor as a “pure” mono sensor by appropriate parameter settings and the use of suitable light sources, in order to obtain a significantly higher resolution.

The new "Adaptive hot pixel correction" is supported from IDS Software Suite 4.82 on.

Flexible and dynamic: Using adaptive hot pixel correction

What's that dot in my image? If you're asking yourself this question, then you've probably just discovered a hot pixel. A certain number of hot pixels exist in all standard image sensors and are perceived as a defect in an image, as they appear brighter or darker than the other pixels. Hot pixels cannot be completely avoided in sensors, even if great care is taken during sensor production.

 

So, wouldn't it be a really nifty idea if hot pixels could be detected dynamically in the application directly under all operating conditions? Well, that's exactly what can now be done thanks to "adaptive hot pixel correction", available in the IDS Software Suite as of Version 4.82.

A special feature that was previously reserved for the camera models with e2v sensors, is now available for the whole USB 3 uEye CP Rev. 2 camera family from IDS Software Suite 4.81 on: the sequencer mode.

Parameter change in real time: Using the sequencer mode

You want to capture image sequences with different exposure times or image sections? You do not want to manually reconfigure the camera while capturing? That is not very easy? Yes, it is!

 

A special feature that was previously reserved for the camera models with e2v sensors, is now available for the whole USB 3 uEye CP Rev. 2 camera family from IDS Software Suite 4.81 on: the sequencer mode. To help you get started, there is a special "uEye sequencer demo".

You can save the camera setting in an *.ini file or in the non-volatile user memory of the camera.

Parameterising instead of programming: The faster way to camera setup

The optimum setting is an important requirement for each application running effectively. Spending time and effort establishing these settings is quite justified. But is this a recurring effort for each application or could this initial setting be separated for reusing respectively?

 

All possible settings are implemented in the uEye Cockpit already. You only have to choose, activate, adjust, parameterise and save. Configuration takes only a few clicks. The high effort, programming the camera configuration on your own, is eliminated.