TechTips

Sort by:
Topic:

Triggering according to the Lego principle

We are all familiar with them. The colorful plastic bricks with the patented stud system, which allows you to combine the bricks in any way you wish and let your imagination run free. The Lego principle is as ingenious as it is simple and has turned out to be a recipe for success. The world of machine vision has made good use of this principle.

With the release of IDS GigE Vision Firmware V1.5, IDS has greatly enhanced the IDS vision cameras by adding many standard features from the GenICam Standard Feature Naming Convention. With the IDS Vision Cockpit, IDS also provides you with an ideal demo tool, which you can use to reconstruct and thoroughly test each of the trigger cases described here.

Bandwidth under control with IDS GigE Vision cameras

GigE Vision cameras transmit image data in small packets over the network, even before a captured sensor image is read out completely. This minimizes the delay of the image transfer. However, if too much data is transferred at the same time, the maximum bandwidth of a GigE network can be exceeded very quickly.

Especially multi-camera applications are affected. This results in transmission losses and increased transmission times if data has to be requested repeatedly. The GigE Vision Standard allows the configuration of transmission parameters to avoid such situations. Easily manage the available bandwidth with the extended settings of IDS GigE Vision firmware 1.3.

Embedded Vision Kit: Prototype development with the uEye Python interface and OpenCV

The evolution of the classical machine vision towards embedded vision is rapidly evolving. But developing an embedded vision device can be very time consuming and cost-intensive. Especially with proprietary developments you may lose a lot of time until the first results are available.

Today, there are a number of suitable embedded standard components that allow out-of-the-box testing. In combination with qualified software solutions, the first insights for vision applications can be derived very quickly.

Our TechTip shows in a few simple steps how to implement a simple embedded vision application with a uEye camera and a Raspberry Pi 3.

Without calibration of the RGB enhancement, the Bayer matrix is clearly visible

Greater resolution in monochrome mode: How to get greater resolution from your colour sensor

The AR1820HS 18 mega pixel sensor in our UI-3590 camera models was launched by the sensor manufacturer ON Semiconductor as a pure colour sensor. Like all colour sensors, the Bayer filter means that you get colour images with effectively only around a quarter of the nominal sensor resolution, as the colour information for each pixel is obtained from four neighbours

 

To use each individual pixel, however, it is not sufficient to operate the sensor in RAW data format (without Bayer interpolation). The Bayer matrix results in a different brightness perception of the individual pixels. We will show you how to use the colour sensor as a “pure” mono sensor by appropriate parameter settings and the use of suitable light sources, in order to obtain a significantly higher resolution.

The new "Adaptive hot pixel correction" is supported from IDS Software Suite 4.82 on.

Flexible and dynamic: Using adaptive hot pixel correction

What's that dot in my image? If you're asking yourself this question, then you've probably just discovered a hot pixel. A certain number of hot pixels exist in all standard image sensors and are perceived as a defect in an image, as they appear brighter or darker than the other pixels. Hot pixels cannot be completely avoided in sensors, even if great care is taken during sensor production.

 

So, wouldn't it be a really nifty idea if hot pixels could be detected dynamically in the application directly under all operating conditions? Well, that's exactly what can now be done thanks to "adaptive hot pixel correction", available in the IDS Software Suite as of Version 4.82.