TechTips

Sort by:
Topic:

Bandwidth under control with IDS GigE Vision cameras

GigE Vision cameras transmit image data in small packets over the network, even before a captured sensor image is read out completely. This minimizes the delay of the image transfer. However, if too much data is transferred at the same time, the maximum bandwidth of a GigE network can be exceeded very quickly.

Especially multi-camera applications are affected. This results in transmission losses and increased transmission times if data has to be requested repeatedly. The GigE Vision Standard allows the configuration of transmission parameters to avoid such situations. Easily manage the available bandwidth with the extended settings of IDS GigE Vision firmware 1.3.

Embedded Vision Kit: Prototype development with the uEye Python interface and OpenCV

The evolution of the classical machine vision towards embedded vision is rapidly evolving. But developing an embedded vision device can be very time consuming and cost-intensive. Especially with proprietary developments you may lose a lot of time until the first results are available.

Today, there are a number of suitable embedded standard components that allow out-of-the-box testing. In combination with qualified software solutions, the first insights for vision applications can be derived very quickly.

Our TechTip shows in a few simple steps how to implement a simple embedded vision application with a uEye camera and a Raspberry Pi 3.

Without calibration of the RGB enhancement, the Bayer matrix is clearly visible

Greater resolution in monochrome mode: How to get greater resolution from your colour sensor

The AR1820HS 18 mega pixel sensor in our UI-3590 camera models was launched by the sensor manufacturer ON Semiconductor as a pure colour sensor. Like all colour sensors, the Bayer filter means that you get colour images with effectively only around a quarter of the nominal sensor resolution, as the colour information for each pixel is obtained from four neighbours

 

To use each individual pixel, however, it is not sufficient to operate the sensor in RAW data format (without Bayer interpolation). The Bayer matrix results in a different brightness perception of the individual pixels. We will show you how to use the colour sensor as a “pure” mono sensor by appropriate parameter settings and the use of suitable light sources, in order to obtain a significantly higher resolution.

The new "Adaptive hot pixel correction" is supported from IDS Software Suite 4.82 on.

Flexible and dynamic: Using adaptive hot pixel correction

What's that dot in my image? If you're asking yourself this question, then you've probably just discovered a hot pixel. A certain number of hot pixels exist in all standard image sensors and are perceived as a defect in an image, as they appear brighter or darker than the other pixels. Hot pixels cannot be completely avoided in sensors, even if great care is taken during sensor production.

 

So, wouldn't it be a really nifty idea if hot pixels could be detected dynamically in the application directly under all operating conditions? Well, that's exactly what can now be done thanks to "adaptive hot pixel correction", available in the IDS Software Suite as of Version 4.82.

A special feature that was previously reserved for the camera models with e2v sensors, is now available for the whole USB 3 uEye CP Rev. 2 camera family from IDS Software Suite 4.81 on: the sequencer mode.

Parameter change in real time: Using the sequencer mode

You want to capture image sequences with different exposure times or image sections? You do not want to manually reconfigure the camera while capturing? That is not very easy? Yes, it is!

 

A special feature that was previously reserved for the camera models with e2v sensors, is now available for the whole USB 3 uEye CP Rev. 2 camera family from IDS Software Suite 4.81 on: the sequencer mode. To help you get started, there is a special "uEye sequencer demo".