Embedded GigE Vision Server
Communication between image source and image sink can take place via various interfaces, each of which offers individual advantages depending on the requirements for cable length, data rate and system topology. High-resolution cameras with impressive frame rates generate extremely large data quantities that demand optimised interfaces and therefore require special interface cards in the computer (e.g. CoaXPress, CameraLink, CameraLink HS).
The definitions of image source and image sink usually takes place unidirectionally. The camera interfaces mentioned above allow for this fact in that the transmission bandwidths between source and sink are asymmetrically designed. However, most applications in the industrial environment use with less sophisticated cameras whose data rates allow transmission via the common standard interfaces found in normal PC systems (USB, Ethernet, Firewire).
Advantages of the Ethernet interface
The Ethernet interface, which is now mostly implemented as Gigabit-Ethernet (1GigE), offers a special feature, namely "full duplex" operation. Unlike the other interfaces listed above, data can be sent and received via this interface with the full bandwidth, simultaneously and without mutual interference.
The control protocol (heartbeat packet) only fails when the bandwidth in both directions is saturated. This results in a disconnection, but can be avoided with appropriate settings (interpacket delay). The Ethernet interface therefore removes the definition of image source (camera) and image sink (computer) for image processing.
In the meantime, almost all manufacturers of industrial cameras with GigE interface and industrial imaging software have agreed on the GigE Vision standard as the hardware protocol and GenICam as the software interface. Through the use of this hardware and software standard, there is a very comprehensive and extensively tested compatibility between established cameras and software packages on the most diverse computer systems.
What is remarkable here is that GigE Vision standard as the hardware protocol, including GenICam as the software interface, is not restricted to the 1GigE interface. The advantages described can also all be enjoyed to the same extent when using Ethernet interfaces with higher bandwidths (e.g. 10GigE). It is essential in these considerations to attach importance to the clear conceptual differentiation between hardware protocol and software interface, since GenICam support as a software interface is also demanded by CoaXPress. If the concepts are used superficially, there is at least a high risk of misunderstandings.
CVB with full GigE Vision support
With the CVB GigE Vision Server, STEMMER IMAGING has dissolved the classic role allocation between camera and computer through exploitation of the described properties of the Ethernet interface with full use of the GigE Vision standard and corresponding certification. Using the standard software module from the Common Vision Blox programming library, the computer can now also function as an image source and send image data to other image sinks via GigE Vision thanks to the full duplex transmission of the Ethernet interface. Since this concerns complete GigE Vision support, this image source behaves exactly like a "normal" GigE Vision camera and in this respect also offers communication – i.e. image transmissions – to all GigE Vision-compatible image sinks.
What initially looks like a technical gimmick opens up undreamt of possibilities at second glance that will lead to completely new system topologies. Old mindsets in system design need to be questioned, especially now that STEMMER IMAGING has presented a CVB version for ARM processors and LINUX.
Decentralised, compact embedded systems
In particular in interaction with the current System-On-Chip (SoC) platforms, the previously clear borders between imaging with "intelligent cameras" and "PC-based imaging" are becoming indistinct. Decentralised, compact embedded systems on the basis of highly specialised SoCs (e.g. INTEL CYCLONE V or NVIDIA JETSON TX1) can now even record the image data from several cameras, pre-process them and output the generated result images again as a GigE Vision camera. The complete control and transmission of the results can thereby take place fully transparently via the GenICam functionality and thus require no proprietary adaptation of the downstream image sink.
In turn, the GenICam standard can demonstrate its flexibility: the camera informs the software itself which special features are provided by the camera. In the case of the GigE Vision Server, these "camera features" are freely programmable and can therefore describe system functions that extend well beyond pure camera functions. The "virtual" camera created in this way can be completely remote controlled via the corresponding camera features. The CVB GigE Vision Server technology is thus directly available to every user of GigE Vision-compatible software on any image sink – irrespective of the manufacturer, operating system and platform.
The already existing computing power of the available SoC and the variety of supported interfaces are impressive. Through several USB3, GigE and MIPI interfaces, current SoCs can be used as decentralised vision systems for recording from the most diverse image sources.
Possible applications range from the pure local conversion of a USB or MIPI camera to the GigE Vision standard, or the local pre-processing of a single camera image on the FPGA e.g. of the Intel Cyclone V and the forwarding of the pre-processed camera image via GigE Vision, through to the recording of several camera images and the forwarding of the complete result images to the local GPU of an NVIDIA Jetson TX1 SoC following computationally intensive pre-processing. There is also nothing to stop one building one's own GigE Vision-compatible camera using the CVB GigE Vision Server on an ARM-based SoC. Only a CCD or CMOS sensor needs also to be connected to the system.
The longer one concerns oneself with the possibilities of the GigE Vision Server on SoC platforms, the more blurred the well-known definitions become. Whether these local imaging nodes of the complete system are called "computer", "camera", "intelligent camera" or "vision sensor" is ultimately a matter for the observer.