FireWire camera

Last updated

FireWire cameras use the IEEE 1394 bus standard for the transmission of audio, video and control data. FireWire is Apple Computer's trademark for the IEEE 1283 standard.

Contents

FireWire cameras are available in the form of photo cameras and video cameras, which provide image and audio data. A special form of video cameras is used in the domains of industry, medicine, astronomy, microscopy and science. These special cameras do not provide audio data.

Different forms of FireWire cameras Fwcam overview padding.png
Different forms of FireWire cameras

Structure

The basic structure of FireWire cameras is based on the following six modules:

Optics

The structure of FireWire cameras Fwcam modules padding.en.png
The structure of FireWire cameras

FireWire cameras are based on CCD or CMOS chips. The light-sensitive area, as well as the pixels of these chips, are small. In the case of cameras with integrated optics, we can assume that the optics are adapted to these chips.

However, in the domains of professional, and semi-professional photography, as well as in the domain of special cameras, interchangeable optics are often used. In these cases, a system specialist has to adapt the optics and the chip to the application (see System integration). Besides normal lenses, such interchangeable lenses may be microscopes, endoscopes, telescopes, etc. With the exception of the standard C-mount and CS-mount, the mounts of interchangeable optics are company-specific.

Signal capture

Since the function of a FireWire camera depends upon electrical signals, the module "signal capture" transforms the incident light, as well as the incident sound into electrons. In the case of light, this process is performed by a CCD or CMOS chip. The transformation of the sound is performed by a microphone.

Digitization

The first step of the image's digitization results from the structure of a CCD or CMOS chip. It dissects the image into pixels. If a pixel has collected many photons, it creates a high voltage. Should there only be a few photons, a low voltage is created. "Voltage" is an analog value. Therefore, during the digitization's second step, the voltage has to be transformed into a digital value by an A/D converter. Now the raw digital image is available.

A microphone transforms the sound into a voltage. An A/D converter transforms these analog values into digital ones.

Signal enhancement

The creation of color is based on a color filter, which is located in front of the CCD or CMOS chip. It is red, green or blue and changes its color from pixel to pixel. Therefore, the filter is called a color filter array or, after its inventor, Bayer filter. Using these raw digital images, the module "signal enhancement" creates an image, which meets aesthetic requirements. The same is true for the audio data.

In the final step, the module compresses the image and audio data and outputs them - in the case of video cameras - as a DV data stream. In the case of photo cameras, single images may be output and, if applicable, voice comments as files.

The application domains of industry, medicine, astronomy, microscopy and science often use special monochrome cameras. They forgo any signal enhancement and thus output the digital image data in its raw state.

Some special models of color cameras are only capable of outputting raw digital image data. Such cameras are called ColorRAW or Bayer cameras. They are often used in industry, medicine, astronomy, microscopy, and science. In form of photo cameras, they are used by professional photographers. Semi-professional photo cameras often offer an optional RAW mode.

The enhancement of the raw digital data takes place outside the camera on a computer and therefore the user is able to adapt it to a particular application.

Interface

The first three modules are part of any digital camera. The interface is the module that characterizes the FireWire camera. It is based on the IEEE 1283 standard, defined by the organization "Institute of Electrical and Electronics Engineers". This standard defines a bus, which transmits:

  1. time critical data, for example, a video and
  2. data whose integrity is of critical importance (for example, parameters or files).

It allows the simultaneous use of up to 74 different devices (cameras, scanners, video recorders, hard disks, DVD drives, etc.).

Other standards, called "protocols" define the behavior of these devices. FireWire cameras mostly use one of the following protocols:

AV/C
AV/C stands for "Audio Video Control" and defines the behavior of DV devices, for example, video cameras and video recorders. It is a standard, defined by the 1348 Trade Association. The Audio/Video Working Group is in charge of it.
DCAM
DCAM stands for "1394-based Digital Camera Specification" and defines the behavior of cameras that output uncompressed image data without audio. It is a standard, defined by the 1394 Trade Association. The IIDC (Instrumentation and Industrial Control Working Group) is in charge of it.
IIDC
IIDC is often used synonymously with DCAM.
SBP-2
SBP-2 stands for "Serial Bus Protocol" and defines the behavior of mass storage devices, such as hard disks. It is an ANSI standard maintained by NCITS.

Devices that use the same protocol are able to communicate with each other. A typical example is the connection of a video camera and a video recorder. Thus, in contrast to the USB bus, there is no need to use a controlling computer. If a computer is used, it has to be compatible with the protocols of the device with which it is to communicate (please cf. Exchanging data with computers).

Control

The controlling module coordinates the other ones. The user may specify its behavior by:

  1. switches outside the camera,
  2. the FireWire bus, using application software or
  3. a hybrid of the first two cases.

Photo cameras

Professional and semi-professional photo cameras, and especially digital camera backs, offer FireWire interfaces to transfer image data and to control the camera.

The image data's transfer is based on the protocol SBP-2. In this mode, the camera behaves as an external hard disk and thus enables the simple exchange of image files with a computer (please cf. Exchanging data with computers).

To increase the work efficiency in a photo studio, additionally photo cameras and digital backs are controllable via the FireWire bus. Usually the camera manufacturer does not publish the protocol used in this mode. Therefore, camera control requires a specialized piece of software provided by the camera manufacturer, which mostly is available for Macintosh and Windows computers.

Video cameras

Although compatibility to the FireWire bus is only found in high-end photo cameras, it has usually been present in home-user level video cameras. Video cameras are mostly based on the protocol AV/C. It defines the flow of audio and video data, as well as the camera's control signals.

The majority of video cameras only provides the output of audio and video data via the FireWire bus ("DVout"). Additionally, some video cameras are able to record audio and video data ("DVout/DVin"). Video cameras exchange their data with computers and/or video recorders.

Special cameras

In the domains of industry, medicine, astronomy, microscopy and science FireWire cameras are often used not for aesthetic, but rather for analytical purposes. They output uncompressed image data, without audio. These cameras are based on the protocol DCAM (IIDC) or on company specific protocols.

Due to their field of application, their behavior is considerably different from photo cameras or video cameras:

  1. Their case is small and built mainly from metal and do not follow aesthetic, but rather functional design constraints.
  2. The vast majority of special cameras does not offer integrated optics, but a standardized lens mount called "C-mount" or "CS-mount". This standard is not only used by lenses, but also by microscopes, telescopes, endoscopes and other optical devices.
  3. Recording aids, such as autofocus or image stabilization are not available.
  4. Special cameras often utilize monochrome CCD or CMOS chips.
  5. Special cameras often do not apply an infrared cut filter or optical low pass filters, thus avoid affecting the image.
  6. Special cameras output image data streams and single images, which are captured using an external trigger signal. In this way, these cameras can be integrated into industrial processes.
  7. Mass storage devices are not available since the images have to be analyzed more or less immediately by the computer connected to the camera.
  8. The vast majority of special cameras is controlled by application software, installed on a computer. Therefore, the cameras do not have external switches.
  9. Application software is rarely available off-the-shelf. It usually has to be adapted to the specific application. Therefore, camera manufacturers offer programming tools designed for their cameras. If a camera uses the standard protocol DCAM (IIDC), it can also be used with third-party software. A lot of industrial computers and embedded systems are compatible to the DCAM (IIDC) protocol (please cf. Structure / Interface and Exchanging data with computers).

In comparison to photo or video cameras, these special cameras are very complifcated. However, it makes no sense to use them in an isolated manner. They are, like other sensors, only components of a bigger system (please cf. System integration).

Exchanging data with computers

FireWire cameras are able to exchange data with any other FireWire device, as long as both devices use the same protocol (please cf. Structure / Interface). Depending upon the specific camera, these data are:

Data exchange between FireWire cameras and computers
Left: company specific system
Right: open system Fwcam open vs prop padding.en.png
Data exchange between FireWire cameras and computers
Left: company specific system
Right: open system

If the camera is to communicate with a computer, this computer has to have a FireWire interface and to use the camera's protocol. The old days of FireWire cameras were dominated by company specific solutions. Some specialist offered interface boards and drivers, which were accessible only by their application software. Following this approach, application software is in charge of the protocol. Since this solution utilizes the computing resources in a very efficient manner, it is still used in the context of highly specialized, industrial projects. This strategy often leads to problems, using other FireWire devices, as for instance hard disks. Open systems avoid this disadvantage.

Open systems are based on a layer model. The behavior of the single layers (interface board, low level driver, high level driver and API) follows the constraints of the respective operating system manufacturer. Application software is allowed to access operating system APIs, but never should access any level lower. In the context of FireWire cameras, the high level drivers are responsible for the protocol. The low level drivers and the interface boards put the definitions of the standard IEEE 1394 into effect. The advantage of this strategy is the simple realization of application software, which is independent of hardware and specific manufacturers.

Especially in the domains of photo cameras and special cameras hybrids between open and company specific systems are used. The interface boards and the low level drivers typically adhere to the standard, while the levels above are company specific.

The basic characteristic of open systems is not to use the APIs of the hardware manufacturers, but those of the operating system. For Apple and Microsoft the subject "image and sound" is of high importance. According to their APIs - QuickTime and DirectX - are very well known. However, in the public perception they are reduced to the reproduction of audio and video. Actually, they are powerful APIs that are also responsible for image acquisition.

Under Linux this API is called video4linux. It is less powerful than QuickTime and DirectX and therefore additional APIs exist besides video4linux:

Accessing FireWire cameras under Linux Fwcam linux padding.en.png
Accessing FireWire cameras under Linux
Photo cameras
Photo cameras usually use Linux' infrastructure for mass storage devices. One of the typical applications is digiKam.
Video cameras
Video cameras are accessed by various APIs. The image to the right depicts the access of the video editing software Kino to the libavc1394 API. Kino also accesses other APIs which are not shown in the image to simplify matters.
Special cameras
The most important API for special cameras is libdc1394. The image to the right depicts the access of the application software Coriander to this API. Coriander controls FireWire cameras that are based on the protocol DCAM (IIDC) and acquires their images.

In order to simplify the use of video4linux and the dedicated APIs, the meta API unicap has been developed. It covers their bits and pieces with the aid of a simple programming model.

System integration

Often FireWire cameras are only a cog in a bigger system. Typically, a system specialist uses a number of different components to solve a particular problem. There are two basic approaches to do this:

  1. The problem at hand is interesting enough for a group of users. The typical indicator of this situation is the off-the-shelf availability of application software. Studio photography is an example.
  2. The problem at hand is only of interest to a particular application. In such cases, there is typically no application software available off-the-shelf. Therefore, it has to be written by a system specialist. The gauging of a steel plate is an example.

Many aspects of system integration are not directly related to FireWire cameras. For example, illumination has a very strong influence on the quality of the acquired images. This holds true for both aesthetic and analytical applications.

However, in the context of the realization of application software, there is a special feature, which is typical for FireWire cameras. It is the availability of standardized protocols, such as AV/C, DCAM, IIDC and SBP-2 (please cf. Structure / Interface and Exchanging data with computers). Using these protocols, the software is written independently from any particular camera and manufacturer.

By leaving the realization of the protocol to the operating system, and by enabling access to a set of APIs, software can be developed independently from hardware. If, for instance, under Linux a piece of application software uses the API libdc1394 (please cf. Exchanging data with computers), it can access all FireWire cameras that use the protocol DCAM (IIDC). Using the API unicap additionally permits access to other video sources, such as frame grabbers.

See also

Related Research Articles

<span class="mw-page-title-main">Embedded system</span> Computer system with a dedicated function

An embedded system is a computer system—a combination of a computer processor, computer memory, and input/output peripheral devices—that has a dedicated function within a larger mechanical or electronic system. It is embedded as part of a complete device often including electrical or electronic hardware and mechanical parts. Because an embedded system typically controls physical operations of the machine that it is embedded within, it often has real-time computing constraints. Embedded systems control many devices in common use today. In 2009, it was estimated that ninety-eight percent of all microprocessors manufactured were used in embedded systems.

<span class="mw-page-title-main">TWAIN</span>

TWAIN and TWAIN Direct are application programming interfaces (APIs) and communication protocols that regulate communication between software and digital imaging devices, such as image scanners and digital cameras. TWAIN is supported on Microsoft Windows, Linux and Mac OS X.

<span class="mw-page-title-main">Image scanner</span> Device that optically scans images, printed text

An image scanner—often abbreviated to just scanner—is a device that optically scans images, printed text, handwriting or an object and converts it to a digital image. Commonly used in offices are variations of the desktop flatbed scanner where the document is placed on a glass window for scanning. Hand-held scanners, where the device is moved by hand, have evolved from text scanning "wands" to 3D scanners used for industrial design, reverse engineering, test and measurement, orthotics, gaming and other applications. Mechanically driven scanners that move the document are typically used for large-format documents, where a flatbed design would be impractical.

<span class="mw-page-title-main">Camcorder</span> Video camera with built-in video recorder

A camcorder is a self-contained portable electronic device with video and recording as its primary function. It is typically equipped with an articulating screen mounted on the left side, a belt to facilitate holding on the right side, hot-swappable battery facing towards the user, hot-swappable recording media, and an internally contained quiet optical zoom lens.

Open Sound Control (OSC) is a protocol for networking sound synthesizers, computers, and other multimedia devices for purposes such as musical performance or show control. OSC's advantages include interoperability, accuracy, flexibility and enhanced organization and documentation. The first specification was released in March 2002.

iSight Brand name used for webcams by Apple

iSight is a brand name used by Apple Inc. to refer to cameras on various devices. The name was originally used for the external iSight webcam, which retailed for US$149, connected to a computer via a FireWire cable, and came with a set of mounts to place it atop any then current Apple display, laptop computer, all-in-one desktop computer, or flat surface.

The NOMAD was a range of digital audio players designed and sold by Creative Technology Limited, and later discontinued in 2004. Subsequent players now fall exclusively under the MuVo and ZEN brands.

A host controller interface (HCI) is a register-level interface that enables a host controller for USB or IEEE 1394 hardware to communicate with a host controller driver in software. The driver software is typically provided with an operating system of a personal computer, but may also be implemented by application-specific devices such as a microcontroller.

OpenMAX, often shortened as "OMX", is a non-proprietary and royalty-free cross-platform set of C-language programming interfaces. It provides abstractions for routines that are especially useful for processing of audio, video, and still images. It is intended for low power and embedded system devices that need to efficiently process large amounts of multimedia data in predictable ways, such as video codecs, graphics libraries, and other functions for video, image, audio, voice and speech.

Picture Transfer Protocol (PTP) is a protocol developed by the International Imaging Industry Association to allow the transfer of images from digital cameras to computers and other peripheral devices without the need of additional device drivers. The protocol has been standardized as ISO 15740.

<span class="mw-page-title-main">Frame grabber</span>

A frame grabber is an electronic device that captures individual, digital still frames from an analog video signal or a digital video stream. It is usually employed as a component of a computer vision system, in which video frames are captured in digital form and then displayed, stored, transmitted, analyzed, or combinations of these.

Windows Image Acquisition is a proprietary Microsoft driver model and application programming interface (API) for Microsoft Windows Me and later Windows operating systems that enables graphics software to communicate with imaging hardware such as scanners, digital cameras, and digital video equipment. It was first introduced in 2000 as part of Windows Me, and continues to be the standard imaging device and API model through successive Windows versions. It is implemented as an on-demand service in Windows XP and later Windows operating systems.

The following are common definitions related to the machine vision field.

A home server is a computing server located in a private computing residence providing services to other devices inside or outside the household through a home network or the Internet. Such services may include file and printer serving, media center serving, home automation control, web serving, web caching, file sharing and synchronization, video surveillance and digital video recorder, calendar and contact sharing and synchronization, account authentication, and backup services.

LinuxMCE is a free and open source software platform with a 10-foot user interface designed to allow a computer to act as a home theater PC (HTPC) for the living-room TV, personal video recorder, and home automation system. It allows control of everything in the home, from lighting and climate to surveillance cameras and home security. It also includes a full-featured VoIP-compatible phone system with support for video conferencing.

<span class="mw-page-title-main">GigE Vision</span>

GigE Vision is an interface standard introduced in 2006 for high-performance industrial cameras. It provides a framework for transmitting high-speed video and related control data over Ethernet networks. The distribution of software or development, manufacture or sale of hardware that implement the standard, require the payment of annual licensing fees. The standard was initiated by a group of 12 companies, and the committee has since grown to include more than 50 members. The 12 founding members were: Adimec, Atmel, Basler AG, CyberOptics, Teledyne DALSA, JAI A/S, JAI PULNiX, Matrox, National Instruments, Photonfocus, Pleora Technologies and Stemmer Imaging. The Automated Imaging Association (AIA) oversees the ongoing development and administration of the standard.

Audio connectors and video connectors are electrical or optical connectors for carrying audio or video signals. Audio interfaces or video interfaces define physical parameters and interpretation of signals. For digital audio and digital video, this can be thought of as defining the physical layer, data link layer, and most or all of the application layer. For analog audio and analog video these functions are all represented in a single signal specification like NTSC or the direct speaker-driving signal of analog audio.

<span class="mw-page-title-main">IEEE 1394</span> Serial bus interface standard, also known as Firewire

IEEE 1394 is an interface standard for a serial bus for high-speed communications and isochronous real-time data transfer. It was developed in the late 1980s and early 1990s by Apple in cooperation with a number of companies, primarily Sony and Panasonic. Apple called the interface FireWire. It is also known by the brand names i.LINK (Sony), and Lynx.

<span class="mw-page-title-main">LIO (SCSI target)</span> Open-source version of SCSI target

In computing, Linux-IO (LIO) Target is an open-source implementation of the SCSI target that has become the standard one included in the Linux kernel. Internally, LIO does not initiate sessions, but instead provides one or more Logical Unit Numbers (LUNs), waits for SCSI commands from a SCSI initiator, and performs required input/output data transfers. LIO supports common storage fabrics, including FCoE, Fibre Channel, IEEE 1394, iSCSI, iSCSI Extensions for RDMA (iSER), SCSI RDMA Protocol (SRP) and USB. It is included in most Linux distributions; native support for LIO in QEMU/KVM, libvirt, and OpenStack makes LIO also a storage option for cloud deployments.

<span class="mw-page-title-main">Audio Video Bridging</span> Specifications for synchronized, low-latency streaming through IEEE 802 networks

Audio Video Bridging (AVB) is a common name for the set of technical standards which provide improved synchronization, low-latency, and reliability for switched Ethernet networks. AVB embodies the following technologies and standards: