This article needs additional citations for verification . (August 2013) (Learn how and when to remove this template message)
1st & Ten is a computer system that augments televised coverage of American football by inserting graphical elements on the field of play as if they were physically present; the inserted element stays fixed within the coordinates of the playing field, and obeys the visual rules of foreground objects occluding background objects. Developed by Sportvision and PVI Virtual Media Services, it is best known for generating and displaying a yellow first down line over a live broadcast of a football game—making it easier for viewers to follow play on the field. The line is not physically present on the field, and is seen only by the television audience.
1st & Ten is sometimes used generically to refer to the class of systems capable of adding first down lines and similar visual elements, and not just the Sportvision system. However, PVI's competing system is more accurately named L-VIS, for Live Video Insertion System.
Over time, usage has evolved. Some football broadcasts change the color of the line from yellow to red on 4th down, or show a second computer-generated line (usually blue in color) that marks the line of scrimmage. Lines can also be projected to show other types of field position, including markings for the red zone and the optimum maximum distance for a placekicker's statistical field goal range.In extreme weather situations, an entire virtual field with yard and boundary markers can be projected onto the field in order to allow league officials, broadcasters and viewers some way to follow action when all field markings are obscured by snow, fog or mud.
The system makes use of a combination of motion sensors mounted on the broadcast cameras to record what they are viewing,and/or the use of match moving computer graphics technology and an enhanced version of chroma key or "green screen" technology.
The idea of creating an on-field marker to help TV viewers identify first down distances was conceived and patented in 1978 by David W. Crain,who presented the concept to Roone Arledge and Roger Goodman of ABC News and Sports and to the CBS Technology Center. At the time, both decided the broadcast industry was not ready to use Crain's invention.
In 1998, ESPN programmer Gary Morgenstern and others revived the idea. ESPN's NFL coordinating producer, Fred Gaudelli, was tasked with overseeing an implementation for his network. The 1st & Ten line was first broadcast by Sportvision, a private company, during ESPN's coverage of a Cincinnati Bengals-Baltimore Ravens game on September 27, 1998.A few weeks later, on Thanksgiving Day in 1998, Princeton Video Image (PVI) aired its version of the virtual yellow down line on a CBS broadcast of a Pittsburgh Steelers–Detroit Lions game. Four years later, SportsMEDIA introduced a third version during NBC coverage of a Notre Dame game.
The rivalry between PVI and Sportvision began with a collaboration. In July 1995, PVI had successfully used its L-VIS (Live Video Insertion System) match moving technology to broadcast virtual advertising behind the home plate on a local broadcast of a Trenton Thunder baseball game. In January 1996, Roy Rosser, director of special projects at PVI, saw Sportvision's FoxTrax puck on the broadcast of the 1996 NHL All-Star Game and realized that a combination of L-VIS and FoxTrax would allow virtual insertions in a wider range of situations than either could do on its own, given the power of affordable computers. He contacted Stan Honey, CTO at Sportvision, and the two companies undertook a joint demonstration of their combined technologies during the 1996 World Series between the Atlanta Braves and the New York Yankees at the Atlanta–Fulton County Stadium. The test was not a success and the two companies parted ways, each developing complementary systems that were eventually used to broadcast Sportvision's "First and Ten" line and PVI's "Yellow Down Line".In October 1999, SportVision sued PVI alleging that PVI's virtual signage, first down line and other products infringed Fox/Sportvision patents. In August 2001, PVI counterclaimed against Sportvision in the federal court action, alleging that Sportvision's virtual strike zone and virtual signage products infringed a PVI patent. In 2002, the companies settled the lawsuits out of court through a cross-licensing deal.
Each football field has a unique crown and contour and is not perfectly flat in order to facilitate drainage, so a 3D model is made of the field prior to the game.Due to the low amount of change throughout a football season, this 3D model is usually only generated once a season at most. It also has a unique color palette, typically various shades of green, depending on the type of surface (i.e. real or artificial grass) and the weather (e.g. bright, shady or even snowing). In addition, after cameras are set up, the position of the camera relative to the field is established to be used in conjunction with the previously created 3D model of the field.
There are usually a number of cameras shooting the field, but typically only three or four main cameras are used for an American football broadcast (one on the fifty-yard line, and one on each twenty-yard line, with most high profile games also having a Skycam, as described below). The cameras with video that will be used with the graphics system have electronic encoders within parts of the camera assembly (in the lens and the moving platform the camera sits on, sometimes called a "panhead") that monitor how the camera is used during the game (pan, tilt, zoom, focus and extender).The encoders transmit that info live 30 or more times per second to the broadcaster's production truck, where it is processed by Sportvision computers (typically one for each camera). A camera with this type of extra hardware is usually called an "instrumented" camera. This information helps keep the yellow 1st & ten line in the proper place without being distorted whenever the camera follows the players or the ball.
In the larger productions, several other cameras can be "instrumented" to work with the graphics system, but these are usually restricted to following additional types: a camera usually placed in a high position to see all twenty-two men on the field, typically called the "all 22" camera, and a camera shooting from above one end zone, called an "end zone camera", or in the industry often just "camera 4". The Skycam (or moving camera attached to cables above the field) can also be used to draw a yellow line over its video, but the mechanism has some major differences from the typical "instrumented" camera.
For the initial implementation, there were seven computers in total and a crew of four. Recent implementations require around four computers, one computer per camera plus a shared computer for chroma-keying and other tasks, that can be run by a single operator (although two is optimal). The primary operator usually uses a KVM to switch between camera computers and has an extra monitor, keyboard, and mouse setup for the chroma-keying computer.
Of the original four-member crew, two members, one inside the stadium and one in front of a computer, communicated the position of the real first down line to make sure everything was working. The third crew member was a troubleshooter. The last crew member monitored the various colors that make up the color palette onto which the line is drawn.
In recent setups only a single operator is required for all cameras. The operator clicks on the ball in the video to set the line of scrimmage and right-clicks where the first down line should be (or presses a button to automatically position it 10 yards in the direction of play). If lighting conditions don't change that much, the primary operator can also monitor chroma-key settings, but often a secondary operator is used when conditions get too variable.
Each set of camera encoders on a camera transmits orientation and zoom data to an aggregator box that translates the digital information into modulated audio where it is sent down to the corresponding camera computer in the truck. This data is synchronized with the video from that camera. At the camera computer the camera position data is demodulated back to digital data for use by the program that draws the "yellow line" over the video.
Separately, the chroma-keying computer is told what colors of the field are okay to draw over (basically grass) and that information is sent to the camera computers.
The first computer in the truck gathers all the separate readings from the cameras and transmits a single, consolidated data stream to the central computer.
The central computer takes these readings, the 3D field model and color palette, the knowledge of which camera is on the air, and together using a geometrical calculation determines which pixels in the video frame would make up the first down line. All pixels that are obstructed by a player, a referee, the ball or any other object are identified and not included in the calculation. This will ensure that the 1st & Ten line will be projected only onto the field.
The PVI Virtual Media system relies on a single spotter to relay the down and distance, and a single operator at the studio as their vision system does not need camera data to perform the insertion. The primary operator of the Sportvision system does the spotting by merely clicking on the video to place the line.
The only pixels that should change are the ones that are the same color as the field, typically several shades of green. As a result, there are a few situations that are difficult. One is when the player's uniform color nearly matches that of the field (for example, the Green Bay Packers' jersey on a bright, sunny day, or for Bronco Stadium at Boise State University, where the field and the home team uniform share the same blue shade). The other is when the field itself changes, like during a rain/snow storm or if the grass field becomes very muddy. In those cases, the field's color palette would need to include brown and/or white shades. The most difficult situations are when the shade of the field is constantly changing as in situations where moving clouds are shadowing the field on some spots, but not others, but continue to move across the field.
The data collection and computation also requires time. The audio feed goes to an audio delay to be synchronized with the delayed video. The total delay for the viewer from the live feed ends up being about 2/3 of a second.
After the camera computer has determined which pixels represent the 1st & Ten line, it takes that pixel information and draws the yellow line in video format at around 60 times per second (depends on video refresh frequency). A 2011 study conducted by SportVision determined the yellow line has an average margin of error of 1.38 inches compared to the official first down marker.
In recent years the system has been upgraded to add more features. During Fox broadcasts, the Sportvision system also generates an arrow-like graphic on the field with down and distance text information inside of an arrow pointing in the direction of play. Competitors have also added this feature in recent years.
Additionally, the Sportvision system can also place virtual graphics that have another embedded video feed inside them like a video picture frame. This is sometimes called "video-in-perspective".
This technology is also the basis for showing ads where they may not appear (i.e. behind home plate in baseball during national broadcasts), and Race F/X in which images can be displayed on the race track, and info can follow a specific car, no matter what the camera does. This technology is used by CBS, ESPN, Fox, NBC, NFL Network, RDS, TSN, and TNT.
In computer graphics and digital photography, a raster graphic is a dot matrix data structure that represents a generally rectangular grid of pixels, viewable via a computer display, paper, or other display medium. Raster images are stored in image files with varying dissemination, production, generation, and acquisition formats.
The RGB color model is an additive color model in which red, green, and blue light are added together in various ways to reproduce a broad array of colors. The name of the model comes from the initials of the three additive primary colors, red, green, and blue.
Chroma key compositing, or chroma keying, is a visual-effects and post-production technique for compositing (layering) two images or video streams together based on colour hues. The technique has been used in many fields to remove a background from the subject of a photo or video – particularly the newscasting, motion picture, and video game industries. A colour range in the foreground footage is made transparent, allowing separately filmed background footage or a static image to be inserted into the scene. The chroma keying technique is commonly used in video production and post-production. This technique is also referred to as colour keying, colour-separation overlay, or by various terms for specific colour-related variants such as green screen or blue screen; chroma keying can be done with backgrounds of any colour that are uniform and distinct, but green and blue backgrounds are more commonly used because they differ most distinctly in hue from any human skin colour. No part of the subject being filmed or photographed may duplicate the colour used as the backing, or the part may be erroneously identified as part of the backing.
Interlaced video is a technique for doubling the perceived frame rate of a video display without consuming extra bandwidth. The interlaced signal contains two fields of a video frame captured consecutively. This enhances motion perception to the viewer, and reduces flicker by taking advantage of the phi phenomenon.
Video Graphics Array (VGA) is a video display controller and accompanying de facto graphics standard, first introduced with the IBM PS/2 line of computers in 1987, which became ubiquitous in the PC industry within three years. The term can now refer to the computer display standard, the 15-pin D-subminiature VGA connector, or the 640×480 resolution characteristic of the VGA hardware.
A framebuffer is a portion of random-access memory (RAM) containing a bitmap that drives a video display. It is a memory buffer containing data representing all the pixels in a complete video frame. Modern video cards contain framebuffer circuitry in their cores. This circuitry converts an in-memory bitmap into a video signal that can be displayed on a computer monitor.
Color depth or colour depth, also known as bit depth, is either the number of bits used to indicate the color of a single pixel, in a bitmapped image or video framebuffer, or the number of bits used for each color component of a single pixel. For consumer video standards, the bit depth specifies the number of bits used for each color component. When referring to a pixel, the concept can be defined as bits per pixel (bpp). When referring to a color component, the concept can be defined as bits per component, bits per channel, bits per color, and also bits per pixel component, bits per color channel or bits per sample (bps).
The Color Graphics Adapter (CGA), originally also called the Color/Graphics Adapter or IBM Color/Graphics Monitor Adapter, introduced in 1981, was IBM's first color graphics card for the IBM PC and established a de facto computer display standard.
Pixel art is a form of digital art, created through the use of software, where images are edited on the pixel level. The aesthetic for this kind of graphics comes from 8-bit and 16-bit computers and video game consoles, in addition to other limited systems such as graphing calculators. In most pixel art, the color palette used is extremely limited in size, with some pixel art using only two colors.
The display resolution or display modes of a digital television, computer monitor or display device is the number of distinct pixels in each dimension that can be displayed. It can be an ambiguous term especially as the displayed resolution is controlled by different factors in cathode ray tube (CRT) displays, flat-panel displays and projection displays using fixed picture-element (pixel) arrays.
Hold-And-Modify, usually abbreviated as HAM, is a display mode of the Commodore Amiga computer. It uses a highly unusual technique to express the color of pixels, allowing many more colors to appear on screen than would otherwise be possible. HAM mode was commonly used to display digitized photographs or video frames, bitmap art and occasionally animation. At the time of the Amiga's launch in 1985, this near-photorealistic display was unprecedented for a home computer and it was widely used to demonstrate the Amiga's graphical capability. However, HAM has significant technical limitations which prevent it from being used as a general purpose display mode.
FoxTrax, also referred to as the glowing puck, is an augmented reality system that was used by Fox Sports' telecasts of the National Hockey League (NHL) from 1996 to 1998. The system was intended to help television viewers visually follow a hockey puck on the ice, especially near the bottom of the rink where the traditional center ice camera was unable to see it due to the sideboards obstructing the puck's location. The system used modified hockey pucks containing shock sensors and infrared emitters, which were then read by sensors and computer systems to generate on-screen graphics, such as a blue "glow" around the puck, and other enhancements such as trails to indicate the hardness and speed of shots.
Sportvision was a private company that provided various television viewing enhancements to a number of different professional sporting events. They worked with NFL, NBA, NASCAR, NHL, MLB, PGA and college football broadcasts.
The following is a chronological list of the technological advancements of Major League Baseball television broadcasts:
PVI Virtual Media Services is one of the companies behind the virtual yellow-down-line shown on television broadcasts of American football games in the United States and Canada. Founded in 1990 as Princeton Electronic Billboard, PVI Virtual Media Services was a wholly owned subsidiary of Cablevision Systems Corporation with a research and operations facility in Lawrenceville, NJ before being acquired by ESPN in December, 2010.
SMT pioneered the first real-time scoring and wireless data system on television. The privately held company specializes in data integration technology and broadcast graphics that enhance broadcasts, webcasts and live events in sports and entertainment. SMT’s customers include major broadcast television networks, regional and specialty networks and sport governing bodies. In 2012 it acquired IDS as a division of the company to provide scoring, results and statistics on-site for live events. In 2016, SMT acquired Chicago-based rival Sportvision.
Composite artifact colors is a designation commonly used to address several graphic modes of some 1970s and 1980s home computers. With some machines, when connected to an NTSC TV or monitor over composite video outputs, the video signal encoding allowed for extra colors to be displayed, by manipulating the pixel position on screen, not being limited by each machine's hardware color palette.