A subcarrier is a sideband of a radio frequency carrier wave, which is modulated to send additional information. Examples include the provision of colour in a black and white television system or the provision of stereo in a monophonic radio broadcast. There is no physical difference between a carrier and a subcarrier; the "sub" implies that it has been derived from a carrier, which has been amplitude modulated by a steady signal and has a constant frequency relation to it.
66-497: S-Video (also known as separate video , Y/C , and erroneously Super-Video ) is an analog video signal format that carries standard-definition video , typically at 525 lines or 625 lines . It encodes video luma and chrominance on two separate channels, achieving higher image quality than composite video which encodes all video information on one channel. It also eliminates several types of visual defects such as dot crawl which commonly occur with composite video. Although it
132-434: A radio reading service for the blind, which reads articles in local newspapers and sometimes magazines. The vision-impaired can request a special radio, permanently tuned to receive audio on a particular subcarrier frequency (usually 67 kHz or 92 kHz), from a particular FM station. Services like these and others on broadcast FM subcarriers are referred to as a Subsidiary Communications Authority (SCA) service by
198-459: A closed-circuit system as an analog signal. Broadcast or studio cameras use a single or dual coaxial cable system using serial digital interface (SDI). See List of video connectors for information about physical connectors and related signal standards. Video may be transported over networks and other shared digital communications links using, for instance, MPEG transport stream , SMPTE 2022 and SMPTE 2110 . Digital television broadcasts use
264-536: A different connector. Although Commodore Business Machines did not use the term S-Video as the standard did not formally exist until 1987, a simple adapter connects the computer's LCA (luma-chroma-audio) 8-pin DIN socket to a S-Video display, or an S-Video device to the Commodore 1702 monitor's LCA jacks. The four-pin mini-DIN connector is the most common of several S-Video connector types. The same mini-DIN connector
330-433: A natively progressive broadcast or recorded signal, the result is the optimum spatial resolution of both the stationary and moving parts of the image. Interlacing was invented as a way to reduce flicker in early mechanical and CRT video displays without increasing the number of complete frames per second . Interlacing retains detail while requiring lower bandwidth compared to progressive scanning. In interlaced video,
396-442: A number is available. Analog video is a video signal represented by one or more analog signals . Analog color video signals include luminance (Y) and chrominance (C). When combined into one channel, as is the case among others with NTSC , PAL , and SECAM , it is called composite video . Analog video may be carried in separate channels, as in two-channel S-Video (YC) and multi-channel component video formats. Analog video
462-438: A particular refresh rate, display resolution , and color space . Many analog and digital recording formats are in use, and digital video clips can also be stored on a computer file system as files, which have their own formats. In addition to the physical format used by the data storage device or transmission medium, the stream of ones and zeros that is sent must be in a particular digital video coding format , for which
528-433: A photoconductive plate with the desired image and produce a voltage signal proportional to the brightness in each part of the image. The signal could then be sent to televisions, where another beam would receive and display the image. Charles Ginsburg led an Ampex research team to develop one of the first practical video tape recorders (VTR). In 1951, the first VTR captured live images from television cameras by writing
594-512: A ratio between width and height. The ratio of width to height for a traditional television screen is 4:3, or about 1.33:1. High-definition televisions use an aspect ratio of 16:9, or about 1.78:1. The aspect ratio of a full 35 mm film frame with soundtrack (also known as the Academy ratio ) is 1.375:1. Pixels on computer monitors are usually square, but pixels used in digital video often have non-square aspect ratios, such as those used in
660-533: A remote transmitter , often located in a difficult-to-access area at the top of a mountain. A station's engineer can carry a decoder around with them and know anything that's wrong, as long as the station is on the air and they are within range. This is the essence of a wireless transmitter/studio link . On wireless studio/transmitter links (STLs), not only are the broadcast station's subcarriers transmitted, but other remote control commands as well. Interruptible foldback , such as for remote broadcasting ,
726-424: A satellite transponder or microwave relay). Extra subcarriers are sometimes transmitted at around 7 or 8 MHz for extra audio (such as radio stations) or low-to-medium-speed data. This is referred to as multiple channel per carrier (MCPC). This is now mostly superseded by digital TV (usually DVB-S , DVB-S2 or another MPEG-2 -based system), where audio and video data are packaged together ( multiplexed ) in
SECTION 10
#1732771723803792-549: A station " format " name ALERT to automatically trigger radios to tune in for emergency info, even if a CD is playing. While it never really caught on in North America , European stations frequently rely on this system. An upgraded version is built into digital radio . xRDS is a system with which broadcasters can multiply the speed of data transmission in the FM channel by using further normal RDS subcarriers, shifted into
858-416: Is also possible over subcarriers, though its role is limited. Analog satellite television and terrestrial analog microwave relay communications rely on subcarriers transmitted with the video carrier on a satellite transponder or microwave channel for the audio channels of a video feed. There are usually at frequencies of 5.8, 6.2, or 6.8 MHz (the video carrier usually resides below 5 MHz on
924-486: Is called Y , which is created from all three original signals based on a formula that produces an overall brightness of the image, or luma . This signal closely matches a traditional black and white television signal and the Y/C method of encoding was key to offering backward compatibility . Once the Y signal is produced, it is subtracted from the blue signal to produce Pb and from the red signal to produce Pr . To recover
990-456: Is improved over composite video, S-Video has lower color resolution than component video , which is encoded over three channels. The Atari 800 was the first to introduce separate Chroma/Luma output in late 1979. However, S-Video was not widely adopted until JVC 's introduction of the S-VHS (Super-VHS) format in 1987, which is why it is sometimes incorrectly referred to as Super-Video . Before
1056-456: Is in rough chronological order. All formats listed were sold to and used by broadcasters, video producers, or consumers; or were important historically. Digital video tape recorders offered improved quality compared to analog recorders. Optical storage mediums offered an alternative, especially in consumer applications, to bulky tape formats. A video codec is software or hardware that compresses and decompresses digital video . In
1122-455: Is less sensitive to details in color than brightness, the luminance data for all pixels is maintained, while the chrominance data is averaged for a number of pixels in a block, and the same value is used for all of them. For example, this results in a 50% reduction in chrominance data using 2-pixel blocks (4:2:2) or 75% using 4-pixel blocks (4:2:0). This process does not reduce the number of possible color values that can be displayed, but it reduces
1188-411: Is modulated with suppressed carrier AM , more correctly called sum and difference modulation or SDM, at 38 kHz in the FM signal, which is joined at 2% modulation with the mono left+right audio (which ranges 50 Hz ~ 15 kHz). A 19 kHz pilot tone is also added at a 9% modulation to trigger radios to decode the stereo subcarrier, making FM stereo fully compatible with mono. Once
1254-517: Is often described as 576i50 , where 576 indicates the total number of horizontal scan lines, i indicates interlacing, and 50 indicates 50 fields (half-frames) per second. When displaying a natively interlaced signal on a progressive scan device, the overall spatial resolution is degraded by simple line doubling —artifacts, such as flickering or "comb" effects in moving parts of the image that appear unless special signal processing eliminates them. A procedure known as deinterlacing can optimize
1320-514: Is pin compatible with, a standard 4-pin S-Video plug. The three extra sockets may be used to supply composite (CVBS) , an RGB or YPbPr video signal, or an I²C interface. The pinout usage varies among manufacturers. In some implementations, the remaining pin must be grounded to enable the composite output or disable the S-Video output. Some Dell laptops have a digital audio output in a 7-pin socket. [REDACTED] The 8-pin mini-DIN connector
1386-527: Is reduced by registering differences between parts of a single frame; this task is known as intraframe compression and is closely related to image compression . Likewise, temporal redundancy can be reduced by registering differences between frames; this task is known as interframe compression , including motion compensation and other techniques. The most common modern compression standards are MPEG-2 , used for DVD , Blu-ray, and satellite television , and MPEG-4 , used for AVCHD , mobile phones (3GP), and
SECTION 20
#17327717238031452-422: Is shot at a slower frame rate of 24 frames per second, which slightly complicates the process of transferring a cinematic motion picture to video. The minimum frame rate to achieve a comfortable illusion of a moving image is about sixteen frames per second. Video can be interlaced or progressive . In progressive scan systems, each refresh period updates all scan lines in each frame in sequence. When displaying
1518-495: Is that decompressed video has lower quality than the original, uncompressed video because there is insufficient information to accurately reconstruct the original video. Subcarrier Stereo broadcasting is made possible by using a subcarrier on FM radio stations , which takes the left channel and "subtracts" the right channel from it — essentially by hooking up the right-channel wires backward (reversing polarity ) and then joining left and reversed-right. The result
1584-509: Is used in NTSC television, YUV is used in PAL television, YDbDr is used by SECAM television, and YCbCr is used for digital video. The number of distinct colors a pixel can represent depends on the color depth expressed in the number of bits per pixel. A common way to reduce the amount of data required in digital video is by chroma subsampling (e.g., 4:4:4, 4:2:2, etc.). Because the human eye
1650-440: Is used in both consumer and professional television production applications. Digital video signal formats have been adopted, including serial digital interface (SDI), Digital Visual Interface (DVI), High-Definition Multimedia Interface (HDMI) and DisplayPort Interface. Video can be transmitted or transported in a variety of ways including wireless terrestrial television as an analog or digital signal, coaxial cable in
1716-412: Is used in some ATI Radeon video cards [REDACTED] [REDACTED] 9-pin connectors are used in graphics systems that feature the ability to input video as well as output it. Again, there is no standardization between manufacturers as to which pin does what, and there are two known variants of the connector in use. As can be seen from the diagram above, although the S-Video signals are available on
1782-499: Is used in the Apple Desktop Bus for Macintosh computers. Apple Desktop Bus cables can be used for S-Video in a pinch. Other connector variants include seven-pin locking dub connectors used on many professional S-VHS machines, and dual Y and C BNC connectors , often used for S-Video patch panels . Early Y/C video monitors often used phono ( RCA connector ) that were switchable between Y/C and composite video input. Though
1848-502: The Latin video (I see). Video developed from facsimile systems developed in the mid-19th century. Early mechanical video scanners, such as the Nipkow disk , were patented as early as 1884, however, it took several decades before practical video systems could be developed, many decades after film . Film records using a sequence of miniature photographic images visible to the eye when
1914-503: The MPEG-2 and other video coding formats and include: Analog television broadcast standards include: An analog video format consists of more information than the visible content of the frame. Preceding and following the image are lines and pixels containing metadata and synchronization information. This surrounding margin is known as a blanking interval or blanking region ; the horizontal and vertical front porch and back porch are
1980-748: The FCC in the United States , and as Subsidiary Communications Multiplex Operations (SCMO) by the Canadian Radio-television and Telecommunications Commission (CRTC) in Canada . The RDS / RBDS subcarrier (57 kHz) allows FM radios to display what station they are on, pick another frequency on the same network or with the same format, scroll brief messages like station slogans, news, weather, or traffic—even activate pagers or remote billboards. It can also broadcast EAS messages, and has
2046-468: The Internet. Stereoscopic video for 3D film and other applications can be displayed using several different methods: Different layers of video transmission and storage each provide their own set of formats to choose from. For transmission, there is a physical connector and signal protocol (see List of video connectors ). A given physical link can carry certain display standards that specify
S-Video - Misplaced Pages Continue
2112-493: The PAL and NTSC variants of the CCIR 601 digital video standard and the corresponding anamorphic widescreen formats. The 720 by 480 pixel raster uses thin pixels on a 4:3 aspect ratio display and fat pixels on a 16:9 display. The popularity of viewing video on mobile phones has led to the growth of vertical video . Mary Meeker , a partner at Silicon Valley venture capital firm Kleiner Perkins Caufield & Byers , highlighted
2178-530: The advent of HDMI. It is possible for a player to output S-Video over SCART, but televisions' SCART connectors are not always wired to accept it, and if not the display would show only a monochrome image. In this case it is sometimes possible to modify the SCART adapter cable to allow full S-Video compatibility. Analog video Video is an electronic medium for the recording, copying , playback, broadcasting , and display of moving visual media . Video
2244-411: The applicable local standard. The Atari 800 introduced separate Chroma/Luma output in late 1979. The signals were put on pin 1 and 5 of a 5-pin 180-degree DIN connector socket. Atari did not sell a monitor for its 8-bit computer line, however. The Commodore 64 released in 1982 (with the exception of the earliest revisions using a 5-pin video port) also offers separate chroma and luma signals using
2310-450: The building blocks of the blanking interval. Computer display standards specify a combination of aspect ratio, display size, display resolution, color depth, and refresh rate. A list of common resolutions is available. Early television was almost exclusively a live medium, with some programs recorded to film for historical purposes using Kinescope . The analog video tape recorder was commercially introduced in 1951. The following list
2376-425: The camera's electrical signal onto magnetic videotape . Video recorders were sold for $ 50,000 in 1956, and videotapes cost US$ 300 per one-hour reel. However, prices gradually dropped over the years; in 1971, Sony began selling videocassette recorder (VCR) decks and tapes into the consumer market . Digital video is capable of higher quality and, eventually, a much lower cost than earlier analog technology. After
2442-449: The color resolution of S-Video is limited by the modulation on a subcarrier frequency of either 3.58 megahertz (NTSC) or 4.43 megahertz (PAL). This difference is meaningless on home videotape systems, as the chrominance is already severely constrained by both VHS and Betamax . Carrying the color information as one signal means that the color has to be encoded in some way, typically in accord with NTSC , PAL , or SECAM , depending on
2508-562: The commercial introduction of the DVD in 1997 and later the Blu-ray Disc in 2006, sales of videotape and recording equipment plummeted. Advances in computer technology allow even inexpensive personal computers and smartphones to capture, store, edit, and transmit digital video, further reducing the cost of video production and allowing programmers and broadcasters to move to tapeless production . The advent of digital broadcasting and
2574-651: The connectors are different, the Y/C signals for all types are compatible. The mini-DIN pins, being weak, sometimes bend. This can result in the loss of color or other corruption (or loss) in the signal. A bent pin can be forced back into shape, but this carries the risk of the pin breaking off. These plugs are usually made to be plug-compatible with S-video, and include optional features, such as component video using an adapter. They are not necessarily S-video, although they can be operated in that mode. Non-standard 7-pin mini-DIN connectors (termed 7P ) are used in some computer equipment (PCs and Macs). A 7P socket accepts, and
2640-438: The context of video compression, codec is a portmanteau of encoder and decoder , while a device that only compresses is typically called an encoder , and one that only decompresses is a decoder . The compressed data format usually conforms to a standard video coding format . The compression is typically lossy , meaning that the compressed video lacks some information present in the original video. A consequence of this
2706-435: The corresponding pins, neither variant of the connector will accept an unmodified 4-pin S-Video plug, though they can be made to fit by removing the key from the plug. In the latter case, it becomes all too easy to misalign the plug when inserting it with consequent damage to the small pins. In many European countries, S-Video was less common because of the dominance of SCART connectors, which were present on televisions until
S-Video - Misplaced Pages Continue
2772-509: The display of an interlaced video signal from an analog, DVD, or satellite source on a progressive scan device such as an LCD television , digital video projector , or plasma panel. Deinterlacing cannot, however, produce video quality that is equivalent to true progressive scan source material. Aspect ratio describes the proportional relationship between the width and height of video screens and video picture elements. All popular video formats are rectangular , and this can be described by
2838-445: The fields one at a time, rather than dividing up a complete frame after it is captured, the frame rate for motion is effectively doubled as well, resulting in smoother, more lifelike reproduction of rapidly moving parts of the image when viewed on an interlaced CRT display. NTSC, PAL, and SECAM are interlaced formats. Abbreviated video resolution specifications often include an i to indicate interlacing. For example, PAL video format
2904-441: The film is physically examined. Video, by contrast, encodes images electronically, turning the images into analog or digital electronic signals for transmission or recording. Video technology was first developed for mechanical television systems, which were quickly replaced by cathode-ray tube (CRT) television systems. Video was originally exclusively live technology. Live video cameras used an electron beam, which would scan
2970-420: The final image, it is desirable to eliminate as many of the encoding/decoding steps as possible. S-Video is an approach to this problem. It eliminates the final mixing of C with Y and subsequent separation at playback time. The S-video cable carries video using two synchronized signal and ground pairs, termed Y and C . Y is the luma signal, which carries the luminance – or black-and-white – of
3036-466: The growth of vertical video viewing in her 2015 Internet Trends Report – growing from 5% of video viewing in 2010 to 29% in 2015. Vertical video ads like Snapchat 's are watched in their entirety nine times more frequently than landscape video ads. The color model uses the video color representation and maps encoded color values to visible colors reproduced by the system. There are several such representations in common use: typically, YIQ
3102-424: The higher frequencies of the FM multiplex. The extra RDS subcarriers are placed in the upper empty part of the multiplex spectrum and carry the extra data payload. xRDS has no fixed frequencies for the additional 57 kHz carriers. Until 2012, MSN Direct used subcarriers to transmit traffic, gas prices, movie times, weather and other information to GPS navigation devices, wristwatches , and other devices. Many of
3168-432: The horizontal scan lines of each complete frame are treated as if numbered consecutively and captured as two fields : an odd field (upper field) consisting of the odd-numbered lines and an even field (lower field) consisting of the even-numbered lines. Analog display devices reproduce each frame, effectively doubling the frame rate as far as perceptible overall flicker is concerned. When the image capture device acquires
3234-416: The luminance signal must be low-pass filtered, dulling the image. As S-Video maintains the two as separate signals, such detrimental low-pass filtering for luminance is unnecessary, although the chrominance signal still has limited bandwidth relative to component video. Compared with component video , which carries the identical luminance signal but separates the color-difference signals into Cb/Pb and Cr/Pr,
3300-639: The number of distinct points at which the color changes. Video quality can be measured with formal metrics like peak signal-to-noise ratio (PSNR) or through subjective video quality assessment using expert observation. Many subjective video quality methods are described in the ITU-T recommendation BT.500 . One of the standardized methods is the Double Stimulus Impairment Scale (DSIS). In DSIS, each expert views an unimpaired reference video, followed by an impaired version of
3366-474: The original RGB information for display, the signals are mixed with the Y to produce the original blue and red, and then the sum of those is mixed with the Y to recover the green. A signal with three components is no easier to broadcast than the original three-signal RGB, so additional processing is required. The first step is to combine the Pb and Pr to form the C signal, for chrominance . The phase and amplitude of
SECTION 50
#17327717238033432-455: The other two from the luminance and taking the remainder . (See: YIQ , YCbCr , YPbPr ) Various broadcast television systems use different subcarrier frequencies, in addition to differences in encoding . For the audio part, MTS uses subcarriers on the video that can also carry three audio channels, including one for stereo (same left-minus-right method as for FM), another for second audio programs (such as descriptive video service for
3498-409: The picture, including synchronization pulses. C is the chroma signal, which carries the chrominance – or coloring-in – of the picture. This signal contains two color-difference components. The luminance signal carries horizontal and vertical sync pulses in the same way as a composite video signal. In composite video, the signals co-exist on different frequencies. To achieve this,
3564-593: The receiver demodulates the L+R and L−R signals, it adds the two signals ([L+R] + [L−R] = 2L) to get the left channel and subtracts ([L+R] − [L−R] = 2R) to get the right channel. Rather than having a local oscillator , the 19 kHz pilot tone provides an in-phase reference signal used to reconstruct the missing carrier wave from the 38 kHz signal. For AM broadcasting , different analog ( AM stereo ) and digital ( HD Radio ) methods are used to produce stereophonic audio. Modulated subcarriers of
3630-594: The receiver. The mono audio component of the transmitted signal is in a separate carrier and not integral to the video component. In wired video connections, composite video retains the integrated subcarrier signal structure found in the transmitted baseband signal, while S-Video places the chrominance and luminance signals on separate wires to eliminate subcarrier crosstalk and enhance the signal bandwidth and strength (picture sharpness and brightness). Before satellite , Muzak and similar services were transmitted to department stores on FM subcarriers. The fidelity of
3696-436: The same video. The expert then rates the impaired video using a scale ranging from "impairments are imperceptible" to "impairments are very annoying." Uncompressed video delivers maximum quality, but at a very high data rate . A variety of methods are used to compress video streams, with the most effective ones using a group of pictures (GOP) to reduce spatial and temporal redundancy . Broadly speaking, spatial redundancy
3762-605: The shift towards digital video the S-video format was widely used by consumers, but it was rarely used in professional studios where YPbPr or component was generally preferred. Standard analog television signals go through several processing steps on their way to being broadcast, each of which discards information and lowers the quality of the resulting images. The image is originally captured in RGB form and then processed into three signals known as YPbPr . The first of these signals
3828-439: The signal represent the two original signals. This signal is then bandwidth -limited to comply with requirements for broadcasting. The resulting Y and C signals are mixed together to produce composite video . To play back composite video, the Y and C signals must be separated, and this is difficult to do without adding artifacts. Each of these steps is subject to deliberate or unavoidable loss of quality. To retain that quality in
3894-513: The subcarrier audio was limited compared to the primary FM radio audio channel. The United States Federal Communications Commission (FCC) also allowed betting parlors in New York state to get horse racing results from the state gaming commission via the same technology. Many non-commercial educational FM stations in the US (especially public radio stations affiliated with NPR ) broadcast
3960-577: The subcarriers were from stations owned by Clear Channel . The technology was known as DirectBand . FMeXtra on FM uses dozens of small COFDM subcarriers to transmit digital radio in a fully in-band on-channel manner. Removing other analog subcarriers (such as stereo) increases either the audio quality or channels available, the latter making it possible to send non-audio metadata along with it, such as album covers, song lyrics, artist info, concert data, and more. Many stations use subcarriers for internal purposes, such as getting telemetry back from
4026-403: The subcarriers. A black and white TV simply ignores the extra information, as it has no decoder for it. To reduce the bandwidth of the color subcarriers, they are filtered to remove higher frequencies. This is made possible by the fact that the human eye sees much more detail in contrast than in color. In addition, only blue and red are transmitted, with green being determined by subtracting
SECTION 60
#17327717238034092-455: The subsequent digital television transition are in the process of relegating analog video to the status of a legacy technology in most parts of the world. The development of high-resolution video cameras with improved dynamic range and color gamuts , along with the introduction of high-dynamic-range digital intermediate data formats with improved color depth , has caused digital video technology to converge with film technology. Since 2013,
4158-466: The type used in FM broadcasting are impractical for AM broadcast due to the relatively narrow signal bandwidth allocated for a given AM signal. On standard AM broadcast radios, the entire 9 kHz to 10 kHz allocated bandwidth of the AM signal may be used for audio. Likewise, analog TV signals are transmitted with the black and white luminance part as the main signal, and the color chrominance as
4224-567: The use of digital cameras in Hollywood has surpassed the use of film cameras. Frame rate , the number of still pictures per unit of time of video, ranges from six or eight frames per second ( frame/s ) for old mechanical cameras to 120 or more frames per second for new professional cameras. PAL standards (Europe, Asia, Australia, etc.) and SECAM (France, Russia, parts of Africa, etc.) specify 25 frame/s, while NTSC standards (United States, Canada, Japan, etc.) specify 29.97 frame/s. Film
4290-466: The vision-impaired, and bilingual programs), and yet a third hidden one for the studio to communicate with reporters or technicians in the field (or for a technician or broadcast engineer at a remote transmitter site to talk back to the studio), or any other use a TV station might see fit. (See also NICAM , A2 Stereo .) In RF-transmitted composite video , subcarriers remain in the baseband signal after main carrier demodulation to be separated in
4356-526: Was first developed for mechanical television systems, which were quickly replaced by cathode-ray tube (CRT) systems, which, in turn, were replaced by flat-panel displays of several types. Video systems vary in display resolution , aspect ratio , refresh rate , color capabilities, and other qualities. Analog and digital variants exist and can be carried on a variety of media, including radio broadcasts , magnetic tape , optical discs , computer files , and network streaming . The word video comes from
#802197