You need to be signed in with an active subscription to watch this video

About Analog v Digital

An analog signal is a continuous, smooth signal produced by varying the amplitude and frequency of an electrical signal. Specifically in video and film work this could be a signal representing sound or images. In contrast, a digital signal comprises discrete samples of the analog wave at specific moments in time. Instead of a smooth, seamless curve, digital signals have a staircase of individual values representing each time the original analog wave was sampled. Analog signals are necessarily converted to digital for storage and manipulation in digital computer systems. The sample rate of a signal refers to how many of these samples are generated each second. The higher the sample rate, the closer match to the original analog wave. The bit depth determines the jump between each individual step in the amplitude or height of the wave. After a signal has been manipulated digitally, at some point it’s returned to an analog signal for viewing. In modern video systems, a signal may be converted to digital right at the sensor plate of the camera, remain in digital form throughout the editing and finishing process, and only return to an analog signal when it’s projected through the LCD matrix of a movie theater projection system.

Related Terms