Search
Close this search box.

Video Compression: Glossary of Terms

Video Compression Guide – Glossary and Further Info

Need to know more about video compression? Here is a glossary of video terms and other information about this topic.

Glossary of Terms

Compression / Encoding

Video information is encoded in order to transport it over the Internet and to deliver it to various kinds of video players. Along the way, it is often compressed to make the file size smaller and more transportable. The more compressed the video is, the less quality is retained and the more degraded the image usually becomes. Compression is always a compromise between quality and file size. Video compression is also often known as video encoding. More here on Wikipedia.

Codec

The word “codec” stands for compression-decompression. A codec is a compression algorithm that is used to compress the video information at one end using an encoding program, and decompress it at the other end for playback, such as in a video player like VLC or Quicktime, or by using a hardware decoder chip inside a DVD player. Basically, it is a piece of software that makes your video readable by your computer, allowing you to play it. Without the correct codec, you won’t be able to play either the audio or video or both. More here on Wikipedia.

Format

A format or “container format,” is used to bind together video and audio information, along with other information, such as metadata or even subtitles. You’ve probably heard of things like .mp4, .mov, .wmv, etc. These are all container formats that put the audio and the video together. For example, a .mp4 file might use the mp3 audio codec together with the H.264 video codec, or a .avi container might use AAC audio with an Xvid video codec. More here on Wikipedia.

Standard

A standard, such as the MPEG standards set by the Motion Picture Experts Group, is a set of rules that video codecs and formats are designed to adhere to. This standardisation allows manufacturers and software designers to anticipate the kind of video, audio, and other information that their software or microchips will have to deal with. For example, MPEG-1 is used in VCDs, while MPEG-2 is used in DVDs. The MPEG-4 codec, known as H.264, is the current standard used by most online video platforms. More here on Wikipedia.

Bitrate

The bitrate is the amount of data a file uses per second to store the audio and picture information. Video bitrates are much higher, as they must describe the highly complex visual information in each frame, whereas audio bitrates are much smaller. Bitrates are usually measured in kilobits per second, also known as kbit/s or kbps. Bitrates can be constant or variable, where the codec adjusts the amount of data based on how complex the video and audio is at the time. We choose to use a constant bitrate setting in the recommended settings listed here, as many systems react better to a constant bitrate rather than spikes in the data throughout the file. It is also easier to predict the resulting file size. More here on Wikipedia.

Frame Rate

Frame rate is the number of video frames per second. PAL formats usually use 25 frames per second (fps) and NTSC formats usually use 29.97 or 30 fps. We tend to round a 29.97 frame rate to 30. Some cameras produce lower frame rates. If your source file has a lower fps, it is possible to compress your video using these rates instead, but it’s usually better to avoid frame rates higher than 30. More here on Wikipedia.

Deinterlacing / Decombing

Older video technologies such as Standard Definition PAL or NTSC video for broadcast television used two interlaced fields per frame of video. When these files are played on a computer, or if other systems re-encode these files, what is known as interlacing artifacts can occur. This often looks like a “comb” effect of horizontal lines across the screen. If you have interlaced content, you need to de-interlace it. However, de-interlacing can produce its own problems, including loss of fine resolution. Handbrake uses a sophisticated filter called Decomb as part of the encoding process, which you can leave on all the time, and it will only deinterlace frames which have interlacing issues present. The settings we recommend above use the Decomb filter. See more here.

Audio Sampling Rate

The audio sample rate refers to how many slices the audio information is sliced into or sampled throughout the file. CDs are usually sampled at 44.1 kHz, and video is usually at 48 kHz. It is usually better to either keep the original sample rate or change it to 48 kHz rather than choosing to use 44.1 kHz for video. More here on Wikipedia.

H.264

H.264 (or MPEG4-AVC) is a modern, fast, and very efficient video codec, and it’s currently the best choice for web video since 2014. It is part of the Motion Picture Experts Group MPEG-4 standard and is also supported by the HTML5 standard, which means many web browsers support playback of this codec natively. See more on H.264 here.

H.265

Also known as High-Efficiency Video Coding (HEVC), H.265 is a video compression standard designed to be the successor to the widely used H.264. In comparison, H.265 offers from 25% to 50% better data compression at the same level of video quality or substantially improved video quality at the same bit rate. See more about H265 on Wikipedia.

MP4

MP4 is a file container format (see above) developed by the Motion Picture Experts Group. It refers to the wrapper around the video and audio codecs, such as H.264 and AAC. It usually has the file extension “.mp4,” but it can also have “.m4v,” “.mov,” and others. More here on Wikipedia.

Compression Artefacts

This refers to the areas of the picture that can look grainy or blocky, or contain other distortions of colour and movement in a video file that has been highly compressed. These artefacts are created when there is not enough information to describe that part of the picture accurately. When compressing video, the aim is to create the smallest file with the least amount of compression artefacts. More here on Wikpedia.

Ripping

Ripping refers to the process of copying a movie from a DVD and then compressing it into a new video file. Sometimes this requires working around anti-copying technologies. Handbrake was designed for this purpose. More here on Wikipedia. There are many unimpeachable sources of information about compression on the web, but be careful: some articles contain misinformation because few really understand the technology and terminology involved. Below are links to information about open source video codecs, open standards, and a glossary of terms.

Guides and Tutorials

Handbrake

Try Handbrake’s own wiki for a guide to all of Handbrake’s features and settings. Check it out here.

This is an incredibly in-depth 10-part series on using Handbrake to make H.264 video. If you watch this whole series, you will have a very thorough understanding of all the settings that are selected. It is a little out of date, as it was produced in 2010, but most of the information is still relevant today.

Forums

VideoHelp Forum

You can check the Video Conversion topic for threads about compression. There are also other topics such as authoring disc-based media, video streaming, camcorders, DVD ripping, editing, programming multimedia applications, and video issues specific to Mac and Linux.

CreativeCow

This is another forum, without listed topics, but you can search for solutions to many issues and read what other users have to say (or ask for help yourself).

Open Standards and FOSS Codecs

It is worth mentioning that Handbrake actually uses a free and open-source software (FOSS) version of H.264 and H.265 called x264 and x265 respectively, and Handbrake itself is FOSS too. FOSS codecs are released under free software licenses such as the GPL which ensure community ownership, allowing others to freely distribute, modify, and contribute to them. H.264 can be considered an open standard. Open standards are technical definitions for video formats and codecs that are publicly released and have had an open process in their design. It’s important to invest in FOSS and open standards to prevent video technology from falling into exclusively corporate hands, which may introduce serious concerns about affordability, security, and access. As community media practitioners, it’s easy to see why we need to support independent media infrastructures such as radio and TV stations—the same principles apply when it comes to media software and internet video technologies.

Latest News in