Understanding digital video and encoding

You recorded your video using a video camera, DSLR camera or even a smartphone or tablet.

You edited your video on your smart device or using Adobe Premiere from your Adobe Creative Suite. All that's needed is to encode your video, upload it and add it to your website.

Understanding the basics of encoding can ensure that you keep the right balance between video file size and video quality. With a little bit of background knowledge you will be able to balance the various encoding variables to find the perfect video encoding recipe for your videos.

Containers and Codecs

Containers are files that can be likened to a bucket. You can put your video, audio and meta-data, compressed using various methods into one bucket or file.

Some common container files that we have all seen are .mov, .mp4 and .wmv. Some of the newer containers are .webM and .Ogg, these are often used inside HTML5 tags.

Codec stands for Compressor-Decompressor or Compression-Decompression. And as is definition states, Codecs are a method of compressing and decompressing video and audio using various clever methods.

These methods reduce the file size of your videos while maintaining good quality and smooth streaming. Codecs that we have probably seen before are h.265, DivX, VP8, ProRes, and DNxHD.

Lossy and Lossless Compression

You may see articles talking about lossy and lossless compression or image compression and data compression. Let's get the skinny on compression, nerdy pun…

Codecs use lossy compression also called image compression. This compression is achieved by removing irrelevant or redundant image data in order to reduce file size. The image is permanently altered when using this form of compression.

However, very clever algorithms have been created that compress or reduce the file size while making little if any changes perceivable to our eyes. Of course, the more a video file is compressed or reduced in size the greater the possibility of noticeable change.

Lossless compression or data compression, is a compression where all the original file data can be recovered when the file is uncompressed. Lossless forms of compression that we are all familiar with are .zip and .rar files. These forms of compression are not generally used for encoding video and audio.

Bit Rate

The definition of bit rate is the number of bits that are conveyed or processed per unit of time. But what does that mean?

Well, it’s all about figuring out how to transmit the best quality video using the smallest amount of data. There are a number of different acronyms for bit rate, as they look very similar it is a good idea to make sure we know what they mean:

Remembering our math class, 1 byte = 8 bits making sure we recognise either the lower case b or capital B is very important. Because 1 MBps = 8Mbps. With this in mind, we can use some basic math to find out what the file size of our video will be based on the bit rate we set.

  • Kbps = Kilobits per second
  • KBps = Kilobytes per second
  • Mbps = Megabits per seconds
  • MBps = Megabytes per second
If we have a 5-minute video that we are encoding at 8Mbps
8 Mbps = 1 MBps (8bits in a byte)
5 minutes = 5 X 60 (seconds) = 300
300 X 1MB = 300 MB
So our video will be 300 Megabytes in size..

You knew that math class was going to pay off someday…

Frames Per Second

Frames per second or FPS is the rate at which a device produces unique consecutive images or frames.

Humans can process 10 to 12 separate images per second, perceiving them individually. With that in mind, 15 FPS is generally the slowest rate for animations or video. More common frame rates are 24, 29, and 30, or rates that are very close to those values.

It is a good rule of thumb to leave the original frame rate of a video. Sometimes you can reduce the frame rate of a video to shave a little off the file size or bit rate. If you reduce the frame rate of your video, you will want to ensure the frame rate reduction has not made the video stagger or jumpy to watch.

Bit Depth

Bit depth, also known as color depth is basically the number of values between dark and light. The more values available to each of the red, green and blue colors, the smoother the transition between colors.

An 8 bit video has 2-to-the-8th-power of color values for the red, green and blue values that gives 256 values. Whereas 10 bit has 2-to-the-10th-power of values for the red, green and blue values that gives 1024 values. So the increase between the two is quite considerable.

Take a look at the two images below to see how the bit depth affects an image. The first image is a regular 8bit image whereas the second has had the bit depth reduced. We can see a huge difference.

Chroma and Luma

Let's give the math a rest for a while. Chroma is color information while Luma is brightness information.

Phew, that one was easy.

Applying Your New Video Knowledge

Now we will put all that video encoding knowledge into a real world environment.

Article Author: Dan Kellett

Dan comes to 123Muse with a background in fashion photography, videography and media creation.

He has worked with many media products as a museum interactive media designer and creative director of a large Miami based agency. Dan works to integrate media software into the Muse workflow.

Skills Highlight:

Photo & Video

Photoshop & Lightroom

Video & Audio

Looking for Adobe Muse Widgets? Looking to for Adobe Muse Training?
Copyright © 123muse.com 2015 - All rights reserved