[A question on formats and codecs]

Locked
User avatar
halex1316
Joined: Tue Jul 09, 2013 8:48 pm
Location: Wyoming
Org Profile

[A question on formats and codecs]

Post by halex1316 » Sat Jul 27, 2013 1:52 am

Hello,

Welcome to my slice of ignorance, where you, the reader, will hopefully be able to help me understand this stuff more than I already do.

So this not only goes for my AMV's but for my other videos that I'm posting on the web. So I've gone ahead and read the Technical Guides to All Things Audio And Video, and there's just a few things that I want to clear up.

I suppose it would be best to tell you guys the "end result" I'm looking for, and I can "extract" the information as I go. I'm serious about learning this stuff, so all help is appreciated. I just do much better when it's explained so that I can ask questions when they arise, so patience is also appreciated.

I'm just under a lot of confusion with the amount of information presented in the guide, and after looking at certain videos, their sizes due to compression, and the quality of the videos, compared to my attempts, I just can't seem to wrap my head around it.

I recently watched an AMV that was beautiful quality, three minutes long, and only 40MB in size. It was an .mp4 file. (Not sure how much relevance that has).

When discussing the different distribution codecs and formats, AVI is mentioned. However, everything below it appears to be codecs of sorts? I have the understanding that a format, is basically the container for a video, for example. For example, if I'm rendering a 3 minute AMV, and I choose the file type .avi, and I render uncompressed, which is as lossless as I think you can get, what would my next steps be to getting that filesize down, while keeping quality. I've downloaded and messed around with zarx264gui, but I'm not quite sure what I'm doing. I want to be able to preserve quality, and limit file size, for means of storage and distribution.

You have an AVI, with a video portion, and an audio portion. Both of these can be encoded with codecs. Can someone please clarify, with a fine line, the difference between the two? (A list of formats vs codecs would be useful, because in the guide, I couldn't differentiate the two as they were listed.)

I realize that I've kind of ranted about all the things I don't understand, and hopefully there's someone out there who can help me out. A live discussion would be amazing (if anyone is willing to help me out that much xD) and if you're able to, please message me and I'll figure out a way we can discuss this.


Thanks ahead of time,

halex1316

User avatar
halex1316
Joined: Tue Jul 09, 2013 8:48 pm
Location: Wyoming
Org Profile

Re: Calling all encoding + compression experts

Post by halex1316 » Sat Jul 27, 2013 6:09 pm

Realize this is the wrong forum -__-' Can someone move this to the footage help?

[MOD258: Topic moved and title changed.]

User avatar
mirkosp
The Absolute Mudman
Joined: Mon Apr 24, 2006 6:24 am
Status: (」・ワ・)」(⊃・ワ・)⊃
Location: Gallarate (VA), Italy
Contact:
Org Profile

Re: Calling all encoding + compression experts

Post by mirkosp » Sun Jul 28, 2013 2:09 am

I think the first thing is differentiating between formats and codecs. Actually I think that might suffice to answer your question?

The format is the standard, so it sets the rules by which the codecs ─ the actual implementations ─ need to abide to.
Now codecs, can be actual codecs (as in, do both COding e DECoding), or could be just encoders or decoders.
Let's make a few examples.

H.264 is the format. It also goes by the name of AVC (MPEG-4 Part 10 Advanced Video Coding would be the full name), which is the same thing, so you can use the names interchangeably.
x264 is a H.264/AVC encoder (it relies on other software for the decoding and internally pretty much just supports uncompressed formats).
LAV Video is a H.264/AVC decoder (but can also decode other formats).
FFMPEG and Libav are H.264/AVC codecs (but can also decode and encode other formats and do many other things).

You'll often see x264 or LAV referred to simply as codec, or even H.264/AVC referred to as codec. These are technically wrong definitions, but through the large usage, they are somewhat accepted, much like it goes with Y'CbCr being called YUV despite being technically wrong.

Then we get to containers.
MKV is the container format.
MKVMerge (part of the MKVToolnix) is a MKV muxer.
Haali Splitter is a MKV demuxer (but can also deal with other containers).

For some more listings...
AVI is a container. VirtualDub can mux to AVI (and demux it too, actually).
ASP (MPEG-4 Part 2 Advanced Simple Profile) is a format. XViD and DivX are ASP encoders.
UTVideo is a format, and has an official implementation which goes by the same name. The official installer comes with various encoders and decoders depending on the colorspace you want to employ, this is why it's called a "codec suite".
AAC is the format, Nero AAC and QTAAC are AAC encoders, and there are many more.
MP3 is the format, Lame is a MP3 encoder (yeah, despite its name).
MP4 is a container (defined in Part 14 of the MPEG-4 standard). L-SMASH and MP4Box are MP4 muxers.

I think these should suffice to give you the idea?

PS: It would help in future if you could make topic subjects clearer about what is to be discussed, otherwise other people won't know the content of the topic at a glance.
Image

User avatar
halex1316
Joined: Tue Jul 09, 2013 8:48 pm
Location: Wyoming
Org Profile

Re: [A question on formats and codecs]

Post by halex1316 » Sun Jul 28, 2013 2:33 am

Yeah, I'm sorry. My question was really ambiguous, but I had spent about two hours trying to figure out and understand it all, there was just so much I wanted to know in the moment, and I'm not sure that it all came out correctly xD It's just that in the tutorial, it looked to me like file types and the names of codecs were being used interchangeably, and I just got really confused.

Really appreciate the help :)

User avatar
halex1316
Joined: Tue Jul 09, 2013 8:48 pm
Location: Wyoming
Org Profile

Re: [A question on formats and codecs]

Post by halex1316 » Sun Jul 28, 2013 2:39 am

Now, what's the difference really between encoders and decoders? Like what's the purpose of having one that does something different than the other? If you have an H.264 file, what would decoding it do, and what would the result be? And vice versa?

If you have an .avi file, can you encode it with a x264 codec? I mean, in Sony Vegas (you're probably familiar) there's a preset Main Concept AVC/AAC and the file type (container) is .mp4 or .avc. You explained that the .avc is a file type, which I get. Can multiple file types use the same codec? What are the advantages of doing this in a different file type?

Another thing that I heard was that .avi is usually used for lossless rendering. So, you'd render this, then compress it with a "lossy" codec to make the file type smaller. However, can you render a .mp4 lossless, or is that one of the limitations of the filetype? It just can't use a lossless codec? :)

User avatar
halex1316
Joined: Tue Jul 09, 2013 8:48 pm
Location: Wyoming
Org Profile

Re: [A question on formats and codecs]

Post by halex1316 » Sun Jul 28, 2013 2:46 am

Side Note - Any links to useful information, that I can read through without lobbing questions at you, would be greatly appreciated. I'm going to continue to do my search on the web to try and find what information I can, but because I'm ignorant on the topic, I have to know what I'm looking for, and quite frankly, I don't xD

User avatar
mirkosp
The Absolute Mudman
Joined: Mon Apr 24, 2006 6:24 am
Status: (」・ワ・)」(⊃・ワ・)⊃
Location: Gallarate (VA), Italy
Contact:
Org Profile

Re: [A question on formats and codecs]

Post by mirkosp » Sun Jul 28, 2013 7:58 am

Doom9 is a pretty good place to check out for guides and so on.

Onto the questions.
As the word says, the encoder encodes to a certain format, while a decoder decodes from it.
The exact implications depend on whether the codec itself is lossy or lossless.

The idea originally is that an uncompressed source is just too big. So there are two things you can do:
1) You want to edit it, thus want to retain quality, while still cutting on the filesize and possibly avoid incurring in disk speed limits or cutting too much cpu time for the decode. The solution is to use a lossless codec.
2) You want to share this. You'll then use a lossy codec to cut the filesize while losing quality in a possibly not too visible way (it's still not going to be lossless though).

When the data is uncompressed, anything that can deal with a stream of the given type (video or audio), is going to support it, because there is no special knowledge involved. It's like if I gave you an Italian book with exercises in arithmetic: you'd still be able to solve it because arithmetic is the same all over the world, you don't really need to know Italian to solve the exercises.
But what if I told you to read a history book in Italian? Well, you need to learn Italian first. You need to decode the language. Of course, you could say that the information in the book originally was just information in the strict sense of the meaning: people did stuff, it's not language related. So that information has been encoded in the language first to be shared in the most efficient way; you could have made extensive drawings to depict what happened, but that's going to take more time and space to move around, ain't it?
This is basically the idea behind encoding and decoding.
Why would you want them separed? Well, you have a standard. By separing the encoder and the decoder, you can let people focus on one at a time, so you have optimized results, so you can save even more space and decoding time, both when encoding and decoding, as clearly the process of encoding and decoding are quite opposite to each other and require different things to be done.
Clearly, once you have an encoded file, decoding it means that you can see/hear what it originally represented (albeit, in the case of H.264, with the implied differences due to the lossy encoding to save space ─ yes, you can use H.264 for lossless encoding but I'd much rather avoid talking about this now).
And when you have an uncompressed file, you can just encode it so you can save space and easily decode it at a later time when the need arises.

And now, let's make further clarity about formats.
Let's say you have an apple and an orange. You want to keep these on your table. You could just leave them there around on their own, or you could put them in a basket.
This is basically the idea behind containers and audio and video streams.
AVI, MKV, and MP4 are container formats. They are the baskets in which you'd put the fruit.
AVC, ASP, AAC, and MP3 are the fruits themselves, what you want to ultimately eat.
You use the basket for storing commodity, but you aren't necessarily forced to use the basket to keep the fruits around. But then it gets kinda hard to move the fruit around and realize it's related.
So basically what we do is, we say "this video goes with this audio" by putting them together so they can be played back at the same time.
On the other hand, you can also use the basket to just have only the apple or the orange there.
Mainconcept (which is an AVC encoder) can give you the fruit. You can tell it to put it in a basket (.mp4), possibly along with the audio; or you can tell it to just leave it on its own on the table (.avc ─ in MPEG terms, this is called an elementary stream).

For various reasons (which I don't feel like explaining here right now, but I will explain in a later post if you want to know), it's best to just output in a lossless format.
You could straight up export uncompressed, but that's going to take even more space. With a lossless format you can save space while still retaining all the quality.
For audio it's actually common to just leave it uncompressed (so in the PCM format) and just encode it lossy later.
For video, there's a number of lossless formats which have been used in the years (you probably read huffyuv, lagarith, and utvideo around).
It's common to use the AVI container to store lossless data; though you could use other containers as well, AVI just sort of stuck with the time, mostly because most lossless codecs have a VfW interface, so it's easier to do it all at one fell swoop ─ encoding to a lossless format while muxing audio and video together, especially from the NLE. You'll realize that exporting from vegas or premiere is limited compared to what you could theoretically do, which is also one of the reasons why it's better to make the final distro encode with software on its own.

So now you have your avi with lossless video and uncompressed audio. You could just playback this directly in your player, but if you were to share this around, it'd take a lot of time, and still it takes a huge chunk of space on your hard disk. So that's why we use lossy codecs. Yes, you can feed this avi to an AVC encoder (remember, AVC is the format, so it means that there are many codecs which encode to the given format: x264 is one specific encoder, Mainconcept made their own encoder as well, which is the one you have in vegas, but there are other encoders too), so you can get your encoded file.
As for .mp4/.avc, the reply is above: as .avc is an elemental stream for the video, you wouldn't have the audio (also the elementary stream for avc doesn't include framerate info, so that's another can of worms). But you aren't forced to use mp4, as you can put avc video along with aac audio in mkv as well, for example.
The various containers offer different features, so you'd have to look in what a container has to offer to decide what suits your taste (though if you just have to release audio+video there isn't much of a difference at all between using mp4 or mkv).
Image

Locked

Return to “Video & Audio Help”