I was curious, so I looked it up. AV1 is more efficient than HEVC by like 28%! On the downside, encoding is horrifically slooow. It’ll be interesting to see how much hardware support AV1 gets in the coming years, because encoding time will have a dramatic effect on its adoption rate.
Interesting to note: AV1 can be played in Kodi, Plex, Emby, Jellyfin, VLC, Chrome, Firefox, Edge, and Opera. So on the software side, it’s pretty widely supported.
They didn’t say that. They just said it’s better, but didn’t mention their rationale. But IMHO opinion, it being FOSS makes it about a million times better than HEVC alone
Yes, AV1 is the next big deal. You can compress the hell out of the video and it still looks near original. I’ve re-encode some of my locally ripped movies for fun to see how it looks and it’s really impressive.
Thanks I will give it a shot and see how it goes. The biggest thing holding me back is older hardware like the Nvidia Shield for example not supporting AV1.
Ahh, yeah that could be an issue. It takes my laptop like 11 hours to encode one of the Lord of the Rings Blu-ray. I also change the audio to eAC3 while I’m in there for better client support.
It’s always possible to re-encode video; it’s usually called transcoding. However, you lose a bit of quality every time you encode, so you might not gain much in the end. You can offset a bit of the quality loss by encoding at a higher bitrate/quality factor/etc than you otherwise would, but that of course takes up extra space.
O dear, 😂, thats bad for longevity of such a expensive device (or would it be doable with an SW update?)
Luckily I’m poor and doesn’t have to think about that 😂 but would be nice to get this first edit, since it is the most likely jailbreakable vision ever made I guess.
Well if really never one releases with a data port, or similar.
You’d describe the encoding, not the source. The fun part is that it also applies to audio. “At 256 kbps, MP3 is transparent.”
It only applies to lossy codecs. Lossless codecs, by definition, have no error. “Error” itself being a borrowed term. Good encodings don’t have fewer errors… they have less error. For example, measured as mean squared error, where an individual sample being very wrong counts more than many samples being slightly wrong.
Y’know, there’s a similar one used in the gemstone industry: “eye clean,” which only applies if the stone has no inclusions (artifacts) that can be seen with the naked eye. As you can imagine, it’s usually a pretty desirable trait, especially in diamonds. It doesn’t really matter if there’s random garbage floating around in it, it just has to be undetectable to our human eyes.
If you reencode to a more efficient codec, you can save ridiculous amounts of space. If you’re interested in reencoding and are willing to play with self hosting, look into Tdarr, it’s an app that can reencode your whole library. Been using it for a while after switching from my personal solution has been wonderful. I just put files into my media directories and it picks it up, reencodes the file and replaces the original if everything checks out.
AV1? that’s a codec, right? I see in the preferences section for Piped. Is better than AVC (h.2640)?
It’s also better than H.265/HEVC. Plus, it’s open-source and royalty free.
Better quality per file size than HEVC? cite?
I was curious, so I looked it up. AV1 is more efficient than HEVC by like 28%! On the downside, encoding is horrifically slooow. It’ll be interesting to see how much hardware support AV1 gets in the coming years, because encoding time will have a dramatic effect on its adoption rate.
Interesting to note: AV1 can be played in Kodi, Plex, Emby, Jellyfin, VLC, Chrome, Firefox, Edge, and Opera. So on the software side, it’s pretty widely supported.
As long as your cpu/gpu can handle it
Ah, yes, that’s correct, thank you. Your cpu/gpu must support it, or it won’t play.
They didn’t say that. They just said it’s better, but didn’t mention their rationale. But IMHO opinion, it being FOSS makes it about a million times better than HEVC alone
Sweet!
Yes, AV1 is the next big deal. You can compress the hell out of the video and it still looks near original. I’ve re-encode some of my locally ripped movies for fun to see how it looks and it’s really impressive.
I wonder if it’s possible to re-encode from H.265/HEVC to AV1
You can play in handbrake with AV1 encoding to see how it goes. I think I set the compression to 36 or something.
Thanks I will give it a shot and see how it goes. The biggest thing holding me back is older hardware like the Nvidia Shield for example not supporting AV1.
Ahh, yeah that could be an issue. It takes my laptop like 11 hours to encode one of the Lord of the Rings Blu-ray. I also change the audio to eAC3 while I’m in there for better client support.
It’s always possible to re-encode video; it’s usually called transcoding. However, you lose a bit of quality every time you encode, so you might not gain much in the end. You can offset a bit of the quality loss by encoding at a higher bitrate/quality factor/etc than you otherwise would, but that of course takes up extra space.
I wonder if the apple vision pro is able to play AV1 files 🤔 i guess, would be really bad if not
Sorry, doesn’t look like it supports AV1:
O dear, 😂, thats bad for longevity of such a expensive device (or would it be doable with an SW update?)
Luckily I’m poor and doesn’t have to think about that 😂 but would be nice to get this first edit, since it is the most likely jailbreakable vision ever made I guess. Well if really never one releases with a data port, or similar.
Usually this kind of stuff is done in hardware for performance reason. So likely no.
That’s probably not their main focus with that thing
Presumably you know, but for anyone else: the word for this is “transparent.” It’s when the codec leaves no noticeable artifacts.
How would you use that in a sentence? Like “You can compress the hell out of the video and it’s transparent”?
You’d describe the encoding, not the source. The fun part is that it also applies to audio. “At 256 kbps, MP3 is transparent.”
It only applies to lossy codecs. Lossless codecs, by definition, have no error. “Error” itself being a borrowed term. Good encodings don’t have fewer errors… they have less error. For example, measured as mean squared error, where an individual sample being very wrong counts more than many samples being slightly wrong.
“the encode is transparent”
deleted by creator
Huh, there’s a term for that? TIL
Y’know, there’s a similar one used in the gemstone industry: “eye clean,” which only applies if the stone has no inclusions (artifacts) that can be seen with the naked eye. As you can imagine, it’s usually a pretty desirable trait, especially in diamonds. It doesn’t really matter if there’s random garbage floating around in it, it just has to be undetectable to our human eyes.
Misread as “gaming industry” and was briefly very confused.
Can you tell me more about reencoding to save space?
If you reencode to a more efficient codec, you can save ridiculous amounts of space. If you’re interested in reencoding and are willing to play with self hosting, look into Tdarr, it’s an app that can reencode your whole library. Been using it for a while after switching from my personal solution has been wonderful. I just put files into my media directories and it picks it up, reencodes the file and replaces the original if everything checks out.
It’s h.264, just FYI.