Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Also, H265 encoded 1080p is hitting the sweet spot between quality and file size. About 500-ish MB per hour of content for decent quality.

Its sort of like when VBR encoded MP3s became the standard in the privacy scene in the early 00s. The quality is good enough for most folks plus the reduced file size means that it downloads really fast on average connections and isn't prohibitively expensive to store so you can create a large library very quickly. Also I think the shift to SSDs from spinning rust delayed the adoption of widespread video piracy 10-ish years ago because the cost per GB was just too expensive and most people weren't going to buy a NAS. But now we are starting to move past the inflection point. SSDs are bigger, H265 reduces file sizes, hardware accelerated H265 encoding is easier (meaning more H265 content), and devices that can decode H265 are becoming cheap and ubiquitous. You can install Plex on your laptop, load up on content, and have a better UX than Netflix.



> About 500-ish MB per hour of content for decent quality.

That's still YIFY-quality video, which is a few steps below HBO/MAX, which happens to be among the lowest quality paid streams.

ATV+ is killing it with 30Mbps 4K HDR+ video (it used to be as high as 48Mbps). Disney+ is close. The Netflix 4K plan serves up high 1080p bitrates, 2x better than their 1080p plan. Worth paying for.


Note that Netflix and co. won't stream 4K at all to Linux devices, and they often don't serve 4K if they deem your internet connection to be too slow.

For many folks, pirated 1080p is par with what they'd get streaming, and pirated 2k or 4k is better.


Amazon Prime Video is even worse, on Linux they only serve 480p. Any pirated file is much better quality.


It's unwatchable. I'd cancel Prime because of it, if I didn't want the shipping. What is the real purpose of them limiting the stream? It can't be because of pirating. The movies and shows show up on the torrent sites BEFORE they even hit Amazon Prime. So who's going to "Record" it there? They are only hurting their customers.


Exactly! As it is unwatchable it makes 0 sense to pay for the service. It is pirate or just do not watch it.


Lmao at paying for 2006-youtube quality video.


Still better than using some other OS.


You can use torrent clients under Linux though.


Do they even stream 1080p to Linux devices by default now? I always had to use a browser extension to make it do that[1].

However the extension seems to be gone.[2]

[1] https://github.com/vladikoff/netflix-1080p-firefox?tab=readm...

[2] https://github.com/vladikoff/netflix-1080p-firefox/issues/28



Lol. Why should I pay for aggressively hostile software that requires me to install spyware on my machine for the privilege of watching throttled content? No thanks, I'll download an mkv and watch it when I want, where I want, online or offline.

I could do so with DVD and blu-ray. You don't want to let me do so via the internet? Okay, I'll do it anyways.


1080p = 2k; both are 1920x1080p. p is rows in the vertical dimension (p for progressive, as opposed to i for interlaced; e.g. NZ terrestrial TV is 1080i). k is columns in the horizontal direction, and comes from film making and visual effects (1920 rounds up to 2k and 3840 rounds up to 4k).


Most relevant for anyone who also had this confusion:

https://en.wikipedia.org/wiki/4K_resolution#2160p_resolution


Wow, I didn't know this! I mistook it for 1440p (that's what my monitor is). I figure that it might be between 1080p and 4k that it was synonymous with 2k, but I'm mistaken.


When looking for a monitor:

> 1k = 1920x1080

> 2k = 2560x1440

> 4k = 3840x2160

[even though it's not usually referred to as 1k, but rather 1080p directly]


That makes no sense.


Wow, I thought K stood for thousand (as in 4K=4,000)! Nice piece of trivia if true.


Yes, k stands for thousand. Historically, 4k meant 4096 pixels wide in the context of digitised film or digital visual effects, and 2k meant 2048 pixels wide. TVs ended up 1920 pixels wide, which is "close enough" to use the same term, 2k. I think "4k" is used for marketing TVs as it's easier to remember and say than "2160p", so now we mix the terminology.


Its a loophole. UHD4k is what tv advertiser used to dilute meaning and not get sued. I believe the definition has been diluted to the point 4k is expected to be UHD.


What do you mean by 2k? Because people should not call 2560x1440 2k, and I've never seen a download that size either.


Why do you single out 2k? The term "4k" is just as wrong and purely marketing driven as well.

The resolution that's usually behind 2k, which is 1440p as you've correctly pointed out, is usually available as torrents too.


Rounding 1920 to 2k and 3840 to 4k is not too bad. And yes it's marketing to switch from height to width, but whatever.

Rounding 2560 to 2k is massively confusing. Don't do it. 2.5k or don't use "k" at all.

And when I go look at a couple torrent sites and scroll through movies and tv shows, I'm not seeing a single 1440p in the first couple pages. Some searches show barely anything at all.


The first site I checked has 317 pages of 50x 2160p listings per page going back seven years.

The most recent entry is:

How To Train Your Dragon The Hidden World (2019) 2160p 4K BluRay 5 1-LAMA

Format : HEVC

Width : 3 840 pixels Height : 1 634 pixels

Display aspect ratio : 2.35:1

Near the top is a recent TV episode:

True Detective S04E04 Night Country Part 4 2160p MAX WEB-DL DDP5 1 DoVi x265-NTb

Format : HEVC

Width : 3 840 pixels Height : 1 920 pixels

Display aspect ratio : 2.000

What makes things problematic is the overbearing love for letterbox like aspect ratios, even pirates have standards and they're having to bundle a slew of aspect ratios together .. this comes from the production companies.


I think you agree with me?

You overwhelmingly see 1280x720, 1920x1080, and 3840x2160, sometimes with a truncated height because of aspect ratio but usually advertised with the full height for consistency reasons.

There's barely any 2560x1440. Anyone going above 1080p goes directly to 4k.

Youtube is pretty much the only place I've ever seen 1440p encodes, and that's because they're super version-happy and make 20 different variants of a video.


Perhaps, I thought your complaint was not finding enough "4K" (not a term I like much).

If it's about finding 1440p that'd mainly be because it's not a common broadcast format to the best of my knowledge - I just don't see it about much.

Articles such as: https://en.wikipedia.org/wiki/4K_resolution

don't mention it as a broadcast format, and articles that are specific to 1440p: https://en.wikipedia.org/wiki/1440p

have it as :

    As a graphics display resolution between 1080p and 4K, Quad HD is regularly used in smartphone displays, and for computer and console gaming.


1440p isn't really available on official streaming platforms, so it is indeed a lot more rare.

It's pretty much only available on original encodes, i.e. BluRay rips, this makes it a format that very rarely seen on currently airing shows, which are mostly webrips from official streaming platforms.

You'll often see it alongside the usual resolutions for movies that have since been released on disk.


I have a proposal for you: Take back the k from the marketers.

Define: k = multiple of 1920x1080 pixel count

1920x1080 ~= 2M pixels = 1k 2560x1440 ~= 4M pixels = 2k 3840x2160 ~= 8M pixels = 4k

So now 1440p = 2k, and k becomes meaningful. Problem solved!

It also gives a solution for ultrawides like 7680 x 2160p, which are 8k.

More interestingly, “8K” TVs now become 16k TVs, which marketers should like. We’ve come full circle! Now the k nomenclature also gives you an idea of how difficult that 16k display will be to drive with a video card relative to your existing monitor.


> I have a proposal for you: Take back the k from the marketers.

And how do you propose getting your definition out in front of more people than the combined marketers of all the legit sources of movie, TV and other streaming content in the world?

lol

Pick the battles you have a whelk's chance in a supernova of winning.


And now you also have to fight with the cinema/projector standard, where 4k is 4096x???.

Not to mention that 3840x2160 already has a (mostly) separate term : UHD-1.

There's also UHD-2 which is 4 times bigger, but I expect it to be renamed to something else soon enough.


A number that scales with pixel count is even more markety. Pixel width or height is a much better metric for quality.

But if you want to go down that path, instead of trying to redefine k just use megapixels.


>Why do you single out 2k?

Read the second clause!


> That's still YIFY-quality video, which is a few steps below HBO/MAX, which happens to be among the lowest quality paid streams.

You don't necessarily have to pick YIFY.

Qxr's uploads are a much better comparison. x265 encoded 1080p (with 5 - 6.5Mbps bitrate) from the highest quality sources and they look very good even on 55" 4K panels.


Are these, like, famous pirates or something?


They are well-known pirate groups, yes.


I personally think streaming services are overestimating that the 2k or 4k streams that they offer is a huge advantage they have over "pirate services". I don't think they have properly researched the consumer psychology or the network effect that is making piracy popular among a large segment of the middle-class and lower strata.

An hour long 30 Mbps 4k HDR+ video file will be roughly around 10-15+ GB with H.265 encoding.

As others have pointed out, a well encoded 720p or 1080p video offers a decent enough viewing experience quickly at far, far smaller sizes than a 2k or 4k videos (file sizes will be 10 to 20 times smaller at these resolutions). Note also that some pirates encode videos with the CPU, than using hardware encoders, and thus these videos tend to have a higher quality with better compression (hardware encoders, while blazing fast, tend to do a poorer job than CPU video encoding). Thus, these smaller sized video files don't require high-speed internet, can be downloaded fast and also encourages people to save the videos longer. This allows some to create their own personal video library. So a side effect of this is that people store and share these videos longer, and their smaller sizes now allow streaming torrents of popular content. Some torrent sites today have even started offering this through the browser itself - so non-techies now don't even have to download any torrent software and learn how to use it. That's near-Netflix like convenience, with more content, for "free" - and that's what these services are up against.

We also can't ignore that 4k videos are often only available at higher tier subscription plan. So even if Netflix, or other streaming services, think that some of these people can be enticed to subscribe to their services, with their high quality 2k or 4k videos, they will have to offer them at a lower price to beat the "free" model of piracy. (It's very hard to compete with "free" - just look at Google search engine's market share and its non-free competitors' market share to understand this).

All this is of course irrespective of the fact that 2k and 4k resolution HDR+ videos are also increasingly available now a days on torrents too.


And by "today", you mean a decade ago, which is when Popcorn Time got popular.


Not sure - did Popcorn Time ever allow streaming torrents through the browser itself? Some torrent sites today have a streaming section where you can browse a catalogue and watch torrent videos right in the browser itself. Check out https://ferrolho.github.io/magnet-player/ for an example of this.


Ah, well I am not really certain that it was such a big deal to need to download a separate program, especially at the time, especially in Argentina ?

Popcorn Time seems to have (already?) been based on browser techniques : JavaScript & NodeJS ? (Or is that only for today's version ?)

Also, I'm not exactly impressed because PeerTube has used WebTorrent for quite some years now (before recently abandoning it for something even better ?).

But yeah, you shouldn't expect that website to last long - but then this kind of game of whack a mole has been played by PirateBay-likes and illegal streaming sites for years now.


25% of all streaming globally is consumed on a mobile device. During the day (commuting to work, during lunch breaks) that number is much higher. Children's programming is also largely consumed on mobile devices (well, tablets).

On small devices like that 4K HDR+ video is kind of meaningless.


It’s also kind of meaningless at regular tv viewing distances. I have a 4k tv but from my couch there’s no way I can tell 4k apart from 1080. And honestly, 480 movies (dvd resolution) still look fine to me too. They have a certain aesthetic softness to them that I quite enjoy. 30 seconds into a 480 film I stop noticing the resolution at all.


There are no 'regular viewing distances' anymore, any guidelines were long ago thrown out of window. Some want to have cinema at home, some want a tiny screen in kitchen or bedroom, some want (or can only afford) something in between.

Just do what works for you. But with lets say 'cinema experience' distance:size ratio you definitely can see the difference. I could clearly see it on Netflix 4k on tiny 55" 4k screen from maybe 2-2.5m.

But of course after 1-2 minutes it becomes meaningless difference and your brain blends it, what you should focus is not seeing big banding pixels due to low bad compression in very dynamic scenes.


That's probably because your TV is seamlessly upscaling every signal it can receive to 4K. Newer TVs are very good at that.


That’s sweet of you to say, but it’s because I’m pushing 40 and my eyesight isn’t what it once was.


So for 75%, including Youtube, it is meaningful?


Maybe your eyesight is a lot better than mine but I can't discern pixels at sub-1080p resolution, especially in a movie where everything is kinda smudged out and individual pixel values don't matter as much.


It's usually blocky compression artifacts that show up at 720p and some 1080p encodes. They're most noticeable in dark or fast moving scenes.

On Netflix, The Sandman looked abysmally bad at 720p. The complex backgrounds became a garbled mess and the dark scenes were filled with banding.


> blocky compression artifacts that show up at 720p

It's just the shitty compression, not the mark of 720p.

This is extremely evident on YouTube now, for a couple of years if the source content is 4K native, they downgrade it to 1080p (so you can stream it with a comfortable bitrate), but despite a good enough resolution (1080p! 2K!) and the need to just downgrade the resolution (yes, double re-encoding, but) the 2K version looks abysmal. Especially evident on live videos with a contrast lights.


You're right that it's not a problem inherent to 720p itself. I was using it as a proxy since the issues weren't present in Netflix's 4k encoding and it's more familiar than other measures.


Do you like dark scenes?


I grew up with analog TV where the colours used to bend and you had to kick the antenna with a stick after the wind moved it a bit. If I can tell the characters apart and the audio is synchronised, I don't need any better quality!


You and I both. I had the fortune of taking the kids out to look after a field camp years ago, where a storm.in the distance would cut internet and TV signals, and cloudy weather would knock stuff out for days. They learned a different way to use a compass and how to repoint an antenna to get signal using team work. Thoroughly rate it.


Depends on the movie I guess. Planet Earth 2 is a jaw-dropping experience on even an average UHD-1 tv (and it's not ALL about the HDR, I've tried turning it off). Distinctly more impressive than Planet Earth 1 (FullHD only).

It's a real shame that the same store that sold Blu-Ray players didn't have it and I had to pirate it (and also skip buying the player).


> The Netflix 4K plan serves up high 1080p bitrates

Only if you are lucky and everything aligns well. You can't ask customers to debug their hardware to make it work.


Step 1: use appletv 4k

as much as it pains a certain set of nerds, using a dedicated stream device is still the best way to bypass the arbitrary hurdles that streamers throw at you. To build a general pc that gets Dolby vision and atmos working right requires at best some very specific hardware/software choices and at worst sometimes just don’t work at all on pc (unless you are willing to do things like hackintosh).

The audiovisual space has just always been intensely proprietary, even something as basic as divx was something you paid for or pirated back in the v3.x or v4.x days. Same story as HEVC: Fraunhofer and Dolby have been grubbing for licensing fees for decades now, and you either pay it or spend some time working around it. And if you’re a corporation then you just don’t ship that feature in your product.

Apple also does a really good job implementing those features at the margins, Apple TV is a nice device. But even if you buy (eg) something like a shield tv it’s gonna be easier than tilting at getting Netflix to send you a 4k stream on windows 10 or Linux. At some level it is platform discrimination plain and simple - they just won't send full-quality video to "open" platforms and you can either deal with it or go full pirate. Sometimes they won't stream it even if you have widevine set up properly etc - and it's all completely arbitrary and they could break it tomorrow. Or you just buy an apple tv/shield tv.

Plex, Infuse, and VLC also let you interact with plex/jellyfin on the apple tv, and there really is no question about what it can decode, the answer is "basically everything you'd interact with as a consumer", short of weird stuff (idk about something like 4:2:2 or ProRes). Ironically the A17 also might be my fastest processor (in single-thread) at the moment, lol - definitely faster than my laptop, might be faster than my gaming desktop.

Sucks but that's how it is - to this day, AMD doesn't have working HDMI 2.1 support on linux, because HDMI Forum won't do an open license. You either have closed-source blobs or firmware that implement it, or you don't get the feature. These are hard problems, and people expect to be paid to stare at them. And once you spend the time getting all the software and hardware vendors and studios onto the same page and then implement it etc you're going to want to be paid for that too.

It is one of the real problems for the linux kernel imo - not everything can be GPL licensed all, and (for obvious reasons) there is no escape valve for "ok but this vendor is really proprietary and the feature is really important...". When the unstoppable force meets the immovable object what you get is both vendors shrugging their shoulders. You want to fuck with NVIDIA over whatever GPL symbols? Then I guess you won't have HDMI 2.1 on your platform at all, then - because AMD's not gonna win that battle on licensing. The bulli approach worked with ZFS because ultimately there's diversity in filesystems, but if your hardware works best with HDMI 2.1 then you really don't have a choice in the matter. The very popular RDNA2 cards only have DP1.4, for example, so if you don't get HDMI 2.1 then you lose half the bitrate the card can support.

https://gitlab.freedesktop.org/drm/amd/-/issues/1417

Anyway you can tilt at that, or... pay $125 for a refurb ATV4K ethernet 3rd-gen and go watch a movie. It sucks but that's kinda the reality of the space - it's only troublesome if you aren't willing to grease some palms to make the system work as designed. Do you want to consoom or not?


This is not correct, the bitrates are actually cca 2x that high. I don't have YIFY much, went for RARBG since they released literally everything I cared about and wanted a bit of uniformity, always with subs, bad releases got PROPER fix etc. But they both aimed at same file sizes.

X265 1080p movies for both rippers are roughly 1.5-2GB per average movie, which is on average 1.5-2h long. You won't find better quality in this size range, in same way 7-10GB seems a sweet spot for 4k (especially with AV1 instead of HEVC). Not sure who & why downloads those 80GB full bluray releases.


I do. But I’m on private trackers where almost all content is well-seeded and long-term seeding is rewarded, so download time is less of a concern.

Why? Because I want the highest-quality possible copy. Same reason I download lossless versions of albums. It’s less about being able to perceive the quality increase, and moreso about archival. Then my Plex server can transcode to appropriate streaming bitrates on the fly without having to re-encode an already heavily compressed video stream.


Man, I want best quality too, I have all my music in FLACs even on my phone. But even on 75" from 4m I simply don't see the difference, so for me thats chasing numbers or similar.

And building and properly maintaining some not-so-small array of massive HDDs for those capacities... I just don't see the point when I can get all I need (and ie in sound much much more), without caring more than buying a single hdd and be done for next 5 years.

Its not the bandwidth, 1gbps is easy to get here, stuff I saw in those sizes was well seeded too, but just overall all added extra work.


I guarantee you can’t hear the difference between FLAC and 256kbps lossy either. The point is that you have the actual, canonical data as it was released by the creator.

You don’t need a massive HDD array, I have my whole collection of 10+ years of Blu-ray remuxes on a single Seagate Exos drive in my NAS.


Meh not everyone has a 4K 120Hz TV :)

I still have an old 1080p 32" LG from 10 years ago and it looks fine. In fact I find the motion upscaling weird, it gives this creepy uncanny effect.

I still download H264 too because not all of my devices support H265 yet.

I was thinking about something new but they're just too expensive.


The first two paragraphs: you are me. The last one: I don't think about upgrading because my TV is off all the time. I watch it on a tablet or on my phone. I use a Raspberry with a TV hat to stream free to air TV on my home network and apps for the IP based content. My TV is the backup device.

About the subject of this thread, too many paid streaming services are maybe too inconvenient. If they are, the market will fix that. That is: some of them won't make enough money and close, or sell to a competitor, or just create content and license it to streaming services.

My preference would be not to pay per month but per view. There are months when I don't watch any series and months when there is something I like. About movies, maybe no movie at all for a year or two, then a few of interesting ones, including old ones that I never watched and I feel like to watch. The last one was Gilda, 1948.


Yeah, same issue. Though I use Raspberry Pi as a backup when the playback doesn't work (it's not just H265, though others formats failing to work on the TV happens much more rarely).


I don't pay extra for 4K streaming in part because I just don't have the bandwidth each month to handle 4x larger video downloads.

Even with 1080p and 2-3 downloaded modern video games per month (which are all greater than 100GB now), I tend to get up to my 1TB bandwidth cap each month. I start paying extra after that.


What is YIFY?


A pirate group that was known for releasing movies in small files, but with a decent quality.


YIFY encodes were notorious for being bitrate starved to the point where good SD encodes trumped YIFY "HD".


true, but the overlap in quality between low/high resolution and bitrate encodes are neither historical nor limited to yify.

as someone that grew up with VHS and VGA; 2-3mbps h264 720p is fine and few practical gains above 5mbps 1080p.


If we're looking at the storage cost, I'd say the 10x decrease in flash price is doing a lot more than the maybe-2x decrease in file size.


Few people are building flash nas though the price pressure from flash storage on hdd prices did play a big part. Before flash storage prices started pushing pressure on hdd prices per tb larger drives had become more expensive than smaller drives. We are I think just 3-4 years away from 8tb SSD being cheaper than hdd for me that is the point where we will have mass SSD adoption for Nas drives.


I'd guess that the kind of person that makes a NAS wouldn't have been stopped by h.264 sizes. The comment I replied to seemed pretty focused on ordinary system drives.


Av1 doing 1080p at 3-4 mbps will be reach the point of diminishing returns imo for 1080p content. I can’t wait till that become mainstream. It’ll be the perfect archival quality/space effeciency.

Too bad even devices that claim av1 support … even some from the makers of said codec .. ahem chromecast w/gtv .. still stutter or fail.


Sorry to focus on such a detail but VBR nevern been the standard in the Piracy scene. Back in the days it was 192kbps


When I quit pirating music, I remember the standard being either LAME -V2 or --alt-preset standard which both are VBR https://opentrackers.org/scenerules.org/html/2007_MP3.html


Maybe metadata is wrong? Somewhere around 2011 it started to report as cbr, but encoding options are -m j -V 4 -q 3 -lowpass 20.5


That's not how I remember it but it was a long time ago and memory is unreliable so I won't claim that my statement is a fact.

But I remember a period of time where basically every rip was VBR. 192kbps then took over when bandwidth and disk space were no long were no longer concerns.


My memory of old what.cd's and today's Red snatch counters disagrees with you. FLAC ~ V0 > 320


Also AV1. I got a 1080p version of a movie encoded in AV1 and it looks pretty damn good at only 1.5GB. The x264 version I have at over 2GB has a lot of obvious artifacting by comparison.


> the privacy scene

lol

> Also I think the shift to SSDs from spinning rust delayed the adoption of widespread video piracy 10-ish years ago

Huh? 500GB (and more) drives became available at 2006 and by 2008 dropped below $150.

Also there was no demand for extra-high-wanna-see-every-original-pixel quality, so XviD releases with atrocious quality but a fine file size were common and were fine on 40-80-250GB drives in 2001-2005.

> you can create a large library very quickly

Music library and video library are a different things. While the most people 'store forever' the music they dloaded (well, they were back in 2000s, probably not so much today), the video library is mostly useless for the most, because it's dloaded for 1-time consumption.


A 500GB HDD, yes. I'm thinking of the person who had a laptop in 2010-2012 and either replaced their large capacity 2.5 inch HDD with a smaller capacity 128GB or 256GB SSD for the massive performance increase or bought one that had an SSD preinstalled. This would also be a shared drive so it isn't dedicated to media.

2010-2012 is also when most folks would have already upgraded to a HD display and so there was more demand for HD content.


Nah, by 2012 newcomers didn't bought SSDs en masse yet, USB drives were extremely common (I think my USB3 Transcend 25M3 is from that time) and some still used mobile racks, slowly replaced by USB dock stations.

Most of the time people replaced HDD to SSD in the laptops only to use ODD caddy to move their HDD there.

By 2010 storage was not a problem, at all.

If anything the torrenting reached it's peak ~2008 and on a slow decline (but not in absolute numbers, because the userbase is still rising) ever since, because by 2010 almost any music could be found on YouTube (popular ofc, not something obscure what is known by 3 greybeards on the whole planet) and even a lot of movies too. For the parts of the world where broadband was still costly or capped you could always go to the store and buy 5-in-1 DVD for $2 (talk about quality, eh). After that the succesful multi year campaign by Apple to convince the user what it doesn't need to own music and what it is totally okay to stream "your" music over the cellular networks, along with %he rise of the streaming services, moved the torrenting to a niche of nerds (only lossless music!), nerds (248 variants of StarWars) or nerds (anime).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: