Who introduces this reviled junk NOW, after years of mockery, and on a media player? Anyone who actually wants this (or doesn't care) already has it on his TV, making its addition to individual sources stupid and redundant.
Reading between the lines slightly, it sounds like this affects TVs running Roku software for inbuilt "smarts" rather than the media players themselves. Still not great, of course.
I connected my TV to the internet to test some Wi-Fi streaming before, and I noticed after a few days the number of blocked queries my DNS had blocked. So, I kept it running for a few days. On average, it blocked more than 15k queries a day! And that’s in idle mode and/or turned off, so I wasn’t running any YouTube, Netflix, or similar.
If you're just looking at number of requests blocked, that can be highly deceiving. A lot of these "smart" TV OSs (and other embedded devices) will very rapidly spam the DNS server with no backoff when they don't get the expected response.
Make a specific network for these applications which does not have access to the internet. Add your personal media servers to it and give the TV and other untrustworthy devices access to it. You get to access your media without having the devices spy on you and without having them auto-update. You can have several such networks for different purposes, e.g. 'internet of things' and 'media' and 'systems management' to name a few. All this takes is a reasonably capable router - OpenWRT in my case, running in a container on the server-under-the-stairs - and (if you use separate access points which is a must when using a virtual router...) a managed switch which can handle VLANs. For access points I use repurposed WiFi routers running OpenWRT with their routing functionality disabled. I went this route because I live on a farm with several access points spread out over the premises but the same can be achieved with a single WiFi router running OpenWRT or similar, in that case just add a number of interfaces and wireless networks ('media', 'iot', 'ops' etc.) to which you connect our devices.
I agree that using the default OS on a smart TV is riddling oneself with ads and tracking.
But it is not common wisdom, and saying everyone deserves it is unproductive and lacks empathy.
I deserved it when I started getting ads on Twitch after re-enabling updates for it for no particular reason, because I know better. My dad who wants to use Criterion on his smart TV but can't figure out how to use a smartphone camera? Different story.
A separate streaming device, or a computer, has the same issues. Unless a TV screen is special here it sounds like this logic won't let me use any streaming services.
I'm not saying that other people should go the route that I did, I'm just adding to the conversation.
I wanted to make my "smart" tv "dumb" without losing smart tv functionality, so I bought a mini pc and installed Kubuntu on it. Normally I don't use KDE but it is a very highly customizable DE and so I customized the hell out of it to make the desktop look like a typical smart tv launcher page. I created launchers for streaming services and YouTube etc. and bought a USB remote control from Amazon.
Now I have a Linux machine that is entirely under my control, runs FOSS software and runs streaming services in a web browser instead of installing proprietary software on my TV. I also have way more features, like being able to use VLC to play local media so it supports pretty much every file format ever (and I have a lot of older video files from decades ago so that's nice).
My "smart" tv is no longer connected to any network but I can still do every smart tv thing that I want to and more.
I've not had any issues with any streaming service not working on Linux. Disney+, Netflix, Prime Video, Shutter all work fine for me. If it can be streamed in a browser, on any OS, it works on Linux. Typically your browser just needs to have Widevine installed.
As for Netflix quality, I've not noticed any quality issues at 1080p but I've not measured the stream to confirm if that's what they're actually streaming.
A streaming box can be factory reset or thrown in the trash. If your 65 inch TV doesn't let you roll back an update you have to buy a new TV. Streaming boxes are usually faster and more feature filled than the in built TV stuff as well.
Without connecting my TV to the Internet for firmware upgrades, I would be missing out on several fixes and improvements especially with handling of higher frame rates, VRR and IIRC at least one in the HDMI ARC subsystem that was causing audio drops.
Interesting! I haven’t connected my smart LG to the net since it was new, and it handles everything I throw at it.
I prefer to use an external box as opposed to the built-in apps, to have a more responsive interface, to keep the TV as clean as possible, and to have control over choosing to update the TV or not.
I imagine all of them. This is all controlled in soft/firmware and as such is bound to have bugs. There's also updates that make better use of the hardware.
That's exactly the problem. There's no guarantee that the built-in will be and remain "proper" with updates. It's like saying we can trust our government with encryption backdoors as long as they behave properly.
Having an internet-connected TV running Android is great. With this, I can download and install SmartTube, and then use it to watch YouTube without ads or sponsor segments. I can also install Jellyfin, so I can watch various media served by my PC.
You're better off using an Nvidia Shield for that. You can swap that or the TV out independently as tech changes. You can also plug drives into it and play whatever media you want.
Except that my TV already has a computer running Android inside, so why would I want to spend a bunch more money on an additional device that just does the same thing?
If I could have saved a little money by getting a 65" monitor, this advice would make a lot of sense. But that's just not an option these days. You're getting a built-in computer whether you want it or not.
As for plugging in drives, it looks like you can only do that with either microSD or USB. Either way, doing that for every item I want to watch is very inconvenient compared to just using Jellyfin, which lets me browse everything on my PC.
Not at all. I have an SSD plugged into my Shield, to which I can copy whatever I want over my network. People also like to install torrent-based clients for illicit streaming services, although I don't consider that worthwhile.
Your observation about every TV forcing built-in junk on you is legit; but I use a projector, which doesn't have any of that bullshit. Every source I have goes through my receiver, which is another reason to use a media player that's not your TV.
The whole thing is a sad commentary on the state of A/V and people's laziness today. They're buying giant TVs but settling for the shitty audio from its built-in speakers... or fooling themselves with a sound bar.
I have a sound bar. I'm not fooling myself. It's much better than the built-in speakers (though my soundbar has a small subwoofer, so that helps a lot). Is it equivalent to a real A/V receiver/amplifier with large speakers? Definitely not, but that setup costs a LOT more money, plus I'm limited in how much I can turn up the volume anyway because of my neighbors.
The projector thing is a no-go for many people. They cost a lot of money, the bulb inside has limited life and is very expensive (and makes a lot of heat too), but worst of all, they just aren't bright enough for many viewing environments. I would never be able to see it in my apartment, except late at night.
From the article: "[Maybe] Roku is operating with such large numbers these days that every decision is a fly-by-wire corporate abstraction far from the bare metal of the user experience. This means upgrades and features gain unstoppable internal force and the only thing that can stop them is the immovable object of financial results months or years later."
This is the most optimized feedback loop: one without a user. See all the major tech companies for addt'l examples, from AMZN to X.
I like the other explanation: Roku is basically an appliance company with no internal culture of culture, so it just never came up and now they're dealing with all this stuff they didn't realize had anything to do with what they sold.
They could be selling video streaming, or video poker. They don't care, as long as they have subscribers.
What I don't understand is why cinephiles would be streaming through a Roku anyway. Shouldn't they be downloading 18K TURBO-HD straight from the Academy's server?
If there is a video processing technology to make actual soap operas not look like soap operas then that could probably be used to make soap operas watchable.
I bought a Roku years ago, when Hulu was a viable alternative to Netflix, and was very surprised to discover that we had access to fewer shows on Hulu after subscribing to Hulu (and paying for the privilege) through Roku. Simply put, if you were just watching Hulu anonymously for free then you had access to more shows than if you were paying for Hulu through Roku.
Probably the fastest product return I have ever made.
I haven't noticed anything on my TCL with built-in Roku that I use as my computer display. But that's because I have game mode on all the time and I just checked it by disabling game mode and I'm pretty sure at least mine's not affected.
I wonder if this is only enabled for certain content?
I also have a TCL TV and would definitely notice this effect, and I typically use Movie picture mode so you'd expect it would be triggered, but doesn't seem to be present on YouTube or other streaming sources.
I think you're right. I used to be an fps purist like everyone in this thread, but now I have a much more unpopular opinion. I think that frame rate enhancing features of TVs are a good thing. Hear me out. The only reason 24 fps is popular is because that's what people got used to in the early days of cinema. Objectively speaking, 24 fps is a horrible choice of frame rate for movies. It's terrible at conveying fast motion properly. We no longer have technological barriers that limit us to 24 fps any more. The main reason we're sticking with it is because old people associate 30 fps and higher frame rates with soap operas that were shot in 30 fps. Young people don't really have this association and don't mind high frame rates, they are used to 60 fps video shot by phones. So I think if we could just all agree to bite the bullet and get used to high frame rates we could finally move forward to proper high fps cinema, and we can do that by enabling frame rate conversion features that TVs have. So please, just turn it on and get used to how it looks. It's for the betterment of mankind.
The gist is that, for animation, frame interpolation messes with intended timing and can produce incoherent images on interpolated frames and odd frame rate issues for certain kinds of animations. Interpolation can thus cause animations to lose their punch and feel wrong.
While interpolation may be nice for live action films, it should still be an option to turn off.
Please try watching the opening sequence of Aliens with motion smoothing enabled sometime.
Without motion smoothing, the model shots still look reasonably convincing 40 years later.
With motion smoothing, it looks like a low-budget B-movie.
I don't mind if a film is shot at a higher frame rate and then displayed that way, but interpolating additional frames looks terrible, and I don't know why any manufacturer turns it on by default anymore.
While I agree that more content, especially fighting (or panning) scenes and sports, should probably be more than 24fps. I can't agree that any interpolation would be able to provide sufficient quality.
Let's remove that feature to get rid of that dark dark stain on high-FPS content and let platforms and content creators decide where it fits and where it doesn't.
This video is laced with profanity and focused specifically on animation, but give a pretty good overview of why interpolation does not inherently make things look better: https://www.youtube.com/watch?v=_KRb_qV9P4g
Higher frames rates are good, when you actually get more frames. But motion smoothing is a fraud. It interpolates additional frames. In order to make things look consistent, it has to apply addition filtering to the original frames. The result is you get less information overall, not more.
Even worse, it tends to ruin production values when it's a film with a DoP who knows how to exploit the characteristics of film.
It's like interlaced video. Yeah, you motion that looks like 50 or 60 fps, but the actual information is still only 25 or 30 fps, and it's degraded due to the effects of interlacing.
Motion smoothing is this century's interlacing, and in a few decades we'll have archivists running video through a motion desmoothing counterpart to QTGMC.
Who introduces this reviled junk NOW, after years of mockery, and on a media player? Anyone who actually wants this (or doesn't care) already has it on his TV, making its addition to individual sources stupid and redundant.
Reading between the lines slightly, it sounds like this affects TVs running Roku software for inbuilt "smarts" rather than the media players themselves. Still not great, of course.
Anyone who allows their TV to connect to the internet deserves what they get.
I connected my TV to the internet to test some Wi-Fi streaming before, and I noticed after a few days the number of blocked queries my DNS had blocked. So, I kept it running for a few days. On average, it blocked more than 15k queries a day! And that’s in idle mode and/or turned off, so I wasn’t running any YouTube, Netflix, or similar.
If you're just looking at number of requests blocked, that can be highly deceiving. A lot of these "smart" TV OSs (and other embedded devices) will very rapidly spam the DNS server with no backoff when they don't get the expected response.
Some of these tvs also hardcode DNS IPs. If your router supports it -- check the outgoing traffic from the tv to the standard DNS ports.
This is the way.
If you went into a data center with a random device from amazon and asked to plug it in and see what happens an sysadmin would slap you silly.
At home folks are content to plug in whatever. Not only do you have no idea what it's doing but what it MIGHT do in the future.
https://www.theverge.com/23573362/anker-eufy-security-camera... ... Anker was a pretty respectable brand, their marketing department out right lied, about a device where privacy is critical.
Control your network, trust nothing you plug into it... It won't stop, so cut it off is the order of the day.
I said the same thing, until I wanted to connect to my private Jellyfin server.
Make a specific network for these applications which does not have access to the internet. Add your personal media servers to it and give the TV and other untrustworthy devices access to it. You get to access your media without having the devices spy on you and without having them auto-update. You can have several such networks for different purposes, e.g. 'internet of things' and 'media' and 'systems management' to name a few. All this takes is a reasonably capable router - OpenWRT in my case, running in a container on the server-under-the-stairs - and (if you use separate access points which is a must when using a virtual router...) a managed switch which can handle VLANs. For access points I use repurposed WiFi routers running OpenWRT with their routing functionality disabled. I went this route because I live on a farm with several access points spread out over the premises but the same can be achieved with a single WiFi router running OpenWRT or similar, in that case just add a number of interfaces and wireless networks ('media', 'iot', 'ops' etc.) to which you connect our devices.
I agree that using the default OS on a smart TV is riddling oneself with ads and tracking.
But it is not common wisdom, and saying everyone deserves it is unproductive and lacks empathy.
I deserved it when I started getting ads on Twitch after re-enabling updates for it for no particular reason, because I know better. My dad who wants to use Criterion on his smart TV but can't figure out how to use a smartphone camera? Different story.
I loathe ads on my Roku TV, but free news channels and Jellyfin make it worthwhile in my case.
There's nothing inherently wrong with built-in Netflix etc.
There is, if you desire control of your television.
If you have no problem with Roku et al. pushing updates to your TV, then by all means use integrated software.
A separate streaming device, or a computer, has the same issues. Unless a TV screen is special here it sounds like this logic won't let me use any streaming services.
I'm not saying that other people should go the route that I did, I'm just adding to the conversation.
I wanted to make my "smart" tv "dumb" without losing smart tv functionality, so I bought a mini pc and installed Kubuntu on it. Normally I don't use KDE but it is a very highly customizable DE and so I customized the hell out of it to make the desktop look like a typical smart tv launcher page. I created launchers for streaming services and YouTube etc. and bought a USB remote control from Amazon.
Now I have a Linux machine that is entirely under my control, runs FOSS software and runs streaming services in a web browser instead of installing proprietary software on my TV. I also have way more features, like being able to use VLC to play local media so it supports pretty much every file format ever (and I have a lot of older video files from decades ago so that's nice).
My "smart" tv is no longer connected to any network but I can still do every smart tv thing that I want to and more.
Unless things have changed, a Linux machine will not give you a high bit rate picture. Netflix requires DRM and will only give you 720p.
So you get freedom, but a worse product. Which is a trade off not all would be willing to make.
There are also few major streaming services which require elevated levels of DRM , preventing them from being used on Linux.
I've not had any issues with any streaming service not working on Linux. Disney+, Netflix, Prime Video, Shutter all work fine for me. If it can be streamed in a browser, on any OS, it works on Linux. Typically your browser just needs to have Widevine installed.
As for Netflix quality, I've not noticed any quality issues at 1080p but I've not measured the stream to confirm if that's what they're actually streaming.
A streaming box can be factory reset or thrown in the trash. If your 65 inch TV doesn't let you roll back an update you have to buy a new TV. Streaming boxes are usually faster and more feature filled than the in built TV stuff as well.
Without connecting my TV to the Internet for firmware upgrades, I would be missing out on several fixes and improvements especially with handling of higher frame rates, VRR and IIRC at least one in the HDMI ARC subsystem that was causing audio drops.
Interesting! I haven’t connected my smart LG to the net since it was new, and it handles everything I throw at it.
I prefer to use an external box as opposed to the built-in apps, to have a more responsive interface, to keep the TV as clean as possible, and to have control over choosing to update the TV or not.
What brand is the one you’re talking about?
Which TV manufacturer is shipping out buggy TVs that require updates?
I imagine all of them. This is all controlled in soft/firmware and as such is bound to have bugs. There's also updates that make better use of the hardware.
Then use USB offline update?
Samsung: https://www.samsung.com/us/support/answer/ANS00062224/
Vizio: https://support.vizio.com/s/article/Firmware-Information-215...
TCL: https://support.tcl.com/rokutv-common-questions/01-how-to-ma...
Name the specific one you're talking about. Either there is a manufacturer we should all avoid, or atleast a model, or you made it up.
Lol ok so I'm making it up, then. I don't owe you anything, dude. You can believe whatever you want, I don't give a single f.
And now we're getting into much more technical details and why I said "inherently".
A proper built-in box will let you bypass it.
That's exactly the problem. There's no guarantee that the built-in will be and remain "proper" with updates. It's like saying we can trust our government with encryption backdoors as long as they behave properly.
My perspective: the incentive for manufacturers is to generate as much revenue as possible. In hardware manufacturing, margins are extremely tight.
Consequently, any additional revenue from selling customer data is incredibly attractive.
Especially if you can do so better/faster than your competitors, thereby competing at the same retail price points with better net margins.
As a result... they have every incentive to fuck their customers over from a privacy perspective.
And what they're doing is invisible.
And this is an entity people are comfortable trusting with unfettered access to their devices/network?
Sorry, I was unclear.
By 'bypass', I mean something that does not require the cooperation of the box.
I should have said the proper way of adding it will let you bypass the box, not phrase things as if the box is the gatekeeper.
I'd rather my tv get updates than have it end up as part of a botnet. The recent RCE in ssh shows nothing is immune.
> The recent RCE in ssh shows nothing is immune.
An unplugged network cable.
For myself, I’ve never had a reason to put it online.
If you’re using the TV online, then yes, get updates, but if you never let it have network access then this isn’t a deep concern.
Having an internet-connected TV running Android is great. With this, I can download and install SmartTube, and then use it to watch YouTube without ads or sponsor segments. I can also install Jellyfin, so I can watch various media served by my PC.
You're better off using an Nvidia Shield for that. You can swap that or the TV out independently as tech changes. You can also plug drives into it and play whatever media you want.
Except that my TV already has a computer running Android inside, so why would I want to spend a bunch more money on an additional device that just does the same thing?
If I could have saved a little money by getting a 65" monitor, this advice would make a lot of sense. But that's just not an option these days. You're getting a built-in computer whether you want it or not.
As for plugging in drives, it looks like you can only do that with either microSD or USB. Either way, doing that for every item I want to watch is very inconvenient compared to just using Jellyfin, which lets me browse everything on my PC.
Not at all. I have an SSD plugged into my Shield, to which I can copy whatever I want over my network. People also like to install torrent-based clients for illicit streaming services, although I don't consider that worthwhile.
Your observation about every TV forcing built-in junk on you is legit; but I use a projector, which doesn't have any of that bullshit. Every source I have goes through my receiver, which is another reason to use a media player that's not your TV.
The whole thing is a sad commentary on the state of A/V and people's laziness today. They're buying giant TVs but settling for the shitty audio from its built-in speakers... or fooling themselves with a sound bar.
I have a sound bar. I'm not fooling myself. It's much better than the built-in speakers (though my soundbar has a small subwoofer, so that helps a lot). Is it equivalent to a real A/V receiver/amplifier with large speakers? Definitely not, but that setup costs a LOT more money, plus I'm limited in how much I can turn up the volume anyway because of my neighbors.
The projector thing is a no-go for many people. They cost a lot of money, the bulb inside has limited life and is very expensive (and makes a lot of heat too), but worst of all, they just aren't bright enough for many viewing environments. I would never be able to see it in my apartment, except late at night.
There is also nothing inherently wrong with connecting random Windows machines directly to the Internet.
It’s still not going to work out in a good way over any significant timeframe.
Better article, with more details: https://www.theverge.com/2024/6/12/24177117/tcl-roku-tv-moti...
And Verge coverage submitted earlier....by OP?
https://news.ycombinator.com/item?id=40875351
From the article: "[Maybe] Roku is operating with such large numbers these days that every decision is a fly-by-wire corporate abstraction far from the bare metal of the user experience. This means upgrades and features gain unstoppable internal force and the only thing that can stop them is the immovable object of financial results months or years later."
This is the most optimized feedback loop: one without a user. See all the major tech companies for addt'l examples, from AMZN to X.
I like the other explanation: Roku is basically an appliance company with no internal culture of culture, so it just never came up and now they're dealing with all this stuff they didn't realize had anything to do with what they sold.
They could be selling video streaming, or video poker. They don't care, as long as they have subscribers.
I wouldn't be surprised to learn that Roku is managed by an AI.
What I don't understand is why cinephiles would be streaming through a Roku anyway. Shouldn't they be downloading 18K TURBO-HD straight from the Academy's server?
You don't have to be a cinephile to not want movies to look like soap operas
If there is a video processing technology to make actual soap operas not look like soap operas then that could probably be used to make soap operas watchable.
> They could be selling video streaming, or video poker. They don't care, as long as they have subscribers.
first time dealing with capitalism, eh?
I used to make the joke, "Roku is the worst thing to happen to TVs since motion smoothing." Now it needs a rewrite.
I bought a Roku years ago, when Hulu was a viable alternative to Netflix, and was very surprised to discover that we had access to fewer shows on Hulu after subscribing to Hulu (and paying for the privilege) through Roku. Simply put, if you were just watching Hulu anonymously for free then you had access to more shows than if you were paying for Hulu through Roku.
Probably the fastest product return I have ever made.
I haven't noticed anything on my TCL with built-in Roku that I use as my computer display. But that's because I have game mode on all the time and I just checked it by disabling game mode and I'm pretty sure at least mine's not affected.
55R615 model
I wonder if this is only enabled for certain content?
I also have a TCL TV and would definitely notice this effect, and I typically use Movie picture mode so you'd expect it would be triggered, but doesn't seem to be present on YouTube or other streaming sources.
My 75” TCL running Roku seems to be unaffected. Thank goodness. I despise that (field rate) soap opera look.
If you don’t control the software, someone else does.
The latest in a long line of utterly stupid decisions by Roku demonstrating their lack of competency.
Why anyone willingly choses Roku anymore is beyond me.
Perhaps The New York Times recommending it has something to do with it:
https://www.nytimes.com/wirecutter/reviews/best-media-stream...
Isn't higher frames per second good? You wouldn't want a video game to intentionally limit itself to 24 fps. Isn't motion smoothing more realistic?
I think you're right. I used to be an fps purist like everyone in this thread, but now I have a much more unpopular opinion. I think that frame rate enhancing features of TVs are a good thing. Hear me out. The only reason 24 fps is popular is because that's what people got used to in the early days of cinema. Objectively speaking, 24 fps is a horrible choice of frame rate for movies. It's terrible at conveying fast motion properly. We no longer have technological barriers that limit us to 24 fps any more. The main reason we're sticking with it is because old people associate 30 fps and higher frame rates with soap operas that were shot in 30 fps. Young people don't really have this association and don't mind high frame rates, they are used to 60 fps video shot by phones. So I think if we could just all agree to bite the bullet and get used to high frame rates we could finally move forward to proper high fps cinema, and we can do that by enabling frame rate conversion features that TVs have. So please, just turn it on and get used to how it looks. It's for the betterment of mankind.
A really good video on the subject: https://youtu.be/_KRb_qV9P4g?si=x0pmLkBhLYXud4G0
The gist is that, for animation, frame interpolation messes with intended timing and can produce incoherent images on interpolated frames and odd frame rate issues for certain kinds of animations. Interpolation can thus cause animations to lose their punch and feel wrong.
While interpolation may be nice for live action films, it should still be an option to turn off.
He only talks about animation, and even then it's a specific software and method that does a really bad job. It's a strawman argument.
Please try watching the opening sequence of Aliens with motion smoothing enabled sometime.
Without motion smoothing, the model shots still look reasonably convincing 40 years later.
With motion smoothing, it looks like a low-budget B-movie.
I don't mind if a film is shot at a higher frame rate and then displayed that way, but interpolating additional frames looks terrible, and I don't know why any manufacturer turns it on by default anymore.
While I agree that more content, especially fighting (or panning) scenes and sports, should probably be more than 24fps. I can't agree that any interpolation would be able to provide sufficient quality.
Let's remove that feature to get rid of that dark dark stain on high-FPS content and let platforms and content creators decide where it fits and where it doesn't.
The content creator should decide the frame rate. Higher real fps I can get behind. But I don't want a third tier middle man making up fake frames.
This video is laced with profanity and focused specifically on animation, but give a pretty good overview of why interpolation does not inherently make things look better: https://www.youtube.com/watch?v=_KRb_qV9P4g
Higher frames rates are good, when you actually get more frames. But motion smoothing is a fraud. It interpolates additional frames. In order to make things look consistent, it has to apply addition filtering to the original frames. The result is you get less information overall, not more.
Even worse, it tends to ruin production values when it's a film with a DoP who knows how to exploit the characteristics of film.
It's like interlaced video. Yeah, you motion that looks like 50 or 60 fps, but the actual information is still only 25 or 30 fps, and it's degraded due to the effects of interlacing.
Motion smoothing is this century's interlacing, and in a few decades we'll have archivists running video through a motion desmoothing counterpart to QTGMC.