this post was submitted on 12 Sep 2023
554 points (100.0% liked)

Technology

37724 readers
475 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

I absolutely hate "smart" TVs! You can't even buy a quality "dumb" panel anymore. I can't convince the rest of my family and friends that the only things those smarts bring are built-in obsolescence, ads, and privacy issues.

I make it a point to NEVER connect my new 2022 LG C2 to the Internet, as any possible improvements from firmware updates will be overshadowed by garbage like ads in the UI, removal of existing features (warning: reddit link), privacy violations, possible attack vectors, non-existent security, and constant data breaches of the manufacturers that threaten to expose every bit of personal data that they suck up. Not to mention increased sluggishness after tons of unwanted "improvements" are stuffed into it over the years, as the chipset ages and can no longer cope.

I'd much rather spend a tenth of the price of my TV on a streaming box (Roku, Shield TV, etc.) and replace those after similar things happen to them in a few years. For example, the display of my OG 32-inch Sony Google TV from 2010 ($500) still works fine, but the OS has long been abandoned by both Sony and Google, and since 2015-16 even the basic things like YouTube and Chrome apps don't work anymore. Thank goodness I can set the HDMI port as default start-up, so I don't ever need to see the TV's native UI, and a new Roku Streaming Stick ($45) does just fine on this 720p panel. Plus, I'm not locked into the Roku ecosystem. If they begin (continue?) enshitifying their products, there are tons of other options available at similar price.

Most people don't replace their TVs every couple of years. Hell, my decade old 60-inch Sharp Aquos 1080p LCD TV that I bought for $2200 back in 2011 still works fine, and I only had to replace the streamer that's been driving it twice during all this time. Sony Google TV Box -> Nvidia Shield TV 2015 -> Nvidia Shield TV 2019. I plan to keep it in my basement until it dies completely before replacing it. The Shield TV goes to the LG C2 so that I never have to see LG's craptastic UI.

Sorry, just felt the need to vent. Would be very interested in reading community's opinions on this topic.

you are viewing a single comment's thread
view the rest of the comments
[–] dingus@lemmy.ml 125 points 1 year ago* (last edited 1 year ago) (8 children)

You actually can buy quality dumb TVs, but you have to do the legwork and do research on what are often referred to as "commercial displays." I see them everywhere in businesses for ads and showing the menu. They're sometimes a little pricier, but they're usually built a little "beefier" too, as they're expected to deal with more rough usage in like a restaurant context.

However, the other solution is the one you've already mentioned where you never plug the Smart TV into the internet, and instead bypass the "smart" on the TV with your own streaming boxes.

I think as more people realize there is a market for dumb TVs, you'll start to see that market grow more and more until they no longer just "commercial displays." Just gotta get enough people buying them and not buying Smart TVs.

[–] notfromhere@lemmy.one 55 points 1 year ago (2 children)

I think if enough people never gave them Internet access, the manufacturers would start adding in cellular modems to ensure they get the data flowing (that is, data on your viewing habits and sending you ads).

[–] beefcat@beehaw.org 39 points 1 year ago* (last edited 1 year ago) (2 children)

Having worked in this field, I can tell you how it usually operates: You want the most data for the least amount of investment. As soon as your operational costs start to eat into your already thin margins, the equation falls apart.

Complex solutions designed to capture data from that 1-3% of users who actively avoid it end up costing a lot more money than their data is actually worth. In order to make this particular solution work, you need to make enough money selling whatever tiny amount of data you get from those 1-3% of users to cover the cost of putting a cellular modem in all of your TVs plus the ongoing cost of paying various regional cellular networks to deliver that data to you. You are likely tripling or quadrupling the total cost of your data collection operation and all you have to show for it is a rounding error. And that is before we factor in the fact that these users likely aren't using the built in streaming apps, so the quality of the data you get from them is below average.

[–] 1993_toyota_camry@beehaw.org 4 points 1 year ago (2 children)

The cheaper option would be to set up an ad-hoc tv-to-tv network. You might not let your TV talk to the internet, but I bet your neighbour does, or if not, then their neighbour will.

[–] jarfil@beehaw.org 4 points 1 year ago* (last edited 1 year ago)

The "Anti-Fraud Community Group" already thought of that:

https://github.com/antifraudcg/proposals/issues/17

Device mesh (Androids/Chromes) to share suspicious behavior

The proposal is to use the consensus between devices on genuine and suspect characteristics

A device should be able to query from a safe and reliable source if another device has performed (within a defined period of time) some malicious action similar to the one it is going to perform, so it could make the decision not to perform that same action, autonomously.

...just in case you wanted to install ~~an ad blocker~~ malicious software, or something.

[–] Leafeytea@beehaw.org 2 points 1 year ago (1 children)

I mean our computers and phones already do something like this while looking for available WiFi networks, so maybe it wouldn't be that farfetched. On the other hand, I just got a flashback to Jim Carrey in Batman and.... "the box" for some bizarre reason! 😂

Since I live in a small space and game a lot, I have invested in a gorgeous 4k monitor and honestly love how all my movies look as well as games, so I have zero issues. It would be nice to someday buy a large tv that didn't constantly search, scan, and update crap I don't want or need, but I am not holding my breath they will reverse course.

It's amazing how Batman Forever predicted the then-future of television, up to and including most people trading in security/privacy for convenience.

[–] jarfil@beehaw.org 3 points 1 year ago

1-3% of users might not be enough people, but what is the break-even % of people to justify adding a cheap cellular modem? 5%? 10%?

You are likely not even doubling the cost of the data collection operation. We're talking under $0.50 in additional hardware per unit, with a relatively low data usage requirements. The servers to collect that data are likely already more expensive, and you can easily sell user viewing habits for way more than $1/month/user. You can use a prepaid low usage data-only eSIM with global roaming for less than $5/year, only renew it for the devices that don't get hooked up to a user's WiFi. If it was only needed for 5% of the users, or 1 in 20, you could still get a ROI of under a year.

With a device life of 5+ years, it's definitely much more than a rounding error. Keep in mind the profits go directly to the manufacturer, so it's a % of product cost in origin, not of MSRP... which is pretty much the reason why all manufacturers have jumped onto the data collection bandwagon in the first place.

[–] nickwitha_k@lemmy.sdf.org 6 points 1 year ago (1 children)

That's what they do with CPAP machines.

[–] furrowsofar@beehaw.org 2 points 1 year ago

My CPAP is always in airplane mode. Hopefully solved that problem.

[–] thejml@lemm.ee 35 points 1 year ago (2 children)

I feel like the market is only going to grow in the top end. Audio/videophiles sort of areas with large, high quality, top end feature sets.

The low end tends to be partly subsidized by the “smart” features. Think TVs that show ads in the menu, or Amazon or Google screens that want you to use their services because it’s “easy” and they’re “right there” so maybe people will subscribe. Couple that with the “feature” that it’s already built in so it saves you an extra box/purchase for people who want cheap TVs, and I don’t see it going away anytime soon.

[–] tiramichu@lemm.ee 38 points 1 year ago* (last edited 1 year ago) (2 children)

Exactly this.

Manufacturers are NOT INTERESTED in selling low-cost dumb TVs when they can sell smart TVs and get long-term returns. They are even willing to sell the TVs at cost because they will monetise later with ads and selling your data.

Manufacturers don't want you to have a dumb TV, they want everyone to go smart - which is part of why business-targetted dumb panels are priced higher - to disincentivise regular end-customers from buying.

[–] brihuang95@sopuli.xyz 12 points 1 year ago (1 children)

oh...is that why all these nice smart TVs are so affordable these days?! damn!!

[–] tiramichu@lemm.ee 10 points 1 year ago (1 children)

Normal manufacturing efficiencies and cost reduction is surely the biggest reason they are cheaper now but it's absolutely a factor.

So many companies in so many industries are trying to move from being product companies (make money selling a thing) to being service companies (make money from subscriptions, user data and other monetisation) and I'm doing my damnedest to keep away from any of it.

[–] Teppic@kbin.social 6 points 1 year ago

It could get interesting with right to repair, that probably includes the right to load custom firmware...

[–] upstream@beehaw.org 4 points 1 year ago

There’s no down-side to selling a smart TV to someone who doesn’t want one/doesn’t use the features.

The features we “want” from modern TV’s like DolbyVision and all the shit they do the image to make it stand out in the store requires a significant amount of processing power.

It’s simply better business to sell smart TV’s to everyone than to make dumb TV’s that compete for a tiny fraction of the market when people buy Smart TV’s in every price segment.

[–] amju_wolf@pawb.social 4 points 1 year ago (1 children)

The paradox being that if therr were "premium" smart TVs for people like us - with proper support, privacy, customization options and no crap like ads - we'd probably buy it, and pay a premium for it.

But that's just too much work for them and they probably don't even realize that kind of market exists.

[–] averagedrunk@lemmy.ml 3 points 1 year ago (2 children)

I think they know it, I just don't think they care. It's a niche market. On top of that, they'd have to convince the people in that market to trust them.

If they can get a 10-20% return on 10,000 Smart TVs, why waste the effort on properly developing and supporting 3 PrivaTVs (patent pending, exclusions apply, see your local drunk for details)?

I could be wrong, I just don't think the market is large enough that they'd be willing to throw manpower at it.

[–] amju_wolf@pawb.social 2 points 1 year ago

I think you're right mainly in that what they're doing now is sure and easy money. Why risk it, right?

[–] jarfil@beehaw.org 2 points 1 year ago* (last edited 1 year ago) (1 children)

I think people underestimate the value of their tracking data. For a manufacturer, the benefits over the lifetime of the device, can be way higher than 20% the manufacturing costs.

They could still develop and support those 3 PrivaTVs, but the MSRP would easily be a few times higher than that of an equivalent Smart TV.

[–] averagedrunk@lemmy.ml 1 points 1 year ago (1 children)

How many more times? If a regular TV costs $100 then they're making $20k plus marketing data. The PrivaTV would need nearly a $7000 markup for the same return.

Obviously these are made up numbers for illustration. I think that for big manufacturers it's not worth it for the return and amount of effort they would need to spend. Maybe a small manufacturer could do it. Maybe that would spur the big guys to buy them out and take it over once the hard work is done.

[–] jarfil@beehaw.org 1 points 1 year ago (1 children)

A TV manufacturer doesn't need to develop that PrivaTV from scratch, they can get their SmartTV and just rip out the Smart part, for a much lower markup.

A big manufacturer is one who'd have it easier; just need to make "privacy" into a selling point, then slap a "Private" sticker instead of the "Smart" one.

Hopefully with the "right to repair", we might see some people ripping out the smarts out of a SmartTV, possibly just flashing an updated firmware, so that might convince manufacturers to give it a go too.

[–] averagedrunk@lemmy.ml 1 points 1 year ago

I was specifically talking about what the original commenter said.

with proper support, privacy, customization options and no crap like ads

Dumb TVs are already a thing as mentioned elsewhere. Commercial Displays cost more but you can beat someone to death with them and they'll still work.

I'm with you on hoping for more options. I'd hate for my next TV purchase (hopefully years from now) to be forced online under the guise of firmware updates to steal my viewing habits.

[–] beefcat@beehaw.org 27 points 1 year ago

They aren't very good though. They are durable, but usually expensive and missing a lot of features you might actually want for that price tag. For example, I've yet to find any OLED "commercial displays" that support Dolby Vision, VRR, and eARC.

It's way cheaper and easier to just buy the TV you want and not connect it to your wifi.

[–] shininghero@kbin.social 14 points 1 year ago (2 children)

Computer monitors should work too, and are more readily available. Just dig through the business oriented monitors and ignore the gaming ones, as cable providers aren't really going to have anything that can take advantage of >60 fps display rates.

[–] dingus@lemmy.ml 9 points 1 year ago* (last edited 1 year ago) (2 children)

My personal experience with computer monitors is that they work great except they always seem to cheap out on speakers if they have built in speakers. Tiny, tinny things whose volume is always way too low.

I don't mind having separate speakers, but once in a while it would be nice to not need them.

[–] Mac@mander.xyz 11 points 1 year ago (1 children)

I don't think I've ever heard what my TV speakers even sound like. I've never used them.

[–] Frederic@beehaw.org 2 points 1 year ago

Same, I think I never used them, when I bought my latest TV I already had my good old 5.1 system

[–] FiniteLooper@lemm.ee 7 points 1 year ago (1 children)

Even on a high end TV the speakers are going to be bad. It’s just there to check a box. TVs are so thin that you cannot physically fit in speakers large enough to sound good.

A cheap sound bar will make a huge improvement to audio quality over any built in speaker system.

[–] dingus@lemmy.ml 4 points 1 year ago* (last edited 1 year ago) (3 children)

Right, but at that point, may as well just invest in a fucking PC monitor. Like what else is a TV really bringing to the game that a monitor can't?

Like, if they can't put in speakers worth a damn, that's the point of even including them?

[–] sodypop@beehaw.org 4 points 1 year ago

Like what else is a TV really bringing to the game that a monitor can’t?

A tuner and a remote control.

[–] BorgDrone@lemmy.one 3 points 1 year ago

Size and picture quality.

[–] natebluehooves@pawb.social 2 points 1 year ago

Find me a reasonably priced 70” monitor and i will hail you as the next coming of christ. That is the holy grail for me.

[–] Hamartiogonic@sopuli.xyz 1 points 1 year ago

Since I’m going to be skipping the TV part with my HTPC, then why not simply use a computer monitor. Nowadays you can also get a 40+” monitor, and that should be big enough for most people. These things might not even have any speakers, so you might need to plug it into an audio system to make it all work.

[–] Banzai51@midwest.social 11 points 1 year ago (1 children)

The other option is to buy the smart TV, turn off the networking, and hook it up to a Shield, Apple TV, or Roku. All those box makers are going to support the devices longer than TV manufacturers, and the streaming apps can't ignore them.

[–] brihuang95@sopuli.xyz 4 points 1 year ago (1 children)

so is using something like an Apple TV or Roku box actually more secure than just using the apps directly on the TV?

[–] Nawor3565@lemmy.blahaj.zone 8 points 1 year ago

Yes, because streaming boxes can be upgraded independently of the TV and so you can always have hardware that's actively supported. My old Roku 3 was still getting updates as of a few years ago, while my "smart" TV from 2015 stopped getting security updates long ago.

[–] Frederic@beehaw.org 10 points 1 year ago* (last edited 1 year ago) (1 children)

Last time I looked for commercial dumb TV, a SHARP was like $4000 for a 65" 1080p or something :-/

[–] clgoh@lemmy.ca 1 points 1 year ago (1 children)
[–] Frederic@beehaw.org 1 points 1 year ago

Not bad, I'm in Canada I'm wondering if i could find it, but I'd like the 75" one, at about 2k US, I guess a Sony from Costco would have better pictures

[–] storksforlegs@beehaw.org 5 points 1 year ago

This is good to know, thank you for the info. I am getting worried about my increasingly old TV (15+ years) and I do not want a smart TV to replace it.

[–] GenderNeutralBro@lemmy.sdf.org 5 points 1 year ago (2 children)

However, the other solution is the one you’ve already mentioned where you never plug the Smart TV into the internet, and instead bypass the “smart” on the TV with your own streaming boxes.

I did this for a long time on my old Vizio TV, but the experience was notably worse with external devices compared to built-in, due to the limited framerate support over HDMI. This led to awkward juddering when e.g. trying to play 23.976fps movies with only 30hz or 60hz output. It also meant built-in video features like motion interpolation did not work effectively.

I guess this is less of an issue today with VRR support on high-end TVs, but still, a lot of devices you might connect to a TV don't support VRR.

[–] beefcat@beehaw.org 6 points 1 year ago* (last edited 1 year ago) (1 children)

Your streaming box was either not configured properly, or was very low cost.

The most likely solution is that you need to turn on a feature on your streaming box that sets the output refresh rate to match that of the content you are playing. On Apple TVs it is called "match frame rate". I know Rokus and Android TV devices have similar options.

Newer TVs can detect when 24 fps content is being delivered in a 60 hz signal and render it to the panel correctly, but this doesn't usually work if you have the selected input set to any low-latency modes ("Game", "PC", etc)

Good to hear newer devices support this.

My experience was from quite a few years ago (2015ish). At that time, there was no such feature in any of the devices I tried connecting, including a few brands of Android phones, Fire TV sticks, and MacBooks. I remember reading into the documentation on other devices at the time to find something better, with no luck. That said, documentation was pretty poor all around so who knows? The most useful info I found was in threads on VideoHelp or AVS forums where other users reported similar issues on various devices. Android TV was still very new and very shitty back then.

At this point I would simply not buy anything that doesn't support VRR.

[–] TJmCAwesome@feddit.nu 4 points 1 year ago (2 children)

This is one of the downsides of the widespread adoption of HDMI, it has quite a few downsides. Something like display port would be better, but it's far less common. Such is life.

[–] beefcat@beehaw.org 5 points 1 year ago* (last edited 1 year ago)

How is this a downside of HDMI?

It sounds to me like the user's TV or streaming box are configured incorrectly. DisplayPort doesn't magically remove judder from 24fps content being rendered to a 60hz signal.

DisplayPort never saw widespread adoption in the home theater space because it never tried to. The standard is missing a ton of features that are critical to complex home theater setups but largely useless in a computer/monitor setup. They aren't competing standards, they are built for different applications and their featuresets reflect that.

[–] GenderNeutralBro@lemmy.sdf.org 3 points 1 year ago (1 children)

Newer revisions of HDMI are perfectly good, I think. I was surprised and dismayed by how slow adoption was. I saw so many devices with only HDMI 1.4 support for years after HDMI 2.0 and 2.1 were in production (probably still to this day, even). It's the biggest problem I have with my current display, which I bought in 2019.

[–] beefcat@beehaw.org 2 points 1 year ago* (last edited 1 year ago) (1 children)

GP's problem probably isn't even bandwidth, but rather needs to enable their TV's de-judder feature or configure their streaming box to set the refresh rate to match that of the content being played.

[–] GenderNeutralBro@lemmy.sdf.org 2 points 1 year ago (1 children)

VRR support came with HDMI 2.1.

You could still have your player device set to a static 24 or 30 without VRR, in theory, but none of the devices I tried (granted, this was ~8 years ago) supported that anyway.

[–] beefcat@beehaw.org 4 points 1 year ago* (last edited 1 year ago)

VRR is really meant for video games.

You could still have your player device set to a static 24 or 30 without VRR, in theory, but none of the devices I tried (granted, this was ~8 years ago) supported that anyway.

That's interesting. Pretty much every Blu-Ray player should support this. I can confirm from firsthand experience that Apple TV, Roku, and Android TV devices also all support this. I can't speak for Amazon's fire stick thingy though.

The feature you are looking for is not to manually set the refresh rate, but instead for the device to set it automatically based on the framerate of the content being displayed. On Apple TV it’s called “match frame rate”.