Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Bloodplay it again
Aug 25, 2003

Oh, Dee, you card. :-*
I know the rule of thumb is to go with a Vizio if the budget is less than $1k, but I just bought a Sony KDL-55W950B. It has the lowest input lag of any HDTV (17ms), passive 3D, and decent enough sound to last me until I buy a soundbar. I was gonna go with the 2015 model that comes out in a few weeks, but it has active 3D at 120 Hz, which isn't something I want. I tried to convince myself to get the Vizio but it just didn't look anywhere near as good as the Sony.

Adbot
ADBOT LOVES YOU

Bloodplay it again
Aug 25, 2003

Oh, Dee, you card. :-*

Aphrodite posted:

That's not a mistake or anything. With a model number like W950 I assume that's one of Sony's top end TVs, so yeah that's going to be very nice.

The W series is mainly average except for the W950B, I think. I've seen the W800 for $200 less, but it doesn't have 3D and I don't think the screen is edge lit. Dunno if it's an IPS panel, either. I wanted to get an XBR, but 4K at 55" isn't worth the extra $500 IMO. The LG OLED is the only other 1080 set I was even considering, but the Sony is half the price. My next TV after this is absolutely going to be OLED. I'd take the 55" OLED over an 85" 4K, even if they were the same price.

Bloodplay it again
Aug 25, 2003

Oh, Dee, you card. :-*

bull3964 posted:

Many people were disappointed with the W950b because it came after the truly excellent W900a and couldn't match it in black levels or color accuracy. The W900a was the last 'top of the line' 1080p TV that Sony produced. I don't know if it's true or not, but supposedly the W900a was originally going to be an XBR set. However, Sony decided at the 11th hour that the 2013 XBR TVs had to be 4k only so it was re-branded.

I did see a 900A pop up, but it was $150 more and I assume it has active 3D. I don't plan on watching movies exclusively in 3D, but I'm willing to take a cut in resolution over the flickering at 120 Hz. I like active on 240 Hz models, but the flickering drives me insane at 120. I'm more worried about backlight bleed than I am black levels.

I'm upgrading from a 32" 720p Samsung I bought in 2007 for $700. I'm sure even "awful" black levels are better than what I'm used to using.

Bloodplay it again
Aug 25, 2003

Oh, Dee, you card. :-*

bull3964 posted:

The W900A was a native 240hz panel.

God drat it. I could probably still cancel the order since I chose free shipping on Monday and it still hasn't shipped out. Odd, since Amazon hits me with a penalty if I don't ship a fulfilled by merchant order on Amazon within 24 hours.

Bloodplay it again
Aug 25, 2003

Oh, Dee, you card. :-*

Good Will Hrunting posted:

How come I don't see this on the Display Lag link in the OP?

According to what I could find online, it is ~35ms on game mode. In May, I bought last year's 1080 KDL55W950B because it was ranked so highly on displaylag and I use it mainly as a PC gaming display. It says ~17ms and looks/plays crazy well with everything I've played on it (Shadow of Mordor, Fallout 4, Rocket League, Skyrim, Vermintide, Battlefront). I can't imagine another ~17ms would make that much of a difference, unless you're playing twitch shooters on it.

Bloodplay it again
Aug 25, 2003

Oh, Dee, you card. :-*

Number_6 posted:

Best Buy and Amazon have dropped the price of the 55" Sony A80J to a level I just can't refuse ($999). I hope I get a clean panel.

I've been clutching my beloved (and still working) Panasonic plasma with my cold dead hands for years now (2005 model) but it's time.

I ordered this from Costco yesterday and it looks like it is sold out now online, but some stores just got them in. It's A80CJ and comes with a 3-yr instead of 1-yr warranty. The only other real difference I saw is 1 yr Bravia Core streaming + 5 movie credits with A80J and 2 yr Bravia Core + 10 credits with the A80CJ. Fyi the A90J remote with brushed metal finish and motion sensor backlight works with the A80J/CJ too if you ever want to upgrade.

Bloodplay it again
Aug 25, 2003

Oh, Dee, you card. :-*

OldSenileGuy posted:

I need to buy an office/spare bedroom TV. I’m eyeing the 55” Sony A80J because it’s $999 and that’s about what I’m looking to spend. It’s actually marginally cheaper than the equivalent LG C1, and I’ve heard anecdotally that it’s a smidge better in picture quality and motion handling, at the expense of being a smidge worse for gaming (which doesn’t matter to me at all for this TV).

My only problem is the devil on my shoulder whispering “don’t pay for last year’s model - the new ones are only $700 more!”

Am I really missing out on anything huge by picking up the A80J instead of the A80K (or the C1 instead of the C2 for that matter)?

Also, side note, I see the version Costco sells is actually the “A80CJ” - I assume that just designates that it’s the Costco version, but other than that there’s no difference?

The model Costco sells has a 3-year warranty, a physical switch on the left-hand side of the TV to turn off the microphone, and 24 months/10 movie credits for Bravia Core instead of 12/5 like the normal A80J. The packaging on the Costco model is a little cheaper, too, in that they want you to tear into the box instead of lifting it off. If you carefully cut the tape on the bottom, it will still slide off though and I recommend doing that and saving the box in case you need to move it in the future.

If you really want to spend more money, just spend a fraction of that on the backlit metal remote that comes with the A90J because it works with the A80J/A80CJ. The buttons also have a click to them instead of feeling mushy.

There weren't any available at my local warehouse when I checked on the 2nd, so I ordered one online and scheduled delivery for the 7th. Got a robocall on the 6th saying it wasn't going to be delivered and it instructed me to reschedule. The next date out was 8 days later so I wound up canceling the online order after discovering my local warehouse had gotten about a dozen in that same day. None of them were out on the floor yet.

The TV's VRR works fine as long as you go into the Google TV settings and change the input to "VRR" before connecting the HDMI cable to a GPU or console. It supports 4:4:4 at 4k/120 with VRR enabled on either of the two HDMI 2.1 ports as long as you are using a 48 Gbps cable, at least with an RTX 3xxx series GPU. I know you said it doesn't matter in your case, but I wanted to add it in case anyone else is interested.

Sound quality on the Sony is nearly infinitely better than the C1, esecially once you change it off the default audio preset to something like cinema. Can't really recommend the $1,100 soundbar that will let your $1,000 TV act as a center channel though. :v:

Picture settings are controlled on a per-input or per-app basis so be prepared to toggle some settings to your taste. That said, the out of the box picture quality is way higher than either of the C1s that I helped friends configure over the past year and it's a matter of changing a handful of settings instead of doing deep dives into test patterns.

The built-in apps support advanced features of some streaming services like Dolby Vision or IMAX Enhanced. Bravia Core requires ~115 Mbps or faster internet access for its highest quality streaming, but in Sony's infinite wisdom, they used a 100 Mbit ethernet port. You'll need to use either a USB 3.0 ethernet adapter with a specific chipset for wired or 5 GHz WiFi if you want to use it properly. I haven't run into any issues using WiFi in a particularly congested area, but ymmv.

The Sony has an ATSC 3.0 tuner and the LG C1 doesn't if you're gonna be connecting an OTA antenna.

Finally, if you go to Settings > Accounts & Sign In > select acct > Apps only mode - This will remove all the ads except one from the home screen. No more garbage tv/movie recommendations. no I don't want to watch the last 27 minutes of the adventures of pluto nash free with ads.

Bloodplay it again fucked around with this message at 09:35 on Jul 9, 2022

Bloodplay it again
Aug 25, 2003

Oh, Dee, you card. :-*

morestuff posted:

Pulled the trigger on an A80J — anyone have any experience using the built-in speakers? Wouldn't mind ditching my soundbar so I can use the legs in low-profile mode

This is exactly what I did. I have a 360W soundbar/sub but haven't even bothered hooking them up. I'd give it a shot in low profile mode (it's how the legs are setup by default if you just slide em in according to instructions) and if they aren't good enough, you can simply remove the legs and reattach them so that the TV sits higher and has room for your soundbar. I am in an apartment complex with 4 neighbors surrounding me, so it isn't like I ever turn the volume up past ~22 anyway. It was a necessity with the KDL55W950B to be able to hear dialogue clearly, but it hasn't been necessary with the A80J.

If you have ever stood next to a pair of large speakers at a low volume, it is similar to the A80J speakers. There are only three 10W drivers in there, but because it is using the entire screen, it has an impressive... Presence?

My only complaint regarding sound is that the default sound profile is fairly flat. Cinema has been the best profile, imo, for movies, TV shows, and games. I can't find a single form of media where the default profile sounds better than cinema.

If you wind up using bluetooth headphones, there's of course a handy sync feature to change the a/v delay so that latency isn't ever an issue with TV/movies. It will probably not work well for games, though, because it would insert too much of a delay and make controls feel quite sluggish.

Bloodplay it again fucked around with this message at 23:03 on Jul 14, 2022

Bloodplay it again
Aug 25, 2003

Oh, Dee, you card. :-*
What's the model of your a/v receiver? My guess it is doesn't have HDMI 2.1 for 4k/120. If it does support 4k/120, are you sure you're using 48 Gbps HDMI cables? If 4k@60 is saturating your HDMI 2.0 or older A/V port/older HDMI cables, it might not have enough bandwidth to do HDR color data + 4k @ 60 simultaneously, which is why you're seeing HDR not supported.

Regarding res: 3840x2160 or thereabouts is what you'd see. A lot of the 4k TVs will also support 4096x2160, but only at 60 Hz. You are infinitely better off doing 3840x2160@120.

Bloodplay it again fucked around with this message at 00:50 on Jul 15, 2022

Bloodplay it again
Aug 25, 2003

Oh, Dee, you card. :-*

Elephant Ambush posted:

I'll check the receiver model but it's only like 3 years old. I also have no idea what type of HDMI cable I'm using. I thought they were all the same. I guess they're cheap on monoprice so I'll order one

Thanks for the response btw

You could see which of the cables are 48 Gbps by connecting your PS5 directly to one of the TV's HDMI 2.1 ports and checking for 3840x2160@120 Hz support. The one that came with the console may very well be 48 Gbps, but I don't know. Every cable that kicks back 4k@60 is < 48 Gbps. If your receiver is three years old, there's no chance it has HDMI 2.1 ports that support 4k@120 Hz, so you would need to connect the console directly to the TV to test the cables or play 4k@120. Even if you had all 48 Gbps cables, the a/v receiver's ports only support 24 Gbps at most, so you'd never get the 4k@120 signal to the TV.

Monoprice had an ETA date of early September when I looked a couple weeks back for some 48 Gbps HDMI cables, so I wound up buying a 10' onn-branded cable from Walmart for about $15. I didn't actually believe the packaging in its bandwidth claims and expected to have to return it, but it hasn't given me any issues. Besides, you know, the California prop 65 warning that lists something like 6 or 7 carcinogens. I keep telling myself it's all in the interior solder... :yikes:

Bloodplay it again
Aug 25, 2003

Oh, Dee, you card. :-*
Bravia Core on Sony TVs requires at least 115 Mbps to use Purestream. The manual even specifies you need to use wifi because the TV's ethernet adapters are only up to 100 Mbps, but you can use a USB 3.0 ethernet adapter with a specific chipset if you must use a wired connection. As already mentioned, it also makes a big difference if you are using Steam Link too.

Bloodplay it again
Aug 25, 2003

Oh, Dee, you card. :-*

morestuff posted:

Kind of bummed out. I spent years hemming and hawing over whether an OLED would be bright enough for my living room before the price on the A80J was too good to pass up. Delivery came today and even after messing with the settings for a few hours it’s unwatchably dim in everything except for the Vivid preset — and even that’s just OK.

Gonna give it a few days but leaning towards returning it at this point.

The absolute brightest settings (for SDR content):

Picture mode: vivid
Advance contrast enhancer: high
Peak luminance: high
Brightness: max
Contrast: max
Color temp: cool
Live color: high

This will get it to about 700 nits, but the picture won't be super accurate. The same settings for HDR content will get you to about 900 nits. If it's still too dim with those settings, I doubt even a 3x as expensive A95K would really be bright enough and you should look at LEDs instead.

Edit: on second review, the new QD-OLEDs do get about twice as bright, so I take that last sentence back.

At a full white screen with no other colors, WOLED gets up to about 130 nits and QDOLED gets up to about 250 nits. The 700/900 values are assuming you are watching basically anything besides a hockey game.

Bloodplay it again fucked around with this message at 22:07 on Jul 20, 2022

Bloodplay it again
Aug 25, 2003

Oh, Dee, you card. :-*

m.hache posted:

poo poo I'm down the road from there tomorrow. Thanks!

They have the Sony A80J for $1,498 too. You miss out on two additional hdmi 2.1 ports, but you get better processing and Android instead of the awful LG OS. Both are really great TVs, though, and you will be happy with either.

Bloodplay it again
Aug 25, 2003

Oh, Dee, you card. :-*

Taffer posted:

Thanks for the answers everyone! Sounds like the general choices are either run an HDMI cable or get a Shield? I was planning a chromecast upgrade for a new TV (what I use for everything right now) so maybe getting a shield instead would be a good choice.

Probably worth noting: if you want to go the 4k/120Hz via HDMI route, you are limited to about 5m of cable. Anything beyond that needs to be actively powered and gets super pricy ($90+). I don't know what the max cable length for 1080/120Hz is, but I assume it is somewhere between 4 and 5 meters. When I upgraded to an A80J, I had to rearrange some things to be able to connect a 48 Gbps HDMI cable to the TV and my GPU. The Shield TV Pro will do 1080/120Hz, but you can't do 4k/120Hz, since it doesn't have HDMI 2.1 support. The only real benefit over the Steam Link would be the 1080/120Hz route (since the Link tops out at 60 Hz) and the ability to stream 4k videos with Dolby Vision HDR profiles to the TV. I think most TVs that support DV profiles already have native apps on the TV to play them, though, so it's usually easier to use something like Plex or Just Player natively on the TV.

In regards to 2560x1440 on a TV: be sure to check a site like Rtings and look at the supported resolutions under the Input Lag section. If you were to get an LG C1, for example, you'd be fine because 2560x1440@120 Hz leads to just over 5ms input delay, other factors notwithstanding. If you got a Sony A80J, you'd be very disappointed because you have to force a 2560x1440 resolution at anything > 60 Hz (done via AMD or Nvidia control panel) and it adds around 50 ms input lag. It's not an officially supported resolution.

Out of curiosity, which GPU(s) are you using with your computer(s)? Only cards released in the past couple of years have HDMI 2.1 for 4k/120, so if you're rocking something like a GTX 1070, you'd never get a 4k/120 signal, even if you went through the effort of rearranging everything to be able to connect a <15' 48 Gbps HDMI cable.

Bloodplay it again
Aug 25, 2003

Oh, Dee, you card. :-*

Obviously ymmv, but my local warehouse has been selling the Denon S760-H on clearance for $349.97 since 10/06. I would highly recommend it over any Onkyo.

Item #3360461 if anyone wants to check.

Bloodplay it again
Aug 25, 2003

Oh, Dee, you card. :-*

Rescue Toaster posted:

I've been looking at the C2 a lot too, and supposedly it has like a 'Cinema' mode for motion interpolation that's supposed to help with low framerate material without going full blown Soap Opera Effect. No idea how true that actually is though, but it was something I was considering.

Yeah the Sony and LG both have low motion smoothing options useful for removing judder from panning shots at 24 FPS. The newest James Bond movie and Encanto both greatly benefit from it. OLED have such fast response times that panning or spinning shots look jarringly bad if you don't enable it.

Bloodplay it again
Aug 25, 2003

Oh, Dee, you card. :-*

OldSenileGuy posted:

Ok, just posting this now to help out anyone else that might be trying to make the same decision -

I think I'm gonna be returning the A80J. The first thing I did upon setting it up was head over to AVSForums to see what the recommended picture settings were, and the very first post is from some calibration dude that says to turn off Dolby Vision on this tv because it isn't implemented well. WTF. That's really something I wish I would have seen before I bought the drat thing because that would have been disqualifying.

However, I know that sometimes the AVSForum people can approach audiophile levels of crazy, so I moved forward with DV turned on and watched the TV. Sadly, I don't think they were wrong about the TV's poor implementation of DV. I watched the first 5 episodes of the ATV+ show "Bad Sisters" on my old LG C8 and didn't notice any issues. I watched the rest of the show on the A80J, and was frequently noticing wild swings in brightness, contrast, and saturation from shot to shot within the same scene. I tried many different settings, and while some were better than others, none made the issue go away entirely.

If you're interested in a deeper dive on this, you can check out my posts on the last couple pages of this thread at AVSForum:

https://www.avsforum.com/threads/2021-sony-a80j-oled-settings-tips-gaming-and-advanced-discussion-thread.3208007/page-264

And if anyone here has the A80J and watches DV content on it AND has ATV+, I would be interested in hearing them watch Bad Sisters, Season 01, Episode 08, around 38:30 into the episode. That whole bathroom scene is a good example of the shifts in brightness/contrast within the same scene that I'm seeing.

It's a shame because SDR content looks great, but I'm sure it'll look great on the C2 as well.

I think Dolby Vision Dark used to have issues in certain apps (I saw a lot of complaints about Disney+, for example), but a combination of app updates and firmware updates fixed all of those issues, afaik, months ago. I watch Dolby Vision content using Plex, Disney+, and HBO Max without any issues, but I am using the apps on the TV, not a separate box. I don't know much about the Apple boxes, but I know that despite the Nvidia Shield Pro being recommended for DV content, it only supports like three of the seven different types of DV profiles. I haven't had any DV issues, even when streaming huge uncompressed UHD rips with Dolby Vision profiles from my desktop.

I would recommend signing into the Apple app on the TV itself and trying the same scene. It's also worth nothing that each app and each input has its own separate picture settings, so just because you turned DV on in one app doesn't mean it's now on for the other apps. It took me some time to fine tune settings for 4 apps, OTA TV, a Nintendo Switch, and HDMI to my PC's GPU. Hilariously, the Apple TV UI is so bright that it's the only time I have ever seen any image retention on the panel.

One of the older issues with the software was with how it applied tone mapping after enabling/disabling DV. By habit, if I change the picture settings to either DV Bright or DV Dark, I also scroll down to disable tone mapping and re-enable it. This depends entirely on whether the content I'm watching has scene-by-scene or frame-by-frame DV mapping and may not be the right(tm) way to do it, but I am happy with the results. I disable it on frame-by-frame grading.

My advice is to grab whatever apps you might use off the Play Store, go into the main TV settings and under the account section, toggle it to apps-only mode. When you hit the Input button on your remote, you can then go to the + box and add whatever apps you use directly to the input button. You can disable apps-only mode again if you need to install something else and toggle back when you're done. At the very least, this gets rid of the bulk of the ads if you hit the Apps button on the remote. I never personally use the apps button, since I have all of my used apps in that menu.

If you have your TV connected to wifi (not ethernet, as they cheaped out and put a 100 Mbit ethernet adapter in the TV), be sure to check the bonus offer app on the TV and open up Bravia Core. There are a handful of movies there (Hook, Ghostbusters and Ghostbusters 2, and a few others at the moment) that are free to stream without burning through one of your 5 vouchers. Again, since it's a different app, you'll have default picture settings. I think those movies might have the "IMAX Enhanced" profile enabled, too. You need ~120 Mbps+ internet access and I think there's a specific PureStream toggle in the settings you will want to enable. If your ISP has data caps, ignore this entirely because you are going to burn through tons of data.

Bloodplay it again fucked around with this message at 19:19 on Nov 18, 2022

Bloodplay it again
Aug 25, 2003

Oh, Dee, you card. :-*

Animale posted:

I have a 2022 Sony that has an ATSC 3.0 tuner and my answer is not sure because I can't tell what is the ATSC 3.0 channel and what isn't since no one actually airs anything in 4k or HDR for me to verify. My receiver does show Dolby D+ while watching this football game on both YouTube TV and the TV tuner, so probably? YTV shows the game being streamed at 720p and I believe my local Fox channel is airing it at the same resolution so ATSC 3.0 has a long ways to go before it's actually viable.

Part of the holdup, besides the obvious expenses involved, is that OTA networks are super congested compared to any other point in broadcast history. The broadcast spectrum is too saturated now, unlike the switch to color TV, addition of stereo audio, or the switch to digital 720p/1080i. Networks can't simply broadcast 4k alongside another, older broadcasts anymore because the previously unused channels are now owned by weird religious media entities. It is all or nothing. Considering so few people have 3.0 tuners in their TVs, a switch would leave most people who watch OTA without the ability to decode the new signals.

I used to be a huge proponent for the switch to 3.0 because it allows for cool interactivity (i.e. multiple choice quizzes) in edutainment broadcasts, but after having dipped my toes into the waters, it seems like it will nearly exclusively be used as a Nielsen-type box that constantly uses content recognition to spy on everyone instead. :(

Bloodplay it again
Aug 25, 2003

Oh, Dee, you card. :-*
If anyone else in this thread uses a Sony A80J or Sony A80CJ OLED and does not have the newest system update from March installed, I highly recommend you go into settings and disable auto-update if you haven't done so. The 120 Hz VRR mode is now completely broken with Xbox, PS5, and Nvidia GPUs with HDMI 2.1 ports.

Mine was disabled, last I checked, and still auto updated last night when I turned it off, sadly. Now I can't play games at 120 FPS from my PC because the screen flickers black with green artifacts as it negotiates the HDMI handshake every 15 sec or so. Happens regardless of Nvidia driver version. I haven't found any fix. Only way for me to game on it is to change the HDMI input to a 60 Hz mode.

I'm pretty frustrated, so hopefully this saves someone else from a headache. If I didn't use Plex or other Android apps for streaming with DV support, I would have disconnected the TV from the internet entirely already. Too late now.

Bloodplay it again
Aug 25, 2003

Oh, Dee, you card. :-*

Bloodplay it again posted:

If anyone else in this thread uses a Sony A80J or Sony A80CJ OLED and does not have the newest system update from March installed, I highly recommend you go into settings and disable auto-update if you haven't done so. The 120 Hz VRR mode is now completely broken with Xbox, PS5, and Nvidia GPUs with HDMI 2.1 ports.

Mine was disabled, last I checked, and still auto updated last night when I turned it off, sadly. Now I can't play games at 120 FPS from my PC because the screen flickers black with green artifacts as it negotiates the HDMI handshake every 15 sec or so. Happens regardless of Nvidia driver version. I haven't found any fix. Only way for me to game on it is to change the HDMI input to a 60 Hz mode.

I'm pretty frustrated, so hopefully this saves someone else from a headache. If I didn't use Plex or other Android apps for streaming with DV support, I would have disconnected the TV from the internet entirely already. Too late now.

Update to this:

The June 2023 update has seemingly fixed this issue and both of my 48 Gbps cables are now working fine again. The only issue I currently have gaming at 2160@120 with HDR is that the audio will cut out entirely after an undetermined amount of time.

When I first got the TV in July 2022, the audio would cut out for a split second every 30-40 min and immediately recover. It only happened when HDMI type was set to "Enhanced" for 4k@120. Now, the audio cuts out and does not recover unless the HDMI handshake is manually forced. I have been doing this by changing the audio output from TV speakers to external device and then back to TV speakers. It is a pain in the rear end for games that don't let you pause during cutscenes and the fix only works if the game is borderless windowed. Fullscreen exclusive games get screwed up during the handshake. A pain in the rear end, for sure, but I am glad to be gaming on the OLED again.

It is also worth mentioning that I have to use the nvidia HDMI audio driver from January 2022 because anything newer causes the audio to fail and never recover. Dpc latency is through the roof and the TV simply cannot handle it. Fun note: if you don't select the hdmi audio driver in custom install during the nvidia driver installation, it ignores your option and updates it anyway.

Bloodplay it again fucked around with this message at 15:42 on Jul 13, 2023

Bloodplay it again
Aug 25, 2003

Oh, Dee, you card. :-*

TITTIEKISSER69 posted:

For Prime Day I treated myself to a Roku voice remote. Is there any possibility of using it to turn on my Logitech z-5500s? I realize this is typically done by CEC and this sound system doesn't have an HDMI input.

Depending on how the Logitech setup functions after power is connected, you could maybe get one of those roku indoor smart plugs for the Z-5500's power and use the new remote with the plug. If there's no HDMI and you're using optical out from the TV, I don't know of any other way you could do it, though. You would still have to use the other remote for every other function, too.

If you leave it on the same input, you could try unplugging the Logitech hardware, let it sit for a bit (maybe press power to dissipate the remaining stored electricity first), and plug it back in. If it powers on to the right input, it might work for you. If it goes to some other default state, the plugs would be a wasted purchase.

Bloodplay it again fucked around with this message at 01:22 on Jul 21, 2023

Bloodplay it again
Aug 25, 2003

Oh, Dee, you card. :-*

codo27 posted:

So I got a TCL 6 series, and I notice a bit of blur around text, especially when I'm running something thats just plain white on black, like playing music through my receiver. What would be the first thing you'd look at in that case? I have replaced the HDMI cable, not as a troubleshooting step though.

The TCL 6 series uses a BGR subpixel layout, which causes fringing around text. There's not a lot you can do about it, but you can read more about it here.

Bloodplay it again
Aug 25, 2003

Oh, Dee, you card. :-*
Disable bluetooth and the mic on the remote won't work. If the TV came from Costco (model will have C in the model, i.e. A80CL instead of A80L) there is also a physical switch on the TV itself to disable the mic. Your remote will need to be pointed at the receiver on the TV if bluetooth is disabled, so keep that in mind if you have something like a soundbar blocking it.

E: hmm it sounds like the mic on the TV itself is picking it up. If you can't find a physical switch on the TV to disable it, try this

Settings, Privacy, Google Assistant, Voice control, disable the hands-free mic setting. If that doesn't work, PM me with the exact model no of the TV.

Bloodplay it again fucked around with this message at 23:07 on Feb 16, 2024

Bloodplay it again
Aug 25, 2003

Oh, Dee, you card. :-*

gwrtheyrn posted:

Wait what? Is there a typical location because I didn't not know this was a thing

80/90/95 J/K/L all have them located on the left-hand side (assuming you are facing the screen) below the bottommost HDMI port. It is a physical switch you slide to kill the mic connection internally.

Bloodplay it again
Aug 25, 2003

Oh, Dee, you card. :-*
What's doubly weird is that assistant isn't available at all with apps only mode enabled, so, regardless of the bluetooth or privacy settings, it shouldn't be able to trigger/function. You've already checked for Play Store updates, so if the option still doesn't appear in settings, I am at a loss as to what to try besides a factory reset. Obviously, not preferrable because then you have to set everything up again.

You could also check the support section on the TV and call Sony. They can remote into the TV, but your guess is as good as mine regarding how long it would take or how helpful it would be. Probably would go faster if it happens frequently or you can play media that regularly triggers it. Sounds frustrating as hell and I'm sorry I can't give you any other tips.

Bloodplay it again
Aug 25, 2003

Oh, Dee, you card. :-*
I use my A80J with a 3080 and it is awesome. Just make sure you use a 48 Gbps HDMI cable and change the HDMI format to the VRR option. The sound cuts out for half a second every 40 min or so but no issues otherwise. I also cap my global frame rate in nvidia control panel to 117 FPS.

Edit: You will need a video card with HDMI 2.1 ports to take full advantage. Not sure which GPU you have. Even if it is older, you could do 4k at 60 FPS.

Bloodplay it again
Aug 25, 2003

Oh, Dee, you card. :-*
The TV got too hot, probably because it has not been powered off since purchase, and the adhesive has failed. Your options are to either buy a new one and an extended warranty or have them invest in commercial signage displays, which will be hideously expensive even compared to the Frame and will look awful in a home. I can't even fault your mom for this because the whole selling point of the drat thing is to show off art while not in use.

They could also hound Samsung about it but I think they'll be told to kick rocks because we have no consumer protections in the US. Maybe making a social media stink could help. Who knows?

Bloodplay it again
Aug 25, 2003

Oh, Dee, you card. :-*
The post was more to explain the likely culprit (that adhesive is nearly transparent when applied in a thin layer, but yellow when it is thicker/pooled) and also emphasize how ridiculous a TV intended to be used to display an image for most of its life managed to crap out in less than 2 years. Shame on Samsung, not su madre.

Adbot
ADBOT LOVES YOU

Bloodplay it again
Aug 25, 2003

Oh, Dee, you card. :-*
I was also going to ask about the SSD filesystem. Are you able to see the drive with the shield at all? I know the apps aren't detecting it. You may need to reformat if it is NTFS. You can reformat it with the shield or Windows to exFAT, which will allow you to transfer the video back to the SSD. Android doesn't like NTFS. Obviously back up anything you wanna save before reformatting.

Bloodplay it again fucked around with this message at 12:28 on May 16, 2024

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply