Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Yaoi Gagarin
Feb 20, 2014

Why is there judder anyway? This never used to be a problem on old dumb TVs

Adbot
ADBOT LOVES YOU

KillHour
Oct 28, 2007


VostokProgram posted:

Why is there judder anyway? This never used to be a problem on old dumb TVs

It's because the screens have such good responsiveness now. Old displays have bad response times, which naturally adds blur to the image. If you go even further back, CRT TVs were interlaced, which means they only drew every other line on the screen per frame, so every frame was a combination of the current frame and the last frame. So motion was very smooth, at the expense of also being very blurry.

This is also why there has been some grumbling about if film should switch from 24 FPS to 48 or even higher. Anime (and animation in general) is even worse, because animations are typically not drawn for every frame. Instead, the artists will draw every 2, 3 or even 4 frames, and the in-between frames will be subtly adjusted to smooth out the motion.

KillHour fucked around with this message at 23:05 on Dec 30, 2022

Enderzero
Jun 19, 2001

The snowflake button makes it
cold cold cold
Set temperature makes it
hold hold hold
There’s 2 types of judder: the natural version is seen when panning horizontally quickly - 24 fps is low enough that motion blur (from holding the shutter open for 1/48 seconds, which is half the 1/24th of a second each frame is shown) can’t hide it and you see large jumps from frame to frame. This is inherent and seeing a movie in the theater has the same issue.

Then there is judder caused by incompatible refresh rates. 60hz tvs had this problem - you can’t divide 24 into 60 evenly, so they would show 1 frame for 2 refresh cycles, then the next frame for 3 cycles - which causes a frame timing issue where a perfectly smooth horizontal pan “jumps” unevenly - 2 then 3 then 2 then 3 etc. Motion interpolation basically tried to create 5 separate frames for those 2 film frames using software so the judder is gone but now it doesn’t look like film but rather a soap opera.

With 120hz screens judder is gone - just hold each film frame for 5 screen frames. So it’s really not more of an issue - it’s less - and motion interpolation is garbage for everything but live sports and maybe anime.

KillHour
Oct 28, 2007


As you said, even film can't fully hide the panning judder, because there isn't enough blur in the actual image. Old TVs hid this by virtue of being poo poo. We only think 24 FPS looks 'correct' because we're used to it. :shrug:

There's been a bunch of articles and such by animation artists saying that drawing animation on 2s or 3s is artistic decision and technology is ruining their vision by interpolating, and they are also wrong, so it's not just film people I disagree with.

abelwingnut
Dec 23, 2002


as someone who religiously hates the soap opera effect, i just turned off my samsung's automotion plus/picture clarity.

i don't know if there's a better way to avoid it than that?

Enderzero
Jun 19, 2001

The snowflake button makes it
cold cold cold
Set temperature makes it
hold hold hold

KillHour posted:

As you said, even film can't fully hide the panning judder, because there isn't enough blur in the actual image. Old TVs hid this by virtue of being poo poo. We only think 24 FPS looks 'correct' because we're used to it. :shrug:

There's been a bunch of articles and such by animation artists saying that drawing animation on 2s or 3s is artistic decision and technology is ruining their vision by interpolating, and they are also wrong, so it's not just film people I disagree with.

I’d guess it’s mostly a financial decision, to produce the work cheaper? I have heard one artistic use of that though - in Into The Spiderverse when Miles swings for the first time they animated him on the 2’s with background on 1’s to make it seem more awkward. Once he learns how to use his powers well, they switch back to animating him on 1’s.

KillHour
Oct 28, 2007


Enderzero posted:

I’d guess it’s mostly a financial decision, to produce the work cheaper? I have heard one artistic use of that though - in Into The Spiderverse when Miles swings for the first time they animated him on the 2’s with background on 1’s to make it seem more awkward. Once he learns how to use his powers well, they switch back to animating him on 1’s.

It almost always is, because animation is expensive. You just get a certain type of people who have to insist that new technology always makes everything worse. That is a cool use of it though.

wash bucket
Feb 21, 2006

VostokProgram posted:

Why is there judder anyway? This never used to be a problem on old dumb TVs

I actually still have an old CRT TV and I notice panning stutter on it too. I guess it wasn't a "problem" back then because that's just how movies looked and there wasn't anything you could do to change it. But now TVs have all these fancy processing features that we can adjust and get really fussy with the details.

I kinda miss the old days when adjusting the image on a new TV took like 3 minutes because there just wasn't much to fuss with.

mancalamania
Oct 23, 2008
I (stupidly?) thought 24 fps stutter wasn't an issue in a movie theatre because of flickering?

Enderzero
Jun 19, 2001

The snowflake button makes it
cold cold cold
Set temperature makes it
hold hold hold

mancalamania posted:

I (stupidly?) thought 24 fps stutter wasn't an issue in a movie theatre because of flickering?

It’s not stutter; that’s when you freeze on one frame for a bit and then jump forward. To get a “frozen in time” effect you need a shutter of around 1/80 seconds or shorter. At 1/48, the shutter is open long enough for fast movement to blur parts of the image noticeably. Usually movement is slow enough that we get a pleasing motion blur that is effectively the cinematic look. But for fast horizontal pans, you see each frame blurred separately because it doesn’t blur enough to cover the movement in between each photo, so it appears “juddery” - you see one blurred image then suddenly the next without enough overlap and it looks jumpy.

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


Well, nowadays there's not really flicker in the theaters because it's mostly DLP projection at a higher real framerate (at a non-interpolated multiple of the 24fps source) with either a lamp or laser as the light source.

Back in the film days, stuff was actually projected at 48fps to reduce flicker, each frame was just doubled via a shutter and left up for half as long.

It was funny after years of seeing digital projection and then going to see "The Hateful Eight" 70mm roadshow, I was struck by just how much flicker there was in an actual film projection.

Modern DLP projectors are usually triple flashed at 72fps (some can do 72hz per eye for 3d.)

Mister Facetious
Apr 21, 2007

I think I died and woke up in L.A.,
I don't know how I wound up in this place...

:canada:

qbert posted:

Hot take: TruMotion set to Cinematic Movement is perfectly fine and is worth the trade off of not having judder during panning shots.

I don't watch sports, but I like it for anime. Though to be honest, I'm not sure it actually does anything with a framerate that low to begin with. :haw:

Quixzlizx
Jan 7, 2007

Mister Facetious posted:

If I recall correctly it's 60hz vs 120Hz, and it might have a lower peak brightness? If you're serious about TV gaming on oled, the C series is the way to go, otherwise it's not THAT big of a deal. Very few PS5 games even have a 4k120hz option anyway.

The B series is 120 hz, it's the A series that isn't.

pseudanonymous
Aug 30, 2008

When you make the second entry and the debits and credits balance, and you blow them to hell.
Is there a consensus about putting a tv over/in front of the fireplace? I know obviously heat/fire = bad for electronics. I only would ever use the fireplace basically in an emergency. I've lived here for about 3 years and not lit it once.

I know having the tv over the fireplace would also not be a good viewing angle but i've seen some mounts that kind of extend it downwards.

Strabo4
Jun 1, 2007

Oh god, I'm 'sperging all
over this thread too!


pseudanonymous posted:

Is there a consensus about putting a tv over/in front of the fireplace? I know obviously heat/fire = bad for electronics. I only would ever use the fireplace basically in an emergency. I've lived here for about 3 years and not lit it once.

I know having the tv over the fireplace would also not be a good viewing angle but i've seen some mounts that kind of extend it downwards.

Never used them myself but I've seen them recommended here multiple times, you might want to check out mountel mounts.

CatHorse
Jan 5, 2008

pseudanonymous posted:

Is there a consensus about putting a tv over/in front of the fireplace?

Put it on a table/stand in front of fireplace. So you can move it in an emergency.

jisforjosh
Jun 6, 2006

"It's J is for...you know what? Fuck it, jizz it is"

MikusR posted:

Put it on a table/stand in front of fireplace. So you can move it in an emergency.

Get an A/V cart from a local school for even easier moving

HONKER24
Dec 15, 2000

cubicle_whore
Hair Elf
It really depends on the fireplace type.

I currently have a TV above a fireplace but the house was built in the 2000s and the fireplace itself is a self-contained model, meaning that there is a metal tube behind the wall connecting the source of the fire all the way up to the roof vent.

This allowed me to drill in the wall above the fireplace without worrying about breaking through a barrier where the smoke travels. Also, being that there is a metal tube for heat/smoke to flow through, the heat isn't directly against the wall the TV is mounted to.

I had a chimney inspector come in to verify all of this before I started and would recommend you do too.

I was also told that mantels help to direct heat away from the TV above so we had one of those installed beforehand also.

KillHour
Oct 28, 2007


The fireplace in my livingroom has drywall around the chimney with an air gap between, so I just mounted it to that and it's been totally fine. If it's just the raw brick face, it might be different.

qirex
Feb 15, 2001

The "official" [such as things are] optimal height is where your natural eye level is between the center and bottom third of the screen.

pseudanonymous
Aug 30, 2008

When you make the second entry and the debits and credits balance, and you blow them to hell.

HONKER24 posted:

It really depends on the fireplace type.

I currently have a TV above a fireplace but the house was built in the 2000s and the fireplace itself is a self-contained model, meaning that there is a metal tube behind the wall connecting the source of the fire all the way up to the roof vent.

This allowed me to drill in the wall above the fireplace without worrying about breaking through a barrier where the smoke travels. Also, being that there is a metal tube for heat/smoke to flow through, the heat isn't directly against the wall the TV is mounted to.

I had a chimney inspector come in to verify all of this before I started and would recommend you do too.

I was also told that mantels help to direct heat away from the TV above so we had one of those installed beforehand also.

I think this is what I have - I do not have exposed brick above the mantel, just drywall, but I'll reach out, we had the chimney fixed up and some other things done so I can probably get some information from when that work was done.

George H.W. Cunt
Oct 6, 2010





Anything cool from CES?

FilthyImp
Sep 30, 2002

Anime Deviant
LG OLED G3 EVO with panel brightness 75% better than current models. New image processor.

LG OLED Z3 with 8k and a ATSC 3.0 tuner.

LG OLED M with a wireless box that transmits picture and sound data. TV only requires power and has a 0 gap mount.

KillHour
Oct 28, 2007


FilthyImp posted:

LG OLED M with a wireless box that transmits picture and sound data. TV only requires power and has a 0 gap mount.

I feel like in practice, this is going to be flaky and annoying and probably have noticeable artefacting.

Tacier
Jul 22, 2003

We got news that TCL’s got a QD-OLED in the works too. Not sure if it’s coming this year, but it’s already on my short list knowing nothing about it.

EL BROMANCE
Jun 10, 2006

COWABUNGA DUDES!
🥷🐢😬



Are the ATSC 3.0 tuners on these newest TVs fully compatible now? I know the HDHomeRun doesn’t natively support the audio codec and has to send it off for transcoding or something weird.

FilthyImp
Sep 30, 2002

Anime Deviant
Hisense has a 86" TV with mini-LEDs that are capable of pushing 2,500 nits as they try to expand from being New TCL

Animale
Sep 30, 2009

EL BROMANCE posted:

Are the ATSC 3.0 tuners on these newest TVs fully compatible now? I know the HDHomeRun doesn’t natively support the audio codec and has to send it off for transcoding or something weird.

I have a 2022 Sony that has an ATSC 3.0 tuner and my answer is not sure because I can't tell what is the ATSC 3.0 channel and what isn't since no one actually airs anything in 4k or HDR for me to verify. My receiver does show Dolby D+ while watching this football game on both YouTube TV and the TV tuner, so probably? YTV shows the game being streamed at 720p and I believe my local Fox channel is airing it at the same resolution so ATSC 3.0 has a long ways to go before it's actually viable.

TheWevel
Apr 14, 2002
Send Help; Trapped in Stupid Factory
Everything that’s being hyped right now about atsc3 is all theoretical. There’s like 1 or 2 markets that are doing some feature tests but I don’t think they’re signals regular consumers can pick up.

If you’re in a launched atsc3 market, you’re only getting a transcoded atsc1 signal. Most of the transition to 3.0 was because of the FCC UHF repack over the last couple of years. Since the FCC checkbook was open, stations (and station groups) used that as an opportunity to make the move.

The modulation in 3.0 is way better but that’s the only benefit at the moment. Station groups (Sinclair) are seeing this as the only way they can compete with streaming/cord cutting but there’s still no real business model. All of the “advanced” 3.0 features still require an internet connection. Why would I supplement my antenna with an internet connection when I could just subscribe to a streaming service? I wouldn’t bank on the 3.0 features being free, at the very least you’ll be datamined to hell.

Also, the HDR being featured is Technicolor’s SLHDR1 which no TV currently supports. Hisense might soon, maybe TCL but don’t be surprised if your fancy Sony never gets it.

I still think we’re 3 years away from anything meaningful with 3.0.

Note: I work for Sinclair. All of this applies to the call letter stations…on the sports side, we did a 1080p HDR10 broadcast test in mid December. It’s not currently supported in app but we’re working on it.

Bloodplay it again
Aug 25, 2003

Oh, Dee, you card. :-*

Animale posted:

I have a 2022 Sony that has an ATSC 3.0 tuner and my answer is not sure because I can't tell what is the ATSC 3.0 channel and what isn't since no one actually airs anything in 4k or HDR for me to verify. My receiver does show Dolby D+ while watching this football game on both YouTube TV and the TV tuner, so probably? YTV shows the game being streamed at 720p and I believe my local Fox channel is airing it at the same resolution so ATSC 3.0 has a long ways to go before it's actually viable.

Part of the holdup, besides the obvious expenses involved, is that OTA networks are super congested compared to any other point in broadcast history. The broadcast spectrum is too saturated now, unlike the switch to color TV, addition of stereo audio, or the switch to digital 720p/1080i. Networks can't simply broadcast 4k alongside another, older broadcasts anymore because the previously unused channels are now owned by weird religious media entities. It is all or nothing. Considering so few people have 3.0 tuners in their TVs, a switch would leave most people who watch OTA without the ability to decode the new signals.

I used to be a huge proponent for the switch to 3.0 because it allows for cool interactivity (i.e. multiple choice quizzes) in edutainment broadcasts, but after having dipped my toes into the waters, it seems like it will nearly exclusively be used as a Nielsen-type box that constantly uses content recognition to spy on everyone instead. :(

Fate Accomplice
Nov 30, 2006




George H.W. oval office posted:

Anything cool from CES?

Samsung 140” micro led

https://www.sammobile.com/news/hands-on-with-samsung-140-inch-micro-led-tv-ces-2023/

A Proper Uppercut
Sep 30, 2008

Looks like they are focusing on the larger sizes for that one. What would the cost be on something like the 89"? I'm kind of assuming if I have to ask then the answer is "too loving much".

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


The reason why is density. It's still hard to make micro led in smaller sizes and still have a high pixel count.

Animale
Sep 30, 2009

A Proper Uppercut posted:

Looks like they are focusing on the larger sizes for that one. What would the cost be on something like the 89"? I'm kind of assuming if I have to ask then the answer is "too loving much".

The cheapest one in 2022 was $80k, so it's gonna be a while before it becomes a viable joke solution.

serebralassazin
Feb 20, 2004
I wish I had something clever to say.
Apparently they actually had prototypes of smaller sizes like 50 and 63 on hand. I think this video shows them.

https://youtu.be/G4_oGaryiwc

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


Yeah, but they aren’t planning on selling them. So it seems likely it’s prototype process to get that density that’s probably absurdly expensive.

codo27
Apr 21, 2008

I hate that the wall in my modest little house's living room cant really accommodate any more than the 65" screen I have now

Kingtheninja
Jul 29, 2004

"You're the best looking guy here."
How many weeks before the super bowl do the big deals start hitting?

serebralassazin
Feb 20, 2004
I wish I had something clever to say.

bull3964 posted:

Yeah, but they aren’t planning on selling them. So it seems likely it’s prototype process to get that density that’s probably absurdly expensive.

Oh yea I'm sure if these things materialize down the road they'll be heinously expensive. We will be using QD-OLED and regular OLED for years to come.

Adbot
ADBOT LOVES YOU

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


I think the real question is if it will ever get to that point where it's a viable home tech. Sometimes display tech just dies when other things are good enough *cough*SED*cough*.

Between OLED, QD-OLED, miniLED, and UST projection, I'm not sure micro LED is going to get a foothold unless it gets to the point where it can undercut all of those.

Where I do see it gaining ground is with theaters. It's the perfect density to the small-mid screen multiplex and could greatly increase their quality while reducing maintenance costs.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply