Tom Cruise tells fans to just say no...to video interpolation

Thank you - I was actually having trouble telling what the issue was. But I’m going to claim it is because I’m viewing on my monitor, and not on my TV so it already looks ‘different’. I do see the blur, but on my screen it isn’t all that distracting. I’ll have to do it direct from TV and see how it compares.

I’m wondering if some of this is the whole black/blue or white/gold dress scenario - some of us just have broken eyes.

Wow, I couldn’t even finish watching the subway fight like that it’s so awkward.

I know, right? And the framerate/interpolation issue is pretty bad, too.

For those with LG OLED tvs: I understand frame interpolation is in the TruMotion settings, but if you turn it off movies mote like crap there’s judder all over the place. I have to turn TruMotion on and then set dejudder to 3/10 or thereabouts to get something that looks smooth but I don’t think has this soap-opera effect, but I am not sure.

Am I doing something wrong?

I did some research into OLED’s a few months ago, and the burn-in issue put me on the fence (as well as finding out their vibrancy fades over time). Then I read they had to have motion smoothing on to avoid judder, and I completely dismissed owning one. :(

You have to enable Real Cinema in the settings. This allows the TV to automatically detect and adapt its frame rate to 24p content.

Cool thanks will give it a go.

I was visiting my brother this weekend, and he showed off his new Panasonic TV … we watched a DVD and I was already expecting some weird stuff with the picture. It looked OK, so we checked some settings, there was something with frame interpolation set to “low” … Turning it off did not made much difference. Then he told me he already set it to “true cinema” after he bought it, because everything looked like cheap video material.

wow, we set the mode to “normal” and there it was, soap opera effect. It looked so bad, it was unbelievable that they sell it as “normal” … My brother was under the impression that “true cinema” did some magic post-processing stuff so that it looks like in a cinema, whereas the truth is, that “true cinema” does nothing to the picture and “normal” does the whole frame manipulation thing …

So, those companies like Sony, Panasonic, Samsung what are they up to? Is this just some marketing bs or what is their motivation? I don’t get it.

I’ll point out that this is a bad example. Parts of that fight already used interpolation at 24fps (the bullet time sequence), and many other parts are in front of green screens and/or were likely not filmed at the same speed they’re shown. (Fight scenes are often sped up in post-production.) When you’re doing interpolation on top of a bunch of other effects that were generated for 24fps, it does look kind of bad, but it’s like making a copy of a copy. A fair comparison with be a wide angle outdoor panning shot without effects or something. I don’t doubt that the interpolation algorithms are imperfect. But I’ll be one of the (apparent) weirdos who’ll say I’ve never (rarely?) been bothered by having the interpolation setting on my monitor on.

Would the first Hobbit movie be a better example to view? I think Peter Jackson filmed that at a higher frame rate, correct?

All early LCDs had this bullshit, I can’t remember if first gen panny plasmas had it.

Yes, it was shot at 48 frames. Can’t say that it worked for me, but I know others found it less problematic. For me it made everything look more artificial than it already was.

What about watching The Matrix on an auto-smoothing TV? Wouldn’t that provide the same horrifying result?

I think this is more an HDTV issue, than a PC monitor issue.

Right, I should have been more precise. I’ve got a 43" Sony 4k HDR Bravia TV with all the interpolation options. That’s where I don’t really notice the problems. (Huh, I just noticed via a post at NeoGaf that HDMI ports 1 & 4 on my set only have 4k HDR at 30 Hz. That’s why my Roku yells at me.)

I wonder if Sony’s version of the technology isn’t just smarter about not using it when it can detect it isn’t needed. That’d be handy.

Not on my parents Sony Bravia 4K TV. They had it on and it was completely obvious with the effect. I immediately went in and turned their “Tru-motion” or whatever it was called off. Oh and I noticed its usually on a per input basis so be aware if you have multiple devices plugged into the TV.

This would be essentially a non issue if it was set to be off by default on all these TV’s instead on on.

For sure. I’ve seen two sides - people who hate it, and people who don’t notice it. Why leave a setting on when the best case scenario is it’s not noticed? I don’t think I know anyone that said “oh man, I love that effect!”

Uh, yes. That’s why the Matrix clips are being bandied about, remember? Because they illustrate what motion interpolation of 24p content looks like. I thought everyone understood that.

 

Considering this thread is about frame interpolation, and is NOT about native high-framerate content… what do you think?

Eehhhh…

I think the most legitimate criticism of interpolation is that the algorithms aren’t perfect, so you get all kinds of weird artifacting. I notice it on hands and (in video games) around HUD elements.

But that’s not what people in this thread are talking about. They’re talking about the “soap opera effect” and how things look “cheap” and “weird.” Those criticisms aren’t going to go away when (if) we start moving to native high-framerate movies.

Uh I agree that’s why I asked that of Matt when he pointed it out as a bad example?