Tom Cruise tells fans to just say no...to video interpolation

If Tom Cruise is involved, then it must be some heinous plot instigated by North Korea and Iran. Save us Tom Cruise!

Ya, I know that film cameras do… but that’d only be older movies, right? Like, stuff prior to 2010 or so?

Nope, if they’re recording for display on the silver screen, they film at 24fps, up to the present, with a couple of exceptions, e.g. The Hobbit movies, which were filmed at 48fps.

Modern TV cameras typically film at 30 or 60Hz in the U.S. (and anywhere that historically used NTSC and 60Hz electrical systems) and at 25 or 50Hz in the UK (and anywhere that historically used PAL and 50hz electrical systems.)

Well, so here’s where we fall into the other discussion that Zylon was so careful to make sure we weren’t accidentally confusing ourselves with earlier (gee, he’s a real helpful guy :P).

Because most movies have been filmed at 24fps on, well, film, for the majority of the medium’s existence and for virtually all of its more notable fare, it’s been said (and maybe even proven; I’m a lazy Googler today) that people tend to associate that framerate, with all its consequent advantages and disadvantages, with how good movies are “supposed” to look.

Whereas, the so-called soap opera effect, people first got used to the kind of slick uncanny valley movement inherent to higher-framerate film in low-budget TV offerings, especially soap operas, because, for reasons that I’m once more too lazy to Google, that kind of film was what was most available and affordable for those productions as they came into their own in the last few decades.

I think a lot of people try to wrap way more meaning and substance around those semi-conscious associations than there really is; I’ve heard lots of “film buff” types tell me how obviously the lower frame rate of movies lends them an air of dramatic, epic weightiness that flightier higher-FPS video would lose, for instance.

I suspect it’s one of those things that, if everyone and their mother did start filming and playing back Hollywood blockbusters at 60 or 120 or 144hz or whatever for 20 years, people would just get over it. But for now, there’s a ton of inertia and tradition associated with it.

I come down on the side of “We live in the future, so I want you to give me the highest resolutions and highest framerates you can. Make my eyes bleed with it.”

I come down on the other side. Films just look right. High framerate stuff just looks wrong and cheap somehow. I know I’ll probably “get over it” if I don’t have a choice, but I’ll enjoy movies less while I’m adapting, so I’d rather not.

The discussion is a little confusing. Soap operas looked the way they did because they used flat lighting, static cameras, closeups, small sets, and lower quality videotape. These were all concessions to budget, mostly so they could set up cameras and lighting once for multiple shots. It didn’t have to do with frame rate. But high frame rate film tends to produce images where the foreground stands out sharply from the background (which is how things actually look in real life–when your eye focuses on something close to you, the background is out of focus), which contrasts with the more blended, consistent look you’ve gotten used to in film. The sharp foregrounds bear a resemblance to how soaps looked with their flat lighting, so high frame-rate video is subject to the “soap opera effect.” And the interpolation algorithms seem to exacerbate the problem by making camera subjects look plastic, with hard outlines against the background.

Don’t forget all those old silent movies where the action seems to happen too fast because they’re played back at the wrong speeds.

Also a good video:

Well, the noobs also think watching some jerk in a small box play a video game while yelling out curses is high art, so screw the noobs’ opinion.

That video is awesome. Thanks!

Wait, what?
That makes no sense.
If you’re filming with a digital camera, you can’t film at 24fps right? Because the digital camera isn’t going to have motion blur, so you’re just going to get choppy 25fps footage, aren’t you?

Or are you saying that they still use analog film? Cause I didn’t think that was true either these days. Going back as early as 2009, I think Avatar was fully filmed with digital cameras.

Or do digital cameras also create some blur effect? I guess I never thought of them doing that, but I guess they do?

I still don’t really understand why anyone would want a blur effect that just tricks your brain, rather than just feeding a higher fidelity signal to your brain.

See, this doesn’t make sense to me.

I can certainly understand how interpolating stuff from different framerates would look messed up, because it actually is messed up.

But if you look at stuff that is a suitably high enough framerate, it’s just gonna look like the real world.

I guess maybe some folks like the fact that old movies didn’t look that real?

12fps is the bare minimum to create the illusion of movement. It would obviously look terrible on pans though. “Motion blur” in terms of movies is a function of the display tech, not the recording tech.

Digital cameras use the same apertures and shutters that analog cameras do. Just the recording media is different: CCDs instead of photographic film. They have motion blur due to shutter speed just like analog cameras do. You can see this with your phone in low light or if you can change the shutter speed manually.

With analog film, blur is created as a result of stuff moving in front of the camera during the exposure time of each frame, isn’t it? It has to do with the way that light affects the chemicals in the film over time.

Digital cameras use the same apertures and shutters that analog cameras do. Just the recording media is different: CCDs instead of photographic film. They have motion blur due to shutter speed just like analog cameras do. You can see this with your phone in low light or if you can change the shutter speed manually.

Ya, I guess it would… I wasn’t really thinking about it that way, since there’s no chemicals getting exposed to light, but I guess the light sensor still does something similar?

Do the light sensors in digital cameras have some upper limit in terms of the maximum framerate they could support? I would think that they must, as a result of some sort of response time built into the hardware.

Here’s some more info. Digital is still at 24fps.

As with anything to do with movies decisions quite often come down to cost. Number one, the majority of digital theaters can’t currently project 60fps (big projectors are crazy expensive to replace) and for compatability any movie released internationally has to support a 24fps film and digital release. A movie also eventually needs to be able to play on TV (North America/Japan 30fps and most of the rest of the world at 25fps) 24 is awesome. It can played natively at 24 via film or digital and then easily converted to 25 and 30. No frames are wasted in any conversions.

And then there are the special effects. Lets say you have a 5 second shot of a dragon flying through an exploding canyon. It is a very very complex shot. At 24fps that is 120 frames to render, at 60fps it is 300. Every frame costs money so every second now has 150% more frames at 60. Same applies to cars exploding, green screen compositing, rotoscoping problems, removing blemishes etc.

But hold it because film and most digital projectors can’t project at 60 the producers would need to shoot and edit 150% more frames but also generate two versions of the movie one at 60 and one at 24. So it gets very complex and expensive very quickly. Another problem is if a movie is shot at 60fps and then converted to 24,25 and 30fps for distribution then those extra frames shot must get dropped somehow. No trouble going from 60 to 30 you can either drop every other frame or blend every two together. But going to 24 and 25 from 60 will end up with an uneven number of frames to drop or blend every second. This introduces motion artifacts and blurs the image.

And then there are the tests, most of the audience tests at 48fps and above that have been done don’t indicate that people notice enough of a positive difference to be willing to pay extra for it.

So in a nutshell it would cost way more, be a huge amount of hassle, make non 60fps movie projection and broadcast way worse and in the mainstream market there would be no additional money streams to compensate for the additional cost.

For specialized theaters then any high frame rate could be used. High frame rates are great for experiential things like being in a luge, a race car, going down a mountain, etc where a POV shot can look a lot more real. So in the near future high frame rate productions will be mostly limited to specialty installations at fairs or science centres.

You’ll get a blur on a given frame, but it has nothing to do with how “choppy” it looks. That has to do with persistence of vision and the response time (or equivalent for non-digital) of the display.

Ah, but it totally does have to do with how choppy it looks.

The blur is the only reason it doesn’t look choppy. Because your brain interprets that blur and kind of “fills in” the missing frames.

If you run purely digital content, like a video game, at 24fps with no blur, it looks choppy for exactly that reason. 24fps is well below the threshold detectable by the human eye.

Regardless, I think the big thing I didn’t consider was that digital cameras also have a blur effect, and they are still filming at a fairly low framerate (because of reasons?.. I’m still not sure why anyone would intentionally record at low framerates, given we can do higher ones now, and you can create a more reaslistic film now).

You’re right. But people have done tests, and most think it looks like crap. Studios and theaters don’t want to invest in something the audience thinks looks worse. That’s the #1 reason.

Yep. Digital light sensors (like CMOS sensors and CCDs) just measure light exposure. There’s typically a color filter over the sensor array in a digital camera to determine the color for each pixel.

The sensors are faster than the electronics needed to read them, i.e the bottleneck is shuttling the pixel information off to memory, not the speed of the sensors. But, there are extreme slo-mo cameras with shutter speeds in the tens of thousands of fps:

To be fair, the only movie I know about that was filmed at a high frame rate was Peter Jackson’s first Hobbit movie, and I didn’t watch the high frame rate version of that. (48 fps). So it’s just an assumption on my part that it will look “wrong” compared to 24fps in a movie theater vs 48fps in a movie theater. I don’t actually know because I haven’t seen it. The only thing I have seen is movies on people’s TV sets that use interpolation to get high frame rates. So you might be right. Maybe the higher frame rate stuff won’t look weird.