change it back
April 30, 2019 6:21 PM   Subscribe

 
Thanks, I hate it!
posted by Homo neanderthalensis at 6:26 PM on April 30, 2019 [33 favorites]


Post title is true. It's impressive to watch, but it just doesn't look right.
posted by Lukenlogs at 6:26 PM on April 30, 2019 [1 favorite]


I wonder what method they used to interpolate it. Between stuff like this and neural nets that process and upscale low-rez textures and prerendered visuals in old games the future is going to be weird.
posted by Mr.Encyclopedia at 6:35 PM on April 30, 2019 [8 favorites]


For the most part, it seems fine to me. From 0:02 to 0:03 Tom's back does some gross undulating thing, but after that it just looks different, not bad.
posted by Bugbread at 6:37 PM on April 30, 2019 [1 favorite]


This Spongebob clip is a lot clearer (and surreal-er)
posted by RobotVoodooPower at 6:41 PM on April 30, 2019 [4 favorites]


That's less unnerving to me than some of the live action interpolations I've seen. I stumbled across this ST:TNG clip without realizing it was 60 FPS and thought that I was watching a performance of a Star Trek play or something.
posted by Betelgeuse at 6:45 PM on April 30, 2019 [16 favorites]


Is there a side-by-side of this and the original anywhere? I don't notice anything unusual and I seem to be the only one!
posted by ragtag at 6:56 PM on April 30, 2019 [8 favorites]


Change it back to when Tom and Jerry were vile little bastards to each other. But leave out the racist bits.
posted by Abehammerb Lincoln at 6:56 PM on April 30, 2019 [1 favorite]


That Star Trek clip is crazy. I know a little about video, but I'm perplexed as to why even a still frame looks like it was shot on videotape. I can understand why it looks weird in motion, since we're used to seeing it as filmed at 24FPS, pulled down to 30FPS (or 29.97, or whatever it is), but I can't figure out how that changes the look of a still. Even the lighting looks different — more harsh.
posted by jonathanhughes at 7:00 PM on April 30, 2019 [2 favorites]


Gods, that’s horrible.
posted by Thorzdad at 7:00 PM on April 30, 2019


Another example: the opening bike scene from Akira in 60fps (and at the original framerate). It looks pretty good to me.
posted by A Thousand Baited Hooks at 7:03 PM on April 30, 2019 [5 favorites]


..., but I can't figure out how that changes the look of a still. Even the lighting looks different — more harsh.

I suspect the interpolation includes an overly hefty amount of edge sharpening and color/saturation boosting.
posted by Thorzdad at 7:04 PM on April 30, 2019 [5 favorites]


Frame rate is weird. We aren't exactly conscious of it once it gets above about 10 fps (if there is no flicker), but we can feel it.
The other thing we can feel with live action material is the shutter angle, which is the proportion of each frame the shutter was open for. It determines the amount of motion blur and we are very sensitive to that and how it changes the feel, even though we don't really perceive the individual frames.

Sometimes I'll be watching tennis on TV and they cut to the action replay hi-speed camera to get some casual shots of the players relaxing. It is very jarring because the shutter angle is different so there is no motion blur and everything feels jerky and fake even though the frame rate is normal and the action is normal speed.
posted by w0mbat at 7:05 PM on April 30, 2019 [12 favorites]


That's less unnerving to me than some of the live action interpolations I've seen. I stumbled across this ST:TNG clip without realizing it was 60 FPS and thought that I was watching a performance of a Star Trek play or something.

Gah! That is unnerving.
posted by mandolin conspiracy at 7:15 PM on April 30, 2019 [2 favorites]


I wish it went longer. I want to know what happens next with Tom, the bird, and the bowling ball.
posted by dfm500 at 7:18 PM on April 30, 2019 [12 favorites]


Most animation was made with extreme consciousness to the frame rate it was made for... the theatrical Tom and Jerry toons had "full animation", compared to the work Hanna and Barbera did to give their later TV cartoons "limited animation". The more extreme examples of "stretch and squish" that were not used in the clip shown would make 60fps much harder. I recommend the animation blog Tralfaz for interesting frame-by-frame analysis... and remember that's at 24fps.
posted by oneswellfoop at 7:26 PM on April 30, 2019 [8 favorites]


Has anyone here seen a film projected at high frame rate? The first Hobbit film or Billy Lynn's Long Halftime Walk?
posted by octothorpe at 7:29 PM on April 30, 2019


Yeah this is what I went through with every anime when I got my first new, modern TV last year (TCL 55R615), until I fiddled with the settings. Before that, it did occasionally look cool, but mostly it just looked wrong, and the smoothness would break and drop out and restart and look comb-y at random. Once I turned it off I don't miss it at all.
posted by glonous keming at 7:29 PM on April 30, 2019 [3 favorites]


A Thousand Baited Hooks: "Another example: the opening bike scene from Akira in 60fps (and at the original framerate ). It looks pretty good to me."

Huh. Interpolation bugs me with live action stuff, it seems just fine in the Tom & Jerry clip, and I couldn't tell the difference at all between the Akira clips.
posted by Bugbread at 7:32 PM on April 30, 2019


Is there a side-by-side of this and the original anywhere? I don't notice anything unusual and I seem to be the only one!

No the only one. Looks fine to me
posted by 922257033c4a0f3cecdbd819a46d626999d1af4a at 8:23 PM on April 30, 2019 [1 favorite]


It's like a bad simpsons episode. It doesn't catch the thousands of variables that can happen between actual frames, and just takes the average of them. It smooths out what doesn't need smoothing. And makes it look like a flash cartoon. DAMN KIDS AND YOUR NEWGROUNDS
posted by not_on_display at 8:29 PM on April 30, 2019 [1 favorite]


The Tom and Jerry has an unsettling sense of movement, like watching an old silent movie where things don't seem to be going at a consistent speed. I felt that way watching the Hobbit movie in the theater - I was expecting it to be this awesome upgrade but it just made it feel like an uncanny valley of motion.

It doesn't make logical sense to me that it live action should look worse in any way than a normal movie FPS or that I'm somehow conditioned to seeing slower FPS motion and I'm just "used to it". I feel like it might be a byproduct of the FPS exposing either the green screen fakeness or the fake lighting/extra lighting used in filming at that rate.
posted by lubujackson at 8:30 PM on April 30, 2019


I think it's in the same thread there, but just for emphasis: this explanation.
posted by wordless reply at 8:57 PM on April 30, 2019 [3 favorites]


Has anyone here seen a film projected at high frame rate?

I saw one of the Hobbit movies in 48FPS, and I agreed with many other critics, that it looked wrong, too clear, "like a soap opera." The Tom and Jerry clip here doesn't bother me, though.
posted by Western Infidels at 9:04 PM on April 30, 2019 [2 favorites]


Here's the original. The train sequence starts at about 2:00.

What's strange to me is that the interpolated version looks so fuzzy. The original 24 fps is very carefully crafted to be sharp even when the motion is crazy. (Look especially at the bird's flight.)

(Also, I didn't notice any weirdness in the motion in the Star Trek clip, but I couldn't stop staring at Data's caked-on makeup in high-def.)
posted by zompist at 9:05 PM on April 30, 2019 [4 favorites]


Western Infidels: "I saw one of the Hobbit movies in 48FPS, and I agreed with many other critics, that it looked wrong, too clear, "like a soap opera." The Tom and Jerry clip here doesn't bother me, though."

I think the difference between live action and cartoon is really important here.

When you up the frame rate, things look more like real life. That means that for some people (like me), instead of looking like a wizard and a hobbit are having a conversation in an old wooden house, it looks like an actor dressed up in a wizard costume is talking to an actor dressed up in a hobbit costume in a set painted to look like an old wooden house.

With a cartoon, though, no amount of that kind of tweaking is going to make it look like real life, because it was never real life in the first place (I'm strictly talking about cartoons that were drawn from the ground up to not appear photorealistic). So upping the frame rate just makes it look different, but it doesn't tear down the veil. Some people like the different look, some people dislike it, and some people are indifferent.
posted by Bugbread at 9:25 PM on April 30, 2019 [6 favorites]


The make-up thing reminds me of something that I noticed in photos of the TNG-era Star Trek cast, that a lot of the heavy-makeup aliens, the Klingons and Cardassians and whatnot, have that same grainy look to their skin. It looks fine in the definition used in television, but still photographs (and, obviously, here) really makes it pop out. It reminds me of predictions that some movie stars with imperfections that could be hidden with regular TV/movie makeup wouldn't survive the transition to hi-def, just as some silent actors didn't have the voices for talking pictures and the likes of William Conrad, who was a fine Marshall Dillon on the radio Gunsmoke, got bumped in favor of the more photogenic James Arness.
posted by Halloween Jack at 9:27 PM on April 30, 2019 [3 favorites]


This is pretty much how I feel about HD tvs in general. It looks unnaturally sharp and is distracting. Just like I do not need every story to be illustrated in photo-realistic detail.
posted by ctmf at 9:37 PM on April 30, 2019 [3 favorites]


I've watched Tom and Jerry for at least two generations. Tom is the cat, right?
posted by Brian B. at 9:43 PM on April 30, 2019 [3 favorites]


In particular, it's interesting that 3D animated kids cartoons that looked...fine on our crappy old plasma TV, now look like bad early-00s low-poly video games on a 1080 LCD TV. Peter Rabbit is a prime example.
posted by Jimbob at 9:45 PM on April 30, 2019


zompist: "What's strange to me is that the interpolated version looks so fuzzy."

ctmf: "It looks unnaturally sharp and is distracting."
posted by Bugbread at 9:51 PM on April 30, 2019 [1 favorite]


The top reply has a great explanation thread

@egoraptor
Interpolated frames are a pretty beautiful example of the importance of pacing and easing and anticipation in animation and I’d love to illustrate that in a way that applies to hand drawn animation

@egoraptor
Here’s a basic breakdown (from (link: https://medium.com/motion-in-interaction/animation-principles-in-ui-design-understanding-easing-bea05243fe3) medium.com/motion-in-inte…) the two balls use the same frames but are timed out differently, because the artist had a hand in designing the movement on the green ball.

egoraptor
This is called “easing,” and gives the ball a sense of varying speed in its movement. Frame interpolation automatically chooses the most efficient frames to create and therefore usually makes it look like the red ball, even if the drawn movement is already like the green ball.
Image

egoraptor
So in between each crafted frame, you have 3 or more “interpolated” frames that do not take the intended movement speed into account, so you end up with these sort of lifeless jerky frames linking already fluid frames.

egoraptor
The additional problem is that, even at a low frame rate, your mind will fill in the gaps automatically in a way that understands the artist’s intent. Adding frames ruins your brain’s ability to do that. Think of Spiderverse where you forget about the limited frames 1 minute in.

egoraptor
This is a basic breakdown, I think it’d be fun to show more detailed explanations in more cartoony examples, you also have to consider animation basics like arcs and anticipation which pcs mostly can’t interpolate bcuz no matter how much detail there is, it’s still artist intent.
posted by Rainbo Vagrant at 10:09 PM on April 30, 2019 [6 favorites]


Frame rate is weird. We aren't exactly conscious of it once it gets above about 10 fps (if there is no flicker), but we can feel it.
The other thing we can feel with live action material is the shutter angle, which is the proportion of each frame the shutter was open for. It determines the amount of motion blur and we are very sensitive to that and how it changes the feel, even though we don't really perceive the individual frames.

Sometimes I'll be watching tennis on TV and they cut to the action replay hi-speed camera to get some casual shots of the players relaxing. It is very jarring because the shutter angle is different so there is no motion blur and everything feels jerky and fake even though the frame rate is normal and the action is normal speed.


And correct me if I'm wrong, there is no camera setting that's just ... correct. There is no camera setting that just neutrally and accurately records reality with no interference or interpretation. A camera inherently interprets and mediates what it picks up, and it cannot do anything else.

like, at first I thought that you could do artsy stuff with a camera but the default was to take "just normal pictures" and slowly I've learned that there is no such thing. And it's fascinating to me. That's what finally convinced me that photography is art.
posted by Rainbo Vagrant at 10:15 PM on April 30, 2019 [10 favorites]


I wonder if those who are not bothered by or don't really notice the issue in this have become accustomed to watching TV over the last several years on TVs that have the interpolation turned on, such that everything they watch looks like this.

I think this may be part of it, yeah; I think there may also be a kind of "cilantro y/n?" whammy factor at play where different folks are at a basically individually varying level predisposed for whatever random reason to be more or less sensitive to this or that little wrinkle of optical processing (internal and external). Not so much an inability to perceive a given visual phenomenon if put flatly to it, but a tendency to have a visceral reaction to it.

I lose my mind a little bit on those odd occasions where I'm in a motel or hotel room or bar and the TV's set to the wrong aspect ratio because someone wanted SD content to fill up a widescreen TV. That happens less and less these days as TV content has become ubiquitously 16:9 but man that shit drove me craaaaazy for a good stretch there. I have a lot of friends it never much bothered. I don't think it's that I'm some visual supertaster, I think I just got conditioned at some point to really care about aspect ratio, or a bit of my visual processing hardware is tuned a little more on the "wtf?" side there.

I can absolutely see the weird smeary uncanniness in the Tom & Jerry clip, and likewise the TNG clip feels weird and off (though I agree there's probably also some unsharp mask or whatnot nonsense contributing to that), but I don't think it's that the effect is in an abstract sense obviously generally uncanny or weird; it doesn't surprise me that a lot of people would shrug and be like "eh, seems the same basically", because it is basically the same. It's just different in the details. But if you're tuned to notice those details it's a weird fucking slippery strangeness, like some mild hallucinatory aura over a familiar thing, like trying to track action after two too many drinks when the visual pipeline gets a little out of sync and other bits of brain hardware try to compensate.
posted by cortex at 10:20 PM on April 30, 2019 [6 favorites]


Yeah, to me this looks exactly like all other motion smoothing, ie absolutely gut-wrenchingly awful, in this particular case like someone recreated 20 seconds of T&J in flash.

But!
- Some friends swear they don't see motion smoothing at all
- My partner used to swear thry didn't see it at all, but apparently I complained about it so much they suddenly did see it and now it drives them bananas too, like a shitty FedEx arrow
- but for live sport, nothing registers for me as wrong at all - maybe my brain in that case read the screen as a window into actual reality rather than something created? Brains are weird, man.
posted by ominous_paws at 10:36 PM on April 30, 2019


I watched the 60 fps Akira first, then went to the original, and for the first, I don't know, 10 seconds or so, the original looked so much jerkier, like I could see the individual animation cels. It was kind of a weird brain effect. I recommend giving it a shot to see if it works for you! (It doesn't last very long, as mentioned above, your brain calibrates to the frame rate pretty quickly.)
posted by Rev. Syung Myung Me at 11:07 PM on April 30, 2019 [1 favorite]


What's strange to me is that the interpolated version looks so fuzzy. The original 24 fps is very carefully crafted to be sharp even when the motion is crazy. (Look especially at the bird's flight.)

It's that careful crafting that's the reason that the interpolated version is fuzzy. As pointed out above, old school full animation used "smear" frames to make the animation look smooth - and it's those smear frames that cause the interpolated version to be so rough.
posted by NoxAeternum at 11:17 PM on April 30, 2019 [1 favorite]


ragtag: "Is there a side-by-side of this and the original anywhere? I don't notice anything unusual and I seem to be the only one!"

Looked good to me. The Star Trek one too. I guess I don't watch right. As far as I know, my TV does not have interpolation on. I do watch a lot, a real lot of sporting events on TV (and a lot of car shows and DIY shows).

Oh, I am pretty sure the bowling ball ends up on Tom's head and Jerry is saved. Just a hunch.
posted by AugustWest at 12:43 AM on May 1, 2019


I lose my mind a little bit on those odd occasions where I'm in a motel or hotel room or bar and the TV's set to the wrong aspect ratio because someone wanted SD content to fill up a widescreen TV. That happens less and less these days as TV content has become ubiquitously 16:9 but man that shit drove me craaaaazy for a good stretch there.

At one stage during the transition to widescreen, Sky was filming all the UK soccer games in 16:9 and advertising it as a feature, but presumably some of the other European broadcasters were still filming in 4:3, because when Sky was broadcasting a Champion’s League game from, say, Italy, it would look weird. And at first I thought the TV was on the wrong setting, but no, after spending some time fiddling with it and failing, I realised they had trimmed off the top and bottom of the frame and stretched the remainder slightly to fit widescreen.

The trimmed aspect wasn’t usually that obvious, although when a player went to take a throw-in, it would tend to cut off the top of their head. But once you noticed that the ball wasn’t round, it was just infuriating. They had sabotaged their own broadcast so that it was literally impossible to watch it correctly, because the marketing was more important than the actual quality.
posted by Bloxworth Snout at 2:13 AM on May 1, 2019 [3 favorites]


Between stuff like this and neural nets that process and upscale low-rez textures and prerendered visuals in old games the future is going to be weird.

There's a neat video renderer called MadVR that uses a neural net upscaling technique called NGU, and seems like a magic "enhance" button for how well it works. There's a graphic on their website (hover your cursor over the different text labels under the image) that demonstrates the difference it makes.

I'm fairly certain that whatever video codec that comes next is going to have some neural net trickery added in by default to make the video look higher resolution than it actually is...
posted by Eleven at 2:36 AM on May 1, 2019 [1 favorite]


So I saw the Hobbit in 48fps and it was GLORIOUS! That said, I have had interpolation turned on (intentionally) on my TV for years, and I consume a decent amount of video games. However, I should also mention that the first time I saw interpolated frames on a TV my brain immediately broke and thought it was a weird live-for-tv episode of Star Trek TNG that I was watching.

My current position on this is that, unsurprisingly after something like a century of film at 24fps and 3/4 of that with tv at 30fps, with only special exceptions for live sports and soap operas, we’ve become highly attuned to these framerates and have strong associations to the type of content.

But give someone the opportunity to get accustomed to higher framerate content, and they will not want to go back. Most commercial video looks jerky and crap to me now if is not on my TV, and it broke my heart when Peter Jackson abandoned the high frame rate cinema thing for his next movies...
posted by Nutri-Matic Drinks Synthesizer at 3:27 AM on May 1, 2019 [3 favorites]


Does Twitter auto-downsample video if you're on a low-bandwidth internet connection? Maybe I'm not even looking at a 60FPS video?
posted by ragtag at 4:52 AM on May 1, 2019 [1 favorite]


Are sports broadcast at a higher frame rate so that would be why they don’t look as wrong with the interpolation on?

Broadcast TV can't change the frame rate. But the cameras can use a faster shutter speed, which makes slow-motion replay better but also has the side effect of assisting interpolation since the frames are crisper with less motion blur.

A lot of broadcast networks in the US also use a lower compression method on sports since they get a lot more visual attention than a news broadcast or sitcom. In my town the Fox network actually uses the entire 18 mbit HDTV stream and doesn't waste bandwidth on side channels so sports look pretty amazing over an antenna when compared to a compressed cable/sat simulcast. Lower compression means fewer artifacts, which also helps interpolation.
posted by JoeZydeco at 5:06 AM on May 1, 2019 [1 favorite]


This is one of things I absolutely hate about going over to someone else's home and accidentally watching something on their "all-default-settings-on" TV... (Well, that and the fact that they are all still on regular cable with commercials every couple minutes).

Everything looks wrong - automatic motion interpolation is crap - you know it's bad when even Hollywood directors band together to drum up publicity about the issue.
posted by jkaczor at 5:28 AM on May 1, 2019 [3 favorites]


I don’t know why, but my first reaction to anything in 60fps is that it looks “cheap,” and I’m absolutely unable to take it seriously. It looks like a soap opera or a locally produced children’s tv show. Yeeech. Why would anybody do that to their entertainment?

The Tom and Jerry thing is slightly different, I think. 60 FPS makes things look more “lifelike” in a very uncanny valley way, and Tom and Jerry is already pretty disturbing. The two together is just one big NOPE for me.

Side note : how/why did any of us watch Tom and Jerry as children? Or, better question, whose idea was it to show that to children? And how did we not all wind up having constant nightmares every night? That murdery look on Tom’s face... yuck!
posted by panama joe at 5:37 AM on May 1, 2019 [1 favorite]


When I got my most recent TV - and it obviously defaulted to interpolation on - while I was watching the classic Powell & Pressberger film A Canterbury Tale (1944), I realised it looked like it was shot on video last week (but in black and white), and it made me feel so uncomfortable that until I was able to drill down the many menus necessary to turn it off I was afraid I'd never be able to watch anything on the television again.
posted by Grangousier at 5:39 AM on May 1, 2019 [3 favorites]


Oh god yes, one real measurement of how close a friend is is now whether or not I feel comfortable immediately digging into their TV menu and turning motion smoothing off, while they look on with bemused indulgence.
posted by ominous_paws at 5:45 AM on May 1, 2019 [2 favorites]


I don’t see or feel anything different on the 60fps, but I’m watching it on my phone. Would it be more noticeable on a larger screen?
I watched a bunch of football in the 80’s (on CRTs) but not much since then, and HDTV football looks very weird to me. Maybe it’s just what you’re used to.
posted by MtDewd at 5:47 AM on May 1, 2019


The biggest reason opinions differ on the usefulness of frame interpolation is that implementations of frame interpolation vary widely between models. Some algorithms are intentionally designed to make motion appear unnaturally smooth. Others are less aggressive in their operation, attempting to mimic the look of the strobing light bulb that is a CRT or plasma screen. Additionally, other algorithmic processing can have a dramatic effect on how a given interpolation scheme ends up looking to the eye.
posted by wierdo at 6:05 AM on May 1, 2019 [1 favorite]


> Does Twitter auto-downsample video if you're on a low-bandwidth internet connection? Maybe I'm not even looking at a 60FPS video?

Twitter generates downsampled versions of videos but will try to stream the highest quality it estimates you can stream.

Network providers (particularly the mobile telcos and American cable operators) are also known to cache videos from Youtube, Twitter, Facebook, Netflix etc., and deliver those in recompressed forms.

I think this might be why some people aren't noticing a difference, in addition to some of the thoughts cortex had about it.
posted by ardgedee at 6:12 AM on May 1, 2019


When you up the frame rate, things look more like real life. That means that for some people (like me), instead of looking like a wizard and a hobbit are having a conversation in an old wooden house, it looks like an actor dressed up in a wizard costume is talking to an actor dressed up in a hobbit costume in a set painted to look like an old wooden house.

Thank you for articulating this so well, Bugbread! And, until this thread, I really thought I was the only one who noticed and/or was bothered by this. Higher frame rates/HD/whatever-is-going-on-in-tvs-these-days makes suspension of disbelief incredibly difficult for me for exactly the above reason.
posted by treepour at 6:37 AM on May 1, 2019 [2 favorites]


I always feel vaguely amused at the fact that among gamers the trend is in the other direction, where 144 frames per second is becoming more and more mainstream.

The reason why manufacturers of flat-screen TVs have felt the need to add interpolation is to eliminate motion judder. And motion judder is a thing because they don't work like film cinema used to - the film "frame" gets moved into position and a shutter opens and the image gets projected onto the screen, then the shutter closes while the film frame gets advanced. **

This means that the screen is actually showing complete darkness 50% of the time while the shutter if closed and the mechanism is advancing the film strip. This allows your brain to "interpolate" the motion smoothly yourself, in a post hoc manner. Because these moments of darkness are shorter than the flicker fusion rate, you don't consciously perceive the blank frames, yet they help you interpolate the motion more smoothly. See also saccadic masking for other visual adaptations your brain uses in the absence of sufficient information.

This is opposed to regular LCD screens, which don't have these moments of blank, dark frames - you perceive a hard "step" from one frame to another. It was particularly an issue because LCD screens were supposed to be an "upgrade" from CRT screens... which, by virtue of their projection technology, also did have moments of darkness between "frames" and thus did not suffer from frame judder.

Enter pulse width modulation. Modern, high end LCD screens can simulate old film projectors by "pulsing" the image on the screen in a similar fashion. So, for a monitor displaying anywhere from 24 to 144 frames per second, it can be be adjusted to have a display time for each frame between 20% and 100% of the time allotted to it. The difference is astounding: instead of an image looking like it is being displayed on the screen in consecutive frames, it looks like the image is physically moving in the screen.

I agree that motion interpolation is crap, it's adding "fake" information, no different to a sharpening filter or saturation boosting filter on the image. It creates all kinds of visual artifacts on fast moving objects.

** in practice, to achieve even lower flicker, the more modern systems bumped up to 48fps but displayed each frame twice, or 72fps and displayed each frame 3 three times - this is likely what most people have been watching in movies theatres all this time, and (IIRC) even digital projection systems have maintained similar projection methods.
posted by xdvesper at 6:52 AM on May 1, 2019 [8 favorites]


We've got an old (oooold) 1080i plasma, and when we visited my sister-in-law her 4k TV just looked wrong. Maybe it was the 60fps interpolation, but everything we watched just looked fake and unnatural. Like pod people doing their best guess at how humans should move.
posted by xedrik at 7:16 AM on May 1, 2019


Even perfect TV frame interpolation would feel bad because it's increasing the frame rate without proportionally reducing the motion blur on each frame. Effectively it is changing the perceived shutter angle (see my post above) and that is something we are sensitive to.
posted by w0mbat at 11:29 AM on May 1, 2019 [2 favorites]


instead of looking like a wizard and a hobbit are having a conversation in an old wooden house,

My opinion is the look of the movie was so noticeable because the Hobbit movies where terrible in every other regard.
posted by The_Vegetables at 11:38 AM on May 1, 2019 [1 favorite]


We like our TV and movies to be 24fps simply because that's what we're used to. They set 24fps as the standard for film over a hundred years ago. High framerate looks like "video" simply because we're used to video having that framerate. I don't know how I feel about high framerate movies and TV...the thing that bugs me is the panning right and left; that seems really artificial somehow.
posted by zardoz at 3:17 PM on May 1, 2019 [1 favorite]


ominous_paws: "- but for live sport, nothing registers for me as wrong at all - maybe my brain in that case read the screen as a window into actual reality rather than something created? Brains are weird, man."

That's what I was getting at up above.
  • I can see the interpolation in live action TV (movies, shows, etc.) and I hate it.
  • I can see the interpolation in the T&J clip, and it doesn't bother me at all.
When interpolation makes things look like what they actually are -- "actors in costumes on a painted set" -- and you don't want them to look like what they actually are, it is horrible.
When interpolation makes things look like what they actually are -- "athletes in a real competition" -- and you do want them to look like what they actually are, then it's fine.
When interpolation doesn't even make things look like what they actually are -- "non-realistic cartoons" -- then it's neither here nor there, and is purely a matter of personal taste.

This is why the T&J interpolation doesn't bother me. I can see it, but it doesn't expose the inherent artifice of the cartoon, it just makes it smoother, which I am okay with.

I enjoyed Spider-Man: Into the Spiderverse. If the frame rate had been much higher, I would have enjoyed it just as much. I also enjoyed Spider-Man: Homecoming. If If the frame rate had been much higher, it would have interfered with my enjoyment.
posted by Bugbread at 5:24 PM on May 1, 2019


It is surprisingly difficult to find legitimate side-by-side comparisons of 24/30 and 60fps: almost all the videos are either video games or are comparing 30fps originals to 60fps interpolations.

Video games though are an interesting experiment, because a video game can adjust its framerate completely independently of everything else-- and in video games I think most people would agree that the visual difference between 30fps and 60fps is not that dramatic, and relatively few people would say, I think, that a 30fps game looks more 'cinematic' or 'realistic' than that same game running at 60fps.

As another point, I've sometimes noticed that 60fps streams of video games on twitch do have an uncanny quality to them, in a way that live games running at 60fps do not.

Taken together, these make me suspect that a significant source of the negative reaction is not the framerate itself, but rather production / encoding artifacts that become more pronounced at higher framerates.
posted by Pyry at 6:39 PM on May 1, 2019 [1 favorite]


There are times I really miss my old big bastard Trinitron monitor. Why? Because unlike most modern displays, old multisync CRT monitors had no native resolution or refresh rate. 24p, 60i, 60p, 72p, 120i, or anything in between would work just fine in any resolution your video output hardware or display had the bandwidth to handle. The lack of intelligence in the display itself made them much more versatile in that sense.

There was no confusion between recreated examples intended to convey concepts and what things actually looked like in real life causing endless arguments on the Internet. The downside to that flexibility was that taking advantage of it required a ridiculously in depth understanding of how analog video works. Configuring custom modes in X Windows was..tricky.
posted by wierdo at 8:55 PM on May 1, 2019


I'm 54 and from earliest memory shot-on-video and shot-on-film, as displayed on television, have been quite distinct to me. I inferred, though with minimal comprehension, that this was a film/video distinction, but only by association when the medium was known a priori. As it was for many, this was maximally evident to me in the contrast between the rebroadcast of cinema on television and soap operas. Thus, for myself and many people, because of production values (everything from lighting and photography to sets, costuming and makeup) video was strongly associated with "cheap" and "fake".

For those of you who are much younger, that association still exists, though attenuated, because of cultural inertia combined with the fact that technology influences technique which becomes style. Style has a great deal of semantic content.

We might agree that there is some sort of continuum between "realistic" and "unrealistic", while granting for the sake of the argument that we could, if we worked very hard at it, clearly and rigorously defined those terms -- but, even so, as Rainbow Vagrant wrote above, "There is no camera setting that just neutrally and accurately records reality with no interference or interpretation." I'm not saying "realism" is meaningless in this discussion, just that it is less meaningful than we naively intuit and it's more useful to think about culturally embedded signals of what is and isn't verisimilitude as being not quite arbitrary, but extremely variable. This is basic stuff to many or most people with regard to, say, painting, but it's just as true in photography and cinema.

If we take a step back and see different technique not as better or worse, but that it is as semantically important as anything else, then what we're left with in this discussion is what we might call the integrity of any particular presentation of the work to the work. (That's contentious, too, but for the sake of argument we can probably temporarily agree about this.)

So the essential problem here is whether the presentation is true to the work. When you interpolate 12fps of animation out of the fps and shutter angle the artist presumed would be its presentation, it shouldn't be a surprise that you don't merely cross your cultural signifier wiring, you very likely objectively damage the integrity of the work and get something that looks "wrong" because it is.

Everything about photography + cinema is culturally-mediated technical artistic expression and it's not helpful to try to say that any one thing is better or worse than another. The interaction between the technologies of photography with the mechanisms of our visual perception create vast possibilities for varying visual experiences, many of which will develop strong cultural associations that will evolve over time.

A stage play has a great deal of verisimilitude relative to a film in some respects, and in other respects it's very much the opposite. We don't really have much difficulty accepting the low relative verisimilitude of a stage set; nor, conversely, is it the case that the fact that there are actual living human beings just feet from the audience causes an "uncanny valley" experience.

With that in mind, it might be easier to understand that there is no "correct" frame rate or shutter angle / motion blur or, for that matter, aspect ratio or field-of-view or depth-of-field or color. It's just not as meaningful as we think it is to attempt to evaluate "realism" -- there are just varying artistic choices within a particular cultural context.

In contrast, even if we're cautious enough to avoid saying a particular presentation is "wrong", I think it's self-evident that choices in presentation can be transformative...and not in a good way. It's much more defensible to argue that frame-interpolation from 24fps to 60fps of The Fellowship of the Ring is "bad" than it is to argue that Jackson's choice to shoot The Hobbit at 48fps is "bad".
posted by Ivan Fyodorovich at 6:09 AM on May 2, 2019 [2 favorites]


> in video games I think most people would agree that the visual difference between 30fps and 60fps is not that dramatic

*Lunges out of the way of the incoming stampede like a fainting goat.*
posted by lucidium at 12:06 PM on May 2, 2019 [2 favorites]


It wouldn't surprise me if portions of a number of the animations in this thread were animated on 2s originally, which would make the effect even more jarring.
posted by Aleyn at 12:59 PM on May 2, 2019


I'd be suprised if the Tom & Jerry cartoon was animated on 2's. There's an awful lot of frenetic movement, very smoothly animated. (Of course there are holds when e.g. Tom is baffled.) The short ("Kitty foiled") was made in 1948 when animation was done pretty lavishly.

Various websites say that anime is usually animated on 2's, but Akira in particular was animated on 1's.
posted by zompist at 3:19 PM on May 2, 2019


zompist: "Various websites say that anime is usually animated on 2's, but Akira in particular was animated on 1's."

I would imagine it depends intensely on whether we're talking about TV or movies, right? Dragon Ball movies are probably animated on 1's, but Dragon Ball TV episodes are probably animated on 24's.
posted by Bugbread at 5:22 PM on May 2, 2019 [2 favorites]


« Older Sabika's Story   |   And this little goat had a house of rocks Newer »


This thread has been archived and is closed to new comments