High Frame Rate movies
December 4, 2014 12:43 AM   Subscribe

HFR (High Frame Rate) movies: why do they look fake?
posted by Chocolate Pickle (69 comments total) 24 users marked this as a favorite
 
I don't understand how John Knoll's explanation has any bearing on a comparison between a HFR version and a standard rate version made from the HFR version from the same set of frames.
posted by edd at 1:14 AM on December 4, 2014 [4 favorites]


Yeah, I thought that sounded hand-wavy too. Honestly, and I say this as someone who works in film postproduction and daily gets up close and personal with frames of movies (some shot on film, most not, these days), it's a matter of habit. We're used to films looking one way and TV looking another way, and this ends up somewhere in the middle, so it requires you to get used to it. I think it looks better, generally speaking.

Although there is in some scenes more opportunity to see detail, because you get half the motion blur when elements in the frame or the camera itself moves, so it's easier to pick stuff out. Normally, you can keep a moving object free from motion blur by moving your eyes along with it, so that the static background will get blurry, but the moving object will not. In film, this decision is largely made for you, either the camera moves with a moving object or it doesn't, and whatever moves in the frame gets blurry. In HFR, there are more and shorter exposures, so less motion blur inherent, and it's easier to pick what you want to focus on, which is why it feels more real. I can see some negatives with this, because choosing what to allow to get blurry and what not can be a tool in the cinematographer's tool box.

However, we didn't originally get 24 fps because it was "painterly" or "a romantic illusion" or whatever. We got it because people thought 18 fps was a bit on the jerky side, so 24 was the minimum they could get away with and still not use a huge amount of film. Higher frame rates use more film, so they're more expensive, make your magazines last a shorter time (a 400 foot 35mm magazine is just over 4 minutes in 24 fps, so this mattered), and are generally more of a pain to work with.

Digital made all of that easier (although HFR generates twice as much data, for larger productions, that's not even a dent in the budget), and it's not surprising that filmmakers want to try it out.

I'm not sure everything will be HFR any time soon, but I'm sure we'll see a lot more of it. It'd be interesting to see how it works for a jumpy, jittery action movie, shot and edited in the style of the Bourne movies, for instance. Intuitively, it should be possible to express the same speed and energy, but make it all a lot easier to follow. (Although that might not always be desirable either).

(As an aside, I'm still very iffy on 3D, I'm not at all convinced it's going to stick this time around either. I think HFR and 4k, and possibly HDR in a few years, are much more likely to get adopted quickly and become more or less standard.)
posted by Joakim Ziegler at 2:00 AM on December 4, 2014 [45 favorites]


I presume the 24fps version isn't just every other frame of the 48fps version. Although that would make for an interesting comparison.
posted by inpHilltr8r at 2:17 AM on December 4, 2014


inpHilltr8r: That's a really good question - ideally, you'd film the scenes twice, once for each framerate with the lighting setup to be appropraite for each. If The Hobbit didn't do this (which given it would more than double the time taken to film all the scenes wouldn't be surprising) then it might explain some of the odd look of the interior scenes.
posted by pharm at 2:33 AM on December 4, 2014


It's about how much visual information is being thrown at the screen to be reflected back into your eyeballs.

24fps movies aren't actually 24fps. It's 24x2fps. What happens is that each frame of a standard frame rate film is actually shown twice in a row. This is a legacy of physical film media, as the film moves through the projector slower than the actual shutters that move to give each frame a separate image on the screen (otherwise all you'd see is a blur), so each image is shown by a shutter twice as it moves through the projector. (This is a limitation created by the physical movement of the film stock through the camera -- moving it faster increases the chance it will tear under the stress of being wound through the projector.)

[Actually, many modern film projectors show each frame 3 times, making the effective frame rate 72fps. This is done to decrease the classic old film "blinking" effect.]

The human eye is a very interesting visual sensory organ. It can detect very VERY VERY subtle things. At 24fps (that is, 24x2fps), the darkness between projected images created by the shutters is reduced to the level that the conscious mind cannot see that there are bits of black happening 48 times a second, and the doubling of the image between every other bits of black is still great enough to give the persistence of vision a sense that the strung together images are creating fluid movement. (Persistence of vision is believed to run at around 1/25th of a second, so the second projected image of the same frame with the shorter blackout time is thought to help overcome the flicker problem of older projection methods.)

With HFR movies, you don't get each image projected twice. Each image is projected once. It's the same number of shutter moves, but with every shutter move you get a new, in-between image, something which our brains, trained from years of exposure to fewer frames with traditional cinema, feel to be a bit unnatural when it comes to watching something on a screen. We've come to accept that, when we watch a movie, there are certain things which give us cues that it is not something we are watching happening live in front of us, but instead is something filmed and being presented. This includes film grain (which is actually visible in a standard film, but because each frame of the film has its own grain composition, and all the different frames are thrown into the eye so quickly, we don't consciously realize we are seeing it) and the shutter effect (which is very pronounced in older films, particularly ones which were meant to be hand-cranked, but which is still subliminally noticed even in modern 35mm 72fps (which is really 24x3fps). This subliminal effect, where the eye is shown the same image 3 times in a row with very minimal blackout between frames, is part of what makes watching a movie feeling like "watching a movie".

The modern 48fps HFR frame rate gives the classically-trained movie watching brain a bit of a flummox. It's literally twice as much visual information, but it still contains the same number of shutter-flickers (that is, black moments) as traditional film presentation. Where there used to be two frames that were the same presented before, now there is a subtle (yet very noticeable, to the human eye, that marvel of evolutionary engineering) difference, and things are more fluid, more real, and more interpreted by the visual cortex into the realm of "this is a thing I am actually watching happen" as opposed to "this is a thing I am watching that was filmed".

So when they used the same "film" stock to create both regular frame rate and HFR versions of movies, what happened was, a brain that was used to watching basically all media on ~24fps (theater projection, or NTSC which is 23.976fps or PAL which is 25fps) suddenly had twice as much visual information to take in, because it was used to seeing things projected twice (or three times) in a row, with the shutter blanks (black spots) taking up a certain amount of visual information. It was trained to recognize the information it processed when viewing movies as "repeated frames with black space in-between". And it accepted that as "this is a picture that is moving" as opposed to "this is a thing that is happening in real life in front of me", because real life doesn't have a frame rate (beyond what the normal brain cycle processing rate is, which is about 60 cycles/second), and there are no black spaces in our normal visual processing.

People have been playing with different frame rates for movie projection for a long time. The ill-fated 1981 final film starring Natalie Wood, Brainscan, was originally supposed to film all the real life segments of the movie in traditional 24fps format while all the sequences showing the actual "brain recordings" was going to be in the new Showscan 60fps format, giving the audience a sense of heightened reality for those segments which were supposedly (according to the plot) recorded from a person's actual mind while the rest of the movie played out in regular movie frame rate "reality".

High Frame Rate filmings are a pretty interesting thing, overall. When it comes to things like the Peter Jackson Hobbit films, it isn't being used very intelligently or effectively. This is due to the legacy of older, lower frame rate film and the lack of experience of dealing with the additional information the visual cortex is receiving. This leads to problems with makeup and lighting, because directors and cinematographers aren't skilled in how to shoot for the higher frame rates and how the visual cortex will interpret what is being presented.

HFR would be awesome for big screen presentations of things like Great Performances filmings of stage shows. Or if Wim Wenders had used HFR for his filming of Pina, that would have been amazing. But for effect-heavy, make-up heavy, fantasy-set heavy productions like the Hobbit movies, it's probably WAY too early in the development cycle (and viewership-training cycle) for it to be seen as anything other than garish and suspension-of-disbelief-breaking.

Because it looks TOO real. We're used to movies, even really great ones, having a veneer of "this is fake, this is a thing you are watching that was filmed". The colors, the lighting, the makeup, the flaws in the set... These are all things which, in 24fps, wash out a little, and our brain fills in the gaps, and we get a sense of reality because of what is missing.

At higher frame rates, our brains don't get to fill in the gaps as much, and we see something closer to reality, which is a bunch of people in a lighted, flawed set wearing heavy makeup.
posted by hippybear at 2:37 AM on December 4, 2014 [206 favorites]


hippybear: When it comes to things like the Peter Jackson Hobbit films, [HFS] isn't being used very intelligently or effectively.

This, really. I think a lot of that is because of the constraints imposed by having to make a 24fps version alongside and presumably not shooting every scene twice. I still think motion blur is likely an issue, as I speculated when I commented on that article (2 years ago! blimey!). I don't work in the industry so I'm surprised that lighting is so widely considered a source of problems - I would have expected the cameras in use for the Hobbit to be able to cope with low light well enough to avoid most of the interior vs exterior issues mentioned. It will be interesting to see if things are any better in this year's Hobbit (if I can get past the "Orcs in the Hobbit" problem I have with it and actually go see :)).

3D is a whole other rant but suffice it to say that I won't be going to see many more 3D movies unless I can arrange to have my head in a vice for the duration of the showing, which I suspect would degrade the experience enough to make it not worthwhile.
posted by merlynkline at 3:22 AM on December 4, 2014


The Hobbit films aren't as enjoyable as LotR for a lot of reasons; HFS is only one of them.

That I don't enjoy neither HFS nor 3D (mostly because the glasses chafe and it's all so overdone) I take to be a sign that I'm getting older—not that I fetishize film grain or anything.
posted by flippant at 3:27 AM on December 4, 2014 [1 favorite]


That is a terrible explanation. I'm sure the co inventor of Photoshop did not say that. What probably happened is that Jackson overexposed the frames deliberately so it wouldn't look like ass in 3D , and this person's 2D viewing didn't compensate.
posted by Yowser at 3:31 AM on December 4, 2014 [1 favorite]


I think the main reason why HFR looks 'fake' is we associate 50 or 60 fps images with TV (actually fields* rather than frames per second) and 24fps images with cinema. Some people see video images as 'the present' and film images as 'the past'. The connatations are mostly historical - movies seem somehow classier than TV - and, for this reason, much digital TV is shot at 24 (or 25 in Europe) fps rather than 60 (or 50 in Europe). People just like the look of it better.

As well as the historical connotations, there are theories that the 24 frame rate puts the viewer in a semi-trance state (think of Brion Gysin's dream machine) or that the spaces between frames allow the viewer's imagination to intervene.

I can't say whether those theories are true or not, but I know for sure (as some who works in TV production) that most people prefer the look of the slower frame rate. Rather than being 'fake', I think HFR looks 'cheap'. But John Knoll's explanation I find particularly unconvincing.

*TV has been traditionally 50 or 60 interlaced 'fields', meaning half the picture (alternate odd and even lines) is transmitted 50 or 60 times a second - called 50i or 60i. This transmits the same information as 25 or 24 frames per second (25p or 24p) but doubles the flicker rate so flicker is imperceptible.
posted by rolo at 3:38 AM on December 4, 2014 [3 favorites]


(... Brainstorm)
posted by Auden at 3:40 AM on December 4, 2014 [3 favorites]


Am I really the only one who watched the HFR version and thought it looked great right off the bat?

Maybe my history of playing video games has accustomed me to seeing things in a decent, non-Edison enforced, framerate? That is, my history of playing games on PC where you can get decent framerates, I guess I can see how console gamers might not be used to framerates higher than 30.

Or maybe I just didn't decide subconsciously that low framerates made movies special and superior to TV?

Apparently I'm in the tiny minority here. But I do hope we keep pushing HFR until it gets past the unreason and we can shake people out of this stupid "it looks fake/cheap" thing. I don't think linking art/quality to flickery low framerates is really a great thing.
posted by sotonohito at 3:54 AM on December 4, 2014 [14 favorites]


…and in terms of interior lighting looking 'fake', this has nothing to do with HFR and everything to do with cinematic style. Any decent cinematographer can light naturalistically if needed, but naturalistic lighting is often very dull. How theatrical or stylised you light is an aesthetic choice.

And even before the era of HD digital, cinematographers were finding new lenses and filmstocks were producing images which looked too sharp and clean to their taste, so would use diffusion filters or other techniques to soften the images. But there are no absolutes here - what looks real and what looks fakes is mostly a matter of convention. I haven't seen The Hobbit, nor any hobbit for that matter, so I can not comment on what looks real or unreal in that world, but if it's not convincing to the audience then that's a fail on the filmmakers' part.

It's also worth pointing out that although modern cameras can be more sensitive than the human eye, shooting by available light does not necessarily give you an image that looks more 'real' - if often just gives you an image that looks more like a documentary than a drama. To make something look 'real' requires a lot of artifice.
posted by rolo at 4:01 AM on December 4, 2014 [2 favorites]


I am unconvinced that this phenomenon is solely or even largely caused by framerate. Video games frequently run at framerates that vary, even within a single playing session, from between 15-60 fps, and although this variation is noticeable, it does not result in the types of drastic secondary qualitative effects that are reported with the Hobbit. Likewise, many digital video recording devices (phone cameras, go pros, etc.) provide a selection between 30fps and 60fps, and yet the qualitative difference between recordings in the two framerates is not especially dramatic.

(by the way, motion perception is not explained by the 'persistence of vision' [pdf])
posted by Pyry at 4:01 AM on December 4, 2014 [2 favorites]


Oh dear.

I still fret over the airbrushed faces which I can now plainly see on our large-format HDTV. Everyone looks like a mannequin and many formerly familiar faces have been nipped and tucked and sprayed until their owners look mummified.

Yes, it's great that we can now see the facial expressions on everyone in the football stadium during games. What grieves me is that with the scrutiny and demands of each new paradigm there is a tendency to play to it--we don't need good scripts anymore, heck, who wiil care when we have such amazing stuff going on with CGI? But now every wrinkle is visible so head for the surgeon and bring on the industrial-strength airbrush.

I seem to spend more and more time watching older movies, not because I'm a "film snob" or any such thing--it's because I enjoy seeing real faces with real warts and wrinkles, performing beautifully written scripts that engage me and don't involve explosions.

Damnit, kids, get offa my lawn... and take your damn HFR stuff with you.
posted by kinnakeet at 4:27 AM on December 4, 2014 [3 favorites]


You know the way that UK programmes like Dr Who, Monty Python et al used to use film for exteriors and videotape for interiors ?

Haven't seen HFR yet but it sounds like it would be as jarring as that now looks to us.
posted by GallonOfAlan at 5:24 AM on December 4, 2014 [1 favorite]


Color TV, technicolor, and Kodakchrome all had its detractors who found a purity and monumentalism in black and white.
God, I miss editors.
posted by ricochet biscuit at 5:30 AM on December 4, 2014 [17 favorites]


We saw the last Hobbit movie on our new big screen, and I'd been warned it looked kind of like video but it was more that it just looked... off. Like, not film, not video, not real. I suppose this will be another one of those things, like the cancerous spread of fakey CGI all over modern movies, that other people will soon accept as just the way things are now while I sit there not being able to deal.

(It didn't help that a bunch of the big scenes in that movie honestly looked like cutscenes from a middle-grade PS4 title. Hell, some of them looked like middle-grade PS3. WTF, Weta?)

But I've found some other things look weird in HD too. Stuff shot on film can have kind of a weird, real life/video look, instead of the "film" look I'm used to. To me video has always looked more like real life, and film has been sharper. It's hard to explain, but if you look at a show where they cut back and forth between video and film (Monty Python's Flying Circus comes to mind) there's this very obvious difference between the two formats. But now I'll watch a show that I presume is shot on film - like Supernatural - and in some shots I'll get that "video" feeling and in others it's like looking through a window into a real life scene. It happens with some old movies too.

HD is weird, and I like it but it also creeps me out. I don't want to see every little crinkle on Jensen Ackles' face.
posted by Ursula Hitler at 5:30 AM on December 4, 2014


I think that video tape looks old because it is old. If you watch old episodes of the Twilight Zone or even Barney Miller or some soap operas it's not that higher resolution is worse but that it seems dated.
posted by vapidave at 5:39 AM on December 4, 2014


Jackson is pretty slapdash with his make up anyway, says I. Gimli's face in the LOTR didn't stand up at 24fps.
posted by Trochanter at 6:05 AM on December 4, 2014


I play a fair number of PC games (and Japanese console games, which often shoot for 60 fps). And they've always looked weird to me too! I remember playing Devil May Cry and trying to figure out why everything looked somehow wrong. It's as though foreground and background aren't composited quite correctly, perhaps because of the motion blurring noted by Joakim Ziegler above.

After seeing the terrific 3D dance doc Pina, though, I'm wondering if it's ultimately going to be a matter of the right tool for the job. In Pina, it provided a terrific choreographer's view; the sort of exaggeration of z-axis planes is how a lot of directors see the stage. Where HFR and 3D really shine is sports, where there's a lot of very fast movement and physical depth is really important. So I wonder if we'll ultimately get a breakdown of 3D and HFR are terrific for sports, 2D 24p for narrative film, and documentaries choosing whichever seems right.
posted by ThatFuzzyBastard at 6:12 AM on December 4, 2014 [1 favorite]


I was skeptical of the article until the end when it started talking about vinyl v. CD, then I knew the author didn't know what he was talking about. Light and audio recording are not comparable for the purpose of this analysis. Digital music is a sampling of native analog sound waves. In that way, digital captures frames of sound like film or digital recording. Analog to analog recording on vinly has in playback no limiting "frame rate" aside from maybe the physical ability to cut and read the vinyl and is therefore the opposite of the old film standard frame rate limitation problem. If the author wanted to talk audio, the better analogy would be between lower quality mp3s to the higher definition and better quality high sample rate digital formats.
posted by Muddler at 6:19 AM on December 4, 2014 [1 favorite]


I dunno, while on a technical level the mp3/cd comparison may be better, experientially (and culturally) the distinction is probably more similar to cd/vinyl.

Not that I'm convinced either analogy is that useful.
posted by Dysk at 6:22 AM on December 4, 2014 [2 favorites]


I still fret over the airbrushed faces which I can now plainly see on our large-format HDTV.

This is probably going to be even more of a thing in 4k and 8k - and effects to apply makeup-like effects digitally in realtime to make people seem at once more real and more beautiful under hi-rez closeups is going to make some Mac developer a dumptruck full of money.
posted by Slap*Happy at 6:38 AM on December 4, 2014 [1 favorite]


It isn't the lighting. The same effect happens when you watch a 24fps movie on a TV with motion interpolation. It just looks like video. The motion blur of film just makes it seem more dramatic, kind of like adding reverb to a vocal track.
posted by grumpybear69 at 6:38 AM on December 4, 2014 [5 favorites]


I have been on a number of movie sets in a previous career, and the first thing that struck me was how incredibly unconvincing they usually were.

Once in a while one would be surprisingly immersive, like Sarah Polley's bedroom from the opening scenes of Zak Snyder's Dawn of the Dead remake, which was just a bedroom built inside the cavernous back room of one of the empty anchor stores from the mall they were using. You walked past the big fabrication shop they'd turned that area into, through the door, and bam, you were in a suburban bedroom.

But most of the time they just looked ridiculous, like something a high school drama club came up with on no budget. The example that hit me the hardest was the ship sets from Stargate Atlantis, which were just horrible when you walked through them. Plain cement floor, wooden bulkheads with random bits of painted styrofoam glued on them. You're like, there's no way this is ever going to work. And yet, when you see it on screen, it does. Ditto for character makeup, and even acting to a certain extent. Somehow, in the process of moving all that imagery out of reality and onto a screen, it is made to work. If HFR somehow strips that veneer away, then yeah, they're going to have a problem, and I'm sure filmmakers will quickly figure out what not to do, much the way acting styles had to be tamped way down when sound replaced silent film.
posted by Naberius at 6:43 AM on December 4, 2014 [7 favorites]


“Instead of feeling like we’ve been transported to Middle-earth, it’s as if we’ve dropped in on Jackson’s New Zealand set…”

If if watched a movie during my acid taking days, I would often feel this way about it. It had never to me before that this might be a function of 'mechanical' perception, rather than something higher up the chain of cognition.
posted by not_that_epiphanius at 6:44 AM on December 4, 2014 [2 favorites]


You are not alone, sotonohito. I fell in love with 48fps in the hobbit the moment I laid eyes on it, and have tried my hardest to make sure my first viewing of subsequent hobbits is also in 48fps.

The only downsides for me is that there is no convenient home media format capable of doing HFR, which makes this and all future potential HFR films an exclusively Theater Experience, instead of something I can take home with me.
posted by Joviwan at 6:46 AM on December 4, 2014 [1 favorite]


I always think of digital versus film projection as a closer analog to the CD/vinyl thing. I fully sympathize with film fetishists who love the texture and the grain and all that, but (leaving aside the huge benefits of moving to 65mm or true IMAX formats) there is no question in my mind that the digital projection is a closer representation of what the finished film actually looks like — at least in cases where the film went through a digital intermediate process, which is nearly all feature filmmaking these days. (The film release of Interstellar would be a current exception.)
posted by Mothlight at 6:48 AM on December 4, 2014


Ugh, HFR and the impending death of the incandescent bulb have me thinking that I'd better be lookin' around while I can, because in like 2 years I'm just going to have a migraine 24/7.
posted by We put our faith in Blast Hardcheese at 7:14 AM on December 4, 2014 [1 favorite]


I guess I just plain don't get the people who complain that it looks too real, like a window into another place. Isn't that the whole point of movies and other video entertainment? Don't we complain when stuff doesn't look real (like we did when Lucas dropped sub-video game level CG into Star Wars and it just looked awful compared to the more real muppets and models)? Did I miss a memo saying that real was good if it referred to old tech, but bad if it refers to new tech?

I want to have it look like I'm watching something real. Why wouldn't I?

kinnakeet, you do know that high quality movies with good scripts are not the exclusive property of the past, and you only get that impression because if (to pick some numbers at random) 5% of movies per year are high quality, non-explosion centric, then the number of movies that match that criteria made in the past three years will, of course, be fewer than the number of movies that match that criteria made in the past 20 years, right?

It isn't like somewhere along the line people said "nope, no more good scripts like we had in the Golden Age of the Past, let's just let Michael Bay blow shit up". There have never been a majority, or even a sizeable plurality, of movies that weren't basically the Michael Bays of the past eras blowing shit up.

Go take a peek at the IMDB page for 1972. That's the year of The Godfather. But look at the other movies, most of them are cheesy action flicks or cheesier adventure/SciFi crap.

There's good stuff in with all the Michael Bay stuff today. Look at 2013, sure it gave us the usual examples of explosion based blockbusters you see from every year since movies were first created. But it also saw Twelve Years a Slave.

The idea that good stuff only happened in the past is poison.
posted by sotonohito at 7:17 AM on December 4, 2014 [10 favorites]


The idea that good stuff only happened in the past is poison.

Seconded!
posted by grubi at 7:29 AM on December 4, 2014 [2 favorites]


I don't think it is a matter of looking too realistic, as in "wow those spaceships look so authentic that I no longer believe this is a film." It is that, at least historically, the 24fps motion-blur provided a scrim of unreality upon which the suspension of disbelief necessary for a Star Wars or Poltergeist or LOTR is propped. So in this case "too real" means "it looks like actors on a set" which is a side-effect of the suspension of disbelief being broken by the motion detail made available via HFR.

It is doubtless that filmmakers will find a way to overcome this, but right now they haven't.
posted by grumpybear69 at 7:32 AM on December 4, 2014 [6 favorites]


I guess I just plain don't get the people who complain that it looks too real, like a window into another place. Isn't that the whole point of movies and other video entertainment?

Depends on what you're in it for, I guess, but to me "realism" in film has always been overrated. Give me expressionism every time—Scorsese's Steadicams and Gilliam's fisheyes and Argento's colored gels. I'm also that guy that much prefers Willis O'Brien's stop-motion King Kong to Weta's CG version, impressive as the CG is. I even like film grain's velvety texture and the way the image dances around a little in the gate during projection. (Those things are lost with digital, of course.) So when you tell me that shooting at 48fps yields a more "realistic" image, I sort of shrug. You know, whatever turns you on. I've always liked the way the picture flickers in the dark.

All that romantic claptrap aside, 48fps does have the positive effect of making scenes with lots of action on different planes of the frame much more readable when viewed in 3D by dramatically reducing strobing, and so I think it's compelling in combination with stereo imagery. If I decide to see the next Hobbit movie in 3D, I will definitely seek out a theater with HFR projection. But I saw the last one in 2D and didn't feel like I missed a thing. I saw the first Hobbit movie in HFR and thought it was OK, if a little distracting in certain shots, and I think that as people see more HFR content their perceptions will start to adjust to the new image type. (It's possible that future generations will come to be as distracted by 24p images as I am by 48p!)
posted by Mothlight at 7:43 AM on December 4, 2014 [1 favorite]


not_that_epiphanius: "“Instead of feeling like we’ve been transported to Middle-earth, it’s as if we’ve dropped in on Jackson’s New Zealand set…”

If if watched a movie during my acid taking days, I would often feel this way about it. It had never to me before that this might be a function of 'mechanical' perception, rather than something higher up the chain of cognition.
"

Haha, oh man. Forgive a self-indulgent little aside/druggie story but I once ended up watching one of the LotR moves while under the influence of some morning glory seeds. I remember it very vividly because it's always kind of coloured how I see certain films since then. In my semi-dissociated, giggly state, the swirls and sparkles of the visual effects were much less interesting than how it changed my viewing of the events on-screen.

Specifically, it was impossible for me to suspend my disbelief and see the characters - I could only see the actors, and the tension between their roles and their selves -- all the subtle little cues and signs that showed what they thought of having to dress up like a portentous wizard or a surly dwarf and spout Jacksonian dialog. I could feel and almost see the set around them, the context of their life off-camera, and their true demeanours.

However my tripped-out brain was far more amenable to the special effects and the CGI landscapes and monsters. Those seemed totally, magically real. So the picture painted was of a load of midly bemused actors who'd somehow found themselves forced to act out this Tolkien fantasy, by travelling to Middle Earth with a bunch of cameramen and lighting crew and Hollywood execs, and actually enacting the heroic, epic myth they needed to film. All of the characters reactions to the events of the story, were transformed to me into the actors reactions to the characters reactions to the story they were now also going through, but this time just for the ticket sales and Hollywood acclaim.

I finally lost it when some inscrutable observation led me to "deduce" that at some point the two parties had actually caught up with eachother and somehow become intermixed, so half the characters were the real ones and half of them just there with the film crew. That kind of... broke my brain, and resulted in what felt like a 5-year stretch of manic chortling.

Good times!
posted by Drexen at 7:58 AM on December 4, 2014 [22 favorites]


(Orlando Bloom, on the other hand, seemed to have no idea he wasn't really an elf. Which I guess is a... compliment?)
posted by Drexen at 8:09 AM on December 4, 2014 [1 favorite]


Slightly off-topic, but I'm thrilled that YouTube is starting to offer 60fps video. Watching this man dance is hypnotic, the unnatural look is transformed into a virtue. It's almost like watching something in slow motion and yet running in natural time, the emphasis on the movement. It helps it's a fixed camera and a static background, just his body movement is isolated. Also the young man is appealing and can dance.
posted by Nelson at 8:21 AM on December 4, 2014 [6 favorites]


I don't get how people are saying that "the lighting is wrong for 48 fps" is obviously a wrong explanation. It seems obvious to my eyes that the main difference from regular film is that the lighting appears to be all different.

You don't think it would be possible to compensate for that with different lighting and get something that looks more like the movies we're used to?
posted by straight at 8:29 AM on December 4, 2014


Metafilter: because real life doesn't have a frame rate
posted by sylvanshine at 9:17 AM on December 4, 2014 [1 favorite]


Movies aren't supposed to REAL they're supposed to have an element of surrealism which makes them removed from reality. That's what the lower frame rate and B/W film) does, it makes it look more pleasing and slightly altered from hyper realism. And why video looks boring.
Why the fuck would a director want a movie to look "real"?
Less is more.
posted by Liquidwolf at 9:39 AM on December 4, 2014 [1 favorite]


Go take a peek at the IMDB page for 1972.

Hoy crap, a ton of fantastic movies came out that year, and Godfather is maybe only #8 or 9 on the list - any year with both Fritz the Cat and Snoopy Come Home is a daring year for cinema, and I mean that sincerely. Silent Running, Pink Flamingoes, "Aguirre, the Wrath of God" - wow. All that and John Wayne and Bruce Lee popcorn flicks - you could have a "Bruce Wayne" double feature!

I get the point and agree with you, but '72 may have been the wrong year to pick as the exemplar...
posted by Slap*Happy at 9:40 AM on December 4, 2014 [3 favorites]


Slumdog Millionaire was an Oscar winner for cinematography and used 12 frames per second:

"And those emotional moments, almost without exception, featured key shots captured at 12 frames per second (or less) and double-printed for a staccato, dreamy feel."
posted by sol at 10:00 AM on December 4, 2014 [1 favorite]


As someone who dreads camera pans in the theater because of the choppy/blurry mess that makes me go crosseyed, I'd love to see an actual, good movie in 48 FPS, to see if it solves the problem. Basically, I wish someone aside from Peter Jackson would decide to try it, and have the clout to.
posted by destructive cactus at 10:22 AM on December 4, 2014


> With HFR movies, you don't get each image projected twice. Each image is projected once

Digital projectors don't have rotary shutters (or sprocket-hole pull-down claws).
posted by morganw at 10:29 AM on December 4, 2014


Digital projectors don't have rotary shutters (or sprocket-hole pull-down claws).

True, but part of the design of digital projectors is to mimic film projection as much as possible. I remember reading something a few years back (cursory Google searching isn't yielding any satisfactory results) about how they actually built into a lot of digital projection systems an effect which mimics the effect of the shutter because that's what movie watchers expect.

It's possible that part of what is going on with The Hobbit HFR projections is that they aren't using that effect, which leads to an even greater sense of unexpected dislocation for a theater patron.
posted by hippybear at 10:41 AM on December 4, 2014


With the Hobbit, it immediately looked like bad daytime television to me. I think there's a way in which we've been trained combined with a lack of experience lighting for these new technologies that is combining for a less than immersing experience.

I'd be interested to see a character set piece, shot in an actual house with available lighting with this technology.
posted by lumpenprole at 11:36 AM on December 4, 2014 [2 favorites]


This claim is just like the claim that some actors' careers will be ruined, ruined I say!, by hi-def TV because suddenly their very skin pores will be visible. Blackheads! Tooth plaque! Pre-cancerous DNA mutations! All visible!

Hi-def TV doesn't even double the resolution along either axis.

You know what will more than double the resolution along either axis? Moving in for a closer shot.

No actor who has ever been in a head shot during any TV scene in their entire career was affected by hi-def TV... although a lot of shit journalism hand-wringing articles were written about it.

This is the same category of "problem".
posted by IAmBroom at 11:41 AM on December 4, 2014 [3 favorites]


IAmBroom: "Hi-def TV doesn't even double the resolution along either axis."

Um, yes, it does. NTSC SD is 720x486 (with non-square pixels). HD is 1920x1080 (with square pixels). That's pretty clearly more than double the resolution on each axis.
posted by Joakim Ziegler at 11:52 AM on December 4, 2014 [7 favorites]


    Color TV, technicolor, and Kodakchrome all had its detractors who found a purity and monumentalism in black and white.
God, I miss editors.


That's only because your eye is trained from lifelong practice to spot grammatical errors in a line of text.

In ten years that kind of writing will have become the universal standard, while anything that's been subjected to the administrations of a proofreader or in any way revised ahead of publication will look as odd and archaic as something out of The Canterbury Tales</em
posted by Atom Eyes at 11:54 AM on December 4, 2014 [7 favorites]


Why does a movie look better at 24fps than the same movie at 48fps?

In response, let me ask you this:

Why do dancers look better in a strobe light than they do with the house lights fully on?

Why do classic video games look better when you insert emptiness between the scan lines to emulate an old CRT?

Why does a person in a half-toned photograph often look more attractive than in the original?

Beauty is often found when we leave it to the brain to interpolate missing information from images.
posted by eschatfische at 12:00 PM on December 4, 2014 [4 favorites]


By the way, conversion of things shot in 48fps to 24fps can be done in several ways, many of them yielding basically the same result as if the material was originally shot in 24fps.

Normal shutter in 24fps is 180 degrees, that is, the shutter is open half the time (1/48th of a second) and then closed the other half (1/48th of a second), which is when the film advances (this is not entirely true, but true enough for the purposes of this example).

Digital cameras can shoot with shutters practically approaching 360 degrees, because sensor readout is very fast. So, if you want, you can shoot 48fps with a 360 degree shutter, you can just drop every second frame, and it'll look 99.5% identical to how it would look if you shot it in 24fps.

However, this will not give as much crispness, essentially, each frame will have the same motion blur as a 24fps frame, just more of it. Motion will appear smoother, though. If you want extreme crispness, you need to shoot 48fps at 180 degrees. In this case, to convert to 24 fps, you can either blend two and two frames together, either naïvely, by just mixing them, or by applying some sort of motion compensating algorithm, to avoid doubling artifacts in very fast motion, or you can just drop every second frame again, which will maintain more crispness, but might result in some strobing.

I'm a little uncertain what the most common practice is for HFR on large productions today, since there have been so few. But in general, conversion from 48 to 24 is trivial.
posted by Joakim Ziegler at 12:21 PM on December 4, 2014 [1 favorite]


JZ, your use of the term "180 degrees" in this context strikes me as odd. Maybe it's a term of art in the industry (and I'm not in the industry) but I would have referred to the "duty cycle". What you call "180 degrees" is to me a 50% duty cycle. (Which is another term of art, of course.)

Digital projectors do flicker at 48 fps even if they're showing a 24 fps movie. That's because our eyes can easily see a 24 fps flicker rate and find it thoroughly annoying. That's also why film movie projectors are designed to flash twice each frame of a 24 fps movie.

48 flashes per second is still perceptible; if you want to really seem smooth you need something above 70 flashes per second.
posted by Chocolate Pickle at 12:26 PM on December 4, 2014


I have the same reaction to HDTV (which I have little experience with).
Every time I see a football game on HDTV, it looks fake to me, like it's a video game.
posted by MtDewd at 1:25 PM on December 4, 2014


Chocolate Pickle: "JZ, your use of the term "180 degrees" in this context strikes me as odd. Maybe it's a term of art in the industry (and I'm not in the industry) but I would have referred to the "duty cycle". What you call "180 degrees" is to me a 50% duty cycle. (Which is another term of art, of course.)"

Yes, the "degrees" is an artifact of the way film camera shutters work, they're basically a pair of rotating half-discs that you move relative to each other (the rotate together, you move them relative to each other to adjust shutter). The maximum possible shutter is 180 degrees open (the discs fully overlap), which means that the shutter is open half the rotation cycle.

Your duty cycle term isn't that different, though. If you're looking at a sinusoidal waveform, 50% of that is indeed 180 degrees of the waveform.
posted by Joakim Ziegler at 1:54 PM on December 4, 2014 [1 favorite]


Joakim Ziegler: IAmBroom: "Hi-def TV doesn't even double the resolution along either axis."

Um, yes, it does. NTSC SD is 720x486 (with non-square pixels). HD is 1920x1080 (with square pixels). That's pretty clearly more than double the resolution on each axis.
Sorry, I was technically incorrect. Which is the best kind! (But the rest of my argument stands.)
posted by IAmBroom at 1:57 PM on December 4, 2014


> digital projection systems an effect which mimics the effect of the shutter because that's what movie watchers expect

For home viewing, it's called dark or black frame insertion.

Christie has a write-up on HFR and 3D. They say (on a single projector DLP) they're doing 3 flashes per eye per frame or 144 time-divisions per second and 2 per eye per frame for HFR, so 192 (from 48fps) or 240 (from 60) divisions per sec.

That's to give less time between left and right eye images. Rather than left for 1/48, right for 1/48, it's left for 1/144, right for 1/144 and repeat 2 more times. This doc about Real D which also mentions "triple flash" claims blanking is only 8% of light loss which suggests a pretty short blanking period (compared with film projection).
posted by morganw at 2:43 PM on December 4, 2014


No actor who has ever been in a head shot during any TV scene in their entire career was affected by hi-def TV...

How do you know that? I'd argue that plenty of actors have probably seen their careers suffer because they didn't look so hot in HD. If an actress has dark circles under her eyes or, heaven forbid, some actual wrinkles, that stuff is going to stand out in HD and it's definitely not going to help her career any.

As I mentioned in a comment up there someplace, HD weirds me out sometimes because I can see the actors in too much detail. A lot of actors instantly looked like 10 years older, and a lot more bumpy and pimply, when we switched to an HD set. I'm getting used to it, but there was an adjustment period where everybody looked like a Drew Friedman cartoon.
posted by Ursula Hitler at 2:56 PM on December 4, 2014 [1 favorite]


We need this technology in banks and credit unions, and on the police.
posted by Oyéah at 5:47 PM on December 4, 2014


Oyéah: "We need this technology in banks and credit unions, and on the police"

CCTV and police body camera specs vary, but many offer 1280x720 resolution in 60 fps, which is basically this. I'm assuming even the ones that are lower frame rate use a fairly fast shutter to get clearer images of chaotic scenes.
posted by Joakim Ziegler at 7:21 PM on December 4, 2014


destructive cactus: As someone who dreads camera pans in the theater because of the choppy/blurry mess that makes me go crosseyed, I'd love to see an actual, good movie in 48 FPS, to see if it solves the problem. Basically, I wish someone aside from Peter Jackson would decide to try it, and have the clout to.

A lot of people openly hate on the "shitty soap opera effect" that a lot of TVs have now in motion smoothing/"clear motion"/etc, but have you actually tried it out on a properly high end set?

the samsung ones on their 240hz panel refresh screens that cheat their way up to over 1000fps(perceptible, they claim) look really awesome when they work right. I have one of the 240hz/"720cmr" ones and it's a very impressive effect. It introduces some slightly amount of delay, but seems to use that to cheat and read ahead on what the next couple frames are like or...something. There's more to it than just tweening frames, it seems to work like a video encoder and actually be breaking down the video in various ways.

The only problem at least on mine is that sometimes certain shots fuck it up and it drops back to 24fps for a second or two and this ruins it. So i mostly leave it at the setting that eliminates any 3:2 pulldown weirdness from any sources, but doesn't smooth.

Pan shots are one of the things it does really well though, and i share your hatred of that choppy effect. It also does long pulls/zooms/dolly shots pretty awesome.

I see it kind of the way that stuff like izotope ozone was in its infancy as a winamp plugin. Cool, but hasn't hit it's stride yet. In a couple years when every tv has last years smartphone SOC in it and those are incredibly powerful(and they're already hitting laptop-from-a-couple-years-ago levels of power today), and most people are using the tvs internal hardware to play videos so they can REALLY look ahead, i think this stuff will be very robust... and you'll be able to dial in as much or as little smoothing as you want and cheat your way up to 120fps or whatever.

The only thing this wont solve is motion blur inherent in the medium in older stuff, but at least it'll get rid of the choppiness, which yea, i have some hate for.
posted by emptythought at 7:31 PM on December 4, 2014


emptythought: "There's more to it than just tweening frames, it seems to work like a video encoder and actually be breaking down the video in various ways. "

Yeah, this is what I was talking about when I mentioned motion compensated frame blending earlier, this is just in reverse. It's fairly common in VFX work to slow down material using algorithms like these, that track elements in the frame and create intermediate frames with each element in the intermediate position. It works quite well for a lot of types of material, and if it doesn't, there are more manual ways to help it out.

Most algorithms like this are not good enough to always work totally automatically yet, but we're getting there.

The only thing this wont solve is motion blur inherent in the medium in older stuff, but at least it'll get rid of the choppiness, which yea, i have some hate for.

There are actually algorithms to get rid of a great deal of motion blur too. They do something similar, track motion, then work backwards to figure out what features would generate the motion blur that's in the image. It's obviously not perfect, but it can work pretty well.
posted by Joakim Ziegler at 8:06 PM on December 4, 2014


Porn. Porn will be the place where this gets pioneered or widely accepted first. Oh look at that....
posted by now i'm piste at 8:21 PM on December 4, 2014 [2 favorites]


This is why I was really impressed by what I saw at Douglas Trumbull (2001, Blade Runner)'s tech-demo video UFOTOG at Seattle Cinerama. Key elements from that:

* Recording at 120fps in 3D, downsampling frame rates all the way down from there
* Being able to dynamically shift frame rate from scene to scene, so action scenes could be cranked up and dialogue scenes could be brought back down
* Pixel-independent framerates, so you could have a video playing in a scene at 24fps while people talk at 60fps and something goes on in the background at 120fps
* Being able to 'blur' framerates, so it could transition smoothly rather than being a jarring cut from 24fps to 120fps
* End-to-end software for managing and working with all of the above

Cranking the framerate all the way up (He said that 120 was possible/chosen because the widespread upgrade forced by Avatar left theaters worldwide with the capability, but it was otherwise unused) seemed to fix the over-reality issue (for me, at least), so I suspect there's something of an uncanny valley effect with this. I've seen some comments from directors that suggested that this may be because it got up high enough to be able to capture the *really tiny* facial/eye movements in actors which we process for emotional cues.

Handling the datastreams for 4k @ 120fps * 3d (two cameras) is pretty impressive stuff, so it really sounds/feels like this is the sort of thing that's just recently becoming feasible.
posted by CrystalDave at 8:47 PM on December 4, 2014 [1 favorite]


"The ill-fated 1981 final film starring Natalie Wood, Brainscan, was originally supposed to film all the real life segments of the movie in traditional 24fps format while all the sequences showing the actual "brain recordings" was going to be in the new Showscan 60fps format, giving the audience a sense of heightened reality for those segments..."

I was the sort of nineteen-year-old nerd back in 1983 who was very aware of Trumball's Showscan format and when I read that he was opening a demonstration theater at the Galleria in Dallas, where I lived, I dragged my skeptical friends to see it. Not only was this a 70mm, 60 frames-per-second format, but it was a specially constructed theater, very much like the later IMAX theaters, and included (IIRC) a ten-channel audio very high fidelity audio system.

This was all a technology demonstration and the film was a proof-of-concept demo. And it was stunning. I guess the image was projected from behind the screen, because they did this amazing thing where a real person stood on the stage next to a projected person and it was difficult to tell them apart. There was a close-up of a woman speaking, and her giant head was so lifelike, so visually real, that it was frightening. Then they did some multi-channel effects of someone walking around the theater, which probably seems ho-hum now but was pretty incredible then.

Overall, it made a very big impression on me. I'd read what Trumball and others had written about Showscan in that a 70mm filmstock at 60fps reaches the magic threshold where a projected image on a large movie screen transitions into something the eye perceives as real-life. But it really was true. I've never seen anything like it since, and that's because even today's popular formats don't reach that 70mm by 60fps information density.

It's funny that this was posted today because my mother and her husband have finally decided to get an HDTV and they had me explain the tech to them and help them make a decision. And with the advertisements of the screen refresh rates, I had to explain what some of the advertised features related to that would do and wouldn't do, and how some people like interpolated high framerates, other people don't. But I had to start with trying to explain why film looks different from video. And I had the same problem as I've had in the past -- believe it or not, there's a lot of people who don't even notice the difference. When I first learned that, I was astonished because I can't remember a time when I wasn't acutely aware of the difference between stuff shot on film and shot on video. And so I've also been thinking a lot the last few days about the stuff discussed in this piece and thread.

I share the skepticism of others here that the artifice of conventional film set lighting is the sole or primary culprit in this. I'm perfectly willing to believe that it plays a role, but I also think that the stuff that hippybear and Joakim Ziegler describes are the bigger factors.

What I'd like to see are accessible and (relatively) complete analysis of all the important factors involved in what reaches the eye. For example, no one here has really talked more than a little bit about the interplay between the (effective?) shutter speed at exposure, frame-rate of exposure and display, and motion-blur. But that's clearly very important because that's behind the impetus for higher frame-rates for sports broadcasts, even just interpolating them, because higher frame rates with lower motion blur results in realistic, or even hyperrealistic images that include a lot of motion. Also, my very strong impression is that there's a subjective, perceptual interplay between this kind of high framerate crispness and perceived light levels in a way independent of the actual lighting. I would be surprised if greater information density wouldn't necessarily reveal lighting artifice, but I'm not convinced there's not something independent going on.

This discussion is closely related to the discussion in the other thread about the HD conversion of The Wire and aspect ratios, and specifically my most recent comment. People arguing for absolutist positions based upon naturalistic claims are going way too far. And that's in either direction -- with both sides agreeing that the greater realism of HFR makes some decisive difference, though some say it's positive and others negative. With regard to what's objectively true, yes, there's a very strong naturalistic argument supporting the claim that HFR is "more realistic". But that alone doesn't account for how people react to it -- they also react the way they do because of the conventions of the medium and in this regard the author of the linked piece is quite right. But he's wrong if he is arguing that this means there's no reason to use 24fps. And in the same way that audiences just naturally bring some expectations to the table with things like a wide format versus a square-ish format, and this is part of the experience, it's true about frame rate, whether lower or higher. If people feel that it makes it look cheap, then that's a valid response and the filmmaker should be aware of that and work with it.

Obviously, over time this can change. Look at shaky handheld cameras, for example: if you could show a lot of today's single camera dramas using shaky cam techniques to an audience thirty years ago, they'd most likely react negatively and tell you that it makes it look amateurish. Originally, that was sort of the point to delberately choose it when you could have used a Steadicam or a dolly. Now, the connotations are all deeply implicit and we don't notice it much at all. It's just part of the grammar of filmmaking, not something that's the result of practical necessity (or budget) and not something that calls attention to itself, but just one of the tools in the toolbox. In this sense the use of HFR by directors is unavoidably going to go through a period where it calls too much attention to itself and then, later, it will just become part of the toolkit. It may retain some of these connotations for a long time -- that it's like a "soap opera" or whatever. I don't know. It depends upon how filmakers use it. But what the linked piece's author says is true -- to the degree to which it's more used, and especially the more it's used well, the more we'll accept it. That doesn't mean that everything should be HFR.
posted by Ivan Fyodorovich at 10:19 PM on December 4, 2014 [1 favorite]


CrystalDave: "* Recording at 120fps in 3D, downsampling frame rates all the way down from there"

Did he happen to say anything about how this will require five times as much light to achieve the same image quality/noise level?
posted by Joakim Ziegler at 2:05 AM on December 5, 2014 [1 favorite]


Ursula Hitler: No actor who has ever been in a head shot during any TV scene in their entire career was affected by hi-def TV...

How do you know that? I'd argue that plenty of actors have probably seen their careers suffer because they didn't look so hot in HD. If an actress has dark circles under her eyes or, heaven forbid, some actual wrinkles, that stuff is going to stand out in HD and it's definitely not going to help her career any.
Since I've already explained how I know that, I'll just tl/dr: magical sciencing! That's how I know that.
posted by IAmBroom at 11:47 AM on December 5, 2014


IAmBroom: "Since I've already explained how I know that, I'll just tl/dr: magical sciencing! That's how I know that."

Well, to be fair, HD/4k/whatever ultra high definition flavor of the month will show much more detail for an equivalent shot. The closest closeups are, so to speak, closer than ever now. This is not just because of digital sensors being very high resolution, it's also because of modern PL mount optics being extraordinarily sharp (if that's what you want).

But I agree that it's unlikely anyone's had their career affected by this. I do remember that when the Matrix sequels came out in IMAX, there was a lot of talk about the sex scene and how skin and pores looked in that, though...
posted by Joakim Ziegler at 3:23 PM on December 5, 2014 [1 favorite]


But I agree that it's unlikely anyone's had their career affected by this.

Partly because there are so many good options for soft filtration on the lens, right? My understanding is filters are in pretty wide use to soften the HD (or, gadzooks, 4K) image, especially when shooting actors in close-up or news anchors in a studio environment.

That's the funny thing — sure, HFR promises to increase temporal resolution, which sounds like progress. But cinematographers often work at cross-purposes with spatial resolution. There was a major music video a couple years ago that was shot partly as a showpiece for Sony's then-new 4K camera system. It was to be shot and post-produced at 4K and then displayed on the big 4K screens at the Sony store and wherever the hell. And the cinematographer on the shoot was using diffusion to soften the image up during acquisition, and the director was putting artificial grain in during post to give the picture a little added texture and the Sony engineers were going nuts because in their view it was all wrecking the image. But here's the bottom line — that video looked fucking great on those 4K TVs.

tl;dr: Resolution isn't everything.
posted by Mothlight at 5:22 PM on December 5, 2014 [2 favorites]


Mothlight: "Partly because there are so many good options for soft filtration on the lens, right? My understanding is filters are in pretty wide use to soften the HD (or, gadzooks, 4K) image, especially when shooting actors in close-up or news anchors in a studio environment. "

Some is filters on the lens, some is due to the limitations of the optics, some is soft filtering in post (more precise, selective, and predictable). I don't think they use soft filters on news anchors, news and other TV featuring talking heads has been airbrush makeup (which looks freaky as hell to me, but is indeed very perfect) for years now.
posted by Joakim Ziegler at 11:55 PM on December 5, 2014


Joakim Ziegler: I don't think they use soft filters on news anchors,
No, but neither do they really do tight headshots on them. Those are for either drawing the viewer in to put themselves in the subject's place, or to emphasize the emotional reactions on their face. Newspeople are kept mostly in wider shots, to keep their formal attire and emotional distance (making them look more "objective").
posted by IAmBroom at 8:23 AM on December 8, 2014


« Older Walter Benjamin for Children   |   The Truly Final Countdown Newer »


This thread has been archived and is closed to new comments