But why does it all appear so odd, so abnormal? And is it actually? One of the major issues with the new frame-rate projection is how smooth it all looks. To perceive motion on a screen, our brains need a projection rate of at least fourteen frames per second — and the higher the frame rate, the smoother the motion seems. But there is no perception limit. Brightness is not as great. There were negative reactions to all of those at first, too.
The characters look too sharp? Some of the sets look like sets? Of course they do. I don't think it's better because it's new. I think it's better because it's a higher frame rate. It's not some new stylistic choice, it's just more information.
It's better in the same way that a higher bandwidth Internet connection is better, or a phone with longer battery life. I also don't think that 3D films is better in the same absolute sense, although I actually do think 3D is pretty good in theaters. The dizziness complaints about 3D are very valid because of physical implementation details, but I don't think the same about similar complaints about HFR. People are extremely defensive about the 24 frames, so I feel for you trying to take a stance.
When I watched the Hobbit I thought to myself that some of the breathing movements in the CGI were amazingly fluent and they kind of reminded me of a computer game. And then I thought, what a loss to Hollywood, that beautiful fluent movements have come to be associated with computer games and not films.
You're trying make an opinion a fact. That's fair. Perhaps 'closer to reality' is a better statement. Reality doesn't render at 24fps. Dylan on Dec 24, root parent prev next [—].
Subjectivity is not the same thing as placebo. There has to be a perceptible difference to have a real preference. MLR on Dec 24, parent prev next [—]. People need to take a step back and remember that film is art, it's not always meant to be the best possible depiction of reality. The issue is when people start trying to proscribe 48 fps, or 24 fps, as inherently superior, it's all situational. We can still watch old Chaplin movie and love it. Because it's good. But if new movies were going to be released in 16fps then nobody would watch that if it's just one movie people might go because of the novelty.
Especially for action movies, there is no excuse not to reach 48, and better 60 fps, since we now have the technology to do this. Peter Jackson and James Cameron are precursors and they will have the knowledge and the technique before everyone else when everyone will start making 60fps movies. In action movies, they often use an even faster shutter to reduce motion blur. The choppy motion makes hits look faster and harder. Lately I've noticed they'll even drop several frames from the middle of a punch to make it look harder.
He very much a care less and very non technical person. After the show he asked me why the movie not look like a movie but TV video. He said the mountains looked plain, not as grand like in LOTR. Kurtz79 on Dec 25, parent prev next [—]. When something it's clearly better than a previous technology, you "get used to it" pretty quickly. When then first "retina" displays appeared, the improvement over lower DPIs screens was clear, and nobody complained it was "too sharp". It's clear that ithe HFR case the issue is not so clear cut: the fact that there are discussions about it and articles like this one that try to place a scientific base to the fact that many do not like HFR movies , it shows that it's not just a matter of "getting used to it".
Actually, if you look back via the web archive, at the time when Retina appeared, a lot of people complained about it. It was exactly the same as with 3D or CGI, even though it is definitely useful.
Have you anybody else here tried interpolation? What's your opinion of it? AndrewDucker on Dec 24, prev next [—]. So, basically, at 24FPS things are blurry enough that you can't see the fine details, which means that special effects and costumes look realistic. Increase the frequency to 48FPS and the blur goes away, meaning that we can see the fine detail, and suddenly sets look like sets, costumes look like costumes, and CGI looks like a computer game.
TillE on Dec 24, parent next [—]. To be fair, CGI creatures look like a computer game regardless. A very pretty computer game, but still.
Sometimes it seems like people want to believe that CGI is a whole lot better than it actually is. Even at 24fps, it's not great. It certainly doesn't look real. I agree with this but I think it is more complicated than people "wanting to believe CGI is better than it is", it is more like some sort of innate suspension of disbelief we all have as long as what we are seeing is as good or better than what we've seen before.
The way the CGI in all the movies in between looks to me now is very different than it looked to me when first viewed and as a graphics nerd I was always interested and up to date on the technology was done. I don't believe my brain "wanted" to believe CGI was better than it was then, I just had no context for how it would look when it was done even better and as the goalposts moved what came before looked increasingly bad in comparison.
CGI backgrounds are very good though. Incidentally, have you re-watched Fight Club recently? I remember when I watched it in I was completely blown away by the "impossible" cuts. I watched again a couple of months ago, and all I could think was that it looked like a Half Life 2 cut-scene. The standard for CGI rendering has gotten a lot higher. I have wondered about this as well. Maybe the audience does not care so much and thus it makes no sense stay at stuff that can be pulled off with budget.
That said there is more and more things that can be done well every year. I think something like Gravity looked pretty cool and it will be interesting to see if it looks good few decades down the road. There weren't many organic things in Gravity.
Mechanical things are much more easy to get to look right. The easiest would be something made of plastic, I guess. Could that be due to the brightness difference between seeing it in the theatre and watching it on your TV? TVs tend to be set brighter.
I was an extra in LOTR 3, and was on some of these sets. The Weta people put insane energy and time into making them look and feel realistic: dirt on the floors, dirt on the costumes, peeling paint, heavy chain mail even though it's electroplated plastic, it's still heavy!
It's not just that the lack of realism jumps out. Everyone instinctively knows movies look strange compared to real life, but this dream-like quality is part of what makes them so seductive. Exactly, though movies often aim not at realism but something better. How to achieve that? So far, it's taken a lifetime of trial-and-error in every aspect of movie making to find what works which happened to be at 24fps.
It doesn't have to be real, it just has to take you there fully enough for you to have an experience intended by the creator. EpicEng on Dec 24, prev next [—]. Seems to be down. Jyaif on Dec 24, prev next [—]. UhUhUhUh on Dec 24, prev next [—]. There's also a high-level processing aspect. The brain excels at extracting relevant information, which includes discordant information.
Back in the days, a solo violin was tuned slightly off to allow the audience to hear it over the orchestra. Barthes also came up with the "punctum" idea, whereby an odd detail in a picture will generate an impression.
What I'm saying is that higher-level processing is probably responsible for a number of "impressions" that might have little to do with fps. Qiasfah on Dec 24, prev next [—]. Most serious FPS gamers swear by screens that have a higher update rate than 60hz. In the past this was achieved by setting your CRT to a low resolution and upping the refresh rate.
Moving the mouse in small quick circles on a hz screen compared to a 60hz screen is a very different experience. On a 60hz screen you can see distinct points in the circle where the cursor gets drawn.
With hz you can still see the same effect if you go fast enough, but it is way smoother. This makes a huge difference for being able to perceive fast paced movements in twitch style games and is the reason there has been a shift to these monitors across every competitive shooter.
My thoughts on this is that this behavior is similar to signal sampling theorems. Specifically the Nyquist theorem talks about how you have to sample at at least 2x the max frequency of a signal to accurately represent the frequency. For signal generation this means that you have to generate a signal at at least twice the rate of the max frequency you want to display. If you want to accurately reconstruct the shape of that signal you need 10x the max frequency for example two samples in one period of a sine wave makes it look like a sawtooth wave, ten samples makes it look like a sine wave.
So, if you're moving your mouse cursor quickly on a screen or playing a game with fast paced model movement even if your eyes can only really sample at something like hz the ideal monitor frequency might be hz. There's a lot of complexity throughout the system before we can get anything close to this game engines being able to run at that high of a framerate, video interfaces with enough bandwidth to drive that high of a framerate, monitor technology being able to switch the crystals that fast, etc.
Yes, 48fps movies typically look less cinematic, but I think this is a flaw in movie making technology and not of the framerate. The fight scenes in the hobbit sometimes look fake because you can start to tell how they aren't actually beating up the other person. This detail is lost at 24fps and is why they have been able to use these techniques. A perfect sawtooth wave actually contains infinitely high frequency content and thus can't be perfectly represented digitally.
I just upgraded to a hz monitor and a this Christmas, to celebrate them working on Mesa. And yeah, they work pretty flawlessly, even in my now 3 monitor setup. And Quake. Holy shit. Playing that game at hz makes it feel incredibly real , even if its blocky and pixilated, the movements are incredibly organic and the camera turning feels like a head turning rather than spinning around on Google Maps.
Wait a second.. It's my understanding that you just need 2x two points in a sine wave to construct a unique wave. If you're getting a sawtooth, it means that you're sampling a wave that is composed of very high frequencies, and you're accurately sampling it, so a DAC can reconstruct it uniquely. Qiasfah on Dec 24, root parent next [—]. What that whitepaper is saying is that "if you only sample 2xMaxFreq and then connect the dots with straight lines it doesn't really look like a sine wave so buy 5x as much instrument from us".
That's a total cheat as that sawtooth graph they show is only possible if you allow higher frequencies. If the signal is bandwidth limited at the frequency of the sinewave the points you sample at 2xFreq only have one possible solution for the graph the sinewave again.
So if our eyes really are Hz we can't see anything above 50Hz. I just finished going through a Fourier Transform course. The technical answer is that you don't interpolate the samples with lines, but with the sinc function.
The sinc function is sinusoidal and so it more naturally approximates waves. In this case 2xMaxFreq is enough to reproduce it exactly. Using linear interpolation in the whitepaper is a blatant lie. I'm not sure this follows as we're not perceiving waveforms when light hits our eyes, but we're perceiving intensity of energy hitting our receptors. This paper has a lot of false information in it. The sawtooth wave example is just not correct.
There is exactly one band-limited i. In the case of a sine wave sampled at twice the frequency, that solution is the exact sine wave that was produced. The video I linked to above has a demonstration of this signal reconstruction, using an analog oscilliscope to show that sine waves are reconstructed perfectly when sampled at only 2x the fundamental frequency.
Ah ok, so here I think is the slight confusion. For example, in Figure 2 in the article, it states that 2x sampling only provides frequency information, and not amplitude and shape. This is true if we assume that we're trying to directly reconstruct -any- periodic signal. Then if we sample at only 2x of the signals fundemental frequency, we are in fact stuck.
This can cause certainly cause confusion. So I think the usual way I just dinker with DSP for funsies and a little bit at work, so I might have got it mangled to deal with this confusion is to remember that sawtooth and square and whatever signals are chocked full of high harmonics that also must be sampled at or beyond the nyquist limit for you to be able to construct it. I see the same arguments arise about HFR as I do with stereoscopy and the rhetoric follows the same as the switch from vinyl to digital music formats: it is no longer art.
It feels like you lose the artistic effect when you add a multiple of information to your brain. The reality is artists need to learn how to be mindful of the new medium and the old tricks they used to overcome older medium defects need to be removed from the process.
Over use of makeup I am excited because we have a bright future with better media technology and pioneers like James Cameron are leading the way. MLR on Dec 25, parent next [—]. Film is art, it's important people remember that, HFR is just another tool - it shouldn't be forced upon people.
You won't see people claiming 3D is an inherently superior format to film in, we shouldn't see the same for HFR. Conversely if a director feels it's best for their film to use HFR, in full or in parts, people shouldn't be jumping on their back about it until they've seen the end product.
Animats on Dec 25, prev next [—]. James Cameron Titanic, Avatar, etc. He considers that more important than resolution, pointing out that higher resolution only benefits the first three rows in a theater.
With the low 24FPS frame rate, pans over detailed backgrounds look awful. This is a serious constraint on filmmaking. Cameron's films tend to have beautifully detailed backgrounds, and he has to be careful with pan rates to avoid "judder". That was when good color and wide screen came in, and films contained gorgeous outdoor shots of beautiful locations. With, of course, pans. Some of the better Westerns of the period have serious judder problems. Directors then discovered the seven-second rule.
Or defocused the background slightly, if there was action in the foreground. The author's analysis of the human visual system is irrelevant for pans. For pans, the viewer's eyes track the moving background, so the image is not moving with respect to the retina. It also felt like there was far too much depth of field The depth was overwhelming.
I can honestly say I found it visually repugnant at times harsh words I know—but you have to realize I almost RAN out of the theater within the first 5 minutes. Yet when I saw the exact same scene in 2D guess what? I loved the lighting. The depth of field wasn't there anymore. The image was cinematic. And this was with the exact same scenes And guess what else? I connected with the actors. I was left to let my eyes wander and tunnel vision if you will to the detail or actor that I wanted to "listen" to or see.
I caught every joke and chuckled. I became immersed. And I found this absolutely fascinating—even stunning to the point that I had to ask myself even though I knew the answer whether the same scene had been re-light and re-shot in 2D it wasn't—they simply used only one of the 2 cameras they shot with.
And this is coming from someone who has been studying lighting and the visual medium for 22 years. I then saw the same scene towards the end of the film with Gollum in all 3 formats. In 3D—I got into it and I actually liked it just fine. In 2D—I made the closest connection with the actors even though one was but a CGI character of the pioneering and amazing actor Andy Serkis who's defined motion capture. The first battle scene was also fascinating and in many ways a death blow to 3D HFR for me.
The purpose of HFR is supposedly to make these very fast moving scenes much easier to see. Everything was in focus and semi-sharp—but I didn't know where to look. I found it horrendous. The same scene in 2D was easy to follow, very dynamic and poignant when the severed king's head rolled by at the end of the battle. Because of the motion blur It worked. The 3D HFR. Not at all.
When Richard Armitage's character Thorin picked up a sword to cut the main opponent's forearm off—I couldn't make out the sword in the 3D HFR at all ironically—and this was confusing as he had been fighting the creature with the trunk of a tree which had been split in two I didn't know how he'd managed sever an arm with half of a tree trunk.
In the 2D version—my eye was able to "punch" in on the wider frame and easily catch him picking up a sword. So with all of this here's the "Master Class" that I took away, and that Peter Jackson shared with every filmmaker out there that is willing to study these 3 versions of the same film:. Shallow depth of field, motion blur, lack of sharpness, and movement all help to create movie magic. If images are too sharp and you see too much detail The Canon 5D MKII showed us that in many ways—it's large sensor and resulting lack depth of field combined with what was a relatively "soft" image relative to video cameras made it what it was when I shot " Reverie.
High frame rates belong on bad TV shows and perhaps sports. That is unless this next generation of video game players change the rules on us of course. I can see this working for animation, sports and nature films though. I'd also like to see it used on only certain moves fast ones in a film perhaps and not the entirety of a film. It highlights the weaknesses of both techniques exponentially. In fact just yesterday afternoon a VFX friend of mine said, verbatim: "Motion blur is extremely important to what I do This latest technological "advance' reaffirms one of my key beliefs: We're far too focused on technology these days we are creating a lot distractions to what can make a film truly powerful.
So many of these new technologies threaten the magic of film by making the experience a little too "hyper real" if you will. Having only one of 8 characters in focus during an important soliloquy, or another person crossing frame out of focus and motion blurred can be a good thing to make the audience become more immersed in the film Something to think about. I wonder if this a combination of not being able to focus my eyes on the lips when things were tough to hear in the 3D version, or if I was just being overwhelmed visually and couldn't refocus my mind on paying attention to the dialogue I notice this on the scene with Gollum pretty acutely as he was hard to understand at times.
As we invariable move towards 4K—directors will need to make sure that things looks as "real" on set as possible. It's damn hard to hide your cheats I'd like to selfishly think that this will lead us to shoot things more practically than with effect shots That can also prove limiting for filmmakers as they may have to chose to limit how fast they move the camera and or how fancy their moves are in terms of speed and degree of focusing difficulty.
That being said if the film is projected at 2K—this isn't an issue Most films shot on the RED Epic at 5K such as Fincher's Girl with the Dragon Tattoo have only been finished in 2K - which is why people aren't really talking about this that much out there yet. I have seen more than a dozen 4K projections: when the production value of the film is high, the makeup and wardrobe good, excellent lighting, and excellent focus pulling skills with attention to not moving the camera too fast: it looks STUNNING.
If you fail to do any of the aforementioned: it can be deadly. Absolutely unforgiving.
0コメント