Archive for October, 2016

24 fps vs 48/60 fps in movies

I watched the trailers of The Hobbit at 48fps and Avatar’s at 60fps (as their directors wished), and I really don’t like the smoothness. Originally, in principle, I was a high frame rate advocate for cinema movies (thinking that 24fps is simply a relic of a century old tech requirement), however, now I’m against upgrading the temporal velocity for regular film. The difference between my old opinion and my new one, is that now I know why 48 or 60 fps don’t work as well as 24 fps. I had to become a collage artist to understand why.

The reason is that high frame rate becomes extremely distracting. When there are more frames, it means that there is way more information on the screen. The eye and the brain gets way too tired to follow and analyze it all (it follows it by default, you can’t turn off that natural process). Because of that, the brain runs out of steam to follow the story fully, and so the movie fails because the story doesn’t shine through.

The same is true for color: if you look at Hollywood color grading, except for black & white, only 1 or 2 more color families are actively visible. For example, you will get red, yellow, and that teal color that covers both green and blue. Basically, the fewer color families that are on screen, the less processing the brain has to do, resulting in the viewer dipping into the story more.

It’s the same in collage: the fewer the color families used, the more successful a collage is. Otherwise, it looks like a mess.

Having said all that, there is a future for high frame rate (and more colors), but that’s only in VR, or in futuristic systems where the image is projected directly on the brain. Then, yes, there’s a requirement for “more”, since the whole point of VR is to “fool” the brain about the reality it displays.

But for TV & film, which is projected away from ourselves, and perceivable by only one of our senses (so it must provide less information in order to be processed fast-enough), “less is more”. That’s why 24fps is here to stay for these mediums in particular.

iPod Touch sensor crop and wide field of view

I received my Moondog Labs anamorphic lens for mobile devices today, and ran some tests. It is wonderful to be able to have such a wide field of view with a device such as the iPod Touch 6th gen. As you can see in the picture, the image is way wider than shooting in standard 1080p. But do not make the mistake to think that this is all the anamorphic lens’ doing. There are THREE factors that extended the wide field of view that much:

1. Shooting in 3k instead of 1080p (using the ProMovie app), the sensor gives you a completely different field of view. The crop of the sensor is smaller. This is the biggest hack you can do to get a wider field of view (it’s even bigger than the anamorphic lens hack!).

2. The anamorphic lens.

3. Turning off stabilization (which means that you must have some sort of other stabilizer at hand to shoot properly).

ipod-touch-crop

Here’s the test video I shot today:

Here’s how I shot:

1. I used the ProMovie iOS app, which allows me to record at 3k (3200×1800 resolution). I used 100 mbps, at 24 fps. I locked the exposure to 1/48th shutter speed, and then I set the ISO to lock the exposure. I set and locked focus, and white balance. The ProMovie app also has an anamorphic screen view! I set stabilization to OFF (that’s why the video is very shaky). Obviously, when shooting something seriously, use a tripod or a stabilizer/gimbal.

2. When using the Moondog Labs anamorphic lens, and you apply the 1.333 aspect ratio in the project properties and on each clip (I use Sony Vegas), the effective resolution becomes 4267×1800.

3. Then, I color graded this way, plus I added the FlmConvert filter with its “FJ H160 Pro” template, and also tweaked the template’s levels a bit.

4. Then, I exported at exactly 3840×1620, at 100 mbps bitrate (I exported no audio in my case). If using Sony Vegas, you must “disable resample” in all clips in the timeline before you export. Then, I uploaded on youtube. It is very important to export at the exact resolution stated above for 4k anamorphic btw, otherwise, people with ultra-wide monitors will get black bars on all four corners! The above resolution is ultra-wide UHD (3840 px wide).

The Reflex

Slept a bit this morning, after my sun-rising walk, and not surprisingly, I got a lucid dream. I was caught in the middle of a game between two factions, taking place in my own house, and I thought I had to take part. Soon, I realized it was just a game and not real adversaries. So, I approached one of the entities (they were not human), and I started asking questions about the reality of everything.

I don’t remember much of the early questions, but the answers didn’t really surprise me, which means that they were in sync with my own cosmic ideas. But towards the end, there was a Q&A that did surprise me.

I asked if there was a chance that our scientists would eventually be able to become aware of their dimension, or other dimensions.

The reply was:

“They already did that in an experiment in Shanghai. Chinese scientists call this ‘The Reflex’.”

Whoah!

What really surprised me wasn’t the fact some scientists might have gone a bit “too far” with their physics experiments. But rather, the fact that they called this a “reflex”, and not an “echo”. You see, the reality I was in during my lucid dream, was very close to our reality. My house was probably about 80% the same as it is in my waking state. So if an experiment was able to identify a similar setting of matter in another operating frequency, they would normally call it an “echo”. Because being the materialist scientists that they probably are, they would prefer to give the explanation that this is simply an echo of our matter in another frequency, rather than a whole dimension in its own right.

But calling it a “reflex”, might mean (my interpretation) that whatever changes here or there, it has a reflective capability to our matter back here or over there. So basically, the two (or more) frequencies of reality are both connected, but also separate, and one can influence the other (if only at quantum level).

If what I was told is true, the scientists already know that there are other “dimensions” of reality, and not just “echoes” of just our own. There is a huge difference between the two! Also, I was not specifically told that there was communication between these scientists and the lifeforms that live there. I don’t think there is any, to be honest.

So yeah, there was that today…

How to color correct for iOS devices

It is definitely possible to shoot a movie with a mobile device, just like the Sundance movie “Tangerine” did. And in fact, today it would look much better than Tangerine looked like (which had pretty bad lighting throughout). Being the bad girl I am, I ordered the Moondog Labs anamorphic lens, with a 52mm filter ring, to add a variable ND filter for outdoor shots and shoot anamorphically for more cinematic shots.

So, I spent $150 to buy FilmConvert today too, only to find out that I could do a better job myself at grading my iPod Touch 6th Gen footage. Click the images below to see the before and after properly (click through again to see them in full resolution).

For this tutorial, you will need the Sony/Magix Vegas video editor.

1. Shoot your movie with either Filmic Pro, or if you have an iPod Touch instead, the ProMovie app. The ProMovie allows up to 100 mbps bitrate, and for certain newer iPods, it can shoot at 3k instead of just 1080p. At the end, you will be exporting again at 1080p (or 2k), but you will have a wider angle and more pixels to work with than shooting in 1080p.

2. Make sure lighting is adequate indoors. If shooting outdoors, always have the sun on your back.

3. Set your app at 24 fps, and lock the shutter speed at 1/48th. Lock white balance to the best value you can, and lock focus. For ISO, observe the exposure meter, and always lock the ISO half a stop below of what the app thinks it’s the best exposure. This is because mobile apps tend to overexpose. This is mostly true for outdoor, or brightly lit scenes.

4. Record (preferably with a tripod or a gimbal), and save the video in your gallery (there’s a small icon to do that). Connect your device, and copy the MOV file(s) over to your PC.

5. In Sony Vegas, it’s very important to set the right project properties to match the clips (right frame rate, tell it it’s progressive, etc).

6. Bring the footage in your timeline. Select all of them in the timeline, right click on any of them, and hit Switches/Disable-Resample. If you don’t do that for ALL your clips in your timeline, you will end up with “ghosting” (blurred images).

ipod-1b

7. Pick a clip in the timeline, and click the little + icon at its far right to add plugins on it. In the new window that opens, click the little + icon again on the right of the window, and add, in this order:
– White Balance: amount 0.100
– Saturation Adjust: pick the preset “Reduce minor color noise”
– Brightness & Contrast: Brightness -0.040, Contrast: 0.075
– Color Corrector: Saturation 0.800
– Gaussian Blur: 0.0003 for both horizontal & vertical ranges

8. For exterior, sunny shots, it’s the same as above, except for a few small changes:
– Brightness -0.040, contrast 0.000
– Color Corrector: Saturation 0.750

No scene is the same as another, so you will slightly need to adjust the above to better match your scenes.

ipod-2b

9. After you color corrected all clips separately, click the + plugin icon on the left of the video timeline (that’s the icon for the global plugins). Add the “Levels” plugin, and select the “Computer RGB to Studio RGB” preset. This will make your footage look “flat”. That’s ok, it won’t look like that when it’s rendered at the end. We need to do this, otherwise all h.264 exports will come out way too contrasty (they will differ from your Vegas working preview, and this plugin prevents this).

10. Export by clicking File/RenderAs and opening the MainConcept AVC/AAC format. Select the “Internet HD 1080p” template, and click “Customize Template”. Make it look like exactly like this (and give AAC audio 160kbps at the very least). Then, upload to Youtube the resulted .MP4 file if desired.

export

Note: Interior shots might need denoise. You can do that using the Neat plugin (commercial), or by bringing your noisy scenes to Photoshop one by one (use an intermediate codec in that case). I used Photoshop above for the interior shot of my living room.

Note 2: A very interesting Vegas plugin is the LAB Adjust. With it, you can mute the green colors (or too much orange colors), by using the “Channel b” very slightly (bring it towards the left). Some shots might require this plugin. Hollywood movies have strong reds and blues, but greens are rather muted.