February 23, 2017 at 9:37 pm #4024
Animation Pagoda StaffModerator
- Topic Count 63
- Replies 0
Digital vs. Film
Digital cameras have made indie filmmaking affordable and available to everyone. However, many film purists still prefer to shoot on analog film cameras. The only real advantage that film has over digital is that there is a nostalgic quality associated with the grainy film footage that digital supposedly lacks. If you’re a cinephile, you might be able to tell the difference between a video shot on film and one that was shot digitally, but to regular people they really look about the same. Film might be a better option if you prefer working with tactile stuff more than software.
Working with Film
8 mm film is the standard film size. Super-8 film has the same 8 mm dimensions, but results in a slightly larger image due to smaller perforations with wider spacing. 16 mm and 35 mm are two other larger film types. 30 mm and 70 mm are used by Hollywood movie studios to capture the detailed images that will be shown on the big screen using movie theater projectors, but the majority of studios are transitioning entirely to digital. Film isn’t completely dead though. It’s possible to shoot on film and just scan the reel into a computer for editing without any adverse effects.
Some things to be aware of when working with film is that the equipment is expensive. Reels can be heavy and difficult to transport. A 3 inch reel only holds 50 feet of film, or about four minutes of footage. A 5 inch reel contains 200 feet, about 15 minutes. The process to develop film is slow, and involves chemicals that are not particularly environmentally friendly. Old cellulose nitrate films are highly flammable (modern film is made with safer polyester or acetate). Film stock scratches and fades. Editing everything by hand cutting, splicing, and taping isn’t as glorious as some people say it is, and it’s nowhere near as efficient as using software. Most old film projectors are being replaced by digital projectors. Film captures high quality images with exceptional levels of information, so it remains a good archival medium
Aspect ratio determines the dimensions of the video. 16.9 is the most used standard for HD. 4.3 is the old standard. A lot of televisions, monitors, and computer screens are manufactured to fairly large dimensions, so widescreen is the slightly more adaptable format. Widescreen creates black bars on screen, but this is preferable to some formats that can sometimes get compressed horizontally, causing the image to look squashed.
24 Frames per second is the arbitrary standard for video, since it matches how fast the human eye can process visual information. Less than 24 frames per second will cause an undesirable flickering jitter effect as seen in flipbooks and old movies. Higher frame rates of 30, 48, or 60 FPS are possible for capturing extra detail, but should only be used when necessary since they just use memory without a noticeable difference in quality. High frame rates of 120-300 FPS are used for creating slow-motion effects.
Motion Blur and Motion Smoothing
Broadcasting networks use different standard frame rates than Hollywood. These standards are based on the refresh rate of the projection equipment, namely HD monitors and television screens. American television networks use 29.97 FPS, but European broadcast television uses 25 FPS. YouTube and Vimeo also have slightly different codec.
These standards are not dissimilar enough to cause any perceptible differences in viewing quality. However, the frame rate recording method can have an impact on motion blur. Motion blur is a visual effect that occurs naturally due to persistence of vision. Movie film integrates motion blur as a vital component to the Hollywood cinematic look that people have grown accustomed to. Digital technology and machines can’t cycle through frames the same way a movie projector can, but they can reproduce motion blur artificially.
Television shows often have less action and more emphasis on dialogue, so superfluous motion blur is cut out using an automated process called motion interpolation. It basically adds extra frames between the existing ones. The resulting effect causes people and objects to stand out from the background in sharper contrast. Depending on the situation, the lack of motion blur can appear slightly artificial to the human eye. Motion blur can usually be easily turned on or off in the display settings.
Most camera lenses are named after the focal length at which they work best. Different lenses can achieve different looks to your shots. 14mm is close to your average smartphone camera. 50mm is the standard for general shots. 85mm is usually said to be good for portraits. 85mm+ Telephoto lenses are used for far away shots where you want to zoom in really close. Telephoto zoom magnifies shots even more.
Fisheye lenses have high distortion around the edges like a convex mirror, and are used mainly for experimental effects. Wide angle lenses are generally used for panoramic or landscape shots.
When to use lens filters
Filters can be fairly pricey for something you might not use all that frequently, but in certain cases they can really add some nice effects your shots if you know how to use them properly. The most common types of lens filters are UV, ND, Polarizer, Tilt shift, Hood, and various types of color gels. It’s also possible to create your own homemade filters, but be careful not to touch the camera lens since the sensitive coating scratches easily.
There are ways to adjust lighting contrast and audio in post, but that can be time consuming and doesn’t always work. Getting the best quality footage the first time around is always the most efficient approach that will leave less work to do later. Always double check that your camera and recording equipment settings are where you want them to be. Staged scenes and lighting may be necessary. If possible, plan the location and what time of day you shoot. Make sure the battery is fully charged and take extra footage, preferably in relatively high resolution. Things can always be scaled down in post.
Blurry Footage or Low Resolution Images
The camera was probably out of focus, or else the video dimensions weren’t set at a high enough resolution. If sharpening the image quality and reducing noise in After Effects doesn’t help, there’s not much that can be done to salvage the footage other than reshooting. Unfortunately there isn’t any way to magically make poor resolution appear higher quality.
You could use the Motion Tracking correction filter in After Effects to reduce the amount of camera movement, but it is generally recommended to use a camera tripod. If a tripod won’t work for your scene, there are DIY methods to steadying shots by using string, pvc pipes, or homemade dollies.
Different cameras and lenses can produce very different moods in a video, depending on the context. It takes some practice to become familiar with the subtle effects different types of equipment will produce, but a good director can take advantage of technology quirks to create more interesting shots.
DSLR cameras are fairly versatile, but GoPro cameras are specially tailored for extreme POV action videos. They are built to be easy to carry around or clip on to equipment. The fisheye lens causes distortion, but captures a wider angle of view. This might not look very good in context of a general conversation scene between two characters, but maybe it would produce a cool effect for a chase sequence or a flashback.
Three-point lighting is the key to lighting most scenes. A regular lighting setup just fills a scene and is kind of generic and boring, but by adjusting the lighting levels and positioning it is possible to create more interesting and dramatic effects. Generally you don’t want to have many lights in one scene. Be sure to think about other sources of light such as windows or light fixtures. You don’t have to have expensive equipment to light a scene, but renting specialized lighting equipment is an option.
When adding audio tracks to video, it is generally recommended to always cut on the down beats of the music. For animation, creating exposure sheets will break down every sound or syllable so it is easier to time actions with the audio. Audio syncing can be a very time-consuming process. In the industry, ADR (Automated Dialogue Replacement) voice dubbing one of the most important post-production jobs
You must be logged in to reply to this topic.