Thursday 25 October 2012

Doppler Effect 2

Here are some more clips of how the doppler effect works 












Wednesday 24 October 2012

Production Tecniques
Sequencing
In film and TV, the audio portion of a project is recorded separately from the video. Unlike your home video camera, the film or video cameras used in professional productions don't have built-in microphones. Instead, all dialogue is recorded with either a boom or a tiny, wireless lavalier mic that can be hidden in an actor's clothing. Most other audio, like ambient background noise and music is added in post-production
Sequencing is putting all part of a song together, in this case for sound designers working in post production sequencing will be putting all the sound effects, Foley effects and background noise and music together and time it with whats going on in the footage, By sequencing these sounds you move them around within the realm of the footage to arrange the sounds in a way you want them to come across.
Synthesis
Synthesizers are almost always used in Sci-Fi and horror films because they can produce otherworldly sounds. But for straightforward emotion, horns are used too. These are associated with pageantry, the military, and the hunt, so they are used to suggest heroism. Movies featuring death-defying heroes such as Star Wars and RoboCop use a lot of horns.
in this short clip the steam powered mono-wheel the boy rides was created by a synthesis
Sampling
A software sampler is a piece of software which allows a computer to emulate the functionality of a sampler.
In the same way that a sampler has much in common with a synthesizer, software samplers are in many ways similar to software synthesizers and there is great deal of overlap between the two, but whereas a software synthesizer generates sounds algorithmically from mathematically-described tones or short-term wave forms, a software sampler always reproduces samples, often much longer than a second, as the first step of its algorithm.
Equalizing
In film sound, the sound designer matches sound to the look of the film. A sad movie has mood lighting, and the sound will be designed to match it in emotional tone. Its dialogue is EQ'd less crisply, with a lower-frequency boost.
In a happy comedy, lower frequencies are rolled off, and it's EQ'd and mixed to be "brighter."
Film sound is "sweetened" by manipulating room tone, premixing audio levels, and carefully considering dialog, music, and effects for their proper audio EQ.

Film sound expects post-production sweetening, which makes film audio sound so different from audio for video. Video sound can be sweetened, but Indies use it pretty much as it is recorded.

EQ can also alter the frequencies of the human voice to make them sound like they are on the phone which would be good in a scene where a character is on the phone and you hear the voice of the person on the other end.
Mixing
The key to mixing audio is to make it sound exactly how you want it to sound and make the recording of the sound even better, you do this by adding stuff like a compressor to reduce the dynamic range so that nothing is too loud or too quiet but in a sound effect you might want something to start at a low volume and then increase, in this case you would not want a compressor but add in a fader or filter. It all depends on what kind of sound you are going for, another thing to use is noise gate if you want to eliminate any background noise in a sound below a certain threshold


Non-Pitched Atmospheric Sounds

Non-pitched atmospheric sounds are used in film making to create a similar feeling to what the score tries to establish, but in a more subtle fashion. They aren't usually very noticable unless you're looking for them, but will still have the effect regardless of if you're aware of a sound being played.
A brilliant example of the effectiveness of atmospheric sounds is Orin Peli's "Paranormal Activity". Often throughout the film, when there is a silence, an extremely low rumble is used to create subtle suspense and a feeling of uneasiness in the viewer.




Note how the extremely low frequency sound stops as soon as the door slams shut. The sound is used to build suspense, to make the viewer aware that something is going to happen. This is very effective on high end sound systems such as those in cinemas where the subwoofers can produce extremely low sounds. Found footage movies like this generally have no score or non-diegetic sounds, so these techniques are usually relied on to keep it subtle and make it feel more realistic than high-budget blockbuster.





Bernard Hermann
 Bernard Hermann was an American composer noted for his work in motion pictures.
Herrmann's involvement with electronic musical instruments dates back to 1951, when he used the theremin in The Day the Earth Stood Still. Robert B. Sexton has noted that this score involved the use of treble and bass theremin electric strings, bass, prepared piano, and guitar together with various pianos and harps, electronic organs, brass, and percussion, and that Herrmann treated the theremins as a truly orchestral section.

Herrmann was a sound consultant on The Birds, which made extensive use of an electronic instrument called the mixturtrautonium, although the instrument was performed by Oskar Sala on the film’s soundtrack. Herrmann used several electronic instruments on his score of It’s Alive as well.
Ben Burtt
Burtt pioneered modern sound design, especially in the science-fiction- and fantasy-film genres. Before his work in the first Star Wars in 1977, science-fiction films tended to use electronic-sounding effects for futuristic devices. He sought a more natural sound, blending in "found sounds" to create the effects. For exmple the lightsaber hum was derived from a film projector idling combined with feedback from a broken television set, and the blaster effect started with the sound acquired from hitting a guy-wire on a radio tower with a wrench.

He is personally responsible for some of the sounds heard in films. In the Star Wars series, part of R2-D2's beeps and whistles are Burtt's vocalizations, also made using an ARP 2600 synthesizer, as are some of the squawks made by the tiny holographic monsters on the Millennium Falcon spacecraft. In Star Wars Episode III: Revenge of the Sith, he provides the voice for Lushros Dofine, captain of the Invisible Hand cruiser. The heavy-breathing of Darth Vader was created by recording his own breathing in an old Dacor scuba regulator.




This is a clips of a lightsaber fight in starwars, the sounds are the sounds ben burtt made 




This is r2-d2 and all the beeps/noises he makes 



Walter Murch
While Walter Murch was editing directly on film, he took notice of the crude splicing used for the daily rough-cuts. In response, he invented a modification which concealed the splice by using extremely narrow but strongly adhesive strips of special polyester-silicone tape. He called his invention "N-vis-o".

In 1979, he won an Oscar for the sound mix of Apocalypse Now as well as a nomination for picture editing. Murch is widely acknowledged as the person who coined the term Sound Designer, and along with colleagues developed the current standard film sound format, the 5.1 channel array, helping to elevate the art and impact of film sound to a new level. Apocalypse Now was the first multi-channel film to be mixed using a computerized mixing board.

In 1996, Murch worked on Anthony Minghella's The English Patient, which was based on Michael Ondaatje's novel of the same name. Murch won Oscars both for his sound mixing and for his editing. Murch's editing Oscar was the first to be awarded for an electronically edited film (using the Avid system), and he is the only person ever to win Oscars for both sound mixing and film editing.

In 2003, Murch edited another Anthony Minghella film, Cold Mountain on Apple's sub-$1000 Final Cut Pro software using off the shelf Power Mac G4 computers. This was a leap for such a big-budget film, where expensive Avid systems were usually the standard non-linear editing system. He received an Academy Award nomination for this work; his efforts on the film were documented in Charles Koppelman's 2004 book Behind the Seen.
The Doppler Effect
The doppler effect  is the change in frequency of a wave (or other periodic event) for an observer moving relative to its source. It is commonly heard when a vehicle sounding a siren or horn approaches, passes, and recedes from an observer. The received frequency is higher (compared to the emitted frequency) during the approach, it is identical at the instant of passing by, and it is lower during the recession


This is a short clip of sheldon from The big bang theroy giving an example of what the doppler effect is