Sound effect

Last updated

Various acoustic devices in a Greek radio studio Various hard sound effects devices.JPG
Various acoustic devices in a Greek radio studio
Deep, pulsating digital sound effect
Voice saying "Ja", followed by the same recording with a massive digital reverb
A blackbird singing, followed by the same recording with the blackbird singing with 5 voices

A sound effect (or audio effect) is an artificially created or enhanced sound, or sound process used to emphasize artistic or other content of films, television shows, live performance, animation, video games, music, or other media.

Contents

In motion picture and television production, a sound effect is a sound recorded and presented to make a specific storytelling or creative point without the use of dialogue or music. Traditionally, in the twentieth century, they were created with Foley. The term often refers to a process applied to a recording, without necessarily referring to the recording itself. In professional motion picture and television production, dialogue, music, and sound effects recordings are treated as separate elements. Dialogue and music recordings are never referred to as sound effects, even though the processes applied to such as reverberation or flanging effects, often are called sound effects.

This area and sound design have been slowly merged since the late-twentieth century.

History

A live rooster in the Yle recording studio in 1930s Finland A live rooster in the studio, 1930s..jpg
A live rooster in the Yle recording studio in 1930s Finland

The term sound effect dates back to the early days of radio. In its Year Book 1931 the BBC published a major article about "The Use of Sound Effects". It considers sound effects deeply linked with broadcasting and states: "It would be a great mistake to think of them as anologous to punctuation marks and accents in print. They should never be inserted into a program already existing. The author of a broadcast play or broadcast construction ought to have used Sound Effects as bricks with which to build, treating them as of equal value with speech and music." It lists six "totally different primary genres of Sound Effect":

  • Realistic, confirmatory effect
  • Realistic, evocative effect
  • Symbolic, evocative effect
  • Conventionalised effect
  • Impressionistic effect
  • Music as an effect

According to the author, "It is axiomatic that every Sound Effect, to whatever category it belongs, must register in the listener's mind instantaneously. If it fails to do so its presence could not be justified." [1]

Film

In the context of motion pictures and television, sound effects refers to an entire hierarchy of sound elements, whose production encompasses many different disciplines, including:

Each of these sound effect categories is specialized, with sound editors known as specialists in an area of sound effects (e.g. a Car cutter or Guns cutter).

Foley is another method of adding sound effects. Foley is more of a technique for creating sound effects than a type of sound effect, but it is often used for creating the incidental real-world sounds that are very specific to what is going on onscreen, such as footsteps. With this technique, the action onscreen is essentially recreated to try to match it as closely as possible. If done correctly it is very hard for audiences to tell what sounds were added and what sounds were originally recorded (location sound).

In the early days of film and radio, Foley artists would add sounds in real time or pre-recorded sound effects would be played back from analog discs in real time (while watching the picture). Today, with effects held in digital format, it is easy to create any required sequence to be played in any desired timeline.

In the days of silent film, sound effects were added by the operator of a theater organ or photoplayer, both of which also supplied the soundtrack of the film. Theater organ sound effects are usually electric or electro-pneumatic, and activated by a button pressed with the hand or foot.

Photoplayer operators activate sound effects either by flipping switches on the machine or pulling cow-tail pull-strings, which hang above. Sounds like bells and drums are made mechanically, sirens and horns electronically. Due to its smaller size, a photoplayer usually has fewer special effects than a theater organ, or less complex ones.

Notable early uses of sound effects in film

Video games

The principles involved with modern video game sound effects (since the introduction of sample playback) are essentially the same as those of motion pictures. Typically a game project requires two jobs to be completed: sounds must be recorded or selected from a library and a sound engine must be programmed so that those sounds can be incorporated into the game's interactive environment.

In earlier computers and video game systems, sound effects were typically produced using sound synthesis. In modern systems, the increases in storage capacity and playback quality has allowed sampled sound to be used. The modern systems also frequently utilize positional audio, often with hardware acceleration, and real-time audio post-processing, which can also be tied to the 3D graphics development. Based on the internal state of the game, multiple different calculations can be made. This will allow for, for example, realistic sound dampening, echoes and Doppler effect.

Historically the simplicity of game environments reduced the required number of sounds needed, and thus only one or two people were directly responsible for the sound recording and design. As the video game business has grown and computer sound reproduction quality has increased, however, the team of sound designers dedicated to game projects has likewise grown and the demands placed on them may now approach those of mid-budget motion pictures.

Some pieces of music use sound effects that are made by a musical instrument or by other means. An early example is the 18th century Toy Symphony. Richard Wagner in the opera Das Rheingold (1869) lets a choir of anvils introduce the scene of the dwarfs who have to work in the mines, similar to the introduction of the dwarfs in the 1937 Disney movie Snow White . Klaus Doldingers soundtrack for the 1981 movie Das Boot includes a title score with a sonar sound to reflect the U-boat setting. John Barry integrated into the title song of Moonraker (1979) a sound representing the beep of a Sputnik like satellite.

Sound effects in theater

Gao, Jianliang, Zhao, Yuezhe, and Pan, Lili explained how sound absorption in the stage area influences the acoustics within an opera house auditorium. [4] Their research, using computer models and scale experiments, revealed that sound absorption significantly affects sound clarity and the time it takes for sound to fade, but not the volume. More absorption led to clearer sounds but quicker fades, showing the intricate dance between stage and auditorium acoustics.

In his book "Sound: A Reader in Theatre Practice," Brown [5] effectively connects the dots between theory and practice in the world of theater sound. He presents an engaging look into how sound design in theater has evolved, blending historical insights with current philosophical thoughts. Brown argues that the immersive nature of theater sound goes beyond traditional analysis, providing fresh perspectives on how sound interacts with societal contexts. [6]

Brown [7] offers a fresh look at Ovadija's exploration of sound in theater, questioning the traditional focus on visuals over audio. He points out the significant, yet often overlooked, role of sound in shaping theater's impact and experience. Brown pushes for a broader appreciation of sound's essence in theater, beyond just supporting visuals, to acknowledge its deep influence on storytelling and audience immersion.

Rost explores the criteria for 'good sound' in theater through handbooks and prioritization as guiding principles. These criteria not only dictate the creation and selection of sounds to complement the narrative and mood but also aim to maintain audience focus. Rost's analysis reveals underlying hierarchies in sound selection and emphasizes the need for further research into the historical and practical aspects of theater sound. [8]

Recording

A man recording the sound of a saw in the 1930s A man is recording sound effects, 1930s.jpg
A man recording the sound of a saw in the 1930s

The most realistic sound effects may originate from original sources; the closest sound to machine-gun fire could be an original recording of actual machine guns.

Despite this, real life and actual practice do not always coincide with theory. When recordings of real life do not sound realistic on playback, Foley and effects are used to create more convincing sounds. For example, the realistic sound of bacon frying can be the crumpling of cellophane, while rain may be recorded as salt falling on a piece of tin foil.

Less realistic sound effects are digitally synthesized or sampled and sequenced (the same recording played repeatedly using a sequencer). When the producer or content creator demands high-fidelity sound effects, the sound editor usually must augment his available library with new sound effects recorded in the field.

When the required sound effect is of a small subject, such as scissors cutting, cloth ripping, or footsteps, the sound effect is best recorded in a studio, under controlled conditions in a process known as Foley. Many sound effects cannot be recorded in a studio, such as explosions, gunfire, and automobile or aircraft maneuvers. These effects must be recorded by a professional audio engineer.

When such big sounds are required, the recordist will begin contacting professionals or technicians in the same way a producer may arrange a crew; if the recordist needs an explosion, they may contact a demolition company to see if any buildings are scheduled to be destroyed with explosives in the near future. If the recordist requires a volley of cannon fire, they may contact historical re-enactors or gun enthusiasts.

Depending on the effect, recordists may use several DAT, hard disk, or Nagra recorders and a large number of microphones. During a cannon- and musket-fire recording session for the 2003 film The Alamo , conducted by Jon Johnson and Charles Maynes, two to three DAT machines were used. One machine was stationed near the cannon itself, so it could record the actual firing. Another was stationed several hundred yards away, below the trajectory of the ball, to record the sound of the cannonball passing by. When the crew recorded musket fire, a set of microphones were arrayed close to the target (in this case a swine carcass) to record the musket-ball impacts.

A counter-example is the common technique for recording an automobile. For recording onboard car sounds (which include the car interiors), a three-microphone technique is common. Two microphones record the engine directly: one is taped to the underside of the hood, near the engine block. The second microphone is covered in a wind screen and tightly attached to the rear bumper, within an inch or so of the tailpipe. The third microphone, which is often a stereo microphone, is stationed inside the car to get the car interior.

Having all of these tracks at once gives a sound designer or audio engineer a great deal of control over how they want the car to sound. In order to make the car more ominous or low, they can mix in more of the tailpipe recording; if they want the car to sound like it is running full throttle, they can mix in more of the engine recording and reduce the interior perspective. In cartoons, a pencil being dragged down a washboard may be used to simulate the sound of a sputtering engine.

What is considered today to be the first recorded sound effect was of Big Ben striking 10:30, 10:45, and 11:00. It was recorded on a brown wax cylinder by technicians at Edison House in London on July 16, 1890. This recording is currently in the public domain.

Processing effects

As the car example demonstrates, the ability to make multiple simultaneous recordings of the same subjectthrough the use of several DAT or multitrack recordershas made sound recording into a sophisticated craft. The sound effect can be shaped by the sound editor or sound designer, not just for realism, but for emotional effect.

Once the sound effects are recorded or captured, they are usually loaded into a computer integrated with an audio non-linear editing system. This allows a sound editor or sound designer to heavily manipulate a sound to meet his or her needs.

The most common sound design tool is the use of layering to create a new, interesting sound out of two or three old, average sounds. For example, the sound of a bullet impact into a pig carcass may be mixed with the sound of a melon being gouged to add to the stickiness or gore of the effect. If the effect is featured in a close-up, the designer may also add an impact sweetener from his or her library. The sweetener may simply be the sound of a hammer pounding hardwood, equalized so that only the low-end can be heard. The low end gives the three sounds together added weight, so that the audience actually feels the weight of the bullet hit the victim.

If the victim is the villain, and his death is climactic, the sound designer may add reverb to the impact, in order to enhance the dramatic beat. And then, as the victim falls over in slow motion, the sound editor may add the sound of a broom whooshing by a microphone, pitch-shifted down and time-expanded to further emphasize the death. If the film is science-fiction, the designer may phaser the whoosh to give it a more sci-fi feel. (For a list of many sound effects processes available to a sound designer, see the bottom of this article.)

Aesthetics

When creating sound effects for films, sound recordists and editors do not generally concern themselves with the verisimilitude or accuracy of the sounds they present. The sound of a bullet entering a person from a close distance may sound nothing like the sound designed in the above example, but since very few people are aware of how such a thing actually sounds, the job of designing the effect is mainly an issue of creating a conjectural sound which feeds the audience's expectations while still suspending disbelief. Sci-fi and fantasy genres can be more forgiving in terms of audience expectations; the listener will not be caught off guard as much by unusual sound effects. In contrast, when creating sound effects for historical accuracy and realism, the listener likely had a lifetime of exposure to some of these sounds and so there are expectations of what they should sound like. [9]

In the previous example, the phased 'whoosh' of the victim's fall has no analog in real-life experience, but it is emotionally immediate. If a sound editor uses such sounds in the context of emotional climax or a character's subjective experience, they can add to the drama of a situation in a way visuals simply cannot. If a visual effects artist were to do something similar to the 'whooshing fall' example, it would probably look ridiculous or at least excessively melodramatic.

The conjectural sound principle applies even to happenstance sounds, such as tires squealing, doorknobs turning or people walking. If the sound editor wants to communicate that a driver is in a hurry to leave, they will cut the sound of tires squealing when the car accelerates from a stop; even if the car is on a dirt road, the effect will work if the audience is dramatically engaged. If a character is afraid of someone on the other side of a door, the turning of the doorknob can take a second or more, and the mechanism of the knob can possess dozens of clicking parts. A skillful Foley artist can make someone walking calmly across the screen seem terrified simply by giving the actor a different gait.

Techniques

Original sound sample for comparison

In music and film/television production, some typical effects used in recording and amplified performances are:

75 millisecond echo
Flanger
Phaser
Chorus
Equalizer

See also

Related Research Articles

<span class="mw-page-title-main">Effects unit</span> Electronic device that alters audio

An effects unit, effects processor, or effects pedal is an electronic device that alters the sound of a musical instrument or other audio source through audio signal processing.

<span class="mw-page-title-main">Production sound mixer</span> Member of a film crew or television crew

A production sound mixer, location sound recordist, location sound engineer, or simply sound mixer is the member of a film crew or television crew responsible for recording all sound recording on set during the filmmaking or television production using professional audio equipment, for later inclusion in the finished product, or for reference to be used by the sound designer, sound effects editors, or Foley artists. This requires choice and deployment of microphones, choice of recording media, and mixing of audio signals in real time.

<span class="mw-page-title-main">Surround sound</span> System with loudspeakers that surround the listener

Surround sound is a technique for enriching the fidelity and depth of sound reproduction by using multiple audio channels from speakers that surround the listener. Its first application was in movie theaters. Prior to surround sound, theater sound systems commonly had three screen channels of sound that played from three loudspeakers located in front of the audience. Surround sound adds one or more channels from loudspeakers to the side or behind the listener that are able to create the sensation of sound coming from any horizontal direction around the listener.

Flanging is an audio effect produced by mixing two identical signals together, one signal delayed by a small and (usually) gradually changing period, usually smaller than 20 milliseconds. This produces a swept comb filter effect: peaks and notches are produced in the resulting frequency spectrum, related to each other in a linear harmonic series. Varying the time delay causes these to sweep up and down the frequency spectrum. A flanger is an effects unit that creates this effect.

<span class="mw-page-title-main">Echo chamber</span> Hollow enclosure used to produce reverberated sounds

An echo chamber is a hollow enclosure used to produce reverberation, usually for recording purposes. A traditional echo chamber is covered in highly acoustically reflective surfaces. By using directional microphones pointed away from the speakers, echo capture is maximized. Some portions of the room can be moved to vary the room's decay time. Nowadays, effects units are more widely used to create such effects, but echo chambers are still used today, such as the famous echo chambers at Capitol Studios.

<span class="mw-page-title-main">Sound reinforcement system</span> Amplified sound system for public events

A sound reinforcement system is the combination of microphones, signal processors, amplifiers, and loudspeakers in enclosures all controlled by a mixing console that makes live or pre-recorded sounds louder and may also distribute those sounds to a larger or more distant audience. In many situations, a sound reinforcement system is also used to enhance or alter the sound of the sources on the stage, typically by using electronic effects, such as reverb, as opposed to simply amplifying the sources unaltered.

Automatic double-tracking or artificial double-tracking (ADT) is an analogue recording technique designed to enhance the sound of voices or instruments during the mixing process. It uses tape delay to create a delayed copy of an audio signal which is then played back at slightly varying speed controlled by an oscillator and combined with the original. The effect is intended to simulate the sound of the natural doubling of voices or instruments achieved by double tracking. The technique was developed in 1966 by engineers at Abbey Road Studios in London at the request of the Beatles.

A phaser is an electronic sound processor used to filter a signal by creating a series of peaks and troughs in the frequency spectrum. The position of the peaks and troughs of the waveform being affected is typically modulated by an internal low-frequency oscillator so that they vary over time, creating a sweeping effect.

<span class="mw-page-title-main">Moogerfooger</span> Analogue effects pedals

Moogerfooger is the trademark for a series of analog effects pedals manufactured by Moog Music. There are currently eight different pedals produced; however, one of these models is designed for processing control voltages rather than audio signal. A sixth model, the Analog Delay, was released in a limited edition of 1000 units and has become a collector's item. Moog Music announced on August 28, 2018, that the Moogerfooger, CP-251, Minifooger, Voyager synthesizers, and some other product lines were being built using the remaining parts on hand and discontinued thereafter.

<span class="mw-page-title-main">Field recording</span> Audio recording produced outside a recording studio

Field recording is the production of audio recordings outside recording studios, and the term applies to recordings of both natural and human-produced sounds. It can also include the recording of electromagnetic fields or vibrations using different microphones like a passive magnetic antenna for electromagnetic recordings or contact microphones, or underwater field recordings made with hydrophones to capture the sounds and/or movements of whales, or other sealife. These recordings are often regarded as being very useful for sound designers and foley artists.

Chorus is an audio effect that occurs when individual sounds with approximately the same time, and very similar pitches, converge. While similar sounds coming from multiple sources can occur naturally, as in the case of a choir or string orchestra, it can also be simulated using an electronic effects unit or signal processing device.

<span class="mw-page-title-main">Boundary microphone</span> Microphone for use on or near a surface

A boundary microphone is one or more small omnidirectional or cardioid condenser mic capsule(s) positioned near or flush with a boundary (surface) such as a floor, table, or wall. The capsule(s) is/are typically mounted in a flat plate or housing. The arrangement provides a directional half-space pickup pattern while delivering a relatively phase-coherent output signal.

<span class="mw-page-title-main">Delay (audio effect)</span> Echo-like effect

Delay is an audio signal processing technique that records an input signal to a storage medium and then plays it back after a period of time. When the delayed playback is mixed with the live audio, it creates an echo-like effect, whereby the original audio is heard followed by the delayed audio. The delayed signal may be played back multiple times, or fed back into the recording, to create the sound of a repeating, decaying echo.

Re-amping is a process often used in multitrack recording in which a recorded signal is routed back out of the editing environment and run through external processing using effects units and then into a guitar amplifier and a guitar speaker cabinet or a reverb chamber. Originally, the technique was used mostly for electric guitars: it facilitates a separation of guitar playing from guitar amplifier processing—a previously recorded audio program is played back and re-recorded at a later time for the purpose of adding effects, ambiance such as reverb or echo, and the tone shaping imbued by certain amps and cabinets. The technique has since evolved over the 2000s to include many other applications. Re-amping can also be applied to other instruments and program, such as recorded drums, synthesizers, and virtual instruments.

<span class="mw-page-title-main">Microphone practice</span> Microphone techniques used for recording audio

There are a number of well-developed microphone techniques used for recording musical, film, or voice sources or picking up sounds as part of sound reinforcement systems. The choice of technique depends on a number of factors, including:

<span class="mw-page-title-main">Foley (filmmaking)</span> Addition of sound effects to visual media

In filmmaking, Foley is the reproduction of everyday sound effects that are added to films, videos, and other media in post-production to enhance audio quality. Foley is named after sound-effects artist Jack Foley. Foley sounds are used to enhance the auditory experience of a movie. They can be anything from the swishing of clothing and footsteps to squeaky doors and breaking glass. Foley can also be used to cover up unwanted sounds captured on the set of a movie during filming, such as overflying airplanes or passing traffic.

<span class="mw-page-title-main">Audio mixing (recorded music)</span> Audio mixing to yield recorded sound

In sound recording and reproduction, audio mixing is the process of optimizing and combining multitrack recordings into a final mono, stereo or surround sound product. In the process of combining the separate tracks, their relative levels are adjusted and balanced and various processes such as equalization and compression are commonly applied to individual tracks, groups of tracks, and the overall mix. In stereo and surround sound mixing, the placement of the tracks within the stereo field are adjusted and balanced. Audio mixing techniques and approaches vary widely and have a significant influence on the final product.

Musical outboard equipment or outboard gear is used to process or alter a sound signal separately from functionality provided within a mixing console or a digital audio workstation. Outboard effects units can be used either during a live performance or in the recording studio.

A mixing engineer is responsible for combining ("mixing") different sonic elements of an auditory piece into a complete rendition, whether in music, film, or any other content of auditory nature. The finished piece, recorded or live, must achieve a good balance of properties, such as volume, pan positioning, and other effects, while resolving any arising frequency conflicts from various sound sources. These sound sources can comprise the different musical instruments or vocals in a band or orchestra, dialogue or Foley in a film, and more.

A reverb effect, or reverb, is an audio effect applied to a sound signal to simulate reverberation. It may be created through physical means, such as echo chambers, or electronically through audio signal processing. The American producer Bill Putnam is credited for the first artistic use of artificial reverb in music, on the 1947 song "Peg o' My Heart" by the Harmonicats.

References

  1. The BBC Year Book 1931 (PDF). p. 194.
  2. 1 2 3 4 5 6 7 Schrader, Paul (1995). "Paul Schrader". Projections 4½. doi:10.5040/9780571344406.0063. ISBN   978-0-571-34440-6.
  3. 1 2 3 Bottomore, Stephen (2006). "Cinema museums - a worldwide list". Film History: An International Journal. 18 (3): 261–273. doi:10.1353/fih.2006.0020. ISSN   1553-3905.
  4. Gao, Jianliang; Zhao, Yuezhe; Pan, Lili (March 2024). "The Effects of Sound Absorption of Stage House on the Acoustics of Auditorium in an Opera House". Buildings. 14 (3): 718. doi: 10.3390/buildings14030718 . ISSN   2075-5309.
  5. Curtin, Adrian (2010). "Sound: A Reader in Theatre Practice (review)". Theatre Journal. 62 (4): 706–707. doi:10.1353/tj.2010.a413949. ISSN   1086-332X.
  6. Curtin, Adrian (2010). "Sound: A Reader in Theatre Practice (review)". Theatre Journal. 62 (4): 706–707. doi:10.1353/tj.2010.a413949. ISSN   1086-332X.
  7. Brown, Ross (2014). "Dramaturgy of Sound in the Avant-Garde and Postdramatic Theatre by Mladen Ovadija (review)". Comparative Drama. 48 (4): 441–443. doi:10.1353/cdr.2014.0026. ISSN   1936-1637.
  8. Rost, Katharina (October 2016). "The shaping of 'good sound' in handbooks for theatre sound creation". Theatre and Performance Design. 2 (3–4): 188–201. doi:10.1080/23322551.2016.1226073. ISSN   2332-2551.
  9. "How Darren Blondin takes sound to the next level – with sonic contraptions, experiments and custom tools". 20 June 2018. Archived from the original on January 6, 2022. Retrieved January 6, 2022.
  10. Jimmy Page of Led Zeppelin used this effect in the bridge of Whole Lotta Love.