A talkbox creates a vocal effect which is often applied to a guitar. The musician uses the mouth to change / shape the frequency content of sound. They use a tube to do this whilst changing the shape of their mouth; this is mirrored in the frequency change in the original audio.
A vocoder analyses the vocal signal and applies it to a synth timbre. It sounds ‘robotic’ and the voice follows the pitch from the synth.
Other Creative FX
A ring modulator creates a dissonant effect that creates metallic and ringing sounds. It creates frequency products by adding and subtracting the input frequencies.
Overuse of pitch correction or autotune (a quick response time) forces the note to the nearest pitch set by the user / scale. It is often over-used in R&B music to create a slightly robotic effect, or subtly to keep vocalists in tune.
The term pan comes from ‘panorama’ in cinema. Auto-pan controls the place in the stereo field and is used to introduce creative panning throughout the stereo field. In tracks from the 60s and 70s, tracks were often extreme panned due to the limitations of technology.
A pitch shift alters frequencies and changes the musical note that is played. Harmonisers are intelligent pitch shifters that can add a musical interval to a part (e.g. a 3rd above). These effects would have been created historically by slowing down or speeding up a tape.
- Distortion is created by turning the volume of a sound up too loud.
- It is primarily a guitar effect, created using a pedal or stompbox, or a DAW plugin
- We can control three parameters of a distortion effect:
||The volume of the input signal (before distortion)
||The volume of the output signal (after distortion)
||Changes the frequency content of the output
- Historically, artists created distortion by damaging their amplifiers
- When the gain is increased, the wave peaks and troughs are clipped
- Soft clipping rounds the tops of the waves and sounds warmer. It is often seen in valve amps. Hard clipping creates a wave with square peaks & troughs and sounds harsher; it is commonly seen in transistor amps.
- The distorted signal has more harmonics, and filtering / EQ may be required to fit it well into the mix.
- Fuzz is a dense distortion used by guitarists such as Jimi Hendrix
Jimi Hendrix – Purple Haze
- Overdrive gives the sound more crunch.
- It was originally created by pushing amps to play louder than they were able to
Chuck Berry – Johnny B. Goode
- Lowering the bit depth leads to distortion of audio
- The track can sound ‘electronic’ or ‘lo-fi’
La Roux – Colourless Colour
- We can use plugins to model the sound of distortion
- Software plugins include Logic’s ‘Overdrive’ and ‘Bitcrusher’
- We can also use hardware to emulate the sound of different distortions and amplifiers
- Line6’s Pod emulates the sound of many different guitar rigs
- This is beneficial as it means that you can have the option of lots of different amplifier settings without needing to own or transport the physical amplifiers
- An LFO is a low frequency oscillator
- The frequencies used are below 20Hz, so they are inaudible
- An LFO changes a parameter of an effect or processor according to another wave
- Examples might be altering the delay time according to a sine wave. Other geometric waves can be used to change the effect of an LFO (e.g. square/triangle/pulse)
Controls on Modulation Effects
- Modulation effects have controls related to the rate and the depth of the modulation
- Rate is how fast the periodic change of the parameter takes place (wave frequency)
- Depth is how much the parameter is varied (wave amplitude)
- The combination of both of these parameters can change whether the effect sounds subtle or very obvious
- A comb filter adds a delayed version of a signal to itself, causing constructive and destructive interference.
- The frequency response of a comb filter consists of a series of regularly spaced notches, giving the appearance of a comb.
Historical Production of Modulation Effects
- The Leslie speaker uses the Doppler Effect to create modulation, and is sometimes called ‘Rotary’ on effect plugins.
- This is similar to the changing pitch heard when an ambulance passes you
- It is a system of rotating speakers used primarily with the Hammond Organ in the 1960s
- It occurs because the source of a sound changes position, relative to the listener.
The Eagles – Hotel California
The Beatles – Let It Be
Tape-Based Origins of Modulation Effects
- Modulation effects were originally created by using two tape recorders
Reverb is short for reverberation. This is a natural acoustic phenomenon where sound bounces off surfaces in a space and tells our brain lots of details about where we are. Throughout the history of music recording and production, it has been crucial for us to gain control over this and ensure that we can add or remove ambience to recordings where the space doesn’t necessarily sound how we want it to.
- Originally, reverb was created through a variety of ‘live rooms’ that had very different sounding reverb characteristics.
- This was expensive, and engineers soon found ways of recreating these reverbs in other forms in smaller rooms.
- One way was to find a very reverberant room and replay your mix on speakers in that room, using a pair of omnidirectional microphones to record the resulting echoes.
- Some studios used reverberant spaces they already had – for example the office toilet!
The Crystals – Da Doo Ron Ron
Tina Turner – River Deep Mountain High
Plate Reverb and Spring Reverb
- Plate and spring reverb are both methods of artificial reverb production devised in the 1950s.
- Both used the vibration of metal objects to create the sound of reverb
- Plate reverb was often used in studios and avoided the need for an echo chamber.
- Spring reverb was often used in amplifiers; Fender was one of the first manufacturers to build spring reverb into their amps.
Beach Boys – God Only Knows
A Simple Plate Reverb
A Simple Spring Reverb
- Digital reverb was first seen in effect units made by Yamaha and Lexicon in the 1980s
- It models a reverb by using lots of delays, which are mathematically calculated
- Digital reverbs enabled users and manufacturers to store lots of presets
- As computer processors got faster in the 1990s, digital reverb was incorporated as a software plugin
En Vogue – Hold On
- Convolution reverb reproduces a real reverb from a real space, and was pioneered by Sony in 1996
- An impulse reflex is generated in a space, and the response is recorded
- Mathematical algorithms subtract the impulse reflex from the reverb and this can then be applied to other sounds
- Convolution reverb is very heavy on processor usage
Other Types of Reverb
- Reverse reverb is created using a wet reverb signal played backwards. The original recording is then played forwards with the reverb which is first.
Beach Boys – Feel Flows
The Only Ones – Miles From Nowhere
- In many styles of music, but particularly the 1980s, the reverb decay was gated, which meant that a very dense reverb could be used without causing ‘muddiness’ in the mix. Gated reverb is often used on snare drums and sometimes vocals.
Phil Collins – In The Air Tonight
There are a number of ways of communicating between different musical instruments. Without a doubt, the most common is MIDI, but since this standard was first created in 1983, there have been a number of revisions, and indeed new developments since then. This page on StudyMusicTech.com outlines some of the key information about this for your further study and research.
- MIDI stands for Musical Instrument Digital Interface.
- It allows different electronic musical instruments and equipment to be connected together
- MIDI was introduced in August 1983
- It was a joint standard across different manufacturers
- It is a language; it doesn’t store audio data. Rather it is a set of commands sent to connected instruments.
- Each instrument has its own MIDI channel. This means that it only responds to messages meant for it.
- General MIDI was a further set of requirements for MIDI devices that was intended to ensure consistent playback between devices (particularly with regard to instrument patches)
- GM requires 24-voice polyphony, standardised controllers & numbers and locations of instruments
- Whilst MIDI files are very small, they are often criticised for sounding unrealistic; this is really a criticism of the equipment playing them back rather than the protocol itself.
Commands and Controllers
- MIDI controllers change a characteristic of the sound or track
- Controllers are either switched or continuous; switched controllers either have a value of 0 or 127, so are on or off. Continuous controller values range between 0 and 127.
- You can view controller data in the MIDI event list in your DAW, or sometimes in the piano roll editor.
||Adds vibrato (modulates pitch)
||Changes overall track volume
||Moves between L, C and R
||Holds on notes
||Bends between notes (glide)
||Adds reverb (amount)
||Adds chorus (amount)
||Stops all controllers
MIDI Controllers and Numbers
- Quantise is used to move notes that are poorly timed back onto the beat where they are meant to be. It achieves this through ‘snapping’ each note to a grid, the size of which is determined by the note resolution you choose
- Resolutions are defined by the number of the the notes you could fit in a bar of 4/4 (e.g. 1/1 – 1 semibreve; 1/2 – two minims etc…)
- It has been criticised for removing the musical expression from the music. Thus, some more advanced quantise functions were created:
- Swung quantise
- Groove templates
- Iterative quantise
Articulation and Velocity
- Articulation is how the notes are played; short or long, slurred together or detached.
- This is recreated by altering parameters of the MIDI note, in particular the length and velocity
- You could change the synth’s envelope to mirror the specific articulation of an instrument
- In Logic, ‘Gate Time’ can be used to change the feel of the MIDI data
- When a note on command is sent to a MIDI device, it is sent with a specific velocity value (from 0 to 127). This relates to how hard the note is struck
- Velocity is a useful method of us recreating the sound of real instruments and avoiding MIDI sequences becoming mechanical.
- Synthesisers commonly assign filter envelopes that are dependent on velocity
Other Programming Languages / Control Environments
- DMX stands for Digital Multiplex
- It is faster than MIDI and provides a higher resolution
- It is mostly used to control lighting fixtures and other parts of a show.
- It was originally intended as a protocol for controlling light dimmers (which were not compatible with each other before the introduction of the standard, known in full as DMX512
- It soon became the preferred method for linking a lighting console to dimmers and other stage effects such as smoke machines and for the control of moving head lights.
- It uses a 5 pin XLR cable to connect controllers to peripherals
- OSC stands for Open Sound Control and was designed between 2002 and 2004
- It is becoming increasingly popular and is used by programmes such as MAX/MSP and Reaktor
- Like MIDI, it can be used to connect together musical equipment such as synthesisers and computers
- It was originally intended for sharing music performance data (gestures, parameters and note sequences) between musical instruments (especially electronic musical instruments such as synthesizers), computers, and other multimedia devices.
- It is sometimes used as an alternative to MIDI, and it provides higher resolution and more detailed parameter information
- Connections use computer network protocols such as UDP/IP and Ethernet and is often connected to controllers using USB cables
- This means that it is possible to send OSC messages over a wired or wireless network
- OSC has been used extensively in experimental music controllers
I am currently working on a number of resources for the new specification in Music Technology (for first teaching in 2017-18).
To be released by mid-February (keep visiting this site for more information0
Full Content Guide for Edexcel AS and A Level Music Technology 2017
Guide to all content to be delivered for the course; suitable for teachers or for student revision.
Student Workbook for Edexcel AS and A Level Music Technology 2017
Suitable for use in lessons and for recording notes / key terminology.
Component 3 Practice Paper Pack
Three practice papers with mark schemes and links to audio files.
Component 4 Practice Paper Pack
Three practice papers with specially written audio tracks with mark schemes.