1.2 Proximity Effect

  • The proximity effect is when low frequencies are emphasised when you get closer to the microphone
  • This can be fixed by moving further back or by using EQ later on. Sometimes this effect is used to give a ‘warmer’ sound
  • It occurs in directional (cardioid, hypercardioid or figure of 8) microphones
  • The diagram below shows the proximity effect at different distances away from the microphone.

proximity.png

1.1 Hardware: DI Boxes

DI Boxes

  •  DI boxes (direct input) are used to eliminate the need to mic up electronic instruments.
  • They can be plugged directly in to a mixer or audio interface.
  • They convert an unbalanced (two wire) signal to a balanced (three wire)
  • Microphones usually use XLR connections (Extra Low Resistance).
  • Balanced cables use phase cancellation to reduce noise; they copy and invert the sound signal at both ends then use the resulting signal to cancel out noise.

di1.png

di2.png

1.1 Hardware: Audio Interfaces

The picture below shows a simple FocusRite audio interface with two separate inputs.

focusrite.png

  • Latency can be an issue when using external audio interfaces. This is when there is a delay between playing the signal and either the computer recording it or hearing it being monitored back.
  • This is very off putting when performing and can mean that the ‘feel’ or ‘groove’ of the song is lost and even the performance ending up out of time
  • Audio quantise functions can sometimes help to improve this, as can reducing the buffer size on your DAW
  • CD quality recordings use a sample rate of 44.1kHz and a bit depth of 16 bits.
  • Even some relatively inexpensive audio interfaces can record at higher bitrates, such as 48kHz and 24 bit.
  • The Red Book Standard outlines the requirements for digital audio and CDs
  • A higher sample rate improves the capture of higher frequencies and thus the higher frequency response. Nyquist’s theorem states that the highest frequency captured is half of the sample rate, and thus to capture the human hearing range, a sample rate of at least 40kHz must be used.
  • Benefits of using a higher bit depth include capture of wider dynamic ranges and minimising noise.
  • In order to capture or play back digital audio, some conversion needs to take place. Thus, audio interfaces incorporate analogue to digital converters (ADCs) and digital to analogue converters (DACs)
  • Audio interfaces will normally incorporate some kind of meter. This shows the volume of the input or output signal and allows the operator to avoid distortion.

 

XLR / Jack / Combi Inputs

·      There are two combi inputs. These can take both XLR connections and jack connections.

·      Jack connections can be TS (tip/sleeve) and TRS (tip/ring/sleeve).

·      XLR connections are balanced. This reduces noise using phase cancellation from combining two signals with opposite polarity and the resulting destructive interference

·      XLR connections are often used for microphones; jack connections for electric guitars and synthesisers

·      There is a locking tab at the top of the connection which means that the XLR cable does not come loose when pulled without pressing the locking tab

·      A pre-amp will amplify the input signal as part of the audio interface

Gain / Level / Pad

·      For each input, you can select between instrument and line level.

·      The gain control is used to set the input for a good signal to noise ratio and thus reduce noise and prevent distortion

·      The pad switch reduces the sensitivity in order to prevent distortion if the signal being recorded is loud. Pad switches often reduce the input signal by up to 20dB.

Phantom Power / LED Indicators

·      The 48V switch activates or deactivates phantom power.

·      This is used to supply power to a condenser microphone or active DI box

·      The interface only has one switch so this controls phantom power for all channels

·      This would create issues if using a condenser and a ribbon microphone as phantom power would break a ribbon microphone

·      MIDI stands for Musical Instrument Digital Interface

·      It is used as a language to communicate between different electronic instruments

·      The LED identifies when a MIDI signal is being sent or received

·      The second LED identifies when a USB connection to a computer is active

Headphones

·      Headphones can be used to monitor the signal

·      The headphone socket is a TRS/stereo jack connection. It can be used to provide a monitor mix (this may be different to what would be coming out of the monitor speakers).

·      On this interface, you can change which output the monitor output mirrors, and use the knob to change the volume of the headphones to ensure a good monitor level and avoid damaging your hearing

USB Connector

·      USB stands for Universal Serial Bus

·      It allows the interface to be connected to a computer and is used for transferring data

·      Power can be provided on a USB cable

·      USB 2.0 is a development of the original USB protocol, but is not as fast as ThunderBolt (Apple) and FireWire (IEEE 1394) connections.

MIDI Ports

·      These connectors are 5 pin DIN-180 connectors

·      They can be used to connect other equipment such as synthesisers and effects units

·      Equipment is connected together in a loop, with the MIDI out of one piece of equipment being connected to the MIDI in of the next piece

·      Some interfaces incorporate a MIDI-thru connector, which reduces the latency on the MIDI network

Outputs

·      All sets of outputs here are in stereo, and are analogue

·      There are two sets of unbalanced phono / RCA connections.

·      The red connector is for the right channel and the white for the left

·      Unbalanced connectors are are more susceptible to interference and have a poorer signal-to-noise ratio than balanced connectors

·      The two jack outputs are balanced; they are using a TRS jack connector

 

3.3 Tape Delay

Tape Delay

roland_space_echo_02

  • The peak level will illuminate if the signal clips or distorts and the VU meter shows the input level in volume units.
  • The mic/instrument volume is the gain. This can be decreased to avoid clipping. It can be turned up to obtain a better signal to noise ratio.
  • The instrument volume is for a low impedance instrument such as guitar.
  • The mode selector is used to select different taps / patterns / rhythms / types of delay / number or volume of repeats / combinations of playback heads.
  • The bass and treble controls are used to modify the tone of the delay (not the dry signal). It is a shelving filter that adjusts the levels of low and high frequencies.
  • The reverb/echo volume is the wet/dry effect mix. “Straight” gives no echo at all. This is the gain, and has a spring reverb unit.
  • Repeat rate is the delay time / the amount of time between each repeat.
  • Intensity is the feedback amount/number of repeats. This is the gain.
  • The power switch should be used so that the unit should be switched off when not in use to preserve the life of the tape.
  • Echo normal or footswitch works as a bypass (to switch off the machine).
  • HML gives different output levels/volumes so that the unit can match the different signal levels required by different studio equipment.

1.12 Creative Effects

Vocal Effects

talkbox creates a vocal effect which is often applied to a guitar. The musician uses the mouth to change / shape the frequency content of sound. They use a tube to do this whilst changing the shape of their mouth; this is mirrored in the frequency change in the original audio.

 

A vocoder analyses the vocal signal and applies it to a synth timbre. It sounds ‘robotic’ and the voice follows the pitch from the synth.


Other Creative FX

A ring modulator creates a dissonant effect that creates metallic and ringing sounds. It creates frequency products by adding and subtracting the input frequencies.

Autotune

Overuse of pitch correction or autotune (a quick response time) forces the note to the nearest pitch set by the user / scale. It is often over-used in R&B music to create a slightly robotic effect, or subtly to keep vocalists in tune.

 

The term pan comes from ‘panorama’ in cinema. Auto-pan controls the place in the stereo field and is used to introduce creative panning throughout the stereo field. In tracks from the 60s and 70s, tracks were often extreme panned due to the limitations of technology.

A pitch shift alters frequencies and changes the musical note that is played. Harmonisers are intelligent pitch shifters that can add a musical interval to a part (e.g. a 3rd above).     These effects would have been created historically by slowing down or speeding up a tape.

 

1.1 MIDI, OSC and Communicating Between Instruments

There are a number of ways of communicating between different musical instruments. Without a doubt, the most common is MIDI, but since this standard was first created in 1983, there have been a number of revisions, and indeed new developments since then. This page on StudyMusicTech.com outlines some of the key information about this for your further study and research.

MIDI

  • MIDI stands for Musical Instrument Digital Interface.
  • It allows different electronic musical instruments and equipment to be connected together

MIDI Specification

  • MIDI was introduced in August 1983
  • It was a joint standard across different manufacturers
  • It is a language; it doesn’t store audio data. Rather it is a set of commands sent to connected instruments.
  • Each instrument has its own MIDI channel. This means that it only responds to messages meant for it.

General MIDI

  • General MIDI was a further set of requirements for MIDI devices that was intended to ensure consistent playback between devices (particularly with regard to instrument patches)
  • GM requires 24-voice polyphony, standardised controllers & numbers and locations of instruments
  • Whilst MIDI files are very small, they are often criticised for sounding unrealistic; this is really a criticism of the equipment playing them back rather than the protocol itself.

Commands and Controllers

  • MIDI controllers change a characteristic of the sound or track
  • Controllers are either switched or continuous; switched controllers either have a value of 0 or 127, so are on or off. Continuous controller values range between 0 and 127.
  • You can view controller data in the MIDI event list in your DAW, or sometimes in the piano roll editor.

 

Name Number Function Switched/Continuous
Modulation 1 Adds vibrato (modulates pitch) Continuous
Volume 7 Changes overall track volume Continuous
Pan 10 Moves between L, C and R Switched
Expression 11 Volume fine-tuning Continuous
Sustain 64 Holds on notes Switched
Portamento 65 Bends between notes (glide) Switched
Reverb Depth 91 Adds reverb (amount) Continuous
Chorus Depth 93 Adds chorus (amount) Continuous
Reset All 121 Stops all controllers Switched

MIDI Controllers and Numbers

Quantise

  • Quantise is used to move notes that are poorly timed back onto the beat where they are meant to be. It achieves this through ‘snapping’ each note to a grid, the size of which is determined by the note resolution you choose
  • Resolutions are defined by the number of the the notes you could fit in a bar of 4/4 (e.g. 1/1 – 1 semibreve; 1/2 – two minims etc…)
  • It has been criticised for removing the musical expression from the music. Thus, some more advanced quantise functions were created:
    • Swung quantise
    • Groove templates
    • Iterative quantise
    • Humanise

 

Articulation and Velocity

  • Articulation is how the notes are played; short or long, slurred together or detached.
  • This is recreated by altering parameters of the MIDI note, in particular the length and velocity
  • You could change the synth’s envelope to mirror the specific articulation of an instrument
  • In Logic, ‘Gate Time’ can be used to change the feel of the MIDI data
  • When a note on command is sent to a MIDI device, it is sent with a specific velocity value (from 0 to 127). This relates to how hard the note is struck
  • Velocity is a useful method of us recreating the sound of real instruments and avoiding MIDI sequences becoming mechanical.
  • Synthesisers commonly assign filter envelopes that are dependent on velocity

 

MIDI Messages

midi-msg

Other Programming Languages / Control Environments

DMX

  • DMX stands for Digital Multiplex
  • It is faster than MIDI and provides a higher resolution
  • It is mostly used to control lighting fixtures and other parts of a show.
  • It was originally intended as a protocol for controlling light dimmers (which were not compatible with each other before the introduction of the standard, known in full as DMX512
  • It soon became the preferred method for linking a lighting console to dimmers and other stage effects such as smoke machines and for the control of moving head lights.
  • It uses a 5 pin XLR cable to connect controllers to peripherals

 

OSC

  • OSC stands for Open Sound Control and was designed between 2002 and 2004
  • It is becoming increasingly popular and is used by programmes such as MAX/MSP and Reaktor
  • Like MIDI, it can be used to connect together musical equipment such as synthesisers and computers
  • It was originally intended for sharing music performance data (gestures, parameters and note sequences) between musical instruments (especially electronic musical instruments such as synthesizers), computers, and other multimedia devices.
  • It is sometimes used as an alternative to MIDI, and it provides higher resolution and more detailed parameter information
  • Connections use computer network protocols such as UDP/IP and Ethernet and is often connected to controllers using USB cables
  • This means that it is possible to send OSC messages over a wired or wireless network
  • OSC has been used extensively in experimental music controllers

OSC Messages

osc