Eletronik Fields
The project aims to utilize electromagnetic noise to compose music, creating a novel interaction between machine and artist. By controlling virtual instruments and sound processing chains, this interaction is presented through performance.
The project uses the Fourier Transform to analyze the audio signal and convert it into MIDI notes, which are then sent to a set of VSTs that synthesize sound through random patterns. The resulting audio is subjected to a second Fourier analysis, producing a real-time visualization in the form of an oscilloscope.
The MIDI notes generated correspond to the most intense frequencies of the captured noise. For example, a noise peaking at 440Hz generates an A4 MIDI note.
An electromagnetic induction microphone captures the noise, while a MIDI controller manipulates effects and parameters. A projector displays the oscilloscope-generated visuals.
This performative setup is intuitive, requiring minimal expertise from the performer, fostering a direct connection with the result and allowing little interference in its interpretation.
TOOLS
The main tools used for the development of the project.
Example MIDI track that allows you to create variations of MIDI notes
Ableton Live 11
DAW
Patch that turns noise into MIDI notes
Max 8
Patching platform
JIT.MO is used by Zwobot to create visuals
Zwobot
Library for VJing
CREATIVE REFERENCES
The concept of sonifying non-audible elements of nature, such as electromagnetic waves, was explored in Christina Kubisch’s Electrical Walks. Drawing on similar techniques, I captured environmental noise and transformed it into harmonic, musical output via an Ableton patch, making it more accessible to an audience unfamiliar with experimental music.
I also referenced the Max community and YouTube tutorials for additional insights into integrating tools such as Ableton and Max into the same project.
Additionally, Brian Eno’s conceptual approach to ambient music, detailed in Reverb Machine’s article "How Brian Eno Created Ambient 1: Music for Airports," helped shape my understanding of the structure and signal processing.
SOUND DESIGN
The project transforms electromagnetic signals into music via a magnetic induction coil, capturing noise and converting it into MIDI data to control VST instruments, loops, patches in MaxLive, and various effects. The interface allows interaction with electronic objects, each producing unique signals with different frequencies, timbres, and repetition patterns.
A Zoom H5 recorder manages gain control, routing the signal to a MaxLive patch for a detailed FFT analysis. The sound is then transformed into MIDI notes, controlling the VSTs. Both audio and MIDI data are incorporated into the composition, with randomized values generating variations in execution, timbre, and intensity.
The objective is to facilitate both the performer and audience’s understanding of the work, transforming raw noise into musical notes within a predefined scale.
Two open-source patches from the MaxLive community support the project. The first converts sound into MIDI notes, mapping frequencies to corresponding MIDI values (e.g., C-2 to C5) based on FFT readings. It defines note duration, BPM, and harmonic range. Instruments used include SynthBass, Organ, Lead, and Keys.
HARMONIZING
A CMI V VST is employed to manipulate samples captured via a Zoom H5 recorder and processed with Praat software. Ableton Live audio racks are utilized to modulate the sound with equalizers, delays, convolutions, and other plugins.
The second patch analyzes dynamics, frequency, and intensity to modulate effects such as delays, convolution reverb, and pattern variation. After processing, the output controls visual plugins that generate real-time sound-reactive visuals.
After all the sound processing, the final output of the work is directed to an instance of plugins that use the sound input to generate and modulate visuals, creating a visualization corresponding to what is heard. These visuals are created from a library developed with MaxLive, where you can use an oscilloscope or other functions to generate the visuals of the work and make them react in real time with the artist's performance when using the inductor coil.
VISUALIZING
The performance is staged in a dark room, with a central table holding the artist’s station. A projector displays the sound visualization in real-time, while fabric panels (approx. 2x2m) enhance the immersive atmosphere.
This setup allows the artist to physically integrate with the performance, manipulating electronic devices in sync with both the audio and visual projections.
With this arrangement, the artist becomes part of the work itself during its execution, since he can link his movements and the way he manipulates the electronic devices to accompany the sound and visual projections.
SCORE AND PERFORMANCE
The performance score is divided into two clefs. The upper clef, marked by the phi symbol, represents the right hand controlling the electromagnetic induction coil. Its lines correspond to the distance from the electromagnetic source, with the upper line indicating proximity and greater noise intensity.
The lower clef, marked by a shuffle symbol, represents the left hand controlling the MIDI controller, adjusting parameters of the generated MIDI notes and effects. Lines indicate the value range for each control knob.
Color coding is applied as follows:
-
Blue, yellow, and purple for electronic devices
-
Pink and red for MIDI control elements
In the final setup, electromagnetic noise is captured from devices such as a laptop charger, amplified by the Zoom H5, and processed in Ableton. This processed audio is then presented both sonically and visually, creating an immersive performance experience.
LINK TO VIDEO :
https://youtu.be/-LA7Wil530s
The cognitive effort required to interact with the system is relatively low, since no technique is required to perform with the induction microphone and the knobs of the MIDI controller. It is also easy to interpret since the system is automated to do most of the work by processing the sound, creating the melodies, controlling the samples through the VSTs and randomizing values for the audio effects.
With small stimuli the system can create intense sound layers, having a somewhat unpredictable behavior due to the random nature applied to MIDI note generation - that is, the same microphone input signal can produce several variations of the same note in duration and ADSR.
Since the image is directly correlated with certain frequencies of the sound, discontinuity can be experienced in the visual part of the work, which depending on the random variables of the sound effects will produce more glitch effects in the visuals, changing colors, distorting the image and changing shapes.
The dynamic range from C-2 to C8 allowed for a distribution between the 4 instruments that interpret the MIDI notes, causing a versatility that depends on the frequency of the noise captured - that is, if the noise is bass it will be interpreted by the bass synthesizer, if it has low mids it will be by the key synthesizer, if it has only high mids by the lead synthesizer and in the highs only the ag key synthesizer will interpret it.
COGNITIVE ANALYSIS
-
audio2midi version 1.1 by SebastienClara on maxforlive.com. (n.d.). Maxforlive.com. Retrieved January 2, 2023, from https://maxforlive.com/library/device/7868/audio2midi
-
Electrical Walks. (n.d.). Christina Kubisch. https://christinakubisch.de/electrical-walks
-
Zwobot - Vjing & Visuals in Ableton Live. (n.d.). Zwobot. Retrieved January 2, 2023, from https://www.zwobotmax.com/
-
How Brian Eno Created “Ambient 1: Music For Airports.” (2019, July 11). Reverbmachine.com. https://reverbmachine.com/blog/deconstructing-brian-eno-music-for-airports/?ph=a8d856f075fe696e4fe44033