Eletronik Fields
his project explores the use of electromagnetic noise as a compositional tool, creating a dynamic interaction between machine and artist. Through the control of virtual instruments and sound processing chains, this interaction unfolds in a live performance.
The system applies Fourier Transform analysis to the audio signal, converting it into MIDI notes that drive a set of VSTs, generating sound through randomized patterns. The resulting audio undergoes a second Fourier analysis, producing a real-time oscilloscope visualization.
The MIDI notes correspond to the dominant frequencies of the captured noise— for instance, a peak at 440Hz translates into an A4 MIDI note.
An electromagnetic induction microphone captures the noise, while a MIDI controller is used to manipulate effects and parameters. The visuals, generated from the oscilloscope output, are projected during the performance.
Designed for intuitive interaction, this setup requires minimal expertise from the performer, fostering an immediate connection with the sound.
TOOLS
The main tools used for the development of the project.
Example MIDI track that allows you to create variations of MIDI notes

Ableton Live 11

DAW
Patch that turns noise into MIDI notes
.png)
Max 8
Patching platform

JIT.MO is used by Zwobot to create visuals
Zwobot
Library for VJing

CREATIVE REFERENCES
The idea of sonifying inaudible elements of nature, such as electromagnetic waves, was notably explored in Christina Kubisch’s Electrical Walks. Building on similar techniques, I captured environmental noise and transformed it into harmonic, musical output using an Ableton patch, making it more accessible to audiences unfamiliar with experimental music.
For technical refinement, I drew insights from the Max community and YouTube tutorials, exploring ways to integrate Ableton and Max within the same project.






Additionally, Brian Eno’s conceptual approach to ambient music, as explored in Reverb Machine’s article "How Brian Eno Created Ambient 1: Music for Airports," influenced my understanding of structure and signal processing, helping to refine the project’s sonic framework.

SOUND DESIGN
The project converts electromagnetic signals into music using a magnetic induction coil, capturing noise and transforming it into MIDI data to control VST instruments, loops, MaxLive patches, and various effects. The interface enables interaction with electronic objects, each generating distinct signals with unique frequencies, timbres, and repetition patterns.
A Zoom H5 recorder handles gain control, routing the signal to a MaxLive patch for detailed FFT analysis. The extracted data is converted into MIDI notes, which drive the VSTs. Both audio and MIDI elements are integrated into the composition, with randomized values introducing variations in execution, timbre, and intensity.
The goal is to create an intuitive experience for both performer and audience, reshaping raw noise into structured musical notes within a predefined scale.
Two open-source patches from the MaxLive community support the project. The first converts sound into MIDI notes, mapping frequencies to corresponding MIDI values (e.g., C-2 to C5) based on FFT readings. It defines note duration, BPM, and harmonic range. Instruments used include SynthBass, Organ, Lead, and Keys.
HARMONIZING
A CMI V VST is employed to manipulate samples captured via a Zoom H5 recorder and processed with Praat software. Ableton Live audio racks are utilized to modulate the sound with equalizers, delays, convolutions, and other plugins.




The second patch analyzes dynamics, frequency, and intensity to modulate effects such as delays, convolution reverb, and pattern variation. After processing, the output controls visual plugins that generate real-time sound-reactive visuals.
After all sound processing, the final output is routed to a set of plugins that use the audio signal to generate and modulate visuals, creating a direct correspondence between sound and image. These visuals are produced using a MaxLive-based library, which includes an oscilloscope and other functions to dynamically shape the visual output. The system reacts in real time to the artist’s performance with the induction coil, reinforcing the connection between sound and movement.




VISUALIZING
The performance is staged in a dark room, with a central table holding the artist’s station. A projector displays the sound visualization in real-time, while fabric panels (approx. 2x2m) enhance the immersive atmosphere.
This setup allows the artist to physically integrate with the performance, manipulating electronic devices in sync with both the audio and visual projections.





With this arrangement, the artist becomes part of the work itself during its execution, since he can link his movements and the way he manipulates the electronic devices to accompany the sound and visual projections.