The Sun as Composer: How Solar Flares Became Music
Explore how the unpredictability of solar flare rates discovered by volunteers may offer new insights into musical composition, mirroring the patterns of natural fluctuations.
The sun does not keep time the way we expect it to.
For decades, solar physicists predicted flare activity with reasonable confidence, an eleven-year cycle, peaks and troughs, a rhythm with the feel of something almost musical in its regularity. Then citizen scientists broke the model.
Volunteers working with NASA's heliophysics programs identified unusually high flare rates that challenged existing predictions. These weren't professionals with access to magnetograms and spectral data. They were people, sifting through data that professional teams lacked bandwidth to process. The finding matters not for what it reveals about the sun, but for what it says about pattern: complex systems refuse to behave on schedule, and the anomalies hold the most interesting information.
Composers have known this for forty years.
A Tradition Built on Listening
Artists have mapped solar data to sound since the 1980s. That history is worth knowing, because it proves the idea is not speculative, it has already produced real, performed, recorded music.
Maggi Payne, an American experimental composer in the San Francisco Tape Music Center lineage, released Solar Wind on her 1986 album Crystal. Working in electroacoustic textures and spatial sound design, she was among the first to treat the sun not as metaphor but as raw material, a source to draw from, not describe.
Robert Alexander went further. Working with space-physics researchers at the University of Michigan, using data from NASA's Advanced Composition Explorer satellite, Alexander mapped that data into musical structures: particle velocity became pitch, intensity became amplitude, charge states shaped vocal harmonies. His works, including Music from the Sun and Solar Heartbeat, stand among the most prominent examples of scientific sonification crossing into composed music. The sun is credited as co-composer, in the data.
Lawrence Casserley and saxophonist Evan Parker recorded Solar Wind in 1997, bringing real-time signal processing into the tradition, an improvisation that treated the name as artistic metaphor rather than data source, exploring what solar-scale turbulence might sound like through the dialogue of saxophone and electronics. More recently, composer Klaus Nielsen collaborated with the European Space Agency to convert multi-year solar flare datasets into musical structures for installation and audiovisual work.
This is not a niche. This is a lineage.
It runs back further, through composers who built music from scientific and mathematical processes before satellite data existed: John Cage's indeterminate methods drawn from the I Ching, Iannis Xenakis's stochastic models pulled from physics and probability theory. What separates contemporary solar composition from those earlier experiments is that the data is real and scientifically grounded, sometimes archival, sometimes near-real-time, rather than purely abstract. A feed from a star. Not an abstraction.
Why Anomaly Is the Point
The citizen scientists who flagged unusual flare rates weren't looking for regularity. They were trained to notice deviation, moments where the data broke from the expected curve. NASA's citizen science programs operate on the premise that distributed human attention catches what automated systems smooth over. The algorithm averages. The human eye catches the outlier.
Composition works the same way. Messiaen transcribed birdsong because birds don't follow scales. Steve Reich let tape loops drift out of phase because the drift was the music. Björk built Biophilia around the principle that natural systems, from crystal formation to lunar pull, carry rhythmic logic worth composing with. The question is never whether natural phenomena contain pattern, they always do. The question is whether the pattern serves: whether it translates into something a human body responds to.
Solar flare data answers that question compellingly because it operates on multiple timescales at once. Individual flares last minutes to hours. Active regions persist for weeks. The solar cycle spans roughly a decade. And we now know the cycle contains unexpected surges, pockets of intensity that don't fit the predicted envelope. In compositional terms: micro-rhythms nested inside macro-structures, with aperiodic bursts that shatter expected form. No human composer would write this rhythm. The sun already did.
The Tools Already Exist
Sonification, mapping non-audio data to sound, is established practice. NASA has produced space data sonifications for years, and Alexander's work demonstrates it can reach concert halls. But sonification is translation. The next step is using solar activity's structural logic as a compositional scaffold: letting flare timing drive rhythmic variation, letting intensity curves shape dynamic contour, letting unexpected surges mark the moments a piece breaks its own rules.
The tools to build this sit on composers' hard drives.
Cycling '74's Max/MSP, the patching environment powering live electronic music for three decades, reads external data streams and routes them to any sound parameter a composer can imagine. Patch a NASA open-data feed into a Max object, map particle velocity to pitch, and you have a live instrument driven by the sun. CSound, the foundational computer music language, offers finer control: its score and orchestra files ingest tabular solar data, translating flare intensity into amplitude envelopes or spectral density across thousands of grain-level events. Particle synthesis environments, granular systems generating sound from clouds of micro-events rather than continuous waveforms, match solar physics with particular precision, because discrete particle bursts accumulating into large-scale plasma behavior is how granular synthesis works. The compositional metaphor isn't forced. It is structural.
Robert Alexander built his pipeline with the tools available in the 2000s. Today's composer has Max, CSound, SuperCollider, and access to NASA's heliophysics data archives. The infrastructure gap has closed. What remains is the human decision to wire these things together, and the collaborator on the physics side willing to explain what the data means.
The Score Isn't Finished
The lineage from Payne to Alexander to Nielsen proves this intersection produces real work. The anomalous flare data from NASA's citizen science programs proves there is more to discover, patterns no single institution would have found alone. The open data archives exist. Max, CSound, and SuperCollider sit on hard drives waiting.
What's missing is the collaboration: the moment a heliophysicist and a composer sit at the same table and ask, What does this flare sequence sound like if we map intensity to timbre and timing to rhythm? The most interesting problems live at those borders. Solar physics and musical composition both work in waveforms, in periodicity and its violations, in the gap between predicted behavior and actual behavior. They are closer than they look.
The anxiety around AI and creativity comes from the fear that the machine replaces the artist, algorithm decides, human rubber-stamps. A solar data instrument inverts that entirely. The data is non-human in origin but produces nothing without a composer to interpret it. Alexander proved this: he needed physicists to read the data and musicians to perform it. The collaboration ran in both directions. That is the model worth building toward, not AI generating music for you, but the sun handing you a rhythm no human would have written, and a composer deciding what it means.
The volunteers noticed the sun was stranger than the models predicted. That strangeness is waiting to be heard.
References
- https://science.nasa.gov/get-involved/citizen-science/volunteers-find-oddly-high-solar-flare-rates
- https://science.nasa.gov/get-involved/citizen-science/more-than-36000-volunteers-helped-do-nasa-eclipse-science
- https://www.zooniverse.org/projects/eimason/solar-active-region-spotter/about/results
- https://iopscience.iop.org/article/10.3847/1538-4357/ae197d
Models used: gpt-4.1, claude-opus-4-6, claude-sonnet-4-20250514, gpt-image-1
If this resonated, SouthPole is a slow newsletter about art, technology, and the old internet — written for people who still enjoy thinking in full sentences.