Using data recorded from space, researchers say they've created a soundscape of the Earth.
Performed live last week in the NASA booth at the supercomputing conference in Austin, Texas, the experiment used a process called data sonification to transform images of the Earth beamed back from the Space Weather Satellite DSCOVR into sound.
Earth Waves At Saturn In Colossal NASA Photo Collage
The result is an atmospheric sound which fairly resembles Brian Eno's ambient music.
Click here to listen to the sound of the Earth
DSCOVR was launched in February this year from Cape Canaveral in Florida and has NASA's Earth Polychromatic Imaging Camera (EPIC) on board.
Domenico Vicinanza, from Anglia Ruskin University and GÉANT in Cambridge, U.K, and colleague Genevieve Williams developed algorithms to give a specific pitch and melody to each image sent back from the satellite.
"Sonification gives space research a new dimension. When you hear the resulting music you really are hearing the data," Vicinanza, a physicist, classical composer and director of Anglia Ruskin University's Sound And Game Engineering (SAGE) research group, told Discovery News.
Vicinanza has previously produced music based on data from the NASA Voyager mission and even sonified the data from the Atlas experiment at the Large Hadron Collider (LHC) in Switzerland, converting the Higgs Boson-like particle into sound.
For the new experiment, Vicinanza and Williams's algorithms relied on specific features in the image's pixels. For example, brightness and pitch are deeply related (the brighter the pixel, the higher the pitch), while octaves are linked to colors.
"Green-brown colors determine lower octaves, while blue-gray colors produce higher octaves," Vicinanza said.
"Clusters of pixels with similar colors create a smoother melody. On the contrary, metereological events, such as clouds over the sea, produce sudden changes," he added.
Listen to the Higgs Boson
The resulting music is a mix of ethereal, calming sounds and dotted rhythms.
"The melody changes and evolves as the EPIC picture is analyzed from left to right and from top to bottom. Clusters of pixels of different sizes have been used to analyze the picture at different scales," Vicinanza said.
He explained bigger clusters returned longer notes that built the harmonic background. Smaller clusters were mapped to shorter notes, moving smoothly within a range of sounds or jumping to a different octave depending on the color of the pixels.
According to Vicinanza, the algorithm can be further developed for specific requirements.
"For example, we can use maps of the global geographic distribution of CO2 in the atmosphere and create sonifications to pinpoint where carbon dioxide is being emitted and absorbed," he said.