No sounds can travel through space (sound requires a medium to travel, and outer space is a near vacuum), but astronomers on Earth can experience data from space in sound with a new method known as sonification. They do not hear it in the sense that sonified data sounds like how the event sounds, but in that non-sound data is turned into something that can be heard. There are many different types of sonification, which allow you to experience many phenomena as sound: for example, gamma ray bursts, x-ray spectra, and even images. Almost everyone has an innate scientific curiosity, and sonification is a way to share data with those that may not be able to see it.
In a TED Talk by Wanda Diaz Merced, the speaker shares that she used sonification to turn light curves of gamma ray bursts into sound. This made it so she could hear the data she had been able to see as a sighted astrophysicist. The lines of graphs she used to see turned into high and low pitches she could hear instead. As such, she was able to tell when and where events became energetic and hear the fluctuations in data as hums and blips. This allowed her to work alongside her colleagues as an astrophysicist studying these high-energy events, despite losing her vision.
The Fermi Large Area Telescope allows us to see beyond the visible spectrum and into the gamma rays. Following is an example of a sonified gamma ray burst. In the video, different qualities of gamma rays are played by different instruments (low-quality is played on a harp, medium-quality on a cello, and high-quality on a piano). The time passed in the sonification corresponds to the time passed in the burst (but the burst is shorter, so this was drawn out to be more easily understood). The volume corresponds to intensity.
Another example of sonified gamma-ray bursts lies in the death of stars. In 2016, the National Radio Astronomy Observatory (NRAO) observed the death of a star. A star collapsed and triggered a supernova, leaving a black hole in its center. The collapse lasted but a few moments. Then the newborn black hole sent out a gamma ray burst that lasted seven seconds, and continued to shine in the x-ray, visible light, and radio for weeks. These observations in the different bands of light allowed scientists to create a time-lapse movie of the star’s explosion, which gave surprising results. The reverse shock was only expected to last a few seconds, but it lasted almost an entire day. A reverse shock is a shockwave pushing back on the jet of exploded material – the jet pushes on the material around it, and the material pushes back. The following sonification is the death of this star, several weeks of data from gamma rays, x-rays, visible light, and radio waves, turned into under a minute of sound.
X-ray spectra can also be sonified. These spectra have many uses, such as observing neutron stars, black holes, and high-temperature plasmas, and determining elemental composition of these entities. Following is an X-ray spectrum that has been put through X-Sonify, a software created by NASA. Each note represents the intensity at a certain wavelength, with wavelengths getting longer as the sound progresses. Gerhard Sonnert from the Harvard-Smithsonian Center for Astrophysics felt that sonification can be both a way of understanding data and an art form.
A rather famous (to astronomers, at least!) example of sonification happened in 2015. On September 14th, LIGO detected gravitational waves from the merger of two black holes, each with a mass about 30 times that of the Sun. According to LIGO Caltech, this event released 50 times more energy than all the stars in the observable universe, but it lasted only fractions of a second. They released the following clip to turn the gravitational waves that were detected into sound waves. I can still remember that day, and first hearing the sonified blip. The sound clip is described as follows: “In the first two runs of the animation, the sound-wave frequencies exactly match the frequencies of the gravitational waves. The second two runs of the animation play the sounds again at higher frequencies that better fit the human hearing range. The animation ends by playing the original frequencies again twice.”
The last example of sonification that I will share in this article is sonification of images from telescopes. A telescope captures data in the form of ones and zeros and turns it into an image, but we can use the same data and turn it sound instead of visuals. The NASA Chandra “A Universe of Sound” site has many such examples. I will go over the example using the Crab Nebula, as that supernova remnant is near and dear to my heart. The Crab is the remains of a star that exploded in 1054 CE. The site describes that it has a powerful, spinning neutron star in its center, which was formed after its progenitor star collapsed. It has a strong magnetic field, and that, along with its rotation, generates jets of matter and anti-matter flowing away from its poles, with winds that go outward from its equator. They translated these data into sound, moving from left to right across the image. Each wavelength of light was paired with a different family of instruments. X-rays are brass, the visible spectrum is strings, and infrared data is woodwind. The light in the top of the image has the highest notes, and the further down you go, the deeper the note. Brighter light is played louder.
Sonification is a powerful tool. It can help astronomers better understand and explain data, and it can share something visible with those that cannot see. Sonification is slowly becoming more popular, and I hope it continues to increase in popularity so that more people can come to enjoy and understand astronomy.