Home » News »

An app that allows you to hear images from space

Photo of WUT students with symbolic checks

 The NASA International Space Apps Challenge hackathon took place in Stalowa Wola between October 7 and 8, 2023.

What could space sound like? NASA is working on turning cosmic views into sound. The agency is looking for new solutions. One of them was proposed by the students from the Warsaw University of Technology during the hackathon.

Thanks to the so-called sonification, NASA makes it possible for people, including those who are blind or visually impaired, to "listen" to astronomical images obtained with the Hubble Space telescope or the Chandra X-ray Observatory and learn about their data.

Over the past few years, the agency has been translating two-dimensional astronomical data into notes and sounds. Advanced tools already allow for the production of 3D images, and it was their sonification that our representatives tackled during the NASA International Space Apps Challenge hackathon.

As part of the "Immersed in the sounds of space" challenge, Maciej Leszek and Piotr Wojciechowski – students of Internet of Things engineering, Jakub Budrewicz – a computer science graduate, and currently a student of electronics (all from the Faculty of Electronics and Information Technology of the Warsaw University of Technology), together with Michał Oręziak from the Military University of Technology had to design a method to create sonifications of 3D NASA space datasets to help the users to better understand them and appreciate the wonders of the universe.

– From among several dozen challenges, we chose the one centered around sonification. Although we were not previously familiar with this technology, we quickly realized its significance – remarks Maciej Leszek. – Our solution is based mainly on the emotions evoked by sounds. We developed the AstroSonix application, which allows the user, including people with disabilities, to experience the sonification of images captured by space telescopes, ground-based telescopes, and satellites – he explains.

The students used large language models to analyze the emotions accompanying the visualizations. With the help of specially crafted heuristics, the image is parameterized and converted into parameters of soundtracks. By combining data from both approaches, the application generates a soundtrack using open machine learning models. As a result, everyone can create sonification free of charge using their smartphones or laptops.

Graphic showing a laptop screen and an open application

Image from the application’s demo available on both mobile and desktop

The JeansMasters team secured sixth place, competing against 71 teams. – The time pressure was something I hadn't experienced before. Throughout the 30-hour hackathon, we managed just 30 minutes of sleep – says Maciej Leszek. – The effort paid off. We all agreed that we'd like to participate again next year – he adds.

The solution could find applications in planetariums, museums, or art galleries, enhancing the overall visitor experience. – Through the interactive integration of image and sound, visitors would have the opportunity to explore a new combination of the senses of sight and hearing, enriching their experience and evoking additional emotions – states Maciej Leszek. During the hackathon, he also won a trip to the Betzdorf Campus in Luxembourg – SES’s space flight operation center.

– It will certainly serve as a great opportunity to broaden my understanding of satellites, which I hope to use in the Students' Space Association for which I’m applying – he points out.

The students are already planning to participate in the next hackathon. They are also considering setting up a research club dedicated primarily to participating in such events.