You’re a landscape photographer and you’re looking to spot a very specific type of tree. Here’s the problem; it’s a foggy, rainy and a humid day, and, on top of that, it’s hot so my camera lens is steaming up. Perhaps though the lens of your camera you can only see the outline of an odd looking tree, or perhaps it’s a windmill? If only there were a way to classify it, and, the huge range of other odd looking objects scattered across the field. One would hope that an expert photographer, who had been at this very spot a number of times before, would be able to classify the objects in the field, be it a tree, a windmill, a cow, a shed. My bet is that they would probably be able to because they understand how to account for the different weather conditions between them and the object. They might also be able to recognise the patterns distinguishing the different objects in the field.
The challenge for astrophysicists is very alike. Deep told me in conversation “the main issue that we have to go around is the noise in the data”. Only this noise includes demystifying moving objects, hidden light patterns, the effect of gravity, weather effects, instrumental effects, heat, and pretty much anything that you could think of in this universe, or another?…
Deep Chatterjee is an astrophysicist specialising in the field of gravitational waves, paying particular interest in observing electromagnetic signatures that come off colliding compact objects in space, such as neutron-stars and black holes. Bear in mind, when I spoke with him he was with the National Centre for Supercomputing Applications, but has just recently taken a post with MIT. And, at the time, we discussed his work in the field of astronomy so I could understand exactly where the AI came in, and the following is taken from our conversation.
There are two key ways in which machine intelligence is used in astronomy and astrophysics.
“In essence, I use deep learning libraries to train models to do the science essentially” – he said.
What this means – He knows the science of gravity and other scientific principles in space. He has the recorded libraries of data from space. He has knowledge of the techniques needed to create intelligent mathematical models. Put the three together, and you have an AI that can summon images of an elliptical galaxy from 10 years of image data with the phrase “elliptical galaxy”.
This in itself is a step forward, and has only been possible due to the huge amounts of data collected over the years, and, the huge amounts of knowledge that have been built around the data / information, regarding what the data is describing.
2. Generative Modelling – Having enough data to be able to use an already known model of something to flip the problem on its head and solve it backwards. (link here)
“And what that is, it’s not classification or regression (a mathematical approach to find the relationship between 2 or more variables), but more of how you can generate data from already existing data?”.
The other way to look at generative modelling is through the phrase “simulation-based inference”, which is using already generated simulations of something e.g the dynamics of a fluid under a certain density, and projecting that modelled simulation under new densities to gather data as to how the fluid might behave in different conditions.
The same principle, related to astrophysics, would be using simulation-based inference to study a supernova. So a supernova system in space will be emitting energy in all wavelengths, but the observatory can only see in certain wavelengths. So a neural net (the bedrock-mathematics of intelligent computing) will be trained to understand the differences between all the wavelengths in the spectrum. This understanding of the spectrum of wavelengths itself can then be used to expand understanding of the select wavelengths of the supernova that are able to be seen through the telescope.
He explained that “generative modelling has picked up a lot of interest in the physics community, and, in astronomy as well”, and that whilst the physics-informed neural net-based technique is still in its infancy, the proof of principle work has been done, and this marks a paradigm shift in the mathematical techniques being used to predict what a particular system will do at a future point in time”.
Whilst still in its infancy, there is a clear direction towards scientific research institutions requiring Artificial Intelligence specialised computing capabilities, with example news such as the US National Energy Technology Laboratory partnering with Cerebras to solve their intelligence computational needs – marking a key step forward for research institutions scaling their AI needs at the level of infrastructure and hardware design.