It has become exceedingly clear in recent months that NASA’s James Webb Space Telescope is doing exactly what it set out to do. Just as the creators had hoped, the multi-billion dollar machine flawlessly unfolds “the universe” by revealing cosmic light we can’t see with our own eyes – and the outstanding results make even the most unlikely stargazers feel alive.
This gold-plated telescope made Twitter go wild over a blurry red dot one day. For 48 hours, people worldwide stared at a galaxy that was born shortly after the birth of time itself. It seems that, thanks to the technological prowess of the JWST, humanity stands united over stardust.
But here’s the thing.
Amid personal awe, scientists at the Massachusetts Institute of Technology warn that we should view one as crucial scientific consequence of having a superhero telescope.
If the JWST is some sort of upgrade from zero to 100, they wonder, is it possible that our scientific models also need a reboot from zero to 100? Are the datasets scientists have been using for decades unable to match the power of the device and therefore fall short in revealing what it is trying to tell us?
“The data we’ll get from the JWST will be incredible, but…our insights will be limited if our models don’t match in quality,” Clara Sousa-Silva, a quantum astrochemist at the Center for Astrophysics, Harvard & Smithsonian, told CNET.
And according to a new study she co-authored, published Thursday in the journal Nature Astronomy, the answer is yes.
More specifically, this paper suggests that some of the light-decomposition tools scientists normally use to understand the atmospheres of exoplanets are not fully equipped to handle the JWST’s exceptional light data. In the long run, such a barrier can have the most impact enormous JWST quest of all: the hunt for extraterrestrial life.
“Currently, the model we use to decrypt spectral information is not consistent with the precision and quality of the data we have from the James Webb telescope,” said Prajwal Niraula, a graduate student in MIT’s Department of Earth, Atmospheric and planetary sciences and co-author of the study, said in a statement. “We have to improve our game.”

The Carina Nebula, as imaged by NASA’s James Webb Space Telescope.
NASA
Here’s one way to think about the riddle.
Imagine pairing the latest, most powerful Xbox console with the very first version of a TV. (Yes, I know the extremely hypothetical nature of my screenplay). The Xbox would try to give the TV great, colorful, beautiful high-resolution graphics to show us – but the TV wouldn’t have the capacity to calculate anything from it.
I wouldn’t be surprised if the TV exploded straight up. But the point is you wouldn’t know what the Xbox is trying to offer you unless you get a TV with an equally high resolution.
Similarly, along the lines of exoplanet discoveries, scientists feed a lot of data from deep space light or photons into models that test for “opacity.” Opacity measures how easily photons pass through a material and differs depending on things like wavelength of light, material temperature and pressure.
This means that any such interaction leaves a telltale mark of what the photon’s properties are, and therefore, when it comes to exoplanets, what kind of chemical atmosphere those photons went through to get to the light detector. That’s how scientists make a kind of inverse calculation, from light data, that makes up the atmosphere of an exoplanet.
In this case, the detector link is on the James Webb Space Telescope — but in the team’s new study, after putting the most widely used opacity model to the test, the researchers saw JWST light data hit what they call an “accuracy wall.” ”
The model wasn’t sensitive enough to analyze things like whether a planet has an atmospheric temperature of 300 or 600 Kelvin, the researchers say, or whether a particular gas makes up 5% or 25% of the atmosphere. Not only is such a difference statistically significant, but according to Niraula it is also “important for us to limit planetary formation mechanisms and reliably identify biosignatures.”
That is, evidence of alien life.
“We need to work on our interpretive tools,” Sousa-Silva said, “so we don’t notice we’re seeing something great through JWST, but don’t know how to interpret it.”

Are we looking at the oldest galaxy ever found?
T. Treu/GLASS-JWST/NASA/CSA/ESA/STScI
Furthermore, the team also found that its models somewhat obscured the uncertain measurements. A few adjustments can easily remove the uncertainty, with the results considered good if they are incorrect.
“We found that there are enough parameters to tweak, even with the wrong model, to still get a good fit, meaning you wouldn’t know your model is wrong and what it’s telling you is wrong,” Julien de Wit , assistant professor at MIT’s EAPS and co-author of the study, said in a statement.
Going forward, the team is pushing for coverage models to be improved to accommodate our spectacular JWST revelations, especially calling for crossover studies between astronomy and spectroscopy.
“So much could be done if we knew perfectly how light and matter interact,” Niraula says. “We know that well enough around Earth’s conditions, but once we move into different kinds of atmospheres, things change, and that’s a lot of data, of increasing quality, that we risk misinterpreting.”
De Wit compares the current opacity model with the old translation tool the Rosetta Stone and explains that this Rosetta Stone has worked well so far, such as with the Hubble Space Telescope.
“But as we move to the next level with Webb’s precision,” the researcher said, “our translation process will prevent us from discovering important subtleties, such as those that make the difference between a planet habitable or not.”
As Sousa-Silva puts it, “It’s a call to improve our models so we don’t miss the subtleties of data.”
0 Comments