Could someone explain why there are … I’m not sure this is the right word, but “rays” coming off the brighter stars in the JWST image? Is that a sensor not quite finished calibrating artifact or real? I’m quite probably wrong, but I thought rays from light sources as seen on earth were the result of atmospheric scattering.

Those are diffraction spikes, usually related to support struts used for the camera in reflecting telescopes.

You’re seeing diffraction spikes from the 3 arms holding the secondary mirror in place (each arm creates a spike along a different axis). This page has a good guide to what problems you’re seeing in an astronomical image.

Cool, thank you!

James Webb telescope or JJ Abrams telescope… Amiright?

Well, JJ Abrams is more known for lens flares than diffraction spikes, but they both understand a thing or two about striking visuals. ;)

Geez, Boeing don’t you guys have duct tape

They tried but NASA was being squirrely about the $560k per roll that Boeing quoted. So really this is NASA’s fault.

Holy shit. This is a comparison of Spitzer (most recent generation of infrared orbiting telescope) and a calibration image from Webb of the same region of space. The difference is pretty staggering.

Speaking as someone whose dissertation was in infrared astronomy, this is going to be a total revolution in how we understand the galaxy around us and the universe beyond.

Edit: And I see the image (thought not the gif) was posted above too. Fair enough - I’m just completely blown away. Frankly, I got used to thinking of infrared images of small regions of space as inherently fuzzy - longer wavelengths make it harder to resolve things sharply.

I can’t wait til they point that at individuals planets :)

Enhance!

I was looking at this again in some article and wondered why those spikes aren’t digitally removed? Or will they be at some point? Or are these pics more real (digital but not really adjusted) unlike the Hubble stuff that is super adjusted with various filters that assign visible colors to different light waves that wouldn’t be visible otherwise? Hope that makes sense.

There’s no point to changing those as they are what has been “received” by the sensor. You could remove it, sure, but there’s no actual gain (from a scientific standpoint) in doing so; you can’t recover the data “obscured” by those spikes, so you might as well leave them there.

From an aesthetic point of view, sure, removing those could be done and it would be a mostly trivial thing to do, but there’s little benefit to doing so. And in aesthetic terms, people tend to actually like the spikes - there’s a reason stars in children drawings are depicted with pointy parts. It’s likely most people actually expect those spikes nowadays.

Makes sense. Thanks for the quick answer.

If you need the “obscured” data space, the better strategy is to take the image again with a slight rotation of the platform. This is pretty practical unless it is an extremely long exposure image capture.

That’s an excellent addendum; I believe they have full rotational control over the telescope via gyroscopes and whatnot, so that’s a way to retrieve “obscured” data if they think there’s something interesting behind a spike.

In most scientific images they won’t have objects that bright in the field - at least without some kind of masking from the instrument in question (exoplanet searches) - so it shouldn’t be an issue most of the time.

/cries in Josh Whedon

Was just coming to post this. Amazing increase in detail.