February 8, 2023


You’ve seen the first full-color image from the James Webb Space Telescope, right? A stellar nursery has revealed previously unseen stars, a massive exoplanet whose atmosphere has been examined, a collection of galaxies, a beautiful planetary nebula and the deepest image of the universe we’ve ever taken.

Cool, right?but if they real?

Of course they are real!

Are they exactly the same as what Webb captures in a single image, like when you take a picture with your phone?

not at all.

Webb is designed to be sensitive to light we cannot see. It also has four scientific instruments and seventeen modes.

“When you collect data, they don’t look like beautiful color images at all,” said Klaus Pontoppidan, Webb project scientist at STScI, who led a team of 30 image processing experts. “They don’t look like anything at all [and] You can only appreciate them if you know what to look for. “

The engineers at Webb had to process the images we often see before they were published, and for some very simple and common sense reasons.

what happened?

It’s not just about taking pictures on your phone.

planning images

The first is lens selection. NASA is always looking for objects that create beautiful frames, have structure, and utilize color, while also highlighting science.

Webb can’t see every part of the sky at any given time. So, given that the telescope’s launch has been delayed many times, engineers couldn’t carefully plan the first images until Webb took off in December.

When it did, the engineers had a list of about 70 targets that were chosen to demonstrate the breadth of the scientific web, and could foreshadow spectacular color images.

“Once we knew when data would be available, we could look at that list and choose the highest priority target that was visible at that time,” Pontoppidan said. “These images were planned for a long time. [and] We’ve done a lot of work to inspire what the observations should look like so that everything can be configured correctly. ”

How Webb’s Data Returned to Earth

Before engineers can start processing Webb’s images, the raw data must be returned to our planet from a million miles away. This is done using NASA JPL’s Deep Space Network (DSN), which is how engineers communicate with and receive data from more than 30 robotic probes, including Webb, in and outside the solar system. There are three complexes in the DSN, each 120 degrees from each other; California, Madrid, Spain and Canberra, Australia.

Radio waves are very reliable, but slower. Data comes in at several megabits per second (Mbps). However, DSNs will soon be upgraded from slow radio transmissions to ultra-fast “space lasers” that can dramatically increase data rates by a factor of 10 or even 100.

“We plan things, upload them to the observatory, get the data and bring them back to Earth — and then we have another long time to process the data,” Pontoppidan said.

Why the colors in Webb’s photos are fake

Are Webb Telescope images in color? Are the colors in space photos real? No, they are not. Webb telescope sees red. It is specifically designed to detect infrared light, the faintest and farthest light in the universe.

It is essentially seen in thermal radiation, not visible light.it sees another part electromagnetic spectrum:

Think rainbows. One end is red and the other end is blue or purple. In reality, that rainbow is much wider, but both extremes represent the limits of what color the human eye can perceive. In addition to blue, there are shorter and shorter wavelengths that we don’t have names for. The same is true beyond red, where the wavelengths of light are longer.

That’s where Webb is looking — the infrared part of the electromagnetic spectrum.

It uses masking techniques (filters) to detect faint light sources next to very bright light sources. But none of them are “colors”.

So how can the photos we see be in color?

How Webb’s Photos Are Colored

Webb’s image moves up the electromagnetic spectrum from parts of the electromagnetic spectrum we can’t perceive to parts of the visible light we can see.

They took single-intensity images from Webb using up to 29 different narrowband filters, each detecting a different wavelength of infrared light. They assigned a different visible color to the light collected by each filter, from the reddest red light (the longest wavelength) to blue (the shortest wavelength). Then they create a composite image.

Is this cheating? All engineers do is take radiation from one part of the spectrum that our eyes can’t see and transfer it to another part of the spectrum that we can see.

It’s like playing a song in a different key.

Also, all cameras (including your smartphone’s camera) use filters to capture the image you see. No, not Instagram filters, but individual red, green, and blue filters that, when combined, produce visible images that look “real”.

If you think Webb’s images aren’t real, then you must think your own smartphone photos are fake too.

How long does it take to process Weber’s images

This is a complex process that has never been done before with data from Webb. As a result, it takes weeks for each image to take on their colorful brilliance.

“Typically, the process from raw telescope data to a final, clean image that conveys scientific information about the universe can take anywhere from weeks to a month,” said Alyssa Pagan, a science vision developer at STScI.

It’s definitely worth the wait.

“In the first image, we only had a few days of observations,” Pontopidan said. “This is really just the beginning, we’re just scratching the surface.”

May you have a clear sky and open your eyes.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *