Another cloud free day in Scotland let me catch almost 9 hours of this huge and lively prom. Taken with my home made 90mm modded Coronado PST and DMK21 camera. Software: CdC, Eqmod, DSSR, AutoStakkert!, Wavesharp, DVS, Shotcut and Gimp.
David Wilson on April 8, 2025 @ Inverness, Scotland
https://spaceweathergallery2.com/indiv_upload.php?upload_id=221951
“actual image your camera sees” is a term that is hard to define with astrophotography, because it’s kinda hard to define with regular digital photography, too.
The sensor collects raw data on its pixels, where the amount of radiation that makes it past that pixel’s color filter actually excites the electrons on that particular pixel and gets processed on the image processing chip, where each pixel is assigned a color and it gets added together as larger added pixels in some image.
So what does a camera “see”? It depends on how the lenses and filters in front of that sensor are set up, and it depends on how susceptible to electrical noise that sensor is, and it depends on the configuration of how long it looks for each frame. Many of these sensors are sensitive to a wide range of light wavelengths, so the filter determines whether any particular pixel sees red, blue, or green light. Some get configured to filter out all but ultraviolet or infrared wavelengths, at which point the camera can “see” what the human eye cannot.
A long exposure can collect light over a long period of time to show even very faint light, at least in the dark.
There are all sorts of mechanical tricks at that point. Image stabilization tries to keep the beams of focused light stabilized on the sensor, and may compensate for movement with some offsetting movement, so that the pixel is collecting light from the same direction over the course of its entire exposure. Or, some people want to rotate their camera along with the celestial subject, a star or a planet they’re trying to get a picture of, to compensate for the Earth’s rotation over the long exposure.
And then there are computational tricks. Just as you might physically move the sensor or lens to compensate for motion, you may just process the incoming sensor data to understand that a particular subject’s light will hit multiple pixels over time, and can get added together in software rather than at the sensor’s own charged pixels.
So astrophotography is just an extension of normal photography’s use of filtering out the wavelengths you don’t want, and processing the data that hits the sensor. It’s just that there needs to be a lot more thought and configuration of those filters and processing algorithms than the default that sits on a typical phone’s camera app and hardware.