Another cloud free day in Scotland let me catch almost 9 hours of this huge and lively prom. Taken with my home made 90mm modded Coronado PST and DMK21 camera. Software: CdC, Eqmod, DSSR, AutoStakkert!, Wavesharp, DVS, Shotcut and Gimp.

David Wilson on April 8, 2025 @ Inverness, Scotland

https://spaceweathergallery2.com/indiv_upload.php?upload_id=221951

  • nexguy@lemmy.world
    link
    fedilink
    arrow-up
    62
    ·
    2 days ago

    Looks like the video is about 20 minutes of real time per 1 second of video. There are dops of plasma that fall further than the diameter of Earth in less than one in video second… which means the plasma is falling the more than the diameter of Earth in less than 20 minutes. That’s close to 100,000 mph or 160,000 kph. Dang

    • andros_rex@lemmy.world
      link
      fedilink
      arrow-up
      22
      ·
      2 days ago

      About 0.01% of the speed of light. I got a Lorentz factor of 1.00000001 so not quite fast enough for relativistic stuff.

      • Sconrad122@lemmy.world
        link
        fedilink
        arrow-up
        24
        ·
        2 days ago

        Description says the poster caught 9h of video, but based on the clock watermark in the top left, what is shown is about 7.5h of video (maybe cut for the interesting bits/highest quality) from 0830ish to 1600ish) at a rate of roughly 20 minutes of real time per 1 second video time, as the original commenter pointed out

  • gcheliotis@lemmy.world
    link
    fedilink
    arrow-up
    36
    ·
    2 days ago

    Absolutely amazing that you could capture that with “amateur” equipment, although it is clear from your post that a lot went into this. Bravo!

  • Lovable Sidekick@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    ·
    edit-2
    2 days ago

    For comparison the distance from the plasma cloud to the sun’s surface is about how far communication satellites in geostationary orbit are above Earth.

    I know all kinds of nerdy things.

  • 1luv8008135@lemmy.world
    link
    fedilink
    arrow-up
    55
    ·
    3 days ago

    So dumb question, but what’s causing the gap between the plasma cloud(?) and the surface? And is that gap filled with something that is invisible?

    • crapwittyname@lemm.ee
      link
      fedilink
      arrow-up
      87
      arrow-down
      1
      ·
      3 days ago

      Plasma is electrically charged, so it interacts with magnetic lines.
      The sun has magnetic field lines just as the earth does. It also rotates. But- since it’s not solid, it doesn’t have to rotate all at the same speed. The plasma in fast-rotating regions drags the field lines further than the plasma in slow rotating areas, creating weird loops, breaks and reconnections in the field lines. I’m almost certain that what we’re seeing in this lovely bit of photography is a cloud of plasma travelling across, or trapped by one of those rogue field lines which has been pushed upwards from the surface by differential rotation.

    • niktemadur@lemmy.world
      link
      fedilink
      arrow-up
      35
      ·
      3 days ago

      The dynamics there due to sheer gravity, magnetism and levels of energy/radiation that are utterly alien to our daily experience.

        • perestroika@lemm.ee
          link
          fedilink
          arrow-up
          20
          ·
          edit-2
          3 days ago

          A guess: doubly ionized helium vs. singly ionized helium. They absorb different amounts of radiation (have different opacity). At high opacity it gathers heat and subsequently expands. At low opacity it lets the heat pass through, subsequently cools and condenses.

          (This is the mechanism that makes Cepheid stars regularly and predictably change intensity. The same mechanism is probably present in other stars too, and causes local processes that we cannot observe from another star system… but can observe in the Sun.)

          Alternatively, there could be a multitude of other effects doing something similar.

          • niktemadur@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            1 day ago

            This is the mechanism that makes Cepheid stars regularly and predictably change intensity

            Doesn’t it also make the Cepheid noticeably swell (then deflate) in circumference? Or does it maintain the same basic size, and it’s just storing magnetic bubbles of hot plasma like a halo, before bursting and releasing all that accumulated material?

            • perestroika@lemm.ee
              link
              fedilink
              arrow-up
              2
              ·
              edit-2
              1 day ago

              To my understanding they do chance circumference. The opaque doubly ionized helium forms at high temperature, expands until temperature drops (change in circumference), drops to singly ionized after expansion, and gets doubly ionized again after contraction (another change in circumference). In Cepheids, it’s uniform across the whole star.

              Thus, your question makes me doubt my original speculation that it’s helium changing ionization levels. The way some material “climbs up” into the arc in this video (from the right end, at one point of time) while other material “rains down” make a magnetic explanation (proposed by others here) seem more plausible.

  • trotfox@lemmy.world
    link
    fedilink
    arrow-up
    64
    arrow-down
    1
    ·
    3 days ago

    It’s crazy this guy is just doing this on his own. Looks like something from NASA to me.

  • Higgs boson@dubvee.org
    link
    fedilink
    English
    arrow-up
    11
    ·
    2 days ago

    Another cloud free day in Scotland let me catch almost 9 hours of this huge and lively prom.

    As soon as I read the word “Scotland”, my brain went back and revised this to be read in Scott Manley’s voice.

  • Sgarcnl@lemmy.world
    link
    fedilink
    arrow-up
    18
    ·
    3 days ago

    If the earth to scale is accurate, the drops coming to the surface might be approximately close to the land mass of a large continent.

  • danc4498@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    ·
    2 days ago

    Is this the actual image your camera sees? Or is it more like heat sensors visualized, or something like that?

    • lurker2718@lemmings.world
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      1 day ago

      To add what the others said, this image is most likely taken with a special filter for taking only one specific wavelength, so color. In this case H-alpha, so excited hydrogen atoms, which is deep red. With this and additional filters for safety you can see more or less this image yourself, except it’s red. I already had the opportunity to try this.

      Here is a site showing daily images of the sun taken with different filters. Red is H-alpha, also shown in OP. Only with this filter you can see the protuberances. White is white, so what you would see if you could look directly without burning your eyes, or what you see with eclipse goggles. Right is another special Line, Calcium K. All of this you can look at with the right filters and a telescope and it looks similar to the images here, except the two colors are even more saturated than shown here. However, changes are on the order of minutes, so it looks more like an still image.

      However, the sun and planets are pretty much the only object where images are similar to what you could see with telescope and filters. Colorful images of the moon are always heavily processed. For nebulas and galaxies its even more of a difference, they are just too dark to see more than a grey blob. For this a telescope does not help much, similar to a lens not helping to see in the dark. So nebulas and galaxies are shown at least hat they would look like, if they were brighter. But most of the time they are shown with a lot brighter colors than reality.

    • GamingChairModel@lemmy.world
      link
      fedilink
      arrow-up
      20
      ·
      2 days ago

      “actual image your camera sees” is a term that is hard to define with astrophotography, because it’s kinda hard to define with regular digital photography, too.

      The sensor collects raw data on its pixels, where the amount of radiation that makes it past that pixel’s color filter actually excites the electrons on that particular pixel and gets processed on the image processing chip, where each pixel is assigned a color and it gets added together as larger added pixels in some image.

      So what does a camera “see”? It depends on how the lenses and filters in front of that sensor are set up, and it depends on how susceptible to electrical noise that sensor is, and it depends on the configuration of how long it looks for each frame. Many of these sensors are sensitive to a wide range of light wavelengths, so the filter determines whether any particular pixel sees red, blue, or green light. Some get configured to filter out all but ultraviolet or infrared wavelengths, at which point the camera can “see” what the human eye cannot.

      A long exposure can collect light over a long period of time to show even very faint light, at least in the dark.

      There are all sorts of mechanical tricks at that point. Image stabilization tries to keep the beams of focused light stabilized on the sensor, and may compensate for movement with some offsetting movement, so that the pixel is collecting light from the same direction over the course of its entire exposure. Or, some people want to rotate their camera along with the celestial subject, a star or a planet they’re trying to get a picture of, to compensate for the Earth’s rotation over the long exposure.

      And then there are computational tricks. Just as you might physically move the sensor or lens to compensate for motion, you may just process the incoming sensor data to understand that a particular subject’s light will hit multiple pixels over time, and can get added together in software rather than at the sensor’s own charged pixels.

      So astrophotography is just an extension of normal photography’s use of filtering out the wavelengths you don’t want, and processing the data that hits the sensor. It’s just that there needs to be a lot more thought and configuration of those filters and processing algorithms than the default that sits on a typical phone’s camera app and hardware.

    • jdnewmil@lemmy.ca
      link
      fedilink
      arrow-up
      12
      ·
      2 days ago

      Not OP, but solar photography requires super dense filters so like sunglasses alter what you see from “actual” the filters also alter the image from “actual” yet this is what would “actually” be “seen” by the camera. So yes and no depending how you want to interpret “actual”.

      • danc4498@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 days ago

        Thanks, this makes sense. I’ve heard there are some great astronomy photos where what we are seeing isn’t actually visible to the naked eye. Rather it’s invisible gases or something, and the photos are just visualizations based on assigning colors to density… I guess I was wondering if it was something like that. It sounds like it’s not.

        • jdnewmil@lemmy.ca
          link
          fedilink
          arrow-up
          2
          ·
          2 days ago

          When they sense invisible electromagnetic wavelengths like xrays or microwaves and “assign” colors to completely invisible wavelengths then that is false color imaging. Possible to do with the sun… but unlikely with an amateur rig.