• lefty7283@lemmy.worldOPM
    link
    fedilink
    English
    arrow-up
    6
    ·
    9 months ago

    So I shot the Bubble Nebula in true-color last year, but I decided to shoot it again this past month in false color. It really helps to show the extended nebulosity, and gives me and excuse to compare my image to Hubble’s. This false color image uses the SHO palette, where the sulfur-ii wavelength is mapped to red, hydrogen-alpha to green, and oxygen-iii is blue. I’m really happy with how the colors turned out on this one. There’s also a number of other nebulae and a star cluster in frame. Captured over 14 nights in Jan/Feb 2024 from a bortle 9 zone (I could only get a couple hours max per night on it.

    Places where I host my other images:

    Flickr | Instagram


    Equipment:

    • TPO 6" F/4 Imaging Newtonian

    • Orion Sirius EQ-G

    • ZWO ASI1600MM-Pro

    • Skywatcher Quattro Coma Corrector

    • ZWO EFW 8x1.25"/31mm

    • Astronomik LRGB+CLS Filters- 31mm

    • Astrodon 31mm Ha 5nm, Oiii 3nm, Sii 5nm

    • Agena 50mm Deluxe Straight-Through Guide Scope

    • ZWO ASI-290mc for guiding

    • Moonlite Autofocuser

    Acquisition: 37 hours 36 minutes (Camera at -15°C), Camera at unity gain.

    • Ha - 95x360"

    • Oiii - 140x360"

    • Sii - 141x360"

    • Darks- 30

    • Flats- 30 per filter

    Capture Software:

    • Captured using N.I.N.A. and PHD2 for guiding and dithering.

    PixInsight Preprocessing:

    • BatchPreProcessing

    • StarAlignment

    • Blink

    • ImageIntegration per channel

    • DrizzleIntegration (2x, Var β=1.5)

    • Dynamic Crop

    • DynamicBackgroundExtraction

    duplicated each image and removed stars via StarXterminator. Ran DBE with a shitload of points to generate background model. model subtracted from original pic using the following PixelMath (math courtesy of /u/jimmythechicken1)

    $T * med(model) / model

    Narrowband Linear:

    • Blur and NoiseXTerminator

    • Duplicated the images before stretching to be used for separate stars-only processing

    • Slight stretch using HistogramTransformation

    • iHDR 2.0 script to stretch each channel the rest of the way.

    This is a great new pixinsight script from Sketch on the discord. here’s the link to the repo if you want to add it to your own PI install.

    Stars Only Processing:

    • PixelMath to combine star images (SHO palette)

    • SpectroPhotometricColorCalibration (narrowband working mode)

    • StarXTerminator to make stars only image form each channel

    • SCNR > invert > SCNR > invert to remove greens and magentas

    • ArcsinhStretch + HT to stretch nonlinear - to be combined later with starless pic

    Nonlinear:

    • PixelMath to combine stretched Ha, Oiii, and Sii images into color image (SHO palette)

    • StarXterminator to remove stars

    • HistogramTransformations to tone back the greens and apply a more aggressive stretch to red and blue channels

    • Shitloads of Curve Transformations to adjust lightness, hues, contrast, saturation, etc

    • LRGBCombination with stretched Ha as luminance

    • DeepSNR

    • more curves

    • ColorSaturation to bring up the blues in the bubble

    • LocalHistogramEqualization

    • even more curves

    • MLT for chrominance noise reduction

    • Pixelmath to add in the stretched stars only image from earlier

    This basically re-linearizes the two images, adds them together, and then stretches them back to before

    (Jimmy is a processing wizard when it comes to writing up this independent starless processing stuff)

    mtf(.005,

    mtf(.995,Stars)+

    mtf(.995,Starless))

    • A round of NoiseXterminator for good measure

    • Resample to 60%

    • Annotation

  • Guenther_Amanita@feddit.de
    link
    fedilink
    English
    arrow-up
    2
    ·
    9 months ago

    Wow… I will probably never be able to do something like this, but it’s fucking awesome and interesting to see the amount of effort you’ve put into this one picture 😊

    One thing that might interest me is how a single, unstacked frame looks like, just in comparison.

    Is there a reason you use PixInsight instead of Siril?
    I’m very new to this and would like to know differences and experiences in software.

    • lefty7283@lemmy.worldOPM
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      9 months ago

      Here’s what single Ha, Oiii, and Sii frames look like.

      I’ve never actually used Siril, but for the longest time Pixinsight has been considered the be all and end all for deep sky processing, and I decided to dive completely into it once I moved on from photoshop. There’s also a number of processes and scripts made by the community just for pix which have become essential for some of my workflows, like BlurXterminator (paid), and a bunch of pixelmath expressions from the guys in the discord.

      • Guenther_Amanita@feddit.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 months ago

        Here’s what single Ha, Oiii, and Sii frames look like.

        Awesome, thanks! For a newcomer like myself I always find it super fascinating to see how “uninteresting” the single take looks like, and then how blastingly colorful, detailled and amazing the final result is compared to that.


        I just shared my own story a few hours ago here on this community, where you can see my progress over the last few months.
        How long did it take for you to get to this stage of awesomeness? How much does the equipment cost, and at what spots do you shoot them? Do you travel explicitly to very remote areas (e.g. the desert or forest), or do you shoot them in your frontyard? :D

        Are you interested in being my “guide”, in terms of telling me what maximum quality and detail I can achieve with my current, shitty and absolutely-not-comparable-to-yours, equipment? I mean, you had to start somewhere too, right? What were your last results of the stage when you decided to upgrade from your first camera to a better one? Would you mind sharing a similar story?

        I think, I will just make a post regarding that question, so it reaches a wider audience :D

        • lefty7283@lemmy.worldOPM
          link
          fedilink
          English
          arrow-up
          1
          ·
          9 months ago

          It’s taken me several years to get to this point, and honestly I’m still continuing to learn new techniques and improve my processing to this day. I’ve never actually added up how much all my equipment costs, because then I’d have to give an honest answer when my family asks. I did buy a lot of it on the used market, and I haven’t really upgraded anything since covid and astronomy gear prices shot up. Most of my images I take from my apartment balcony, which has horrific light pollution. A couple times a year I’ll head out to a dark site like the Deerlick Astronomy Village for a weekend. If there are any astonomy clubs in your town they’ll tell you what dark sites are best near you.

          I’m not sure how much of a ‘guide’ I could be, but I can help give advice and constructive criticism if you need it! If you use discord there’s also a ton of beginner info and people able to help out in ours (link in the sidebar).

          • Guenther_Amanita@feddit.de
            link
            fedilink
            English
            arrow-up
            1
            ·
            9 months ago

            If there are any astonomy clubs in your town they’ll tell you what dark sites are best near you.

            I sadly don’t, but you can check out this. Gladly, I’m at the edge of a dark-yellow zone, next to a green one, meaning it’s pretty good. The difference, just 500 or 1000 meters away compared to the mid city, is actually super noticeable even with the naked eye. Maybe consider checking out the map and look for less polluted areas, e.g. a forest or big field :)