Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Golden-i
Sep 18, 2006

One big, stumpy family
That's a surprisingly good shot for the scope specs. I guess I'm not really surprised, it's a 50mm triplet APO with a ZWO ASI462MC installed. I'm curious to see what kind of shot you could get with an hour or more on the target.

Even for non-seestar shooting, things have gotten so much easier lately with things like the ASIAIR. It used to take me a couple hours to get set up - build things out, polar align, test, correct alignment, test exposures, guiding, swearing at the clouds that rolled in when I was getting this all set up. A couple days ago I was up and capturing in 20 minutes. It'd be even faster if I had a fixed place to shoot from and I didn't need to haul all of the gear out. It's really neat how far things have come in just the handful of years I've been doing this.

Adbot
ADBOT LOVES YOU

Achmed Jones
Oct 16, 2004



i took a couple dozen shots last night and naively shoved them through astropixelprocessor. i might do it again tonight! my pixinsight trial came so it might be fun to run through both. conditions weren't good last night and won't be tonight, though.

fwiw the picture i got last night was _not good_. but it at least showed that im on some kinda right path

QuarkJets
Sep 8, 2008

Hey thread, I've been a professional astronomy for over 10 years, and I specialize in imaging through turbulence - e.g. adaptive optics but also boutique image processing to try and get a better image.

Professor Chan at Purdue University has some really great slides going through the whole process and modern techniques for mitigation, which are probably beyond the scope of what most amateurs may want to try. They may still be interesting:
https://engineering.purdue.edu/ChanGroup/project_turbulence.html

The tl;dr is that the atmosphere acts like a massive cloud of constantly-fluctuating lenses, causing point sources like stars to look like big fuzzy blobs when you take a long exposure. This reduces the resolution of your telescope, making it harder to resolve details and often reducing SNR a bit. For the purposes of resolution, it's as though your telescope's diameter is limited to the coherence length (or Fried's parameter) - the stronger the turbulence the smaller the coherence length, and the smaller effective diameter of your telescope. Some of the best astronomy sites in the world on the best nights may only have a coherence length that's less than 30 cm, and your back yard may be something much smaller, like 1 to 5 cm. If you buy a 50 cm telescope, you'll still benefit from that extra collection power, but your resolution is going to be crap if you don't do something about it.

So what do we do about it? Adaptive optics is the main thing, if you have tens (or possibly hundreds) of thousands of dollars to burn, but let's assume you don't have that. That brings us to the fabulous world of turbulence mitigation with software! These are the techniques that I think most amateurs should be able to access and may even already use.

Lucky Imaging
Since the blur variation is randomly varying over time, some images will look much better than others. Looking at a stack of images and selecting the best one is the basis of this technique. It's simple and not computationally advanced, requiring only a hard drive and a camera that can take short exposure images. From Fried's original 1967 paper, the probability of getting a lucky image decreases exponentially with (D/r0)^2 in the exponent. This means that if you've got a 20 cm telescope on a 5 cm night, you stand a reasonable chance of getting some lucky images that are almost diffraction limited if you gather enough of them! The trick here is that you have to be taking short exposure images; if you take long exposure images then the varying turbulence is guaranteed to be changing too rapidly to benefit from this technique. We're talking an absolute maximum of 20-30 milliseconds if you're on a mountain on a calm night. This obviously limits the kinds of targets that you can pursue with this technique

Image Stacking
Take your short-exposure images and add them together. This is like taking a long exposure, but if the individual exposures were short enough then you're able to average out some of the speckle and get a crisper image. The challenge here is that the images must be well-registered (e.g. aligned to one another), and the downside is that you'll have a higher effective readout noise (because every image gets readout noise applied to it, so 1 long exposure will have less total readout noise than the sum of 10 images with 1/10th of the long exposure time).

Blind deconvolution
This is the edge of where professional mitigation really begins. Use a forward model of the atmosphere with some kind of numerical solver to estimate the atmospheric blur, hopefully giving you a cleaner image. Some groups just use an estimate of the coherence length to synthesize a reasonable blur function and then deconvolve it everywhere. Other groups will use lone stars in the field of view, which should be point targets, as an estimate for the blur function across the entire image - this is an approximation but it can work okay sometimes. If there are a lot of stars across the field of view, then the blur function can be interpolated across the image for an even better estimate. All of these techniques are accessible to any amateur with a laptop, I remember using ds9 back in the early 2000s to do blind deconvolution; it is not particularly expensive but there is a learning curve.

Multi-frame blind deconvolution (MFBD)
Same as blind deconvolution, but throw more data at the problem. Think of this as combining blind deconvolution with image stacking. Since each short-exposure is a different realization of the randomly varying turbulence, you can get much better estimates of the undisturbed scene. This is the gold standard and where people are still working and publishing new implementations. I'm not sure where any open-source implementations live, but I imagine there have to be some.

Deep Learning
There's a lot of exciting work in this area, teams of researchers sifting through huge amounts of data, debates over which physical model is most suitable for a GAN, etc. The slides I linked above go through some of the recent results. Try your hand at it if you'd like, but the learning curve / amount of work required is high.

QuarkJets
Sep 8, 2008

The tl;dr of what I'd want to tell an amateur astronomer is that longer exposure is not always better, you can get a better image by simply stacking a bunch of short exposures. But YMMV! If you're imaging something very faint for your system then you may be better off with a long exposure! And don't be afraid of exploring some of these algorithms, image stacking is a tried-and-true decades-old technique that you can probably implement yourself with just a bit of python code

duodenum
Sep 18, 2005


Very interesting post!

I thought there was a system that used a laser to measure the problems in the atmosphere and mechanical mirror deformations to correct it live. Do you know anything about that?

pumped up for school
Nov 24, 2010

QuarkJets posted:

The tl;dr of what I'd want to tell an amateur astronomer is that longer exposure is not always better, you can get a better image by simply stacking a bunch of short exposures. But YMMV! If you're imaging something very faint for your system then you may be better off with a long exposure! And don't be afraid of exploring some of these algorithms, image stacking is a tried-and-true decades-old technique that you can probably implement yourself with just a bit of python code

So before I finally get off my rear end and tell myself "No this is the year for real I code again" (I'd been more admin than technical for a long time), the thing that I keep giving the sideeye is:

quote:

Image Stacking
Take your short-exposure images and add them together. This is like taking a long exposure, but if the individual exposures were short enough then you're able to average out some of the speckle and get a crisper image. The challenge here is that the images must be well-registered (e.g. aligned to one another), and the downside is that you'll have a higher effective readout noise (because every image gets readout noise applied to it, so 1 long exposure will have less total readout noise than the sum of 10 images with 1/10th of the long exposure time).

at first glance the stacking algos I've seen look like simple summation (after registration). I had to write code to do that my sophomore year in school, super basic. And back then in the 90s that was still old hat. Has diversity stacking made it out of my niche world (geophysics) to other processing yet?

I honestly don't think it will make any improvement given the amount of sub frames are involved, but at least it'd be different.

hannibal
Jul 27, 2001

[img-planes]

QuarkJets posted:

The tl;dr of what I'd want to tell an amateur astronomer is that longer exposure is not always better, you can get a better image by simply stacking a bunch of short exposures. But YMMV! If you're imaging something very faint for your system then you may be better off with a long exposure! And don't be afraid of exploring some of these algorithms, image stacking is a tried-and-true decades-old technique that you can probably implement yourself with just a bit of python code

From what I remember reading at various places like Cloudy Nights, the main concern with lots of shorter exposures was increased read noise. However sensors are a lot better these days so that's probably not as big of a problem as before.

But I agree that chasing long exposures is an uphill battles sometimes and people can get frustrated with tracking issues, crappy mounts, field rotation, etc. Just take some exposures and see what you can do, you can always try again later with different settings.

PerniciousKnid
Sep 13, 2006

Golden-i posted:

I really do like to see how people's artistic interpretations of the data come out when given the chance. This can be both a science and an art, and it's up to the one processing the data to decide exactly how much they want to lean into either.

All photography (or just vision) is interpretation, intentional or not.

El Grillo
Jan 3, 2008
Fun Shoe
Anyone got any recommendations for good filters for viewing deep sky objects? My knowledge is limited and this is for a gift for a family member who uses a Sky watcher 200 scope. We are in a relatively low light rural UK village, though there are some (not many) street lights nearby. Would love to e.g. be able to see any detail on Andromeda at all. I hear there are some kinds of filters that might help?

Also - any recommendations for a finder scope with a right angle viewer that has a shoe that can fit onto the standard shoe mount on the Skywatcher? We use a Rigel Quikfinder at the moment which does ok (though we have to stick it on with two sided tape which is annoying) but I was wondering about getting a kind of intermediate finder scope with some magnification as well, for use alongside the Rigel. In particular to help locate deep sky objects. And having one with a right angle fitted eyepiece so we don't have to squat down next to the score and try to look up the length of the scope, like we do with the Rigel, would be awesome.

Thanks all and sorry if either question is dumb!

Liquid Chicken
Jan 25, 2005

GOOP

El Grillo posted:

Anyone got any recommendations for good filters for viewing deep sky objects? My knowledge is limited and this is for a gift for a family member who uses a Sky watcher 200 scope. We are in a relatively low light rural UK village, though there are some (not many) street lights nearby. Would love to e.g. be able to see any detail on Andromeda at all. I hear there are some kinds of filters that might help?

Also - any recommendations for a finder scope with a right angle viewer that has a shoe that can fit onto the standard shoe mount on the Skywatcher? We use a Rigel Quikfinder at the moment which does ok (though we have to stick it on with two sided tape which is annoying) but I was wondering about getting a kind of intermediate finder scope with some magnification as well, for use alongside the Rigel. In particular to help locate deep sky objects. And having one with a right angle fitted eyepiece so we don't have to squat down next to the score and try to look up the length of the scope, like we do with the Rigel, would be awesome.

Thanks all and sorry if either question is dumb!

The problem with so-called "light pollution" filters is that modern LED lights are full spectrum. Filters won't do a thing against them. Those filters were designed to block the wavelengths from older sodium and mercury lights. If the light pollution is from the older lights then maybe, just maybe a good broadband filter might help. The results won't be dramatic.

As for Nebulase, a good narrowband UHC filter can be quite useful. I'll leave this link for some reading on DSO filters. The better brands are noted in the reading.

https://www.prairieastronomyclub.or...FzfC7LOAJRcxd_U

As for RACIs, I like the APM RACIs . I have the 50mm and 80mm models on my 10" and 16" dobs. They are well built and they fit in the mounting shoe. HOWEVER, they don't come with an eyepiece. Any 1.25" eyepiece will fit so you can change the magnification and field of view. I'll use different eyepiece depending if I'm viewing planets or DSOs. There's other brands of RACIs that are very good as well like Stellarvue, but check out what First Light Optics or Teleskop-Express has to offer.

El Grillo
Jan 3, 2008
Fun Shoe
Awesome, thanks that's super helpful.

Found a shop not too far off that does the 50mm APM. Would have liked one of the larger ones but hopefully this will do a decent job for now. Have also picked up the reticule eyepiece as well to make sure we have one that works properly with it given the guy in the shop said it has a short range of focus so there are plenty of eyepieces that won't work.

pumped up for school
Nov 24, 2010

Interest check - I've got a Star Adventurer 2i Pro I found buried in my closet. It hasn't been used all year. I got a MSM for the camera-based stuff and a ZWO mount for scope. Anyone want this weird in-between, or have a friend getting into the hobby who might benefit? I'd much rather keep it friendly than go to eBay.

I think most of the bits and bobbins are there but I will go through it thoroughly if there's interest. It'd be BYO tripod. There's no way I'd get it out before Christmas.

Jewmanji
Dec 28, 2003

pumped up for school posted:

Interest check - I've got a Star Adventurer 2i Pro I found buried in my closet. It hasn't been used all year. I got a MSM for the camera-based stuff and a ZWO mount for scope. Anyone want this weird in-between, or have a friend getting into the hobby who might benefit? I'd much rather keep it friendly than go to eBay.

I have no idea what I’m doing but would be interested.

pumped up for school
Nov 24, 2010

Jewmanji posted:

I have no idea what I’m doing but would be interested.

Ok. Sometime next week I'll get it all together and pm you when I am ready to put it on SA Mart. Give you first dibs.

Raikyn
Feb 22, 2011

Another wide-angle shot I've been working on over the last week.
Vela Supernova remnant, big object in the southern sky.

Last object I posted here was the area around orion using a canon 50mm lens, showed orion, horsehead etc and barnards loop around the outside.
Well this is a comparable size, used the 50mm lens for this as well , and it fills up the frame quite nicely.

About 5.5hrs of exposure

Vela Supernova Remnant by Marc, on Flickr

QuarkJets
Sep 8, 2008

duodenum posted:

Very interesting post!

I thought there was a system that used a laser to measure the problems in the atmosphere and mechanical mirror deformations to correct it live. Do you know anything about that?

That's a type of Adaptive Optics. You can split this up into two categories
1 - Natural Guide Star, which looks at a real star
2 - Laser Guide Star, which is the same as Natural Guide Star but you create your own star with a powerful laser. Commonly this laser is designed to ignite the sodium layer in the atmosphere (~100 km up). The laser power required is on the scale of "you will go blind if you are even in the same room as it without the correct eye protection when it turns on". They make you take terrifying annual laser safety courses every year when you work with laser guide stars it's great!

I know all about AO systems! They're so ubiquitous that even a lot of space telescopes have a "slow" AO system on board, to correct for aberrations in their optical system (that would have been a lot cheaper than sending an astronaut to replace optics, like they had to do on the Hubble Space Telescope)

Here's a cool rear end time-lapse movie of the Robo-AO system, it's fully autonomous!
https://www.youtube.com/watch?v=WiGUBS1ByxE

Jewmanji
Dec 28, 2003

Raikyn posted:

Another wide-angle shot I've been working on over the last week.
Vela Supernova remnant, big object in the southern sky.

Nothing to contribute except to say wow, and thanks for sharing.

pumped up for school posted:

Ok. Sometime next week I'll get it all together and pm you when I am ready to put it on SA Mart. Give you first dibs.

Thank you! I'm excited to dive in.

QuarkJets
Sep 8, 2008

pumped up for school posted:

So before I finally get off my rear end and tell myself "No this is the year for real I code again" (I'd been more admin than technical for a long time), the thing that I keep giving the sideeye is:

at first glance the stacking algos I've seen look like simple summation (after registration). I had to write code to do that my sophomore year in school, super basic. And back then in the 90s that was still old hat. Has diversity stacking made it out of my niche world (geophysics) to other processing yet?

I honestly don't think it will make any improvement given the amount of sub frames are involved, but at least it'd be different.

I don't know what diversity stacking is :shrug:

Image stacking is not going to do anything miraculous - a long exposure is image stacking without registration, and with only 1 realization of read noise (instead of 1 per image), so that's often what your result is going to look like if your mount and object are stable. Some users remove the worst images from the stack (they estimate the coherence length in each image and throw away the worst ones - this is easy to automate) but even that is going to be hard to see unless you are using a lot of images

pumped up for school
Nov 24, 2010

QuarkJets posted:

Image stacking is not going to do anything miraculous - a long exposure is image stacking without registration, and with only 1 realization of read noise (instead of 1 per image), so that's often what your result is going to look like if your mount and object are stable. Some users remove the worst images from the stack (they estimate the coherence length in each image and throw away the worst ones - this is easy to automate) but even that is going to be hard to see unless you are using a lot of images

Whereas a vertical stack is just a summation, for a diversity stack you scale your data by the inverse of its average power prior to a stack. You take that composite and divide it by the sum of its scalar for a new normalized value. I'm usually thinking of data as samples, or binned samples. I think the imagery analogue would be binned pixels.

Diversity stacks would do zero for a traditional bad image: shake, mis-alignment, tracking error, etc. But what it would do is allow part of an image to be used when you have a noise burst – like partial occlusion in an otherwise clean piece of data. A different way to filter satellite trails beyond simple stacking, or rolling cloud coverage. For people taking 100s or 1000s of images, when you get a noisy subframe you just throw it out and forget about it. In my world we’re using it when we may only collect 2, maybe 3 sub “frames” before stack, it allows us to retrieve a bit more data. The Div stack is pretty useful when you don't have as much luxury of data collection time.

pumped up for school
Nov 24, 2010

I bought a Seestar S50 last week and had a very short break in clouds a few nights ago. I think I have the same impressions as most: more impressive than it ought to be, unfortunate sensor dimensions at that focal length, fun simple.

Since I only saw about an hour of clearing, & the moon was so bright I started there. Definitely at phone-size scale viewing these are amazing. I do want to play with the subframes of the avi, though I didn't save these to the internal memory. I guess went to phone only.


6 minutes under that moon, left straight out of device; right 6 minutes in PI. I need to figure out how to blend 2 different processed images (like layers in photoshop) and start over so I can reduce the blowout of the core.


I also did 10 minutes of pacman and 5 of NGC891, but that last was way blown out by the moon, really strong banding/gradient as the clouds started scattering all that white light.



Yesterday morning the dog woke me up stupid early in the AM and we had a great "moon-dog" halo. I was too fuzzy-headed to grab a camera w/ wide lens. Kind of mad at myself about that. It was very well-defined.

Golden-i
Sep 18, 2006

One big, stumpy family
A PSA for anyone who is getting into astro processing - keep your data organized and standardized.

What I mean is: With the weather crappy right now, I'm going back through old data to reprocess and am finding the bulk of it completely unusable. Weird gain values on cameras that I don't have dark frames for, weird temperatures that I don't have bias frames for (FITS header says it was shot at -17C? that's a nice round number, idiot), missing flat frames. Stuff that was easily avoidable if I had planned ahead even slightly.

My advice:

1. Standardize your configuration and figure out your variables
There's a lot of variables in this hobby, so it's really best to figure out what's static and what needs to be adjusted for. Work out an ideal gain value for your camera and use it for everything unless you have a very good idea what you're doing or have a specific need to change it. I believe cameras using Sony sensors (maybe others too) have a point where they move from low gain to high gain mode, where the sensor's SNR improves massively at a certain gain setting. Example for the ZWO ASI294MC-Pro (via CloudyNights):



At Gain 117, your SNR improves significantly and you start to see a better dynamic range. Therefore, I would now use a gain of 120 on this camera for all shooting (unless there's some very good reason to change it - I'm not an expert and don't know a ton about camera CCDs so I'm not going to deviate from this). On my ASI6200MM, this value is at gain 100. Once an ideal gain is chosen for your camera, you've got a couple variables to work with: temperature and exposure time.

That leads to the next point:


2. Set up a calibration library
For each of my cameras, I took an afternoon and took dark and bias frames for a fixed gain at various temps (bias) and temps/exposures (darks). These types of calibration frames are not dependent at all on the telescope or optical configuration of any given setup, so you can shoot them once per camera for these settings and always use them.

I've got a library for each camera:


Sorted darks and bias:


And my most common settings for each camera saved so I can grab them later as needed:


I set this up last winter and it's been a massive time saver for me to not worry about figuring these frames out when shooting. It all feels very obvious in hindsight but I've only gotten good about this stuff in the last year or so, and have some major issues with any data older than that.


3. Organize your data and keep notes
This one's simpler: keep an organized folder structure of your original data, and note down what gear you used for everything (even if it's just a txt file in the folder with the raw data - something that notes gear, camera settings, location/seeing quality, anything else you can think of). I have hundreds of GB of old data that I just don't know anything about at this point, which makes reprocessing that much harder when you're trying to figure things out from file headers.

Here's a reprocess of 15 mins on the Dumbbell Nebula I took back in 2019 - all the other frames were blurry, I think this was before I used a guide camera and tracking was not great. There were no notes on this data at all aside from what I could get out of the FITS headers from the raw light frames. ASI294MC-Pro, 99% sure that this was on my old Celestron 8" Reflector (no coma correction/field flattener/filters of any kind, but the flares from the brighter stars are indicative of a Newtonian reflector) @1000mm FL. 15x60sec RGB combined, processed in PI.

simble
May 11, 2004

I’d suggest only keeping calibrated frames. (Light - bias - dark) / (flat - flat dark). This math is always the same so reprocessing it from the raw light frame doesn’t get you a different result.

I store calibrated data by target and date. As soon as the raw lights are calibrated, they get deleted. If you have data from different cameras (and therefore different fovs), keep that separate. This point of calibration is also a good time to make sure you’re compressing everything if you’re storing xisf’s.

At that point the offset, gain, and temp they were shot at doesn’t matter. And I don’t have to keep around a set of flats (and maybe darks depending on your setup) that I can’t remember what they were for.

Luneshot
Mar 10, 2014

Golden-i posted:


2. Set up a calibration library
For each of my cameras, I took an afternoon and took dark and bias frames for a fixed gain at various temps (bias) and temps/exposures (darks). These types of calibration frames are not dependent at all on the telescope or optical configuration of any given setup, so you can shoot them once per camera for these settings and always use them.


Nice work. Having a calibration library is a great time-saver for astrophotography, because you're not really concerned with making scientific measurements that require careful control of image statistics. However, you may eventually need to take new biases and darks; it's standard practice in professional astronomy to take a set of biases and darks basically every night that you observe, because bias levels and dark current can slowly vary over time.

Achmed Jones
Oct 16, 2004



dark frames: taken at your actual exposure time and settings (including temperature) with the lens cap on

bias frames: taken at your fastest shutter speed with the lens cap on. other settings, temp, etc are the same as the image frames. you can reuse these (but not forever)

flat frames: massive pain in the butt. taken with flat illumination, so you need to bring a light to the shoot. must be taken in the exact same location and without bumping or moving the equipment, so that dust isn't disturbed. if you have a CMOS sensor, you take flat darks instead. you can reuse these (but not forever)

flat dark frames: do all the stuff for flat frames (no bumping etc) except you don't need the light and can leave the lens cap on.

am i missing anything? can non-cmos sensors use flat darks instead of flats since they're such a less of a pain in the neck to take?

duodenum
Sep 18, 2005

These are the conversations I think about when someone Kramers into an astronomy community discussion asking about taking pictures through their Dad's old Kmart 114 mm bird-jones on a teeeny EQ1.

The can of worms is unimaginably deep and wide.

simble
May 11, 2004

Achmed Jones posted:

dark frames: taken at your actual exposure time and settings (including temperature) with the lens cap on

bias frames: taken at your fastest shutter speed with the lens cap on. other settings, temp, etc are the same as the image frames. you can reuse these (but not forever)

flat frames: massive pain in the butt. taken with flat illumination, so you need to bring a light to the shoot. must be taken in the exact same location and without bumping or moving the equipment, so that dust isn't disturbed. if you have a CMOS sensor, you take flat darks instead. you can reuse these (but not forever)

flat dark frames: do all the stuff for flat frames (no bumping etc) except you don't need the light and can leave the lens cap on.

am i missing anything? can non-cmos sensors use flat darks instead of flats since they're such a less of a pain in the neck to take?

Flats and flat darks are not interchangeable. Flat darks are darks for calibrating your flat frames. They are taken with the cap on with a similar exposure time and temperature as your flats. Flats should be calibrated with bias and flat dark before they are used to calibrate your lights.

Also depending on the sensor (imx571 ime), darks and flat darks are practically unnecessary as the read noise is basically indistinguishable from the bias.

simble fucked around with this message at 19:52 on Jan 1, 2024

Golden-i
Sep 18, 2006

One big, stumpy family

Luneshot posted:

Nice work. Having a calibration library is a great time-saver for astrophotography, because you're not really concerned with making scientific measurements that require careful control of image statistics. However, you may eventually need to take new biases and darks; it's standard practice in professional astronomy to take a set of biases and darks basically every night that you observe, because bias levels and dark current can slowly vary over time.

I've never thought about this before, but that makes sense. It's probably worth reshooting calibrations every once in a while. I wonder what variables may make this change happen - different power sources, camera/CMOS age, basic wear-and-tear? It's really interesting in any case. I suppose if you want to be super accurate you'd shoot dark/bias immediately before or after every session with the same power source, cabling, etc to reflect any imperfections in power conditioning.


simble posted:

As soon as the raw lights are calibrated, they get deleted.

"Deleted"? What's that? :)



But I take your point here - I'm just a pack-rat, paranoid about deleting anything that took effort to collect. Calibration happens as part of my pre-processing workflow (WBPP in PI) and the calibrated frames are saved as part of the project, so I could delete the original frames at that point and restart processing at a later date at the post-calibration step.


duodenum posted:

These are the conversations I think about when someone Kramers into an astronomy community discussion asking about taking pictures through their Dad's old Kmart 114 mm bird-jones on a teeeny EQ1.

The can of worms is unimaginably deep and wide.

If you give a mouse a cookie, he'll want a glass of milk to properly calibrate his light frames

Raikyn
Feb 22, 2011

I bought a new camera over the holidays, and tested it out on a bright object last night.

Orion Nebula

ASI2600MC + Askar FRA300

No filter as I had a couple of hours before the moon gets too high

120 x 60sec

duodenum
Sep 18, 2005

Gorgeous. What's your light pollution situation?

Raikyn
Feb 22, 2011

Bortle 5 I think.

Not great but not too bad.
You can clearly see stars and satellites, can see the milky way on a clear dark night.

This is a straight 30sec dslr shot a few weeks ago while looking for a meteor shower, a bit cloudy but the light pollution doesn't look too bad. Still twilight, about an hour after sunset and about 40 min before dark sky

Raikyn fucked around with this message at 23:59 on Jan 4, 2024

Liquid Chicken
Jan 25, 2005

GOOP
Looks like Celestron has joined the smart telescope wars. Maybe in April, their Origin smart telescope will be out. It's basically a f/2.2 6" RASA that comes with a camera. The mount and tripod looks like those of the Evolution scopes. 41lbs total. It has a filter drawer for 1.25" and 2" filters. Built in Raspberry PI computer. It appears to do everything the ZWO Seestar50 can do - auto focus, dew heater, etc., but also has dual fans and probably other tricks.

Might be very interesting if you have $4K burning in your pockets.

Looking over at Orion...."Anything?"

Raikyn
Feb 22, 2011

Unistellar and Nikon are doing a new product, Odyssey Pro as well

Jewmanji
Dec 28, 2003
How are the folks in this thread who have been in the hobby for a long time and who have amassed these very complicated, expensive rigs feeling about these telescopes? As arbitrary as it is, it seems like making it tooeasy would take the fun out of it, in a way. Though I don't know any serious or professional photographers who bemoan the advent of smartphone cameras.

pumped up for school
Nov 24, 2010

Jewmanji posted:

How are the folks in this thread who have been in the hobby for a long time and who have amassed these very complicated, expensive rigs feeling about these telescopes? As arbitrary as it is, it seems like making it tooeasy would take the fun out of it, in a way. Though I don't know any serious or professional photographers who bemoan the advent of smartphone cameras.

I haven't been in this hobby for any length of time but I'll chime in saying I think we're "living in interesting times." Just wrt the speed of tech advancements from other disciplines mixing and comingling.

I do have several full-time professional photographer friends. And they'll all use their phone if that's the right/available tool at hand. But we've had exactly that conversation, and they'll admit there was lots of whining, bitching and moaning, and gatekeeping when (good) phone cameras really started to be A Thing. Now it is just acceptance. Their current grumbling is how the software is getting too mobile-focused. And the standby complaint of "now everyone thinks they're a photographer."

Aside - My wife used to work in an old print newspaper doing graphic design and layout. Small town paper so she also split time in the darkroom. When I ask her about those days it is very much a "uphill in the snow both ways" laugh.

Achmed Jones
Oct 16, 2004



i dont know anything about astrophotography, but the impression i get is that dicking with images sucks, doing nothing sucks (half the products exist so you can go back inside and not even _be there_), and that it's only the end result that's awesome. being outside looking through a telescope is great fun, but i don't really see a lot of "enjoyable activity" from the astrophotography side. i guess there's some fun to be had with a tripod and a tracker, but as soon as you start stacking, things start being pretty toilsome

AstroZamboni
Mar 8, 2007

Smoothing the Ice on Europa since 1997!
I'm with you. Visual astronomy is my happy place. Has been for thirty years. Astrophotography is just too much like work.

duodenum
Sep 18, 2005

Christmas return season, B&H has an XW5 for $177 and two 8" Starsense dobs for ~$100 off and free shipping. Plenty of other fun stuff too.

For those of you newbies that got a (common Celestron, Orion, SkyWatcher) 6, 8, or 10" dob for Christmas, they're all 1200mm focal length and the 5mm XW eyepiece would get you 240x. It'd take practice to track a planet at that magnification, but the Pentax XW will ruin you for your cheap "it came with the scope" eyepieces. Comfortable to view through and 70 degrees wide. B&H also has a used XCel-Lx 5mm which is much cheaper but also pretty comfy and just 60 degrees. The XCel-LX series was my first eyepiece purchase outside of my first super cheapies and it was like getting a whole new telescope. Don't give up on your scope until you've tried a decent eyepiece.

Adorama has in used: XW3.5, XW7, XCel-LX 25,18, and 12mm etc. The 12mm is $39 which is like half price. Crazy good deal for babby's first real eyepiece.

duodenum fucked around with this message at 17:25 on Jan 16, 2024

Raikyn
Feb 22, 2011

Don't get a lot of time at the moment with it being the middle of summer, but still get the odd night out.
Dolphin Head nebula
About 3hrs exposure

slidebite
Nov 6, 2005

Good egg
:colbert:

God drat, that's nice.

Adbot
ADBOT LOVES YOU

Liquid Chicken
Jan 25, 2005

GOOP
Sadly, looks like Farpoint Astro company might have crashed and burned. Website down. People on Cloudy Nights complaining about service and unfulfilled orders. If you have any paid standing orders with them you might want to think about contacting your credit card or payment service about a refund.

Liquid Chicken fucked around with this message at 03:22 on Jan 30, 2024

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply