Spending time to save time… it’s kinda what ORSAs do.

Over the past couple of weeks, I’ve been working a lot, to make future work worth more. I’ve been installing the various drivers required to get my laptop talking to the scope, and then getting those drivers installed on Kelly’s laptop, and then making them talk to each other. Lots of technical mumbo-jumbo could follow here, but the long and short of it is I’ve been trying to get this setup fully operational:
10866095_10205012064477554_8582241301569727711_o

Laptop talks to camera so I can control taking exposures. Laptop will also (eventually) talk to the scope itself so I can control where the scope is aimed without having to be outside in 14° weather. I still have to get the scope aligned in person, as well as re-setting the focus when I switch from the eyepiece to the camera. I have a couple of ideas to work the focus thing out, but alignment will remain an outside thing until I have a permanent pier installed for mounting the scope (not at this house). In any case, once the alignment and focus get sorted, this is what my imaging session looks like:
10960110_10205012064797562_7740964808901074017_o

Yes, that’s my laptop, running RealVNC (remote desktop viewing software) controlling the laptop sitting by the scope over the WiFi network.  What this allows is me getting the initial settings for exposure length and ISO sorted out without having to be physically present, and then programming in a series of exposures to shoot automatically. Since all the imaging after I start the program is automatic, I get more free time while the computer is doing its thing.  Eventually this will mean I can get the scope set up before sunset, aligned as soon as it’s dark enough to see a couple of bright stars, and then image for hours while still being able to spend time with the family!

I’ve currently got some vibration issue that is causing about 1/3 of my images to have ‘lollipop’ stars (bright point with a ‘stick’ poking out one side). Since I can just take a BUNCH of images with this setup, I can just chuck the bad shots while I’m sorting out what’s causing the problem. I’m also learning how to better utilize the Deep Sky Stacker (DSS) software, and how having the computer spend more time processing each stack can turn out MUCH better images.  Again, get the process streamlined so I can have the computer just crunch away, and then come back to process the stacked image after the baby goes to bed.  I wanted to take another crack at Andromeda this weekend, but based on where it is in the sky I’m unable to get images before it goes down into the trees. Opting instead for a ‘target of opportunity’, I shot over a hundred images of the Pleiades, resulting in 50 usable lights, and the associated dark, flat, and bias frames. The star cluster has some faint reflection nebulosity, which can be seen in longer exposures. Since I’m still limited to around 30 seconds each due to field rotation, it just means I have to take a LOT more images. I realized after I started that I didn’t have all of the stars in the cluster in the frame, but couldn’t get them all due to camera / telescope base interference. A couple rounds of processing later, I ended up with something reasonable, with the hint of nebulosity starting to show up:
10001117_10205017424851560_2212747555571686455_o
Nerd alert-Here’s all the pertinent data for the above image: Pleiades (Messier 45): 50x 30-sec exposures, ISO 1600, 1280MM, f/6.3. Canon T1i on Celestron Telescopes NexStar 8 GPS w/ focal reducer/corrector. 50x lights, 25x darks, 20x flat, and 20x bias frames tacked in DSS, levels edited in Photoshop.

All the practice crunching this set of images taught me some tricks about the DSS settings which I was then able to use on a previous set of images: M42 (the Great Nebula in Orion).  I would be better off had I done flats and bias shots for this, but did at least manage to take some darks.  I’ll post both the 2nd re-process I did before Christmas, and this new one side by side so you can see the comparison.  Same original data, but significantly more processor effort was involved with the latest run.  I already own the computer, and processor time while I’m sleeping is essentially free, so I’ll definitely be opting for the more time-intensive stacking work with future telescope efforts.  I think I got the same level of nebulosity out of the data, but without resorting to so much post-processing in Lightroom.  I think the result is more aesthetically pleasing and also doesn’t contain the satellite tracks from the sub-frames:
OLD:                                                                                               NEW:
wpid3416-M42-reprocessed.jpg    wpid3473-M42.jpg

I have hobbies. Some hobbies can be done at home, and the ones that can tend to percolate toward the top considering family time being a priority.  While I would love to be able to go to the racetrack, that kind of thing is not especially feasible with a 10-month-old.  Not un-doable, but difficult enough that I’ve got to say more power to the guys who can pull it off, but for me it’s just a bridge too far.  So for now I’ll concentrate on the things that I can spend time optimizing quietly in the mornings before everyone else in the house is up, so that when I do have to take family time (evenings) for things like scope setup, the payoff is bigger.

 

About Galaxieman