Friday, November 7, 2014

Nikon D5300 long exposure sensor noise comparison

Continuing with my camera testing, I picked up a Nikon D5300.  According to DXOMark.com, the D5300 is one of the best low light performers among APS sized sensors.

MakeModelLow-Light ISO Score
PanasonicDMC-G3667
PanasonicDMC-GH3812
Canon60D813
NikonD5000868
PanasonicE-P5895
Canon70D926
NikonD51001183
NikonD53001338
NikonD6102925

A few micro four thirds cameras are in the chart, including my trusty Panasonic DMC-G3, as well as the full frame sensor Nikon D610.  From what I have read online, part of the increased performance is the removal of the antialias filter.  My requirements include having an articulated screen and the ability to use a remote for long "bulb" exposures.

As mentioned in my previous post, there is some concern about Amp glow in the D5300, so I repeated my previous procedure to assess sensor noise.

Materials
Nikon D5300
Michron intervalometer and DC2 cable
iPhone 5

Methods
I configured the Micron to capture the following series of images with delays:
24 images x (1 minute exposure with 15 seconds delay)
20 images x (2 minute exposure with 15 second delay)
21 images x (4 minute exposure with 15 second delay)
15 images x (8 minute exposure with 15 second delay)

The delay was chosen to allow the sensor to cool down a bit between image captures, which is what is frequently done in astrophotography imaging.  The camera and Micron with cable were placed in my basement with lights out.  The temperature was set to 69° F.  Image recording was set to obtain Fine JPG + 14-bit RAW images at full resolution.  Noise reduction in the camera was turned off for long exposures as well as for High ISO.  The lens of the camera was removed and replaced with the body cap.  The articulated screen of the D5300 was opened to provide better ventilation.  An ISO setting was chosen, and the intervalometer was started.

By taking multiple photos using the intervalometer function of the Michron, the camera's sensor was allowed to warm up to a rough equilibrium point.  The final JPEG image of each exposure time series was then used for image comparisons.

Results
Figure 1:  Nikon D5300. Final JPG image of each exposure series was scaled to 4.55% to show overall sensor noise profile.

Figure 2: Nikon D5300. Final JPG image of each exposure series was cropped to 274x182 pixels, center weighted, to show detailed noise and "hot" pixel profile.


Figure 3: Comparison of sensor noise between Canon 70D and Nikon D5300.  Final JPG image of each exposure series was cropped to 274x182 pixels, center weighted, to show detailed noise and "hot" pixel profile.

The Nikon D5300 shows very little noise overall.  At the extreme of the testing, there was some sensor glow starting on the top left hand side of the images, which is not really seen in the scaled images (Figure 1).  In the center cropped images, noise is minimal except for sporadic hot pixels (Figure 2).

From a visual comparison of the center cropped images, the Canon 70D sensor shows significantly more diffuse noise versus the Nikon D5300, which is evident at ISO 800 and above (Figure 3).

Discussion
For astrophotography and general long exposure purposes, the tests represent a best case scenario.  Light pollution will reduce the reasonable exposure/ISO values, but this can be partly offset by a light pollution filter.  What we can take away from these test are the following.  First, the Nikon D5300 has significantly less noise than the Canon 70D at equivalently labeled ISO values.  Objectively, the noise in the Nikon D5300 at ISO 6400 is close to that of the Canon 70D at ISO 1600 for 4 minute exposures.  For 8 minute exposures, the difference is even greater.  Secondly, hot pixels are not a large issue for the Nikon until the ISO goes over 800 and exposure time is over 4 minutes.  Finally, there was a minimal amount of glow noticed in the top left of the Nikon sensor at high ISO and exposure duration.  This glow pales in comparison to the overall diffuse noise of the Canon.

Subjectively, I found the D5300 more difficult to use than the Canon 70D or Panasonic DMC-G3.  While all three cameras have articulating LCD screens, the Nikon D5300 lacks a touch screen.  This means that for each settings change, additional buttons and dials need to be pressed.  I was able to assign ISO settings changes to a function button, which made the process only moderately easier.  While the quality of photos from the D5300 is great, the addition of a touchscreen or dedicated ISO dial (which my ancient Canon Powershot G9 had) would be a marked improvement.

A parting shot of the Pacman nebula, captured with the Nikon D5300.
http://astrob.in/134619/0/



Thursday, October 16, 2014

Canon 70D long exposure sensor noise comparison

I tend to lurk around the forums at cloudynights.com.  Recently, I have been looking into getting a new camera for doing astrophotography.  This was partly prompted by some nice, short exposure photos at ISO 6400 that my brother took with his Nikon D600.  My micro four thirds Panasonic DMC-G3 just cannot compete with his full frame sensor.  "Amp glow" in the DMC-G3 makes the top 1/6th of the sensor close to useless for any exposures over 30 seconds, and forget about using ISO settings beyond 400 due to noise.

For my (microprocessor controlled) astrophotography goals, and for my budget, I need camera with little noise at high ISO values and long exposures.  A "soft" requirement is an articulated LCD screen so that I don't have to kneel on the ground to see what is being imaged through my refractor (yes, I could raise the tripod legs at the risk of decreased stability).  DXOMark.com does comparisons of many cameras, so I looked at the information they had.  Most of their testing is geared towards daylight photography, but they do some tests for "Sports (Low-Light ISO)" which is a starting place for us.

MakeModelLow-Light ISO Score
PanasonicDMC-G3667
PanasonicDMC-GH3812
Canon60D813
NikonD5000868
PanasonicE-P5895
Canon70D926
NikonD51001183
NikonD53001338
NikonD6102925

A full frame sensor (which scores great) is not in my budget, but I added the Nikon D610 to the list for a comparison.

So, in my lurking, I came upon the possibility that a number of cameras are amenable to modifications that allow for H-alpha wavelengths to be captured by CMOS sensors in digital cameras.  To be more accurate, the CMOS sensors are already able to capture H-alpha wavelengths (and IR and UV).  Camera manufacturers put glass and coatings over the sensors to block out UV and IR light so that terrestrial photography represents what our eyes see.  Canon has two models, the 20Da and 60Da that come from the factory with the modification.  Third party services such as Hypercams & Mods do the modification for reasonable fees, and there is always the DIY approach.

For the price and DXOMark performance, the Nikon D5300 is looking great against similar competitors, but there is some concern about Amp glow.  With more lurking, I came across a nice comparison of ISO and exposure times for the Nikon 5300.  Being a scientist, I thought that there might be a systematic way we can evaluate the performance of digital camera sensors for astrophotography (and in general for long exposures).  I saw a Canon 70D on sale, so I picked it up for testing (since I could quickly obtain a Canon EF to T-ring adapter for astrophotography).

Materials
Canon 70D
Michron intervalometer and E3 cable
iPhone 5

Methods
I configured the Micron to capture the following series of images with delays:
24 images x (1 minute exposure with 15 seconds delay)
20 images x (2 minute exposure with 15 second delay)
21 images x (4 minute exposure with 15 second delay)
15 images x (8 minute exposure with 15 second delay)

The delay was chosen to allow the sensor to cool down a bit between image captures.  The camera and Micron with cable were placed in my basement with lights out.  The temperature was set to 69° F.  Image recording was set to obtain JPG + RAW images at full resolution.  Noise reduction in the camera was turned off for long exposures.  The lens of the camera was removed and replaced with the body cap.  The articulated screen of the 70D was opened to provide better ventilation.  An ISO setting was chosen, and the intervalometer was started.

By taking multiple photos using the intervalometer function of the Michron, the camera's sensor was allowed to warm up to a rough equilibrium point.  The final JPEG image of each exposure time series was then used for image comparisons.

As a control the same procedure was done on a Panasonic DMC-G3.  "Bulb" exposures are limited to 2 minutes on the G3, so data could not be collected at 4 and 8 minute exposure lengths.

Results

Comparison exposure vs ISO for overall sensor
Figure 1:  Canon 70D. Final JPG image of each exposure series was scaled to 5% to show overall sensor noise profile.

Comparison exposure vs ISO for center of sensor
Figure 2: Canon 70D. Final JPG image of each exposure series was cropped to 274x182 pixels, center weighted, to show detailed noise and "hot" pixel profile.

Figure 3: Control, Panasonic DMC-G3.  Final JPG image of each exposure series was scaled 5.3% to show overall sensor noise profile or cropped to 244x182 pixels, center weighted, to show detailed noise and "hot" pixel profile.

From a visual comparison of the center cropped images, the Panasonic DMC-G3 at 1 minute exposure performs roughly equivalent to the Canon 70D at 8 minutes exposure. While the 70D is uniformly dark at ISO 200 and 400, the G3 has noticeable gray 2x4 pixel marks over the entire sensor.

The 70D is relatively free of banding, but there is some vertical banding observed at high ISO and long exposures.  The DMC-G3 has a noticeable band across the top of the image, plus a diagonal fault of some sort.

Figure 4: A comparison of RAW file size versus exposure time for selected ISO values

The recorded value of each pixel, in terms of data, will depend on the amount of light (or noise) captured.  Consequentially, the final RAW image size on disk will correlate with the amount of noise captured at the given settings.  By plotting image size versus exposure time (Figure 4), we can see that a doubling of exposure time roughly correlates with a halving of ISO in terms of sensor noise.  Additionally, at higher ISO settings, a doubling of exposure time shows a much greater increase in sensor noise than at lower ISO values.


Discussion
For astrophotography and general long exposure purposes, the tests represent a best case scenario.  Light pollution will reduce the reasonable exposure/ISO values, but this can be partly offset by a light pollution filter.  What we can take away from these test are the following.  First, the 70D sensor readout is fairly uniform.  There does not seem to be any unusual "amp glow", and "banding" is minimal.  Second, we can subjectively observe when "hot" pixels start to be a nuisance.  An objective test would be to determine when the first hot pixel reaches saturation, since dark frame subtraction would leave a hole.

From experience, the DMC-G3's gray 2x4 pixel marks wind up making long streaks on stacked images if there is any drift in your telescope tracking, even with dark frame subtraction.  This effectively limits the DMC-G3 to ISO 200 at less than 2 minute exposures or ISO 400 at less than 1 minute exposures.  The 70D does not seem to have similar, inherent noise issues.  However, the sensor noise in the 70D seems to be more "blobby" as compared to the granular noise in the DMC-G3.

Sensor noise increases significantly more at higher ISO values as the exposure time increases.  If we declare an acceptable amount of noise, for instance at ISO 800 and 4 minute exposure with the 70D, our plot of file size versus exposure time gives alternate ISO/exposure combinations which are roughly equivalent in terms of noise.  Therefore, if our mount tracking is fantastic, an 8 minute exposure at ISO 200 might help us get those rare photons from a deep sky object.  With a less precise mount (or no mount at all) a 1 minute ISO 800 exposure would reduce the degree of start trails and have roughly equivalent, and actually slightly less, sensor noise.

Friday, December 20, 2013

Beaglebone Black and Astrophotography: Phase 2: iAstroHub tweaks

While I was able to setup nearly all of the software for iAstroHub on my Beaglebone Black (still some issues with gphoto), the interface just wasn't working quite right.  So, I enlisted my PHP guru brother to help.

Nginx setup
Edit /etc/php5/fpm/php.ini and turn short_open_tag On in order for things to be happier.
@awnage: The javascript on the main page calls several other pages to display data. This is usually termed AJAX. Those calls were being made to php scripts that used <? (shorttags) instead of <?php. <? is an old PHPv3 standard that has been out of favor for quite some time. Changing the ini setting allows those files to run properly. There is no security concern with this.
Software locations and additional setup
Your lin_guider package is expected to be in the /home/pi directory.  So, the executable path should look like:
/home/pi/lin_guider_pack/lin_guider/lin_guider

When lin_guider is run, it will be done as root.  Consequently, the setup of lin_guider needs to be done as root so that appropriate configuration files are placed in the root's home directory.  If you are logging in in remotely via ssh -X, you will probably have to copy the .Xauthority file from your user account to the root home directory in order to export the lin_guider GUI.  Or, once you have the web interface running, you can use VNC to connect and see the lin_guider interface for setup.

Permissions
Make sure all the text files in /home/pi/www are editable by pi:www-data.
# chown -R pi:www-data /home/pi/www
# chmod 644 /home/pi/www*.txt
Make sure the executables in /home/pi/www/guidestar can be executed by pi.
# chmod ug+x /home/pi/www/guidestar/cmd_getimage
# chmod ug+x /home/pi/www/guidestar/cmd_setposition
SkySafari and iOS Browsers
So, iAstroHub has the cool feature of being able to plate-solve (i.e. figure out where in the sky an image was taken) and then load the solution in SkySafari Plus/Pro.  Anat has a video of the process.  Unfortunately, most browsers on iOS don't allow for downloading of files.  You can of course look at the raw text of the file when clicking on the "Get result for SkySafari" link.  The Mercury browser does allow downloading, but it strips the file extension, and you cannot open it in SkySafari.

After a bit of searching, the Atomic browser is what Anat is using.  You will need the full version, which is a bargain at $1.99 currently.  Unfortunately, the download functionality doesn't work in iOS 7.  Fortunately, my wife's iPad is still on iOS 6.

I see that the iCab browser allows for downloading of files.  Using the same procedure (click and hold on link->save->open in external app) you can get the SkySafari file into your SkySafari app.  $1.99 also.  Wonder if I can get my $1.99 back for Atomic....

It is probably worthwhile to have a VPN Client in case something goes funky with lin_guider.  I also have the pTerm app so that I can login to the BBB from iOS.

Monday, December 16, 2013

Beaglebone Black and Astrophotography: Phase 2: Building astronomy software

Install astronomy software.
So, this is going to be based on the work by Anat Ruangrassamee, written up at www.cloudynights.com, and his iAstroHub project.  His project files can be obtained here: http://sourceforge.net/projects/iastrohub/

Open the README.txt and follow the "HARD WAY".  Some of the stuff we have already done.
Step 1-2: Skip.  Steps 3-4: I am not sure it is entirely necessary to give open access in sudo, but for now we will follow the instructions.  Steps 5-6: BTDT.  Step 7: Sure.  I use vi instead of nano, so I left that out.

Step 8: We need to shutdown apache in favor of nginx.
# update-rc.d -f  apache2 disable
# apachectl stop
Then continue to follow the iAstroHub instructions.
Make sure that you can still connect to your access point and webpage.  Mine is still at 192.168.4.1, and I now see that is running nginx as opposed to Apache.

Step 9:  I followed the instructions, and got a failure.  Instead (you will want unzip for later):
# apt-get install gphoto2 unzip
That being said/written, it does not look like the gphoto2-6 package works later in step 20, so I will revisit compilation later.

Step 10: It is not mentioned, but grab the latest lin_guider from linguider.sourceforge.net.  The "copy modified files" means to copy the files from the Modified codes/linguider folder in the iAstroHub package to the lin_guider_pack/lin_guider/src directory of lin_guider.

The setup of lin_guider is through a GUI.  So, you either need to have a monitor hooked up to the BBB, or you need to be exporting the graphics via ssh -X.  Woohoo!  I can see from my webcam after the device setup.

Step 11: The mentioned package is available from http://en.sourceforge.jp/projects/sfnet_cccd/releases.  I am uncertain if it does any good for webcams.

Step 12-13: Sure.  This is starting to take up a fair bit of space....  

Step 14: The pip install pyephem step should probably be sudo pip install pyephem. When setting up astrometry, the modified starutil.c goes in util, and solve-field.c goes in the blind directory.

The make on astrometry fails for me with:
collect2: error: ld returned 1 exit statusmake[1]: *** [libbackend.so] Error 1make[1]: Leaving directory `/home/ubuntu/projects/astrometry.net-0.40/blind'make: *** [subdirs] Error 2
So, according to the astrometry google group, you can get around this by doing: sudo make -k since libbackend.so apparently isn't needed for astrometry.  At this point I needed to do a sudo apt-get clean so that I had enough space on my BBB to continue.  Actually, at this point I realized that there will not be enough space on the BBB's eMMC memory for all of this, which means expanding to 8+ GB and booting off the microSD (and editing my first post).

You need to get plate solving index files from http://data.astrometry.net/4200/.  Read the http://www.astrometry.net/doc/readme.html to figure out what indexes you need.  Anat has some recommendations, post 6193995.  Since my scope is a WO GTF81 using prime focus, I will try indices 4208-4210.fits.  Just copy them to the /usr/local/astrometry/data folder.

Step 15: Already configured our Wifi.

Step 16:  I have a Sabrent SBT-USC1M USB to Serial cable using a Prolific PL2303 chip (according to Amazon), so I am more interested in the pl2303.ko module.  Fortunately, this build of Ubuntu already has the modules.  Bonus!  Thanks to Anat for pointing this out to me and suggesting I do serial via UART and save a USB port (Phase 3/4).

Step 17: This configuration might change if a do a UART to serial.  Copying for now.  Port is 3300.

Step 18: We already setup our access point (previous post).  The MK80 apparently uses eth0 for it wireless, while we use wlan0.  I am keeping my dhcp range in 192.164.4.x, but it is probably a good idea to change the lease time to 8 or 12 hours in /etc/dnsmasq.conf and restart the service.

Step 19: This section is for starting up services on boot and creating a virtual X server.

Step 20: ccd_1.2.7.tar.bz2 is obtained from http://sourceforge.net/projects/cccd/files/ccd/1.2.7/.  This is for controlling your imaging CCD or DSLR.  I don't have one (yet, since I am using a micro four thirds camera) but I will build it anyhow.  My Ubuntu distribution has gphoto2-6 and fails to compile.  It looks like there is a known bug.  I will come back to this.  Guess I will have to build gphoto, per step 9.

Step 21: Sure.  

Step 22: lazarus 1.0.10 is in the repository, so I apt-get'd it instead of building, although it added an additional 698 MB worth of packages.  Unfortunately, this doesn't work, as a Raspberry Pi user found out.  So, lets build from source.
svn checkout http://svn.freepascal.org/svn/lazarus/tags/lazarus_1_0_12
cd lazarus_1_0_12
make clean; make
chmod 777 /home/pi/lazarus_1_0_12 -R
Continue to configure lazarus as appropriate.  With lazarus and fpc installed, the skychart configure line looks like:
# ./configure fpc=/usr/lib/fpc/2.6.2 lazarus=/home/pi/lazarus_1_0_12 prefix=/usr/local
Some directories are missing in skychart which prevents it from compiling.  While in skychart-3.8-2450-src:
# mkdir skychart/component/lib/arm-linux-gtk2
# mkdir skychart/units/arm-linux-gtk2
# mkdir varobs/units/arm-linux-gtk2
Then proceed with building and installing.  During configuration of "Telescope type", there was not a direct option for my mount (Celstron Advanced VX) so I chose Celestron GPS.  Fingers crossed.

Step 23-25: Don't really apply.  No audio DSP in the BBB from what I understand.

Reboot.  Grabbed my (wife's) iPad, installed SkySafari Plus, and pointed my WiFi at the BBB.  Go to settings in SkySafari Plus (or Pro if you have that).  Choose your mount (Celestron NexStar/Advanced GT works for my VX).   Use the appropriate network settings.  Mine is still 192.168.4.1, but Anat's instructions have the network on 10.0.0.1 IIRC.  We previously (step 17) set the serial port to communicate on port 3300.  Connect to the telescope, and presto!  I can control the scope!  Of course, if I just wanted SkySafari control, all I really needed to do was get the access point setup and install/configure ser2net (Step 17).

Phase 3: GPIO-based guide port interface
for the next post...

Tuesday, December 10, 2013

Beaglebone Black and Astrophotography: Phase 1: Configuring the BBB

Goals for Phase 1
Boot Linux on BBB.
Activate WiFi access.
Activate webcam.

Parts list for Phase 1
Beaglebone Black (BBB)
4-Port powered USB hub
USB WiFi adapter with Access Point capability
5V 2A PSU
Ethernet cable
USB webcam
HDMI to mini-HDMI cable (optional)
8GB+ microSD card

I purchased my Beaglebone Black version A3, PSU, and USB WiFi adapter from Adafruit, and they were quickly delivered.  The BBB comes with a USB to mini USB cable.  The USB hub, webcam (a Logitech C300), and microSD (32GB UHS-I Class 10) were ordered via Amazon.com.  I had an ethernet cable sitting around.

Useful online tutorials
The following are some resources that I found useful for setting up my BBB.
http://learn.adafruit.com/beaglebone
http://derekmolloy.ie/category/embedded-systems/beaglebone/
Plus others

Boot Linux on BBB
Technically, you can just power-up the BBB, and Linux will boot.  However, after a bit of playing, I found out that the Angstrom Linux distribution that was preinstalled on my BBB's eMMC memory was from 2012.  So, I wanted to get updated...
[edit 2013/12/12: due to size limitations of the eMMC memory, you will need to boot off the microSD and install your software there.  Follow these instructions for resizing your microSD boot partition]

I started by using Debian linux (3.2.51-1) via Virtualbox on a Windows 8 PC in order to control and program on my BBB.  You can look elsewhere for how to setup Virtualbox and install a version of Linux.  I was originally thinking of using a Raspberry Pi, and Raspbian is a derivative of Debain, hence the Linux "flavor" choice.  Ubuntu seems to be popular for BBB users, so the choice in *nix/*nux is up to you.  I switched to Ubuntu to make my life easier, but found it had connectivity problems with the usb-to-ethernet system, so I went back to Debian until I got ethernet running.

I suggest you follow some of the online tutorials for getting started, since there are plenty out there.  When you first connect your BBB to your computer via the USB cable, it will appear as a mass storage device.  There are Windows and OS X drivers on it.  I noticed that the Windows drivers are unsigned, so they will not load in Windows 8 without some tricks.  I didn't bother with that.  Instead I passed the BBB through to my Linux virtual machine (VM) in Virtualbox.  Then do the initial setup as suggested at beagle board.org/static/beaglebone/latest/README.htm

Another good site:
http://derekmolloy.ie/beaglebone/getting-started-usb-network-adapter-on-the-beaglebone/
BBB Angstrom flasher 2013.09.12.  http://downloads.angstrom-distribution.org/demo/beaglebone/

I fought with the Angstrom distribution for a while (and had a long blog post written) before I got frustrated enough with connmand and wifi network configuration to scrap it.  Install Ubuntu on your BBB instead and save some headaches.  You can probably save some headaches by not getting a RTL8192CU WiFi adapter, but that is a different story.  Many online tutorials use Ubuntu too.
http://elinux.org/Beagleboard:Ubuntu_On_BeagleBone_Black
I got a Ubuntu 13.10 Saucy build from Robert Nelson at http://rcn-ee.net/deb/flasher/
You WILL want to read the following page for flashing a new Linux distribution onto the BBB.
http://elinux.org/Beagleboard:Updating_The_Software
Then resize your microSD and use it as the boot device (the eMMC memory isn't enough for this project).

With Ubuntu installed, plug your BBB into your Linux computer (VM) again.  You will then be able to ssh into the BBB (ssh -l ubuntu 192.168.7.2 with password being temppwd.   Setup your usb-to-ethernet networking.  That being said, I hooked my ethernet cable up (since usb-to-ethernet was acting odd in Ubuntu) and waited for the BBB to show up on my network.  Then I ssh'd into it over eth0.

# sudo apt-get update
# sudo apt-get upgrade
# sudo apt-get install avahi-daemon

That last line will get Zeroconf installed so that we can access the BBB vi arm.local.  Of course, I changed the name of mine to astrobeagle in /etc/hostname and /etc/hosts.  Reboot.

Activate WiFi access
From what I have heard (November 2013), the Adafruit tutorial for getting WiFi working is not applicable to the BBB.  The dongle I purchased from them uses the RTL8192cu Chipset, and after a week's worth of compiling things on my own, here are the best instructions I have found on getting it to work:
http://wiesel.ece.utah.edu/redmine/projects/hacks/wiki/BeagleBone_Black_AP

My kernel is saucy 3.8.13-bone30, so I had to make adjustments as mentioned by Anh Long.  Additionally, for step 9 to compile the driver, I had to adjust the Makefile.  Having tried many approaches previously, I found some good info from Evan Hanau on fixes to the Makefile.  By default, the Makefile is setup to build for i386, so it needs to be switched to arm.

With wifi up and running, want to get hostapd running so that I can make the BBB into an access point.  Ahn's writeup is good.  After configuration and a reboot, I see my beaglebone as an access point with my iPhone!  Pointing Safari to http://192.168.4.1 (I used the suggested 4 subnet) I can see apache running.

Time to hook up a USB hub and get the webcam going...

Activate webcam
I chose the Logitech C300 webcam due to established support in Linux (link).  HD resolutions were not necessary for my purposes (I hope).  The shape and design of the C300 also lends itself to disassembly for eventually mounting on a guidescope.  [Edit: DO NOT get a Microsoft LifeCam Studio.  It has a bug that pull full bandwidth from the USB and will effectively disable your WiFi adapter on the hub.  Look at this list.]  Plug in the webcam and check to see if it is recognized:

# lsusb

Now, we need to get  a few things if you want to follow Derek Molloy's example vid
# apt-get install v4l-utils pkg-config libopencv-dev libv4l-dev uvccapture
OpenCV isn't necessary for my final project, only if you are wanting to do OpenCV stuff.  With that installed, you can run v4l2-ctl to get all sorts of information in the camera.  I found out that my autoexposure was turned on and creating trouble, so...

# v4l2-ctl --set-ctrl=exposure_auto_priority=0

You can use uvccapture to grab some still frames.  Since my webcam doesn't do h264, if you are following Derek's tutorial, I did the following since my output is in MJPEG format:

# ./capture -o > output.raw (then scp'd the output file to my Linux desktop for ffmpeg...)
# ffmpeg -f mjpeg -i output.raw tmp.mp4

Phase 2: Install astronomy software
for the next post...

Tuesday, November 26, 2013

Beaglebone Black and Astrophotography: Preamble

Background
Astrophotography has many aspects, depending on ones' goals.  In a simplistic categorization, astrophotography can be divided into two classes: wide-field, and narrow field.  In wide field astrophotography, wide angle lenses are used to capture large regions of the sky.  Frequently, digital cameras with large aperture lenses are used, and often tracking systems that match the Earth's rotation are not necessary since exposure times are short.  Photographic systems, be it cameras alone or digital imaging devices attached to telescopes, with smaller apertures may need tracking systems to avoid star trailing and/or to get more detail.  In narrow field astrophotography, typically when imaging faint deep space objects (DSOs), accurate tracking systems to match Earth's sidereal rate are necessary due to the requisite longer exposures.  Cameras need sufficient time (in the order of minutes) to capture enough photons from DSOs to resolve the distant objects, and any movement degrades the image quality.  Therefore, for high quality pictures, a sturdy photographic mount that accurately tracks celestial objects is necessary.  For this, I once built my own "barn door tracker" and got decent results with up to 55mm camera lenses and 60 second exposures.

Telescope mounts come in a few varieties, but the altazimuth (alt-az) and equatorial mount are perhaps most commonly known.  For my purposes, I will be focusing on the german equatorial mount (GEM) found in the Celestron Advanced VX.  For the GEM, an axis of rotation is aligned with the Earth's rotation axis.  If the telescope is then rotated around this axis at the same rate as the Earth's rotation, all objects will stay stationary in the telescope's view.  Accurate alignment with the celestial pole is therefore important for proper tracking, as well matching the earth's rotation speed.  A perfect polar alignment combined with perfectly matched mechanical rotation has the possibility of supporting indefinitely long photographic exposures without star trails.  Unfortunately, this is not possible.  Consequently, shorter photographs that don't exhibit noticeable star trailing are typically taken in astrophotography, and multiple images are "stacked" with computer software to produce an enhanced image.

One way to enable longer photographic exposure times without star trailing is to actively adjust the direction in which the telescope is pointed when deviations are experienced.  In this scenario, a "guide star" is used as a reference point, and the telescope is nudged in whichever direction necessary to maintain the relative position of the guide star in the field of view.  To that end, autoguiders were created - smaller scopes attached to a main scope who's duty it is to keep a guide star stationary in the view.  The autoguiders typically have a CCD (or other digital sensors) linked to a computer that analyzes the position of a star or stars.  If a deviation in position is observed, corrective motor signals are sent back to the telescope mount, often to a dedicated autoguider port, to reestablish the initial alignment.  This system requires an imaging device separate from the main photographic imager as well as an attached computer to make all the adjustments.

Telescope directional control with a microprocessor
Many modern telescope mounts come with attached, handheld control boxes for adjusting the aim of the telescope.  With the advent of "GoTo" systems, the control boxes contain microprocessors with positional information on a large number of celestial objects.  If a telescope mount with GoTo is accurately aligned, for instance a GEM with good polar alignment, a simple press of a few buttons can "slew" the telescope to view stars, planets, and DSOs.  "Hobbiest" microprocessor boards such as the Raspberry Pi have also been configured to control GoTo mounts, allowing for connections to popular software such as Stellarium for slewing.

Long exposure astrophotography requirements
1. Motorized telescope mount
2. Accurate polar alignment (or as accurate as possible)
3. Photographic camera with "Bulb" setting
4. Imaging telescope or long focal length camera lens
5. Guide scope or off-axis guider
6. Guide camera
7. Hardware for analyzing guide camera data and sending adjustments to mount
8. Power supply

Item 7 in the list has typically taken care of with a laptop PC, which adds a significant cost if one is not already available.  A table would likely also be desired to keep the laptop off of the ground, and USB/serial cables of sufficient length would be needed to connect the computer to the guide camera and motorized mount.  To me, this looked like a hassle, but I still wanted to produce some great astrophotographs.

Not (conveniently) having a laptop, and not really interested in buying one nor leaving one outside for log periods of time, I wanted to build a different solution.  Could a microprocessor take the place of the computer?  Having toyed with a MSP430, I knew it could be configured to control the telescope mount, but it would not have the processing power or connectivity for analyzing CCD/webcam images.  Next up from that I looked at the Arduino, and I came to a similar conclusion at the time.  The Raspberry Pi (RPi) on the other hand had some distinct advantages.  RPi can run Linux, and there are Linux software packages (specifically linguider, INDI, and open-phd-guiding) for controlling telescope mounts via autoguiding.  A thread at Cloudy Nights Telescope Reviews described a project to do just that.  While remote GoTo control was achieved with the RPi, the original author decided to move to a MK808 ARM® system since the RPi didn't have the desired processing power.  The MK808 system additionally needed a USB hub for a GPUSB Guide port interface and webcam, although it had built-in WiFi.  A main downside to this solution, in my opinion, was the lack of expansion via GPIO and the extra guide port interface.

It was at this point I learned about the Beaglebone Black (BBB).  While not dual-core like the MK808, I figured that the fast, 1GHz ARM® Cortex-A8 processor would be able to handle necessary image processing for my project.  The BBB also has a large number of configurable pins available for all sorts of expansion.  I recognized that I could build an equivalent Guide port interface onto the BBB, avoiding the GPUSB dongle.  However, since the BBB only has one host USB port, a separate USB hub would still be necessary to simultaneously use a WiFi adapter and webcam (at least until a USB or WiFi expansion cape becomes available for the BBB).  Moreover, the BBB has the possibility to use LCD and LED displays for visual feedback without the need of an external monitor, as is the case for the MK808.

And here starts my tale with the BBB.

Phase 1: Configuring the BBB
Phase 2: Building astronomy software
Phase 3: GPIO-based guide port interface
Phase 4: Autoguiding
Phase 5: Celebrate
Phase 6: Digital camera control
Phase 7: Cape with USB hub and guide port interface

Thursday, July 12, 2012

Review: Lexar Professional 16GB SDHC 133x Class 10 vs SanDisk Extreme 45MB/s 16GB SDHC

My Panasonic DMC-G3 is getting close to a year old now, and since I do time lapse photography, the Professional 16GB SDHC 133x Class 10 card I have installed sometimes fills up quickly.  This of course is an annoyance to my wife when she grabs the camera to get some photos of the kids.  So, I figured it was time to get a new memory card.  Sure, I could have bought a huge capacity SDXC card, but I tend to offload images to my computer frequently enough that a 16GB card suits me well (500+ JPEG+RAW images).

The big question, what to buy?  I typically take photos of fast moving kids, nature, the stars, and some time lapse work, so burst mode and an intervalometer are my friends.  The tom's hardware review I went to back in 2011 to initially choose the Lexar is still the most current roundup of SDHC cards as of this writing.  Of course, in a year, SDHC prices have fallen, especially for the speedier "ultra high speed" UHS-I models.  The Sandisk Extreme Pro 16GB UHS-I from the review is now marketed as just a Sandisk Extreme 45MB/s (SDSDX-016G-AFFP) and can be found online for about $18, oddly less expensive than my old Lexar.  In terms of camera speed, the metric I was interested in for the memory card was the sequential write speed, and the Sandisk was reportedly 1.74x faster for writes.  The SDHC comparison used benchmarks for the computer world, so how does this translate to cameras?

Test equipment:
Panasonic DMC-G3
Panasonic Leica Summilux DG 25mm f/1.4
AC/DC adapter
Intervalometer/remote
Timer
Lexar Professional 16GB SDHC 133x Class 10
SanDisk Extreme 45MB/s 16GB SDHC

Camera setup:
mode: manual
focus: manual
aperture: f/1.4
shutter speed: 1/125
white balance: sunlight
continuous shutter burst, high
format: 4:3, 4592x3448
Quality: JPEG (J), fine JPEG + RAW (J+R) or RAW (R)

Methods:
Memory card was formatted before each test.

For the first burst test, I held down the shutter button using the remote until the buffer filled and counted the number of photos taken.  I repeated this three times and took the average of measurements.


For the second burst test, I held down the shutter button using the remote until the buffer filled (7 shots), started the timer, and counted the number of photos taken in 1 minute.  I repeated this three times and took the average of measurements.


For the intervalometer test, I set the intervalometer to a 1, 2 or 3 second interval, unlimited shots.  I counted the number of pictures taken with the interval over a given length of time.  I repeated this three times and took the average of measurements.

Results:
sample JPEG image from burst test 2
Bookshelf. Originally 4.6 MB before rotation.


Burst test - capture rate (J):
JPEG image size: 4.5MB
Lexar - 9.0 images.  Sandisk - 13.0 images.

Burst test - post buffer capacity(J+R):
JPEG image size: 4.6MB
RAW image size: 19.3MB
Lexar - 15.0 images/minute, 5.98 MB/s.  Sandisk - 23.0 images/minute, 9.13 MB/s.

Intervalometer test:
JPEG image size: 6.2MB
RAW image size: 19.4MB


Interval (s)delta T (min)theoretical max shotsLexar (J+R)Sandisk (J+R)Lexar (R)Sandisk (R)
116020.326.027.035.0
213020.626.027.030.0
324035.040.040.040.0
423030.030.0N/AN/A

Discussion:
The Panasonic DMC-G3 camera, as well as many cameras, have a "burst" mode to quickly capture a series of photos.  The photos are temporarily stored in an internal memory buffer before being written to media.  While holding down the shutter button in burst mode, the camera will take as many pictures as it can as fast as it can until the buffer is full.  While still depressing the shutter button, subsequent photos are taken when enough memory has cleared from the buffer (i.e., a photo has been written to the media card).  Therefore, after the buffer is full, the rate at which photos subsequently are taken is dependent on the speed of data transfer to the memory card.


In the first burst mode test, the Sandisk UHS-I card handily beat the Lexar, capturing 4 more images before the buffer was full.  Since the buffer capacity is constant, this means that the Sandisk was able to offload data from the buffer faster, thereby giving more room to acquire more photos in the buffer.  No big surprise.

In the second burst mode test, the Sandisk UHS-I card again beat the Lexar, capturing 8 more images per minute.  Since we were capturing photos in JPEG+RAW mode, the combined data size per image was 23.9 MB.  Based on the recorded transfer rates, 4 and 2.6 seconds per image are required to transfer each photo of this size for the the Lexar and Sandisk, respectively.

In the interval test, values less than the theoretical maximum indicate that the buffer has filled and shots are missed/skipped while space is being made in the buffer for the next shot.  The interval data corroborates the second burst test.  In JPEG+RAW mode, even up to a 3 second interval, the Lexar cannot empty the buffer fast enough to prevent shots from being skipped, but the card does fine at 4 second intervals.  The Sandisk handles 3 and 4 second intervals without trouble, but below 3 second intervals the card cannot keep up with the data.  When in purely RAW quality mode, the Sandisk can handle 2 second intervals without filling the buffer, and the Lexar has improved to handle the 3 second intervals.  These results point to the Sandisk as having a 9.7-11.1 MB/s data transfer speed, while the Lexar is just 7.5-8.7 MB/s.

We can learn a few things from these results.  First, the computer benchmark comparison between the cards translates moderately well to cameras (at least the DMC-G3).  In the burst test, the Sandisk card was able to acquire images 1.5x faster than the Lexar.  In the interval test, the Sandisk was about 1.3x faster in transferring data.  Second, the speed of the Lexar card rather than the camera hardware is limiting the data transfer rate.  This may also be true for the Sandisk, but we cannot tell until a faster card is tested or we use other methods.

As an aside, I was hoping against hope that the Sandisk UHS-I card would get me at least one more JPEG+RAW photo in the buffer before capacity was reached (i.e. that one more photo would be written to the card from the buffer) but that was not the case.  In J+R mode, I was still stuck with 8 images, but for Fine JPEG only mode, more images could be caught.  Of course, a longer shutter speed would increase the time available to write to the card - possibly a benefit for time lapse photography but a burst shutter mode is usually meant for capturing quick action.

Conclusion:
Definitely pick up a Class 10 UHS-I SDHC card versus a standard Class 10.  The SanDisk Extreme 45MB/s 16GB SDHC card did not live up to the 45MB/s label, but it was quite a bit faster than the older 133x Lexar and could handle short-interval time lapse photography better.  If you need to take a lot of consecutive shots in burst mode, you are better off switching off RAW mode.  Either card will handle short bursts in JPEG+RAW, but the UHS-I will allow for quicker, repeated bursts.