Friday, November 6, 2009

Some Thoughts on the Processor Module

Back in my post on Broad Strokes, I described how I plan to divide the camera into a sensor module and a processor module. Since then I've rambled on about the sensor, but haven't said much about the processor. So let me run through my current thinking on the processor module.

I've settled on using a Gumstix Overo Air as a processing core. Gumstix as a company is very open source friendly, and all their products already have Linux ported and ready to go. Also, they publish full schematics for their various accessory boards. I can use these as design hints for the actual accessory board that I will have to make for the camera.

The Overo Air includes the processor, 256Meg SDRAM, 256Meg flash, a microSD card slot, and Wireless (802.11g) support. These alone make the board a reasonably complete Linux server. Load Linux on this board and I've got a lot of stuff for free. The Overo also includes connectors that carry many important processor signals out to an accessory board. This is where the application specific functionality gets added, so that's where my development effort needs to go.

Networking

The Overo Air COM board includes wifi already, so it is possible to declare victory right there. However, I've decided to throw in ethernet support. Although I expect the wireless support to be sufficient in most cases (and has the obvious advantage of being wireless), I think it prudent to include a wired network interface for a little extra speed and stability. Wifi in my home can sometimes flake on me when the neighbor turns on their routers; and besides, the wired link should be faster. So I will include ethernet support for extra stability, speed and flexibility.

Some Gumstix boards use the LAN9221 ethernet interface chip. This is a pretty nice chip as it includes the MAC and PHY for 100baseT in a single package. There are also known Linux drivers for this chip. All I need to do is add a socket, some magnetics and a few other discretes, and it's done. In fact, sockets are available with the magnetics and LEDs built in.

USB

I'm actually interested in USB host support. I have found optical filter changers that use USB as the source of power and control. I would like to be able to control these devices from the processor module, so the processor will need at least one USB Type A socket. The Type-A socket is the kind that you find on your computer and into which you plug your peripherals. So definitely I will include USB host support. And conveniently, the Overo includes most of the electronics for the USB host; beyond the socket and a few discretes, the problem is largely solved out of the box.

What about USB peripheral support? That would mean including a Type-B socket (like what you would find on a USB peripheral) and would allow one to connect to a host computer via USB. I've decided against it. It wouldn't take much to add the support (the Overo has the necessary signals) but with already two forms of networking supported, I don't think even more are required. The only possible advantage I might see is the ability to mimic other USB cameras and thus gain compatibility with some existing software.

I'm still wavering on the idea of also adding a USB connection to act as a serial console for the processor. This would require some extra parts (probably an FTDI FT232 or the like) to transparently connect a standard UART to USB. This would be darned handy when initially bringing up the board, but useless otherwise. So I'm undecided. Perhaps the thing to do is design it in, but leave the parts unpopulated if I decide not to do it.

User Interface

Most existing astronomy cameras have no user interface. Or, rather, all the user interface is on a host computer that drives the camera and collects the images. But my camera system will have the capacity for some autonomy. It would be completely silly to expect to do image post processing in the camera module (that is a job for when you get home) but there are some telescope operations that are best done without a computer getting in the way. I'm thinking specifically of focus adjustment and mount/tracker alignment. There may be other things as well, if the user interface support them.

A small LCD display with a touch screen seems perfect. A 480x272 display should be big enough to display a small start field (or large planet) along with some simple controls, or more complicated menus. A touch screen will allow all the input to be through drawn buttons. This should be plenty for doing many camera manipulations, and it is infinitely flexible, a highly desirable feature for open source projects.

So I will include the Samsung LTE430WQ LCD touch screen display. This is the device recommended by Gumstix (they sell them as accessories) so I'm confident that there will be available drivers. I'll be able to find reference schematics and other support, so not a lot of risk.

Motors

One of the tasks for the processor module is to provide optical tracking. To do this, it will need to be able to control the stepper motors of the mount, and for that it will need motor drivers. The Overo does not directly include stepper motor controls, but they should be relatively easy to implement in an accessory FPGA, along with some external circuitry to safely power the motors. This will have to be somewhat configurable in order to be able to handle different mounts. That can probably be handled by jumpers/pots to select voltages, and software to adjust step timings and calibrate angles.

To get full motion control, the controller will need to be able to drive at least 2 motors: R.A and declination. I also plan on adding two more motor drivers, one for possible focus control, and another for random mechanical accessories that I haven't thought of yet.

Power

And of course all of this needs power. The processor module will receive DC power from an external source, probably a rechargeable 12V battery. Besides taking power for itself, the processor will create the right voltages and power up the sensor module, the stepper motors and and USB peripheral. I will also need some sort of power control circuitry to allow a clean start/stop of the various units. This part could get interesting as I am no master of this subject. All I know for certain is that the processor module will take in battery power and turn it into clean, filtered supply voltages for the rest of the system.

Summary

The processor board clearly hosts a lot of the complexity of the Icarus Astronomy Camera. Fortunately, the Gumstix Overo bundles up a lot of the pieces. It provides the processor and the system basics, and gives me a head start on most of the peripherals. A lot of the remaining bits can be handled using off the shelf components, and many implementation questions can be answered by reference schematics. Incidentally, I had considered adding a kitchen sink, but concluded that functionality was already covered elsewhere.

Monday, October 19, 2009

Open Hardware Open Source

This is a slight digression from the technical details of the camera itself, and into matters of infrastructure. Specifically, I'd like to spell out my choices for development and sharing tools. A goal of this project is to share the source code (i.e. design files) openly; but rather then complete the project and release only the final result, I intend to make the design files available as I work on them. That obviously means that the tools I use to manipulate the source files need to be tools available to potential users. The easiest way to manage that is to user open source tools wherever possible. And for electronics design and embedded programming, open source tools get me pretty far.

Schematics and PCB Layouts

This suite of tools is going to be the first to be put to use, as I'm going to be drawing at least partial schematics before anything else. There are a few open source schematic editors available: KiCAD, Electric and GEDA/gschem are common examples. Personally, I choose GEDA/gschem because that's what I'm most comfortable with. I also know that the integration with PCB, the circuit board layout editor, is thorough and advanced. So I see no reason to change my habits.

Logic Design

The camera system as a whole will have at least 2 FPGA devices: one in the camera module and one in the processor module. The design that goes into these devices will be described in Verilog, with the designs simulated by Icarus Verilog and synthesized by the vendor tools for the chosen device. Since the chosen devices are likely to be Xilinx parts, the vendor synthesizer is likely to be WebPACK. (Altera does not support Linux with its web edition software. Big strike against Altera.)

Embedded Systems/Software

As I mentioned back in an earlier blog entry, the processor module will likely be built around a Gumstix single board computer. That right there implies the embedded runtime operation system is Linux. The tools and techniques for programming embedded Linux are well known and mature. There is no need right now to pin it down any further then that.

Source Code Control/Distribution

The source code control system will need to carry schematics, printed circuit board layouts, Verilog design files, and C/C++ source code for the processor module, as well as assorted text files for various other purposes. The source code control should also act as a repository, that helps make all these files available to interested users. There is a convenient hosting service at GitHub that I've used in the past for other purposes. This service (and others like it) offer storage and services for public "git" repositories. GitHub in particular has far better performance then sourceforce.net, and has been reliable for other projects that I've hosted there.

The git URL for the astrocam project is "git://github.com/steveicarus/astrocam.git". The quick way to get the current sources, then, is:
git clone git://github.com/steveicarus/astrocam.git
For now, that is where the source lives. There is very little there now, but that is where it will go. Text files within the distribution directory will describe the finer details of how to view/compile the source files.

Thursday, October 15, 2009

Moving Data Between Modules

In this post, I'll describe how I plan to link the camera sensor module to the processing module. Recall that I want the processor module and sensor module physically separate, with the sensor module compact and light weight; so naturally I'm after a cable link that can use common cables without bulky connectors. The link needs to be fast enough to carry video down from the sensor and commands up from the processor, and simple enough to use that a tiny FPGA with simple logic can operate the sensor end. (In fact, I also plan to program the FPGA on the sensor module through this link.) It should also be reliable and fault tolerant.

But first, how much data do I need to transfer, and how fast? The KAI-04022 sensor (See my previous blog entry) has a maximum pixel rate of 40MHz, and each pixel will be 12 bits. Add to each pixel a few bits for framing (at least 3 bits) and we get a target rate of around 640MBits/sec. The link from the sensor to the processor must be at least that fast, because there are no plans for any storage or significant buffering capacity on the sensor module. In the other direction, there is not much need to send data from the processor to the sensor. There will be an FPGA to load, but that's a one-time thing and speed is not critical. The only other data up are commands to configure the hardware and start captures. What I can say for sure at this point is that the uplink will demand much less then the downlink, but it will also turn out that an uplink the same speed as the downlink is easy and convenient. The video speed is therefore the driving consideration.

I've chosen the National Semiconductor DS92LV16 16-Bit Bus LVDS Serializer/Deserializer to handle the video link. This chip can take a stream of 16-Bit words clocked by a 25-80MHz clock, and serialize them onto a single LVDS pair. Route that pair over a cable, and a matching receiver chip can deserialize the signal back to a stream of 16-Bit words with the recovered clock. Each chip contains both a serializer and deserializer, so with two identical chips (one in the sensor module and one in the processor module) and 4 wires arranged as 2 twisted pairs I can have a full-duplex connection between the camera module and the processor module. Given the desired video rate, it makes sense to run this whole business at 40MHz, to get 40*16 = 640MBits/sec.

At the board level, the DS92LV16 is very easy to use. The digital interface is a pair of 16bit wide synchronous word streams; one stream from the remote, and the other to the remote. Super simple. There are also some link control pins. The LOCK* pin signals receiver lock and the SYNC pin controls transmitter resynchronization. Connect the LINK* pin to the SYNC pin and the entire link initialization can be controlled remotely by the processor module. The data bits can also be connected directly to an FPGA in such a way that after link-up the processor can load the FPGA configuration stream from reset. This chip is simple enough to operate; it can be used to remotely bootstrap the sensor module as a whole. Nice!

The Design Guide from National Semiconductor says that I can use CAT5 twisted pair ethernet cable with RJ-45 connectors to carry the LVDS pairs. I only need to operate at 40MHz, so cables as long as 16 meters can work. That is plenty. More likely, I'll be using 1 or 2 meter long cables. The RJ-45 connectors are compact and cheap, and the cable itself carries 8 conductors arranged as 4 pairs. I'll use 2 of the pairs to carry the high speed data in both directions. That is plenty of high speed data, so the remaining 2 pairs can be set aside for low speed signals. I'm thinking 1 of the extra pairs can carry a hard reset signal from the processor module to the sensor module. The remaining pair, for now, I'll leave unconnected. Later in the design process, I might find a better use for it. (No, I do not think it can carry power.)

So the plan is simple. Use 2 DS92LV16 chips, one in the processor module and one in the sensor module, and CAT5 ethernet cables to physically carry the links. Clock both directions at 40MHz to reduce the number of different frequencies running around and simplify the overall design. Wire the DS29LV16 in the sensor module so that the processor module can remotely initialize both link directions, and wire the FPGA on the sensor module to the DS92LV16 so that the processor module can use the link to configure the FPGA. And finally, use one of the pairs in the CAT5 cable to carry a hard reset signal. That takes care of the data link from the sensor module.

Tuesday, September 15, 2009

All Good Cameras Have a Sensor

I spent some time choosing a sensor early in the project, as much to test the feasibility of the project as anything. The CCD sensor will make or break the whole project, so as a sanity test, I decided to go ahead and do the research to choose a chip that fits the application.

The sensor of choice for this project needs to be usable in a camera system that does not include a mechanical shutter. That requirement alone narrows the field. I found that the sensors that go into your basic digital camera would not work because they need that mechanical shutter. Specifically, they need to be covered while the captured image is read out of the sensor. Yes, you can connect your DSLR camera to a telescope, but that camera body includes the mechanical shutter. But then again, newer digital cameras have video modes, so the right sensors for this project surely exist.

We also want our sensor to have an area of around 4Meg pixels. This eliminates most cheap webcam sensors. In fact, 4Meg is large even for HD video. That pretty much rules out webcam sensor chips. We are getting into the realm of scientific imaging sensors.

Fortunately, Kodak has a whole range of sensors, the Interline CCD family, that fits the bill nicely. Not only do these chips support readout while the sensor is uncovered, they have an electronic shutter that allows for precise control of integration time. The KAI-04022 in particular is the right size (4meg) and is square (2048x2048). That is a nice feature. (Square pegs fit better in round holes then do rectangular pegs.) These sensors are indeed intended for scientific imaging. Looking good.

The pixel density of the KAI-04022 is 7.4um square, which gives approximately 1.6 arc-seconds per pixel and a total of 55 arc-minutes field of view when put behind my 1000mm focal length telescope. The Dawes limit for this 120mm refractor is 0.98arc-seconds, so this strikes me as a good compromise. I can add a barlow lens if I really want to push the resolution. If I want to increase the field of view, I'll just need a shorter telescope. There are physically larger sensors in the series, but they are getting into the 8-10MPixel range. Maybe later.

The chip is able to read out pixels at around 40MPixels/second, which gives about 10 frames per second. This is plenty fast enough for a plausible focus mode. It's not fast enough for full-motion video, but that's fine for what I'm after.

And the Kodak chips are well documented, with thorough datasheets, and evaluation boards with complete schematics. That's good.

Finally, I checked with Kodak sales, and the KAI-04022 is currently in production. There are color and grayscale versions of the chip, so I have some flexibility there, and for small quantities the chips themselves can be purchased directly from Kodak for something in the range of $1,000.00. A lot of money, but within the total budget for the camera as a whole, so that's what it will be.

By the way, it turns out that many store-bought astronomy cameras in the $3000+ price range use these or similar chips as well, so that gives me some extra confidence that I'm not too far off the mark.

Sunday, September 13, 2009

Broad Strokes

In my getting started post, I described the design goals for this camera. I can tell you right off that one particular requirement somewhat contradicts all the others. How does one shove all that functionality into a lightweight sensor module? That also begs the preliminary question "How light?"

The answer: as light as possible. Remember, the sensor is going to attach to the telescope far away from the balance point for the optical tube. Every gram is multiplied by the moment arm away from the center of mass, becomes a burden on tracking motors, and can therefore be a real irritation in the field. When I attach a Nikon D80 camera body to my telescope (Celestron XLT120 refractor) I practically have to remount the tube to the equatorial mount to re-balance it. Heck, even fiddling with the focus might cause the balance to be thrown out, never mind adding a barlow lens. So I'm motivated to keep it light, light, light. (Can I get it as light as an eyepiece? Probably not, but that's an interesting impossible goal.)

Fortunately, the first major design decision follows rather naturally. The whole device can be broken into two major parts: the sensor module and the processing module. The sensor module need only contain the sensor chip and the bare minimum electronics to send the signal out through a cable to the processing module. The processing module can contain all the featurisms of the camera, and can be strapped to the telescope near its center of mass. The size and weight of the sensor module can be minimized by removing functionality to the processing module.

The sensor module needs to hold very little stuff. I can probably keep it down to only the CCD sensor, a few ASIC chips (including a tiny FPGA) and the discrete parts needed to handle some non-digital bits. It should be possible to keep the sensor circuit board down to little more then twice the area of the sensor itself. The sensor chip is going to be pretty large as semiconductor devices go, so this shouldn't be much of a challenge. Keeping chip count low (and hopefully minimize the power draw) should have the added advantage of limiting the heat sources in the sensor assembly. Sensors like cold.

The processing module need not have the size/weight limits, so I can go crazy there. And crazy is where I shall go. The basis can be a Gumstix running Linux. This gets me a whole host of possibilities for free, including networking software and hardware. To that I will attach an interface board that includes the electronics needed to communicate with the sensor board. I can also include on the interface board the components needed to manage power supplies (the sensor board will likely have tricky power needs) and even interface to stepper motors. This will allow me to interface not only to the sensor board, but with mount motors.

Actually, there will probably be a third box. The battery pack will likely be something off the shelf that generates some standard voltages (DC) that the processing module interface board can convert to the voltages needed to run the whole system. I like the idea of having Li-ion power for the whole system, but adding the weight of battery cells to the processor module really is a bit over the top. So batteries get a module of their own.

So that's the very broad stroke. A sensor module will contain the CCD sensor and minimal support electronics; a processor module will contain the interesting image processing hardware/software, the interfaces to the mount, and some power supply management; and a battery pack will supply the power. Next post, I will probably discuss my choice of sensor chip.

Saturday, September 12, 2009

Time to start a new project

I've come to the conclusion recently that I want a hobby project that is a little bit more physical and direct then the tools projects that I've been working on to date. And at the same time I want to combine some interests of mine: Electronics, software engineering, photography and astronomy. So I'm going to design, build and use a digital astronomy camera.

And not just any camera. Here are some design goals:
  • Reasonable pixel count for the sensor (say, 4MPixels),
  • Lightweight sensor module,
  • Wireless and/or wired network operation,
  • Focus mode,
  • Support for auto-guiding, and
  • Open source all the design files, firmware and software.
This is a non-trivial set of goals. I expect that the project will run me on the order of $3000 in parts and countless hours of labor. That's in line with a similar device purchased off the shelf (except for the "labor" part) so not too wacky a plan. The project may at times suffer from "Creeping Featurism", but so long as it remains open source and the price doesn't escalate too far, so be it. No managers are telling me it has to be done now, or ever. It's for fun, so I'll have fun features, gosh darn it! But I have some justification for the major features.

Reasonable Pixel Count

Well, this goes without saying, doesn't it? But I'm specifically thinking of around 4MPixels. Smaller cameras are already available for embarrassingly cheap prices, especially if you get down to the "VGA" range. People have even used low-end webcams at this level. I would like my device to be worth building, so I really think it needs a few more pixels then that. But too many pixels and the device becomes crazy expensive. The sensors for large pixel counts get pricey pretty fast. Since the design is open source, one can alter the design and substitute a larger sensor if one has the financial means. The 4MPixel sensors seem to around that sweet spot where it is worth building the thing, but it doesn't break the budget.

Lightweight Sensor Module

The sensor module attaches to a telescope in the place of an eyepiece (or in addition to the eyepiece for eyepiece projection) so it is way out there beyond the center of mass. I know that my Nikon-80 on the end of a 1 meter long telescope is enough to throw the telescope balance way out of whack. A light sensor module (much lighter then a DSLR camera body) will be easier to mount.

Wireless and/or wired network access

The ideal is to be able to access the camera via a wireless network. Not too crazy, people can access their printers that way these days. Wires on a moving telescope are a particular irritation because of potential binding. (Also, I have another motive here. I have access to some larger telescopes, where a wire to the camera could get quite long. Think 20" refractor.)

Focus mode

This is a mode where the camera makes a live full resolution display of a small region of interest. One can use this display to adjust the telescope focus on a nearby star. Getting accurate focus is a persistent problem with astrophotography, and is downright exasperating with DSLR cameras.

Support for auto-guiding

This is where the camera not only collects the light to form the image, but also provides a sense of the motion of the image. This allows computer controlled mounts to be adjusted by software to keep the target centered. Getting this feature implies some sort of microprocessor that can do image processing, and also obviously implies that the sensor exposure be collected as a stack of images.

Open source all the design files, firmware and software

Mechanical drawings, board schematics and layouts, FPGA designs, firmware and host software will use open source tools wherever reasonable version exist, and the designs will be released under GPL. Everything will be published, so that you can make a camera of your own.

Onward!

So that's my plan in general. I'm getting used to this blogging business, but I intend to try and write up my thinking and progress as a sort of journal for this project. Hopefully others will find this of interest.