Wednesday, November 4, 2015

Pixie - Bright Things Come in Small Packages


Exciting! A new product of my design just hit the market last week! World, meet Pixie - a 3W chainable smart LED Pixel. Kind of a long title... what does it mean?

LED Pixel: The Pixie is a color LED module, allowing an external controller to change its color and brightness dynamically.
Chainable: The module is designed so that you can chain many of them and control each one individually. If you know NeoPixels, this concept should be clear, but in case you don't, imagine you want to build a project that requires 50 LEDs to be individually controlled. Naively, you would need to power each on of them individually, then connect each one of them individually to a controller. This would require tons of wiring, many pins on the controller, each one possibly driven by a specialized peripheral, such as UART or PWM. In short, this is not practical. With the Pixie, being chainable, you connect the first LED's input pins to power and a single control pin (serial TX) on the controller. Then you connect the first LED's output pins (power, ground, data) to the input of the second LED, and so on. Each Pixie in the chain consumes its own data, then relays the rest of the data down the chain, so the controller can control each Pixie individually, without being connected to each one.
3W: 3 Watts of power drive the LED, or 1 Watt for each Red, Green, Blue. This is a VERY bright LED. Compare to typical NeoPixels, which are around 0.2W.
Smart: Some really high-end features are available on each Pixie, such as Gamma correction (8-bit to 16-bit) for super-smooth color gradients, over-heating protection (these things do get hot if left on at full blast for too long), communication loss protection.


The idea was born about a year back, when I started designing my Burning Man LED jacket. The jacket required 50 x 3W RGB LED and I couldn't find any off-the-shelf ones that include the required drive circuitry and are chainable. So I decided to make one myself. It was only a couple of month later that I recognized that this might actually be useful for other people, so I decided to try to turn it into a product. It took a couple of iterations to get it right, fixing some issues that I'll describe in detail below.

I chose to collaborate with Adafruit on this project, partly because I've already worked with SparkFun and SeeedStudio in the past, and partly because Adafruit really have a great selection of LED-related products and have a great reputation in this field (and in general). I think I chose well, they were great to work with!

So... before this thing starts sounding too much like a sales pitch, let me share with you some of the design insights I've had while working on this module. As you'll see, sometimes the devil is in the details and what might appear to be a simple product initially may turn out to be quite a challenge. I've shared the same information as a Design Notes page on Adafruit. Keep in mind that as with my previous products, this one is also fully open-source software and hardware for maximum hackability and fun.

Microcontroller

Fairly early down the design path, it became clear that implementing all the features we wanted is a task most suited for a small microcontroller. We chose the 8-bit Microchip PIC12F1571, which had just about everything we could hope for in this application. It is small and cheap, works on 5V, has exactly 5 I/O pins (used for R, G, B, Din, Dout), an internal oscillator, a 16-bit, 3 channel PWM module, on-die temperature sensor and more. Pretty amazing!
Programming the PIC12 is done through exposed pads featured on the circuit for that purpose (labeled rst/pgd/pgc). A cheap PIC programmer can be used, but the programming protocol is so simple that we’ve implemented an Arduino library that can do that for our testbed.
The possibilities with having an on-board microcontroller are endless! The Pixie can be reprogrammed for standalone operation, and the Din/Dout pins can be repurposed to support different protocols or to directly connect to buttons, etc. The Dout pin can even be used for analog input!
The exiting firmware can be found in Pixie's Github repository.

Constant Current Driver

In order to provide a consistent level of illumination each of the R, G, B LEDs needs a constant current supply of about 350mA. We opted for linear regulation for its simplicity and low-cost, despite it being less efficient (and as such, dissipating more heat) than switching regulation.
The constant current circuit is pretty cool. Let’s explain it by first considering the path of the current through the LED. The current comes from the 5V supply, through the LED, then through a nFET (Q1/3/5) then through a 1.74[Ohm] shunt resistor. The more resistive the FET becomes between its drain and source, the smaller the current flowing through this path. Now let’s see how we can use this to our advantage.
The NPN transistors Q2/4/6 have a specified 0.6V drop between base and emitter when on. This means the voltage across their respective shunt resistors R1/3/5 will always be 0.6V. According to Ohm’s law, this means that the current through them will be 0.6[V]/1.74[Ohm], or about 344mA. Close enough. If the current were to decrease, the base voltage would decrease proportionally, resulting in the NPN having more resistance between its collector and emitter, thus causing a higher voltage on the collector (recognize the voltage divider formed between the NPNs and their R2/4/6 pull-ups?). But this would result in a higher voltage on the FET gate, causing it to become less resistive between source and drain and as a result, higher current through the LED. The same logic can be applied in the opposite direction. The conclusion is that this circuit is self-regulating the LED current.

Color and Brightness Control

Different colors are achieved via Pulse Width Modulation (PWM) on each of the R, G, B LED. The PIC has a built-in 3-channel, 16-bit PWM peripheral. This allows us to be fancy and do Gamma correction, which means we are doing a non-linear mapping of the 8-bit color value we are commanded with to a high resolution 16-bit color, resulting in a much more natural color gradient compared to a straight linear mapping. The PWM peripheral runs at about 500Hz. The generated signals switch the constant-current circuit described above.

Daisy-Chaining

Originally, we have designed the Pixie to support the same serial protocol as the WS28x family (aka NeoPixel). It worked. However, this compatibility, which was originally considered a feature has been eventually deemed a drawback: the WS28x protocol doesn’t easily lend itself to common micro controller peripherals, and in most cases ends up being bit-banged by the controller, requiring a relative high CPU usage, making it hard to do other timing-sensitive operations at the same time, not to mention driving another chain on a different pin… Our solution: stick to the good ol’ 115k.2 asynchronous serial. Almost every microcontroller has a UART peripheral capable of easily generating this protocol without much CPU intervention. Many have more than one. Even a PC with a simple USB-serial dongle can do that fairly easily. Seems like a win! The only drawback we could see what with the data rate being relatively low, we run into frame-rate / chain length limitations (about 60-long chain @ 50 frames/sec). However, at about 1[A] per Pixie, we concluded that typical chains would not be super-long.
The resulting serial protocol is very simple: the controllers sends a byte-string containing a color value for each LED in the chain as follows:
, , , , , , , , , …, , , ,
Each of is a byte representing the brightness of a single color of a single Pixie, where 0 is off, 255 is fully on, and everything is between is, er, everything in between. , , will determine the color of the Pixie that is the first in the chain, counting from the controller end. , , is the next one, etc.
Each Pixie listens on it Din pin for serial data. It will consume the first 3 bytes it sees and store them. It will then echo any subsequent bytes to its Dout pin (with less than a microsecond latency). It will keep doing so until it detects a 1ms-long silence on Din. Then, it will immediately apply (latch) the color values it got and go back to listening for a new color. This yield a very effective mechanism for addressing LEDs individually and making sure they all latch at the same time.

Dealing With Supply Noise

Having a chain with multiple nodes constantly switching 1[A] loads is no small feat! Even an otherwise negligible wire resistance would result in noticeable voltage glitches. Not to mention wire inductance, which likes sudden current changes even less, and reacts with furious voltage surges unless dealt with. To make things worse, being a chain-oriented product, we’re expecting people to use rather long (several meters) wires, which inevitably have more resistance and inductance. And worse still, the on-chip temperature indicator that we really really wanted to use is extremely sensitive to the slightest of noise on the supply. Did we get your attention?
We took several measure to mitigate those issues. First, we made sure the holes for the supply wires are large enough to fit a 16AWG wire. Thicker wires = less resistive = good. The PCB traces connecting the input and output supply are super wide for the same reason.
Then, bulk capacitance! A large 22uF ceramic (hence, low ESR) capacitor across the supply on every node is used to absorb voltage transients, especially those caused by wire inductance. Then, that supply gets further filtered using an R/C circuit comprising R7, R9 and C1, the latter being yet another 22uF ceramic and the relatively high resistor values ensure that the C1 reacts very slowly to any change in the supply voltage. One thing to notice is that we’ve used to resistors and we let the capacitor “float” in the middle. Why? Assuming fairly equal wires for 5V and GND, this setup would result in the supply rails for all microcontrollers in the chain to always have the same Vcc/2 potential, even if their Vcc voltage is different as result of wire resistance x high current. This makes it easier to discern the 0’s from the 1’s between consecutive nodes and thus get a reliable communication channel despite the supply noise. Otherwise, since the detection threshold is relative to the supply rails, we would have smaller error margins.
That simple circuit took a lot of tweaking to get right, but the result is very satisfactory noise-immunity characteristics.

Power Dissipation and Over-Temperature Protection

Despite LEDs being relatively efficient light sources, they still convert the vast majority of their consumed power into heat. Furthermore, our linear constant current circuit uses resistance (across the FET) to limit the current, resulting in the extra power being converted to heat. In total, at full steam (driving all 3 LEDs at 100% duty cycle) our little circuit dissipates around 5W! Keeping it from over-heating in this condition is unfeasible. We’ve allocated largish thermal places on the PCB to improve cooling efficiency, but really, the intention is to not leave the LED full-on for more than a couple of seconds. Rather, working continuously at lower brightness is perfectly fine as well as generating fast, bright pulses periodically.
But we wanted to make sure that the LEDs would not get dangerously hot even by accident. For that reason, the Pixie firmware uses the PIC’s on-chip temperature indicator to estimate the board’s temperature and would shut-down the LED when it gets too hot (above about 70 degrees celsius). It will automatically resume operation when it cools down. Getting this temperature indicator to work with reasonable precision was a challenge. First, the PIC’s supply voltage needed to be extra-clean (as described above) and second, to account for variability between different instances of the PIC, each and every unit goes through an automated calibration process during manufacturing and the temperature calibration data gets written to the PIC’s flash memory.

Loss of Communication

Have you ever noticed how NeoPixels retain their color if their controller goes away? While this can be considered a convenient feature in some cases, it is an absolute no-go in a 3W LED case. Losing communications with the controller for any reason during a high-brightness pulse, that was otherwise intended to be very short, could potentially result in LEDs being left on for extended periods, consuming a lot of power and dissipating a lot of heat (limited by the over-temperature feature described above). Even more, what if we have a firmware bug (not that we ever have bugs, but just for the sake of the discussion ;D) causing the PIC to hang while its LED is on? That would be unacceptable.
Watchdog to the rescue! Remember we told you how awesome the PIC12 is? Another feature that is useful for us is the watchdog. It will reset the PIC if it doesn’t hear from our firmware that everything is fine for about 2 seconds. In turn, our firmware will only pet the watchdog every time it gets a valid color and successfully latches it. So unless we hear from our controller at least every 2 seconds (and in practice, better leave a little margin), the Pixie resets, causing the LED to turn off until told otherwise.
So unlike NeoPixels, if you want your Pixies to stay on for extended periods, even with no color change, you need to constantly remind them that you’re alive by sending them their favorite string.

Conclusion

Who would have guessed that designing a circuit having only about 20 simple parts could get so complicated? Certainly not us! We have done our best to provide a high quality, useful product and learned a lot along the way. We’re hoping you’ll like the result and enjoyed reading about some of the reasoning behind it.

I have some cool ideas for projects now that this is done. Stay tuned :)

Tuesday, May 19, 2015

Electro-Legos

My oldest son is almost seven years old. Seven is right about when I got my first Atari 800XL and learned to program. Today's kids have much cooler choices for computers. Personally, I think that the ability for programs to interact with the physical world makes a huge difference in motivation.
I recently backed up a Kickstarter campaign for the Espruino Pico. While being "yet another" small microcontroller-based board, this one has quite a neat software stack: it runs a JavaScript interpreter and exposes a very extensive JavaScript API for controlling its I/O. I was thinking this would make a great tool for super-quick hacks that I sometimes need for testing purposes / one-off tooling. But quickly it dawned on me that this would be a perfect way to get my son into programming! The Espruino has an IDE that is a Chrome extension with Blockly support, allowing us to code without having to do much typing and without risking syntax errors (which is often a big source of frustration for beginners). I also found that it was fairly easy to add new blocks, so I quickly made a couple of useful ones (for example, there wasn't a stock block for controlling a servo).
And so we've been playing for a while, controlling the on-board LEDs with the on-board button, and later adding a servo motor. But blinking LEDs and moving a servo for their own sake gets boring after a while. Luckily, the tooth-fairy has kindly provided us with a kit of accessories which greatly extends the possibilities! Since my son loves Legos (who doesn't?), my idea was to allow him to embed the I/O devices in his creations. Not a new concept, but mine was totally DYI, taking only a few hours of work in my garage and costing next to nothing.
The kit I made (shhhhh, don't tell it was me!) contains:

  • A micro servo, fitted with 2x2 Lego bricks on the top and bottom with correct alignment and a 1x4 brick on the arm. I used Gorilla glue for attaching the bricks to the servo, then "cast" the thing in hot glue for extra strength and cleaner shape. For alignment, I built a Lego fixture that held everything in place while the glue dries. For the casting, I found a cool trick you can do with hot glue: I pressed the workpiece with the molten glue against a piece of parchment paper placed on the desk. Once cool, the hot glue easily separates from the parchment paper, resulting in a really clean, flat surface.
  • A selection of LEDs. I found that the "technic" lego pieces have holes that are the perfect size for a 5mm LED. With a bit pressure it goes all the way in and will never come out. I made a set including: a double-LED (red, green) piece, an auto-color-changing LED piece and an RGB LED piece. I soldered the appropriate (for 3.3V) current-limiting resistors to the back of all LEDs to make hook-up simpler for my son.
  • Two push-buttons. It turns out those through-hole push-buttons fit perfectly between the bumps of a 2x2 piece, when placed diagonally. A dab of Gorilla glue, and we're done.
I'm really happy with the result and if it works out well I can easily add more pieces.
Here's what happens when you let dad play with the kids' toys...





Tuesday, September 16, 2014

Fairydust


The Story of My Obnoxiously Bright LED Jacket

Burning Man. The most awesome experience ever. My story begins last year, which was the first time I went. Long story short, having been a little unprepared, I was just about the darkest creature on the playa (one of the most typical things about Burning Man is that at night everybody and everything is lit up beautifully). I had a head lamp to keep cyclists from running me over. Me of all people! So for this year I decided to over-compensate. Inspired by a super-bright LED that I saw at work, I decided I'm going to make an LED jacket that would be seen from a distance and will blind anyone that's standing close. Mission accomplished :)


This jacket has 48 super-high-brightness (150 lumen) variable-color LEDs integrated in it. The LED color are created by a controller that is installed on the wrist. The controller has a little graphic display and a rotary push-button that allow selection from different available patterns. Some of those patterns react to sound in real-time. The jacket is powered by two integrated battery packs weighing a total of about 1kg and lasting for over 12 hours of typical usage.
I've designed and hand-made all the electronics and software and sewed the actual jacket over a few months in my not-so-spare time.

The Attire

Extracting the pattern from an existing jacket
This is by far the most advanced sewing project I've ever done. It includes a 5-piece pattern (that I had reverse-engineered from an existing jacket) times two layers. The outer layer is black felt with fine glitter. The inner layer is black fleece. All the electronics go in between, so the jacket looks clean and feels smooth from both the outside and the inside. The jacket features a hoodie and a separating zipper. In order to allow washing of the jacket as well as easy repair of the electronics, everything can be detached and re-attached: the connection between the two layer as well as between the electronics and the fabric are all implemented using hook-and-loop fasteners (Velcro). Each LED is mounted on a small PCB that is attached to the inside of the outer fabric layer. Only the LED itself is visible from the outside, through a 5mm hole. The hole is reinforced (on the inside) with a fusible (iron-on) female Velcro, also serving for mounting the PCB, which has a male Velcro glued to it. The controller is attached in a similar fashion, with the display and control knob picking through matching holes. Two battery packs are carried inside specialized flame-proof inner pockets that are attached near the waist.
Marking the fabric with a chalk
The cut pieces for one of the layers
Complete jacket, prior to mounting electronics
All in all, it was an amazing experience to design the jacket and make it. I'm very pleased with the result too. To my untrained eye it looks not much different in quality than a jacket that you might see in a store.

LED Modules

LED modules, top and bottom
It started with finding these amazing 3W RGB LEDs on eBay that cost $30 / 50 pieces. Sold! My idea was to make modules around them that allow them to be daisy-chained. Controlling a large number of LEDs would require a hell-of-a-lot wiring if had to run 4 wires (R, G, B, GND) from the controller to each LED! So the standard solution is to daisy-chain them. In other words, each LED is connected to the next until the last one is connected to the controller. Particularly popular with the Maker crowd are the WS281x series LEDs (more commonly known as "Neo-Pixel") that can be daisy-chained with as little as three wires by using a cleverly designed single-wire serial protocol. I decided to use the same technique, maybe even to make something that's compatible with the WS28', only 50x more powerful.
Having considered different approaches (including using the WS2811 and boosting its current), I eventually opted for having a little MCU (PIC12F1571) on each module as well as 3 discrete constant current drivers. My goal was to optimize for cost (I made 50 of these, so every dime matters) while maintaining a nice feature set. I managed to reduce the cost to about $3/module at quantity 50, which is pretty good I think.
The boards were made at OSH Park (awesome!!!) Making 50 of something is not like making 1 or 2. Soldering by hand is prohibitively slow (around 20 small parts on each module). I've been wanting to try poor-man's reflow soldering for a while now, so this was a good excuse. Big shout out to OSH Stencils (surprisingly unrelated to OSH Park) that make cheap solder paste stencils in small quantities and fast. The stencil is a piece of Kapton film that is placed on top of the bare PCB and has precision-cut holes in all the places where solder should be applied. A simple jig makes alignment of the stencil with respect to the PCB fast and accurate. A single swipe of a squeegee puts solder paste on all the pads. Then the components were manually placed with tweezers. This has now become by far the most time-consuming step of the process. Once the components are all in place, several modules at a time can be put on a hot electric skillet. Shortly, the solder paste melts and all the components are cleanly soldered to their pads. That easy. 100% yield after the first batch that easily revealed what I should look out for when applying the paste (answer: apply some pressure on the squeegee to keep the stencil tight against the board and the paste layer as thin as the stencil). The LEDs were the only part soldered by hand on the opposite side of the board. I think I'm going to use this technique a lot going forward: the extra few bucks for the stencil are totally worth the time savings and superior quality of the product.
An LED module on the solder paste jig
The stencil is perfectly aligned with the PCB
Reflow soldering a batch of LED modules on a skillet
Check out the quality of the solder joints!
These 8-pin MCUs are awesome! At 50 cents a piece they feature 3 channels of 16-bit PWM, an on-die temperature sensor and 8MIPS without an oscillator over a wide voltage range. They implement the serial protocol in software (bit-banging), running at 800kbps. 3 of the MCU pins are each connected to a constant current driver (comprising a FET, a BJT and a shunt resistor), feeding current to each of the Red, Green and Blue LEDs. Constant current is important for getting a consistent color that is not affected by fluctuation of supply voltage, variation of the LEDs, etc. Since the LED modules are dissipating a lot of heat (5W at full brightness), they can get dangerously hot if left on for long. Thus, I used the on-die temperature sensor to shut down if the temperature exceeds 60°C. Another protection feature is to handle cases of software malfunction or loss of communication with the controller. For that purpose I've used the hardware watchdog on the PIC, so that if we don't successfully decode a new command for 2 seconds, we reboot (and as a side-effect turn the LED off).
First assembled LED module, instrumented for firmware development
The serial protocol is pretty simple and clever: "zeros" and "ones" are represented by pulses of different widths (durations) on the wire. Each node in the chain consumes the first 24-bits it receives to be its own color, then echoes any following bits to the output (connected to the next node in the chain). When the line is silent (neither zeros nor ones) for a certain duration (milliseconds), all nodes latch at once (i.e. set the color of the LED to the given command and start listening for a new command again). In order to meet the high bandwidth requirements (they translate to allowing long chains at good refresh rates), I wrote assembly interrupt handlers with carefully calculated timing.
Once I had everything ready and tested, I wired 48 modules in a long chain using silicon-insulated wires. These wires are very flexible and durable.
Many LEDs
First test of daisy-chaining 
Every little task becomes time consuming at large quantities
Making holes for the LEDs using a soldering iron
The module is attached to the inside with only the LED peeking through the hole
Outside close-up view of an LED
All LED modules attached (jacket is inside out) prior to wiring
Outside view with all the LEDs attached

Power System

Buck regulator PCB
The LED modules are all designed for a 5V operation. Each LED module takes around 1A when fully on. I designed the power system so that it is able to deliver up to 5A continuously. In other words, I had to constrain my software so that the maximum current is not exceeded. I designed a buck regulator module around the TPS54560 chip. One of the nice features of this chip is that it is pretty easily configurable for different input and output voltages, current limit, etc. So the module I designed and the spare board I now have can easily be used on my next project. Power is supplied by two 2S LiPo packs with 5Ah capacity, for a total of about 75Wh. Under normal usage this can easily last all night on a charge. The two packs were connected in series to form a 4S pack with a nominal voltage of 14.8V. I've also joined the balance connector of the packs, so the battery interface looks exactly like an off-the-shelf 4S pack despite the fact that it is split in two (for ergonomic reasons). I also hooked up a cheap off-the-shelf LiPo voltage indicator / buzzer alarm to be able to constantly monitor the cell voltage and protect against over-discharge.
Buck module, hardened and ready to go 
For safety reasons, I mounted the batteries inside safety LiPo pouches made out of thick flame-proof silicon fabric. The pouches where Velcro'ed  to the inside of the jacket so that if worst comes to worst  (in extreme cases LiPo batteries can ignite), I can quickly remove them from my body.

Controller



Controller mounted
Just for the hell of it, I wanted the controller to be somewhat fancy. I resisted my initial urge to make a IOIO-based controller, because I wanted the jacket to be 100% standalone. I did, however, use the same MCU that's used on the IOIO, both because it's awesome and because I'm very familiar with it and I have many of them lying around the house. The controller board features:
  • A PIC24FJ256DA206 MCU.
  • A 128x32 pixel, white OLED display module that I got from eBay on the cheap. It talks to the MCU over SPI.
  • A rotary + push encoder (knob that can be either turned or pushed) for input.
  • A tiny microphone with some analog magic circuitry that extracts the audio envelope on a logarithmic scale. The output is fed to an analog input.
  • A 3-axis accelerometer. It talks to the MCU over I²C.
  • An output for the LED chain, which also powers the controller.
  • Several extension ports (digital, analog, I²C, UART, etc.) - I have some ideas for improvements :)
Being all spoiled from the LED modules, I've ordered a stencil for that one too, even though I only needed one. It was up and running in no time.
I chose to use an RTOS to facilitate easy authoring of the software while maintaining a very low power consumption when the LEDs are not running. FreeRTOS was a natural choice for me, as it is free and also very familiar to me. There are two main tasks: the higher priority task reads the microphone and controls the LEDs at 50 frames per second; the lower priority task handles the UI (display + knob). The UI presents a list of available LED animations that can be scrolled through by turning the knob and activated by pressing it. I developed a framework that makes it very easy to author new animations, so it was then easy to quickly make about 20 such programs with different feels. Most of them are random in nature because I like this style.
An animation gets a reading of the audio level and acceleration on every frame in case it wants to react to any of them. Due to lack of time, I ended up not implementing the accelerometer feature.
Controller front (OLED, knob, microphone)
Controller back

Aftermath

Off to Black Rock City, NV (where Burning Man lives)! The jacket was a huge success. A lot of people came over to compliment, ask questions and takes pictures. One of the first people who saw me passing by with my "Mr. Pink" animation (fast random blinks of shades of red / pink / white) said: "Hey! This is so cool! You look like you're leaving a trail of fairy dust behind you". From that moment on I shall be know on the Playa as Fairydust!
As with every project, not everything went perfectly. The LED modules are not as robust to their noisy power lines as I had hoped, so every once in a while one of them might suddenly have the wrong color on for a split second. However, I think the general idea was good. I'm considering making another revision on those modules and perhaps offer them as a product (think super-bright drop-in replacement for NeoPixels). If you're interested, drop me a comment below and it will increase the likelihood that I'll actually get myself to do this.
And last, I've now started working on a scaled-down version (lower power, simpler, lower cost) to use as magician costumes for my kids for Halloween.

Tuesday, November 5, 2013

How I Became an Artist

The IOIO Plotter strikes again!

I'll start with a story and move on to some technical detail on my cool new image processing algorithm.
A couple of weeks ago I was presenting in East Bay Maker Faire with my friend Al. I was showing off my plotter among other things, taking pictures of by-passers, making their edge-detection-portrait and giving it to them as a gift. At some point, a woman came in, introduced herself as a curator for an art and technology festival called Codame and invited us to come and present our works there.

Frankly, at this point I had no idea what Codame was and pretty much decided not to go. Fortunately, I have friends that are not as stupid as me, and a couple of days later Al drew my attention to the fact that some reading on codame.com and our favorite search engine suggested that this is going to be pretty badass.

OK, so we both said yes, and now we're a week from D-day and I decide that presenting the same thing again would be boring, and anyway edge detection is not artistic enough... I had a rough vision in my head about some random scribbles that would make sense when viewed from a distance, but didn't quite have an idea how to achieve that. But at this point I become mildly obsessed with this challenge, as well as mildly stressed by having to present the thing by the end of the week, so I gave up sleep and started trying out my rusty arsenal of computer vision and image processing tools from days long forgotten.

I got lucky! Two nights later, and there it was. Just like I imagined it!
If you step away from your monitor a few meters (or feet if you insist) the right picture will start looking like the left one. And if you zoom in, it looks like nothing really. It's all, by the way, a single long stroke, made out of straight line segments.

From this point, all that was left is to not sleep a few more nights and integrate this algorithm into my Android app that controls the plotter. Barely made it, as usual. The first end-to-end plot saw the light of day in the morning of the opening.

The event was amazing! Seriously, everyone reading this blog who happens to be around San-Francisco should check it out next time. The Codame organization is committed to giving a stage and other kinds of help to works of art that the conventional art institution refuses to acknowledge: art made by code, by circuits, by geeks and by engineers. The exhibits included a lot of video art, computer games and mind-boggling installations as well as tech-heavy performance arts (music and dance).

My plotter worked non-stop and I was on a mission to cover an entire wall with its pictures. By the end of two evenings, covered it was! I even gave away some. The reactions where very positive - watching people stand in front of this piece fascinated and smiling joyfully was worth all the hard work. The first time one of them approached me and asked "are you the artist?" I looked behind me to see who he was talking to. "No", I said, "I'm an engineer". But after the tenth time, I just smiled and said "yes" :)


Now for the geeky part of our show.

How Does It Work?

After struggling with complicated modifications on Hough transform and whatnot, the final algorithm is surprisingly simple. So much so, that I wouldn't be surprised if somebody has already invented it.
The main idea is:
  • At every step (generates one segment), generate N (say, 1000) random candiate lines within the frame. Each line can have both its points random or one point random and the other one forced to where the previous step ended, in order to generate a one-stroke drawing line the one above.
  • From those N lines, choose the one for which the average darkness it covers in the input image is the greatest.
  • Subtract this line from the input image.
  • Repeat until the input image becomes sufficiently light.
A small but important refinement is now required. If we do just that, the algorithm will helplessly try to cover every non-white pixel with black and especially will have an annoying tendency to get stuck in any dark area for too long and darken it too much. To fix that, instead of thinking of a black line as a black line of thickness 1, let's think about it as a 20% gray line of thickness 5 (or any other such reciprocal pair). The intuition behind this is that when viewed from far (or down-sampling), a thin black line darkens its entire environment slightly rather than covers a very small area in black.

In practice, an efficient way to implement this is simply to work on an input image resized at 20% and draw 20% gray lines on it. Saves a lot of computations too. The coordinates of the actual output lines can still be of arbitrarily fine resolution, limited only by the resolution of our output medium.

Source code, you way? Here: https://github.com/ytai/ScribblerDemo

Sunday, October 20, 2013

Who Wants a IOIO-OTG for $30?



No, this is picture not a negative: it's the new IOIO-OTG from SeeedStudio!

What's new?


  • It's black.
  • It's selling for $30, including male and female headers and a USB-OTG cable.
Otherwise it's exactly the same.

Why?


Since I set out to develop the IOIO-OTG I had a goal of making the end-user price be $30. For various reasons that didn't quite work out on the first attempt, but I kept pursuing this goal, with the belief that this is the right price for it. I got strong recommendations on SeeedStudio from Shenzhen, China and soon contacted them with an offer. I was not disappointed! They were very enthusiastic to collaborate and very professional. Working together with them, I closely inspected their manufacturing and testing procedures and I'm happy to report that the quality of their boards meets my highest standards, and I feel that they are very serious and committed to quality and user happiness.
We decided to make it black in order to differentiate it from the existing boards and get people's attention. I also personally love the way it came out looking.
So a big shout-out to those guys for all their hard and excellent work, and I encourage you to support them and the healthy competition they are bringing.

What's next for IOIO?

I'm now working full-steam on the next software release. It will include:
  • The motor control library mentioned on my previous post. Took me a while to get it from "working" to "polished" state.
  • The IOIO-OTG can also work as a USB slave with Android devices that support USB-OTG. That means that the IOIO can be powered by the Android without needing an external power source. This is a contribution by my friends Misha and Nadir (thanks!).
  • Some bug fixes and cleanup.
I'm also planning a small hardware revision that will add some improved protection against input voltage surges. My friends from Seeed drew my attention to a subtle problem with the current design which I'm intending to fix with that. Stay tuned!

Tuesday, May 14, 2013

IOIO Plotter and the Motor Control Library


This is the story of how I built my Android/IOIO/OpenCV-based interactive plotter, as well as the soon-to-be-released motor control library for the IOIO. I'll tell the story of the plotter from the bottom up, which is the order in which I've designed and built it. But before diving into the geeky technical details, a few words about the final product.
It started with me looking for a cool example application for the motor control library I was about to develop for the IOIO, and have this application something I can present in Maker Faire. I wanted it to demonstrate the ability to control some multi-axis machine in a simple, reliable and precise way using high-level Java code running on an Android or a PC. I finally decided on the plotter, as it seemed like a fun thing to make as well as an interesting piece to exhibit. The plotter is based on a very elegant design, which unfortunately isn't my idea (just Google for "whiteboard plotter"). Since building yet-another-one is boring, I wanted to make this one with a twist, taking advantage of the fact that I can easily put an Android device in the system. So my plotter is interactive in that you take a picture with it and it would immediately convert it to paths (via edge detection) and plot them.
Now we're ready for the geeky stuff :)

Ultra-Productive Development Environment

My friend and colleague Anton Staaf, has introduced me to a really cool development strategy he's been using on his projects. He developed this nice little "shell" library, which is essentially a simplified, very portable command shell, to which it is very easy to add new commands. I borrowed his code, and ported it to the PIC24, using the USB CDC as the underlying serial link. Shortly after, I had a IOIO board which I could plug into the USB port of my PC, open a "screen" session to and run commands to exercise whichever new features I'm working on. Combined with the device-mode bootloader of the IOIO-OTG, I was able to have super fast cycles of code-compile-flash-test. It was really fun to work like that. I'm hoping to eventually release this shell-app for the IOIO for others to hack with - it's totally awesome!
Another tool which served me really well here is the XProtoLab from Gabotronics. Since this library is all about generating perfectly synchronized, precisely-timed signals, I needed a way to validate the output signals. I don't have a scope or a digital analyzer, but the XProtoLab is a tiny, beautifully designed, tiny oscilloscope, logic analyzer and signal generator. I bought this one second hand for $30 a while back and it is worth its weight in gold. Highly recommended!
Now I was in good position to start playing around with some ideas for how my library will work, which I eventually ported into the proper IOIO app firmware.

H-A-R-D-R-E-A-L-T-I-M-E

OK, so I develop realtime software for a living, but I never before got to this level of realtime...
I initially drafted the following design principles:

  • The IOIO will play the role of a sequencer. It will have a buffer of "cues", which keeps getting filled by the client (Android or PC). Those cues are essentially "over this period of time, I want this channel to do this and that channel to do that". A channel can be for example a stepper motor pulse train, a PWM signal for a DC motor or a servo or a digital output pin.
  • As usual with the IOIO library, I want this to happen with as little as possible CPU intervention, so that it can run in parallel with all the other IOIO functions. So I decided I'll use the output compare modules for generating all the pulse signals and a timer for timing the cues.
  • But stepper signals are slightly tricky, as you want a precise number of steps over the period of the cue. Not one too many, not one too little. Ever. So one might generate those pulses one-by-one, but that would place a lot of burden on the CPU and would be very difficult to time correctly when multiple channels are involved. So I decided I'll just let the pulse trains run freely during a cue, and just be really really precise about stopping them at the right time, before they generate an extra pulse.
So, OK, one might think that setting a timer to interrupt at the highest possible priority should suffice, but it really doesn't, unless you want to be really way too conservative with how close you allow your last step in every cue to get to the cue point. But that would put a very serious limitation on the maximum pulse rate, which I didn't want to do.
It took my some time to convince myself that C can't cut it. You just can't really know how many cycles your code is going to take. And even if you could, this could change the next time you upgrade your compiler. So I reverted to assembly. It was actually a lot more fun than I expected, after not having done this for years. And the end-result is something I'm really really happy with! Cycle-accurate timings for everything. I know exactly when each instruction runs with respect to the output waveforms and everything is super-fast, so it doesn't place any significant overhead on the CPU. You can do up to about 30KHz signals, up to 9 such in parallel, in addition to twenty-something binary outputs (e.g. for controlling the stepper direction, solenoids, or LEDs), without ever missing a step jittering on the timing by as little as one CPU cycle.
Of course, it took me about 10 times to complete than it would have in C...
Here's a little (underground) video I shared a while ago, demonstrating some early stages of the motor control library:

Protocol Glue

From this point, bootstrapping this library to the IOIO protocol was pretty straightforward, and shortly after I had a semi-baked Java API for feeding the sequencer. The current API is pretty bare-metal, and I'm thinking about providing it with some higher level abstraction layer when I release it, or at least a decent set of handy utilities. For the time being, I developed some utilities that are specific to stepper motors, which is what I needed for the plotter.

Plotter Design

Finally came the time to make the plotter. The design is pretty simple: two pulleys driven by stepper motors controlling the length of two strings. The two strings are joined in one point attached to the carriage. A rather simple geometric transformation can tell you how long you want each of the strings to be in order to get the carriage to a given point on the sheet in XY coordinates.

On the carriage is a sharpie. In order to minimize the effect of the carriage swinging on the position of the tip of the sharpie, the hanging point, where the strings connect needs to be as close as possible to the tip. This way, even if the carriage tilts a little, the tip won't deviate by a whole lot.


I found a neat little trick for the carriage design: when drawing, the carriage is supported by two ball casters and the tip of the sharpie. This way the sharpie stays perpendicular to the sheet. In order to raise the sharpie (for moving the carriage without plotting), a third ball caster is mounted on an actuated linear bearing, driven by a small hobby servo. The servo can push this third caster into the sheet, so that it is "deeper" than the tip of the sharpie, thus causing the carriage to be supported on the three casters and the sharpie tip to float.
Oh, and remember that I just used the fancy term "linear bearing", causing you to imagine some precision machined awesomeness? Think more like a piece of a shampoo bottle pump in this case :) What can I say, I just hate waiting for parts to arrive or worry about how to fabricate the perfect bearing, when all I really care about is getting this app up and running...


Another little trick I found is for mounting the main assembly (with the motors and pulleys) on the easel: I used some square pieces of plastic, originally intended for hot cups, and bent them with the hot air gun I used for soldering. It ended up pretty cool, and I can easily unmount the thing for transportation.
And of course, the IOIO is mounted on the front, and a pair of DRV8825 stepper drivers behind it to help with the heavy-lifting. I dialed them to about 1A per motor, which seems to work well.


Now, Plot!

There's this part in a project when the hardware is pretty much done and now it's all "just a simple matter of software", as my friend Ed likes to joke. But at this point, having all these million layers of infrastructure at my disposal, I just couldn't wait to draw something with it and see if it's any good. For all I know, I might have a million problems hidden. I coded like crazy for a couple of days, until finally plotting a first circle! Well, let's call it circle-ish, since it did uncover some small mechanical issues that needed addressing. But pretty much, the entire stack of electronics and software worked flawlessly! I wrote some basic utilities to do the coordinate transformations and to expose a high-level Plotter API, which gets an arbitrary list of paths, each represented by x(t), y(t) and plots them.

Pimp Your Plotter



Since this piece is to be presented, some aesthetics can't hurt. I coincidentally noticed that the pulleys look a lot like yo-yo's and decided to have fun with this concept. My friend Ali took me for an awesome tour in his workshop and let me cut some pieces of MDF on his laser cutter. It's funny that the only precision part in this entire system is the decoration :D

Finally: The App Layer

So many layers on layers on layers, I finally had a working plotter and it was time for the application. It's been a long time since I wanted to get my hands dirty with OpenCV for Android. It is 100% pure awesome! Makes image manipulation and standard computer vision algorithms really simple to implement. I developed a simple GUI that allows you to pick a picture from your Android gallery (local  storage, Web albums or capture an image from the camera), and then interactively tweak the parameters of a Canny edge detector. The detected edges are displayed in red against a grayscale image, so you can easily see what result you are going to get and keep moving the sliders until you're happy.thinning the resulting edges, or otherwise a single edge may actually be two-pixel thick at times, which makes it really annoying to convert into paths. I used the algorithm proposed here (thanks, guys!) and implemented it pretty easily with OpenCV. Last, I needed to trace the edges into paths. I did this step pretty dumbly, because I started to run out of time and juice. In the future, it could be pretty nice to:

Then, I found that an essential step I needed to take is

  • Smooth the edges rather than just connect pixels coordinates with straight lines. A nice algorithm to borrow ideas from is here.
  • Be clever about the ordering of paths within an image, so that to attempt minimize the total length of travel of the carriage. In the current implementation, the ordering is pretty stupid, causing the carriage to move from side to side way more than is necessary, significantly slowing down the process.

Aftermath

This weekend I'm going to present this project, along with some other IOIO projects in the Maker Faire Bay Area 2013. If you happen to be around, drop by to say hi. Special thanks to my good friend Al Linke, who made the video shown at the top. Al has made some awesome IOIO-based projects himself, and is going to share the booth with me in Maker Faire.

I intend to release the motor control library within a few weeks for everybody to enjoy. I hope it will pave the way for driving 3D printers and other CNC machines with a IOIO/Android combo, which seems to me like an elegant way for giving these machines a great user interface, standalone computing and connectivity on the cheap.

Some more fun pictures for those who persisted this far.

Friday, January 25, 2013

Go, Go, IOIO-On-The-Go!

A year and nine months passed since the first announcement on the IOIO. Eight months flew by since my announcement on the upcoming IOIO-OTG. Almost a year since I started development. Waaaaaay longer than planned. Everything that could have gone wrong went wrong. Hell, even some things that couldn't have gone wrong went wrong! But finally:

IOIO-OTG is HERE!!!

Let's start from scratch, as for some of the readers (who've spent the last two years on Mars) this might be your first encounter.

I/O for the I/O-less

The IOIO-OTG is a printed circuit board for electronics hobbyists and prototypers, which addresses a very common problem: how do I use my {computer, tablet, phone} to control my {robot, dish-washer, cat-feeder, etc.}. The original IOIO board was the first to offer a complete solution to this problem for Android devices. The IOIO-OTG adds PC support (Windows, Linux, OSX).

Unlike most other solutions, requiring you to write two pieces of software (board and client), at least one of which is typically complicated, the IOIO takes a different approach, by only requiring you to write the client-side (Android, PC) software, using a high-level programming language (currently Java) with a rich and intuitive API that allows you to manipulate the I/O pins and hardware peripherals on the board, giving you the experience that those I/O capabilities are actually an integral part of your client machine. This feature of the IOIO makes prototyping and development with it very fast and simple. Another great selling point is that it can work with virtually any Android device, even very old ones (Android 1.5 or higher), while most other boards would only work with the latest and greatest.

You can connect the IOIO to its client either over USB or over Bluetooth (using a standard dongle). The pre-installed firmware and the provided software library completely hide away the gory details of the underlying connection. The same code you write will work over either connection type seamlessly.

Smaller, Cheaper, Stronger

The IOIO-OTG is significantly smaller in size and/or cheaper and/or richer in features than alternative solutions. For example, in the Android world, the IOIO-OTG's I/O specs are comparable with an Arduino Mega ADK at a fraction of the size and cost and with all the software advantages mentioned above. In the PC world it would provide an attractive competition as well to boards of similar capabilities, and once again, the savings on software development time are huge.

To make it super-loud-and-clear: I'm not making a case that IOIO-OTG is better than Arduino or vice-versa. Arduino is awesome for what it's good at, which is IMHO for standalone operation. The IOIO boards were never intended to work standalone, but excel when it comes to offering I/O to an existing machine that lacks it. There is an ever increasing number of applications that fall into this category, especially due to the attractiveness of using Android devices in physical computing applications, getting tons of sensors, internet connectivity, lots of computation power, touch screen, etc. in a cheap, easy to use package.

Constantly Improving

The IOIO-OTG is all open-source, software, firmware and hardware. The development has been an ongoing process, with new features and bug fixes introduced on a regular basis. New versions of firmware are distributed in a way that makes it very simple for the user to upgrade: connect the IOIO-OTG to a PC and run a simple program to flash the latest version, or your own custom one if you're into such adventures. Download the software package with libraries and examples from the website and you're good to go.

Hardware Specs

That's what we're here for, aren't we? Here are the main features:
  • USB-OTG dual-role (host, device).
  • Input voltage: 5V-15V, from external source or through USB (when connected to a computer).
  • Output voltage: 5V, up to 3A (!), 3.3V, up to 500mA.
  • 46 I/O pins (digital I/O), built-in pull-ups / pull-downs / open-drain on all pins.
  • 16 Analog inputs.
  • 9 PWM (for driving servos, DC motors, dimming LEDs, etc).
  • 4 UART.
  • 3 TWI (I2C, SMBUS).
  • 3 SPI.
  • 6 Pulse Input (precise pulse-width / frequency measurement).
  • USB current limiting when acting as USB host (useful in Android mode).
  • Switch for forcing host mode (for using non-standard USB cables, which are more common than the standard ones...)
  • On-board LED under user control.
For those who know the IOIO V1, the main changes are the dual-role stuff, the beefier 5V regulator and two less I/O pins. There's also improved circuitry for cleaner analog input and better protection against user error (you can still fry it if you really want to :D)

Cutting Costs

My main goal in this project is to make it available and useful for as many people as possible. Really. No marketing BS here. This is not my day-job. One of the key factors in meeting this goal (assuming the product is great, of course) is making it affordable for as many people as possible. Making it cheaper also means people will be less frustrated if they happen to fry the board, and make people feel more comfortable leaving the board permanently attached to their project and buying a new one for the next project.

In that, I feel that I have yet to improve. We set our goal at $30 end-user price. It is currently about $40, and during the struggle to cut costs I decided to forego my own royalties of the product. I'm not particularly happy about it, to be honest, but I feel like I have done the right thing. I'm currently considering my options for how to reduce the end-user price while leaving a little something for myself too.

Coming Up

  • Raspberry Pi support. Already have a working prototype, but need to polish.
  • More applicative features: capacitive sensing, extended stepper motor control, encoder interface.

Thanks

Luckily, I didn't do this alone!
Aaron Weiss from SparkFun took part of the board design and walked me through the winding road of getting a product out the door.
David Stadler designed the beautiful new graphics, which preserves (and improves) the yo-yo from the IOIO, but gets rid of the only-for-Android feel.
Kustaa Nyholm contributed his PureJavaComm library to the community, which opened the door for easy PC integration.
All the IOIO users, who are slowly turning into a nice community, gave me all the inspiration and motivation to keep working when things went the hard way.
My dear wife, kids and friends, who gave me the huge amount of support required for such a project and for patiently listening to my boring geeky stories all along.

Read More

Main IOIO homepage: https://github.com/ytai/ioio/wiki
IOIO project gallery (links I'm collecting): http://pinterest.com/ytaibt/ioio