Last week, the Living Molecules app was released on the iTunes AppStore, in its “minimum viable product” form, to use the parlance of our times. To no great surprise, the recognition sequence for taking camera images and applying an algorithmic transform to decode the binary payload was less than perfect in version 1.0, though due to a couple of rookie mistakes on my part, the nature of those imperfections taught me a couple of things that I should have already known.
The first mistake was failing to appreciate just how much faster an iPhone 5 is than an iPhone 4. For some reason I assumed that the biggest jump in performance was from 3 to 4, but in fact the 4 to 4S transition was the almighty leap: it turns out that the company-provided benchmarks actually are realistic, which is a shock, since I’ve gotten used to assuming that they’re one step beyond damned lies. Probably in large part because the biggest improvements happened in the GPU, which is the rate limiting step for inbound camera image processing, as well as many of the decoding steps for the glyphs. Unfortunately there are a lot of iPhone 4’s out there in the wilderness, and Living Molecules runs unbearably sluggishly, so that meant it’s time to roll up my sleeves and start optimising code, squeezing out any unnecessary cycles, so everyone can enjoy the magic that is molecular QR codes.
Which led me to understand the nature of my second mistake: as the hours went by, and my optimisation efforts ensured that the timings improved, I noticed my test poster + iPhone combination failed to quickly recognise the glyphs more and more often, which was counterintuitive, since I was quite sure I was making the algorithm faster and better, not worse. Turns out that it had something to do with the rotation of the earth, i.e. the amount of sunlight entering the secret laboratory was steadily decreasing. Aha, thought I: it looks like I will have to code up that binormal filter afterall for deciding exactly what is black and what is white. That seemed to help the recognition process a little, though not enough, and it also rolled back my previous optimisation gains.
It seems that difficulties are more to do with the phone’s camera being able to get a good focus lock on the object, and it works a lot better with a light on. But that’s not good, I thought to myself, because this is intended to be used for posters, which are not always pinned up in the most fluorescent of corridors. So it’s time to dig out another iPhone feature, the torch:
One quick new feature later, and the Capture panel shows a little round lightbulb icon at the top right: tap on it to switch the torch on or off. It really does make quite a difference, so hold on for Living Molecules v1.0.1. If nothing else, the app will still be useful for finding the light switch in the middle of the night.