## Towards a blinky simulator

It’d be convenient if the simulator allowed us to test the projection algorithm. To do this, there should be coloured spheres “stuck” to the facets of the icosahedron, arranged much like the holes in my last post‘s triangle. I haven’t written the sphere layout algorithm yet, but as a proof-of-concept, I put a sphere above the icosahedron and made it orbit the icosahedron when it is rotated. It is done by rotating its position vector using the incoming quaternion. Once all LEDoids are created, for each input quaternion, the software will have to loop through all the spheres and make them rotate around the icosahedron, so that they appear not to move in its frame of reference.

That’s a scene graph: the LEDs are children of the icosahedron, and making the icosahedron rotate makes the LEDs rotate. PyQtGraph doesn’t include scene graph handling, but this is a pretty simple one, so doing it manually will probably be less hassle than picking another library for this (VisPy, for instance).

In the end, the simulated projection algorithm should be able to change the spheres’ colours: this will allow us to test it.

## First ideas for a projection algorithm

There is one physical icosahedron, and a virtual, stable icosahedric image, that we’ll call V. To find what facet of V a LED is in, rotate the LED’s position vector using the quaternions from the sensor fusion algorithm, normalize it and find the facet of V whose normalized normal (rolls off the tongue, huh ?) vector has the highest cross-product with our normalized LED position vector.

Once the facet of V is known remains the task of finding the right colour, but I haven’t given it too much thought yet. Finding this triangle for each of the 2000+ LEDs is going to be really computationally expensive, so perhaps we could do some kind of dichotomy algorithm, using a first dot product to find which hemisphere we’re interested in.

## MCU choice

So far, we’d like to use a STM32F7 MCU, especially for its FPU and L1 cache (16KB+16KB for instructions and cache in STM32F7x8/9 MCUs !). A specific STM32F7 MCU has not been chosen yet.

## A new simulator

I was trying to replace the cube in Lucas’ simulator (see here) by an icosahedron and add some kind of clue of the icosahedron’s orientation, but the Python 3D library we used, VTK, was getting on my nerves. Adding an icosahedron worked fine, but I wasn’t able to change its colors, and what’s worse, even when using the default colours (blue everywhere), one of the model’s facets stayed red, which was pretty jarring. I also added an axes widget that was supposed to rotate with the icosahedron, but to no avail: it wouldn’t rotate. One of us had to go, and it wasn’t going to be me.

Alexis sent us a script 11 days ago that displayed a colorful icosahedron with PyQtGraph, which provides a light abstraction over OpenGL. It made a nice starting point for a new simulator, with a rotating icosahedron, a fixed grid and axes. Behold !

Granted, it’s still ugly, but it works and PyQtGraph is way nicer to deal with than VTK.

## 3D modeling

We’d like our icosahedron to have 13cm equilateral triangles, which would make it fit snugly inside a 25cm transparent spherical shell for protection.

At first, we wanted to build the icosahedric display with triangular PCBs, but last week, our teachers suggested trying to 3D print a facet and put a LED strip (one the ubiquitous APA102 strips) on it, to check density, and if triangular PCBs are really necessary: perhaps we could have a PCB inside the isocahedron control LED strips glued to the facets.

Hichem and I made a model using OpenSCAD to understand how to lay the LEDs out. It’s a pretty neat piece of software for 3D declarative modeling. I really appreciated that because in this case, we had to be explicit and think about our constraints. So far, here’s what we’ve got:

The LED strips are meant to go under this. Using this model, we see that with the strips and dimensions we chose, there’s 111 LEDs per facet, so 2220 LEDs overall. That’s huge, and we’ll have to discuss whether having that many LEDs is feasible (or desirable, for that matter).

## [bouLED] Visualizing an orientation

The first thing I did for our project bouLED was to write a python script to visualise the orientation of our test card.

bouLED will need to know its orientation, which will be computed from a MARG (Magnetic, Angular Rate, and Gravity) sensor using Madgwick’s algorithm. In order to easily check the results of this computation, we want to draw and rotate a 3D shape as we rotate the card with the MARG sensor. The orientation of the card is represented as a quaternion, and will be sent to my python script via a serial port.

We haven’t implemented Madgwick’s algorithm yet, so I just sent arbitrary quaternions to test the visualization. Its latency was at first unacceptable (~0.5s) when I used PySerial to read the serial port.  Sending the serial port to the script standard input made the latency imperceptible.