It’s been a while since my last post and I’ll try to resume what’s is going on right now.
Our PCB is finished (!) and should arrive at the beginning of next week.
We started soldering the leds and it tuns out that our wood support works very well! The columns are straight and it does’t take a long time to make then. We have done 33/144 so far. It will be a long weekend..
We also took some decisions like the way to exchange data between the boards. We’ll use a triple buffer to keep images in the STM32 received from the BeagleBoard
After succeeding in installing the freenect library on my laptop, I also made the OpenNI library work. The difference between the 2 of then is that while the freenect can manage the hardware of the Kinect (motor and led), it does’t give us support to higher level functions like hand detecting. OpenNI does it.
The next obvious step was to use it in our BeagleBoard which had just arrived. It was another challenge because we are working with an ARM processor running a different version of the OS. After a very long day, me and Thibaut managed to make it work. The video performs poorly due to the power of the hardware but is didn’t show any problem to give us depth data.
to handle the presence sensors of our cube, we’ll use tinyAVRs (ATtiny85). There will be 9 sensors per side so 5×9 = 45 tinyAVRs. They will be connected to the main STM32 by I2C bus and I am over the datasheet to learn how to flash them.