Categories

last two weeks

It’s been a wild ride. In the past 2 weeks we worked more than ever and had some sleepless nights to get to where we are now. Many tests with the proximity detector (unfortunately we decided to leave them in the end because, although they work, they are ugly and not very stable). The animations are progressing and we try to use all the cube’s potential. We came back to the kinect and we got some cool surprises to show you even though we didn’t get much time on it.

We are finishing the presentation and rehearsing it. Hope to see you all tomorrow.

main card and bootloader

Hello,

As we are getting closer to the end, days are becoming longer. We spent the week-end trying to make our main card work and light a led. We had a lot of trouble to get the right openocd configuration file and then to be able to control the card because we had a bug in our board.h (MCU pins configuration file in ChibiOS). Our driver boards had to be remade but we managed to use the old one to light a LED through the drivers.
Later, I started to work on the sensors and the final decision is that we’ll use a bootloader (http://jtxp.org/tech/tinysafeboot_en.htm) in the tinyAVRs so we can easily flash our code in case something goes wrong once the micro controllers are on the cube. I also studied how to use the BeeProg, a Microcontroller Programmer and also the bootloader.

Today I finally got the bootloader into the AVR after many (43) tries but I need a cable to connect it to the TP card to flash our code.

I made small code to light a led and I hope it works once as I can test it.

beeprogIMAG0850_1

Roled so far

Hello,

It’s been a while since my last post and I’ll try to resume what’s is going on right now.
Our PCB is finished (!) and should arrive at the beginning of next week.
We started soldering the leds and it tuns out that our wood support works very well! The columns are straight and it does’t take a long time to make then. We have done 33/144 so far. It will be a long weekend..
We also took some decisions like the way to exchange data between the boards. We’ll use a triple buffer to keep images in the STM32 received from the BeagleBoard
After succeeding in installing the freenect library on my laptop, I also made the OpenNI library work. The difference between the 2 of then is that while the freenect can manage the hardware of the Kinect (motor and led), it does’t give us support to higher level functions like hand detecting. OpenNI does it.
The next obvious step was to use it in our BeagleBoard which had just arrived. It was another challenge because we are working with an ARM processor running a different version of the OS. After a very long day, me and Thibaut managed to make it work. The video performs poorly due to the power of the hardware but is didn’t show any problem to give us depth data.
to handle the presence sensors of our cube, we’ll use tinyAVRs (ATtiny85). There will be 9 sensors per side so 5×9 = 45 tinyAVRs. They will be connected to the main STM32 by I2C bus and I am over the datasheet to learn how to flash them.

tests, definitions and playing with kinect

Hello all,

So last week was Athens week and with classes from 8h30 to 17h we had less time to work on our projects. However we took some important decisions and advanced in some areas. We made the USB speed test to see if our bus would be enough for transfering images to the cube and it is! We got 4-5 Mbits/s when we needed just 1 Mbit/s. The components have been chosen and the PCB is being designed.

This weekend, after a lot of reasearch, we got some initial success with the Kinect. Knowing that we want it to work “embedded” we’ll have to run it over linux in our beagleboard. This makes things much more complicated since we don’t have the official SDK neither support (only for Windows).

Analysing the open source options, the first one to be tested was Open Kinect (freenect library). It wouldn’t agree with my GPU but some googling I found out that I needed to install Bumblebee (https://wiki.ubuntu.com/Bumblebee). The first impressions can be viewed in the image below.

kin_test

The left image is colored by distance from the camera. The library also gives me control of the height to which it points (it’s motorized!) and to the color of the led on it.

The next step is to test a higher level library so perhaps we can have access to some body detection. And let another week begin!

capacitive sensor and first PSSCs

Friday was a long day and after finishing the schema of the TP PCB, we continued to work on the project choosing some components and  doing some tests with the capacitive sensor. The idea is that the proximity of our hands of an iron wire will change its capacity in a way that we can measure by analyzing the time delay between a signal sent in one side and received in the other. After a lot of different tests, we were only able to obtain significant differences when we touch the wire, but not when we are near the wire, as we hoped. This means we’ll have to use an other kind of sensor.

We also started working on some PSSCs such as the architecture and the choice of components.

PCB and PSSCs

Hello,

This week we have taken some important decisions in the project. We now have the final version of the PSSCs, with the respectives responsibles for each one. We went to the Robotic club and we saw a bit of their work on a 8x8x8 cube with some characteristics that resemble ours and this gave us a more materialistic idea of where we are going. We also found some nice LED drivers, thanks to Alexis, that might do the job. The next big choice is the Microcontroller.

I also worked on the PCB TP. The installation was tough and I getting used to the software but I am slowly progressing.

back to work

Thursday 28th we stayed until late in the night to get ready to the communication challenge. Fortunately, I got my Xbee to work and also the LCD. The challenge on Friday was very exciting and fun. I think I got good results but I still have to work on my code cause I didn’t have enough time and it’s not very well written.

This week I took a little break though we had some email discussions about the project. We have just ordered the leds (10mm RGB) after testing some characteristics (illumination, notably). Now I’m ready to come back with energy refilled

preparation for the challenge

tuesday and yesterday I spent a lot of time on the code for the communication challenge. It was hard in the beggining and I didn’t really knew were to start but once I got it going I avanced a little but yesterday night I got stuck in the buttons. If I connect them directly to the Leds, aren’t them supposed to work? When I do that, if I press the left button if turns on but one I release it takes almost a second to turn off. With the right button, the led is always on but when I press it, it does nothing 99% of the time (the other 1% it turns off). I don’t thing the rebound problem would interfeer on this. Is there something I not getting?

research and reunion

Yesterday I spent some time looking for how to use the kinect in our project. Since the profs told us that it may not be necessary to make the kinect “embedded” (we can use it through a computer or a smartphone), I think it will be a good idea to start on the computer because it seems much easier to implement and test. Last year Microsoft released a PC version of the Kinect with a SDK included. I’ll be looking deeper into this, as we decided in our reunion today. We divided the research tasks and we expect to figure out what hardwares to use in the next days. It will certainly be fun to work with the kinect but very demanding also.

tomorrow’s presentation

This we figured out the characteristics of the cube. It will be a 12x12x12 cube and at each time, only the leds in one “floor” (a 12×12 board) will be on. This means that a led will be on 1/12 of the time so it has to have enough power to make use of persistence of vision. This also means that we can have about “only” 400 cables. We also finished the presentation for tomorrow and studied it.