Evaluation

Hi,

today we had an evaluation of the project and … there are still a lot of things to do. Our project works but there are still some bugs and it is not easy to use for someone not from our group. I also worked on the compass using the magnetometer and it works nicely and fixed one of the bugs I mentioned, we had to keep the PCB plugged on the USB to have it work and I think I solved this, just a mistake in the coding. So all the major features are operative but integration is still required more thoroughly as well as user-friendlyness.

Hugues

DDay

During the last three days we have been in quite a rush. I am happy to say that we eventually succeeded  in having a good motion recognition process : it’s based on two algorithm. As we saw that DTW wasn’t as effective as expected, we split in two, one part working on improving the DTW and another group working on detecting the movement thanks to the orientation of the wand, for this we use the quaternions, we split the space around the user in slice or quadrants, a movement is considered to be a succession of different orientation, which means of different slices. This is quite useful as the amplitude of the gesture no longer matters. I have been working on making a compass using the magnetometer but unfortunately it won’t be ready for tomorrow… a shame because it actually started working today.

See you,

Hugues

Motion recognition again

Hi,

obviously I was too optimist a few days ago when I said that the motion recognition was going on well : the results were not conclusive and quite randomly given. So for the last days I have been making tests, plotting the IMU’s output to check that the data were coherent were with what I expected and they were therefore the problem has to come from the DTW algorithm which doesn’t seem to be as effective as documents, thesis and others we read lead us to believe. We have added pre-processing and tried specific gesture and the results have slightly improve but it is still quite disappointing. Still we are not giving up and we will find a way even if it implies changing utterly the way we want to recognize the spells. I also worked on the microphone, we spent a lot of time trying to generate  the clock without success and it looks like we didn’t configure it well on the microprocessor, that is we didn’t use to its full ability but it’s starting to work now so more info this evening.

Hugues

Motion recognition !!

Hi,

Great news, today we performed our first motion recognition with the wand’s PCB. We had some disappointing results at first, results were either not conclusive or wrong !! Implementing the DTW algorithm we could have an horizontal move that was closer to a vertical move than to another horizontal one but now it’s ok. Provided the move is big enough (if you just move your thumb vertically it won’t recognize it 🙂 ) we can make the difference between an horizontal and a vertical move. I know it’s not much but it’s enough for two spells for our wand : attack and protection so we can start playing !!

Apart from that I also tried to compensate for the orientation of the wand in the user’s hand but didn’t get good results so its quite likely we will have to give up on that, at least for now. I gave a look at MIT-GRT but apparently the orientation is only calculated for static posture whereas we need it during a movement.

Bye,

Hugues

Processing IMU’s data

Hi,

I have been working on the IMU during the last few days. First, we have run some test with data coming from the IMU and our implemented DTW algorithm and the results bode well. By this I mean that if we compare two series of acceleration data from two gesture more or less similar, we get a much lower value than if we compare two different gesture. Not a big surprise but at least for precise gesture, there will be no doubt that we can guess the one that has been made by the user.

Nonetheless I have some troubles : The data given by the IMU are given along the IMU’s axis and we would like to have them along the user’s axis. The aim is that even if the user doesn’t hold the wand quite right, calculation will compensate for the wand’s bad orientation. On a mathematical viewpoint, it’s quite easy, it’s just plain projection with three-dimensional basis, still my results in that area are a bit disappointing : I use the quaternion data given by the IMU, from them I can guess the coordinates of each axis of the IMU’s base into the user’s base (the rotation matrix from the quaternion). Since I know how to write the IMU’s base into the user’s base and I also know the coordinates of the acceleration vector into the IMU’s base, I can calculate the coordinates of that same vector into the user’s base… But with all the multiplications and divisions that implies, the result isn’t quite what it should be, values move a lot even if the IMU is still and the gravity’s vector is sometime more horizontal than vertical.

So more about it tomorrow, I hope I’ll have a solution and I am open to suggestion 🙂

Hugues

Visual over the IMU

Hi,

here is to tell what I have been busy with during the last few days. For a few hours I have tried to enable the I2C communication with our capacitive sensor, but my main concern has been toward configuring the IMU and getting a visual over its rotations to check the referential’s axis. A movie is better than a long post so check out my video !! I have to tell it wasn’t easy, I used pyqtgraph for the graphics  and the quaternion data given by the IMU in fusion mode to get the rotations (much easy to use than the euler’s angles I tried to calculate and represent before, wasting quite a few hours !!)

Hugues

communication with the imu

A short post to say that i can now access the imu’s registers for writing and reading from the shell. So it s all going well, i need now to learn how to get the right configuration and also find something to have a visual of the movements of the imu, apparently opengl would be a good idea.

Hugues 

usb working

Hi,

Today I worked mainly on the usb connection with our board. It appeared we had the same problem as stabiloRose about VBUS not being on the right pin so we had to right an extra-#define in the board.h. Now everything works fine for the usb and I also implemented a shell to communicate with the processor (most of the work had already been done by chibiOS). 

I intend to test the IMU tomorrow so more news then.

 

Hugues

PCB ready !

Hi,

Today was a great day : we received our PCBs!!!  I must say, they certainly are the prettiest of all of this year ROSE PCBs ;). 20150402_215310

 

Ok, for now it does nothing but lighting a LED when the battery is charging, but that’s a start. And also that’s why I started implementing the USB communication, it should be over tomorrow quite quickly. I also carried on with my coding for the IMU and the capacitive sensor but at last I’ll be able to test it fully on the hardware so more news tomorrow.

Hugues

Coding coding

Hi,
today it was coding again. I finished the IMU’s functions, I think I have got all of them now plus some others to communicate with a shell between our PCB and the computer, we will see when we get the device. I have also begun the capacitive sensor, as it also communicate with an I2C bus, I will be able to use part of the code I already did for the IMU. Unfortunately, I couldn’t find a code on the internet for the registers like I did for the IMU (for all the lines : #define reg_name reg_addr) so I had to do it myself 🙂 no hard work but it’s so easy to make a mistake…Anyway it is done now so I can begin the real coding.
Hugues