We have had bad luck for the past three days: we have first lost two days because of our (fused ?) microcontroller not responding anymore. After we confirmed this problem, Alexis soldered for us a brand new one. We have had just enough time to confirm our code so far works, before facing another material problem : a broken pin from our codec (we were wondering why despite all our efforts we couldn’t get a response from it). Once again Alexis fixed this, and we are ready to go!
While I couldn’t work on the microcontroller, I have thought a bit about a way to get localize our modules with “sound pings”: we enable both codecs in full-duplex mode, the first one emits a sound (fixed frequency, for example with the sine test available in the codec) while the other listens. As soon as he hears the sound, he repeats it to the first module. When the first module gets an answer, he checks the time and obtains a delay. This delay should be from something like 2*distance/sound_speed + T with T a constant covering the time to generate the first sound, then to detect it and generate an answer, and finally detect the answer. The problem is wether this T really is constant or not (I have searched for it in the datasheet but couldn’t find this kind of timings). I fear we might run short of time to try and implement this function, mainly because of the delay due to our material problems. I think if we can fix the codec soon enough then we should give it a try. Else, let’s focus on the main features we have thought of (we already are able to localize close distance – below one meter – with RSSI, which is already pretty cool).
Today, I was trying to determine wether we could transmit music with our zigbees reliably (will the bitrate be good enough when many MAESTROSEs are talking together in the mean time? Will we lose any data?), I couldn’t conclude yet because of some troubles I have using ChibiOS’s timers. I should fix that tomorrow though. In the meantime, Bertrand and Aurelien have diagnosized and then repaired (with Alexis’ help) our codec, and have begun implementing FRAM support.
Updates of the project:
- SmartFusion A2F200 and OpenOCD
After preparing the patch to OpenOCD to support writing in the flash memory, we sent the patch. It was refused due the amount of code from Actel we used to generate the embedded bin program. We could provide a cleaner code, removing the unused things from Actel’s driver, but it is working and we don’t have much time, after Rose we can work more in the patch.
If you are still interested in this patch you can found it here. Or send us an email .
Our PCB has arrived! \o/ Seems that we are going to weld the components tomorrow
- Libero (FPGA and project configuration)
The routing of FPGA is done, a map between MSS ans FPGA, which are the things that can be connected directly with a GPIO and which can not. So after we have our robot, we will be able to drive the peripherals which doesn’t need a complex FPGA support.
Necessary HDL modules will be done together with the peripheral’s drive.
We integrated FreeRTOS to our project. There is just some minor things we need to configure, as the code work in the Ram but doesn’t in Rom memory =/
SPI driver is done and tested for SmartFusion Evaluation Board, we will need to adjust some aspects of the drive to our SPI flash memory
After the OpenoCD patch, we tested DirectC in flash it is working but it is not fully integrated with the project. Tomorrow we hope to have the FPGA programing environment working and tested with our Robot.
More updates tomorrow.
Through yesterday’s presentation and teachers’ comments, we have defined our new Project Specific Success Criteria.
Here they are :
|Kalman filter working
|Complementary filter working
|ADC to measure battery level working
|Ball-E is standing
|Ball-E moves the way we want
|We can pilot Ball-E with an Android smartphone
Yesterday, I also managed to make the leds blink on our Ball-E board, let’s begin the Wi-fi !
Saturday our PCB was completed, some screen shots :
Gabriel is working in the DirectC, a set of code to implement In-application programming (IAP) for the Cortex and FPGA. The main goal is to program the FPGA from an image.dat file generated by Libero IDE as we won’t be using a FlashPro chip and we are not sure if the FlashPro software can be used without it.
We also studied how to work with Libero, route pins of FPGA and generate an image.dat file. The program is not complicated and the Libero Project of Tutobot should be ready soon.
Now Thalita and I, we are working in a patch to OpenOCD to be able to write in the flash memory of SmartFusion by OpenOCD. The idea is loading a code in RAM which writes on the Flash as we didn’t find any documentation about driving the Flash memory directly through JTAG.
More updates tomorrow.
With the huge help of Alexis we finished soldering pcbs. Tomorrow we are ready for some real tests and we will give you some news .
3 steps of the PCBs:
First PCB is the Station and the second is an adapter for remotes which use xpressnet protocol like the roco multimaus.
As you may have noticed we don’t have a wifi module. We think that the wifi is very optional and if someone need wifi you can always buy a wifi router and plug him (same price as a wifi module and more configuration).
Erratum : It’s JTAG and not JATG
Great news : Ball-E is due to be born soon !
Until Wednesday, I worked on the PCB with Pierre-Hugues and it has been sent in order to be produced (We even have pictures of the PCB’s birth). Spot welding might be our next week’s work !
As we have already received our stepping motors and omniwheels, till today I have been working on the architecture of the robot and it will soon be done, just take a look at the base of the robot ! I made it so that it is possible to change the angle of the omniwheels relatively to the ball.
Scott is working on the simulation part, and Otilia is working hard to make the Kalman filter work before Easter vacation.
We finally finished our PCBs. I took care of the pinout and the placement, then I started the routing in trying to put in one PCB all the logic the power supplies and the leds. But the routing in the center is very complicated and loaded (with the leds and the fpga in a same place and constraints with some heavy components), that would have forced us to get a 4-layers PCB. Which with a 50cm long PCB would have cost an outrageous price.
So we split it in 2 PCB, one with the logic and the power supplies, Sylvain has dealt with.
And one with only the leds and leds driver, Jeremy has dealt with.
During this time I started to deal with Gumstix and install a Debian armhf on it. I’m currently operating the WiFi to make it works properly and in access point with a dhcp and a web server inside.
Yesterday we worked on propeller PCB designing and routing. The problem we had was that our PCB was too expansive (because we need a 4 layers PCB). So, we decided to use 2 PCB: one for LED and the other for the FPGA, the Gumstix, …
I personally worked on the LED PCB which is almost finished …
These two days we are a bit slow because the most part of the group has classes about human formation from 8:30 am to 6:30pm, so we are just working at night.
The bootstrap of programming the MSS (Microcontroller SubSystem, the non-FPGA part of SmartFusion) in Linux is done, now we are able to embed a bootloader and a OS, but unfortunately just in ram.
Programing in flash will be done latter as we need to find out how to program the SmartFusion FPGA part in Linux as we use the most part of our peripherals in FPGA. We will start this tomorrow.
For the PCB, Thalita is advancing in our model, placing the components and making the shape of our robot. Now she knows how to perform holes in the board for the wheels. Models of the motors and the wheels will be added soon in the Expedition Library to help making a good placement and avoid errors.
Today I finished the last details of the PCB and started placing the components. There is a lot of components, our LCD is kind of big, and the smart fusion chip isn’t that tiny either. I started considering a circular board of 11cm diameter, but increased to 15cm after seeing all the components.
One thing that is decided is that the motors will be on top of the board, so it stays closer to the ground (and so the line sensors). I drew some rectangles to represent the space of the motors and wheels. I still need to learn how to make the holes for the wheels and screws.
I also have to be careful with the ground planes. I divided the board in two, one part has the regular ground planes, the other one has the analog ground. This is a recommendation from Microsemi, to have a less noisy analog ground. The two will be linked at a single point by a zero ohm resistor.
Tomorrow night I will keep working on this.
Gabriel is working on our Makefile structure for compiling projects easily.
Helen is going further with programming testing interruptions. She could activate them, but the vector table is in the flash memory (which we are not able to program by the openOCD yet). She is searching a way to inform the core that the vector table should be placed in RAM.