ELECINF344/381

Interactive web site of Télécom ParisTech's ELECINF344/ELECINF381 Robotics and Embedded Systems classes (a.k.a. ROSE, 2012 session).

Categories

[MAESTROSE] Codec, RSSI and first tests on the board

Friday Alexis soldered our first PCB, that you can admire below :

She is now waiting for her 12 little sisters :)

Thanks to that, Guillaume was able to port our existing code on the module, after the traditional blinking LED he was able to communicate in ZigBee with the module. Unfortunately all of a sudden openocd has been reporting some JTAG-DP STICKY errors and assumes that the flash memory is in protected mode. All our attempts to flash some new code with JTAG or to use the integrated bootloader have been unsucceful and Guillaume has spent the day working on it.

In the mean time Aurélien has been writting code to use the codec. It is now ready to be tested on the board but we have to fix the upper mentionned problem first.

For my part I was in charge of the evaluation of the use of RSSI based location that we will be able to do on the project. Unfortunately no big surprise we still encountered the same problems than before. Unlike what physics could let us expect, the RSSI in not always decreasing with the distance, it goes up and down with the distance and the value is almost the same at 12 and 7 meters for example while it is much higher for 10 meters. The following histogram show these disapointing results.

Nevertheless we have been able to make use of the RSSI on very short and fixed distances. We placed 5 F103 boards 30 centimeters apart in line. Each card used RSSI to locate its closest neighboor, and we were able to make them number themselves from one extremity to the other from 0 to 4.

At the moment we plane on using RSSI in this way only, it’s quite restrictive, but we have an idea of a nice use that we can make of it :)

Bertrand Mermet

Some tests on Zigbee

In order to locate themselves and  to stream content between each other our modules will make extensive use of zigbee so it is a critical parameter in our project. In order to determine what we will be able to achieve I started doing some tests with the STM32 boards that we used in class (thanks to Phh and Aurélien for lending me theirs)

The localisation part will be done by measuring the signal attenuation beetween the modules. To access this data the zigbee module has to be switched in API mode through the ATAP command. Then the zigbee module is no longer acting transparently as a serial line but it receives and sends structured trames. Among the informations contained in the RX trames, we can access the Received Signal Strength Indicator in -dBm.

It appeared that this value doesn’t remain constant when the two zigbees stay at the same distance from each other. in order to see the dispersion of these values, I used a zigbee module of the robotics club connected to the serial port of the computer to log the values and I obtained the following histograms.

The green one shows the values of attenuations when the two modules are in opposite corners of the room and the blue one when they are almost stuck the one to the other.

The attenuation value is not constant but considering the median or the average on last value should give us a relevant value.

I’m concerned because the range of values is not very wide : -23 to -50 dBm in the experience, and I’ve observed that it decreases very fast with the distance and then much more slowly, I hope that we’ll be able to have a decent resolution in spite of it and I’ll try to plot an attenuation/distance curve in the next days to have precise data.

We will also use the zigbee to stream audio content between the modules and in this perspective we need to evaluate the maximum bitrate we can rely on. The theoretical value for zigbee is 250kb/s in reality we will certainly not reach it. To do some tests I used two STM32 boards, broadcasting trames containing 100bytes of datas (the maximum value on API mode). On the software part I simply used a timer that print each second the number of trames received in the last second.

The results were very disapointing. When the two modules were broadcasting and listening simultaneously I hardly reached 27 trames/s = 21.6kbps and when only one was brodcasting and the other one listening 68 trames = 64.4kbps which is still pretty far from the theoretical  250kbps. I’ll try to improve these results in the next days, so any ideas on what I could change to do so is very welcome.

Bertrand

Bertrand Mermet

DHEXTROSE update

After our adventures with the STM32 board and the communication challenge of last week, we spent our weekend trying to catch up on the project. So here are our conclusions :

Simulation :

In order to start working on the project before we get the components, we have been searching for a way to simulate our robot behavior. After some researches we finally decided to use V-Rep . The reason of this choice are that it is free for students, it has all the features we need for the simulation and can even simulate proximity sensors or a camera, and last but not least it offers a 3D model of an hexapod (you can see it on the screenshot below) very close of what we are planning to do. The scripting of movements can be done in Lua, and as soon as we will be familiar with the API we will be able to start some simulations on our own.
Servos :

This weekend choosing the servos for the project took us a huge amount of time and we’re not yet totally sure about our results. We read back our old mechanics courses and did some calculations. To get a result we did a few approximations so our results are probably a bit over-estimated.
There is a quick graph of one leg of the robot :

If we call C1, C2, C3 the torques maximum that should handle the servos 1, 2 and 3, we obtain :

Horizontal :
C2 = m/3*g(l1*cos(beta-alpha) – l2*cos(beta))
C1 = m/3*g*l1*cos(beta-alpha)
Vertical :
C3 = m/3*g*(l1*cos(beta-alpha) -l2*cos(beta) + l1)

The torque will mainly depend on the extent of our moves. In order to get an idea of what we need we used the simulator to measure the angles during a plane movement already implemented with an actual gait. We obtained the graph below with time on X axe and angles in degrees on Y axis :

With these measures and our calculations we estimated that for a 3kg robot we will need at least 12kg.cm for the servo 3,  8.9kg.cm for the servo 1 and a bit less for the servo 2.

Vacuum cups :

The vacuum cups should be able to hold the robot when it is not walking on the ground. The robot will always have at least three legs in contact with the support. So each vacuum cup should handle up to ⅓ of the total weight of the robot. This document gives a table of the force a suction cup of given diameter can admit under a given vacuum. It also explains that this force should be divided by four  if it is applied laterally. So if we consider a robot of 3kg, each vacuum cup should be able to hold a theoretical force of 40N.
We have found some vacuum cups that seem to fit our needs. We plan on choosing a plan round suction cup, a bit more than a 4 cm diameter: here is the link towards those suction cups.
For now, our main choice would be this one.

Vacuum generator :

Considering the above-mentioned table, we have laid our choice on Alldoo’s CMP30-3P, whose characteristics (rate 12L/min, max vacuum -800mbar) would allow us to perform 3 steps in 5s, (one step  involving 3 legs), but that doesn’t take into account the legs’ move time. We don’t know the weight, but this one, from the same constructor with similar dimensions and structure (even  a bit wider) weights no more than 240g, which is still acceptable. We have contacted Alldoo in order to have prices and weights.

Valves :

They should be able to undergo a (-)300-500 mbar vacuum. This one is made for undergoing from 0 up to 140 psi (9652,66 mbar), so it should work just fine. The dark point is we don’t have any indication regarding dimensions, except a 6mm diameter for the aperture, which is big. Note that the larger the valves and pipes, the better the inertia of the whole air system will be (towards a cup being stuck, especially).

Proximity sensors:

For our robot not to be blind, we plan on adding two proximity sensors: infrared and ultrasonic. The first one will do short distances between 20cm and 150cm while this other can detect up to 4m ahead. In order to cover the whole 360° around it, we will use a platform controlled by a servo (much weaker than the others) on which we will hold the sensors. We will then be able to fully rotate it, thus enabling our robot to walk and sense in any direction.
Here are two sensors that might do the trick: here  for the sonic sensor, and here for the other one. We should need only one of each.

Battery :

We need  6V for servos and 12V for pneumatics (pump and valves). Having 2 different batteries would allow us to avoid multiplying this with all these amperes. For 12V, we could take these ones, which give us 40mn of autonomy, or perhaps something a bit heavier if we can find light enough a camera. For 6V, these ones would feed the servos for about 30mn, not counting the cards and captors, and are already  a bit too heavy. We’re searching for better.

Aurélien, Bertrand and Guillaume.

Bertrand Mermet