The Android application localizes !

On the last day, we’ve rebuilt the Android application completely from scratch and it works ! This screenshot has been made in the foyer of Telecom. In red are the tweeters, in black our position.

The application will be better tomorrow, stay tuned 😉

2013-05-03 00.20.48

Structure of the Android application

Just a little post to describe you the current architecture of our Android application 🙂

As we are not 200% sure on the algorithms to use to detecting the time arrival differences, the localization and the bip planning, I was forced to do a flexible architecture. The idea was to cut the processing path into blocks of different nature, and use Dependency Injection to make replacing each block easy.




  • AudioThread/AudioFifo acquire the audio data from the soundcard at 44.1kHz / 16 bit PCM data, stores it in a FIFO, and redirects it to the next layer. We’ve only analog data here.
  • PeakDetector takes the analog data, and deduces from it the presence of an audio peak. It doesn’t have any knowledge of the number of tweeters, of the planning of the beeps, and so on. It’s only a smart threshold with different technics (basic FFT, Groetzel filter, cross-correlation, …)
  • PeakProcessor takes the information about when we’ve heard a beep, identifies which tweeter has made the noise, and maps it with the planning of the beep to deduce which tweeter has beeped when. If it has a valid vector of emitter 1 => beeped at 20 µs, emitter 2 => beeped at 40µs, …, it gives this vector to the next layer
  • Localizer takes the information of the different times of beeps and uses it to localize the smartphone. After having the data, it gives the information to a custom MapView for displaying it.

In order to fetch the data on the Android smartphone about which tweeter is where, how many tweeter they are, and so on, we needed a simple way to fetch these data.

To do so, we’ve developed a server using Node.js which acts as a XML-RPC server which can give the current state to the smartphone, and store the current state from an external source. It also has a web console on port 9091 so we can monitor how many emitters there is alive from the network, and so on. The server is hosted in a dedicated OpenVZ container at OVH, so we don’t worry about if the server’s port is accessible from the smartphone or not.


The Android application now fetchs this data so the PeakProcessor / Localizer can work.

On the other side, a Python script will take data given on the USB port of the PCB which sync all the clocks, and then send it via XML-RPC to the server so we can tell to the Android client how many tweeters are alive, which are alive, and their beeping order.

WAMI – some more success — H bridge responds correctly


Tell you what ? WAMI is fine. How’d you know that, would you tell me ?
The H bridge has been tested right at the moment and the sound on our tweeters is emitted correctly.
Here are some screen shots of the logic analyzer with the appropriate legend:

Channel 0/1: output voltage on the emitter
Channel 2: enable pin voltage (input)
Channel 6/7: voltage of the two input pins that drive the H bridge

Explanations of the sample code:
The two input pins are driven by a PWM Timer and have opposed phases, while the enable signal is raised for 1 second each 10 seconds for the moment
The two output pins have the same value as the two input ones (with an amplification ratio) when the enable signal is raised, which  allows the tweeter to beep at the same frequency as the input one (21kHz by the way)

Here are the screen shots

Capture du 2013-04-16 00:49:39

Capture du 2013-04-16 00:54:09

Capture du 2013-04-16 00:55:53



A little success



That was a bip at 440Hz detected by an Android phone, using cross-correlation 🙂

All the structure of the Android application is there, I’ll come to that in a future post.

As soon as the localization algorithm has been finished to be transcribed from Python/Sympy to Java/Common Math, we should have a basic usable Android application which will be able to localize the smartphone and show the position dot on a map.

That awkward moment when… your led blinks and you’re the happiest man of the world

Besides the fact that the title of the post is too long and that you should punish me for that, today I started developping on the WAMI project trying to do some basic tasks. you may tell me that the purpose of the project is not to make LEDs blink, but it took me a while to accomplish this task and as the development is done using ChibiOS/RT, this allowed me to check whether the configuration files that we’re using are correct or not.

Proof ? OK !

WAMI led blink

WAMI PCB soldered -> check !

Hello everyone, today I’m glad to introduce WAMI PCB, that has been soldered yesterday night.

We’ve got 5 buddies that I would like to show you, (see the photo below)

I’m not a pro at editing photos but I would like to tell you more about the components that you can see below.

We’ll take the top board and we’ll see from left to right, then from top to bottom what we can find out there :

A 5.5V supply plug that’s connected to a switching regulator which supplies a 12V voltage for the H-bridge. The H-bridge is used to drive the ultrasonic emitters.

A second 3.3V switching regulator is used to supply the rest of the board : mainly the MPU (STM32F405RG), the micro USB plug, the Xbee transceiver and the JTAG port + USART port for debug purposes.

The other non soldered part is related to an audio codec and in/out plugs that we could try to use if we have time by the end of the project.

There are other things yet to come so stay tuned ! (Wait, there’s a pun in there ! If you didn’t catch it, don’t blame yourself)


D-21 : XBEE communication protocol

Here is a post concerning the XBEE communication protocol (hence the title) that we will use to synchronize the different slave tweeters.

We named the different messages to be able to be able to identify them quickly if you have any questions concerning them.

Protocol :

The master broadcasts alternatively two kinds of messages:

  • synchronisation messages “SYNCHRO”
  • configuration messages “CONFIG” detailing :
    • the frequency of the internal timer
    • the length of a cycle
    • ID with beeping configuration of each tweeter


When turning ON the master:

  • It waits a cycle length to get the actual configuration of the active tweeters
    • If all the configurations are the same
      -> it recovers this configuration and re-activate the synchronisation signal
    • If configurations are different or no configuration are received
      -> it sends a new configuration and re-activates the synchronisation signal


The slaves do the following things:

  • when receiving a synchronisation message “SYNCHRO” -> reset of the timer
  • when they receive a new configuration, they apply the changes concerning the general timer configuration (frequency and cycle length)
    • If their ID is present in the configuration
      -> apply the changes concerning the beeping times
    • If their ID is not present in the configuration
      -> send its ID to the master (to be included in the new configuration)
      in a message called “LOG”
  • during their “beeping interval” they send their ID and the actual beeping configuration according to them to the master in a “OK” message



When the master shuts down, the actual configuration is still applied and used for a given time.

When the master turns on: it tries to recover the configuration of each tweeters and if it is consistent (equal for each tweeter) he save it and re-synchronize the tweeters.

When a tweeter shuts down, the master will adjust the configuration accordingly. The master knows a tweeter is shut down because he won’t receive the OK message.

When a tweeter turns on, it waits for a configuration message containing its ID followed by a synchronisation message to start beeping.

If no configuration messages are send, the master is most likely dead and the tweeter cannot resynchronize.
If the configuration message does not contain its ID the tweeter sends a “LOG” message and it will be included in the next configuration file.


Choice and implementation of the localisation algorithm :

Finding the solution of a localisation problem thanks to tdoa (time difference of arrival) of the beeps and knowing the position of each tweeter.

Technically the problem consists in solving 3 equations of the second degree with 3 parameters (the coordinates if the position of the device).

One major advantage of an algorithm is to be able to use the fact it has redundant information if extra twitters are used.

The uncertainties in the reality come from two elements :

  • detecting the exact time of arrival of a beep (this maybe difficult because of the noise and air flux)
  • knowing the speed of ultrasound (which varies a lot in function of temperature, humidity, pression …)

Simulations shows that the main source of uncertainty is the speed of the ultrasound.

The algorithms we chose to implement takes all the information from the emitters and apply a least square algorithms to find a good position. The solution obtained is re-injected into the equation system to deduce an error parameter. This process is repeated to minimise the error in function of the speed of the ultrasound.

We are eager to test it in real life to see how good is the precision obtained.


I am aware I spend a great deal of time on this problem.

I am now switching to the embedded part concerning the synchronisation of the emitters.

Please feel free to leave any comments.

WAMI: Where AM I


The aim of the project is to implement an indoor localisation system using ultrasounds.

The localisation system shall be able to locate a given cell phone in a 3D space of approximatively 50m * 50m * 20m (lets say the Amphitheatre Thevenin for instance).

The system  is composed of :

  • emitters who send ultrasound beeps
  • a receiver who computes his position thanks to the time difference of arrival of the beeps

Our receiver will be a cell phone implementing an android application. This choice has been made so that the project is easily portable and does not require any additionnal equipment.



We design our own PCB to be rather generic.

They have :

  • LEDs and buttons for control
  • a H-bridge to connect the tweeter and have an increase power of emission (x4 as compared to a normal connection)
  • a XBEE module for radio communication
  • USB and a micromatch concerning the interfaces

We also have a codec connected in case we need to record the ambient noise / beeps coming from the other emitters.

Communication protocol

In order for the system to work we need to synchronised our emitters.

This is the purpose of the presence of xbee modules.

One of the devices we made will be considered as the master and send synchronisation information to the other emitters. By considering that the time of flight of the radio waves to the different emitters is much smaller than the precision we need for beeping (which is true in reality), all the “slave twitters” are therefore synchronized.



The device we use for reception is a regular cell phone able to run an android application.
Regular micros on modern cell phone sample the sound at 44kHz which is enough to regognize ultrasounds (from 20kHz to 22kHz).

Detecting a pulse

We are curently working on several methods to detect the time of arrival of a pulse. To do that we need to get the energy associated to the 20kHz frequency.
We thought of implementing :

  • FFT
  • the Fourier coefficient associated to 20kHz frequency
  • a goertzel filter

We shall meet a signal teacher tomorow whose advice will be precious.


Finding position

To compute its position from the measures of the time difference of arrival, the receiver needs to associate each beep to a tweeter and to know the position of all of them.
The main incertainties in the algorithm is that we don’t know the speed of the ultrasounds (it varies a lot with respect to temerature, humidity, altitude). For instance a 5°C shift results in 3m/s diffenrence in the ultrasound speed.
We are currently using a least square algorithm with some adjustements to be able to find the proper position.

The architecture of our PCB

Some days ago, we discussed about the architecture of our PCB. I will be short, and just give you the overall design.





This PCB will be used as well for the emitters and the card which will emit the master clock (but we can also use the course’s card for this).

It is designed to be relatively generic : some part will be not soldered so we can add them after if we need them. The example would be the audio codec. We don’t think we’ll need him but it can serve as two purposes : once could be measuring the speed of sound (which vary with the humidity for instance), and the second could be using one of the cards as a receiver.