Categories

Structure of the Android application

Just a little post to describe you the current architecture of our Android application 🙂

As we are not 200% sure on the algorithms to use to detecting the time arrival differences, the localization and the bip planning, I was forced to do a flexible architecture. The idea was to cut the processing path into blocks of different nature, and use Dependency Injection to make replacing each block easy.

 

archi-android

 

  • AudioThread/AudioFifo acquire the audio data from the soundcard at 44.1kHz / 16 bit PCM data, stores it in a FIFO, and redirects it to the next layer. We’ve only analog data here.
  • PeakDetector takes the analog data, and deduces from it the presence of an audio peak. It doesn’t have any knowledge of the number of tweeters, of the planning of the beeps, and so on. It’s only a smart threshold with different technics (basic FFT, Groetzel filter, cross-correlation, …)
  • PeakProcessor takes the information about when we’ve heard a beep, identifies which tweeter has made the noise, and maps it with the planning of the beep to deduce which tweeter has beeped when. If it has a valid vector of emitter 1 => beeped at 20 µs, emitter 2 => beeped at 40µs, …, it gives this vector to the next layer
  • Localizer takes the information of the different times of beeps and uses it to localize the smartphone. After having the data, it gives the information to a custom MapView for displaying it.

In order to fetch the data on the Android smartphone about which tweeter is where, how many tweeter they are, and so on, we needed a simple way to fetch these data.

To do so, we’ve developed a server using Node.js which acts as a XML-RPC server which can give the current state to the smartphone, and store the current state from an external source. It also has a web console on port 9091 so we can monitor how many emitters there is alive from the network, and so on. The server is hosted in a dedicated OpenVZ container at OVH, so we don’t worry about if the server’s port is accessible from the smartphone or not.

nodejs-console

The Android application now fetchs this data so the PeakProcessor / Localizer can work.

On the other side, a Python script will take data given on the USB port of the PCB which sync all the clocks, and then send it via XML-RPC to the server so we can tell to the Android client how many tweeters are alive, which are alive, and their beeping order.

Commentaires fermés.