Writing my previous post, I realized that I hadn’t tested the reception of data through the WiFi module, only the emission. So on Thursday morning, I did some tests to check this part and it works immediately, which was a good news. Then, I worked (again) on the touch detection part of our project, as it’s a quite important part of it and for the moment, it’s unreliable. Despite many tests and solutions tried, it’s still unreliable. It depends a lot on how the wire is fixed to the leaf and on the leaf itself… We’re now able to detect touches almost every time but we have many false positive also. The variability of the results makes the work on this task very painful. Indeed, it often happens that you think you’ve reached an acceptable solution, you do 2 or 3 tests that work very well, and when you want to show it to someone else, it doesn’t work any more. And if it works one evening, it doesn’t work next morning. More work is thus required to make it work, even though it’s not the most interesting part.
I also worked on reacting to messages sent by the server. It’s now possible to select the color of the leds on the server and to send it to the plant. We faced several problems with Tanguy when doing this part, most of them coming from the database, which sometimes crashes due to an overflow of requests. But now, it seems to work well.
On Friday, I spent most of my time working on the box that will contain our board. I decided to put the PCB and the servomotors in the same box, and the LEDs in another one. The PCB will be placed below the pot while the LEDs will be at the base of the trunk. I’ve 3D printed a first version of the PCB box (without the roof) and the dimensions are good (I took margins to be honest, and I’ll remove them for the next version). This week-end, Alexis has to buy a new plant that we’ll use for the final presentation. So tomorrow, I’ll start to put all our stuff on it.
On Friday evening, we finally received the audio jack plug. We thus could have tested the sound and after some debugging, it worked well. However, the music download doesn’t work for now, so I’ll have a look to it to find where does the problem come from and see if I can solve it.
During the next few days, I’ll work on the mechanic (containers for the PCB and the LEDs, placement of the wires and of the optical fiber), on music download and maybe on the touch detection. And we also have to think about what we want to show during the final presentation.
On Monday, I started to review my code and clean it. I found that the touch detection algorithm was not adapted for 2 sensors, so I modified it (double the size of all the buffers, add one parameter to each function,…) And this day, we finally received our PCB. With Alexis, we made the first tests and discovered 2 mistakes: the power jack is connected in the wrong sense and the pins of the SWD aren’t mapped properly on the connector. For the first problem, Alexis inversed the wires coming from the power block, and for the second he made a patch.
After that, we flashed our program and began to test the components. The Bluetooth worked immediately, and the communication with the capacitive sensor was also immediately functional. Tanguy had to make some corrections for the LEDs and I spent some time trying to understand why I wasn’t able to communicate with the WiFi module. The problem was desperately stupid: I set the pins in the wrong alternate function (the alternate functions table was organized in a different way than the others I had seen but I hadn’t noticed it before yesterday). Now, the Wi-Fi works well (I’ve forgotten to test the reception of messages with websockets so I hope I won’t have any problems with that).
The next component I tested is the capacitive sensor. The communication worked well but I wasn’t able to detect any touch. I investigated a bit, reviewed the configuration values and decided not only to copy the one that worked for the demo board but to evaluate the appropriate ones for our board. First problem with that: we use the internal oscillator of these sensors and its frequency is used in the formulas. But the datasheet only says that this frequency is between 35 and 55 MHz, which is a rather large interval. The second problem I faced is : how to know the capacitance of a banana leaf? With Tanguy, we went to see a professor who could have helped us, but he didn’t find any relevant way to perform this measure. So I decided to redo some tests with the demo board and my own configuration to see if it worked. Then, I did some new tests with our board and this time, the values read were much more coherent. So, it was a good news. The problem now is that the algorithm I wrote some months ago to detect touches doesn’t work with these values so I will have to work a little on this to make it work.
During the end of the week, I will work on the touch detection algorithm, on the “mechanic” (container for the boards, the servomotors and the LEDs,…) and on the code that will run on the board during the presentation.
Since last post, I worked on the capacitive sensor and on alarms.
For the capacitive sensors, I integrated the algorithm previously developed. It works pretty well with the sensors pad of the evaluation board: I’m able to detect touches. I didn’t encounter specific problems for this integration. Next step will be to finely tune the algorithm settings with the sensors glued to the leaves, but I’ve estimated that it’s better to do it directly with the PCB as I have other tasks and problems to solve before.
The other topic which occupied my time is the addition of alarms. Indeed, we want to be able to use our plant as an alarm clock with both sound and light. First, I tried to implement this mechanism on the server side (and the commands would have been sent to the plant at the awakening time). After several hours of vain intents, I made a break and thought that in fact, it could be interesting to have a more reliable alarm clock, one that doesn’t depend on the internet connection. And thus, it was no more necessary to implement the timer on the server. So I implemented only the settings selection on the server and the alarm is created directly on the plant side. When you create your alarm, you send a list of commands to the plant that will be executed at the awakening date. These commands belongs to 3 categories: sound to play (for the moment, you have the choice between 3 ringtones), lights to turn on and servomotors to move. Doing and testing this functionality showed me that I wasn’t able to play several music sequentially, so I had to correct this point (it was simple).
During the end of the week, I want to solve the last major remaining problem: the connection to the websocket which doesn’t work perfectly. It works sometimes, and sometimes not. So I want to find what is the cause of this strange behaviour and to correct it.
The beginning of this week has been quite productive. Indeed, I managed to have different tasks reaching a functional state.
First, I worked on the server. I added several pages to provide an interface that allows the user to trigger all the actions that the plant can perform. I also improved the login/logout functionality. For the front-end, after some discussion with my classmates, I chose to use Bootstrap. It allowed me to have an acceptable-looking website in a short time.
On Tuesday, we received the new FDC evaluation board (for the capacitive sensor). This time, I connected the pins very carefully, entirely reviewed my code and tested it. After some minor correction, I was able to configure the sensor and to read data by polling it. After that, because polling is a bad thing and that in that case, it can be avoided, I implemented the part of the driver that waits for an interrupt (that signals that a new set of data is available). With this, the driver seems operational to me. Next step will be to connect it with the detection algorithm I wrote a few months ago.
Finally, I also worked on the Wi-Fi module. I greatly improved the websocket part of the driver and after the corrections made by Tanguy on the server side of the websockets, it worked very well. Like with the capacitive sensor, it was done by polling the module. And in this case also it was possible to avoid it by using an interrupt. So I tried to implement this part of the driver. I had some problems to find a pin on which the pulldown resistor worked (because on the 2 first I tried, there was a pull-up somewhere on the board…). And I had also some (stupid) mistakes on my code that make me lose some time. But it forced me to carefully review my code and thus to improve it, so it’s been useful. So now, websockets are fully operational. I also improved a lot the management of the errors and started to implement the decoding of the commands that will be received through the websocket. As one of these commands is to play a music downloaded from the server, I tried to re-use the code I’d already written for this. And the last error re-appeared unfortunately: the Wi-Fi module takes too much time to answer to my requests. And I don’t have any idea of possible origins of this problem. I will try tomorrow with another Wi-Fi network to see if it changes something, but I don’t hope too much.
The end of the week will be dedicated to solving the problem with the music and to connect the detection algorithm with the capacitive sensor driver: assembling several parts together in fact. It’s the beginning of the integration phase and this is a good news.
During this second part of the week, we had the opportunity to test several components of our project.
First, on Thursday, I tested the capacitive proximity sensor with a coaxial cable (~ 50cm long) and an aluminium plate. By chance, it worked immediately: no parameter tuning was needed on order to detect the proximity of my hand, in all directions and even through a 3cm wood table. Friday evening, I decided to do the same test but with a real plant this time (the calathea I’ve already told about before). I attached the coaxial cable to a leaf with some scotch and once again, it worked perfectly !! I was able to see how close my hand was from the leaf. And if I approached my hand of another leaf, it was not detected, which is good if we want to turn each leaf into a separate button (this is probably due to the fact that on the calathea, each leaf has a separate trunk) We can thus definitely validate the choice of this component.
On Friday, we received some of the components we ordered: the LEDs and all the stuff required to test them (resistors & transistors). We first tested the 2 types of LEDs with a laboratory power supply: they are really powerful, which is good. Then, we decided to try to control them with PWM. With a lot of help from Alexis, I soldered the different components required on the breadboard part of the Olimex P407 board while Tanguy wrote the code. First, it didn’t work and after some exploration, Alexis found a short circuit between 2 pins. Once it has been removed, it worked perfectly well. We had time to test only one of the 2 LEDs. Before taking the final decision about the leds, we want to test with the optical fiber. This will be done as soon as we receive it.
Next week, we have to do the schematic of our board. And I would like to write some code in order to detect how close my hand is from the sensor (up to now, I’ve just visually analysed the graph).
PS: I forgot to say, I’ve bought a new plant yesterday: a banana tree. I hope this species will move less than the Calathea. Answer on Wednesday.
First of all, I want to give you some news about the plant I bought last saturday. It’s pretty nice and would have been perfectly tailored for our project except for one detail. Indeed, unfortunately, it moves its leaves alone: from horizontal during the day, they go vertical for the night. This video shows it very well. So this plant will remain in my living room and we will have to find another one for our project. Maybe another Calathea or a banana tree (provided we manage to find one in Paris or nearby).
On a more technical point of view, this week has been dedicated to the choice of the components, because nothing else can be done before. I’ve searched an appropriate capacitive proximity sensor and found 3 possible ones. Alexis helped us to do the final choice by quickly eliminating the 2 first, because of the sensor design needed. So, we’ll use one from TI (FDC2214). In order to check that it can do the job, we ordered an evaluation board with this component. We received it Friday morning and I tested it Friday evening. The result is quite impressive, even if I haven’t done any tuning yet (see the image on the right)!!
For the ligthning, I called 2 companies (MidlightSun & Luxeri) to ask them about what they can propose us. Midlight Sun is too expensive but Luxeri isn’t. So we’ll order diffusing optic fiber. We don’t know what diameter to choose as they have from 0.25 to 2mm. If possible, we’ll order a sample of 1mm and one of 2mm of diameter to test which one is the most suitable. In order to illuminate them, we have to find a proper RGB LED, but we have no idea of the required power. I’ve found this one and this one which seem rather powerful. A test will be needed to choose.
For the other components, we’ve chosen:
- for moving the leaves, we’ll use Flexinol (also called muscle wire), a wire that shortens when heated.
- for communication, a Wi-Fi module (Zentri AMW007)
- a portable speaker (many exists, so we’ll take the easiest to find nearby)
- an audio codec (TLV320), found by Tanguy
Next week work:
- test different sensor shapes for the capacitive sensor and tune the settings
- order samples to test (optic fiber and LEDs)
- select a processor
- decide if we add BT (or BLE)
- select a battery (if possible, because it depends a lot on the LEDs we’ll choose)
- start the PCB design
As Tanguy said, this week has seen many changes in the definition of our project. Indeed, Alexis suggested to use a real living plant. Obviously, this choice adds several difficulties: a true plant needs water to live and thus we’ll have to design a waterproof product. Moreover, a real plant grows which can be a problem as we want to put wires on leaves. Alexis and us made some research to find an appropriate plant, one which keeps its leaves all the year long, that grows as slowly as possible and has quite big leaves. Tanguy already told about the stabilized plants so I won’t explain it again. Yesterday afternoon, I went to Truffaut in order to have a better view of the possibilities. I exposed our needs to the vendors and they showed me several plants. A Yucca is too rigid for our project, and another plant that I don’t remember the name has too many small leaves. In this shop, they didn’t have banana tree, unfortunately,which was one of the suitable plant we had found. I finally decided to take a Calathea. It will at least help us to size our project.
This week, we have also begun to look at the components that we could use. We are almost certain to use Flexinol in order to make the leaves move. It will probably be the main factor that will affect the battery size. For the proximity detection, Lélio (from Expelliarose) lended us their magic sticks so we can see how they used the capacitive sensors. For now, the most suitable chip I’ve found is the PCF8885, made by NXP. It can have up to 8 inputs and 2 chips can be connected in order to get up to 16 inputs. Moreover, it’s clearly stated that it’s possible to use remote sensing plates. We still have to check the minimum and maximum dimensions of the sensing plates before validating this choice.
Next week will be mainly dedicated to components selection.