It remains less than 36 hours before final presentation and there are still some things to do, but I’m confident in the fact that we’ll be ready on time.
During the last few days, for the software part, I worked on the final debug and improvements of our code. This morning, I finally found why the server crashes so often: in fact, I never closed the SQL sessions I opened, because I didn’t know that it was necessary to do so. So now, the server doesn’t crash anymore. And we have added a timeout on the websocket connections, with a keep-alive message, in order to detect when the plant has been disconnected. And it solved many problems.
For the hardware part, I soldered several leds, tested several heatsink and tried to find a good solution for the servomotors. Indeed, it was hard to find a good place for them, one that allows the wires to freely move up and down. In fact, the problem was that the leaves didn’t have enough strength to pull the wire if this one went through the ground or in a pipe with angles. I finally decided to put the servos at the top of the pot, so that the wires only cross air.
Finally, I also prepared a first version of the slides that we will show on Friday during the presentation.
Tomorrow, we will prepare the show for Friday and finish to equip the plants.
NB: Tanguy has just implemented the code that allows the plant to tweet when a leaf is touched.
The presentation will take place on Friday and we are working to prepare them. During the last days, I continued to track bugs and finalize the code for the server and the card. I also worked on the mechanical aspects, I did some soldering for the first time of my life and the result wasn’t very beautiful.
We tested everything on a real plant and it works quite well, we can even interface it with the outside, I even created a Twitter account where we could tweet when a leave is touched (the account is @Makahiya1)
Until Friday we will finish the preparation of the plants (we will use two) and code some scripts for the presentation
During the last days, I continued testing the capacitive captors. It appears that when they are far from the transformer of my laptop four of them don’t saturate and it was quite easy to detect touches with a threshold that didn’t seem to depend on the location, but the four others saturate and we need a metallic contact to detect anything.
I also made some tweaks and debugging to the server and merged everything in the branch master.
Now we will have to equip a plant, decide the scenario of the presentation and find what to do with the touch detection (only use four or try something to have 8 working).
The last 4 days have been dedicated to integrate all the components of our project, bind them together and make the final corrections.
I’ve soldered several leds and with Tanguy, we made some tests with the leds container I had printed. The leds are really powerful, so powerful in fact that they heat too much and destroy themselves if we don’t put a heatsink. I’ve tried to add a small one, which improved a bit the dissipation, but a bigger is likely to be required if the leds stays turned on for more than a few seconds.
Yesterday, I’ve also added the mechanism that allows the user to program the reaction of the plant when he/she touches one of the leaves. It’s possible to control the leds, the servos and the alarm ring tone and to add loops combining all these elements.
Now, all the features that we wanted to have in our project seem to work, even though the touch detection doesn’t work as well as expected.
Next week, we will finalize the mechanical pieces, equip a plant with all our stuff and prepare Friday’s presentation.
Since last time, I worked on the touch detection because the captors began to behave strangely. First, they saturated when no touch was detected. So it was easy to adapt the code as we only have to detect if the captor is saturating. But then it became harder to have the captor not saturating, so the touch has to be quite strong and it seems it can’t work below the leave as intended. But the weirdest thing is that if I unplug the transformer of my laptop, although the board isn’t connected to him, the captors always saturate, with some exceptions when touching the coaxial cable with something metallic.
Now, the card has every coaxial cable soldered to it, and the behavior is the same with every cable connected to one of the two sensors, the other sensor saturates most of the time but randomly it doesn’t for a short time.
The problems probably have something to do with mass plans around the plant, so we will have to figure a solution to be able to exploit the sensors.
The end of the project is close now, so we have to make as many things work as possible. At the beginning of the week, we still had 2 problems to solve: the touch detection that didn’t worked well and the music download, that was of very poor quality (very frequently, the music “freezes”, which gives a very bad effect). So the goal of this week was to solve these 2 problems. And we are very close to achieve it.
On Monday morning, I worked on the touch detection, trying to understand where does the problem come from. I performed many tests, with different settings, without any results. When Tanguy arrived, he tried once with the original configuration and it worked (he was able to detect a touch at least). Then, I worked on the mechanic: Alexis had brought us 3 new plants, so we had to choose our preferred one. We made some tests with the servomotors to see how a move of the leaves looks like. It doesn’t turn the plant into a fan, but the move is large enough to be clearly visible. After that, I worked on the roof of the container for the PCB and the servomotors and printed it.
Yesterday, I worked on the music download and on the container for the LEDs. The container looks nice (I will take photos for the next post).
For the music download, I finally found why the music “freezes”: it’s because I’ve put my buffers in flash and I’ve to erase the flash sector before being able to write in it. And this erase operation is so long (and blocking) that it can be heard. Using the flash for the buffer thus can’t be a solution, so i tried to put my buffers in RAM, because there is no reason that it doesn’t work. In the evening, Samuel suggested me some ways to improve my code. So I implemented these ideas and this morning, I tested them. Unfortunately, it didn’t work immediately… And I spent several hours trying to solve crazy problems with FIFO. I finally decided to test the code at each commit written yesterday evening, to see how the results evolved. And at two of them, it worked perfectly… Samuel reviewed the changes made at the following commit and found a mistake (a variable used instead of another one). And all what I’ve implemented yesterday evening worked once this mistake corrected. So now, the music download seems to work (I will perform new tests tomorrow morning to see if it really works everytime).
During the next days, I will merge the code written the past few days in the main branch and prepare our code for the presentation. And I also have to add content to the server.
We wanted to be able to send commands from the server to the plant. We didn’t expect to face a lot of problems, but there were some : I sent booleans as True or False when Sylvain expected 1 or 0 when reading the commands, there were some problems with database requests and we decided to add a synchronization between the database on the server and the plant when the plant is set on. I had to do the according modifications on the server side.
I also tried to improve the touch detection algorithm : i decided to not use the data directly but to use 1/(data)² as according to the datasheet it should be proportional to the measured capacity. I tried to compare new values with the mean of the last values, and if the difference is to big, I conclude that there is a touch. It seems to work quite well after fixing the coaxial cable to the plant, but after some point it works quite badly but goes better when re-fixing the cable, so we will see how it works when putting cables on the real plant used for the demo.
I also prepared the servomotors, they are now ready to be used to shake the leaves.
Now, we will have to put every mechanical part of the project on the real plant and to end interfacing everything together.
Writing my previous post, I realized that I hadn’t tested the reception of data through the WiFi module, only the emission. So on Thursday morning, I did some tests to check this part and it works immediately, which was a good news. Then, I worked (again) on the touch detection part of our project, as it’s a quite important part of it and for the moment, it’s unreliable. Despite many tests and solutions tried, it’s still unreliable. It depends a lot on how the wire is fixed to the leaf and on the leaf itself… We’re now able to detect touches almost every time but we have many false positive also. The variability of the results makes the work on this task very painful. Indeed, it often happens that you think you’ve reached an acceptable solution, you do 2 or 3 tests that work very well, and when you want to show it to someone else, it doesn’t work any more. And if it works one evening, it doesn’t work next morning. More work is thus required to make it work, even though it’s not the most interesting part.
I also worked on reacting to messages sent by the server. It’s now possible to select the color of the leds on the server and to send it to the plant. We faced several problems with Tanguy when doing this part, most of them coming from the database, which sometimes crashes due to an overflow of requests. But now, it seems to work well.
On Friday, I spent most of my time working on the box that will contain our board. I decided to put the PCB and the servomotors in the same box, and the LEDs in another one. The PCB will be placed below the pot while the LEDs will be at the base of the trunk. I’ve 3D printed a first version of the PCB box (without the roof) and the dimensions are good (I took margins to be honest, and I’ll remove them for the next version). This week-end, Alexis has to buy a new plant that we’ll use for the final presentation. So tomorrow, I’ll start to put all our stuff on it.
On Friday evening, we finally received the audio jack plug. We thus could have tested the sound and after some debugging, it worked well. However, the music download doesn’t work for now, so I’ll have a look to it to find where does the problem come from and see if I can solve it.
During the next few days, I’ll work on the mechanic (containers for the PCB and the LEDs, placement of the wires and of the optical fiber), on music download and maybe on the touch detection. And we also have to think about what we want to show during the final presentation.
On Monday, we received the PCB and we began using them. I had no problems having the Bluetooth work on our cards, but I had to do some debugging with the LEDs and the timer (I hadn’t tested it before). I burnt a LED in the process, but now it works and then I also managed to move the servomotors, we are now able to shake a lieve of the plant.
Now we have to bind commands sent by Wi-Fi to those actions, but that shouldn’t be that much of a problem.
On Monday, I started to review my code and clean it. I found that the touch detection algorithm was not adapted for 2 sensors, so I modified it (double the size of all the buffers, add one parameter to each function,…) And this day, we finally received our PCB. With Alexis, we made the first tests and discovered 2 mistakes: the power jack is connected in the wrong sense and the pins of the SWD aren’t mapped properly on the connector. For the first problem, Alexis inversed the wires coming from the power block, and for the second he made a patch.
After that, we flashed our program and began to test the components. The Bluetooth worked immediately, and the communication with the capacitive sensor was also immediately functional. Tanguy had to make some corrections for the LEDs and I spent some time trying to understand why I wasn’t able to communicate with the WiFi module. The problem was desperately stupid: I set the pins in the wrong alternate function (the alternate functions table was organized in a different way than the others I had seen but I hadn’t noticed it before yesterday). Now, the Wi-Fi works well (I’ve forgotten to test the reception of messages with websockets so I hope I won’t have any problems with that).
The next component I tested is the capacitive sensor. The communication worked well but I wasn’t able to detect any touch. I investigated a bit, reviewed the configuration values and decided not only to copy the one that worked for the demo board but to evaluate the appropriate ones for our board. First problem with that: we use the internal oscillator of these sensors and its frequency is used in the formulas. But the datasheet only says that this frequency is between 35 and 55 MHz, which is a rather large interval. The second problem I faced is : how to know the capacitance of a banana leaf? With Tanguy, we went to see a professor who could have helped us, but he didn’t find any relevant way to perform this measure. So I decided to redo some tests with the demo board and my own configuration to see if it worked. Then, I did some new tests with our board and this time, the values read were much more coherent. So, it was a good news. The problem now is that the algorithm I wrote some months ago to detect touches doesn’t work with these values so I will have to work a little on this to make it work.
During the end of the week, I will work on the touch detection algorithm, on the “mechanic” (container for the boards, the servomotors and the LEDs,…) and on the code that will run on the board during the presentation.