The last 4 days have been dedicated to integrate all the components of our project, bind them together and make the final corrections.
I’ve soldered several leds and with Tanguy, we made some tests with the leds container I had printed. The leds are really powerful, so powerful in fact that they heat too much and destroy themselves if we don’t put a heatsink. I’ve tried to add a small one, which improved a bit the dissipation, but a bigger is likely to be required if the leds stays turned on for more than a few seconds.
Yesterday, I’ve also added the mechanism that allows the user to program the reaction of the plant when he/she touches one of the leaves. It’s possible to control the leds, the servos and the alarm ring tone and to add loops combining all these elements.
Now, all the features that we wanted to have in our project seem to work, even though the touch detection doesn’t work as well as expected.
Next week, we will finalize the mechanical pieces, equip a plant with all our stuff and prepare Friday’s presentation.
Contrary to what I said in my last post, I didn’t work on AES during the end of the week.
First, I worked on the qualification of the breathing and heart signals. Because after having them filtered, we need to count the breathing cycles and heart beats. Therefore I created a C script that works pretty well with good signals. I still need to test it on our database, but as the majority of our datas were acquired with different configurations, this might not be as relevant as I want it to be.
Then I worked on the tests, and they are now running on each commit after they are pushed. Configuring the docker to create a postgresql database for our server wasn’t an easy thing to do but it has been done!
During the start of the week, I will continue my work on testing the qualification of the signals and I might work on the front end of the server.
Have a nice week,
As you could read on the post of my teammates, we saw Laurent this weekend to show him our works on the project about the acquisition part and the algorithms. During is meeting, Laurent confirmed us that after the digital filters we had a good Electromyography (EMG). That is a very good new.
After that, with Alexandre, we made some tests to check if we could use an unipolar supply power and if the ADC internal clock could be used to make measurements. We could see a good acquisition but to be sure of that we need to make others tests and treat them this the digital filters. After an other reading of the ADC and acquisition boards datasheets, I identified the different signals and bit registers to modify. But there is a signal (the clock select signal) that on the testing software it is not possible to choose the value of that. I think its value is 1 by default because there is a jumper on the board to choose this signal as the input on the ADC or to put this input on the ground. However, I did not find something witch confirm my idea.
I also made a little review of the analog part of the schematic.
For the next week, I don’t know if Alexandre will work alone on that tests or if I will help him. I will finish the analog schematic section just after the confirmation of the analog supply power and the oscillator. I anticipated the two cases. So,I will just need to make some little modifications. I will begin to work an the smartphone application too.
That is all for today,
Have a nice week.
Benoît for BreathUp.
During this long week-end, I had time to work on the first lines of code of our project: the touch/slide detection algorithm. On Sunday, I recorded many data with different configuration in order to be able to test my code. To do the tests, I’ve chosen Criterion, which is a C unit testing framework, well documented and easy to use. The more I use automatic tests the more I like them: it’s so nice to have the opportunity to quickly check that what you’ve just done has not broken hours of work. And if something has been broken, you can fix it immediately. So I’m happy to have found time to learn how to use Criterion. For now, the algorithm is able to detect the touches and the slides on the records I’ve given to it. As Tanguy said, he suggested me another solution to detect slides and touches which visually seems efficient. I’ll integrate it in the code as soon as possible (I hope tomorrow) in order to see if it’s really as efficient as it looks like.
Today, with Samuel’s help, I set up a continuous integration tool on gitlab (it’s called Docker). Combining this tool and Criterion, the tests are now run automatically at each pull on gitlab. This is a good starting point for future clean developments of code.
As Tanguy said, we hoped we could test the optic fiber today but unfortunately, they were not there… We now hope to test it on Friday. In fact no, not in Friday. Because we’re almost certain that on Friday will occur a special event well known by all those who’ve followed the ROSE lessons: the challenge (called communication challenge in the previous years). Indeed, our lab code must work before Friday 07:00 am and we are strongly encouraged to look at our laptops starting from 07:30 am… So on Friday, it’s very likely that we won’t have time to test the optic fiber. So we’ll do it after, as soon as possible.
For the end of this week, I will continue to work on the detection algorithm and if possible we’ll do some tests with the optic fiber in order to definitely choose the LEDs and terminate the electrical schematic of our project.
During the last 3 days, I could not work a lot. Indeed, I made some tests about the oscillator on the testing board. To make that tests, I need to use my Windows Virtual Machine but I don’t know why, the board connects its to the Windows machine and disconnects. So I can’t use the board’s software. However, the board does not sometime unplug and I can make some tests. I tests different configurations thanks to some jumpers. I tried to use the internal clock of the ADC but when I disabled the extern oscillator, the acquisitions did not work. If my computer will allow me to use the software, I am going to make some tests to know if the ADC need a bipolar or an unipolar supply power.
I reviewed the analog part of the schematics too. And, after the tests, I will see it again.
Benoît for BreathUp
In this end of week, the result of our work is not so impressive compare to what we said in the previous post. Let start with the goal I settled in the last post. The first point was: “reduce the number of electrodes from 4 to 3” and the second was: “get an EMG”. On the first point, I study with our teacher Alexis the usefulness of the RLD electrode (which is right leg driven electrode). After a while we deduce that we couldn’t afford to use the system without RLD. Concerning the usefulness of the three other electrodes I will see it directly with Laurent, when I will try to get a true EMG with him on Wednesday (if he can come). To conclude I would say that my goal for Wednesday are quite the same they were for today.
Concerning the other guys of the group, Benoit and Xavier have done a pretty nice work on the PCB and the components. We almost finish it, I have just to do a small test on the acquisition card, to see if they can simplify a little the PCB.
About the algorithm, compare to the acquisition, it is quite the same problem. It seems normal because both are linked. This is the reason why, Benjamin will try to improve it on Wednesday.
I hope Laurent will come on Wednesday.
The first part of the week was quite productive. In my last post I said that I wanted to work on a fully integrated processing chain and on the PCB schematics.
And here we are, on Wednesday:
I’ve started to design the schematics, that’s not finished yet, but I will spend my evening on it because I really want to make a list of questions and have some feedback from Alexis before the long week-end.
A bad news is that we might have some real difficulties with the battery autonomy because after some datasheet reading and calculation I found that the maximum current needed will be 700mA/h…
Secondly, I succeeded to create an automated test of our algorithms from Simulink and C for a database. Here are the main difficulties I had to overstep:
- The main part of our algorithms is still in a Simulink model. This model includes in its configuration the input file, the sample time and the simulation time and we want to change those parameters for the different files in our database. The best thing I found is to edit the model file (.mdl) with a Python script to change theses values.
- The next step was to find a way to launch the simulation from command line. And I finally found a way to give to Matlab instructions to do that.
- Finally, we want to run some C algorithms on the outputs to qualify them. So we need to convert the Matlab output file into a file that will be easy to parse in C.
And here are the difficulties that still remains:
- I want to be able to get the sample time and simulation time from the input files, not from an hard coded parameter in the script.
- This is really slow, one minute per file, so about half an hour for our actual database… I think I can improve this, but I need to make tests and spend time on it before confirming it.
Now that I am getting used to Matlab and Simulink from command line, I want to create some scripts that would improve our efficacy. For example with a script that will allow, in a few seconds, to get the different curves after filtering from the csv data exported from the ADC testing card.
Eventually, I made a good start with the PCB and the test automations and I will continue this work for the small end of the week.
During this second part of the week, we had the opportunity to test several components of our project.
First, on Thursday, I tested the capacitive proximity sensor with a coaxial cable (~ 50cm long) and an aluminium plate. By chance, it worked immediately: no parameter tuning was needed on order to detect the proximity of my hand, in all directions and even through a 3cm wood table. Friday evening, I decided to do the same test but with a real plant this time (the calathea I’ve already told about before). I attached the coaxial cable to a leaf with some scotch and once again, it worked perfectly !! I was able to see how close my hand was from the leaf. And if I approached my hand of another leaf, it was not detected, which is good if we want to turn each leaf into a separate button (this is probably due to the fact that on the calathea, each leaf has a separate trunk) We can thus definitely validate the choice of this component.
On Friday, we received some of the components we ordered: the LEDs and all the stuff required to test them (resistors & transistors). We first tested the 2 types of LEDs with a laboratory power supply: they are really powerful, which is good. Then, we decided to try to control them with PWM. With a lot of help from Alexis, I soldered the different components required on the breadboard part of the Olimex P407 board while Tanguy wrote the code. First, it didn’t work and after some exploration, Alexis found a short circuit between 2 pins. Once it has been removed, it worked perfectly well. We had time to test only one of the 2 LEDs. Before taking the final decision about the leds, we want to test with the optical fiber. This will be done as soon as we receive it.
Next week, we have to do the schematic of our board. And I would like to write some code in order to detect how close my hand is from the sensor (up to now, I’ve just visually analysed the graph).
PS: I forgot to say, I’ve bought a new plant yesterday: a banana tree. I hope this species will move less than the Calathea. Answer on Wednesday.