[Kudly]Last run !

Hi,

Today, we solved a lot of problems.

First, I finished the streaming part of my project. I faced problems with communication speed, queues and data loss. I finally found a way to configure all these parameters. Now I need to integrate it to the rest of the program. I could be cool to have it ready for Friday presentation.

Next, we had some problems with wifi. They were here for a long time but it never appeared before. To do events and mailboxes on chibios, we used the *_DECL way of declaring it. The problem is that it seems to declare static variables. So, it did not work in others files all the time : sometimes it works, sometimes it fails…

Ciao,

 

Julien

[Kudly]Presentation day

Hi,

Today, we presented our project to Sam and Alexis. Our kuddly bear has different functionalities.

First, we did a game for the child that teaches his the colors. He has to regognize the color printed in one ear to push the pow of the side of this color.

Next, we did a program that teaches his the alphabet. It says the alphabet and ask to repeat. The sound is recorded and sent to the server. The parents can consult it later.

Then, you can hug you bear and it does a sound and turn on the leds.

Finally, you can take a photo and send it to the server.

We still have some bugs to fix : no data were sent to the server while we were presenting our demo. One again, Murphy is here 🙂

Ciao,

 

Julien

[Kudly] Server, websockets, streaming and couture

Hi,

Today I have done a lot of differents thing.

First,  worked on the server and the web application. For now, there are really few html pages. However, we would like to send care data such as temperature to the server. So, I found a way to print these data. I did a charts with amCharts. It was quite easy and the result is ok. Next, I added a streaming function. As I am developping an app with play framework and scala, I used Iteratees and Enumerators. I need to try more deeply this function because I am not sure of the result. This streaming is using websockets. As you are communicating bytes and not string, the header was different. That caused some problems.

I tried to help some people to fix bugs. Now, file downloading and uploading work. We are still waiting for websockets to try the streaming.

I also did some couture. I fixed sensors inside the bear and we found a way to open and close the bear quickly. It is good enough for what we want to do.

Ciao,

 

Julien

[Kudly]Camera and websockets

Hi,

I augmented once again the quality of my image. I used the link given by Eric and I changed all my registers. I am using double buffering and I had some problems with my images. It seems that I am not writting fast enougth on the sd card and so the dma overwrite what was on the memory before. So, I have to take a buffer big enough (currently 50kb in my case) to be able to obtain a correct image.

I am working on the server too. I did a simple POST echo handler and a websocket handler to echo the messages. We are trying to adapt it on our wifi card (AW006). The post was easy, we gives the arguments in the url. For the websockets, it was a bit more difficult. It seems that it is not directly supported. If we begin a websocket connection thanks to the http_get method, we get a 101 response (which is the response for websockets) and next, it is impossible to write in the stream.

So we are working on a tcp connection and we are writting our own packets. After reading the websocket RFC, we understood how to encode/decode packets. We tried with basic example and it seems to work 😉

Here is a french frie for the short post 🙂

See you,

 

Julien

[Kudly]High quality

Hi,

I have just taken a better quality photo. Here is a family photo :

famille

See you 🙂

Julien

[Kudly] Cheese V2

Hey,

Yesterday, I finally got an image without any problem 😉 I will explain you what I did to use the STM32F4xx with Chibios, DCMI and OV2640.

First, DCMI is not supported by Chibios but someone wrote a driver that works a bit. I adapted it to make it work. You can find it here : https://github.com/EwanColdicott/ChibiOS . There was some problems. First, there is written DCMI_CR_CRE instead of DCMI_CR_CAPTURE in the dcmi_lld.h file. Without changing that I obtained strange results. Next, in dcmi_lld.c, in the receive function, the dma  mode is define again whereas it was also defined in the init function. However, the interrupts flags are forgotten. So, I just used the former value.

The VSYNC interrupt is also activated by default and it causes problems. The dcmi interrupt is called multiple time and in One Shot mode, this interrupt is supposed to be used only once. Else, the dcmi driver thinks it is in continuous mode. So, I changed the line of the interrupt :

dcmip->dcmi->IER |= STM32_DCMI_IER_FRAME_IE;

To communicate with the camera, a sccb interface is needed. I wrote my own driver and Eric made it better. You can find it on this post : http://rose.eu.org/2015/04/09/heros-faster-camera-configuration . It looks like I2C.

Now, the configuration of the camera. Open your grimoire page 404 because we are going to do dark magic. It the datasheet, all the registers are not defined. So, I found configuration on the internet. Here is my main inspiration source : http://tech.munts.com/MCU/Frameworks/ARM/stm32f4/libs/STM32F4xx_DSP_StdPeriph_Lib_V1.1.0/Project/STM32F4xx_StdPeriph_Examples/DCMI/DCMI_CameraExample/dcmi_ov2640.c

. I initialized my registers with those configurations and it works. At the beginning, I was not able to obtain a pixel clock. It was to be activated in a hidden register…

I modified the clock speed register because it was too long to take a picture (I change a prescaler). After changing some parameters like the clock polarity (samples on rising edge), the camera finally worked.

Cheese !

test

 

Julien

[Kudly]Cheeeese

Hi,

The camera is finally working ! After (a lot of) black magic, I finally found a way to output an image in jpeg. In fact, the registers are not all documented so I compared codes for the OV2640 to find a register configuration that works. Here is the result :

kudlycamera imagecamera

The color are not really good but I hope I will be able to configure them 🙂

See you,

 

Julien

[Kudly]Camera Stuff

Hi,

Last week end, I began the camera, an OV2640. First,  I tried to communicate with it : I implemented the sccb protocol, which looks like I2C. SCCB is a protocol developped by omnvision. After some tries with different timings, I was finally able to control the register of my camera.

Next, I tried to get an picture and my problems began : I send a clock to the camera on the XCLK pin (the clock is MCO1). Then, my camera is supposed to give me a clock on the PIXCLK pin but I received nothing. I tried to see with it was enable is the register but I found no significant problem. In fact, I read on the openmv project an initialization sequence. I need to write in registers which are not on the datasheet ! So, I finally obtained a PIXCLK… Thank you OmniVision.

I also started a server for Kudly. Yesterday, I did a ogg streaming on a page with chuncked packets.

See you,

 

Julien

[Kudly] First demonstration

Hi,

Yesterday, we did the first demonstration of what we did. As we do not have the card yet, we used our lab card. Then, we showed how the sd card works and the functions we can use in the shell. We also showed the beginning of our server which simply answers some requests.

I started to program for the card even if I do not have it. I hope most of the code we did will work because we adapted what we did during the lab.

I also stated to do the camera and I wrote a sccb driver. SCCB is a “I2C like” bus but as it is not integrated to the card, we need to generate the signals thanks to the software.

See you 🙂

 

Julien

Challenge day

Hi,

Yesterday was a harsh day : I spent most of my time coding for the communication challenge, a challenge designed by our teachers. The goal was to print a maze on a screen through the network and to solve it thanks to an on-board joystick.

The first part was dedicated to network communications : I had to send requests to a server by different ways. First, it was a simple request. Next, I had to send a request very quickly after receiving an url in a packet. Then, I needed to answer questions that was sent to me thanks to a request. In this part, I used the code I did during the last lab and so it was quite fast. However, the solution I used was sometime a bit cheating because, for example, I was not detecting the end of the header. I had to do it later.

Then came the amazing part : the maze ! I began with the screen initialization. It used SPI and I printed a test picture. Next, I received a picture through the network and I printed it on the screen. It was a checkcode to go to the next step. Then, I implemented the joystick : it was quite easy because I did a code similar to the buttons I did during the lab. This joystick woke up a thread that send to request to server to obtain the next position in the maze. Once the maze was completed, I finished the challenge.

I had some difficulties in this last part. First, I do not understand why, but if I do not put a timeout for the reception, my program tries to read infinitely. As I wanted to make fast requests, it was kind of a problem. Next, at the beginning, I had not understood that the screen was larger than what can be seen, just like VGA. So, my picture was moved. To finished, when I used the joystick, I first disabled the interrupts because I did not wanted there to be a joystick update while I was updating the screen. Sam advised me to wait for an event with an immediate timeout, which in fact cleared the flags of an event.

Well, it was really amazing but I was very tired as I needed to be concentrate a lot.

See you !

Julien