Sunday, November 10, 2013

RS4 - Self balancing Raspberry Pi OpenCV image processing Robot

Here is the robot that I'm working on, you can see the latest video here, although it suffered some modifications since then.

I'll divide this description in topics as it's easier for me to describe it this way. The idea to build this robot came from buying a Raspberry Pi, when I saw it I said "I've got to build a robot with this " :) . I have built other robots in the past but this one is the most complex and the first with image processing.


The robot chassis was designed by me, I used a 3D tool to generate some previews mainly because I needed to have an idea of the size and components distribution before build it. Here you can see the model of the robot:

After this I began the building process, I bought a carbon fiber plate (more or less the size of a A4 sheet) and cut all the pieces by hand with a mini drill machine (unfortunately I don't have a CNC machine to do this job). I bought some aluminum profiles to make spacers and fixing parts as you can see in the next photo. The result is a very light and strong chassis.

Motors and wheels

I'm using stepper motors in this robot, no special reason for that. I bought them as Nema 17 motors, the motors reference is  LDO-42STH38-1684A 121121 LDO MOTORS. These type of motors have a nice robust look and are usually used in CNC and RepRap machines.
The wheels are from a RC 1/8 Buggy, you can find it easily in any RC store as they are standard size. What I like the most in this wheels is their soft touch, this way they work as a damper for small obstacles allowing smooth run.

To connect the wheels to the motors I used Traxxas Revo hubs and nuts like showed in the photo, these are the only ones that I found with a 5 mm hole, the same as the motors shaft. This way is more or less plug and play.


For pan and tilt I use 2 micro servos (Tower Pro MG90S), very cheap and easy to get. The head has a holder for the Raspberry Pi camera module, a ultrasonic sensor and 2 RGB LEDs.
You can see some details of the robot in the next photos

Balancing and motor control Board

This robot uses a dedicated board for balancing and motor control (I want to use Raspberry Pi only for high level tasks). This board is my design and it uses the following components:
 - 2 L298 + 2 L297 stepper motor drivers, (yes, I know they are old but they are cheap and easy to find to, in a future revision I'll use something from this century :) );
 - Murata ENC-03 Gyroscope, analog single axis gyro, very easy to use;
 - MMA7361L Accelerometer, 3 axis analog accelerometer (I use a module, this chip is to small to hand soldering);
 - PIC24FJ64GA002 microcontroller
It allows I2C and serial communication. Photo of the board and the motors here:

Servo control board

I'm using a modified motor board to control two servos and to read the ultrasonic sensor (not yet being used). This is a temporary solution, I intend to design a dedicated servo control board or buy one.


The energy to power the robot comes from a 2000 mAh LiPo 3S battery. To generate required voltages I'm using one 3.3V regulator and two 5V switched regulators. I want to design a dedicated power board in a future revision.

Balancing control 


Balancing control is performed by a PID cascade, like showed in the next picture. This way is possible to balance the robot even if you move the center mass or run it in a ramp. It will find a new balance angle that allows it to be balanced and stopped. In fact both the controller are PI only, the derivative gain is set to 0 because it causes the robot to shake even with small gain. 

  PID implementation is as simple as this:
    pTerm = Kp * error;
    sum += error;
    iTerm = Ki * sum*Ts;
    dTerm = Kd * (error - lastError) / Ts;
    Cn = pTerm + iTerm + dTerm;
    lastError = error;
For PID tuning I used a Bluetooth module which allows me to adjust Kp,Ki,Kd for both the controller in real time. This way you can immediately view the effects and reach the desired behavior for the robot. In this video you can see it successfully balanced for the first time .

Sensor fusion

Sensor fusion (gyroscope + accelerometer to get the leaning angle) is performed by a Kalman filter, not much to say about it, it works really well.   Follow this fantastic tutorial, here is everything you need to know, includes explanation and implementation.
OK, the robot is balanced but now it is necessary to move it. Moving forward an back is quite easy with this PID cascade setup, you just have to give a set point to the first controller and it will calculate the appropriate leaning angle to reach that speed. Something like this:

  setAngle = calcCn1(instSpeed - setSpeed);
  instSpeed= calcCn2(angle - setAngle);
To turn the robot I'm attenuating the speed in one wheel, depending on the side it needs to turn. This way the robot keeps the balance as both wheels are reflecting the control system speed. Implementation looks like this:
 instSpeedL = instSpeedR = instSpeed;
 motorSpeedL(instSpeedL * factorL);
 motorSpeedR(instSpeedL * factorR); 
 0 ≤ factorL ≤ 1,     0 ≤ factorR ≤ 1  
To perform spins, rotating in turn of itself, what I do is to give an opposite offset speed to the wheels. With the wheels rotating symmetric speeds it will perform a spin and stays balanced, completing the implementation it will look like this:
 motorSpeedL(instSpeedL * factorL + spinSpeed);
 motorSpeedR(instSpeedL * factorR - spinSpeed); 
 If spinSpeed is positive the robot will spin clockwise, other way it will spin counter clockwise.
That’s the way I found to control the robot motion, there are possibly other methods. Other important thing is that with stepper motors you shouldn't apply big speed changes abruptly or they will slip, this can be solved with some low pass filter applied to factorL/R and spinSpeed. This way works well in my robot. In this video you can see the a run with Bluetooth control, it can run faster than this but will easily fall if it finds some small bumps on the road.

Raspberry Pi

I'm using a Raspberry Pi model B 256 MB with a micro SD adapter because of the limited space on the robot. I have a small WiFi adapter but the robot is not yet using it. The installed operating system is Raspbian, I managed to get OpenCV working with the Camera module thanks to this tutorial, great stuff here:
At the moment I'm using serial communication between the Raspberry and the motor board and servo control board but I intend to use I2C as it is a more appropriated method. The reason I'm using serial now is because the interface code was already done for the Bluetooth module (it is a cheap serial Bluetooth module). I have to spend some time working in the I2C interface.
Serial interface with the Raspberry is quite easy, you just have to disable terminal emulation. I'm using WiringPi library to achieve serial communication and to control Pi's GPIOS without any issues.

Image processing

I have very little experience with image processing, it is the first time I'm using OpenCV and I'm still learning how to use it. My first example is the object tracking (ball) by color filtering like in this tutorial:
It works well but is sensitive to lightning changes, at this moment I'm using YCrCb color space instead of the HSV but the results are similar. With the object coordinates in the screen I control the servos to point the camera to the object and control the robot direction based on the head angle. 
The ball following was the first simple example to integrate all the parts of the robot, the robot behavior was funny and I decided to publish the video on youtube. 

Final remarks

This robot is an ongoing project, I'm continuously building new parts and modifying others. I don't have a defined goal for this robot, I would like to give it some autonomous navigating capabilities. It has some real potential I just have to work on image processing and learn some more technics. I intend to add a speaker too.
In the initial robot sketch it has 2 arms, it would look cool but it gives a lot of work to build and I'm aware that is hard to give it some useful function like grabbing objects or something. I could use arms to get the robot back on balance after a fall, maybe in a future update.
I have implemented odometry in this robot, at the moment I'm not using it. A 3 axis gyro would be very useful to correct odometry angle errors, a point to review in future revision.


  1. how do you measure the robot's speed?

  2. You don't need to measure speed with steppers. They just do the steps that you want, your input is the real speed unless they slip.

  3. Hi!
    I wonder about 2 PID controller for angle and speed are similar?
    I think output +=pTerm + iTerm + dTerm for PID speed controller
    And output = pTerm + iTerm + dTerm for PID angle controler
    What do you think about that?

    1. Hello Kim

      Yes, you are right. Both controllers are the same in implementation but with different gains (Kp,Ki).

  4. Hello!
    I´ve been trying to use OpenCV to do a work with a robot, also, but it reveled to be a daunting task.
    How did you cross compiled to the RPI? Did you programmed and debugged in VS?

  5. Hello André (are you Portuguese?)
    I'm not using cross compile for the OpenCV. I'm programming directly on the Pi.
    Sometimes takes a long time to compile.

  6. Hello!
    Thanks for your reply. Yes, I am :P I´m in Povoa de Varzim, what about you?
    I tried to do all the programming in the RPI with geany, but I given up that idea because of the speed. I thought I could do a lot better with a familiar interface like VS along with the speed of a few gHz cpu, but I failed to compile openCv libraries for windows with errors I know nothing about and found difficult to find solutions on the web.
    So I tried with Linux on a VM, and it worked, if it was not for some hardware problems related with crossing the usb camera from the main OS to the VM OS.
    I finally decided to grab an old computer with Linux and compiled the libraries there. It is working fine (with some adjustments) but the best I can do is to place the code there and debug it, and then grab that *.cpp and place it on the PI and compile it there. So far all the code that worked on the computer also worked on the RPI, wich shows a good compatibility between OpenCV versions.
    Your work is very impressive and it´s very nice to see the that robot doing it´s thing.
    Please keep posting new stuff!

    1. Perguntei porque o teu nome parecia português. Eu estou a morar no Porto.
      I'm doing something similar to what you're doing, I'm using a Linux virtual machine with OpenCV on Eclipse IDE. I do all my coding first on the computer and when I get something I like, I compile it in the RPi. Never had a problem with compatibility between OpenCV versions.

      From now it will be hard to work in the robot, I have a new job and I have not much time for it. If you need something just ask.

    2. Até estamos próximos. Queria ver se conseguia dominar a linguagem do OpenCV o mais rápido possível porque queria ter o robot em que estou a trabalhar pronto até Janeiro. Mas ando um bocado á nora com os processos do OpenCV. I´m really not very good with C++, so it´s being difficult to elaborate a code for what I want. The arrows functions that you have developed are exactly one of the features I was thinking on putting in the robot. How were you able to master OpenCV? Did you follow the tutorials or something? Sad to know you will have little time for the self balancing robot. But hope to see some more amazing projects from you.

  7. This comment has been removed by the author.

  8. Hi!
    I was wondering, is the raspberry self contained? What I'm trying to say is did you run opencv on a monitor and then disconnected the pi, or is it communicating in any way maybe wireless to a monitor?

    1. Hello,

      I'm using external monitor,mouse and keyboard. To test it on the floor I unplug all (with application running). You can see it in the photo.

  9. Do you recommend a place to start with openCV for robotics ?

    1. Hello

      I didn't follow any particular tutorial to begin with OpenCV. What you can do is install it in your PC and start playing with it. There are many examples in the web of object tracking that you can try. Even in YouTube you can easily find some tutorials on that.

  10. Hi, I have just got my 1st raspberry pi and saw your project and thought it was cool and would be something that I would like to build. Could I use a usb webcam instead of the pi webcam? How are the servos driven for the head as it wasn't clear from the info you have it from a separate driver or directly from the raspberry pi?Thanks for any help you can give.

    1. Hello Paul
      Yes you can use a USB camera but it will be slower than the Pi camera board (see the thinkrpi website). Head servos are not controlled directly from the Pi, it is a microcontroller that generates the signals.Good luck for your work :)

  11. Hi Samuel. Appreciated! Cool job! :)
    I also tried to build first self balancing draft a week ago. But physics of "Brushless Gimbal Controller" project not so appropriate for Self-Balancing robot, but anyway I got it standing :) :) Next step - I'm going to adapt multiwii project (I'm old fun and multiwii developer in past) with soft drivers for brushless gimbal motors...

    Also does it possible to look through the code of your project? It's open source?


    1. At the moment my code is not open source. I am with no time for this project at the moment. I'll try to update the blog with more information soon.

  12. Bom dia Samuel, excelente trabalho. Temos um pequenino projeto chamado Jabutino, usamos o Arduino nele, quando puder procure no Youtube. Estava a trabalhar num Hexapod quando me deparei com seu projeto... parei tudo... tirei a poeira da a minha raspberry ... Fantástico. Gostaria muito de aprender com seu projeto, tenho algumas idéias para inclusão social de deficientes... imagino um mudo interagindo com um robo, como o seu, através de gestos e linguagens de sinais, ou uma criança aprendendo formas geométricas e cores junto ao robô..., mas vamos ao que eu gostaria de saber se você puder me informar, por favor fique a vontade ... Pode dar uma dica sobre com fazer para o robô correr atras da bola ou reconhecer uma instrução através das plaquinhas? Muito agradecido, Luis Oliveira (Brasil, Salvador-Bahia)

    1. Olá Luís,

      Tens alguma informação aqui no blog acerca de como foram implementadas as estratégias de visão artificial com o OpenCV. Quero adicionar aqui mais informação e inclusive código que possa ser testado mas para já não tenho tido muito tempo livre. Quando puder vou actualiazar o blog com mais informação.

  13. Olá Samuel, como tem passado? Bem espero!
    Foi bastante trabalhoso colocar a Camera nativa da Raspberry para funcionar mesmo seguindo o tutorial que você recomendou, falizmente depois de muito "apanhar" deu tudo certo.
    Estou a seguir o tutorial não está fácil. Estou com problemas na execução do programa ObjectTrackingTut.cpp. Ele compila sem nenhum erro, porém quando executo aparece o seguinte erro: Assertion failed ((scn ==3 || scn == 4) && (depth ==CV_8U || depth == CV_32F)) in cvtColor, file color.cpp, line 2957. Desculpe por abusar da sua paciência, mas tem alguma ideia do que pode ser? A camera esta funcionando perfeitamente com os programas de teste estou usando a OPEN CV 2.3.8

  14. Hi, well done! I am doing something similar for a school project, we are tracking a color using the Raspberry Pi camera board. I find that when runing OpenCV on the Raspberry Pi, my image tracking is running very very slow(about 2-5 fps) which is entirely too slow for real time object tracking. Your robot seems to be responding very fast. How much fps do you get? How did you acheive this? Any tips would be great!

  15. Hello

    Maybe you are using a higher resolution. I'm using 320X240 and it works good. Color tracking I think I get around 15 fps.

  16. Hi
    A big thank you for your explanations
    I could not find how to limit the speed!
    your method works too well
    sorry for my bad English ...

  17. Hi Samuel, compliments for your bot. Is the inner PID (angle) executed the same number of time/second of the outer PID (velocity) or the inner PID is executed more times than the outer?


    1. Hi :)

      Both are executed at the same time interval, every 10 ms.

  18. Hi Samuel.
    Why you decided to use a Kalman filter and not a complementary filter?

    What advantages does the Kalman filter on the complementary filter? More accurate? Faster?

    To use I2C gyroscope + accelerometer? Bitwise? Or SMBUS?

    Thank you.

    1. Hello Juan,

      I have simulated both filters and the Kalman is a little better, it is also more complicated and needs more processing. You can try both and see whats the best for you.

      I don't understand your last question...

    2. Sorry for my english.

      How you acquire data gyroscope? Using I2C? The speed is enough?

      What resolution you use in stepper motors? Changes according to the state? For example: To go forward = full step
      To stay balanced = 1/16 ?

      Thanks in advance

    3. My gyro has analog output, I have to use ADC to convert it. I've used I2C gyro in other projects and it is fast.

      I'm using half step in the motors, unfortunately the drivers don't allow better resolution. I have a redesigned motor board with better drivers, but it is only on paper...

  19. This comment has been removed by a blog administrator.

  20. How did you do, that your robot doesn't turn back to spot when you push it? I made similar balancing robot with cascade PID but mine always comes back to spot and it is diffucult to control it to move because after stopping it always turn back. In your video I can see that your robot always stays in new place after pushing.
    I used DC motor instead of steppers and I measure the speed with encoders but I think it doesn't matter. The PID look and works the same...

    1. Hi Hubert,

      In fact my robot has the same behavior that you describe, in the video that you see me pushing it with my hand it is working with only one PID stage, this way it don't return to the starting point.
      I don't see why is this a problem for you, can you explain?

    2. I wanted to make it move forward and backward (spins and steering I want to add later) with my TV remote control and IR receiver. I did it in this way:
      I use three buttons -" forward", "backward" and "stop".
      When I press "forward" button I set the setpoint for speed PID for a particular positive value . Then when I press "stop" button I set this setpoint to 0 as it id by default when robot doesn't move. "Backward" works same as "forward" but i set the negative value as a setpoint.

      Speed PID calculates the leaning angle for angle PID to start move, but it doesn't move slightly. It moves for exaple forward but after a while it looks like it was trying to stop (it slows down) and then it accelerates again but with a greater speed. After several such oscillations it falls down.

      Also when I press "stop" when the robot moves forward or backward it usually turns back to the place where it was before doing movement forward or backward. For instance it goes 1 meter forward and after stopping it goes 1 meter backward so effectively it stays at the same place.

      I don't know what I am doing wrong. I just change the setpoint for speed PID just like you wrote in your post. Maybe I also should make some stopping routine, not just immediately change the setopint to 0, but for instance decrease it slightly until it reaches 0?

    3. This comment has been removed by the author.

    4. This video shows this situation.

      As you can see it moves with oscillations and it also doesn't brake slightly. It looks like my PID, especially speed loop isn't tuned well. When it tries to achieve a setpoint speed it overshot and in effect it slows down and undershots and this process is going forever. The same situation is during braking. When I change speed setpoint to 0 and current speed is much greater than 0 it tries to achieve 0 very quickly so the robot brakes very sharply and becomes unstable.

      I have no idea how to tune it to work well. When it stands on the spot it looks good. It's very difficult to tune PID while robot moves... :(
      Did you have such problems during tuning your PID including the speed outer loop? Maybe you did it in some special way...

    5. It looks a PID tuning problem,I've seen similar behavior in my robot but with lower oscillation frequency. I don't have any special way, just trial and error...
      I my case the speed set point is increased/decreased, mainly because of the steppers, they slip if the speed increase/decreased is to high. Try it in your case.
      I'll try to make a video of the push behavior with the 2 PID stages for you to compare, it is very distinct.

    6. You mean you don't just give the set point as a particulat value, but you increase or decrease it gradually until this value is reached? I mean it doesn't jump from 0 to 100 but it is incremented in each loop as long as it isn't 100. (100 is only exemplary value).

    7. Yes, just like that. See this video:

      See how the robot leans to counteract the disturbance, I don't see this behavior in yours.

    8. This comment has been removed by the author.

  21. Thank you very much for the video. You robot turns back to spot so slightly without any oscillations. My robot also turns back to spot after disturbance but with oscillations. Now I know how should it look like. Maybe I will tune this PID some day :) Thank you :)
    I would like to ask you only one more question. In situation when you don't control your robot (like on video), just it stays on the spot, the setpoint for speed PID loop is always 0?

    1. Yes, if is stationary the speed setpoint is 0.
      You are in the right direction, just needs small adjustments to make it work properly.

    2. Hi. I succeded to make my robot move smoothly. It was just the matter of PID constats. Now I have my own PCB and Raspberry PI connected to it with Serial so I can tune it from my PC's keyboard (VNC serer). I've spent some ours and I achieved followin result:

      I think it works quite well. Now I can move on and add some visual algorythms with OpenCV. Thank you very much for your help :)

  22. This comment has been removed by the author.

  23. Olá Samuel, eu estou a contruir um robot, já consegui colocar o PID a funcionar com o MPU6050 filtro kalman.
    Estou com dificuldade em colocar os motores de passo com movimento suave o tal motion control PID, utilizo duas pontes H L298.
    O que posso fazer no arduino para melhorar os movimentos dos motores?

    1. Olá Nuno,

      O que eu faço para tornar o movimento dos motores mais suave é variar a corrente de acordo com a necessidade de movimento. Uma corrente mais baixa torna os passos suaves mas perde força, uso quando o robo está parado e com velocidade baixa. Se for necessário velocidade elevada aumento a corrente no motor para que este possa aguentar. Espero que ajude

  24. Hi Samuel, i doont know how to change resolution to raspicam from python, could you help me?

  25. Hi,

    your project is very impressive!
    where can get help with schematics and how to connects stuff?
    for example, how do i wire the Accelerometer and the gyro sensors to raspberry?


  26. Seria uma boa idéia usar seu projeto com pi cam opencv python para controle pan tilt do meu drone?
    Você poderia me ajudar?

  27. Seria uma boa idéia usar seu projeto com pi cam opencv python para controle pan tilt do meu drone?
    Você poderia me ajudar?

  28. hallo Samuel
    my name Safri, from Indonesia
    i'm very interested with your project
    but, i have question, can i used this methode for number detection?
    Thank you for answer

  29. Hello my name is Henrique

    I´m doing a self balance robot´s project.
    I saw that you tuned the PID gains by bluetooth.
    In PID Cascade you have 6 gains. How did you tune that? Wich on did you tune first?(Angle gains or speed gains?)
    Can you show me you technique ? I having problems to tune my cascade PID.


  30. Ola Samuel, eu estou a fazer uma placa dedicada para o controlo dos steppers, e estou a uzar um Atmega328p, e tenho dificuldades em conectar os l297 ao microcontrolador, ha alguma possibilidade de me dares os schematics da tua board ?

  31. can you please post a parts list?

  32. Hello ,can you post or send me ( for Balancing and motor control Board or a example can help .

  33. This comment has been removed by the author.

  34. This comment has been removed by the author.

    1. Great project !!!!
      Which version of Python and opencv you use ? And it is possible, to get the code for it ?

  35. can you please elaborate on how you tuned the pid using bluetooth

  36. Hey! Your earliest video of the balancing robot tracking and following the ball really impressed the heck out of me! Pretty soon afterwards I set about building a similar version of your bot, sans the self balancing part. Although I seem to have hit a big of a roadblock and was wondering if you could help me out. I've used an RPi2 + OpenCV + Python to build the project, and have come to the stage where the pan-tilt servos can accurately track a color filtered object. However, upon integrating the ultrasonic sensor for corresponding distance measurement, the RPi simply cannot handle the processing requirements, and the setup lags very badly. Do you have any thoughts?

  37. Hi Samuel,

    I find your blog very inspiring and I would really like to get more updates. What plate did you use for the motor board?

    1. Hello Janina,
      I'm planing to return to this project, so you can expect some updates in the next months.
      What do you mean by plate?

    2. Cool! I'm very excited :) I mean the motor board, which you designed by your self. Did you buy a special board/plate (don't know the english word for the green board) or how did you made it?

    3. Now I get it. I didn't produce it myself (it is possible). I have designed it and send it for production on China, don't remember the website but I can check it if you want.

  38. This comment has been removed by the author.