Project tutorial
Smart Face Tracking Robot Car

Smart Face Tracking Robot Car © MIT

This project used as arduino interface to control the motors, and the Creator Ci20 as image processor with a script in python.

  • 8,467 views
  • 7 comments
  • 47 respects

Components and supplies

Apps and online services

About this project

It is a car that tracks your face, if I'm away he comes!

You need to have installed this (Ci20):

  • Opencv
  • Python
  • Serial port fix permissions

I used the distro Debian 8 install & download

I used:

  • Old laptop camera (recycled)
  • Ultrasonic ranging module HC - SR04
  • x2 Battery 3.7v 4000mah (recycled old laptop)
  • x2 Servo
  • DC-DC converter
  • Arduino nano
  • Robot base kit
  • x2 L298d

For installation use the following commands in the terminal.

To update linux.

sudo apt-get update        # Fetches the list of available updates
sudo apt-get upgrade       # Strictly upgrades the current packages
sudo apt-get dist-upgrade  # Installs updates (new ones)

OpenCV. It is the version 2.4.9.1 is not up to date, but it works fine.

sudo apt-get install libopencv-dev python-opencv

If you want to compile OpenCV from the source , I found this tutorial.

Install PySerial.

sudo apt-get install python python-serial

We change the serial port permissions, if you used the USB or a native.

sudo chown ci20:ci20 /dev/ttyUSB0 #Serial port USB from Arduino
sudo chown ci20:ci20 /dev/ttyS1   #Serial port native in Ci20

Runs the python script (changes the zero if you have more than one video source, this selects the source).

sudo python facetrackingcar.py 0

To improve performance slightly, the image captured is not displayed if you want to see, you must search and uncomment this lines.

#cv.ShowImage("result", img)
#cv.NamedWindow("result", 1)
#cv.DestroyWindow("result")

You need to have the haarcascade_frontalface_alt.xml file in the same directory as the scrip.

Video for only tracking camera in Windows

Demo video on Ci20

Feedback and improvements are welcome.

Code

faceser.pyPython
Python scrip OpenCV face traking
#!/usr/bin/python
"""
This program is demonstration for face and object detection using haar-like features.
The program finds faces in a camera image or video stream and displays a red box around them,
then centers the webcam via two servos so the face is at the center of the screen
Based on facedetect.py in the OpenCV samples directory
"""
import sys
from optparse import OptionParser
import serial
import cv2.cv as cv
import time

# Parameters for haar detection
# From the API:
# The default parameters (scale_factor=2, min_neighbors=3, flags=0) are tuned
# for accurate yet slow object detection. For a faster operation on real video
# images the settings are:
# scale_factor=1.2, min_neighbors=2, flags=CV_HAAR_DO_CANNY_PRUNING,
# min_size=<minimum possible face size

min_size = (20, 20)
image_scale = 2
haar_scale = 1.2
min_neighbors = 2
haar_flags = cv.CV_HAAR_DO_CANNY_PRUNING

max_pwm = 180
min_pwm = 1
midScreenWindow = 20  # acceptable 'error' for the center of the screen.
panStepSize = 1 # degree of change for each pan update
tiltStepSize = 1 # degree of change for each tilt update
servoPanPosition = 90 # initial pan position x
servoTiltPosition = 45 # initial tilt position y
panGpioPin = 2  # arduino pan servo ID (NOT pin number)
tiltGpioPin = 1  # arduino tilt servo ID (NOT pin number)
var =0;
def detect_and_draw(img, cascade):
    gray = cv.CreateImage((img.width,img.height), 8, 1)
    small_img = cv.CreateImage((cv.Round(img.width / image_scale),
                   cv.Round (img.height / image_scale)), 8, 1)

    # convert color input image to grayscale
    cv.CvtColor(img, gray, cv.CV_BGR2GRAY)

    # scale input image for faster processing
    cv.Resize(gray, small_img, cv.CV_INTER_LINEAR)

    cv.EqualizeHist(small_img, small_img)

    midFace = None

    if(cascade):
        t = cv.GetTickCount()
        # HaarDetectObjects takes 0.02s
        faces = cv.HaarDetectObjects(small_img, cascade, cv.CreateMemStorage(0),
                                     haar_scale, min_neighbors, haar_flags, min_size)
        t = cv.GetTickCount() - t
        if faces:
            for ((x, y, w, h), n) in faces:
                # the input to cv.HaarDetectObjects was resized, so scale the
                # bounding box of each face and convert it to two CvPoints
                pt1 = (int(x * image_scale), int(y * image_scale))
                pt2 = (int((x + w) * image_scale), int((y + h) * image_scale))
                cv.Rectangle(img, pt1, pt2, cv.RGB(255, 0, 0), 3, 8, 0)
                # get the xy corner co-ords, calc the midFace location
                x1 = pt1[0]
                x2 = pt2[0]
                y1 = pt1[1]
                y2 = pt2[1]
                midFaceX = x1+((x2-x1)/2)
                midFaceY = y1+((y2-y1)/2)
                midFace = (midFaceX, midFaceY)

    #cv.ShowImage("result", img)
    return midFace

def move(servo, angle):
    '''Moves the specified servo to the supplied angle.
    Arguments:
        servo
          the servo number to command, an integer from 1-4
        angle
          the desired servo angle, an integer from 0 to 180
    (e.g.) >>> servo.move(2, 90)
           ... # "move servo #2 to 90 degrees"'''

    if (min_pwm <= angle <= max_pwm):
        ser.write(chr(255))
        ser.write(chr(servo))
        ser.write(chr(angle))
    else:
        print "Servo angle must be an integer between 0 and 180.\n"

if __name__ == '__main__':
    ser=serial.Serial(port='/dev/ttyUSB0',baudrate=115200,timeout=1)
    #ser=serial.Serial(port='/dev/ttyS0',baudrate=115200,timeout=1)
    #ser=serial.Serial(port='COM14',baudrate=115200,timeout=1)

    # parse cmd line options, setup Haar classifier
    parser = OptionParser(usage = "usage: %prog [options] [camera_index]")
    parser.add_option("-c", "--cascade", action="store", dest="cascade", type="str", help="Haar cascade file, default %default", default = "./haarcascade_frontalface_alt.xml")
    (options, args) = parser.parse_args()
    
    cascade = cv.Load(options.cascade)

    if len(args) != 1:
        parser.print_help()
        sys.exit(1)

    input_name = args[0]
    if input_name.isdigit():
        capture = cv.CreateCameraCapture(int(input_name))
        cv.SetCaptureProperty(capture, cv.CV_CAP_PROP_FRAME_WIDTH, 320)
        cv.SetCaptureProperty(capture, cv.CV_CAP_PROP_FRAME_HEIGHT, 240)
    else:
        print "We need a camera input! Specify camera index e.g. 0"
        sys.exit(0)

   # cv.NamedWindow("result", 1)

    if capture:
        frame_copy = None
        move(panGpioPin, servoPanPosition)
        move(tiltGpioPin, servoTiltPosition)
        
        while True:
            start = time.time()
            frame = cv.QueryFrame(capture)
            if not frame:
                cv.WaitKey(0)
                break
            if not frame_copy:
                frame_copy = cv.CreateImage((frame.width,frame.height),
                                            cv.IPL_DEPTH_8U, frame.nChannels)
            if frame.origin == cv.IPL_ORIGIN_TL:
                cv.Copy(frame, frame_copy)
            else:
                cv.Flip(frame, frame_copy, 0)

            midScreenX = (frame.width/2)
            midScreenY = (frame.height/2)

            midFace = detect_and_draw(frame_copy, cascade)

            if midFace is not None:
                midFaceX = midFace[0]
                midFaceY = midFace[1]

                #Find out if the X component of the face is to the left of the middle of the screen.
                if(midFaceX < (midScreenX - midScreenWindow)):
                    #Update the pan position variable to move the servo to the right.
                    servoPanPosition += panStepSize
                    print str(midFaceX) + " > " + str(midScreenX) + " : Pan Right : " + str(servoPanPosition)
                #Find out if the X component of the face is to the right of the middle of the screen.
                elif(midFaceX > (midScreenX + midScreenWindow)):
                    #Update the pan position variable to move the servo to the left.
                    servoPanPosition -= panStepSize
                    print str(midFaceX) + " < " + str(midScreenX) + " : Pan Left : " + str(servoPanPosition)
                else:
                    print str(midFaceX) + " ~ " + str(midScreenX) + " : " + str(servoPanPosition)

                servoPanPosition = min(servoPanPosition, max_pwm)
                servoPanPosition = max(servoPanPosition, min_pwm)
                move(panGpioPin, servoPanPosition)

                #Find out if the Y component of the face is below the middle of the screen.
                if(midFaceY < (midScreenY - midScreenWindow)):
                    if(servoTiltPosition <= 90):
                        #Update the tilt position variable to lower the tilt servo.
                        servoTiltPosition -= tiltStepSize
                        print str(midFaceY) + " > " + str(midScreenY) + " : Tilt Down : " + str(servoTiltPosition)
                #Find out if the Y component of the face is above the middle of the screen.
                elif(midFaceY > (midScreenY + midScreenWindow)):
                    if(servoTiltPosition >= 1):
                        #Update the tilt position variable to raise the tilt servo.
                        servoTiltPosition += tiltStepSize
                        print str(midFaceY) + " < " + str(midScreenY) + " : Tilt Up : " + str(servoTiltPosition)
                        start = 1;
                        end = 1;
                else:
                    print str(midFaceY) + " ~ " + str(midScreenY) + " : " + str(servoTiltPosition)

                servoTiltPosition = min(servoTiltPosition, max_pwm)
                servoTiltPosition = max(servoTiltPosition, min_pwm)
                move(tiltGpioPin, servoTiltPosition)
            else:
                # end measuring time
                end = time.time()+0.1
                var += 0.1
                # getting elapsed time
                time_elapsed = int(end - start + var)
                # printing information
                print 'time elapsed:\t{}'.format(time_elapsed)
                print 'var:\t{}'.format(var)
                if time_elapsed == 20:
                    move(3, servoTiltPosition)
                    servoPanPosition = 90
                    servoTiltPosition = 45 
                    var=0;

            if cv.WaitKey(1) >= 0: # 1ms delay
                break
    #cv.DestroyWindow("result")
Car.inoArduino
#include <Servo.h>
#define MA_1 2
#define MA_2 3
#define MB_1 4
#define MB_2 5
#define MC_1 6
#define MC_2 7
#define MD_1 8
#define MD_2 9
#define SERVOX_PIN    11
#define SERVOY_PIN    10
#define trigPin 13
#define echoPin 12
// User input for servo and position
Servo servoy;
Servo servox;
int x = 90, prevX;
int y = 45, prevY;
int userInput[3];    // raw input from serial buffer, 3 bytes
int startbyte;       // start byte, begin reading input
int servo;           // which servo to pulse?
int pos;             // servo angle 0-180
int i;               // iterator

int State = LOW;
unsigned long previousMillis = 0 , val = 100;;
const long interval = 100;
bool scana = false;
unsigned long currentMillis;
void setup() {
  inicializate();
  currentMillis = millis();
}
void loop() {
  // Wait for serial input (min 3 bytes in buffer)
  if (Serial.available() > 2) {
    // Read the first byte
    startbyte = Serial.read();
    // If it's really the startbyte (255) ...
    if (startbyte == 255) {
      // ... then get the next two bytes
      for (i = 0; i < 2; i++) {
        userInput[i] = Serial.read();
      }
      // First byte = servo to move?
      servo = userInput[0];
      // Second byte = which position?
      pos = userInput[1];
      // Packet error checking and recovery
      if (pos == 255) {
        servo = 255;
      }
      // Assign new position to appropriate servo
      switch (servo) {
        case 1:
          servoy.write(pos);    // move servo1 to 'pos'
          break;
        case 2:
          servox.write(pos);
          range(pos);
          break;
        case 3:
          scan(45);
          scan(65);

          break;
        default:
          break;
      }
    }
  }
}
void scan(int val) {
  while (Serial.available() == 0) {
    servoy.write(val);
    unsigned long currentMillis = millis();
    if (currentMillis - previousMillis >= interval) {
      previousMillis = currentMillis;
      servox.write(pos);
      if (State == LOW ) {
        pos += 1;
        if (pos == 130) {
          State = HIGH;
        }
      }
      else {
        pos -= 1;
        if (pos == 40) {
          State = LOW;
          break;
        }
      }
    }
  }
}
long distancia() {
  long duration, distance;
  digitalWrite(trigPin, LOW);  // Added this line
  delayMicroseconds(2); // Added this line
  digitalWrite(trigPin, HIGH);
  delayMicroseconds(10); // Added this line
  digitalWrite(trigPin, LOW);
  duration = pulseIn(echoPin, HIGH);
  distance = (duration / 2) / 29.1;
  return distance;
}
void range(int pos) {
  if ((pos >= 80 ) & (pos <= 100)) {
    moves();
  }
  else if ((pos >= 100 ) & (pos <= 180)) {
    left();
    delay(10);
    Stop();
  }
  else if ((pos >= 1 ) & (pos <= 80)) {
    right();
    delay(10);
    Stop();
  }
}
void moves() {
  long distance, previusdistance;
  distance = distancia();
  if (val <= 80) {
    reverse();
    delay(50);
    Stop();
  }
  else if (val >= 140) {
    forward();
    delay(50);
    Stop();
  }
  if (distance >= previusdistance) {
    val = (distance - previusdistance);
  }
  else if (distance <= previusdistance) {
    val = (previusdistance - distance);
  }
  previusdistance = distance;
}
void inicializate() {
  Serial.begin(115200);
  pinMode(MA_1, OUTPUT);
  pinMode(MA_2, OUTPUT);
  pinMode(MB_1, OUTPUT);
  pinMode(MB_2, OUTPUT);
  pinMode(MC_1, OUTPUT);
  pinMode(MC_2, OUTPUT);
  pinMode(MD_1, OUTPUT);
  pinMode(MD_2, OUTPUT);
  pinMode(trigPin, OUTPUT);
  pinMode(echoPin, INPUT);
  servox.attach(SERVOX_PIN);
  servoy.attach(SERVOY_PIN);
}
void forward() {
  digitalWrite(MA_1, HIGH);
  digitalWrite(MA_2, LOW);

  digitalWrite(MB_1, HIGH);
  digitalWrite(MB_2, LOW);

  digitalWrite(MC_1, HIGH);
  digitalWrite(MC_2, LOW);

  digitalWrite(MD_1, HIGH);
  digitalWrite(MD_2, LOW);
}
void reverse () {
  digitalWrite(MA_1, LOW);
  digitalWrite(MA_2, HIGH);

  digitalWrite(MB_1, LOW);
  digitalWrite(MB_2, HIGH);

  digitalWrite(MC_1, LOW);
  digitalWrite(MC_2, HIGH);

  digitalWrite(MD_1, LOW);
  digitalWrite(MD_2, HIGH);
}
void right() {
  digitalWrite(MA_1, LOW);
  digitalWrite(MA_2, HIGH);

  digitalWrite(MB_1, LOW);
  digitalWrite(MB_2, HIGH);

  digitalWrite(MC_1, HIGH);
  digitalWrite(MC_2, LOW);

  digitalWrite(MD_1, HIGH);
  digitalWrite(MD_2, LOW);
}
void left() {
  digitalWrite(MA_1, HIGH);
  digitalWrite(MA_2, LOW);

  digitalWrite(MB_1, HIGH);
  digitalWrite(MB_2, LOW);

  digitalWrite(MC_1, LOW);
  digitalWrite(MC_2, HIGH);

  digitalWrite(MD_1, LOW);
  digitalWrite(MD_2, HIGH);
}
void Stop() {
  digitalWrite(MA_1, LOW);
  digitalWrite(MA_2, LOW);

  digitalWrite(MB_1, LOW);
  digitalWrite(MB_2, LOW);

  digitalWrite(MC_1, LOW);
  digitalWrite(MC_2, LOW);

  digitalWrite(MD_1, LOW);
  digitalWrite(MD_2, LOW);
}
void StopH() {
  digitalWrite(MA_1, HIGH);
  digitalWrite(MA_2, HIGH);

  digitalWrite(MB_1, HIGH);
  digitalWrite(MB_2, HIGH);

  digitalWrite(MC_1, HIGH);
  digitalWrite(MC_2, HIGH);

  digitalWrite(MD_1, HIGH);
  digitalWrite(MD_2, HIGH);
}
scanlinux.pyPython
Scan linux serial ports
#! /usr/bin/env python
"""\
Scan for serial ports. Linux specific variant that also includes USB/Serial
adapters.

Part of pySerial (http://pyserial.sf.net)
(C) 2009 <cliechti@gmx.net>
"""

import serial
import glob



def scan():
    """scan for available ports. return a list of device names.

    pvergain@houx:~/PDEV1V160_CodesRousseau/Soft/PC/test_boost/serialport/pyserial$ python scanlinux.py
    Found ports:
    /dev/ttyS0
    /dev/ttyS3
    /dev/ttyS2
    /dev/ttyS1
    /dev/ttyACM0
    /dev/serial/by-id/usb-id3_semiconductors_MEABOARD_00000000-if00
    """
    return glob.glob('/dev/ttyS*') + glob.glob('/dev/ttyUSB*') + glob.glob('/dev/ttyACM*') + glob.glob('/dev/serial/by-id/*')

if __name__=='__main__':
    print "Found ports:"
    for name in scan():
        print name

Custom parts and enclosures

Python scrip OpenCV face traking and Arduino Code
FaceTrakingCar.rar

Schematics

Schematics
Using the arduino to control motors and servos, pin-out may vary.
Untitled%20sketch%202 bb

Comments

Similar projects you might like

Omni-Directional People Tracking Friendly Robot

Project tutorial by Team Think Engineer

  • 3,988 views
  • 8 comments
  • 26 respects

Ball Tracking Robot

Project showcase by Rohan Juneja

  • 38,434 views
  • 34 comments
  • 86 respects

Voice Controlled Robot using your Smart Phone

Project tutorial by Ahmed Yassin

  • 3,058 views
  • 0 comments
  • 7 respects

Otto DIY+ Arduino Bluetooth Robot Easy to 3D Print

Project tutorial by Team Otto builders

  • 48,234 views
  • 117 comments
  • 162 respects

Two Mode Robot Controlling through Android and Windowsphone8

Project tutorial by Anas Siddiqui

  • 16,314 views
  • 9 comments
  • 48 respects

Turn your RC Car to Bluetooth RC car

Project tutorial by Prajjwal Nag

  • 11,971 views
  • 1 comment
  • 11 respects
Add projectSign up / Login