Project tutorial
Child Assistant

Child Assistant © CC BY

A robot for children which helps to play and reduce addiction of smartphone.

  • 2,166 views
  • 0 comments
  • 20 respects

Components and supplies

Necessary tools and machines

Apps and online services

About this project

Introduction

The Problem: Children are becoming more addicted to smartphones day by day and usually watch cartoons, youtube, and play games. In Bangladesh, kids spend over three to four hours a day on screens. A study found that screen time impacts the physical structure of children’s brains, as well as their emotional development and mental health. Unless we get control of the screens which now absorb so much of our kids’ time, it will indirectly destroy our future generation.

The Solution: Robot always attracts a child. With added voice interactivity a robot can be a dream machine for a child. Definitely, it will attract children more than a smartphone. To bring out the children from screen addiction I made a robot. The robot has the following features:

  • It can be driven through voice command.
  • Children can ask the robot to make different sounds like fire service cars, ambulance, police cars even the sound of different animals.
  • A child can ask the robot to tell stories, poems, joke and play music.
  • The robot is able to tell the surrounding environmental parameter like temperature, humidity, air pressure, and UV light.
  • The robot can show different directions like east, west, north, south and also be able to align itself in a specific direction.

How to Make It

Making the Chassis

Different types of ready-made chassis are available in the market and you can buy one. I made one myself using the ebonite board and I added some pictures of how I made the chassis. I used one ball caster and two small sizes DC gear motor for the chassis. DC geared motors are fixed to the bottom plate using supper glow.

Installing Motor Driver and Arduino

For driving the motors of the robot I used Adafruit Motor Shield v1.0 and Arduino Uno.

The driver is capable to drive 4 dc motors. I used M3 and M4 connector to drive two motors of my chassis. The Arduino will receive power from raspberry pi but motor drivers required an external power supply. I added two 18650 Li-ion cells for powering the motors.

Installing Speaker and Amplifier Board

For talking with the children the robot required speaker. I placed two 3W speaker with a mini amplifier board to the robot. The speakers will be connected to the Raspberry Pi through a 3.5mm audio port. We will connect Raspberry Pi in the next step.

Installing Matrix Creator and Raspberry Pi

Matrix Creator and Raspberry Pi are the two main hardware units for our project. Raspberry Pi will host the Snips voice platform and communicate with Arduino and Matrix Creator. Matrix creator act as a microphone of raspberry pi, provide sensor data and drive the led ring.

Matrix Creator case is 3D printed and the 3D file can be downloaded here: https://www.thingiverse.com/thing:2872527

After placing the Raspberry Pi and Matrix Creator inside the case it looks like as follows:

Connecting Arduino with Raspberry Pi

Though Arduino drives the motors but for the directions, it depends on Raspberry Pi. Raspberry Pi sends the command through serial port and according to serial command Arduino sends the signal to motor driver. Arduino also receives the power from Raspberry Pi by the serial port.

Providing Power to Raspberry Pi

Raspberry Pi required 5V stable power supply and the current capacity of the power supply should be at least 1.5A. I used a 2A power bank for providing power to Raspberry Pi.

The capacity of the power bank is 10000mAh. The audio amplifier also gets power from the power bank.

Putting all Together

After completing all the connection the robot looks like the following pictures.

From the front side it looks like:

Side view of the robot:

Developing Snips App

For enabling the voice interaction I developed an application using the Snips platform. For making your own app go to https://console.snips.ai/ and develop one. It's Free.

I made five Intents for the app.

Every Intent has a dedicated slot and has several training examples.

You can install the app from the Snips app store using the following link:

https://console.snips.ai/store/en/skill_KpPYQxAwApk

For Raspberry Pi image I used MATRIX Kit image from the link: https://drive.google.com/file/d/1cTpKiyAIc9RxfF1shW3F1oBBJjAdkZu_/view?usp=sharing

Matrix Core is preinstalled in the image.

For developing action code I used Python. All the sources are added in the code section.

To learn how to make an App using Snips follow the link: https://docs.snips.ai/getting-started/quick-start-console

To configure Snips for Raspberry Pi follow the link:

https://docs.snips.ai/getting-started/quick-start-raspberry-pi

Python Template of the action code is here:

https://docs.snips.ai/articles/platform/create-an-app/python-template

Thanks for reading this tutorial. Thank MATRIX Creator and Hackster for providing me the Super Cool Matrix Creator.

Code

Arduino CodeArduino
This code will be uploaded to Arduino. It receives serial commands from Raspberry Pi and drives the motors accordingly.
/*************************************
 * Author: Md. Khairul Alam
 * This code is for controlling a robotic arm containing 5 servos
 */
#include <AFMotor.h>

AF_DCMotor motorLeft(3);
AF_DCMotor motorRight(4);

String inputString = "";         // a string to hold incoming data
boolean stringComplete = false;  // whether the string is complete

void setup() {
  // put your setup code here, to run once:
  Serial.begin(9600);
 
  inputString.reserve(200);

   // turn on motor
  motorLeft.setSpeed(200);
  motorRight.setSpeed(200);
 
  motorLeft.run(RELEASE);
  motorRight.run(RELEASE);

}

void loop() {
  
  if (stringComplete) {
    if(inputString == "Forward"){//if received message = pos1
        forward();
        delay(20);
      }
    else if(inputString == "Backward"){
        backward();
        delay(20);
      } 
    else if(inputString == "Right"){
        turnRight();
        delay(20);
      } 
    else if(inputString == "Left"){
        turnLeft();
        delay(20);
      } 
    else if(inputString == "Stop"){
        stop();
        delay(20);
      }   
    // clear the string:
    inputString = "";
    stringComplete = false;
  }

}


void serialEvent() {
  while (Serial.available()) {    
    // get the new byte:
    char inChar = (char)Serial.read();     
    // if the incoming character is a newline, set a flag
    // so the main loop can do something about it:
    if (inChar == '\n') {
      stringComplete = true;
    }
    else
    // add it to the inputString:  
      inputString += inChar;
  }
}


void forward(){
  motorLeft.run(FORWARD);
  motorRight.run(FORWARD);
  }

void backward(){
  motorLeft.run(BACKWARD);
  motorRight.run(BACKWARD);
  }

void turnRight(){
  motorLeft.run(FORWARD);
  motorRight.run(BACKWARD);
  }

void turnLeft(){
  motorLeft.run(BACKWARD);
  motorRight.run(FORWARD);
  }

void stop(){
  motorLeft.run(RELEASE);
  motorRight.run(RELEASE);
  }
Action Code for SnipsPython
#!/usr/bin/env python3
# -*- coding: utf-8 -*-

from snipsTools import SnipsConfigParser
from hermes_python.hermes import Hermes
from hermes_python.ontology import *


import io
import sys
import logging
import time
import serial
from imp import load_source
from matrix_lite import led
from matrix_lite import sensors

ser = serial.Serial('/dev/ttyAMA0',9600)

CONFIG_INI = "config.ini"
logging.basicConfig()

_LOGGER = logging.getLogger(__name__)
_LOGGER.setLevel(logging.ERROR)


# If this skill is supposed to run on the satellite,
# please get this mqtt connection info from <config.ini>
# Hint: MQTT server is always running on the master device
MQTT_IP_ADDR = "localhost"
MQTT_PORT = 1883
MQTT_ADDR = "{}:{}".format(MQTT_IP_ADDR, str(MQTT_PORT))


class Child_Assistant_app(object):
    """Class used to wrap action code with mqtt connection        
    This app dispatch the intents to the corresponding actions
    """

    def __init__(self):
        """Initialize our app 
        - read the config file
        - initialize our API and Multilanguage Text handler class with 
          correct language 
        """

        # get the configuration if needed
        try:
            self.config = SnipsConfigParser.read_configuration_file(CONFIG_INI)

            # set log level according to config.ini
            if self.config["global"]["log_level"] == "DEBUG":
                _LOGGER.setLevel(logging.DEBUG)

            _LOGGER.debug(u"[__init__] - reading the config file {}".format(self.config))
            _LOGGER.debug(u"[__init__] - MQTT address is {}".format(MQTT_ADDR))

        except:
            self.config = None
            _LOGGER.error(u"[__init__] - not able to read config file!")


        # start listening to MQTT
        self.start_blocking()

    # -------------------------------------------------------------------------
    # --> Sub callback function, one per intent
    # -------------------------------------------------------------------------

    # ===train_schedule_to intent action ======================================
    def robot_direction(self, hermes, intent_message):
        """Action for direction slot 
        """
        # terminate the session first if not continue
        hermes.publish_end_session(intent_message.session_id, "")
        
        direction = intent_message.slots.Direction.first().value
        ser.write(direction)
        ser.write('\n')
        # terminate the session first if not continue
        #hermes.publish_end_session(intent_message.session_id, text_to_speak)
        hermes.publish_start_session_notification(intent_message.site_id, "Car is going {}".format(str(direction)), "")

    # ===train_schedule_from_to intent action =================================
    def robot_color(self, hermes, intent_message):
        """Action for color slot
        """
        # terminate the session first if not continue
        hermes.publish_end_session(intent_message.session_id, "")
        
        color = intent_message.slots.Color.first().value
        ser.write(color)
        ser.write('\n')
        if color == 'Red':
            led.set('red') # color name
        elif color == 'Green':
            led.set('green') # color name
        elif color == 'Blue':
            led.set('blue') # color name
        elif color == 'White':
            led.set('white') # color name
        elif color == 'Black':
            led.set('black') # color name
        
        # terminate the session first if not continue
        #hermes.publish_end_session(intent_message.session_id, text_to_speak)
        hermes.publish_start_session_notification(intent_message.site_id, "Changes colot to {}".format(str(color)), "")

    # ===station_timetable intent action ======================================
    def robot_sound(self, hermes, intent_message):
        """fulfill the intent 
        """
        # terminate the session first if not continue
        hermes.publish_end_session(intent_message.session_id, "")
        
        sound = intent_message.slots.Sound.first().value
        ser.write(sound)
        ser.write('\n')
      
        
        # terminate the session first if not continue
        #hermes.publish_end_session(intent_message.session_id, text_to_speak)
        hermes.publish_start_session_notification(intent_message.site_id, "Car is playing {}".format(str(sound)), "")
        
    def robot_voice(self, hermes, intent_message):
        """fulfill the intent 
        """
        # terminate the session first if not continue
        hermes.publish_end_session(intent_message.session_id, "")
        
        voice = intent_message.slots.Voice.first().value
        ser.write(voice)
        ser.write('\n')
        
        # terminate the session first if not continue
        #hermes.publish_end_session(intent_message.session_id, text_to_speak)
        hermes.publish_start_session_notification(intent_message.site_id, "Car is talking {}".format(str(voice)), "")
        
    def robot_sense(self, hermes, intent_message):
        """fulfill the intent 
        """
        # terminate the session first if not continue
        hermes.publish_end_session(intent_message.session_id, "")
        
        sense = intent_message.slots.Sense.first().value
        #if sense == 'Temperature':
        #    sensors.humidity.read()
        #elif sense == 'Humidity':
        #    sensors.humidity.read()
        #elif sense == 'Pressure':
        #    sensors.pressure.read()
        #elif sense == 'Backward':
        #    self.relay13.off()
        #elif sense == 'Stop':
        #    self.relay13.off()
        #sensors.uv.read()
        # terminate the session first if not continue
        #hermes.publish_end_session(intent_message.session_id, text_to_speak)
        hermes.publish_start_session_notification(intent_message.site_id, "Car is reading {}".format(str(sense)), "")

    # -------------------------------------------------------------------------
    # --> Master callback function, triggered everytime an intent is recognized
    # -------------------------------------------------------------------------

    def master_intent_callback(self, hermes, intent_message):
        coming_intent = intent_message.intent.intent_name
        _LOGGER.debug(u"[master_intent_callback] - Intent: {}".format(coming_intent))
        if coming_intent == "taifur:robot_direction":
            self.robot_direction(hermes, intent_message)
        if coming_intent == "taifur:robot_sound":
            self.robot_sound(hermes, intent_message)
        if coming_intent == "taifur:robot_color":
            self.robot_color(hermes, intent_message)
        if coming_intent == "taifur:robot_voice":
            self.robot_voice(hermes, intent_message)
        if coming_intent == "taifur:robot_sense":
            self.robot_sense(hermes, intent_message)

    # --> Register callback function and start MQTT
    def start_blocking(self):
        with Hermes(MQTT_ADDR) as h:
            h.subscribe_intents(self.master_intent_callback).start()


if __name__ == "__main__":
    Child_Assistant_app()

Custom parts and enclosures

Case Bottom
Case Top

Schematics

Connection of Motors
51ml5vayttl 8cb4uoyz5g

Comments

Similar projects you might like

Autonomous Assistant Agricultural Bots

by Sherbin

  • 5,081 views
  • 2 comments
  • 26 respects

Candy Dispenser with Google Assistant

Project tutorial by Arduino “having11” Guy

  • 17,574 views
  • 1 comment
  • 60 respects

Joy Robot (Robô Da Alegria)

Project tutorial by Igor Fonseca Albuquerque

  • 4,277 views
  • 5 comments
  • 33 respects

Alzheimer's Assistant

Project tutorial by Abdullah Sadiq

  • 34,685 views
  • 19 comments
  • 121 respects

Personal Home Assistant

Project tutorial by 3 developers

  • 5,387 views
  • 0 comments
  • 29 respects
Add projectSign up / Login