Gizmo is a robotic vehicle with a camera that can be remote-controlled over WLAN or the Internet by an Android app. The video capture of the camera can be watched online through the app. When in hearing distance, the vehicle can be controlled by voice.
The Stepper Motors
The vehicle is driven by two NEMA 17 stepper motors. I decided for steppers instead of DC motors because they don't need any regulators and allow to position the vehicle with an accuracy better than one centimeter per meter. An other advantage is their high torque especially on low rotation speeds. Disadvantage are the higher weight and higher power consumption. But in my opinion, the advantages make up these weak points.
The RC car wheels are directly mounted to the motor axis - simply with a piece of aluminum and brass tube.
A fork light barrier in combination with a perforated disc allows to detect when a motors loses steps. this happens whenever there is an overload (i.e. if the vehicle hangs somewhere). In this case, the motors stop immediately.
Nowadays, it is no rocket science to control stepper motors any more. The stamp-sized motor controller modules Poulou A4988, DRV8825 and others can directly be connected to the IO pins of an Arduino. Although there are prefabricated shields for Arduino available, I soldered my own one.
Comparing prices, It worked out that a LiPo power bank is the cheapest, most lightweight and least space-consuming option to power up the vehicle. Here, I use a 55 Wh power bank, which stores enough energy for approximately one day. Driving around a lot will of course shorten this time. But the problem is, that a Power Bank is designed to charge mobiles, not to power up 5V devices. So its voltage is only roughly regulated and drops below 4V, when high currents are drawn. Especially a Raspberry Pi 3B+ feels offended by such a low voltage and makes a reboot. An other point is, that the stepper motor driver modules need an input voltage of at least 7 Volts, while the Raspberry Pi, Arduino and other gadgets need 5 Volts.
So I used a power bank that supports quick charge and vdeconinck's QC3Control library for Arduino (https://github.com/vdeconinck/QC3Control) to get 12 volts out of it. The 12V are fed directly into the stepper motor drivers, and an additional LM2576-based switching regulator outputs perfectly stable 5 volts for the electronic devices.
The A4988 Motor Controllers can be driven directly by the 3.3 V IO Pins of a Raspberry Pi. Nevertheless, I decided to let an Arduino UNO do this work. One reason is, that I already had one and no idea what to do with it, the other reason is, that the Raspberry Pi has only one accessible PWM channel to drive the "step" input of the A4988. OK, I could cheat the second channel in software, But my concern was, that other tasks like Image processing and speech recognition could slow down the software PWM by taking too much CPU resources.
My Arduino Uno has an ATmega328P controller with two 8-bit and one 16-bit counter. These can be configured as variable frequency generators. The first Counter is already used by the Arduino library itself, so I used the other 8-Bit and the 16-bit counter. The A4988 is wired for a microstep resolution of 1/16 steps. Multiplied with the 200 steps per revolution of a commercial NEMA 17 stepper motor, we get 3200 microsteps per turn, which means that 3200 rising edges applied to the "step" pin of the A4988 will rotate the motor axis one round. Consequently, an output frequency of 3.2 kHz makes the stepper motors rotate once per second. The directly attached wheels have a circumference of 27 cm, so the vehicle runs at a speed of 27 cm/s at that frequency. I defined 30 cm/s as maximum speed, which is quite fast when driving around in a normal livingroom. Another task is collision prevention by an ultrasonic distance sensor. When the sensor detects an obstacle, the motors are stopped immediately by the Arduino sketch itself.
The Arduino also controls the output voltage of the QC Power Bank by vdeconinck's QC3Control library for Arduino (https://github.com/vdeconinck/QC3Control) and some resistors soldered on top of the Motor Controller HAT.
The Raspberry Pi sends the distance to run and the speed of each motor to the Arduino through I2C. When the Raspberry Pi is I2C master, the bus has a high level of 3.3 Volt. This is also clearly identified as high level by the Arduino, so there is no level converter needed.
The vehicle has a front-mounted Raspberry Pi NOIR Camera. The Raspberry Pi streams the camera picture as MJPEG Stream over HTTP. I modified the sources of jacksonliam´s mjpg-streamer (https://github.com/jacksonliam/mjpg-streamer) a little bit to be compiled directly into the code and stream OpenCV matrices directly, without having to save them to a file before streaming.
The camera is mounted on a tiltable 3D-printed camera mount. A servo Motor will allow to set the view angle by software. This feature is not yet implemented, but the camera mount can already be tilt by hand. The camera has a switchable IR-cut filter, so it can capture detailed Images even in the dawn or candlelight. The image below shows the photos of a candle with and without IR-cut filter. With optional IR LEDs that are shipped with the camara, it is even possible to get perfect B/W Images in total darkness.
I also coded an app for Android, that sends commands and receives status codes from the vehicle over MQTT. Tapping on the camera image makes the vehicle move forward or backward or lets it turn left or right. The triangles on the image become orange, when the ultrasonic distance sensor finds an obstacle and stops the motors. The triangles become red, when the stepper motors drop steps due to an overload (i.e. if the vehicle hangs somewhere).
Using the 360-degree LIDAR scanner on top, the vehicle can scan surrounding objects and walls with a laser beam. The resulting map can be used for convenient navigation.
Snips is a software platform that allows users to add powerful voice assistants to their Raspberry Pi devices without compromising on privacy. It runs 100% on-device, and does not require an Internet connection. It features Hotword Detection, Automatic Speech Recognition (ASR), Natural Language Understanding (NLU) and Dialog Management. When Snips receives a Hotword ("Gizmo", in this case), the Automatic Speech Recognition (ASR) becomes active. In a first step, ASR transcribes the recorded sound to text. Then the Natural Language Understanding (NLU) extracts the meaning from the text and publishes so called "Intents" and "Slots" over MQTT. An Intent contains the voice commands as json String.
The Command Processing Software running on the Raspberry Pi parses the json string and extracts the direction (forwards, left, right, back) and distance to go respectively angle in degrees to turn. For example "forward 50" lets the robot move 50 cm forward, "right 90" rotates 90 degrees clockwise.
Gizmo receives the voice commands by a microphone array with four microphones. Like human or animal brain, a small Python script computes the direction of the incoming sound. This is done by measuring the phase difference between the four mics. The LED ring with 12 programmable RGB LEDs then indicates the direction of the Hotword and any operating condition. See in the next video, how Gizmo reacts on the voice command "come here"..
The following picture shows how all the componets work together:
All in all it took me one and a half year to develop this project until this stage, taking most of my evenings after my regular work, weekends and holidays. Not to complain about the pain of having to spend money for the parts (including three Raspberry Pi killed by overvoltage) and not being paid for anything. But all in all it was a big pleasure for me and I really learned a lot. And this really makes up all the effort and pain.