Project in progress

OpenCat © GPL3+

A programmable and highly maneuverable robotic cat for STEM education and AI-enhanced services.

  • 194,721 views
  • 154 comments
  • 1,358 respects

Components and supplies

11113 01
SparkFun Arduino Pro Mini 328 - 5V/16MHz
×1
Pi 3 02
Raspberry Pi 3 Model B
×1
Adafruit PCA9685 PWM & servo driver
×1
Micro SD card
×1
Heat sink
×2
Compression spring
×13
Torque spring
×4
Extension spring
×1
Flat self-tapping screws (various)
×1
Rivets (various)
×1
Infrared sensor and remote
×1
Adafruit industries ada1536 image
Buzzer
×1
Amplifier
×1
Cellphone speaker
×1
USB microphone
×1
Capacitor
×1
Resistor (various)
×1
18650/18500 batteries
×2
Battery holder
×1
MG92B servo
×13
MG91 servo
×1
Longer servo screw
×14
Pi noir fisheye camera with lights
×1
ToF lidar
×3
1982 00
Adafruit Capacitive Touch Sensor Breakout - MPR121
×1
GY-521 MPU-6050 3 Axis Gyroscope + Accelerometer Module For Arduino
×1
E switch eg1218 image 75px
Slide Switch
×1
Male/female pin connector (various)
×1
Flat washer
×14
Lock washer
×14
Right angle connector
×1
Rainbow wires
×1
USB to micro USB cable
×1
09716 01
SparkFun FTDI Basic Breakout - 5V
×1
Pan/tilt holder
×1
Heat shrink tubing
×1
Electrical tape
×1

Necessary tools and machines

3drag
3D Printer (generic)
3D filament (ABS, rubber, conductive, nylon)
09507 01
Soldering iron (generic)
Solder sucker/wick
Screw drivers (various)
File (various)
Acetone
Brush
Mini drill

Apps and online services

About this project

Demo Video

大陆读者请移步优酷 (readers from mainland China can see it here):

http://v.youku.com/v_show/id_XMzQxMzA1NjM0OA==.html?spm=a2h3j.8428770.3416059.1

I'm going to launch an Indiegogo campaign this fall with a brand new kit. You can subscribe to Petoi.com or my Twitter https://twitter.com/PetoiCamp for updates. The codes will be posted on Github: https://github.com/PetoiCamp/OpenCat.

Thank you!

Mentioned on IEEE: Spectrum:

https://spectrum.ieee.org/automaton/robotics/humanoids/video-friday-boston-dynamics-spotmini-opencat-robot-engineered-arts-mesmer-uncanny-valley

The Next Web:

https://thenextweb.com/creativity/2018/03/05/forget-aibo-heres-opencat-a-3d-printable-pi-powered-open-source-cat-robot/

------

Hi,

You may have seen Boston Dynamic Dogs and the recently released Sony Aibo. They are supper cool but are too expensive to enjoy. I hope to provide some affordable alternatives that have most of their motion capabilities. I'm not saying I can reproduce the precise motions of those robotics giants. I'm just breaking down the barrier from million dollars to hundreds. I don't expect to send it to battlefield or other challenging realities. I just want to fit this naughty buddy in a clean, smart, yet too quiet house.

With very limited resources and knowledge, I started small. Smaller structure avoids a lot of engineering challenges of those larger models. It also allows faster iterations and optimization, just like rats adapt faster than elephants. Regardless the hardware, the major control algorithm could be shared once accurate mapping of DoFs is achieved. I derived a motion algorithm (with a dozen of parameters) for multiple gaits. The historic fastest speed was over 3 bodyLength/sec, achieved with trotting (2-leg-in-air). As I constantly add new components and change the CoM, while the adaptive part is not good enough, I reserve the tuning time for finalized models.

The motion algorithm is currently implemented on a 32KB,16MHz Arduino board, using up its system resources with algorithmic optimization almost everywhere. I'm going to switch to a 256KB,48MHz board to boost the performance of active adaption, as well as allowing additional codes by future users. The motion is actuated by hobby level (but still robust, digital & metal gear) servos considering price. Some elastic structures were introduced to damper the shock and protect the hardware.

On top of the motion module is a RasPi. The Pi takes no responsibility for controlling detailed limb movements. It focuses on more serious questions, such as "Who am I? Where do I come From? Where am I going?" It generates mind and sends string commands to the Arduino slave. Motion instructions can still be sent to the Arduino in a slower manner. A human remote sits in the middle to intercept the robot's control of its own body. It will still hold certain instincts, like refusing to jump down a cliff.

Currently I have two functional prototypes:

* The mini model is a stand-alone 8-DoF (supports up to 16-DoF) Arduino motion module that holds all skills for multiple gaits and real-time adaptation. The codes are compatible with the full version, only to change one parameter. The mount dimension matches those of a RasPi board. So it can also be a "leg-hat" for your existing project. With some enhanced "carrier" configuration, it can cargo about 1kg additional weights (but walks slower of course). It's targeted at STEM education and Maker community. The price will be similar to some robotic car kits.

* The full version uses a Pi for more AI-enhanced perception and instructs an upgraded 16-DoF motion module. Besides Pi's wifi and bluetooth, it also carries ground contact, touch, infrared, distance, voice and night vision interfaces. All modules have been tested on its light weighted body. It also adopts some bionic skeleton designs to make it morphologically resembles a cat. It's targeted at consumer market with less tech backgrounds. You can imagine it as a legged Android phone or Alexa that has an app store for third party extensions. It can continuously run at about 2.6 bodyLength/sec for 60 mins, or sit streaming videos for several hours. I also reserved some space below the spine for additional boards (such as a GPS). I have a regular routine for duplicating the model, but need better industrialization to reduce the labor. I expect the price to be close to a smartphone.

* I also have an obsolete version that uses only Pi for controlling both AI and motion. All code were written in Python. The movement is not as good if it's running intensive AI tasks.

I bought my first RasPi in Jun.2016 to learn coding hardware. To express my joy when I first lit an LED with the spark from Pi, I wrote a Chinese quatrain:

启蒙

夜闻禽兽想炊烟,斑白胼胝枉瘦田。

仙界私厨失圣火,人间春事已燎原。

which literally means: “The roar of nocturnal animals reminds me of the beginning of ancient civilizations. White-headed people with callus worked hard only to grow a few crops on the barren land. One day a kindling that used to serve God’s private kitchen fell from the heaven. Soon the slash-and-burn agriculture prospered on the earth.”

This robot served as a playground for learning all the components in a regular RasPi beginner kit. I started with craft sticks, then switched to 3D printed frames for optimized performance and morphology. An Arduino was introduced in Jul.2017 for better movement. The 7 major iterations were made between Jul.2016 and Sep.2017. No significant progress was made since last September, when I got distracted from increasing job duties, company formation and patent writting.

I'm now teaching a university robotics class using the Mini kit. I hope to sell more kits to validate the market and bring in some revenue to keep the project going. The full version is yet to be polished. I'm also applying for several accelerators and will probably try Indiegogo. Depending on where the best support I could get, I may start a business or make the project completely open-souce. Even as a commercial product, most codes will be open-sourced with the sale of kits.

I believe in the power of open-source if everyone could grab a robot and start to solder and code. Rather than a final product, it shows the potential backed by a growing maker community. Users can focus on coding it on either motion (Arduino with C) or AI (Pi with Python) part. And the two can communicate through string tokens. It's also easy to teach new body languages and behaviors with a couple lines of codes. Everyone with previous knowledge of Arduino or RasPi could imagine its possible applications. I've been bored with those animal-shaped cars for years. Now kids can learn physics and coding on a new type of toy. Robotics experts can focus on their walking algorithms on a much cheaper platform. Software developers can write AI-enhanced applications on a pet-like robot, rather than a "wheeled ipad".

If you are interested in the cat and want to have one in hand, please like the video and share. I also love to see your comments to make it better. Your support will determine how soon it will be available on the market.

------------

I was amazed by your warm feedback. It's really encouraging!

I may forget to reply some of your messages, so here I want to answer a few common questions:

----

* Sharing STL?

- The full version cat needs multiple precisely printed structures with various filament. It requires ~two day for printing and post processing (involving acetone). And they have to be assembled with specific accessories and tools. Some mechanisms are designed at <0.2mm precision and I'm currently tuning it by careful filing. Even an alternative way of soldering or wiring may cause trouble in assembling.

I think the most economic (and safe) way is to invest some expensive injection mold then go mass production, at least for the key mechanic parts. Once I release the files and specifications, you would probably agree with that.

And I need time to put up a good documentation. The mini version should come out much earlier.

----

* Sharing Codes?

- The project is built upon some open-sourced libraries. I'm supposed to inherit their licenses. From another point of view, it's impossible for me to hide my codes once it's released. So I will share the codes.

However, I do hope to organize my codes and plan better for the project. The cat is my baby and I want it stronger before leaving home. I'm hosting a private GitHub repository with a compact team of volunteers, and the contents will be migrated gradually to https://github.com/PetoiCamp/OpenCat.

----

* Open-source?

- Open-source also needs some commercial operations to keep healthy. I'll try my best to balance everything. They are out of my expertise and I have to learn. I do have to settle down and support my family, rather than sleeping lonely on a foreign land. Sorry guys...I'm busy teaching 5 credits university classes only to pay-off my bills!

I hope to make the project support itself rather than begging for donations every year. For such a complicated system, I'm confident that mass production will make the unit cost lower than individual DIY efforts. By achieving that goal, hobbiests can still build this challenging project from source codes (and tens of accessories from online store), while the common public can simply enjoy the product at a lower price.

Conclusion is, the project will be open-sourced once a minimal maintainance team can live on it.

----

* Last question is for you:

- Could you suggest some semi-public platform (like forum, BBS) so that we can discuss the project in a better organized way? I'm receiving emails for collaborations and I hope everyone could get credit and keep track of others' contribution.

Comments

Similar projects you might like

Real-Time Face Recognition: An End-to-End Project

Project tutorial by MJRoBot

  • 48,615 views
  • 92 comments
  • 171 respects

KITtyBot

Project tutorial by StaffanEk

  • 16,363 views
  • 7 comments
  • 79 respects

Mike's Robot Dog

Project in progress by Mike Rigsby

  • 10,425 views
  • 2 comments
  • 51 respects

JQR Quadruped Autonomous Robot

Project in progress by aldoz

  • 5,107 views
  • 9 comments
  • 45 respects

WalaBeer Tank

Project tutorial by Balázs Simon

  • 22,025 views
  • 4 comments
  • 145 respects

Make An Autonomous "Follow Me" Cooler

Project tutorial by Hacker Shack

  • 83,425 views
  • 131 comments
  • 307 respects
Add projectSign up / Login