We are a group of sophomore engineers from Olin College of Engineering. This project was part of our Principles of Integrated Engineering course (PIE). Our objective was to make a cute, interactive, and advanced robot that would act as the dorm pet we are all forbidden. This website is designed to be showcase and document all of our work through the course of the semester on the foxbot.
Cute
Inspired by the cat cakes from Honkai: Star Rail, we decided to make a fun and whimsical fox cake.
Interactive
In order to make the robot feel like a real pet, we wanted to be able to interact with it in ways such as petting it or playing music, and have it respond.
Advanced
We wanted to create an integrated system with mechanical, electrical, and software components.
Default
Eyes and ears neutral, no movement
Chase tail
Tail to the right and spin to the right, eyes get dizzy, blink
Look Around
Turn slightly right, turn slightly left, move forward
Petted
Ears move, eyes happy (triggered by button)
Follow Treat
Heart eyes, detect and follow berry, get angry after a period of time if not petted (triggered by melody)
Sleep
Closed eyes, ears inward (triggered by darkness)
Electrical design process and justification
The electrical system of the Fox-Bot was designed to be both functional and clean. The electrical system needs to facilitate all necessary software tasks and power all of the systems while not interfering with the cute looks of the bot. It should also be well organized and easy to follow for debugging.
All external interaction with the robot was to occur via audio which we pivoted on later in the project.
Going into the team set the following electrical goals:
The first big choices on the table were how to do computing, how to control motors and servos, and how to power the system.
Running both machine learning processes and audio processing at once requires a lot of computing power and powerful software available. Many Single Board Computers (SBCs) exist on the market but many are either overpriced or underperforming for our budget and needs. To compare our options we made the following decision matrix:
| Criteria | Weight (%) | Raspberry Pi 5 | NVIDIA Jetson | Raspberry Pi 4 | Orange Pi 5 |
|---|---|---|---|---|---|
| Cost | 20 | 7 | 4 | 8 | 4 |
| Performance | 20 | 8 | 10 | 4 | 10 |
| Support | 5 | 6 | 6 | 6 | 3 |
| Community | 15 | 10 | 8 | 10 | 6 |
| Size | 5 | 10 | 5 | 10 | 8 |
| Power Draw | 15 | 7 | 4 | 7 | 6 |
| Ease of Use | 20 | 8 | 7 | 8 | 7 |
| Final Score | 100 | 7.95 | 6.55 | 7.35 | 6.55 |
Note: The Arduino Uno Q would have been an absolutely perfect option but it would not have arrived in time.
The Raspberry Pi 5 cannot directly control and power servos or motors, so we needed an intermediate to facilitate this control. In earlier PIE projects, we learned how to use Arduinos and Adafruit Motor Shields, and since fancy motor control was not in the scope of the project or in our learning goals, we opted to continue with this method of control. PIE also has a spare motor shield we could borrow which made it an easy choice, given that our team also had multiple Arduino R4s at our disposal (foreshadowing).
All the components we selected have to now be powered. All together, the max power draw of the system is as follows:
Raspberry Pi 5 = 15W (3A @ 5V)
Arduino Uno R4 Minima = 0.5W (90mA @ 5V)
Three Servos = 3W (600mA @ 5V)
Two 8x8 Matrix Displays = 3W (600mA @ 5V)
Two 12V Drive Motors = 72W (6A @ 12V)
Total 5V = 4.29A Needs to be stepped down from 12v
Total 12V = 2.4A Including 12V converter theoretical draw
We had a 12V battery easily at our disposal from PIE stock. It was a 12V 2.2AH Li-Po battery that has seen better days (foreshadowing #2). Also, we needed a 12V system anyway to power the motors.
We found buck converters in the electrical stock room that could only supply 3A max, meaning that at full draw our robot would be power limited. Also, the Raspberry Pi 5 is incredibly sensitive to power fluctuations (foreshadowing #3) so having a dedicated buck converter makes the system more reliable and cleaner. So we ended up with a 12V battery and 2 buck converters to power the entire robot.
With all of our components selected, we could start working! Our original expectation was to use three microphones for all of the software functions. To test this, we connected three microphones to the ADC port on an Arduino and transmitted the audio data over WiFi to test botha audio collection and remote data transmission. We were only able to send low fidelity audio which did not meet software’s requirements for tracking a person or melody. Relatively late into the project we switched to a camera, which we will discuss more later.
Just as we were figuring out audio, our Raspberry Pi 5 finally arrived. We purchased the Pi 5 due to its reliability, community, and generally available knowledge on the internet. However, we managed to discover many of the possible downfalls of the Raspberry Pi 5, all in succession.
After flashing a 64 bit version of Debian Trixie, the Raspberry Pi would start to boot, get caught, and start to boot again endlesssly. It gave no error code, would not connect to the internet so we could SSH into it and check errors, or give really any consistent signal as to the problem. We eventually identified a weak power supply, wires that were too thin, a slow SD card, and funky network protocols to all be interfering with the boot process.
After soldering thicker wires to the GPIO pins, setting a static IP and connecting to OLIN-DEVICES, finding a faster SD card, and changing power supplies, the Raspberry Pi booted! This was absolutely huge for our team and happened after over ten hours of fiddling with a supposedly plug and play product. Note: we did not have the Raspberry Pi official power supply which would have solved some isses.
With our base systems set up with the Arduino data communication and Raspberry Pi up and running, we started to integrate the electrical system. We were able to find two buck converters in the electrical stock room that perfectly fit our power requirements and dimensions.
Within a couple days we had a fully wired, albeit messy, system for testing. Our battery was happily powering our buck converters and the Arduinos and motors.
Then one of our GPIO pins on one Arduino shorted and produced magic smoke, and a 12V wire became quite friendly with the USB port on the other Arduino… leaving us with one final Arduino for the project.
Besides this, the two buck converter system kept the Raspberry Pi happy while providing enough power to the servos and displays.
It was incredibly important to our electrical team member that the system be quite neat. So after we had a working prototype, it was time to refine the wiring. Before refinement the wiring was quite a rat’s next: It all worked, but some wires were too short, some were too long, and it all made working on the robot really hard.
To fix this issue we unsoldered all of the wires and made all of them the correct length, as well as switching to silicone sleeved wire instead of PVC for ease of routing. We then added wire wrap and heat shrink to all necessary connections and routed the wires for minimal length on power wires.

Here is what the final electrical system looks like with all of the wire management.
Firmware also fell under the electrical umbrella. Please see the firmware page on the website for details on the development.
The battery powering our dear furry friend was a 12V Li-Po battery that was in PIE stock. We found it with a small amount bulge but it was within safe levels. This battery served us for a couple weeks before we left the robot plugged in all night. This completely drained the battery. In the morning we charged the battery again, causing it to inflate a great deal more to highly unsafe levels (enough that the entire PIE teaching team was concerned). We unfortunately had to dispose of our trusty battery with the shop.
The PIE team offered to buy our team a new battery, since safety comes above budget, so huge shout out to all of them. Thank you for making our project possible. But for a while, we were relegated to power supply testing for our software/hardware integration.

Our design process laid bare
The main goals for the mechanical design of the Fox-Bot were to be easy to take apart and put back together, and to be visually fox-like and cute. To achieve the first goal we made a modular skeleton out of lasercut plywood. All joints are pressfit or secured with bolts, to ease disassembly and access to the inside electronics. Plywood was also chosen to be cost effective and sustainable. To achieve the second goal we built moving ears and tail, hid the camera in the nose, and covered the entire Fox-Bot in faux-fur. We’ve broken our design process into six main aspects which we’ll cover here.
The chassis is designed to be easily taken apart in order to access the interior workings. The Fox-Bot is driven by two 12V DC motors mounted on the baseplate, with two castor wheels on the front and back. We originally were going to design and manufacture our own caster wheels, but decided against it because of how much time it would take. The caster wheels are embedded higher into the baseplate in order to match the height of the main wheels. We also had to sand the motor mounts a bit to reach the desired height. The design of the lasercut chassis stayed mostly the same throughout our design process, and we were able to add attachment points where necessary.

The design of the face had to include mounting for the 2 displays we used for eyes, and eventually a mount for the camera. In continuing the goal of ease of assembly, the eyes are simply pressfit into holes cut into plywood. Once we pivoted to using a camera, the camera mount was added into the center of the face, and disguised as a nose/snout.
The management of the electrical component was played around with near the beginning of the project as a unified platform to house the arduino, motorshield and battery, but we quickly realized that would not work because the DC motors for the wheels cut right through the center of the body. This led to a two platform design, with a simple plywood housing for the battery on one side of the Fox-Bot and a simple 3D printed platform on the other.

Two screws are tightened up through the baseplate and holes in the raspi to screw into the poles of the PCB platform. The Arduino is then secured with a separate set of bolts into the platform. This set up had the added benefit of balancing the chassis.
We also implemented wiring notches in the top and bottom the chassis supports to help with the internal wire management of the Fox-Bot

The ears were designed to be rotated above the head by two servo motors, and be simple enough to cover in fur. A structure to support the servos is press fit into the top. We used small servos provided by our class because they were free and we didn’t need much tourque to move the ears, but were more constarined by space.

The goal of the button is to allow the user to directly interact with the Fox-Bot. It was designed with ease of activation and ease of integration in mind. Electrically the Fox-Bot was already doing a lot, so that is why there is not a pressure sensor or some other sensor to detect someone petting the head of the Fox-Bot. The solution is two buttons.


Above is the CAD rendering of the top plate and large button (both laser cut plywood) and the real life implementation. The small blue button is a real electrical component, but doesn’t meet the requirement that a user can pet anywhere on the top of the Fox-Bot and queue a response. We chose to use this component because we found it for free within our classroom’s excess materials. The large and the top plate that have suspended dowels running through both, and springs around those dowels. This guides the button directly down when it is pressed and keeps it from moving on the XY plane. The springs wood glued directly to the button and top plate keep the large button from dislodging. Whenever the large button is pressed, even slightly, it will apply pressure to the small button and queue the Fox-Bot to respond.
This design makes integration easier and relies on a very simple and robust electric component, versus a potentially finickier sensor.
The tail of the Fox-Bot is and was 3D printed from PLA. This manufacturing process was chosen for its affordability, and because it was easy to prototype. The tail went through two major phases. The first one was an attempt to install a cable drive system that would use a gear to simultaneously move the tail and curl the tail. These were the first iterations of the tail with that design in mind (note the holes in the sides of each component).

This idea was quickly abandoned as too complicated for such a small subsystem, and would require springs to be integrated into the cable system to keep it taut. We opted for a simpler modular design that would curl slightly when swung back and forth via the momentum of the tail. The components were also made cylindrical, and filling was added to remove the clanking noise created when the PLA collided with itself as the tail swung. This significantly reduced the tails ability to curl, but considering the size of the fur that would later cover it this trade off was worth it.

The motor supports (which just press fit into the Fox-Bot for even more modularity) for the servo remained relatively the same through the two phases, although the whole platform was raised to reduce the dragging created with the ground.
“Yap about fur”

The other purely aesthetic part we created was a 3D rendition of the Minecraft Sweet Berry.

We made it out of plywood and painted it. The color of the berries is a brighter and more consistent red than that of the actual videogame artifact, but that decision was made with the computer vision in mind. One, brighter red color makes tracking easier.

Software design process and justification
We chose to use Python because of its ease of use and many libraries, except for a C++ script that sends data to arduino that controls all the hardwares.
The core module that controls the robot is the behaviors class, which:
self.left_speed = 0 # -128 - 127
self.right_speed = 0 # -128 - 127
self.ear = 90 # 0–180
self.tail = 60 # 0–120
self.eye_brightness = 1
self.left_eye = eye_display.EyeDisplay()
self.right_eye = eye_display.EyeDisplay()
In a seperate EyeDisplay class, we define the eye of 8*8 pixels with a list of 0s and 1s and imports it here.
We began with some basic idle functions like sleep() and chase_tail() that sets these parameters to a certain value or state according to time and phase of the behavior. For example, in chase_tail(), its tail would first move to one side, then it spin at its tail’s direction, and finally it gets dizzy eyes.
At early stage, since the hardwares are not in place yet, we tested our code on a simulation script that prints out all robot parameters and the raw bytes it was sending out, so that we could test and debug our code as we were writing it, which prevents the risks of large-scale code reconstruction or underlying logic bugs that couldn’t have been found without running.

In fact, this helped a lot with our integration. Our script almost worked instantly when we first tested it on hardwares despite some arduino reading issues that are easy to fix.
As we started to add more idle states, more complex behaviors, and behaviors that only triggers with certain input signals, we realized that one class that controls both the behavior actions and robot states can be really long and messy. To make our code more readable and extendable, we split it into two classes: Behaviors and BehaviorManager.
BehaviorManager, as its name suggests, manages robot states and behavior signals based on inputs such as button press(petted), melody detection and darkness. It also handles prioritization of behaviors, idle behavior loop and time tracking.
The Behavior class receives a string that indicates the state that the robot should currently be in, and update robot parameters accordingly.
def update_behavior(self):
self.state = self.manager.state
match self.state:
case “run_petted”:
self.behavior = lambda:self.petted(self.manager.eye_state)
case “run_look_for_treat”:
elapsed = self.manager.now - self.manager.look_for_treat_start
self.behavior = lambda:self.look_for_treat(elapsed)
case “run_sleep”:
self.behavior = self.sleep
case “blink”:
elapsed = self.manager.now - self.manager.idle_start
self.behavior = lambda: self.blink(elapsed)
case “chase_tail”:
elapsed = self.manager.now - self.manager.idle_start
self.behavior = lambda:self.chase_tail(elapsed)
case “wag_tail”:
self.behavior = self.wag_tail
case “look_around”:
elapsed = self.manager.now - self.manager.idle_start
self.behavior = lambda: self.look_around(elapsed)
case _:
self.behavior = self.default
There are three main tasks for audio processing:
Pitch finder
We used librosa library for finding the frequency and pitch of notes in the audio. It worked pretty well on saved .wav audio files, so we decided that our main task is getting it to run on real-time collected audio data.
Real-time audio
One challenge of real-time audio processing is that while collecting data is fast and easy, processing it and detecting notes takes longer and will cause a noticible pause of robot. To address this, we continuously streams sound from the microphone and store small chunks of audio in memory, which doesn’t affect the normal robot operation. Every few seconds, we save the recorded audio to a temporary file, process it and analyze the frequencies, and save the notes detected. Then we can comparing the notes to the given melody and send that signal to other modules.
Word & Speaker recognition
Our stretch goal was to make the robot react to specific word commands, such as “spin” or “come here”, as well as train a model for audio data on different people’s voices. We explored on both of them but didn’t end up applying it to the robot.
We wanted the robot to come to us when it hears the specific melody. Our initial thought was to use three microphones on the side of the robot, convert the amplitude of each microphone’s reading to distance, and calculate the position of the sound source based on the distances. However, the microphone quality was worse than we expected and the difference between the three microphones are not significant enough for accurate math calculation.
To address that, we decided to instead use a camera so that the robot can see what’s around it and follow a targeted thing – in our case, a red berry. We used OpenCV to detect the color and when it sees a block of red it returns the position of its center. The robot will then move to it.

At final stage, when all the hardwares were assembled, we tested the code on the robot and fine tuned the parameters. For example, the tail servo cannot move to its theoretically biggest angle because it would hit the mounting, and the motor speed for wiggling also needed to be adjusted carefully so that it didn’t just go straight or turn to one side.

System Diagram/Descriptions
Diagrams/descriptions of multiple high and low level systems.
A mechanical diagram of the Fox-Bot.
Below is a full and then subsystem level illustration and discription of the mechanical side of the Fox-Bot. We chose Onshape as the CAD software to design in because of its colaboration feature. If you are looking for more details, or wish to recreate some or all of our Fox-bot, here is the link to view our Onshape document*. Most real life images are in the mechanical design process page.

This is a full CAD rendering of the Fox-Bot. The whole design is modular and almost all joints are either press fits or bolts. Only the castor wheels, the depth of which needed to exactly match the motorized wheels, are wood glued.


CAD rendering and drawings of the chassis. All of these parts are \frac{1}{4}’’ plywood and machined with a laser cutter.


CAD rendering and drawings of the battery container and PCB housing. The battery container is \frac{1}{4}’’ plywood machined with a laser cutter. The PCB Housing was 3D printed from PLA.


CAD rendering and drawings of the eye display mounts, nose, and camera mount. All drawn parts are \frac{1}{4}’’ plywood machined with a laser cutter. Included in the CAD rendering are CAD aproximations of the eye displays and camera and its board


CAD rendering and drawings of the ear scafolding, ear motor mounts, and ear motor horn. All drawn parts are \frac{1}{4}’’ plywood machined with a laser cutter. Included in the CAD rendering are CAD aproximations of the small servos used to power the ears



CAD rendering and drawings of the tail pieces, tail horn, and tail servo mounts. All drawn parts were printed with PLA. Included in the CAD rendering are CAD aproximations of the servo used to power the tail. The live image shows the final integrated tail.



Images of the 4 main peices of fur that cover the body, ears, and tail of the Fox-Bot, CAD for the Minecraft Sweet Berry, and a physical rendition of the sweet berry made of stacked layers of wood glued \frac{1}{4}’’ plywood machined with a laser cutter. This Prop is what the camera tracked after the Beastling Call melody is recognized by the microphone (built into the board the camera is on).
A full list of parts and costs for the Fox-Bot.
| Part | Quantity | Price | Link | | —————————— | ——– | —— | —————————————————————————————————————————————————————————————————————————————————————————————————————————————————————————————————————————————————————————————————————————————————————————————————— | | DC12V Encoder Gear Motor | 2 | $19.00 | Amazon Link | | Electret Microphone (x3) | 1 | $6.99 | Amazon Link | | 8x8 Matrix Display (x3) | 1 | $6.99 | Amazon Link | | 100 1/4 in Chrome Plated Balls | 1 | $6.56 | Amazon Link | | Raspi 5 | 1 | $66.00 | Adafruit Link | | Box of springs | 1 | $6.36 | Amazon Link | | 8 Adhesive Castor Wheels | 1 | $5.32 | Amazon Link | | Faux fur | 1 | $53.61 | Amazon Link | | Camera/Microphone | 1 | $54.90 | DF Robot Link | | 1/4 in plywood | 2 | $6.00 | Shop Stock | | 12v to 5v 3A buck converter | 2 | $10.00 | Found | | XT60 Connectors | 1 | $0.66 | Found | | 20 AWG Wire | 3ft | $1.73 | Found | | 24 AWG Wire | 2ft | $0.70 | Found | | Button | 1 | $5.00 | Found | | Shrink Wrap | N/A | $0.24 | Found | | Shrink Wrap | N/A | $0.24 | Found
A circuit diagram of the Fox-Bot and electrical components list.
Notes:
Electrical Components List:
A graphic model of the data and energy flow inside the robot.
This diagram shows the energy and data flow throughout the Fox-Bot robot. It is oriented in a birds eye view with the tail towards the top and eyes towards the bottom.
A firmware description of the Fox-Bot.
Our system uses an Arduino R4 Minima to facilitate actuation of servos, motors, and receiving signals. The Arduino recieves all of its instructions on how to actuate those devices via a serial message from the Raspberry Pi 5. Since we need the robot to react in real time with the user the firmare needs to be efficient and loop quickly. The firmware also needs to do very minimal calculation as most of the grunt work is done by the much more powerful Raspberry Pi 5.
Since everything the Arduino does is based on the data it receives, the main loop starts with reading and unpacking a serial message.
// ----------------------------------------------------------
// Serial Data format:
// Byte 0 = Left motor speed -128-127(byte)
// Byte 1 = Right motor speed -128-127(byte)
// Byte 2 = Servo angle for ears 0-180(byte)
// Byte 3 = Servo angle for the tail 0-180(byte)
// Byte 4 = Eye brightness (0-1 -> scaled to 0-255)
// Bytes 5-12 = Array for left eye (8 bytes)
// Bytes 13-20 = Array for right eye (8 bytes)
// ----------------------------------------------------------
if (Serial.readBytes(buffer, SERIAL_PACKET_SIZE) == SERIAL_PACKET_SIZE) { // Unpack message data
int receivedLeftSpeed = (int)buffer[0];
int receivedRightSpeed = (int)buffer[1];
int leftSpeed = (receivedLeftSpeed - 128)*2; // Convert unsigned 0-255 to signed -255-255
int rightSpeed = (receivedRightSpeed - 128)*2; // Convert unsigned 0-255 to signed -255-255
int receivedEar = (int)buffer[2];
int receivedTail = (int)buffer[3];
int receivedBrightness = (int)buffer[4];
These datapoints are then used as the input of various functions to change the state of the robot.
for (int i = 0; i < 8; i++) { // Unpack array for left eye matrix
leftArray[i] = buffer[5 + i];
}
for (int i = 0; i < 8; i++) { // Unpack array for right eye matrix
rightArray[i] = buffer[13 + i];
}
setDisplays(receivedBrightness, leftArray, rightArray); // Set displays based on arrays unpacked above
setMotors(leftSpeed, rightSpeed); // Set motors to speeds unpacked above
if (l_receivedEar != receivedEar) { // Set ear angles
setEars(receivedEar);
}
if (l_receivedTail != receivedTail) { // Set tail angle
setTail(receivedTail);
}
The serial communicaton and function calls make up the entire loop. This simple loop accomplishes our goals of simplicity and speed. This loop is able to run fast enough to react in real time (impercetable to our eyes) to the Raspberry Pi’s inputs even while changing motor, servo, and display states.
A huge part of the speed of the firmware is derived from non-blocking function calls. None of our functions wait for the action to be completed which greatly increases speed.
void loop() {
if (Serial.available() >= SERIAL_PACKET_SIZE) {
This can lead to issues of actions not reaching completion before a new state is requested. We solve this problem by reducing or increasing the message rate of the Raspberry Pi as the Arduino only acts when a serial message is available. Therefore, the loop runs only as often as the Raspberri Pi requests (up until the Arduino’s computing limit).
The Arduino is connected to an Adafruit v3 motor shield to allow the Arduino to control the two DC drive motors. We are also using matrix LED displays along with servos and thus we need multiple external libraries to run these complex tasks. The libraries we used are:
#include <Adafruit_MotorShield.h>
#include <Servo.h>
#include <LedControl.h>
Note: IDE is also required to efficiently edit and flash Arduino code.
#include <Adafruit_MotorShield.h>
#include <Servo.h>
#include <LedControl.h>
// All pin defs
const int DIN_PIN = 12; // Data In (MOSI) -> Connect to DIN on the first display (orange wire)
const int CLK_PIN = 7; // Clock -> Connect to CLK on the first display (green wire)
const int CS_PIN = 8; // Load/Chip Select (CS) -> Connect to CS on the first display (yellow wire)
const int ENCODER_PIN_R = 2; // Encoder for right motor (must be 2 or 3 for ISR)
const int ENCODER_PIN_L = 3; // Encoder for left motor (must be 2 or 3 for ISR)
const int SERVO_EAR_R = 10; // Right ear servo
const int SERVO_EAR_L = 9; // Left ear servo
const int SERVO_TAIL = 6; // Tail servo
const int NUM_DEVICES = 2; // Number of displays daisy-chained together
byte SERIAL_PACKET_SIZE = 21; // Serial Comm Constants
// Motor shield setup
Adafruit_MotorShield MS1 = Adafruit_MotorShield(); // Motor shield (0x60 address)
// Attach motors
Adafruit_DCMotor *m1 = MS1.getMotor(4); // Left Drive Motor
Adafruit_DCMotor *m2 = MS1.getMotor(3); // Right Drive Motor
// Servo objects
Servo earR_servo;
Servo earL_servo;
Servo tail_servo;
// Set up LED controller
LedControl lc = LedControl(DIN_PIN, CLK_PIN, CS_PIN, NUM_DEVICES);
int receivedLeftSpeed = 0;
int receivedRightSpeed = 0;
int receivedEar = 90;
int receivedTail = 90;
int receivedBrightness = 5;
int l_receivedLeftSpeed = 0;
int l_receivedRightSpeed = 0;
int l_receivedEar = 90;
int l_receivedTail = 90;
int l_receivedBrightness = 5;
byte leftArray[] = {
B00000000, // Row 0
B00000000, // Row 1
B00000000, // Row 2
B00000000, // Row 3
B00000000, // Row 4
B00000000, // Row 5
B00000000, // Row 6
B00000000 // Row 7
};
byte rightArray[] = {
B00000000, // Row 0
B00000000, // Row 1
B00000000, // Row 2
B00000000, // Row 3
B00000000, // Row 4
B00000000, // Row 5
B00000000, // Row 6
B00000000 // Row 7
};
const int PATTERN_ROWS = 8; // Max number of rows in the pattern
void setDisplays(int brightness, byte leftArray[], byte rightArray[]) {
lc.setIntensity(0, brightness); // Set the brightness from 0-15
lc.setIntensity(1, brightness); // Set the brightness from 0-15
for (int row = 0; row < PATTERN_ROWS; row++) { // Set patterns
lc.setRow(0, row, leftArray[row]);
lc.setRow(1, row, rightArray[row]);
}
}
void setMotors(int speedL, int speedR) {
// Clamps all speeds to 255
if (speedL > 255){
speedL = 255;
}
if (speedR < -255) {
speedL = -255;
}
if (speedR > 255){
speedL = 255;
}
if (speedR < -255) {
speedR = -255;
}
// Set speed and direction of left motor
if (speedL==0){
m1 -> run(RELEASE);
} else if(speedL<0){
m1 -> run(BACKWARD);
m1->setSpeed(-speedL);
} else{
m1 -> run(FORWARD);
m1->setSpeed(speedL);
}
// Set speed and direction of right motor
if (speedR==0){
m2 -> run(RELEASE);
} else if(speedR<0){
m2 -> run(BACKWARD);
m2->setSpeed(-speedR);
} else{
m2 -> run(FORWARD);
m2->setSpeed(speedR);
}
}
void setEars(byte ear_angle) { // Set ears angle
earR_servo.write(ear_angle);
earL_servo.write(ear_angle);
}
void setTail(byte tail_angle) { // Set tail angle
tail_servo.write(tail_angle);
}
void setup() {
MS1.begin();
//------------------------------------------------------------------------------------------------
// The arduino code needs to receive a 21 byte array message to run
// 21 bytes = 168 bits
// The robot should have a refresh rate of around 200-500 times a second for seameless operation
// Realistically 60hz or so would probably be fine but where is the fun in that
// 168*500 = 84,000 which means 9600 is too slow so we need 115200 bps or faster
//------------------------------------------------------------------------------------------------
Serial.begin(115200); // Bitrate justified above
delay(100); // Give time for Raspi to connect
// Attach all necessary servos
earR_servo.attach(SERVO_EAR_R);
earL_servo.attach(SERVO_EAR_L);
tail_servo.attach(SERVO_TAIL);
lc.shutdown(0, false); // Wake up display 1
lc.setIntensity(0, 1); // Set the brightness from 0-15
lc.clearDisplay(0); // Clear display 1
lc.shutdown(1, false); // Wake up the display 2
lc.setIntensity(1, 1); // Set the brightness from 0-15
lc.clearDisplay(1); // Clear display 2
}
void loop() {
if (Serial.available() >= SERIAL_PACKET_SIZE) {
byte buffer[SERIAL_PACKET_SIZE];
// ----------------------------------------------------------
// Serial Data format:
// Byte 0 = Left motor speed -128-127(byte)
// Byte 1 = Right motor speed -128-127(byte)
// Byte 2 = Servo angle for ears 0-180(byte)
// Byte 3 = Servo angle for the tail 0-180(byte)
// Byte 4 = Eye brightness (0-1 -> scaled to 0-255)
// Bytes 5-12 = Array for left eye (8 bytes)
// Bytes 13-20 = Array for right eye (8 bytes)
// ----------------------------------------------------------
if (Serial.readBytes(buffer, SERIAL_PACKET_SIZE) == SERIAL_PACKET_SIZE) { // Unpack message data
int receivedLeftSpeed = (int)buffer[0];
int receivedRightSpeed = (int)buffer[1];
int leftSpeed = (receivedLeftSpeed - 128)*2; // Convert unsigned 0-255 to signed -255-255
int rightSpeed = (receivedRightSpeed - 128)*2; // Convert unsigned 0-255 to signed -255-255
int receivedEar = (int)buffer[2];
int receivedTail = (int)buffer[3];
int receivedBrightness = (int)buffer[4];
for (int i = 0; i < 8; i++) { // Unpack array for left eye
leftArray[i] = buffer[5 + i];
}
for (int i = 0; i < 8; i++) { // Unpack array for right eye
rightArray[i] = buffer[13 + i];
}
setDisplays(receivedBrightness, leftArray, rightArray); // Set displays based on arrays unpacked above
setMotors(leftSpeed, rightSpeed); // Set motors to speeds unpacked above
if (l_receivedEar != receivedEar) { // Set ear angles
setEars(receivedEar);
}
if (l_receivedTail != receivedTail) { // Set tail angle
setTail(receivedTail);
}
}
}
}
A full system diagram of the Fox-Bot.
A software diagram of the Fox-Bot.
This is a diagram of the structure of the code at a high level, showing the most important classes and functions and how they connect.