Laser Cat Toy

Project Overview

For a graduate level course entitled Prototyping Interactive Interfaces II, I developed and refined the skills necessary to prototype interactive systems comprising both hardware and software components. The semester's goal was to build a network-connected hardware system, also referred to as an IoT (Internet of Things) device.

I decided to design and build a network-controlled laser toy to keep my two cats occupied while my partner and I are away from home. The device was built using an Arduino UNO WiFi Rev2, a Reolink E1 Pro camera, as well as a laser diode and mini pan-tilt kit from Adafruit.

This writeup intends to serve as documentation of my design process as well as a tutorial on building a telepresence cat toy.

Role

Solo Software Developer/Hardware Engineer

Topics

Physical Interface, Internet of Things, Telepresence

Timeline

10 weeks

Methods

General research, software development, hardware build, prototyping

Tools

Arduino, p5.js, Adafruit IO, Photoshop, laser diode, micro servo motors

Getting Started

As someone brand new to the world of networking and physical computing, there was a lot I had to learn before diving into a build. We spent the first several weeks of class on readings and assignments focused around topics such as the best data markup formats, the I2C communication protocol, smart home sensors, and MQTT versus RESTful schemas.

After developing a foundational understanding of IoT devices and how they work, I began ideating about my project. I knew right away that I wanted to create something engaging for my cats, pictured below.
Two beautiful cats laying on a gray carpet gazing at the camera.
Indigo Mountain's Majesty AKA Little Girl AKA Osita (left) and Sapphire Gizelle (right).

Project Inspiration

I firstly would like to thank the people at Adafruit for writing and bringing together such an amazing collection of resources. I relied heavily on Adafruit for code reference and project inspiration, and I purchased the majority of the hardware needed from them and even used their Adafruit.IO server to interface with my device.

I would also like to thank Tony DiCola and M. LeBlanc Williams for their innovative projects which inspired my own.

I began looking around online for other IoT pet toys other enterprising individuals have created and came across two unique remote control laser toy projects.
Tony DiCola's project involved a Raspberry Pi hooked up to network camera, two micro servos, and a laser diode. His interface involves moving the laser to the desired location by clicking on the video stream.
M. LeBlanc William's project cleverly employs a Wii Nunchuck connected to a microcontroller to control a laser rigged to Adafruit's mini pan-tilt kit.

I lightly drew from each of these projects for my own. I got the idea of using a laser diode and pan-tilt kit from the nunchuck build, and I originally intended to hook up a camera to my microcontroller like Tony did. However, I quickly realized that the Arduino board I chose would not have the processing power to support video streaming.

Hardware Build

With all that being said, I recognized the major hardware components I would need for my build: an Arduino WiFi microcontroller, a laser diode, and a mini pan-tilt kit which includes two micro servos. 

The Circuit

With my professor's help, I created an early draft of the circuit diagram using FigJam.
Firstly, we need to hook up a 2.4 amp adapter to the wall which plugs into the USB-B port to power the Arduino.

The mini pan-tilt kit consists of two micro servos, one that moves horizontally and one vertically, and some plastic assembly. The servo hookup is pretty straightforward: each is wired to 5V power, ground, and a digital control pin on the Arduino. I chose pin 3 for the horizontal and 6 for vertical.

The laser diode circuit is a little more complicated. It gets 5V power straight from the microcontroller, but the ground has to go through a few more hoops. We need a transistor in the circuit to control the power that flows through. Simply put, the transistor will make it so that we can precisely control when the diode turns on and off.

The transistor I used, IRF520, has 3 pins for gate, drain and sink. Ground from the board goes to sink and ground on the diode goes to drain. Then, we run a wire from another digital pin of the Arduino (I chose 12) to gate, which also goes through a 10k resistor to ground.

Below is a picture of the assembled circuit:

The Build

A note on the camera: I knew that a camera was an essential component of my project, as I aspire to operate the toy from far away and therefore need to see how the cats interact with it. However, the Arduino UNO simply does not have the processing power to run a live video stream. Therefore, for this prototype, I purchased a network security camera which will be viewed and operated separately from the control interface of the toy. I opted for the Reolink E1 Pro, but there are plenty of great WiFi cameras available on the market.

All that aside, now that the circuit was assembled, it was time to start putting together my build. This prototype build is brought to you courtesy of cardboard packaging, duct tape, and prayers.
Here's my first pass at the build. The mini pan-tilt kit is on a slightly elevated platform affixed to a piece of cardboard packaging with push pins.
Next I added in the laser pointer, very securely attached to the pan-tilt kit with duct tape, and also the camera on a lower platform.
Here I slightly switched the orientation so that the wires of the laser and pan-tilt kit don't have to reach quite as far. I also elevated the camera and secured the build further with more duct tape.

I also played around with incorporating a passive infrared (PIR) motion sensor and/or a microphone to notify me when the cats were moving nearby, but neither of the ones I have for Arduino were sensitive enough to pick up the necessary stimuli.

Arduino Code

Now that we have the circuit and build all set, it's time to start on the software side. The first task is to connect the Arduino to the cloud. For this project, I've chosen to connect to the cloud services of Adafruit IO.

Connecting to the Cloud

Let's take a look at the code to get us connected and online. All of the following takes place in the Arduino IDE.
#define AIO_USERNAME "YOUR AIO USERNAME"
#define AIO_KEY "YOUR AIO KEY"
#define WIFI_SSID "YOUR WIFI NAME"
#define WIFI_PASS "YOUR WIFI PASSWORD"
#define USE_AIRLIFT // required for Arduino Uno WiFi R2 board compatability
#define AIO_LASER_FEED "laser"
#define AIO_XSERVO_FEED "xservo"
#define AIO_YSERVO_FEED "yservo"

// Libraries
#include <AdafruitIO_WiFi.h>
#include <Servo.h>

// Pin Mapping
const byte laserPin = 12;
const byte xServoPin = 3;
const byte yServoPin = 6;

//define servos
Servo xServo;
Servo yServo;
int xServoAngle = 0;
int yServoAngle = 0;

bool laserOn = false;

// Constructors
AdafruitIO_WiFi aio(AIO_USERNAME, AIO_KEY, WIFI_SSID, WIFI_PASS, SPIWIFI_SS, SPIWIFI_ACK, SPIWIFI_RESET, NINA_GPIO0, &SPI);
AdafruitIO_Feed *laserFeed = aio.feed(AIO_LASER_FEED);
AdafruitIO_Feed *xServoFeed = aio.feed(AIO_XSERVO_FEED);
AdafruitIO_Feed *yServoFeed = aio.feed(AIO_YSERVO_FEED);
If this is your first time using Adafruit IO, I would highly recommend following this tutorial to get started on the platform. It will walk you through how to set up your account and your feeds which populate with data from the microcontroller.

The first lines involve defining what you need to connect to the cloud, which will be filled in with your Adafruit IO username and key, and your wifi info. After that, I define the 3 feeds I previously set up on Adafruit which will enable me to push and receive data for my laser, horizontal servo, and vertical servo.

Next we import the libraries for the Adafruit IO and servo connections, and define which pins we wired up to control the laser and servos. We also declare a few global variables to use later on. Following that are some constructors necessary to initiate our connection with the Adafruit server and feeds.

Setup Function

void setup() {
Serial.begin(9600);
while(!Serial); // wait for serial connection
// Adafruit IO connection and configuration
Serial.print("Connecting to Adafruit IO");
aio.connect(); // connect to Adafruit IO service
while(aio.status() < AIO_CONNECTED) {
Serial.print(".");
delay(1000);}
 
//set up Arduino outputs
pinMode(laserPin, OUTPUT);
xServo.attach(xServoPin);
yServo.attach(yServoPin);

Serial.println(aio.statusText());  // print AIO connection status
laserFeed->onMessage(laserMessageHandler);
laserFeed->get();

xServoFeed->onMessage(xServoHandler);
xServoFeed->get();

yServoFeed->onMessage(yServoHandler);
yServoFeed->get();}
Next, in setup() we start the Serial monitor and open up the connection to Adafruit. This code will print out a period every second it takes to load that connection. Then we get the Arduino set up by indicating the appropriate pins for the laser and servos.

Following that, we set up message handlers for each of our 3 feeds, which will call a specific function each time a new message is received for the associated feed. The get() function requests an update for each of those feeds and then calls the handler function.

Loop Function

void loop() {
aio.run();  // keep client connected to AIO service
if(laserOn){
xServo.write(xServoAngle + random(-2,2));
yServo.write(yServoAngle);}
delay(100);}
The loop function is quite slim. On each iteration of the loop, we execute aio.run() which serves to keep us connected to the Adafruit IO service. Then we write the servo angles to the appropriate servos. We only want to move the servos if the laser is on, which will act as a proxy power control for the whole system. These servo values will be filled in through the handler functions. For the horizontal servo, I add a random value between -2 and 2 to create a slight jitter in the movement of the laser to keep the cats engaged.

Handler Functions

void laserMessageHandler(AdafruitIO_Data *data) {
String value = data->toString(); // capture feed value from AIO
Serial.print("laser feed received -> "); Serial.println(value);
if(value == "ON"){
laserOn = true;}

else{
laserOn = false;}
digitalWrite(laserPin, value == "ON" ? HIGH : LOW);}

void xServoHandler(AdafruitIO_Data *data) {
int value = data->toInt();  // capture feed value from AIO
Serial.print("xservo feed received -> ");  Serial.println(value);
xServoAngle = value;}

void yServoHandler(AdafruitIO_Data *data) { 
int value = data->toInt();  // capture feed value from AIO
Serial.print("yservo feed received -> ");  Serial.println(value);
yServoAngle = value;}
As mentioned before, these functions are called each time there is an update to the associated Adafruit IO feed. Each function follows the same format. First it takes the data received from the cloud feed and stores in a variable called value. Then it prints it to the Serial monitor to see that values are being received correctly. Finally, it executes an action using that new value.

For the laser, it writes directly to the pin to do the opposite of whatever the current value is: if the laser is off, it will turn on; if it is on, it will turn off. For the servos, it simply stores those values in the appropriate variables. The servos will move based on these values each iteration of the loop.

That's it for the Arduino side! Load this code onto your Arduino and let's continue.

Front-End Code: p5.js

After getting the Arduino up and connected to the cloud, we can start working on the front-end interface in p5, which is a JavaScript library for interactive experiences. Firstly be sure to include the following line in your html file to import the p5.js library:
<script src="https://cdn.jsdelivr.net/npm/p5@1.4.0/lib/p5.min.js"></script>

Variable Declarations

let IO_USERNAME = "YOUR AIO USERNAME";
let IO_KEY = "YOUR AIO KEY";
let allfeeds;

let laser_value;

let xServoMax = 180;
let yServoMax = 72;

let xservo_value;
let xServoAngle = 0;

let yservo_value;
let yservo_updated;
let yServoAngle = 0;

let lastCheckedTime = 0;

let pwrImg;
let pwrHover = false;
let pwrX = 150;
let pwrY = 15;
let pwrD = 100;

let joystick_ballImg;
let joystick_ballX = 146;
let joystick_ballY = 276;
let joystick_ballD = 110;

let joystick_backImg;
let joystick_backX = 50;
let joystick_backY = 180;
let joystick_backD = 300;

let joystickHover = false;
let joystickPressed = false;
let joystick_xoffset;
let joystick_yoffset;

let joystick_dragMinX = 110;
let joystick_dragMaxX = 190;
let joystick_dragMinY = 210;
let joystick_dragMaxY = 310;

let textX = 150;
let textY = 500;
let errorBool = false;
There are quite a few lines here, but all we're doing is declaring and initializing all of the global variables we'll be using later on. Some of the notable hard-coded values here are the horizontal and vertical servo max, which indicate how much the servos will actually be moving. Both servos can hypothetically move from 0 to 180 degrees, but from my testing I determined a max of 180 for horizontal and 72 for vertical to be ideal.

The rest of these values are the locations for which we'll place the visual elements on the canvas, stored in pixels, and the distance that we want to constrain the movement of the joystick.

Preload and Setup Functions

function preload(){
pwrImg = loadImage('YOUR POWER IMAGE FILE PATH');
joystick_ballImg = loadImage('YOUR BALL IMAGE FILE PATH');
joystick_backImg = loadImage('YOUR JOYSTICK IMAGE FILE PATH');}

function setup() {
createCanvas(400, 600);
frameRate(30);
noStroke();

pwrImg.resize(pwrD, pwrD);
joystick_backImg.resize(joystick_backD, joystick_backD);
joystick_ballImg.resize(joystick_ballD, joystick_ballD);

getData();}
In preload, we load in the images that will make up our control interface. These can just be native p5 circles, but I decided to lightly design my interface in Photoshop. If anyone is interested in using these images, please feel free to contact me.

In setup, we create the p5 canvas, which I decided to match the screen size for my phone. Then we set the frame rate to be 30. This is an important step, because Adafruit IO limits server requests to 30 per minute so we want to limit the amount of times the data is refreshing. This throttling creates a difficult challenge which I'll further address later on.

Beyond that, we just need to resize the interface images to the desired dimensions. In setup I also call the getData function once to populate the feed values right off the bat.

Draw Function

function draw() {
clear();
checkButtons();

//update every second
if(millis() > lastCheckedTime+1000){
getData();
lastCheckedTime = millis();}

if(!allfeeds){
return}

//populate values from IO feed
laser_value = allfeeds.feeds[0].last_value;
xservo_value = allfeeds.feeds[1].last_value;
yservo_value = allfeeds.feeds[2].last_value;

//set color for power button
if(laser_value == "OFF"){
tint(255, 0, 0);}
else{
tint(0, 255, 0);}

//draw images for buttons
image(pwrImg, pwrX, pwrY);

noTint();
image(joystick_backImg, joystick_backX, joystick_backY);
image(joystick_ballImg, joystick_ballX, joystick_ballY);

//draw error text
if(errorBool){
fill(255, 0, 0);
textSize(14);
text("too many requests", textX, textY);}}
At the start of each loop, we want to clear the canvas to make sure the drag events operate properly. Then we call the checkButtons function to see if we're hovering over anything. Next, we want to update the data values every second to make sure our data still matches what's on the cloud. We also have a quick check here which will end the loop if the data hasn't populated yet.

After receiving the data through getData, we move that data into the appropriate variables of the laser and servos. Next we set the color of the power button based on the status of the laser - naturally, green is on and red is off. We draw all the interface elements on the screen.

Finally, I added error text which will show up if the data requests are being throttled (which happens quite frequently in testing at a max of 30 per seconds).

Mouse Interactions

function buttonHover(x, y, width, height){
if(mouseX >= x && mouseX <= x + width && mouseY >= y && mouseY <= y + height){
return true;}
else{return false;}}

function checkButtons(){
//check if mouse is hovering over any buttons
pwrHover = buttonHover(pwrX, pwrY, pwrD, pwrD);
joystickHover = buttonHover(joystick_ballX, joystick_ballY, joystick_ballD, joystick_ballD);}

function mousePressed(){
let data;
//process  laser logic
if(pwrHover){
if(laser_value == "ON"){
data = {"value":"OFF"};}

else{
data = {"value":"ON"};}
postData(data, "laser");}
joystickPressed = joystickHover;

joystick_xoffset = mouseX - joystick_ballX;
joystick_yoffset = mouseY - joystick_ballY;}

function mouseDragged(){
//drag interaction for joystick
let xOffset = mouseX - joystick_xoffset;
let yOffset = mouseY - joystick_yoffset;

if(joystickPressed){
//constrain joystick movement within x radius
if(xOffset < joystick_dragMinX){
xOffset = joystick_dragMinX;}

else if(xOffset > joystick_dragMaxX){
xOffset = joystick_dragMaxX;}

//constrain joystick movement within y radius
if(yOffset < joystick_dragMinY){
yOffset = joystick_dragMinY;
}
else if(yOffset > joystick_dragMaxY){
yOffset = joystick_dragMaxY;}

joystick_ballX = xOffset;
joystick_ballY = yOffset;

//translate into 0 - 180 range with intervals of 4
xServoAngle = map(joystick_ballX, joystick_dragMinX, joystick_dragMaxX, xServoMax, 0);
xServoAngle = Math.ceil(xServoAngle/4) * 4;

//translate into 0 - 72 range with intervals of 3
yServoAngle = map(joystick_ballY, joystick_dragMinY, joystick_dragMaxY, yServoMax, 0);
yServoAngle = Math.ceil(yServoAngle/3) * 3;}}

function mouseReleased(){
joystickPressed = false;

//send values to servo
if(xServoAngle != xservo_value){
console.log("X Servo Angle: "+xServoAngle);
postData({"value":xServoAngle}, "xservo");}

if(yServoAngle != yservo_value){
console.log("Y Servo Angle: "+yServoAngle);
postData({"value":yServoAngle}, "yservo");}}
The bulk of the front-end code revolves around translating mouse movements on the screen into actions to send to the Arduino. Firstly, button hover checks if the mouse is currently over either of the interface elements (power button or joystick).

Next, we define, mousePressed, abuilt-in function called any time the mouse is clicked. In there we have some logic to check if the mouse is pressed while over the power button - if so, we send data to the cloud telling it to switch the status of the laser.

The following logic in mousePressed and mouseDragged serves to track the user's movement while clicking on the joystick and also constrain that movement within a certain radius. All of that logic is processed into variables called joystick_ballX and joystick_ballY. These variables essentially store how far the joystick was just moved from the center.

We then map these variables into our defined range of servo movement. This means we take the furthest values that the user can move the joystick and translate it into a range understandable by the servo. I also added a bit more logic to send these values as intervals of 3 or 4 so the movement is more discrete and won't overwhelm the cloud with requests.

Finally, the mouse being released is what prompts us to send the data to the cloud. Originally I was sending this request as the mouse was dragged, but the server limit became overloaded very quickly.  So now when the user stops dragging the joystick, the script will do a check to see if the current cloud servo value is the same and if not, it will post the servo data to the appropriate feed.

Touch Interactions

function touchingButton(touch, x, y, d){
if(touch.x >= x && touch.x <= x + d && touch.y >= y && touch.y <= y + d){
return true;}  
else{return false;}}

function touchStarted(){
var touch = touches[0]
if (touchingButton(touch, pwrX, pwrY, pwrD)){
if(laser_value == "ON"){
data = {"value":"OFF"};}
else{data = {"value":"ON"};}

postData(data, "laser");}

if(touchingButton(touch, joystick_ballX, joystick_ballY, joystick_ballD){   joystickPressed = true;
joystick_xoffset = touch.x - joystick_ballX;
joystick_yoffset = touch.y - joystick_ballY;
console.log("joystick pressed");}}

function touchMoved(){
var touch = touches[0];

//drag interaction for joystick
let xOffset = touch.x - joystick_xoffset;
let yOffset = touch.y - joystick_yoffset;
if(joystickPressed){    
//constrain joystick movement within x radius
if(xOffset < joystick_dragMinX){
xOffset = joystick_dragMinX;}
else if(xOffset > joystick_dragMaxX){
xOffset = joystick_dragMaxX;}

//constrain joystick movement within x radius
if(yOffset < joystick_dragMinY){
yOffset = joystick_dragMinY;}
else if(yOffset > joystick_dragMaxY)
{yOffset = joystick_dragMaxY;}
joystick_ballX = xOffset;
joystick_ballY = yOffset;

//translate into 0 - 130 range with intervals of 3
xServoAngle = map(joystick_ballX, joystick_dragMinX, joystick_dragMaxX, xServoMax, 0);
xServoAngle = Math.ceil(xServoAngle/3) * 3;
yServoAngle = map(joystick_ballY, joystick_dragMinY, joystick_dragMaxY, yServoMax, 0);
yServoAngle = Math.ceil(yServoAngle/3) * 3;}}
function touchEnded(){ joystickPressed = false;

//send values to servo
if(xServoAngle != xservo_value){
console.log("X Servo Angle: "+xServoAngle);
postData({"value":xServoAngle}, "xservo");}
if(yServoAngle != yservo_value){
console.log("Y Servo Angle: "+yServoAngle);
postData({"value":yServoAngle}, "yservo");}}
Next, we make the joystick movement touch responsive with built-in functions touchStarted, touchMoved, and touchEnded.

Network Communication

function getData(){
let feedurl = "https://io.adafruit.com/api/v2/YOUR USERNAME/YOUR FEED URL?x-aio-key="+IO_KEY;
httpGet(feedurl, false, function(response){
allfeeds = JSON.parse(response);
});}

function postData(data, feedKey){
let url = "https://io.adafruit.com/api/v2/YOUR USERNAME/feeds/"+feedKey+"/data?x-aio-key="+IO_KEY;
httpPost(url, data, function(data){}, function(response){
if(response == "Error: [object ReadableStream]"){
errorBool = true;
}
});
}
Last but certainly not least, we have the functions getData and postData to communicate with the cloud. getData sends request to the server and comes back with a string of all the values in the cat toy group I created on Adafruit. This data is converted into JSON and stored in the variable allfeeds.

postData sends the desired data as a request to whatever feed is passed into the function, in this case xServo or yServo. If it returns with an error, the variable errorBool will be set to true which indicates to the draw function to print the text "too many requests." Again, depending on the user's joystick movement this does happen occasionally.

User Interface Design

I opted for a very simple skeuomorphic-style interface to control the laser toy. I didn't want to overcomplicate the problem, and all I really needed was something to power the laser on and off, and something to control the servos in horizontal and vertical movements. I aimed for a sort of old-school video game control aesthetic.

Finalizing Build

Now that I had the Arduino side and the front-end coded and ready to go, I could put it all together and see how it worked. I began by testing if the laser was hooked up properly and receiving commands from the cloud.
Here the laser is up and running, and I was able to turn it on and off by using the power button in the p5.js interface.

Next I began testing by each of the servos separately. The horizontal servo was working perfectly and I was able to control its movement as well through the interface. However, once I tried to use the vertical servo stuff began to go awry. The prototype would work for a couple of seconds and then stop listening to all commands.

I couldn't resolve this issue through code and at a certain point the vertical servo stopped running altogether, so I determined it to be a hardware problem.

Emergency Surgery

I had no choice but to try and take the pan-tilt kit apart so I could replace the vertical servo with another I had lying around.
This turned out to be less difficult than I anticipated, and I had an opportunity to use the teeny tiny screwdriver I have lying around. Look how cute!

After replacing the servo, the prototype starting working better but I made sure to limit the maximum angle to 72 degrees so as not to overload it or pinch any wires.

Creating a Housing

Next, I decided I needed a housing to protect the fragile electrical components from any prying cat paws. I found a transparent plastic container at the dollar store with nearly perfect dimensions and drilled a hole into the side so the power cable to sneak through.
Not even half an hour after relocating the build into its new housing, the kitties decided they needed a closer look at what was going on with all these moving parts. I was very grateful for my foresight at this moment.

I also decided to camera needed an extra layer of protection so I added an extra tupperware to keep everything contained. Also, I'll admit the camera looked quite lonely duct taped to the top of the container.
Here's a look at the final build with everything working together!

User Testing

As any good User Experience Designer knows, no project is complete without rounds of user testing. In this case, I had the opportunity to test with two real-life users: Indigo Mountain's Majesty (IMM) and Sapphire Gizelle.

User 1: IMM

The little kitty seems to be far more interested in cuddles from me than playing with the laser. How can we make this toy more engaging? Perhaps adding an element of sound?

User 2: Sapphire Gizelle

Sapphire seems to be pretty excited about the new toy. At first, he was a lot more interested in the sound and movement of the servos but now he's enjoying it as a regular laser toy. Success!

Unfortunately no other cats responded to my requests on LinkedIn, so we'll have to settle for the two user tests for now.

Future Steps

This project was a whole lot of fun despite the major challenge of working on it at the same time as my thesis. I'm satisfied with the final product but I didn't have quite as much time as I would have liked to work on it.

I have several ideas for future improvements and iterations, and I hope to continue fleshing out this prototype beyond graduate school.
Incorporate motion detection to notify when cats are nearby
Combine camera feed with interface control
Program automated mode in addition to manual control
FInd a way to bypass server throttling to allow more precise control
That's all for now! Thanks for taking the time to look through this case study. I hope you enjoyed it reading it as much as I enjoyed making it.
return to top
Next: Trust Your Gut >