Making a Controller for My P5 Game

(note all the code for these projects can be viewed on my Github here though I will make specific references to scripts down below).

One of my P5 projects that I am the happiest with is an interactive poetry/anti-game game that involves corralling a ball with the cursor to reveal a narrative. I decided I wanted to replace the mouse and cursor with a joystick, and that I wanted to use node, express, and a websocket to interface between P5 and the arduino.

I started out by adapting the SerialInOut example code for arduino to work for a joystick since the example uses a potentiometer and a joystick is just two potentiometers and a switch. Making those changes I wrote this code for outputting the joystick readings to serial. I had some trouble setting wiring the joystick initially but Aiden helped me out. Also I read on a now forgotten stack exchange that INPUT_PULLUP is necessary for setting the pinmode on digital ins that are going over serial, but I didn’t understand why that is the case.

Before trying to figure out how to integrate the serial readings into my game/poem script I connected it to a very quick example here which you can see in the first gif above. I’d used node and websockets before but I didn’t really remember them very well so it took me a pretty long time to get off the ground. I made use of some code that I had from Mark Hellar from when I was at Grey Area, which is why I ended up using express.

Once I figured out the example worked I went about redoing my P5 project to take the serial input. It didn’t take very long, I just replaced the mouseX and mousePressed variables with the respective inputs from the joystick. I intentionally made it so that you couldn’t see the player’s “cursor”, I wanted discovering the relationship between what the player was doing and the motion of the ball to be didactic and not immediately clear (so that you have to play around with the joystick before you figure out what is going on).

Business Card

Side A of the business card:

I really liked the hands and eyes that I had going in my show poster, so I decided that I wanted the same kind of thing for my business card. I wasn’t really sure what to put for my “title” so that is just a placeholder for now. The font I used was “Gil Sans” and I took my Twitter and Github icons from their media packages.

Side B of the business card:

Circle Game

I wanted to make a game-as-interactive-poetry without clear goals but with player progress still mediated by “points”.

“Points” are displayed constantly in the upper left corner. Doing nothing makes the points go down, clicking and holding the circle in the center makes the points go up. I was thinking of “cookie clicker” type games (like the currently buzzing paper clip AI game) when I made this and how much I hate them for creating unintersting artificial rewards that my brain still loves.

Code is on my github here.

ITP Winter Show Card

Initially I wanted to make something inspired by Tadanori Yokoo. I spent a lot of time futzing around with different elements but I ultimately decided what I liked was the simplest, a little line drawing reaching out to the show information, giving a sense of playfulness. I really like this color contrast but I think it might be a little distracting. I put a less contrast version on the bottom of the post.



I had some trouble drawing hands but even though I got the hang of more realistic hands I thought that cartoonish hands fit the aesthetic better.

More face elements kind of got ugly.

Here is with less color elements.

You Suck

Katya and I started out just sitting next to each other, noodling around and sharing what we were doing. I ended up making this moon pattern with a surprise:

You can see the code for that here.

Katya then had the idea to make something that prompted the user to write something on the screen but then whatever they tried to write it would just spell out “You Suck”. I thought that was pretty funny but we weren’t sure how to get the actual drawing mechanism to work, i.e. how to make it look like what was being printed onto the screen was happening in real time but only when the user clicked and moved. We started doing research and thinking about stuff and eventually we came up with this, click-through for interactivity:

You can see the code here.

Breaking A Servo?

I wanted to make a project that would display a dubiously motivational message each time I sat at my home office desk. To start off I used a HC-SR04 ping distance sensor and attached a pin with a message (in this case a pin I got at the art book fair) to a little SG90 micro servo motor. I initially tried to map the distance readings of the HC-SR04 to the angle of the servo motor, but it was too noisy and the servo jittered around quickly (you can see that code commented out below).

After that I tried running it as an if/then statement, where the servo would change positions if the distance sensor recorded a short distance. However, the servo didn’t change position even as I watched the distance go below the threshold on the serial monitor. Instead it stayed in one place making a really awful noise and getting hotter until it stopped moving at all. After that it wouldn’t move at all even for simpler movements like the servo sweep example code, leading me to think that it was broken. However I tried it again after it had cooled down and it worked fine, but just still not for the application I wanted.

You can see the code here:

#include <Servo.h>

Servo myServo;

const int trigPin = 13;
const int echoPin = 12;
const int servoPin = 6;
int servoAngle;

long duration;
int distance;

void setup() {
  pinMode(trigPin, OUTPUT);
  pinMode(echoPin, INPUT);

void loop() {
  digitalWrite(trigPin, LOW);
  digitalWrite(trigPin, HIGH);
  digitalWrite(trigPin, LOW);
  duration = pulseIn(echoPin, HIGH);
  distance = duration * 0.034 / 2;
  Serial.print("Distance: ");

  if(distance < 10) {
  } else {

  // int servoAngle = map(distance, 0, 30, 0, 179);
  //  myServo.write(servoAngle);


Infomercial Brainstorming and Storyboard

Initial Ideas

As you can see from the above picture Amitabh, Martin, and I had a few different ideas, but we settled on a device that takes the way in which online human communications are mediated via AI algorithms and replicates it in the real world. We debated for a while what form we wanted this to take – either a consumer level device for facial recognition for emotion or an AI that determines political leanings. We tried to think of some funny scenarios in which this kind of device would be marketable but subtly ominous. I was reminded of a critical design piece an artist I met at a residency, Lark VCR, had created.

A first draft of the storyboard:


In the end we decided on an emotion detector device. The infomercial is framed by a spokesperson demonstrating the effectiveness of the device – [Empathy Lense working title] – by walking the viewer through several scenarios in which it is applicable. Each scenario will feature a before and after scene with voiceover by the spokesperson. No audio from the scenario will be audible, they will all be narrated by the spokesperson and the acting will be heavily exaggerated. The infomercial ends with several banal user testimonials.