Exploring interactions

Introduction

A new medium?

I wanted to experiment creating a new medium that doesn't have any tactile interaction. I created the experience of drawing without touching anything. It mimics what drawing would be and has a physical output, and the only tool is your hand. I am testing how people react and adapt to this. It utilizes Computer Vision in python and Arudino for the matrix, all code is written by me.

Drawing mode

Using the most dominate finger, the pointer. Raj R, Marquis C. Finger dominance. 1999.

Erasing mode

Using two dominate fingers, the middle+pointer. Raj R, Marquis C. Finger dominance. 1999.

Code breakdown

Serial connection.

By sending $1 or anything $x, it turns it on the specific LED with my method. This let me know that Arduino wanted to receive serial data.

Python

The "Tool."

To read the landmarks of each finger, I used Google's MediaPipe to develop a hand tracking script. After that, I used OpenCV to establish a grid semi-reflective of the 8x8 matrix. This gave each "pixel" in the camera grid a unique location related to the matrix. Once the landmark passed into the location, data was sent to the Arduino. This works for both drawing and erasing.

if y1 < 90:
if 1 < x1 < 150:
arduino.sendData([1])

Looking for Y coordinate
Looking for X coordinate
Send Data

Arduino

The "Medium."

Once the Arduino receives the data from python, then it needs to turn on the LED associated with where the landmark passed in the camera grid.

void loop(){
int pyval;
serialData.Get(valsRec);
pyval=valsRec[0];
if(pyval==1){ lc.setLed(0, 0, 0, true);}
}
For turning off:
if(pyval==100){ lc.setLed(0, 0, 0, false);}

Set up
Declaration
Value from python code
Take the data
If data received is 1, then turn on this LED


If data received is 100, then turn off this LED

Field testing

Whats next?

When I observed a new artist initial interaction, how quickly they learned to control the LED was interesting. While there was no guidance to where the fingers might go in the air in relation to the matrix, it seems like it isn't necessary. I was happy to see that people could pick up the task easily. Control comes quickly, and I'd like to further explore that. This project taught me how to use serial data between different programming languages, and have it reflect physically. I think that it is a good step forward into thinking about how humans might use gestural interaction and have it reflect in the real world, which seems to be an up and coming trend. I don't want to stop here with this project. I am interesting in exploring a 16x16 matrix, and I can do so because the script I made is modular. I can take input from a finger and have it turn into data, which can be used to control whatever. I also want to experiment with having sound play when the G.E.M. is in use. Some more testing will allow me to see where people might get caught up, and where some strengths are. This was a great experiment, I failed a lot and that only allowed me to learn more.

Designed by Justin Catalano

DuneSystems

2024 ©