For this week’s assignment, our class was tasked with integrating sound into a p5 animation. This assignment was an interesting challenge, as I aimed to make a drum pad, usable both on a desktop and touch screen device, that played different lyric sections of Daft Punk’s Harder, Better, Faster, Stronger.
I tried three different methods to reach my end result, accessible in the buttons below.
Users may interact with my final sketch on touch screens using the preview link.
If I had more time I would have made a popup introductory epilepsy warning with the song’s chorus quietly playing in the background. The drum pad would be accessible once the user clicked the warning, which would be superimposed over the helmet of one of the members of Daft Punk’s members. I have stored all of these media files in my sketch, and intend to add this feature in the near future.
This week our group re-evaluated our project concept based on user feedback from our previous class. Our users made two concepts clear:
1. The metaphorical connection between fireflies, fire, and personal/psychological connection was not entirely clear
2. It is not universally intuitive when given a jar to try and catch fireflies
Based on this feedback, we decided to refrain from integrating a Microsoft Kinect to track our user’s body motions. We made the following progress with our animation, figured out how to integrate our muse into our animation, and made steps in building our physical jar process trigger.
This week, my group was tasked with developing a project description, a system diagram, timeline, and bill of materials in preparation for our playtesting.
As a group, we constructed our final narrative and drew out the necessary components provided in the images below:
Essentially, we will have 3 components linked to a p5 sketch – a Neurosky EEG reader, a jar that can light up and vibrate, and an Xbox Kinect camera. Our animation will begin as a flame with a dark background. When our user twists off the top of the jar, the fire will extinguish, and turn into smoke. Depending on our user feedback, our user will be prompted to focus on the smoke, and it will turn into an avatar for our user, as fireflies begin to emerge from the darkness. With the jar, registered by the Neurosky, our user may capture up to 10 fireflies. For every firefly caught, our jar will become brighter and vibrate. Once the user has caught 10 fireflies, the animation will revert back to a fire, and the user will reset the animation by replacing the cap on our jar.
We decided that we would determine our floor plan based on user feedback.
Our group had been working on dynamic timeline and bill of material documents, that we have made accessible on this page.
the work documented in this post has gratuitous use of the middle finger. discretion advised I guess.
With my partner, Sarah Liriano, I shot a stop motion animation, set to Oliver Nelson’s Reuben’s Rondo. In it, my character struggles with his anthropomorphized middle finger, after being jostled by a colleague. Though the tone of the animation is light, I intended this piece to be a physical representation of the struggles individuals with obsessive-compulsive, mood, and pervasive-thought disorders may face in their day-to-day life.
We ended up using Dragonframe software and a Canon EOS Mark 3 camera to collect and stitch our stop motion images. I had anticipated how time-intensive the process of shooting the animation would be; approximately 5 hours in total. We planned well in ahead, writing our storyboard before the day we decided to shoot. However, because we neglected to mark up my finger in the shooting phase, editing the middle finger face (which Sarah created in Adobe Illustrator) required tedious, manual, frame by frame adjustment to overlay with my finger.
For my Physical Computing final, I have two potential ideas:
1. “One Track Mind”
One-track mind is a turntable, with an RC car set atop it. The car and turntable would be moving in opposite directions, at the same velocity, so that the car would stay in the same place on the turntable while it spins. The car and turntable would be in communication with one another through Bluetooth. The interactive, and most difficult part, of this project, would be a potentiometer that would change the speed of the turntable and car simultaneously. I am also considering integrating some sort of LED into the turntable that can help delineate what speeds the turntable/car has reached. I may have the RC car look like a brain.
2. Title TBD
This project would utilize the portable EEG reader NeuroSky, or a more affordable model (pending feedback) to halt the perpetual motion of an RC car. For this project, I would definitely try to make the car look like a brain.
With my partner, Simone Salvo, I created a “talking” skull, which would play a Donald trump quotation sound-byte, display a corresponding fake tweet, and flash red, white, and blue eyes – all activated by a hand press on Dostoyevsky’s The Idiot.
In creating this project, we used Arduino Uno because it gives out higher voltage than our standard Arduino Nano (5V vs 3V). Our concept went through two iterations, one where we planned for the use of simple red LEDs for our eyes, and our later iteration where we used multicolor NeoPixel LEDs. Many of the components we programmed for we had worked with in one way or another in our lab, as such much of our code had essentially been pre-written on the Arduino side. However, we found that in order to make our button serial input trigger we had to go back and “uncouple” the code we had written, separating each individual action into their own function. In doing so, we realized that we could have reconfigured our button press feedback to rely on the buttonPress() function, instead of the newState()/oldState() functions. It is simple conditions like these that could have taken out multiple instances of delay() from our Arduino code – which we have learned could have been mitigated with millis().
In all, this project worked exactly how we wanted it to for our presentation – moving forward, we could do more to “clean up” our code, and utilize best practice. That is not to say that our code was poorly written, in fact, this project was an opportunity to appreciate the many ways in which solutions can present in programming problems. However, I have a new appreciation for writing code in a style that separates different functions distinctly.
Without realizing it, I went ahead in the week we were supposed to learn about the basics of asynchronous communication (week 6) and tried to set up a feedback system with two types of serial input (a week 7 concept). With the help of my instructor, I discovered that (unsurprisingly) more work was needed to read feedback from both of my potentiometers, which I intended to use to determine the position of my pong paddles. I found the Arduino code relatively straightforward to program, though I struggled (and still struggle) to animate two elements of a p5 sketch creatively. To realize my vision, I used my Arduino code but interfaced it with a java sketch written in processing, available here.
Two images of my first serial circuit
Though I may not have succeeded in creating the animation I desired in p5, this project was a great entry point into troubleshooting with serial input, and figuring out the complexities of math-based objects in p5. It certainly solidified my capability in integrating serial input into basic objects- however, when I wanted to create objects that relied on serial input but were able to react to the ball element in my sketch, I struggled to make the paddles react to the ball.
This week we were tasked with cleaning up the code from last week’s assignment. Frankly, I found that my code was clean and satisfactory, without redundancies, so I added an outro level to my animation exercise that prevented my users from surpassing 100 cat clicks.
This week, our class was expected to create a sketch with constructor-based objects, and explain our sketch. I decided to continue on an original idea I had had, a balloon popping game, utilizing new and old skills from throughout my projects. I used arrays to generate my cats, and the transparent “button” ellipses that allowed my counter to increase, and the position of the cats to reset. I used for loops to populate my arrays, and move my cloud animations (as I had done in the past). I also uploaded a new font to match the 8-bit aesthetic of my animation, and a loading screen that reflected the same aesthetic
The only issue I ran into was when I tried to draw from a randomized “meow” array, which would play when I pressed my ellipse balloon buttons. I assume the issue came from the syntax I used after establishing the variable in the constructor, or the syntax I used when linking it to the event “clicked”.