Re-Tratos (Portraying || Dealing With) is an interactive work that establishes a temporary connection between audiences in two contexts, which are divided but also interdependent: Cubans living on the island and Cubans living in Miami.
Description
An interactive two-way mirror which projects faces onto the viewers face by my classmate, Felipe Castelblanco. I created the software for the project using video capture/playback techniques and face tracking.
Presented at the Camagüey International Video Art Festival in Cuba, April 2013.
Background
Felipe stopped in Miami on his way to Cuba and recorded video portraits of Cubans living there. “Do you have a message for Cuba?” At the festival, these portraits were then projected onto the faces of the Cubans looking into the mirror.
Mars Society Mars Desert Research Station, Crew 119, Dec 1-14 2013
In this Martian analog on Earth, the crew spent 2 weeks living and working in a remote, simulated habitat: planning Extra Vehicular Activities, wearing space suits, exploring the terrain on foot or via rovers, maintain/upgrading systems, and experiencing a tin-can existence. Through this research, they’ll be able to better understand how people will live and work effectively on the Red Planet.
Dan’s mission was to document what life will be like for the first humans on Mars from “a feet on the ground” perspective. This work was research for his MFA Thesis project: a live musical performance and concept album around the theme of humanity crossing the sea of space and touching down on a familiar new world.
The Cyborg Cabaret explores human, robot, and cyborg relationships in a variety show format featuring everything from cutting edge metal machines to cardboard-suited meat bags. Expect tear-jerking vignettes, frequent non-sequiters, and lots of humor through avant art-meets-science theater.
What if today’s robotic technology could be put to the test in a one-on-one tiered wrestling tournament? How would a Roomba stack up against a UAV? Could a Google self-driving car defeat the cuteness of Keep-On? Uh oh, Geminoid was talking smack about the CMU CRUSHER! It’s on!
Opening Opening Opening! Come on down to see who will win the Colossal Weight championship of the world!
Description
Robot Rumble is a live multimedia performance where actors portray real-life robots in one on one bouts in the style of backyard and WWF wrestling. Major themes of robots and society will be explored through cardboard-crushing, masculine soap operatic action.
The event will occur on the opening night of the first and second year MFA show at Bakery Square in end of March and a subsequent showing will take place in the end of April as part of the upcoming Cyborg Cabaret show.
words become the shape of my mouth as I read Geometry and Non
Description
I was given an assignment to create a screen print based on the city of Venice and inspired by the poem Geometry and Non by Jennifer Scappettone (Actually, this is a erroneous amalgamation of various poems attributed to Scappettone, sorry this is a collage given to me as the *real* thing. Apologies.). As more of a performer then visual artist, I decided to create a piece of software that could turn a reading of the poem into it’s visual analog. This singular performance generated both the print as a PDF and video documentation of its creation.
On Monday September 5th I took part in Netrooms performed at The University of Nottingham Ningo, China campus.
Netrooms: The Long Feedback is a participative network piece which invites the public to contribute to an extended feedback loop and delay line across the internet. The work explores the juxtaposition of multiple spaces as the acoustic, the social and the personal environment becomes permanently networked. The performance consists of live manipulation of multiple real-time streams from different locations which receive a common sound source. Netrooms celebrates the private acoustic environment as defined by the space between one audio input (microphone) and output (loudspeaker). The performance of the piece consists of live mixing a feedback loop with the signals from each stream.
Experiments in balloon motion and sound using an MS Kinect depth sensing camera.
Created for the Carnegie Mellon 1st & 2nd year MFA Graduate show entitled “Fresh Baked Goods” at Bakery Square, April 2011.
Description
A machine stands in a room surrounded by balloons. Circulating fans blow the balloons over the machine which creates sound based on their movements.
Mode 1: Tones
Balloon height and x/y position control the pitch and panning of a treble and bass voice. The tones can be quantized into a certain key or a glisssando can be employed for a theremin-style effect.
Mode 2: 99 Luftballons
The playback speed of Nena’s 99 Luftballons is controlled by balloon height. The balloons must be kept in the air for the song to play. Feed the machine.
A simple Open Frameworks application using the MS Kinect depth sensing camera via libfreenect and ofxKinect.
The computer searches for my manboobs and draws a bra or pasties on top. Music is played when titties are detected.
I’m using OpenCV on the depth image. I look for a person-sized blob and use it’s centroid to approximate a search box wherein to detect 2 boobs. The bra or pasties are drawn using the centroids of these boob blobs. A third blob detector is used to look for the hand to change between bras.
The second half of the video shows a projection mapping of the bras/pasties onto my chest. This is all running in realtime and in low light conditions with a bad background. Yes, the Kinect sensor is pretty awesome!
Edit4: Now on Australian news The Age“… there’s been a spate of videos using the technology, such as this man putting a BRA on himself” (thanks for the link Tim)
Fellow student Luke Loeffler and I both presented Richard Serra and Nancy Holt’s 1974 “Boomerang” audio/video piece to separate classes at CMU. A delay line is used to throw Holt’s voice back at just the right speed to where her brain becomes confused and her speech and comprehension slows. She is filmed on live television as she explains the experience.
Pure Data
We made a simple Pure Data patch which replicates the experience. Plug in a microphone and headphones (preferably closed ear types) and give it a shot.
EDIT: The original patch was using a 100 ms delay. Later on I was told the original piece used 283 ms, calculated form research into cognitive perception, etc. I have updated the speed and, indeed, the effect is much more pronounced.
I took the Pd patch and then made an RjDj scene which can be run on an iPhone or iPod Touch. RjDj is a “reactive music” environment that runs live, interactive songs.
Install RjDj on the App Store and download Boomerang from the Soundtrips section. The effect works really well with earbuds, just make sure to turn up the volume. You can also make recordings and upload them to RjDj if you create an account.
This is a quick video documentation of the Ars Electronica Center Facade Terminal which I helped implement while working for the Ars Electronica Futurelab in Linz, Austria.
The Facade Terminal is a touch screen pc mounted in a concrete pillar near the Danube. Visitors can use 3 different applications to control the building: Pulse, Music, and Cam.
Pulse: have your pulse read and visualized on the facade
Music: plug in a music device and control visualizations on the facade
Cam: place a cellphone on the camera and play your videos on the facade
I wrote most of the software and designed the audio visualizations (with pointers from Lia).