Collaborative Faculty Projects
Below are descriptions of some group projects that faculty and students have worked on at the Ammerman Center.
OBSTACLE/FLOW: Interactive Presence Project
Baird, B., Izmirli, O. and Wollensak, A. (2012)
Obstacle/Flow's primary inspiration is the natural phenomena of jökulhlaups, the Icelandic term for glacier-burst. Jökulhlaups refers to a sudden flood-release of melt water from glaciers and ice sheets. This is realized within the work by generative elements that include: audio that sonifies obstacles, small particles that aid in visualizing flow data, and exploding images of ice governed by random bursts of particles.
DEEP/PLACE: site-based immersive history - Harkness Chapel
Baird, B., Izmirli, O. and Wollensak, A. (2012)
We have joined together as an interdisciplinary team of collaborators to explore, through technology, experiences of place and history that are engaging and interactive. DEEP/PLACE features an expanded interactive audiovisual space consisting of diverse media elements. This multidisciplinary collaborative artwork merges materials from discrete domains—such as architecture, cultural geography and geology—in an immersive site-specific experience. Participants explore the multifaceted information by navigating a rich media landscape through an intuitive gestural interface. The media landscape is represented by a system of interconnected nodes of site-based information that include spatial and geological information, archival blueprints and images, 3D models, video and recorded audio material. A live camera feed of the site into the virtual space connects the built architectural space to the digital multi-modal history. The overall experience provides an important interplay between the real site and the virtual experience, allowing the past to infuse the present.
Using Motion Capture to Synthesize Dance Movements
Baird, B., Izmirli, O., Ajjen Joshi '12 (2011)
Motion capture presents an interesting opportunity for the analysis and synthesis of movements in dance. We have created a tool that uses concatenative synthesis of dance movement based on a library of prerecorded basic movements. Dance movements are first broken into discrete, small movements following the guidelines of Laban dance notation. Then these movements can be performed by dancers and recorded using motion capture. Finally, these (edited) sequences are placed in a 3D virtual environment where the user can synthesize movements to form a choreographed composition. Such a pedagogical tool provides a creative way to understand and study dance movements.
One Thing Leads to Another
Baird, B., Hartman, C., Izmirli, O., Kreiger, A. and Wollensak, A. (2008)
The syntactical construction of an ordered binary pair—and in particular the antecedent/consequent relationship—occurs in many communication idioms including spoken language, poetry, music, and the visual arts. As a basis for group collaboration, we have chosen to creatively explore these constructions by constructing an environment where selected binary examples from our respective disciplines can coexist and interact. The environment was designed to select and simultaneously play several examples at any given time. The selections within each medium are drawn from a larger pool constructed within the disciplines. The three realms of antecedent/consequent selections are the poetic (in the form of proverbs), visual (in the form of pairs of images), and sound (in the form of paired musical gestures). In each medium the relation of the first part to the second reflects both the disciplinary and individual-authorial conditions of choice, intention, and balance.
Red Ball: A Collaboration to Develop an Interdisciplinary Interactive Space
Baird, B., Hartman, C., Izmirli, O., Kreiger, A., Wollensak, A. (2006)
The initial impetus for this project came from meetings of a group of faculty in different disciplines who were interested in the idea of developing materials from a common inspirational source and creating a virtual space where those materials could interact. The disciplines included poetry, music composition, visual art, video and computer science. The concept of a "red ball" was chosen as the unifying basis for the work, and three of the artists involved — a poet, a visual artist, and a composer — independently developed materials in their own preferred media so that they could stand as artistic "bits" that could be combined. The audience experiences a performance, created by a user, through computer projection and a sound system. The user's interface was developed as a glove-triggered coordinate grid that could be used to spawn spoken text, written and visual text, video fragments, and musical phrases.
Using Haptics and Sound in a Virtual Gallery
Baird, B., Izmirli, O. and David Smalley
Galleries are traditionally places for visual exploration of objects; concert halls provide auditory exploration. The tools of virtual reality allow for a new kind of gallery: one that encompasses features of a traditional visual museum, means for auditory discovery, and in addition, haptic exploration. The user is invited to browse through this virtual gallery, interacting with the objects, feeling their textures, listening to their audio properties, moving around and inside them. All of this takes place in an interactive, 3D environment where the user navigates and explores with her eyes, ears, and hands.
Conducting a Virtual Ensemble
Baird, B., Izmirli, O.
Experienced conductors of music ensembles are not metronomes: their hand movements and the speed at which the players perform exhibit a complex time and context dependent means of communication. The aim of this project is to analyze this relationship between the conductor's movements and the actual tempo as performed by the players, and apply the results from this analysis to construct a computer based system that will mimic the salient behavior of a real ensemble. Models for different conductors are obtained by first having a conductor lead a live ensemble. Data from the conductor is obtained using 3-D position sensors; the performance is also recorded digitally in a sound file. Velocity, acceleration, direction, and position data of the movements are used to extract features and determine the location of beats. By synchronizing this information with the audio data and by using information about music performance and the score, a model is constructed that produces, at each point in time, the implied tempo for the ensemble. Thus this model ultimately deduces implied tempo from hand movements. Once a model has been formed, the system can be put into "perform" mode in which the user can "conduct" in real time by controlling the playback speed of a MIDI sequencer. The conductor uses 3-D trackers to conduct; data from hand movements is fed into the model, processed in real time, and then used to control the tempo of the virtual ensemble. This computer system enables conducting students to experience the complex coupling between movements and actual tempo of the ensemble and to conduct a "virtual" ensemble or a mixed virtual/real ensemble.