Enhancing the Musical Experience - From the Acoustic to the Digital... and Back.

57
Опубликовано 6 сентября 2016, 17:31
Over the last decade, inspired and motivated by the prospect of innovating the core of the musical experience, I have explored a number of research directions in which digital technology bears the promise of revolutionizing the medium. The research directions identified ΓÇô gestural expression, collaborative networks, and constructionist learning ΓÇô aimed at creating musical experiences that cannot be facilitated by acoustic means. The first direction builds on the notion that through novel sensing and mapping techniques, new expressive musical gestures can be discovered that are not supported by current acoustic instruments. Such gestures, unconstrained by the physical limitation of acoustic sound production, can provide infinite possibilities for expressive and creative musical experiences for novice as well as trained musicians. The second research direction utilizes the digital network in an effort to create new collaborative experiences, allowing players to take an active role in determining and influencing not only their own musical output but also that of their co-performers. By using the network to interdependently share and control musical materials in a group, musicians can combine their musical ideas into a constantly evolving collaborative musical activity that is novel and inspiring. The third research direction utilizes constructionist learning, which bears the promise of revolutionizing music education by providing hands-on access to programmable music making. While facilitating novel musical experiences that cannot be achieved by traditional means, the digital nature of these research directions often led to flat and inanimate speaker-generated sound, hampering the physical richness and visual expression of acoustic music. In my current work, therefore, I attempt to combine the benefits of digital computation and acoustic richness, by exploring the concept of ΓÇ£robotic musicianship.ΓÇ¥ I define this concept as a combination of musical, perceptual, and social skills with the capacity to produce rich acoustic responses in a physical and visual manner. The robotic musicianship project aims to combine human creativity, emotion, and aesthetic judgment with algorithmic computational capabilities, allowing human and robotic players to cooperate and build off one anotherΓÇÖs ideas. A perceptual and improvisatory robot can best facilitate such interactions by bringing the computer into the physical world both acoustically and visually. The first robot to demonstrate these capabilities is Haile ΓÇô a perceptual and interactive robotic percussionist that is designed to ΓÇ£listen like a human and improvise like a machineΓÇ¥. Haile listens to live human players, analyzes perceptual aspects of their playing in real-time, and uses the product of this analysis to play along in a collaborative and improvisatory manner. Its perceptual modules include the detection and analysis of low-level musical percepts such as onsets, pitch and velocity as well as higher-level musical percepts such as beat, similarity, stability and tension. HaileΓÇÖs interaction and improvisation modules utilize mathematical construct that are unlikely to be used by humans such as genetic algorithms and fractal functions that are embedded in a variety collaborative interaction schemes. When playing with human musicians, the robotΓÇÖs improvisational techniques are designed to inspire players to interact with it in novel manners that may revolutionize the musical experience and maybe, in the future, music itself.
автотехномузыкадетское