From ftm
Jump to: navigation, search
Line 1: Line 1:
== About ==
 
The Real-time Applications Team conducts research and development in real-time computer technology for digital signal processing and machine learning for music, sound and gesture.
 
Over the years, [http://www.ircam.fr/ Ircam] has developed numerous hardware (4X, ISPW) and software environments (Max, FTS, jMax) for real-time music processing. Today the principal concern of the team remains live performance with computers representing a large field of research and development in man-machine interaction.
 
The applications addressed by the team cover the direct interaction of a performer with the computer as well as the mutation of techniques traditionally present in music, dance and theater - musical instruments, stage, accessories. The use of digital techniques can correspond to the augmentation or transformation of the performer’s expression – the sound of his intrument, his voice, his gestures, his memory - as well as the creation of a dialogue between the artist and the machine.
 
In addition to topics regarding live performance itself, such as gestural interfaces or real-time audio analysis and synthesis, the team is also concerned with tools for the composition and representation of works involving real-time processes and interaction. Further fields of applications are found in the larger context of audio and multimedia industry as well as education.
 
 
 
== Navigation ==
 
== Navigation ==
* [[Projects]]  
+
* [[FTM]]  
* [[Software]]  
+
* [[Gabor]]  
* [[Publications]]  
+
* [[MnM]]  
* [[People]]
+
* [[Suivi]]

Revision as of 17:54, 22 November 2006

Navigation