<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
		<id>https://ftm2.ircam.fr/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Bevilacq</id>
		<title>ftm - User contributions [en]</title>
		<link rel="self" type="application/atom+xml" href="https://ftm2.ircam.fr/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Bevilacq"/>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php/Special:Contributions/Bevilacq"/>
		<updated>2026-04-27T23:53:43Z</updated>
		<subtitle>User contributions</subtitle>
		<generator>MediaWiki 1.28.0</generator>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2491</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2491"/>
				<updated>2008-11-04T11:29:59Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* Credits and Aknowledgements */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== What is the ''gesture follower'' ? ==&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a ensemble of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to get parameters from the comparison between a performance and an set of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (beginning, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
== What is a ''gesture'' anyway ? ==&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit for the dimension of the gesture data (or number of sensor channel) other than what your computer can afford as a CPU load (for example 20 is generally no problem). &lt;br /&gt;
&lt;br /&gt;
In Max/MSP the data feeding the ''gesture follower'' can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
== Download, license and referencing==&lt;br /&gt;
The ''gesture follower'' comes freely with the download of [[http://ftm.ircam.fr/index.php/Download FTM]], in the folder .../MnM.BETA/examples/gesture_follower/. Note that you must have FTM installed.&lt;br /&gt;
&lt;br /&gt;
The latest version, v0.3 can be downloaded here [http://recherche.ircam.fr/equipes/temps-reel/gesturefollower/gesture_follower_v0.3.zip  v0.3]&lt;br /&gt;
&lt;br /&gt;
This software is intended for artistic work and/or scientific research. Any commercial use is reserved. Copyrights 2004-2007 IRCAM - Centre Pompidou.&lt;br /&gt;
&lt;br /&gt;
If appropriate please cite the Real Time Interaction Team, IRCAM or reference the following article:&lt;br /&gt;
F. Bevilacqua, F. Guédy, N. Schnell, E. Fléty, N. Leroy, [http://mediatheque.ircam.fr/articles/textes/Bevilacqua07a/ Wireless sensor interface and gesture-follower for music pedagogy], Proceedings of the International Conference of New Interfaces for Musical Expression (NIME 07), New York,NY, USA, pp 124-12, 2007.&lt;br /&gt;
&lt;br /&gt;
== Getting help and disclaimer ==&lt;br /&gt;
Any type of feedback, problem, bug report, feature request are welcome and we will try our best to help you.&lt;br /&gt;
Please post any message/questions directly to the [http://listes.ircam.fr/wws/info/ftm FTM list]. &lt;br /&gt;
&lt;br /&gt;
Nevertheless, ...this is work in progress!!! Use this software at your own risk. We do not assume any reponsability for possible problems caused by the use of this software.&lt;br /&gt;
&lt;br /&gt;
== References==&lt;br /&gt;
&lt;br /&gt;
The messages used the various modules are listed here [http://recherche.ircam.fr/equipes/temps-reel/gesturefollower/gesturefollower-reference.01.pdf ref]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Getting started ==&lt;br /&gt;
&lt;br /&gt;
=== LCD example===&lt;br /&gt;
&lt;br /&gt;
==== 1st step : Record gestures ====&lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====2nd sep : Compare ====&lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 3rd step : Observe ==== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====  Connection Avec EyesWeb XMI ==== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;br /&gt;
&lt;br /&gt;
== Examples ==&lt;br /&gt;
&lt;br /&gt;
In the version [http://recherche.ircam.fr/equipes/temps-reel/gesturefollower/gesture_follower_v0.3.zip  v0.3], the following examples can be found:&lt;br /&gt;
* writing&lt;br /&gt;
* Wii&lt;br /&gt;
* audio parameters (pitch, periodicity, energy)&lt;br /&gt;
* voice (mfcc)&lt;br /&gt;
&lt;br /&gt;
== Links ==&lt;br /&gt;
&lt;br /&gt;
== Credits and Aknowledgements==&lt;br /&gt;
Real Time Musical Interaction - Ircam - CNRS STMS&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Frédéric Bevilacqua, Rémy Muller, Norbert Schnell, Fabrice Guédy, Jean-Philippe Lambert, Aymeric Devergié, Anthony Sypniewski, Bruno Zamborlin, Donald Glowinski (thanks the scree captures!)&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Download&amp;diff=2468</id>
		<title>Download</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Download&amp;diff=2468"/>
				<updated>2008-06-30T10:33:33Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* Current version */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Current version ==&lt;br /&gt;
&lt;br /&gt;
FTM &amp;amp; Co 2.3&lt;br /&gt;
* [http://ftm.ircam.fr/downloads/ftm2.release.txt FTM 2 Release Notes]&lt;br /&gt;
&lt;br /&gt;
Max 4.6 on Mac OS X&lt;br /&gt;
* [http://ftm.ircam.fr/downloads/FTM.2.3.5-Max46ub.dmg FTM 2.3.5 binaries for Max/MSP 4.6 (UB)  on Mac OS X]&lt;br /&gt;
&lt;br /&gt;
Max 4.6 on Windows&lt;br /&gt;
* [http://ftm.ircam.fr/downloads/FTM.2.3.4.BETA-Max46.exe FTM 2.3.4.BETA binaries for Max/MSP 4.6 on Windows]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
''Known issue:''&lt;br /&gt;
&lt;br /&gt;
Attention: FTM is still not entirely thread-safe. Messages or expressions that are executed in different threads (alarm, GUI/main, UDP input, etc) and simultaneously modify FTM data structures (objects) can cause crashes. The easiest way to avoid this is to reschedule congruent accesses to FTM object into the same thread. The externals &amp;quot;pipe&amp;quot; or &amp;quot;ftm.schedule&amp;quot; (that now is thread-safe) with a 0 delay time would always output in the alarm (&amp;quot;high priority&amp;quot;) thread while &amp;quot;defer&amp;quot; and &amp;quot;deferlow&amp;quot; can be used to reschedule messages in the Max main application (GUI) thread.&lt;br /&gt;
&lt;br /&gt;
== Older versions ==&lt;br /&gt;
&lt;br /&gt;
Max 4.6 on Mac OS X&lt;br /&gt;
* [http://ftm.ircam.fr/downloads/FTM.2.3.4.BETA-Max46ub.dmg FTM 2.3.4.BETA binaries for Max/MSP 4.6 (UB)  on Mac OS X]&lt;br /&gt;
* [http://ftm.ircam.fr/downloads/FTM.2.3.2.BETA-Max46ub.dmg FTM 2.3.2.BETA binaries for Max/MSP 4.6 (UB)  on Mac OS X]&lt;br /&gt;
* [http://ftm.ircam.fr/downloads/FTM.2.3.1.BETA-Max46ub.dmg FTM 2.3.1.BETA binaries for Max/MSP 4.6 (UB)  on Mac OS X]&lt;br /&gt;
* [http://ftm.ircam.fr/downloads/FTM.2.3.0.BETA-Max46ub.dmg FTM 2.3.0.BETA binaries for Max/MSP 4.6 (UB)  on Mac OS X]&lt;br /&gt;
* [http://ftm.ircam.fr/downloads/FTM.2.2.4-Max46ub.dmg FTM 2.2.4 binaries for Max/MSP 4.6 (UB)  on Mac OS X]&lt;br /&gt;
* [http://ftm.ircam.fr/downloads/FTM.2.2.3.BETA-Max46ub.dmg FTM 2.2.3 BETA binaries for Max/MSP 4.6 (UB) on Mac OS X]&lt;br /&gt;
* [http://ftm.ircam.fr/downloads/FTM.2.2.2.BETA-Max46ub.dmg FTM 2.2.2 BETA binaries for Max/MSP 4.6 (UB) on Mac OS X]&lt;br /&gt;
* [http://ftm.ircam.fr/downloads/FTM.2.2.1.BETA-Max46ub.dmg FTM 2.2.1 BETA binaries for Max/MSP 4.6 (UB) on Mac OS X]&lt;br /&gt;
* [http://ftm.ircam.fr/downloads/FTM.2.2.0.BETA-Max46ub.dmg FTM 2.2.0 BETA binaries for Max/MSP 4.6 (UB) on Mac OS X]&lt;br /&gt;
* [http://ftm.ircam.fr/downloads/FTM.2.1.4.BETA-Max46ub.dmg FTM 2.1.4 BETA binaries for Max/MSP 4.6 (UB) on Mac OS X]&lt;br /&gt;
* [http://ftm.ircam.fr/downloads/FTM.2.1.3.BETA-Max46ub.dmg FTM 2.1.3 BETA binaries for Max/MSP 4.6 (UB) on Mac OS X]&lt;br /&gt;
* [http://ftm.ircam.fr/downloads/FTM.2.1.2.BETA-MAX46ub.dmg FTM 2.1.2 BETA binaries for Max/MSP 4.6 (UB) on Mac OS X]&lt;br /&gt;
* [http://ftm.ircam.fr/downloads/FTM.2.1.1.BETA-Max46ub.dmg FTM 2.1.1 BETA binaries for Max/MSP 4.6 (UB) on Mac OS X]&lt;br /&gt;
* [http://ftm.ircam.fr/downloads/FTM.1.7.13-Max46ub.dmg FTM 1.7.13 binaries for Max/MSP 4.6 (UB)  on Mac OS X]&lt;br /&gt;
* [http://ftm.ircam.fr/downloads/FTM.1.7.12-Max46ub.dmg FTM 1.7.12 binaries for Max/MSP 4.6 (UB)  on Mac OS X]&lt;br /&gt;
&lt;br /&gt;
Max 4.5 on Mac OS X&lt;br /&gt;
* [http://ftm.ircam.fr/downloads/FTM.2.2.0.BETA-Max45.dmg FTM 2.2.0 BETA binaries for Max/MSP 4.5 (ppc) on Mac OS X]&amp;lt;br&amp;gt; (last release for Max 4.5, works for Mac OS X 10.3)&lt;br /&gt;
* [http://ftm.ircam.fr/downloads/FTM.1.7.11-Max45ppc.dmg FTM 1.7.11 binaries for Max/MSP 4.5 (PPC only) on Mac OS X]&lt;br /&gt;
&lt;br /&gt;
Max 4.6 on Windows&lt;br /&gt;
* [http://ftm.ircam.fr/downloads/FTM.2.3.2.BETA-Max46.exe FTM 2.3.2.BETA binaries for Max/MSP 4.6 on Windows]&lt;br /&gt;
* [http://ftm.ircam.fr/downloads/FTM.2.3.1.BETA-Max46.exe FTM 2.3.1.BETA binaries for Max/MSP 4.6 on Windows]&lt;br /&gt;
* [http://ftm.ircam.fr/downloads/FTM.2.2.4-Max46.exe FTM 2.2.4 binaries for Max/MSP 4.6 on Windows]&lt;br /&gt;
* [http://ftm.ircam.fr/downloads/FTM.2.2.2.BETA-Max46.exe FTM 2.2.2 BETA binaries for Max/MSP 4.6 on Windows]&lt;br /&gt;
* [http://ftm.ircam.fr/downloads/FTM.2.2.1.BETA-Max46.exe FTM 2.2.1 BETA binaries for Max/MSP 4.6 on Windows]&lt;br /&gt;
* [http://ftm.ircam.fr/downloads/FTM.2.1.2.BETA-Max46.exe FTM 2.1.2 BETA binaries for Max/MSP 4.6 on Windows]&lt;br /&gt;
* [http://ftm.ircam.fr/downloads/FTM.1.7.12.BETA-Max46.exe FTM 1.7.12 BETA binaries for Max/MSP 4.6 on Windows]&lt;br /&gt;
&lt;br /&gt;
Max 4.5 on Windows&lt;br /&gt;
* [http://ftm.ircam.fr/downloads/FTM.2.3.2.BETA-Max45.exe FTM 2.3.2.BETA binaries for Max/MSP 4.5 on Windows]&lt;br /&gt;
* [http://ftm.ircam.fr/downloads/FTM.2.3.1.BETA-Max45.exe FTM 2.3.1.BETA binaries for Max/MSP 4.5 on Windows]&lt;br /&gt;
* [http://ftm.ircam.fr/downloads/FTM.2.2.4-Max45.exe FTM 2.2.4 binaries for Max/MSP 4.5 on Windows]&lt;br /&gt;
* [http://ftm.ircam.fr/downloads/FTM.2.2.2.BETA-Max45.exe FTM 2.2.2 BETA binaries for Max/MSP 4.5 on Windows]&lt;br /&gt;
* [http://ftm.ircam.fr/downloads/FTM.2.2.1.BETA-Max45.exe FTM 2.2.1 BETA binaries for Max/MSP 4.5 on Windows]&lt;br /&gt;
* [http://ftm.ircam.fr/downloads/FTM.2.1.2.BETA-Max45.exe FTM 2.1.2 BETA binaries for Max/MSP 4.5 on Windows]&lt;br /&gt;
* [http://ftm.ircam.fr/downloads/FTM.1.7.12.BETA-Max45.exe FTM 1.7.12 BETA binaries for Max/MSP 4.5 on Windows]&lt;br /&gt;
&lt;br /&gt;
* [http://ftm.ircam.fr/downloads/ftm.release.txt FTM 1.7 Release Notes]&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2454</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2454"/>
				<updated>2008-06-11T12:48:36Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* Getting started */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== What is the ''gesture follower'' ? ==&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a ensemble of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to get parameters from the comparison between a performance and an set of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (beginning, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
== What is a ''gesture'' anyway ? ==&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit for the dimension of the gesture data (or number of sensor channel) other than what your computer can afford as a CPU load (for example 20 is generally no problem). &lt;br /&gt;
&lt;br /&gt;
In Max/MSP the data feeding the ''gesture follower'' can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
== Download, license and referencing==&lt;br /&gt;
The ''gesture follower'' comes freely with the download of [[http://ftm.ircam.fr/index.php/Download FTM]], in the folder .../MnM.BETA/examples/gesture_follower/. Note that you must have FTM installed.&lt;br /&gt;
&lt;br /&gt;
The latest version, v0.3 can be downloaded here [http://recherche.ircam.fr/equipes/temps-reel/gesturefollower/gesture_follower_v0.3.zip  v0.3]&lt;br /&gt;
&lt;br /&gt;
This software is intended for artistic work and/or scientific research. Any commercial use is reserved. Copyrights 2004-2007 IRCAM - Centre Pompidou.&lt;br /&gt;
&lt;br /&gt;
If appropriate please cite the Real Time Interaction Team, IRCAM or reference the following article:&lt;br /&gt;
F. Bevilacqua, F. Guédy, N. Schnell, E. Fléty, N. Leroy, [http://mediatheque.ircam.fr/articles/textes/Bevilacqua07a/ Wireless sensor interface and gesture-follower for music pedagogy], Proceedings of the International Conference of New Interfaces for Musical Expression (NIME 07), New York,NY, USA, pp 124-12, 2007.&lt;br /&gt;
&lt;br /&gt;
== Getting help and disclaimer ==&lt;br /&gt;
Any type of feedback, problem, bug report, feature request are welcome and we will try our best to help you.&lt;br /&gt;
Please post any message/questions directly to the [http://listes.ircam.fr/wws/info/ftm FTM list]. &lt;br /&gt;
&lt;br /&gt;
Nevertheless, ...this is work in progress!!! Use this software at your own risk. We do not assume any reponsability for possible problems caused by the use of this software.&lt;br /&gt;
&lt;br /&gt;
== References==&lt;br /&gt;
&lt;br /&gt;
The messages used the various modules are listed here [http://recherche.ircam.fr/equipes/temps-reel/gesturefollower/gesturefollower-reference.01.pdf ref]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Getting started ==&lt;br /&gt;
&lt;br /&gt;
=== LCD example===&lt;br /&gt;
&lt;br /&gt;
==== 1st step : Record gestures ====&lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====2nd sep : Compare ====&lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 3rd step : Observe ==== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====  Connection Avec EyesWeb XMI ==== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;br /&gt;
&lt;br /&gt;
== Examples ==&lt;br /&gt;
&lt;br /&gt;
In the version [http://recherche.ircam.fr/equipes/temps-reel/gesturefollower/gesture_follower_v0.3.zip  v0.3], the following examples can be found:&lt;br /&gt;
* writing&lt;br /&gt;
* Wii&lt;br /&gt;
* audio parameters (pitch, periodicity, energy)&lt;br /&gt;
* voice (mfcc)&lt;br /&gt;
&lt;br /&gt;
== Links ==&lt;br /&gt;
&lt;br /&gt;
== Credits and Aknowledgements==&lt;br /&gt;
Real Time Musical Interaction - Ircam - CNRS STMS&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Frédéric Bevilacqua, Rémy Muller, Norbert Schnell, Fabrice Guédy, Jean-Philippe Lambert, Aymeric Devergié, Anthony Sypniewski, Bruno Zamborlin&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2453</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2453"/>
				<updated>2008-06-11T12:18:53Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* Examples */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== What is the ''gesture follower'' ? ==&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a ensemble of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to get parameters from the comparison between a performance and an set of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (beginning, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
== What is a ''gesture'' anyway ? ==&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit for the dimension of the gesture data (or number of sensor channel) other than what your computer can afford as a CPU load (for example 20 is generally no problem). &lt;br /&gt;
&lt;br /&gt;
In Max/MSP the data feeding the ''gesture follower'' can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
== Download, license and referencing==&lt;br /&gt;
The ''gesture follower'' comes freely with the download of [[http://ftm.ircam.fr/index.php/Download FTM]], in the folder .../MnM.BETA/examples/gesture_follower/. Note that you must have FTM installed.&lt;br /&gt;
&lt;br /&gt;
The latest version, v0.3 can be downloaded here [http://recherche.ircam.fr/equipes/temps-reel/gesturefollower/gesture_follower_v0.3.zip  v0.3]&lt;br /&gt;
&lt;br /&gt;
This software is intended for artistic work and/or scientific research. Any commercial use is reserved. Copyrights 2004-2007 IRCAM - Centre Pompidou.&lt;br /&gt;
&lt;br /&gt;
If appropriate please cite the Real Time Interaction Team, IRCAM or reference the following article:&lt;br /&gt;
F. Bevilacqua, F. Guédy, N. Schnell, E. Fléty, N. Leroy, [http://mediatheque.ircam.fr/articles/textes/Bevilacqua07a/ Wireless sensor interface and gesture-follower for music pedagogy], Proceedings of the International Conference of New Interfaces for Musical Expression (NIME 07), New York,NY, USA, pp 124-12, 2007.&lt;br /&gt;
&lt;br /&gt;
== Getting help and disclaimer ==&lt;br /&gt;
Any type of feedback, problem, bug report, feature request are welcome and we will try our best to help you.&lt;br /&gt;
Please post any message/questions directly to the [http://listes.ircam.fr/wws/info/ftm FTM list]. &lt;br /&gt;
&lt;br /&gt;
Nevertheless, ...this is work in progress!!! Use this software at your own risk. We do not assume any reponsability for possible problems caused by the use of this software.&lt;br /&gt;
&lt;br /&gt;
== References==&lt;br /&gt;
&lt;br /&gt;
The messages used the various modules are listed here [http://recherche.ircam.fr/equipes/temps-reel/gesturefollower/gesturefollower-reference.01.pdf ref]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Getting started ==&lt;br /&gt;
&lt;br /&gt;
=== LCD example===&lt;br /&gt;
====overview ====&lt;br /&gt;
(Note: The pictures are related to a previous release of the follower, there have been some changes &lt;br /&gt;
in the graphic design)&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 1st step : Record gestures ====&lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====2nd sep : Compare ====&lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 3rd step : Observe ==== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====  Connection Avec EyesWeb XMI ==== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;br /&gt;
&lt;br /&gt;
== Examples ==&lt;br /&gt;
&lt;br /&gt;
In the version [http://recherche.ircam.fr/equipes/temps-reel/gesturefollower/gesture_follower_v0.3.zip  v0.3], the following examples can be found:&lt;br /&gt;
* writing&lt;br /&gt;
* Wii&lt;br /&gt;
* audio parameters (pitch, periodicity, energy)&lt;br /&gt;
* voice (mfcc)&lt;br /&gt;
&lt;br /&gt;
== Links ==&lt;br /&gt;
&lt;br /&gt;
== Credits and Aknowledgements==&lt;br /&gt;
Real Time Musical Interaction - Ircam - CNRS STMS&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Frédéric Bevilacqua, Rémy Muller, Norbert Schnell, Fabrice Guédy, Jean-Philippe Lambert, Aymeric Devergié, Anthony Sypniewski, Bruno Zamborlin&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2452</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2452"/>
				<updated>2008-06-11T12:17:41Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* Examples */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== What is the ''gesture follower'' ? ==&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a ensemble of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to get parameters from the comparison between a performance and an set of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (beginning, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
== What is a ''gesture'' anyway ? ==&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit for the dimension of the gesture data (or number of sensor channel) other than what your computer can afford as a CPU load (for example 20 is generally no problem). &lt;br /&gt;
&lt;br /&gt;
In Max/MSP the data feeding the ''gesture follower'' can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
== Download, license and referencing==&lt;br /&gt;
The ''gesture follower'' comes freely with the download of [[http://ftm.ircam.fr/index.php/Download FTM]], in the folder .../MnM.BETA/examples/gesture_follower/. Note that you must have FTM installed.&lt;br /&gt;
&lt;br /&gt;
The latest version, v0.3 can be downloaded here [http://recherche.ircam.fr/equipes/temps-reel/gesturefollower/gesture_follower_v0.3.zip  v0.3]&lt;br /&gt;
&lt;br /&gt;
This software is intended for artistic work and/or scientific research. Any commercial use is reserved. Copyrights 2004-2007 IRCAM - Centre Pompidou.&lt;br /&gt;
&lt;br /&gt;
If appropriate please cite the Real Time Interaction Team, IRCAM or reference the following article:&lt;br /&gt;
F. Bevilacqua, F. Guédy, N. Schnell, E. Fléty, N. Leroy, [http://mediatheque.ircam.fr/articles/textes/Bevilacqua07a/ Wireless sensor interface and gesture-follower for music pedagogy], Proceedings of the International Conference of New Interfaces for Musical Expression (NIME 07), New York,NY, USA, pp 124-12, 2007.&lt;br /&gt;
&lt;br /&gt;
== Getting help and disclaimer ==&lt;br /&gt;
Any type of feedback, problem, bug report, feature request are welcome and we will try our best to help you.&lt;br /&gt;
Please post any message/questions directly to the [http://listes.ircam.fr/wws/info/ftm FTM list]. &lt;br /&gt;
&lt;br /&gt;
Nevertheless, ...this is work in progress!!! Use this software at your own risk. We do not assume any reponsability for possible problems caused by the use of this software.&lt;br /&gt;
&lt;br /&gt;
== References==&lt;br /&gt;
&lt;br /&gt;
The messages used the various modules are listed here [http://recherche.ircam.fr/equipes/temps-reel/gesturefollower/gesturefollower-reference.01.pdf ref]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Getting started ==&lt;br /&gt;
&lt;br /&gt;
=== LCD example===&lt;br /&gt;
====overview ====&lt;br /&gt;
(Note: The pictures are related to a previous release of the follower, there have been some changes &lt;br /&gt;
in the graphic design)&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 1st step : Record gestures ====&lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====2nd sep : Compare ====&lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 3rd step : Observe ==== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====  Connection Avec EyesWeb XMI ==== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;br /&gt;
&lt;br /&gt;
== Examples ==&lt;br /&gt;
&lt;br /&gt;
The the version, v0.3 here [http://recherche.ircam.fr/equipes/temps-reel/gesturefollower/gesture_follower_v0.3.zip  v0.3]&lt;br /&gt;
examples can be found:&lt;br /&gt;
- writing&lt;br /&gt;
- Wii&lt;br /&gt;
- audio parameters (pitch, periodicity, energy)&lt;br /&gt;
- voice (mfcc)&lt;br /&gt;
&lt;br /&gt;
== Links ==&lt;br /&gt;
&lt;br /&gt;
== Credits and Aknowledgements==&lt;br /&gt;
Real Time Musical Interaction - Ircam - CNRS STMS&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Frédéric Bevilacqua, Rémy Muller, Norbert Schnell, Fabrice Guédy, Jean-Philippe Lambert, Aymeric Devergié, Anthony Sypniewski, Bruno Zamborlin&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2451</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2451"/>
				<updated>2008-06-11T12:15:17Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* Credits and Aknowledgements */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== What is the ''gesture follower'' ? ==&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a ensemble of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to get parameters from the comparison between a performance and an set of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (beginning, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
== What is a ''gesture'' anyway ? ==&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit for the dimension of the gesture data (or number of sensor channel) other than what your computer can afford as a CPU load (for example 20 is generally no problem). &lt;br /&gt;
&lt;br /&gt;
In Max/MSP the data feeding the ''gesture follower'' can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
== Download, license and referencing==&lt;br /&gt;
The ''gesture follower'' comes freely with the download of [[http://ftm.ircam.fr/index.php/Download FTM]], in the folder .../MnM.BETA/examples/gesture_follower/. Note that you must have FTM installed.&lt;br /&gt;
&lt;br /&gt;
The latest version, v0.3 can be downloaded here [http://recherche.ircam.fr/equipes/temps-reel/gesturefollower/gesture_follower_v0.3.zip  v0.3]&lt;br /&gt;
&lt;br /&gt;
This software is intended for artistic work and/or scientific research. Any commercial use is reserved. Copyrights 2004-2007 IRCAM - Centre Pompidou.&lt;br /&gt;
&lt;br /&gt;
If appropriate please cite the Real Time Interaction Team, IRCAM or reference the following article:&lt;br /&gt;
F. Bevilacqua, F. Guédy, N. Schnell, E. Fléty, N. Leroy, [http://mediatheque.ircam.fr/articles/textes/Bevilacqua07a/ Wireless sensor interface and gesture-follower for music pedagogy], Proceedings of the International Conference of New Interfaces for Musical Expression (NIME 07), New York,NY, USA, pp 124-12, 2007.&lt;br /&gt;
&lt;br /&gt;
== Getting help and disclaimer ==&lt;br /&gt;
Any type of feedback, problem, bug report, feature request are welcome and we will try our best to help you.&lt;br /&gt;
Please post any message/questions directly to the [http://listes.ircam.fr/wws/info/ftm FTM list]. &lt;br /&gt;
&lt;br /&gt;
Nevertheless, ...this is work in progress!!! Use this software at your own risk. We do not assume any reponsability for possible problems caused by the use of this software.&lt;br /&gt;
&lt;br /&gt;
== References==&lt;br /&gt;
&lt;br /&gt;
The messages used the various modules are listed here [http://recherche.ircam.fr/equipes/temps-reel/gesturefollower/gesturefollower-reference.01.pdf ref]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Getting started ==&lt;br /&gt;
&lt;br /&gt;
=== LCD example===&lt;br /&gt;
====overview ====&lt;br /&gt;
(Note: The pictures are related to a previous release of the follower, there have been some changes &lt;br /&gt;
in the graphic design)&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 1st step : Record gestures ====&lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====2nd sep : Compare ====&lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 3rd step : Observe ==== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====  Connection Avec EyesWeb XMI ==== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;br /&gt;
&lt;br /&gt;
== Examples ==&lt;br /&gt;
&lt;br /&gt;
== Links ==&lt;br /&gt;
&lt;br /&gt;
== Credits and Aknowledgements==&lt;br /&gt;
Real Time Musical Interaction - Ircam - CNRS STMS&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Frédéric Bevilacqua, Rémy Muller, Norbert Schnell, Fabrice Guédy, Jean-Philippe Lambert, Aymeric Devergié, Anthony Sypniewski, Bruno Zamborlin&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2450</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2450"/>
				<updated>2008-06-11T12:14:03Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* Credits and Aknowledgements */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== What is the ''gesture follower'' ? ==&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a ensemble of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to get parameters from the comparison between a performance and an set of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (beginning, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
== What is a ''gesture'' anyway ? ==&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit for the dimension of the gesture data (or number of sensor channel) other than what your computer can afford as a CPU load (for example 20 is generally no problem). &lt;br /&gt;
&lt;br /&gt;
In Max/MSP the data feeding the ''gesture follower'' can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
== Download, license and referencing==&lt;br /&gt;
The ''gesture follower'' comes freely with the download of [[http://ftm.ircam.fr/index.php/Download FTM]], in the folder .../MnM.BETA/examples/gesture_follower/. Note that you must have FTM installed.&lt;br /&gt;
&lt;br /&gt;
The latest version, v0.3 can be downloaded here [http://recherche.ircam.fr/equipes/temps-reel/gesturefollower/gesture_follower_v0.3.zip  v0.3]&lt;br /&gt;
&lt;br /&gt;
This software is intended for artistic work and/or scientific research. Any commercial use is reserved. Copyrights 2004-2007 IRCAM - Centre Pompidou.&lt;br /&gt;
&lt;br /&gt;
If appropriate please cite the Real Time Interaction Team, IRCAM or reference the following article:&lt;br /&gt;
F. Bevilacqua, F. Guédy, N. Schnell, E. Fléty, N. Leroy, [http://mediatheque.ircam.fr/articles/textes/Bevilacqua07a/ Wireless sensor interface and gesture-follower for music pedagogy], Proceedings of the International Conference of New Interfaces for Musical Expression (NIME 07), New York,NY, USA, pp 124-12, 2007.&lt;br /&gt;
&lt;br /&gt;
== Getting help and disclaimer ==&lt;br /&gt;
Any type of feedback, problem, bug report, feature request are welcome and we will try our best to help you.&lt;br /&gt;
Please post any message/questions directly to the [http://listes.ircam.fr/wws/info/ftm FTM list]. &lt;br /&gt;
&lt;br /&gt;
Nevertheless, ...this is work in progress!!! Use this software at your own risk. We do not assume any reponsability for possible problems caused by the use of this software.&lt;br /&gt;
&lt;br /&gt;
== References==&lt;br /&gt;
&lt;br /&gt;
The messages used the various modules are listed here [http://recherche.ircam.fr/equipes/temps-reel/gesturefollower/gesturefollower-reference.01.pdf ref]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Getting started ==&lt;br /&gt;
&lt;br /&gt;
=== LCD example===&lt;br /&gt;
====overview ====&lt;br /&gt;
(Note: The pictures are related to a previous release of the follower, there have been some changes &lt;br /&gt;
in the graphic design)&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 1st step : Record gestures ====&lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====2nd sep : Compare ====&lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 3rd step : Observe ==== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====  Connection Avec EyesWeb XMI ==== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;br /&gt;
&lt;br /&gt;
== Examples ==&lt;br /&gt;
&lt;br /&gt;
== Links ==&lt;br /&gt;
&lt;br /&gt;
== Credits and Aknowledgements==&lt;br /&gt;
Real Time Musical Interaction - Ircam - CNRS STMS&lt;br /&gt;
Frédéric Bevilacqua&lt;br /&gt;
Norbert Schnell&lt;br /&gt;
Fabrice Guédy &lt;br /&gt;
Jean-Philippe Lambert&lt;br /&gt;
Rémy Muller&lt;br /&gt;
Bruno Zamborlin&lt;br /&gt;
Anthony Sypniewski &lt;br /&gt;
Aymeric Devergié&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2449</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2449"/>
				<updated>2008-06-11T12:09:17Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== What is the ''gesture follower'' ? ==&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a ensemble of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to get parameters from the comparison between a performance and an set of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (beginning, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
== What is a ''gesture'' anyway ? ==&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit for the dimension of the gesture data (or number of sensor channel) other than what your computer can afford as a CPU load (for example 20 is generally no problem). &lt;br /&gt;
&lt;br /&gt;
In Max/MSP the data feeding the ''gesture follower'' can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
== Download, license and referencing==&lt;br /&gt;
The ''gesture follower'' comes freely with the download of [[http://ftm.ircam.fr/index.php/Download FTM]], in the folder .../MnM.BETA/examples/gesture_follower/. Note that you must have FTM installed.&lt;br /&gt;
&lt;br /&gt;
The latest version, v0.3 can be downloaded here [http://recherche.ircam.fr/equipes/temps-reel/gesturefollower/gesture_follower_v0.3.zip  v0.3]&lt;br /&gt;
&lt;br /&gt;
This software is intended for artistic work and/or scientific research. Any commercial use is reserved. Copyrights 2004-2007 IRCAM - Centre Pompidou.&lt;br /&gt;
&lt;br /&gt;
If appropriate please cite the Real Time Interaction Team, IRCAM or reference the following article:&lt;br /&gt;
F. Bevilacqua, F. Guédy, N. Schnell, E. Fléty, N. Leroy, [http://mediatheque.ircam.fr/articles/textes/Bevilacqua07a/ Wireless sensor interface and gesture-follower for music pedagogy], Proceedings of the International Conference of New Interfaces for Musical Expression (NIME 07), New York,NY, USA, pp 124-12, 2007.&lt;br /&gt;
&lt;br /&gt;
== Getting help and disclaimer ==&lt;br /&gt;
Any type of feedback, problem, bug report, feature request are welcome and we will try our best to help you.&lt;br /&gt;
Please post any message/questions directly to the [http://listes.ircam.fr/wws/info/ftm FTM list]. &lt;br /&gt;
&lt;br /&gt;
Nevertheless, ...this is work in progress!!! Use this software at your own risk. We do not assume any reponsability for possible problems caused by the use of this software.&lt;br /&gt;
&lt;br /&gt;
== References==&lt;br /&gt;
&lt;br /&gt;
The messages used the various modules are listed here [http://recherche.ircam.fr/equipes/temps-reel/gesturefollower/gesturefollower-reference.01.pdf ref]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Getting started ==&lt;br /&gt;
&lt;br /&gt;
=== LCD example===&lt;br /&gt;
====overview ====&lt;br /&gt;
(Note: The pictures are related to a previous release of the follower, there have been some changes &lt;br /&gt;
in the graphic design)&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 1st step : Record gestures ====&lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====2nd sep : Compare ====&lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 3rd step : Observe ==== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====  Connection Avec EyesWeb XMI ==== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;br /&gt;
&lt;br /&gt;
== Examples ==&lt;br /&gt;
&lt;br /&gt;
== Links ==&lt;br /&gt;
&lt;br /&gt;
== Credits and Aknowledgements==&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=FTM%26Co&amp;diff=2444</id>
		<title>FTM&amp;Co</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=FTM%26Co&amp;diff=2444"/>
				<updated>2008-05-28T15:29:22Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__NOTOC__&lt;br /&gt;
FTM is an extension for [http://www.cycling74.com/ Max/MSP] and [http://puredata.info PureData] providing a small and simple real-time object system and a set of optimized services.&lt;br /&gt;
&lt;br /&gt;
The [http://puredata.info/ PureData] version of FTM is under development.&lt;br /&gt;
&lt;br /&gt;
* [[About | Read more about FTM]]&lt;br /&gt;
* [[Download | Download FTM]]&lt;br /&gt;
* [http://listes.ircam.fr/wws/info/ftm Join the FTM mailing-list]&lt;br /&gt;
&lt;br /&gt;
* [[Credits]]&lt;br /&gt;
&lt;br /&gt;
== Documentation ==&lt;br /&gt;
* [[Documentation]]&lt;br /&gt;
* [[Packages]]&lt;br /&gt;
* [[FAQ]]&lt;br /&gt;
* [[Examples]]&lt;br /&gt;
&lt;br /&gt;
== [[Applications]] ==&lt;br /&gt;
* [http://imtr.ircam.fr/index.php?CataRT CataRT]&lt;br /&gt;
* [[Gesture Follower]]&lt;br /&gt;
&lt;br /&gt;
== Development ==&lt;br /&gt;
* [http://trac.ircam.fr/projects/ftm-and-co Bug reports, feature requests]&lt;br /&gt;
* [http://sourceforge.net/projects/ftm/ FTM @ SourceForge.net]&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=MnM&amp;diff=2337</id>
		<title>MnM</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=MnM&amp;diff=2337"/>
				<updated>2008-02-10T21:36:40Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* Mapping is not Music */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Mapping is not Music ==&lt;br /&gt;
[[MnM]] is a set of Max/MSP externals based on [[FTMlib]] providing a unified framework for various techniques of classification, recognition and mapping for motion capture data, sound and music.&lt;br /&gt;
&lt;br /&gt;
Features currently implemented in MnM include:&lt;br /&gt;
* Hidden Markov Models, see for example [[Gesture Follower]]&lt;br /&gt;
* Principal Components Analysis&lt;br /&gt;
* Singular Value Decomposition, LU and QR decompositions&lt;br /&gt;
* Non-negative Matrix Factorization and sparse decomposition&lt;br /&gt;
* multi-dimensionnal M to N mapping based on examples&lt;br /&gt;
* Multi-dimensioannal autocorrelation&lt;br /&gt;
* Matrix/Vector Statistics (min, max, mean, std, histogram, mahalanobis distance)&lt;br /&gt;
&lt;br /&gt;
The MnM package is released within the [[Download | FTM distributions]].&lt;br /&gt;
&lt;br /&gt;
== Publications ==&lt;br /&gt;
{{:MnM Publications}}&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2197</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2197"/>
				<updated>2007-11-09T10:12:11Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* Download, license and referencing */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== What is the ''gesture follower'' ? ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a ensemble of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to get parameters from the comparison between a performance and an set of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (beginning, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
=== What is a ''gesture'' anyway ? ===&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit for the dimension of the gesture data (or number of sensor channel) other than what your computer can afford as a CPU load (for example 20 is generally no problem). &lt;br /&gt;
&lt;br /&gt;
In Max/MSP the data feeding the ''gesture follower'' can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
=== Download, license and referencing===&lt;br /&gt;
The ''gesture follower'' comes freely with the download of [[http://ftm.ircam.fr/index.php/Download FTM]], in the folder .../MnM.BETA/examples/gesture_follower/. Note that you must have FTM installed.&lt;br /&gt;
&lt;br /&gt;
The latest version, v0.3 can be downloaded here [http://recherche.ircam.fr/equipes/temps-reel/gesturefollower/gesture_follower_v0.3.zip  v0.3]&lt;br /&gt;
&lt;br /&gt;
This software is intended for artistic work and/or scientific research. Any commercial use is reserved. Copyrights 2004-2007 IRCAM - Centre Pompidou.&lt;br /&gt;
&lt;br /&gt;
If appropriate please cite the Real Time Interaction Team, IRCAM or reference the following article:&lt;br /&gt;
F. Bevilacqua, F. Guédy, N. Schnell, E. Fléty, N. Leroy, [http://mediatheque.ircam.fr/articles/textes/Bevilacqua07a/ Wireless sensor interface and gesture-follower for music pedagogy], Proceedings of the International Conference of New Interfaces for Musical Expression (NIME 07), New York,NY, USA, pp 124-12, 2007.&lt;br /&gt;
&lt;br /&gt;
=== Getting help and disclaimer ===&lt;br /&gt;
Any type of feedback, problem, bug report, feature request are welcome and we will try our best to help you.&lt;br /&gt;
Please post any message/questions directly to the [http://listes.ircam.fr/wws/info/ftm FTM list]. &lt;br /&gt;
&lt;br /&gt;
Nevertheless, ...this is work in progress!!! Use this software at your own risk. We do not assume any reponsability for possible problems caused by the use of this software.&lt;br /&gt;
&lt;br /&gt;
=== Getting started ===&lt;br /&gt;
&lt;br /&gt;
==== LCD example====&lt;br /&gt;
=====overview =====&lt;br /&gt;
(Note: The pictures are related to a previous release of the follower, there have been some changes &lt;br /&gt;
in the graphic design)&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== 1st step : Record gestures =====&lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=====2nd sep : Compare =====&lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== 3rd step : Observe ===== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=====  Connection Avec EyesWeb XMI ===== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;br /&gt;
&lt;br /&gt;
=== References===&lt;br /&gt;
&lt;br /&gt;
The messages used the various modules are listed here [http://recherche.ircam.fr/equipes/temps-reel/gesturefollower/gesturefollower-reference.01.pdf ref]&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Credits and Aknowledgements===&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2196</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2196"/>
				<updated>2007-11-09T00:12:32Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* References */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== What is the ''gesture follower'' ? ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a ensemble of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to get parameters from the comparison between a performance and an set of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (beginning, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
=== What is a ''gesture'' anyway ? ===&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit for the dimension of the gesture data (or number of sensor channel) other than what your computer can afford as a CPU load (for example 20 is generally no problem). &lt;br /&gt;
&lt;br /&gt;
In Max/MSP the data feeding the ''gesture follower'' can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
=== Download, license and referencing===&lt;br /&gt;
The ''gesture follower'' comes freely with the download of [[http://ftm.ircam.fr/index.php/Download FTM]], in the folder .../MnM.BETA/examples/gesture_follower/. Note that you must have FTM installed.&lt;br /&gt;
&lt;br /&gt;
The latest version, v0.3 can be downloaded here [http://recherche.ircam.fr/equipes/temps-reel/gesturefollower/GestureFollower.v03.zip  v0.3]&lt;br /&gt;
&lt;br /&gt;
This software is intended for artistic work and/or scientific research. Any commercial use is reserved. Copyrights 2004-2007 IRCAM - Centre Pompidou.&lt;br /&gt;
&lt;br /&gt;
If appropriate please cite the Real Time Interaction Team, IRCAM or reference the following article:&lt;br /&gt;
F. Bevilacqua, F. Guédy, N. Schnell, E. Fléty, N. Leroy, [http://mediatheque.ircam.fr/articles/textes/Bevilacqua07a/ Wireless sensor interface and gesture-follower for music pedagogy], Proceedings of the International Conference of New Interfaces for Musical Expression (NIME 07), New York,NY, USA, pp 124-12, 2007.&lt;br /&gt;
&lt;br /&gt;
=== Getting help and disclaimer ===&lt;br /&gt;
Any type of feedback, problem, bug report, feature request are welcome and we will try our best to help you.&lt;br /&gt;
Please post any message/questions directly to the [http://listes.ircam.fr/wws/info/ftm FTM list]. &lt;br /&gt;
&lt;br /&gt;
Nevertheless, ...this is work in progress!!! Use this software at your own risk. We do not assume any reponsability for possible problems caused by the use of this software.&lt;br /&gt;
&lt;br /&gt;
=== Getting started ===&lt;br /&gt;
&lt;br /&gt;
==== LCD example====&lt;br /&gt;
=====overview =====&lt;br /&gt;
(Note: The pictures are related to a previous release of the follower, there have been some changes &lt;br /&gt;
in the graphic design)&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== 1st step : Record gestures =====&lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=====2nd sep : Compare =====&lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== 3rd step : Observe ===== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=====  Connection Avec EyesWeb XMI ===== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;br /&gt;
&lt;br /&gt;
=== References===&lt;br /&gt;
&lt;br /&gt;
The messages used the various modules are listed here [http://recherche.ircam.fr/equipes/temps-reel/gesturefollower/gesturefollower-reference.01.pdf ref]&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Credits and Aknowledgements===&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2195</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2195"/>
				<updated>2007-11-09T00:12:11Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* References */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== What is the ''gesture follower'' ? ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a ensemble of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to get parameters from the comparison between a performance and an set of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (beginning, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
=== What is a ''gesture'' anyway ? ===&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit for the dimension of the gesture data (or number of sensor channel) other than what your computer can afford as a CPU load (for example 20 is generally no problem). &lt;br /&gt;
&lt;br /&gt;
In Max/MSP the data feeding the ''gesture follower'' can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
=== Download, license and referencing===&lt;br /&gt;
The ''gesture follower'' comes freely with the download of [[http://ftm.ircam.fr/index.php/Download FTM]], in the folder .../MnM.BETA/examples/gesture_follower/. Note that you must have FTM installed.&lt;br /&gt;
&lt;br /&gt;
The latest version, v0.3 can be downloaded here [http://recherche.ircam.fr/equipes/temps-reel/gesturefollower/GestureFollower.v03.zip  v0.3]&lt;br /&gt;
&lt;br /&gt;
This software is intended for artistic work and/or scientific research. Any commercial use is reserved. Copyrights 2004-2007 IRCAM - Centre Pompidou.&lt;br /&gt;
&lt;br /&gt;
If appropriate please cite the Real Time Interaction Team, IRCAM or reference the following article:&lt;br /&gt;
F. Bevilacqua, F. Guédy, N. Schnell, E. Fléty, N. Leroy, [http://mediatheque.ircam.fr/articles/textes/Bevilacqua07a/ Wireless sensor interface and gesture-follower for music pedagogy], Proceedings of the International Conference of New Interfaces for Musical Expression (NIME 07), New York,NY, USA, pp 124-12, 2007.&lt;br /&gt;
&lt;br /&gt;
=== Getting help and disclaimer ===&lt;br /&gt;
Any type of feedback, problem, bug report, feature request are welcome and we will try our best to help you.&lt;br /&gt;
Please post any message/questions directly to the [http://listes.ircam.fr/wws/info/ftm FTM list]. &lt;br /&gt;
&lt;br /&gt;
Nevertheless, ...this is work in progress!!! Use this software at your own risk. We do not assume any reponsability for possible problems caused by the use of this software.&lt;br /&gt;
&lt;br /&gt;
=== Getting started ===&lt;br /&gt;
&lt;br /&gt;
==== LCD example====&lt;br /&gt;
=====overview =====&lt;br /&gt;
(Note: The pictures are related to a previous release of the follower, there have been some changes &lt;br /&gt;
in the graphic design)&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== 1st step : Record gestures =====&lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=====2nd sep : Compare =====&lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== 3rd step : Observe ===== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=====  Connection Avec EyesWeb XMI ===== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;br /&gt;
&lt;br /&gt;
=== References===&lt;br /&gt;
&lt;br /&gt;
The messages used the various modules are listed here[http://recherche.ircam.fr/equipes/temps-reel/gesturefollower/gesturefollower-reference.01.pdf ref]&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Credits and Aknowledgements===&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=File:Gesturefollower-doc01.pdf&amp;diff=2194</id>
		<title>File:Gesturefollower-doc01.pdf</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=File:Gesturefollower-doc01.pdf&amp;diff=2194"/>
				<updated>2007-11-09T00:08:18Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2193</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2193"/>
				<updated>2007-11-08T23:22:43Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* Download, license and referencing */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== What is the ''gesture follower'' ? ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a ensemble of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to get parameters from the comparison between a performance and an set of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (beginning, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
=== What is a ''gesture'' anyway ? ===&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit for the dimension of the gesture data (or number of sensor channel) other than what your computer can afford as a CPU load (for example 20 is generally no problem). &lt;br /&gt;
&lt;br /&gt;
In Max/MSP the data feeding the ''gesture follower'' can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
=== Download, license and referencing===&lt;br /&gt;
The ''gesture follower'' comes freely with the download of [[http://ftm.ircam.fr/index.php/Download FTM]], in the folder .../MnM.BETA/examples/gesture_follower/. Note that you must have FTM installed.&lt;br /&gt;
&lt;br /&gt;
The latest version, v0.3 can be downloaded here [http://recherche.ircam.fr/equipes/temps-reel/gesturefollower/GestureFollower.v03.zip  v0.3]&lt;br /&gt;
&lt;br /&gt;
This software is intended for artistic work and/or scientific research. Any commercial use is reserved. Copyrights 2004-2007 IRCAM - Centre Pompidou.&lt;br /&gt;
&lt;br /&gt;
If appropriate please cite the Real Time Interaction Team, IRCAM or reference the following article:&lt;br /&gt;
F. Bevilacqua, F. Guédy, N. Schnell, E. Fléty, N. Leroy, [http://mediatheque.ircam.fr/articles/textes/Bevilacqua07a/ Wireless sensor interface and gesture-follower for music pedagogy], Proceedings of the International Conference of New Interfaces for Musical Expression (NIME 07), New York,NY, USA, pp 124-12, 2007.&lt;br /&gt;
&lt;br /&gt;
=== Getting help and disclaimer ===&lt;br /&gt;
Any type of feedback, problem, bug report, feature request are welcome and we will try our best to help you.&lt;br /&gt;
Please post any message/questions directly to the [http://listes.ircam.fr/wws/info/ftm FTM list]. &lt;br /&gt;
&lt;br /&gt;
Nevertheless, ...this is work in progress!!! Use this software at your own risk. We do not assume any reponsability for possible problems caused by the use of this software.&lt;br /&gt;
&lt;br /&gt;
=== Getting started ===&lt;br /&gt;
&lt;br /&gt;
==== LCD example====&lt;br /&gt;
=====overview =====&lt;br /&gt;
(Note: The pictures are related to a previous release of the follower, there have been some changes &lt;br /&gt;
in the graphic design)&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== 1st step : Record gestures =====&lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=====2nd sep : Compare =====&lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== 3rd step : Observe ===== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=====  Connection Avec EyesWeb XMI ===== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;br /&gt;
&lt;br /&gt;
=== References===&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Credits and Aknowledgements===&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2192</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2192"/>
				<updated>2007-11-08T23:08:08Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* Download, license and referencing */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== What is the ''gesture follower'' ? ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a ensemble of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to get parameters from the comparison between a performance and an set of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (beginning, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
=== What is a ''gesture'' anyway ? ===&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit for the dimension of the gesture data (or number of sensor channel) other than what your computer can afford as a CPU load (for example 20 is generally no problem). &lt;br /&gt;
&lt;br /&gt;
In Max/MSP the data feeding the ''gesture follower'' can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
=== Download, license and referencing===&lt;br /&gt;
The ''gesture follower'' comes freely with the download of [[http://ftm.ircam.fr/index.php/Download FTM]], in the folder .../MnM.BETA/examples/gesture_follower/. Note that you must have FTM installed.&lt;br /&gt;
&lt;br /&gt;
The latest version, v0.3 can be downloaded here [http://  v0.3]&lt;br /&gt;
&lt;br /&gt;
This software is intended for artistic work and/or scientific research. Any commercial use is reserved. Copyrights 2004-2007 IRCAM - Centre Pompidou.&lt;br /&gt;
&lt;br /&gt;
If appropriate please cite the Real Time Interaction Team, IRCAM or reference the following article:&lt;br /&gt;
F. Bevilacqua, F. Guédy, N. Schnell, E. Fléty, N. Leroy, [http://mediatheque.ircam.fr/articles/textes/Bevilacqua07a/ Wireless sensor interface and gesture-follower for music pedagogy], Proceedings of the International Conference of New Interfaces for Musical Expression (NIME 07), New York,NY, USA, pp 124-12, 2007.&lt;br /&gt;
&lt;br /&gt;
=== Getting help and disclaimer ===&lt;br /&gt;
Any type of feedback, problem, bug report, feature request are welcome and we will try our best to help you.&lt;br /&gt;
Please post any message/questions directly to the [http://listes.ircam.fr/wws/info/ftm FTM list]. &lt;br /&gt;
&lt;br /&gt;
Nevertheless, ...this is work in progress!!! Use this software at your own risk. We do not assume any reponsability for possible problems caused by the use of this software.&lt;br /&gt;
&lt;br /&gt;
=== Getting started ===&lt;br /&gt;
&lt;br /&gt;
==== LCD example====&lt;br /&gt;
=====overview =====&lt;br /&gt;
(Note: The pictures are related to a previous release of the follower, there have been some changes &lt;br /&gt;
in the graphic design)&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== 1st step : Record gestures =====&lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=====2nd sep : Compare =====&lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== 3rd step : Observe ===== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=====  Connection Avec EyesWeb XMI ===== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;br /&gt;
&lt;br /&gt;
=== References===&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Credits and Aknowledgements===&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2191</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2191"/>
				<updated>2007-11-08T23:05:57Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== What is the ''gesture follower'' ? ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a ensemble of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to get parameters from the comparison between a performance and an set of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (beginning, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
=== What is a ''gesture'' anyway ? ===&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit for the dimension of the gesture data (or number of sensor channel) other than what your computer can afford as a CPU load (for example 20 is generally no problem). &lt;br /&gt;
&lt;br /&gt;
In Max/MSP the data feeding the ''gesture follower'' can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
=== Download, license and referencing===&lt;br /&gt;
The current version of the ''gesture follower'' comes freely with the download of [[http://ftm.ircam.fr/index.php/Download FTM]], in the folder .../MnM.BETA/examples/gesture_follower/. Note that you must have FTM installed.&lt;br /&gt;
&lt;br /&gt;
This software is intended for artistic work and/or scientific research. Any commercial use is reserved. Copyrights 2004-2007 IRCAM - Centre Pompidou.&lt;br /&gt;
&lt;br /&gt;
If appropriate please cite the Real Time Interaction Team, IRCAM or reference the following article:&lt;br /&gt;
F. Bevilacqua, F. Guédy, N. Schnell, E. Fléty, N. Leroy, [http://mediatheque.ircam.fr/articles/textes/Bevilacqua07a/ Wireless sensor interface and gesture-follower for music pedagogy], Proceedings of the International Conference of New Interfaces for Musical Expression (NIME 07), New York,NY, USA, pp 124-12, 2007.&lt;br /&gt;
&lt;br /&gt;
=== Getting help and disclaimer ===&lt;br /&gt;
Any type of feedback, problem, bug report, feature request are welcome and we will try our best to help you.&lt;br /&gt;
Please post any message/questions directly to the [http://listes.ircam.fr/wws/info/ftm FTM list]. &lt;br /&gt;
&lt;br /&gt;
Nevertheless, ...this is work in progress!!! Use this software at your own risk. We do not assume any reponsability for possible problems caused by the use of this software.&lt;br /&gt;
&lt;br /&gt;
=== Getting started ===&lt;br /&gt;
&lt;br /&gt;
==== LCD example====&lt;br /&gt;
=====overview =====&lt;br /&gt;
(Note: The pictures are related to a previous release of the follower, there have been some changes &lt;br /&gt;
in the graphic design)&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== 1st step : Record gestures =====&lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=====2nd sep : Compare =====&lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== 3rd step : Observe ===== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=====  Connection Avec EyesWeb XMI ===== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;br /&gt;
&lt;br /&gt;
=== References===&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Credits and Aknowledgements===&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2181</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2181"/>
				<updated>2007-11-07T11:19:37Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== What is the ''gesture follower'' ? ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a ensemble of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to get parameters from the comparison between a performance and an set of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (beginning, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
=== What is a ''gesture'' anyway ? ===&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit for the dimension of the gesture data (or number of sensor channel) other than what your computer can afford as a CPU load (for example 20 is generally no problem). &lt;br /&gt;
&lt;br /&gt;
In Max/MSP the data feeding the ''gesture follower'' can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
=== Download, license and referencing===&lt;br /&gt;
The current version of the ''gesture follower'' comes freely with the download of [[http://ftm.ircam.fr/index.php/Download FTM]], in the folder .../MnM.BETA/examples/gesture_follower/. Note that you must have FTM installed.&lt;br /&gt;
&lt;br /&gt;
This software is intended for artistic work and/or scientific research. Any commercial use is reserved. Copyrights 2004-2007 IRCAM - Centre Pompidou.&lt;br /&gt;
&lt;br /&gt;
If appropriate please cite the Real Time Interaction Team, IRCAM or reference the following article:&lt;br /&gt;
F. Bevilacqua, F. Guédy, N. Schnell, E. Fléty, N. Leroy, [http://mediatheque.ircam.fr/articles/textes/Bevilacqua07a/ Wireless sensor interface and gesture-follower for music pedagogy], Proceedings of the International Conference of New Interfaces for Musical Expression (NIME 07), New York,NY, USA, pp 124-12, 2007.&lt;br /&gt;
&lt;br /&gt;
=== Getting help and disclaimer ===&lt;br /&gt;
Any type of feedback, problem, bug report, feature request are welcome and we will try our best to help you.&lt;br /&gt;
Please post any message/questions directly to the [http://listes.ircam.fr/wws/info/ftm FTM list]. &lt;br /&gt;
&lt;br /&gt;
Nevertheless, ...this is work in progress!!! Use this software at your own risk. We do not assume any reponsability for possible problems caused by the use of this software.&lt;br /&gt;
&lt;br /&gt;
=== Getting started ===&lt;br /&gt;
&lt;br /&gt;
==== The simplest patch ====&lt;br /&gt;
&lt;br /&gt;
==== LCD example====&lt;br /&gt;
=====Tutorial Workspace : overview =====&lt;br /&gt;
&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== 1st STEP : Record gestures =====&lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=====2nd STEP : Compare =====&lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== 3rd STEP : Observe ===== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=====  Connection Avec EyesWeb XMI ===== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;br /&gt;
&lt;br /&gt;
=== References===&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Credits and Aknowledgements===&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2180</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2180"/>
				<updated>2007-11-07T11:18:53Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== What is the ''gesture follower'' ? ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a ensemble of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to get parameters from the comparison between a performance and an set of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (beginning, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
=== What is a ''gesture'' anyway ? ===&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit for the dimension of the gesture data (or number of sensor channel) other than what your computer can afford as a CPU load (for example 20 is generally no problem). &lt;br /&gt;
&lt;br /&gt;
In Max/MSP the data feeding the ''gesture follower'' can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
=== Download, license and referencing===&lt;br /&gt;
The current version of the ''gesture follower'' comes freely with the download of [[http://ftm.ircam.fr/index.php/Download FTM]], in the folder .../MnM.BETA/examples/gesture_follower/. Note that you must have FTM installed.&lt;br /&gt;
&lt;br /&gt;
This software is intended for artistic work and/or scientific research. Any commercial use is reserved. Copyrights 2004-2007 IRCAM - Centre Pompidou.&lt;br /&gt;
&lt;br /&gt;
If appropriate please cite the Real Time Interaction Team, IRCAM or reference the following article:&lt;br /&gt;
F. Bevilacqua, F. Guédy, N. Schnell, E. Fléty, N. Leroy, [http://mediatheque.ircam.fr/articles/textes/Bevilacqua07a/ Wireless sensor interface and gesture-follower for music pedagogy], Proceedings of the International Conference of New Interfaces for Musical Expression (NIME 07), New York,NY, USA, pp 124-12, 2007.&lt;br /&gt;
&lt;br /&gt;
=== Getting help and disclaimer ===&lt;br /&gt;
Any type of feedback, problem, bug report, feature request are welcome and we will try our best to help you.&lt;br /&gt;
Please post any message/questions directly to the [http://listes.ircam.fr/wws/info/ftm FTM list]. &lt;br /&gt;
&lt;br /&gt;
Nevertheless, ...this is work in progress!!! Use this software at your own risk. We do not assume any reponsability for possible problems caused by the use of this software.&lt;br /&gt;
&lt;br /&gt;
=== Getting started ===&lt;br /&gt;
&lt;br /&gt;
==== The simplest patch ====&lt;br /&gt;
&lt;br /&gt;
==== LCD example====&lt;br /&gt;
=====Tutorial Workspace : overview =====&lt;br /&gt;
&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== 1st STEP : Record gestures =====&lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=====2nd STEP : Compare =====&lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== 3rd STEP : Observe ===== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=====  Connection Avec EyesWeb XMI ===== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;br /&gt;
&lt;br /&gt;
=== References===&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Aknowledgements===&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2179</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2179"/>
				<updated>2007-11-07T11:18:28Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== What is the ''gesture follower'' ? ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a ensemble of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to get parameters from the comparison between a performance and an set of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (beginning, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
=== What is a ''gesture'' anyway ? ===&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit for the dimension of the gesture data (or number of sensor channel) other than what your computer can afford as a CPU load (for example 20 is generally no problem). &lt;br /&gt;
&lt;br /&gt;
In Max/MSP the data feeding the ''gesture follower'' can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
=== Download, license and referencing===&lt;br /&gt;
The current version of the ''gesture follower'' comes freely with the download of [[http://ftm.ircam.fr/index.php/Download FTM]], in the folder .../MnM.BETA/examples/gesture_follower/. Note that you must have FTM installed.&lt;br /&gt;
&lt;br /&gt;
This software is intended for artistic work and/or scientific research. Any commercial use is reserved. Copyrights 2004-2007 IRCAM - Centre Pompidou.&lt;br /&gt;
&lt;br /&gt;
If appropriate please cite the Real Time Interaction Team, IRCAM or reference the following article:&lt;br /&gt;
F. Bevilacqua, F. Guédy, N. Schnell, E. Fléty, N. Leroy, [http://mediatheque.ircam.fr/articles/textes/Bevilacqua07a/ Wireless sensor interface and gesture-follower for music pedagogy], Proceedings of the International Conference of New Interfaces for Musical Expression (NIME 07), New York,NY, USA, pp 124-12, 2007.&lt;br /&gt;
&lt;br /&gt;
=== Getting help and disclaimer ===&lt;br /&gt;
Any type of feedback, problem, bug report, feature request are welcome and we will try our best to help you.&lt;br /&gt;
Please post any message/questions directly to the [http://listes.ircam.fr/wws/info/ftm FTM list]. &lt;br /&gt;
&lt;br /&gt;
Nevertheless, ...this is work in progress!!! Use this software at your own risk. We do not assume any reponsability for possible problems caused by the use of this software.&lt;br /&gt;
&lt;br /&gt;
=== Getting started ===&lt;br /&gt;
&lt;br /&gt;
==== The simplest patch ====&lt;br /&gt;
&lt;br /&gt;
==== LCD example====&lt;br /&gt;
==== ==Tutorial Workspace : overview ==== =&lt;br /&gt;
&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== 1st STEP : Record gestures =====&lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=====2nd STEP : Compare =====&lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== 3rd STEP : Observe ===== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=====  Connection Avec EyesWeb XMI ===== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;br /&gt;
&lt;br /&gt;
=== References===&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Aknowledgements===&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2178</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2178"/>
				<updated>2007-11-06T21:53:46Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* Download, license and referencing */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== What is the ''gesture follower'' ? ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a ensemble of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to get parameters from the comparison between a performance and an set of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (beginning, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
=== What is a ''gesture'' anyway ? ===&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit for the dimension of the gesture data (or number of sensor channel) other than what your computer can afford as a CPU load (for example 20 is generally no problem). &lt;br /&gt;
&lt;br /&gt;
In Max/MSP the data feeding the ''gesture follower'' can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
=== Download, license and referencing===&lt;br /&gt;
The current version of the ''gesture follower'' comes freely with the download of [[http://ftm.ircam.fr/index.php/Download FTM]], in the folder .../MnM.BETA/examples/gesture_follower/. Note that you must have FTM installed.&lt;br /&gt;
&lt;br /&gt;
This software is intended for artistic work and/or scientific research. Any commercial use is reserved. Copyrights 2004-2007 IRCAM - Centre Pompidou.&lt;br /&gt;
&lt;br /&gt;
If appropriate please cite the Real Time Interaction Team, IRCAM or reference the following article:&lt;br /&gt;
F. Bevilacqua, F. Guédy, N. Schnell, E. Fléty, N. Leroy, [http://mediatheque.ircam.fr/articles/textes/Bevilacqua07a/ Wireless sensor interface and gesture-follower for music pedagogy], Proceedings of the International Conference of New Interfaces for Musical Expression (NIME 07), New York,NY, USA, pp 124-12, 2007.&lt;br /&gt;
&lt;br /&gt;
=== Getting help and disclaimer ===&lt;br /&gt;
Any type of feedback, problem, bug report, feature request are welcome and we will try our best to help you.&lt;br /&gt;
Please post any message/questions directly to the [http://listes.ircam.fr/wws/info/ftm FTM list]. &lt;br /&gt;
&lt;br /&gt;
Nevertheless, ...this is work in progress!!! Use this software at your own risk. We do not assume any reponsability for possible problems caused by the use of this software.&lt;br /&gt;
&lt;br /&gt;
=== Getting started ===&lt;br /&gt;
&lt;br /&gt;
==== The simplest patch ====&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== References===&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Aknowledgements===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Tutorial Workspace : overview ==== &lt;br /&gt;
&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 1st STEP : Record gestures ==== &lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 2nd STEP : Compare ==== &lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 3rd STEP : Observe ==== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====  Connection Avec EyesWeb XMI ==== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2177</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2177"/>
				<updated>2007-11-06T21:52:23Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* Getting help and disclaimer */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== What is the ''gesture follower'' ? ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a ensemble of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to get parameters from the comparison between a performance and an set of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (beginning, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
=== What is a ''gesture'' anyway ? ===&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit for the dimension of the gesture data (or number of sensor channel) other than what your computer can afford as a CPU load (for example 20 is generally no problem). &lt;br /&gt;
&lt;br /&gt;
In Max/MSP the data feeding the ''gesture follower'' can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
=== Download, license and referencing===&lt;br /&gt;
The current version of the ''gesture follower'' comes freely with the download of [[http://ftm.ircam.fr/index.php/Download FTM]], in the folder .../MnM.BETA/examples/gesture_follower/. Note that you must have FTM installed.&lt;br /&gt;
&lt;br /&gt;
This software is intended for artistic work and/or scientific research. Any commercial use is reserved.&lt;br /&gt;
&lt;br /&gt;
If appropriate please cite the Real Time Interaction Team, IRCAM - Centre Pompidou or reference the following article:&lt;br /&gt;
F. Bevilacqua, F. Guédy, N. Schnell, E. Fléty, N. Leroy, [http://mediatheque.ircam.fr/articles/textes/Bevilacqua07a/ Wireless sensor interface and gesture-follower for music pedagogy], Proceedings of the International Conference of New Interfaces for Musical Expression (NIME 07), New York,NY, USA, pp 124-12, 2007.&lt;br /&gt;
&lt;br /&gt;
=== Getting help and disclaimer ===&lt;br /&gt;
Any type of feedback, problem, bug report, feature request are welcome and we will try our best to help you.&lt;br /&gt;
Please post any message/questions directly to the [http://listes.ircam.fr/wws/info/ftm FTM list]. &lt;br /&gt;
&lt;br /&gt;
Nevertheless, ...this is work in progress!!! Use this software at your own risk. We do not assume any reponsability for possible problems caused by the use of this software.&lt;br /&gt;
&lt;br /&gt;
=== Getting started ===&lt;br /&gt;
&lt;br /&gt;
==== The simplest patch ====&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== References===&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Aknowledgements===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Tutorial Workspace : overview ==== &lt;br /&gt;
&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 1st STEP : Record gestures ==== &lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 2nd STEP : Compare ==== &lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 3rd STEP : Observe ==== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====  Connection Avec EyesWeb XMI ==== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2176</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2176"/>
				<updated>2007-11-06T21:51:06Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* What is the ''gesture follower'' ? */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== What is the ''gesture follower'' ? ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a ensemble of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to get parameters from the comparison between a performance and an set of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (beginning, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
=== What is a ''gesture'' anyway ? ===&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit for the dimension of the gesture data (or number of sensor channel) other than what your computer can afford as a CPU load (for example 20 is generally no problem). &lt;br /&gt;
&lt;br /&gt;
In Max/MSP the data feeding the ''gesture follower'' can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
=== Download, license and referencing===&lt;br /&gt;
The current version of the ''gesture follower'' comes freely with the download of [[http://ftm.ircam.fr/index.php/Download FTM]], in the folder .../MnM.BETA/examples/gesture_follower/. Note that you must have FTM installed.&lt;br /&gt;
&lt;br /&gt;
This software is intended for artistic work and/or scientific research. Any commercial use is reserved.&lt;br /&gt;
&lt;br /&gt;
If appropriate please cite the Real Time Interaction Team, IRCAM - Centre Pompidou or reference the following article:&lt;br /&gt;
F. Bevilacqua, F. Guédy, N. Schnell, E. Fléty, N. Leroy, [http://mediatheque.ircam.fr/articles/textes/Bevilacqua07a/ Wireless sensor interface and gesture-follower for music pedagogy], Proceedings of the International Conference of New Interfaces for Musical Expression (NIME 07), New York,NY, USA, pp 124-12, 2007.&lt;br /&gt;
&lt;br /&gt;
=== Getting help and disclaimer ===&lt;br /&gt;
Any type of feedback, problem, bug report, feature request are welcome and we will try our best to help you.&lt;br /&gt;
Please post any message/questions directly to the FTM list. &lt;br /&gt;
&lt;br /&gt;
Nevertheless, ...this is work in progress!!! Use this software at your own risk. We do not assume any reponsability for possible problems caused by the use of this software.&lt;br /&gt;
&lt;br /&gt;
=== Getting started ===&lt;br /&gt;
&lt;br /&gt;
==== The simplest patch ====&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== References===&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Aknowledgements===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Tutorial Workspace : overview ==== &lt;br /&gt;
&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 1st STEP : Record gestures ==== &lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 2nd STEP : Compare ==== &lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 3rd STEP : Observe ==== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====  Connection Avec EyesWeb XMI ==== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2175</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2175"/>
				<updated>2007-11-06T21:50:34Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* Getting help and disclaimer */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== What is the ''gesture follower'' ? ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a ensemble of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to get parameters from the comparison between a performance and an set of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (begining, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
=== What is a ''gesture'' anyway ? ===&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit for the dimension of the gesture data (or number of sensor channel) other than what your computer can afford as a CPU load (for example 20 is generally no problem). &lt;br /&gt;
&lt;br /&gt;
In Max/MSP the data feeding the ''gesture follower'' can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
=== Download, license and referencing===&lt;br /&gt;
The current version of the ''gesture follower'' comes freely with the download of [[http://ftm.ircam.fr/index.php/Download FTM]], in the folder .../MnM.BETA/examples/gesture_follower/. Note that you must have FTM installed.&lt;br /&gt;
&lt;br /&gt;
This software is intended for artistic work and/or scientific research. Any commercial use is reserved.&lt;br /&gt;
&lt;br /&gt;
If appropriate please cite the Real Time Interaction Team, IRCAM - Centre Pompidou or reference the following article:&lt;br /&gt;
F. Bevilacqua, F. Guédy, N. Schnell, E. Fléty, N. Leroy, [http://mediatheque.ircam.fr/articles/textes/Bevilacqua07a/ Wireless sensor interface and gesture-follower for music pedagogy], Proceedings of the International Conference of New Interfaces for Musical Expression (NIME 07), New York,NY, USA, pp 124-12, 2007.&lt;br /&gt;
&lt;br /&gt;
=== Getting help and disclaimer ===&lt;br /&gt;
Any type of feedback, problem, bug report, feature request are welcome and we will try our best to help you.&lt;br /&gt;
Please post any message/questions directly to the FTM list. &lt;br /&gt;
&lt;br /&gt;
Nevertheless, ...this is work in progress!!! Use this software at your own risk. We do not assume any reponsability for possible problems caused by the use of this software.&lt;br /&gt;
&lt;br /&gt;
=== Getting started ===&lt;br /&gt;
&lt;br /&gt;
==== The simplest patch ====&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== References===&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Aknowledgements===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Tutorial Workspace : overview ==== &lt;br /&gt;
&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 1st STEP : Record gestures ==== &lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 2nd STEP : Compare ==== &lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 3rd STEP : Observe ==== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====  Connection Avec EyesWeb XMI ==== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2174</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2174"/>
				<updated>2007-11-06T21:49:48Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* Getting help and Disclaimer */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== What is the ''gesture follower'' ? ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a ensemble of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to get parameters from the comparison between a performance and an set of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (begining, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
=== What is a ''gesture'' anyway ? ===&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit for the dimension of the gesture data (or number of sensor channel) other than what your computer can afford as a CPU load (for example 20 is generally no problem). &lt;br /&gt;
&lt;br /&gt;
In Max/MSP the data feeding the ''gesture follower'' can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
=== Download, license and referencing===&lt;br /&gt;
The current version of the ''gesture follower'' comes freely with the download of [[http://ftm.ircam.fr/index.php/Download FTM]], in the folder .../MnM.BETA/examples/gesture_follower/. Note that you must have FTM installed.&lt;br /&gt;
&lt;br /&gt;
This software is intended for artistic work and/or scientific research. Any commercial use is reserved.&lt;br /&gt;
&lt;br /&gt;
If appropriate please cite the Real Time Interaction Team, IRCAM - Centre Pompidou or reference the following article:&lt;br /&gt;
F. Bevilacqua, F. Guédy, N. Schnell, E. Fléty, N. Leroy, [http://mediatheque.ircam.fr/articles/textes/Bevilacqua07a/ Wireless sensor interface and gesture-follower for music pedagogy], Proceedings of the International Conference of New Interfaces for Musical Expression (NIME 07), New York,NY, USA, pp 124-12, 2007.&lt;br /&gt;
&lt;br /&gt;
=== Getting help and disclaimer ===&lt;br /&gt;
Any type of feedback, problem, bug report, feature request are welcome and we will try our best to help you.&lt;br /&gt;
Please post any message/questions directly to the FTM list. &lt;br /&gt;
&lt;br /&gt;
Nevertheless, ...&lt;br /&gt;
==== Disclaimer ====&lt;br /&gt;
This is work in progress!!! Use this software at your own risk. We do not assume any reponsability for possible problems caused by the use of this software.&lt;br /&gt;
&lt;br /&gt;
=== Getting started ===&lt;br /&gt;
&lt;br /&gt;
==== The simplest patch ====&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== References===&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Aknowledgements===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Tutorial Workspace : overview ==== &lt;br /&gt;
&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 1st STEP : Record gestures ==== &lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 2nd STEP : Compare ==== &lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 3rd STEP : Observe ==== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====  Connection Avec EyesWeb XMI ==== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2173</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2173"/>
				<updated>2007-11-06T21:49:15Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* Disclaimer */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== What is the ''gesture follower'' ? ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a ensemble of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to get parameters from the comparison between a performance and an set of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (begining, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
=== What is a ''gesture'' anyway ? ===&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit for the dimension of the gesture data (or number of sensor channel) other than what your computer can afford as a CPU load (for example 20 is generally no problem). &lt;br /&gt;
&lt;br /&gt;
In Max/MSP the data feeding the ''gesture follower'' can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
=== Download, license and referencing===&lt;br /&gt;
The current version of the ''gesture follower'' comes freely with the download of [[http://ftm.ircam.fr/index.php/Download FTM]], in the folder .../MnM.BETA/examples/gesture_follower/. Note that you must have FTM installed.&lt;br /&gt;
&lt;br /&gt;
This software is intended for artistic work and/or scientific research. Any commercial use is reserved.&lt;br /&gt;
&lt;br /&gt;
If appropriate please cite the Real Time Interaction Team, IRCAM - Centre Pompidou or reference the following article:&lt;br /&gt;
F. Bevilacqua, F. Guédy, N. Schnell, E. Fléty, N. Leroy, [http://mediatheque.ircam.fr/articles/textes/Bevilacqua07a/ Wireless sensor interface and gesture-follower for music pedagogy], Proceedings of the International Conference of New Interfaces for Musical Expression (NIME 07), New York,NY, USA, pp 124-12, 2007.&lt;br /&gt;
&lt;br /&gt;
=== Getting help and Disclaimer ===&lt;br /&gt;
Any type of feedback, problem, bug report, feature request are welcome and we will try our best to help you.&lt;br /&gt;
Please post any message/questions directly to the FTM list. &lt;br /&gt;
&lt;br /&gt;
Nevertheless, ...&lt;br /&gt;
==== Disclaimer ====&lt;br /&gt;
This is work in progress!!! Use this software at your own risk. We do not assume any reponsability for possible problems caused by the use of this software.&lt;br /&gt;
&lt;br /&gt;
=== Getting started ===&lt;br /&gt;
&lt;br /&gt;
==== The simplest patch ====&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== References===&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Aknowledgements===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Tutorial Workspace : overview ==== &lt;br /&gt;
&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 1st STEP : Record gestures ==== &lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 2nd STEP : Compare ==== &lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 3rd STEP : Observe ==== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====  Connection Avec EyesWeb XMI ==== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2172</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2172"/>
				<updated>2007-11-06T21:42:39Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* Dowload, license and referencing */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== What is the ''gesture follower'' ? ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a ensemble of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to get parameters from the comparison between a performance and an set of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (begining, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
=== What is a ''gesture'' anyway ? ===&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit for the dimension of the gesture data (or number of sensor channel) other than what your computer can afford as a CPU load (for example 20 is generally no problem). &lt;br /&gt;
&lt;br /&gt;
In Max/MSP the data feeding the ''gesture follower'' can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
=== Download, license and referencing===&lt;br /&gt;
The current version of the ''gesture follower'' comes freely with the download of [[http://ftm.ircam.fr/index.php/Download FTM]], in the folder .../MnM.BETA/examples/gesture_follower/. Note that you must have FTM installed.&lt;br /&gt;
&lt;br /&gt;
This software is intended for artistic work and/or scientific research. Any commercial use is reserved.&lt;br /&gt;
&lt;br /&gt;
If appropriate please cite the Real Time Interaction Team, IRCAM - Centre Pompidou or reference the following article:&lt;br /&gt;
F. Bevilacqua, F. Guédy, N. Schnell, E. Fléty, N. Leroy, [http://mediatheque.ircam.fr/articles/textes/Bevilacqua07a/ Wireless sensor interface and gesture-follower for music pedagogy], Proceedings of the International Conference of New Interfaces for Musical Expression (NIME 07), New York,NY, USA, pp 124-12, 2007.&lt;br /&gt;
&lt;br /&gt;
=== Disclaimer ===&lt;br /&gt;
This is work in progress. Use this software at your own risk. We do not assume any reponsability for possible problems.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Getting started ===&lt;br /&gt;
&lt;br /&gt;
==== The simplest patch ====&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== References===&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Aknowledgements===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Tutorial Workspace : overview ==== &lt;br /&gt;
&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 1st STEP : Record gestures ==== &lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 2nd STEP : Compare ==== &lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 3rd STEP : Observe ==== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====  Connection Avec EyesWeb XMI ==== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2171</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2171"/>
				<updated>2007-11-06T21:41:26Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* What is the ''gesture follower'' ? */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== What is the ''gesture follower'' ? ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a ensemble of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to get parameters from the comparison between a performance and an set of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (begining, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
=== What is a ''gesture'' anyway ? ===&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit for the dimension of the gesture data (or number of sensor channel) other than what your computer can afford as a CPU load (for example 20 is generally no problem). &lt;br /&gt;
&lt;br /&gt;
In Max/MSP the data feeding the ''gesture follower'' can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
=== Dowload, license and referencing===&lt;br /&gt;
The current version of the ''gesture follower'' comes freely with the download of [[http://ftm.ircam.fr/index.php/Download FTM]], in the folder .../MnM.BETA/examples/gesture_follower/. Note that you must have FTM installed.&lt;br /&gt;
&lt;br /&gt;
This software is intended for artistic work and/or scientific research. Any commercial use is reserved.&lt;br /&gt;
&lt;br /&gt;
If appropriate please cite the Real Time Interaction Team, IRCAM - Centre Pompidou or reference the following article:&lt;br /&gt;
F. Bevilacqua, F. Guédy, N. Schnell, E. Fléty, N. Leroy, [http://mediatheque.ircam.fr/articles/textes/Bevilacqua07a/ Wireless sensor interface and gesture-follower for music pedagogy], Proceedings of the International Conference of New Interfaces for Musical Expression (NIME 07), New York,NY, USA, pp 124-12, 2007.&lt;br /&gt;
&lt;br /&gt;
=== Disclaimer ===&lt;br /&gt;
This is work in progress. Use this software at your own risk. We do not assume any reponsability for possible problems.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Getting started ===&lt;br /&gt;
&lt;br /&gt;
==== The simplest patch ====&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== References===&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Aknowledgements===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Tutorial Workspace : overview ==== &lt;br /&gt;
&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 1st STEP : Record gestures ==== &lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 2nd STEP : Compare ==== &lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 3rd STEP : Observe ==== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====  Connection Avec EyesWeb XMI ==== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2170</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2170"/>
				<updated>2007-11-06T21:39:42Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== What is the ''gesture follower'' ? ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a set of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to be able get parameters by comparing a performance with an ensemble of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (begining, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
=== What is a ''gesture'' anyway ? ===&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit for the dimension of the gesture data (or number of sensor channel) other than what your computer can afford as a CPU load (for example 20 is generally no problem). &lt;br /&gt;
&lt;br /&gt;
In Max/MSP the data feeding the ''gesture follower'' can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
=== Dowload, license and referencing===&lt;br /&gt;
The current version of the ''gesture follower'' comes freely with the download of [[http://ftm.ircam.fr/index.php/Download FTM]], in the folder .../MnM.BETA/examples/gesture_follower/. Note that you must have FTM installed.&lt;br /&gt;
&lt;br /&gt;
This software is intended for artistic work and/or scientific research. Any commercial use is reserved.&lt;br /&gt;
&lt;br /&gt;
If appropriate please cite the Real Time Interaction Team, IRCAM - Centre Pompidou or reference the following article:&lt;br /&gt;
F. Bevilacqua, F. Guédy, N. Schnell, E. Fléty, N. Leroy, [http://mediatheque.ircam.fr/articles/textes/Bevilacqua07a/ Wireless sensor interface and gesture-follower for music pedagogy], Proceedings of the International Conference of New Interfaces for Musical Expression (NIME 07), New York,NY, USA, pp 124-12, 2007.&lt;br /&gt;
&lt;br /&gt;
=== Disclaimer ===&lt;br /&gt;
This is work in progress. Use this software at your own risk. We do not assume any reponsability for possible problems.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Getting started ===&lt;br /&gt;
&lt;br /&gt;
==== The simplest patch ====&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== References===&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Aknowledgements===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Tutorial Workspace : overview ==== &lt;br /&gt;
&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 1st STEP : Record gestures ==== &lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 2nd STEP : Compare ==== &lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 3rd STEP : Observe ==== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====  Connection Avec EyesWeb XMI ==== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2169</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2169"/>
				<updated>2007-11-06T21:38:05Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* Dowload, license and referencing */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== What is the ''gesture follower'' ? ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a set of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to be able get parameters by comparing a performance with an ensemble of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (begining, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
=== What is a ''gesture'' anyway ? ===&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit for the dimension of the gesture data (or number of sensor channel) other than what your computer can afford as a CPU load (for example 20 is generally no problem). &lt;br /&gt;
&lt;br /&gt;
In Max/MSP the data feeding the ''gesture follower'' can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
=== Dowload, license and referencing===&lt;br /&gt;
The current version of the ''gesture follower'' comes freely with the download of [[http://ftm.ircam.fr/index.php/Download FTM]], in the folder .../MnM.BETA/examples/gesture_follower/. Note that you must have FTM installed.&lt;br /&gt;
&lt;br /&gt;
This software is intended for artistic work and/or scientific research. Any commercial use is reserved.&lt;br /&gt;
&lt;br /&gt;
If appropriate please cite the Real Time Interaction Team, IRCAM - Centre Pompidou or reference the following article:&lt;br /&gt;
F. Bevilacqua, F. Guédy, N. Schnell, E. Fléty, N. Leroy, [http://mediatheque.ircam.fr/articles/textes/Bevilacqua07a/ Wireless sensor interface and gesture-follower for music pedagogy], Proceedings of the International Conference of New Interfaces for Musical Expression (NIME 07), New York,NY, USA, pp 124-12, 2007.&lt;br /&gt;
&lt;br /&gt;
=== Disclaimer ===&lt;br /&gt;
This is work in progress. Use this software at your own risk. We do not assume any reponsability for possible problems.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Getting started ===&lt;br /&gt;
&lt;br /&gt;
==== The simplest patch ====&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Referecences===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== I want to know how it is built to modify it====&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Download ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Tutorial Workspace : overview ==== &lt;br /&gt;
&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 1st STEP : Record gestures ==== &lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 2nd STEP : Compare ==== &lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 3rd STEP : Observe ==== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====  Connection Avec EyesWeb XMI ==== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2168</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2168"/>
				<updated>2007-11-06T21:37:50Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== What is the ''gesture follower'' ? ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a set of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to be able get parameters by comparing a performance with an ensemble of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (begining, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
=== What is a ''gesture'' anyway ? ===&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit for the dimension of the gesture data (or number of sensor channel) other than what your computer can afford as a CPU load (for example 20 is generally no problem). &lt;br /&gt;
&lt;br /&gt;
In Max/MSP the data feeding the ''gesture follower'' can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
=== Dowload, license and referencing===&lt;br /&gt;
The current version of the ''gesture follower'' comes freely with the download of [[http://ftm.ircam.fr/index.php/Download FTM]], in the folder .../MnM.BETA/examples/gesture_follower/. Note that you must have FTM installed.&lt;br /&gt;
&lt;br /&gt;
This software is intended for artistic work and/or scientific research. Any commercial use is reserved.&lt;br /&gt;
&lt;br /&gt;
If appropriate please cite the Real Time Interaction Team, IRCAM - Centre Pompidou or reference the following article:&lt;br /&gt;
F. Bevilacqua, F. Guédy, N. Schnell, E. Fléty, N. Leroy, [http://mediatheque.ircam.fr/articles/textes/Bevilacqua07a/ Wireless sensor interface and gesture-follower for music pedagogy], Proceedings of the International Conference of New Interfaces for Musical Expression (NIME 07), New York,NY, USA, pp 124-12, 2007.&lt;br /&gt;
Thanks!&lt;br /&gt;
&lt;br /&gt;
=== Disclaimer ===&lt;br /&gt;
This is work in progress. Use this software at your own risk. We do not assume any reponsability for possible problems.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Getting started ===&lt;br /&gt;
&lt;br /&gt;
==== The simplest patch ====&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Referecences===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== I want to know how it is built to modify it====&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Download ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Tutorial Workspace : overview ==== &lt;br /&gt;
&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 1st STEP : Record gestures ==== &lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 2nd STEP : Compare ==== &lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 3rd STEP : Observe ==== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====  Connection Avec EyesWeb XMI ==== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2167</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2167"/>
				<updated>2007-11-06T21:36:51Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* Disclaimer */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== What is the ''gesture follower'' ? ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a set of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to be able get parameters by comparing a performance with an ensemble of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (begining, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
=== What is a ''gesture'' anyway ? ===&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit for the dimension of the gesture data (or number of sensor channel) other than what your computer can afford as a CPU load (for example 20 is generally no problem). &lt;br /&gt;
&lt;br /&gt;
In Max/MSP the data feeding the ''gesture follower'' can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
=== Dowload, license and referencing===&lt;br /&gt;
The current version of the ''gesture follower'' comes freely with the download of [[http://ftm.ircam.fr/index.php/Download FTM]], in the folder .../MnM.BETA/examples/gesture_follower/. Note that you must have FTM installed.&lt;br /&gt;
&lt;br /&gt;
This software is intended for artistic work and/or scientific research. Any commercial use is reserved.&lt;br /&gt;
&lt;br /&gt;
If appropriate please cite the Real Time Interaction Team, IRCAM - Centre Pompidou or reference the following article:&lt;br /&gt;
F. Bevilacqua, F. Guédy, N. Schnell, E. Fléty, N. Leroy, [http://mediatheque.ircam.fr/articles/textes/Bevilacqua07a/ Wireless sensor interface and gesture-follower for music pedagogy], Proceedings of the International Conference of New Interfaces for Musical Expression (NIME 07), New York,NY, USA, pp 124-12, 2007.&lt;br /&gt;
Thanks!&lt;br /&gt;
&lt;br /&gt;
=== Disclaimer ===&lt;br /&gt;
This is work in progress. Use this software at your own risk. We do not assume any reponsability for possible problems.&lt;br /&gt;
&lt;br /&gt;
=== Referecences===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== I want to know how it is built to modify it====&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Download ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Tutorial Workspace : overview ==== &lt;br /&gt;
&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 1st STEP : Record gestures ==== &lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 2nd STEP : Compare ==== &lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 3rd STEP : Observe ==== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====  Connection Avec EyesWeb XMI ==== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2166</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2166"/>
				<updated>2007-11-06T21:36:08Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* Dowload, license and referencing */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== What is the ''gesture follower'' ? ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a set of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to be able get parameters by comparing a performance with an ensemble of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (begining, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
=== What is a ''gesture'' anyway ? ===&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit for the dimension of the gesture data (or number of sensor channel) other than what your computer can afford as a CPU load (for example 20 is generally no problem). &lt;br /&gt;
&lt;br /&gt;
In Max/MSP the data feeding the ''gesture follower'' can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
=== Dowload, license and referencing===&lt;br /&gt;
The current version of the ''gesture follower'' comes freely with the download of [[http://ftm.ircam.fr/index.php/Download FTM]], in the folder .../MnM.BETA/examples/gesture_follower/. Note that you must have FTM installed.&lt;br /&gt;
&lt;br /&gt;
This software is intended for artistic work and/or scientific research. Any commercial use is reserved.&lt;br /&gt;
&lt;br /&gt;
If appropriate please cite the Real Time Interaction Team, IRCAM - Centre Pompidou or reference the following article:&lt;br /&gt;
F. Bevilacqua, F. Guédy, N. Schnell, E. Fléty, N. Leroy, [http://mediatheque.ircam.fr/articles/textes/Bevilacqua07a/ Wireless sensor interface and gesture-follower for music pedagogy], Proceedings of the International Conference of New Interfaces for Musical Expression (NIME 07), New York,NY, USA, pp 124-12, 2007.&lt;br /&gt;
Thanks!&lt;br /&gt;
&lt;br /&gt;
=== Disclaimer ===&lt;br /&gt;
This is work in progress. Use this software at your own risk. We do not assume any reponsability for possible problems.&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is primerly aut&lt;br /&gt;
Then two options&lt;br /&gt;
==== I just want to know how to use the patch====&lt;br /&gt;
1. &lt;br /&gt;
2.&lt;br /&gt;
&lt;br /&gt;
=== Referecences===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== I want to know how it is built to modify it====&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Download ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Tutorial Workspace : overview ==== &lt;br /&gt;
&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 1st STEP : Record gestures ==== &lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 2nd STEP : Compare ==== &lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 3rd STEP : Observe ==== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====  Connection Avec EyesWeb XMI ==== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2165</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2165"/>
				<updated>2007-11-06T21:35:40Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* Dowload, license and referencing */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== What is the ''gesture follower'' ? ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a set of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to be able get parameters by comparing a performance with an ensemble of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (begining, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
=== What is a ''gesture'' anyway ? ===&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit for the dimension of the gesture data (or number of sensor channel) other than what your computer can afford as a CPU load (for example 20 is generally no problem). &lt;br /&gt;
&lt;br /&gt;
In Max/MSP the data feeding the ''gesture follower'' can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
=== Dowload, license and referencing===&lt;br /&gt;
The current version of the ''gesture follower'' comes freely with the download of [[http://ftm.ircam.fr/index.php/Download FTM]], in the folder .../MnM.BETA/examples/gesture_follower/. Note that you must have FTM installed.&lt;br /&gt;
&lt;br /&gt;
This software is intended for artistic work and/or scientific research. Any commercial use is reserved.&lt;br /&gt;
&lt;br /&gt;
If appropriate please cite the Real Time Interaction Team, IRCAM - Centre Pompidou or reference the following article:&lt;br /&gt;
F. Bevilacqua, F. Guédy, N. Schnell, E. Fléty, N. Leroy, [http://mediatheque.ircam.fr/articles/textes/Bevilacqua07a/ Wireless sensor interface and gesture-follower for music pedagogy], Proceedings of the International Conference of New Interfaces for Musical Expression (NIME 07), New York,NY, USA, pp 124-12, 2007.&lt;br /&gt;
&lt;br /&gt;
Thanks!&lt;br /&gt;
&lt;br /&gt;
=== Disclaimer ===&lt;br /&gt;
This is work in progress. Use this software at your own risk. We do not assume any reponsability for possible problems.&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is primerly aut&lt;br /&gt;
Then two options&lt;br /&gt;
==== I just want to know how to use the patch====&lt;br /&gt;
1. &lt;br /&gt;
2.&lt;br /&gt;
&lt;br /&gt;
=== Referecences===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== I want to know how it is built to modify it====&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Download ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Tutorial Workspace : overview ==== &lt;br /&gt;
&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 1st STEP : Record gestures ==== &lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 2nd STEP : Compare ==== &lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 3rd STEP : Observe ==== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====  Connection Avec EyesWeb XMI ==== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2164</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2164"/>
				<updated>2007-11-06T21:33:11Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* Dowload, license and referencing */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== What is the ''gesture follower'' ? ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a set of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to be able get parameters by comparing a performance with an ensemble of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (begining, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
=== What is a ''gesture'' anyway ? ===&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit for the dimension of the gesture data (or number of sensor channel) other than what your computer can afford as a CPU load (for example 20 is generally no problem). &lt;br /&gt;
&lt;br /&gt;
In Max/MSP the data feeding the ''gesture follower'' can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
=== Dowload, license and referencing===&lt;br /&gt;
The current version of the ''gesture follower'' comes freely with the download of FTM, in the folder /MnM.BETA/examples/gesture_follower/&lt;br /&gt;
Note that you must have FTM installed, see [[http://ftm.ircam.fr/index.php/Download FTM]].&lt;br /&gt;
&lt;br /&gt;
This software is intended for artistic work and/or scientific research. Any commercial use is reserved.&lt;br /&gt;
&lt;br /&gt;
If appropriate please cite the Real Time Interaction Team, IRCAM - Centre Pompidou or reference the following article:&lt;br /&gt;
F. Bevilacqua, F. Guédy, N. Schnell, E. Fléty, N. Leroy, [http://mediatheque.ircam.fr/articles/textes/Bevilacqua07a/ Wireless sensor interface and gesture-follower for music pedagogy], Proc. of the International Conference of New Interfaces for Musical Expression (NIME 07), New York,NY, USA, pp 124-12, 2007.&lt;br /&gt;
&lt;br /&gt;
=== Disclaimer ===&lt;br /&gt;
This is work in progress. Use this software at your own risk. We do not assume any reponsability for possible problems.&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is primerly aut&lt;br /&gt;
Then two options&lt;br /&gt;
==== I just want to know how to use the patch====&lt;br /&gt;
1. &lt;br /&gt;
2.&lt;br /&gt;
&lt;br /&gt;
=== Referecences===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== I want to know how it is built to modify it====&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Download ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Tutorial Workspace : overview ==== &lt;br /&gt;
&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 1st STEP : Record gestures ==== &lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 2nd STEP : Compare ==== &lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 3rd STEP : Observe ==== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====  Connection Avec EyesWeb XMI ==== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2163</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2163"/>
				<updated>2007-11-06T21:30:55Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* Dowload, license and referencing */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== What is the ''gesture follower'' ? ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a set of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to be able get parameters by comparing a performance with an ensemble of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (begining, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
=== What is a ''gesture'' anyway ? ===&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit for the dimension of the gesture data (or number of sensor channel) other than what your computer can afford as a CPU load (for example 20 is generally no problem). &lt;br /&gt;
&lt;br /&gt;
In Max/MSP the data feeding the ''gesture follower'' can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
=== Dowload, license and referencing===&lt;br /&gt;
The current version of the ''gesture follower'' comes freely with the download of FTM, see [[http://ftm.ircam.fr/index.php/Download FTM]].&lt;br /&gt;
Note that you must have FTM installed. &lt;br /&gt;
&lt;br /&gt;
This software is intended for artistic work and/or scientific research. Any commercial use is reserved. &lt;br /&gt;
If appropriate please cite the Real Time Interaction Team, IRCAM - Centre Pompidou or reference the following article:&lt;br /&gt;
F. Bevilacqua, F. Guédy, N. Schnell, E. Fléty, N. Leroy, [http://mediatheque.ircam.fr/articles/textes/Bevilacqua07a/ Wireless sensor interface and gesture-follower for music pedagogy], Proc. of the International Conference of New Interfaces for Musical Expression (NIME 07), New York,NY, USA, pp 124-12, 2007.&lt;br /&gt;
&lt;br /&gt;
=== Disclaimer ===&lt;br /&gt;
This is work in progress. Use this software at your own risk. We do not assume any reponsability for possible problems.&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is primerly aut&lt;br /&gt;
Then two options&lt;br /&gt;
==== I just want to know how to use the patch====&lt;br /&gt;
1. &lt;br /&gt;
2.&lt;br /&gt;
&lt;br /&gt;
=== Referecences===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== I want to know how it is built to modify it====&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Download ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Tutorial Workspace : overview ==== &lt;br /&gt;
&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 1st STEP : Record gestures ==== &lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 2nd STEP : Compare ==== &lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 3rd STEP : Observe ==== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====  Connection Avec EyesWeb XMI ==== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2162</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2162"/>
				<updated>2007-11-06T21:27:36Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* Dowload, license and referencing&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== What is the ''gesture follower'' ? ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a set of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to be able get parameters by comparing a performance with an ensemble of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (begining, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
=== What is a ''gesture'' anyway ? ===&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit for the dimension of the gesture data (or number of sensor channel) other than what your computer can afford as a CPU load (for example 20 is generally no problem). &lt;br /&gt;
&lt;br /&gt;
In Max/MSP the data feeding the ''gesture follower'' can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
=== Dowload, license and referencing===&lt;br /&gt;
The current version of the ''gesture follower'' comes freely with the download of FTM, see [[http://ftm.ircam.fr/index.php/Download FTM]].&lt;br /&gt;
Note that you must have FTM installed. &lt;br /&gt;
&lt;br /&gt;
This software is intended for artistic work and/or scientific research. Any commercial use is reserved. &lt;br /&gt;
If appropriate please cite the Real Time Interaction Team, IRCAM - Centre Pompidou or reference the following article:&lt;br /&gt;
F. Bevilacqua, &lt;br /&gt;
&lt;br /&gt;
=== Disclaimer ===&lt;br /&gt;
This is work in progress. Use this software at your own risk. We do not assume any reponsability for possible problems.&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is primerly aut&lt;br /&gt;
Then two options&lt;br /&gt;
==== I just want to know how to use the patch====&lt;br /&gt;
1. &lt;br /&gt;
2.&lt;br /&gt;
&lt;br /&gt;
=== Referecences===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== I want to know how it is built to modify it====&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Download ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Tutorial Workspace : overview ==== &lt;br /&gt;
&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 1st STEP : Record gestures ==== &lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 2nd STEP : Compare ==== &lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 3rd STEP : Observe ==== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====  Connection Avec EyesWeb XMI ==== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2161</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2161"/>
				<updated>2007-11-06T21:09:12Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== What is the ''gesture follower'' ? ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a set of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to be able get parameters by comparing a performance with an ensemble of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (begining, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
=== What is a ''gesture'' anyway ? ===&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit for the dimension of the gesture data (or number of sensor channel) other than what your computer can afford as a CPU load (for example 20 is generally no problem). &lt;br /&gt;
&lt;br /&gt;
In Max/MSP the data feeding the ''gesture follower'' can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
=== Dowload, license and referencing===&lt;br /&gt;
The gesture follower&lt;br /&gt;
&lt;br /&gt;
First of all, you need to have FTM installed, see [[http://ftm.ircam.fr/index.php/Download FTM]].&lt;br /&gt;
Then two options&lt;br /&gt;
==== I just want to know how to use the patch====&lt;br /&gt;
1. &lt;br /&gt;
2.&lt;br /&gt;
=== Referecences===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== I want to know how it is built to modify it====&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Download ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Tutorial Workspace : overview ==== &lt;br /&gt;
&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 1st STEP : Record gestures ==== &lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 2nd STEP : Compare ==== &lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 3rd STEP : Observe ==== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====  Connection Avec EyesWeb XMI ==== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2160</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2160"/>
				<updated>2007-11-06T20:59:51Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* I just want to know how to use the patch */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== What is &amp;quot;the gesture follower&amp;quot; ? ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a set of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to be able get parameters by comparing a performance with an ensemble of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (begining, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
=== What is a &amp;quot;gesture&amp;quot; anyway ? ===&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but  typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit of the dimension of the gesture data (or number of sensor channel) other than the CPU load (for example 20 is no problem). &lt;br /&gt;
In Max/MSP the data can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
=== Documentation ===&lt;br /&gt;
&lt;br /&gt;
==== I just want to know how to use the patch====&lt;br /&gt;
1. You need to have FTM installed, see [[http://ftm.ircam.fr/index.php/Download FTM]]&lt;br /&gt;
2.&lt;br /&gt;
&lt;br /&gt;
==== I want to know how it is built to modify it====&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Download ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Tutorial Workspace : overview ==== &lt;br /&gt;
&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 1st STEP : Record gestures ==== &lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 2nd STEP : Compare ==== &lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 3rd STEP : Observe ==== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====  Connection Avec EyesWeb XMI ==== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2159</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2159"/>
				<updated>2007-11-06T20:56:48Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* I just want to know how to use the patch */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== What is &amp;quot;the gesture follower&amp;quot; ? ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a set of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to be able get parameters by comparing a performance with an ensemble of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (begining, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
=== What is a &amp;quot;gesture&amp;quot; anyway ? ===&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but  typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit of the dimension of the gesture data (or number of sensor channel) other than the CPU load (for example 20 is no problem). &lt;br /&gt;
In Max/MSP the data can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
=== Documentation ===&lt;br /&gt;
&lt;br /&gt;
==== I just want to know how to use the patch====&lt;br /&gt;
1. You need to have FTM installed, see [http://ftm.ircam.fr/index.php/Download &lt;br /&gt;
FTM]&lt;br /&gt;
&lt;br /&gt;
==== I want to know how it is built to modify it====&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Download ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Tutorial Workspace : overview ==== &lt;br /&gt;
&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 1st STEP : Record gestures ==== &lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 2nd STEP : Compare ==== &lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 3rd STEP : Observe ==== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====  Connection Avec EyesWeb XMI ==== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2158</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2158"/>
				<updated>2007-11-06T20:54:51Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* I want to know how it is buit to modifyit */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== What is &amp;quot;the gesture follower&amp;quot; ? ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a set of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to be able get parameters by comparing a performance with an ensemble of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (begining, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
=== What is a &amp;quot;gesture&amp;quot; anyway ? ===&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but  typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit of the dimension of the gesture data (or number of sensor channel) other than the CPU load (for example 20 is no problem). &lt;br /&gt;
In Max/MSP the data can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
=== Documentation ===&lt;br /&gt;
&lt;br /&gt;
==== I just want to know how to use the patch====&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== I want to know how it is built to modify it====&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Download ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Tutorial Workspace : overview ==== &lt;br /&gt;
&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 1st STEP : Record gestures ==== &lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 2nd STEP : Compare ==== &lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 3rd STEP : Observe ==== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====  Connection Avec EyesWeb XMI ==== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2157</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2157"/>
				<updated>2007-11-06T20:54:06Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* Documentation */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== What is &amp;quot;the gesture follower&amp;quot; ? ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a set of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to be able get parameters by comparing a performance with an ensemble of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (begining, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
=== What is a &amp;quot;gesture&amp;quot; anyway ? ===&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but  typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit of the dimension of the gesture data (or number of sensor channel) other than the CPU load (for example 20 is no problem). &lt;br /&gt;
In Max/MSP the data can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
=== Documentation ===&lt;br /&gt;
&lt;br /&gt;
==== I just want to know how to use the patch====&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== I want to know how it is buit to modifyit====&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Download ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Tutorial Workspace : overview ==== &lt;br /&gt;
&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 1st STEP : Record gestures ==== &lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 2nd STEP : Compare ==== &lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 3rd STEP : Observe ==== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====  Connection Avec EyesWeb XMI ==== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2156</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2156"/>
				<updated>2007-11-06T20:52:53Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* What is a &amp;quot;gesture&amp;quot; anyway ? */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== What is &amp;quot;the gesture follower&amp;quot; ? ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a set of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to be able get parameters by comparing a performance with an ensemble of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (begining, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
=== What is a &amp;quot;gesture&amp;quot; anyway ? ===&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be at least 1 milisecond, but  typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit of the dimension of the gesture data (or number of sensor channel) other than the CPU load (for example 20 is no problem). &lt;br /&gt;
In Max/MSP the data can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
=== Documentation ===&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Download ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Tutorial Workspace : overview ==== &lt;br /&gt;
&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 1st STEP : Record gestures ==== &lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 2nd STEP : Compare ==== &lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 3rd STEP : Observe ==== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====  Connection Avec EyesWeb XMI ==== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2155</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2155"/>
				<updated>2007-11-06T18:06:41Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* What is &amp;quot;the gesture follower&amp;quot; ? */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== What is &amp;quot;the gesture follower&amp;quot; ? ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a set of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to be able get parameters by comparing a performance with an ensemble of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (begining, middle or end of the gesture)&lt;br /&gt;
&lt;br /&gt;
=== What is a &amp;quot;gesture&amp;quot; anyway ? ===&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be larger then 1 milisecond. Typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit of the dimension of the data (other than CPU load that could augment dangerously, we did more than 20 with no problem). &lt;br /&gt;
In Max/MSP the data can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
=== Documentation ===&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Download ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Tutorial Workspace : overview ==== &lt;br /&gt;
&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 1st STEP : Record gestures ==== &lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 2nd STEP : Compare ==== &lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 3rd STEP : Observe ==== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====  Connection Avec EyesWeb XMI ==== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2154</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2154"/>
				<updated>2007-11-06T18:05:40Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* What is a &amp;quot;gesture&amp;quot; anyway ? */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== What is &amp;quot;the gesture follower&amp;quot; ? ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a set of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to be able get parameters by comparing a performance with an ensemble of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (begining, middle or end ...)&lt;br /&gt;
&lt;br /&gt;
=== What is a &amp;quot;gesture&amp;quot; anyway ? ===&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be larger then 1 milisecond. Typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
There is no technical limit of the dimension of the data (other than CPU load that could augment dangerously, we did more than 20 with no problem). &lt;br /&gt;
In Max/MSP the data can be taken from a list, for example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* any sensors data, etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
=== Documentation ===&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Download ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Tutorial Workspace : overview ==== &lt;br /&gt;
&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 1st STEP : Record gestures ==== &lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 2nd STEP : Compare ==== &lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 3rd STEP : Observe ==== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====  Connection Avec EyesWeb XMI ==== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2153</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2153"/>
				<updated>2007-11-06T18:00:38Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* What is a &amp;quot;gesture&amp;quot; anyway ? */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== What is &amp;quot;the gesture follower&amp;quot; ? ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a set of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to be able get parameters by comparing a performance with an ensemble of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (begining, middle or end ...)&lt;br /&gt;
&lt;br /&gt;
=== What is a &amp;quot;gesture&amp;quot; anyway ? ===&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''multi-dimensional temporal curve'', sampled at relatively low frequency compared to sound. With the current implementation in Max/MSP the frequency sampling period must be larger then 1 milisecond. Typically, 10-20 milisecond is recommended. There are no upper limit (if you have time...).&lt;br /&gt;
&lt;br /&gt;
Technically, in Max/MSP the data can be taken from any flow of a list,fro example:&lt;br /&gt;
&lt;br /&gt;
* sound parameters (pitch, amplitude, etc)&lt;br /&gt;
* mouse, joystick coordinates&lt;br /&gt;
* parameters from video tracking (EyesWeb, Jitter, etc)&lt;br /&gt;
* Wiimote&lt;br /&gt;
* MIDI&lt;br /&gt;
* etc...&lt;br /&gt;
* any combination of the above (you said multimodal ?)&lt;br /&gt;
&lt;br /&gt;
=== Documentation ===&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Download ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Tutorial Workspace : overview ==== &lt;br /&gt;
&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 1st STEP : Record gestures ==== &lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 2nd STEP : Compare ==== &lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 3rd STEP : Observe ==== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====  Connection Avec EyesWeb XMI ==== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2152</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2152"/>
				<updated>2007-11-06T17:52:02Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* Description */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== What is &amp;quot;the gesture follower&amp;quot; ? ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a set of Max/MSP modules to perform gesture recognition and following in real-time. It is integrated in the toolbox MnM of the library FTM (see dowload). The general idea behind it is to be able get parameters by comparing a performance with an ensemble of prerecorded examples. &lt;br /&gt;
&lt;br /&gt;
The gesture follower can guess the two following questions:&lt;br /&gt;
* which gesture is it ? (if you don't like black and white answers, you can get &amp;quot;greyscale&amp;quot; answers: how close are you from the recorded gestures ? )&lt;br /&gt;
* where are we ? (begining, middle or end ...)&lt;br /&gt;
&lt;br /&gt;
=== What is a &amp;quot;gesture&amp;quot; anyway ? ===&lt;br /&gt;
&lt;br /&gt;
A gesture here can be any ''temporal curve'', sampled at relatively low frequency compared to sound. Typically, from 1000 Hz to 0.001 Hz.&lt;br /&gt;
&lt;br /&gt;
=== Documentation ===&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Download ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Tutorial Workspace : overview ==== &lt;br /&gt;
&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 1st STEP : Record gestures ==== &lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 2nd STEP : Compare ==== &lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 3rd STEP : Observe ==== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====  Connection Avec EyesWeb XMI ==== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2151</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=2151"/>
				<updated>2007-11-06T17:12:16Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== Description ===&lt;br /&gt;
&lt;br /&gt;
The ''gesture follower'' is a set of Max/MSP modules integrated in the toolbox MnM of the library FTM. The development of the gesture follower is pursued with the general goal to compare in realtime a gesture with prerecorded examples. The comparison mechanisms we implemented, following and recognition are further explained in the next subsections.&lt;br /&gt;
&lt;br /&gt;
=== Documentation ===&lt;br /&gt;
&lt;br /&gt;
=== Examples ===&lt;br /&gt;
&lt;br /&gt;
=== Links ===&lt;br /&gt;
&lt;br /&gt;
=== Download ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Tutorial Workspace : overview ==== &lt;br /&gt;
&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 1st STEP : Record gestures ==== &lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 2nd STEP : Compare ==== &lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== 3rd STEP : Observe ==== &lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====  Connection Avec EyesWeb XMI ==== &lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=1818</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=1818"/>
				<updated>2007-05-09T12:46:40Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* Test */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Gesture Follower tutorial ==&lt;br /&gt;
&lt;br /&gt;
The gesture follower is a set of Max/MSP modules integrated in the toolbox MnM of the library FTM. The development of the gesture follower is pursued with the general goal to compare in realtime a gesture with prerecorded examples. The comparison mechanisms we implemented, following and recognition are further explained in the next subsections.&lt;br /&gt;
&lt;br /&gt;
== Workspace : overview ==&lt;br /&gt;
&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 1st STEP : Record gestures ==&lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 2nd STEP : Compare ==&lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 3rd STEP : Observe ==&lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Connection Avec EyesWeb XMI ==&lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	<entry>
		<id>https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=1817</id>
		<title>Gesture Follower</title>
		<link rel="alternate" type="text/html" href="https://ftm2.ircam.fr/index.php?title=Gesture_Follower&amp;diff=1817"/>
				<updated>2007-05-09T12:45:21Z</updated>
		
		<summary type="html">&lt;p&gt;Bevilacq: /* Test */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Gesture Follower tutorial ==&lt;br /&gt;
&lt;br /&gt;
The gesture follower is a set of Max/MSP modules integrated in the toolbox MnM of the library FTM. The development of the gesture follower is pursued with the general goal to compare in realtime a gesture with prerecorded examples. The comparison mechanisms we implemented, following and recognition are further explained in the next subsections.&lt;br /&gt;
&lt;br /&gt;
== Workspace : overview ==&lt;br /&gt;
&lt;br /&gt;
Get an overview of the interface functions.&lt;br /&gt;
[[Image:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 1st STEP : Record gestures ==&lt;br /&gt;
&lt;br /&gt;
Let’s start with two simple drawings : a triangle and a circle.&lt;br /&gt;
[[Image:Example2.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 2nd STEP : Compare ==&lt;br /&gt;
&lt;br /&gt;
Draw a figure and then see how similar it is with your two referent drawings.&lt;br /&gt;
[[Image:Example3.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 3rd STEP : Observe ==&lt;br /&gt;
&lt;br /&gt;
Pay attention to the curves below. They represent the velocity in X and Y axis of the mouse trajectories. That give a useful temporal information on how you realize your drawing. &lt;br /&gt;
[[Image:Example4.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Connection Avec EyesWeb XMI ==&lt;br /&gt;
&lt;br /&gt;
EyesWeb XMI, the open platform for real-time analysis of multimodal interaction, can be connected to Max/Msp throughout the OSC protocol (Open Sound Control). OSC is open, message-based protocol which was originally developed for communication between computers and sythesizers (cf. wiki).&lt;br /&gt;
&lt;br /&gt;
== Test ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;html&amp;gt;&lt;br /&gt;
&amp;lt;head&amp;gt;&lt;br /&gt;
&amp;lt;title&amp;gt;jit.matrix&amp;lt;/title&amp;gt;&lt;br /&gt;
&amp;lt;/head&amp;gt;&lt;br /&gt;
&amp;lt;body bgcolor=#F7F7F7 link=#000000 vlink=#000000&amp;gt;&lt;br /&gt;
&amp;lt;table&amp;gt;&amp;lt;tr&amp;gt;&amp;lt;td&amp;gt;&amp;lt;a href=&amp;quot;index.html&amp;quot;&amp;gt;&amp;lt;img src=&amp;quot;images/jitter_smallest.gif&amp;quot; border=0&amp;gt;&amp;lt;/a&amp;gt;&amp;lt;br&amp;gt;&amp;lt;/td&amp;gt;&amp;lt;/tr&amp;gt;&amp;lt;/table&amp;gt;&lt;br /&gt;
&amp;lt;table border=0 width=100%&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#CCCCCC width=30% valign=bottom&amp;gt;&lt;br /&gt;
&amp;lt;font size=+3 face=&amp;quot;Times&amp;quot;&amp;gt;&amp;lt;b&amp;gt;jit.matrix&amp;lt;/b&amp;gt;&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#DDDDDD width=70% align=right&amp;gt;&amp;lt;font size=+1 face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The Jitter Matrix!&lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;/table&amp;gt;&lt;br /&gt;
&amp;lt;/p&amp;gt;&lt;br /&gt;
&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The &amp;lt;b&amp;gt;jit.matrix&amp;lt;/b&amp;gt; object is a named matrix which may be used to &lt;br /&gt;
matrix data storage and retrieval, resampling, and &lt;br /&gt;
matrix type and planecount conversion operations.&lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;p&amp;gt;&lt;br /&gt;
&amp;lt;table border=0 width=100%&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#CCCCDD&amp;gt;&amp;lt;font size=+1 face=&amp;quot;Times&amp;quot;&amp;gt;&amp;lt;a href=&amp;quot;group-attributes.html&amp;quot;&amp;gt;&amp;lt;b&amp;gt;Attributes:&amp;lt;/b&amp;gt;&amp;lt;/a&amp;gt;&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;/table&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;table border=0 width=100%&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#CCCCCC width=10%&amp;gt;&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&amp;lt;b&amp;gt;Name&amp;lt;/b&amp;gt;&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#CCCCCC width=5%&amp;gt;&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&amp;lt;b&amp;gt;Type&amp;lt;/b&amp;gt;&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#CCCCCC width=85%&amp;gt;&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&amp;lt;b&amp;gt;Description&amp;lt;/b&amp;gt;&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7CCCC valign=top&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;adapt&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#DDDDDD valign=top&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;int&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7D7CC valign=top&amp;gt;&lt;br /&gt;
&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Matrix adaptation flag (default = 0 if matrix arguments are present, otherwise 1)&lt;br /&gt;
When the flag is set, the &amp;lt;b&amp;gt;jit.matrix&amp;lt;/b&amp;gt; object will adapt to the incoming matrix planecount, type, and dimensions.&lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7CCCC valign=top&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;dimstride&amp;lt;/font&amp;gt; &amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;(get)&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#DDDDDD valign=top&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;int list[32]&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7D7CC valign=top&amp;gt;&lt;br /&gt;
&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The byte stride per dimension&lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7CCCC valign=top&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;dim&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#DDDDDD valign=top&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;int list[32]&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7D7CC valign=top&amp;gt;&lt;br /&gt;
&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The dimensions of matrix data (default = 1 1)&lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7CCCC valign=top&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;dstdimend&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#DDDDDD valign=top&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;int list[32]&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7D7CC valign=top&amp;gt;&lt;br /&gt;
&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The destination dimension end position (default = all &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;dim&amp;lt;/font&amp;gt; values minus 1)&lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7CCCC valign=top&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;dstdimstart&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#DDDDDD valign=top&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;int list[32]&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7D7CC valign=top&amp;gt;&lt;br /&gt;
&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The source dimension start position (default = all 0)&lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7CCCC valign=top&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;interp&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#DDDDDD valign=top&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;int&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7D7CC valign=top&amp;gt;&lt;br /&gt;
&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Matrix interpolation flag (default = 0)&lt;br /&gt;
When the flag is set, the input matrix will be interpolated &lt;br /&gt;
when copied to the internal matrix.&lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7CCCC valign=top&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;name&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#DDDDDD valign=top&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;symbol&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7D7CC valign=top&amp;gt;&lt;br /&gt;
&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The name of the matrix (default = UID)&lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7CCCC valign=top&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;planecount&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#DDDDDD valign=top&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;int&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7D7CC valign=top&amp;gt;&lt;br /&gt;
&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The number of planes in matrix data (default = 4)&lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7CCCC valign=top&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;planemap&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#DDDDDD valign=top&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;int list[32]&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7D7CC valign=top&amp;gt;&lt;br /&gt;
&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Maps input places to output planes (default = 0 1 2 3 ...)&lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7CCCC valign=top&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;size&amp;lt;/font&amp;gt; &amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;(get)&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#DDDDDD valign=top&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;int&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7D7CC valign=top&amp;gt;&lt;br /&gt;
&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Total byte size of matrix&lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7CCCC valign=top&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;srcdimend&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#DDDDDD valign=top&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;int list[32]&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7D7CC valign=top&amp;gt;&lt;br /&gt;
&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The source dimension end position (default = all &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;dim&amp;lt;/font&amp;gt; values minus 1)&lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7CCCC valign=top&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;srcdimstart&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#DDDDDD valign=top&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;int list[32]&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7D7CC valign=top&amp;gt;&lt;br /&gt;
&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The source dimension start position (default = all 0)&lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7CCCC valign=top&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;thru&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#DDDDDD valign=top&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;int&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7D7CC valign=top&amp;gt;&lt;br /&gt;
&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Thru mode flag (default = 1) When the flag is set, a matrix is output&lt;br /&gt;
when another one is received.&lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7CCCC valign=top&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;type&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#DDDDDD valign=top&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;int&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7D7CC valign=top&amp;gt;&lt;br /&gt;
&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The matrix data type (default = &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;char&amp;lt;/font&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
Supported data types are &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;char&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;long&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;float32&amp;lt;/font&amp;gt;, or &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;float64&amp;lt;/font&amp;gt;.&lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7CCCC valign=top&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;usedstdim&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#DDDDDD valign=top&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;int&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7D7CC valign=top&amp;gt;&lt;br /&gt;
&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Destdim use flag (default = 0)&lt;br /&gt;
When the flag is set, the destination dimension's attributes are used when copying an input matrix &lt;br /&gt;
to an internal matrix.&lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7CCCC valign=top&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;usesrcdim&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#DDDDDD valign=top&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;int&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7D7CC valign=top&amp;gt;&lt;br /&gt;
&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Srcdim use flag (default = 0)&lt;br /&gt;
When the flag is set, the source dimension's attributes are used when copying an input matrix &lt;br /&gt;
to an internal matrix.&lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;/table&amp;gt;&lt;br /&gt;
&amp;lt;p&amp;gt;&lt;br /&gt;
&amp;lt;p&amp;gt;&lt;br /&gt;
&amp;lt;table border=0 width=100%&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#CCCCDD&amp;gt;&amp;lt;font size=+1 face=&amp;quot;Times&amp;quot;&amp;gt;&amp;lt;b&amp;gt;Messages:&amp;lt;/b&amp;gt;&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;/table&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;table border=0 width=100%&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#CCCCCC&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;bang &amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7D7CC&amp;gt;&lt;br /&gt;
&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Equivalent to the &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;outputmatrix&amp;lt;/font&amp;gt; message.&lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;/table&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;table border=0 width=100%&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#CCCCCC&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;clear &amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7D7CC&amp;gt;&lt;br /&gt;
&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Sets all matrix values to zero.&lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;/table&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;table border=0 width=100%&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#CCCCCC&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;exportimage &amp;lt;/font&amp;gt;&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt; [&amp;lt;i&amp;gt;filename&amp;lt;/i&amp;gt; (&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;symbol&amp;lt;/font&amp;gt;)] { [&amp;lt;i&amp;gt;file-type&amp;lt;/i&amp;gt; (&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;symbol&amp;lt;/font&amp;gt;)] } { [&amp;lt;i&amp;gt;use-dialog&amp;lt;/i&amp;gt; (&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;int&amp;lt;/font&amp;gt;)] }&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7D7CC&amp;gt;&lt;br /&gt;
&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Export the current frame as an image file with the name specified by the first argument. The&lt;br /&gt;
optional second argument sets the file type (default = png). Available file types are&lt;br /&gt;
&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;png&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;bmp&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;jpeg&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;macpaint&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;photoshop&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;pict&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;qtimage&amp;lt;/font&amp;gt;, &lt;br /&gt;
&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;sgi&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;tga&amp;lt;/font&amp;gt; and &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;tiff&amp;lt;/font&amp;gt;. An optional &amp;lt;i&amp;gt;use-dialog&amp;lt;/i&amp;gt; argument of &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;1&amp;lt;/font&amp;gt; will open a&lt;br /&gt;
file dialog to allow you to enter the image file settings.&lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;/table&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;table border=0 width=100%&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#CCCCCC&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;exportmovie &amp;lt;/font&amp;gt;&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt; { [&amp;lt;i&amp;gt;filename&amp;lt;/i&amp;gt; (&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;symbol&amp;lt;/font&amp;gt;)] } [&amp;lt;i&amp;gt;FPS&amp;lt;/i&amp;gt; (&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;float&amp;lt;/font&amp;gt;)] [&amp;lt;i&amp;gt;codec&amp;lt;/i&amp;gt; (&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;symbol&amp;lt;/font&amp;gt;)] [&amp;lt;i&amp;gt;quality&amp;lt;/i&amp;gt; (&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;symbol&amp;lt;/font&amp;gt;)] [&amp;lt;i&amp;gt;timescale&amp;lt;/i&amp;gt; (&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;int&amp;lt;/font&amp;gt;)]&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7D7CC&amp;gt;&lt;br /&gt;
&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Exports a matrix as a QuickTime movie. The &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;exportmovie&amp;lt;/font&amp;gt; message takes an optional argument&lt;br /&gt;
to specify a file name. If no filename is specified, a file dialog will open to let you choose a file.&lt;br /&gt;
&amp;lt;p&amp;gt;&lt;br /&gt;
The default &amp;lt;i&amp;gt;FPS&amp;lt;/i&amp;gt; is 30. frames per second&amp;lt;/p&amp;gt;&lt;br /&gt;
&amp;lt;p&amp;gt;&lt;br /&gt;
The default &amp;lt;i&amp;gt;codec&amp;lt;/i&amp;gt; is &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;raw&amp;lt;/font&amp;gt;. Supported codecs are&lt;br /&gt;
&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;raw&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;cinepak&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;graphics&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;animation&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;video&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;componentvideo&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;jpeg&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;mjpega&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;mjpegb&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;sgi&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;planarrgb&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;macpaint&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;gif&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;photocd&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;qdgx&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;avrjpeg&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;opendmljpeg&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;bmp&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;winraw&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;vector&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;qd&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;h261&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;h263&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;dvntsc&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;dvpal&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;dvprontsc&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;dvpropal&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;flc&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;targa&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;png&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;tiff&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;componentvideosigned&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;componentvideounsigned&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;cmyk&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;microsoft&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;sorenson&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;indeo4&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;argb64&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;rgb48&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;alphagrey32&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;grey16&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;mpegyuv420&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;yuv420&amp;lt;/font&amp;gt;, and &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;sorensonyuv9&amp;lt;/font&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&amp;lt;p&amp;gt;&lt;br /&gt;
The default &amp;lt;i&amp;gt;quality&amp;lt;/i&amp;gt; is &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;max&amp;lt;/font&amp;gt;. Supported quality settings are&lt;br /&gt;
&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;lossless&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;max&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;min&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;low&amp;lt;/font&amp;gt;, &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;normal&amp;lt;/font&amp;gt;, and &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;high&amp;lt;/font&amp;gt;.&amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
Note that minimum quality is, in many cases, the codec's default quality. Use &amp;quot;low&amp;quot; quality for consistent results.&amp;lt;/p&amp;gt;&lt;br /&gt;
&amp;lt;p&amp;gt;&lt;br /&gt;
The default &amp;lt;i&amp;gt;timescale&amp;lt;/i&amp;gt; is 600 units per second.&amp;lt;/p&amp;gt;&lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;/table&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;table border=0 width=100%&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#CCCCCC&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;float &amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7D7CC&amp;gt;&lt;br /&gt;
&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Equivalent to the &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;setall [float]&amp;lt;/font&amp;gt; message, followed by the &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;bang&amp;lt;/font&amp;gt; message.&lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;/table&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;table border=0 width=100%&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#CCCCCC&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;getcell &amp;lt;/font&amp;gt;&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt; [&amp;lt;i&amp;gt;position&amp;lt;/i&amp;gt; (&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;list&amp;lt;/font&amp;gt;)]&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7D7CC&amp;gt;&lt;br /&gt;
&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Sends the value(s) in the cell specified by &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;position&amp;lt;/font&amp;gt; &lt;br /&gt;
out the right outlet of the object as a &lt;br /&gt;
list in the form &lt;br /&gt;
&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;cell&amp;lt;/font&amp;gt; &amp;lt;i&amp;gt;cell-position0&amp;lt;/i&amp;gt; ... &amp;lt;i&amp;gt;cell-positionN&amp;lt;/i&amp;gt; &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;val&amp;lt;/font&amp;gt; &amp;lt;i&amp;gt;plane0-value&amp;lt;/i&amp;gt; ... &amp;lt;i&amp;gt;planeN-value&amp;lt;/i&amp;gt;.&lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;/table&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;table border=0 width=100%&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#CCCCCC&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;importmovie &amp;lt;/font&amp;gt;&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt; { [&amp;lt;i&amp;gt;filename&amp;lt;/i&amp;gt; (&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;symbol&amp;lt;/font&amp;gt;)] } [&amp;lt;i&amp;gt;time-offset&amp;lt;/i&amp;gt; (&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;int&amp;lt;/font&amp;gt;)]&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7D7CC&amp;gt;&lt;br /&gt;
&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Imports a QuickTime movie into the matrix. If no filename is specified, a file dialog will open to let you choose a file.&lt;br /&gt;
The &amp;lt;i&amp;gt;time-offset&amp;lt;/i&amp;gt; argument may be used to set a time offset for the QuickTime movie being imported (the default is 0). &lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;/table&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;table border=0 width=100%&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#CCCCCC&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;int &amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7D7CC&amp;gt;&lt;br /&gt;
&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Equivalent to the &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;setall [int]&amp;lt;/font&amp;gt; message, followed by the &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;bang&amp;lt;/font&amp;gt; message.&lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;/table&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;table border=0 width=100%&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#CCCCCC&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;exprfill &amp;lt;/font&amp;gt;&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt; [&amp;lt;i&amp;gt;expression&amp;lt;/i&amp;gt; (&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;symbol&amp;lt;/font&amp;gt;)]&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7D7CC&amp;gt;&lt;br /&gt;
&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Evlauates &amp;lt;i&amp;gt;expression&amp;lt;/i&amp;gt; to fill the matrix. See &amp;lt;b&amp;gt;&amp;lt;a href=&amp;quot;jit.expr.html&amp;quot;&amp;gt;jit.expr&amp;lt;/a&amp;gt;&amp;lt;/b&amp;gt; for more information on expressions.&lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;/table&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;table border=0 width=100%&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#CCCCCC&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;jit_gl_texture &amp;lt;/font&amp;gt;&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt; [&amp;lt;i&amp;gt;texture-name&amp;lt;/i&amp;gt; (&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;symbol&amp;lt;/font&amp;gt;)]&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7D7CC&amp;gt;&lt;br /&gt;
&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Copies the texture specified by &amp;lt;i&amp;gt;texture-name&amp;lt;/i&amp;gt; to the matrix.&lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;/table&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;table border=0 width=100%&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#CCCCCC&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;jit_matrix &amp;lt;/font&amp;gt;&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt; [&amp;lt;i&amp;gt;matrix-name&amp;lt;/i&amp;gt; (&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;symbol&amp;lt;/font&amp;gt;)]&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7D7CC&amp;gt;&lt;br /&gt;
&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Copies a matrix specified by &amp;lt;i&amp;gt;matrix-name&amp;lt;/i&amp;gt; to the matrix.&lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;/table&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;table border=0 width=100%&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#CCCCCC&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;list &amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7D7CC&amp;gt;&lt;br /&gt;
&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Equivalent to the &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;setall [list]&amp;lt;/font&amp;gt; message, followed by the &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;bang&amp;lt;/font&amp;gt; message.&lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;/table&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;table border=0 width=100%&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#CCCCCC&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;outputmatrix &amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7D7CC&amp;gt;&lt;br /&gt;
&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Sends the matrix out the left outlet.&lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;/table&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;table border=0 width=100%&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#CCCCCC&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;read &amp;lt;/font&amp;gt;&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt; { [&amp;lt;i&amp;gt;filename&amp;lt;/i&amp;gt; (&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;symbol&amp;lt;/font&amp;gt;)] }&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7D7CC&amp;gt;&lt;br /&gt;
&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Reads Jitter binary data files (.jxf) into a matrix set.&lt;br /&gt;
If no filename is specified, a file dialog will open to let you choose a file.&lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;/table&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;table border=0 width=100%&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#CCCCCC&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;register &amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7D7CC&amp;gt;&lt;br /&gt;
&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Equivalent to setting the &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;name&amp;lt;/font&amp;gt; attribute.&lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;/table&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;table border=0 width=100%&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#CCCCCC&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;setall &amp;lt;/font&amp;gt;&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt; [&amp;lt;i&amp;gt;value&amp;lt;/i&amp;gt; (&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;list&amp;lt;/font&amp;gt;)]&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7D7CC&amp;gt;&lt;br /&gt;
&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Sets all cells to the value specified by &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;value(s)&amp;lt;/font&amp;gt;. Position is specified of a list whose length is equal to the number of dimensions (&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;dimcount&amp;lt;/font&amp;gt;). &lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;/table&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;table border=0 width=100%&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#CCCCCC&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;setcell &amp;lt;/font&amp;gt;&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt; [&amp;lt;i&amp;gt;position&amp;lt;/i&amp;gt; (&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;list&amp;lt;/font&amp;gt;)] { &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;plane&amp;lt;/font&amp;gt; [&amp;lt;i&amp;gt;plane-number&amp;lt;/i&amp;gt; (&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;int&amp;lt;/font&amp;gt;)] }  &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;val&amp;lt;/font&amp;gt; [&amp;lt;i&amp;gt;value&amp;lt;/i&amp;gt; (&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;list&amp;lt;/font&amp;gt;)]&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7D7CC&amp;gt;&lt;br /&gt;
&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Sets the cell specified by &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;position&amp;lt;/font&amp;gt; to the value specified by &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;value&amp;lt;/font&amp;gt;.&lt;br /&gt;
Position is specified of a list whose length is equal to the number of dimensions (&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;dimcount&amp;lt;/font&amp;gt;). The &lt;br /&gt;
optional arguments &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;plane&amp;lt;/font&amp;gt; &amp;lt;i&amp;gt;plane-number&amp;lt;/i&amp;gt; can be used to specify a plane. If a plane is specified, &amp;lt;i&amp;gt;value&amp;lt;/i&amp;gt; should be a single number, otherwise it should be a list of numbers of size &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;planecount - 1&amp;lt;/font&amp;gt;. &lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;/table&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;table border=0 width=100%&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#CCCCCC&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;val &amp;lt;/font&amp;gt;&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt; [&amp;lt;i&amp;gt;value&amp;lt;/i&amp;gt; (&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;list&amp;lt;/font&amp;gt;)]&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7D7CC&amp;gt;&lt;br /&gt;
&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Equivalent to the &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;setall&amp;lt;/font&amp;gt; message, followed by the &amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;bang&amp;lt;/font&amp;gt; message.&lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;/table&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;table border=0 width=100%&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#CCCCCC&amp;gt;&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;write &amp;lt;/font&amp;gt;&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt; { [&amp;lt;i&amp;gt;filename&amp;lt;/i&amp;gt; (&amp;lt;font face=&amp;quot;Courier&amp;quot;&amp;gt;symbol&amp;lt;/font&amp;gt;)] }&amp;lt;/font&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#D7D7CC&amp;gt;&lt;br /&gt;
&amp;lt;font face=&amp;quot;Times&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Writes matrix set as a Jitter binary data file (.jxf). &lt;br /&gt;
If no filename is specified, a file dialog will open to let you choose a file.&lt;br /&gt;
&amp;lt;/font&amp;gt;&lt;br /&gt;
&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;/table&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;p&amp;gt;&lt;br /&gt;
&amp;lt;table border=0 width=100%&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#CCCCDD&amp;gt;&amp;lt;font size=+1 face=&amp;quot;Times&amp;quot;&amp;gt;&amp;lt;b&amp;gt;Example:&amp;lt;/b&amp;gt;&amp;lt;/a&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;/table&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;&lt;br /&gt;
&amp;lt;img src=&amp;quot;images/jit.matrix.gif&amp;quot; border=0&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&amp;lt;p&amp;gt;&lt;br /&gt;
&amp;lt;table border=0 width=100%&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#CCCCDD&amp;gt;&amp;lt;font size=+1 face=&amp;quot;Times&amp;quot;&amp;gt;&amp;lt;b&amp;gt;See Also:&amp;lt;/b&amp;gt;&amp;lt;/a&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;tr&amp;gt;&lt;br /&gt;
&amp;lt;td bgcolor=#CCCCCC&amp;gt;&lt;br /&gt;
&amp;lt;a href=&amp;quot;jit.coerce.html&amp;quot;&amp;gt;&amp;lt;b&amp;gt;jit.coerce&amp;lt;/b&amp;gt;&amp;lt;/a&amp;gt;, &lt;br /&gt;
&amp;lt;a href=&amp;quot;jit.fill.html&amp;quot;&amp;gt;&amp;lt;b&amp;gt;jit.fill&amp;lt;/b&amp;gt;&amp;lt;/a&amp;gt;, &lt;br /&gt;
&amp;lt;a href=&amp;quot;jit.matrixset.html&amp;quot;&amp;gt;&amp;lt;b&amp;gt;jit.matrixset&amp;lt;/b&amp;gt;&amp;lt;/a&amp;gt;, &lt;br /&gt;
&amp;lt;a href=&amp;quot;jit.matrixinfo.html&amp;quot;&amp;gt;&amp;lt;b&amp;gt;jit.matrixinfo&amp;lt;/b&amp;gt;&amp;lt;/a&amp;gt;, &lt;br /&gt;
&amp;lt;a href=&amp;quot;jit.peek~.html&amp;quot;&amp;gt;&amp;lt;b&amp;gt;jit.peek~&amp;lt;/b&amp;gt;&amp;lt;/a&amp;gt;, &lt;br /&gt;
&amp;lt;a href=&amp;quot;jit.poke~.html&amp;quot;&amp;gt;&amp;lt;b&amp;gt;jit.poke~&amp;lt;/b&amp;gt;&amp;lt;/a&amp;gt;, &lt;br /&gt;
&amp;lt;a href=&amp;quot;jit.spill.html&amp;quot;&amp;gt;&amp;lt;b&amp;gt;jit.spill&amp;lt;/b&amp;gt;&amp;lt;/a&amp;gt;, &lt;br /&gt;
&amp;lt;a href=&amp;quot;jit.submatrix.html&amp;quot;&amp;gt;&amp;lt;b&amp;gt;jit.submatrix&amp;lt;/b&amp;gt;&amp;lt;/a&amp;gt;&amp;lt;/td&amp;gt;&lt;br /&gt;
&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;/table&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/body&amp;gt;&lt;br /&gt;
&amp;lt;/html&amp;gt;&lt;/div&gt;</summary>
		<author><name>Bevilacq</name></author>	</entry>

	</feed>