Examples of dance movement analysis from the ICI project (2016)

The seven cases below highlight tools developed for analysis of dance in project ICI that is part of LABEX Arts-H2H at University Paris 8.

Case 1 exemplifies how motion tracking of two freely improvising dancers can be reduced to a 2-dimensional space permitting quantification of their relationship. Case 2 demonstrates the development of movement complexity measures based on acceleration data. Case 3 provides first attempts at segmenting dance improvisation. Case 4 provides an exemple of our method to identify moments of group joint attention during performance. Case 5  exemplifies the combination of 1st and 3rd person measures of attention. Case 6 is a short documentation of a grid game protocol that was developed to study joint decision making. Case 7 is a novel attempt to produce a quantitative analogue of Stern (1984)’s concept of affective attunement.

Case 1: Style-based relationships between dancers

This video exemplifies how motion tracking of two freely improvising dancers can be reduced to a 2-dimensional space permitting quantification of their relationship. The recorded motion, (animated with the stick figures), was numerically described with features such as positions, velocities, and accelerations. The dimensionality of the feature set was then reduced with Principal Component Analysis to produce the two components shown on right in the video. The circles in the plot effectively represent the movement styles of the dancers. Therefore, when the dancers move similarly in similar poses (see time 3:40), the circles are close to each other. Different movement styles, such as at time 2:25, cause the circles to be at a large distance.

Case 2: Estimating complexity of dance movements

The video below demonstrates an acceleration based estimate of movement complexity. Here the complexity attempts to capture if the movements are simple and predictable or complex and unpredictable. The estimate is a combination of the variance in the acceleration signals, and the size of the unexplained part of the signal in PCA (Principal Component Analysis). The blue and the black lines in the video are the complexity estimates for the dancers wearing the same colors. It can be seen from the video that when the dancers perform similar actions, such as standing still or walking around, they have similar level of complexity. Also, unexpected events cause peaks in the complexity, such as at time 01:03 when a dancer falls to the floor.

Case 3: Segmenting dance by combining subjective annotations and kinetic data

This video demonstrates the possibility to combine real-time subjective annotations by spectators and accelerometer data in creating a segmentation for an improvised dance sequence . During the live performance spectators were asked to indicate using a tablet interface when they felt that there was a start or an ending in the dance. The blue line in the video shows the annotations of the starts. For example, at time 00:12 there is a peak of 5 persons indicating a start within the next 5 seconds. During the performance the movements of the dancers were recorded with accelerometers. The red line is a predictor for the starts that has high values when there is a rise in the level of overall acceleration. The same approach can be also used for predicting perceived endings by finding drops in the level of acceleration.

Case 4: Visualization of spectator’s (joint) attention

This video shows a solo dance (on the top), reconstruction of annotations drawn by spectators with tablets (on bottom left), and a heat map created from the annotations (on bottom right). The task given to the spectators was to draw the targets of their attention over the background image. In the image, the blue and black figures stand for the dancers, the large figure in the bottom is meant for attention on self, and the smaller figures on the bottom represent the other persons in the audience. In the heat map, presses to single points have been expanded to small circles, drawn ellipses have been filled, and drawn lines have been made thicker. The heat map identifies moments of joint or common attention by the spectators. For exemple,  at time 00:30, the heat map lights up as 10 persons are making annotations following a sharp motion from the dancer in black. The heat map enables precise analysis of the annotations as regions corresponding, for example, to one dancer or one limb can be extracted from the map.

case 5: Combining 1st and 3rd person measures of attention during performance

Eye-tracking pilot from Tommi Himberg on Vimeo.

case 6: The grid game:

Case 7: Estimating attunement from dance

This case is a novel attempt to produce a quantitative analogue to Stern (1984)’s concept of affective attunement (here between a dancer and a spectator). We attempted to capture attunement by two measures that are based on recorded accelerations.The first measure shown in the upper graph is a correspondence between acceleration histograms in a 5 second window. An example of a high value for the measure starts at time 1:35 in the video, where the dancer slows down, and thus both the dancer and the observer have low accelerations. While the behaviors have similarities in this case, we cannot be entirely sure if this a true indication of attunement or only a coincidence.The second measure shown in the lower graph is a windowed correlation between accelerations in a 5 second window. An example of a high value can be found from the video starting at time 4:20. There both persons start an action from stillness with only a short delay. As both the dancer and the observer participate actively in the behaviour, it is a more likely indication of attunement than the behavior of the first example.


Case 8:  EEG measures of quality of attention


Case 9: The 4 person mirror game:

he four person mirror game protocol builds on the two person mirror game research by our group members and collaborators. In this demo (http://bit.ly/fourmirror) the game is used before and after short JI training to assess how JI impacts the quality and dynamics of group interaction. Modeling and further testing of this protocol is currently underway. This protocol will be used to quantify individual level and group level changes in joint attention (estimated by the extent of temporal synchronization), joint decision making (by identifying the dynamics and strategies leading to changes in movement and coordination patterns ), and affective dynamics (by providing a measure of social bonding and group affiliation after training).

Case 10: Two-group  rhythm synchronization battle

The novel 2-group rhythm synchronization task developed in our project and described here: http://bit.ly/groupsync will also be used to estimate changes in joint attention (how well or fast the group syncs together), joint decision making (what are the strategies used in enhancing the within-group synchronisation under the stress of the other group?) and affective dynamics (by providing a measure of group cohesion and resilience to the influence of the other group).