• No results found

THELMA SVENNS Designing for Interactive Dance and the Experience of Control SENSITIV

N/A
N/A
Protected

Academic year: 2021

Share "THELMA SVENNS Designing for Interactive Dance and the Experience of Control SENSITIV"

Copied!
21
0
0

Loading.... (view fulltext now)

Full text

(1)

INOM

EXAMENSARBETE

MEDIETEKNIK,

AVANCERAD NIVÅ, 30 HP

,

STOCKHOLM SVERIGE 2020

SENSITIV

Designing for Interactive Dance and the

Experience of Control

THELMA SVENNS

KTH

(2)
(3)
(4)

Thelma Svenns

KTH, Royal Institute of Technology Media Technology and Interaction Design

Stockholm, Sweden thelmas@kth.se

In the last decade, many studies and performances within the field of interactive dance have been made. Interactive dance means involving technology into the dance, which opens opportunities to execute a dance in another way than used to. The studies in the past have often involved manipulation of music, but not many studies seem to involve manipulating real-time music produced by a live musician. Hence, this study consisted of a musician and a dancer investigating in the co-play between the two artists through sensory technology in a project called SENSITIV. More specifically, the investigation focused on the input design, i.e. the placement and processing of the motion sensors, for an interactive system, and how the involvement of the sensory technology affects the dance. Inertia-based motion sensors were worn by the dancer, by which the real-time sound produced by the musician was manipulated through the movements of the dancer. This created in turn an interaction within the intermediate connection, where the dancer came to act as a co-musician. Two studies were conducted, where in the first study a prototype was developed and designed in a first-person perspective, and the second study tested the developed prototype on a larger group of dancers. The results showed that placement on the outer parts, such as wrist and ankles, were the most suitable. It was further found that for reaching a positive experience, in terms of feeling in control with dancing with sensors involved, it requires some time as having an understanding and knowledge of the system is needed.

Interactive dance; sound and movements; real-time; co-play; input design.

1 htt5ps://sunhou.se/

Dance and music have been a part of human history for centuries. In general, music - such as playback or real-time music - are being played and dance is applied to the music in terms of that the dance is following the music. W. Siegel [30] states that dance and music are in many cultures inseparable and is asking if dancing could instead be said as “dance the music” through today’s technology. During the last decade, technology has been in a fast-growing development and new ways of dancing interactively with other medias with the help of technology has been explored. Some studies or performances have investigated dancing in controlling visuals [25] or controlling drones [9], and some about controlling music [12, 14]. However, not many previous studies or projects seem to involve interacting with a live performing musician, where the dancer acts like a co-musician manipulating the sound produced by the musician. This kind of interaction could create a greater co-play between two artists from different areas, where both are contributing to the intermediate connection - the music. The meaning of co-play is where the musician and dancer meet each other in terms of influencing and affecting each other, which makes them create something together. This thesis will therefore be based on the real-time interaction between a dancer and a musician, who both are controlling the music.

In the present study, a project called SENSITIV, which is the Swedish word for sensitive, was developed. A drummer produced sound by drums that were connected to percussive sensors together with the drum software program Sensory Percussion1, that makes the drums work as a synthesizer. A dancer was simultaneously wearing inertia-based motion sensors called New Generation Inertial Measurement Unit (NGIMU) sensors2, that in turn were connected to the drum software through Max/MSP, and manipulated the sound through movements. In this

(5)

way, the dancer became an added musician as the dancer processed the sound produced by the drummer. This thesis will therefore investigate the input design, i.e. where the sensors should be placed and how the provided signals should be processed, to make the user to feel in control, and how the involvement of the technology affects the dancer. The specific research questions will therefore be:

Where and how should sensors and their signals, that influence real-time music, be placed on dancers and be processed, to make the experience of dancing modern dance positive in ways of giving a feeling of control?

With a further research question:

How does technical involvement in modern dance affect the dance and does it cause limitations in the movement of dance?

The investigation will be conducted by two studies; a development study with a first-person design perspective, where a dancer, by iterative testings, get to shape the final design of the prototype developed for this study, and an evaluation study, where seven dancers get to test the developed prototype. In the development study, different placements of the sensors and what movements to use are tested together with the sound parameters they produce, whilst in the evaluation study, an overall experience of the usage of the sensors and the involvement of technology will be investigated by a larger group of dancers.

This study will at first go through some previous studies and related work done within the area of interactive dance and New Interfaces for Musical Expression (NIME). In Section 3, the method conducted is described, followed by the results in Section 4. In Section 5, a discussion of the results is presented completed with a conclusion in Section 6.

Music and dance have been a part of the human history for a long time [4, 16, 32]. Music, or rhythm, are affecting parts of the brain that controls the motoric areas, which makes us move to music [1]. Other studies [20, 26] have also shown that music affects audio-visual mirror neurons, which makes us connect sound to actions or vice versa. That can in turn make us connect the production of the music or sound to specific movements, and therefore makes us move or dance to music. In W. Siegel’s book about

interactive dance [29], he states that music and dance have in many cultures a strong relationship and are often inseparable. As an example, in western culture, salsa, samba, jive and tango are terms both in dance and music. Dancing to music implies taking music parameters into account, such as tempo, musical form, rhythm and the structure of the melody, and it is common to dance to a played back music, which makes the dancers follow, or dance to, the music. Siegel suggests then, based on these statements, if dancers could instead “dance the music”. As new technology is enabling dance to take new forms and become interactive in terms of letting the dancers controlling e.g. sound [21, 29], Siegel’s suggestion could be implemented.

Based on the Encarta World English Dictionary 2007, W. Siegel [29] is defining interaction as “the combined or reciprocal action of two or more things that have an effect on each other and work together”. Sigel states that the purpose of having an interactive dance is that it creates a precise synchronization between the movements and music, but the strongest reason is that it gives the dancer a feeling of freedom as he states that the strongest motivation for interactive dance is “giving the dancer a feeling of being free in time or even free from time”. Mullis [21] is defining the interactive dance performance as “performances in which a dancer’s movement, gesture, and action are read by sensory devices, translated into digital information, processed by a computer program, and rendered into output that shapes the performance environment in real time”. Mullis continues to say that dancing in interactive platforms creates an awareness of the manner as the technology affects their performance and “with some suggesting that the platform can be experienced as a partner that responds to and influences their movement as the performance unfolds”. The statements of Siegel and Mullis are good descriptions of what the further text will be about.

(6)

performance “Variations V” [27]. Cunningham is seen as one of the greatest choreographers and he had a big impact in the dance field [3]. Together with composer John Cage, he created a performance based on the theremin technology, where antennas detected the dancer’s movements which in turn affected the sound [18, 27]. Three studies have given a basis for the project presented in this thesis. The first one is Eriksson et al.’s study, “Dancing with drones” [9], where a dancer controls drones, that have loudspeakers which makes them “sing”, with sensors in an optical Motion Capture system. What was interesting in this study was the design study, that was from a first-person perspective. This affected the dancer, who was iteratively testing the setup, to feel a sort of intercorporeality with the drones. Intercorporeality means to fully understand another’s mind through a perceptual process. However, as drones have a different corporeality compared to humans and do not have minds, a kinesthetic engagement and awareness had to be achieved for reaching the feeling of intercorporeality for the dancer with the drones. The intercorporeality made the movements, or the choreography, become dependent on the drones. The study explores the concepts of somesthetic as the participating dancer in the study, except from only dancing with the drones, also acted as she was a drone, which caused a richer expressivity for the dancer. When designing in a first-person perspective, it creates an opportunity to fully living through and experience one’s own ideas as the design and experience development becomes in a close loop. This is necessary when designing for engaging different bodywork practices [13].

The second study is “VIBRA” [5] that was explored during two workshops with dancers, who got to test two different types of sensors: Myo armbands3 and NGIMU sensors. The differences between these sensors are that NGIMU are measuring the orientation with the Inertial Measurement Unit (IMU) and sending Open Source Control4 (OSC) messages through Wi-Fi whilst Myo armbands have both an IMU and electromyography (EMG), that measures the muscle tension, and are sending data through Bluetooth that later is converted into OSC messages. These sensors controlled audio and visual, and the aim of the project was to explore interactive dance as an artistic medium. They could see that the movements were affected and changed depending on what type of sensor, the placement of the sensors, and the instruments used. One user with a

3https://support.getmyo.com/hc/en-us

background as a musician made movements as like playing an instrument.

The third study is a study by L. Jap [14]. In her study, 12 participants participated and could with NGIMU sensors control the tempo of electronic dance music, that was mapped to and controlled in Max/MSP. The aim was to investigate what mapping methods of the sensors on the body could lead to a positive experience compared to dancing without and the study showed that an increasement in enjoyment was achieved when controlling the beat as it contributed with more engagement compared to dancing without the sensors. The final placement of the sensors is presented in section 2.3.

In interactive dance with music, the dancer takes a role of an instrumentalist or conductor [28], hence the field of New Interfaces for Musical Expressions (NIME) is related to the present study.

When technology is being developed, it opens opportunities to develop NIME [15]. As for the piano during the 18th century, it could be developed into the modern pianos we are used to today due to metallurgy, which in turn gave a wide piano repertoire. Another example is the development of electronic instruments during the 50’s and 60’s, that created new musical forms or genres [24]. It can therefore be awaited to find new types of musical forms in the future.

S. Fels [11] discusses intimacy in designing for NIME. Fels states though that the meaning of expression has not a wide agreement, which A. R. Jensenius and M. J. Lyon [15] takes further and mean that New Interfaces for Musical Expressions can reach more areas than what it stands for. The M in NIME could for example stand for Multimodal or the E could stand for Exploration. When researching in NIME, it seems to touch a wide area. However, in [11], the meaning of expression, specifically musical expression, is described to occur when a player intentionally expresses itself through the “medium of sound”. Fels continues to say that a well-designed instrument should give the user “enough control freedom to explore sound space and make music while being sufficiently constrained to allow the user to learn to play the instrument”. This means the interface should be simple for creating sound but in a manner that does not limit the “expressive capacity of the player”, and still being musical. For creating this kind of interface, Fels

(7)

meant that there should be an intimacy between the user and the instrument, which means that the instrument should be an extension of the user that creates a “transparent relationship between control and sound”. According to Fels, there are four types of this relationship; Object disembodied from Self, Self embodies Object, Self disembodied from Object and Object embodies Self. The relation Self embodies Object seems most appropriate for the project presented in this thesis. That means that the object, i.e. the sound, is embodied within the user (the self) and creates an extension of the user. Intimacy is also discussed by R. Moore in terms of controlling intimacy [19], which is the relationship between the musically desirable sounds and the psychophysiological capabilities of the player. He meant that the human voice is the largest control intimacy instrument.

The visual movements of a musician give also cues to the audience of the expression [22]. When coming to expression in dance, J. C. Schacher [28] states that dancers have a greater awareness and sensitivity to the expressions in movements, as a dancer is trained to generate expressions either through an imaginary stage or in a dramaturgy. However, M. Chion [2] argued that the audio content and visual content cannot be broken into two independent elements in multimedia as a relationship called “synchresis” is arising between them. Like in dance, these two aspects will influence each other in terms of the perceptual entanglement [29].

However, the aim of reaching an interface for expression may not always be of prior. Siegel [29] is describing a project called “Very Nervous System” by David Rokeby, where the user is controlling the sound by movements. Rokeby was however not interested in the controlling of the sound in terms of instrumental performance or dance, instead he was interested in the social interaction between the interactor and the system.

Discussing designing for NIME open up questions regarding placement of sensors in interactive dance. As shown in L. Jap’s initial study [14], the placement of the sensors were both preferred to be placed on the hand wrist, since that placement made it easier to control the sound, and the ankle joint as it was less noticeable there. However, the sensors in the main study were later placed on the wrists since it gave the largest data pronunciation there. In another study made on circus artists [8], the placement of the sensors showed to be a bit more complex as they need different body parts to be free for the circus movements.

When the placement and use of the sensors were to be considered well suited for their movements, it let the artist be more spontaneous and expressive in their movements, thus explore new expressive potentials.

Just as the placement of the sensors should be well suited for the aimed movements, the attached movement and the related sound should also be considered natural. If the movement is involving a kick, the sound may be e.g. a loud percussive sound [29]. As a dancer is also changing its position while dancing, the change in music and the related movement might be perceived as being changed from an external part such as a technician or through musical cues [31]. This means that the audience’s perception should be taken into consideration when mapping and designing the input design and sound. Siegel [29] means that since interactive dance is not as widely experienced as e.g. a piano concert – where the audience assume, without seeing the fingers, that the pianist is really generating the sound – the interactive relationship should be clear and explained for the audience. However, he continues that in another artistic perspective the explanation of the interactive relationship may not be needed. It is therefore mostly depending on taste and perspective.

3. METHOD

A prototype was developed for this study, together with a team of four people; the author of this thesis; Lisa Andersson López, who was the main collaborator in this study, i.e. collaborated in developing the technical parts and investigated the mapping design [17]; Jakob Klang, who is a drummer at dance classes where he uses the drumming tools Sensory Percussion (SP) used in this study, and who developed the artistical parts as well as accompanied in the testings; and Isabell Hertzberg, who is a professional dancer, working as a choreographer and dance teacher with previous experiences working with Jakob and SP, and made iterative think-aloud testings when developing the prototype. They will further be named by their first name. As the input design depends on the mapping design, e.g. what sound parameters to be mapped to different movements, a close and continuous discussion with Lisa was held during the whole method process.

(8)

to evaluate the outcomes and experiences of the prototype on a larger group of participants. The structure of the method can be seen in Figure 1.

Figure 1. Method process.

The prototype was designed in a first-person perspective with Isabell, as it was developed through iterative testings with her. This design perspective was chosen to create a prototype that was satisfying from a professional’s perspective. Details of the equipment and system design will be presented in the next sections, but the prototype’s description in summary is that wearable sensors were worn on different body parts by the dancer, by which the sound was being manipulated by movements. The sound was produced by the drummer, who through the drumming software program Sensory Percussion was changing the sounds from the drums. This created an interaction between the drummer and dancer where both controls the sound and performance but in different ways.

3.1.1 Equipment

The equipment used were the wearable sensors, NGIMU from x-IO Technologies (Figure 2), the visual programming music software for music and multimedia Max/MSP 8 from Cycling745, the drum software program Sensory Percussion from Sunhouse and in turn its drum sensors, and an acoustical mesh drum set consisting of four drums; floor tom, rack tom, bass drum and a snare drum (Figure 3). Additionally, a sound card Motu 8pre was used. As the NGIMU sensors send OSC messages via Wi-Fi, a portable router named TP-Link AC750 was also used together with a MacBook Air computer. From now on when referring to no specific sensor, it will always be the NGIMU sensors, as the Sensory Percussion sensors will be called as the drum sensors.

The choice of NGIMU sensors was due to their easiness to wear on different body parts, such as the wrist or ankle, that they measure three-dimensional movements, such as rotation, acceleration, and magnetometer, and as they send OSC messages, which makes them communicate with other software tools, they could be connected Max/MSP. Max/MSP was in turn chosen to be used as it allows to build

5https://cycling74.com/

interactive projects and can process the OSC messages between the NGIMU sensors and Sensory Percussion.

Figure 2. An NGIMU sensor.

Figure 3. The drum set used and its drum sensors.

3.1.2 System design

(9)

Figure 4. The system design of the prototype.

magnitude with the square root, and the three values were then summed up for getting the total value. The NGIMU provides both three-dimensional rotation measurements using a gyroscope, as well as three-dimensional acceleration data, where the sum of the absolute values of both signals was considered in this study.

In Max/MSP, a Sensory Percussion VST (Virtual Studio Technology) plugin was used to take the sound from the playing drums and connect it to the movements of the NGIMU’s (see Appendix C). That made it possible to manipulate the sound from the drums by movements. The whole system design is shown depicted in Figure 4. A development study was conducted once a week in the first five weeks with the whole team participating, where iterative testings and discussions on the meanwhile-developed prototype were conducted with Isabell. The aim was to explore the usage of the sensors and how to process their output data for creating a satisfying prototype from Isabell’s perspective. Every meeting lasted for 1-2h. 3.2.1 Procedures

In the first two meetings, discussions on what sound effects and movements to be used was held. The motion signals from the sensors that were tested were rotation and acceleration. Later, she tested the newest developed prototype and provided feedback with think-aloud sessions about the input design and mapping of the sound parameters. She used in the earlier meetings only one sensor and danced mainly the same dance, where she used her whole body, on every test session to easier explore the differences from every week. In the beginning, she placed the sensor mainly on her right wrist, as she is right-handed, but tried later to place the sensor on the left wrist to see the difference. She also tested to place the sensor on one of her ankles instead of the wrist. In the later meetings, she explored dancing with two sensors placed both as a diagonal, i.e. one on the wrist and the other one on the ankle on the opposite side, as well as a in a row, i.e. sensors

on the wrist and ankle on the same side. She also tried dancing with four sensors placed on the wrists and ankles.

Figure 5. Isabell dancing with NGIMU sensors.

From her feedback from every week, the prototype was developed further together with Lisa and Jakob to the final prototype used in the evaluation study. Figure 5 shows a test session of Isabell when dancing with sensors.

When the final prototype was developed, an evaluation study was conducted with seven participants together with the whole team, except Isabell. In this study, a dance session, divided into three five-minutes sections, with the prototype was held at first, from which the participants got to evaluate the prototype and the experience in a questionnaire and an interview that was held right after the dance session. All participants were female professional dancers in the age between 20-32, with experiences within inter alia modern dance and jazz. None of them had previous experience of dancing with Jakob nor the drum tool Sensory Percussion.

3.3.1 Procedures

(10)

was held in a dance hall at SKH, Stockholm University of the Arts. Every test was executed individually.

3.3.2.1 Dance session

The dance session was further divided into three sections that lasted five minutes each and was together with Jakob, who accompanied with the drums. This choice of method setup was discussed and agreed on within the team, as Jakob compared the dance session as learning a new instrument and Isabell compared the dance session with how she would have done in a dance class. Every dance session was video recorded while conducted, with the purpose of showing it to the participants in the interview as a help for them to analyze their dance with the sensors, to which they had to approve consent before recording. Jakob played similarly in all dance sessions and the participants were asked to improvise but holding a similar theme in every dance session.

The setup of the three dance sessions was: 1. a session dancing with the drums, wearing the sensors but having them switched off; 2. a session where they got to dance with the sensors switched on but without any information about how they worked or what they did; 3. a session dancing with the sensors switched on, knowing what the sensors did, how they were mapped to the drums and in what way the related sound parameters were positioned. Figure 6 shows a participant dancing the third session.

In the first session, the aim was to get to know how it was like to dance with Sensory Percussion and to get used to the weight and physical placement of the sensors, as well as being able to compare dancing with or without sensors. The second session was a warm-up session, with the aim to get the participants to explore and get to know the sensors without any information of them. The purpose of not giving any information was to make them be able to explore as much as possible on their own and to make the exploration and improvisation as authentic as possible. In the last session, the aim was to make them explore the dance again but with the knowledge of how the sensors worked. Only the first and last session was later discussed and compared in the questionnaire and interviews.

Figure 6. Participant during the evaluation study.

3.3.2.2 Questionnaire

The questionnaire consisted of 16 qualitative and quantitative questions (see Appendix A) that covered the input design as well as the mapping design aimed for Lisa’s study [17]. As answers can be affected by a spoken manner, the aim of having a questionnaire was to avoid this and for collecting data objectively.

3.3.2.3 Interview

Directly after the questionnaire was filled, the last step was having an interview with the participant. The participant was asked to think aloud while looking at a piece of the recorded video from the third dance session and was then asked with follow-up questions (see Appendix B). These follow-up questions were specifically shaped for the interviews, with the aim to create a discussion together with the project team about the experience and to collect data in a more subjective way. As Jakob also participated, discussions about the co-play could be held as well, where Jakob could add his thoughts. Afterwards, the participants were asked to give general feedback on the prototype.

The results from the development study and evaluation study will be presented here.

(11)

to movements and was hard to stabilise. The gyroscope worked better and as it in the end gave similar feedback in dance as the accelerometer, only the gyroscope was decided to be used.

4.1.1 Placement of sensors

As placements on the wrists and ankles each were tested at a time, Isabell could explore how each limb felt to specific sound parameters. The sensors are mainly designed to be worn at these body parts, which made it natural to test it at those limbs, but Isabell also wanted to use these body parts as she wanted possibilities for making “led movements”, i.e. movements that she felt she carried from one side to another. She thought as the arms are often being in movement, many possibilities for this could be reached from the wrist placement. Additionally, she thought that since the feet can in contrast be more still, and are more anchored to pulse, the placement of sensors on the ankles gave a clearer difference between the movements from the arms. She continued to say that as the legs give a "force” in the movements, she also felt that it gave a larger effect and feedback on the technology. Discussions of using torso were held but was not tested, partly because of the design of the sensors that made it hard placing them elsewhere than on the wrists and ankles, partly because Isabell became more involved and explorative in using her outer body parts, i.e. arms and legs, for creating led movements. She never expressed that she wanted to test other placements as well. Placement of sensors on the wrists and ankles was therefore decided to be used.

4.1.2 Number of sensors

When only using one sensor, Isabell became more aware of the specific body part it was placed on and took thereby out most of the movements in that body part. She did however often place a sensor on the right wrist, and as she is right-handed, she normally uses her right-side arm slightly more. Therefore, she thought that the enlargement of movements could be because of that in combination with the greater awareness of that body part. When she was asked to place the sensor on the left wrist instead, she was still conscious of the sensor and movements on the left side but felt a greater compensation of the movements as she started to use her left arm more than usually.

When she tested to dance with two sensors, placed on the left wrist and left ankle respectively, it made her conscious of yet another body part and made her feel like her whole body was joining. That made her feel that it became more evident that the whole body started to act as an instrument. She wanted however the placement to be diagonally, i.e.

one sensor on the wrist and the other sensor on the opposite side on the ankle, as it made her movements more even.

When Isabell later tested it with four sensors on the ankles and wrists respectively, and the sound parameters mapped diagonally, she positively expressed that this was what she wanted it to feel like as it “became more natural to dance inside from the body than to force a movement as I’m having a sensor on every body part”. She also said that she found a more natural approach to the sensors which thereby made her improvisation more authentic, and she also felt that she “owned” the sound. The feeling of control became more detailed with four sensors, but she was however not as aware of what movement did what as when only using one sensor, except her right foot. Every sensor was mapped to one of the four drums in the drum set as both Isabell and Jakob thought it made the connection between them better in a musical way, where the right foot was anchored to the bass drum that produced a long lasting tone. In addition, she also felt having more than one sensor was better in a scenic- or performance perspective, which this project is more aimed for, as using only one would have suited more for a dance class.

However, Isabell did not feel that much difference in control between having two or four sensors. As each sensor was mapped to one drum, she did not know what body part affected which sound, but the knowledge of that the sensors did affect something and that there was an “expectation” affected in turn her movements. Thereby, having four sensors made the dance experience more positive than having only two as it affected the movements in her whole body.

(12)

placement of the sensors in the final prototype6 became to be placing the sound parameters diagonally, i.e. delay in the right wrist and left ankle, and pitch in the left wrist and right ankle (see Figure 7).

Figure 7. The diagonal mapping of the sound parameters.

The final prototype was tested on seven participants, who got to explore and give feedback on the prototype. While the development study had more focus on exploring the input design, such as placement and numbers of sensors and the motion signals to be used from the sensors for creating a satisfying prototype for Isabell, the results presented in this section will mainly focus on a larger amount of dancer’s experiences of using the prototype and the feeling of having control with the involvement of sensory technology as well as the technology’s influence on the dance.

4.2.3 Input design

A majority (five out of seven) were aware of the sensors in the third dance session (Figure 8). The placement of the sensors and its attached sounds did however feel natural for the majority (Figure 9). According to the general placement, one participant (P6) stated in the interview that getting movements from the arms and legs are easy and the sensor’s placements were therefore well placed, but it became clear both in the questionnaire and the interviews that the majority would have wanted to explore sensors positioned around the torso, such as the chest, hips and back, as “that would have made that the whole body became important in the music making” - (P2). One participant (P4) expressed in the interview that she consciously had to think about that she had a torso to use as well and that the impulses can come from other places than the arms and legs, and another (P5) thought the sensors could have been placed higher up on the arm as they constrained her for “throwing” herself onto the floor. Other suggestions of placements were the head. Another participant (P1) stated in the discussion that the placement of the sensors depends

6 Video of Isabell dancing with the sensors https://vimeo.com/414692921

on what genre to dance. Since the torso is being used much in jazz, she thought placement of sensors would be well suited in the jazz genre, which also was a genre that all of the participants had experiences within and some still executed. Apart from the aspect that placement of sensors depends on what genre to dance, the individuality of dancing showed to matter as well. One participant (P1) who knew Isabell well and had been dancing with her for a long time, said that Isabell uses more her outer body parts while she herself uses her torso more and therefore suggested placements around torso.

Figure 8. Degree of awareness of the sensors.

Figure 9. Results of how the placement anchored to the sound parameters felt. Scale in 1: not at all natural, 3: very natural.

4.2.1 Control

(13)

over the sound stated in the questionnaire that it was hard to understand in what way she affected the sound due to the short amount of time. The mapping of one sensor to one drum in the drum set might also have affected the lack of feeling control as one participant (P2) commented in the questionnaire that “to understand that I had control, I had to move one body part at a time. Often it felt that it was played on another drum than the one I focused on, so then I had the experience of not having control over the sound.”.

Figure 10. Results of the feeling of having control of the sound.

When asked what movements made them feel having control over the sound, four out of seven expressed in the questionnaire that big or fast movements gave a feeling of control, where one of the participants (P2) commented that “I felt that I had control over the intensity in the sound if I changed the intensity in my body”. She also felt that movements with her right leg gave her a feeling of control. Another participant (P6) noticed sometimes that it was her arm or leg that did something, but could in general not pinpoint what exactly did what and continued that “I was more in the music, I let my body follow the music and in the middle of it, I started to feel that sometimes I was in control, and sometimes the music was in control. It became like a dance between me and the music, instead of that I am just following what I hear”. In the interviews, all participants stated though that their dance became more explorative to see what movements did what.

However, when the recorded video was watched at in the interview, one participant (P5) said that it became more clear which one of the dancer and musician did what and another participant (P2) expressed already five seconds into the video that “Wow… it looks like I have more control in the video than I felt”.

One aspect that came into discussion in the interviews was that the reason for not feeling control was because of not having the usage of using sensors. One participant (P1) said that she is normally used to listen and then react but regarding to explore dance in a new way, she later said that “it is something beautiful to be very focused on something but more scary though since I want to have control” and in contrast, another participant (P5) stated the question “if one

even need to feel control” but continued that it can however be frustrating as one has to drop all the ideas of how it should sound. As the usage of sensors is to dispossess some control from the musician, Jakob added a comment that neither he is used to playing in this type of setup. The aspect of having more time came into discussion in this consideration as well as one participant (P3) thought that the feeling of control would probably come after a while when one gets more used to the sensors and their effects. 4.2.2 Technical involvement and its influence

(14)

Figure 11. Results of the feeling as one with the music. The scale in the picture below is 1: negative, 5: positive.

Some participants compared dancing with sensors as learning to play a new instrument to which one participant (P2) stated in the interview that “[…] and I did not know anything about this instrument but I thought I would be able to manage playing it at once”. The same participant also thought that the sensors make the dancer take a part of the musician’s role and that they together create some type of a common “blob” that they both need to find for reaching the intermediate intersection and co-play. Another participant (P1) stated in the interview that she felt that her instrument “dance” became worse played as she had more focus on what happened in the dance and music while wearing the sensors.

The involvement of the sensors made that a marginal majority (four out of seven) felt limited sometimes (Figure 12). Some participants stated both in the questionnaire and the interview that they were a bit worried to dance in low positions on the floor as they did not know how robust the sensors were, which in turn affected them slightly in their movements. As already mentioned, one participant (P5) felt the placements on the wrists constrained her for throwing herself onto the floor the same way as she usually does. As this question was interpreted differently, the aspect of feeling control should be considered as well. One participant (P2), who felt limited sometimes, stated in the interview that she usually wants to sense control of what she does, which she did not feel she had during the test, that in turn made her frustrated. Another participant (P3) who also felt limited sometimes stated in the interview that she wanted to have the same control as without the sensors.

Figure 12. Results of if the participants felt limited in any way.

The questionnaire showed that the experience of dance did become better in the third dance session for the majority (five out of seven, see Figure 13) as sensory technology contributed to more explorative use of their body, but for two it became worse as they felt being more in the here and now without the sensors. One participant (P1) motivated this in the questionnaire as “I experienced myself being more in the present and in contact with the musician in the first session. The last session became more tentative. But it was interesting”. However, every participant stated in the interviews that they could imagine dancing with sensors in the right situation, such as a performance, and with sufficient knowledge about the system.

Figure 13. Results of the experience between with or without sensors.

4.2.4 Summary

(15)

audience perspective, as it became clearer what the sensors did and who affected what. A change in dancing was experienced with the involvement of technology as it made the participants be more aware of the body and its movements. It made the participants to begin exploring dance in another way and they felt it became more like learning and playing an instrument. This created a change for many of their artistic expression.

Through the development study and evaluation study, the investigation of the how the input design should be designed for creating a positive experience in terms of achieving control, and the influence of involving sensory technology in dance, could be made. The investigation of the input design was mainly studied in the development study whereas the involvement of technology had its emphasis in the evaluation study.

From the development study, it became clear that Isabell enjoyed dancing with sensors placed on the outer parts of her body and using four sensors instead of only one. However, as she had been exploring the sensors and system in a five-week period, and successively added sensors to her body, she got a deeper understanding and knowledge of the usage of sensory technology. She did also have previous experiences dancing with Jakob and the drum tool Sensory Percussion, which could have facilitated her as that could have made her be able to focus more on the exploration of sensors than in addition explore an approach to Jakob and Sensory Percussion.

Changing this five-week period into a 15-minute dance session might have affected the fact that all of the participants in the evaluation study wanted more time for the exploration. The limited amount of time was the strongest reason for not feel having control as the participants did not feel they got to know the system sufficiently, which caused frustration to some. Some thought having more time would create a practice to the sensors and thus improve the experience of having control. Hence, the exploration time should have been longer. However, as the prototype developed in this study was designed from Isabell’s perspective and thus designed in a first-person perspective, the design choice may have affected the feeling of limited control as well. As mentioned in Section 2.1, this design development is an appropriate design method when designing for bodywork practices [13]. A majority of participants suggested placements of the sensors around the torso, since some are used to having

(16)

One aspect that probably caused a lack of feeling control was the mapping of the sensors to the drums. As each sensor was mapped to only one drum, Jakob had to play on the specific drum that the dancer intended to manipulate for achieving an effect. As this was shown to cause some frustration and less feeling of control, another mapping method to the drums could have been chosen, even though the present mapping caused a richer musical connection between Isabell and Jakob. One aspect though that should be taken into regard, considering the co-play, is that a mapping of one sensor to one drum will make the musician necessary to be fully concentrated on the dancer, and vice versa, which could create a deeper co-play regarding the awareness and receptiveness. Despite the sensor to drum mapping, other sounds of the drums could have been chosen. As shown, both Isabell and a participant noticed their right leg and felt some control with it. The mapped sound was a long tone and it could therefore be investigated if long tones would work better in this setup. That is however a study in itself.

The involvement of sensory technology clearly affected the dancers into be more explorative in their bodies as it caused more awareness, and despite the feeling of having low control and weak understanding of the system, all of the participants expressed positively to using sensory technology. All of them were curious to explore more, nonetheless the effects of the co-play between a real-time musician. As not many studies within interactive dance seem to involve a real-time musician, there are therefore many reasons to investigate and develop this study further. The amount of participating dancers were however not that many for being able to make any solid conclusions of the technical involvement. The results from the evaluation study could thereby be investigated further on a larger amount of participant and the prototype developed could be considered as a contribute to the field of interactive dance and NIME. Another aspect is that only female dancers participated. The prototype was however developed from a female dancer’s perspective, but it would have been interesting, and nonetheless equally, if a male dancer could have participated as well.

In this thesis, an evaluation of technical involvement in modern dance has been conducted and particularly investigated the input design, i.e. how NGIMU sensors should be placed and processed on the dancer, and the involvement of sensory technology in dance. The

evaluation has been conducted in two steps: firstly in a development study which was done together with the dancer Isabell, who got to shape the prototype in a first-person perspective; secondly in an evaluation study where seven dancers got to test the developed prototype and evaluate the experience of dancing with sensors considering feeling limited and the difference between dancing with and without sensors. For Isabell, usage of four sensors made the experience of dance most positive, having the sensors placed on the outer body parts, such as wrists and ankles, as she wanted to do “led movements”. The motion signals from the NGIMU sensors that were thought as well-suited in a dance context were rotation and acceleration. However, only rotation was used as acceleration did not work as the team wanted, and rotation gave similar effects in the dance as acceleration. From the evaluation study, it became clear that a loss of control caused a feeling of limitation in the dance. This was mainly due to a limited amount of time for exploring the sensors and its setup. For reaching a greater feeling of control, a deeper understanding of the technology became clear to be needed. The usage of sensors in dance together with a real-time musician seemed however to be of interest for all participants as it during the tests created new possibilities to explore dance and interact with another artistic performer.

First of all, I want to thank my supervisor André Holzapfel for his great guidance throughout the study.

I would also like to thank all the participants who wanted to offer some time contributing to this study and for creating such interesting discussions in the interviews. Your participation was highly appreciated.

(17)

[1] Chen, J. L., Penhune, V. B., & Zatorre, R. J. (2008). Listening to musical rhythms recruits motor regions of the brain. Cerebral cortex, 18(12), 2844-2854.

[2] Chion, M. (1990). Audio-Vision. New York: Columbia University

Press

[3] Copeland, R. (2004). Merce Cunningham: the modernizing of modern dance. Routledge.

[4] Barras C. (2004). Did early humans, or even animals, invent music?

Retrieved from: http://www.bbc.com/earth/story/20140907- does-music-pre-date-modern-man. [Retrieved 2020-02-29]

[5] Bergsland, A., Saue, S., & Stokke, P. (2019). VIBRA-Technical and

Artistic Issues in an Interactive Dance Project. In proceedings of the

16th Sound and Music Computing Conference, 39-46

[6] Bisig, D., Palacio, P., & Romero, M. PIANO & DANCER (Paper)

Topics: Dance and Music. Generative Art Conference GA2016.

[7] Bresin, R., Elblaus, L., Frid, E., Favero, F., Annersten, L., Berner, D., &

Morreale, F. (2016). Sound forest/ljudskogen: A large-scale string-based interactive musical instrument. In Sound and Music Computing 2016 (pp. 79- 84). SMC Sound&Music Computing

NETWORK. .

[8] Elblaus, L., Goina, M., Robitaille, M. A., & Bresin, R. (2014). Modes of sonic interaction in circus: Three proofs of concept. In ICMC.

[9] Eriksson, S., Unander-Scharin, Å., Trichon, V., Unander-Scharin, C., Kjellström, H., & Höök, K. (2019, May). Dancing with drones: Crafting novel artistic expressions through intercorporeality. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (pp. 1-12).

[10] Feldmeier, M., & Paradiso, J. A. (2007). An interactive music environment for large groups with giveaway wireless motion sensors. Computer Music Journal, 31(1), 50- 67.

[11] Fels, S. (2004). Designing for intimacy: Creating new interfaces for musical expression. Proceedings of the IEEE, 92(4), 672-685.

[12] Hsu, A., & Kemper, S. (2015, August). Kinesonic approaches to mapping movement and music with the remote electroacoustic kinesthetic sensing (RAKS) system. In Proceedings of the 2nd International Workshop on Movement and Computing (pp. 45-47).

[13] Höök, K., Caramiaux, B., Erkut, C., Forlizzi, J., Hajinejad, N., Haller, M., ... & Loke, L. (2018, March). Embracing first-person perspectives in soma-based design. In Informatics (Vol. 5, No. 1, p. 8). Multidisciplinary Digital Publishing Institute.

[14] Jap, L., & Holzapfel, A. (2019). Real-time Mapping of Periodic Dance Movements to Control Tempo in Electronic Dance Music. In Sound & Music Computing Conference.

[15] Jensenius, A. R., & Lyons, M. J. (Eds.). (2017). A NIME Reader: Fifteen Years of New Interfaces for Musical Expression (Vol. 3). Springer.

[16] Laland, K., Wilkins, C., & Clayton, N. (2016). The evolution of dance. Current Biology, 26(1), R5-R9.

[17] López, A. L (2020). SENSITV: Mapping Design of Movement Data to Sound Parameters when Creating a Sonic Interaction Tool for Interactive Dance. Master’s thesis. Royal Institute of Technology (KTH), Stockholm, Sweden.

[18] Merce Cunningham Trust, Variations V, Retrieved from

https://www.mercecunningham.org/the-work/choreography/variations-v/ [Retrieved 2020-05-26]

[19] Moore, F. R. (1988). The dysfunctions of MIDI. Computer music journal, 12(1), 19-28.

[20] Molnar-Szakacs, I., & Overy, K. (2006). Music and mirror neurons: from motion to’e’motion. Social cognitive and affective neuroscience, 1(3), 235-241.

[21] Mullis, E. (2013). Dance, interactive Technology, and the device Paradigm. Dance Research Journal, 45(3), 111-123.

[22] Paine, G. (2009). Towards unified design guidelines for new interfaces for musical expression. Organised Sound, 14(2), 142-155.

[23] Park, C., Chou, P. H., & Sun, Y. (2006, March). A wearable wireless sensor platform for interactive dance performances. In Fourth Annual IEEE International Conference on Pervasive Computing and Communications (PERCOM'06) (pp. 6-pp). IEEE.

[24] Poupyrev, I., Lyons, M. J., Fels, S., & Blaine, T. (2001, March). New interfaces for musical expression. In CHI'01 Extended Abstracts on Human Factors in Computing Systems (pp. 491-492).

[25] Qian, G., Guo, F., Ingalls, T., Olson, L., James, J., & Rikakis, T. (2004, June). A gesturedriven multimodal interactive dance system. In 2004 IEEE International Conference on Multimedia and Expo (ICME)(IEEE Cat. No. 04TH8763) (Vol. 3, pp. 1579-1582). IEEE.

[26] Rizzolatti, G., & Craighero, L. (2004). The mirror-neuron system. Annu. Rev. Neurosci., 27, 169-192.

[27] Rovan, J., Wenschler, R., & Weiss, F. (2001). Artistic Collaboration in an Interactive Dance and Music Performance Environment: Seine Hohle Form, a Project Report. In Proceedings of COSIGN.

[28] Schacher, J. C. (2010). Motion To Gesture To Sound: Mapping For Interactive Dance. In NIME (pp. 250-254).

[29] Siegel, W. (2009). Dancing the music: Interactive dance and music. The Oxford Handbook of Computer Music

[30] Todoroff, T. (2011). Wireless Digital/Analog Sensors for Music and Dance Performances. In NIME (pp. 515-518).

[31] Wechsler, R. (2006). Artistic considerations in the use of motion tracking with live performers: A practical guide. In Performance and Technology (pp. 60-77). Palgrave Macmillan, London.

[32] Wikipedia. History of dance. Retrieved from: https://en.wikipedia.org/wiki/History_of_dance [Retrieved 2020-02-29]

[33] Woolford, K. A., & Guedes, C. (2007, September). Particulate matters: generating particle flows from human movement. In Proceedings of the 15th ACM international conference on Multimedia (pp. 691-696) .

(18)

Appendix A

QUESTIONNAIRE (Google Forms)

The questions relevant for this thesis are marked in bold.

Gender:

Age:

Dance experience (genre, years, in what way):

1. Did you feel in control of the sound?

(Scale: Not at all – Barely – Neutral (neither nor) – Little – Very much)

a. Explain why you felt like that

2. What sound effects did you feel in control of?

(None – Delay – Pitch – Both)

3. What movements made you feel that you had control over the sound?

4. How satisfying was it to control the sound?

(Scale: Delay: Not at all satisfying – Not satisfying – Neutral – Little satisfying – Very satisfying)

(Scale: Pitch: Not at all satisfying – Not satisfying – Neutral – Little satisfying – Very satisfying)

5. What was your overall experience of the sound effects?

(Scale: Delay: Negative – Positive ; Pitch: Negative – Positive)

a. Explain why you felt like that

6. What was your overall experience of the controlling the sound?

(Scale: Very negative – Negative – Neutral – Positive – Very positive)

7. What was your experience of controlling the specific sound effects?

(Scale: Delay: Very negative – Negative – Neutral – Positive – Very positive)

(Scale: Pitch: Very negative – Negative – Neutral – Positive – Very positive)

8. Did you feel as one with the music?

(Scale: Not at all - Barely - Sometimes - A lot)

a. Was that a positive or negative experience? (Negative – Positive)

9. In what way did the experience of the dance change between the first and the last session?

(Scale: Much worse – A little worse – Not at all – A little better – Much better)

a. Explain why you felt like that

10. Did the sensors affect you to dance in a different way?

11. Were you aware of the sensors in session 3?

(Scale: Not at all aware – Barely aware – Little aware – Very much)

12. Did you feel limited in any way?

(Scale: Very limited – A little limited – No difference – Not limited – Not at all limited)

13. How did the placement of the sensors felt according to the specific sound effect it made?

(Scale: Not at all natural/obvious - Natural/obvious - Very natural/obvious)

14. Are there any other sound effects that you think would have been more satisfying? Which?

(19)

Appendix B

QUESTIONS FOR INTERVIEW

The questions relevant for this thesis are marked in bold.

The questions are in the order they generally were asked in.

• What was your general experience of dancing with the sensors?

• What was the difference in your experience between the first and the last session?

• Was your experience of the dance positively or negatively influenced by controlling the sound?

• Which sound effect did you felt you liked the most? Any other sound effects that you would have

wanted to try out?

• What did you think about the placement of the sensors?

• Would you prefer dancing with or without the sensors?

• Were you aware of the sensors while dancing?

• Would you liked to have the sensors on other placements?

• How did you experience the co-play between the dance and music?

• Was it any different in the co-play between the different sessions?

• Did you experience that you changed/modified your artistic expression when you used the sensors?

• If you could choose, how would you want a prototype like this to work?

(20)

Appendix C

The resulting Max/MSP patch

es.

(21)

www.kth.se

References

Related documents

Re-examination of the actual 2 ♀♀ (ZML) revealed that they are Andrena labialis (det.. Andrena jacobi Perkins: Paxton & al. -Species synonymy- Schwarz & al. scotica while

46 Konkreta exempel skulle kunna vara främjandeinsatser för affärsänglar/affärsängelnätverk, skapa arenor där aktörer från utbuds- och efterfrågesidan kan mötas eller

Both Brazil and Sweden have made bilateral cooperation in areas of technology and innovation a top priority. It has been formalized in a series of agreements and made explicit

The increasing availability of data and attention to services has increased the understanding of the contribution of services to innovation and productivity in

Generella styrmedel kan ha varit mindre verksamma än man har trott De generella styrmedlen, till skillnad från de specifika styrmedlen, har kommit att användas i större

I regleringsbrevet för 2014 uppdrog Regeringen åt Tillväxtanalys att ”föreslå mätmetoder och indikatorer som kan användas vid utvärdering av de samhällsekonomiska effekterna av

Swedenergy would like to underline the need of technology neutral methods for calculating the amount of renewable energy used for cooling and district cooling and to achieve an

Industrial Emissions Directive, supplemented by horizontal legislation (e.g., Framework Directives on Waste and Water, Emissions Trading System, etc) and guidance on operating