• No results found

MASTER'S THESIS

N/A
N/A
Protected

Academic year: 2021

Share "MASTER'S THESIS"

Copied!
64
0
0

Loading.... (view fulltext now)

Full text

(1)

MASTER'S THESIS

The Future of Musical Instruments

The Extended Clarinet

Harald Andersson

2016

Master of Science in Engineering Technology Computer Science and Engineering

Luleå University of Technology

(2)

The future of musical instruments

The extended clarinet

Harald Andersson

Lule˚

a University of Technology

Dept. of Computer Science, Electrical and Space Engineering

(3)
(4)

A

BSTRACT

In this thesis a clarinet is augmented using inexpensive hardware in order to introduce new means of interaction with the instrument using motion. Methods for audio sampling using inexpensive hardware and visualization of the collected data to enhance the overall audience experience are investigated.

A prototype, based on an earlier iteration, was built, using 3D printing technology to produce a new bell for the clarinet which would hold the hardware. A microcontroller and sensors were fitted on the printed bell and the sensor data as well as the clarinets estimated orientation were sent using a wireless network to a computer with the software MAX 7, where it was used to influence the music and for visualizations of the motions. The audio sampling implemented were removed from the final version of the prototype, due to limitations introduced by the selected hardware which rendered it unsatisfac-tory, but showed a working principle and alternative hardware and methods to achieve satisfactory audio sampling is discussed.

It was shown that augmentation of a musical instruments, in this case a clarinet, using sensors to allow for interaction through motion, and visualization of the interactions, adds a new dimension to the process of music creation and the performance, allowing for musicians to express themselves in more dynamic and creative ways.

(5)
(6)

P

REFACE

First I would like to thank my supervisor, Prof. Peter Parnes at Lule˚a University of Technology, for his support as well as inspiration and ideas for this thesis work.

Thanks to J¨orgen Normark, Lule˚a University of Technology, who designed the body and shell for the prototype.

Thanks to clarinetist Robert Ek, Norrbotten Neo, who constantly tested the system and provided helpful feedback and inspiration.

Finally I would like to thank Pia L¨ofberg for support and motivation during this thesis work, as well as providing useful feedback from the perspective of someone who is not familiar with the subject.

Harald Andersson

(7)
(8)

C

ONTENTS

Chapter 1 – Introduction 1 1.1 Introduction . . . 1 1.2 Purpose . . . 2 1.3 Project delimitations . . . 2 Chapter 2 – Background 5 2.1 MAX . . . 5

2.2 Open Sound Control . . . 5

2.3 3D printing . . . 6

2.4 Open Hardware . . . 6

2.5 Philips HUE . . . 6

2.6 NeoPixels . . . 7

2.7 Earlier work . . . 7

2.7.1 Matters for further development . . . 8

2.8 Related work . . . 9 Chapter 3 – Theory 11 3.1 IMU . . . 11 3.2 Estimating orientation . . . 14 3.3 Data filtering . . . 15 3.4 Audio sampling . . . 16 Chapter 4 – Methodology 19 4.1 Hardware . . . 19 4.1.1 Control unit . . . 20 4.1.2 IMU . . . 21 4.1.3 Power supply . . . 21 4.1.4 Display . . . 22 4.1.5 Buttons . . . 22 4.1.6 LED . . . 22 4.1.7 Microphone . . . 23 4.2 Software . . . 23 4.2.1 Data transfer . . . 23 4.2.2 Calibration . . . 24

(9)

4.2.3 Audio sampling . . . 26

4.2.4 Configuration . . . 26

4.2.5 Filtering . . . 27

4.2.6 Visualization . . . 29

Chapter 5 – Evaluation 33 5.1 Hardware and design . . . 33

5.2 Audio sampling . . . 34

5.3 Interaction . . . 35

5.4 Visualization and audience experience . . . 35

5.5 Comparison with first prototype . . . 36

5.6 Artist evaluation . . . 37

Chapter 6 – Discussion 39 6.1 Goals and purpose . . . 39

6.2 Future work . . . 40 6.2.1 General . . . 40 6.2.2 Controller . . . 40 6.2.3 Power consumption . . . 41 6.2.4 Independent filtering . . . 41 6.2.5 Gesture detection . . . 42 6.2.6 Audio sampling . . . 42

6.2.7 Custom hardware and design . . . 42

6.3 Conclusions . . . 43 Appendix A – Tables 45 Appendix B – Figures 47

(10)

C

HAPTER

1

Introduction

1.1

Introduction

While some instruments have been augmented using new technology to influence the sound and provide additional ways of interaction, e g electric guitar and keyboard, there are many instruments that are standing still in the development process, limiting the development of music creation. Creation of music is limited by the instruments, wherefore an evolution of the acoustic instruments is needed in order to create new types of music. Meanwhile the evolution of electronic components have gone rapidly forward, leading to very small sensors and a trend with open hardware where many different components are available to consumers at low prices. By using sensors a more physical way to interact with technology, in comparison to selecting options in a menu, is introduced, i e tangible interaction. By using 3D printing to create new parts for instruments and using sensors for tangible interaction, a new way for musicians to interact with their instruments is created, allowing the musician to influence the music using motion.

With the use of sensor based interaction new visualization opportunities are introduced, where the musician can control the visualizations in real time by interacting with the instrument, e g by changing lighting effects in the room or projections based on the motion of the instrument.

(11)

2 Introduction

1.2

Purpose

The purpose of this project was split in two parts, augmentation and visualization. The augmentation part was about how to augment a clarinet using inexpensive open hardware for the possibility to measure movements of the instrument, digitize the music, and send all the collected data wirelessly to a computer. One of the main goals for the augmentation was to create a system which, from the user’s perspective, is relatively simple and easy to use. The visualization part was about how the data collected from the augmentation could be visualized and used to create a better experience for the audience. One of the main goals for the visualization was to create simple and clear visualizations of the represented data.

The following questions were to be answered more specifically:

• How can a clarinet be augmented to. . .

– give the possibility to influence the music through motion? – be able to digitize the music using inexpensive hardware? – transfer the collected data wirelessly to a computer? • How can the collected data be visualized?

• How can the collected data be used to create a better experience for the audience?

Additional goals for the augmentation were to aim for the creation of a compact and concealed system, and to resolve the shortcomings which were identified during an earlier iteration of the project (see Section 2.7). The creation of a concealed system allows for the hardware to be hidden from direct view in order to minimize the impact on the aesthetic appearance of the instrument, while a compact system, with all hardware contained within a single entity rather than spread out across the body of the instrument, allows for the system to be easily integrated with the instrument.

1.3

Project delimitations

One of the goals for the visualization was to create clear and simple visualizations, al-lowing the audience to understand what is visualized by looking at the performance and the visualizations rather than having to read or hear about what the visualizations are

(12)

1.3. Project delimitations 3 representing. Therefore it was decided to only visualize basic movement data, e g the current orientation of the clarinet, in this project and doing so using simple methods such as changing the lighting in the room and using simple 3D graphics that varies with the orientation. With only basic motion, which itself has a clear visual display, being represented in the visualization the process of detecting the correlation between the vi-sualization and the represented data should be facilitated, where the audience should be able to realize, for example, how the lighting depends on the motion by looking at the performance and the change in lighting.

Audio sampling with good quality requires a stable and high sample rate to be able to perfectly reconstruct the audio signal and a good microphone and amplifier to minimize noise. To achieve that, using hardware which is inexpensive and small enough to comply with the goal of keeping the prototype compact as well as being compatible with the other selected hardware, is out of the scope for this project, wherefore it was decided to only show a working principle for the audio sampling, requiring a lower sampling frequency, no requirement of a perfectly stable sample rate, and a cheaper microphone with built in amplifier could be used.

Simplicity and user friendliness was an important part for this project. By implementing all control on the computer side in MAX (see Section 2.1), a more compact system is created which allows the user to connect all parts of the system in the same program, removing the need for external programs for communication and visualization. It was also decided to, as far as possible, only use built in modules in MAX to build the system to minimize the number of external plugins needed to be installed by the user.

By integrating sensors in the instrument it is possible to detect different gestures and motions. However, advanced gesture detection, such as detecting sweeping or circular motions, were, due to the limited time of the project, deemed to be out of the scope for this project and it was decided to only implement functionality to estimate the current orientation of the instrument, in addition to providing raw sensor data to the user.

(13)
(14)

C

HAPTER

2

Background

2.1

MAX

Max 7, in this thesis referred to as MAX, is a visual programming tool for music and multimedia developed by Cycling ’74[1]. It allows for fast and relatively easy building of modules which can be used by musicians and music composers to influence the music using external parameters. It is used in this project as the main node on the computer to communicate with the instrument, control visualizations, and to influence the music using the received data from the instrument.

2.2

Open Sound Control

Open Sound Control, henceforth referred to as OSC, is a protocol for communication between networked multimedia devices, such as computers and sound synthesizers[2]. MAX uses OSC for UDP communication.

OSC messages are divided into three main parts, an address pattern, a type tag string, and arguments, where each part must have a length, in bytes, divisible by four, and it must end with at least one byte which is equal to zero. The address is a pattern used by the application to determine what data is sent, i e where to route it when it is unpacked. By specification[3] the address pattern should start with the character ’/’, although MAX does not follow this standard, ignoring the ’/’ when sending MAX messages using OSC.

(15)

6 Background The type tag string is a string representation of the types of the following arguments, if any, and starts, by specification, with the character ’,’. The arguments are a list with arguments of the types specified in the type tag string. The standard arguments types supported are 32-bit integers, type tag ’i’, 32-bit floats, type tag ’f’, OSC-strings, type tag ’s’, and OSC-blobs, type tag ’b’.

2.3

3D printing

3D printing is an additive manufacturing process in which machines, so called 3D printers, build real physical objects based on a 3D model from a computer. There exists a variety of different types of 3D printers, using different materials and technologies to build the models, though almost all 3D printers currently use the same basic building technique by building the model in layers, one at a time, where each new layer fuses together with the previous one. While not giving the same quality, since the layer-by-layer technique gives a ragged surface, as injection molding, which gives a smooth surface, 3D printing is good and commonly used for rapid prototyping, where models can be created, printed in order to see the end result, and thereafter, if needed, altered and printed again within a relatively short period of time.

2.4

Open Hardware

Open Hardware is hardware released under Open Source license, allowing anyone to study, modify, and distribute it[4]. This allows for a community to build around the hardware, providing a lot of examples, help, and facilitates the hardware selection process by forcing the manufacturer to release full specification of the hardware.

Open hardware has been widely spread, and has led to a trend with many, very small and inexpensive, hardware components, with good documentation and examples of use with microcontrollers.

2.5

Philips HUE

Philips HUE is a product line for smart wireless lighting. It consists of a bridge, which is connected to a network, and one or more light sources which are connected wirelessly to

(16)

2.6. NeoPixels 7 the bridge. Through communication with the bridge the user can control the brightness and colors of all connected light sources over a network. Philips HUE is well established on the market and widely supported through APIs written in multiple programming languages, making the communication with the bridge, and control of the light sources, easy to integrate in custom applications.

2.6

NeoPixels

NeoPixels is a product line from Adafruit with individually addressable RGB LEDs, allowing the user to control all lights in an array independently[5]. NeoPixels come in multiple variations, such as flexible LED strips with different LED density, LED rings, and separate LEDs for through-hole mounting. They are easily controlled using a micro controller and are widely used and supported, with many examples of setup and coding examples for creating and controlling custom animations.

2.7

Earlier work

This thesis work was the second part of The Extended Clarinet project at Lule˚a University of Technology. The first part of the project took place during the end of 2014 and the spring of 2015, during which a first prototype was designed, built, and evaluated. Multiple shortcomings were identified, most of which depended on the chosen main controller, which resulted in the iteration not reaching as far as planned. All work during the first iteration were documented in a technical report which were used as a reference during this thesis work.

The first prototype used a Spark Core as the main controller, a battery with capacity of 450 mAh with a SparkFun Power Cell as power supply, a NeoPixel strip with pixel den-sity of 144 LEDs per meter, an accelerometer (SparkFun MMA8452Q), a magnetometer (SparkFun MAG3110), and a touch button controller (Adafruit AT42QT1070).

During the development of the first prototype a clarinet bell, the lowest part of the clarinet (see Figure 2.1), was scanned using a 3D scanner in order to get a 3D model with correct dimensions. The 3D model were then edited and printed to create an inner body for the prototype, on which all hardware could be mounted. With all hardware mounted on the body an outer shell was designed by the design faculty at LTU and printed to cover all hardware and provide areas for the touch buttons.

(17)

8 Background

Figure 2.1: Illustration of a clarinets anatomy.

Figure 2.2: The first prototype with the outer shell on the left side removed.

2.7.1

Matters for further development

The main issue of the first prototype was the main controller, the Spark Core. While providing all necessary features it had a very unreliable WiFi module, which had problems with connecting to WiFi networks as well as maintaining a stable connection. Since

(18)

2.8. Related work 9 a reliable stream of data is required for this project it is important to have a stable connection.

When using the magnetometer in the first prototype the transmission of button states failed, probably due to a bug with the Spark pre-processor, wherefore the magnetometer was disabled and only accelerometer data was used for motion detection. This issue should be resolved in order to be able to estimate an accurate orientation around all three axes.

While requiring little space and being easy to use, firmware wise, the touch buttons and controller introduced other issues. In order to react in a satisfactory fashion the outer shell had to be very thin at the point of the button, and substantially indented in order to let the end user feel where the button area were. With the sensibility of the touch sensors being mainly a design issue, the lack of feedback prevented the user from knowing if the button had been pressed or not. Another issue introduced by this was that the end user could not feel the way to a button without risking to activate it, forcing the user to look for the locations of the buttons, which is not desirable for such a system. While not being a primary issue, the touch controller used in the first prototype only supported one button to be pressed at a time, which could be an undesirable feature.

Another hardware related issue was the battery life. Power consumption and battery life was secondary priorities during the first iteration, wherefore only a small, single cell, LiPo battery was used and no focus was put on optimizing for lower power consumption. While a battery with higher capacity would reduce this issue, focus should be put into optimizing the firmware for lower power consumption, such as limiting transmission rate and operations, and putting the main controller to sleep mode if appropriate.

The firmware developed for the first prototype was very static, using hard coded vari-ables for IP addresses and ports for data transmission, and the only options for config-uring these was to configure via the cloud service, requiring an internet connection. The firmware should be developed in a more general fashion, allowing the end user to config-ure all connection and transmission settings from a computer and the system should be able to operate without an internet connection.

2.8

Related work

While others have performed similar projects, using open hardware and sensors to aug-ment instruaug-ments, none seem to have focused on combining this with 3D printing to create custom parts for the instruments, thus hiding the components and maintaining a

(19)

10 Background relatively natural appearance of the instrument.

Schiesser and Schacher augmented a bass clarinet, SABRe[6], by adding motion sensors, additional input buttons, mouth pressure sensor, and key sensors to detect the current state of the native buttons of the clarinet. Their clarinet had two pre-processing and sending units and the augmentation hardware was spread out over the clarinets body, thus visible and not easily removable. This thesis project focuses on creating a more compact and concealed system, and adding new interaction methods, rather than adding extra functionality to native inputs. The SABRe communicates using a IEEE 802.15.4 network, requiring a specific receiver on the computer side.

The Electrumpet[7] is a trumpet augmentation which is easily attached and detached, and uses two Arduino devices to interpret and send the data. As with the SABRe all hard-ware is fully visible, significantly altering the visual appearance of the instrument. The Electrumpet uses Bluetooth for communication, allowing it to connect to most modern laptops without the need for specific receiver devices on the computer side. This the-sis project aims to use WiFi in order to support communication with most computers without the need for extra devices.

(20)

C

HAPTER

3

Theory

In this chapter the theory used during this thesis work is described. Firstly the theory for using sensors to measure movements is presented, then the theory for estimating the sensors’ current direction using the measured values. Following that the theoretical methods used to provide stable sensor data is described, and lastly the theory of recording audio, by converting it from analog sound waves to digital values.

3.1

IMU

An IMU (Inertial Measurement Unit) is a device that uses multiple sensors to measure specific force, angular rate, and in some cases the earth’s magnetic field. The sensors used are normally an accelerometer, a gyroscope, and in cases where the earth’s magnetic field is to be measured, a magnetometer. By having all sensors on the same device it is ensured that the axes of the different sensors are parallel and have the same origin, which is critical for reliable calculations when using data from multiple sensors at the same time.

An accelerometer is a sensor that measures the accelerations that affects the sensor, such as the gravitational force. An accelerometer that is perfectly still reports a value from −1g to 1g for each axis, where an axis will report 0g when the sensor axis is perpendicular to the gravity vector, as illustrated in Figure 3.1. By using an accelerometer with three axes the sensor’s orientation can be estimated around the global X- and Z-axes[8] (see Figure 3.4).

(21)

12 Theory

Figure 3.1: A dual-axis accelerometer with its axes perpendicular to the gravity vector.

A gyroscope is a sensor which measures the current rotational motion, more specifically the angular velocity, of the sensor. By giving the ability to measure continuous rotational velocity around the global axes, rather than rate of change along the axes as with an ac-celerometer, a gyroscope provides useful data for motion sensing and gesture detection[9]. While basic gesture recognition can be performed with only an accelerometer[10][11], the gyroscope data allows for more advanced gesture detection in 3D space by, among other, allowing for reliable orientation estimation during motion[12]. Gesture detection is how-ever out of the scope for this project, wherefore no advanced motion and gesture detection is covered in this thesis.

A magnetometer is a sensor that measures the strength of magnetic fields, which can be used as a digital compass to estimate a heading around the gravitational axis. Since there are many objects that has a magnetic field or can influence magnetic fields the magnetometer data has to be calibrated in order to produce correct readings[13]. There are two types of distortion for magnetometers, hard iron and soft iron distortion.

Hard iron distortions are constant, additive distortions created by objects with a mag-netic field that is added to the earth’s magmag-netic field, which offsets the magnetometer readings[14][15] (illustrated in Figure 3.2b). Hard iron offsets are independent of the orientation of the sensor.

(22)

3.1. IMU 13

Figure 3.2: Illustrations of distortions to a magnetic field where the blue square represents a distortion source and the arrows represent the direction of the magnetic field. a) no distortions, b) hard iron distortion, c) soft iron distortion.

Figure 3.3: Example data for magnetometer readings with: a) no distortions, b) hard iron distortion, c) soft iron distortion, d) both hard iron and soft iron distortions.

necessarily generating a magnetic field themselves[14][15] (as illustrated in Figure 3.2c). Such distortions are, unlike hard iron distortions, dependent on the orientation of the object relative to the sensor and the earth’s magnetic field.

In the ideal case, the values sampled from a magnetometers axes perpendicular to the gravity vector, when rotated 360 degrees around the vertical axis, would produce a perfect circle centered at (0, 0) when plotted in a coordinate system[15]. As illustrated in Figure 3.3, the hard and soft iron distortions can be seen as offsets and deformations to this circle. To compensate for these offsets and deformations the sensor data has to be calibrated, thus restoring the shape and origin of the circle, which is essential in order to estimate a direction.

(23)

14 Theory To compensate for hard iron interference the sensor is rotated, with the axis of calibration perpendicular to the gravitational vector, and the maximum and minimum measurements are saved. These measurements are then used to determine the center of the circle[15] (as illustrated in Figure 3.3) which equals to the calibration offsets to be used for restoring the origin of the circle, thus compensating for the interference.

The soft iron interference is, in this case, negligible, wherefore the methods for soft iron compensation are not covered in this thesis.

3.2

Estimating orientation

By using an accelerometer in combination with a magnetometer the current orientation of the instrument can be estimated in terms of the rotation angles roll (φ), pitch (θ), and yaw (ψ). The system uses a global right handed coordinate system with the x-axis pointing right, the y-axis pointing upwards and the z-axis pointing backwards, from the point of view of the musician.

Figure 3.4: Right handed coordinate system.

To calculate the roll and pitch angles the accelerometer is used, and the magnetometer is used to determine the yaw value. Each rotation angle is calculated using two compo-nents, which ideally should be perpendicular to the rotation axis. This however cannot be achieved by using fixed sensor axes for each component since the sensor can have an arbitrary rotation. To overcome this problem tilt compensation can be used. Tilt

(24)

3.3. Data filtering 15 compensation uses all sensor axes in addition to the previously calculated rotation angles to create new components which are perpendicular to the rotation axis. The roll angle is calculated first, wherefore it cannot be compensated, the pitch angle is compensated for the roll, and the yaw angle is compensated for both roll and pitch.

The equations for tilt compensation[8], using the coordinate system as described above, are as follows tan φ = ax ay (3.1) tan θ = −az axsin φ + aycos φ (3.2)

tan ψ = mxcos φ − mysin φ

−mxsin φ sin θ − mycos φ sin θ − mzcos θ

(3.3)

where (ax, ay, az) are the accelerometer components and (mx, my, mz) are the

magne-tometer components.

Since the equations (3.1–3.3) have an infinite number of solutions for multiples of 2π the angles are calculated using the software functions ATAN and ATAN2, where ATAN outputs a value in the interval (−π22) and ATAN2 outputs a value in the interval [−π, π], thus giving φ = ATAN2(ax ay ) (3.4) θ = ATAN( −az axsin φ + aycos φ ) (3.5)

ψ = ATAN2( mxcos φ − mysin φ

−mxsin φ sin θ − mycos φ sin θ − mzcos θ

). (3.6)

3.3

Data filtering

The values from sensors such accelerometers usually fluctuates with a high frequency. Filtering of the measured values is performed in order to smooth out the data, thus removing the fluctuations.

A simple moving average filter is a filter which outputs the average of a series of numbers. The series which is averaged is the N latest measured values, thus giving the filter

(25)

16 Theory equation[16] Xi = 1 N N −1 X n=0 Vi−n, N > 0 (3.7)

where Xi is the filtered value, Vi is the measured value, and N is the filter constant.

A low-pass filter is a filter which strongly attenuates high frequency fluctuations, and passes lower frequencies. A simple low-pass filter can be implemented using linear interpolation[17], leading to the filter equation

Xi = Xi−1C + Vi(1.0 − C), 0 ≤ C < 1 (3.8)

where Xi is the filtered value, C represents the filter coefficient, and Vi is the measured

value.

The Kalman filter is a two-step filter, which predicts a state for the system and updates it using real measurements. A standard Kalman filter is complicated and includes many matrix operations. However, for filtering sensor data a one dimensional Kalman filter, which could be applied once for each sensor axis, is sufficient. Using the equations in [18], and replacing the covariance matrices Wt and Vt with constants, Q and R, in order

to allow static configuration of the filter, leads to

Ppredicted= Pi−1+ Q (3.9) Ki = Ppredicted Ppredicted+ R (3.10) Xi = Xi−1+ Ki(Vi− Xi−1) (3.11) Pi = (1.0 − K)Ppredicted (3.12)

where Xi is the filtered value, Vi is the measured value, P is the estimation error

covari-ance, Q is the process noise covaricovari-ance, and R is the measurement noise covariance.

3.4

Audio sampling

Audio sampling, or digitizing, is the process used to record audio waves by converting them from an analog, (continuous) to a digital (discrete-time) signal. When sampling, samples of the analog signal is taken at regular intervals, determined by the sampling frequency, and stored as digital values. Since the ADC (Analog-to-Digital Converter) has a finite range, the samples will be quantized to the restricted set of values[19], thus

(26)

3.4. Audio sampling 17

Figure 3.5: Representation of sampling and quantization. The continuous signal (red) is sam-pled at regular intervals and the sample values (black) are matched to the closest discrete value.

Figure 3.6: Example of aliasing, where two distinct signals share sampling points.

introducing errors to the sampled signal (see Figure 3.5). A theoretical ideal sampler introduces no quantization.

According to the sampling theorem[20], in order to allow perfect reconstruction of the sampled signal back to an analog signal, the sampling frequency fsmust be at least 2fmax

where fmax is the highest frequency in the original signal. Lower sampling frequencies

introduces aliasing, where different signals become indistinguishable when sampled[21] (as illustrated in Figure 3.6).

(27)
(28)

C

HAPTER

4

Methodology

In this chapter the methodology for the development process during this thesis work is described. Firstly the hardware is presented, discussing the selection of the hardware components and how they are connected. Following that the developed software solutions for the system is presented.

4.1

Hardware

This section explains the hardware chosen for this project, starting with an overview of the system as a whole followed by detailed explanations of the individual parts.

The main part of the system is the main controller, which controls all hardware, performs necessary computations, and communicates with MAX on the computer in order to provide the sensor data to the user. Connected to the controller is a small display for easier setup and configuration, an IMU for motion detection, an LED strip for visual effects, six tactile buttons, a LiPo charger and booster to allow for power supply from a LiPo battery, and a LiPo fuel gauge to measure the current battery level. The display, the IMU, and the fuel gauge communicate with the main controller using I2C while the buttons and LEDs are connected directly to the digital I/O on the controller.

All hardware were put on the printed body, as illustrated in Figure 4.1, using hot glue and was connected as illustrated in the circuit diagram (see Figure B.2).

(29)

20 Methodology

Figure 4.1: Illustration of the location of the hardware on the 3D printed body.

4.1.1

Control unit

As the main controller used during the first part of the project, Spark Core, had multiple shortcomings, wherefore a new controller had to be chosen in order to get a stable and more user friendly prototype.

The controller chosen for this project was the Particle Photon, a sequel to the Spark Core. This controller was chosen due to it being code and pin compatible with the Core and offered better hardware specifications such as faster CPU, more RAM and a new WiFi module. The WiFi module in the Photon supports the ability to act as a soft access point (AP), although an official API for developers to use in order to utilize this functionality has not yet been released, allowing it to create its own WiFi network for other nodes to connect to, giving a more concentrated system and more user friendly setup. The AP mode currently only serves the function of letting the users connect to it, if it could not connect to a WiFi network, allowing the user to wirelessly add configurations for WiFi networks. Thus, while not allowing the use of its full potential, the AP mode still serves a role in making the system setup more user friendly.

Another controller that was considered was the Intel Edison. The Edison offers much better hardware specifications, it has both WiFi and Bluetooth built in, and it is smaller. The reasons why it was not chosen was since it requires all extension blocks, modules

(30)

4.1. Hardware 21 connected to the Edison in order to provide additional features such as I2C communication or battery connector, to be stacked which takes up more space vertically, unlike a setup where all parts can be connected through wires which allows for arbitrary positioning of the hardware components, and makes the prototype more bulky and it would require more work to get the setup to the same state as the first prototype, while the Photon could be used directly with all existing hardware and software.

4.1.2

IMU

For motion sensing the accelerometer and magnetometer used in the first prototype were exchanged for an IMU (SparkFun LSM9DS1) with nine degrees of freedom, including accelerometer, magnetometer, and gyroscope, all with 3 axes. The sensor board was placed flat on the top side of the bell (see Figure 4.1) with the Y-axis pointing forwards (from the musicians point of view), and the axes were remapped in the software to match a global right handed coordinate system (see Figure 3.4).

Using an IMU instead of three separate sensors gives a more uniform and compact system in addition to ensuring that all axes of the different sensors are parallel, which improves the accuracy and facilitates the estimation of the current orientation using data from multiple sensors since no additional calibration to compensate for sensor alignment is needed.

4.1.3

Power supply

For power supply a SparkFun Power Cell was chosen in addition to a 3.7V LiPo battery with a capacity of 2000mAh. The power cell was chosen because it converts the battery voltage to 5V as well as providing charging capability using Micro USB and the ability to disable the power supply using a switch, allowing the user to turn the system off without disconnecting the battery. All components that require 5V was connected to the Power Cells 5V output directly, along with a decoupling capacitor with capacitance of 1000µF in accordance with the best practices for NeoPixels[5]. The components requiring 3.3V was connected to the 3.3V supply on the Photon.

In order to minimize the effects of soft iron distortions introduced by the battery[22], the battery was placed on the opposite side, as far away as possible, from the IMU (see Figure 4.1). To give the user an indication of the current battery level a SparkFun LiPo Fuel Gauge was connected directly to the battery connectors on the Power Cell and connected to the Photon via I2C.

(31)

22 Methodology

4.1.4

Display

For easier configuration and setup a small display was added. The display chosen was a SparkFun MicroView, a small micro controller with built in display. The MicroView was connected to the Photon via I2C and coded in a passive fashion, where the Photon

keeps track on the current display mode and formats all data for the display, and the MicroView only displays the received data. The display was put on top of the IMU, facing directly outwards from the body (see Figure 4.1). At a first test with the display it was angled towards the user, but were after testing put flat on the body to minimize the amount of visible hardware since, due to the size and position of the display, the musician could not see the display clearly when using the instrument.

4.1.5

Buttons

Due to the lack of physical feedback and inconsistent response, the touch buttons and controller used in the first prototype were replaced with physical momentary buttons connected between ground and digital inputs on the Photon.

A total of six momentary buttons was placed on the bell. Two on each side for extra inputs to the computer, one on the bottom side for calibration of the sensors, and one on the display to cycle through the different display modes.

4.1.6

LED

For lighting effects in the clarinet a NeoPixel strip of 28 LEDs with a pixel density of 144 LEDs/meter were put on the inside of the bottom end of the bell, facing inwards. The visibility of the LEDs through the plastic depends on the material used for the bell. An alternative for the LED strip that was considered was to use individual addressable LEDs mounted in holes around the bottom ring of the bell, facing forwards. By mounting them in holes the material of the bell would not have any influence on the visibility of the light, but such a setup would not allow as dense concentration of LEDs, and with a bell printed in half translucent plastic a smoothing effect was added to the light, making the light from the individual LEDs blend together and light up more of the bell itself.

(32)

4.2. Software 23

Figure 4.2: Picture of a NeoPixel strip and two separate NeoPixels made for through hole mounting.

4.1.7

Microphone

To add the ability to record audio an electret microphone with amplifier (Adafruit 1063) was connected to an analog input on the Photon. Other similar components were inves-tigated and the Adafruit was chosen due to having adjustable gain, while the others had static, and the microphone had wider frequency range (20 ∼ 20000 Hz).

4.2

Software

This section explains the software solutions implemented in this project.

4.2.1

Data transfer

The sampled and calculated data from the Photon was sent to MAX over UDP using the MAX OSC package format. The data was divided into four main categories, button states, raw sensor data, calculated sensor data, and orientation, where the sensor data categories were divided into three parts, one for each sensor. The button states was sent as an integer, where the least significant byte represented the current state of all buttons as a bit array with button one at the least significant bit, with a value of 1 represents the button being pressed. The raw sensor data consists of the raw sensor readings, after eventual filtering, represented as integers. The calculated sensor data consists of the measured values for each sensor represented as floating point numbers.

(33)

24 Methodology The measured values have units of g-force for the accelerometer, degrees per second for the gyroscope, and gauss for the magnetometer. The orientation consists of three floating point numbers, representing the estimated orientation of the sensor in terms of roll, pitch, and yaw, respectively.

Only the data requested will be sent, thus altering the transmitted list of data. The data is sorted as seen in Table 4.1, with only the selected parts present. The request message is an OSC message with two integer arguments, the first of which the least significant byte represents the requested data, and the second which represents which UDP port that the data should be transmitted to.

Data Button states Raw sensor data Calculated sensor data Orientation Buttons Acc. Mag. Gyro. Acc. Mag. Gyro. Roll, Pitch, Yaw Format i i i i i i i i i i f f f f f f f f f f f f Table 4.1: Format of the received data. i indicates an integer value, f indicates a float value

Request bits Request integer Received list format a 00000000 0 None

b 01100000 96 i i i i i i c 10000011 131 i f f f f f f

d 11111111 255 i i i i i i i i i i f f f f f f f f f f f f

Table 4.2: Examples of data request and received list format. i indicates an integer value, f indicates a float value. a) Request no data, stops transmission. b) Request raw sensor readings for the accelerometer and the magnetometer. c) Request button states, calculated gyro data (degrees per second), and estimated orientation (roll, pitch, and yaw). d) Request all data.

4.2.2

Calibration

To make the calibration easy to use for the end user a calibration button was added to the prototype to allow for calibration of the gyroscope, the magnetometer, and setting an orientation offset, thus allowing the user to measure the orientation relative to the set orientation instead of the sensor’s absolute orientation. The calibration button is used for all three settings, where a single press will set the initial orientation to the current

(34)

4.2. Software 25 absolute orientation of the sensor, the button being held down between 1.5 and 3 seconds and then released enters the gyroscope calibration mode, and the button being held down for over three seconds before release enters the magnetometer calibration mode.

The gyroscope calibration mode is to set offsets for the gyroscope in order to minimize sensor drifting. When entering the mode the user will, through instructions on the display, be prompted to hold the sensor perfectly still and press the calibration button. When pressed the Photon will collect 128 consecutive samples for each gyro axis, calculate the average for each axis and set it as offset to be used for future sensor readings.

When entering the magnetometer calibration mode the user will be instructed to hold the sensor horizontally and rotate around the vertical axis before pressing the calibration button again, which prompts the user to hold the sensor vertically and rotate around the vertical axis again before pressing the calibration button to exit the calibration mode.

Figure 4.3: Illustration of the two steps of the magnetometer calibration.

The calibration is performed in two steps, where the first step calibrates the X- and Z-axes and the second step calibrates the Y-axis of the sensor. When sampling values for the calibration the sensor axis needs to be perpendicular to the gravitational vector, as illustrated in Figure 4.3, in order to minimize calibration errors. To ensure good sampling data the measurement was paused if the sensor was tilted by more than approximately five degrees. In order to provide feedback to the user the LEDs was used, lighting green when the sensor orientation was good and lighting red if the sensor was tilted.

(35)

26 Methodology

4.2.3

Audio sampling

The audio sampling was implemented by connecting the chosen microphone and amplifier to an analog input on the Photon and using a timer interrupt on the Photon to read the value at regular intervals. For simplicity when testing the sampled data was sent to MAX using OSC over UDP, since it required no external packages in MAX and the OSC format was already used for sending of other data, allowing for quick adaption. It was planned to, when a satisfactory sampling rate and sound quality had been achieved, to exchange the sending using OSC to compact packages using TCP to ensure package delivery and minimize overhead. However, due to problems with the sound quality and getting a reliable sampling rate (see Section 5.2), the TCP transmission was never implemented as the audio sampling was removed.

4.2.4

Configuration

In contradiction to the first prototype, where all configurations were hard coded in the firmware, configuration for this prototype was performed via MAX. A configuration patch was created, which establishes a TCP connection to the instrument in order to send and receive configuration packets, formatted according to the MAX OSC packet standard (see Section 2.2). The built in objects for TCP communication in MAX only supports the sending of MAX matrices, which was not suitable for configuration messages. Therefore a separate TCP module was developed, based on the mxj tcpClient created by Arvid Tomayko[23], which supported MAX messages as input and converted these to the MAX OSC format before sending, in a similar fashion to the built in udpsend object[24]. The configuration patch was designed to present a simple interface to the user for easy setup and configuration.

While mainly designed for usage via the provided interface (see Figure 4.4), the configu-ration patch was built to allow control from other patches via an inlet object as well as a receive object (for receiving MAX messages without patch cords). All messages sent to the inlet are routed to the corresponding configuration. A special receive object was also added, which was connected directly to the TCP object, allowing advanced users to send messages directly to the instrument (see Table A.1 for supported messages), without setting the parameters in the configuration patch. All messages sent from, and received to, the configuration patch is sent to an outlet as well as a send object (for sending MAX messages without patch cords), thus allowing the user and other patches to study, save, and keep track of the messages and the current configuration state of the instrument.

(36)

4.2. Software 27

Figure 4.4: The configuration patch created for MAX. Settings from top to bottom; 1) TCP connection to instrument. IP address and port for the instrument. 2) Data transfer. Which data the should be sent to the computer and which port it should be sent to. 3) Latching mode. Which buttons that should be in latching mode (on) or momentary mode (off ). 4) Filter. Configure how the sensor data should be filtered. Filter parameters are shown for the chosen filter. 5) Magnetometer offset. Set offset values or get the current offsets from the instrument.

4.2.5

Filtering

In order to get smoother and more stable sensor data without flickering, filtering was implemented on the Photon. Three types of filters was implemented, a one dimensional Kalman filter, a Low Pass filter, and a simple moving average filter (see Section 3.3). The filters was implemented to, when active, filter the raw sensor data directly when

(37)

28 Methodology it was updated, and when initialized to start with the raw sensor readings rather than previously filtered data in order to start at a correct value and thus minimizing potential initial errors.

Figure 4.5: Example of data filtering using a simple moving average filter with different filter constants.

Figure 4.6: Example of delay introduced by a simple moving average filter using different values of the filter constant.

(38)

4.2. Software 29 The activation of filters and setting of filter parameters is possible using the configura-tion patch in MAX (see Secconfigura-tion 4.2.4), allowing user friendly setup and configuraconfigura-tion. The option to set custom filters and parameters ensures the possibility to achieve optimal filtering for the current application. Where heavy filtering might be good for slower move-ments, providing very smooth data, it introduces noticeable delay and faulty readings for faster changing data (see Figure 4.5 and Figure 4.6).

4.2.6

Visualization

The visualization of the collected data was divided into three parts, visualization using the LEDs in the instrument, using Philips HUE to alter the lighting in the room, and 3D graphics in MAX on the computer.

All visualizations are based on the estimated orientation of the instrument, mainly map-ping the values to different colors to visualize movement. Common for all are that the current orientation around the Z-axis (see Figure 3.4) in degrees is directly mapped to the corresponding color on the color wheel (see Figure 4.7) to use as a base for the visual-ization. All visualizations are controlled via different patches in MAX on the computer.

(39)

30 Methodology

Figure 4.8: The NeoPixel LEDs lighting up the bell when rotated 180 degrees from its initial direction with no pitch.

The visualization using the LEDs in the instrument use the pitch value (see Section 3.2) to offset the base color, thus giving a visualization of movement around two axes. To achieve smooth transitions and avoid flickering a transition time of 0.5 seconds were used for the LEDs and new values were sent at the same interval.

The HUE visualization consisted of three HUE light bulbs. Only the base color was used, but it was sent to all lights with different delays, where the first light got the new value directly, the second got it after 0.9 seconds and the third after 1.8 seconds. This was done to let the lights act as a flowing timeline. As with the LEDs a transition time, in this case one second, were used to avoid flickering and give smooth transitions, and the sending interval was 0.95 seconds. By only using the base color for the HUE lights the audience is given a reference for hue pitch offset used for the LEDs.

The 3D visualization was done to provide a simple augmented reality based 3D visualiza-tion. It contained a simple 3D scene with a floor and a back wall with wooden textures (see Figure 4.9). On the back wall the video feed from a camera directed towards the musician was projected in order to add the reality element. The floor in front of the video plane had eight stretched spheres, which would change its materials emission parameter depending on the current orientation around the Z-axis, giving the illustration of the light directly in front of the musician lighting up. The base color was added as scene lighting to match the reality.

(40)

4.2. Software 31

Figure 4.9: Screenshot from the 3D visualization.

The pitch and yaw angles were used in the 3D visualization to control the position of the camera, allowing the musician to control the 3D view. The yaw value was used as with the lights to always have the camera to appear to be positioned in front of the musician, while the pitch value were used to control the elevation of the camera.

(41)
(42)

C

HAPTER

5

Evaluation

5.1

Hardware and design

For easy removal the Photon and MicroView was connected to the system via socket headers, adding height to the position of these. The use of socket headers allowed easier prototyping, since the MicroView had to be disconnected in order to be reprogrammed, but the added height to the position of the components affected the design of the outer shell adversely. With the final software flashed on the controllers the socket headers should have been removed and all wires soldered directly to the connectors on the con-trollers to allow for a slimmer design of the shell.

The weight of the prototype was not a matter for consideration during this project, but due to the increased weight of the developed prototype, 211 grams compared to the first prototype weight of 109 grams, the weight started to become an issue by affecting the physical impression when using it, adding an unwieldy feeling to the instrument. The added weight was mainly due to the adding of a display, the new battery, and, due to larger battery and inconvenient position of the Photon and the display, the bigger outer shell.

Overall the hardware met the expectations, allowing for the system to be built in a relatively easy manner. The use of physical momentary push buttons instead of the previous touch buttons improved the user experience and allowed for multiple buttons to be used at the same time, a feature which was limited by the hardware for the touch buttons. The display, while not providing any useful features when playing, significantly increased the ease of setup and configuration. The upgrade from the Core to the Photon

(43)

34 Evaluation

Figure 5.1: Close up pictures of the Photon and MicroView connected to the socket headers.

Figure 5.2: Comparison of the outer shells for the first (silver) and second (white) prototype.

was the main improvement, hardware wise, since it had resolved the issues with unstable WiFi connection, allowing the system to work effortlessly as intended.

5.2

Audio sampling

The implemented audio sampling was satisfactory in the sense of showing a working prin-ciple, but due to limitations with the chosen components it did not fulfill the expectations for this project, wherefore it was removed from the final version of the prototype.

(44)

5.3. Interaction 35 The first problem was to get a steady sampling frequency. While the Photon has the capability of sampling at regular intervals at a very high frequency, its system loop, which maintains the WiFi connection, disables interrupts and therefore creates a small window without samples each loop.

The second problem was the amount of noise and scratch introduced with the sampling. Most of the noise was introduced by the amplifier, and was constant regardless of the set gain for the amplification. While sufficient for showing of a working principle, small microphones with built in amplifiers connected directly to an analog input does not provide the quality needed for a functional prototype. For satisfactory sampling quality an external sound card with connector for an arbitrary microphone should be used.

5.3

Interaction

The interaction methods used to interact with the developed prototype are very elemen-tary. The implemented motion detection provides a new way of interacting with a clarinet in a simple fashion, but does not add a higher level of interaction. The estimation of the current direction of the instrument is sufficiently accurate for smaller rotations of the instrument and slower motions, but has compromised reliability for more extreme rota-tions, such as roll angles outside the interval of ±45 degrees or pitch values outside the interval of ±80 degrees, where the estimated rotations angles are prone for fluctuations and estimation errors due to the implemented tilt compensation. Estimation errors are also introduced by faster movements due to the increased acceleration, something which could be mainly resolved by using a more advanced model for the estimation, using the gyroscope measurements to compensate.

5.4

Visualization and audience experience

The implemented methods for visualization of the interactions, NeoPixels, Philips HUE, and 3D environment, all, individually and together, adds a second level to the audience experience, allowing the artist to increase the visual impression and thus add a more dynamic feeling to the performance.

While increasing ease of use and allowing for a compact system by implementing all visualizations in MAX, the HUE and 3D visualizations would benefit from being imple-mented as separate applications. With the visualizations running, MAX was prone to

(45)

36 Evaluation crash frequently. The HUE visualization had the problem with MAX, after a certain period of time, failing to create a connection to make the HTTP requests to control the lights, thus stopping the visualization. With the 3D visualization MAX had a tendency to quit unexpectedly. The 3D visualization also suffered from a noticeable amount of lag at times, and in order to provide smooth animations and minimize flickering needed heavy filtering which added a noticeable delay for the animations. The delay due to fil-tering could be resolved by using different filters and tweaking the parameters, something which can be done in MAX but would be easier to customize in a desirable fashion in a stand alone application.

More advanced means of visualization, which could allow for visualization of other data or allow audience interaction through augmented reality on mobile devices, could be implemented, as long as the main principle of the visualization, if its purpose is to mediate data such as the artists movements, is relatively simple and clear. For a clear mediating visualization the audience should be able to understand how it is influenced.

5.5

Comparison with first prototype

To easily evaluate the new prototype developed during this thesis work it was compared with the first version in order to get a clear overview of made improvements.

The battery life was tested on both prototypes starting with a fully charged battery, the controller configured to transmit all available data to MAX, and with the LED strip on at half brightness. The second prototype (developed during this project) had the display turned off, the filtering set to a Kalman filter, and the LED visualization active with the orientation set to match a color of full red. The first prototype had the LED color set statically to full red. The settings were chosen to imitate typical settings for a performance rather than a worst case scenario. The first prototype had a battery life of 28 minutes and the second prototype had a battery life of 4 hours and 8 minutes. The first prototype had no filtering implemented on the main controller, forcing the user to use the built in slide object in MAX or to implement their own filters in MAX. While not being a major issue, it was not as user friendly as the system aims to be. The second prototype has two types of filters implemented, allowing the user to choose an appropriate filter for the current application, although it could be done in a more general manner (see Section 6.2.4).

(46)

5.6. Artist evaluation 37

5.6

Artist evaluation

The clarinetist who tested the prototype during the development process, Robert Ek, thought the new prototype was a significant improvement compared to the earlier version. The system is more stable and reliable, and easier to use and set up. The orientation estimation combined with the ability to set a direction offset as well as built in filtering allows for fast and easy setup as well as a simple yet effective way of interacting with the instrument. The improved battery life also allows for the prototype to be used for longer periods of time, which is essential for live performances.

The disadvantages with the new prototype were the size and weight. The new battery is significantly larger than the previous and adds much weight as well as takes up a lot of space on the body. The size is more of an aesthetic issue, while the weight is more critical as it affects the movements, making the clarinet feel a bit unwieldy.

(47)
(48)

C

HAPTER

6

Discussion

6.1

Goals and purpose

All goals for the augmentation part of the project were met, excluding the implementation of satisfactory audio sampling, although it showed a working proof of concept.

In the aspect of interaction through motion the expected results were achieved. The simple model used in this project, estimating the direction in which the clarinet points in terms of three rotations, to detect motion was perceived as easy to integrate with MAX to influence the music as well as easy to use while playing and an intuitive way of using movement to influence the music and visualizations. While not fully utilized in this thesis work, the use of multiple, different, sensors allows for many new possibilities to influence the music through motion.

It was found that in order to digitize the music within the extension of the instrument, inexpensive and open hardware can be used, but in order to achieve satisfactory quality the use of hardware specifically made for such operations, such as external sound cards, should be used. The sampled audio should be sent using TCP in order to ensure delivery and correct packet order.

The proposed method of transferring the collected data to a computer is to use WiFi and send the data using OSC packets over UDP for fast delivery. While not being as reliable as TCP, UDP allows for a fast continuous stream of data from the controller. The OSC format allows for direct communication with multiple modern multimedia software, including MAX. For use with applications which do not natively support the OSC format,

(49)

40 Discussion the use of an open standard with well defined specification allows for relatively easy adaption to add support for the protocol.

The use of the collected data for visualization and to improve the audience experience flowed together, with the visualization being used to provide a new experience for the audience. The end results were satisfactory, though matters for improvement exist. While this project focused mainly on extending the visual experience for the audience, other methods of increasing the overall audience experience were found, such as using the motion to send the music to different speakers in the room to add a more dynamic and interactive level to the auditory experience, or using the sensor data to use motions to interact with the decor on the stage.

6.2

Future work

While the prototype developed during this project has many improvements compared to the first prototype, there are still many things that could be improved in order to reach all goals in a satisfactory manner and produce a finished product.

6.2.1

General

Two issues, regarding user friendliness, with the developed prototype is that the user will have to remove one side of the outer shell in order to charge the battery, and the Photon has to be turned on and connected to a WiFi network in order to show the current battery level. The first could easily be solved by adding a separate connector for charging which could be accessed without removing the shell. The second could be solved by either adding a small battery indicator, or by editing the firmware to show the battery level while starting the system and trying to connect to a WiFi network. For good user friendliness, hardware wise, the end user should never have to remove the outer shell.

6.2.2

Controller

For future iterations the Particle Photon should be exchanged for a unit which is more suitable for audio sampling and preferably allows full use and configuration of a soft access point for WiFi. Although the Photon was chosen over the Intel Edison for this

(50)

6.2. Future work 41 project, the Edison, or a controller with similar specifications, would be a better choice for future generations. With the ability for the main controller to act as an access point, i e creating its own WiFi network for computers to connect to, the system would be more independent and could offer a more user friendly setup.

6.2.3

Power consumption

As during the first iteration of the project no primary focus were laid on power con-sumption and battery life. The battery used for the first prototype was exchanged with a larger one and the display has an off mode in order to increase the battery life, but much can still be done to save power. Transmission rates should be analyzed in order to send data at as low frequencies as possible while still not introducing noticeable latency. By allowing the user to put the controller to sleep mode, which could be useful if the transmission of data is only needed during certain parts of a performance, the power consumption could be greatly reduced.

The most power consuming component is the LED strip. By adding a second battery to the prototype only for the LED strip the battery life could be greatly improved for the other components. Another way to reduce power consumption would be to exchange the strip used for another with less LEDs, or fewer separate LEDs (as mentioned in Section 4.1.6).

6.2.4

Independent filtering

The filtering implemented in this project is applied directly to the sensor data when read. While this is acceptable for the most applications, the ability to set filters independently for each host should be added. This should be done since different patches could re-quire different filters or filter parameters in order to work properly. By implementing independent filtering such patches would be allowed to be used simultaneously, adding fewer restrictions to the end user. Another solution would be to turn the filtering off on the instrument and implement filtering in MAX instead, but that would give a less user friendly system.

(51)

42 Discussion

6.2.5

Gesture detection

The current prototype only allows for more advanced gesture detection, such as detecting sweeping motion, specific motion patterns etc., on the computer side, using the raw sensor data, while the built in calculations on the Photon only estimates the current orientation of the instrument. For future iterations methods for advanced gesture detection should be investigated and, if applicable, implemented in a manner where the main controller could send notifications when detecting specific gestures. This would allow for more a dynamic, personal, and creative interaction with the instrument.

6.2.6

Audio sampling

With a new controller (see Section 6.2.2) the audio sampling could be implemented in a satisfactory manner. If the chosen controller supports USB (like the suggested Intel Edison) a small USB sound card could be connected, allowing for an arbitrary microphone with 3.5mm connector to be connected and used. Using a separate sound card also ensures relatively good quality of the sampled audio. Since the Edison can run Linux the sampling could be done using a separate application running in another thread, thus resolving the issue with unstable sampling frequency.

6.2.7

Custom hardware and design

While good for fast development of prototypes, the use of multiple hardware components results in an unnecessary large and cumbersome prototype, with many wires and com-ponents. Since all hardware chosen for this project is released under open source license it is fully possible to, in a relatively simple manner, design a custom PCB containing the hardware for all components, making it smaller and letting all sensors share data lines and power supply. This could significantly reduce the size needed inside the prototype, allowing for a slimmer shell around it, creating a more natural looking prototype. During this project all hardware was placed on a generic, 3D printed, bell and the outer shell were then designed to fit around the hardware. This made it easy to build a pro-totype fast but increased the difficulty to place the hardware symmetrically and parallel with the axes of the bell. By modeling a bell specific for the chosen hardware, with flat bases for the sensors and mounts for the buttons, a more symmetric prototype could be built in addition to providing more accurate readings by ensuring that the sensor axes are parallel with the instruments axes.

(52)

6.3. Conclusions 43 The prototype could easily be converted to fit other instruments since no hardware is specific for the clarinet. Only the physical layout of the hardware would have to change as well as designing and printing a new body and shell which fits another instrument.

6.3

Conclusions

Although there still exists multiple matters of further development, the prototype devel-oped during this thesis project shows a distinguishable evolution compared to the first prototype. The change of control unit from the Spark Core to the Particle Photon signif-icantly improved the reliability of the system, mainly the WiFi connection, and allowed for easier initial setup, although another controller, like the Intel Edison, would have allowed for more desired features to be implemented.

By allowing all configuration to be done via MAX, in addition to providing more options for configuration, instead of forcing the user to use the cloud service the second proto-type gives a more user friendly system in addition to removing the need for an internet connection, making the overall system more independent.

The lack of audio sampling is not critical as other, independent, solutions exist, although it would increase the ease of use by having all necessary electronics integrated within the same system. The integration of satisfactory yet inexpensive audio sampling in the prototype would most likely increase the interest for such a product on a future market. The use of novel techniques for creating and augmenting musical instruments in this fashion, using inexpensive open hardware components, allows for individuals to express themselves musically in a new way, adding a level of embodiment to the creation of music and empowering dynamic creativity.

Through motion and visualization a new way of conveying emotions linked to the music is introduced, where the artists can, for example, direct the instrument downwards and move slowly in order to create a more dismal visualization when playing gloomy music, or change the lighting to animate with bright colorful light when directing upwards and moving around to convey happiness and activity.

The added level of embodiment allows for redefining of musical creation and processes involving music, for example where dance moves would normally be choreographed to match the music, the process could be altered, where the music could be created partially through the dance moves.

(53)

44 Discussion Overall, although some of the initial goals were excluded, the system works in a satis-factory manner, providing new methods of interaction with the instrument in order to influence the music through motion, and provide visual feedback of the collected data to the audience, thus extending the overall experience.

(54)

A

PPENDIX

A

Tables

Address Type tag string Arguments Description dat i i data port Request data con - - Connect to cloud

lat i latch Turn latching mode on or off for the buttons

lsc i rgb Set one color for all LEDs. Red, green, and blue values are packed in one integer, in respective order, with the most significant byte empty.

lsb i brightness Set brightness for the LEDs, values between 0–255 are accepted.

lst i transition Set the transition time for the LEDs. Time in millisec-onds.

rec i rects Light up quadrants of the display when in corresponding display mode. Last four bits of the integer controls the states of the quadrants.

fil i i/f f f filter c/p/n q r Set filtering. The first integer represents which type of filter that should be used, 0 = off, 1 = Low Pass, 2 = Kalman, 3 = Average. The arguments set parame-ters for the filparame-ters. For no filtering all are ignored. For low pass filtering the first float represents filter coeffi-cient and the others are ignored. For Kalman filtering the floats represents the estimated error, process noise, and sensor noise, in respective order. For moving aver-age filtering the first argument (integer) sets the filter constant.

cre s s ssid passphrase Add new WiFi credentials.

cal i i i x y z Set magnetometer calibration values (hard iron offsets). gca - - Request the current magnetometer calibration values.

Table A.1: Configuration messages supported by the main controller

(55)
(56)

A

PPENDIX

B

Figures

Figure B.1: Size comparison of the battery used in the second prototype (left) with the battery used in the first prototype (right).

(57)

48 Appendix B

(58)

49

References

Related documents

In Table 3, outcomes are described across commercialization mode and whether inventors were active during the commercialization. Patents commercialized in new firms have a

It could be debated that these terms, ‘nearly’, ‘somewhat’, ‘almost’ in reference to how site-specific a work is, could be understood as manifestation of the fluidity that

The present experiment used sighted listeners, in order to determine echolocation ability in persons with no special experience or training in using auditory information for

In total, 17.6% of respondents reported hand eczema after the age of 15 years and there was no statistically significant difference in the occurrence of hand

In this thesis, I wanted to design a lamp in collaboration with the lighting company Örsjö Belysning AB, that would contribute to stress-reduction and calmness both through visual

protection of fundamental rights and freedoms that is essentially equivalent to that guaranteed within the European Union” 8 the protection of the fundamental rights of EU

Included in the platform is a web site specific for each customer where all data is presented and there is also a possibility for the customer to upload files containing open

For unsupervised learning method principle component analysis is used again in order to extract the very important features to implicate the results.. As we know