Refining sensor readings

After achieving the first goal to turn an effect on above a certain fret and turn it off when below this fret as outlined in Blog #3, I have to admit that I struggled a little bit figuring out my next steps. Luckily, my supervisor and I were able to schedule and hold a meeting last week and it really helped me finding out what to do next. This weekend was all about coding and refining the code that I had up until then.

One of my problems was that the sensor data I received in Pure Data was fluctuating a lot in a quiet volatile manner which made it not ideal to control parameters like delay time or feedback amount for example. My supervisor suggested applying a kind of filter that smooths the readings. We decided on the moving average method. Based on this method, I added some lines to my code so now the Arduino takes ten samples of the distance readings of the ultrasonic sensor, sums them up and then calculates the average by dividing the sum of the ten samples by ten. I then use the average value of ten incoming samples for further computation and MIDI conversion instead of individual values. I tried it out and at least in the serial monitor of the Arduino IDE, the readings are less volatile. However, I still have to try it out in conjuncture with Pure Data and actual audio effect applications. The readings are especially smooth using integer values which I did at first. However, I fear that I might need float values in my further computations, so I changed the average filter to work floating point numbers. However, maybe integer values may be enough – so far it is impossible to tell for me.

Next to the moving average filter, another measure taken to smooth the incoming MIDI data was to send the MIDI data less frequently than the sensor outputs its readings. Using the millis() object, I programmed the Arduino to keep of the time since the program started and wait 20 ms each time before sending MIDI data to my laptop. Of course, the delay should not be too long to ensure that the parameter changes of audio effects controlled by the incoming data are still smooth enough. I think it works but as it is the case with the moving average filter, I still have to confirm the measure’s positive impact in conjuncture with Pure Data and actual audio effect applications.

Additionally, me and supervisor also talked about ways to improve some problems that I was experiencing in Pure Data. The first problem was massive latency issues, so we talked about ways to reduce the latency in Pure Data. However, as I focused on the coding aspects this weekend, I did not come round to that part yet. Additionally, there were constant clicks when I used the MIDI data from the ultrasonic sensor to alter parameters of audio effects in Pure Data. Luckily, my supervisor could help me out: apparently, I (unintentionally) implemented a kind of true bypass solution that comes with its characteristic popping noise when turned on or off. With the sig~ object and a lowpass filter lop~, I now first convert the control (numeric) signal to an audio signal and then filter it which should remove the pops.

Working with Pure Data

I initially planned to use the MIDI data coming in from the sensor + Arduino directly in my DAW of choice Cubase. This, however, proved to be difficult because either Cubase is not really suited for this kind of external input, or I do not know Cubase well enough to make it work. Although, I could select the Arduino as a MIDI input in Cubase, I only managed to assign some parameters to be controlled and these were not the parameters I needed. Consequently, I abandoned the idea to use my DAW (for now) and decided to try it with Pure Data since my goal for now is only to try things out and I do not need a perfect setup yet – only a working one XD.

Luckily, I managed to write a patch in Pure Data that receives the MIDI input from the sensor + Arduino and allows me to use it further. With no DAW plug-ins at hand, I had to make my own  and with some help from YouTube I made a delay. I subsequently assigned the incoming MIDI data to the delay parameters and played the guitar to see how I could change the delay parameters through my playing. The parameters controlled by the sensor data were “Delay Time”, “Feedback Amount” and “Delay Volume (dry/wet)” respectively. I made the following observations:

The sensor measurements and consequently the MIDI values fluctuate quite a bit even when I try not to move my fretting hand at all. This is not ideal, particularly for controlling the “Delay Time” and “Feedback Amount” parameters. Additionally, dropouts occur from time to time with the MIDI value dropping to zero before going back up to the initial value which of course ruins the delay settings. I think this problem could be solved by defining certain MIDI note ranges e.g. from 97 to 103 where the effect parameters still stays the same. This way one could achieve more constant settings.

There are annoying clicks when the delay parameters change. I tried to smooth the transitions with a PD “line” object but the clicking did not really improve. I do not know what causes this problem and how to deal with it at the moment.

I had the greatest success with using the MIDI data to switch the delay on and off depending on if one is above or below a certain point of the fretboard. I discovered and used the PD “moses” object to turn the delay off for basic chords in the lower region of the fretboard and to turn the delay on if I reach up for higher frets. This setup is not harmed by the fluctuating problem since there is more room for error so to speak: If MIDI values over 50 mean ON, then it does not matter if the value is 56 or 60, the effect will stay ON.

To complement this discovery, I made a custom overdrive/distortion effect in Pure Data using the “clip” object. It is not a Tubescreamer but it serves my purpose. I also used the MIDI data as a means to turn the effect on and off and it works really well. There are only some latency issues.

The possibility to turn an effect on above a certain fret and turn it off when below this fret is one important step towards my envisioned “Solo Mode” so I am pretty happy that I made it work and achieved the first goal of my project.

First Steps into the right Direction

Last week I tested positive for COVID-19 and actually felt ill as well. Therefore, I focused on recovering and put the project on the backburner. With no new developments to write about I also skipped the weekly blog post. However, this week I felt much better already and continued with the project. I have to admit that the project until now was not as fruitful as hoped with no or only little progress made even after hours of working on it. However, this week things worked out to some extent, and I made a few steps in the right direction.

The first milestone I achieved was writing an Arduino IDE sketch that allowed me to use the Arduino as a MIDI output device. Initially, I had borrowed an Arduino UNO from the FH only to find out that an UNO is not really capable of being used as a MIDI device. I received an Arduino Micro Pro from my supervisor, but the Micro Pro required a specific set-up procedure that appeared quite daunting to a programmer noob (=me). Luckily, the FH also had an Arduino Leonardo which works with MIDI AND is easy to set up so I decided to use this one. With the right Arduino available, I started working on the code and managed to make a sketch that takes the distance measurements of the ultrasonic sensor and converts them to MIDI messages that I can subsequently use to control parameters in my DAW or Pure Data.

The next challenge was then to attach the ultrasonic sensor (+ the Arduino) to the neck of the guitar in a way that gives me reliable and useful distance readings of my fretting hand to pinpoint its position along the neck. The problem is that the posture of the fretting hand varies a lot depending on how one plays. The hand’s posture playing barre chords for example is very different from playing single notes. Therefore, I tried quite a few positions before settling on the one seen in the picture. (!!!!) I even considered to drop the ultrasonic sensor in favor of a different sensor that may be better suited for this kind of measurements. For now, the ultrasonic sensor seems to work but I will keep other options in mind.

With the ideal position identified, I still had to find a way to attach the sensor to the neck. This proved to be quite difficult because the required position while being suitable to measure the fretting hands distances is not at all ideal to attach the sensor to. Furthermore, I am no DIY genius, and I was still in quarantine, so I had to make do with the materials I had in my apartment. In the end, I “built” a small box for the ultrasonic sensor out of a Bepanthen package and used a single chop stick to attach it to the guitar’s headstock. I was very much in doubt if I really managed to put the sensor in the right position but to my great astonishment it works good enough to allow for some tone testing.

Welcome to Phase 2 of the project!

Welcome back to my blog series covering the proceedings of my guitar project! After enjoying the semester break, I decided it was time to dedicate myself to my project again. As specified in my exposé, the second phase of the project is concerned with developing working setups that are sufficiently reliable and allow for further practical research regarding suitable effects, playability and performability. The second phase includes tasks like:

  • Determine and acquire necessary equipment. Including: an IMU sensor, an ultrasonic sensor as well as two microcontrollers of the brand Arduino
  • Determine ideal placement of sensors and microcontrollers on guitar neck and pick/right hand and install them accordingly
  • Program microcontrollers to pick up the movements of the fretting and picking hands
  • Determine suitable effects and parameters to be triggered by the movements of the fretting and picking hands

On Tuesday, I rented an Arduino UNO starter kit from the FH JOANNEUM and bought a package of ultrasonic sensors which will be needed for my left-hand guitar setup. I decided to start with the fretting hand setup using the ultrasonic sensor since I think it will be easier to implement than the picking hand setup involving the IMU sensor. Thanks to YouTube university, I was able to quickly code a sketch which allows me to measure the distance from the sensor. As outlined in my exposé, the plan is then to use the distance data to pinpoint the hand along the neck. The position of the guitarist’s hand is then used as input data to modulate effects.

On Thursday I met my new supervisor DI Marian Weger from the KUG Graz. Although Mr. Ciciliani guided me very well through my ideation and research phase, we decided in the previous semester to part ways since my current project focus now requires a technical, coding approach more than an artistic approach. Consequently, I will now be working with Marian. During the kick-off meeting last Thursday, we talked about organizational aspects as well as the project itself. One very interesting issue was raised: I always envisioned to control digital effect plugins in a DAW. Marian however, mentioned that it would also be interesting to him if the setups controlled settings of “analogue” effect pedals. While I think that the modulation possibilities are more limited with actual effect pedals, it is certainly a noteworthy perspective since my goal is to further extend the range of sonic possibilities of an electric guitar in a non-invasive manner. While I primarily focus on not interrupting the player’s natural hand movements, “non-invasive” could also refer to rig compatibility. The setups would therefore fulfill the working hypotheses even more if they could be used in conjuncture with a conventional guitar rig consisting of effect pedals and do not require an additional laptop or other “invasive” measures.

During the meeting, I also received my first “homework”: namely to come up with a way to convert sensor data to MIDI data and thus make the Arduino UNO I am working with a kind of MIDI controller. Unfortunately, after several hours of watching tutorials and sifting through online forums, it turned out that the Arduino UNO is not necessarily the best type of Arduino to work with MIDI. Luckily, I have the possibility to borrow an Arduino Pro Micro from the IEM on Monday with which I will hopefully be able to achieve the above-mentioned task. With the coding part done, the real fun begins: mounting the ultrasonic sensor to my guitar neck and start experimenting with different sounds and control parameters. The goal is to find out which hand movements are suitable to trigger which effect or which effect parameter.

Writing the Exposé

Checking in for my 13th blog entry this semester.

I used the Christmas holidays to compile the research done in November and December and subsequently wrote my Exposé. I am excited (and also a little bit anxious) to see how it will be received by the staff of the FH JOANNEUM and the KUG. This week, I will also start to work on my final presentation – here the greatest challenge will probably be to reduce the whole project idea to a mere five minutes of presentation time.

Initially, I also wanted to start with some practical work and thus rented an Arduino board and bought some ultrasonic sensors. Unfortunately, January is pretty much packed with tests and final submissions, so I had to put experimenting with the Arduino on the backburner. Nevertheless, I still have the Arduino until the end of the semester so maybe I will still get a chance to try it out…

Official cancellation of Idea #1

This week the conclusion was made that the idea #1 “Guitar Drum Map” will not be pursued any further. If you followed my previous posts, this should not come as a surprise as I almost immediately focused on my other idea, the Extended Guitar Performance. However, I still wanted to keep the idea as a back-up, at least until now. With my Extended Guitar Performance idea going quite well, I decided to officially quit pursuing the Guitar Drum Map. The reasons therefor are various.

First of all, researching for two ideas simultaneously is very time-consuming and, considering the workload of other subjects this semester, not reasonable.

Furthermore, I wanted to make my project an opportunity for me to learn and work with Pure Data and possibly other kinds of programming since I have not much experience in this field yet. Current research, however, suggests, that working with Pure Data will probably not be necessary for the Idea #1 “Guitar Drum Map”.

There are already several commercial products available that enable a guitar audio-to-MIDI conversion that feature DAW integration and can be used to trigger VST drum instruments. Consequently, it would be unnecessary to develop a custom Pure Data solution.

Said products include:

  • Roland GK-3 hexaphonic pick-up
    • May be used with Roland and BOSS guitar synthesizers only
  • Fishman TriplePlay hexaphonic pick-up
    • Direct MIDI output
  • Jam Origin MIDI Guitar 2 software
    • I tested the trial version of the software. Unfortunately, I was not compatible with Cubase’s Groove Agent, my go-to drum VST. However, I tested out other sounds and was not totally convinced. There is still some latency, especially when playing chords.

Moreover, I spoke to a fellow guitarist of mine about the idea and he raised additional doubts concerning the actual practicability and usage of such a guitar audio-to-MIDI solution. This further discouraged me from pursuing the idea since one of my goals with this project is definitely to make something that other musicians/guitarist can benefit from and find applications for.

My final reasons to abandon this idea is that I find it really difficult to find a “Unique Selling Proposition” so to speak that makes the idea stand out and sufficiently innovative.

As already mentioned, I will, instead, focus on my second idea “Extended Guitar Performance” for the remainder of this semester (and hopefully my master).

Nevertheless, the time invested in this project was not completely in vain: The Fishman Triple Play pick-up found during the research may still come in handy for my other idea Extended Guitar Performance and I will keep it in the back of my head.

Sources:

https://www.thomann.de/at/fishman_triple_play.htm

https://www.thomann.de/at/roland_gk3_midipickup.htm

https://www.jamorigin.com/

Reference works for Extended Guitar Performance #3

During my research on Google Scholar, I came upon the following journal article called “The Electric Guitar: An Augmented Instrument and a Tool for Musical Composition”. I found the content of this article highly inspiring since it deals with the history of sonic augmentation of the electric guitar and is thus also highly relevant for my project (which, of course, also seeks to extend the range of sound possibilities of an electric guitar).

According to Lähdeoja, Navarret, Quintans and Sedes, the electric guitar can be considered a highly augmented instrument in itself since it is basically an acoustic guitar whose sonic possibilities were and still are extended by electromechanical and digital means. Obviously, at the beginning of the electric guitar stands the acoustic guitar since – as far as the basic physical qualities are concerned – an electric guitar is very much based upon an acoustic: (physical) features such as metallic strings, a glued or screwed neck, strengthened with a truss rod, tone woods (maple or mahogany necks with a maple, rosewood or ebony fingerboard) as well as metallic frets and a chromatic scale were all transferred from the acoustic onto the electric guitar. However, with the invention of the solid body a new era began, introducing a major extension of sonic capabilities through electric technologies of amplification. With an electric guitar, the actual sound source of the acoustic guitar becomes part of an electro-acoustic chain that additionally comprises a range of analog and/or digital modules that are necessary for producing sound, thus giving the electric guitar a modular property. Of course, the electrification of the guitar sparked a whole new way of thinking and creating sounds, leading to the development of new playing techniques.

While “the electrification of the guitar is probably the most important modification the instrument has undergone in the twentieth century”, the instrument and its sound possibilities continued to be intensely advanced in the following decades, with the developments driven by guitarists, instrument makers and the requirements of different musical styles. The afore mentioned modularity of the electric guitar that includes the selection of tone woods, pick-ups, amplifiers, effect chains, etc. provides the guitarist/musician with a lot of freedom as far as customizability is concerned. However, it is the case with every instrument that along with the increase of sonic possibilities and corresponding control interfaces, playing the instrument gets more and more complex as well, potentially overburdening the musician’s (physical) capabilities. In order to avoid the latter, most control interfaces for electric guitar have been conceived with simplicity in mind, featuring relatively simple controls and traditionally being conveniently placed within the guitarist’s reach (e.g. an effect pedal that is (dis)activated by stepping on it). While serving its purpose of not overburdening the player, Lähdeoja, Navarret, Quintans and Sedes observe that this approach limits the possibilities for dynamic, real-time interaction with controls which leads to a status they describe as the “sonic stasis common in electric guitar playing: the player chooses a specific sound for a musical part with “on/off” effect switches, playing with the same timbre until the next “monolithic” modification”.

I think this latter notion is very interesting because it justifies my quest to identify ways of extending the sonic range of an electric guitar using only the natural movements of the player as an input source for effects. While the article states that there is currently research being done on this topic, I think that I can still add to the current research with my project and its outcomes.

Source:

Lähdeoja O., Navarret B., Quintans S., Sedes A. (2010). The Electric Guitar: An Augmented Instrument and a Tool for Musical Composition. Journal of Interdisciplinary Music Studies, 4(2), 37-54. 10.4407/jims.2010.11.003

Idea Update

Good News: Conducting thorough research, concerning extended guitar sound possibilities, during the month of November allowed me to further specify my project idea “Extended Guitar Performance”. By firstly discerning what has already been done in this area of musical research, I identified several commercials products (see the MIDI rings in Blog #6) as well as custom-made products (e.g. Blog #9 + Blog #10) that all deal with movement and/or gesture-controlled extension of guitar sounds. However, I also found that the bulk of these “gadgets” require the guitar player to move his body and hands in certain ways to specifically trigger the sensors built within the gadget. Consequently, the guitarist often needs to interrupt his natural playing movements in order to wave is hand for example. I saw this for myself when I tried to keep playing guitar in the usual way while incorporating special hand movements to trigger hypothetical sensors (having not built anything yet) – it proved to be almost impossible to maintain a typical rock groove for instance much less playing lead.

That is why I thought that it would be even better if one could extend a guitar’s sonic possibilities while at the same time not compromising the usual way people play guitar. Based on this notion, I decided to try finding a non-evasive way, restricting my idea to only using natural movements of a guitarist to trigger sensors in order to extend guitar sound possibilities.

Consequently, some changes to my previous idea will be made:

I will abandon the idea of attaching (accelerometer) sensors to the guitar body to pick up its movements because moving in a certain way again is not part of the natural movements of a guitarist.

Instead, I will focus on the “Solo Mode” effect, described in Blog #8, and effects based on the same concept since this kind of system allows the guitar player to move up and down the neck with effects adapting automatically to the fret position. During this week’s consultation session, I brought this idea before my supervisor, and he recommended looking into ultrasonic or infrared sensors to locate/pinpoint the left hand along the neck. This will, of course, be done. Furthermore, my supervisor also saw the possibility of using this system beyond switching from rhythm to lead sound. Additionally, one could define certain areas/positions on the fretboard that each give a slightly different sound, allowing the guitarist to change is sound according to fret position.

I will also further pursue the idea of placing an IMU type of sensor on the pick to pick up the natural strumming and picking movements of the guitarist. Here, it would have been practical to resort to one of the MIDI rings which are, however, quite expensive. At the moment, a custom setup is hence favoured.

Naturally, I will also continue to analyse a guitarist’s natural body and hand movements in order to come up with additional ways to harness them as an input source for guitar effects.

Lastly, upon the recommendation of my supervisor, I will not yet fully discard the possibility of using MIDI pick-ups. During my research I stumbled upon the Fishman Triple Play, a hexaphonic pick-up, capable of picking up the pitch of each string separately and converting it to MIDI data. Albeit coming with a huge price tag (300€ on Thomann), the pick-up constitutes another possibility of adding to a guitar’s sound potential without compromising the player’s usual movements. This will be given further thought.

Sources:

https://www.thomann.de/at/fishman_triple_play.htm

Reference works for Extended Guitar Performance #2

Albeit having used Google Scholar to find scientific articles extensively while writing my Bachelor’s thesis, I never considered using it for my Extended Guitar performance project – until today. It was a good decision for I discovered some great articles that deal with electric guitars and possibilities to further extend or evolve their sonic capabilities. One of these articles is briefly summarised below.

MIDI Pick

The first article I found documents the development of the so-called MIDI Pick. This special pick serves a dual purpose: on one hand, it can be used as a conventional pick to pluck the strings of an electric guitar and on the other hand, it functions as a pressure trigger, interpreting finger pressure exerted on it as analog or digital values. The pick itself is made of wood, rubber and double-sided tape with a force-sensing resistor mounted on it. The sensor is connected to an Arduino microcontroller and Bluetooth module is used to transmit the data by wireless means. The two latter items are attached to strap worn around the wrist. As already mentioned, the MIDI pick needs to be squeezed and the harder the pressure, the higher a numerical number is outputted. The output is received by a MSP/Max patch that relays the data to other patches. Furthermore, the MIDI pick can operate in serial or switch mode with the mode being controlled by a switch on the wrist. In serial mode, values between 0 and 127 are transmitted. In the switch mode, the pick sends a 1 when the pressure exceeds a certain threshold and a 0 when that pressure is exceeded again, essentially making the pick a toggle switch. In a live performance test, the developer successfully tested the MIDI pick, using it as a controller for a white noise generating patch. In this context, the developer also noted that being capable of adequately using the MIDI pick necessitated time and practice. The 2006 article also spoke about a future outlook involving a updated version of the MIDI pick, however, I did not find another article documenting the further development process of the pick.

Personal thoughts

This article is definitely interesting for my project because the latter will also involve using the pick one way or another to add to the sonic capabilities of the guitar. In fact, the notion of placing a pressure sensor opened a whole new world of possibilities for me as far as sensors are concerned. Let me explain: Until now, I only thought about mounting an accelerometer/gyroscope/IMU kind of sensor on the pick or the back of the hand in order to register e.g. the strumming movements of the hand. However, I see now that I need not to restrict my thinking to the afore mentioned sensors alone. While the idea of using a pressure sensor is evidently taken (XD), I immediately thought of a touch sensor, more precisely, a capacitive touch sensor. A capacitive touch sensor measures touch based on electrical disturbance from a change in capacitance and not based on pressure applied (in contrast to a resistive touch sensor). As far as applications in a guitar context are concerned, such a touch sensor may be used to trigger or activate an effect by double tapping on the pick for example. Admittedly, double tapping would not be possible with a conventional pick that needs to be held between thumb and forefinger all the time. However, by using a so-called thumb pick, a pick that is strapped to the thumb, the forefinger would be free to tap onto the underside of the pick in order to trigger a certain value. This idea will certainly find its way into my final project concept. Beyond that, the article also shows that it is possible to place a sensor on a pick without compromising playability.

Sources:

Vanegas R. (2007, June 6-10). The MIDI Pick – Trigger Serial Data, Samples, and MIDI from a Guitar Pick. Proceedings of the 2007 Conference on New Interfaces for Musical Expression (NIME07), New York, NY, USA.  https://dl.acm.org/doi/pdf/10.1145/1279740.1279812?casa_token=kT0EgXV1DtwAAAAA:WQ1bNZkrY9hVGEbT4nQbTd8kk6Miz5_ZPl6ZkRCHTPXQPFpULPva5_QQ3GLr6tGDKq-NZTF0cjF3gA

http://roy.vanegas.org/itp/nime/the_midi_pick/

Reference works for Extended Guitar Performance #1

I dedicated this weekend to a first round of finding similar reference works and publications since this is also one of the tasks due for the Exposé.

Imogen Heap’s Mi.Mu gloves

One of the artists I stumbled upon during my research that makes use of hand movements and gestures to perform and compose her music is Imogen Heap. Considered a pioneer in pop and electropop music, she is a co-developer of the so-called Mi.Mu gloves, gesture controllers in glove form that Heap uses to control and manipulate recorded music and/or her musical equipment during a (live) performance.

As she explained in an interview with Dezeen, Heap found the conventional way of playing keyboards or computers on stage very restrictive since most of her actions like pressing buttons or moving a fader were hidden from the audience and thus not very expressive, even though they may constitute a musically important act. Her goal was to find a way to play her instruments and technology in a way that better represents the qualities of the sounds produced and allows the audience to understand what is going on on stage.

Inspired by a similar MIT project in 2010, the gloves underwent eight years of R&D, with the development team consisting of NASA and MIT scientist alongside Heap. While Heap has used prototypes for several years now during her live performances, also other artists were occasionally seen to try them out. Ariana Grande for example used the gloves on her 2015 tour. In July 2019, the Mi.Mu gloves became commercially available for the first time promising to be “the world’s most advanced wearable musical instrument, for expressive creation, composition and performance”.

The Mi.Mu gloves contain several sensors including:

  • an accelerometer, a magnetometer, and a gyroscope in the form of an IMU motion tracker, located at the wrist, that gives information regarding the hand’s position, rotation and speed
  • a flex sensor over the knuckles to identify the hand’s posture in order to interpret certain gestures
  • a haptic motor that provides the “glove wielder” with haptic feedback: it vibrates for example if a certain note sequence is played

To send the signals to the computer, the gloves use WLAN and Open Sound Control (OSC) data instead of MIDI data. The gloves themselves are made from e-textiles, a special kind of fabric that acts as a conductor for information. Furthermore, the gloves come with the company’s own Glover software to map your own custom movements and gestures which can be integrated into DAWs such as Ableton Live or Logic Pro X.

Unfortunately, the Mi.Mu gloves still cost about £2,500 (converted about 3,000 €) and are, on top of that, currently sold out due to the worldwide chip shortage. A limited number of gloves is expected to become available in early 2022.

Key take-aways for own project

First of all, Heap’s Mi.Mu gloves serve to confirm the feasibility of my project since the technology involved is quite similar. The gloves also use an IMU sensor which is also my current go-to sensor to track the movements of a guitar player’s hands. Although Heap mostly uses the gloves to manipulate her voice, I found a video that shows her playing the keyboard in between as well. This shows that wearing the sensors on one’s hands does not necessarily interfere with playability of an instrument which is a very important requirement for my project.

Interestingly, the gloves rely on WLAN and OSC instead of MIDI which is definitely a factor that calls for additional research from my side. OSC comes with some advantages over MIDI especially as far as latency and accuracy are concerned which makes it ideal for use in real-time musical performances. Furthermore, data is conveyed over LAN or WLAN which eliminates the need for cables. Moreover, OSC is supported by open-source software such as Super Collider or Pure Data which could make it even more attractive for my project.

Finally, I want to use Imogen Heap and her glove-supported performances as a source of inspiration in order to come up with playing techniques or effect possibilities for my own project.

Sources:

https://www.dezeen.com/2014/03/20/imogen-heap-funding-drive-for-gloves-that-turn-gestures-into-music/

https://www.mimugloves.com/gloves/

https://www.engadget.com/2019-04-26-mi-mu-imogen-heap-musical-gloves-price-launch-date.html?guccounter=1&guce_referrer=aHR0cHM6Ly9lbi53aWtpcGVkaWEub3JnLw&guce_referrer_sig=AQAAABq203VmIuqq3D8e81XRlsg9lu1bLGt7Zf8fnxd6554YvV1nBE0XW87WoYfLl5DWNMybFLUsgSz3rlthBtL1ZvEsXv7Szdyv8hIAVr64tKPltPEApCyqtPQvmqWLaQDUfbX1_LIp7oLR6PbzavY3NeWb0NBv2rfC6A1MyUCkG0LZ

https://www.popularmechanics.com/technology/gadgets/reviews/a10461/power-glove-makes-music-with-the-wave-of-a-hand-16686382/

https://en.wikipedia.org/wiki/Imogen_Heap

https://www.uni-weimar.de/kunst-und-gestaltung/wiki/OSC

https://opensoundcontrol.stanford.edu/