New Hardware and left hand setup progress

With the BLE MIDI transmission working rather well, it was decided to move on to finish the programming work of the project. As stated in the goals set for the third semester, the left hand setup will be complemented by an additional threshold-setting knob which enables flexible threshold settings while the right hand setup will have a reset/calibration button added to it. Thus, a potentiometer with matching knob and a button were purchased at Neuhold Elektronik Graz.

As adding the threshold button seemed to be the easier thing to do, it was decided to start with the additions to the left hand setup. Firstly, the basics functions of a potentiometer were researched and a simple code to test the potentiometer and its functionality was written. The potentiometer gives out analog values ranging from 0 to 4095 if operated with 3.3 volts. These values were then divided into nine equal ranges that were matched with nine fret numbers with range 1 going from 0 to 455 for the 1st fret for example. These calculations were then put into code which was also successfully tested. Subsequently, the additional code was added to the main ToF sensor code.

Then another challenge arose: the ESP32 only has one 3.3 Volt pin. However, the ToF sensor and the potentiometer need electricity to work. The first solution found was to connect everything via a breadboard which makes it possible to split the output of the 3.3 Volt pin. This solution works but is not very sleek and thus, not suitable for the final product. Here, some other way will have to be found. Nevertheless, with sensor and potentiometer supplied with power, the code was tested.

Here, an old problem arose. The values fluctuated quite a bit so it was decided to revisit the idea of implementing a code that switches states instead of sending continuous values.

In other words, it was required that if the hand reaches above the set threshold, a MIDI message to turn an effect ON is sent just ONCE and if the hand reaches below the set threshold, a MIDI message to turn the effect OFF is sent just ONCE. If the hand moves but stays above the threshold, no MIDI message is to be sent and if the hand moves but stays under the threshold, no MIDI message is to be sent either.

To make such a code possible, not only the currently incoming fret number had to be known but also the previous fret number so both fret numbers could be compared to each other.

In order to attain the basic logical structure before actually writing the code, four possibilities were discerned:

Prerequisite: Effect is usually ON

Current fret >= threshold && previous fret >= threshold -> do nothing (leave ON)

Current fret >= threshold && previous fret < threshold -> send ON once (turn ON)

Current fret < threshold && previous fret >= threshold -> send OFF once (turn OFF)

Current fret < threshold && previous fret < threshold -> do nothing (leave OFF)

This logical structure was then put into code, but it failed to work at first. As it was hard to debug the code using MIDI messages, it was decided to resort to the old serial monitor and print everything out. Thus, the influence of different conditional operators on the actual output could be monitored very accurately and the code was gradually increased in complexity. Finally, the logical structure mentioned above worked in conjuncture with serial printing. It was then changed to work via MIDI messages, and it also worked successfully. While the code was tested in conjuncture with Cubase, it was not yet tested while actually playing guitar.

MIDI BLE – the best way to transmit data?

After setting the main objectives for the third semester, the first issue that was tackled was the question of how to transmit the data. During the project, quite a few types of data transmissions have been tried out including USB-MIDI, only MIDI and, most recently, OSC.

As already mentioned in blog post #3, data transmission via OSC could be achieved using the ESP32’s WiFi capabilities. This means of communication comes with the definite advantage of being wireless, which was not possible with the MIDI setup from last semester.

However, my supervisor suggested trying out yet another possibility: MIDI via BLE (Bluetooth low energy). So basically, it sends MIDI over Bluetooth. Luckily, the ESP32 also has Bluetooth capabilities so it was possible to start experimenting immediately.

Using the MIDI-BLE library, a code was sketched that could transmit sensor data via MIDI via Bluetooth. Unfortunately, two additional, but free programs are necessary for MIDI BLE to work with Windows. One is called loopMidi which opens a specific MIDI port and the other program is called MIDIberry which lets you route the incoming MIDI signals to the aforementioned MIDI port. This MIDI port can then be selected as a MIDI input source in a DAW (I tested it with Cubase).

Apart from also being wireless, the MIDI BLE data transmission also comes with the additional advantage that it basically sends MIDI signals which work in every DAW almost straightaway while OSC support is not always given or that easy to use. For this very reason, it was decided to drop the OSC possibility for now and focus on MIDI BLE.

In Cubase, one can fully customize the parameter controls via MIDI. So, using Cubase, the “Solo Mode” that was previously written in a Pure Data sketch now works completely without the need for Pure Data since the effects that are turned on or off can now be directly selected in the DAW. This brings the project one more step closer to its completion.

However, while working in Cubase, the current solution does not (yet) work in Reaper for example. This is due to the fact that Reaper only uses the Learn MIDI function to map incoming MIDI CC messages to parameter controls. If two different CC messages are sent at the same time, meant for different effect types, it only recognizes one. It is then not possible to manually input another CC value to control another effect parameter as it is possible in the fully customizable Cubase.

As the two setups should be accessible without too much hassle for most guitarists, it must be established if other DAWs have the possibility of manually mapping MIDI CC values to parameter controls in order to ensure that this method, using MIDI BLE, works for at least the major DAW brands like Pro Tools, Ableton, Logic Pro, etc. if not for Reaper.

Objectives for the third semester

On 9th of October the first meeting of the third semester with my supervisor took place. We decided to meet to set the objectives to be reached during the third semester and to determine how the project and the two setups should be developed further.

The first question that arose was whether the focus of the development efforts should be placed on creative advancements of the tonal possibilities of the two setups (for example developing more effects to be controlled by the sensors’ data) or if the focus should be placed on fine-tuning the current setups and developing them into final products.

After some deliberation, it was decided to focus on further developing the existing setups into final products and pass on working on new tonal possibilities. As one of the goals of this project is to test the final setups on guitarists, it was deemed more important that the final setups should also look the part and function properly instead of having endless tonal possibilities but lacking in reliability and looks.

With the main focus of the final product phase set, it was decided to concentrate the development effort on three areas:

  1. Design of setup attachment
  2. Type of data transmission
  3. Software integration

These areas involve the following tasks:

  1. Design of setup attachment

Left hand setup:

  • 3D print cases for ToF sensor and microcontroller ESP32
  • Develop attachment device that is flexible enough to work on Fender AND Gibson headstock shapes
  • Extra knob for fret threshold setting
  • Extra knob for fine-tuning fret detection (?)
  • Determine power supply -> battery

Right hand setup:

  • 3D print cases for IMU sensor and microcontroller ESP32
  • Extra calibration button
  • Develop attachment -> use snap fasteners to clip sensor + microcontroller on
  • Determine power supply -> battery
  • Strong cover cable to connect sensor and microcontroller
  • Type of data transmission

Via OSC (current solution)

Pros:      wireless, already works

Cons:     integration in DAW more complicated (maybe different for every DAW)

Via Bluetooth + MIDI (BLE) (probably best solution)

Pros:      integration in DAW probably easier with “Learn MIDI”, ESP32 has Bluetooth

Cons:     does not work yet, fast enough? USB dongle necessary?

Via radio transmission (?)

  • Software integration

Goal: make setups useable within DAW

As already touched upon in previous blog posts, some experiments were done using the Camomile VST software. While achieving some success, it was still decided to abandon the idea of working with Camomile. On one hand, it was not possible to solve the issue of the effect not affecting the recorded signal. On the other hand, the Camomile environment was deemed impractical for the left hand setup.

Instead, the development efforts will focus on OSC or BLE + MIDI. Especially the latter solution appears to be promising because one could use the MIDI learn function that comes with practically any DAW that allows the user to use any plugins and customize controllable parameters within the DAW.

With these goals and focus areas set, the third project phase is really about to start. The first issue that will be tackled is determining the type of data transmission as well as the software integration.

Software integration test feat. Camomile VST

With both setups working via OSC, it was time to focus on another goal of the third project phase: the final software implementation. As stated before, the aim of this project is to further advance the sonic range of the electric guitar without impairing the guitar player’s natural playing. Although this statement mainly includes the natural playing style and associated hand movements of the player, its meaning could also be expanded to usability and how well the two setups can be integrated into the usual signal chain of the average guitar player.

The standard guitar signal chain is probably as follows:

Guitar into effect pedals into amp to microphone or interface

As both setups require a computer, implementing the setups into this signal chain will not be possible. However, a lot of guitarists nowadays will play straight into an interface + DAW or go from an amp into an interface + DAW. Hence, if the software of the project could be integrated into a DAW, it could be considered quite user-friendly for guitarists.

And here Camomile comes into play. Camomile is an audio plugin with Pure Data embedded, used to control patches inside a large set of digital audio workstations – as long as they support VST3 or Audio Unit formats. (1) It free, open-source and cross-platform and was developed by Pierre Guillot.

The plan is to convert the current Pure Data patches into VST plugins that can then be used in any DAW, making them more accessible and prone to be used by guitarists.

The left hand setup with its usage of third-party plugins seems not as suited at the moment, however, ways will be found to implement it as well. Nevertheless, for reasons of simplicity, it was decided to start with the right hand setup and try to make it work together with Camomile.

Luckily, Camomile comes with a WikiPage that explains how to generate plugins and also how to make new plugins. As far as I understood it, Camomile is sort of a platform that facilitates and enables communication between the Pure Data patch and DAW. In order to generate a VST plugin, one needs a folder that contains the Pure Data patch and sub-patches, the Camomile VST Plugin-File (instrument or effect), an info sketch in txt format and specific Pure Data sketches that facilitate communication between patch and DAW.

Within the text file, the parameters of the plugin, the in- and outputs and other factors need to be specified. For example, for the Wah-Wah effect patch, the following text was written:

  • param -name Frequency -label Hz -min 280 -max 3000 -default 1500;
  • param -name OSC -min 0 -max 1 -default 0 -nsteps 2;
  • iolayout 2 2;
  • code Bbbb;
  • type effect;
  • manufacturer Bjoern Bernreiter;
  • compatibility 1.0.7;

The first parameter is the center frequency of the Wah-Wah and the second on is the ON/OFF switch for the OSC reception.

Additionally, one must add the specifically written param.get, param.set and param.change sketches to one’s own sketch in order to ensure communication between patch and DAW.

Without going too much into detail, it was possible to generate a Wah-Wah plugin that can be opened in the DAW Cubase. The center frequency can be controlled via OSC with data coming from the IMU sensor. Furthermore, the parameters center frequency and OSC ON/OFF can be automated manually and in Write-Mode within the DAW.

However, there is one problem: While the center frequency can be automated when moving the slider while in Write-Mode or drawing in automation curves, it cannot be automated from the OSC data alone. Similarly, the Wah-Wah affects the tone of a guitar when played live, however, it does not affect it anymore after recording the guitar. These two issues are, of course, linked but so far, no solution could be found. On the upside, while searching for a solution, the Camomile plugin and its workings were understood much better.

Nevertheless, I am confident that this problem will be solved in the near future.

New attachment device for right hand setup

As stated in blog post #1, basic OSC transmission capabilities were achieved and thus, the IMU sensor’s data could now be transmitted via OSC to control the parameters of a Wah-Wah effect. The next step was to test this setup in conjuncture with real guitar playing. However, it was decided that a new attachment device for the ESP32 microcontroller was needed.

Prior to using OSC, the right hand setup used MIDI to transmit the sensor data from sensor + microcontroller to Pure Data. The IMU sensor was attached to the microcontroller (an Arduino Leonardo) via a long cable and the microcontroller itself was fitted onto a small housing with a MIDI jack attached to it.

With the ESP32 microcontroller now working wirelessly over OSC, there was no further need for the housing and the long cable. It was instead decided to fit the ESP32 on the right forearm of the guitar player with only a short cable connecting it to the IMU sensor which was still fitted onto the wristwatch. However, it proved difficult to find a way of attaching the sensor to the uneven surface of a human forearm. After some pondering of the issue, the idea was born to use an old sock to wrap around the arm onto which the ESP32, wrapped into a small bag of fabric, could subsequently be sewn. It must be stated that this means of attachment is not yet worthy of the name “final product” and another solution must be found. Nevertheless, for testing purposes the device sufficed. The Wah-Wah effect could be tested using OSC as a means of transmission and it worked as expected. Thus, it can be concluded that the right hand setup now works wirelessly which brings it closer to its ultimate goal of not impairing the guitarist’s natural playing style while still adding another dimension of sound to the guitar.

With the right hand setup working so well, it was time to transform the left hand setup in the same fashion and make it work wirelessly. Having established the basic OSC principles while working on the right hand setup, it was simply a matter of combining the original code made for the ToF sensor with the OSC code. The “new” left hand setup was then immediately tested using the previously made “Solo Mode” Pure Data patch and a guitar. As it was the case with the right hand setup, the wireless left hand setup seemed to work equally well as the MIDI-based setup.

While the transmission of data now works wirelessly via OSC, both setups still require to connect the respective sensors (IMU and ToF sensors) to the ESP32 microcontroller. With both sensor being relatively wide apart from each other, it is likely that two microcontrollers, one for each hand’s setup must be used for the final product.

Start of Product Phase & OSC Control

Hello and welcome back to my GPRO project as I like to call it. We are now entering the third semester already during which the prototype setups developed during the previous semester will be completed to final products. Instead of overthinking my next steps and planning too much ahead, I decided to jump right back in and continue where I stood at the end of the second semester.

As noted in the documentation and presentation of the second phase, both setups, in particular the right hand setup would benefit from being wireless, with no cables obstructing the guitar player’s movements. Hence, this was the first issue tackled. Although there exist several possibilities to send data wirelessly, it was quickly decided to try out OSC (open sound control) which works over WIFI.

Open Sound Control (OSC) is an open, transport-independent, message-based protocol developed for communication among computers, sound synthesizers, and other multimedia devices. OSC messages are transported across the internet and within local subnets using UDP/IP and Ethernet. Additionally, OSC is sometimes used as an alternative to the 1983 MIDI standard.

This decision was taken among other reasons because the author was provided with an ESP32 microcontroller at the end of the second semester by his supervisor which already has WIFI capabilities on board. As the name already suggests, the ESP32 is not an Arduino product, however, it still can be programmed using the Arduino IDE after conducting a manual implementation process. With only a few difficulties, said process was completed and the ESP32 ready to be programmed.

The first step was to write an Arduino sketch which connects the ESP32 to the local WIFI network of the author in order to be able to send OSC messages. Using the OSC library for Arduino, such a code was achieved in a rather short time. Furthermore, the OSC library also contained a very useful example sketch to send OSC messages which served as a great learning tool to understand how OSC messaging works. In order to test the OSC messaging function, a Pure Data patch was made which uses the “netreceive” and “oscparse” objects among others to receive OSC messages coming in from a predefined port. The “netreceive” objects allows Pure Data to listen to a specific port, in this case 8000. In the Arduino sketch the port 8000 was defined as the outport from which OSC messages are sent. As a result, OSC messages could now be sent from the ESP32 via WIFI to Pure Data.

After achieving basic OSC messaging capabilities, the subsequent step was to merge the original sketch for the right hand setup, containing the code to access the IMU data readings, with the new OSC messaging method. On the other hand, the Pure Data patch from the second semester which is set up to receive the y-value of the orientation data from the IMU sensor to control a Wah-Wah effect was updated was well. Instead of receiving the orientation data as MIDI input, it now receives the orientation data via OSC, using the objects mentioned above.

After establishing the pin layout of the ESP32, the IMU sensor (still attached to the wrist watch device) could be connected to the microcontroller and data transmission from the ESP32 to Pure Data was successful. Latency seemed to be comparable to that of MIDI. With basic OSC transmission working, it is time to test the new setup it in conjuncture with real guitar playing.

Source:

OSC spec 1_0 (stanford.edu)

Evaluation of prototype setups

In this chapter, the project’s progress, made during the second semester, will be evaluated and successful as well as failed outcomes will be discussed. The evaluation will be based on the goals that were set out to be fulfilled during the second phase of the project.

Goal NumberDescription
Goal 1Determine and acquire necessary equipment
Goal 2Determine ideal placement of sensors and microcontrollers on guitar neck and pick/right hand and install them accordingly
Goal 3Program microcontrollers to pick up the movements of the fretting and picking hands using the Integrated Development Environment (IDE) by Arduino
Goal 4Program a Pure Data patch that handles the incoming data and transforms it to useable parameters to trigger effects  
Goal 5Either program custom effects in Pure Data or integrate Pure Data in a DAW to trigger commercial effect plug-ins
Goal 6Determine suitable effects and parameters to be triggered by the movements of the fretting and picking hands

1.1.         Goal 1 – necessary equipment

As far as the first goal is concerned, it can be stated that this task was accomplished. As outlined above, the necessary and suitable equipment for both the left and right hand setups was determined. This was achieved by thoroughly researching potentially suitable components and submitting them to tests in order to identify the best solution possible.

With regards to the choice of a microcontroller, the Arduino Leonardo was chosen over the Arduino UNO due to its built-in USB communication and USB MIDI device capabilities. Albeit the Leonardo being rather big and unwieldy, it was decided to stick with it during the experimental phase on simplicity grounds since it could be borrowed from the FH JOANNEUM. However, it is likely that a smaller microcontroller with similar performance such as an Arduino Micro will be used for the final setups in project phase three.

Regarding the sensor for the left hand setup, diligent research and a direct performance comparison lead to the decision to use a time-of-flight sensor to pinpoint the position of the guitarist’s hand along the neck instead of the initially planned ultrasonic sensor. After evaluating the specifications of several ToF sensors, the sensor of the type VL53L1X was finally chosen.

For the right hand setup on the other hand, a suitable IMU sensor was found rather quickly. Although an MPU9250 was briefly considered and used during an initial Arduino library test, it was then dropped for a BNO055 sensor, following the recommendation of the author’s supervisor.

Of course, next to these main components, other equipment including cables, breadboard, electronic components, etc. were acquired.

1.2.         Goal 2 – sensor position and placement

All in all, this goal has been largely fulfilled with some improvements possible in the third phase of the project.

As outlined above, the usual positions of the fretting hand have been determined and subsequently analyzed. This analysis led to the conclusion that, unfortunately, the posture and the exposed reflection area of the fretting hand vary a lot depending on what is played, with major differences between playing barre chords and single notes for instance. These inconsistencies in hand posture were and still are a major constraint to the left hand setup and its flawless implementation into the natural playing style of a guitarist.

As far as the ideal place and installation of the sensors is concerned, a lot of progress was made with regards to the left hand setup. Based on the afore-mentioned analysis of the hand posture, several attachment devices were made for the ultrasonic as well as the ToF sensors and, subsequently, compared. One position in particular (IMAGE) proved to be better than the others, albeit being not perfect. The position chosen works best for barre chords as well as the Solo Mode application.

Regarding the attachment device of the right hand setup, there is definite room for improvement. The wristwatch solution was sufficient for the experimental phase and proved that placing the sensor on top of the back of the hand serves to get useful sensor data. However, it is unsuitable for the final product. While a wireless solution is optional for the left hand setup, the right hand setup would certainly benefit from the lack of cables. It would enable an even more natural playing of the guitar.

1.3.         Goal 3 – programming microcontrollers

Here, definite progress was made, especially with the left hand setup. With no prior experience in programming the learning curve was quite steep, and a lot of time had to be dedicated just to learn basic coding techniques.

Regarding the left hand setup, Arduino sketches were made firstly for the ultrasonic sensor and, subsequently, for the time-of-flight sensor when the former proved to be unsuitable. In addition to code needed to access the basic sensor data, the mathematical relationship of the guitar fret spacings was established, fret ranges were determined and finally implemented in the code. Next to absolute distance measurements, detecting the fret numbers is possible up to the ninth fret which already enables applications such as the Solo Mode.

The code for the right hand setup on the other hand is not yet as advanced mainly due to time constraints. Using a library, orientation, acceleration, and calibration data could be obtained from the IMU sensor and transmitted to Pure Data via MIDI. The y value from the orientation data is the only data so far that is suitable for further use to control effect parameters of a Wah Wah effect. Here, more variety in data usage would be desirable.

Lastly, different data transmission techniques were tested and evaluated. A lot of time was spent with first MIDI USB libraries and then MIDI only libraries with transmission suffering from a lot if lag initially. Serial port communication proved to be the first viable solution, fast enough to control effect parameters. Finally, the latency problems of MIDI could be eliminated. The current setups work via MIDI communication using a MIDI cable. For the third phase, a wireless means of data transmission would be desirable – especially for the right hand setup.

1.4.         Goal 4 – programming of Pure Data patches

This goal was achieved to a large extent. Albeit initially working only with great latency issues, making a patch that receives MIDI data and is able to use it for further processing was achieved rather easily. The interim solution, serial port communication, took some more research but once the basic method was discovered, its application was straightforward. For the left hand setup, the incoming fret numbers can be either used directly to control effect parameters or, using the “moses” object for instance, a fret threshold can be set to make an ON/OFF switch. The patch for the right hand setup is very similar to that of its counterpart and effect parameters can be controlled.

1.5.         Goal 5 – effects programming

As far as goal 5 is concerned, mixed results were achieved. Working with the digital audio workstation (DAW) Steinberg Cubase Pro 11 proved to be more difficult than previously anticipated and, consequently, it was decided to work with Pure Data only during the experimental phase. Nevertheless, in order to ensure a seamless integration into the guitarist’s natural workflow, a DAW integration of the final product is desirable. Thus, it will be tried to accomplish this in the third phase of the project.

Regarding the decision between using self-made effects or third-party plugins, both approaches were tested. The first patch contained a self-made delay and overdrive/distortion effect which proved to be useful for the first test but had definite shortcomings tonal quality-wise. Thus, the patches from then on used the object “vstplugin~” to implement third-party plugins in the Pure Data environment. The exception is the Wah-Wah effect for the right hand setup which is self-made and works well.

1.6.         Goal 6 – determine suitable effects

With a stable data transmission between Arduino and Pure Data achieved at a relatively late stage of the semester, this goal could not be fully achieved. Tone experiments involving several effects and their parameters were conducted. For the Solo Mode application of the left hand setup, an amplifier, reverb and delay were tested. Less tonal experimenting was done with the right hand setup: three effects tremolo, phaser and Wah Wah were tested with only the latter representing a reasonable effect to be controlled by the right hand setup.

It is evident, that only the surface has been scratched so far and much more in-depth research and experimenting in both setups will be needed to really provide practical applications for extending the range of possible guitar sounds.

1.7.         Conclusion

In conclusion, it can be stated that all tasks set to be done during the second phase of the project have been approached and tackled with the majority of goals at least partially achieved. Additionally, setup compatibility with working hypotheses 1 and 2 was consistently ensured, with all effects working so far not invading the usual way people play guitar. As stated in the Exposé and in chapter 4 of this documentation, the overall aim of the second phase was to develop working setups that are sufficiently reliable and allow for further practical research regarding suitable effects, playability and performability. While the left and right hand setups are far from being a final product or ready to be tested by other guitarists, the second, experimental phase yielded a lot of progress. Overall, it can be affirmed that the left and right hand setups, albeit having shortcomings in some areas, are advanced enough to serve as a base for further practical research in the third semester.

The right data for the right hand setup

Here, the first goal was to get basic sensors readings in general with the next step being to figure out what kind of readings are suitable for controlling effect parameters.

In order to establish which data is needed and how it should be used, the data values of x, y and z and their changes were analyzed while performing strumming movements with the sensor strapped to the hand.

With the IMU +Arduino outputting orientation data, it became clear that the y value could prove useful for controlling effect parameters while strumming the guitar. The range of values of y was analyzed for the up and down strumming movements and the established range subsequently constrained. Then the range was remapped to MIDI values from 0-127 and, using the same data transmission techniques as the left hand setup (so firstly serial bus communication and, subsequently, MIDI), sent to a Pure Data patch similar to the one for the left hand setup.

Next to orientation data, experiments were conducted with accelerometer as well as linear accelerometer data in the same manner.

Using IMU data in Pure Data

In the Pure Data patch, the incoming orientation and (linear) acceleration data was used to control several effects and their parameters. Using the aforementioned “vstplugin~” object, the following effects were tested:

  • MTremolo by Melda Production
  • MPhaser by Melda Production

However, the incoming IMU data proved to be not reasonable for these effects. The linear and normal acceleration data could not be used at all. The orientation data was working to some extent, but no practical application of its effects was immediately discovered.

The first success using the IMU data was the Wah-Wah effect. Using the “vcf~” object, a bandpass filter was made with an adjustable fader going from 280 Hz to 3000 Hz center frequency (the normal operating range of a Wah-Wah pedal) and a Q-factor of 4. Using the y value from the orientation data, the center frequency was controlled through strumming movements of the right hand. The resulting sound was similar to that of a “real” Wah-Wah pedal and could be achieved solely by the natural strumming performed during playing.

The right hand setup

With my left hand setup kind of working I decided to start with my right hand setup which, unfortunately, I have totally neglected so far. Short recap: the right hand setup is planned to consist of an IMU sensor that picks up the natural strumming patterns of the right hand and uses the movement parameters to modulate the guitar sound. First of all, what is an IMU sensor? According to Wikipedia, an inertial measurement unit (IMU) is an electronic device that measures and reports a body’s specific force, angular rate, and sometimes the orientation of the body, using a combination of accelerometers, gyroscopes, and sometimes magnetometers. As the definition suggests, it is quite a complex device and is really on another level coding-wise than the left hand setup featuring the time-of-flight sensor.

At the very beginning of the semester, my supervisor gave me one of his IMU sensors, namely a MPU-92/65. However, as I approached him last week concerning the IMU sensor business for my right hand setup, he recommended me using another kind of IMU sensor, the BNO055 from Bosch. Apparently, there are better/easier-to-use Arduino libraries for the BNO055, and it is capable of sensor fusion – something I will get into below. Luckily, he also had one of those and gave it to me for experimenting.

Additionally, my supervisor told me the basics of IMU sensors which I will relay to you now:

As already mentioned in the definition, an IMU sensor basically combines an accelerometer, a gyroscope and a magnetometer and they can be used alone or in combination to obtain information about the position and/or movement of an object. When used in conjuncture (=sensor fusion), one can determine the pitch, roll and yaw movements of said object which is what I think I need. Since I have to actually wear the sensor on my wrist while playing the guitar, I cannot yet say, what kind of information I need from the IMU sensor. Of course, the pitch, roll and yaw movements make sense, but I could also try acceleration values for example. My goal for now is to get sensors readings in general and in the next step, I will try to figure out what kind of readings work best for my cause.

I found an Arduino library that lets me calculate the orientation of the sensor giving me readings of the x, y and z axes. My supervisor also highlighted the need to calibrate the sensor each time otherwise the readings are inaccurate. Luckily, the library also has a function that reads me the calibration status of each of the sensors in the IMU (accelerometer, gyroscope, magnetometer) – 0 means not calibrated at all; 3 means the sensor is fully calibrated. I watched a YouTube video that explains how to calibrate each of the three sensors: to calibrate the gyro, the sensor just needs to sit still for like 1-2 seconds (easy!). To calibrate the magnetometer, one needs to tilt and move the sensor for a bit into all directions which also works quite well. Calibrating the accelerometer is the most complex of all three approaches. One must tilt the sensor in different angles and hold each position for about five seconds. It takes a little bit of time and experimenting, but it works.

With the calibration and the orientation readings going, I decided to test it by putting the sensor on my wrist – easier said than done! After some tinkering I came up with the very rough solution of using my (seldomly worn) wristwatch and sticking the IMU sensor onto it using double-sided tape. Now I should be able to strap the watch over my hand and start playing.

New Pure Data patch and VST plugins tests

With a new means of communication in place (compare blog #12), a new Pure Data patch was made. After having created custom effects (the delay and the overdrive/distortion effect) for the first Pure Data patch, it was now deemed better to use third-party effect plugins. Luckily, with the help of the object “vstplugin~”, such third-party plugins can be used inside of Pure Data. Furthermore, it is possible to access the plugins’ list of parameters and control them by sending control messages.

The Pure Data patch uses the following three plugins:

  • L12X Solid State Amplifier
  • TAL Dub Delay
  • TAL Reverb 4

The plugins were chosen due to personal constraints: the author is primarily using a desktop PC for audio-related tasks and, hence, most plugins are only available to the author on the desktop PC. As the author’s laptop is used for this project, free plugins were chosen.

At first, a prototype setting for the solo mode was made. Here, the parameter mapping is as follows:

The values coming in from the Arduino are the fret numbers that were calculated from the distance to the ToF sensor in the Arduino sketch. The incoming fret number determines if the amplifier plugin is turned on or off. Using a “moses” object set to the threshold of five, the incoming fret number is compared to the number five. If the fret is below five, the amplifier is or stays turned off. If the fret number is above five, it is turned on. Additionally, fret numbers above five increase the delay’s and the reverb’s wet parameters. As a result, one has a rhythm tone using reduced effect settings when playing below the fifth fret and a more overdriven and effect-laden, lead tone when reaching for frets five and above. The threshold of five is, of course, variable. It was chosen in this case because when playing in A one can easily begin to play a solo at the fifth fret position and above. The test was successful with the rhythm/lead tone switching happening quite reliably and sufficiently fast.

Next, instead of using fixed settings that were being switched either on or off, it was tried to control certain effect parameter settings more fluidly according to the fret number. For instance, the reverb wet knob was set to increase in value as the fret number increases. Consequently, a note played on the first fret had much less reverb than a note played at the 12th fret with the values in between increasing at a steady rate.

With the steady communication between Arduino and Pure Data working, a whole lot of new tonal possibilities was opened, and I am sure that there are a lot more possibilities to discover as far as mapping certain effect parameters to the Arduino data is concerned. This, however, will most probably take place in the third semester during the third phase of the project where suitable effects and playing styles that exploit the two setups’ capabilities will be further explored.