Leaving speakers aside, the hardware requirements mainly consist of wireless wristbands with accelerometers and gyro sensors, a computer that can run Pure Data and Ableton and everything that it takes to get the wristbands and the computer to communicate.
As I had planned last semester, I got to try out the 2.4SINK Kit by Instruments of Things, which is a set of wireless wristband sensors and a receiver. The receiver is made for modular Eurorack setups, meaning that wristband movements will be translated into control voltages (CV’s) that can in turn be used to modulate synthesizers or effect panels. This is of course not necessarily optimal for my application, where I need all the wristband data in my laptop anyway. Thankfully, I was not the first one at my university that wanted to use the 2.4SINK Kit in that way: The module was already built into a small Doepfer box, together with an Expert Sleepers USB Audio Interface, that allowed me to receive up to 10 CV inputs (including the input expansion module) via USB on my laptop.
Wristband sensors
The sensors look very sturdy and of good quality, but also thin and subtle. On the backside, each watch has two metal pins that are used to switch the sensor on (if both pins are touched at the same time), as well as they are used as a mounting contraption to fit them on belt clips or fabric wristbands (“click-mount”). As soon as a sensor is switched on, it will go into calibration mode, which means that it is of utter importance to have the sensor lying still on a flat surface during this process (otherwise the values will drift off constantly).
Receiver
The receiver consists of an antenna, a little toggle button that switches between “configuration” and “active” mode and 16 dynamically configurable outputs. When switched on, the 2.4SINK receiver will create a wireless LAN network, which, if accessed, will provide the opportunity to visit the configuration page of the 2.4SINK Kit. On the configuration page, it is possible to change a few global settings, as well as individual settings for all 16 outputs (while the switch on the receiver is set to “configuration mode”). For each output, it is possible to choose which parameter of which sensor (up to seven sensors can be connected) should modulate the output. For each sensor, six parameters can be chosen:
Rotation X
Rotation Y
Rotation Z
Acceleration X
Acceleration Y
Acceleration Z
Furthermore, it is possible to output LFO signals that are modulated by the sensor’s parameters. However, this is much more interesting for a modular setup and not very suitable for my use case.
USB Interface
The Expert Sleepers USB Interface (ES8 plus ES6 input expansion module) works just as most other USB audio interfaces do. The CVs are normalized as input signals ranging from -1 or 0 (depending on the type of input: unipolar/bipolar) to 1.
While there were only two sensors available, the 2.4SINK receiver would have still supported to read out all parameters from both sensors (six plus six parameters, while the receiver features 16 outputs), but the true bottleneck in this case was the Expert Sleepers audio interface, which only has 10 inputs, meaning that not even all parameters from both sensors could be read simultaneously. This led me to the decision that the outcome for this semester would only be a proof-of-concept demonstration with only one sensor.
Last semester I spent a lot of time on conceptualizing the whole installation as such. I had a very wholistic approach, where the sound was clearly a main attraction, but all the other details seemed just as important. And during developing this concept and contemplating what I want it to convey (and if I want it to convey something), I also came up with the name “The Emotional Space”. As a quick recap – I framed my concept in one sentence like follows: “The Emotional Space describes a room that reacts at least as much to the mood of its visitors as the other way around”. While this might not be very easy to understand immediately, it does transport my artistic approach quite well: I wanted to create an installation that not only emphasizes, but downright lives through the individual ways a visitor might want to experience it.
When this semester arrived, a lot of feedback and contemplation followed, and I realized that I am not fully standing behind the concept anymore. I got confused comments about the word “emotional” in the title and started questioning how appropriate it is – do I dare to claim to change people’s emotions with my installation? How fitting is such a term to describe a room? With those doubts in mind, my whole concept slightly lost its grounding and I had to clear my mind and define what I really wanted to spend my time with during this project. But then again, being a sound design student, the choice was clear: I wanted to work with sound.
When I stopped thinking of the project as an installation, I quickly realized that what I wanted to achieve is easy to describe: I would like to create an explorative, interactive composition. And funnily, that description does not contradict anything I worked out last semester– it is rather a slightly more focused phrasing. While knowing what exactly I want to work on gave the project its direction back, I was still not happy with the word “emotional” in the title. But over the course of this semester, I enjoyed to “The Fluid Space”, which has a little more of a neutral standing and might not create too many expectations in its visitors. Some might disregard those topics as superficial, but I have a strong sense that this process was very crucial for me to know what exactly I am working towards. What I am now striving to create, is rather a format than an installation. Of course, the presentation of this format will still be embedded into an installation that may still follow the wholistic values I had in mind last semester. However, I want to further discover the possibilities of the format of explorative, interactive compositions, aided by the sensors I have in place. My modular approach should make it possible to get entirely different kinds of composition – also in their arrangement and interactivity – while working with the same core logic.
Als funktionelle Basis für das Projekt wurde die Gameenigne Unity gewählt, da diese für wissenschaftliche Zwecke frei zur Verfügung steht und eine professionelle Arbeitsumgebung bietet. Zudem ermöglicht Unity durch die große Verfügbarkeit freier Assets (Erweiterungspakete die nach Projekt individuell hinzugefügt werden können) die Möglichkeit, mit vergleichsweise geringerem Aufwand auch visuell ästhetische Welten zu erschaffen. Außerdem ermöglicht es die Vergleichsweise einfache Einbindung von XR.
Fmod ist eine Audiomiddleware die Programm- und Formatübergreifend Musikdateien laden und abspielen kann. Fmod wurde gewählt, da es ebenfalls für wissenschaftliche Zwecke frei zur Verfügung steht und DAW ähnliche, tiefere Bearbeitungsmöglichkeiten bietet als Unity.
Resonance Audio ist ein plattformübergreifendes SDK für Spatial-Audio und findet in XR, Games und Videos Einsatz. Resonance Audio encodiert Soundquellen in einen Ambisonicstream (bis zu dritter Ordnung) und decodiert diesen am Ende in eine binaurale Ausgabe.
In this chapter, the project’s progress, made during the second semester, will be evaluated and successful as well as failed outcomes will be discussed. The evaluation will be based on the goals that were set out to be fulfilled during the second phase of the project.
Goal Number
Description
Goal 1
Determine and acquire necessary equipment
Goal 2
Determine ideal placement of sensors and microcontrollers on guitar neck and pick/right hand and install them accordingly
Goal 3
Program microcontrollers to pick up the movements of the fretting and picking hands using the Integrated Development Environment (IDE) by Arduino
Goal 4
Program a Pure Data patch that handles the incoming data and transforms it to useable parameters to trigger effects
Goal 5
Either program custom effects in Pure Data or integrate Pure Data in a DAW to trigger commercial effect plug-ins
Goal 6
Determine suitable effects and parameters to be triggered by the movements of the fretting and picking hands
As far as the first goal is concerned, it can be stated that this task was accomplished. As outlined above, the necessary and suitable equipment for both the left and right hand setups was determined. This was achieved by thoroughly researching potentially suitable components and submitting them to tests in order to identify the best solution possible.
With regards to the choice of a microcontroller, the Arduino Leonardo was chosen over the Arduino UNO due to its built-in USB communication and USB MIDI device capabilities. Albeit the Leonardo being rather big and unwieldy, it was decided to stick with it during the experimental phase on simplicity grounds since it could be borrowed from the FH JOANNEUM. However, it is likely that a smaller microcontroller with similar performance such as an Arduino Micro will be used for the final setups in project phase three.
Regarding the sensor for the left hand setup, diligent research and a direct performance comparison lead to the decision to use a time-of-flight sensor to pinpoint the position of the guitarist’s hand along the neck instead of the initially planned ultrasonic sensor. After evaluating the specifications of several ToF sensors, the sensor of the type VL53L1X was finally chosen.
For the right hand setup on the other hand, a suitable IMU sensor was found rather quickly. Although an MPU9250 was briefly considered and used during an initial Arduino library test, it was then dropped for a BNO055 sensor, following the recommendation of the author’s supervisor.
Of course, next to these main components, other equipment including cables, breadboard, electronic components, etc. were acquired.
All in all, this goal has been largely fulfilled with some improvements possible in the third phase of the project.
As outlined above, the usual positions of the fretting hand have been determined and subsequently analyzed. This analysis led to the conclusion that, unfortunately, the posture and the exposed reflection area of the fretting hand vary a lot depending on what is played, with major differences between playing barre chords and single notes for instance. These inconsistencies in hand posture were and still are a major constraint to the left hand setup and its flawless implementation into the natural playing style of a guitarist.
As far as the ideal place and installation of the sensors is concerned, a lot of progress was made with regards to the left hand setup. Based on the afore-mentioned analysis of the hand posture, several attachment devices were made for the ultrasonic as well as the ToF sensors and, subsequently, compared. One position in particular (IMAGE) proved to be better than the others, albeit being not perfect. The position chosen works best for barre chords as well as the Solo Mode application.
Regarding the attachment device of the right hand setup, there is definite room for improvement. The wristwatch solution was sufficient for the experimental phase and proved that placing the sensor on top of the back of the hand serves to get useful sensor data. However, it is unsuitable for the final product. While a wireless solution is optional for the left hand setup, the right hand setup would certainly benefit from the lack of cables. It would enable an even more natural playing of the guitar.
Here, definite progress was made, especially with the left hand setup. With no prior experience in programming the learning curve was quite steep, and a lot of time had to be dedicated just to learn basic coding techniques.
Regarding the left hand setup, Arduino sketches were made firstly for the ultrasonic sensor and, subsequently, for the time-of-flight sensor when the former proved to be unsuitable. In addition to code needed to access the basic sensor data, the mathematical relationship of the guitar fret spacings was established, fret ranges were determined and finally implemented in the code. Next to absolute distance measurements, detecting the fret numbers is possible up to the ninth fret which already enables applications such as the Solo Mode.
The code for the right hand setup on the other hand is not yet as advanced mainly due to time constraints. Using a library, orientation, acceleration, and calibration data could be obtained from the IMU sensor and transmitted to Pure Data via MIDI. The y value from the orientation data is the only data so far that is suitable for further use to control effect parameters of a Wah Wah effect. Here, more variety in data usage would be desirable.
Lastly, different data transmission techniques were tested and evaluated. A lot of time was spent with first MIDI USB libraries and then MIDI only libraries with transmission suffering from a lot if lag initially. Serial port communication proved to be the first viable solution, fast enough to control effect parameters. Finally, the latency problems of MIDI could be eliminated. The current setups work via MIDI communication using a MIDI cable. For the third phase, a wireless means of data transmission would be desirable – especially for the right hand setup.
This goal was achieved to a large extent. Albeit initially working only with great latency issues, making a patch that receives MIDI data and is able to use it for further processing was achieved rather easily. The interim solution, serial port communication, took some more research but once the basic method was discovered, its application was straightforward. For the left hand setup, the incoming fret numbers can be either used directly to control effect parameters or, using the “moses” object for instance, a fret threshold can be set to make an ON/OFF switch. The patch for the right hand setup is very similar to that of its counterpart and effect parameters can be controlled.
As far as goal 5 is concerned, mixed results were achieved. Working with the digital audio workstation (DAW) Steinberg Cubase Pro 11 proved to be more difficult than previously anticipated and, consequently, it was decided to work with Pure Data only during the experimental phase. Nevertheless, in order to ensure a seamless integration into the guitarist’s natural workflow, a DAW integration of the final product is desirable. Thus, it will be tried to accomplish this in the third phase of the project.
Regarding the decision between using self-made effects or third-party plugins, both approaches were tested. The first patch contained a self-made delay and overdrive/distortion effect which proved to be useful for the first test but had definite shortcomings tonal quality-wise. Thus, the patches from then on used the object “vstplugin~” to implement third-party plugins in the Pure Data environment. The exception is the Wah-Wah effect for the right hand setup which is self-made and works well.
With a stable data transmission between Arduino and Pure Data achieved at a relatively late stage of the semester, this goal could not be fully achieved. Tone experiments involving several effects and their parameters were conducted. For the Solo Mode application of the left hand setup, an amplifier, reverb and delay were tested. Less tonal experimenting was done with the right hand setup: three effects tremolo, phaser and Wah Wah were tested with only the latter representing a reasonable effect to be controlled by the right hand setup.
It is evident, that only the surface has been scratched so far and much more in-depth research and experimenting in both setups will be needed to really provide practical applications for extending the range of possible guitar sounds.
In conclusion, it can be stated that all tasks set to be done during the second phase of the project have been approached and tackled with the majority of goals at least partially achieved. Additionally, setup compatibility with working hypotheses 1 and 2 was consistently ensured, with all effects working so far not invading the usual way people play guitar. As stated in the Exposé and in chapter 4 of this documentation, the overall aim of the second phase was to develop working setups that are sufficiently reliable and allow for further practical research regarding suitable effects, playability and performability. While the left and right hand setups are far from being a final product or ready to be tested by other guitarists, the second, experimental phase yielded a lot of progress. Overall, it can be affirmed that the left and right hand setups, albeit having shortcomings in some areas, are advanced enough to serve as a base for further practical research in the third semester.
Here, the first goal was to get basic sensors readings in general with the next step being to figure out what kind of readings are suitable for controlling effect parameters.
In order to establish which data is needed and how it should be used, the data values of x, y and z and their changes were analyzed while performing strumming movements with the sensor strapped to the hand.
With the IMU +Arduino outputting orientation data, it became clear that the y value could prove useful for controlling effect parameters while strumming the guitar. The range of values of y was analyzed for the up and down strumming movements and the established range subsequently constrained. Then the range was remapped to MIDI values from 0-127 and, using the same data transmission techniques as the left hand setup (so firstly serial bus communication and, subsequently, MIDI), sent to a Pure Data patch similar to the one for the left hand setup.
Next to orientation data, experiments were conducted with accelerometer as well as linear accelerometer data in the same manner.
Using IMU data in Pure Data
In the Pure Data patch, the incoming orientation and (linear) acceleration data was used to control several effects and their parameters. Using the aforementioned “vstplugin~” object, the following effects were tested:
MTremolo by Melda Production
MPhaser by Melda Production
However, the incoming IMU data proved to be not reasonable for these effects. The linear and normal acceleration data could not be used at all. The orientation data was working to some extent, but no practical application of its effects was immediately discovered.
The first success using the IMU data was the Wah-Wah effect. Using the “vcf~” object, a bandpass filter was made with an adjustable fader going from 280 Hz to 3000 Hz center frequency (the normal operating range of a Wah-Wah pedal) and a Q-factor of 4. Using the y value from the orientation data, the center frequency was controlled through strumming movements of the right hand. The resulting sound was similar to that of a “real” Wah-Wah pedal and could be achieved solely by the natural strumming performed during playing.
With my left hand setup kind of working I decided to start with my right hand setup which, unfortunately, I have totally neglected so far. Short recap: the right hand setup is planned to consist of an IMU sensor that picks up the natural strumming patterns of the right hand and uses the movement parameters to modulate the guitar sound. First of all, what is an IMU sensor? According to Wikipedia, an inertial measurement unit (IMU) is an electronic device that measures and reports a body’s specific force, angular rate, and sometimes the orientation of the body, using a combination of accelerometers, gyroscopes, and sometimes magnetometers. As the definition suggests, it is quite a complex device and is really on another level coding-wise than the left hand setup featuring the time-of-flight sensor.
At the very beginning of the semester, my supervisor gave me one of his IMU sensors, namely a MPU-92/65. However, as I approached him last week concerning the IMU sensor business for my right hand setup, he recommended me using another kind of IMU sensor, the BNO055 from Bosch. Apparently, there are better/easier-to-use Arduino libraries for the BNO055, and it is capable of sensor fusion – something I will get into below. Luckily, he also had one of those and gave it to me for experimenting.
Additionally, my supervisor told me the basics of IMU sensors which I will relay to you now:
As already mentioned in the definition, an IMU sensor basically combines an accelerometer, a gyroscope and a magnetometer and they can be used alone or in combination to obtain information about the position and/or movement of an object. When used in conjuncture (=sensor fusion), one can determine the pitch, roll and yaw movements of said object which is what I think I need. Since I have to actually wear the sensor on my wrist while playing the guitar, I cannot yet say, what kind of information I need from the IMU sensor. Of course, the pitch, roll and yaw movements make sense, but I could also try acceleration values for example. My goal for now is to get sensors readings in general and in the next step, I will try to figure out what kind of readings work best for my cause.
I found an Arduino library that lets me calculate the orientation of the sensor giving me readings of the x, y and z axes. My supervisor also highlighted the need to calibrate the sensor each time otherwise the readings are inaccurate. Luckily, the library also has a function that reads me the calibration status of each of the sensors in the IMU (accelerometer, gyroscope, magnetometer) – 0 means not calibrated at all; 3 means the sensor is fully calibrated. I watched a YouTube video that explains how to calibrate each of the three sensors: to calibrate the gyro, the sensor just needs to sit still for like 1-2 seconds (easy!). To calibrate the magnetometer, one needs to tilt and move the sensor for a bit into all directions which also works quite well. Calibrating the accelerometer is the most complex of all three approaches. One must tilt the sensor in different angles and hold each position for about five seconds. It takes a little bit of time and experimenting, but it works.
With the calibration and the orientation readings going, I decided to test it by putting the sensor on my wrist – easier said than done! After some tinkering I came up with the very rough solution of using my (seldomly worn) wristwatch and sticking the IMU sensor onto it using double-sided tape. Now I should be able to strap the watch over my hand and start playing.
With a new means of communication in place (compare blog #12), a new Pure Data patch was made. After having created custom effects (the delay and the overdrive/distortion effect) for the first Pure Data patch, it was now deemed better to use third-party effect plugins. Luckily, with the help of the object “vstplugin~”, such third-party plugins can be used inside of Pure Data. Furthermore, it is possible to access the plugins’ list of parameters and control them by sending control messages.
The Pure Data patch uses the following three plugins:
L12X Solid State Amplifier
TAL Dub Delay
TAL Reverb 4
The plugins were chosen due to personal constraints: the author is primarily using a desktop PC for audio-related tasks and, hence, most plugins are only available to the author on the desktop PC. As the author’s laptop is used for this project, free plugins were chosen.
At first, a prototype setting for the solo mode was made. Here, the parameter mapping is as follows:
The values coming in from the Arduino are the fret numbers that were calculated from the distance to the ToF sensor in the Arduino sketch. The incoming fret number determines if the amplifier plugin is turned on or off. Using a “moses” object set to the threshold of five, the incoming fret number is compared to the number five. If the fret is below five, the amplifier is or stays turned off. If the fret number is above five, it is turned on. Additionally, fret numbers above five increase the delay’s and the reverb’s wet parameters. As a result, one has a rhythm tone using reduced effect settings when playing below the fifth fret and a more overdriven and effect-laden, lead tone when reaching for frets five and above. The threshold of five is, of course, variable. It was chosen in this case because when playing in A one can easily begin to play a solo at the fifth fret position and above. The test was successful with the rhythm/lead tone switching happening quite reliably and sufficiently fast.
Next, instead of using fixed settings that were being switched either on or off, it was tried to control certain effect parameter settings more fluidly according to the fret number. For instance, the reverb wet knob was set to increase in value as the fret number increases. Consequently, a note played on the first fret had much less reverb than a note played at the 12th fret with the values in between increasing at a steady rate.
With the steady communication between Arduino and Pure Data working, a whole lot of new tonal possibilities was opened, and I am sure that there are a lot more possibilities to discover as far as mapping certain effect parameters to the Arduino data is concerned. This, however, will most probably take place in the third semester during the third phase of the project where suitable effects and playing styles that exploit the two setups’ capabilities will be further explored.
As mentioned in the previous blog post #11, it was initially planned to start using the IMU sensor. However, my supervisor and I decided that my current MIDI sending setup was not cutting it anymore. The problem is that I currently use control change to send MIDI data which is limited to the range 0-127 steps. Consequently, if I want to make a pitch shift for example, one would definitely hear the pitch jumping from step to step instead of having a smooth transition. My supervisor hence recommended sending MIDI using pitchbend values. Instead of 127 steps, a pitchbend message can be in the range 0- 16,383 which means way more steps and smoother transitions. Unfortunately, the current Arduino MIDI (USB) library I was using could not send pitchbend messages. My supervisor and I did some research and the only library that we found that could send pitchbend messages was not USB compatible.
Additionally, the general way in which both setups were transmitting sensor data via the Arduino to the laptop for further use was not ideal. The latency was high with new MIDI data sometimes coming in as slowly as every five to eight seconds according to the MIDI analyzer “MIDI View”. This of course, makes the data practically useable to fluently controlling effect parameters in almost real-time. In an effort to mitigate the issue, it was decided to abandon the idea of using the USB serial port and a MIDI USB library for transmitting the MIDI data. Instead, the idea was to use an actual MIDI cable and a “pure” MIDI library. As a result, I was tasked to go get a MIDI (DIN) jack and wire it to my Arduino so it could send the MIDI messages directly using a MIDI cable and not via the USB cable that connects the Arduino to the laptop.
Consequently, the necessary parts including a MIDI jack were bought. The MIDI jack was soldered and connected via a breadboard to the Arduino and a MIDI cable was used to connect the Arduino/breadboard to the MIDI In jack of a Steinberg audio interface. Additionally, the Arduino sketch was altered to accommodate the new MIDI library and new means of transmitting the MIDI data. Unfortunately, all these efforts were apparently in vain for the MIDI data transmission did not increase in speed. At that point in time, the reason for that was unknown and no explanation was found. With no reliable (and especially fast) way to transmit the senor data for further use in Pure Data, the whole project began to stall since a stable transmission was THE prerequisite for further testing of the attachment device, the sensor position, the Arduino code and the quest to find suitable effects and usage for the setups.
With the final presentation approaching fast, it was decided in June to try a radically different method. During an Internet research session, a YouTube video was discovered that uses serial bus communication to transmit data from an Arduino to Pure Data using the “comport” object in Pure Data. After specifying the baud rate, one can open and close the data stream coming from the Arduino and, using additional objects, one can convert the data stream to the original values sent from the Arduino. Using this method, faster data transmission could be achieved.
At this stage of the project, I decided that it is time to determine which sensor will be used for the left, fretting hand setup. In order to find out which sensor in which position works the best, I directly compared the sensors one after the other in different positions and made notes what works and what does not. The good news is that we have a clear winner; the bad news is that there are still some issues remaining.
What kind of sensors are tested? I tested the HC-SR04 ultra-sonic sensor against the VL53L1X time-of-flight sensor. Both sensors were tested using two different rigs: Rig #1 is attached to the bottom of the neck and rig #2 is placed horizontally on top of the neck.
I started with the ultra-sonic sensor. Having worked with the time-of-flight sensor for a while now, I was surprised of how bad the results were. As I mentioned a couple of times (most notably in blog post #6), I always had problems with the ultra-sonic sensor as far as accurately pinpointing the fret location if a real hand is used as a reference. In conjuncture with rig #1, the ultra-sonic sensor gives very accurate distance and fret number readings if a flat surface is used as the object whose distance should be elicited. If this object is a hand, however, the readings become very inaccurate. Don’t quote me on this but I think the problem arises because the ultra-sonic sound spreads out a little bit as it travels through the air and, as a result, bounces off of multiple points of the fretting hand which in turn leads to fluctuating readings. Using rig #2, the ultra-sonic sensor did not work at all during the test. Here, the sound apparently bounces off of the frets instead of my hand. Depending on the angle of the sensor it just detects ranges between two to five frets.
These sobering results lead me to the time-of-flight sensor which, luckily, fared better than its ultra-sonic relative. No matter what rig, the distance and fret number readings are super exact until fret number eight, and I think with further coding, I can extend the range to the 10th fret at least. I think the issue at hand is that the library I am currently using calculates only integer values. However, since the fret ranges and their lower and upper limits are defined as decimal numbers, the integer numbers may become too inaccurate and unsuitable to calculate the fret numbers after the 7th fret. I have already asked for the advice of my supervisor regarding this matter. Another major concession is that the fret detection works best when playing barre chords. When playing a solo, I am confronted with the problem that the forefinger is always registered to be one fret before actual fret where solo is played. This unfortunately is the result of the usual hand and finger positioning when soloing. Since my goal is to extent the guitar’s range of sounds in a way that is not invasive, I must find a solution that does not require the player to alter his or her hand position…
Nevertheless, we have a clear winner: the time-of-flight sensor – especially in conjuncture with rig #2. It does not work optimally yet, but I will spend no more time with the ultra-sonic sensor and focus on the ToF. However, as far as learning the code is concerned, my time spent with the ultra-sonic sensor was not in vain. I learnt a lot and I could transfer a lot of code to the ToF.
Hello and welcome back! We were in the middle of deciding which sensor fits my project best – the VL53L1X or the VL53L0X?
They are pretty similar and both have enough range, however, there is one difference which made me favor the VL53L1X: the possibility to (de-)activate SPADs. The VL53L1X has a 16 x 16 spad array for detection. By default, all spads are enabled. This creates a conical field of view (FoV) covering 27°. By disabling a part of the pads, the cone can be narrowed to 15° at 4 x 4 activated spads and the direction (ROI = Region of Interest) can be specified. The VL53L0X does not offer these options. Considering my problems with detecting the appropriate finger to get the most accurate location of the fretting hand possible, I figured that the possibility to further restrict the region of interest would lead to more accurate results.
Consequently, I bought a VL53L1X on Amazon – it was quite expensive in comparison: I could have bought three VL53L0Xs for the price of one VL53L1X. I was used to having cheaper and more sensors at my disposal (I had five ultra-sonic sensors for example) and I have to admit this put some pressure on me working with the sensor because I had no second (or even) third chance in case I messed it up somehow.
This pressure intensified when I saw that the sensor came without the necessary pins soldered on. I am SO bad at soldering. I once participated in a soldering course as a child where we had to build a radio from a kit, and I just couldn’t get it to work because of my horrible soldering. The short anecdote aside, I had no choice but to try it. The result is quite appalling (even my supervisor said so) but it sort of works. Here is a picture that illustrated my shortcomings:
I now had to built rigs to attach it to my guitar. The design of the rigs is closely based on the rigs already built for the ultra-sonic sensors, using the same material and technique. I first build a rig that places the sensor on the underside of the neck. When trying out the VL53L1X sensor using rig #1, I noticed that when playing a solo my forefinger is placed a little bit before the actual fret it is fretting, and I figured that maybe I need to adjust the sensor in this regard. However, when playing a barre chord for example, the finger position is straight on the correct fret – this could evolve into a real problem.
I decided to build a second rig that places it horizontally on the neck itself in order to see if this position solves this problem. I will keep you updated on the results.
I tried rig #1 with quite good results but still have to try out rig #2. After dropping it for the third time, I also gave in to the pressure of working with one single expensive VL53L1X sensor and bought the three VL53L0X just in case. This project is getting quite expensive, and I wonder if there is a possibility to receive some funds from the KUG or FH…