Here comes the ToF sensor Part 1

The proceedings described in this blog post took place quite some time ago before other projects forced me to put my guitar project on the backburner. I also did not have time to put my advancements into writing – until now. Today is a nice sunny day so I decided to sit on my balcony and write this post which is long due anyway.

Short recap: In my last post I mentioned problems concerning the tackling of finger placing when playing the guitar since Fingers placement on the fretboard is:

  1. Different according to how the guitar is played. The hand and finger position differs for example when Barre chords are played compared to when single notes/a solo is played.
  2. Not like a flat surface (which is ideal for the ultrasonic sensor) but instead quite inconsistent and I think that sometime the sensor does not “know” which finger or other part of the hand is the relevant point from which to measure the distance from.

I had the growing concern that the ultra-sonic sensor is not the right tool to pinpoint the location of the fretting hand along the neck and after consulting with my project supervisor we decided to try our luck with a so-called time-of-flight (or ToF) sensor that measures distance using light not ultra-sonic sound. My supervisor recommended me several types but and I started comparing their specs.

But first of all, what are “time-of-flight” sensors? According to a trusty and very helpful blog called “Wolles Elektronikkiste”, these kinds of sensor work (as the name suggests) with the time-of-flight (ToF) principle. They emit light rays and measure the distance according to time passed until the reflected rays reach the sensor again. This makes them very similar to the ultra-sonic sensor, but they work optically. The rays I mentioned are in fact infrared (IR) rays emitted from a Vertical Cavity Surface-Emitting Laser (VCSEL) which features a wavelength of 940 nanometers (nm) – a wavelength outside the spectrum of visible light. Furthermore, the sensors use so-called Single Photon Avalanche Diodes (SPADs) (this detail will become more useful down the (blog) road).

On my quest to find the ideal ToF sensor, I looked more closely at the VL6180X, the VL53L0X and the very similar VL53L1X.

A quick search verified that the VL6180X is unsuitable for my project because of its limited range of 20 cm. As mentioned in one of my previous blog posts, I am working with a Charvel guitar whose scale length is 64,8 cm with the neck measuring approximately 40 cm so consequently, 20 cm of range is too short.

The next choice between the VL53L0X and the VL53L1X proved to be more difficult, and I have not reached a final decision yet. Again “Wolles Elektronikkiste” provided me with a useful blog comparing these two sensors. They have a lot in common as far as voltage range and pinouts are concerned and the communication is via I2C for both sensors. The article describes the range as the most obvious difference between the VL53L0X and VL53L1X with the first reaching two meters and the latter up to four meters. These two ranges are both more than enough for my applications, so I still had no obvious favorite.

I will insert a cliff hanger here to keep the suspense alive: Which sensor will I choose? Read my next blog to find it out!

Sources:

VL6180X – ToF proximity and ambient light sensor • Wolles Elektronikkiste (wolles-elektronikkiste.de)

VL53L0X and VL53L1X – ToF distance sensors • Wolles Elektronikkiste (wolles-elektronikkiste.de)

My Poster of the Future Journey

On Tuesday this week, one of the major projects of the semester came to an end: the poster of the future. Over the last two weeks this project kept me quite busy, and I had little time for something as complex as my guitar project and related blog posts. To make up for at least one blog post, I decided to dedicate this blog to my poster of the future project-journey.

It all started back in January, still in the first semester. We were tasked to come up with an idea concerning a “poster of the future”, thinking about how a poster could look like in the future and which themes it could possibly address. The finished results would then be presented in an upcoming exhibition in September. The project aimed to foster the interaction between the majors but at first, only the majors Sound Design and Communication Design were involved.

I teamed up with a colleague, who studies Communication Design and we thought about future problems or issues that may arise in the future. We did not want to go with obvious choices such as climate change or too much screen time (which would have been the easier and better (?) choices in retrospect). After some time (and partly inspired by me watching Blade Runner 2049), we came up with the topic of advanced, intelligent robots and how they will be treated in the future – will the be enslaved or used for work deemed unworthy for humans? In terms of poster and sound, we wanted to show a human hand that was a robot’s underneath, accompanied by sound that changed from analog to digital sounds and instruments.

With this idea, we started into the second semester where the whole project and its importance were blown up immensely. Suddenly, all majors were to take part and every student should not only do his or her own project but in addition had to partner up with another student and help him or her out.

Furthermore, the second semester also marked the beginning of my colleague and me erring from project presentation to project presentation, desperately trying to come up with a consistent concept that satisfied the demands of our two main lecturers. At first, we were pretty happy with the idea but the lecturer responsible for Communication Design changed from the first to second semester and our idea had to change as well.

We followed the suggestion of the COD lecturer to pursue the concept of eroticism and how it might be perceived in the future. We had several ideas that were rejected sooner or later because they did not fill the teachers’ expectations. Finally, we stumbled onto the term of erobotics, describing the possibility of intimate human-robot relationships in the future. We clung to that topic and (with the help of the SD lecturer) came up with the idea of three people telling their friends about their intimate experience with a robot. To ensure the authenticity of the narrations, we hired three to-be actors that I knew from a film project we did during the first semester. The actors proved to be quite costly, but they did a great job and at the end of the day I had three authentic tales to work with. In the meantime, my colleague had to let go of illustrations and was told by her lecturer to use photos instead. She photographed close-ups from machines, and I installed a mechanism that allowed for triggering the actors’ stories by touching the photos.

The last hurdle was to come up with a sound design concept underlining the actors’ recordings. I was already pretty happy with the recordings and with the messages they conveyed so I had a hard time coming up with additional sound design just for the sake of it. My lecturer suggested using abstract sounds. The thing is I cannot relate to abstract design – visual and auditory. I never stood before an abstract painting and was overwhelmed with emotions. However, I saw my fair share of realistic paintings all over Europe and was always in awe. Maybe it is because I do not understand it, but I just cannot relate to abstract art and am never impressed by it.

But here we go. I recorded some ASMR sounds that admittedly turned out pretty good. However, I tried to go for something realistic for one more time and modeled acoustic backgrounds that placed the three actors in three different but plausible environments where they would tell their stories in real life as well. However, I was then again remined that the abstract way is the right way, and I changed the whole thing again and came up with an abstract concept I could halfway relate to. By then, I was already hoping that our poster would not be chosen for the exhibition.

Not counting a one-hour-delay, the final presentation went quite well with the touch function working like a charm.

I am now glad that the project is over. Although it was a real struggle, I am reasonably happy with the results, and I think that we did our best. I do not see my future in sound installations and exhibitions so I will not be extraordinarily sad if our project does not get chosen. In retrospective, we should have decided on something more straightforward than erobotics and it would have saved us some nerves. Additionally, I have to say that I would have wished for more acceptance of our own ideas. Somewhere in the middle of the project I began thinking about what kind of sound design would please my lecturers instead of thinking what kind of sound design I would like our poster to have.

However, I also have to mention that I really enjoyed doing the sound design for another student’s project. It involved composing four songs of four different genres (jazz, rock, retro pop, classical) and this is where I feel truly comfortable.

Establishing ranges and sensor type musings

With the correct formula identified, I started to work on the Arduino implementation. As outlined in Blog #5, my first sketch could only compute the right fret number after manually inputting the corresponding distances. The next task was thus, to make a sketch that allows the setup to read the distance and then automatically output the correct fret number. As the hand is always nearer to the sensor than the fret, I determined that certain fret ranges must be defined. Arduino then must be able to recognize which distances coming from the sensor fall into which fret range and then output the correct fret. From my Excel calculations sheet (and in retrospective just from looking at the guitar), I noticed that the fret deltas (the difference between the distance to fret n and the distance to fret n+1) gradually decrease and that this decrease corresponds to the width of the fret. Since my fingers are always between the area of two frets, I can define a range of a fret with the upper value being the fret distance of fret n and the lower value being the fret distance of fret n-1.

Following this principle, I defined the following fret ranges:

Fret numberRange lower limit [cm]Range upper limit [cm]
0  
10,003,64
23,647,07
37,0710,31
410,3113,37
513,3716,25
616,2518,98
718,9821,55
821,5523,98
923,9826,27
1026,2728,43
1128,4330,47
1230,4732,40
1332,4034,22
1434,2235,93
1535,9337,55
1637,5539,08
1739,0840,53
1840,5341,89
1941,8943,18
2043,1844,39
2144,3945,53
2245,5346,62

I then implemented these ranges in my Arduino code using the if() function.

I then put the code to the test, and it works as planned – at least when using a flat surface (in my case a ruler) to act as the object whose distance is measured. Unfortunately, it does not work that well or consistently with my actual hand playing the guitar. Fingers placement on the fretboard is

  1. Different according to how the guitar is played. The hand and finger position differs for example when Barre chords are played compared to when single notes/a solo is played.
  2. Not like a flat surface (which is ideal for the ultrasonic sensor) but instead quite inconsistent and I think that sometime the sensor does not “know” which finger or other part of the hand is the relevant point from which to measure the distance from.

I decided to tackle this problem by firstly improving the rig that secures the ultrasonic sensor to the guitar neck/headstock figuring that my first version was too instable and placing the sensor in a non-optimal position. I therefore bought a hot glue gun and went to my mum and her cellar containing other tools and material. With these combined powers at my hands, I crafted two rigs that constitute a definite improvement over my first version as far as stability is concerned. The first rig places the sensor below the fret and the other rig places the sensor right on the fretboard itself.

Rig 1

Rig 2

The first rig however, gives similarly inconsistent results as the very first version. The second version looks more promising, but I am currently dealing with a reflection issue with the ultrasonic beam being reflected too early from the frets.

Unfortunately, this problem is reoccurring no matter what rig I try, and I cannot seem to get on top of it. I am thinking about changing the type of sensor from ultrasonic to time-of-flight sensor functions optically, measuring the distance using time and light. Another possibility would also be to use more than one sensor and combine the measurements to obtain a more accurate overall measurement.

Figuring out the fretboard scale

One of the next tasks I tackled was figuring out the scale of the fretboard. What do I mean by that? At first, I had the ultrasonic sensor measure the distance in a straightforward, linear way. However, the frets on a guitar neck become progressively smaller and, accordingly, the linear scale progressively more inaccurate. For my product, it would be great, if the sensor would not output the absolute distance but the actual fret of the current hand position. The first step in achieving this goal was to establish the mathematical relationship of the distances between the frets. Luckily, I found a YouTube video series that thoroughly explained the issue.

Apparently, the basis of it all is the (Western) chromatic scale and the 12-tone-equal temperament. The chromatic scale is a musical scale with twelve pitches, each a semitone, also known as a half-step, above or below its adjacent pitches. The 12-tone equal temperament on the other hand is the musical system that divides the octave into 12 parts, all of which are equally tempered (equally spaced) on a logarithmic scale, with a ratio equal to the 12th root of 2 (12√2 ≈ 1.05946). That resulting smallest interval, 1⁄12 the width of an octave, is called a semitone or half step.

As a result, in 12-tone equal temperament (the most common tuning in Western music), the chromatic scale covers all 12 of the available pitches.

After watching the video and making some experimental calculations, I came up with the following: The resulting formula to calculate the current length of the guitar length depending on the fret is as follows:

k = fret

Lk = string length at fret k

L0 = scale length

Using this formula, I then calculated the fret positions for my Charvel with a scale length of 648 mm or 64.8 cm on an Excel sheet. I double-checked the calculations by manually measuring several fret distances and compared them with the calculated ones – the math seems to be correct. As this formula gives me the string length AFTER the finger so from the finger/fret all the way to the bridge of the guitar, I subtracted the total scale length of 64.8 cm to get the distance from the sensor at the headstock to the hand.

I quickly realized however, that I needed the formula the other way round so to speak – I had to express the fret k depending on the distance measured. Consequently, I used all my summoned all my mathematical skills to transform the formula accordingly. I came up with:

k = fret

Dk = distance from finger to fret

L0 = scale length

Now I had (or still have) to find a way to implement this formula in my Arduino code and find out how to use it in the best way. Firstly, I had to find out another way to get distance measurements from the ultrasonic sensor because I used a library that only outputted integer values. However, the fret distances are not integer numbers at all so I used another way of programming the ultrasonic sensor, so it sends floating point values. With that done, I had to find a way to implement the formula in Arduino. Arduino knows the log function only (the logarithm with base 10) but my supervisor gave me the following formula to transform the base of my current formula (2 to the power of ½) to two log 10:

k = fret

Dk = distance from finger to fret

L0 = scale length

I then started a new, separate sketch to try out the formula and it works! I can input a certain distance and it returns me the corresponding fret.

However, I still have some issues to solve:

  1. The hand is always nearer to the sensor than the fret, so I need to establish certain fret ranges.
  2. Arduino then must be able to recognize which distances coming from the sensor fall into which fret range and then output the correct fret.  
  3. It would be cool if one could calibrate the sensor by first giving him the distance of the first fret and then the 12th and it calculates the frets in between based on the above-mentioned formula. That way, the fret detection mechanism would be flexible and adaptable to each guitar scale length.

Sources:

Math of the Guitar Fretboard (Part 1) – YouTube

As well as the following parts

Chromatic scale – Wikipedia

12 equal temperament – Wikipedia

The Emotional Space | #12 | Signal Flow

From a sensor movement to a change in the soundscape, it takes quite a number of steps. To make the process and my tasks for building a prototype a little more tangible I created a small sketch that includes some signal flow descriptions and other details that lay out the logical sequence of the interactive capabilities of this project. I will use this blog entry to display and explain the aforementioned sketch.

The complete signal flow sketch

Refining sensor readings

After achieving the first goal to turn an effect on above a certain fret and turn it off when below this fret as outlined in Blog #3, I have to admit that I struggled a little bit figuring out my next steps. Luckily, my supervisor and I were able to schedule and hold a meeting last week and it really helped me finding out what to do next. This weekend was all about coding and refining the code that I had up until then.

One of my problems was that the sensor data I received in Pure Data was fluctuating a lot in a quiet volatile manner which made it not ideal to control parameters like delay time or feedback amount for example. My supervisor suggested applying a kind of filter that smooths the readings. We decided on the moving average method. Based on this method, I added some lines to my code so now the Arduino takes ten samples of the distance readings of the ultrasonic sensor, sums them up and then calculates the average by dividing the sum of the ten samples by ten. I then use the average value of ten incoming samples for further computation and MIDI conversion instead of individual values. I tried it out and at least in the serial monitor of the Arduino IDE, the readings are less volatile. However, I still have to try it out in conjuncture with Pure Data and actual audio effect applications. The readings are especially smooth using integer values which I did at first. However, I fear that I might need float values in my further computations, so I changed the average filter to work floating point numbers. However, maybe integer values may be enough – so far it is impossible to tell for me.

Next to the moving average filter, another measure taken to smooth the incoming MIDI data was to send the MIDI data less frequently than the sensor outputs its readings. Using the millis() object, I programmed the Arduino to keep of the time since the program started and wait 20 ms each time before sending MIDI data to my laptop. Of course, the delay should not be too long to ensure that the parameter changes of audio effects controlled by the incoming data are still smooth enough. I think it works but as it is the case with the moving average filter, I still have to confirm the measure’s positive impact in conjuncture with Pure Data and actual audio effect applications.

Additionally, me and supervisor also talked about ways to improve some problems that I was experiencing in Pure Data. The first problem was massive latency issues, so we talked about ways to reduce the latency in Pure Data. However, as I focused on the coding aspects this weekend, I did not come round to that part yet. Additionally, there were constant clicks when I used the MIDI data from the ultrasonic sensor to alter parameters of audio effects in Pure Data. Luckily, my supervisor could help me out: apparently, I (unintentionally) implemented a kind of true bypass solution that comes with its characteristic popping noise when turned on or off. With the sig~ object and a lowpass filter lop~, I now first convert the control (numeric) signal to an audio signal and then filter it which should remove the pops.

The Emotional Space | #11 | First Tests

With a little delay, I finally managed to get my hands on some 2.4SINK Sensors by Instruments of Things. Two of them to be more exact, together with one receiver, the 2.4SINK Eurorack Module. Other than the SOMI series that is to be released this summer, the 2.4SINK kit natively only works with control voltages (CVs), to be used in a modular setup (or literally any setup that works). The kit that my university kindly provided, which is currently standing on my desk, sits in a Doepfer casing, where it is connected to a power supply and an Expert Sleepers ES-8 USB audio interface. Furthermore there is an input expansion module on this rack, the Expert Sleepers ES-6, increasing the interface inputs from just four to ten.

Instrument of Things 2.4SINK Eurorack Module in a Doepfer casing attached to an Expert Sleepers Audio Interface

Working with Pure Data

I initially planned to use the MIDI data coming in from the sensor + Arduino directly in my DAW of choice Cubase. This, however, proved to be difficult because either Cubase is not really suited for this kind of external input, or I do not know Cubase well enough to make it work. Although, I could select the Arduino as a MIDI input in Cubase, I only managed to assign some parameters to be controlled and these were not the parameters I needed. Consequently, I abandoned the idea to use my DAW (for now) and decided to try it with Pure Data since my goal for now is only to try things out and I do not need a perfect setup yet – only a working one XD.

Luckily, I managed to write a patch in Pure Data that receives the MIDI input from the sensor + Arduino and allows me to use it further. With no DAW plug-ins at hand, I had to make my own  and with some help from YouTube I made a delay. I subsequently assigned the incoming MIDI data to the delay parameters and played the guitar to see how I could change the delay parameters through my playing. The parameters controlled by the sensor data were “Delay Time”, “Feedback Amount” and “Delay Volume (dry/wet)” respectively. I made the following observations:

The sensor measurements and consequently the MIDI values fluctuate quite a bit even when I try not to move my fretting hand at all. This is not ideal, particularly for controlling the “Delay Time” and “Feedback Amount” parameters. Additionally, dropouts occur from time to time with the MIDI value dropping to zero before going back up to the initial value which of course ruins the delay settings. I think this problem could be solved by defining certain MIDI note ranges e.g. from 97 to 103 where the effect parameters still stays the same. This way one could achieve more constant settings.

There are annoying clicks when the delay parameters change. I tried to smooth the transitions with a PD “line” object but the clicking did not really improve. I do not know what causes this problem and how to deal with it at the moment.

I had the greatest success with using the MIDI data to switch the delay on and off depending on if one is above or below a certain point of the fretboard. I discovered and used the PD “moses” object to turn the delay off for basic chords in the lower region of the fretboard and to turn the delay on if I reach up for higher frets. This setup is not harmed by the fluctuating problem since there is more room for error so to speak: If MIDI values over 50 mean ON, then it does not matter if the value is 56 or 60, the effect will stay ON.

To complement this discovery, I made a custom overdrive/distortion effect in Pure Data using the “clip” object. It is not a Tubescreamer but it serves my purpose. I also used the MIDI data as a means to turn the effect on and off and it works really well. There are only some latency issues.

The possibility to turn an effect on above a certain fret and turn it off when below this fret is one important step towards my envisioned “Solo Mode” so I am pretty happy that I made it work and achieved the first goal of my project.

First Steps into the right Direction

Last week I tested positive for COVID-19 and actually felt ill as well. Therefore, I focused on recovering and put the project on the backburner. With no new developments to write about I also skipped the weekly blog post. However, this week I felt much better already and continued with the project. I have to admit that the project until now was not as fruitful as hoped with no or only little progress made even after hours of working on it. However, this week things worked out to some extent, and I made a few steps in the right direction.

The first milestone I achieved was writing an Arduino IDE sketch that allowed me to use the Arduino as a MIDI output device. Initially, I had borrowed an Arduino UNO from the FH only to find out that an UNO is not really capable of being used as a MIDI device. I received an Arduino Micro Pro from my supervisor, but the Micro Pro required a specific set-up procedure that appeared quite daunting to a programmer noob (=me). Luckily, the FH also had an Arduino Leonardo which works with MIDI AND is easy to set up so I decided to use this one. With the right Arduino available, I started working on the code and managed to make a sketch that takes the distance measurements of the ultrasonic sensor and converts them to MIDI messages that I can subsequently use to control parameters in my DAW or Pure Data.

The next challenge was then to attach the ultrasonic sensor (+ the Arduino) to the neck of the guitar in a way that gives me reliable and useful distance readings of my fretting hand to pinpoint its position along the neck. The problem is that the posture of the fretting hand varies a lot depending on how one plays. The hand’s posture playing barre chords for example is very different from playing single notes. Therefore, I tried quite a few positions before settling on the one seen in the picture. (!!!!) I even considered to drop the ultrasonic sensor in favor of a different sensor that may be better suited for this kind of measurements. For now, the ultrasonic sensor seems to work but I will keep other options in mind.

With the ideal position identified, I still had to find a way to attach the sensor to the neck. This proved to be quite difficult because the required position while being suitable to measure the fretting hands distances is not at all ideal to attach the sensor to. Furthermore, I am no DIY genius, and I was still in quarantine, so I had to make do with the materials I had in my apartment. In the end, I “built” a small box for the ultrasonic sensor out of a Bepanthen package and used a single chop stick to attach it to the guitar’s headstock. I was very much in doubt if I really managed to put the sensor in the right position but to my great astonishment it works good enough to allow for some tone testing.

Welcome to Phase 2 of the project!

Welcome back to my blog series covering the proceedings of my guitar project! After enjoying the semester break, I decided it was time to dedicate myself to my project again. As specified in my exposé, the second phase of the project is concerned with developing working setups that are sufficiently reliable and allow for further practical research regarding suitable effects, playability and performability. The second phase includes tasks like:

  • Determine and acquire necessary equipment. Including: an IMU sensor, an ultrasonic sensor as well as two microcontrollers of the brand Arduino
  • Determine ideal placement of sensors and microcontrollers on guitar neck and pick/right hand and install them accordingly
  • Program microcontrollers to pick up the movements of the fretting and picking hands
  • Determine suitable effects and parameters to be triggered by the movements of the fretting and picking hands

On Tuesday, I rented an Arduino UNO starter kit from the FH JOANNEUM and bought a package of ultrasonic sensors which will be needed for my left-hand guitar setup. I decided to start with the fretting hand setup using the ultrasonic sensor since I think it will be easier to implement than the picking hand setup involving the IMU sensor. Thanks to YouTube university, I was able to quickly code a sketch which allows me to measure the distance from the sensor. As outlined in my exposé, the plan is then to use the distance data to pinpoint the hand along the neck. The position of the guitarist’s hand is then used as input data to modulate effects.

On Thursday I met my new supervisor DI Marian Weger from the KUG Graz. Although Mr. Ciciliani guided me very well through my ideation and research phase, we decided in the previous semester to part ways since my current project focus now requires a technical, coding approach more than an artistic approach. Consequently, I will now be working with Marian. During the kick-off meeting last Thursday, we talked about organizational aspects as well as the project itself. One very interesting issue was raised: I always envisioned to control digital effect plugins in a DAW. Marian however, mentioned that it would also be interesting to him if the setups controlled settings of “analogue” effect pedals. While I think that the modulation possibilities are more limited with actual effect pedals, it is certainly a noteworthy perspective since my goal is to further extend the range of sonic possibilities of an electric guitar in a non-invasive manner. While I primarily focus on not interrupting the player’s natural hand movements, “non-invasive” could also refer to rig compatibility. The setups would therefore fulfill the working hypotheses even more if they could be used in conjuncture with a conventional guitar rig consisting of effect pedals and do not require an additional laptop or other “invasive” measures.

During the meeting, I also received my first “homework”: namely to come up with a way to convert sensor data to MIDI data and thus make the Arduino UNO I am working with a kind of MIDI controller. Unfortunately, after several hours of watching tutorials and sifting through online forums, it turned out that the Arduino UNO is not necessarily the best type of Arduino to work with MIDI. Luckily, I have the possibility to borrow an Arduino Pro Micro from the IEM on Monday with which I will hopefully be able to achieve the above-mentioned task. With the coding part done, the real fun begins: mounting the ultrasonic sensor to my guitar neck and start experimenting with different sounds and control parameters. The goal is to find out which hand movements are suitable to trigger which effect or which effect parameter.