Skip to main content

Music synchronization

Music synchronization is a feature of the controller core which really makes the effects dynamic and living. It is possible to generate actions on different elements of music. If a music synchronization started, actions start to execute in 3 new scenarios. These are:

  • Beat
  • Track segment
  • Section

Currently the Controller Core only supports the Spotify audio analysis output. This is a JSON output that can be queried from the Spotify Web API

The output looks like this but with a lot more data:

{
"bars": [
{
"start": 251.98282,
"duration": 0.29765,
"confidence": 0.652
}
],
"beats": [
{
"start": 251.98282,
"duration": 0.29765,
"confidence": 0.652
}
],
"sections": [
{
"start": 237.02356,
"duration": 18.32542,
"confidence": 1,
"loudness": -20.074,
"tempo": 98.253,
"tempo_confidence": 0.767,
"key": 5,
"key_confidence": 0.327,
"mode": 1,
"mode_confidence": 0.566,
"time_signature": 4,
"time_signature_confidence": 1
}
],
"segments": [
{
"start": 252.15601,
"duration": 3.19297,
"confidence": 0.522,
"loudness_start": -23.356,
"loudness_max_time": 0.06971,
"loudness_max": -18.121,
"loudness_end": -60,
"pitches": [
0.709,0.092,0.196,0.084,0.352,0.134,0.161,1,0.17,0.161,0.211,0.15
],
"timbre": [
23.312,-7.374,-45.719,294.874,51.869,-79.384,-89.048,143.322,-4.676,-51.303,-33.274,-19.037
]
}
],
"tatums": [
{
"start": 251.98282,
"duration": 0.29765,
"confidence": 0.652
}
]
}

The Controller Core takes this analysis as an input and parses all the beat, segment and section information and stores them in a queue. Besides this audio analysis the Core needs another information which is the current track progress (The elapsed time since the beginning of the track). It removes all the rhythms that are already passed in the music and starts a timer for the next one. Once the timer finishes it calls a callback function and adds an action to a queue that is executed in the next Tick.

The result is that the actions are executed at the same time these rhythms happen in the music. With the help of this feature the effects can be really spectacular.

Known Issues

  • The core does not support changes in the track. For example if the music is paused or forwarded it cannot be signaled. This is primarily because at the moment it's impossible to get this information from the Spotify Web API on the application side. There is a request from the community but until it's not addressed this could be implemented only with polling but it would not be scalable because of the Rate limiting. Currently the core receives the audio analysis and the track progress at the beginning of the synchronization and it emits until all the rhythms are out of the queue even if the music is not there anymore.