Google Creative Lab recently released A.I. Duet, an interactive AI Music experiment that lets you use your laptop keyboard or a MIDI keyboard (using Chrome's Web MIDI feature) to make music and experiment with artificial intelligence .
Duet was built by Yotam Mann and the Magenta and Creative Lab teams at Google using Tensorflow, Tone.js, and open-source tools from the Magenta project.
The cool thing about this project is that you can not only play music with it, it's all open source code so that if you are into coding you can get the actual code to experiment with it. Tensorflow is an open source software library for numerical computation using data flow graphs.
Tone.js is a Web Audio framework for creating interactive music in the browser. The architecture of Tone.js aims to be familiar to both musicians and audio programmers looking to create web-based audio applications. On the high-level, Tone offers common DAW (digital audio workstation) features like a global transport for scheduling events and prebuilt synths and effects. For signal-processing programmers (coming from languages like Max/MSP), Tone provides a wealth of high performance, low latency building blocks and DSP modules to build your own synthesizers, effects, and complex control signals.by Yotam Mann
Yotam also worked on another really interesting AI musical experiment called the Infinite Drum Machine,
Last year at MoogFest, Google announced their plans for Magenta. Doug Eck explained that one of Magenta's goals is to create an open-source tool to bring together artists and coders looking to make art and music in a collaborative space. As part of the initiative, Google will provide audio and video support, tools for MIDI users and platforms that will make it easier for artists to connect with machine learning models.
The Magenta project generated it's first song (available below) after being fed only a few notes of input.
Here is a link to all of the Google AI experiments.
Google is not the only company that has created musical artificial intelligence experiments. Sony's Flow Machines goal is to "research and develop Artificial Intelligence systems able to generate music autonomously or in collaboration with human artists."
Here is an example of Bach harmonization generated using deep learning.
By turning music style into a computational object, Sony's research project funded by the European Research Council (ERC) can create songs in different styles. Here is a song generated by Flow Machines in the style of the Beatles.
So what does all this musical artificial intelligence have to do with MIDI. Most of these learning machines are fed MIDI as their input because MIDI is the musical instrument digital interface. For example, the Magenta artificial intelligence engine was fed 8000 MIDI files that the neural network analyzed for patterns.
For even more information about musical artificial intelligence check out this excellent article from @hazelcills Hazel Cills on the MTV website
When you subscribe to the blog, we will send you an e-mail when there are new updates on the site so you wouldn't miss them.