CSAIL Deep-Learning System can Isolate Instruments in Videos #MusicMonday

This is crazy cool news! From MIT News:

Amateur and professional musicians alike may spend hours pouring over YouTube clips to figure out exactly how to play certain parts of their favorite songs. But what if there were a way to play a video and isolate the only instrument you wanted to hear?

That’s the outcome of a new AI project out of MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL): a deep-learning system that can look at a video of a musical performance, and isolate the sounds of specific instruments and make them louder or softer.

The system, which is “self-supervised,” doesn’t require any human annotations on what the instruments are or what they sound like.

Trained on over 60 hours of videos, the “PixelPlayer” system can view a never-before-seen musical performance, identify specific instruments at pixel level, and extract the sounds that are associated with those instruments.

Read more and see more from MITCSAIL on YouTube



from Adafruit Industries – Makers, hackers, artists, designers and engineers! https://ift.tt/2KVq8wp
via IFTTT