Google can now guess the song you hum!

Nikita Gawde
3 min readNov 7, 2020
Does this sound familiar to you?

Google has answered this prayer for sooo many of us. On October 15 2020 this new interactive feature called “Sound Search” was announced on Google’s Sound On page. This is a modification of the pervious Sound Search feature which could recognize the song playing by listening in the background in Google Pixel series and the song would pop up, but now using AI and CNN, it is possible to just HUM for 15 seconds whatever parts of the song you remember and it will show up similar song results. How cool is that?

So how do you use it?

So the ways you can use this feature is ( it’s currently available only for Android phones) use the sound search widget of Google or fire up the assistant and simply say “Hey Google, what’s this song?” After you finish humming, the most relevant songs to the tune you’re humming to pop up.

How does it work?

After you’re finished humming, the machine learning algorithm helps identify potential song matches.Then you can select the best match and explore information on the song and artist, view any accompanying music videos or listen to the song on your favorite music app, find the lyrics, read analysis and even check out other recordings of the song when available.

So how does it work? An easy way to explain it is that a song’s melody is like its fingerprint: They each have their own unique identity. We’ve built machine learning models that can match your hum, whistle or singing to the right “fingerprint.”

Song’s fingerprint from NY times

When you hum a melody into Search, the machine learning models transform the audio into a number-based sequence representing the song’s melody. The models are trained to identify songs based on a variety of sources, including humans singing, whistling or humming, as well as studio recordings. The algorithms also take away all the other details, like accompanying instruments and the voice’s timbre and tone. What we’re left with is the song’s number-based sequence, or the fingerprint.

We compare these sequences to thousands of songs from around the world and identify potential matches in real time. For example, if you listen to Tones and I’s “Dance Monkey,” you’ll recognize the song whether it was sung, whistled, or hummed. Similarly, the machine learning models recognize the melody of the studio-recorded version of the song, which we can use to match it with a person’s hummed audio.

This builds on the work of the AI Research team’s music recognition technology. We launched Now Playing on the Pixel 2 in 2017, using deep neural networks to bring low-power recognition of music to mobile devices. In 2018, the same technology is brought to the SoundSearch feature in the Google app and expanded the reach to a catalog of millions of songs. Google spent most of 2017 and 2018 building the technology to give songs a unique fingerprint. This new experience takes it a step further, because now we can recognize songs without the lyrics or original song. All we need is a hum!

References

  1. https://blog.google/products/search/hum-to-search/
  2. https://www.cnet.com/news/google-assistant-just-got-way-better-at-recognizing-which-songs-are-playing/
  3. NY Times article on song fingerprint

--

--