google

You can now find the song with Google Search

According to the latest news, the giant company, Google announced updates. The company added a new feature to its search function and has implemented artificial intelligence and machine learning to improve the customer experience.

Customers now can whistle, hum or sing a melody to Google via the mobile app by clicking the mic icon and saying what this song is? Users can also tap the ‘search a song’ button.

Sounding for 10-15 seconds will give Google’s machine learning algorithm the chance to match the song.

The new feature is currently accessible in English on IOS, and in approximately 20 languages on Android, with more languages coming to both platforms in the future.

Google’s Artificial Intelligence updates also span spelling and general search query. That includes a new spelling algorithm that utilises a deep neural net, which Google claims has significantly advanced its ability to decipher misspellings.

Moreover, the search giant announced it wanted to focus on spelling updates. A one in 10 queries is misspelt. Google announced that this single change makes a more remarkable improvement to spelling than all of our progress over the last five years. With the help of a new spelling algorithm, it is easier to understand the context of misspelt words. Therefore, Google can find the right results in under three milliseconds.

The giant company also reported that its BERT language understanding system was now utilised in almost every query in English, which has helped users to receive higher quality results for questions.

According to Google, with the help of Artificial Intelligence, it could now understand the deep semantics of a video and automatically distinguish critical moments.

Google Lens can not identify 15 billion things

The company also announced new ways to use Google Lens and augmented reality while learning and shopping.

Indeed, searching by taking a photo or screenshot was already possible. Still, now customers can tap and hold an image on the Google app or Chrome on Android and Lens will find the accurate or related items. Remarkably, the feature is coming soon to the Google app on iOS.

Google Lens can now identify 15 billion things to help recognise stuff like plants, animals, and landmarks. It can also translate more than 100 languages.

Google also added Live View feature to help you get social distanced. It allows you to get essential information about a business before you even step inside.

Coronavirus safety information will now be placed on business profiles across Google Search and Maps.

The new feature will help you know if a business requires you to wear a mask, if you need to make a reservation, or if the organisation is taking additional safety cares, like temperature checks.

Sending
User Review
0 (0 votes)

RELATED POSTS

Leave a Reply