Google introduces Daily Listen, AI audio summaries for your news feed

Updated on 10-Jan-2025
HIGHLIGHTS

Daily Listen will give users a detailed AI-generated transcript and accompanying cover art.

Right now, this feature is available only in the US for both Android and iOS users.

In addition to this, Google also upgraded Audio Overviews.

Google is currently working on a new AI feature called Daily Listen. Once this feature rolls out for the general public, it will give five-minute audio summaries of stories and topics that interest the individual user. This feature uses data from users’ Discover feeds and search histories to curate content that resonates with them. The feature is currently available as a Search Labs feature. Further, Daily Listen will give users a detailed AI-generated transcript and accompanying cover art.

If you want to give it a try, here’s how to use it.

How to use Google’s Daily Listen?

Right now, the Daily Listen feature is available only in the US for both Android and iOS users. You can access it via the Search Labs section of the Google app by clicking on the triangular beaker icon in the top-left corner.

Once you enable the feature, it appears as a ‘Made for you’ Daily Listen card under the Space carousel beneath the Google Search bar. Once you tap on the card, a full-screen audio player launches.

You will then get the option to play, pause, rewind, or mute the audio. Users can also provide feedback by giving the content a thumbs up or down. This will further help with the personalisation.

Google had previously introduced NotebookLM’s Audio Overviews back in September 2024. It could transform documents into 10-minute podcasts hosted by AI. These hosts summarise uploaded content, establish thematic connections, and engage in conversational exchanges.

In addition to this, Google also upgraded Audio Overviews by enabling users to actively participate in the discussions between AI hosts. Users can “call in” to the podcast, asking for clarifications, alternative explanations, or additional details. “It’s like having a personal tutor or guide who listens attentively and then responds directly, drawing from the knowledge in your sources,” Google explained.

These features work on AI models like Gemini 1.5 Pro and the experimental Gemini 2.0 Flash.

Mustafa Khan

Mustafa is new on the block and is a tech geek who is currently working with Digit as a News Writer. He tests the new gadgets that come on board and writes for the news desk. He has found his way with words and you can count on him when in need of tech advice. No judgement. He is based out of Delhi, he’s your person for good photos, good food recommendations, and to know about anything GenZ.

Connect On :