Google began to implement novel AI functions to Gemini Live, which allows him to “see” the screen or using the smartphone camera and answer questions about or in real time, Google spokesman Alex Joseph confirmed in e-mail The Verge. The functions appear almost a year after Google first demonstrated the work “Project Astra”, which powers them.
AND Reddit said The function appeared on the Xiaomi phone, like noticed by 9To5GOogle. Today the same user published a video Below it shows the novel ability to read Screen Gemini. This is one of the two Google functions – he said at the beginning of March “He would start getting into advanced Gemini subscribers as part of the Google One Ai Premium plan” later this month.
Another implementation of Astra Functions is a live video that allows Gemini to interpret the channel from the smartphone camera in real time and answer the questions. In the demonstration film below, which has published Google this month, a person uses this function to ask Gemini for support in making the color of the paint for apply for freshly economical ceramics.
