Meanwhile i just tried to have Gemini AI on my Android read the screen to add an event to my calendar: it can't do it. It could, some year ago, which several articles wrote about. It no longer can.

God this is so annoying. The actual functionality we need is not there or is half-assed.

Someone on HN a few months ago said that they gave up and decided to try Copilot in Outlook, which Outlook kept nagging him to do. He tried the example prompt that the nag screen gave him, whatever it was, and Copilot said 'sorry, I don't have that functionality' or something.

Not only the actual functionality people want is missing, but the functionality they're nagging us to use is missing./

I'm not sure if it's more frustrating or just laughably absurd how often I have experiences like this. Like where an LLM chatbot (mostly Gemini) or other AI tool gives me sample prompts to click and test (so they can show their capabilities, give inspiration etc) and it fails right off the bat.

Out of all things you'd think they'd at least invest some time to run some quality control on the demo options lol.

It's the new assistant on my phone but it can't even set a timer or alarm when I ask it to. Gave up using it after that.

Yes, the first thing I asked an "AI Phone Assistant" to do was set an alarm. It didn't even try and fail, it rejected the request entirely.

Same here! It's even worse than Siri was!

You probably have a custom domain Google account? They have Gemini locked down and barely able to do anything.

Switch to a consumer Gmail account and loads of Google features start working.

I do, indeed. It used to work, so while I get why they would strip down the Workspace accounts of some functionality, at least they should communicate that properly.

I dropped a screenshot and it worked great. Like a screenshot of sports practices.

This worked for me just a few weeks ago

Try now. I tried several times with different types of content/apps displayed, to no avail. It analyzes the screen and tells me whatever Gemini would say, instead of actually doing it.

I just tried it now, it worked. Specifically, I long-pressed the home button to pull up Gemini, pressed the "+" to add screen content, and said "add this event to my Google calendar". Confirmed it worked by opening my Google calendar.

Someone said it doesn't work anymore for Workspace accounts, which I use. It did work, though, and it wasn't communicated when it stopped.

Plus the random decision to split Google Assistant functions off from the bottom search bar. I still randomly try to tap that bar with it's mic button to ask the assistant to do something only to have it try to do a Google search. That's leaving aside all the random things that worked rather well in assistant until they started trying to push Gemini, can't think of a reason that should correlate (/s).

I bet that bottom mic is a different team...

Also the homepage search widget, the app drawer search, and chrome address bar search are three near identical experiences, yet with enough differences to be painful. Either unify them, or make them distinct!

It acts like it is now but it used to act like it was one team and was an alternative assistant trigger, or you could even type to the assistant if you couldn't/didn't want to speak. Now that's basically only available via the "Hey/Ok Google" wake words and at best the bottom search bar uses the Google home page AI.

I think this is precisely what made it not work anymore.