GoToRO 2 days ago

They do. Recently I was pleasantly surprised by gemini telling me that what I wanted to do will NOT work. I was in disbelief.

3
sgtnoodle 2 days ago

I asked Gemini to format some URLs into an XML format. It got halfway through and gave up. I asked if it truncated the output, and it said yes and then told _me_ to write a python script to do it.

GoToRO 2 days ago

That's a different kind of push back.

walls 1 day ago

This is my most common experience with Gemini. Ask it to do something, it'll tell you how you can do it yourself and then stop.

edoloughlin 1 day ago

Given that Gemini seems to have frequent availability issues, I wonder if this is a strategy to offload low-hanging fruit (from a human-effort pov) to the user. If it is, I think that's still kinda impressive.

ASalazarMX 1 day ago

Somehow I like this. I hate that current LLMs act like yes-men, you can't trust them to give unbiased results. If it told me my approach is stupid, and why, I would appreciate it.

danielbln 2 days ago

I've noticed Gemini pushing back more as well, whereas Claude will just butter me up and happily march on unless I specifically request a critical evaluation.

kelvinjps10 2 days ago

Y experience as well

captainkrtek 2 days ago

Interesting, can you share more context on the topic you were asking it about?

GoToRO 2 days ago

coding in a stack I didn't bother to learn first (android)