Back in the 2010s, a website called Let Me Google That For You gained a notable amount of popularity for serving a single purpose: snark.
The site lets you generate a custom link that you can send somebody who asks you a question. When they click the link, it plays an animation of the process of typing a question into Google. The idea is to show the person asking the question how easy it would have been for them to just look up the answer themselves.
It’s an insult, basically. It’s funny and rude.
Now, there’s nothing wrong with a little rudeness in the right context. If an openly hostile person is wasting your time on social media by asking easily researched questions, I think you should go ahead and enjoy a little passive aggression (as a treat).
In more personal contexts, though, using Let Me Google That For You states clearly that you don’t respect the person you gave the link to, and that their question is a waste of your time. If someone from your workplace or your personal life is asking you a question, it’s because they want your specific input, so it’s better to just give the answer—ideally with context only you can provide—than it is to send a link to a Google search results page.
Now, this being 2025, the people behind Let Me Google That For You also offer Let Me ChatGPT That For You, which works exactly the way you think it does. And its existence points to something new: how rude it is to, in response to a question, respond with AI output—especially in a more professional context.
Wasting Time
Telling someone to Google something can be funny and satisfying, but it’s not helpful. I’d put copy-pasting or screenshotting a conversation with ChatGPT, Claude, or any other AI agent in the same category: not helpful and kind of rude.
Developer Alex Martsinovich touched on this a while ago in a blog post called it’s rude to show AI output to people: “Be polite, and don’t send humans AI text,” he writes. “My own take on AI etiquette is that AI output can only be relayed if it’s either adopted as your own or there is explicit consent from the receiving party.” I think this is a pretty good framework for AI etiquette.
If someone asks you a question, when they could have asked the machine instead, it’s because they wanted your perspective. The internet exists, at least in theory, so that humans can connect with each other, and so that we can benefit from each other’s knowledge. Responding to a question with AI output ignores this dynamic, especially if you don’t say that’s what you’re doing.
Again: There are situations where you may want to be rude. That doesn’t make it less rude.
No Hallucinations
But there’s another reason sharing AI output is kind of rude: it could be flat out wrong.
While models are getting better all the time, they still make (occasionally hilarious) mistakes. Sharing the output of a large language model in conversations, without checking it yourself for accuracy, means you could potentially be sharing misinformation.
Even worse, because you didn’t state you were sharing AI output, you’re giving the person you’re sharing the output with the impression that you can vouch for the information.
There’s Nothing Wrong With Using Tools
None of this is to say that AI isn’t useful, or that you can’t use it to answer questions people ask. But as with Googling, using AI to answer questions isn’t the end of the job—it’s the beginning.
When I use AI as a research tool, it’s a way to find primary sources. I ask for an overview, sure, but I also ask for articles and studies to read on a subject. Then I read those sources myself in order to decide whether the AI’s summary of the subject matter is accurate or not. I’ll likely even follow up with a few of the people involved.
I, as a journalist, consider this doing my due diligence. If an editor asks me a question about a statement in an article I don’t just ask ChatGPT. I dig down to primary sources, or as close as I can get, and point to them. I’m sure your profession has a similar level of due diligence.
AI can be an amazing starting point for research, but you have more to offer than a starting point. Use these tools to show what you can do, not as an alternative to doing things.




