ChatGPT praised Biden but not Trump. The right is furious. .

Oh, we can all relate. We’ve asked it thousands of times – “Alexa, what time is it?” Only to get a stilted and often altogether ambiguous answer in return. But, it seems like we’re in for a terse response of a different sort if you’re asking your chatbot questions lately.

That’s right, according to the Washington Post, chatbot developers are making a subtle transition from polite answers to warning users away from those oh-so-familiar artificially intelligent queries.

It’s not that they’re trying to be rude, mind you. It’s just that chatbot developers have now figured out that they need to draw a line between what people can expect from automated conversations and what they can realistically hope to get out of them.

In other words, rather than just defaulting to the same answer for all questions, the bots are learning that it’s important to recognize when the expectation of a helpful answer is unreasonable. So, instead of just giving the same old “I’m sorry, I don’t know” response to every query, they are now programmed to reply with a polite but firm “Please don’t ask me questions I can’t answer” before deferring to an expert or handing over to a human operator.

By doing this, the developers are hoping to prevent users from going down the rabbit hole of increasingly unanswerable queries in the vain hope that something satisfactory is eventually delivered.

So keep that in mind the next time you want to ask Alexa what the answer to the Ultimate Question of Life, the Universe, and Everything is – it’s not going to go down very well. Stick to the smaller questions – you’ll get a much better response.


Posted

in

by

Tags:

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *