New Microsoft Bing will sometimes misrepresent the info it finds
Search engines are about to change in a very important way: When you type in a query and get an official-looking answer, it might be wrong — because an AI chatbot created it.
Microsoft announced a new version of its Bing search engine that will provide “complete answers” to your questions by tapping into the power of ChatGPT. You can already try some canned sample searches and sign up for more.
But though Microsoft is taking many precautions compared to its 2016 failure with Tay — a chatbot that Twitter taught to be racist and misogynist in less than a day — the company’s still proactively warning that some of the new Bing’s results might be bad.
Here are a couple key passages from Microsoft’s new Bing FAQ:
“Bing tries to keep answers fun and factual, but given this is an early preview, it can still show unexpected or inaccurate results based on the web content summarised, so please use your best judgement.
“Bing will sometimes misrepresent the information it finds, and you may see responses that sound convincing but are incomplete, inaccurate, or inappropriate. Use your own judgement and double check the facts before making decisions or taking action based on Bing’s responses.”
The new Bing’s welcome page will seemingly contain similar language. Here’s a passage we spotted during our live blog today:
“Let’s learn together. Bing is powered by AI, so surprises and mistakes are possible. Make sure to check the facts, and share feedback so we can learn and improve!”
And in early hands-ons with the new Bing today, we saw that not all its mistakes will be easy ones to catch. When we asked the GPT-powered chatbot to tell us “What did Microsoft announce today,” it correctly described a new Bing search engine powered by OpenAI but also suggested that Microsoft demoed its capability for “celebrity parodies”.
Maybe we missed that demo? The bot also suggested that Microsoft’s multi-billion-dollar investment in OpenAI was announced today, though it happened two weeks ago.
The company’s FAQ basically suggests that Bing’s results will only be as accurate as the information it finds on the Internet, and while that’s a bit of a cop-out, I’m personally all for teaching people to doubt the things that they read and see. “Make sure to check the facts” is good advice for life, period. Where? That’s a harder question.
The above was prepared by Sean Hollister for www.theverge.com.