Bing Chat works in a straightforward way, at least on the user’s end: you can ask it a question and it will scrape the internet for related information, which is then recompiled into something seemingly original in response to the query. You can, for example, ask the Bing AI to give you recipes that meet specific requirements — or, as demonstrated by Microsoft, you can ask it to compare more than one product. In a post on his blog, software engineer Dmitri Brereton highlighted some of the mistakes Bing made when comparing pet vacuums, not to mention offering false details on bars in Mexico and incorrectly summarizing a quarterly financial report.
At the heart of the matter is what appears to be entirely fictional information generated by the chatbot, which cited sources for information that wasn’t actually presented by those sites. Brereton dug through the Bing Chat answers given during the demonstration and found everything from “cons” about a particular pet vacuum model that didn’t reflect actual reviews all the way to a financial summary that included figures not included in the actual report. Bing reported the wrong numbers, said that a cordless vacuum had a 16-foot cable, and recommended nightlife destinations in Mexico that may be far different in reality than described by the bot.
Stay connected with us on social media platform for instant update click here to join our Twitter, & Facebook
We are now on Telegram. Click here to join our channel (@TechiUpdate) and stay updated with the latest Technology headlines.
For all the latest Automobiles News Click Here