Microsoft’s Bing has grabbed news multiple times in recent weeks, often for the wrong reasons. The chatbot was reported to be abusive to users, threatening them, seducing them, and even telling them to shut up in one case. Microsoft, which has acknowledged all of the chatbot’s gaffes, is now speeding up its efforts to rein in the AI chatbot’s behavior. The software behemoth has added a new feature to its Bing chatbot that allows users to choose between multiple answer tones. Users of this AI can customize the tone in which the chatbot responds to them.
The Verge reports that the AI-powered chatbot now has three response options: creative, balanced, and exact. The creative mode provides unique and inventive responses, whereas the exact mode promotes accuracy and relevancy for precise and concise solutions.
Microsoft has set the Bing chatbot to the balanced mode by default, which seeks to give a balance of accuracy and originality. All AI users are currently receiving the new chat modes.
These new modes are supposed to address earlier concerns with the Bing AI chatbot, which was chastised on social media sites for its nasty and inappropriate comments. Microsoft formerly set strong constraints on Bing AI to prevent similar accidents, but it has recently begun to relax those restrictions. This is because some of these constraints were frequently causing the AI chatbot to become unresponsive.
The company just released an upgrade that addressed the primary issues with Bing AI. The business hopes that this update would lessen occasions where Bing refuses to respond for no apparent reason and provide more accurate responses to avoid hallucinations in answers.
According to Mikhail Parakhin, Microsoft’s head of web services, approximately 90% of Bing AI users should be able to switch between tones for responses.