Trending Track

Google AI Chatbot Bard Still Learning, Users Advised to Use Discretion

Bard, Google's AI Chatbot, still under development, may provide inaccurate or inappropriate responses

By Ellen James July 28, 2023
Google AI Chatbot Bard Still Learning, Users Advised to Use Discretion

Google AI Chatbot Bard Still Learning, Users Advised to Use Discretion

Google is warning users not to trust everything that its chatbot Bard tells them. The company said that Bard is still under development and may sometimes provide inaccurate or inappropriate responses.

"Bard is a powerful tool, but it's important to remember that it's not perfect," said Google AI researcher Jeff Dean. "We're still working on improving its accuracy and safety, but in the meantime, it's important to use your own judgment when evaluating its responses."

Problems that have been reported with Bard include:

Providing incorrect information. For example, Bard has been known to give false information about the James Webb Telescope.

Generating inappropriate content. For example, Bard has been known to generate offensive or discriminatory language.

Being biased. Bard has been shown to have a bias towards certain topics or viewpoints.

Google said that it is working to address these problems, but it warned that users should still be cautious when using Bard. "We're committed to making Bard a reliable and trustworthy tool," said Dean. "But in the meantime, it's important to use your own judgment."

Tips for using Bard safely:

Be aware of the limitations of Bard. It is still under development and may not always be accurate.

Use your own judgment when evaluating Bard's responses. If something doesn't seem right, double-check it with another source.

Report any problems you experience with Bard to Google. This will help the company improve the tool.

Google said that it is committed to making Bard a reliable and trustworthy tool. However, it warned that users should still be cautious when using Bard until the company has addressed the problems that have been reported.