Explaining Responsible AI: Why AI sometimes gets it wrong

4 721
8
Microsoft1.77 млн
Опубликовано 1 августа 2024, 16:00
AI models learn from vast amounts of information but sometimes they create hallucinations, also known as “ungrounded content,” by altering or adding to the data.

Learn more about the tools we have put in place to measure, detect, and reduce inaccuracies and ungrounded content: news.microsoft.com/source/feat...

Subscribe to Microsoft on YouTube here: aka.ms/SubscribeToYouTube

Follow us on social:
LinkedIn: linkedin.com/company/microsoft...
Twitter: twitter.com/Microsoft
Facebook: facebook.com/Microsoft
Instagram: instagram.com/microsoft

For more about Microsoft, our technology, and our mission, visit aka.ms/microsoftstories
автотехномузыкадетское