Microsoft Introduces Red Teaming Tool for Generative AI

Microsoft-Introduces-Red-Teaming-Tool-for-Generative-AI_

Microsoft has announced the release of PyRIT, an open-source red teaming tool for security professionals and machine learning engineers to identify risks in generative AI.

PyRIT, according to Microsoft, improves audit efficiency by automating tasks and flagging areas that require further investigation, effectively supplementing manual red teaming. The tech giant notes that red-teaming generative AI differs from probing classical or traditional AI systems.

Primarily because it requires identifying both security risks and responsible AI risks, generative AI is more probabilistic, and there are wide variations in generative AI system architecture. PyRIT was created in response to our belief that sharing AI red teaming resources across industries benefits everyone.

Read More: Microsoft Releases Red Teaming Tool for Generative AI

Check Out The New ITsecuritywire Podcast. For more such updates follow us on Google News ITsecuritywire News.