NIST releases open-source platform for AI safety testing

The National Institute of Standards and Technology (NIST) released a new open-source software tool for testing the resilience of machine learning (ML) models to various types of attacks. The tool, known as Dioptra, was released Friday along with new AI guidance from NIST marking the 270th day since President Joe Biden’s Executive Order on the Safe, Secure and Trustworthy Development of AI was signed.

Source: SC Magazine

 


Date:

Categorie(s):

Tag(s):