Cybersecurity Snapshot: NIST Program Assesses How AI Systems Will Behave in the Real World, While FBI Has Troves of Decryption Keys for LockBit Victims

From tenable.com

Check out the new ARIA program from NIST, designed to evaluate if an AI system will be safe and fair once it’s launched. Plus, the FBI offers to help LockBit victims with thousands of decryption keys. In addition, Deloitte finds that boosting cybersecurity is key for generative AI deployment success. And why identity security is getting harder. And much more!

1 – NIST program will test safety, fairness of AI systems

Will that artificial intelligence (AI) system now in development behave as intended once it’s released or will it go off the rails?

It’s a critical question for vendors, enterprises and individuals developing AI systems. To help answer it, the U.S. government has launched an AI testing and evaluation program.

Called Assessing Risks and Impacts of AI (ARIA), the National Institute of Standards and Technology (NIST) program will make a “sociotechnical” assessment of AI systems and models.

That means ARIA will determine whether an AI system will be valid, reliable, safe, secure, private and fair once it’s live in the real world.

“In order to fully understand the impacts AI is having and will have on our society, we need to test how AI functions in realistic scenarios – and that’s exactly what we’re doing with this program,” U.S. Commerce Secretary Gina Raimondo said in a statement.

Read more…