- Validity and reliability
- Security
- Safety and resiliency
- Accountability and transparency
- Explainability and interpretability
- Privateness
- Equity with mitigation of dangerous bias
To analyze the present panorama of accountable AI throughout the enterprise, MIT Expertise Assessment Insights surveyed 250 enterprise leaders about how they’re implementing rules that guarantee AI trustworthiness. The ballot discovered that accountable AI is essential to executives, with 87% of respondents ranking it a excessive or medium precedence for his or her group.
A majority of respondents (76%) additionally say that accountable AI is a excessive or medium precedence particularly for making a aggressive benefit. However comparatively few have discovered how one can flip these concepts into actuality. We discovered that solely 15% of these surveyed felt extremely ready to undertake efficient accountable AI practices, regardless of the significance they positioned on them.

Placing accountable AI into observe within the age of generative AI requires a sequence of finest practices that main corporations are adopting. These practices can embrace cataloging AI fashions and knowledge and implementing governance controls. Corporations could profit from conducting rigorous assessments, testing, and audits for danger, safety, and regulatory compliance. On the similar time, they need to additionally empower workers with coaching at scale and in the end make accountable AI a management precedence to make sure their change efforts stick.
“Everyone knows AI is essentially the most influential change in know-how that we’ve seen, however there’s an enormous disconnect,” says Steven Corridor, chief AI officer and president of EMEA at ISG, a world know-how analysis and IT advisory agency. “Everyone understands how transformative AI goes to be and desires robust governance, however the working mannequin and the funding allotted to accountable AI are properly under the place they should be given its criticality to the group.”
This content material was produced by Insights, the customized content material arm of MIT Expertise Assessment. It was not written by MIT Expertise Assessment’s editorial employees.