Dr. Alondra Nelson is a leading expert on artificial intelligence and its profound influence on culture, society, and politics. She is the Harold F. Linder Professor at the Institute for Advanced Study, Former Deputy Assistant to President Joe Biden & Acting Director of the White House Office of Science and Technology Policy.
Dr. Nelson visited Brown on Thursday to speak on the future of AI governance and policy, in conversation with Dr. Suresh Venkatasubramanian of the Brown Data Science Institute and Center for Technological Responsibility, Re-imagination and Redesign and Dr. Prudence Carter of the Brown Center for the Study of Race and Ethnicity in America.

Dr. Nelson explored how society can embrace cutting-edge technologies like AI while safeguarding rights and encouraging responsible innovation. Drawing on current news and data, she provided actionable insights for navigating this dynamic landscape and fostering the public good.

Dr. Nelson argues for the necessity of policy innovation in AI governance–”the assemblage of laws, rules, policies, norms and standards that we can leverage to mitigate the current and future harms of AI development and deployment to steward potential”–across the board, not just in governmental institutions.
She shared her 6 principles on how to govern the future of AI:
- Remember that AI is not magic; it is a collection of science, statistics and math with limitations and the ability to be shaped by human hands.
- Return to first principles for technology and governance, including a vision of the public good
- Recognize that existing laws, rules, norms, and standards apply to AI, and that we should use them.
- Recognize that new laws, rules, norms, and standards may be needed to govern AI, and we should craft them.
- Embrace iteration; our first attempts may not work well, and we should revisit these principles and processes until they do.
Pry open the black box; continue to question and probe these technologies until we understand them and can shape them to the needs of the public.
