BLOOMINGTON, Ind. – Through their extensive and interdisciplinary research, two business law professors at the Indiana University Kelley School of Business are continuing the legacy of the late Elinor “Lin” Ostrom, who was the first woman to win the Nobel Prize in economics.
Much of Ostrom’s work focused on ways that people organized themselves to manage resources. Today, Scott Shackelford, Provost Professor and professor of business law and ethics at the Kelley School, serves as executive director of the internationally acclaimed Ostrom Workshop, which she founded with her husband Vincent. Angie Raymond, also a professor of business law and ethics, directs the Program on Data Management and Information Governance at the Ostrom Workshop.
Jointly and as individuals, Shackelford and Raymond are involved in numerous projects that focus on knowledge and data – vital resources for the 21st Century that include everything from the protection of personal information to trade secrets — and how people use it and share it safely, including through artificial intelligence.
“Without data, there is no AI. Everyone wants all the data because they want to be smarter,” Raymond said. “The world right now is very siloed and the result of the many regulatory structures that are coming from many different places is to continue to silo that data. There are key areas in my opinion where we need to overcome the regulatory silos and let smart people share data among themselves so we can all benefit as a society.”
Raymond is involved in many collaborations on this topic. She serves as the United States’ national consultant to United Nations Commission on International Trade Law — reporting on Electronic Commerce related issues — and previously attended the UN Commission on International Trade Law Online Dispute Resolution Working Group. She is an identified expert in online dispute resolution (ODR) at Asia-Pacific Economic Cooperation, where she is one of the academic leads in the Collaborative Framework project on cross-border ODR.
Shackelford has written more than 100 articles, book chapters, essays, and op-eds for diverse publications on cyber security and governing AI. He is the author or co-author of nine books, which include, “The Internet of Things: What Everyone Needs to Know (Oxford University Press, 2020),” “Governing New Frontiers in the Information Age: Toward Cyber Peace (Cambridge University Press, 2020),” and the forthcoming “Forks in the Digital Road: Key Decisions in the History of the Internet (Oxford University Press, 2024).”
Together at Ostrom and with support from Cambridge University Press and the National Science Foundation, they are involved with creating “communities” where scholars who work with data can talk and begin creating structures that allow the exchange of information that helps everyone. They also are editing a new series about the management and governance of the “knowledge commons,” where some of this research will appear initially.
“It’s really about diagnosing problems and finding governance gaps,” added Shackelford, who also directs the Ostrom Workshop Program on Cybersecurity & Internet Governance and is executive director of IU’s Center for Applied Cybersecurity Research.
Each of them also is involved in privacy research and how it relates to so-called smart cities, which use electronic methods and sensors to collect data that is then used to manage assets, resources, and services in an urban area. Raymond also has been a consultant to the Midwest Big Data Hub. Shackelford has studied governing AI as it relates to autonomous vehicle regulation in the U.S., Europe and China.
It also was recently announced that Shackelford and Raymond are co-leads as IU’s representatives to the newly formed U.S. AI Safety Institute Consortium, from the U.S. Department of Commerce’s National Institute of Standards and Technology (Another Kelley professor, Sagar Samtani, also is a co-lead). They will help to develop science-based and empirically backed guidelines and standards for AI measurement and policy, with the goal of improving AI safety across the world.
Their other recent and current activities include:
- A new book coming out later this year about the Metaverse for Oxford University Press that Shackelford wrote with Jeff Prince, professor of business economics, the Harold A. Poling Chair of Strategic Management and chairperson of the Department of Business Economics and Public Policy at Kelley. It includes a chapter on AI and the Metaverse.
- Shackelford’s students in IU’s S. in Cybersecurity Risk Management will be working with the Responsible AI Team at Microsoft on a survey of state laws and regulations that are being suggested nationwide which govern AI.
- This fall launching a “tech governance” visiting fellows program, through which scholars and practitioners will come to IU Bloomington and be embedded at the Ostrom Workshop and at Kelley.
- Raymond served as a consultant to the National Science Foundation at a major conference about AI trustworthiness, serving on its planning committee.
- Shackelford and Raymond recently co-authored an influential article, “Should We Trust a Black Box to Safeguard Human Rights? A Comparative Analysis of AI Governance,” published in the UCLA Journal of International Law and Foreign Affairs.
While Shackelford and Raymond are actively pursuing dialogue and understanding of how machine learning and resulting AI impacts our world, they also acknowledge the challenge that higher education institutions face in keeping up. But the basic pedagogical principles remain the same.
“We are giving our students the basic skill sets to make decisions as businesses need to implement technology in a broader global world,” Raymond said. “If we don’t think critically about how we are using technology, we will lose the ability to think critically about certain decisions that we still need human beings in charge of.
“Our research allows us to understand the way that human beings are using and engaging with technology, so that we can bring that into the classroom, so it doesn’t surprise the student,” she added.