Discover more from Dan Patterson | News
Why we must embrace a global approach to AI governance
Generative AI could endanger democratic institutions, indigenous languages, and marginalized voices.
Big tech firms developing generative artificial intelligence are ignoring the marginalized and disenfranchised voices that might have a lesser say in the development of tech applications but are largely impacted by them, said Ivana Bartoletti, Global Privacy Officer at Wipro and author of "An Artificial Revolution, on Power, Politics and AI" in a recent interview.
The international landscape of AI governance is inherently complex. Bartoletti worked with UNESCO to create guidelines for responsible AI. Highlighting the value of education, culture, and sciences, the guidelines drew from international perspectives. The role of AI in education and its potential implications on democratic structures were also priorities.
The framework attempted to include a diverse spectrum of voices, including indigenous rights organizations, and shed light on AI's deeper complexities in the global economy. The challenges don't just lie in the fundamental implications of AI, like data privacy or the proliferation of deep fakes. It also touches on the very essence of culture: What happens to indigenous languages or marginalized narratives in the realm of generative AI?
The path ahead is not straightforward. Ideally, it would involve like-minded nations collaborating to create a governance structure that respects democratic values while integrating technological advancements. However, the fear is that global cooperation might remain elusive, leaving technological advancements ungoverned and potentially harmful.
As we stand at the precipice of an era defined by machine intelligence, said Bartoletti, it's imperative to consider global participation. The stakes are high, and the consequences of getting it wrong could reshape our societies in unimaginable ways.
NOTE: This interview was recorded using Microsoft Teams, which pinned the speaker and excluded the interviewer.