Building creative restrictions to curb AI abuse
Along with all the positive, revolutionary aspects of AI comes a more sinister side. Joining us today to discuss ethics in AI from the developer’s point of view is David Gray Widder. David is currently doing his Ph.D. at the School of Computer Scie
Listen to this episode
Summary
Along with all the positive, revolutionary aspects of AI comes a more sinister side. Joining us today to discuss ethics in AI from the developer’s point of view is David Gray Widder. David is currently doing his Ph.D. at the School of Computer Science at Carnegie Mellon University and is investigating AI from an ethical perspective, honing in specifically on the ethics-related challenges faced by AI software engineers. His research has been conducted at Intel Labs, Microsoft, and NASA’s Jet Propulsion Lab. In this episode, we discuss the harmful uses of deep fakes and the ethical ramifications thereof in proprietary versus open source contexts. Widder breaks down the notions of technological inevitability and technological neutrality, respectively, and explains the importance of challenging these ideas. Widder has identified a continuum between implementation-based harms and use-based harms and fills us in on how each is affected in the open source development space. Tune in to find out more about the importance of curbing AI abuse and the creativity required to do so, as well as the strengths and weaknesses of open source in terms of AI ethics. Full transcript. Key points from this episode: Introducing David Gray Widder, a Ph.D. student researching AI ethics. Why he chose to focus his research on ethics in AI, and how he drives his research. Widder explains deep fakes and gives examples of their uses. Sinister uses of deep fakes and the danger thereof. The ethical ramifications of deep fake tech in proprietary versus open source contexts. The kinds of harms that can be prevented in open source versus proprietary contexts. The licensing issues that result in developers relinquishing control (and responsibility) over the uses of their tech. Why Widder is critical of the notions of both technological inevitability and neutrality. Why it’s important to challenge the idea of technological neutrality. The potential to build restrictions, even within the dictates of open source. The continuum between implementation-based harms and use-based harms. How open source allows for increased scrutiny of implementation harms, but decreased accountability for use-based harms. The insight Widder gleaned from observing NASA’s use of AI, pertaining to the deep fake case. Widder voices his legal concerns around Copilot. The difference between laws and norms. How we’ve been unsuspectingly providing data by uploading photos online. Why it’s important to include open source and public sector organizations in the ethical AI conversation. Open source strengths and weaknesses in terms of the ethical use of AI. Links mentioned in today’s episode: David Gray Widder David Gray Widder on Twitter Limits and Possibilities of “Ethical AI” in Open Source: A Study of Deep Fakes What is Deepfake Copilot Credits Special thanks to volunteer producer, Nicole Martinelli. Music by Jason Shaw, Audionautix. This podcast is sponsored by GitHub, DataStax and Google. No sponsor had any right or opportunity to approve or disapprove the content of this podcast.
Similar Episodes
-
AI Deep Dive
06/21/2023
Today is a technical deep dive in to some practical use of AI, specifically, how could we use AI to generate transcripts for this podcast?#ai #chatgpt #whisper #python #kaggle
Clean -
Welcome to Deep Dive: AI
07/26/2022
Welcome to Deep Dive:AI, an online event from the Open Source Initiative. We’ll be exploring how Artificial Intelligence impacts open source software, from developers to businesses to the rest of us. Episode notes An introduction to Deep Dive: AI, an event in three parts organized by the Open Source Initiative. With AI systems being so complex, concepts like “program” or “source code” in the Open Source Definition are challenged in new and surprising ways. The topic of AI is huge. For Open Source Initiative’s Deep Dive, we’ll be looking at how AI could affect the future of Open Source. This trailer episode is produced by the Open Source Initiative with the help of Nicole Martinelli. Music by Jason Shaw on Audionautix.com, Creative Commons BY 4.0 International license. Deep Dive: AI is made possible by the generous support of OSI individual members and sponsors. Donate or become a member of the OSI today.
Clean -
AI Deep Dive • Dorothy Chou (Google DeepMind)
04/20/2023
Head of Public Affairs for DeepMind, Dorothy Chou, is in Washington from London and joins Niki in the Tech’ed Up Studio to talk about the massive impact AI is already having on science, work, and the future of our economy. They cover the hard questions we all should be asking as this tech develops at breakneck speed, discuss the role of government in shaping this emerging tech, and highlight the importance of wrangling all kinds of people to shape the data and norms that underpin AI. “There needs to be a real renegotiation of what all of our roles are in this process as technology makes it into society. And I think that the dialogue between scientists and policymakers, it's never been more important.” -Dorothy ChouListen to the DeepMind PodcastRead “Chat GPT is a Blurry JPEG of the Web”Learn more about The Obsidian CollectionFollow Dorothy on Twitter Follow Niki on LinkedIn Learn More at www.techedup.com Follow us on Instagram Check out video on YouTube Follow Niki on LinkedIn
Clean -
Innovations in AI Technology: A Deep Dive
06/04/2024
AI Daily Podcast: Innovations in Artificial Intelligence Technology Links:HeartFlow AI Plaque Analysis Achieves Major Milestone Towards Medicare CoverageHeartFlow AI Plaque Analysis Achieves Major Milestone Towards Medicare CoverageNVDA Stock Alert: 7 Things to Know About the New Nvidia Rubin Chip
Clean
Similar Podcasts
-
Foul Play: Crime Series
08/12/2020
Shane L. Waters, Wendy Cee, Gemma Hoskins
Welcome to Foul Play: Crime Series, the riveting true crime podcast that takes you on a deep dive into the world's most gripping cases. Each season unravels a unique case—choose one that intrigues you and dive in.
Clean -
Experts On The Wire (An SEO Podcast!)
08/12/2020
Dan Shure (SEO)
Downloaded over 500,000 times! Experts On The Wire is a monthly SEO podcast hosted by Dan Shure. Discover new trends, tactics, tools, people, and businesses doing remarkable work in the world of Search Engine Optimization. Past guests include Rand Fishkin, Brian Dean, Annie Cushing, Noah Kagan & Vanessa Fox. I'm an SEO consultant myself, so we dig deep into SEO, technical challenges, growth stories, mobile SEO, eCommerce, crawling, content marketing & more.
Clean -
Lifestyle Architecture Lab
08/12/2020
Himanshu Sachdeva
Lifestyle Architecture Lab is a show hosted by Himanshu Sachdeva.Himanshu Sachdeva is a writer, lifestyle architect and a self-experimenter. In this show, he talks about lifestyle design, financial freedom and converses with personalities who have transformed their passion into profession.These guests range from artists, musicians, entrepreneurs, coaches, investors, professional athletes etc. These conversations dig deep into their story to find out their thought process, tools, strategies and tricks that make them tick.For show notes and to learn more, go to - https://himanshusachdeva.com
Clean
Comments
Sign in to leave a comment.
No comments yet. Be the first to comment!