Google Design198 тыс
Опубликовано 22 мая 2023, 16:41
Here you will see a series of case studies showing how product and design choices can be impactful on a personal, community, and systemic level. Designs can both be affirming and inclusive, or harmful and exclusionary to various people and communities. We’ll use some examples to highlight how important inclusion and equity considerations are when making product decisions.
Chapters
00:00:10 - Introduction
00:01:12 - Personal Level
00:02:39 - Community Level
00:03:49 - Systemic Level
00:04:40 - Examples of Inclusion
00:05:10 - Closing
Resources
- Will Douglas Heaven 2020. Predictive policing algorithms are racist. They need to be dismantled. → goo.gle/vpedxj
- How Activity Tracking Can Reduce Life Insurance Premiums → goo.gle/drtezb
- Fred Lewsey, 2023. Cinema Has Helped 'Entrench' Gender Inequality in AI. → goo.gle/loenhw
- Dorothy R. Santos, 2022. The Cultural Baggage Behind Feminized A.I. → goo.gle/hrihip
- Caitlin Chin and Mishaela Robison, 2020. How Al Bots and Voice Assistants Reinforce Gender Bias → goo.gle/izydbw
- Madhusree Goswami, 2022. Why Are Most Voice Assistants Female & How Do They Reinforce Gender Stereotypes? → goo.gle/oyziaf
- Annie Brown, 2021. Brilliance Knows No Gender: Eliminating Bias In Chatbot Development → goo.gle/eghzxh
Presenters
- Mallory, UX Researcher
- Emma, Software Engineer
Chapters
00:00:10 - Introduction
00:01:12 - Personal Level
00:02:39 - Community Level
00:03:49 - Systemic Level
00:04:40 - Examples of Inclusion
00:05:10 - Closing
Resources
- Will Douglas Heaven 2020. Predictive policing algorithms are racist. They need to be dismantled. → goo.gle/vpedxj
- How Activity Tracking Can Reduce Life Insurance Premiums → goo.gle/drtezb
- Fred Lewsey, 2023. Cinema Has Helped 'Entrench' Gender Inequality in AI. → goo.gle/loenhw
- Dorothy R. Santos, 2022. The Cultural Baggage Behind Feminized A.I. → goo.gle/hrihip
- Caitlin Chin and Mishaela Robison, 2020. How Al Bots and Voice Assistants Reinforce Gender Bias → goo.gle/izydbw
- Madhusree Goswami, 2022. Why Are Most Voice Assistants Female & How Do They Reinforce Gender Stereotypes? → goo.gle/oyziaf
- Annie Brown, 2021. Brilliance Knows No Gender: Eliminating Bias In Chatbot Development → goo.gle/eghzxh
Presenters
- Mallory, UX Researcher
- Emma, Software Engineer
Свежие видео
Enhance data access governance with enforced metadata rules in Amazon DataZone | Amazon Web Services