Identity + Algorithms
Characteristics that make up human identity have become increasingly embedded into technological systems. Human characteristics like age, gender, race, and sexuality are being folded into the categorical structures of automated systems, such as algorithmic computer vision methods. However, these characteristics are often complex, nuanced, and fluid–and linked to social and historical instances of bias and discrimination. The simple and discrete categorization of these characteristics leads to tensions that can clash with human values and identity, and result in risky ramifications for already marginalized populations.
To mitigate the potential risks of these types of technological methods, we are researching ways to appropriately develop algorithms that are sensitive to the nuanced human identities held and expressed by the people classified. Our aim is to inform design approaches that are empowering and safe for all users.
Researchers
Jed Brubaker, Katie Gach, Aaron Jiang, Anthony Pinter, Morgan Klaus Scheuerman
Publications
-
From Human to Data to Dataset: Mapping the Traceability of Human Subjects in Computer Vision Datasets
Scheuerman, Morgan Klaus and Weathington, Katy and Mugunthan, Tarun and Denton, Emily and Fiesler, Casey
Proc. ACM Hum.-Comput. Interact.
7,
CSCW1
- Abstract
- PDF
- Reference
- BibTeX
- DOI: 10.1145/3579488
Morgan Klaus Scheuerman, Katy Weathington, Tarun Mugunthan, Emily Denton, and Casey Fiesler. 2023. From Human to Data to Dataset: Mapping the Traceability of Human Subjects in Computer Vision Datasets. Proc. ACM Hum.-Comput. Interact. 7, CSCW1. https://doi.org/10.1145/3579488
@article{Scheuerman2023a,
author = {Scheuerman, Morgan Klaus and Weathington, Katy and Mugunthan, Tarun and Denton, Emily and Fiesler, Casey},
title = {From Human to Data to Dataset: Mapping the Traceability of Human Subjects in Computer Vision Datasets},
year = {2023},
issue_date = {April 2023},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
volume = {7},
number = {CSCW1},
url = {https://doi.org/10.1145/3579488},
doi = {10.1145/3579488},
journal = {Proc. ACM Hum.-Comput. Interact.},
month = apr,
articleno = {55},
numpages = {33},
keywords = {machine learning, datasets, data subjects, data ethics, computer vision},
tags = {identity-and-algorithms}
}
Computer vision is a "data hungry" field. Researchers and practitioners who work on human-centric computer vision, like facial recognition, emphasize the necessity of vast amounts of data for more robust and accurate models. Humans are seen as a data resource which can be converted into datasets. The necessity of data has led to a proliferation of gathering data from easily available sources, including "public" data from the web. Yet the use of public data has significant ethical implications for the human subjects in datasets. We bridge academic conversations on the ethics of using publicly obtained data with concerns about privacy and agency associated with computer vision applications. Specifically, we examine how practices of dataset construction from public data-not only from websites, but also from public settings and public records-make it extremely difficult for human subjects to trace their images as they are collected, converted into datasets, distributed for use, and, in some cases, retracted. We discuss two interconnected barriers current data practices present to providing an ethics of traceability for human subjects: awareness and control. We conclude with key intervention points for enabling traceability for data subjects. We also offer suggestions for an improved ethics of traceability to enable both awareness and control for individual subjects in dataset curation practices.
-
Taxonomizing and Measuring Representational Harms: A Look at Image Tagging
Katzman, Jared and Wang, Angelina and Scheuerman, Morgan Klaus and Blodgett, Su Lin and Laird, Kristen and Wallach, Hanna and Barocas, Solon
AAAI
- PDF
- Reference
- BibTeX
Jared Katzman, Angelina Wang, Morgan Klaus Scheuerman, Su Lin Blodgett, Kristen Laird, Hanna Wallach, and Solon Barocas. 2023. Taxonomizing and Measuring Representational Harms: A Look at Image Tagging. In AAAI.
@inproceedings{Katzman2023,
author = {Katzman, Jared and Wang, Angelina and Scheuerman, Morgan Klaus and Blodgett, Su Lin and Laird, Kristen and Wallach, Hanna and Barocas, Solon},
booktitle = {AAAI},
title = {{Taxonomizing and Measuring Representational Harms: A Look at Image Tagging}},
year = {2023},
tags = {identity-and-algorithms}
}
-
Auto-essentialization: Gender in automated facial analysis as extended colonial project
Scheuerman, Morgan Klaus and Pape, Madeleine and Hanna, Alex
Big Data & Society
8,
2:
20539517211053712
- PDF
- Reference
- BibTeX
Morgan Klaus Scheuerman, Madeleine Pape, and Alex Hanna. 2021. Auto-essentialization: Gender in automated facial analysis as extended colonial project. Big Data & Society 8, 2: 20539517211053712.
@article{Scheuerman2021-bigdata-autoessentalization,
title = {Auto-essentialization: Gender in automated facial analysis as extended colonial project},
author = {Scheuerman, Morgan Klaus and Pape, Madeleine and Hanna, Alex},
journal = {Big Data \& Society},
volume = {8},
number = {2},
pages = {20539517211053712},
year = {2021},
publisher = {SAGE Publications Sage UK: London, England},
tags = {identity-and-algorithms}
}
-
How We’ve Taught Algorithms to See Identity: Constructing Race and Gender in Image Databases for Facial Analysis
Scheuerman, Morgan Klaus and Wade, Kandrea and Lustig, Caitlin and Brubaker, Jed R.
Proc. ACM Hum.-Comput. Interact.
4,
CSCW1:
Article 58
Best Paper Honorable Mention
- PDF
- Reference
- BibTeX
- DOI: 10.1145/3392866
Morgan Klaus Scheuerman, Kandrea Wade, Caitlin Lustig, and Jed R. Brubaker. 2020. How We’ve Taught Algorithms to See Identity: Constructing Race and Gender in Image Databases for Facial Analysis. Proc. ACM Hum.-Comput. Interact. 4, CSCW1: Article 58. https://doi.org/10.1145/3392866
@article{Scheuerman2020-cscw-databaseidentity,
title = {How {We’ve} {Taught} {Algorithms} to {See} {Identity}: {Constructing} {Race} and {Gender} in {Image} {Databases} for {Facial} {Analysis}},
author = {Scheuerman, Morgan Klaus and Wade, Kandrea and Lustig, Caitlin and Brubaker, Jed R.},
doi = {10.1145/3392866},
journal = {Proc. ACM Hum.-Comput. Interact.},
number = {CSCW1},
pages = {Article 58},
volume = {4},
year = {2020},
tags = {identity-and-algorithms, marginalization-and-safety, positionalML},
annote = {Best Paper Award},
note = {Best Paper Honorable Mention}
}
-
How Computers See Gender: An Evaluation of Gender Classification in Commercial Facial Analysis and Image Labeling Services
Scheuerman, Morgan Klaus and Paul, Jacob M and Brubaker, Jed R.
Proc. ACM Hum.-Comput. Interact.
3,
CSCW:
Article 144
- PDF
- Reference
- BibTeX
- DOI: 10.1145/3359246
Morgan Klaus Scheuerman, Jacob M Paul, and Jed R. Brubaker. 2019. How Computers See Gender: An Evaluation of Gender Classification in Commercial Facial Analysis and Image Labeling Services. Proc. ACM Hum.-Comput. Interact. 3, CSCW: Article 144. https://doi.org/10.1145/3359246
@article{Scheuerman2019-cscw-gender,
title = {How {Computers} {See} {Gender}: {An} {Evaluation} of {Gender} {Classification} in {Commercial} {Facial} {Analysis} and {Image} {Labeling} {Services}},
author = {Scheuerman, Morgan Klaus and Paul, Jacob M and Brubaker, Jed R.},
doi = {10.1145/3359246},
journal = {Proc. ACM Hum.-Comput. Interact.},
number = {CSCW},
pages = {Article 144},
volume = {3},
year = {2019},
tags = {marginalization-and-safety, identity-and-algorithms}
}
-
"Am I Never Going to Be Free of All This Crap?": Upsetting Encounters With Algorithmically Curated Content About Ex-Partners
Pinter, Anthony T. and Jiang, Jialun "Aaron" and Gach, Katie Z. and Sidwell, Melanie M. and Dykes, James E. and Brubaker, Jed R.
Proc. ACM Hum.-Comput. Interact.
3,
CSCW:
Article 70
- PDF
- Reference
- BibTeX
- DOI: 10.1145/3359172
Anthony T. Pinter, Jialun "Aaron" Jiang, Katie Z. Gach, Melanie M. Sidwell, James E. Dykes, and Jed R. Brubaker. 2019. "Am I Never Going to Be Free of All This Crap?": Upsetting Encounters With Algorithmically Curated Content About Ex-Partners. Proc. ACM Hum.-Comput. Interact. 3, CSCW: Article 70. https://doi.org/10.1145/3359172
@article{Pinter2019-upsetcon,
title = {"Am {I} {Never} {Going} to {Be} {Free} of {All} {This} {Crap}?": {Upsetting} {Encounters} {With} {Algorithmically} {Curated} {Content} {About} {Ex-Partners}},
volume = {3},
number = {CSCW},
journal = {Proc. ACM Hum.-Comput. Interact.},
author = {Pinter, Anthony T. and Jiang, Jialun "Aaron" and Gach, Katie Z. and Sidwell, Melanie M. and Dykes, James E. and Brubaker, Jed R.},
year = {2019},
pages = {Article 70},
doi = {10.1145/3359172},
tags = {identity-and-algorithms}
}
-
Gender is not a Boolean: Towards Designing Algorithms to Understand Complex Human Identities
Scheuerman, Morgan Klaus and Brubaker, Jed R.
In Participation+Algorithms Workshop at CSCW 2018.
- Abstract
- PDF
- Reference
- BibTeX
Morgan Klaus Scheuerman and Jed R. Brubaker. 2018. Gender is not a Boolean: Towards Designing Algorithms to Understand Complex Human Identities. In In Participation+Algorithms Workshop at CSCW 2018.
@inproceedings{Scheuerman2018a,
title = {Gender is not a {Boolean}: {Towards} {Designing} {Algorithms} to {Understand} {Complex} {Human} {Identities}},
author = {Scheuerman, Morgan Klaus and Brubaker, Jed R.},
booktitle = {In Participation+Algorithms Workshop at CSCW 2018.},
year = {2018},
tags = {identity-and-algorithms}
}
Algorithmic methods are increasingly used to identify and categorize human characteristics. A range of human identities, such as gender, race, and sexual orientation, are becoming interwoven with systems. We discuss the case of automatic gender recognition technologies that algorithmically assign binary gender categories. Based on our previous work with transgender participants, we discuss the ways current gender recognition systems misrepresent complex gender identities and undermine safety. We describe plans to build on this by conducting participatory design workshops with designers and potential users to develop improved methods for conceptualizing gender identity in algorithms.
-
Gender Recognition or Gender Reductionism?: The Social Implications of Embedded Gender Recognition Systems
Hamidi, Foad and Scheuerman, Morgan Klaus and Branham, Stacy M.
Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems
- Abstract
- PDF
- Reference
- BibTeX
- DOI: 10.1145/3173574.3173582
Foad Hamidi, Morgan Klaus Scheuerman, and Stacy M. Branham. 2018. Gender Recognition or Gender Reductionism?: The Social Implications of Embedded Gender Recognition Systems. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI ’18), 8:1–8:13. https://doi.org/10.1145/3173574.3173582
@inproceedings{Hamidi2018,
author = {Hamidi, Foad and Scheuerman, Morgan Klaus and Branham, Stacy M.},
title = {Gender Recognition or Gender Reductionism?: The Social Implications of Embedded Gender Recognition Systems},
booktitle = {Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems},
series = {CHI '18},
year = {2018},
isbn = {978-1-4503-5620-6},
location = {Montreal QC, Canada},
pages = {8:1--8:13},
articleno = {8},
numpages = {13},
url = {http://doi.acm.org/10.1145/3173574.3173582},
doi = {10.1145/3173574.3173582},
acmid = {3173582},
publisher = {ACM},
address = {New York, NY, USA},
tags = {identity-and-algorithms}
}
Automatic Gender Recognition (AGR) refers to various computational methods that aim to identify an individual’s gender by extracting and analyzing features from images, video, and/or audio. Applications of AGR are increasingly being explored in domains such as security, marketing, and social robotics. However, little is known about stakeholders’ perceptions and attitudes towards AGR and how this technology might disproportionately affect vulnerable communities. To begin to address these gaps, we interviewed 13 transgender individuals, including three transgender technology designers, about their perceptions and attitudes towards AGR. We found that transgender individuals have overwhelmingly negative attitudes towards AGR and fundamentally question whether it can accurately recognize such a subjective aspect of their identity. They raised concerns about privacy and potential harms that can result from being incorrectly gendered, or misgendered, by technology. We present a series of recommendations on how to accommodate gender diversity when designing new digital systems.
Blog
How We’ve Taught Algorithms to See Identity
Break-ups Suck. They Could Suck Less.
Press
Even after blocking an ex on Facebook, the platform promotes painful reminders
How social media makes breakups that much worse
The Problem With Putting Social Media in Charge of Our Memories