Identity + Algorithms

Characteristics that make up human identity have become increasingly embedded into technological systems. Human characteristics like age, gender, race, and sexuality are being folded into the categorical structures of automated systems, such as algorithmic computer vision methods. However, these characteristics are often complex, nuanced, and fluid–and linked to social and historical instances of bias and discrimination. The simple and discrete categorization of these characteristics leads to tensions that can clash with human values and identity, and result in risky ramifications for already marginalized populations.

To mitigate the potential risks of these types of technological methods, we are researching ways to appropriately develop algorithms that are sensitive to the nuanced human identities held and expressed by the people classified. Our aim is to inform design approaches that are empowering and safe for all users.


  1. Gender is not a Boolean: Towards Designing Algorithms to Understand Klaus Scheuerman, Morgan and Brubaker, Jed R.
    In Participation+Algorithms Workshop at CSCW 2018.
  2. Gender Recognition or Gender Reductionism?: The Social Implications of Embedded Gender Recognition Systems Hamidi, Foad and Scheuerman, Morgan Klaus and Branham, Stacy M.
    Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems