Limitations of (Trans) Gendered Legibility within Binary Digital ...

31
Goetz, Teddy G. 2021. “Swapping Gender Is a Snap(chat): Limitations of (Trans) Gendered Legibility within Binary Digital and Human Filters.” Catalyst: Feminism, Theory, Technoscience 7 (2): 1–31. http://www.catalystjournal.org | ISSN: 2380-3312 © Teddy G. Goetz, 2021 | Licensed to the Catalyst Project under a Creative Commons Attribution Non-Commercial No Derivatives license Swapping Gender is a Snap(chat): Limitations of (Trans) Gendered Legibility within Binary Digital and Human Filters Teddy G. Goetz University of Pennsylvania, Department of Psychiatry [email protected] Abstract In May 2019 the photographic cellphone application Snapchat released two company-generated image filters that were officially dubbed “My Twin” and “My Other Twin,” though users and media labeled them as feminine and masculine, respectively. While touted in most commentary as a “gender swap” feature, these digital imaginaries represent a unique opportunity to consider what features contribute to classification of faces into binary gender buckets. After all, the commonly considered “male” filter makes various modifications—including a broader jaw and addition of facial hair—to whichever face is selected in the photograph. It does not ask and cannot detect if that face belongs to a man or woman (cis- or transgender) or to a non-binary individual. Instead, the augmented reality that it offers is a preprogrammed algorithmic reinscription of reductive gendered norms. When interacting with a novel face, humans similarly implement algorithms to assign a gender to that face. The Snapchat “My Twin” filters—which are not neutral, but rather human-designed—offer an analyzable projection of one such binarization, which is otherwise rarely articulated or visually recreated. Here I pair an ethnographic exploration of twenty-eight transgender, non-binary, and/or gender diverse individuals’ embodied experiences of facial gender legibility throughout life and with digital distortion, with a quantitative analysis of the “My Twin” filter facial distortions, to better understand the role of technology in reimaginations of who and what we see in the mirror.

Transcript of Limitations of (Trans) Gendered Legibility within Binary Digital ...

Goetz, Teddy G. 2021. “Swapping Gender Is a Snap(chat): Limitations of (Trans) Gendered Legibility within Binary Digital and Human Filters.” Catalyst: Feminism, Theory, Technoscience 7 (2): 1–31.

http://www.catalystjournal.org | ISSN: 2380-3312

© Teddy G. Goetz, 2021 | Licensed to the Catalyst Project under a Creative Commons Attribution Non-Commercial No Derivatives license

Swapping Gender is a Snap(chat): Limitations of (Trans) Gendered Legibility within Binary Digital and Human Filters

Teddy G. Goetz

University of Pennsylvania, Department of Psychiatry [email protected]

Abstract In May 2019 the photographic cellphone application Snapchat released two company-generated image filters that were officially dubbed “My Twin” and “My Other Twin,” though users and media labeled them as feminine and masculine, respectively. While touted in most commentary as a “gender swap” feature, these digital imaginaries represent a unique opportunity to consider what features contribute to classification of faces into binary gender buckets. After all, the commonly considered “male” filter makes various modifications—including a broader jaw and addition of facial hair—to whichever face is selected in the photograph. It does not ask and cannot detect if that face belongs to a man or woman (cis- or transgender) or to a non-binary individual. Instead, the augmented reality that it offers is a preprogrammed algorithmic reinscription of reductive gendered norms. When interacting with a novel face, humans similarly implement algorithms to assign a gender to that face. The Snapchat “My Twin” filters—which are not neutral, but rather human-designed—offer an analyzable projection of one such binarization, which is otherwise rarely articulated or visually recreated. Here I pair an ethnographic exploration of twenty-eight transgender, non-binary, and/or gender diverse individuals’ embodied experiences of facial gender legibility throughout life and with digital distortion, with a quantitative analysis of the “My Twin” filter facial distortions, to better understand the role of technology in reimaginations of who and what we see in the mirror.

Special Section: Probing the System: Feminist Complications of Automated Technologies, Flows, and Practices of Everyday Life

| Catalyst: Feminism, Theory, Technoscience 7 (2) Teddy Goetz, 2021

2

Keywords

transgender, non-binary, identify formation, technology, legibility

No Such Thing as (Digital) Neutrality In May 2019 the photographic cellphone application Snapchat released two company-generated image filters used to modify selfies. While these particular Snapchat image filters were officially dubbed “My Twin” and “My Other Twin,” users and media quickly labeled them feminine and masculine, respectively. When one selects a filter within the Snapchat application, a circle-shaped simplified cartoon face icon is visible at the screen base. The feminine filter icon sports eyeliner, large doe eyes, cheek makeup, and full lips with red lipstick, while the masculine filter icon displays smaller unadorned eyes, a full beard and mustache, and a single black line for a smile. No further description or explanation of what the filters aim to execute is provided.1 As opposed to user-generated filters, which must be intentionally downloaded, these two come preset with the application. While touted in most commentary as a “gender swap” feature, these digital imaginaries represent a unique opportunity to consider what features humans use to classify faces into binary gender buckets. A May 17, 2019, article in the technology news site Pocket-Lint asserted, “the new Snapchat lenses that turn men into women and vice versa…are huge fun to make and share” (Henderson, 2019). This language assumes gender is simply a (binarized) facial performance and that such “swaps” are entertaining. The filters received broad criticisms from trans individuals for both reinforcing damaging binarized ideals and stereotypes, and presenting gender transition as a “joke” (Andersen, 2019). Considering the algorithm behind this “swap,” a May 14, 2019, Newsweek commentary representatively distilled that the “male” filter makes the user’s face “boxy, with a chiseled jaw. It widens your neck and adds facial hair,” while “for a boy to girl change, the filter softens the entire appearance, slims and contours the face, giving you a pointy chin. It makes the eyes larger, applies makeup, elongates the eyelashes and shapes the eyebrows” (Harbison, 2019). That Newsweek article’s title, “Snapchat's New Gender Swap Filter Will Make You Question Your Identity: How to Get the Male to Female Filter,” took essentialist gender rhetoric a step further with the clickbait, fear-mongering suggestion that not only might one be able to swap genders, but further could be seduced to such real-life deviance by the digitally generated product. These filters, the author boasts, “will literally transform the manliest of men into a beautiful princess. Meanwhile, prom queens may be dismayed to discover, they'd actually make a pretty handsome dude” (Harbison, 2019). Such language is reminiscent of the early US media discourse around gender-affirming surgery for transgender individuals, surrounding Christine Jorgensen’s publicly affirming her gender,

Special Section: Probing the System: Feminist Complications of Automated Technologies, Flows, and Practices of Everyday Life

| Catalyst: Feminism, Theory, Technoscience 7 (2) Teddy Goetz, 2021

3

heralded by the December 1, 1952, New York Daily News front-page headline “Ex-GI Becomes Blonde Beauty” (Stryker, 2017), inspiring widespread public fascination that someone representing peak masculinity (a soldier) could become the pinnacle of female desirability (a blonde beauty). Indeed, such reporting—more focused on efficacious highly binarized social transition than post-operative results—continues now, with moments such as Caitlyn Jenner’s June 25, 2015, “Call Me Caitlyn” cover splash in Vanity Fair, which included an Annie Liebowitz cover page photograph of Jenner in white lingerie, with highly styled hair and makeup (Bissinger, 2015). Of note, the Newsweek Snapchat article subtitle only references the “Male to Female Filter,” highlighting media preference for sensationalizing transfemininity. Thus, these Snapchat filters’ mesmerizing gimmick seems to be projecting a convincing gender performance. Such selfie filters mark a specific, historically unprecedented societal moment in which this form of self-portraiture and selective, digitally edited image portrayal (Dussault Staff, 2015; Walker Rettberg, 2014) is pervasive. This moment is also notable for increased personal “control” over one’s own images, which in this format are often “playful” and function as an attention grab on social media (Cambre & Lavrence, 2019). As filter popularity grows, their invocation paradoxically becomes increasingly invisible, as the hegemonic white, Western “beautification”—including skin smoothing, largening of eyes, adding the perception of makeup, and more—is even present in filters that add dog ears or star decorations and fail to mention how they change the user’s face. These supposed “norms,” just like Western understandings of the gender binary, are societally constructed and rooted in white supremacy (e.g., Bederman, 2008; Carby, 1987; Carter, 2007; Cooky & Dworkin, 2013; Driskill, 2011; Herzig, 2015; Morgensen, 2011; Oyěwùmí, 1997; Schuller, 2018). Thus, distortions become expected and our social media-consumer eyes become recalibrated to an unnatural “norm.” Our gaze is a product of our environment and our visual diet. In her now classic article “Situated Knowledges,” feminist and science and technologies studies scholar Donna Haraway acknowledges the biases we each bring to our work and argues for the value of those partial perspectives. She ponders, “With whose blood were my eyes crafted?” (Haraway, 1988, p. 585). With this question, Haraway calls for introspection into (and subsequent use and celebration of) our intersectional positionality and lived experiences, rather than feign a neutral gaze. Intersectional(ity) here refers to legal scholar and critical race theorist Kimberlé Crenshaw’s word coined to describe the more than additive experiences of distinct axes of oppression—originally employed to describe the experiences of Black women who are subject to both racism and misogyny (Crenshaw, 1991; Crenshaw, Gotanda, Peller, & Thomas, 1995; Crenshaw, 2017), which we can apply to analyze trans experiences. For example, foundational transgender studies theorist and historian Susan Stryker offers examples in her article “Transgender Feminism” of how, from her experience as a trans woman,

Special Section: Probing the System: Feminist Complications of Automated Technologies, Flows, and Practices of Everyday Life

| Catalyst: Feminism, Theory, Technoscience 7 (2) Teddy Goetz, 2021

4

she “can personally articulate [experiences with]: misogyny, homophobia, racism, looksism, disability, medical colonisation, coercive psychiatrisation, undocumented labour, border control, state surveillance, population profiling, the prison-industrial complex, employment discrimination, housing discrimination, lack of health care, denial of access to social services, and violent hate crimes” (Stryker, 2007, p. 66), among other prejudices. Since so many of these marginalized experiences factor into the ways in which we use our societally informed gaze to critique selfies and assign gender to a stranger on the street, we cannot consider gender alone in analyzing the “My Twin” filters. This non-neutral gaze has been demonstrated by implicit association testing, a proxy for implicit bias, which demonstrate implicit, non-neutrality of rapid assessments regarding binary gender generally (Schmid Mast, 2004; White & White, 2006), as well as for transgender individuals specifically (Axt, Conway, Westgate, & Buttrick, 2020; Wang-Jones, Alhassoon, Hattrup, Ferdman, & Lowman, 2017). Societal gendered norms color our gaze. Thus, employing algorithms to separate faces by assumed binary gender cannot be accepted as a neutral practice. Such a construct depends upon the coder’s life experience and socialization; we can take this literally for digital gender filters. As legal scholar Lawrence Lessig asserts, “Code is never found; it is only ever made, and only ever made by us” (Lessig, 2009, p. 6). For this reason, digital literature scholar Mark Marino notes, “lines of code are not value-neutral” (Marino, 2006). Critical race and information scholar Safiya Umoja Noble offers a framework for considering technological interventions as non-neutral entities in her book Algorithms of Oppression: the phenomenon of “technological redlining” (2018). Historically, “redlining” referred to systematic oppression (usually on the basis of race, as well as socioeconomic class) through denying access to governmental and private sector services—directly or by selectively raising prices—a practice that did not become illegal until 1968 (Rothstein, 2017). Noble’s term acknowledges how “digital decisions reinforce oppressive social relationships and enact new modes of racial profiling” (Noble, 2018, p. 1); human prejudice often follows inscribed process algorithmically. This use traces historical, structural, and policy-driven racism to present manifestations of those same forces in technological platforms. Such algorithmic oppression is also manifest in social media technologies. In her article “The Gender Binary Will Not Be Reprogrammed,” sociologist and communications scholar Rena Bivens reports how Facebook insidiously operationalizes the gender binary: “deep in the database, users who select custom gender options are re-coded—without their knowledge—back into a binary/other classification system…a more marketable and ‘authentic’ (yet, paradoxically, misrepresented) data set is produced” (Bivens, 2017, pp. 885-886). Thus, by using invisible algorithms to bin users into binary genders (for the purpose of advertising targeting), the social network intentionally corrupted its own data set. Here I will

Special Section: Probing the System: Feminist Complications of Automated Technologies, Flows, and Practices of Everyday Life

| Catalyst: Feminism, Theory, Technoscience 7 (2) Teddy Goetz, 2021

5

be arguing that the aforementioned Snapchat filters exercise similar enforcement of cisheteronormative gendered hegemonies. Notably, Snapchat as a platform allows users to create and share ephemeral images and videos. The social media site, unlike the Facebook “timeline” (see Haimson, 2018), does not build an archive of life milestones, but much to the contrary erases creations and messages just seconds after revealing them. Thus, Snapchat might be particularly well suited for considering gender and gendered performance as mutable constructs, since users may engage in documenting and sharing their gender and gendered exploration over time, without worrying about leaving a trace. Algorithmic oppression also operates insidiously in facial recognition technology aiming to assign gender, which has a striking demonstrable preference for faces that look like stereotypical coders. Light-skinned men were shown to be most consistently classified correctly by gender (with a maximum error rate of 0.8%), versus darker-skinned women, who were the group most often misclassified by gender (subject to error rates up to 34.7%) (Buolamwini & Gebru, 2018; Ngan, Grother, & Ngan, 2015). Such biases are often implicit, due to facial gender classifying systems not reporting the racial/ethnic demographics of the individuals in the sample photos used for algorithm development (Ballihi, Amor, Daoudi, Srivastava, & Aboutajdine, 2012; Kekre, Thepade, & Chopra, 2010; Moghaddam & Yang, 2002, 2006; Perez, Tapia, Estévez, & Held, 2012; Wang, Li, Yau, & Sung, 2010). Such algorithms have also been critiqued for operationalizing a gender binary that excludes transgender bodies and experiences (Hamidi, Scheuerman, & Branham, 2018; Keyes, 2018); indeed, facial analysis technology was shown to consistently perform worse on binary transgender persons than on cisgender individuals, while entirely unable to assign non-binary genders to images (Scheuerman, Paul, & Brubaker, 2019). Just as computers use digital algorithms to assign genders to faces, people use non-digital algorithms to assign genders to faces (Reddy, Wilken, & Koch, 2004). Like Noble, “algorithms” here refers to reductive, iterative processes used to assign binary gender from distilling evaluations of individual facial features quickly. This step-by-step sum of individual parts is notably distinct from a schema, which functions more as an internal representation of a theoretical face belonging to a “man” or “woman.” The “My Twin” Snapchat filters offer a unique opportunity to manifest such internal processes by projecting a digital gendered filter—programmed by individuals who remain hidden behind the company mantle and figureheads, with limited accountability—onto an unknown, unlabeled face, and providing the result as visual output. With this paper I will contextualize and quantitatively dissect a case study of gendered digital projections—human-coded and visually presented.

Special Section: Probing the System: Feminist Complications of Automated Technologies, Flows, and Practices of Everyday Life

| Catalyst: Feminism, Theory, Technoscience 7 (2) Teddy Goetz, 2021

6

My main methodology in this endeavor will be “bio-ethnography”—as proposed, but not yet implemented, by anthropologist Elizabeth Roberts (Roberts, 2015)—signifying a mixed-methods intervention reconstituting the relationship between aspects of embodied human experience, following recent examples of utilizing scientific data to support feminist and queer scholarship by Deboleena Roy (Roy, 2018), Elizabeth Wilson (Wilson, 2015a, 2015b), Angela Willey (Willey, 2016), and Anne Pollock (Pollock, 2015).

Gendered Faces Recognizing some of the biases and sociocultural influences that impact facial digitization—both accuracy and outcomes—it is important to consider if and how gender is ascribed to unfiltered faces. There is biological anthropology literature on facial sexual dimorphism of both overlying tissue (Carré & McCormick, 2008; Penton-Voak et al., 2001; Toma, Zhurov, Playle, & Richmond, 2008) and underlying bone structure (Bulygina, Mitteroecker, & Aiello, 2006; Weston, Friday, & Liò, 2007), including broader, squarer faces and more prominent noses and brows in males, compared to females with more prominent eyes and cheeks. Testosterone has been shown to drive “masculine” craniofacial development during puberty, while estrogens and progestins have only been linked to alterations in skin collagen (Verdonck, Gaethofs, Carels, & de Zegher, 1999). Such studies are limited by small sample sizes and racial/ethnic homogeneity of participants, and any tie to behavioral stereotypes remains controversial. These data are also supported by generation of digital facial amalgamations, which can facilitate comparing intergroup differences between composite “female” and “male” faces—generated by “averaging” hundreds of female faces and hundreds of male faces to yield visually distinct facial prototypes. This has been done for East Asian persons (Perrett et al., 1998; Perrett, May, & Yoshikawa, 1994; Tiddeman, Burt, & Perrett, 2001) and for white individuals (Behar, 2014; Pollock, 2015; Rowland & Perrett, 1995; Tiddeman et al., 2001). The observed differences are likely a combination of biological (i.e., jaw and chin shape) and cultural (i.e., hair length) origins (Rowland & Perrett, 1995). Notably, the only published research on large-scale blended images do not specifically include or exclude trans individuals’ images; thus, the conclusions drawn from those studies may not be generalizable. Facial attractiveness has been associated with degree of sexual dimorphism, but this effect is stronger within racial and ethnic groups, suggesting that attractiveness cues are learned socially (Perrett et al., 1998). One could then extrapolate that the gendered legibility of faces itself is educable as well—which is a foundational hypothesis of this work.

Methodology and Participants To better understand the complex embodiment and digitization of gendered faces, I spoke with twenty-eight transgender, non-binary, and/or gender diverse individuals (hereafter referred to as trans) about their experiences of and

Special Section: Probing the System: Feminist Complications of Automated Technologies, Flows, and Practices of Everyday Life

| Catalyst: Feminism, Theory, Technoscience 7 (2) Teddy Goetz, 2021

7

perspectives on gendered legibility of faces, including their own. Participants were additionally asked to take three selfies: (1) unfiltered, (2) using the “My Twin” feminizing filter, (3) using the “My Other Twin” masculinizing filter. In keeping with prior work altering the gendered legibility of facial images, participants were asked to take all three images using identical frontal views, neutral facial expression, and no makeup (Rowland & Perrett, 1995). By having participants examine and discuss these filtered and unfiltered selfies, we invoked the “digital-forensic gaze,” as described by cultural sociologists Christine Lavrence and Carolina Cambre (2020), signifying the “habituated practice of scrutinizing selfies,” which includes “decoding the filtration effects applied (or not) in the image” (Lavrence & Cambre, 2020). By engaging with more intentionally distorting—less subtle—image filters, I aimed to engage with heavy-handedly projected hegemonic gendered expectations with that same digital-forensic gaze in order to dissect the features that the Snapchat developers—as a proxy for societal gaze—deemed the aspirational gendered “norm.” In my study, twenty-eight participants who currently reside in the United States or Canada were convenience sample recruited from posts on a variety of Facebook pages for trans (+/- queer) communities. Participation was voluntary and compensated with a twenty-dollar gift card. Interviews were conducted via Zoom video chat and recorded and transcribed with permission. This protocol was approved by the Columbia University Irving Medical Center Institutional Review Board. In the ethical design of the study, I paid particular attention to creating an environment that would adequately support participants in discussion of potentially distressing topics, such as their gendered legibility and least favorite facial features. In the recruitment text, the study elements were explicit (interview and taking three selfies on Snapchat, along with the specific two filters that would be used), and the topic of questions was described. Participants were given the choice to decline to answer any questions and were able to pause or terminate the interview at any point if they desired (though none did). I shared my name, gender identity, pronouns, and current occupation as a medical student, but no other demographic information with participants. Considering my own gendered legibility, for each interview I appeared on Zoom as a white assigned-female-at-birth (AFAB) person in my twenties, with short red wavy hair and large glasses, who had not yet undergone gender-affirming medical care. I declined to answer further questions about my opinions on the research questions and promised to send the resultant paper(s) when published. Acknowledging that no individual can be distilled to their identities, nor can any individual speak on behalf of a community, it is important to first attempt to describe the diversity of voices and faces included in this work. For demographics, participants were asked to describe themselves in the terms that felt most

Special Section: Probing the System: Feminist Complications of Automated Technologies, Flows, and Practices of Everyday Life

| Catalyst: Feminism, Theory, Technoscience 7 (2) Teddy Goetz, 2021

8

authentic and were not prompted to clarify. Thus, some may resonate with additional terms that they did not employ, but the following descriptive breakdown is based on their chosen label(s). Participants held a broad spectrum of gender identities (Table 1). Those who identified as transgender men (n=6) used he/him pronouns (n=5) or they/them pronouns (n=1). Transgender women (n=4) used she/her (n=3) or she/they (n=1). Other (non-binary) gender labels included non-binary (12), genderqueer (2), agender/genderfluid (2), and gender non-conforming/androgynous (1). Of the non-binary individuals, some identified with additional labels specifying directionality (transfeminine with they/them or she/they pronouns, or transmasculine with they/them or he/they pronouns) or fluidity (genderfluid or genderflux). No participants solely used the pronouns typically corresponding with their assigned-sex-at-birth (i.e., AFAB using she/her).

Table 1. Gender/Sex Demographics

The average age of participants was 27.5 years, with a standard deviation of 5.9 years. Both the median and the most common age was 26. The range was 20 to 52 years old. Racially and ethnically, five participants identified as Asian (two Korean; one each Chinese, Filipino, Singaporean), four as Latinx, two as Black, one as Brown/multiracial, and one as Middle Eastern/Arab. Sixteen identified as white, including one who additionally self-described as Jewish.

Gender Identity Descriptor Terms/Labels

Used for Self [n (%)]

Non-Binary 18 (64%)

Femme/Feminine/Female 8 (29%)

Masc/Masculine/Male 6 (21%)

Fluid/Flux 6 (21%)

Pronouns: [n (%)]

He/him: 5 (18%)

She/her: 3 (11%)

They/them: 13 (46%)

He/they: 2 (7%)

She/they: 5 (28%)

Assigned-Sex-at-Birth: [n (%)]

Female/AFAB: 23 (82%)

Male/AMAB 5 (18%)

Special Section: Probing the System: Feminist Complications of Automated Technologies, Flows, and Practices of Everyday Life

| Catalyst: Feminism, Theory, Technoscience 7 (2) Teddy Goetz, 2021

9

Binary, Filtered (Il)legibility While researchers have digitally “feminized” and “masculinized” facial images (Perrett et al., 1998; Perrett et al., 1994; Rowland & Perrett, 1995), no formal analysis of publicly available gender-altering digital filters has been published to date. Here I present quantitative measurements of different facial features in twenty-eight individuals’ photographs with the feminizing and masculinizing “My Twin” filters (Table 2) within the context of both their subjective experience of gendered legibility of faces at large and their subjective observations of how “My Twin” digitally filters their faces.2 Only three participants regularly used selfie filters. Overall, few participants wanted to look like (n=2) or felt affirmed by the way that they looked with (n=3) either filter, while many more stated that they did not want to look like either (n=18) or found them unattractive (n=12). Two reported dysphoria from the filter associated with their assigned-gender-at-birth. First, considering the “My Twin” filtered selfies holistically, nine individuals (both AFAB and AMAB) saw a likeness to male family members (uncles, fathers, and brothers) in their masculine-filtered image; two remarked upon similarities with a female relative with the feminine filter (both sisters). One participant felt affirmed by this likeness, as if such a masculinization might be “accessible” for them. All other participants who noted resemblance expressed discomfort with the familial resemblance in the filtered photograph and disliked the resultant image. Paradoxically, when observed in daily life without any filters, familial traits were a common contributing factor to both favorite and least favorite facial features. P2, a fifty-two-year-old trans man, remarked, “I like my face better when it looks like my dad’s.” He specifically identified feeling confident and masculine since his lower face and jaw began to resemble his father’s face after a few years of gender-affirming hormone therapy (GAHT). P1, a twenty-six-year-old trans man, expressed the same sentiment about his father and his lower face, in contrast to his pain revisiting old photographs of himself, prior to GAHT, in which he resembles his sister. P24, a twenty-four-year-old non-binary trans man, described his lifelong joy about sharing his grandfather’s sharp, angular nose, which he feels fits his face increasingly well, the longer he takes testosterone. Those participants felt affirmed by embodying familial masculine-coded forms.

Special Section: Probing the System: Feminist Complications of Automated Technologies, Flows, and Practices of Everyday Life

| Catalyst: Feminism, Theory, Technoscience 7 (2) Teddy Goetz, 2021

10

Measurable features (lengths normalized to No Filter): [mean +/- S.D.]

No Filter (X)

Masculine (M)

Effect size; p (M:X)

Feminine (F)

Effect size; p (F:X)

Effect size; p (F:M)

Forehead Width

1 +/- 0.09 1.09 +/- 0.05

+9%; 0.0468

0.84 +/- 0.06

-16%; 0.0001

-23%; <0.0001

Forehead Height

1 +/- 0.11 1.10 +/- 0.12

+10%; 0.0311

0.93 +/- 0.10

-7%; 0.2175

-15%; <0.0001

Jaw Width 1 +/- 0.08 1.09 +/- 0.07

+9%; 0.0439

0.95 +/- 0.06

-5%; 0.4648

-13%; 0.0010

Lateral Mandibular Angle

1 +/- 0.03 0.93 +/- 0.04

-7%; 0.2287

1.01 +/- 0.04

+1%; 0.9548

+9%; 0.1304

Central Mandibular Angle

1 +/- 0.06 1.06 +/- 0.07

+6%; 0.2418

0.90 +/- 0.08

-10%; 0.0233

-15%; <0.0001

Neck Width

1 +/- 0.09 1.18 +/- 0.09

+18%; <0.0001

0.81 +/- 0.12

-19%; <0.0001

-31%; <0.0001

Nose Width

1 +/- 0.16 1.12 +/- 0.18

+12%; 0.0081

0.76 +/- 0.21

-24%; <0.0001

-32%; <0.0001

Nose Height

1 +/- 0.15 1.03 +/- 0.15

+3%; 0.5881

1.0 +/- 0.14

0%; 0.9919

-3%; 0.5117

Eye Width 1 +/- 0.12 0.99 +/- 0.10

-1%; 0.9418

1.09 +/- 0.11

+9%; 0.0395

+10%; 0.0158

Eye Height 1 +/- 0.16 0.91 +/- 0.24

-9%; 0.0421

1.44 +/- 0.29

+44%; <0.0001

+58%; <0.0001

Eyebrow Width

1 +/- 0.14 1.03 +/- 0.15

+3%; 0.6983

0.92 +/- 0.14

-8%; 0.0963

-11%; 0.0113

Special Section: Probing the System: Feminist Complications of Automated Technologies, Flows, and Practices of Everyday Life

| Catalyst: Feminism, Theory, Technoscience 7 (2) Teddy Goetz, 2021

11

Eyebrow Height

1 +/- 0.20 1.51 +/- 0.29

+51%; <0.0001

0.52 +/- 0.17

-48%; <0.0001

-76%; <0.0001

Lip Width 1 +/- 0.12 0.97 +/- 0.10

-3%; 0.7056

1.03 +/- 0.10

+3%; 0.7496

+6%; 0.2822

Lip Height 1 +/- 0.21 0.97 +/- 0.25

-3%; 0.7979

1.18 +/- 0.26

+18%; <0.0001

+22%; <0.0001

Other: [n=number with presence (% with presence)]

No Filter Masculine Feminine

Eyelashes (+)

0 (0%) 0 (0%) 28 (100%)

Eye Shadow (+)

0 (0%) 0 (0%) 28 (100%)

Eye Liner (+)

0 (0%) 0 (0%) 28 (100%)

Bright Lip Color (+)

0 (0%) 0 (0%) 28 (100%)

Long Hair (+)

11 (39%) 0 (0%) 28 (100%)

Facial Hair (+)

6 (21%) 28 (100%) 0 (0%)

Blemish Airbrush (+)

0 (0%) 0 (0%) 28 (100%)

Table 2: Snapchat Image Analysis: Quantitative Data: The top section of this table lists facial features that could be quantitatively assessed using the ImageJ measuring tool, with their corresponding normalized measurements—for the unfiltered (X) image, the “masculine” (M) filtered image, and the “feminine” (F) filtered image. To normalize each filtered image measurement, a participant’s filtered feature measurement was divided by that participant’s unfiltered feature measurement. In this table, these measurements are reported as the mean of the participants’ normalized measurements +/- the standard deviation of the groups’ normalized measurements. Statistically significant differences are bolded. Additionally, I report the effect size and p-value of each filtered image compared to unfiltered (M vs. X; F vs. X) and the filtered images compared to each other (F vs. M). In the lower part of the table, presence of “gendered” features that are not easily quantitatively measured was reported without statistical analysis.

Special Section: Probing the System: Feminist Complications of Automated Technologies, Flows, and Practices of Everyday Life

| Catalyst: Feminism, Theory, Technoscience 7 (2) Teddy Goetz, 2021

12

P16, a twenty-seven-year-old non-binary AFAB individual reflected, with gratitude, that they have continued to share their grandmother’s smile—their favorite facial feature—“no matter how much testosterone I took or how much I didn’t take.” They felt smiles are ungendered and did not want gender-affirmation to cost ceding their family legacy. Two participants expressed frustration about filters’ false advertising. P22, a twenty-eight-year-old genderqueer individual stated, “It's funny it's called ‘My Twin’…This guy person doesn't look like me….[laughs] Like, it's not accurate at all…I have, like, masculine-presenting siblings—they are My Twins, they look just like me [laughs]. Like, I have this feature [in real life].” They were frustrated by the filter yielding facial distortion instead of more subtle feature masculinization, which they would have desired. Sixteen participants criticized the filters for projecting essentialist conceptions of gender. Eight believed the filters presented unachievable images. Twenty-two participants felt that the filters were extreme or exaggerated (feminine: n=20, masculine: n=9). The feminine filter was particularly perceived as appearing unrealistic and subhuman by twenty-three participants, who used words like “fake,” “plastic,” “weird,” “artificial,” “eerie,” “surreal,” “alien,” “doll,” “cartoon,” or “caricature” to describe the filtered image. As P22 added,

It's so hyperfemme, it's so intense about it. Make a more interesting choice, Snapchat. Give me antlers and fangs, not like, weirdly airbrushing my face. I'm not a supermodel and I didn't want to be, so what the fuck? It's just kind of a nightmare. Essentially what it does is take away all the personality from my face. It takes away everything that makes me see myself, and the fact that I can look kind of interesting. Cuz the way it just flattens your face into, like, a Pixar character. Like literally these people look like they're out of Frozen. They're so generic. It's so sad… t's so weird. It makes me self-conscious in a way, and it makes me really reject that image, cuz it's not what I look like, it's not what I want to look like, and it's not what I'm asking anybody to do to my face.

Some struggled with this hyperfemininity because of dysphoria from being reminded of years spent performing an inauthentic gender presentation. P9, a thirty-two-year-old trans man, described, “I know that person, but that's a person I used to be. It feels like seeing a ghost, I guess. I killed and buried that person.” For others, complex feelings originated from seeing how far their attempts at gender conformity landed from this “ideal.” P1, who, again, is a twenty-six-year-old trans man, recalled,

When I used the female filter, the thing that most stuck out to me was that I never looked like that, and I was never going to look like that. I

Special Section: Probing the System: Feminist Complications of Automated Technologies, Flows, and Practices of Everyday Life

| Catalyst: Feminism, Theory, Technoscience 7 (2) Teddy Goetz, 2021

13

don't think it should have made me feel as disheartened as it did—I think it should have made me feel affirmed in who I am—but it was a little disheartening because I spent a long time trying to fit into that idea of what being a woman for me would look like, and some of that is because of societal pressure and family pressure. At least I thought that I had, and realizing that I was so far from that mark was just disheartening. But on the other hand, it's sort of a relief, because I think if I had fit that mold better, I think I would have spent a much larger part of my life trying to be a woman, while at seventeen I was like “this is over for me.” I think it would have taken me much longer to do that.

Seven participants described their masculine-filtered face as looking like a “jerk,” “douche,” “dirty,” “stupid,” or “on steroids.” Even those who expressed wanting a more masculine appearance had no desire to embody their digitally masculinized self. As P23, a twenty-four-year-old non-binary participant, reflected,

The masc one looks like a different person. Like, I wouldn't want to be that person, but I'd respect him [laughs]. Reads as very masc. One of the reasons I chose [my name] is that it makes me imagine, like, somebody sitting in a window seat, petting cats, in like a sunset. Very, like, a gentle boy? A gentle queer boy. And the masc filter is very much not a gentle look, so it's very much not the aesthetic that I want.

P24, a twenty-six-year-old non-binary transmasculine person, reflected, “you have to make a lot more facial movement [in the male filter] to get any semblance of a smile out of it. Which is funny cuz…an audible smile is actually part of an aural effect to feminize the voice.” This apparent masculine legibility afforded to projection of an uninviting facial expression (like more authoritative vocal inflection) was consistent with a performative tactic that six AFAB individuals described using to enhance their perception as male. Three participants felt that the feminine filter reflected hyperbolic Asian beauty standards, while eight believed that the filters had a bias for white people due to the hair texture and the presence of facial hair, which many cisgender Asian and Indigenous North American men are unable to grow (n=6). Fifteen participants explicitly preferred their own faces without the filter, though nine of those had previously stated that they disliked their own faces. As P19, a twenty-six-year-old non-binary participant, described, the filter giving them two features they had previously described wanting, squarer jaw and heavier eyebrows, was “exciting, but I think it also made me realize that it's also wrong, you know? Like it's sort of cool, this is what I would look like, like if I got some stubble or whatever, and it's fun to think about, but it's still wrong. It didn't look like me…at least I know that now. My face is my face and I don't need to change

Special Section: Probing the System: Feminist Complications of Automated Technologies, Flows, and Practices of Everyday Life

| Catalyst: Feminism, Theory, Technoscience 7 (2) Teddy Goetz, 2021

14

it.” Even though they coveted more stereotypically masculine features, the digital distortion was not satisfying because felt legibly male at the cost of registering as their own face. The specific changes observed with the filters coalesced and are described as follows, organized into the lower face (cheeks, facial hair, jaw, neck, lips, and nose), upper face (forehead, eyebrows, and eyes), and adornment (head hair and makeup).

Lower Face Sixteen participants cited cheeks and/or cheekbones as a major factor contributing to gendered legibility of faces, with masculine faces having more “sharp,” “angular” cheekbones, and feminine faces having more “soft,” “round” cheeks. Similar descriptive terms were used to differentiate male and female face shape (n=18), as well as jaw lines (n=19) and chin (n=8). Eleven AFAB participants specifically mentioned the lack of angularity of their lower facial shape as their least favorite feature. This was a leading goal for the nine who had taken GAHT—all of whom reported improvement with GAHT—and for three of the four who desired GAHT. AMAB participants taking GAHT reported joy and affirmation with their rounder, softer lower facial structure (5/5), and the two participants interested in gender-affirming facial surgery are looking to reduce their jaws and cheekbones for softer, more delicate appearing faces. P8, a twenty-six-year-old trans women, described her jaw and chin—her least favorite features—as “very blocky, very masculine, and not pretty or elegant,” and stated that she would like to surgically make them “less prominent.” GAHT had given her a “dramatically…softer, rounder face,” which partially alleviated her dysphoria. Many participants reported racialization of facial gendered legibility. Thirteen noted feminine coding of facial softness to particularly plague AFAB individuals of certain races/ethnicities (Asian, Mexican) who have rounder face shape and facial features at baseline, impeding their legibility as male (n=9). This softness was also compared to “baby” faces, which plagued more masculine- or androgynous-presenting AFAB individuals, who were generally assumed to be younger than they were (n=6). In contrast, Black individuals were reported to be more frequently masculinized (n=9). Cultural examples of said racialized masculinization include the political demonization of First Lady Michelle Obama’s muscular arms, and the booming electrolysis industry largely used to remove visible body hair from women of color—who naturally grow facial and body hair dark and coarse enough to be considered masculine or mannish by Western binarized beauty standards, as discussed in queer theorist and trans individual Jack Halberstam’s Female Masculinity (Halberstam, 2019).

Special Section: Probing the System: Feminist Complications of Automated Technologies, Flows, and Practices of Everyday Life

| Catalyst: Feminism, Theory, Technoscience 7 (2) Teddy Goetz, 2021

15

These observations were verified quantitatively. The masculine-filtered jaw width was significantly higher than that of both the unfiltered image and the feminine filter (Figure 1A). The lateral mandibular angle trended lower with the masculine filter, as observed by participants discussing the “square” or “boxy” appearance, and trended higher with the feminine filter, consistent with participants descriptions of “sloping” jaw lines; yet, neither difference was statistically significant (Figure 1B). The feminine filter yielded a significantly narrower central mandibular angle than both other filters (Figure 1C). As observed by two participants, the masculine filter generated a significantly wider neck and the feminine filter generated a significantly narrower neck, compared to both unfiltered and masculine (Figure 1D). This was consistent with two participants’ use of neck thickness as a cue in reading gender.

Figure 1: Quantitative Filter Analysis of Jaws and Necks. A) Jaw Width, and Mandibular angles B) Lateral and C) Central, and D) Neck Width. Significance compared to No Filter is indicated with *p<0.05, ****p<0.0001. Significance compared to Masculine Filter is indicated with ##p<0.01, ####p<0.0001.

Facial hair impacted gendered legibility (n=16). While six participants sported visible facial hair in their unfiltered photo, all twenty-eight had cheek stubble with masculine filters and zero did with feminine.

Special Section: Probing the System: Feminist Complications of Automated Technologies, Flows, and Practices of Everyday Life

| Catalyst: Feminism, Theory, Technoscience 7 (2) Teddy Goetz, 2021

16

Six AFAB participants voiced ambivalence about their fuller lips, which contributed to their being misgendered as a woman. This may have a racialized component, as P10, a twenty-five-year-old Latinx trans woman, observed, “people who are Black or have African heritage and have larger lips, and larger lips are usually associated with more femininity, but that's not necessarily true.” Her observation echoes a cultural hypersexualization of Black women, which can be seen anywhere from Google search results to strip clubs (see Brooks, 2010; Noble, 2013). This is notably divergent from a more general masculinization of Black individuals, which underscores how Black women can be both hypersexualized and masculinized, depending on the context. Indeed, both a Filipino American participant and a Black participant discussed being perceived more femininely for their fuller lips, and seeing men in their communities emasculated for this feature. The feminine filter yielded significantly greater lip height (a two-dimensional proxy for fullness) than both other selfies; the masculine filter had no difference (Figure 2A). Filters did not alter lip width (Figure 2B).

Figure 2: Quantitative Filter Analysis of Lips. Lip A) Width, and B) Height. Significance compared to No Filter is indicated with ****p<0.0001. Significance compared to Masculine Filter is indicated with ####p<0.0001.

Noses were frequently least favorite (n=9) or favorite (n=5) facial features, and six participants voiced nasal sharpness/prominence signaling male, versus a smaller/delicate nose coding feminine. Similarly, P3, a twenty-year-old Asian trans man, noted that his nasal bridge was “wider and stronger,” with the masculine filter; zero observed changes with the feminine filter. Quantitatively, the width of the nose bridge (measured at the lowest perpendicular cross-section above the upper bound of the nostrils) was significantly broader with the masculine filter and significantly narrower with the feminine filter, compared to both unfiltered and masculine (Figure 3A). Nose height was comparable across the filters (Figure 3B).

Special Section: Probing the System: Feminist Complications of Automated Technologies, Flows, and Practices of Everyday Life

| Catalyst: Feminism, Theory, Technoscience 7 (2) Teddy Goetz, 2021

17

Figure 3: Quantitative Filter Analysis of Noses. Nose A) Width, and B) Height. Significance compared to No Filter is indicated with **p<0.01, ****p<0.0001. Significance compared to Masculine Filter is indicated with ####p<0.0001.

Upper Face

Figure 4: Quantitative Filter Analysis of Foreheads. Forehead A) Width, and B) Height. Significance compared to No Filter is indicated with *p<0.05, ***p<0.001. Significance compared to Masculine Filter is indicated with ####p<0.0001.

Five participants expressed using foreheads/hairlines as facial cues for assigning gender. Two non-binary trans women described dysphoria about their larger foreheads, and one planned to move her hairline lower in her upcoming facial gender-affirming surgery. P10 mentioned that “hiding” her longer forehead—a least favorite feature—behind bangs is essential to her being perceived as a woman. P6, a thirty-year-old trans man, perceives faces with a farther recessed, angular hairline as more masculine, and was disappointed that the masculinizing filter did not execute this. Nine participants also noted the “prominence,” “strength,” or “angularity” of the brow bone as a male facial cue. One non-binary AFAB participant remarked that the masculine filter made them appear to have a “huge forehead”; no others commented on forehead or hairline alterations with either filter. Quantitatively, the Snapchat filter increased both the width

Special Section: Probing the System: Feminist Complications of Automated Technologies, Flows, and Practices of Everyday Life

| Catalyst: Feminism, Theory, Technoscience 7 (2) Teddy Goetz, 2021

18

(measured temple to temple) and height (measured hairline to top eyebrow border at the center of the face) in the masculine filter, and reduced both for the feminine filter compared to both others (Figure 4A, Figure 4B). Fifteen participants also reflected that thicker eyebrows are societally coded as masculine, while feminine eyebrow ideals fluctuate with fashion. Two trans men lamented their minimal eyebrow growth, which they attributed to societal pressure to over pluck them as teenagers, when still presenting as girls. Two non-binary AFAB individuals disliked their light eyebrows, while five cited their thick, bushy eyebrows as a current favorite feature on their own faces, though they had disliked them earlier in life. Eight AFAB participants described masculinizing their faces by using makeup to increase prominence of their eyebrows; two participants described apparent eyebrow plucking/grooming as a method of femininizing one’s appearance. In the Snapchat filters, eyebrow width was significantly reduced in the feminine filter compared to masculine; neither significantly differed from the unfiltered images (Figure 5A). Eyebrow height (representing thickness) was significantly higher with the masculine filter compared to unfiltered, and significantly reduced in the feminine-filtered images compared to both others (Figure 5B). Six participants remarked upon how the masculine filter “strengthened” eyebrows, while the feminine filter “shaved” them down. Those who noted this (all AFAB) universally disliked the feminizing reduction and enjoyed the masculine heavier brows.

Figure 5: Quantitative Filter Analysis of Eyebrows. Eyebrow A) Width, and B) Height. Significance compared to No Filter is indicated with ****p<0.0001. Significance compared to Masculine Filter is indicated with #p<0.05, ####p<0.0001.

Eyes were controversial. Six participants reported eyes as their favorite feature on their own faces. Six AFAB individuals described eyes as a feature commonly appreciated by others, four of whom believed that their eyes outed them as trans or got them read as more feminine. Indeed, P2, whose favorite feature was his eyes, specifically described “clocking” a man as trans whenever “he has pretty eyes and he’s short.” In contrast, P1 explained that his eyes never felt gendered: “For

Special Section: Probing the System: Feminist Complications of Automated Technologies, Flows, and Practices of Everyday Life

| Catalyst: Feminism, Theory, Technoscience 7 (2) Teddy Goetz, 2021

19

me, it never changed. Like, your jaw line changes, the way things fit on your face changes, but if I were to call something gender neutral...I always recognize myself in my eyes.” In fact, he remarked, triumphantly, that though the feminine Snapchat filter attempted to change his eyes, “They can’t really—you’re still there.” Seven reported eye enlargement with the feminine filter; no one mentioned the masculine filter. Indeed, eye width was measured to be significantly longer with the feminine filter compared to both unfiltered and masculine-filtered images (Figure 6A). The same is true for eye height being greater with the female filter, and the masculine filter opposingly generated shorter eyes (Figure 6B).

Figure 6: Quantitative Filter Analysis of Eyes. Eye A) Width, and B) Height. Significance compared to No Filter is indicated with *p<0.05, ****p<0.0001. Significance compared to Masculine Filter is indicated with #p<0.05, ####p<0.0001.

Adornment Makeup and hair length were other frequently cited tools for ascribing gender to faces. Ten participants reported makeup as a key for facial gender legibility as a woman. Three AMAB individuals utilized makeup to prevent misgendering, and the other two AMAB participants reported not using it daily because they did not “need” makeup to “pass” (be legible) as a woman. In contrast, three AFAB individuals reported minimizing their use of makeup in order to avoid misgendering. P12, a twenty-four-year-old non-binary participant, reflected on the social dysphoria created by binarized gender expectations that impeded her presenting authentically: “I feel really good when I wear makeup, but it's really hard, because people assign makeup to a gender, and that makes me feel really dysphoric.” Eight AMAB participants regularly use makeup to feminize their appearance, which includes mascara, lipstick, blush, eye liner, eye shadow, foundation, and contouring. Two trans women described crying with joy whenever they put on a

Special Section: Probing the System: Feminist Complications of Automated Technologies, Flows, and Practices of Everyday Life

| Catalyst: Feminism, Theory, Technoscience 7 (2) Teddy Goetz, 2021

20

full face of makeup. As P7, a twenty-seven-year-old trans woman, explained, using makeup facilitates

masking certain elements, emphasizing others, creating shapes from where there are none…The rest is in order to conform to gender stereotypes. Honestly, I did not think that the personal aspect would be as big. I was mostly focused on the social aspect, but no: the euphoria is actually pretty great. The lipstick and eyeshadow in my experience are the ones that have the greatest effect both societally and on my general euphoria. Beyond that, the full face of makeup, whenever that happens really makes me cry [laughs] every time. So, I only do it when I have stuff to help with that [laughs]. Cry from joy.

Makeup helped her to read herself as a woman when she looked at her reflection, and that gendered legibility was profoundly euphoric. Eight AFAB participants report regularly using masculinizing makeup, including darkening and/or thickening their eyebrows and using contouring to add angularity to the jaw and cheeks. P14, a twenty-four-year-old agender, genderfluid AFAB participant, used “goth” makeup, which they did not feel was gendered; however, they have reduced this form of self-expression due to wanting to “pass” as a man at work. With the Snapchat feminizing filter, nine participants mentioned use of makeup, including enhancing eyelashes (n=2), and addition of blush (n=3), eye liner (n=1), and lip color (n=1). Seven participants noted the use of blemish airbrush in the feminine filter, and absence of such smoothing in the masculine filter. Eleven participants had long hair at the time of the interview (at or below chin length). The feminine filter always projected long hair (at least shoulder length) and the masculine filter always yielded short hair. Twelve participants explicitly disliked the digital hair changes, and no participant liked them. Two Black participants remarked that the filters were made for “white” people because feminine filter had given them long, straight hair that they would never have, regardless of gender presentation. Three participants believed head hair serves as a factor in the racialized gendering of faces more broadly. P5, a thirty-eight-year-old trans woman of Middle Eastern/Arab descent, described being read as more feminine due to her (inherited) wavy hair texture. P27, a twenty-five-year-old Black non-binary participant, reflected that different races have specific internal beauty standards, and amongst Black women, hair is one of the most important ways of signaling feminine beauty. P29, a twenty-five-year-old genderqueer individual, found that the filters confirmed their belief that brightly colored hair signals femininity. Their hair was

Special Section: Probing the System: Feminist Complications of Automated Technologies, Flows, and Practices of Everyday Life

| Catalyst: Feminism, Theory, Technoscience 7 (2) Teddy Goetz, 2021

21

their favorite facial feature because, “I can do whatever I want with it very easily [laughs]. It feels like something that is way more versatile and I have way more control over than the rest of my face…Even when I was in cis girl mode, because I could mess around with it like: ponytail, braid, longer, shorter, whatever I wanted.” Yet this tendency to try different styles as forms of self-expression can contribute to their being misgendered as a woman “as soon as my hair gets remotely long or it gets brightly colored.” The masculine “My Twin” filter reinforced this prior inclination, “[It] really takes it home to me that my haircut is really not masculine, because it really doesn't match at all. And same with the shirt…[which] to me doesn't scream feminine…But that shirt fits way better on the feminine photo than the masculine photo. So, I was like, ‘oh fuck.’ A lot of the things I'm doing are already gender parsed in a way that I didn't even process were that gender parsed.” Thus, digital distortion yielded improved insight into how their real-life presentation was legibly gendered.

(De?)Gendered Gaze All of the features that participants identified as contributing to reading gender on faces were reflected in the changes adopted by the “My Twin” Snapchat filters, and three participants explicitly remarked upon that. Those clues to legibility included both hormonally driven and styled or performed features. Indeed, fifteen participants described how being trans intensified their own scrutiny of others’ genders, due to hyperawareness of the elements of their own faces that induce dysphoric distress. That hypocrisy underscores the built-in bind for those who attempt to buck gendered expectations, as understanding what one must (de)construct in order to read as desired within a binary societal structure requires investing attention into the very hegemonic doctrine that they are otherwise critiquing. Despite this replicability of societal gender legibility, due to their negative experiences being misgendered, eleven participants reported actively trying not to gender strangers. P16 explained, “I feel bad, as someone who is non-binary, about having myself categorized in a certain way. So, I have stopped doing it to other people.” Five conjectured that cisgender individuals would be much better at describing the gendering “algorithms” they personally implemented than trans persons would be. Yet not one participant struggled to articulate which of their facial features strangers use to assign them a gender, which suggests that even when actively resisting their learned gaze, participants remained fluent in societal gendered legibility. For trans persons, navigating gendered legibility requires balancing authentic expression, societal legibility, and reinforcing hegemonic norms. Speaking to these internal tensions, P5 described, “I perceive it as a sort of triangle where I have to negotiate between three things: my own desire that confining or oppressive gender norms be eliminated; my own aesthetic preferences for style

Special Section: Probing the System: Feminist Complications of Automated Technologies, Flows, and Practices of Everyday Life

| Catalyst: Feminism, Theory, Technoscience 7 (2) Teddy Goetz, 2021

22

and self-presentation, which would incorporate some but not all aspects of conventional gender norms, and, last but not least, the need to embody or comply with established norms in order to be read and accepted at least minimally by people at large.” The “My Twin” Snapchat filters—as digital algorithms displaying human gendering algorithms—visually reveal the way that one can distill and present prescriptive gender norms. Yet the reaction to those filters underscores the failure of such hegemony; “legibility” is unsatisfying when it comes at the cost of authenticity.

Zooming Out My work with trans participants here offers key insights into external gendered legibility, Western binary hegemonic gendered ideals, and how digital distortion with facial filters may allow exploration of self-recognition (or personal intelligibility). Notably, the “My Twin” filters’ distortions matched participants’ description of the facial features they used to assign genders to others. While all participants reported wanting features they associated with the other sex assigned at birth, they found the filters unsatisfying due to inauthenticity and inscription of racialized beauty ideals. Thus, there was a disjunction between trans persons‘ fantasy for their gendered features and legibility and the discomfort associated with filters accentuating such stereotyped features. This hypothesis is supported by recent work (with cisgender men, cisgender women, and non-binary individuals) that found that participants tried to avoid conspicuously applied selfie filters, which appeared “gauche” and lacked the desirable “natural” aesthetic of less perceptible enhancement that one might believe was reality (Lavrence & Cambre, 2020). Part of the appeal of the illusion was making it believable. Indeed, communication and digital culture scholar Lisa Silvestri described selfies as a “sociocultural revolution” about “identity affirmation” (Silvestri, 2014, p. 114). Selfies with digital facial filters offer a potentially powerful component of a gender-affirming journey, yet—per my participants in this study—the examined “My Twin” filters did not deliver, likely partially because they were not coded for the purpose. What would it mean to design gendered filters more explicitly catering to trans persons’ desires? Per my quantitative analysis above, one might surmise that the Snapchat “My Twin” filters produce and accentuate dominant Western, white cisnormative gendered images. Some gender studies academics, such as Bernice L. Hausman (e.g., Hausman, 2001), have spent the past three decades controversially arguing that trans individuals are inherently anti-feminist and damaging to the feminist cause due to seeking to reinforce biological essentialism (i.e., seeking to alter one’s embodiment in order to better fit within societal expectations, signifying equation of anatomy with behavior—masculinity or femininity). If such were the case for these participants, they would have found the “My Twin” filtered images

Special Section: Probing the System: Feminist Complications of Automated Technologies, Flows, and Practices of Everyday Life

| Catalyst: Feminism, Theory, Technoscience 7 (2) Teddy Goetz, 2021

23

appealing and representative of their transition goals. This could not have been farther from the truth. Instead, in my data, participants were able to read gender through the same features that the Snapchat filters imposed for gendered performativity, but were not satisfied by seeing their own morphology reconstrued as such. Rather, they sought a sense of self-recognition and agency in their embodiment—which is directly in keeping with (feminist scholarship) ideals of bodily autonomy and self-discovery, rather than in contradiction. As Susan Stryker reasons, “transgender phenomena ask us to follow basic feminist insights to their logical conclusion (biology is not destiny, and one is not born a woman, right?)” (Stryker, 2007, p. 59). Instead—in keeping with Stryker’s understanding of trans theory and trans experience—these data represent how participants were able to regurgitate what they had been socialized to consider embodied masculinity and femininity, and often sought some aspect of gender-affirming features in their own faces, but were not interested in claiming a fully reductive societal picture of gender. Our results speak to the tensions between external legibility and internal authenticity, echoing the concept of “cultural intelligibility,” as first coined by queer theorist Judith Butler in the context of butch/femme queer couples in her landmark “Gender Trouble” (Butler, 1990) and then tied to trans experience by transgender studies scholar Sandy Stone in “The 'Empire' Strikes Back: A Posttranssexual Manifesto” (Stone, 1992). In sum, if one were to design filters for trans individuals specifically, the goal would likely be more subtle and allow for specific tuning of gendered features, like selective addition of different forms of facial hair or altering one’s hair length. Such interventions would allow trans selfie subjects to envision where they might like to take their gendered legibility in the future, genetics and luck willing. Such a digital intervention would only be appropriate if it were truly gender-neutral—a single filter system that would allow mixing and matching all sorts of “gendered” features interchangeably—a system that facilitated unbounded exploration. This need for an unbounded experience would allow individuals to play with gender non-conformity and work to reduce internalized transphobia for individuals who have features that are not celebrated for their gender, and would avoid reinscribing binary hegemonic gender norms rooted in white supremacy. The problematic roots of contemporary hegemonic gendered legibility set us up for the ambivalence and tension our participants voiced between cultural intelligibility versus personal intelligibility. To be culturally legible means at least partially buying into hegemonic gender ideals—the falsely constructed “norms,” a deceptive misnomer. Such a performance—if convincing—offers safety and external affirmation. In contrast, while personal intelligibility can be influenced by

Special Section: Probing the System: Feminist Complications of Automated Technologies, Flows, and Practices of Everyday Life

| Catalyst: Feminism, Theory, Technoscience 7 (2) Teddy Goetz, 2021

24

socially informed understandings of what makes a gendered body, it creates more space for individuality, and the ultimate goal is not fitting a specific mold, but rather self-recognition in the mirror and feeling aligned with one’s internal truth. Neither priority is right or wrong, and these forms of (in)visibility and distortion are not mutually exclusive, but the risks and benefits are distinct and their relative value is individual and situational. Without societal oppression, trans persons would theoretically be able to prioritize self-recognition—at risk of external unintelligibility—but we do not (yet) live in such a world. Even if we did, the tendrils of societal dogma about masculinity and femininity would likely continue to impact individual understandings about gender and subsequent choices about gendered presentation through internalized transphobia and internalized ideas about “real” men and women (read: binary hegemonic gendered ideals). For this reason, we return to Haraway’s query that I posed in the introduction: “With whose blood were my eyes crafted?” (Haraway, 1988, p. 585). Our ability to read “gender” visually on faces is crafted by societal context and personal experience. We cannot disentangle the Snapchat filters we analyzed or the participants’ observations we reported from Western hegemony, rooted in binary gender constructs, neoliberalism, and white supremacy. Yet these results contribute to a better understanding of how such forces impact trans individuals’ self-concepts, personal and external intelligibilities, and goals for their transitions. These results are also limited by multiple factors of positionality. For the sake of cohort experience consistency, I limited the sample to individuals who were currently residing in the United States or Canada. These experiences and opinions are of my participants alone and cannot speak for whole populations. I was limited to the individuals who responded to the call for research participants, giving this sample voluntary bias. The group was limited to twenty-eight individuals by temporal and financial constraints, which is not small for this kind of study, but larger samples inherently offer greater breadth of experiences and perspectives. Additionally, my results suggest that self-concept improvement and patient satisfaction may be higher with gender-affirming care that results in more subtle enhancement of gendered facial features. Further research will be needed to characterize that, as I did not have the technical capacity to evaluate how participants would modify their own images, if able to selectively edit specific features. That work with trans individuals will be essential to better understanding whether more subtle changes are deemed more desirable to participants, or whether participants only seek to change certain features as opposed to the aforementioned list of distortions that are executed by the filters studied here. Some scholars are already working on such research generally, and I hope that they will build on this work by considering trans persons specifically (e.g., Leclercq, 2016).

Special Section: Probing the System: Feminist Complications of Automated Technologies, Flows, and Practices of Everyday Life

| Catalyst: Feminism, Theory, Technoscience 7 (2) Teddy Goetz, 2021

25

Conclusion Our results underscore the disjunction between the fantasy transgender and/or non-binary individuals hold for their gendered features and legibility and the discomfort associated with filters accentuating such stereotyped features, which suggests that self-concept improvement and patient satisfaction may be higher with gender-affirming care that results in more subtle enhancement of gendered facial features. This work can inform mental health providers’ understanding of the role of technology in reimaginations of who and what we see in the mirror, and specifically informs treatment of transgender, non-binary, and/or gender diverse individuals.

Acknowledgments This project was funded by the Columbia University Vagelos College of Physicians and Surgeons Scholarly Project Fund. Thank you to Drs. Michael J. Devlin and Adele Tutter for your mentorship for this research project, to Amanda C. Arcomano for your editorial assistance, and to Drs. Vanessa Agard-Jones and Joanna Radin for providing the foundational training that made this work possible.

Notes 1 Snapchat neglected to reply to the queries sent about the origins of or digital effects induced by the “My Twin” filter, so all information presented here is generated from reading news articles in which the authors were able to speak to the applications’ employees (e.g. Andersen, 2019) or “reverse engineering” the filters via comparing normalized quantitative measurements of specific facial features. 2 Photographic analyses were done using ImageJ software. The images were technically blinded, though the effects of the filters were often apparent. Length measurements for each photograph were normalized to the square root of the area of the smallest rectangle that could fully enclose the entire face and neck, from the apparent upper limit of the skull down to the sternal notch. I used this technique to account for the different degrees of zooming in on the photographs. This area scaling was not executed for angle measurements, as Zoom would not influence that. Specific measurement boundaries are described in the following sections.

References Andersen, Sage. 2019. “Snapchat's 'Gender-Swap' Filter Exposes the Internet's Casual Transphobia.” Mashable, May 16, 2019. https://mashable.com/article/snapchat-gender-swap-filter/.

Special Section: Probing the System: Feminist Complications of Automated Technologies, Flows, and Practices of Everyday Life

| Catalyst: Feminism, Theory, Technoscience 7 (2) Teddy Goetz, 2021

26

Axt, Jordan, Morgan Conway, Erin Westgate, and Nick Buttrick. 2020. “Implicit Transgender Attitudes Independently Predict Beliefs about Gender and Transgender

People.” Personality and Social Psychology Bulletin 47(2):257-74. https://doi.org/10.1177/0146167220921065.

Ballihi, Lahoucine, Boulbaba Ben Amor, Mohamed Daoudi, Anuj Srivastava, and Driss

Aboutajdine. 2012. “Boosting 3-D-Geometric Features for Efficient Face Recognition

and Gender Classification.” IEEE Transactions on Information Forensics and Security 7 (6): 1766–79. https://doi.org/10.1109/TIFS.2012.2209876.

Bederman, Gail. 2008. Manliness and Civilization: A Cultural History of Gender and Race in the United States, 1880–1917. Chicago: University of Chicago Press.

Behar, Ruth. 2014. The Vulnerable Observer: Anthropology That Breaks Your Heart. Boston: Beacon Press.

Bissinger, Buzz. 2015. “Caitlyn Jenner: The Full Story.” Vanity Fair, June 25, 2015. https://www.vanityfair.com/hollywood/2015/06/caitlyn-jenner-bruce-cover-annie-leibovitz.

Bivens, Rena. 2017. “The Gender Binary Will Not Be Deprogrammed: Ten Years of

Coding Gender on Facebook.” New Media & Society 19 (6): 880–98. https://doi.org/10.1177/1461444815621527.

Brooks, Siobhan. 2010. “Hypersexualization and the Dark Body: Race and Inequality among Black and Latina Women in the Exotic Dance Industry.” Sexuality Research and Social Policy 7 (2): 70–80. https://doi.org/10.1007/s13178-010-0010-5.

Bulygina, Ekaterina, Philipp Mitteroecker, and Leslie Aiello. 2006. “Ontogeny of Facial Dimorphism and Patterns of Individual Development within One Human Population.” American Journal of Physical Anthropology 131 (3): 432–43. https://doi.org/10.1002/ajpa.20317.

Buolamwini, Joy, and Timnit Gebru. 2018. “Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification." Proceedings of the 1st Conference on Fairness, Accountability and Transparency. PMLR, no. 81, 77–91. https://proceedings.mlr.press/v81/buolamwini18a.html.

Butler, Judith. 1990. “Gender Trouble, Feminist Theory, and Psychoanalytic Discourse.” In Feminism/Postmodernism, edited by Linda J. Nicholson, 324–40. New York: Routledge.

Cambre, Maria-Carolina, and Christine Lavrence. 2019. “How Else Would You Take a Photo? #SelfieAmbivalence.” Cultural Sociology 13 (4): 503–24. https://doi.org/10.1177/1749975519855502.

Carby, Hazel V. 1987. Reconstructing Womanhood: The Emergence of the Afro-American Woman Novelist. New York: Oxford University Press.

Carré, Justin M., and Cheryl M. McCormick. 2008. “In Your Face: Facial Metrics Predict Aggressive Behaviour in the Laboratory and in Varsity and Professional Hockey

Special Section: Probing the System: Feminist Complications of Automated Technologies, Flows, and Practices of Everyday Life

| Catalyst: Feminism, Theory, Technoscience 7 (2) Teddy Goetz, 2021

27

Players.” Proceedings of the Royal Society B: Biological Sciences 275 (1651): 2651–56. https://doi.org/10.1098/rspb.2008.0873.

Carter, Julian B. 2007. The Heart of Whiteness: Normal Sexuality and Race in America, 1880–1940. Durham, NC: Duke University Press.

Cooky, Cheryl, and Shari L. Dworkin. 2013. “Policing the Boundaries of Sex: A Critical Examination of Gender Verification and the Caster Semenya Controversy.” Journal of Sex Research 50 (2): 103–11. https://doi.org/10.1080/00224499.2012.725488.

Crenshaw, Kimberlé. 1991. “Mapping the Margins: Intersectionality, Identity Politics, and Violence against Women of Color." Stanford Law Review 43 (6): 1241–99. https://doi.org/10.2307/1229039.

———. 2017. On Intersectionality: Essential Writings. New York: The New Press.

Crenshaw, Kimberlé, Neil Gotanda, Gary Peller, and Kendall Thomas, eds. 1995. Critical Race Theory: The Key Writings That Formed the Movement. New York: The New Press.

Driskill, Qwo-Li. 2011. Queer Indigenous Studies: Critical Interventions in Theory, Politics, and Literature. Tucson: University of Arizona Press.

Dussault Staff, J. (2015). Why Snapchat pulled it’s ‘anime’ filter. The Christian Science Monitor. August, 2016. https://www.csmonitor.com/Technology/2016/0812/Why-Snapchat-pulled-its-anime-filter.

Haimson, Oliver. 2018. “Social Media as Social Transition Machinery.” Proceedings of the ACM on Human-Computer Interaction 2 (CSCW): 1–21. https://doi.org/10.1145/3274332.

Halberstam, Jack. 2019. Female Masculinity. Durham, NC: Duke University Press.

Hamidi, Foad, Morgan Klaus Scheuerman, and Stacy M. Branham. 2018. “Gender Recognition or Gender Reductionism? The Social Implications of Embedded Gender Recognition Systems.” Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. Paper no. 8, 1–13. https://doi.org/10.1145/3173574.3173582.

Haraway, Donna. 1988. “Situated Knowledges: The Science Question in Feminism and the Privilege of Partial Perspective.” Feminist Studies 14 (3): 575–99. https://doi.org/10.2307/3178066.

Harbison, Cammy. 2019. “Snapchat's New Gender Swap Filter Will Make You Question Your Identity: How to Get the Male to Female Filter.” Newsweek, May 14, 2019. https://www.newsweek.com/snapchat-gender-swap-filter-how-get-girl-boy-change-male-female-how-use-not-1425014.

Hausman, Bernice L. 2001. “Recent Transgender Theory.” Feminist Studies 27 (2): 465–90. https://doi.org/10.2307/3178770.

Henderson, Rik. 2019. “Snapchat Gender Swap, How to Do It Yourself and Some Hilarious Celebrity Results.” Pocket-Lint, May 17, 2019. https://www.pocket-

Special Section: Probing the System: Feminist Complications of Automated Technologies, Flows, and Practices of Everyday Life

| Catalyst: Feminism, Theory, Technoscience 7 (2) Teddy Goetz, 2021

28

lint.com/apps/news/snapchat/148095-snapchat-gender-swap-how-to-do-it-and-check-out-these-crazy-celebrity-results.

Herzig, Rebecca M. 2015. Plucked: A History of Hair Removal. New York: NYU Press.

Kekre, H.B., Sudeep D. Thepade, and Tejas Chopra. 2010. “Face and Gender Recognition Using Principal Component Analysis.” International Journal on Computer Science and Engineering 2 (4): 959–64. http://www.enggjournals.com/ijcse/doc/IJCSE10-02-04-26.pdf.

Keyes, Os. 2018. “The Misgendering Machines: Trans/HCI Implications of Automatic Gender Recognition.” Proceedings of the ACM on Human-Computer Interaction 2 (CSCW): 1–22. https://doi.org/10.1145/3274357.

Lavrence, Christine, and Carolina Cambre. 2020. “‘Do I Look Like My Selfie?’: Filters and the Digital-Forensic Gaze.” Social Media + Society 6 (4). https://doi.org/10.1177/2056305120955182.

Leclercq, Charlotte. 2016. “Do You ‘Lensit’? A Call for Research on Modified Selfies.” Masters of Media (blog), October 20, 2016. http://mastersofmedia.hum.uva.nl/blog/2016/10/20/do-you-lensit-a-call-for-research-on-modified-selfies/.

Lessig, Lawrence. 2009. Code: And Other Laws of Cyberspace. ReadHowYouWant. com.

Marino, Mark C. 2006. "Critical Code Studies.” Electronic Book Review. Cambridge, MA: MIT Press.

Mast, Marianne Schmid. 2004. “Men Are Hierarchical, Women Are Egalitarian: An Implicit Gender Stereotype.” Swiss Journal of Psychology / Schweizerische Zeitschrift für Psychologie / Revue suisse de psychologie 63 (2): 107-111. https://doi.org/10.1024/1421-0185.63.2.107.

Moghaddam, Baback, and Ming-Hsuan Yang. 2002. “Learning Gender with Support Faces.” IEEE Transactions on Pattern Analysis and Machine Intelligence 24 (5): 707–11. https://doi.org/10.1109/34.1000244.

———. 2006. United States Patent No. US6990217B1. Google Patents.

Morgensen, Scott Lauria. 2011. Spaces between Us: Queer Settler Colonialism and Indigenous Decolonization. Minneapolis: University of Minnesota Press.

Ngan, Mei, Patrick J. Grother, and Mei Ngan. 2015. Face Recognition Vendor Test (FRVT) Performance of Automated Gender Classification Algorithms. US Department of Commerce, National Institute of Standards and Technology.

Noble, Safiya Umoja. 2013. “Google Search: Hyper-Visibility as a Means of Rendering Black Women and Girls Invisible.” InVisible Culture 19 (October 29). https://ivc.lib.rochester.edu/google-search-hyper-visibility-as-a-means-of-rendering-black-women-and-girls-invisible/.

Special Section: Probing the System: Feminist Complications of Automated Technologies, Flows, and Practices of Everyday Life

| Catalyst: Feminism, Theory, Technoscience 7 (2) Teddy Goetz, 2021

29

———. 2018. Algorithms of Oppression: How Search Engines Reinforce Racism. New York: NYU Press.

Oyěwùmí, Oyèrónkẹ́. 1997. The Invention of Women: Making an African Sense of Western Gender Discourses. Minneapolis: University of Minnesota Press.

Penton-Voak, Ian S., Benedict C. Jones, A.C. Little, S. Baker, B. Tiddeman, D.M. Burt,

and David I. Perrett. 2001. “Symmetry, Sexual Dimorphism in Facial Proportions and Male Facial Attractiveness.” Proceedings of the Royal Society of London. Series B: Biological Sciences 268 (1476): 1617–23. https://doi.org/10.1098/rspb.2001.1703.

Perez, Claudio, Juan Tapia, Pablo Estévez, and Claudio Held. 2012. “Gender Classification from Face Images Using Mutual Information and Feature Fusion.” International Journal of Optomechatronics 6 (1): 92–119. https://doi.org/10.1080/15599612.2012.663463.

Perrett, David I., Kieran J. Lee, Ian Penton-Voak, D. Rowland, Sakiko Yoshikawa, D. Michael Burt, S.P. Henzi, Duncan L. Castles, and Shigeru Akamatsu. 1998. “Effects of Sexual Dimorphism on Facial Attractiveness.” Nature 394 (6696): 884–87. https://doi.org/10.1038/29772.

Perrett, David I., Karen A. May, and Sin Yoshikawa. 1994. “Facial Shape and Judgements of Female Attractiveness.” Nature 368 (6468): 239–42. https://doi.org/10.1038/368239a0.

Pollock, Anne. 2015. “Heart feminism.” Catalyst: Feminism, Theory, Technoscience 1(1), 2-30. https://doi.org/10.28968/cftt.v1i1.28811.

Reddy, Leila, Patrick Wilken, and Christof Koch. 2004. “Face-Gender Discrimination Is Possible in the Near-Absence of Attention.” Journal of Vision 4 (2): 4. https://doi.org/10.1167/4.2.4.

Rettberg, Jill Walker. 2014. Seeing Ourselves through Technology: How We Use Selfies, Blogs and Wearable Devices to See and Shape Ourselves. Springer Nature. E-book.

Roberts, Elizabeth F.S. 2015. “Bio-Ethnography: A Collaborative, Methodological Experiment in Mexico City.” Somatosphere. http://somatosphere.net/2015/bio-ethnography.html/.

Rothstein, Richard. 2017. The Color of Law: A Forgotten History of How Our Government Segregated America. New York, NY: Liveright Publishing.

Rowland, Duncan A., and David I. Perrett. 1995. “Manipulating Facial Appearance through Shape and Color.” IEEE Computer Graphics and Applications 15 (5): 70–76. https://doi.org/10.1109/38.403830.

Roy, Deboleena. 2018. Molecular Feminisms: Biology, Becomings, and Life in the Lab. Seattle: University of Washington Press.

Scheuerman, Morgan Klaus, Jacob M. Paul, and Jed R. Brubaker. 2019. “How Computers See Gender: An Evaluation of Gender Classification in Commercial Facial Analysis Services.” Proceedings of the ACM on Human-Computer Interaction 3 (CSCW): 1–33. https://doi.org/10.1145/3359246.

Special Section: Probing the System: Feminist Complications of Automated Technologies, Flows, and Practices of Everyday Life

| Catalyst: Feminism, Theory, Technoscience 7 (2) Teddy Goetz, 2021

30

Schuller, Kyla. 2018. The Biopolitics of Feeling: Race, Sex, and Science in the Nineteenth Century. Durham, NC: Duke University Press.

Silvestri, Lisa. 2014. “Shiny Happy People Holding Guns: 21st-Century Images of War.” Visual Communication Quarterly 21 (2): 106–18. https://doi.org/10.1080/15551393.2014.928159.

Stevenson, Angus, ed. 2010. Oxford Dictionary of English. Oxford University Press.

Stone, Sandy. 1992. “The Empire Strikes Back.” Camera Obscura 10: 2 (29), 150–176. https://doi.org/10.1215/02705346-10-2_29-150.

Stryker, Susan. 2007. “Transgender Feminism.” In Third Wave Feminism, edited by Stacy Gillis, Gillian Howie, Rebecca Munford, 59–70. London: Springer. https://doi.org/10.1057/9780230593664_5.

———. 2017. Transgender History: The Roots of Today's Revolution. New York, NY: Seal Press.

Tiddeman, Bernard, Michael Burt, and David Perrett. 2001. “Prototyping and Transforming Facial Textures for Perception Research.” IEEE Computer Graphics and Applications 21 (5): 42–50. https://doi.org/10.1109/38.946630.

Toma, Arshed M., Alexei Zhurov, R. Playle, and Stephen Richmond. 2008. “A Three‐Dimensional Look for Facial Differences between Males and Females in a British‐Caucasian Sample Aged 15½ Years Old.” Orthodontics & Craniofacial Research 11 (3): 180–85. https://doi.org/10.1111/j.1601-6343.2008.00428.x.

Verdonck, Anna, M. Gaethofs, Carine Carels, and Francis de Zegher. 1999. “Effect of Low-Dose Testosterone Treatment on Craniofacial Growth in Boys with Delayed Puberty.” The European Journal of Orthodontics 21 (2): 137–43. https://doi.org/10.1093/ejo/21.2.137.

Wang, Jian-Gang, Jun Li, Wei-Yun Yau, and Eric Sung. 2010. “Boosting Dense SIFT Descriptors and Shape Contexts of Face Images for Gender Recognition.” 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition-Workshops. https://doi.org/10.1109/CVPRW.2010.5543238.

Wang-Jones, Tiffani, Omar M. Alhassoon, Kate Hattrup, Bernardo M. Ferdman, and

Rodney L. Lowman. 2017. “Development of Gender Identity Implicit Association Tests to Assess Attitudes toward Transmen and Transwomen.” Psychology of Sexual Orientation and Gender Diversity 4 (2): 169–183. https://doi.org/10.1037/sgd0000218.

Weston, Eleanor M., Adrian E. Friday, and Pietro Liò. 2007. “Biometric Evidence That Sexual Selection Has Shaped the Hominin Face.” PLoS One 2 (8). https://doi.org/10.1371/journal.pone.0000710.

White, Michael J., and Gwendolen B. White. 2006. “Implicit and Explicit Occupational Gender Stereotypes.” Sex Roles 55 (3–4): 259–66. https://doi.org/10.1007/s11199-006-9078-z.

Willey, Angela. 2016. Undoing Monogamy: The Politics of Science and the Possibilities of Biology. Durham, NC: Duke University Press.

Special Section: Probing the System: Feminist Complications of Automated Technologies, Flows, and Practices of Everyday Life

| Catalyst: Feminism, Theory, Technoscience 7 (2) Teddy Goetz, 2021

31

Wilson, Elizabeth A. 2015a. “Depression, Biology, Aggression: Introduction to Gut Feminism (Duke University Press, 2015).” Catalyst: Feminism, Theory, Technoscience 1 (1): 1–10. https://doi.org/10.28968/cftt.v1i1.28812.

———. 2015b. Gut Feminism. Durham, NC: Duke University Press.

Author Bio Teddy G. Goetz (he/him or they/them) is a psychiatry resident at the University of Pennsylvania. Prior to earning his M.D. at Columbia, he studied biochemistry and gender studies at Yale, conducting research on a wide spectrum of biologically- and socially-determined aspects of gender-based health disparities, including earning his M.S. developing the first animal model of gender-affirming hormone therapy. His current focuses include mixed-methods research on LGBTQ mental health, as well as narrative medicine and physician advocacy. More about his scholarly and artistic work can be found at teddygoetz.com.