Future Tense

Male-Order Brides

Why smart home devices and feminized A.I. need a rethink.

An Amazon Echo device wearing a skirt and matching bow.
Photo illustration by Slate. Photos by Amazon and Getty Images Plus.

This article is an adapted excerpt from The Smart Wife: Why Siri, Alexa, and Other Smart Home Devices Need a Feminist Reboot by Yolande Strengers and Jenny Kennedy © 2020 Massachusetts Institute of Technology.

We are witnessing the slow death of the wife in contemporary society—at least the wife we’ve known as the longtime backbone of patriarchal society. But she’s having an enthusiastic comeback, with a few critical upgrades. It’s not wives themselves who are being asked to come back into the kitchen, but rather feminized artificial intelligence built into robots, digital voice assistants (also known as smart speakers, conversational agents, virtual helpers, and chatbots), and other smart devices.

The smart wife comes in many forms. In fact, chances are you’re already living with her. Most obvious are assistants such as Amazon’s Alexa, Apple’s Siri, or Google Home, which have default female voices in most markets. Other smart wives are anthropomorphic, zoomorphic, or automated (such as home appliances or domestic robots)—most of which carry out domestic responsibilities that have traditionally fallen to wives. Smart wives can also be found in the bodies of overtly feminized and sometimes “pornified” sex robots or gynoids.

So who wants a smart wife? Potentially everyone. In 2016, the research firm Gartner predicted people would soon be having more conversations with bots than their spouses. More than a quarter of the adult population in the United States now owns at least one smart speaker like Alexa; that’s more than 66 million people.

When Siri made her debut in 2011 as “a sassy young woman who deflected insults and liked to flirt and serve users with playful obedience,” as a UNESCO report on closing gender divides in digital skills put it, her “coming-out party” reached nearly 150 million iPhones in her first year. This single technology—developed behind closed doors by one company in one corner of the world with little input from women—shaped global expectations for smart wives and A.I. assistants more broadly in a little over 12 months.

In terms of gendered interest and uptake, industry sales figures show that consumers of smart home devices are more likely to be male, and “smart home obsessives” are invariably men. Men are also more often the instigators for bringing smart home technologies into the home and managing their operation. However, women (and the significant percentage of the world’s population that is not heterosexual men) need (smart) wives too. Millennial women in the United States, ages 18 to 35, are particularly excited about smart home technology, and the occasional report finds that women are actually more interested in some devices than men are, such as voice assistants and some smart appliances.

Narrowing down to specific markets reveals other gender differences in interest, uptake, and benefits. The vast majority of people currently interested in or buying sex robots (and dolls) are men. Women are understandably less enthusiastic about the penetration-oriented characteristics of most current offerings. By contrast, in the present social robot market, women stand to benefit most given that they live longer than men and therefore are more likely to suffer from debilitating conditions like dementia—which is one of the emerging applications for care robots.

When it comes to the creation of smart wives, men are clearly in the lead. Men vastly outnumber women in computer programming jobs, making up over 75 percent of the total pool of programmers in the United States in 2017. In the field of robotics and A.I., men outnumber women as well. Men make up between 77 and 83 percent of the technical positions at Apple, Facebook, Microsoft, Google, and General Electric, and just over 63 percent at Amazon. Men make up 85 percent of the A.I. research staff at Facebook and 90 percent at Google. Likewise, in academic environments, more than 80 percent of A.I. professors are men, and only 12 percent of leading A.I. researchers are women.

Indeed, computer science has gone backward on gender diversity in the past 30 to 40 years, with female participation in computer science degrees in the U.S. dropping from 37 percent in the early 1980s to 18 percent in 2016, despite a number of active campaigns and initiatives to try to turn this around. As a UNESCO report on closing gender divides in digital skills depressingly puts it, “The digital space is becoming more male-dominated, not less so.”

So potent is the gendered imbalance in computing that the journalist, producer, and author Emily Chang labeled the coding culture of Silicon Valley a “Brotopia.” In addition, the A.I. industry has been called out by leading academics and commentators like Kate Crawford and Jack Clark for having a “white guy problem” in an industry characterized by a “sea of dudes.” Indeed, the A.I. Now Institute has identified a “diversity crisis” perpetuated by harassment, discrimination, unfair compensation, and lack of promotion for women and ethnic minorities. The institute recommends that “the A.I. industry needs to make significant structural changes to address systemic racism, misogyny, and lack of diversity.”

This gender and racial imbalance filters down to the ways in which technologies are imagined and created. Scholars such as Safiya Umoja Noble have written about the “algorithms of oppression” that characterize search engines like Google, which reinforce racism and sexism. There has also been considerable criticism leveled at digital voice assistants like Alexa, Siri, and Google Home, and other types of smart wives, for their sexist overtones, diminishing of women in traditional feminized roles, and inability to assertively rebuke sexual advances.

For example, Microsoft’s Cortana and Mycroft assistants take their names as well as identities from gamer and sci-fi culture, respectively, both of which have been widely critiqued as highly sexist domains. Likewise, assistants like Microsoft’s Ms. Dewey and Facebook’s Moneypenny (both now retired from service) were sexually suggestive and flirtatious by reputation (Moneypenny is named for the coy secretary and romantic interest in the James Bond novels and films) or through their coded behavior. This gendering of smart tech all makes a particular set of (mostly) men’s ideas about home, wives, domestic responsibilities, and sexual desires deeply relevant. And it potentially excludes a lot of other people for whom these ideas don’t resonate—including, we should note, many men.

An eerie example of the feminization of smart technologies is found in the Japanese digital voice assistant named Azuma Hikari developed by the company Vinclu as part of its Gatebox technology and entering mass production in 2019. This cutesy smart wife is targeted toward the country’s single residents, now the largest segment of the population. Described by her creators as a “bride character,” Hikari is a virtual anime hologram, with blue hair and matching outfits, who lives in a glass tube about 30 centimeters high and 10 centimeters wide. She is depicted as a 20-year-old woman, with a schoolgirl-ish and upbeat personality. Hikari wears a short skirt and over-the-knee socks and has a high-pitched voice supplemented with coquettish giggles. In several promotional videos, she takes care of a lonely, hardworking young Japanese man. Her key role is to greet her “master” with excitement when he comes home, and check on him during the day by sending helpful messages such as “come home early.” She also provides timekeeping services and weather advice, turns off the lights when her master leaves the house, adjusts the home’s heating and cooling, and remembers their anniversary.

Hikari is an ideal smart wife (or girlfriend), doting on her man’s needs. She is also useful and efficient, helping men keep their schedules on track. But if she does too good a job, the singles population of Japan may not need to look for a human companion—potentially exacerbating the falling birth rates in that country. On this point, the Japanese government has bigger plans for smart wives (as do other nations, like China). Professor of anthropology Jennifer Robertson suggests that there is a push to position social and care robots like Pepper as an opportunity to redirect Japanese women’s time back to the task of having children. Smart wives are thus entangled in social and political agendas about the role of women, wives, and heteronormative relationships in contemporary societies.

Sex robots and virtual pornography take these ideas in other tantalizing and potentially troubling directions. U.S. company RealDoll’s Harmony sexbot has 18 customizable feminized personality traits (including jealous, shy, moody, thrilled, and insecure), 42 different nipple options, and different voice selections (including a “saucy Scottish accent”), and she remembers her user’s favorite food (like any good smart wife should). But her true stroke of genius is this: Harmony’s removable and self-lubricating vagina is dishwasher safe. She is smart (with controllable parts and efficient cleaning!) and wifely (devoted to her man’s intimate needs). She is a woman with all the sexy bits, without all the mess or fuss. Harmony is customizable yet uniform, deeply feminine but with masculine efficiency, and there to be enjoyed, consumed, and penetrated. To be clear, this isn’t creepy because we’re talking about a robot (we’re not here to vilify anyone’s kinks) but rather because it embodies a pornographic idea of female sexiness that—in some cases—celebrates nonconsensual sex.

This gets to the heart of the strange paradox that characterizes the smart wife: She is simultaneously a dutiful, feminine wife and sexual muse while adeptly solving household problems with technological tools. She is docile and efficient. Compliant and in control. Seductive yet shrewd. Intimate yet distant. She is ready to be played, ready to serve, and able to optimize her domain.

On the one hand, the smart wife represents an ingenious solution to the ongoing domestic disputes over the division of labor that plague contemporary households in gender-progressive societies. On the other hand, there is something downright worrying about the smart home and robotic industries’ subtle characterizations of their products as a nostalgic, sometimes porn-inspired wifely figure. This is particularly so because we are trying to move on from these representations of women in most contemporary societies.

What’s the problem exactly? For a start, these depictions affect how we treat our devices, robots, and A.I., which in turn are reflected back in how we treat people in general—and women in particular. Friendly and helpful feminized devices often take a great deal of abuse when their owners swear or yell at them. They are also commonly glitchy, leading to their characterization as ditzy feminized devices by owners and technology commentators—traits that reinforce outdated (and unfounded) gendered stereotypes about essentialized female inferior intellectual ability.

Relatedly, a smart wife gone wrong is the central plotline of many sci-fi movies featuring feminized A.I., such as Joanna in The Stepford Wives, Vanessa in Austin Powers, and Ava in Ex Machina. These women are typically sexualized, demure, and slightly dysfunctional, yet ready to retaliate against their male makers, owners, and enslavers. The plotlines of these highly entertaining films consistently reinforce the cliché that the perfect woman is an artificial one—as long as she doesn’t have too much control or power, as then she will rebel, kill, or enslave her makers. Ironically, it’s often the lonely techies in these films and stories who fall for these femme fatales and suffer the effects. Yet despite their problems (or perhaps because of them), these on-screen and imperfect smart wives are often identified as the design inspiration and source code for those now entering our homes.

We know from research carried out in the fields of robotics, human-computer interaction, and psychology that humans assign emotional as well as personal traits to computers. A smart wife precedent for this was set in 1966, when founding computer scientist Joseph Weizenbaum created the first chatbot, named ELIZA. This fembot, which performed natural language processing, was cast in the role of psychiatrist and worked by posing questions based on Rogerian psychotherapy back to her “clients” (such as “And how does that make you feel?”). Weizenbaum was surprised and later dismayed to discover how intimately his colleagues related to ELIZA, and the emotional connections they quickly formed with this artificial therapist.

Indeed, according to the late Clifford Nass and his collaborator Corina Yen, experts in the fields of human-computer interaction and user experience design, the success and failure of interactive computer systems depends on whether we like them, and how well they treat us.

This is partly because people have a tendency to humanize devices and assign them with genders, even when they don’t have one. Sherry Turkle, a professor of the social studies of science and technology, has pioneered research on people’s relationships with technology—especially mobile technology, social networking, and sociable robotics. She has found that the boundary between humans and machines is weakening, affecting how we understand and relate to one another, and leading to some troubling outcomes, such as reducing our communication with other people and making us feel lonelier than ever.

Other examples demonstrate how our connections and interactions with inanimate and animate devices are gendered. Consider satellite navigation systems. Most of us prefer the female voice because we consider it to be warmer and more pleasant than a male one (the same holds true for smart wives). But we also distrust a female-voiced sat-nav (because women are notoriously bad at giving directions, or so the stereotype goes) and are quick to dub her helpfulness as badgering. When a new female voice command system was introduced into jet planes in 2012, U.S. fighter pilots referred to her as “Bitchin’ Betty” for getting louder and sterner when they ignored the system’s commands. These kinds of humanlike assistants provide us with an opportunity to perform and reinforce exaggerated gender stereotypes.

Likewise, on the homefront we prefer female voice assistants when their purpose is to discuss love and relationships, or help us with the housework (by adding groceries to the shopping list, for example, or, better still, restocking our fridges and pantries). In other words, we like our assistants to conform to gendered stereotypes. But designing gendered devices can also reinscribe those same stereotypes. When those devices start behaving erratically, or we perceive them to be annoying or acting “dumb,” we associate those characteristics with common gender typecasts.

In short, smart wives hark back to nostalgic stereotypes that people are now being told (through smart tech marketing) that they deserve and should desire. Sure, overtly gendered smart wives are familiar, cute, sexy, friendly, and “easy to use”—but at what cost to society?

Left as they are, by and large smart wives serve a patriarchal capitalist system, which positions women as useful and efficient commodities, upholds (and promotes) gendered and sexual stereotypes, and paints men as boys who enjoy playing with toys. Of course, we don’t think that every man who is interested in designing or using smart technology is a misogynist or misogamist. We mean that the smart wife works with a narrow range of stereotypes that are potentially damaging for all genders. We aren’t man or tech haters, nor are we anti-technosexuals (defined as those who include the mechanical within their boundaries of sexuality). But we are killjoys of the smart wife as she is currently programmed, and our agenda is simple. We’re here to give her a reboot.

Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.