Data/Body: Corpus and the Cloud Empire of our Lives – The Brooklyn Rail

2022-09-10 07:40:47 By : yu zhou

Find the RAIL in print

My last column addressed generative art, a practice in which artists often use data sets to create complex works about our world. But where does that data come from? And, more importantly, can the aestheticization of data ignore its historical context or the privacy issues of its contemporary context?

Data colonialism is the how, the extractivist processes through which life gets newly appropriated by capitalism. The social quantification sector is the who, the consortium of private and public players who engage in data colonialism to achieve their financial and political goals. And the Cloud Empire is the what, the overall organization of resources and imagination that emerges from the practices of data colonialism.

–The Costs of Connection, Nick Couldry and Ulises A. Mejias

Data is not neutral and typically produced by extraction from our lives. Artists have unveiled that opaque appropriation to help us consider how we might resist. The topic has been on my mind because recent news has brought collected data systems’ political externalities to the fore. The Federal Trade Commission Chair Lina M. Khan expressed concern about “business models that can incentivize endless hoovering up of sensitive user data and a vast expansion of how this data is used” in her proposal to consider possible rules around commercial data practices. When even the government is concerned that corporate abuse of people’s data has gone too far, we are in trouble. This dialogue has resurfaced in response to the inordinate invasion of privacy rights around reproductive healthcare in the United States, two issues that are deeply intertwined because privacy is a right that has always only been accorded to some bodies, with the rest dismissed as quanta to be sorted and sold.

A data set is called a corpus, a poetic allusion to its integrated body of information. Often, the data is an extraction of lived bodies’ experiences, but lacks in that corpus also expose cultural amputations. The artist Stephanie Dinkins just exhibited On Love and Data at the Queens Museum and much of her work is about revealing the importance of including more people and voices in the development of our technological tools. The extent of racial bias has been well-reported but Dinkins’s creative tekne enquires–and subtly proposes–how technologies could present a logics of care and mutuality rather than constant extraction. This is particularly important as AI technologies become widespread, a technology and practice that will need to be expressly addressed in future columns.

Mad Pinney, the artistic coordinator at Art Blocks and a long time community organizer around art and technology, describes her interest in a particular kind of creative effort:

Artists who challenge the status quo of Surveillance Capitalism by transmuting data as a system of control into a matter of collective reckoning….Another example of “taking back” data is LaJune McMillian’s Black Movement Archive which aims to ethically digitize the Black movement. I’m particularly inspired by the notion of data that exists outside of the quantifiable into the experiential, cultural, and historical. Yo-Yo Lin developed the Resilience Journal, a way to record the experience of being chronically ill and disabled in the form of soft data. Artists who challenge data and its extortive ways to enhance community, love, and shared history will always be the most powerful to me.

These artists present the bodies and motions of populations that have been unrecognized by standard data sets; the introduction of soft data recognises the value of qualitative input. Their work reorganizes how we think of data through an oppositional gaze that ensures data remains entangled with our bodies.

As Jacqueline Wernimont aptly observes in Numbered Lives, “there are no data, tracking opportunities, algorithms or patterns without bodies.” For Data Sensorium (2022-ongoing), the choreographer and artist Lins Derry translated environmental migration data from the Internal Displacement Monitoring Center to express climate change through a movement score for performance. Her research on data embodiment aims at “moving data off ‘the page’ and into the world of bodies; it's about translating data from the abstract or visual domain into the kinesthetic so that it may be viscerally contended with too.” Derry’s practice aims to anchor data in the body and obviate the ways statistics and diagrams become disassociated from embodied experience.

If knowledge is power then in the age of big data, everyone faces the Orwellian forecast: ‘If you want a picture of the future, imagine a boot stamping on a human face—for ever.’ Ian Cheng’s recent project 3Face (2022) produces NFTs that interpret the other NFTs within a collector’s wallet to create a picture of the collector; the face alters in response to changes in the digital wallet. The project transforms something seemingly arbitrary like the selection of works in a collector’s public transaction history into their nurture, nature, and even something Cheng calls posture, i.e. “how you interface with the world.” A playful nod at predictive algorithms that more forcefully extrapolate data, Cheng’s project also hints at the discomfort that can come from such unveiling since the face a collector receives can be altered by new purchases or sales. Playing with an inherent narcissism, Cheng cultivates greater cognition around collectors’ choices.

With the greater varieties of data now available and incredible processing speeds, data doesn’t have to be that big anymore either. The old triad of volume, variety, and velocity is no longer expensive to combine and make data analysis easy, which has led to great stores of data even when that data has no clear purpose. Now, collection occurs and the schema around how to organize it develops later, a marked lack of what Cennydd Bowles calls “ethical imagination” in Future Ethics.

Trade secrets operate as a form of intellectual property, which corporations contain within black boxes. Humans can’t do the same, and yet both are persons. When persons are required–or even just expected–to be transparent, who is really benefiting?

Humor fosters a creative discourse that can reconsider the data delivery devices to which we have become attached. subRosa is a cyberfeminist collective founded by Faith Wilding and Hyla Willis in 1998 whose satirical projects subvert cutting-edge technologies and the data they extract. For SmartMom, they re-imagined a military shirt designed to track soldiers into a pregnancy dress that “uses optical sensors connected to a web of coded fiberoptic lines leading to a radio transmitter to provide constant monitoring of body systems and data such as heartbeat, blood pressure, fluid levels, nervous functioning, the mother's fantasy life, sexual and eating urges, and the like.” The technology exists; wearables are designed for this kind of data capture. Apps already support pregnant women in just this way.

The Center for Intimacy Justice found that ads on Facebook whose topics fell under pregnancy, fertility, or pelvic pain were frequently flagged by Facebook as “adult content” although–in a fascinating if bizarre example of cultural bias around bodies– erectile dysfunction treatment was not. Clinically backed information offered as public service announcements about medical abortions—the FDA-approved two pill combo—are more likely to be removed by social media sites than advertisements for the abortion reversal pill, a treatment with no medical backing and harmful consequences for many women who take it. Even though Twitter requires fact based and no “inflammatory or provocative content which is likely to evoke a strong negative reaction,” the opposition to medically accurate information about abortion procedures creates such blowback from conservatives that sites bend to pressure.

If information is not available, or sites alter information about healthcare sites, how are people to make informed decisions about their healthcare? It is even worse when states require doctors to present this unsupported research during clinical care. Barbara Kruger’s Who will write the history of tears? (2011) speaks to the grief derived from confusion.

The artist and human rights researcher Caroline Sinders presented The Architectures of Violence in summer 2021 at Telematic Media Arts in San Francisco, CA tracking the themes, styles and trends that disseminate misinformation across social media sites. Her research predated but aligned with the Surgeon General, Dr. Vivek Murthy’s Advisory on Misinformation in July 2021 because of the fabrications surrounding Covid-19. Though all prior warnings by any Surgeon General focused on health threats related to food, water, and smoke, “Today we live in a world where misinformation poses an imminent and insidious threat to our nation’s health,” Murthy explained. Misinformation is certainly rampant across the reproductive health care space where the dangers of legal abortions are contradicted by fact: The Center for Disease Control reports 0.41 maternal deaths per 100,000 legal abortions in 2019 but 24 deaths per 100,000 births, with maternal mortality increasing to 55 for Black women because adequate reproductive health care is very much dependent on economic status, a history tied to systemic racism.

Miguel Luciano’s painting Barceloneta Bunnies (2007), from the “Louisiana Porto Ricans” series, presents audiences with the strange two-sided history of pharmaceutical involvement in sexual and reproductive health, and the powerful regimes that determine which corpus deserves care. In the bold style of mid-20th century agriculture posters, pills and signs proliferate alongside three cartoonish bunnies—one jumping with an amputated foot on a large yam (used for hormonal drugs), another slurping from an oil barrel, with a third disapprovingly holding a Viagra pill. Barceloneta is Puerto Rico’s pharmaceutical center where Pfizer produces Viagra, and one of many towns across the island where the US government endorsed sterilisation of Puerto Rican women between 1930-1960. The rabbits in the work portray and ridicule the stereotypes of hyper-sexualization that are a part of the socio-political facets of this medical history. In California, across the 1960s, Mexican women giving birth were commonly sterilized without informed consent, leading to the federal class action suit Madrigal vs. Quilligan—a story told in the documentary No Más Bebés (2015). The women lost the case. These works of art recall narratives otherwise erased. So, while true medical information and social histories can be eradicated by those in power, the online searches and messaging of ordinary citizens can be tracked mercilessly.

In July 2022, criminal charges for a self-induced abortion were brought against seventeen-year-old Clarice Burgess based on a warrant to search her Facebook messages where she had communicated with her mother about her pregnancy. The mother, Jessica Burgess has been charged with five criminal charges, including three felonies, for helping her daughter obtain the FDA-approved abortion pills that are banned in Nebraska where the two live. The significant question here is why a social media company stores these messages at all.

What do they gain from holding onto this data?

Works from Paula Rego’s Abortion series (1998) show young women after they’d self-administered abortions because she wanted to emphasise who is most likely to suffer from abortion bans: young women, poor women, and women from ethnic minorities. Her paintings were considered so powerful, they ended up influencing subsequent legislation that protected abortion rights in Portugal. It’s difficult to imagine such an outcome in the US. In 2017, Latice Fisher in Mississippi was arrested, tried, and jailed for a miscarriage because she had searched online for abortion information. The prosecutor presented that as evidence of foeticide. The convoluted case had this mother of three young children in and out of jail before finally having the charges dropped in 2019.

But all this data, to whom does it belong? Who can use it? Access it? Sell it?

Online searches are just some of the data extracted by law enforcement with the rise of digital forensics. Search histories, online purchases, geolocation and social media activity contribute to profile data sets. Law enforcement agencies typically need a warrant to access this kind of data, but many like Border Patrol, FBI, IRS and the Secret Service have contracts with data brokers. Some report that the stripping of rights in the USA has encouraged other nations to do the same, so that persons get scrutinized even as national cybersecurity gets emphasized. Jenny Holzer’s Doodle (2014) is from the Dust Painting series where she hand-rendered government documents, including the diagram leaked by Snowden about how the government wanted to circumvent the encryption in Google's cloud. The uproar about the political and security impact of those leaks often forget those whose privacy was already being subverted.

Data brokers buy and sell data, even creating software development kits that work like lego blocks to simplify the job of app developers. Those kits channel data from the app to the third party, which may be specified in the Terms and Conditions that most end users never read before clicking accept to enjoy their free health tracking or virtual measuring tape. In a study that examined 211 diabetes apps, researchers found that 64% of the apps could modify or delete information, 31% could read phone status and identity, 27% could gather location data, 12% could identify the wifi network, and 11% could access your photos and videos by activating the camera; between 4-6% of the apps could read and modify contacts, read the call log, call phones in your device, and “activate your microphone to record your speech.”

Shoshana Zuboff reported on this in The Age of Surveillance Capitalism but similar issues were raised by Cathy O’Neil’s Weapons of Math Destruction, Nick Srnicek’s Platform Capitalism, Bernard Harcourt’s Exposed, and countless others. As noted by any of these researchers, most information captured is not necessary for the apps to function. A surprising 81% of the diabetes apps surveyed had no privacy policies, and three quarters of apps without policies shared information with third parties. Of those that did have privacy policies, 79% shared data—although only half of them admitted to that in the policy. The service an app provides is only its superficial appearance; it gains much more from the user data it collects.

Anti-abortion groups have partnered with fertility health apps to identify users near a reproductive care clinic. As patients sit in a waiting room browsing online, their geolocation makes them targets for those groups to direct advertisements and dubious information, as well as later persecution. The forensic data gathering techniques of targeted ad tech, become vectors for discrimination and now criminalization. The 1994 FACE (Freedom of Access to Clinic Entrances) Act forbid activists from obstructing clinic entrances, but was too early to consider the potential of cybersurveillance and threats that curtail access to information and care.

HIPAA can’t do much either. Self-reporting health apps are not protected under HIPAA guidelines and cybersecurity advocates warn that information in 'period apps' may be sought to target women believed to have pursued pregnancy terminations. Though some companies are offering employees travel support for abortion care, that information would be unlikely to be protected by HIPAA. Health and Human Services has a data sheet for consumers on better understanding the public nature of information put into apps.

Whose privacy deserves to be maintained?

In the 21st century's age of information, the loss of privacy rights in the Supreme Court's overturning Roe v. Wade has made reproductive healthcare a matter of data privacy.Brendan Dawes is an artist who has worked with data extensively and he describes his approach for The Art of Cybersecurity (2019): "The first thing many people would think of with cyber security is darkness, hoodie-wearing hackers, pumping techno music on a black background, hyper machismo bullshit. So I went completely the other way. I thought what is the end goal here? Why do we protect our computers and our systems? It’s so we can do the things we love to do, or just have the ability to get on with our day. That’s a positive thing, not a negative thing."

It has become a problematic cliché to remark that we are the raw material, the crude oil, that algorithms plunder to produce information of use to advertisers or anyone else, but that reiterates data colonialism by objectifying the human lives (and other species) generating it. Data that seems inconsequential in one context can become all too salient with an avalanche of effects. The data is significant because as data accrues into metadata, it illuminates the intimate details of one’s daily life. As the former Director of National Intelligence General Michael Hayden bluntly told Reuters in 2014: “we kill people based on metadata.” Awareness of these attitudes helps, as does general recognition of the role of data in our lives. Though I admire the network scientist Albert-László Barabási’s effort to impress upon audiences the significance of data in today’s world, some artists and scientists can take his notions of dataism into a kind of fetishism. An upcoming exhibition of works from the BarabasiLab at Postmasters is an opportunity to confront the fine line that artists working with big data must walk. The danger is in excitement about the data itself rather than how it can make us question the technologies that derive it, the customs that produce it, the futures it proposes.

So, how can we protect our data?

Many of the solutions to datafication propose finding ways for people to profit on data, turning steps and more into business opportunities. The artist, Jennifer Lyn Morone became a one-person corporation in 2014 to reveal how capitalism fails at protecting individuals in the way that it shields big tech. As a corporation, Jennifer Lyn Morone, Inc. (JLM, Inc.) can protect the data of Jennifer Lyn Morone, the person, because what Morone does, when, where, for how long, and so forth are the intellectual property of the corporation. Just as they can hide the algorithms that interpret our data in black boxes protected as intellectual property, JLM Inc could insist Morone’s data was the company’s product and the basis of JLM’s intellectual property. This required refusing to sign most Terms of Agreement that would relinquish data about her activities to other companies, making it a challenge for Morone, either as a person or a company, to participate in many of the interactive information streams common to the 21st century. In the opening video describing her Extreme Capitalism project she explains that if “corporations can make money from my information, information that I generate just by being alive, then so can I.” Except she can’t, because the marketplace depends on participating online. The data as property argument requires that people interact on the internet, that they sell their web and app activities, not that they aim to secure them.

Jack Burnham wrote the essay Systems Esthetics (1968) to describe a shift he perceived in the art making practices of his day: “as technology progresses this [esthetic] impulse must identify itself with the means of research and production.” He argues that art used to reside in its material entity where now it exists in “the relations between people and between people and the components of their environment.” I take his claim to be that art had once been about objects manifesting craftsmanship and ideological power, and that the rise of technology introduced a new interest in how things are produced. Gilbert Simondon had addressed something similar as the scholar Tiziana Terranova articulated in Network Culture: Politics for the Information Age. Post modernism is just that: art’s shift away from the role of embodying power to one aimed at revealing the operations of power. Morone’s project succinctly does that and forces us to imagine alternatives for our data outside capitalizing on it.

Claiming individual data as property retains a product mentality while compensating data as labor metabolizes every aspect of our lives into an extension of economic relations. Both reproduce the alienating competition that disables cooperative resistance. In response to the demise of reproductive rights, nine artists identifying as women rapidly produced a group show, UNPROTECTED, as a statement about the impact of the Supreme Court’s ruling in Dobbs. Produced through Epoch Gallery, the works by the artists are sold as a unit, an important act of solidarity in this particular moment, but one that is retained by the gallery across all exhibitions. UNPROTECTED is about women’s healthcare but the loss of protections that women face around their bodies relates to rampant privacy violations, including data privacy, that endanger every body. What can we do?

The Information Transparency & Personal Data Control Act introduced to Congress in 2021 was referred to the Subcommittee on Consumer Protection and Commerce, where it remains, but the FTC is currently taking statements on the impact of “Collecting, Analyzing, and Monetizing Information About People” in order to propose regulations; speak up. Privacy isn't about secrets but self-sovereignty—especially for those who've been historically denied it. The how of data colonialism is under inquiry, as are the who. The cloud empire remains and, until we act with solidarity and conviction against what these artists have shown, we all remain unprotected.

Charlotte Kent is an assistant professor of visual culture and an arts writer.

Critical Perspectives on Art, Politics and Culture

© Copyright 2000-2022 The Brooklyn Rail

Find the RAIL in print

Get notified about upcoming live conversations with artists and when the next issue of the Rail drops.