SXSW: Covering Panels on Data Art and Privacy

ACCENT Reporter Marisela Perez Maita shares her experience on the two panels that talked about the same topic but from very different perspectives.

by Marisela Perez Maita

First panel: Data Art: Processes and Perspectives 

Three data art artists presented their work during the session, explaining how they approach data analysis and what it is all about. 

From left to right, Jane Adams, Laurie Fick, and Sara Miller. 
What is Data Art? 

The first speaker, Laurie Frick, is a freelance artist that has worked with different companies and museums. For her, “Data art is more about art than data. The latter is more about the paint.” She sees that this is a medium where all that information can be expressed. She tries to take all that data and make it feel ambient in her work. 

Laurie Frick’s slide presentation

Second presenter Sarah Miller is a data visualization designer. She has worked with multiple clients, such as the Bill and Melinda Gate foundation, the University of Chicago, and the Museum of the City of New York. For her, data art can range from AI to making things with your own hands. Data art can visualize and give insight into something, and in particular, to visualize who you are.

Sarah Miller’s slide presentation

Lastly was Jane Adams. She is a researcher from Northeastern University with BFA in Graphic Design and Digital Media. She also builds robots and hydroponic sculptural works. Adam emphasized that data art treats data as a medium instead of a subject. Adams explained that there is a parallel between data science and data art. Both can teach each other about processes, but data art will always seek to find meaningful ways to find the human connection of the data. 

Jane Adams’ slide presentation
Their work.

All three artists  approach data art differently, be it the materials, resources, intentions behind their art, and of course, the data they use.  

Laurie explained that much of the work is just research. For example, she retailed the commission she did for the Houston Federal Reserve, where all the cash for the state of Texas is processed. She described the place as a fortress with billions of dollars in cash inside it. She was completely mesmerized by the quantity of cash. 

Once home, she started “the hunting process.” Where does the cash go? How often does it transit to people’s hands? All the questions led to processing the data. She found a government survey of individuals’ spending during her research. This data set had information on how much money individuals made and spent divided into categories. 

Laurie explained that there is something organic in people’s data. “There is usually a rhythm in human data. A symphony of actions and behaviors.” Having all this information in front of her and comparing it, such as someone making $250,000 and someone that made $10,000, she realized that the answer to “where does the cash go” was about income and inequality.

Her work holds 60 squares that detail how people spend their money in a year, and what she likes about it is that it allows the viewer to infer the same thing she did. The more someone makes and the more they have to spend, the more “squares” they have in their section.

“Where does the cash go” (2020) By Laurie Frick. https://www.lauriefrick.com/

Sarah Miller discussed her project “The Digital in Architecture.” She and her team were focused on an essay that detailed the history of the discipline and the digital tools that architects have used throughout history. They decided to visualize the data as buildings and their different aspects. How tall are they? Which materials do they have? All this data-set is a timeline of description and design. 

“The Digital in Architecture” (2019) by Sarah Kay Miller. https://sarahkaymiller.com/

Jane Adams talked about her most recent work. It is a sculpture of a Latent Walk video from an IRL model she trained. IRL is a technology that, in this case, Adam used to extract stock images from aerial drone photographs. From this, Adam scripted down over 17,000 images, printing them in transparency films that we see in the picture as all the layers of the map. 

At the bottom is something she added after contemplating how to credit every photographer who contributed to her art training data. It ended up being a 120 foot-long roll of all the names and titles of people who provided the data she used in her machine-learning work of art. 

Caption: Sculpture title “Latent Walk Prims.” (2023)  by @artistjaneadams instagram

As we see, Data Art is diverse, broad and open to interpretation. It is about joy, utility, creativity, and, more importantly, making data more memorable. 

Second panel: It’s Time to Stop Denying Privacy as a Civil Right 

In this panel, data takes a 180-degree turn. In the Civil Engagement category, four panelists discussed the implications of having no privacy rights in an era when governments, companies, and applications can sell and use our data as they please. 

From left to right, Christopher Woods, Nicole Turner-Lee, Koustubh “K.J.” Bagch, and Amy Hinojosa

The discussion was led by Christopher Wood, founder of LGBT Tech and three non-profit organizations focused on the LGBTQ+ community. He started the discussion by asking the other three panelists, “What is at the top of your mind when it comes to civil rights and privacy?”

Amy Hinojosa, the founder of MANA, a National Latina Organization, answered that over her 16 years working with the organization and fighting on behalf of women she is overly concerned about the weaponization of women’s health care data, especially reproductive care. She brought the Dobbs vs. Jackson legislation, explaining that it was detrimental not only to abortion access but to the privacy of our health care decisions. This legislation allows them to look at the user’s health history, and take action according to their definition of morality.

For example, if someone uses an app to track their period, this information can be documented and used against them. States have the power to  make health care less available and look into people’s health care decisions. Hinojosa explained that in the state of Texas, this information could be used to criminalize people like  women subject of abortions. She has no control over her personal health care data so any entity can use it against her.

Koustubh “K.J.” Bagchi has focused over 10 years on marginalized communities. He has worked with Washington D.C. as a council member on issues regarding consumer protection. Now he is the Chamber of Progress of New America’s Open Technology Institute, focusing on platform accountability and privacy issues. Regarding civil rights and privacy, he answered that the conversation of privacy is usually centered on the big companies: Google, Amazon, Apple, and so on, when actually, the government has found a way to collect data from all civilians around the country, and there is no legislation to protect us from the surveillance the government has upon us and that we are not even aware of. 

Nicole Turner-Lee has a PhD in sociology and is the Director of Technology Innovation at the Brookings Institution. She is concerned of the social implications of technology, where we have not realized that we are in a system of technological surveillance that lends itself to “discriminatory, racist, gender and homophobic violence.” It is done in a way that the user does not realize that it is happening. Turner-Lee outlined that without legislation, our data can add on the already system of inequality and the state of oppression of people of color, women, and immigrants. 

The discussion unpacked that  data is a tool available to use against communities. Local and federal governments collect information by, as Bagchi named, “data brokers.” These are businesses that collect information across the web. It can be public data, like what you have in your link profile, or in many cases, these data brokers make deals with third parties apps to aggregate data for private use. 

He illustrated the case covered by newsletters of an app made for its Muslim users to find the right time and orientation to pray. This information reveals where thousands of Muslim users are located and was sold to the US military and defense contractors. People from the Muslim communities, thinking they were just using an app to follow their religion, were actually being monitored by the government. We think that the information we put in apps is not something anyone but us can see, when behind our backs, they are actively collecting our data and selling it. 

Turner-Lee  asked the public, “How many of you accept the cookies?” and most of us raised our hands. Half the people don’t read the privacy policies because, in the first place, these policies are meant to be read by lawyers and not people who are navigating through the internet. Users are unaware of what they are giving up when they decide “I don’t want that information to be collected about me.” 

Most times, not accepting cookies hinders one from accessing the website. This is a telling to users that they don’t really have an option to deny or accept what technology offers. With this, can we really call it our “options”? We are in a position where you can’t update your phone if you don’t accept its terms and conditions. We accept everything because there are no other choices, and don’t  realize the implications of our decisions. 

Our needs, religion, location, political views, nationality, and all of our identity are beyond our control when privacy is not considered a civil right. What can we do among all these big companies and the government? Hinojos said that there must be some outrage on the part of people to make companies understand the consequences this has on them and their communities,the consequence being a lack of  support for their products. 

According to the panel, advocacy is acting now. We have to ask our companies how and what they are doing with our information. “We have to keep pushing,” Hinojosa said.

As Bagchi said, the issue of data privacy is challenging, but there are attempts to make companies more transparent. Apple and Google have policies regarding privacy rights, and were done internally. Some companies have the rules, while others don’t. Turner-Lee said it is very hard to rely on companies to do the right thing, and that’s why we need regulation. With this, we as consumers have ways to enforce our rights, and this can be done by legislation.