Wednesday, October 5, 2022

Pratik Nyaupane Core Post #2

 

The more we progress through this course, the more we find that each week's topics are overlapped, intersected, and interconnected with the others. We cannot think about code, without thinking about labor and software, and algorithms. We cannot fully grasp AI without code, waste, and surveillance. The readings this week do a great job of emphasizing this importance and allowing us to holistically view artificial intelligence not just as the mysterious robot that is always listening to us, but through understanding how we got here and where we are headed. 

The Amazon Echo is built so that it blends into our social and lived environments but stands out just enough to be noticeable for convenience and its utility. These AI tools, are meant to blend in and quilt themselves into the fabric of our everyday lives. For Amazon, ubiquity is the goal and the least convenient thing about the echo is that it can't physically be in multiple places at once. 

"Put simply: each small moment of convenience – be it answering a question, turning on a light, or playing a song – requires a vast planetary network, fueled by the extraction of non-renewable materials, labor, and data" (Crawford).

There are three things that are central to the function of this AI ecosystem, material, labor, and data. These are analogous to food, water, and air to the human life. Crawford reminds of the materiality of the digital. Oftentimes, the discourse around the digital is accompanied by phrases like "cashless" "paperless" "green" "efficient" "waste-free" and "cloud" and others that indicate that digital algorithmic processes are one of progress, a social change toward good. However, much of AI and technological development exists and is developed through a capitalistic framework, dependent on the exploitation of labor and goods. Like many other technologies that are profitable and fill a void, whatever that void or issue may be, companies rely on a certain dysfunctionality and non-autonomy of individuals and social entities to continue and expand these ventures.


Wendy Chun highlights the sheer power that these tools have over not only our lives and behaviors but also significant democratic processes such as the US and UK elections. We have seen the role of social media and advertising in campaigning for the past decade or so, but Cambridge Analytica's role in targeting individuals and persuading their values, beliefs, and politics was a terrifying groundbreaking moment. Through personal, cultural, and other identifiable characteristics, these patterns are able to predict a lot about us. Sometimes we discuss how error-prone and faulty these machine learning algorithms are, but in line with the violent histories of statistics and science, the verification of these tools will only produce and maintain racist and sexist predictions (47).


Many of us have probably heard the phrase that "truth is power." With the understanding of data, numbers, quantification, there is an ontological notion that "data/numbers don't lie." We hear things like (often in the liberal purview)  "facts don’t care about feelings" or that "I believe in science" to cling on to a sense of objectivity that cannot be challenged. As Cheney-Lipold alludes to, as well as in the video we watched in class a few weeks ago, this quantified algorithmic process of the body is extremely violent, and many times deadly. Situated in colonial, racial, and capitalist logics, my brilliant colleague Will Orr also expands on how death and violence are built into algorithms and platforms.



No comments:

Post a Comment

Note: Only a member of this blog may post a comment.