“So today, we still find that most people prefer to resort to blame and assume there is human intentionality behind the negative side of these digital coins…digital technology is dialectical and intrinsically contradictory; often what adjudicate as its good and bad implications are inseparable consequences of the same developments (24)”

I wanted to point to an example of what this quote is trying to illustrate in our lives (I was almost going to say real lives but then realized that this was the authenticity trap). I recently watched a movie called The Social Dilemma, a movie critiquing the ways in which the algorithms behind huge tech platforms (i.e. Facebook, Google, etc.) are tweaking and changing in response to our interactions with our screens. For example, the same two people using Facebook will have very different home screens that are catered to their specific interests; entering into the Google search bar the same word (i.e. “election”) will yield to suggested autocompletions based on your past search history. Here is a clear example of digital technology being dialectical and intrinsically contradictory: using the same development (algorithms), these platforms have personalized and tailored our technologies to reflect our needs/wants (the positive) but at the same time will only reinforce our own concepts and understandings of the world, thus creating an echo chamber that has consequently led to a lot of polarization (i.e. elections) and potential disinformation (the negative). It might be easy to blame the technologies themselves, but perhaps the real blame should be looked at ourselves and the ways that we have created and reinforced our own cultural systems into these technologies.

The idea that digital technologies are merely just shaped into reflections of ourselves and thus our cultures reminds me of Geertz: “man is an animal suspended in webs of significance he himself has spun.” The digital lives we have created are very much webs that we have spun ourselves through the ways that we have interacted with our technologies. Thus, I would agree that (1) ethnography will still play a significant role in understanding what it is that makes us human and (2) perhaps the interpretation of the symbols in our digital lives will be made easier as digital technology given the fact that we must reduce a lot of qualitative aspects of non-digital lives into quantifiable symbols (i.e. hearts, search bar, etc.).

  1. Jeffrey Himpele says:

    Emily – this is an important reminder that we ought to avoid fetishizing technologies in our analysis – so we all should read it. That is, that we have to keep in mind the social relationships in which they are embedded, or suspended. (Indeed, Ginsburg, Abu-Lughod and Larkin make a parallel argument in their introduction to Media Worlds.) Further, as your post points out, there is necessary place for ethnography in understanding technologies. And algorithms more specifically. Algorithms alone aren’t leading to polarization, for example, companies like Facebook are. And companies are webs of relationships with cultures and values.