Taking ANT347 and my statistics course (SML201) at the same time has brought up questions about data and its relationship to objectiveness. In my SML201 class, data means Excel-like datasets, where narratives, events, contexts are squashed into a single integer, string of characters, ones and zeros. My professor exclaimed that statistics and the use of extrapolating meaning from datasets allows for more objective insights to drive our decisions (versus our subjective experience and intuition). She broke down the extrapolation process of datasets for us:
- Contextualize the dataset (where is this data coming from, what was going on historically)
- What dose the variables mean in the dataset?
- What questions do you want to ask from this dataset?
- Use programming to gain insights from your variables to answer the question at hand
What I thought was interesting is that there is a certain level of objectiveness that comes with my statistics class that doesn’t come with the data we’ve discussed in this class (i.e. Rodney King video; Colonialism and Culture). But what exactly makes for these more “objective” insights that my professor mentioned? Because in both of these classes, contextualizing is critical as it imbues meaning to our data, regardless of whether that is the video from Rodney king or the 0s and 1s in my dataset. It not only gives meaning, but also a helps us to selectively focus and pay attention to what is deemed as “important” within our data; for Rodney King that was his movements, in SML201 it is what variables I choose to engage with. Yet as seen through the Rodney King trial, the very act of imbuing meaning to data and selective attention already steers us away from objectiveness and “thin” descriptions — and by engaging in contextualizing, datasets would be far more subjective than I first thought.
My conclusions was that statistics is more objective because of it’s ability to remove oneself/ human touch from the data in my statistics class and use mathematical calculations that makes my final conclusions from my dataset more “correct” and therefore more valuable. And furthermore, there is no rigorous process of incorporating multiple perspectives and interpretations like during The Rodney King trial (but perhaps there will be in higher level stats courses or real world projects) and thus allows us to view the data from a distance creating more objectiveness.
To add on more questioning: are my datasets are really more objective than the Rodney King video or the insights from Colonialism and Culture if (1) I’m only contextualizing from my own point of view and understanding of the situation versus gathering a “thick” description of the context and (2) I’m engaging with a dataset that was already created, morphed, changed by another human being before it came to my own hands? I’m left wondering if there is a way to combine ethnography with statistical projects to create a more accurate portrayal of reality with thick descriptions.
This is a fascinating connection, Emily. These 4 questions about extrapolating data do seem to resemble the work of contextualization that defines cultural interpretation, or thick description, for Geertz. If that is the case, then what might make data appear “objective” would be analogous to a convincing interpretation. Think about how the Simi Valley jury believed the defense’s case – that the meanings they extrapolated from the video made “objective” sense.
We will be asking your questions about data when we get to the second part of the course!