Over a year ago, I left Facebook after a seven-year relationship with the social media space. I wrote about my reasons in an article published by Hybrid Pedagogy titled, “Breaking Up with Facebook: Untethering from the Ideological Freight of Online Surveillance.” Essentially, Facebook tracks and monitors user movements and actions throughout their ecosystem using complex algorithms.
I began noticing the algorithmic movements when I saw personalized advertisements on the sides of the Facebook newsfeed. And, while I ultimately deleted my account because of Facebook’s graph search feature, I also felt uncomfortable with Facebook’s algorithms deciding what content I would experience in my newsfeed by promoting some posts over others.
About a week after publication, a new scandal erupted on social media networks, in mainstream media, and in academic circles. A Facebook employee, a university faculty member, and a graduate student reported on a study conducted in 2012 focusing on emotional contagion, sharing they were able to manipulate newsfeeds for users to learn if emotional contagion could occur. There were several accounts about this study from ethics (Albergotti & Dwoskin, 2014; Arthur, 2014; Junco, 2014) to questions about methodology (Albergotti, 2014; Grohol, 2014; Hill, 2014) to commentary about the experiment (Auerbach, 2014; Boyd, 2014; Crawford, 2014). The tools that allowed the researchers to manipulate the newsfeeds were the algorithms Facebook used to control how users experience content on their screens.
Facebook is back in the news this week because of their algorithms for the lack of content displayed about the ongoing political and social events in Ferguson, Missouri. Many users of Facebook and Twitter have reported that while Twitter shows real-time events in their streams, their Facebook newsfeeds are decidedly quiet about the events.
If algorithms control what users experience in Facebook, then what really, is the benefit of being a Facebook user if users cannot experience what they want to in the space?
I ask this question because recently I was encouraged to rejoin Facebook for professional reasons. The person who brought this up to me is someone I have a great deal of respect for, trust their advice, wisdom, and experience in several areas. This person is also aware of and a supporter of my research.
And, here’s the rub: I know this person is right–right about re-joining a social media space that can provide professional benefits through online social networking.
But, I also can’t shake that re-joining this space calls into question my ethos as a researcher and private citizen who is aware of the surveillance and algorithmic practices of Facebook. This isn’t necessarily because I wrote an article about leaving (well, in small part it is), but that to re-join means I am subject to surveillance, to algorithmic manipulation, and that I become a commodity to Facebook again–all in the service of finding benefit from online networking.
I spoke with a dear friend and colleague about this earlier, and she advised me to consider re-joining, but to do so as connected to my research. Perhaps re-joining (if I decide to do so) will foster a new research project.
In the meantime, I find myself in a dilemma. Even though I have officially cut ties with Facebook, it seems that breaking up is really hard to do.