Another reminder: Facebook is not an open forum. Facebook actively monitors, adjusts, and (possibly) censors your Timeline as they see fit. Your information, likes, wants, and interests are goods to be traded without your benefit. There is a distinction between a social network where the information is disseminated in a bottom-up fashion and a newspaper where information comes from top-down. This distinction can (and does) affect the perception of the network’s users (the “Echo Chamber” effect), so, despite the politics implied, morality and ethicality dictate that Facebook should leave the Timeline meddling to certified Gallifreyans–even when politics are (not) involved.
This week, Facebook CEO Mark Zuckerberg appeared to publicly denounce the political positions of Donald Trump’s presidential campaign during the keynote speech of the company’s annual F8 developer conference.
This was the business model of Compuserve. And AOL. And then a little thing called The Internet got popular for a minute in the mid 1990s, and that plan suddenly didn’t work out so well for those captains of industry.
I know a few people in my timeline that fell victim to Facebook’s latest ill-conceived attempt to shoehorn itself into our lives. I saw these popping up in December, and decided against opting in–then I saw that they built one for me and shoved it in my face with a prompt to post.
Bad form, Facebook. “Opt-In” is ALWAYS a better marketing strategy when dealing with people’s private lives and personal data, especially in this era of increased concerns over privacy. This is just another reason to flee the walled gardens for the freedom of the World Wide Web.
I know they’re probably pretty proud of the work that went into the “Year in Review” app they designed and developed, and deservedly so—a lot of people have used it to share the highlights of their years. Knowing what kind of year I’d had, though, I avoided making one of my own. I kept seeing them pop up in my feed, created by others, almost all of them with the default caption, “It’s been a great year! Thanks for being a part of it.” Which was, by itself, jarring enough, the idea that any year I was part of could be described as great.
People are starting to figure out just how much of our souls we sold for “free” access to social media.
Facebook will have to face a class action lawsuit alleging it violated users’ privacy when it scanned the content of messages for advertising purposes, U.S. District Judge Phyllis Hamilton in Oakland ruled on Tuesday.
Sterling Crispin’s “Data Masks” are haunting portraits that don’t actually depict any one person. Instead, they use raw data to show how technology perceives humanity. Reverse-engineered from surveillance face-recognition algorithms and then fed through Facebook’s face-detection software, the Data Masks “confront viewers with the realization that they’re being seen and watched basically all the time,” Crispin says.
The Googles and Facebooks (GoogleBooks?) of the world want to aggregate all of these personas into a single identity. They want to do this, not because they think this is good for users or because this is how they think society works, but rather because it helps them monetize user interactions. However, this type of aggregation is a very bad deal for users.
Facebook has a long history of pushing the envelope when it comes data privacy and user rights, and one of the most egregious examples was its research on “emotional contagion,” where user news feeds were manipulated to study their reactions to negative posts. It may have sounded innocent enough to the data scientists in Menlo Park, CA, but the reality was that this Web overlord distorted its customers’ emotional states in order to better understand how to profit from them.
The idea that Facebook isn’t a content-neutral communication medium like the phone or email seems to generate constant surprise and outrage.
Facebook has every reason to manipulate the News Feed to optimize for whatever user engagement metrics correspond to the best returns for advertisers, which in turn correspond to the best returns for Facebook.
Most conspicuously absent from Schroepfer’s post is any suggestion that users can opt into or out of experiments like the emotional contagion study.