======= ======= ====== ====== ====== ===== ==== ====== ====== ===== ==== ======= ======= ====== ====== ====== ===== ==== ====== ====== ===== ====
Oh goodie, another wonderful news story about #fakenews and how corporate America is continuing to invade our every thought. According to the Verge, over the past year, Facebook has been rolling out a system that “[rates] its users’ trustworthiness in order to help the social network know how much to value user reports that a certain news story might be fake.” If a user posts a story that enough people report and is deemed to be fake (relying on Snopes and Politifact to judge the accuracy of these articles), the user’s individual trust rating goes down. If the user is regularly posting “fake news” they will have a lower trust rating.
Facebook has not been clear with how this “trustiness” score is being used. It’s hard to say whether less trusted users’ shared articles will be deleted or not shown to their friends, or something along those lines. They will only say that this system is only used with regards to reports about news articles. But we have all seen this episode of Community (“Meow Meow Beenz”). This is the beginning of Facebook instituting their own cyber watchdog system.
It starts with a system of trustiness under the guise of rooting out “fake news.” But remember, Facebook also owns Instagram. So what’s to stop them from investigating your posts and hashtags to see whether you’re actually a happy, go-lucky THOT who jet sets across the globe wearing nothing but her bikini, or if you’re a corporate shill for big bubble tea and facial creams?
“Well that’s great!” you might say. “They should be getting rid of all those fake Insta-models and all that bullshit faux-advertising.”
But then the slippery slope shall continue my friends. Sure, they can get rid of the fake Instamodels and bot accounts, but what’s next? Maybe they’ll start using digital scanning to see whether that diamond engagement ring you’re boasting about is actually cubic zirconium. The model industry may boom from all the Insta models being deleted, but what about when Facebook starts scanning all the photos to show who’s had their pics photoshopped? The Kardashian girls will suddenly start to look like the busted, misshapen, plastic surgery Frankenstein monsters they really are (except Kendall, that girl is a flawless angel).
Sure, Facebook knowing who is posting reliable news stories is innocuous enough at the start. But then maybe they start letting the dating apps know who is trustworthy or not. Maybe they start evaluating the posts and statuses of those untrustworthy folks.
Oh, John says he’s having a night out with the boys? Well, his Youtube history and Tinder profile show he’s right at home on his couch. Maybe we should correct that and let everyone know he’s a big fat liar!
Jane posted a photo on Instagram of her with Alice at Mick’s Bar? Funny, Alice’s geotag shows that she’s not at Mick’s. But you know who is there? That guy co-worker Peter that she hasn’t added but has commented on his pics a good amount. Maybe we should let Jane’s husband know about this.
Ryan is looking at Michelle’s “Spring Break 2011” photo album again. Hm, that’s the fifteenth time this month. I wonder what has him so fascinated with her seven-year-old college photos? Maybe we should let Michelle and everyone else know.
Okay, that last one actually doesn’t sound too bad, but you get my drift. If Facebook begins to grade how trustworthy we are with the new articles we share, how long until they’re scrutinizing every bit of information we have online. And who knows who they might pass that information on to.
Before we even get there, though, there’s the issue of Facebook deciding what is or is not “fake news.” I appreciate that they are using reputable fact-checking sites and are being transparent about that, but we cannot pretend that there will not be the potential for back-room deals to censor or not censor certain content. Right now, like it or not, companies like Facebook, Apple, and Google are distributors of information both to and about their consumers, and they often work alongside with our government and other corporations to use that information in a way that may not be in our best interests.
There are a lot of complicated arguments to be made as to whether the vile filth spewed by people like Alex Jones would be protected from government censorship under the First Amendment, and whether the government’s relationship with those providers could extend those protections to sites like Facebook. It’s not an argument I necessarily agree with, but there is a clear lesson to be learned from the 2016 election: we cannot put blind faith in any company putting our best interests ahead of their financial interests when disseminating or restricting our access to information.
Reviewing the information we share for accuracy and truth sounds great…as long as we can we can trust the people doing the reviewing only consider accuracy and truth. And if the prospect of any company, site, or even our own government doing this sort of scrutiny makes you uneasy, think about them doing it with regards to the people you interact with. Not to get all conspiracy theory, but what happens when those trust ratings become visible to everyone? The moment that anyone creates a comprehensive system that tells you which of the people around you can or cannot be believed is the moment that truly insidious levels of control can develop. And ol’ Josh won’t be around to scream “wake up Sheeple!” because they will likely have silenced us nutjobs long ago.
So, let’s tread lightly. And remember that old adage. First, they came for the fake news, and I said nothing because I block everyone who posts stuff from Infowars. Then they came for the Instagram models, and I said nothing, for my girlfriend will kill me if she sees any hot girls on my timeline. But then they came for me because I posted a meme on my wall and cats cannot actually do chemistry experiments, but there was no one left to speak for me. They were all locked in those cages that Bryce Dallas Howard was left in at the end of “Nosedive.” .
[via The Verge]