If you aren’t at least a little bit technologically savvy, the latest scandal involving social media giant Facebook can make your head hurt.

In essence, Facebook is catching fire after admitting that it gave away the personal information of tens of millions of its users.

The more in-depth explanation is that Facebook allowed a psychology professor at the University of Cambridge to harvest data from Facebook users who downloaded his app. Those users unknowingly gave the professor — Aleksandr Kogan — permission to harvest data from all their Facebook friends, as well. Kogan then violated Facebook’s rules at the time by handing over that data to Cambridge Analytica, a data analysis firm with ties to conservative stalwarts that helped elect President Donald J. Trump.

It’s anyone’s guess as to how much of a role that data played in the 2016 presidential election — if any — but it could have theoretically have been used to formulate campaign strategies by micro-targeting specific users with specific political advertising related to the election.

It isn’t a new strategy. As social media has revolutionized the way humans communicate, we’re leaving a data trail that defines us — from which sports we enjoy to what our latte preferences are. All of those things — along with our age, gender, geographic location, religion, et cetera and so on — can be plugged into complicated computer formulas to predict our political leanings and voting tendencies.

Some are disdainful that the data may have been used to help elect Trump, but the truth is that Hillary Clinton’s campaign used data mining, as well — as did Barack Obama’s during his 2012 re-election efforts. The only difference with Cambridge Analytica is how the firm obtained the data — essentially, without Facebook’s permission.

It remains to be seen what the fallout will be. Cambridge Analytica insists that it did nothing wrong, but it probably bears liability for receiving the data in violation of Facebook’s policies. And Facebook could potentially find itself afoul of a 2011 FTC agreement, which required the social networking giant to better protect its users’ data. At the very least, Facebook has suffered a giant knot on the forehead of its public image, and it’s one that won’t soon subside.

This latest episode of a technological giant’s breach of trust has reignited a debate that has existed since the internet became a household institution. We use apps, websites and the world wide web itself with no assumption of privacy — at least we shouldn’t — but even in instances where the major players like Google and Facebook are promising to let us protect our privacy, they probably aren’t playing as nicely as they’d have us believe. That’s part of the cost of this digital era in which we live. There’s a tradeoff for the “free” online and connectivity services we enjoy. It’s a different argument for a different day, but it goes back to the number one rule of marketing: if the product is free, the user is the product.

And, yet, as we bash Facebook for not protecting our data and bad actors like Kogan for abusing it, there remains a simple, if inconvenient, truth that won’t go away: if we were among the 50 million Facebook users whose data was used in a purported effort to persuade the 2016 election, it’s only because we allowed ourselves to be.

The personal data we share with Facebook is entirely voluntary. There may be some non-personal data gathered through cookies, but unless Mark Zuckerberg is far more nefarious than any of us imagine — certainly there have been reports that he is; take them for what they’re worth — his company should not be collecting non-personal data that could be used to determine how we’re likely to vote in an upcoming election.

Instead, that information is derived by plugging the personal data we share into complicated algorithms. If we “check in” at a Starbucks, we might be presumed more likely to vote one way; if we “check in” at a Bass Pro Shops store, we might be presumed more likely to vote the other.

And we’ve known this data mining to be taking place for years. We’ve just chosen to turn a blind eye to it. For example, have you ever wondered how that ad for cheap shoes on Amazon just happened to present itself on your Facebook news feed a half-hour after you searched for stilettos on the Amazon app?

The bottom line is that Facebook doesn’t know what we don’t tell it. That sounds much simpler than it really is, because everything we say and do can be used after so long to predict much more about us than even our best friends know about us.

But, in truth, does it matter? After heavily perusing Facebook throughout the 2016 election cycle, I’d venture a guess that Cambridge didn’t have to mine too deeply, if we’re being honest. Never have American voters been so eager or bold to proclaim their voting preference as in the last presidential election, and that seems like a trend-setter for elections to come. Whether you were pro-Trump or pro-Hillary, there’s a pretty good chance that you left no doubt in the minds of your Facebook friends as to how you intended to vote. The result was a lot of bickering and disdain among “friends” as the election played out, and by the time the final ballots had been counted, a lot of us had fewer friends than we started with.

The same is now applying on a local scale. As the 2018 general election looms near, many of us are boldly using Facebook to tell our friends how we intend to vote — because, of course, we see this as a way to persuade how they’ll vote, in turn.

As for me, my intent is to leave you — and Cambridge Analytica, if that were possible — guessing. I’ve always seen voting preferences as a deeply personal thing. Sure, whether we’re registered to vote and whether we have voted in a specific election is a matter of public record, but it’s really none of your business who I’m voting for, just as it’s none of my business who you’re voting for.

Not that I’m faulting anyone who uses Facebook to campaign for their candidates of choice. To each his won, and I’m quite sure that I’ve said a lot of things on Facebook — even if in jest — that have left some of my friends scratching their heads and wondering if I’ll ever learn to keep my mouth shut.

But I can honestly say that no one could scroll through my Facebook posts and determine who I voted for in the 2016 presidential election (not even, dare I say, some fancy computer formula), and no one will be able to scroll through my posts and determine who I’m going to vote for in the upcoming county mayor’s election. I have friends who are running against one another in the local election, and I have friends who are both conservative and liberal when it comes to national politics. I enjoy the fact that we can have conversations about political matters without falling out with one another over each other’s voting preferences. I’m not going to allow Facebook to change that, especially considering that I’m unlikely to persuade your vote, no matter how hard I try.

The lesson here is just this: each person’s Facebook feed is his own. And Facebook only has access to the data that we’re providing it as we punch our iPhone screen as quickly as our thumbs will move. Data mining isn’t new and isn’t going to stop. The only way to remove ourselves from the equation is to unplug completely. In the meantime, we make it a lot easier every time our thumbs move. If we choose to be an open book, there’s a good chance we’re gonna get read.

SHARE
mm
Ben Garrett is Independent Herald editor. Contact him at bgarrett@ihoneida.com. Follow him on Twitter, @benwgarrett.