IEyeNews

iLocal News Archives

Facebook has a much bigger problem than Russian ads

BY JACK MORSE From Mashable

Much noise has rightly been made about the role Facebook played in the 2016 presidential election. Critics have pointed to a targeted ad campaign by Russian groups as proof that the Menlo Park-based company wasn’t minding the store — and alleged that disaster followed as a result.

But that argument overlooks one key point: In showing microtargeted “dark ads” to users, Facebook was doing exactly what it was designed to do. The larger problem is not these specific Russian ads (which Facebook refuses to disclose to the public) — or even that Donald Trump was elected president — but the very system upon which the company is built.

Mark Zuckerberg’s plan to increase transparency on political advertisements, while welcome, falls into the same trap. Yes, more disclosure is good, but what is the remedy when the underlying architecture itself is gangrenous?

Zeynep Tufekci, author of Twitter and Tear Gas and associate professor at the University of North Carolina at Chapel Hill, made this point painfully clear in a September TED Talk that dove into the way the same algorithms designed to better serve us ads on platforms like Facebook have the ability to be deployed for much darker purposes.

“So Facebook’s market capitalization is approaching half a trillion dollars,” Tufekci told the gathered crowd. “It’s because it works great as a persuasion architecture. But the structure of that architecture is the same whether you’re selling shoes or whether you’re selling politics. The algorithms do not know the difference. The same algorithms set loose upon us to make us more pliable for ads are also organizing our political, personal and social information flows, and that’s what’s got to change.”

Tufekci further argued that when machine learning comes into play, humans can lose track of exactly how algorithms work their magic. And, she continued, not fully understanding how the system works has potentially scary consequences — like advertising Vegas trips to people about to enter a manic phase.

This concern is real. Facebook can now infer all kinds of data about its users — from their political views, to religious affiliations, to intelligence, and much more. What happens when that power is made available to anyone with a small advertising budget? Or, worse, an oppressive government?

“Imagine what a state can do with the immense amount of data it has on its citizens,” noted Tufekci. “China is already using face detection technology to identify and arrest people. And here’s the tragedy: we’re building this infrastructure of surveillance authoritarianism merely to get people to click on ads.”

Facebook bills itself as a company striving to bring “the world closer together,” but the truth of the matter is far different. It is, of course, a system designed to collect an endless amount of data on its users with the goal of nudging us toward whatever behavior the company believes is in its best interest — be that purchasing an advertised item, voting, or being in a particular mood.

That’s a fundamental problem that cuts to Facebook’s very core, and it’s not one that a new political ad disclosure policy will fix.

IMAGE: JUSTIN SULLIVAN/GETTY IMAGES

For more on this story go to: http://mashable.com/2017/10/27/facebook-ads-algorithm-problem/?utm_campaign=Feed%3A+Mashable+%28Mashable%29&utm_cid=Mash-Prod-RSS-Feedburner-All-Partial&utm_source=feedburner&utm_medium=feed#iZ5M__81fOqL

LEAVE A RESPONSE

Your email address will not be published. Required fields are marked *