Losing Face(book)? Part 2

- Advertisement -

by Alma Anonas Carpio

 

 

Facebook CEO Mark Zuckerberg  (AP Photo/Andrew Harnik)

Facebook began notifying users whose data had been used by Cambridge Analytica on April 10, concurrently with the start of the two-day testimony Zuckerberg gave a joint session of the US Congress.

On the second day of the hearing, Zuckerberg said “Facebook is an idealistic and optimistic company. For most of our existence, we focused on all the good that connecting people can bring.”

He also spoke of how much Facebook has grown from its origins in a college dorm.

“People everywhere have gotten a powerful new tool for staying connected to the people they care about most, for making their voices heard and for building community and businesses,” he said.

He cited the “Me Too” movement and the recent March for Our Lives that were “organized, at least part, on Facebook.”

Disaster-response, such as that mounted after Hurricane Harvey, has also become one of the most important uses of Facebook’s tools for connecting people—and, in the Philippines, that was true when Typhoons Ondoy, Sendong and Yolanda hit and hit hard. Social media used for disaster response has proven to be a swiftly and easily deployed lifeline to people in the calamity zone.

Zuckerberg also said “there are more than 70 million small businesses around the world that use our tools to grow and create jobs. But it’s clear now that we didn’t do enough to prevent these tools from being used for harm, as well. And that goes for fake news, foreign interference in elections and hate speech, as well as developers and data privacy. We didn’t take a broad enough view of our responsibility, and that was a big mistake.”

“It was my mistake, and I am sorry. I started Facebook, I run it, and, at the end of the day, I am responsible for what happens here. So, now, we have to go through every part of our relationship with people to make sure that we’re taking a broad enough view of our responsibility,” he added in his opening remarks. “It’s not enough to just connect people. We have to make sure those connections are positive. It’s not enough to just give people a voice. We need to make sure that voice isn’t used to harm other people or spread misinformation. And it’s not enough to just give people control of their information. We need to make sure that the developers that they share it with protect their information too.”

It is refreshing to see a corporate chief take responsibility for his company and its actions so boldly. It is even more refreshing to see Facebook reaching out to users and doing what it can, as a technology company, to make right what it got wrong. But transparency at the very beginning would have been so much better—and we are not talking of the tl;dr (too long, didn’t read) legalese technology companies are so fond of putting up as “end-user agreements” prior to allowing you use of their platforms—be those for email, social networking or online shopping.

We all know that even those who know better don’t want to wade through the long-winded “cover my behind” text, even if, yes, as in the case of Facebook’s data breach, we get told “you agreed to that.”

Inviolable rights are just that, and no legalese should ever sanction violating those rights: If I lend Ben a pen, that pen still belongs to me. Ben does not have any right to pass that pen over to another person who will use it—not without my permission. The same thing goes for data you permit any business organization or government agency to gather as proof of your identity. What you choose to share with the world must serve the purpose you determine, not some other faceless person’s agenda. Definitely not a political agenda for an organization or person you may or may not elect to support in any way.

Now, Zuckerberg assures the questioning panel that “we’re working with governments in the US, the [United Kingdom] and around the world to do a full audit of what [Cambridge Analytica has] done and to make sure that they get rid of any data that they still have.”

The data was obtained from a personality quiz app developed by Aleksandr Kogan, a researcher at Cambridge University, who collected a dataset of approximately 50 million Facebook users four years ago.

Kogan passed the information he’d gathered on to Cambridge Analytica, which claimed then, but now denies, that it used this data to craft political ads for President Trump’s 2016 election.

“[T]o make sure that no other app developers are out there misusing data, we’re now investigating every single app that had access to a large amount of people’s information on Facebook in the past,” Zuckerberg said. “[I]f we find someone that improperly used data, we’re going to ban them from our platform and tell everyone affected.”

“[T]o prevent this from ever happening again, we’re making sure developers can’t access as much information, going forward,” he added. “The good news here is that we made some big changes to our platform in 2014 that would prevent this specific instance with Cambridge Analytica from happening again today.”

“My top priority has always been our social mission of connecting people, building community and bringing the world closer together. Advertisers and developers will never take priority over that for as long as I am running Facebook,” he said. “We now serve more than 2 billion people around the world, and, every day, people use our services to stay connected with the people that matter to them most. I believe deeply in what we’re doing, and I know that, when we address these challenges, we’ll look back and view helping people connect and giving more people a voice as a positive force in the world.”

 

 

ABOUT THE AUTHOR

JUST IN

More Stories