Mark Zuckerberg, Facebook, breaks silence over shootings in Christchurch


Facebook CEO Mark Zuckerberg finally broke his silence on the shooting in Christchurch in an interview with ABC News in the US.

Zuckerberg said that the massacres of the mosque, which killed 50 people, as the shooter transmitted to Facebook, "a really terrible event."

But the founder of Facebook had mixed feelings about a delay, just as the TV "live" is in a little delay, when the idea was suggested by the interviewer George Stephanopoulos if that would help.

"You know, you can, in this case," Zuckerberg replied.

"But it would also fundamentally break down what live broadcast is to people.Most people are doing live shows, you know, a birthday party or hanging out with friends when they can not be together.And it's one of the magical things about livestreaming So you're not just broadcasting, you're communicating, and people are commenting back, so if you had a delay you'd break it. "

Earlier this week, privacy commissioner John Edwards criticized Facebook for failing to introduce new safeguards that would prevent the livestream from being repeated on March 15.

In an open letter on March 30, Facebook's chief operating officer Sheryl Sandberg said the social network was "exploring" the idea of ‚Äč‚Äčlive restrictions for users who violated social networking community standards.

Yesterday, when the Australian Parliament passed a tough new law, social media companies can fine up to 10 percent of their revenue and executives are sentenced to up to three years in prison if they do not take "swift" measures on " . -General Christian Porter said: "There are platforms like YouTube, Twitter and Facebook that do not seem to take responsibility for not showing the most disgustingly violent material seriously."

A partial transcript of the interview is below. Read the full transcript here.

STEPHANOPOULOS: Do you think social media has made acts of extreme violence more prevalent?

ZUCKERBERG: It's hard to say. I think this is going to be something that's been studied for a long time. I certainly have not seen data that suggests this has happened. And I think the hope is that by giving voice to everyone, you are creating a diversity of visions that people can have and that even though sometimes some ugly visions arise, I think the democratic tradition we have is what you want to put these problems at the table so that you can deal with them. Of course, that's why I care so much about issues like damaging content policing and hate speech, right? I do not want our work to be something that amplifies really negative stereotypes or promotes hatred. That is why we are investing so much in building these artificial intelligence systems. We now have 30,000 people doing content and security reviews to do the best job possible to proactively identify and remove such harmful content.

STEPHANOPOULOS: What did you learn from the New Zealand experience a few weeks ago? It took about an hour to tear this video live. Clearly, it seemed like this should happen on social media. What did you learn about it? What else can be done to stop it?

ZUCKERBERG: Yes, I mean, this was a really terrible event. And we work with the police in New Zealand and we still do. There were a couple of different parts of what I think we learned. The first was on the live video itself. I really think most of it was – it's the second one, which is all the copies that were uploaded afterwards.


ZUCKERBERG: So the live video was seen about 200 times while it was live. Most of them, it seems, were from people from a different online community, outside of Facebook, that this terrorist basically said he was about to do that. So they went. And many of those views were copying the video, so they could upload multiple times. So one of the big arguments is that we need to build our systems to be more advanced, so we can identify the terrorist events of live broadcasting faster, as it is happening, which is a terrible thing.

STEPHANOPOULOS: Would a delay help, any delay of livestreaming?

ZUCKERBERG: You know, you can, in this case. But it would also fundamentally break down what it is to live for people. Most people are doing live shows, you know, a birthday party or hanging out with friends when they can not be together. And it's one of the things that's magical about livestreaming is that it's bidirectional, right? So you're not just broadcasting. You are communicating. And people are commenting back. So if you had a delay you would break it.

STEPHANOPOULOS: Even 10, 15, 20 – we have seven-second TV delays.

ZUCKERBERG: But you're not getting comments that are …

STEPHANOPOULOS: No, we received a lot of comments. But you're right.

ZUCKERBERG: (Laughter) Well, yes, but then. But getting back to the point – one of the things we saw were people, basically, of the 200 people who watched this video while it was live, many of them copied and made different versions of the video they tried to upload. For the first 24 hours, our systems automatically took 1.2 million copies of that video that people were trying to send. So we took over 300,000 people signaled us that our systems did not pick proactively. But one of the things that this signaled to me, in general, was how bad the actors will try to circumvent our systems. It was not just a copy of the video. There were 800 different versions of this video that people tried to upload. And they often made slightly different versions of it to try to bypass our systems that would capture the videos as people tried to upload them. So it returns to some of these issues around policing harmful content around the prevention of electoral interference. These are not things you completely solve, right? They're racing armaments, where we need to make sure our systems are ahead of the sophisticated bad-actors, who are always trying to fight them. And that is just part of the dynamic we are in. And we always need to keep investing more to stay ahead of it.

Read the full transcript here.


Source link