Mark Zuckerberg should like to know it despite a scathing report in The New York Times, which shows Facebook as a ruthless and selfish corporate event, things get better, at least as he sees it.
In a long conversation with reporters on Thursday and an equally long "listing" published on Facebook, the CEO of the company introduced a variety of changes that Facebook makes, designed to crush toxic content on the platform and give greater transparency to content decisions. But perhaps the most up to date update is that the Facebook News Feed algorithm will now try to limit the spread of sensational content on the platform, which represents a major change from how the social network has traditionally approached moderation. Everything is in service to restore confidence in a company whose reputation – and its leaders – have been close to constant body strokes over the past two years.
"When you have a setbacks that we have had this year, it's a big question, and it increases confidence, and it takes time to build it back," said Zuckerberg on the conversation. "Certainly, our job is not just having these things at a good level and constantly improving, but being new problems. I think that in recent years, one of the areas we've been the most behind, especially about the election issues. "
These words come one day after Times published a condemnation report depicting Facebook not only behind questions about election interference, as Zuckerberg suggested, but actively worked to reduce what it knew about the disturbance. It argued that Facebook's managers, cautious to choose sides in the party dispute about Russian involvement in the 2016 election, aimed to minimize Russia's role in spreading propaganda on the platform. The story says Facebook's former security director, Alex Stamos, was chastized by company chief operating officer Sheryl Sandberg to investigate Russian documents without the company's approval and told him again to reveal too much about it to the members of Facebook's board.
In his remarks Zuckerberg diligently denied this claim. "We've really stumbled under the road, but to suggest we were not interested in knowing the truth or that we would hide what we knew or that we tried to prevent investigations are simply false," he said. (Stamos, for his part, tweeted earlier on Thursday that he "never told Mark, Sheryl or any other managers not to investigate.")
The Times also tells that Facebook has been conducting a smear campaign against its competitors through an opposition research company called Definers Public Relations. The company worked repeatedly to associate with Facebook's opponents, including groups like the Open Markets Institute and Freedom from Facebook, to billionaire George Soros. Critics say that, in that case, Facebook is engaged in the same anti-Semitic troops that have been used by white nationalists and other hatred groups who regularly rest Soros.
Zuckerberg denied having any personal knowledge about Definer's work with Facebook and added that he and Sandberg only heard about the relationship on Wednesday. It is despite the fact that Definers often coordinated large-scale conversations with the press on Facebook and on behalf of its employees and at least one case set in meetings between Facebook and the media.
After Zuckerberg read the story in Times, he said Facebook immediately discontinued its relationship with the company. "This kind of business can be normal in Washington, but it's not the kind of things I want with Facebook, and that's why we're no longer working with them."
While acknowledging that he had no knowledge of Definer's work or the message, the boss defended Facebook's critique of activist groups such as Freedom from Facebook. The intention was not to attack Soros, for which Zuckerberg said he had "tremendous respect," but to show that freedom from Facebook "was not a spontaneous grassroots effort."
Zuckerberg refused to award the debt for the tactics specified by Definers, or to comment on broader staff questions in Facebook itself. He just said that Sandberg – who has monitored Facebook's lobbying work, and depicted unsatisfactory throughout Times story-is "doing a good job for the company." "She has been an important partner for me and continues to be and will continue to be," added Zuckerberg. (Sandberg was not on the call.)
For the fiftieth year this year, Zuckerberg found that he worked overtime to clean up Facebook's mess, even though he would desperate to develop the progress made by the company. In Myanmar, where fake news on Facebook has animated a brutal ethnic cleansing campaign against the Rohingya people, the company has hired 100 Burmese speakers to measure the content there and now automatically identifies 63 percent of the hate it's falling from just 13 percent at the end of last year. Facebook has expanded its security and security team to 30,000 people globally, more than the 20,000 people originally chose to employ this year. It has also changed the content deduction process, allowing people to appeal the company's decision about content they post or report. On Thursday, Facebook announced that within the next year, an independent regulatory body will be set up to handle content claims.
But by far the biggest news coming out of Thursday's announcements is the change that comes to Facebook's News Feed algorithm. Zuckerberg acknowledged what most observers already know to be one of Facebook's most fundamental issues: the sensational and provocative posts, even those that do not explicitly violate Facebook's policies tend to get the most involvement on the platform. "As content approaches what is prohibited by our community standards, we see people tending to engage more," he says. "This seems true no matter where we set our guidelines."
The problem is probably the basis most of Facebook's problems in recent years. That's why divisive political propaganda was so successful during the campaigns in 2016 and why false news has been able to flourish. So far, Facebook has driven a black and white environment where content violates the rules or not, and if it does not, it's free to collect millions of clicks – even though the poster's intention is to mislead and stoke outrage. Now Facebook says that even content that does not explicitly violate Facebook's rules can see its scope reduced. According to Zuckerberg's posts, it includes "images near the nudity lines" and "posts that are not covered by our definition of hate speech but which are still offensive."
Zuckerberg is called the shift "A big part of the solution to ensure that polarization or sensational content does not spread in the system, and we have a positive effect on the world."
With this step, Facebook is taking a risk. Limiting the commitment to the most popular content is likely to cost the company's money. And such a dramatic change undoubtedly opens Facebook up to even more accusations of censorship, at a time when the platform is in constant criticism from all angles.
Nevertheless, Facebook is betting big on the face. If disorder is no longer rewarded with more and more clicks, the thought goes, maybe people will get better. That Facebook is prepared to take such a chance say much about the public pressure that has been on the company for the past two years. After all, what must Facebook lose?