By Deepa Seetharaman
After a year of controversy over Facebook Inc.'s role in spreading misinformation and handling violent images, Chief Executive Mark Zuckerberg now is positioning the company as the backbone of what he hopes will be a new "social infrastructure" addressing some of humanity's biggest problems.
In a nearly 6,000-word manifesto Thursday, Mr. Zuckerberg outlined ambitions for the 13-year-old social network to play a larger role in tackling issues including terrorism, disease and climate change, alongside the work of governments, nonprofit organizations and other companies.
Facebook is investing more in building products that can alleviate some of these issues, Mr. Zuckerberg wrote, outlining few concrete steps. He also said Facebook was developing tools to more effectively fight the spread of misinformation; detect terrorist propaganda through artificial intelligence, and to promote political engagement, both nationally and globally.
"Today's threats are increasingly global, but the infrastructure to protect us is not," Mr. Zuckerberg wrote. "Humanity's current systems are insufficient to address these issues."
The post underscores how much Facebook has evolved since its founding in 2004 in a Harvard dorm room. From its start as a social hub for college students, it has become a powerful shaper of views and an essential hub of information, social ties and communication for its nearly two billion monthly users -- roughly a quarter of the world's population.
"For the past decade, Facebook has focused on connecting friends and families," the 32-year-old CEO wrote in the note, published on Facebook. "With that foundation, our next focus will be developing the social infrastructure for community -- for supporting us, for keeping us safe, for informing us, for civic engagement, and for inclusion of all."
The post also comes after a trying year at Facebook, which faced criticism for, among other things, the design of its news feed, which put legitimate news sites on equal footing with those peddling misinformation during the U.S. presidential campaign. The company also drew fire for failing to catch violent live videos and for inconsistently applying its content standards, such as when it deleted posts containing a famous Vietnam War photo of a naked girl fleeing napalm bombs last fall. After considerable public uproar, Facebook reversed that decision.
Mr. Zuckerberg has, at times, struggled to articulate a coherent response to these concerns, sparking both public and internal backlash. Speaking at a conference two days after the U.S. election, he dismissed accusations that fake news on Facebook tipped the election as a "crazy idea." This didn't go over well with many employees who argued that the social network should be doing more to confront fake news as well as the "filter bubble" in which many users see few ideas or information different from their own, current and former employees said at the time.
Just over a week later, Mr. Zuckerberg changed his stance, saying that while Facebook didn't want to be in the position of determining the truth, the company takes hoaxes very seriously. Since then, Facebook has taken steps to curb fake news through partnerships with fact-checking organizations and tweaks to its algorithm to demote news items that are deemed "disputed." It also is investing in news literacy and working more closely with publishers.
Thursday, Mr. Zuckerberg said fake news and filter bubbles worried him, but a greater concern is "polarization."
Facebook wants to show users a wider range of perspectives and demote sensationalized news, but has to be careful to do so without deepening divisions, Mr. Zuckerberg wrote, citing research showing that people hold tighter to their beliefs when confronted with an opposing view. "Our goal must be to help people see a more complete picture, not just alternate perspectives," he wrote.
Additionally, Facebook is revamping the way it handles objectionable content to give users the ability to set parameters on how much nudity, violence, profanity and graphic content they can tolerate, Mr. Zuckerberg said. For users who don't take those steps, their settings will default to the content choices made by the majority of users in their region.
Longer term, Mr. Zuckerberg wants to build artificial intelligence that can detect violent content and terror-recruiting networks. Some of that work can be done now, he said, but major advances are still needed to build effective systems that can catch hate speech, graphic violence or sex.
Write to Deepa Seetharaman at Deepa.Seetharaman@wsj.com