Zuckerberg implements changes to Facebook news

339
Lynnette Hardaway and Rochelle Richardson, otherwise known as Diamond and Silk, testified on Capital Hill last week before the House Judiciary Committee. The sisters, social media sensations, have racked up 1.5 million Facebook followers and millions of hits on You Tube. The duo testified that they were being censored on Facebook.

Social advertising is the top communications channel for apartment communities to rent units and retain residents within their communities, particularly Millennials. Facebook’s reach is most heavily weighted toward the 22-37 age group by 70 percent according to one study. Those using the platform are there for many reasons: stay current socially, catch up on news and more. Just how concerned should apartment owners and operators be about the platform’s recent changes, if at all?

Three weeks after Mark Zuckerberg’s testimony on Capital Hill in which he neither confirmed nor denied the remarks, suggestions or admonishments of the senators questioning him, Facebook’s market value continues to slowly recover from the whole affair.

One seeming result of the unfortunate chain of events beginning with Cambridge Analytica’s controversial use of Facebook’s data in a political campaign, is the company’s launch of a news ranking system. While Facebook has always implemented such a system, it will now do it with… uh, verve.

Zuckerberg announced at Facebook’s F8 developer conference yesterday that the company has implemented a new system that ranks news organizations based on trustworthiness, and will promote or suppress the site’s content based on this metric.

The metric, he said, is constructed from Facebook’s users’ level of familiarity with those news groups and their perceptions of, and level of trust in, said news suppliers.

“We put that data into the system, and it is acting as a boost or a suppression, and we’re going to dial up the intensity of that over time,” he said. “We feel like we have a responsibility to further break down polarization and find common ground.”

The day prior to the F8 event, an option suddenly appeared below Facebook content that read: “Does this post contain hate speech?” A public firestorm ensued and the feature was deactivated after 20 minutes. Facebook responded that it was an internal test that accidentally went live.

Some have expressed skepticism about Facebook’s latest initiative. New York Times editor Joseph Kahn said that relying on public feedback was an unreliable metric. Adrienne LaFrance, The Atlantic editor believes that Zuckerberg doesn’t understand journalism or doesn’t care.

The media after party: OTR

The bigger news was found at the after party down the street at the Rosewood Sand Hill hotel in Menlo Park. Dubbed OTR (off the record), this is the second year that news executives of BuzzFeed News, the Information, Quartz, the New York Times, CNN, the Wall Street Journal, NBC, Recode, Univision, Barron’s, the Daily Beast, the Economist, HuffPost, Insider, the Atlantic, the New York Post met with Zuckerberg.

Within the meeting Zuckerberg admitted that Facebook took down tens of thousands of accounts in prior elections, but that Facebook had missed “coordinated information operations” such as Cambridge Analytica. He’s now committed to investing “billions” of dollars in both artificial intelligence and human moderators to address such threats in the future, according to BuzzFeed News.

The CEO says the company was more focused on encrypting WhatsApp, a recent acquisition that made the tech giant the world’s largest fully encrypted communications network. He also acknowledged changes in the market. Market share in the collection of personal data, he says, is becoming more fragmented as users are afforded more choices, and advertisers, in turn, are offered more hyper-targeted audiences. Zuckerberg admitted that people are sharing less personal information on NewsFeed, moving to other publishing platforms such as Groups or Messenger.

Facebook’s business goals for the future? To fund more investigative reporting. Does that begin with Facebook becoming more operationally transparent? Not so much. “If I feel like I can’t communicate stuff openly,” Zuckerberg said, “then that breaks down our internal trust.”

Section 230

Finally, recent interpretations of Section 230 of the Communications Decency Act that hold tech companies responsible for the content on their servers (such as child porn or libelous commentary uploaded by users) has had no impact on Facebook according to Zuckerberg. He believes that AI and human moderators have allowed, and will continue to allow the company to control what’s published on their platform.

A platform that exercises subjective judgement over content could place Facebook on a collision course with Section 230 that states that platforms must be “neutral public forums” if they wish to avoid liability for the content they publish.