This photo illustration taken on March 22, 2018 shows apps for Facebook, Instagram, Whatsapp and other social networks on a smartphone in Chennai. / AFP PHOTO / ARUN SANKAR        (Photo credit should read ARUN SANKAR/AFP/Getty Images)
This photo illustration taken on March 22, 2018 shows apps for Facebook, Instagram, Whatsapp and other social networks on a smartphone in Chennai. / AFP PHOTO / ARUN SANKAR (Photo credit should read ARUN SANKAR/AFP/Getty Images)
PHOTO: ARUN SANKAR/AFP/Getty Images
Now playing
05:39
Hundreds of fake political Facebook accounts banned
Chris, a Trump supporter, reacts to a fact check of a manipulated video shared by the Trump campaign.
Chris, a Trump supporter, reacts to a fact check of a manipulated video shared by the Trump campaign.
PHOTO: CNN
Now playing
03:58
What Trump supporters see on their Facebook feeds
PHOTO: CNN
Now playing
02:03
Watch this former exec compare Facebook to Big Tobacco
Screengrab Nick Clegg facebook
Screengrab Nick Clegg facebook
PHOTO: CNN
Now playing
07:34
Facebook exec explains the company's US election actions
Now playing
05:15
Misleading videos shared by Republicans get millions of views
Now playing
02:24
Under questioning, Zuckerberg admits Instagram was a 'competitor'
Now playing
03:31
Congresswoman grills Facebook CEO on copying competitors
PHOTO: From Facebook
Now playing
02:40
Zuckerberg blasts Trump administration for worsening pandemic
Facebook co-founder, Chairman and CEO Mark Zuckerberg testifies before the House Energy and Commerce Committee in the Rayburn House Office Building on Capitol Hill April 11, 2018 in Washington, DC. This is the second day of testimony before Congress by Zuckerberg, 33, after it was reported that 87 million Facebook users had their personal information harvested by Cambridge Analytica, a British political consulting firm linked to the Trump campaign.  (Photo by Chip Somodevilla/Getty Images)
Facebook co-founder, Chairman and CEO Mark Zuckerberg testifies before the House Energy and Commerce Committee in the Rayburn House Office Building on Capitol Hill April 11, 2018 in Washington, DC. This is the second day of testimony before Congress by Zuckerberg, 33, after it was reported that 87 million Facebook users had their personal information harvested by Cambridge Analytica, a British political consulting firm linked to the Trump campaign. (Photo by Chip Somodevilla/Getty Images)
PHOTO: Chip Somodevilla/Getty Images
Now playing
03:33
Facebook meeting 'disappointing,' says ad boycott organizer
PHOTO: Shutterstock
Now playing
01:57
Facebook removes Trump ads 'for violating our policy against organized hate'
This picture taken on July 4, 2019 in Nantes, shows logos of the US online social media and social networking service, Facebook. (Photo by LOIC VENANCE / AFP)        (Photo credit should read LOIC VENANCE/AFP via Getty Images)
This picture taken on July 4, 2019 in Nantes, shows logos of the US online social media and social networking service, Facebook. (Photo by LOIC VENANCE / AFP) (Photo credit should read LOIC VENANCE/AFP via Getty Images)
PHOTO: LOIC VENANCE/AFP/AFP via Getty Images
Now playing
02:41
He quit Facebook over Zuckerberg's handling of Trump posts. Hear why
PHOTO: Glenn Chapman/AFP/Getty Images/CNN
Now playing
03:38
He says Facebook's Libra is the future. Lawmakers aren't so sure
PHOTO: YouTube/Financial Services Committee
Now playing
02:15
Zuckerberg struggles to explain whether Facebook fact checks political ads
Mark Zuckerberg remained silent after Congressman Barry Loudermilk compared him to President Trump.
Mark Zuckerberg remained silent after Congressman Barry Loudermilk compared him to President Trump.
Now playing
00:43
Watch Zuckerberg react when a lawmaker compares him to Trump
Facebook
Facebook's CEO Mark Zuckerberg delivers his speech during the VivaTech (Viva Technology) trade fair in Paris, on May 24, 2018. (Photo by GERARD JULIEN / AFP) (Photo credit should read GERARD JULIEN/AFP/Getty Images)
PHOTO: GERARD JULIEN/AFP/Getty Images
Now playing
03:21
This is how Facebook kills its competition
facebook zuckerberg origin business orig _00011530.jpg
facebook zuckerberg origin business orig _00011530.jpg
Now playing
01:47
It took Facebook 15 years to take over the world. Here's how
(CNN Business) —  

Facebook is doing a lot of little things to try to address its bigger problems.

On Wednesday, the company announced more than a dozen updates about how it is addressing misinformation and other problematic content on Facebook, Instagram and Messenger. To promote the various efforts, the company held a four-hour long event at its Menlo Park headquarters for around 20 reporters where employees for various Facebook products recapped changes and answered questions.

For years, Facebook has grappled with the spread of controversial content on its platform, such as misinformation about elections, anti-vaccination stories, violence and hate speech.

Facebook has been trying to remove things faster that are against its rules, and “reduce” the spread of content that doesn’t explicitly violate its policies, but is still troublesome, such as clickbait and misinformation.

“We don’t remove information from Facebook just because it’s false. We believe we have to strike a balance,” Facebook’s VP of integrity Guy Rosen said at the event. “When it comes to false information by real people, we aim to reduce distribution and provide context.”

For example, Facebook said it will lessen the reach of groups that often share misinformation. When users in a group frequently share content that has been deemed false by Facebook’s third-party fact checkers, that group’s content will be pushed lower in News Feed so fewer people see it.

There will also be a “click-gap” signal, which will affect a link’s position in the News Feed. With this feature, Facebook hopes to reduce the spread of websites that are disproportionately popular on Facebook compared to other parts of the web.

It is working with experts to identify new ways to combat fake news on the platform. The Associated Press is expanding the work it does for Facebook’s independent fact-checking program, too.

The company has frequently described its issues with problematic content as “adversarial.” In the company’s framing, it is fighting an enemy that learns and changes tactics. The bundle of changes it announced on Wednesday are its newest weapons.

Facebook policy bans content that it determines can result in “imminent physical violence.” Employees on Wednesday defended its decision to not ban all misinformation or anti-vaccination content on its products.

“When it comes to thinking about harm, it is really hard … to draw a line between a piece of content and something that happens to people offline,” said Tessa Lyons, Facebook’s head of News Feed integrity.

She said some of the posts that appeared to be anti-vaccination involved people asking questions, seeking information and having conversations around the topic.

“There is a tension between enabling expression and discourse and conversation, and ensuring that people are seeing authentic and accurate information. We don’t think that one private company should be making decisions about what information can or cannot be shared online,” she said.

Renee Murphy, principal analyst at research firm Forrester who covers security and risk, said that while Facebook’s steps are positive, they don’t do nearly enough to address some of its larger problems.

“Part of me says ‘awesome [this content] wont go as far as it used to,” she said. “The other part says ‘I have no trust in any of this.’ At the end of the day, what is any of this going to do? How will they manage it?”

Facebook is also trying to be more transparent with users about how and why it makes decisions. As part of the effort, the company is adding a new section to its Community Standards website where users can see the updates Facebook makes to its policies every month.

Another update lets users remove comments and other content they posted to a Facebook Group after they leave it.

Meanwhile, Facebook-owned Instagram is trying to squash the spread of inappropriate posts that don’t violate its policies. For example, a sexually suggestive photo would still pop up in a feed if a user follows that account, but it may no longer be recommended for the Explore Page or in pages for hashtags.

Facebook also announced a few updates to its chat service Messenger, including a Facebook verified badge that would show up in chats to help fight scammers who impersonate public figures.

Another tool called the Forward Indicator will pop up in Messenger when a message is forwarded by the sender. WhatsApp, another Facebook-owned app, has a similar function, which is part of an effort to stop the spread of misinformation. WhatsApp has had major issues with viral hoax messages spreading on the platform, which have resulted in more than a dozen lynchings in India.

Forrester’s Murphy believes the company should do more to address major issues such as violence being livestreamed and going viral on the platform. Last month, a suspected terrorist was able to stream live video to Facebook of a mass murder in New Zealand. The company said its AI systems failed to catch the video, and it took down 1.5 million videos of the attack in the first 24 hours.

“They have bigger problems. I’m sure [these updates] will help sometimes, but there are bigger problems at foot,” she said. “Facebook has a lot more to do.”