Gordon Stein, CFO of CleanTech Lithium, explains why CTL acquired the 23 Laguna Verde licenses. Watch the video here.

Less Ads, More Data, More Tools Register for FREE

Facebook's safety head tells UK lawmakers it does not amplify hate

Thu, 28th Oct 2021 20:08

By Paul Sandle

LONDON, Oct 28 (Reuters) - Facebook Inc's https://www.reuters.com/technology/facebooks-zuckerberg-kicks-off-its-virtual-reality-event-with-metaverse-vision-2021-10-28
algorithms demote rather than promote polarising
content, its global head of safety told British lawmakers on
Thursday, adding that the U.S. company would welcome effective
government regulation.

Governments in Europe and the United States are grappling
with regulating social media platforms to reduce the spread of
harmful content, particularly for young users.

Britain is leading the charge by bringing forward laws that
could fine social media companies up to 10% of their turnover if
they fail to remove or limit the spread of illegal content.

Secondary legislation that would make company directors
liable could be proposed if the measures do not work.

Facebook https://www.reuters.com/technology/facebook-asks-employees-preserve-internal-documents-legal-inquiries-2021-10-27
whistleblower Frances Haugen https://www.reuters.com/technology/facebook-sees-safety-cost-whistleblower-says-2021-10-25
told the same committee of lawmakers on Monday that Facebook's
algorithms pushed extreme and divisive content to users.

Facebook's Antigone Davis denied the charge.

"I don't agree that we are amplifying hate," Davis told the
committee on Thursday, adding: "I think we try to take in
signals to ensure that we demote content that is divisive for
example, or polarising."

She said she could not guarantee a user would not be
recommended hateful content, but Facebook was using AI to reduce
its prevalence to 0.05%.

"We have zero interest in amplifying hate on our platform
and creating a bad experience for people, they won't come back,"
she said. "Our advertisers won't let it happen either."

Davis said Facebook, which announced on Thursday it would
rebrand as Meta, wanted regulators to contribute to making
social media platforms safer, for example in research into
eating disorders or body image.

"Many of these are societal issues and we would like a
regulator to play a role," she said, adding Facebook would
welcome a regulator with "proportionate and effective
enforcement powers".

"I think criminal liability for directors is a pretty
serious step and I'm not sure we need it to take action."
(Reporting by Paul Sandle; Editing by Alexander Smith)

Login to your account

Don't have an account? Click here to register.

Quickpicks are a member only feature

Login to your account

Don't have an account? Click here to register.