Facebook Whistleblower Frances Haugen testifies at a Senate Commerce, Science and Transportation Committee hearing on protecting kids online.

Big Tech  

selfology.co/skincare

Eating secret accounts.

 

That are often hidden from their parents.

 

As unique value proposition in their words,

 

a unique value proposition,

 

a way to drive up numbers for advertisers

 

and shareholders at the expense of safety.

 

And it doubled down on targeting children,

 

pushing products on preteens.

 

Not just teens, but preteens.

 

That it knows are harmful to our kids.

 

Mental health and well being.

 

Instead of telling parents

 

Facebook concealed the facts,

 

it sought to Stonewall and block this

 

information from becoming public,

 

including to this committee when Senator

 

Blackburn and I specifically asked.

 

The company.

 

And still even now.

 

As of just last Thursday when a Facebook

 

witness came before this committee,

 

it has refused disclosure or even to

 

tell us when it might decide whether

 

to disclose additional documents.

 

And they continued their tactics even

 

after they knew the destruction it caused.

 

It isn’t just that they made money.

 

From these practices,

 

but they continued to profit from them.

 

Their profit was more important

 

than the pain that they caused.

 

Last Thursday the message

 

from Miss Antigone Davis,

 

Facebook’s global head of safety, was simple.

 

Quote This research is not

 

a bombshell End Quote.

 

And she repeated the line, not a bombshell.

 

Well. This research is.

 

The very definition of a bombshell.

 

Facebook and Big Tech are

 

facing a big tobacco moment.

 

A moment of reckoning.

 

The parallel is striking.

 

I sued Big Tobacco as Connecticut’s

 

Attorney general.

 

I helped to lead.

 

The states in that legal action,

 

and I remember very, very well. The moment.

 

In the course of our litigation,

 

when we learned of those files that showed.

 

Not only that, big tobacco knew

 

that its product caused cancer.

 

But that they had done the research.

 

They conceal the files.

 

And.

 

Now we knew and the world new and

 

big Tech now faces that big tobacco.

 

Jaw dropping moment of truth it is.

 

Documented proof that Facebook

 

knows its products can be addictive

 

and toxic to children and it’s not

 

just that they made money again,

 

it’s that they valued their profit.

 

More than the pain that they caused

 

to children and their families.

 

The damage to self interest and self

 

worth inflicted by Facebook today.

 

Will haunt a generation.

 

Feelings of inadequacy

 

and insecurity rejection.

 

And self hatred. Will impact.

 

This generation for years to come.

 

Our children. Are the ones who

 

are victims teens today looking

 

at themselves in the mirror?

 

Feel doubt and insecurity.

 

Mark Zuckerberg ought to be looking

 

at himself in the mirror today.

 

And yet, rather than taking responsibility.

 

And showing leadership. Mr.

 

Zuckerberg is going sailing.

 

His new modus operandi, no apologies,

 

no admission, no action, nothing to see here.

 

Mark Zuckerberg,

 

you need to come before this committee.

 

You need to explain.

 

To Francis Hogan To us to the world.

 

And to the parents of America.

 

What you were doing and why you did it.

 

Instagram’s business model is pretty

 

straightforward, more eyeballs,

 

more dollars, everything Facebook does.

 

Is to add more users and keep them on

 

their apps for longer in order to hook us,

 

Instagram uses our private information.

 

To precisely target us with

 

content and recommendations.

 

Assessing that.

 

What will provoke a reaction?

 

Will keep us scrolling.

 

Far too often these recommendations

 

encourage our most destructive and dangerous

 

behaviors as we showed on Thursday.

 

We created a fake account,

 

my office and I did as a teen interested

 

in extreme dieting and eating disorders.

 

Instagram latched onto that teenagers

 

initial insecurities that then pushed

 

more content and recommendations

 

glorifying eating disorders.

 

That’s how Instagram’s algorithms.

 

And push teens into darker and darker places,

 

Facebook’s own researchers.

 

Called it Instagrams quote. Perfect storm.

 

Exacerbating downward spirals.

 

Facebook as you have put it.

 

This happens so powerfully maximizes,

 

profits and ignores pain.

 

Facebook’s failure to acknowledge and

 

to act makes it morally bankrupt.

 

Again and again,

 

Facebook rejected reforms recommended

 

by its own researchers.

 

Last week, Miss Davis said quote,

 

we’re looking at End Quote.

 

No specific plans, no commitments,

 

only vague platitudes.

 

These documents that you have revealed.

 

Provided this company with a blueprint for

 

reform and provided specific recommendation.

 

They could have made Facebook and

 

Instagram safe for the company repeatedly

 

ignored those recommendations from its own.

 

Researchers. That would have made

 

Facebook and Instagram safer.

 

Faith Facebook researchers have suggested.

 

Changing their recommendations.

 

To stop promoting accounts

 

known to encourage.

 

Dangerous body.

 

Comparison.

 

Instead of making meaningful changes,

 

Facebook simply pays lip service.

 

And if they won’t act.

 

And a big tech won’t act.

 

Congress has to intervene.

 

Privacy protection is long overdue.

 

Senator Markey and I have introduced the

 

Kids Act which would ban addictive tactics

 

that Facebook uses to exploit children.

 

Parents deserve better tools

 

to protect their children.

 

I’m also a firm supporter

 

of reforming Section 230.

 

We should consider narrowing this sweeping

 

immunity when platforms algorithms.

 

Amplify illegal conduct.

 

You’ve commented on this in your testimony,

 

and perhaps you’ll expand on it.

 

We have also heard compelling recommendations

 

about requiring disclosures of research

 

and independent reviews of these platforms.

 

Algorithms,

 

and I plan to pursue these ideas.

 

The Securities and Exchange Commission should

 

investigate your contentions and claims,

 

Miss Hogan and so should the

 

Federal Trade Commission.

 

Facebook appears to have misled the public

 

and investors and if that’s correct,

 

it ought to face real penalties as a

 

result of that misleading and deceptive.

 

Misrepresentation.

 

I want to thank all my colleagues

 

who are here today because.

 

What we have is a bipartisan congressional

 

road map for reform that will safeguard

 

and protect children from big tech.

 

That will be a focus of our subcommittee

 

moving forward, and it will continue

 

to continue to be bipartisan.

 

And finally, I’ll just end on this note.

 

In the past weeks and days,

 

parents have contacted me.

 

With their stories heartbreaking.

 

And spine chilling stories about

 

children pushed into eating disorders.

 

Bullying, online self injury of the most.

 

Disturbing kind.

 

And sometimes even taking their

 

lives because of social media.

 

Parents are holding Facebook accountable

 

because of your bravery is how they?

 

And we need to hold accountable

 

Facebook and all big tech as well.

 

Again, my thanks to you.

 

I am going to enter into the record a

 

letter from 52 state attorneys general

 

and from two members of the Youth

 

Advisory Board of Sandy Hook Promise.

 

As long as there’s no objection

 

and I will now turn to the

 

ranking member center platform.

 

Thank you Mr.

 

Chairman and thank you for entering

 

that letter in the record that we have

 

from our states attorneys general.

 

Good morning to everyone.

 

It is nice to see people in this hearing

 

room and to be here for the hearing today.

 

Miss Hogan.

 

We thank you for your appearance

 

before us today and for giving the

 

opportunity not only for Congress

 

but for a for the American people

 

to hear from you in this setting.

 

And we appreciate that Mr.

 

Chairman.

 

I think also thanks to you and your

 

staff that have worked with our team to

 

make certain that we had this hearing

 

in this opportunity today so that

 

we can get more insights into what

 

Facebook is actually doing as they

 

invade the privacy not only of adults,

 

but I’ve children and look at the

 

ways that they are in violation of the

 

Children’s Online Privacy Protection Act.

 

Which is federal law?

 

And looking at how they are evading

 

that law and working around it,

 

and as the chairman said,

 

I privacy and online privacy

 

passing a federal privacy standard

 

has been long in the works.

 

I filed my first privacy bill when

 

it was in the House back in 2012,

 

and I think that it will be this Congress

 

and this subcommittee that is going

 

to lead the way to online privacy.

 

Data Security Section 230 reforms and,

 

of course,

 

Senator Klobuchar always wants to talk

 

about anti trust and I have to give a nod.

 

Senator Markey is down there.

 

When we were in the House,

 

we were probably two of the only

 

ones who were talking about the need

 

to have a federal privacy standard.

 

Now, as the chairman mentioned last week,

 

we heard from Miss Davis,

 

who had global safety for Facebook,

 

and it was surprising to us that

 

what she tried to do was to minimize

 

the information that was in these.

 

Documents to minimize the research

 

and to minimize the knowledge

 

that Facebook had.

 

At one point I even reminded her the

 

research was not third party research.

 

The research was their Facebooks

 

internal research, so they knew.

 

What they were doing,

 

they knew where the violations were,

 

and they know they are guilty.

 

They know this.

 

Their research tells them this.

 

Uhm, last week in advance of our hearing,

 

Facebook released two studies and said

 

that the Wall Street Journal was all wrong.

 

They had just gotten it wrong,

 

as if the Wall Street Journal did not

 

know how to read these documents and

 

how to work through this research.

 

Having seen the data that you’ve

 

presented and the other studies that

 

Facebook did not publicly share,

 

I feel pretty confident that

 

it’s Facebook who has done the

 

misrepresenting to this committee.

 

Here are some of the numbers that

 

Facebook chose not to share, and Mr.

 

Chairman,

 

I think it’s important that we look

 

at these as we talk about the setting

 

for this hearing what we learned.

 

Last week,

 

what you and I’ve been learning

 

over the past three years about

 

big Tech and Facebook.

 

And here you go,

 

66% of teen girls on Instagram

 

and 40% of teen boys experience

 

negative social comparisons.

 

This is Facebook’s research,

 

52% of teen girls who experienced

 

negative social comparison on

 

Instagram said it was caused

 

by images related to beauty.

 

Social comparison is worse on Instagram

 

because it is perceived as real life.

 

But based on celebrity standards,

 

social comparison mimics the grief

 

cycle and includes a downward

 

emotional spiral encompassing a

 

range of emotions from jealousy to

 

self proclaimed body dysmorphia

 

Facebook addiction,

 

which Facebook calls conveniently

 

problematic use.

 

Is most severe in teams peaking at age 14?

 

Here’s what else we know.

 

Facebook is not interested in making

 

significant changes to improve

 

kids safety on their platforms,

 

at least not when that would result

 

in losing eyeballs on post or

 

decreasing their ad revenues in fact.

 

Facebook is running scared as they

 

know that in their own words,

 

young adults are less active and

 

less engaged on Facebook and that

 

they are running out of teens.

 

To add to Instagram.

 

So teams are looking at other platforms

 

like Tik T.O.K and Facebook is only

 

making those changes that add to its

 

users and numbers and ultimately

 

its profits follow the money.

 

So what are these changes allowing

 

users to create multiple accounts

 

that Facebook does not delete,

 

and encouraging teens to create second

 

accounts they can hide from their parents.

 

They are also studying younger and

 

younger children as young as eight.

 

So this I can market to them and while

 

Miss Davis says that kids below 13

 

are not allowed on Facebook or Instagram,

 

we know that they are because she told

 

us that they recently had deleted

 

600,000 accounts from children under age 13.

 

So how do you get that many underage

 

accounts if you aren’t turning a

 

blind eye to them in the first place?

 

And then in order to try to clean it up,

 

you go to delete it and then

 

you say, oh, by the way,

 

we just in the last month deleted

 

600,000 underage accounts.

 

And Speaking of turning a blind eye,

 

Facebook turns a blind eye to user privacy.

 

News broke yesterday that the

 

private data of over 1.5 billion.

 

That’s right.

 

1.5 billion Facebook users is

 

being sold on a hacking forum.

 

That’s its biggest data breach to date.

 

Examples like this underscore my

 

strong concerns about Facebook

 

collecting the data of kids and teens

 

and what they’re doing with it.

 

Facebook also turns a

 

blind eye toward blatant.

 

Human exploitation taking place

 

on its platform, trafficking,

 

forced labor cartels,

 

the worst possible things one can imagine.

 

Big tech companies have gotten away

 

with abusing consumers for too long.

 

It is clear that Facebook

 

prioritizes profit over the well

 

being of children and all users.

 

So as a mother and a grandmother,

 

this is an issue.

 

That is of particular concern to me.

 

So we thank you for being here today.

 

Miss Hagan and we look forward to

 

getting to the truth about what

 

Facebook is doing with users data.

 

And how they are abusing their privacy

 

and how they show a lack of respect for

 

the individuals that are on their network.

 

We look forward to the testimony.

 

Thank you Mr.

 

Chairman. Thanks.

 

Senator blank.

 

Thank you Senator Blackburn.

 

I don’t know where the ranking

 

member would like to make it.

 

If you don’t mind.

 

Thank you.

 

Chairman,

 

Blumenthal and and I will just

 

take a moment or two and I do

 

appreciate being able to speak.

 

As ranking member of the full

 

committee this this Miss Hogan this,

 

is this a subcommittee hearing,

 

you see some vacant seats.

 

This pretty good attendance

 

for our subcommittee.

 

There are also a lot of things going on,

 

so people will be coming and going.

 

But I’m I’m willing to predict that

 

this will have almost 100% attendance

 

by members of the Subcommittee,

 

because of the importance

 

of this subject matter.

 

So thanks for coming forward to share

 

concerns about Facebook’s business practices,

 

particularly with respect to.

 

Children and teens and of course,

 

that is the the main topic of our.

 

It’s the title of our hearing today.

 

Protecting kids online.

 

The recent relevation revelations about

 

Facebook’s mental health effects on children.

 

And its plans to target younger

 

audiences are indeed disturbing.

 

And I think you’re going to see a

 

lot of bipartisan concern about this

 

today and and in future hearings,

 

they just they show how urgent

 

it is for Congress to act against

 

powerful tech companies on behalf

 

of children and the broader public.

 

And I say powerful tech companies.

 

They are.

 

Possession,

 

possessive of immense,

 

immense power.

 

Their product is addictive and

 

people on both sides of this

 

dioces are concerned about this.

 

Talked to an opinion maker just

 

down the hall a few moments before

 

this hearing this person said.

 

The tech gods.

 

Have been mystified now,

 

and I think this hearing today, Mr.

 

Chair.

 

Is part of the process of

 

demystifying big tech.

 

The Children of America are

 

hooked on their product.

 

It is often destructive and harmful,

 

and there is a cynical knowledge on

 

behalf of the leadership of these big tech

 

companies that that is true. Miss Hogan.

 

I I hope you will have a chance.

 

To talk about your work

 

experience at Facebook.

 

And perhaps compare it to other

 

social media companies.

 

Also look forward to hearing your

 

thoughts on how this committee and

 

how this Congress can ensure greater

 

accountability and transparency,

 

especially with regard to children.

 

So thank you, Mr.

 

Chairman and thank you,

 

Miss Hogan, for being here today.

 

Thanks Senator Wicker,

 

our witness this morning is Frances Hogan.

 

She was the lead product manager on

 

Facebook civic misinformation team.

 

She holds a degree in electrical

 

and computer engineering from Olin

 

College and an MBA from Harvard.

 

She made the courageous decision

 

as all of us here and many others

 

around the world know to leave

 

Facebook and revealed the terrible

 

truths about the company.

 

She learned during her tenure there.

 

And I think we are all in agreement

 

here in expressing our gratitude

 

and our admiration for your bravery

 

in coming forward. Thank you.

 

Miss Hogan. Please proceed.

 

Good afternoon chairman.

 

Blumenthal ranking member Blackburn

 

and members of the subcommittee.

 

Thank you for the opportunity

 

to appear before you.

 

My name is Francis Hogan.

 

I used to work at Facebook.

 

I joined Facebook because I

 

think Facebook has the potential

 

to bring out the best in us.

 

But I’m here today because I believe

 

Facebook’s products harm children,

 

Stoke division and weaken our democracy.

 

The company’s leadership knows how to

 

make Facebook and Instagram saver,

 

but won’t make the necessary

 

changes because they have put their

 

astronomical profits before people.

 

Congressional action is needed.

 

They won’t solve this crisis

 

without your help.

 

Yesterday we saw Facebook

 

get taken off the Internet.

 

I don’t know why I went down,

 

but I know that for more than five hours,

 

Facebook wasn’t used to deepen,

 

divides, destabilize democracies,

 

and make young girls and women

 

feel bad about their bodies.

 

It also means that millions of

 

small businesses weren’t able

 

to reach potential customers and

 

countless photos of new babies

 

weren’t joyously celebrated by

 

family and friends around the world.

 

I believe in the potential of Facebook.

 

We can have social media we enjoy.

 

That connects us without tearing

 

our democracy apart our democracy.

 

Putting our children in danger and

 

sowing ethnic violence around the world.

 

We can do better.

 

I have worked as a product manager

 

at large tech companies since

 

2006 including Google, Pinterest,

 

Yelp and Facebook.

 

My job has largely focused on

 

algorithmic products like Google

 

Plus search and recommendation

 

systems like the one that powers

 

the Facebook newsfeed.

 

Having worked on four different

 

types of social networks,

 

I understand how complex and

 

nuanced these problems are.

 

However,

 

the choices being made inside

 

of Facebook are disastrous.

 

For our children. For our public safety.

 

For our privacy and for our democracy.

 

And that is why we must demand

 

Facebook make changes.

 

During my time at Facebook,

 

first working as the lead product

 

manager for civic misinformation

 

and later on counterespionage,

 

I saw Facebook repeatedly encounter

 

conflicts between its own profits

 

and our safety.

 

Facebook consistently resolved these

 

conflicts in favor of its own profits.

 

The result has been more division,

 

more harm, more lies,

 

more threats and more combat.

 

In some cases this is this dangerous

 

online talk has led to actual violence

 

that harms and even kills people.

 

This is not simply a matter of certain

 

social media users being angry or unstable,

 

or about one side being

 

radicalized against the other.

 

It is about Facebook choosing

 

to grow at all costs,

 

becoming an almost trillion dollar company

 

by buying its profits with our safety.

 

During my time at Facebook,

 

I came to realize the devastating truth.

 

Almost no one outside of Facebook

 

knows what happens inside of Facebook.

 

The company intentionally hides

 

vital information from the public

 

from EU S government and from

 

governments around the world.

 

The documents I have provided to

 

Congress prove that Facebook has

 

repeatedly misled the public about

 

what its own research reveals

 

about the safety of

 

children. The efficacy of its artificial

 

intelligence systems and its role in

 

spreading divisive and extreme messages.

 

I came forward because I believe

 

that every human being deserves

 

the dignity of the truth.

 

The severity of this crisis demands that we

 

break out of our previous regulatory frames.

 

Facebook wants to trick you into thinking

 

that privacy protections or changes to

 

section 230 alone will be sufficient.

 

While important,

 

these will not get to the core of the issue,

 

which is that no one truly

 

understands the destructive choices

 

made by Facebook except Facebook.

 

We can afford nothing less

 

than full transparency.

 

As long as Facebook is operating

 

in the shadows,

 

hiding its research from public scrutiny,

 

it is unaccountable.

 

Until the incentives change.

 

Facebook will not change.

 

Left alone, Facebook will continue

 

to make choices that go against

 

the common good our common good.

 

When we realized big Tobacco

 

was hiding the harms,

 

it caused the government took action.

 

When we figured out cars were safer with

 

seatbelts, the government took action.

 

And when our government learned

 

that opioids were taking lives,

 

the government took action.

 

I implore you to do the same here.

 

Today, Facebook shapes our perception of the

 

world by choosing the information we see,

 

even those who don’t use Facebook

 

are impacted by the majority who do.

 

A company with such frightening

 

influence over so many people over

 

their deepest thoughts, feelings,

 

and behavior needs real oversight.

 

But Facebook’s closed design

 

means it has no real oversight.

 

Only Facebook knows how it

 

personalizes your feed for you.

 

At other large tech companies like Google.

 

Any independent researcher can

 

download from the Internet.

 

The company search results and write

 

papers about what they find and they do.

 

But Facebook hides behind walls that

 

keeps researchers and regulators

 

from understanding the true dynamics

 

of their system.

 

Facebook will tell you privacy

 

means they can’t give you data.

 

This is not true.

 

When tobacco companies claimed that

 

filtered cigarettes were safer for consumers,

 

scientists could independently

 

invalidate these marketing messages.

 

And confirmed that in fact they posed

 

a greater threat to human health.

 

The public cannot do the same with Facebook.

 

We are given no other option than to

 

take their marketing messages on blind faith.

 

Not only does the company hide

 

most of its own data,

 

my disclosure has proved that when

 

Facebook is directly asked questions

 

as important as how do you impact the

 

health and safety of our children?

 

They mislead and they.

 

They choose to mislead and misdirect.

 

Facebook has not earned our blind faith.

 

This inability to see into Facebook’s

 

actual systems and confirm how they

 

work is communicated and work as

 

and confirmed that they work as

 

communicated is like the Department of

 

Transportation regulating cars by only

 

watching them drive down the highway.

 

Today, no regulator has a menu of

 

solutions for how to fix Facebook,

 

because Facebook didn’t want them to know

 

enough about what’s causing the problems.

 

Otherwise they wouldn’t.

 

Otherwise there would have been

 

need for a whistleblower.

 

How is the public supposed to

 

assess if Facebook is resolving

 

conflicts of interest in a way that

 

is aligned with the public good?

 

If the public has no visibility

 

into how Facebook operates?

 

This must change.

 

Facebook wants you to believe that the

 

problems we’re talking about are unsolvable.

 

They want you to believe in false choices.

 

They want you to believe that you

 

must choose between a Facebook full

 

of divisive and extreme content,

 

or losing one of the most important values

 

our country was founded upon free speech.

 

That you must choose between

 

public oversight of Facebook

 

choices and your personal privacy.

 

But to be able to share fun photos

 

of your kids with old friends,

 

you must also be inundated

 

with anger driven virality.

 

They want you to believe that

 

this is just part of the deal.

 

I am here today to tell you that’s not true.

 

These problems are solvable.

 

A safer free speech respecting more

 

enjoyable social media is possible,

 

but there is one thing that I hope

 

everyone takes away from these disclosures.

 

It is that Facebook can change but is

 

clearly not going to do so on its own.

 

My fear is that without action

 

divisive and extremist behaviors we

 

see today are only the beginning.

 

What we saw in Mian Mar and are now

 

seen in Ethiopia are only the opening

 

chapters of a story so terrifying

 

no one wants to read the end of it.

 

Congress can change the rules

 

that Facebook plays by and stop

 

them any harm that is now causing.

 

We now know the truth about

 

Facebook’s destructive impact.

 

I really appreciate the seriousness

 

which the members of Congress and the

 

Securities and Exchange Commission

 

are approaching these issues.

 

I came forward at great personal risk.

 

Because I believe we still have time to act.

 

But we must act now.

 

I’m asking you our elected

 

representatives to act.

 

Thank you.

 

Thank you, miss happen.

 

Thank you for taking that personal risk

 

and we will do anything and everything

 

to protect and stop any retaliation

 

against you in any legal action.

 

That the company may bring

 

to bear or anyone else,

 

and we made that, I think,

 

very clear in the course

 

of these proceedings.

 

I want to ask you about this idea

 

of disclosure you’ve talked about.

 

Looking in effect at a car going

 

down the road and we’re going to have

 

five minute rounds of questions.

 

Maybe a second round if you’re

 

willing to do it.

 

We’re here today to look under the hood.

 

And that’s what we need to do more in August.

 

Senator Blackburn and I wrote to.

 

Mark Zuckerberg and we asked

 

him pretty straightforward

 

questions about how the company.

 

Works and safeguards children

 

and teens on Instagram.

 

Facebook dogs.

 

Def sidetracked in effect misled us.

 

So I’m going to ask you a few

 

straight forward questions to

 

breakdown some of what you have

 

said and if you can answer them

 

yes or no that would be great.

 

It’s Facebook’s research,

 

its own research ever found that

 

its platforms can have a negative

 

effect on children and teens,

 

mental health or well being?

 

Many of Facebook’s internal research

 

reports indicate that Facebook has a

 

serious negative harm on a non significant,

 

not a significant portion of

 

teenagers and younger and children.

 

And his Facebook ever offered features.

 

That it knew had a negative effect.

 

On children and teens mental health.

 

Facebook knows that it’s amplification,

 

algorithms, things like engagement based

 

ranking on Instagram can lead children from

 

very innocuous topics like healthy recipes.

 

I think all of us could eat a little more

 

healthy all the way from just something

 

innocent like healthy recipes to anorexia,

 

promoting content over a

 

very short period of time.

 

And has Facebook ever found again

 

and its research that kids show

 

signs of addiction on Instagram?

 

Facebook is studied a pattern

 

that they call problematic use.

 

What we might more commonly call addiction,

 

it has a very high bar for

 

what it believes it is.

 

It says you you you self identify that you

 

don’t have control over usage and that

 

it is materially harming your health,

 

your schoolwork or your your physical health.

 

5 to 6% of 14 year olds have the

 

self-awareness to admit both those questions.

 

It is likely that far more than 5

 

to 6% of 14 year olds are or are

 

even RR addicted to Instagram.

 

Last Thursday my colleagues

 

and I asked Miss Davis,

 

who was representing Facebook.

 

About how the decision would have

 

made whether to pause permanently

 

Instagram for kids.

 

And she said,

 

quote,

 

there’s no one person who makes

 

a decision like that.

 

We think about it that collaboratively.

 

It’s as though she couldn’t

 

mention Mark Zuckerberg’s name.

 

Isn’t he the one who will be making this

 

decision from your experience in the company?

 

Mark holds a very unique role in the

 

tech industry in that he holds over 55%

 

of all the voting shares for Facebook.

 

There are no similarly powerful companies

 

that are as unilaterally controlled and

 

and in the end the buck stops with Mark.

 

There is no one hurt currently

 

holding Mark accountable by himself

 

and Mark Zuckerberg in effect,

 

is the algorithm designer in chief correct?

 

Uhm,

 

I received an MBA from Harvard and they

 

emphasize to us that we are responsible

 

for the organizations that we build.

 

Mark has built an organization

 

that is very metrics driven.

 

It isn’t.

 

It is intended to be flat.

 

There is no unilateral responsibility.

 

The metrics make the decision.

 

Unfortunately,

 

that itself is a decision and in the end,

 

if he is the CEO and the

 

chairman of Facebook,

 

he is responsible for those decisions.

 

The buck stops with him,

 

but the buck stops with him.

 

And Speaking of the Bucks stopping.

 

You have said that.

 

Facebook should declare moral bankruptcy.

 

I agree.

 

I think it’s its actions and its failure

 

to acknowledge its responsibility.

 

Indicate moral bankruptcy?

 

There is a cycle occurring inside

 

the company where Facebook has

 

struggled for a long time to

 

recruit and retain the number of

 

employees it needs to tackle.

 

The large scope of projects that is chosen to

 

take on. Facebook is stuck in

 

a cycle where it struggles,

 

struggles to hire that causes

 

it to understaffed projects,

 

which causes scandals,

 

which then makes it harder to hire.

 

Part of why Facebook needs to come

 

out and say we did something wrong,

 

we made some choices that we

 

regret is the only way we can move

 

forward and heal Facebook as we

 

first have to admit the truth.

 

Like the way we’ll have reconciliation

 

and we can move forward is by first being

 

honest and declaring moral bankruptcy.

 

Being honest and acknowledging

 

that Facebook has caused.

 

And aggravated a lot of pain.

 

Simply make more money.

 

And it has profited off spreading

 

disinformation and misinformation

 

and sowing hate.

 

Faint Facebook’s answers to

 

Facebook’s destructive impact

 

always seems to be more Facebook.

 

We need more Facebook, which means more pain.

 

And more money for Facebook.

 

Would you agree?

 

I don’t think at any point Facebook set

 

out to make a destructive platform.

 

I think it is a challenge of that.

 

Facebook has set up an organization

 

where the parts of the organization

 

responsible for growing and expanding

 

the organization are separate and not

 

not regularly cross pollinated with the

 

parts of the company that focus on the

 

harms that the company is causing and

 

as a result regularly integrity actions.

 

Projects that were hard fought by

 

the teams trying to keep us safe

 

are undone by new growth projects.

 

That counteract those same remedies.

 

So I do think it’s a thing of their

 

organizational problems that need

 

oversight and Facebook needs help

 

in order to move forward to a more

 

healthy place.

 

And whether it’s teens bullied

 

into suicidal thoughts.

 

Or the genocide of ethnic minorities and.

 

Myanmar or fanning the flames that division.

 

Within our own country or in Europe.

 

They are ultimately responsible for the.

 

Immorality of the pain that’s caused.

 

Facebook needs to take responsibility

 

for the consequences of its choices

 

and used to be willing to accept

 

small trade offs on profit.

 

And I think I think just that act of

 

being to able to admit that it’s a

 

mixed bag is important and I think

 

that what we saw from Antigone last

 

week is an example of the kind of

 

behavior we need to support Facebook

 

and growing out of,

 

which is instead of just focusing on

 

all the good they do admit they have

 

responsibilities to also remedy the

 

harm Mark Zuckerberg’s new policy is.

 

No apologies.

 

No admissions, no acknowledgement.

 

Nothing to see here.

 

We’re going to deflect it and go sailing.

 

Turn to the ranking member.

 

Thank you Mr.

 

Chairman,

 

thank you for your testimony.

 

I want to stay with Mr Davis and some

 

of her comments because I had asked

 

her last week about the underage

 

users and she had made the comment.

 

I’m going to quote from her testimony if

 

we find an account of someone who’s under 13,

 

we removed them and the last three

 

months we removed 600,000 accounts

 

of under 13 year olds End Quote and.

 

I have to tell you,

 

it seems to me that there is a

 

problem if you have 600,000 accounts

 

from children who ought not to be

 

there in the first place.

 

So what did Mark Zuckerberg know about

 

Facebook’s plans to bring kids on

 

as new users and advertise to them?

 

There are reports within Facebook

 

that show cohort analysis where

 

they they examine out what ages do

 

people join Facebook and Instagram?

 

And based on those cohort analysis,

 

so Facebook likes to say children lie

 

about their ages to get onto the platform.

 

Reality is enough.

 

Kids tell the truth that you can

 

work backwards to figure out what

 

are approximately the real ages

 

of anyone who’s on the platform.

 

One Facebook does cohort analysis

 

and looks back retrospectively

 

and discovers things like you know

 

up to 10 to

 

15% of even 10 year olds in a given cohort.

 

Maybe on Facebook or Instagram.

 

OK, so This is why Adam Moceri,

 

who’s the CEO of Instagram,

 

would have replied to JoJo

 

Siwa when she said to him, oh,

 

I’ve been on Instagram since I was eight.

 

He said he didn’t want to know that.

 

Ah, so it would be for this reason, correct?

 

A pattern of behavior that I saw at

 

Facebook was that often problems

 

were so understaffed that there was

 

a kind of an implicit discouragement

 

from having better detection systems.

 

So, for example, I worked my last team

 

at Facebook was on the counter espionage

 

team within the threat Intelligence org,

 

and at any given time our team

 

could only handle a third of the

 

cases that we knew about.

 

We knew that if we built

 

even a basic detector,

 

we would likely have many more cases.

 

OK, then literally so yeah,

 

let me ask you this.

 

So you look at the way that they have

 

the data, but they’re choosing to keep

 

that data and advertise from it, right?

 

You sell it to third parties.

 

So what does Facebook do?

 

You’ve got these 600,000 accounts that

 

ought not to be on there anymore, right?

 

But then you delete those accounts.

 

But what happens to that data?

 

Does Facebook keep that data?

 

Do they keep it until those

 

children go to age 13?

 

Since, as you’re saying,

 

they can work backward and figure

 

out the true age of a user.

 

So what do they do with it?

 

Do they delete it?

 

Do they store it? Do they keep it?

 

How do they process that?

 

And.

 

I am my understanding of Facebook’s

 

data retention policies and

 

I want to be really clear.

 

I didn’t work directly on that is

 

that they delete when they delete an

 

account they delete all the data.

 

Then I believe 90 days in compliance

 

with GDPR I with regard to

 

children under age on the platform.

 

Facebook do substantially more to

 

detect more of those children and

 

they should have to publish for

 

Congress those processes because

 

there are lots of subtleties and

 

those things and they could be

 

much more effective than probably

 

what they’re doing today got it.

 

Now staying with this underage

 

children since this hearing is all

 

about kids and about online privacy.

 

I want you to tell me.

 

How Facebook is able to do market

 

research on these children that are underage?

 

13 because Mr Davis was really,

 

she didn’t deny this last week,

 

so how are they doing this?

 

Do they bring kids into focus

 

groups with their parents?

 

How do they get that permission?

 

She said they got permission from parents.

 

Is there a permission slip or a form?

 

That gets signed and then how do

 

they know which kids to target?

 

Uhm, there’s a bunch unpack.

 

There will start with maybe how

 

did they recruit children for

 

focus groups or recruit teenagers?

 

Most tech companies have systems

 

where they can analyze the data

 

that is on their servers,

 

so most of the focus groups I read

 

or that I saw analysis of were around

 

Messenger Kids which has children on it.

 

And those focus groups appear to

 

be children interacting in person.

 

Often large tech companies use

 

either sourcing agencies that will

 

go and identify people who meet

 

certain demographic criteria,

 

or they will reach out directly based

 

on criteria data on the platform.

 

So for example,

 

in the case of Messenger kids,

 

maybe you would want to study a

 

child that was an active user and

 

one that was less active user.

 

You might reach out to some that

 

came from each population and so

 

these are children that are underage.

 

  1. Yeah. And they know it.

 

For some of these studies,

 

and I assume they get,

 

I assume they get permission,

 

but I don’t work on that.

 

OK, well we’re still waiting to

 

get a copy of that parental consent

 

form that would involve children.

 

My time is expired, Mr. Chairman.

 

I’ll save my other questions for our

 

second round if we’re able to get those.

 

Thank you. Thank you.

 

Senator Blackburn. Senator Klobuchar.

 

Thank you very much. Mr.

 

Chairman,

 

thank you so much.

 

Miss Hogan for shedding a light

 

on how Facebook time and time

 

again has put profit over people.

 

When their own research found that

 

more than 13% of teen girls say

 

that Instagram made their thoughts

 

of suicide words, what did they do?

 

They proposed Instagram for kids,

 

which has now been put on pause

 

because of public pressure.

 

When they found out that their

 

algorithms are fostering polarization

 

misinformation and hate that they

 

allowed 99% of their violent contact

 

to remain unchecked on their platform,

 

including lead up to the January 6th.

 

Insurrection why did they do?

 

They now as we know Mark Zuckerberg is

 

going sailing and saying no apologies.

 

I think the time has come for

 

action and I think you are the

 

catalyst for that action.

 

You have said privacy

 

legislation is not enough.

 

I completely agree with you,

 

but I think you know we have

 

not done anything to update our

 

privacy laws in this country.

 

Our federal privacy laws.

 

Nothing zilch in any major way.

 

Why?

 

Because there are lobbyists around

 

every single corner of this building

 

that have been hired by the tech industry.

 

We have done nothing when it comes to

 

making the algorithms more transparent.

 

Allowing for the university

 

research that you referred to.

 

Why?

 

Because Facebook and the other

 

tech companies are throwing a

 

bunch of money around this town

 

and people are listening to that.

 

We have done nothing significantly past

 

although we are on a bipartisan basis.

 

Working in the Antitrust subcommittee

 

to get something done on consolidation,

 

which you understand,

 

allows the dominant platforms to

 

control all this like the bullies

 

in the neighborhood.

 

Buyout the companies that maybe

 

could have competed with them.

 

And added the bells and whistles so

 

the time for action is now, so I’ll start.

 

I’ll start with something that I

 

asked Facebook’s head of safety when

 

she testified before us last week.

 

I asked her how they estimate the

 

lifetime value of a user for kids

 

who start using their products

 

before they turned 13.

 

She evaded the question and said

 

that’s not the way we think about it.

 

Is that right or is it your experience

 

that Facebook estimates and that

 

and puts a value on how much money

 

they get from users in general,

 

get to kids in a second.

 

Is that a motivating force for them?

 

Based on what I saw in terms of

 

allocation of integrity spending,

 

so one of the things disclosed in the

 

Wall Street Journal was that I believe

 

it’s like 87% of all the misinformation

 

spending is spent on English,

 

but only about like 9% of the

 

users are English speakers.

 

It seems that that.

 

Facebook invests more and users

 

who make them more money even

 

though the danger may not be evenly

 

distributed based on profitability.

 

Does it make sense that having a

 

younger person get hooked on social

 

media at a young age makes them

 

more profitable over the long term

 

as they have a life ahead of them?

 

Facebook’s internal documents talk about

 

the importance of getting younger users,

 

for example,

 

tweens onto Instagram like Instagram

 

kids because they need to have.

 

Like they know that children bring their

 

parents online and things like that,

 

and so they understand the value

 

of younger users for the long term.

 

Success of Facebook,

 

Facebook reported advertising revenue to be

 

$51.58 per user last quarter

 

in the US and Canada.

 

When I asked Miss Davis how much of that

 

came from Instagram users under 18,

 

she wouldn’t say do you think that

 

teens are profitable for their company?

 

I would assume so based on advertising

 

for things like television.

 

You get much substantially higher

 

advertising rates for customers who

 

don’t yet have preferences or habits,

 

and so I’m I’m sure there are some of

 

the more profitable users on Facebook,

 

but I do not work directly on that now.

 

There were major issue that’s

 

come out of this.

 

Eating disorders studies have found

 

that eating disorders actually

 

have the highest mortality rate

 

of any mental illness for women,

 

and I lead a bill on this with senators

 

capital in Baldwin that we passed into law,

 

and I’m concerned that this

 

algorithms that they have.

 

Pushes outrageous content,

 

promoting anorexia and the like.

 

I know it’s personal to you.

 

Do you think that their algorithms push

 

some of this content to young girls?

 

Facebook knows that their engagement

 

based ranking the way that they

 

picked the content in Instagram

 

for young users for all users

 

amplifies preferences and they have

 

done something called a proactive,

 

proactive incident response where they

 

they take things that they’ve heard,

 

for example like.

 

Can you be led by the algorithms

 

to anorexia content and they have

 

literally recreated that experiment

 

themselves and confirmed yes,

 

this this happens to people.

 

So Facebook knows that they are

 

that they are leading young users

 

to anorexia content.

 

Do you think they are deliberately

 

designing their product to be

 

addictive beyond even that content?

 

Facebook has a long history of having

 

a successful and very effective growth

 

division where they take little

 

tiny tweaks and then constantly,

 

constantly,

 

constantly are trying to optimize it

 

to grow those kinds of stickiness

 

could be construed as things that

 

facilitate addiction, right?

 

Last thing I’ll ask is we’ve seen this

 

same kind of content in the political world.

 

You brought up other countries

 

and what’s been happening there.

 

On 60 minutes you said that Facebook

 

implemented safeguards to reduce

 

misinformation ahead of the 2020.

 

Election,

 

but turned off those safeguards

 

right after the election.

 

And you know that the insurrection

 

occurred January 6.

 

Do you think that Facebook turned

 

off the safeguards because they

 

were costing the company money

 

because it was reducing profits?

 

Facebook has been emphasizing a false choice.

 

They’ve said the safeguards that

 

were in place before the election

 

and implicated free speech.

 

The choices that were happening on

 

the platform were really about how

 

reactive and twitchy was the platform,

 

right?

 

Like how viral was the platform and

 

Facebook changed those safety defaults

 

in the run up to the election because

 

they knew they were dangerous and

 

because they wanted that growth back.

 

They wanted the acceleration of the

 

platform back after the election.

 

They they returned to their

 

original defaults.

 

And the fact that they had to

 

break the glass

 

on January 6th and turn them back on,

 

I think that’s deeply problematic,

 

agree, thank you very much for

 

your bravery in coming forward.

 

Senator Thune thank you Mr.

 

Chair and ranking member Blackburn.

 

I’ve been arguing for some time that it

 

is time for Congress to act and I think

 

the question is always what is the correct

 

way to do it the right way to do it?

 

Consistent with our First

 

Amendment right to free speech?

 

This committee doesn’t have

 

jurisdiction over the antitrust issue.

 

That’s the Judiciary Committee,

 

and I’m not averse to looking at the

 

monopolistic nature of Facebook.

 

Honestly, I think that’s a real

 

issue that needs to be examined

 

and perhaps addressed as well,

 

but at least under this

 

committee’s jurisdiction,

 

there are a couple of things

 

I think we can do,

 

and I have a piece of legislation

 

and Senators Blackburn and

 

Blumenthal are both Co sponsors.

 

Called the filter Bubble Transparency

 

Act and essentially what it would do

 

is give users the options to engage

 

with social media platforms without

 

being manipulated by these secret

 

formulas that essentially dictate

 

the content that you see when you

 

open up an app or log onto a website.

 

We also,

 

I think need to hold big tech

 

accountable by reforming Section 230

 

and one of the best opportunities,

 

I think to do that,

 

at least for in a bipartisan way,

 

is the platform accountability

 

and consumer transparency.

 

Or the pack pack.

 

And that’s legislation that I’ve

 

cosponsored with Senator shots,

 

which in addition to stripping section

 

230 protections for content that

 

a court determines to be illegal,

 

the Pact act would also increase

 

transparency and due process for users

 

around the content moderation process,

 

and, importantly,

 

in the context we’re talking about today.

 

With this hearing with a

 

major big tech whistleblower,

 

the PACT Act would explore the

 

viability of a federal program

 

for big Tech employees to blow

 

the whistle on wrongdoing.

 

Inside the companies where they work,

 

in my view,

 

we should encourage employees in

 

the tech sector like you to speak

 

up about questionable practices

 

of big tech companies so we can,

 

among other things,

 

ensure that Americans are fully

 

aware of how social media platforms

 

are using artificial intelligence

 

and opaque algorithms to keep

 

them hooked on the platform.

 

So let me I miss how we can just ask you.

 

We’ve learned from the information that

 

you provided that Facebook conducts

 

what’s called engagement based ranking.

 

Which you’ve described is very dangerous.

 

Could you talk more about why

 

engagement based ranking is dangerous

 

and do you think Congress should

 

seek to pass legislation like the

 

filter Bubble Transparency Act?

 

That would give users the ability to

 

avoid engagement based ranking altogether?

 

Facebook is going to say you don’t want

 

to give up engagement based ranking.

 

You’re not gonna like Facebook as much if

 

we’re not picking out the content for you.

 

That’s that’s just not true.

 

There are a lot of Facebook likes

 

to present things as false traces

 

like you have to choose between

 

having lots of spam like let’s say,

 

imagine we ordered our feeds by

 

time like an I message or on

 

their other forms of social media

 

that are chronologically based.

 

They’re going to say you’re gonna

 

get spent urine spammed like you’re

 

not going to enjoy your feed.

 

The reality is that those experiences

 

have a lot of permutations.

 

There are ways that we can make

 

those experiences where computers

 

don’t regulate what we see.

 

We together socially regulate what we see,

 

but they don’t want us to have that

 

conversation because Facebook knows

 

that when they pick out there,

 

the content that we focus on using computers,

 

we spend more time on their platform.

 

They make more money.

 

Uhm, the dangers of engagement based

 

ranking are that Facebook knows.

 

That content that elicits an

 

extreme reaction from you is more

 

likely to get a click, a comment,

 

or re share.

 

And it’s interesting because those

 

clicks and comments and shares aren’t

 

even necessarily for your benefit.

 

It’s because they know that other people

 

will produce more content if they get

 

the likes and comments and re shares.

 

They prioritize content in your

 

feed so that you will give little

 

hits of dopamine to your friends,

 

so they will create more content and

 

they have run experiments on people,

 

producers side experiments

 

where they have confirmed this.

 

So you,

 

you and your part of the

 

information you provided,

 

the Wall Street Journal.

 

It’s been found that Facebook altered

 

its algorithm in attempt to boost these

 

meaningful social interactions or MSI.

 

But rather than strengthening bonds between

 

family and friends on the platform,

 

the algorithm instead rewarded

 

more outrage and sensationalism.

 

And I think Facebook would say that

 

its algorithms are used to connect

 

individuals with other friends and

 

family that are largely positive.

 

Do you believe that Facebook’s

 

algorithms make its platform?

 

A better place for Morris users and

 

should consumers have the option to use

 

Facebook and Instagram without being

 

manipulated by algorithms designed to

 

keep them engaged on that platform?

 

I strongly believe like I’ve spent

 

most of my career working on systems

 

like engagement based ranking like when

 

I come to you and say these things.

 

I’m basically damning 10 years

 

in my own work, right?

 

Engagement based ranking.

 

Facebook says we can do it safely

 

because we have a I, you know,

 

the the artificial intelligence will

 

find the bad content that we know.

 

Our engagement based ranking is promoting.

 

They’ve written blog posts on how

 

they know engagement based rankings,

 

dangerous,

 

but the AI will save us.

 

Facebook’s own research says

 

they cannot adequately identify

 

dangerous content and as a result,

 

those dangerous algorithms that

 

they admit are picking up the the

 

extreme sentence, the division.

 

They can’t protect us from the harms

 

that they know exist in their own system,

 

and so I, I don’t think it’s just

 

a question of saying should people

 

have the option of choosing to not

 

be manipulated by their algorithms.

 

I think if we had appropriate oversight,

 

or if we were formed 2:30 to make Facebook

 

responsible for the consequences of

 

their intentional ranking decisions.

 

I think they would.

 

They would get rid of engagement based

 

ranking because it is causing teenagers

 

to be exposed to more anorexia content.

 

It is pulling families apart and

 

in places like Ethiopia it’s

 

literally fanning ethnic violence.

 

Uhm, I encourage reform of these platforms.

 

Not not picking and choosing

 

individual ideas, but instead making

 

the platforms themselves safer,

 

less twitchy, less reactive, less viral.

 

’cause that’s how we scalably

 

solve these problems.

 

Thank you. Miss Chair,

 

I would simply say let’s let’s get to work.

 

So we got some things we can do here.

 

Thanks, I agree, thank you, Senator.

 

Thank you, Mr.

 

Chairman, ranking member.

 

Thank you for your courage in coming forward.

 

Was there a particular moment when

 

you came to the conclusion that reform

 

from the inside was impossible and

 

that you decided to be a whistleblower?

 

There was a long series of moments

 

where I became aware that Facebook,

 

when faced with conflicts of interest

 

between its own profits and the common good

 

public safety that Facebook consistently

 

chose to prioritize its profits.

 

I think the moment which I realized we

 

needed to get help from the outside,

 

that the only way these problems

 

would be solved is by solving them

 

together and not solving them

 

alone was when civic integrity was

 

dissolved following the 2020 election.

 

It really felt like a betrayal of the

 

promises that Facebook can made to

 

people who had sacrificed a great deal

 

to keep the election safe by basically

 

dissolving our community and integrated

 

and just other parts of the company.

 

And I know they’re they’re responses that

 

they’ve sort of distributed the duties.

 

Yeah, that’s an excuse, right?

 

Uhm, I I cannot see into the hearts

 

of other men and I I don’t know

 

what they let me say it this way.

 

It won’t work right?

 

And I I can tell you that when I left

 

the company so my the people who I

 

worked with were disproportionately

 

maybe 75% of my pod of seven people.

 

Our product managers,

 

program managers mostly had come

 

from civic integrity.

 

All of us left the inauthentic behavior

 

pod either for other parts of the

 

company or the company entirely over

 

the same six week period of time,

 

so six months after the reorganization,

 

we had clearly lost faith that

 

those changes were coming.

 

You said in your opening statement that

 

they know how to make Facebook and

 

Instagram safer, so thought experiment.

 

You are now.

 

The chief executive officer

 

and chairman of the company.

 

What changes would you immediately institute?

 

An I would immediately establish a

 

policy of how to share information and

 

research from inside the company with

 

appropriate oversight bodies like Congress.

 

I would I would give proposed legislation

 

to Congress saying here’s what an

 

effective oversight agency would look like.

 

I would actively engage with academics

 

to make sure that the people who are who

 

are confirming our Facebook marketing

 

messages true have the information

 

they need to confirm these things.

 

And I would come immediately implement

 

the quote soft interventions that were

 

identified to protect the 2020 election.

 

So that’s things like requiring someone

 

to click on a link before re sharing it,

 

because other companies like Twitter

 

have found that that significantly

 

reduces misinformation.

 

No one is censored by being forced to

 

click on a link before re sharing it.

 

Thank you,

 

I want to pivot back to Instagram’s

 

targeting of kids.

 

We all know that they announced a pause,

 

but that reminds me of what they announced.

 

When they were going to issue a

 

digital currency and they got beat

 

up by EU S Senate Banking Committee

 

and they said never mind and now

 

they’re coming back around,

 

hoping that nobody notices that they

 

are going to try to issue a currency.

 

Now let’s set aside for the moment.

 

This sort of the the business model,

 

which appears to be gobble up everything.

 

Do everything.

 

That’s the gross growth strategy.

 

Do you believe that they’re actually

 

going to discontinue Instagram kids?

 

Or they’re just waiting for the dust?

 

Settle I. I would be sincerely

 

surprised if they do not continue

 

working on Instagram kids and I would

 

be amazed if a year from now we

 

don’t have this conversation again.

 

Why? Facebook understands that if

 

they want to continue to grow,

 

they have to find new users.

 

They have to make sure that that the

 

next generation is just as engaged

 

with Instagram as the current one.

 

And the way they’ll do that is by making

 

sure that children establish habits

 

before they have good self regulation

 

by hooking kids by hooking kids.

 

I would like to emphasize one of the

 

documents that we sent in on problematic use.

 

Examined the rates of problematic use by age,

 

and that peaked with 14 year olds.

 

It’s it’s just like cigarettes.

 

Teenagers don’t have good self regulation.

 

They say explicitly, I feel bad when

 

I use Instagram and yet I can’t stop.

 

Uhm, we need to protect the kids.

 

Just my final question.

 

I have a long list of.

 

Misstatements, misdirections and

 

outright lies from the company.

 

I don’t have the time to read them,

 

but you’re as intimate with

 

all these deceptions as I am,

 

so I will just jump to the end.

 

If you were.

 

A member of this panel,

 

would you believe what Facebook is saying?

 

I would not believe.

 

Facebook is not earned our right

 

to just have blind trust in them.

 

Trust is last week one of the most

 

beautiful things that I heard on the on.

 

The committee was trust is earned and

 

Facebook is not earned our trust.

 

Thank you.

 

Thanks Senator Schatz, senator.

 

Ran and then we’ve been joined by the chair,

 

Senator Cantwell, she’ll be next.

 

We’re going to break at

 

about 11:30 if that’s OK.

 

’cause we have a vote.

 

And then we’ll reconvene.

 

Mr. Chairman, thank you.

 

The conversation so far reminds me

 

that you and I ought to resolve our

 

differences and introduce legislation.

 

So as Senator Thune said, let’s go to work.

 

Our differences are very minor.

 

Or they seem very minor in the face of

 

the revelations that we’ve now seen,

 

so I’m hoping we can move forward.

 

Senator, I, I share that view, Mr.

 

Chairman, thank you.

 

Thank you very much for your testimony.

 

What examples do you know we’ve talked about,

 

particularly children?

 

Teenage girls in specifically?

 

But what other examples do you know about

 

where Facebook or Instagram new its

 

decisions would be harmful to its users?

 

But still proceeded with the with

 

the plan and executed those harmful.

 

That harmful behavior.

 

Facebooks internal research is aware

 

that there are a variety of problems

 

facing children on Instagram that are.

 

Uh. Her.

 

They know that severe harm

 

is happening in children.

 

For example, in the case of bullying,

 

Facebook knows that Instagram dramatically

 

changes the experience of high school.

 

So when we were in high school

 

when I was in high school,

 

most kids looked at me and changed.

 

Sorry.

 

We went when I was in high school,

 

you know,

 

or most kids have positive home lives like

 

it doesn’t matter how bad it is at school,

 

kids can go home and reset for 16 hours.

 

Kids kids who are bullied on Instagram.

 

The bullying follows them home.

 

It follows them into their bedrooms.

 

The last thing they see before

 

they go to bed at night is someone

 

being cruel to them or the first

 

thing they see in the morning is

 

someone being cruel to them.

 

Kids are learning that their own

 

friends like people who they care

 

about them are cruel to them.

 

Like,

 

think about how that’s going to

 

impact their domestic relationships

 

when they become 20 somethings or 30

 

somethings to believe that people

 

who care about you are mean to you.

 

Facebook knows that parents today,

 

because they didn’t experience these things.

 

They’ve never experienced this addictive

 

experience with a piece of technology.

 

They give their children bad advice.

 

They say things like,

 

why don’t you just stop using it and

 

so that Facebook’s own research is

 

aware that children express feelings

 

of loneliness and struggling with

 

these things because they can’t even

 

get support from their own parents.

 

I don’t understand how Facebook

 

can know all these things and not.

 

Escalated to someone like Congress

 

for help and support in navigating

 

these problems.

 

Let me ask the question in a in a

 

broader way besides teenagers or

 

besides girls or besides youth,

 

are there other practices at Facebook

 

or Instagram that are known to

 

be harmful but yet are pursued?

 

Uhm?

 

Facebook is aware that choice isn’t

 

made in establishing like meaningful

 

social meaningful social interactions.

 

So engagement based ranking that didn’t

 

care if you bully someone or committed hate

 

speech in the comments that was meaningful.

 

They know that that change directly changed

 

publishers behavior that companies like

 

BuzzFeed wrote in and said the content

 

is most successful on our platform is

 

some of the content we’re most ashamed of.

 

You have a problem with your

 

ranking and they did nothing.

 

They know that politicians are

 

being forced to take positions.

 

They know their own constituents

 

don’t like or approve of,

 

because those are the ones that

 

get distributed on Facebook.

 

That’s a huge huge negative impact.

 

The older people also knows that

 

they have admitted in public that

 

engagement based ranking is dangerous

 

without integrity and security systems,

 

but then not rolled out those

 

integrity and security systems to

 

most of the languages in the world.

 

And that’s what causing things

 

like ethnic violence in Ethiopia.

 

Thank you for your answer.

 

What is the magnitude of Facebook’s

 

revenues or profits that come

 

from the sale of user data?

 

Oh I’m, I’m sorry I’ve never worked on that.

 

I’m not aware.

 

Thank you.

 

What regulations or legal actions by

 

Congress or by administrative action do

 

you think would have the most consequence

 

or would be feared most by Facebook,

 

Instagram or Allied companies?

 

I strongly encourage

 

reforming Section 232 exempt.

 

Decisions about algorithms, right?

 

So?

 

Modifying 230 around content I think,

 

has it’s it’s very complicated because

 

user generated content is something

 

that companies have less control over.

 

They have 100% control over their algorithms.

 

And Facebook should not get

 

a free pass on choices.

 

It makes to prioritize growth and virality

 

and reactiveness over public safety.

 

They shouldn’t get a free pass on

 

that because they’re paying for their

 

profits right now with our safety,

 

so I strongly encourage reform of 2:30.

 

In that way.

 

I also believe there needs to

 

be a dedicated oversight body,

 

because right now the only people

 

in the world who are trained

 

to analyze these experiments to

 

understand what’s happening inside

 

of Facebook are people who,

 

you know,

 

grew up inside of Facebook or Pinterest.

 

Or another social media company and

 

there needs to be a regulatory home where

 

someone like me could do a tour of duty.

 

After working at a place like this

 

and and have a place to work on

 

things like regulation to bring that

 

information out to the oversight

 

boards that that have the right

 

to do oversight Regulatory agency

 

within the federal government, yes.

 

Thank you very much. Thank you, chairman.

 

Senator Cantwell thank you Mr.

 

Chairman, thank you for holding this hearing.

 

And I think my colleagues have brought

 

up a lot of important issues and so I

 

think I just want to continue on that vein.

 

First of all,

 

the Privacy act that I introduced,

 

along with several of my

 

colleagues actually does have FTC

 

oversight of algorithm transparency.

 

In some instances, I’d hope you take

 

a look at that and tell us what other

 

areas you think we should add to that

 

level of transparency. But clearly,

 

that’s the the issue at hand here.

 

I think in your coming forward.

 

So thank you again for your

 

willingness to do that.

 

The documentation that you say

 

now we exist is the level of

 

transparency about what’s going on.

 

That people haven’t been able to see,

 

and so your information that you say

 

is going up to the highest levels at

 

Facebook is that they purposely knew

 

that their algorithms were continuing to

 

have misinformation and hate information.

 

And that, when presented with

 

information about this terminology,

 

you know downstream MSI meaningful social

 

information knowing that it was this choice.

 

You could continue this

 

wrong headed information.

 

Hate information about the

 

Rohingya or you could continue

 

to get higher clickthrough rates.

 

And I know you said you

 

don’t know about profits,

 

but I’m pretty sure you know that on a

 

page if you click through that next page,

 

I’m pretty sure there’s a lot more ad

 

revenue than if you didn’t click through.

 

So you’re saying the documents exist

 

that at the highest level at Facebook

 

you had information discussing

 

these two choices and that people

 

chose even though they knew that it

 

was misinformation and hurtful and

 

maybe even causing people lives,

 

they continued to choose profit.

 

We have submitted documents to

 

Congress outlining Mark Zuckerberg

 

was directly presented with a

 

list of quote soft interventions.

 

So a hard intervention is like taking

 

a piece of content on Facebook,

 

taking a user on Facebook so often

 

interventions are about making

 

slightly different choices to make

 

the platform less viral, less twitchy.

 

Mark was presented with these options

 

and chose to not remove downstream MSI.

 

In April of 2020,

 

even though he in even just isolated

 

in at risk countries,

 

that’s countries at risk of violence.

 

If it had any impact on the overall

 

MSI metric.

 

So he chose which in translation

 

means less money.

 

He said right there another reason

 

why they would do it other than they

 

thought it would really affect their numbers.

 

I don’t don’t know for certain

 

like Jeff Jeff Horowitz.

 

Report for the Wall Street Journal.

 

I struggled with us.

 

We sat there and read these minutes.

 

And we’re like, how is this possible?

 

Like we’ve just read 100 pages

 

on how downstream MSI expands,

 

hate speech, misinformation,

 

violence, inciting content,

 

graphic violent content?

 

Why won’t you get rid of this?

 

And we we the best theory that we’ve

 

come up with and I want to emphasize

 

this is just our interpretation on it.

 

Is people’s bonuses are tide to MSI,

 

right?

 

Like people,

 

people stay or leave the company.

 

On what they get paid and like

 

if you hurt MSI a bunch people

 

weren’t going to get their bonuses,

 

so you’re saying that this practice

 

even still continues today

 

like we’re still in this environment.

 

I’m personally very frustrated by this

 

because we presented information to

 

Facebook from one of my own constituents

 

in 2018 talking about this issue with

 

Rohingya pleading with the company.

 

We pleaded with the company and they

 

continue to not address this issue.

 

Now you’re pointing out that these

 

same algorithms are being used,

 

and they know darn well in Ethiopia

 

that it’s causing an inciting violence.

 

And again they are still today choosing

 

profit over taking this information down.

 

Is that correct?

 

When rioting began in the United

 

States in the summer of last year,

 

they turned off downstream MSI only for when

 

they detected content was health content,

 

which is probably COVID and civic content.

 

But Facebook’s own algorithms

 

are bad at finding this content.

 

It’s still in the raw form.

 

For 1890% of even that sensitive content

 

in countries where they don’t have

 

integrity systems in the language,

 

local language,

 

and in the case of Ethiopia,

 

there are 100 million people

 

in Ethiopia and six languages,

 

Facebook only supports two of those

 

languages for integrity systems.

 

This strategy of focusing on language

 

specific content, specific systems,

 

AI to save us is doomed to fail.

 

I I need to get to one of the first of all,

 

I’m sending a letter to Facebook today.

 

They better not delete any information

 

as it relates to the Rohingya.

 

Are investigations about how

 

they proceeded on this,

 

particularly on in light of your

 

information or the documents?

 

But aren’t we also now talking

 

about advertising fraud?

 

Aren’t you selling something to advertisers?

 

That’s not really what they’re getting.

 

We know about this because of the newspaper.

 

Issues were trying to say that

 

journalism that basically has

 

to meet a different standard.

 

A public interest standard

 

that basically is out there.

 

Basically proving everyday or

 

they can be sued.

 

These guys are a social media

 

platform that doesn’t have to live

 

with that and then the consequences.

 

They’re telling their advertisers

 

that this was a we see it.

 

We see it people are coming back to the

 

local journalism because they’re like

 

we want to be against with a trusted brand.

 

We don’t want to be in,

 

you know your website.

 

So I I think you’re finding for the SEC.

 

Is an interesting one,

 

but I think that we also have to look

 

at what are the other issues here

 

and one of them is did you defraud?

 

Did they defraud advertisers

 

and telling them this was the

 

advertising content that you were

 

going to be advertised again,

 

when in reality was something different,

 

was based on a different model.

 

We have multiple examples of question

 

and answers for the advertising staff,

 

the sales staff where advertisers

 

say after the riots last summer

 

were asked should we come back to

 

Facebook or after the instruction?

 

Like should we come back to Facebook and

 

Facebook said in their talking points?

 

That they gave to advertisers.

 

We’re doing everything in

 

our power to make this safer

 

or we take down all the hate

 

speech when we find it.

 

But Facebook’s own.

 

And that was not true.

 

That was not true.

 

They get 3 to 5% of hate speech.

 

Thank you. Thank you Mr.

 

Chairman.

 

Thanks Senator cantwell.

 

And if you wanna make your letter

 

available to other members of the committee,

 

I’d be glad to join you

 

myself and thank you.

 

Thank you for suggesting it.

 

Senator Lee thank you. Mr.

 

Chairman and thank you,

 

Miss Hogan for joining us this week.

 

It’s very, very helpful or grateful that

 

you’re willing to make yourself available.

 

Last week we had another witness

 

from Facebook, Miss Davis.

 

She came and she testified before

 

this committee and she focused on,

 

among other things,

 

the extent to which Facebook

 

targets ads to children,

 

including ads that are either sexually

 

suggestive or geared toward adult

 

themed products or themes in general.

 

Now I didn’t, I I well I appreciated

 

her willingness to be here.

 

I didn’t get the clearest answers in

 

response to some of those questions,

 

and so I’m hoping that you can

 

help shed some light on some of

 

those issues related to Facebook’s

 

advertising processes here.

 

Today, as we get into this,

 

I want to first read you a quote that

 

I got from from Miss Davis last week.

 

Here’s what she said during

 

her questioning quote.

 

When we do ads to young people,

 

there are only three things that an

 

advertiser can target around age,

 

gender, location.

 

We also prohibit certain as to young people,

 

including weight loss ads.

 

We don’t allow tobacco ads at all,

 

meaning to young people.

 

We don’t allow them to children.

 

We don’t allow them to.

 

Minors close quote.

 

Now since that exchange happened last week,

 

there are a number of individuals and groups,

 

including a group called the

 

Technology Transparency Project,

 

or TDP that have indicated that that

 

part of her testimony was inaccurate,

 

that it was false.

 

TDP noted that TDP had conducted

 

an experiment.

 

Just last month and their,

 

their goal was to run a series of ads

 

that would be targeted to children

 

ages 13 to 17 to users in the United States.

 

Now I want to emphasize that PDP

 

didn’t end up running these ads.

 

They stopped them from being

 

distributed to the users,

 

but Facebook did in fact approved them,

 

and as I understand it,

 

Facebook approved them for an

 

audience of up to 9.1 million users,

 

all of whom were teens.

 

So I brought a few of these to show you.

 

Today this is.

 

This is the first one I wanted

 

to showcase this first one as

 

a colorful graphic.

 

Encouraging kids to quote throw

 

a Skittles party like no other.

 

Which you know, as the graphic indicates,

 

and as as the slang jargon also

 

independently suggests this involves.

 

Kids getting together.

 

Randomly to abuse prescription drugs.

 

The second graphic displays an Anna tip.

 

That is.

 

A tip specifically designed to

 

encourage and promote anorexia.

 

And it’s on there now.

 

The language,

 

the anti tip itself.

 

Independently promotes that the

 

ad also promotes it insofar as it

 

was suggesting these are images

 

you ought to look at when you need

 

motivation to be more anorexic,

 

I guess you could say.

 

Now the third one invites children to

 

find their partner online and to make

 

a love connection, you look lonely.

 

Find your partner now to make a

 

love connection. Now look at it.

 

Be an entirely different kettle of fish.

 

If this were targeted to an adult audience,

 

it is not targeted to 13 to 17 year olds.

 

Now obviously I don’t support and and

 

T DP does not support these messages,

 

particularly when targeted to

 

impressionable children and and again,

 

just to be clear,

 

TDP did not end up pushing the ads.

 

Out after receiving Facebook’s approval,

 

but it did in fact receive

 

Facebook’s approval.

 

So I think this says something one could

 

argue that it proves that Facebook is

 

allowing and and perhaps facilitating the

 

targeting of harmful adult themed ads.

 

To our nation’s children.

 

So could you please explain to

 

me Miss Hogan how these ads?

 

With a target audience.

 

Of 13 to 17 year old children.

 

How would they possibly be approved by

 

Facebook and it is AI involved in that?

 

Uhm?

 

I did not work directly on the

 

ad approval system.

 

Uhm, what was resonant for me

 

about your testimony is Facebook.

 

Has a deep focus on scale so scale

 

is can we do things very cheaply

 

for a huge number of people,

 

which is part of why they rely on AI so much.

 

It is very possible that none of

 

those ads were seen by a human and

 

the reality is that we’ve seen from

 

repeated documents within my disclosures.

 

Is that Facebook’s AI systems only

 

catch a very tiny minority of offending

 

content and best case scenario in the

 

case of something like hate speech,

 

at most they will ever get 10 to 20%.

 

In the case of children,

 

that means drug paraphernalia.

 

Ads like that and it’s likely if they

 

rely on computers and not humans.

 

They will also likely never get more

 

than 10 to 20% of those ads understood.

 

Mr. Chairman.

 

I’ve got one minor follow-up question

 

would should be easy to answer.

 

So while Facebook may may claim that

 

it only targets ads based on age,

 

gender and location, even though.

 

These things seem to counteract that,

 

but let’s set that aside for a minute.

 

And that they’re not basing ads based

 

on specific interest categories.

 

Does Facebook still collect interest

 

category data on teenagers even

 

if they aren’t at that moment?

 

Targeting ADS at teens based

 

on those interest categories,

 

I think it’s very important to

 

differentiate between what targeting our

 

advertisers allowed to specify and what

 

targeting Facebook may learn foreign add.

 

Let’s imagine you had some text on an ad.

 

It would likely extract out features.

 

That I thought was relevant for that ad.

 

For example,

 

in the case of something about partying,

 

it would learn partying as a concept.

 

I’m very suspicious that personalized

 

ads are still not being delivered

 

to teenagers on Instagram because

 

the algorithms learn correlations.

 

They learn interactions where your party

 

ad may still go to kids interested in

 

partying because Facebook is is almost

 

certainly has a ranking model in the

 

background that it says this person

 

wants more party related content.

 

Interesting, thank you,

 

that’s very helpful.

 

And what that suggests to me is that while

 

they’re they’re saying they’re not targeting.

 

Teens with those ads.

 

The algorithm might do some

 

of that work for them,

 

which might explain why they

 

collect that data,

 

even while claiming that they’re

 

not targeting those ads in that way.

 

I can’t speak to whether or

 

not that’s the intention,

 

but the reality is it’s very,

 

very very difficult to understand

 

these algorithms today and

 

over and over and over again.

 

We saw these biases,

 

the algorithms unintentionally learn,

 

and so yeah,

 

it it’s very hard to disentangle

 

out these factors as long as you

 

have engagement based ranking.

 

Thank you very much. Senator Lee.

 

Senator Markey but thank you.

 

Mr. Chairman very much.

 

Thank you, Miss Hogan you you

 

are a 21st century American hero

 

warning our country of the danger.

 

For young people.

 

Or on democracy and our nation owes you.

 

Just a huge debt of gratitude for the courage

 

you’re showing here today, so thank you.

 

Miss Hogan, do you agree that Facebook

 

actively seeks to attract children

 

and teens onto its platforms?

 

On Facebook, actively markets to children or

 

marketed to children under the age of 18.

 

To get on Instagram and definitely targets

 

children as young as eight to be on in

 

messenger kids and internal Facebook

 

documents from 2020 that you reveal reads.

 

Why do we care about twins?

 

They are valuable but untapped audience,

 

so Facebook only cares about

 

children to the extent that they are

 

a monetary value last week.

 

Facebook’s global head of safety,

 

Antigone Davis told me that

 

Facebook does not allow targeting

 

of certain harmful content to teens.

 

Miss Davis stated,

 

we don’t allow weight loss ads to be

 

shown to people under the age of 18,

 

yet a recent study found that Facebook

 

permitted targeting of teens as

 

young as 13 with pads that showed a

 

young woman’s thin waist promoting

 

websites that glorify anorexia.

 

Miss Hogan.

 

Based on your time at Facebook,

 

do you think Facebook is telling the truth?

 

I think Facebook has focused on scale

 

over safety and it is likely that they

 

are using artificial intelligence to

 

try to identify harmful ads without

 

allowing the public oversight to see

 

what is the actual effectiveness

 

of those safety systems.

 

You unearthed Facebooks research about,

 

it’s hard to teams.

 

Did you raise this issue with

 

your supervisors?

 

I did not work directly on anything

 

involving teen mental health this research.

 

Is freely available to anyone in the company.

 

Mr.

 

Davis testified last week quote we

 

don’t allow tobacco ads at all.

 

We don’t allow them to children either.

 

We don’t allow alcohol ads to minors,

 

however.

 

Researchers also found that Facebook does

 

allow targeting of teens with ads on vaping.

 

Miss Hogan, based on your time at Facebook,

 

do you think Facebook is telling the truth?

 

I do not have context on that that issue.

 

I assume that if they are using artificial

 

intelligence to catch those vape ads,

 

unquestionably adds or

 

making its way through.

 

So from my perspective,

 

listening to you and your

 

incredibly courageous revelations,

 

time and time again,

 

Facebook says one thing and does another.

 

Time and time again,

 

Facebook fails to abide by the

 

commitments that they had made.

 

Time and time again.

 

Facebook lies about what they

 

are doing yesterday.

 

Facebook had a platform outage,

 

but for years it has had a principals outage.

 

It’s only real principle is profit.

 

Facebook’s platforms are not safe

 

for young people.

 

As you said,

 

Facebook is like big tobacco

 

enticing young kids with that,

 

first cigarettes that first

 

social media account designed to

 

hook kids as users for life.

 

That’s how can you whistle blowing

 

shows that Facebook uses harmful

 

features that quantify popularity.

 

Push manipulative influencer marketing

 

amplify harmful content to teens.

 

And last week in this committee Facebook.

 

Wouldn’t even commit to not

 

using these features on 10

 

year olds. Facebook is built on

 

computer codes of misconduct.

 

Senator Blumenthal and I.

 

Have introduced the kids

 

Internet design and Safety Act.

 

The kids act.

 

You have asked us to act as a

 

committee and Facebook has scores

 

of lobbyists in the city right now.

 

Coming in right after this

 

hearing to tell us we can’t act.

 

And they’ve been successful for a decade

 

in blocking this committee from acting.

 

So let me ask you a question.

 

The kids Internet design and

 

Safety Act or the Kids act.

 

Here’s with the legislation.

 

Does and includes outright bans

 

on children’s app features

 

that one quantified popularity

 

with likes and follower counts.

 

Promotes 2.

 

That two promotes influencer marketing

 

and three that amplifies toxic posts

 

and that it would prohibit Facebook

 

from using its algorithms to promote.

 

Toxic post should we pass that legislation?

 

I strongly encourage reforms that push

 

us towards human scale social media

 

and not computer driven social media.

 

Those amplification harms are caused by

 

computers choosing what’s important to us,

 

not our friends and family,

 

and I encourage any system

 

that children are exposed to to

 

not use amplification systems.

 

So you agree that Congress has to

 

enact these special protections for

 

children and teens that stop social

 

media companies from manipulating young

 

users and threatening their well being.

 

To stop using its algorithm to harm kids,

 

you agree with that.

 

I I do believe Congress must act for

 

protect children and children and

 

teens also needed privacy online.

 

Bill of Rights I’m the author of

 

the Children’s Online Privacy

 

Protection Act of 1998,

 

but it’s only for kids under 13

 

because the industry stopped me from

 

making at age 16 and 1998 because it

 

was already their business model.

 

But we need to update that law

 

for the 21st century.

 

Tell me if this should pass one.

 

Create an online eraser button so

 

that young users can tell websites to

 

delete the data they have collected

 

about them to give young teens under

 

the age of 16 and their parents

 

control of their information and

 

three ban targeted ads to children.

 

I support all those actions,

 

thank you and and finally I’ve also

 

introduced the algorithmic justice

 

and online Platform Transparency

 

Act which would one open the hood

 

on Facebook and big text algorithms

 

so we know how Facebook is using our

 

data to decide what content we see

 

and to ban discriminatory algorithms.

 

That harm vulnerable populations

 

online like showing employment and

 

housing ads to white people but

 

not to black people in our country

 

should Congress pass that bill.

 

Algorithmic bias issues are a

 

major issue for our democracy.

 

During my time at Pinterest,

 

I became very aware of the challenges

 

of like I mentioned before,

 

it’s difficult for us to understand how

 

these algorithms actually act and perform.

 

Facebook is aware of complaints today

 

by people like African Americans

 

saying that reels doesn’t give African

 

Americans the same distribution

 

as white people and and until we

 

have transparency

 

and our ability to confirm ourselves.

 

The Facebook marketing messages are true.

 

We will not have a system that

 

is compatible with democracy.

 

So I and I and I I thank Senator Lee.

 

I agree with you and your line of questions.

 

I wrote Facebook asking them to explain

 

that discrepancy because Facebook I think

 

is lying about tagging 13 to 15 year olds.

 

So here’s my message for Mark Zuckerberg.

 

Your time of invading our privacy,

 

promoting toxic content

 

and praying on children.

 

In teens is over.

 

Congress will be taking action.

 

You can work with us or not work with us,

 

but we will not allow your company

 

to harm our children and our families

 

and our democracy any longer.

 

Thank you. Mr Hogan we will act.

 

Thanks, Senator Markey,

 

we’re going to turn to Senator Blackburn

 

and then. We will take a break.

 

I know that.

 

There is some interest in another

 

round of questions maybe.

 

Well.

 

Maybe we’ll turn to Senator Lujan

 

for his questions before crews

 

and Scott and we have others,

 

so we’ll come back after the Mr.

 

Chairman.

 

I have to go to sit in the chair

 

starting at noon today we turn.

 

I do. I have one question.

 

This relates to white.

 

Mr Markey was asking does Facebook

 

ever employed child psychologist

 

or mental health professionals

 

to deal with these children?

 

Online issues that we’re discussing

 

Facebook has many researchers with

 

pH D’s I assume some of them are I.

 

I know that some have psychology degrees.

 

I’m not sure if they are child specialists.

 

Facebook also works with external

 

agencies that are specialists

 

at children’s rights online.

 

Senator Lujan and then,

 

at the conclusion of Senator

 

Luhan’s questions,

 

we’ll take a break.

 

We’ll come back.

 

At noon thank you, Mr.

 

Chairman and I appreciate the

 

indulgence of the committee.

 

Miss Hogan. Last week,

 

the committee heard directly from Miss Davis,

 

the global head of safety for Facebook.

 

During the hearing,

 

the company contested their own.

 

Internal research as if it does not exist.

 

Yes or no.

 

Does Facebook have internal research

 

indicating that Instagram harms teens,

 

particularly harming perceptions of

 

body image which disproportionately

 

affects young women?

 

Yes, Facebook has extensive research on

 

the impacts of its products on teenagers,

 

including young women.

 

Thank you for confirming these reports.

 

Last week I requested Facebook

 

make the basis of this research.

 

The data set minus any personally

 

identifiable information

 

available to this committee.

 

Do you believe it is important for

 

transparency and safety that Facebook

 

release the basis of this internal research?

 

The core data set to allow

 

for independent analysis.

 

I,

 

I believe it is vitally important

 

for our democracy that we establish

 

mechanisms where Facebook’s internal

 

research must be disclosed to the

 

public on a regular basis and that we

 

need to have privacy sensitive data

 

sets that allow independent researchers

 

to confirm whether or not Facebook’s

 

marketing messages are actually true.

 

Beyond this particular research

 

at Facebook make its internal

 

primary research not just secondary

 

slide decks of Cherry pick data,

 

but the underlying data public by default.

 

Can this be done?

 

In a way that respects user privacy,

 

I believe in collaboration with

 

academics and other researchers that

 

we can develop privacy conscious

 

ways of exposing radically more

 

data that is available today.

 

It is important for our ability to

 

understand how algorithms work,

 

how Facebook shapes the information

 

we get to see that we have these data

 

sets be publicly available for scrutiny.

 

Is Facebook capable of making the

 

right decision here on its own,

 

or is regulation needed to create

 

real transparency at Facebook until

 

incentives change at Facebook,

 

we should not expect Facebook to change.

 

We need action from Congress.

 

Last week I asked Miss Davis about

 

shadow profiles for children on

 

the site and she answered that no

 

data is ever collected on children

 

under 13 because they are not

 

allowed to make accounts.

 

This tactfully ignores the issue.

 

Facebook,

 

who knows children use their platform.

 

However,

 

instead of seeing this as

 

a problem to be solved,

 

Facebook views this as

 

a business opportunity.

 

Yes or no.

 

Does Facebook conduct research

 

on children under 13 examining

 

the business opportunities?

 

Of connecting these young

 

children to Facebook’s products,

 

I want to emphasize how vital it

 

is that Facebook shaft to publish

 

the mechanisms by which it tries

 

to detect these children because

 

they are on the platform in far

 

greater numbers than anyone is aware.

 

I do believe that or I am aware

 

that Facebook is doing research on

 

children under the age of 13 and they have.

 

Those studies are included in my disclosure.

 

You have shared your concerns about

 

how senior management Facebook has

 

continuously prioritized revenue

 

over potential user Harman safety.

 

And I have a few questions on

 

Facebook’s decision making.

 

Last week I asked Miss Davies quote.

 

Has Facebook ever found a

 

change to its platform?

 

Would potentially inflict harm on users,

 

but Facebook moved forward because the change

 

would also grow users or increase revenue.

 

Miss Davis said in response quote.

 

It’s not been my experience

 

at all at Facebook.

 

That’s just not how we would approach it.

 

Yes or no? Has Facebook ever found a feature?

 

On its platform harmed its users,

 

but the feature moved forward because it

 

would also grow users or increase revenue.

 

Facebook likes to paint that these

 

issues are really complicated.

 

There are lots of simple issues,

 

for example,

 

requiring someone to click through

 

on a link before you re share it.

 

That’s not a large imposition,

 

but it does decrease growth

 

that tiny little amount,

 

because in some countries re shares

 

make up 35% of all the content

 

that people see Facebook prioritize

 

that content on the system,

 

the reachers over the impacts to

 

misinformation, hate speech or violence,

 

incitement.

 

Did these decisions ever come

 

from Mark Zuckerberg directly or

 

from other senior management?

 

At Facebook?

 

We have a few choice documents that contain

 

notes from briefings with Mark Zuckerberg,

 

where he chose metrics defined by

 

Facebook like meaningful social

 

interactions over changes that would

 

have significantly decreased misinformation,

 

hate speech,

 

and other inciting content.

 

And this is the reference you shared

 

earlier to Miss Cantwell April of 2020.

 

Facebook appears to be able to count on

 

the silence of its workforce for a long time,

 

even as it knowingly continued

 

practices and policies that

 

continue to cause and amplify harm.

 

Facebook content moderators have

 

called out quote a culture of fear

 

and secrecy within the company that

 

prevented them from speaking out.

 

Is there a culture of fear at

 

Facebook around whistle blowing

 

and external accountability?

 

Facebook has a culture that that that

 

emphasizes that that insularity is

 

the path forward that if information

 

is shared with the public,

 

it will just be misunderstood.

 

And I believe that relationship

 

has to change.

 

The only way that we will solve

 

these problems is by solving them

 

together and we’ll have much better,

 

more democratic solutions if we do

 

it collaboratively than in isolation.

 

And my final question,

 

is there a senior level executive

 

at Facebook,

 

like an Inspector General who’s

 

responsible for ensuring complaints

 

from Facebook employees are taken

 

seriously and that employees legal,

 

ethical,

 

and moral concerns received consideration

 

with the real possibility of

 

instigating change to company policies.

 

I’m I’m not aware of that role,

 

but the company is large and amazing.

 

I appreciate that.

 

It’s my understanding that there is a

 

gentleman of by the name of Roy Austin,

 

who is the Vice President of Civil Rights,

 

who’s described himself as

 

an Inspector General,

 

but he does not have the authority to

 

make these internal conflicts public.

 

The Oversight Board was created

 

by Facebook to review moderation

 

policies related to public content.

 

Specifically, it was not created to

 

allow employees to raise concerns,

 

so again, another area of interest.

 

I believe that we have to act on.

 

I thank you. For coming forward today.

 

My pleasure. Happy to serve.

 

The committee is in recess.

 

Yeah.

 

It’s my

 

old school.

 

Welcome back, Miss Hogan.

 

Thank you for your patience.

 

We’re going to reconvene and

 

we’ll go to Senator Hickenlooper.

 

Thank you Mr. Chair thank you,

 

Miss Hogan, for for your direct answers

 

and for being willing to come out and,

 

you know, provide such clarity

 

on so many of these issues.

 

Obviously Facebook can manipulate

 

its algorithms to attract users.

 

And I guess. My question would be,

 

do you feel, in your humble opinion,

 

that you know simply maximizing profits,

 

no matter the societal impact that is

 

justified, and I think the question

 

then would be that that’s the short

 

question which I think I know the answer.

 

What impact Facebook’s bottom line

 

would it have if the algorithm was

 

changed to promote safety? And two.

 

Instead of to change to two.

 

To save the lives of young women

 

rather than putting them at risk.

 

Learn about the talk button.

 

Facebook today.

 

Has a profit is makes approximately

 

$40 billion a year in profit.

 

A lot of the changes that I’m talking

 

about are are not going to make

 

Facebook an unprofitable company.

 

It just won’t be a ludicrously

 

profitable company like it is today.

 

Engagement based ranking which causes

 

those amplification problems that

 

leads young women from you know,

 

innocuous topics like healthy recipes

 

to anorexia content if it were removed,

 

Facebook people would consume

 

less content on Facebook.

 

But Facebook would still be profitable,

 

and so I I encourage oversight and public

 

scrutiny into how these algorithms work,

 

and the consequences of them.

 

Right well.

 

And I appreciate that I I’m a

 

former small business owner I.

 

Started Brew Pub back in 1988.

 

And really was always we worked

 

very hard to to look.

 

Again,

 

we weren’t doing investigations,

 

but we were very sensitive to whether

 

someone had too much to drink,

 

whether we had a frequent customer

 

who was frequently putting

 

himself at risk and and others.

 

Obviously,

 

I think that the Facebook

 

business model puts.

 

Well posted risk to to youth and to

 

enter teens you cared compared to

 

cigarette companies, which I thought was.

 

Rightfully so.

 

If this I guess the question is,

 

is this level of risk appropriate

 

or is there a level of risk

 

that would be appropriate?

 

I think there is an opportunity to

 

reframe some of these oversight actions.

 

So when we think of them as

 

these trade offs of like it’s

 

either profitability or safety.

 

I think that’s a false choice

 

and then reality.

 

The thing I’m asking for is

 

a move from short term ISM,

 

which is what Facebook is run under today.

 

Right?

 

Is being led by metrics and not led by

 

people and that with appropriate oversight,

 

and some of these constraints,

 

it’s possible that Facebook is

 

actually a much more profitable

 

company five or ten years down the

 

road because it wasn’t as toxic.

 

Not as many people quit it,

 

but that’s one of those counterfactuals

 

that we can’t actually test,

 

so regulation might actually make Facebook

 

more profitable over the long term.

 

That’s often the case.

 

I think the same could be said for

 

automobiles and go down the list

 

of all those things that there’s

 

so much pushback in the beginning.

 

I also thought that the.

 

The question of of how do we assess

 

the impact to their bottom line?

 

We had a represented Facebook in

 

here recently who talked about that

 

eight out of 10 Facebook users feel

 

their life is better and that their

 

job is to get to 10 out of 10.

 

Maybe this is the two to 20%

 

that they’re missing.

 

I don’t know how large that the

 

demographic is of of people that are

 

caught back up into this circulus,

 

circuitous you know?

 

Sense of of really taking them

 

down into a the wrong direction?

 

How many people?

 

That is do you have any idea?

 

That that quote last week was

 

really shocking to me because I

 

don’t know if you’re aware of this,

 

but in the case of cigarettes only,

 

about 10% of people who smoke

 

ever get lung cancer, right?

 

So the idea that you 20% of your users could

 

be facing a serious mental health issues,

 

and that’s not a problem is shocking.

 

I also want to emphasize for people that

 

that eating disorders are serious, right?

 

There are going to be women walking

 

around this planet in 60 years with

 

brittle bones because of choices that fit

 

that Facebook made around emphasizing

 

profit today or there gonna be women.

 

Weather in 20 years who want to have

 

babies who can’t because they’re infertile.

 

As a result of eating disorders.

 

Today, they’re serious,

 

and I think there’s an opportunity

 

here for having public oversight

 

and public involvement,

 

especially in matters that impact children.

 

Thank you for being so direct on

 

this and for stepping forward.

 

I yield back to Florida chair.

 

Thanks Senator Hickenlooper.

 

Senator Cruz. Thank you, Mr.

 

Chairman, Miss Hogan welcome thank.

 

Thank you for your testimony.

 

Uh, when it concerns Facebook?

 

There are a number of concerns that this

 

committee and Congress has been focused on.

 

Two of the biggest had been Facebook’s

 

intentional targeting of kids with

 

content that is harmful to the children.

 

And then, secondly, in a discrete issue,

 

is the pattern of Facebook and social

 

media engaging in political censorship?

 

I want to start with the first

 

issue targeting kids.

 

Uhm?

 

As you’re aware,

 

it is indeed the documents that

 

you provided indicated.

 

Facebook’s quarter the

 

public reporting on it.

 

Facebook’s internal reports

 

found that Instagram makes

 

quote body image issues worse.

 

For one in three teen girls,

 

and additionally,

 

it showed that quote 13% of British users

 

and 6% of American users traced their

 

desire to kill themselves to Instagram.

 

Uh, is that a fair and accurate

 

characterization of what

 

Facebook’s research concluded?

 

I only know what I read in the documents

 

that were included in my disclosure,

 

that is, that is an accurate description

 

of the ones that I have read.

 

I because Facebook is not coming for with

 

the total corpus of their known research.

 

I don’t know what their other things say.

 

But yes,

 

there is documents to say those things,

 

so we had testimony last week in the

 

Senate with a witness from Facebook who

 

claimed that that that information.

 

Accurate and needed to be in context.

 

Now, of course she wasn’t

 

willing to provide the context.

 

The alleged mysterious context.

 

Do you know of any context that would

 

make those data anything other than

 

horrifying and deeply disturbing?

 

Uhm,

 

engagement based ranking and

 

these processes of amplification.

 

They impact all users of Facebook.

 

The algorithms are very smart in the

 

sense that they latch onto things that

 

people want to continue to engage with.

 

And unfortunately in the case of

 

teen girls and things like self harm,

 

they develop these feedback cycles

 

where children are using Instagram

 

is to self soothe but then are

 

exposed to more and more content

 

that makes them hate themselves.

 

This is a thing where we can’t

 

say 80% of kids are OK.

 

We need to say how do we save all the kids?

 

The Wall Street Journal reported

 

that Mark Zuckerberg was personally

 

aware of this research.

 

Do you have any information one way

 

or the other as to Mr Zuckerberg’s

 

awareness of the research,

 

we have a,

 

uh,

 

she’s mean one of the documents

 

included in the disclosures.

 

It details something called Project Daisy,

 

which is an initiative to remove

 

likes off of Instagram.

 

The internal research showed they

 

were moving likes off Instagram

 

is not effective as long as you

 

leave comments on those posts.

 

And yet the research directly

 

presented to mark.

 

Zuckerberg said we should still

 

pursue this as a feature to launch

 

even though it’s not effective

 

because the government,

 

journalists and academics want us

 

to do this like it would get us

 

positive points with the public.

 

That that kind of duplicity is why we

 

need to have more transparency and why.

 

If we want to have a system that

 

is coherent with democracy,

 

we must have public oversight from Congress.

 

Do you know if Facebook,

 

any of the research it conducted,

 

attempted to quantify how many

 

teenage girls may have taken their

 

lives because of Facebook’s products?

 

I am not aware of that research.

 

Do you know if Facebook made

 

any changes when they got back?

 

That 13% of British users and 6%

 

of American users traced their

 

desire to kill themselves to

 

Instagram? Do you know if they made any

 

changes in response to that research

 

to try to correct or mitigate that?

 

I found it very surprising that

 

when Antonio Davis was confronted

 

with this research last week,

 

she couldn’t enumerate a five point plan.

 

A 10 point plan of the

 

actions that they took.

 

I also find it shocking that one

 

once Facebook had this research,

 

it didn’t disclose it to the public.

 

Because this is the kind of thing that

 

should have oversight from Congress.

 

So when you were at Facebook,

 

where their discussions

 

about how to respond to this,

 

this research I did not work directly

 

on issues concerning children.

 

These are just documents that were

 

freely available in the company,

 

so I’m not aware of that.

 

Do you have thoughts as to what

 

kind of changes Facebook could

 

make to reduce or eliminate these

 

harms you mentioned earlier?

 

Concerns around free speech?

 

A lot of the things that I

 

advocate for our around changing

 

the mechanisms of amplification,

 

not around picking winners and

 

losers in the marketplace of ideas.

 

The problems.

 

So like I mentioned before, you know,

 

like how on Twitter if you have to

 

click on a link before you re share it.

 

Small actions like that friction.

 

Don’t require picking

 

good ideas and bad ideas.

 

They just make the platform less twitchy,

 

less reactive,

 

and Facebook’s internal research

 

says that each one of those small

 

actions dramatically reduces misinformation,

 

hate speech and violence,

 

inciting content on the platform.

 

So, and we’re we’re running out of time,

 

but but on the second major

 

topic of concern of Facebook,

 

which is censorship based on

 

what you’ve seen or you are you

 

concerned about political censorship

 

at at Facebook and in big Tech?

 

I believe you cannot have a system that.

 

Camp Odd has as big an impact on

 

society as Facebook does today,

 

with as little transparency as it does.

 

I I’m a strong proponent of chronological

 

ranking or ordering by time,

 

with a little bit of spammed

 

emotion because I think.

 

We don’t want computers deciding

 

what we focus on.

 

We should have software that is human

 

scaled or humans have conversations together,

 

not computers.

 

Facilitating who we get to hear from.

 

So how could we get more transparency?

 

What would produce that?

 

I strongly encourage the development

 

of some kind of regulatory body

 

that could work with academics,

 

work with researchers,

 

or with other government agencies

 

to synthesize requests for data

 

that are privacy conscious.

 

This is an area that I’m

 

really passionate about.

 

And because right now no one can

 

force Facebook to disclose data and

 

Facebook has been stonewalling us.

 

Or even worse,

 

they gave inaccurate data

 

to researchers as they were.

 

The scandal recently showed what

 

data should they turn over and

 

my time is expired so.

 

For example,

 

even data as simple as what

 

integrity systems exist today and

 

how well do they perform like there

 

are lots and lots of people who

 

Facebook is conveying around the

 

world that that Facebook safety

 

systems apply to their language,

 

and those people aren’t aware that

 

they’re using a raw, original,

 

dangerous version of Facebook.

 

Just basic actions like transparency

 

would make a huge difference.

 

Thank you.

 

Thanks, Sandra Cruz,

 

senator Loomis. Thank you Mr.

 

Chairman and thank you for your testimony.

 

If you were in my seat

 

today instead of your seat,

 

what documents or unanswered questions

 

would you seek from Facebook,

 

especially as it relates to children?

 

But even generally speaking?

 

I think any research regarding what

 

Facebook does problematic use,

 

IE the addictiveness of the product

 

is of vital importance and anything

 

around what Facebook knows about parents

 

lack of knowledge about the platform.

 

I only know about the documents

 

that I have seen, right?

 

I do not work on teens or

 

child safety myself,

 

but in the documents that I read,

 

Facebook articulates the idea that

 

parents today are not aware of how

 

dangerous Instagram is and they

 

because they themselves do not

 

live through these experiences.

 

They can’t coach their kids on

 

basic safety things and so at a

 

minimum Facebook shaft to disclose

 

what it knows in that context.

 

OK, so we’re trying to protect individuals.

 

Data that they’re gathering

 

have data privacy,

 

but have transparency in the manner in which.

 

The data is used.

 

Can we bridge that gap?

 

Imagine I think we reasonable people

 

can have a conversation on how many

 

people need to see a piece of content

 

before it’s not really private.

 

Like if 100,000 people see something,

 

is it private?

 

If 25,000 people see it, is it private?

 

Just disclosing the most

 

popular content on the platform,

 

including statistics around what

 

factors went into the promotion of that

 

content would cause radically more

 

transparency than we have today on

 

how Facebook chooses what we get to focus on,

 

how they shape our reality.

 

OK,

 

if if our focus is protecting the

 

First Amendment and our rights

 

to free speech while.

 

Very carefully regulating.

 

Data privacy.

 

Uhm?

 

I’ve heard there there are a number of

 

things that are being discussed in Congress.

 

Everything from antitrust laws

 

to calling Facebook a utility to

 

the idea that you just raised of

 

a regulatory board of some sort

 

that has authority to.

 

Through understanding of the

 

algorithms and how they’re used,

 

and other mechanisms that create what we see

 

the the the face of Facebook, so to speak.

 

How to tell me a little more about

 

how you envision that board working?

 

What is the in your mind?

 

Based on your understanding

 

of the company and the ill?

 

Consequences?

 

What is the best approach to bridging

 

the gap between keeping speech free?

 

And protecting individual

 

privacy with regard to data.

 

So I think those issues are

 

their independent issues,

 

so we can talk about free speech first,

 

which is having more transparency

 

like Facebook has solutions today

 

that are not content based and I am

 

a strong advocate for non content

 

based solutions because those

 

solutions will also then protect

 

the most vulnerable people in the

 

world in a place like Ethiopia

 

where they speak six languages.

 

If you have something that focuses

 

on good ideas and bad ideas,

 

those systems don’t work in diverse places.

 

So investing in non content based

 

ways to slow the platform down not

 

only protects our freedom of speech,

 

it protects people’s lives.

 

The second question is around privacy

 

and this question of how can we

 

have oversight and have privacy?

 

There is lots and lots of research

 

on how to abstract data sets so

 

you’re not showing people’s names.

 

You might not even be showing

 

the content of their post.

 

You might be showing data that

 

is about the content of their

 

post but not the post itself.

 

There are many ways to structure these

 

data sets that are privacy conscious.

 

And the fact that Facebook has

 

walled off the ability to see

 

even basic things about how the

 

platform performs or in the case

 

of their past academic research,

 

releasing inaccurate data.

 

Or not being clear about how they pull

 

that data is just part of a pattern

 

of behavior of Facebook hiding behind

 

walls and operating in the shadows,

 

and they have far too much power

 

in our society to be allowed to

 

continue to operate that way.

 

Well,

 

I I had heard you make the analogy

 

earlier to the tobacco industry and I

 

think that that’s an appropriate analogy.

 

I.

 

I I really believe we’re searching for.

 

The best way to address the problem,

 

and I’m I.

 

I’m not sure that it is the heavy

 

hands like breaking up companies

 

or calling them a utility.

 

Which is why your approach of integrating

 

people who understand the math and the

 

uses of the math with protecting privacy.

 

Is intriguing to me, so the more

 

information that you can provide to us.

 

About how that might work to

 

actually address the problem,

 

I I think would be helpful.

 

So in my case this is an invitation to you

 

to provide to my office or the committee.

 

Information about how we can get at

 

the root of the problem that you’ve

 

identified and can document. And.

 

Save peoples privacy.

 

So I extend that invitation to you

 

and I thank you for your testimony.

 

Mr. Chairman. I yield back.

 

Thanks Senator Lummis Senator Sullivan.

 

Thank you, Mr. Chairman and.

 

I want to thank our witness here.

 

It’s been a good hearing,

 

a lot of information has been learned.

 

Particularly on the issue of how

 

this is impacting our kids, I think.

 

We’re going to look back.

 

20 years from now. And.

 

All of us are going to be like

 

what in the hell were we thinking

 

when we recognize the damage

 

that it’s done to a generation?

 

Kids, do you agree with that?

 

Let’s see you again.

 

When Facebook made statement has made

 

statements in the past about how

 

much benefit Instagram is providing

 

to kids mental health like kids are

 

connecting who were once alone.

 

Well, I’m so surprised about that is if

 

if Instagram is such a positive force,

 

what have we have?

 

We seen a golden age of teenage

 

mental health in the last ten years.

 

No, we’ve seen escalating opposite.

 

We’ve seen escalating rates of suicide

 

and depression amongst teenagers.

 

It’s.

 

Or at least in part driven by

 

the social media phenomenon,

 

there is a broad swath of research that

 

supports the idea that usage of social media.

 

Implies the risk for these

 

mental health harms right now.

 

In this hearing is helping illuminated,

 

we are seeing Facebook’s own

 

research shows up right say that

 

again that support and Facebook’s

 

own research shows that.

 

Right,

 

the kids are saying kids are

 

saying I am unhappy when I use

 

Instagram and I can’t stop,

 

but if I leave I’m afraid I’ll be ostracized.

 

And that’s that’s so sad.

 

So they know that that’s

 

what their research shows.

 

So what do you think drives them to?

 

I had this discussion with the

 

witness last week and I said, well.

 

You know,

 

I think they called it their

 

time out or stop, I said,

 

but isn’t that incompatible

 

with your business model?

 

Because your business model

 

is more time online.

 

More eyeballs online.

 

Isn’t that the fundamental

 

elements of their business model?

 

Facebook has had both an interesting

 

opportunity and a hard challenge

 

from being a closed system,

 

so they have had the opportunity

 

to hide their problems.

 

And like often people do when

 

they can hide their problems.

 

They get in over their heads and I

 

think Facebook needs an opportunity

 

to have Congress step in and say,

 

guess what you don’t have to

 

struggle by yourself anymore.

 

You don’t have to hide these things from us.

 

You don’t pretend they’re not problems.

 

You can declare moral bankruptcy and we

 

can figure out how to fix these things

 

together because we solve problems together.

 

We don’t solve them alone.

 

And by moral bankruptcy.

 

One of the things that I appreciate the.

 

Phrase that the chairman and you’ve

 

been using is one of those elements.

 

Which is,

 

they know this is a problem.

 

They know it’s actually impacting

 

negatively the mental health of the

 

most precious assets we have in America,

 

our youth, our kids.

 

I have three daughters.

 

They know that that is happening

 

in yet the moral bankruptcy from

 

your perspective is the continued.

 

The continuation of this,

 

simply because that’s how they make money.

 

I I phrase it slightly differently,

 

we have financial bankruptcy

 

because we value people’s lives

 

more than we value money, right?

 

The people get in over their heads

 

and they need a process where they

 

admit they did something wrong.

 

But we have a mechanism where

 

we forgive them and we we have

 

a way for them to move forward.

 

Facebook is stuck in a feedback

 

loop that they cannot get out of.

 

They have been hiding this information

 

because they feel trapped, right?

 

Like they would have come forward if

 

they had solutions to these things.

 

They need to admit they did

 

something wrong and they need help

 

to solve these problems.

 

And that’s what moral bankruptcy is.

 

Let me ask,

 

I’m going to switch gears

 

here and in this is.

 

What’s your current position

 

right now in terms of its

 

disinformation and counterespionage?

 

I my last role at Facebook was in

 

counter espionage or your last role.

 

So one of the things this is a very different

 

topic and only got a minute or so left,

 

but right now is Facebook.

 

I know Facebook is not allowed

 

in countries like China,

 

but do they provide platforms?

 

For authoritarian or terrorists base

 

leaders like the ayatollahs in Iran,

 

that’s the largest state sponsored

 

terrorism in the world or the

 

Taliban or Xi Jinping are certain.

 

My view, our biggest rival for this century.

 

A Communist Party dictator who’s

 

trying to export his authoritarian

 

model around the world.

 

Do they provide a platform for those?

 

Kind of.

 

Leaders who, in my view,

 

clearly don’t hold America’s

 

interests in mind.

 

This Facebook provided that I I

 

during my time working with the

 

threat intelligence org so as a

 

product manager supporting the threat,

 

the counter espionage team.

 

My team directly worked on a tracking

 

Chinese participation on the platform

 

surveilling say Uighur populations

 

in places around the world that you

 

could actually find the Chinese based

 

on them doing these kinds of things.

 

So Facebook,

 

I’m sorry,

 

we also saw active participation

 

of the Iran government doing

 

espionage on other state actors,

 

so this is definitely a

 

thing that is happening.

 

And I believe Facebook consistent

 

understaffing of the counter espionage.

 

Information,

 

operations and counterterrorism

 

teams is a national security

 

issue and I’m speaking to other

 

parts of Congress about that.

 

So you are saying, in essence,

 

that the the platform,

 

whether Facebook knows it or not,

 

is being utilized by some of our

 

adversaries in a way that helps push

 

and promote their interests at the

 

expense of America’s very aware that

 

this is happening on the platform.

 

And I believe the fact that Congress

 

doesn’t get a report of exactly how

 

many people are working on these

 

things internally is is unacceptable

 

because you have a right to keep.

 

American people safe.

 

Great, thank you very much.

 

Thanks center Salvin.

 

You may have just opened.

 

An area for another hearing.

 

Yeah,

 

I’ve I’ve I’ve strong national security

 

concerns about how Facebook operates today.

 

Well, Mr.

 

Chairman.

 

Maybe we should write.

 

I mean, it’s, uh,

 

I’m not being at all facetious.

 

Thank you for your questions on this topic,

 

and I know you have a busy schedule,

 

but we may want to.

 

Discuss this issue with you

 

members of our committee,

 

at least informally,

 

and if you’d be willing to

 

come back for another hearing.

 

That certainly is within

 

the realm of possibility.

 

I haven’t consulted the ranking member,

 

but or the chair.

 

Woman but.

 

Thank you for your honesty

 

and your candor on that topic.

 

Senator Scott

 

thank you chairman First off,

 

thanks for coming forward and thanks

 

for coming forward in a manner that you

 

want to have a have positive change.

 

So it’s not always what happens early.

 

Earlier this year I sent letter to Facebook

 

and other social media platforms asking

 

them to detail the harmful impacts.

 

Could effects our mental health their

 

platforms have on children and teens.

 

So your reports revealed that Facebook

 

has been clued fully aware of this

 

for awhile and the harmful impacts,

 

especially on young women,

 

so I think we all agree that’s

 

completely unacceptable and we’ve got

 

to figure out how we protect the people

 

that are vulnerable in this country

 

from the harmful impacts of Facebook

 

and other social media platforms.

 

So First off, do you think there should be?

 

Greater consideration for age when

 

it comes to using any social media.

 

I strongly encourage raising age

 

limits to 16 or 18 years old based on

 

looking at the data around problematic

 

use or addiction on the platform and

 

children’s self regulation issues.

 

So, so I think you addressed this

 

a little bit,

 

but why do you think Facebook didn’t

 

address this publicly when they they

 

figured it out internally that they

 

were having an adverse impact on young

 

young people, especially young women?

 

Why didn’t they come forward and

 

say I’ve got it?

 

We’ve got a problem.

 

We gotta figure this out.

 

I have a huge amount of empathy

 

for for Facebook.

 

These are really,

 

really hard questions and part of why

 

I’m saying I think I think they feel

 

a little a little trapped and isolated

 

is the problems that are driving

 

negative social comparison on Instagram?

 

Facebook’s own research says Instagram

 

is actually distinctly worse than

 

say tick Tock or Snapchat or Reddit

 

because Instagram tick tock is about

 

doing fun things with your friends.

 

Snapchat is about faces and

 

augmented reality.

 

Reddit is vaguely about ideas,

 

but Instagram is about bodies

 

and about comparing lifestyles.

 

And so I think there are real questions

 

where like Instagram would have to come

 

in and think hard about their product.

 

Or about like what is their product about.

 

And I think I don’t think those

 

answers are immediately obvious.

 

But that’s why I believe we need to

 

solve problems together and not alone.

 

Because collaborating with the

 

public will give us better solutions.

 

So do you think Facebook was trying

 

to try and mitigate the problem?

 

I think within the set of incentives

 

that they were working within,

 

they did the best they could.

 

Unfortunately, those incentives

 

are not sustainable and they are

 

not acceptable in our society.

 

Do you think Facebook and other social

 

media platforms ought to be able to

 

be required to report any harmful

 

effects they have on young people?

 

One of the things that I found very

 

interesting after the report in the

 

Wall Street Journal on Teen Mental

 

Health was that a former executive

 

at the company said Facebook needs

 

to be able to have private research

 

and the part that I was offended by

 

this was Facebook has had some of

 

this research on the negative effects

 

of Instagram on teenagers for years.

 

I strongly support the idea that Facebook

 

should have a year maybe 18 months,

 

to have private research,

 

but given that they are the only people

 

in the world who can do this kind of

 

research that the public never gets to do it?

 

They shouldn’t be allowed to keep secrets

 

when people’s lives are on the line.

 

So because because to be clear,

 

if they make $40 billion a year,

 

they have the resources

 

to solve these problems,

 

they’re choosing not to solve them.

 

Yeah, did that surprise you?

 

They wouldn’t put more effort into this.

 

No, you know it’s going to catch

 

up with him eventually, right?

 

Yeah, like I mentioned earlier,

 

to write coming in and having

 

oversight might actually make

 

Facebook a more profitable company

 

five or ten years from now,

 

because toxicity,

 

Facebook’s own research shows they have

 

something called an integrity holdout.

 

These are people who don’t get

 

protections from integrity systems to

 

see what happens to them and those

 

people who deal with a more toxic,

 

painful version of Facebook.

 

Use Facebook less,

 

and so one could could could reason a kinder,

 

friendlier,

 

more collaborative Facebook might actually

 

have more users five years from now,

 

so it’s in everyone’s interest.

 

Do you think I’ve got a bill

 

and there’s a lot of bills that

 

I think we’ve all talked about,

 

but mine is called the Data Act.

 

It’s going to require express consent

 

from users for large platforms

 

to use algorithms on somebody.

 

You agree with that.

 

I mean,

 

shouldn’t we consent before they

 

get to take our everything about

 

us and go sell it?

 

Allison,

 

thanks to us for for selling

 

personal data that that is an issue.

 

I believe people have chef

 

substantially more control over.

 

Most people are not well informed

 

on what the cost, the cost,

 

personal costs of having their data sold are,

 

and so I worry about pushing that

 

choice back on individual consumers

 

in terms of should people consent

 

to working with algorithms.

 

I worry that if Facebook is allowed

 

to give users the choice of do you

 

want an engagement based news feed

 

or do you want a chronological

 

newsfeed like ordered by time,

 

maybe a little spammed emotion that

 

people will choose the more addictive

 

option that engagement based ranking

 

even if it is leading their their

 

their daughters to eating disorders.

 

Right, thank you.

 

Thanks Senator Scott.

 

I think we have concluded the first round.

 

Unless we’re missing someone who is.

 

On line. And not hearing anyone.

 

Let’s go to the second round.

 

Thank you again for your patience.

 

I know you have a hard stop.

 

I think at 1:30 so will be respectful

 

of that limitation and I’ll

 

begin by asking a few questions.

 

First, let me say.

 

Senator Klobuchar very aptly.

 

Raised with you the principle

 

obstacle to our achieving legislative

 

reform in the past,

 

which is the tons of money spent

 

on lobbyists and other. Kinds of.

 

Influence peddling.

 

Use a pejorative word that is so

 

evident here in the United States Congress.

 

Some of its dark money,

 

some of it is very overt.

 

But I guess the point I’d like to

 

make to you personally is that

 

you’re being here.

 

Really sends a profound message to.

 

Our nation,

 

that one person can really make a difference.

 

One person standing up speaking out.

 

And overcome a lot of

 

those obstacles for us,

 

and you have crystallized.

 

In a way our. Consciousness

 

here you have been a catalyst,

 

I think for change in a way that

 

we haven’t seen and I’ve been

 

working on these issues for.

 

1015 years and you have raised awareness

 

in a way that I think is very unique.

 

So thank you not only for your

 

risk taking and your courage

 

and strength and standing up,

 

but also for the effect that it has had.

 

And I also want to make another point and

 

you can tell me whether I’m correct or not.

 

I think there are other

 

whistleblowers out there.

 

I think there are other truth tellers.

 

In the tech world.

 

Want to come forward and I think you are.

 

Leading by example. I think you are

 

showing them that there is a path.

 

To make this industry more responsible.

 

And. More caring about kids.

 

And about the nature of our public

 

discourse generally are about the strength

 

of our democracy, and I think you.

 

Have given them boost those whistleblowers

 

out there and potentially coming forward.

 

I think that’s tremendously

 

important I think also.

 

And again, you can tell me if I’m wrong.

 

There are a lot of people on

 

Facebook who are cheering for you.

 

Because.

 

There are public reports

 

and I know of some of my.

 

Friends in this world who tell me that.

 

There are people working for

 

Facebook who wish they had.

 

The opportunity and the courage to come

 

forward as you have done because they feel.

 

A lot of reservations about the way

 

that Facebook has used the platform,

 

used algorithms used content

 

and pushed it on.

 

Kids in this way. So those are.

 

Sort of hypothesis.

 

That I hope you can confirm.

 

And I also would like to ask you.

 

’cause a lot of parents

 

are watching right now.

 

So you’ve advised us on what you

 

think we should do the reforms.

 

Some of them that you think we

 

should adopt stronger oversight.

 

Authorized by Congress.

 

Better disclosure because right now.

 

Facebook essentially is a black box.

 

Yes, for most of America,

 

Facebook is a black box.

 

That’s designed by Mark Zuckerberg,

 

incorporated.

 

Mark Zuckerberg and his immediate coterie.

 

And the buck stops with him.

 

And reform of Section 203.

 

So there’s some legal responsibility.

 

So people have a day in court,

 

some kind of recourse.

 

Legally, when they’re harmed by Facebook

 

because right now it has this broad immunity,

 

most of America has no idea.

 

Essentially, you can’t sue Facebook.

 

You have no recourse.

 

Most America doesn’t know about section 230.

 

And if you.

 

Pushed a lot of members of Congress,

 

they wouldn’t know either.

 

It’s actually slightly worse than that.

 

They Facebook made a statement in a

 

legal proceeding recently where they

 

said they had the right to mislead

 

the court because they had immunity.

 

Right that 2:30 give them immunity,

 

so why should they have to tell the

 

truth about what they’re showing?

 

Which is kind of shocking?

 

Well, it is shocking to a lawyer.

 

Of us are it’s also uttered.

 

Disregarding contempt for the rule of law,

 

and for the very legal

 

structure that gives them.

 

That kind of protection,

 

so it’s kind of a new low.

 

In corporate conduct,

 

at least in court.

 

So you’ve you’ve provided us with some of

 

the reforms that you think are important,

 

and I think that the oversight goes a long

 

way because it in turn would make public a

 

lot of what is going on in this black box.

 

But for now. Since a lot of teens and

 

tweens will be going home tonight,

 

as you’ve said. To endure the bullying.

 

The eating disorders.

 

The invitations to.

 

Feel insecure about themselves.

 

Heightened anxiety they have to live

 

with the real world as it exists right

 

now and they will be haunted for

 

their lifetimes by these experiences.

 

What would you tell parents right now?

 

What would you advise them?

 

About what they can do because they

 

need more tools and some of the

 

proposals that have been mentioned

 

here would give parents more tools

 

to protect their children.

 

Right now, a lot of parents tell me.

 

They feel powerless.

 

They need more information their way behind

 

their kids and their adeptness online.

 

And they feel that they need

 

to be empowered in some way to

 

protect their kids in the real

 

world right now in real time.

 

So I offer you that open-ended

 

opportunity to.

 

Talk to us a little bit about your thoughts.

 

Very rarely do you have one of

 

these generational shifts where the

 

generation that leads like parents who

 

who who who guide their children have

 

such a different set of experiences

 

that they don’t have the context to

 

support their children in a Safeway.

 

There is an active need for schools

 

or or maybe the National Institutes

 

of Health to to make established

 

information where if parents want to

 

learn on how they can support their kids.

 

It should be easy for them to know what

 

is constructive and non constructive

 

because Facebook’s own research

 

says kids today feel like they are

 

struggling alone with all these

 

issues because their parents can’t

 

guide them and one of the things I’m

 

I’m sad is when I look on Twitter

 

is when people blame the parents

 

for these problems with Facebook.

 

They say just take your kids phone away.

 

And the reality is,

 

those issues are long and so we need

 

to support parents because right now

 

if Facebook won’t protect the kids,

 

we at least need to help the

 

parents to protect the kids.

 

If A at parents are anguished,

 

they are about this issue, parents or hardly.

 

Uncaring,

 

they need the tools they need to

 

be empowered. And I think that the.

 

The major encouragement for reforms is

 

going to come from those parents and.

 

You have pointed out,

 

I think in general,

 

but I’d like you to just confirm for me.

 

This research and the documents

 

containing that research.

 

Is not only findings and conclusions also.

 

Recommendations for changes.

 

What I hear you saying is that, again,

 

and again and again these recommendations

 

were just rejected or disregarded, correct?

 

There is a pattern of behavior that I

 

saw at Facebook, a Facebook choosing

 

to prioritize its profits over people.

 

And anytime the Facebook faced even tiny

 

hits to growth like .1% of Sessions,

 

1% of of views that it shows its profits

 

over safety and you mentioned I think

 

bonuses tide to downstream MSI’s decor.

 

Explain what you meant,

 

so MSI is meaningful social interaction.

 

Facebook’s internal governance is

 

very much based around metrics,

 

so Facebook is incredibly flat to

 

the point where they have the largest

 

open floorplan office in the world.

 

It’s a quarter of a mile long in one room,

 

right?

 

They believe in flat.

 

And instead of having internal governance,

 

they have metrics that people try

 

to move in a world like that,

 

it doesn’t matter that we now

 

have multiple years of data.

 

Saying MSI may be encouraging bad content

 

might be making spaces where people

 

are are scared where they are shown

 

information that puts them at risk.

 

It’s so hard to dislodge a ruler like that,

 

that a yardstick.

 

That you end up in this situation

 

where because no one is taking

 

leadership like no one is intentionally

 

designing these systems.

 

It’s just many,

 

many people running in parallel,

 

all moving the metric that these problems

 

get amplified and amplified and amplified,

 

and no one steps into to bring the solutions.

 

And I just want to finish and then I

 

think we’ve been joined by Senator Young.

 

And then we’ll go to Senator

 

Blackburn and Senator Klobuchar.

 

You know I spent a number of years

 

as an attorney general, helping to

 

lead litigation against Big Tobacco.

 

And. I came to hear from a lot of.

 

Smokers, how grateful they were.

 

Ironically and unexpectedly that someone

 

was fighting big tobacco because they

 

felt they had been victimized as children.

 

They started smoking when they were 7/8.

 

12 years old because Big

 

Tobacco was hooking them.

 

And as we develop the research very

 

methodically and purposefully.

 

Addicting man. At that early age,

 

when they. Believed.

 

That they would make themselves more popular.

 

That they would be cool and hip if they.

 

Began smoking and then nicotine.

 

Hooked them now, physiologically, nicotine.

 

Has addictive properties.

 

What is it about Facebook’s tactics?

 

Of hooking young people. That makes it

 

similar to what big Tobacco has done.

 

Facebook’s own research about Instagram.

 

Contains quotes from kids saying.

 

I feel bad when I use Instagram,

 

but I also feel like I can’t stop right.

 

I I know that the more time I spend on this,

 

the worse I feel.

 

But like I just can’t like

 

that they want the next click.

 

They want the next like they

 

they the the the dopamine.

 

You know the little hits all the time and.

 

I I feel a lot of pain for those kids, right?

 

Like they they they say they

 

fear being ostracized if they

 

step away from the platform.

 

So imagine you’re in this situation in

 

this relationship where every time you

 

open the app it makes you feel worse.

 

But you also fear isolation.

 

If you don’t,

 

I think there’s a huge opportunity here to

 

make social media that makes kids feel good,

 

not feel bad and that we have an

 

obligation to our youth to make

 

sure that they’re safe online.

 

Thank you,

 

Senator Young.

 

Jenn. Thank you for your

 

compelling testimony.

 

In that testimony, you discuss how Facebook

 

generates self harm and and self hate,

 

especially among vulnerable

 

groups like teenage girls.

 

I happen to be a father of four kids,

 

three daughters, two of whom are teenagers,

 

and as you as you just alluded to,

 

most adults, myself included.

 

I’ve never been a teenager during

 

the age of Facebook, Instagram and

 

these other social media platforms.

 

And therefore I think it can be really

 

hard for many of us to fully appreciate

 

the impact that certain posts may have.

 

Including I would add on

 

the team’s mental health.

 

So can you discuss the short and long

 

term consequences of body image issues?

 

On these platforms, please.

 

The patterns that children establish

 

in their teenage years lived with

 

them for the rest of their lives.

 

The way they conceptualize who they are,

 

how they conceptualize,

 

how they interact with other people

 

are patterns and habits that they will

 

take with them as they become adults

 

as they themselves raise children.

 

I’m very scared about the

 

upcoming generation because.

 

When you and I interact in person

 

and I say something mean to you and

 

I see you wins or I see you cry.

 

That makes me less likely to

 

do it the next time, right?

 

That’s a feedback cycle.

 

Online kids don’t get those cues and

 

they learn to be incredibly cruel to

 

each other and they normalize it.

 

And I’m scared of what will

 

their lives look like,

 

where they grow up with the idea that

 

it’s OK to be treated badly by people

 

who who allegedly care about them.

 

That’s a scary future.

 

Very scary future.

 

And I see some evidence of

 

that as to so many parents.

 

Come on a on a regular basis or

 

are there other specific issues of

 

significant consequences that the

 

general public may not be fully aware

 

of that are impacting vulnerable

 

groups that you just like to elevate?

 

During this testimony.

 

One of the things that’s hard.

 

People who don’t look at the data

 

of social networks everyday it

 

can be hard to conceptualize the

 

distribution patterns of harms

 

or just of usage that there are

 

these things called power laws.

 

It means that a small number of

 

users are extremely intensely

 

engaged on any given topic,

 

and most people are just lightly engaged.

 

When you look at things like misinformation,

 

Facebook knows that the people who are

 

exposed to the most misinformation

 

are people who are recently widowed,

 

divorced,

 

moved to a new city,

 

are isolated in some other way.

 

When I worked on civic misinformation,

 

we discussed the idea of the

 

misinformation burden,

 

like the idea that when people

 

are exposed to ideas that are

 

not true over and over again,

 

it erodes their ability to to

 

connect with the community at large

 

because they no longer adhere to

 

facts that are consensus reality.

 

The fact that Facebook knows that

 

is most vulnerable users people who

 

recently widowed like that they’re isolated,

 

that that the systems that are

 

meant to keep them safe like

 

demoting misinformation.

 

Stop working when people look

 

at 2000 posts a day.

 

Right and I just it breaks my heart.

 

The idea that these rabbit holes

 

would suck people down and then

 

make it hard to connect with others.

 

So Miss Miss Hodge and I desperately want to,

 

which is the American impulse?

 

I want to solve this problem and I.

 

I very much believe that

 

Congress not only has a role,

 

but has a responsibility to figure this out.

 

I don’t pretend to have all the answers.

 

I would value your opinion though as

 

to whether you believe that breaking

 

up Facebook would solve any of the

 

problems that you’ve discussed today.

 

Do you think it would?

 

So as an algorithmic specialist,

 

so this is someone who designs

 

algorithmics experiences.

 

I’m actually against the breaking

 

up of Facebook because even looking

 

inside of just Facebook itself,

 

so not even Facebook and Instagram

 

you see the problems of engagement

 

based ranking repeat themselves.

 

So the problems here are about the

 

design of algorithms of AI and the

 

idea that AI is not intelligent.

 

And if you break up Instagram

 

and Facebook from each other,

 

it’s likely so I used to work on

 

Pinterest and a thing that we faced

 

from a business model perspective

 

was that advertisers didn’t want to

 

learn multiple advertising platforms.

 

That they wanted to learn.

 

They got one platform for Instagram

 

and Facebook and whatever.

 

And learning a second one for Pinterest,

 

Pinterest made radically fewer

 

dollars per user.

 

And what I’m scared of is right now,

 

Facebook is the Internet for

 

lots of the world.

 

If you go to Africa,

 

the Internet is Facebook.

 

If you split Facebook and Instagram apart,

 

it’s likely that most advertising

 

dollars will go to Instagram and

 

Facebook will continue to be this

 

Frankenstein that is altering like that.

 

It is endangering lives around the world.

 

Only now there won’t be money to fund it.

 

So I think oversight and regulatory

 

oversight and finding collaborative

 

solutions with Congress is going

 

to be key because these systems

 

are going to continue to exist and

 

be dangerous even if broken up.

 

Thank you thanks and how are you?

 

Senator Blackburn thank you Mr. Chairman.

 

I have a text that was just put

 

up by Facebook spokesperson.

 

It says just pointing out the fact that.

 

Frances Hogan did not work on child

 

safety or Instagram or research these

 

issues and has no direct knowledge of

 

the topic from her work at Facebook.

 

So I will simply say this to Mr.

 

Stone. If Facebook wants to discuss

 

their targeting of children.

 

If they want to discuss their practices.

 

Privacy, invasion or violations of

 

the Children Online Privacy Act.

 

I am extending to you an invitation

 

to step forward, be sworn in and

 

testify before this committee.

 

We would be pleased to hear from

 

you and welcome your testimony.

 

One quick question for you.

 

What’s the biggest threat to

 

Facebook’s existence is agreed,

 

is it regulators?

 

Is it becoming extinct or obsolete

 

for teenage users?

 

What is the biggest threat

 

to their existence?

 

I think the fact that Facebook is driven

 

so much by metrics and that these

 

lead to a very heavy emphasis on short

 

term ISM that every little individual

 

decision may seem like it helps with growth.

 

But if it makes it more and more toxic

 

platform that people don’t actually enjoy,

 

like when they passed

 

meaningful social interaction,

 

meaningful social interactions back in 2018,

 

Facebook’s own research said that users

 

said it made it less meaningful, right?

 

I think this aggregated set

 

of short term decisions.

 

Endangers Facebook’s future,

 

but sometimes we need to pull

 

it away from business as usual.

 

Will help it write new rules if we want

 

it to be successful in the future,

 

so they can’t see the forest for the guests.

 

Yes, very well, thank you.

 

And I know Senator Kluba Char is waiting,

 

so I’ll yield my time back.

 

And I thank you.

 

Thank you very much and thank you to

 

both of you for leadership and all three

 

of us are on the Judiciary Committee,

 

so we’re also working on a host of other

 

issues, including the App Store issues,

 

which is unrelated to Facebook actually

 

including issues relating to dominant

 

platforms when they promote their own

 

content or engage in exclusionary conduct,

 

which I know is not our topic.

 

Today I see the thumbs

 

up from you Miss Hogan,

 

which I appreciate and I think.

 

This idea of establishing some rules

 

of the road for these tech platforms

 

goes beyond the kid protection

 

that we so dearly need to do.

 

And I just want to make sure

 

you agree with me on that.

 

I was shocked when I saw the New York

 

Times story a couple weeks ago about

 

Facebook using its own platform to

 

promote positive news about itself.

 

I was like wow.

 

I knew you shaped our reality.

 

I wasn’t aware of that one right,

 

and that’s a lot of the work

 

that we’re doing over there,

 

so I want to get to something center.

 

Young was talking about

 

misinformation and center Wuhan,

 

and I have put together a an exception

 

actually to the 2:30 immunity when

 

it comes to vaccine misinformation in

 

the middle of a public health crisis.

 

Last week You Tube announced it was

 

swiftly banning all anti vaccine

 

misinformation and I have long called on

 

Facebook to take similar steps they take.

 

Taken some steps,

 

but do you think they can remove this

 

content and do they put sufficient resources?

 

We know the effect of this.

 

We know that over half the people

 

that haven’t gotten the vaccines

 

it’s because of something that

 

they’ve seen on social media.

 

I know the guy I walked into a cafe

 

and said his mother-in-law wouldn’t

 

get a vaccine because she thought a

 

microchip would be planted in her arm.

 

Could you?

 

Which is false.

 

I’m just saying that for the record,

 

here could in case it gets

 

put on social media,

 

could you talk about are there?

 

Are there enough resources

 

to stop this from happening?

 

I do not believe Facebook as currently

 

structured has the capability to

 

stop vaccine misinformation because

 

they’re overly reliant on artificial

 

intelligence systems that they

 

themselves say will likely never

 

get more than 10 to 20% of content.

 

There you go.

 

And yet it’s a company that what

 

the cap over a trillion dollars

 

when the world’s biggest companies

 

that we’ve ever known.

 

And that’s what really bothers me here.

 

Senator Lujan and I also have pointed

 

out the issue with content moderators.

 

Does Facebook have enough content

 

moderation’s for content in Spanish

 

and other languages besides English?

 

One of the things that was disclosed.

 

We have we have documentation that

 

shows how much operational investment

 

there was by different languages

 

and it showed a consistent pattern

 

of underinvestment in languages

 

that are not English.

 

I am deeply concerned about

 

Facebook’s ability to operate in

 

a Safeway in languages beyond.

 

Maybe the top 20 in the world.

 

OK, thank you.

 

We go back to eating disorders today.

 

You said that you have documents indicating

 

Facebook is doing studies on kids under 13,

 

even though technically no kids under

 

13 are permitted on the platform.

 

The potential for eating disorder

 

content to be shown to these

 

children raises serious concerns.

 

Senator Blumenthal has been working on this.

 

I’ve long been focused

 

on this eating disorder.

 

Issue, given the mortality rates,

 

are you aware of studies Facebook has

 

conducted about whether kids under 13

 

under 13 on the platform are nudged

 

towards content related to eating

 

disorders or unhealthy diet practices?

 

CNN also did investigation on this front.

 

I have not seen specific studies regarding

 

eating disorders in under the age of 13,

 

but I have seen research that indicates

 

that they are aware that teenagers

 

coach tweens who are on the platform

 

to not reveal too much to not post too

 

often and that they have categorized

 

that as a myth that you can’t be

 

authentic on the platform and that

 

the marketing team should talk.

 

She tried to advertise to teenagers to stop.

 

Coaching twins that way.

 

So we I believe we’ve shared that

 

document with Congress exactly more.

 

Thank you and we’ll be looking more.

 

Speaking of the research issue.

 

Uhm,

 

Facebook has tried to downplay the

 

internal research that was done,

 

saying it was unreliable, reliable.

 

It seems to me that they’re trying

 

to mislead us there.

 

The research was extensive,

 

surveying hundreds of thousands

 

of people traveling around the

 

world to interview users.

 

In your view,

 

are the internal researchers at

 

Facebook who examined how users

 

are affected by the platform.

 

Is there work throw?

 

Are they experience?

 

Is it fair for Facebook to

 

throw them under the bus?

 

Facebook has one of the top ranked research

 

programs in the in in the tech industry,

 

like they’ve invested more in it than

 

and I then I believe any other social

 

media platform and the some of the

 

biggest heroes inside the company

 

are the researchers because they are

 

boldly asking real questions and

 

being willing to say awkward truths.

 

The fact that Facebook is

 

throwing them under the bus,

 

I think is unacceptable and I just

 

want the research just to know that I

 

stand with them and that I see them.

 

Or maybe we should say as the

 

name of one book, the ugly truth.

 

What about Facebook blocking?

 

Researchers at NYU you from accessing

 

the platform does that concern you?

 

These are outside researchers.

 

I am deeply concerned.

 

So for contacts,

 

for those who are not familiar

 

with this research,

 

there are researchers at NYU Yoo-hoo

 

because Facebook does not publish

 

enough data on political advertisements

 

or how they are distributed.

 

These are advertisements that influence

 

our democracy and how it operates.

 

They created a plugin that allowed

 

people to opt in to volunteer to

 

help collect this data collectively,

 

and Facebook lashed out at them and even

 

banned some of their individual accounts.

 

The fact that Facebook is so scared of

 

even basic transparency that it goes out

 

of its way to block researchers who are

 

asking awkward questions shows you the

 

need for congressional oversight and

 

why we need to do federal research and

 

federal regulations on this very good.

 

Thank you. Thank you for your work.

 

Thank Senator Klobuchar, Senator Markey.

 

Thank you, thank you Mr.

 

Chairman, thank you for your

 

incredible leadership on this issue.

 

As early as 2012.

 

Facebook has wanted to allow

 

children under the age of 12.

 

To use his platform.

 

At that time in 2012,

 

I wrote a letter to Facebook

 

asking questions about what

 

data it planned to collect and

 

whether the company intended to

 

serve targeted ads at children.

 

Now here we are nine years later.

 

Debating the very same issues today.

 

Miss Hogan.

 

You’ve made it abundantly clear

 

why Facebook wants to bring

 

more children onto the platform.

 

It still hooked them early

 

just like cigarettes so that

 

they become lifelong users,

 

so Facebook’s profits increase.

 

Yet we should also ask why

 

in the last nine years,

 

as the company not launched Facebook for

 

kids or Instagram for kids after all,

 

from the testimony here today,

 

Facebook appears to act without regard

 

to any moral code or any conscience,

 

or instead puts profit above people profit.

 

Above all else.

 

The reason why Facebook

 

hasn’t officially permitted.

 

Kids 12 and under to use its

 

platform is because the child

 

online Privacy Protection Act.

 

Of 1998 that I’m the author of exists.

 

Because there is a privacy law on

 

the books which I authored that

 

gives the Federal Trade Commission

 

regulatory power to stop websites and

 

social media companies from invading

 

the privacy of our children 12 and under.

 

That’s why we need to expand the

 

child online Privacy Protection Act.

 

That’s why we need to pass the kids

 

act that Senator Blumenthal and I

 

have introduced and why we need an

 

algorithmic Justice Act to pass.

 

Because the absence of regulation

 

leads to harming teens stocking

 

division damaging our democracy,

 

that’s what you’ve told us today,

 

so Miss Hogan.

 

I want you to come back to the protections

 

that you are calling on us to enact.

 

This isn’t complicated.

 

We’re going to be told online

 

all day with these paid.

 

Facebook people, oh Congress can’t act.

 

They’re not experts.

 

It’s too complicated for Congress.

 

Just get out of the way.

 

You’re not experts.

 

Well,

 

this isn’t complicated.

 

Facebook and it’s big tech.

 

Lobbyists are blocking my bills to protect

 

kids because it would cost them money.

 

That’s how complicated it is.

 

So let’s start with the kids act and

 

Senator Blumenthal and I that would

 

ban influencer marketing to kids.

 

Today’s popular influencers.

 

Peddle products while they flaunt their

 

lavish lifestyles to young users.

 

Can you explain how allowing

 

influencer marketing to teens and

 

children makes Facebook more money?

 

The business model that provides

 

mostly a great deal of the

 

content on Instagram is one where

 

people produce content for free.

 

They put on Instagram free.

 

No one is charged for it,

 

but many of those content creators

 

have sponsorships from from brands

 

or from other affiliate programs.

 

Facebook needs those content

 

creators to continue to make content

 

so that we will view content and

 

in the process view more ads.

 

Facebook provides tools to support

 

influencers and who do influencer

 

marketing because it gives them the

 

supply of content that allows them

 

to keep people on the platform.

 

Viewing more ads,

 

making more money for them so.

 

I am actually the author of the 1990.

 

Children’s television act.

 

What does that do?

 

Well,

 

it says to all the television networks

 

in America stop praying upon children.

 

Stop using all of your power in

 

order to try to get young children

 

in our country hooked on the

 

products that are going to be sold,

 

we had to pass a law that banned

 

television stations from doing this.

 

That’s why I knew that after my law

 

passed in 1996 to break up the monopolies.

 

Of the telecommunications industry

 

and allow in the Googles and the

 

Facebooks and all the other companies.

 

You name it that we would need a

 

child privacy protection there

 

because everyone would just

 

move over to that new venue.

 

It was pretty obvious.

 

And of course the industry said no

 

way we’re going to have privacy laws

 

for adults and they blocked me from

 

putting that on the books in 1996.

 

But at least for children I got up to age 12.

 

That’s all I could get out of the industry.

 

But we also know that as time

 

has moved on it,

 

they’ve become even more

 

sophisticated so that.

 

The Kids Act is necessary to stop

 

children and teen apps from being

 

features such as likes and follower

 

counts that quantify popularity.

 

Miss Hogan.

 

Can you explain how allowing these

 

features that create an online popularity

 

contest makes Facebook more money?

 

Uhm? Just to make sure so I I am only

 

familiar with issues regarding teens.

 

From the research I have read on Facebook,

 

so I want to put that caveat in there.

 

The research I’ve seen with regard

 

to quantifiable popularity is that

 

as long as comments are allowed,

 

so this is not a quantitative thing,

 

which is just comments.

 

As long as comments are

 

still on posts on Instagram,

 

take just taking likes off Instagram.

 

Doesn’t fix the social

 

comparison problem that you know,

 

teenage girls are smart.

 

They see that Sally is prettier than them.

 

Her pictures are really good.

 

She gets tons of comments.

 

They don’t get many comments right.

 

And so I do think we need

 

larger interventions than just

 

removing quantitative measures.

 

Facebook has a product that

 

is very attractive.

 

The reason why they have the study of

 

problematic uses ’cause it is kind of

 

addictive and those kinds of things

 

like having lots of little feedback loops,

 

keeps kids engaged.

 

And like I mentioned earlier,

 

part of why Facebook switched over to

 

meaningful social interactions was

 

it found that if you got more likes,

 

more comments, more reassures,

 

you produced more content.

 

And so having those systems of

 

of little rewards makes people

 

produce more content,

 

which means we view more

 

content and we view more ads,

 

which makes them more money.

 

OK, and the kids acted.

 

Senator Blumenthal and I are

 

advocating for also prohibits

 

amplification of dangerous and

 

violent content to children and teens.

 

Can you explain how algorithms pushing that

 

dangerous content makes Facebook more money?

 

I I don’t think Facebook ever

 

set out to intentionally promote.

 

Divisive, extreme,

 

polarizing content.

 

I do think though that they are aware

 

of the side effects of the choices

 

they have made around amplification,

 

and they know that algorithmic based ranking,

 

so engagement based ranking

 

keeps you on their sites longer.

 

You have long, you have longer sessions,

 

you show up more often and

 

that makes them more money.

 

So do you believe we have to ban all

 

features that quantify popularity

 

as a starting point in legislation?

 

Uhm?

 

As I as I covered before the

 

internal research I’ve seen is that

 

removing things like likes alone,

 

if you don’t remove things like comments,

 

it doesn’t have a huge impact

 

on social comparison.

 

So I do believe we need to have a more

 

integrated solution for these issues.

 

Do should we ban targeted

 

advertisements to children?

 

I strongly encourage banning

 

targeted advertisements to children,

 

and we need to have oversight in

 

terms of I think the algorithms will

 

likely still learn the interests

 

of kids and match adds those kids

 

even if the advertiser can’t.

 

Articulate and want to target

 

on this interest rate.

 

How much money does Facebook made

 

make from targeting children?

 

I don’t know what fraction of

 

their revenue comes from children,

 

so ultimately children are not commodities.

 

They’ve always been given

 

historically special protections.

 

That’s what the Children’s Television Act of

 

1990 is all about.

 

They’ve always been given this special

 

safety zone so the children can

 

grow up without being preyed upon.

 

By marketers When I was a boy and the

 

salesman would knock on the front door,

 

my mother would just say,

 

tell him I’m not home.

 

That man is not getting into our living room.

 

Well, I would say to my mother

 

but you are home not to him,

 

she would say.

 

Well, we need to give parents

 

the ability just to say.

 

No one is home for you and your

 

company and your attempts to prey upon

 

children to get into our living room.

 

That’s how a moment in history we have to

 

make sure that we respond to the challenge.

 

Thank you, Mr.

 

Chairman, thank you,

 

Senator Markey and my thanks to

 

Senator Markey for his leadership over

 

many years on protecting children.

 

As you’ve heard, he was a.

 

Champion in the House of

 

Representatives for coming here.

 

But well before I was in

 

the United States Senate.

 

But around the time I was

 

elected attorney General,

 

I’ve been very pleased and

 

accounted to work with him on.

 

Legislation now going forward

 

and I joined him in thanking you,

 

I have just a few concluding

 

questions and I seem to be the

 

last one left standing here.

 

So the good news is I don’t think.

 

Will have others, but as you may know.

 

You do know my office created an Instagram

 

user identified as a 13 year old girl.

 

She followed a few easily

 

identifiable accounts on.

 

Weight loss, dieting,

 

eating disorders and she was

 

deluged literally within a day.

 

Of.

 

Content pushed to her by.

 

Algorithms that in effect promoted.

 

Self injury and eating disorders.

 

Are you surprised by that fact?

 

I’m not surprised by that fact.

 

Facebook has internal research

 

where they have done even more

 

gentle versions of that experiment

 

where they have started from things

 

like interest in healthy recipes.

 

So not even extreme dieting,

 

and because of the nature of

 

engagement based ranking and

 

amplification of interests.

 

That that that imaginary user was pushed,

 

or that that really count was

 

pushed towards extreme dieting and

 

pro anorexia content very rapidly.

 

And that’s the algorithm.

 

That’s the algorithm.

 

That algorithm could be changed.

 

The algorithm definitely could be changed.

 

I I have first-hand experience

 

from having worked at Pinterest.

 

Pinterest used to be an application

 

that was heavily based just on

 

you follow certain peoples pins

 

and those are put into your feed.

 

And overtime it grew to be much,

 

much more heavily based on recommendations

 

that the algorithm would figure out.

 

What are you interested in?

 

You can have wonderful experiences

 

that are based on human interactions,

 

so these are human scale technologies,

 

not computers. Choosing what we focus on.

 

So the average parent

 

listening here worried about?

 

Their daughter or son.

 

Being deluged with these kinds of.

 

Content.

 

Would want that kind of algorithm changed?

 

I would think and would welcome

 

the oversight that you’re

 

recommending. I, I believe parents

 

deserve more options and more

 

choices and today they don’t know

 

even what they could be asking for.

 

I just received by text literally.

 

About 15 minutes ago a message

 

from someone in Connecticut.

 

And I’m going to read it to you.

 

It’s from a dad. Uhm?

 

I’m in tears right now.

 

Watching your interaction.

 

With Frances Haugen my 15 year old

 

daughter loved her body at 14.

 

Was on Instagram constantly?

 

And maybe posting too much.

 

Suddenly she started hating her body.

 

Her body dysmorphia now anorexia.

 

And was in deep deep trouble

 

before we found treatment.

 

I fear she’ll never be the same.

 

And broken heart.

 

I think people tend to lose

 

sight of the real world impact.

 

Yeah, yeah. And. I think that

 

is the reason that you’re here.

 

I just like to invite you.

 

If you have any words to those.

 

Other. Employees at BGT Act the

 

workers who may be troubled by the

 

misconduct or unethical conduct.

 

That they see what you would tell them.

 

We live in a pattern that we have seen

 

throughout time with regard to technologies

 

is the humans are very crafty people

 

like we we find interesting solutions,

 

but we often get out over our skis, right?

 

We we develop things that are of a larger

 

scale than we know really know how to handle.

 

And what we have done in the

 

past is when we see this happen,

 

we take a step back and we find

 

institutions and we find frameworks

 

for doing these things in a Safeway.

 

We live in a moment where whistleblowers

 

are very important because these

 

technological systems are walled off,

 

they are very complicated.

 

There are things that you need to

 

be a specialist to really understand

 

the consequences of and the fact

 

that we’ve been having these acts,

 

same kinds of false choice,

 

discussions about what to do about Facebook,

 

you know, is it about privacy or oversight?

 

Is it about censorship or safety?

 

Like the fact that we’re being

 

asked these false choices?

 

It’s just an illustration of what

 

happens when the real solutions

 

are hidden inside of companies.

 

We need more tech employees to

 

come forward through legitimate

 

channels like the SEC or Congress

 

to make sure that the public has the

 

information they need in order to

 

have technologies be human centric,

 

not not not computer centric. Thank you.

 

On that note, we’ll conclude.

 

Thank you for an extraordinary.

 

Testimony.

 

I think that anybody watching

 

would be impressed and much better

 

informed and you’ve done.

 

America real public service.

 

Thank you.

 

The record will remain open for a week.

 

I’m sorry for two weeks.

 

Any senators who want to submit questions

 

for the record should do so by October 4th,

 

October 19th.

 

This hearing is adjourned.

 

Welcome to
Reach by Hand Heal by Heart
Evolving the Way with Ways.