Joint Committee on the Draft Online Safety Bill

free e-books and longer documents.
To help reduce some of the confusion/clutter in other areas we are going to try having this section for the longer document posts.

Note: Anyone can read this forum, only registered users may post or reply to messages.
bridgetroll
Posts: 77
Joined: Wed Jun 10, 2015 2:02 pm

Re: Joint Committee on the Draft Online Safety Bill

Post by bridgetroll » Wed May 04, 2022 9:01 am

30
Facebook and Twitter explaining why, and it would be really good to
know how many are in that group of the silenced.
Q64 Lord Black of Brentwood: I have one very brief follow-up to that. Do
you think Ofcom needs much stronger powers under the Bill to be able to
go into the platforms, audit them and demand these figures? At the
moment I suspect the Bill is not strong enough in that direction.
Matt Harrison: I would say yes, because I guess the question is how
Ofcom can be an effective enforcer if it does not understand where the
problems are, what the problems are and the scale of those problems.
Data is key to that, and we know, as has been referred to, that datamining companies are social media platforms. There is definitely
something to be said about Ofcom having access to that data for it to be
as efficient a regulator as the Bill is intending it to be.
Ian Russell: Ofcom should have all the powers that it needs to
understand the situation as completely as possible. It is an ever-changing
situation, so unless it has regular access to such things it is very difficult
for it to regulate. It can provide a screen around commercially sensitive
information. That is important. It is also important and incumbent on
everyone that, if something is discovered that makes the online world
safer, that becomes a commonly held best practice, so that that
information, algorithm or whatever it may be is shared widely.
Izzy Wick: I would absolutely agree with Ian on that last point. In the
Bill, Ofcom has the powers to request this information. It is quite strong
in that respect. What it does not have is the obligation and the
responsibility, and there needs to be a duty in the Bill for Ofcom to
investigate the algorithmic oversight on behalf of children in particular. As
Matt picked up on, Ofcom needs the requisite capacity and resources to
understand those algorithms and interrogate what is going on.
Lord Black of Brentwood: We welcome any suggestions you have
about how the Bill might be amended in that direction. Clare, did you
have anything to add?
Clare Pelham: I would only draw an analogy with terrestrial broadcast.
On the existing television channels, we regulate very effectively the
dissemination of flashing images through a technology that is absolutely
available to all broadcasters. That duty to share, because we are not very
far away from an algorithm that could be developed to stop this material
being broadcast at source by social media companies, would be a very
powerful instrument.
The Chair: To the point you made, Nina, that Facebook says that it
removes 90% of the harmful content it finds, that is about as meaningful
as me saying that I have removed 90% of the weeds in my garden. I find
it impossible to know what that number actually means or even how hard
they are looking.
Q65 Lord Stevenson of Balmacara: This has been a very informative
session. Thank you very much. We have all learned a lot. Most of the
31
points I was going to make have been covered, so I have quite a narrow
question, but it bears on what we have just been talking about. The
tension here is that, possibly more by accident than design, we have a
situation where private companies are offering so-called public spaces or
public squares for commercial reasons, which do not fit what most users
would expect out of that.
You have all said what you would like to see and we all agree that higher
standards are desperately required, but they would not fit the business
model, would they? If we do not get the pile-ons and the numbers of
people there, the data mining, which Ian said was really what they were
after, falls away and they will lose money, so we have a bit of a problem
here.
It has always intrigued me why we ever allowed companies like this to
operate in a publishing role, but without knowledge of what they are
publishing. To narrow it down to a particular question, would a lot of what
you want—and many would join you in a lot of these asks—be advanced
if there was a requirement on the companies to moderate what they are
doing in an effective way through knowledge, not simply through
systems?
In other words, it is not the regulator. It has to be the companies. They
are taking on the responsibility of opening up these public squares, but
without having any apparent knowledge of what is going on, even though
we think they probably do know, compounded by the fact that they are
placing encryption on a number of services. Is that the way to go, or will
that end in failure as well?
Ian Russell: It is really important to find a way to engage with the
companies as best as possible and, at the heart of it, it probably is
knowledge. Sharing of that knowledge is a two-way street. They would
share with us, so the world becomes better informed about the
knowledge and the way they run their companies, and we would share
with them problems that people who are using their platforms are having,
which they seem resistant to accept. If that two-way street and that
dialogue is encouraged, change could come about much more quickly. It
should not be so adversarial.
Izzy Wick: There are certainly opportunities to strengthen the
requirements for better content labelling, tagging and for more
investment in things like pre-moderation, in the hinterland after
something is uploaded and before it makes its way to the shop window.
That is required for things like CSAM, and there is technology developing
to address that.
Just last week, there was an investigation by the Wall Street Journal into
the algorithms that TikTok uses, which lead to children under 18 being
sold drugs from unknown accounts and being offered sex by unknown
adults. A TikTok spokeswoman said, “We don’t differentiate between
content that is labelled as adult only and content that is child friendly”,
and that just spells out the problem.
32
Lord Stevenson of Balmacara: That says it all, really.
Nina Jankowicz: There is a little bit of danger when we talk about
increasing the liability on platforms. This might just be my American
perspective, in talking about Section 230 of the Communications Decency
Act and intermediary liability, but in the States our worry about
increasing liability for the platforms is that the platforms will then use this
as a cudgel to eliminate speech. We have seen that happen in NetzDG, as
I mentioned before.
In cases where platforms are encouraged to moderate, they often
moderate the abused, not the abusers. We see take-downs or
suspensions against people with marginalised identities for fighting back
against the abuse they are receiving, so we would need to be really
careful about that. There is a way forward there if, as you mentioned, it
is a knowledge-based moderation that has humans involved and not just
AI systems.
One thing that begs mentioning, which we have not really talked about
today, is that this is not only about taking down content. It can be about
demoting content too and saying, “You can shout into the black void, but
you don’t get a huge audience to do that”. That allows us to get around
some of the free speech concerns that come up when we are introducing
broader liability for the platforms and that enforcement.
Matt Harrison: It is an interesting one. There is an opportunity for social
media companies to see the legislation not as an attack on them, but
rather as providing them with the clarity about how they should be
operating, what their practices should be and what their responsibilities
are, because we have tried self-moderation and that has not worked.
Companies are trying different things all the time, but this Bill should be
seen as a positive for them in terms of that clarification. For companies
that do not see it in that light, the enforcement is equally important to
bring those who do not want to see the positives or take action back
towards where we need them to be.
I hope it would start those different discussions, rather than not spending
money because of a business model. It encourages a rethink. There are
lots of people in these companies who want to do the right thing,
encourage positive behaviour and get rid of harmful content, but the
legislation is needed just to help people along and give power to those
who want to do that, as well as to bring the stick down on those who just
do not want to take any action.
Clare Pelham: I would be concerned at any move that might permit
social media companies to act essentially as an enforcement agency for
what are criminal offences. We need to be very clear that those are for
the enforcement authorities in this country, the police, the CPS and so
on. What might be an interesting opportunity to explore with them is
perhaps the use of a yellow card for somebody perpetrating behaviour
that is short of criminal but not something that anybody feels is advisable
or to be recommended. Then we would be able perhaps to eliminate from
33
our feeds those who have yellow cards. That would be a way in which
parents could help keep their children that little bit safer.
Q66 Suzanne Webb MP: It is of deep concern that we need this Online
Harms Bill in the first instance, but we are seeing content that causes so
much pain and it should be taken down in the first instance or just never,
ever happen. I am guessing that we are coming to some sort of
conclusion. Do you think there has been sufficient consultation? Have you
been sufficiently consulted on this draft Bill in the first instance? When
you leave here today, are you happy that you have furnished us with all
the information that we need as a committee to take away and consider
the draft Bill?
Ian Russell: I personally feel that the information has been provided
and we have been involved as well as can be expected. It is a hugely
complicated Bill. I am not an expert in these things, but Sarah Connolly
recommended that I got the charity lawyers to pore through it and help
me through it. She said that it is complicated even by parliamentary
standards.
New technology needs new regulation, but it also needs a new style of
regulation, and maybe that is not for this committee or for now. It is
really important that this important piece of regulation is considered and
gets on the books quickly, but it is also important to take the knowledge
that you have gained as a committee and use that maybe to form some
new style of legislation. Maybe, in the same way in which we protect our
computers by installing the latest antivirus software, this needs to be
constantly topped up, because tech moves at such a speed. Maybe a
committee needs to be ongoing in order to inform and feed into the
regulatory process, which must be fleet of foot enough to react quickly
when tech moves and develops quickly.
I would just like to add some statistics about suicide. Papyrus UK, the
suicide prevention charity for young people under the age of 35, states
that 200 schoolchildren—this is backed by the ONS—are lost to suicide
every year in the UK. Samaritans and University of Bristol research states
that 26% of young people who had presented to hospital with self-harm
or suicide attempts had used the internet in relation to that. If you
combine those two figures alone, 26% of 200 people a year, roughly one
person a week of school age in this country, are taking their life and
being affected by what they have seen online. That is just a guesstimate.
It might be very different from that, which is why the research is so
important.
Time is of the essence and it is really important that something is
achieved, even if it is only the first step. It is a great comfort to be here
and see how much effort is being devoted to try to make the UK the
safest place to be online in the world.
Izzy Wick: I would like to echo that. We are huge supporters of the Bill
and particularly the ambition to put children front and centre. We
recognise that the bar has been set very high for these companies. We
34
have engaged very productively with officials over the drafting period and
there are many elements of the Bill that reflect the good conversations
that we have had, but there are two major elements of the Bill that have
changed since the full government response to the White Paper back in
December that are of great concern to us.
The first is the general duty of care. What we have now is a laundry list
of duties and we seem to have lost this overarching duty of care to
address reasonably foreseeable harms. The issue with this is that it will
not necessarily futureproof the Bill and will not account for emerging
technologies and associated risks, which will mean that the regulator is
constantly behind the curve. A much more straightforward approach
would be to reintroduce this general duty of care to fulfil the safety
objectives of the Bill, which are excellent.
The second is the scope of the harms that the Bill will address. The
current draft has been a rebadging, so what we have is a content Bill. It
focuses only on harmful user-generated content, when actually we need
to be thinking about the whole range of risks that children and adults face
online. To do that, we need to bring back in the word “activity”. It needs
to be looking at content and activity, which would bring in all the things
that we have addressed today—the systems and processes of these
platforms, and the other types of risks that children face online, as well
as content risks.
Nina Jankowicz: I am not a UK citizen, but I am delighted to have been
asked here today to talk about these issues. There are very few fora
internationally where they are being spoken about, and the work that you
all are doing is incredibly important. I hope that it will protect freedom of
expression for women around the world and not just here in the UK.
There is one thing that I would compel you to do, as you continue your
inquiry. I am cognisant that the five witnesses are all white. I know that
you talked about racist abuse last week, but I encourage you to continue
to pursue a really diverse panel of witnesses. I have not spoken about
this as much today, but in our research we found it to be far and away
the women with intersectional identities and marginalised communities
who received orders of magnitude worse abuse, and often intersectional
abuse as well, so it is not just gendered abuse. It is also transphobic
abuse, or racist or racialised abuse. It is incredibly important to have that
perspective, and I know you will do your due diligence as you move
forward with this inquiry.
Matt Harrison: So far, we have been fairly happy with the level of
engagement we have had. We are very happy to give evidence today. I
could not agree more with the suggestions from Izzy about the Bill. I also
have something else to say about the application of the Bill and the dayto-day impact it will have on users of social media, and ensuring that
legislation and regulation actually impact people and reflect those
experiences.
35
I could not say strongly enough how important it is that at every level—
government, this committee, if it continues beyond the Bill’s passage,
Ofcom or social media companies—users are engaged, particularly groups
where there are disproportionate online harms, to understand what is
happening and how their changes of policy and regulation are impacting
their experiences, and to have that experience reflected back in any
changes or amendments that are required by you, government or Ofcom.
I could not stress highly enough how much we need to have those lived
experiences reflected in policy.
Clare Pelham: I wonder whether I could make two points in response to
your excellent question. First, I would like to pay tribute to the
Government for this Bill, because it is a world-leading piece of draft
legislation. At the Epilepsy Society, we feel that we have had really good
engagement from DCMS, the Law Commission and parliamentarians
across all parties.
Secondly, I would not be a bit surprised if those of you who are on social
media are targeted, later today or tomorrow, as a consequence of this
conversation. If you are, I would be really interested if you could raise
the targeted messages you receive with the social media platform,
without indicating that you are a parliamentarian, and see what response
you get.
Q67 Suzanne Webb MP: I have a quick follow-up to that. I am conscious
that we have taken up so much of your time. When this Bill is in place, in
an ideal world what will online look like? What would you want it to look
like in terms of the percentage of the behaviour going on, or not? I would
like to see everything that you have described taken off completely,
within a second of it ever happening. What do you think it is actually
going to look like?
Ian Russell: The online world after the period of self-regulation, which
patently has not worked, is more dangerous. As I have said, that is a
problem for the online world, because it needs to do good. It is here for
us to use to do good. The online world needs to be a better reflection of
the offline world, in which dangers are controlled. The platforms using
digital technology have an advantage. Make it safe, particularly for young
and vulnerable people. In my mind, it will be a return to the world of the
internet that I used 10 years ago, when it seemed to be a much safer
place than it is now. The algorithms of the platforms seem to have
propelled it towards a much darker and more dangerous place.
Izzy Wick: The digital world that we want to see is one in which
children’s rights and needs are respected and upheld. It is as simple as
that. There is some way to go to get there, but the Bill is a crucial part of
that journey. If it is going to deliver for children, it needs to really focus,
as Ian said, on the algorithms and some of the more fundamental
problems that sit at the heart of the online ecosystem that we have
today.
36
Nina Jankowicz: I would like to see a world in which women do not
have to second-guess everything they write, say, tweet or put out online.
We are pretty far aware from that. We still cannot walk home through a
park in the dark without fearing for our lives in many cases, and that
environment is replicated online. It would be nice to feel not only that the
social media platforms are doing their due diligence and upholding their
duty of care, but that we have the backing of the Governments who
represent us.
Matt Harrison: It is about people with learning disabilities using social
media platforms with the confidence that the platforms have their backs
should something go wrong, as is reflected in society when something
goes wrong. People have confidence that the police, the law and the
courts are behind them. It is very much about that confidence and, from
that, you get the enabling aspects of social media, the breaking down of
social barriers and the challenging of negative attitudes and stigma. It is
just about removing those aspects, giving people confidence and making
them feel included and empowered to take part in that community.
Clare Pelham: At the Epilepsy Society, we always try to be positive. My
hope, following the Bill, would be that the online world is like the real
world, but just that bit better, actually. That would be a lovely thing to
happen. That is a great vision to have.
Q68 Baroness Kidron: I am sorry to bring you down, but my question is a
supplementary to Tim’s inquiry when we were talking about the structure
of the Bill. I really understood what you said about risk assessments
having to be broader and relevant to the companies, and then I heard
you on minimum standards, but a very big part of the Bill is about terms
and conditions—companies setting their terms and conditions and those
being upheld. We are in a situation where nearly 100% of people do not
read terms and conditions, so I just wanted to have a very quick round
on what you thought about that structural issue in the Bill.
Nina Jankowicz: I like the emphasis on accessibility. I have always
advocated for informed consent regarding terms and conditions of
platforms. They should be written in plain English and should not be
presented in a way that is a click-through, where you need to get your fix
of posting dog and baby pictures, and you just accept cookies regulations
that come up, like many of us do with the GDPR. It needs to be a much
more involved and informed process. Any emphasis you can put on that
is going to benefit the users in the long term.
Izzy Wick: I agree with Nina that the focus on accessibility is very
welcome, but I would stress that it is not just about the presentation of
published terms; it is about their content. We need some indication from
Ofcom about what constitutes that floor of protection and what needs to
be in those terms and conditions, particularly if Ofcom is expected to
assess compliance with the regulation against the company’s ability to
uphold those terms and conditions.
37
Ian Russell: There is a veneer of usability about the platforms. They
have to be easy to use, for people to want to use them. As soon as you
get beneath that veneer to the 22,000 pages of the average terms and
conditions, or whatever it is, it is a mystery to most people. They are a
great example of how the user experience needs to be simplified for all,
so that it is better understood and more readily understood by those who
need to understand it.
Matt Harrison: The focus on accessibility is definitely the key point in
this for people with learning disabilities. Understanding the current terms
of service is mindboggling, actually. I am surprised if anyone at social
media platforms understands what half of it requires of them in their
roles, but then you open the whole can of worms with accessibility. I
know the Bill talks about clear and accessible terms, but what that means
is also its own debate and conversation. There is a big piece of work to be
done there. Perhaps the drafters did not quite envisage what is meant by
those terms, so we would like some discussion about what you mean by
accessibility. Are you talking about length, the duties, or the content?
That requires more exploration.
Clare Pelham: We would all agree that having voluminous terms and
conditions is the same as not providing any terms and conditions,
because none of us sits down and reads 54 pages of closely typed script.
There is plenty of expertise in producing easy-read documents around the
place, if they were to avail themselves of it. There must be a requirement
to produce a synopsis of the key provisions on one side of A4, in a format
that is accessible to the average reader. That is not impossible to do.
The Chair: Thank you very much. I appreciate the time that all the
witnesses have made available to us today. It has been extremely helpful
to our inquiry

Post Reply