Amnesty’s Latest Research Into Online Abuse Confirms What Black Women Have Known For Over A Decade

Yesterday, Amnesty International has further proved that online abuse is a violation of our human rights with the launch of its latest report on women in politics and journalism. Results from the global crowdsourcing project, named TrollPatrol, support what women, particularly black women, have been reporting for over several years. The research revealed that women in the US and UK face a staggering level of abuse – every 30 seconds on Twitter – and that black women are 84% more likely than white women to face abusive or problematic tweets. Organisations like Take Back the Tech, Women’s Media Center and The National Democratic Institute’s Not the Cost Campaign were one of the first to address this pandemic and we pay tribute to their dedication.  Now that we have intersectional data, it’s time we all help fix the glitch and end online abuse.

Last year I experienced a tidal wave of misogynoir, which is abuse that’s both misogynistic and racist, after a speech I made went viral. As a young black woman in politics observing how other women were publicly treated on social media platforms, experiencing abuse first hand was the final straw. This is why I founded Glitch, a not-for-profit organisation that exists to end online abuse. Our workshops centre on digital self care, self defence and digital citizenship and were recently recommended as best practice in a recent European Parliament Report.

According to the report, 1 in 10 tweets mentioning black women is either abusive or problematic, compared with 1 in 15 directed at white women. We strongly agree with Amnesty, social media companies such as Twitter must be more transparent but they must also engage and support many more diverse activist groups using their platforms. For so long, research, policies and Government interventions to address online abuse have focused on children and women, as if the two are homogenous groups. By doing so, we are ignoring the real drivers behind a lot of online abuse and online bullying, affecting women and girls on a daily basis. Amnesty’s research is a welcome step towards a more nuanced, intersectional critique of online harms.

It’s important to understand that while women experience all different kinds of online abuse, the overall impact has a silencing effect that represents a potent threat to gender equality, human rights and democracy. It causes anxiety and, in very sad situations, has resulted in girls self harming and taking their own lives. Amnesty’s Write for Rights Campaign has inspired thousands of people around the world to write to Jack Dorsey, asking him to take serious action. I’ve also received so many messages from people sharing their own experiences of online abuse and losing loved ones.

So, how do we begin to address online abuse?

The current online/offline dichotomy is unhelpful and hides much of the violence that women and girls face online. Founder of Everyday Sexism, Laura Bates’ book Misogynation is a collection of essays talking about the importance of joining the dots between the different forms of violence that women and girls face. We can already see patterns emerging between domestic violence and terrorist attacks and in individuals who are violent offline also, sending abusive messages to women online. Even with these cases we again see social media platforms failing to enforce their own rules.

Having presented at the United Nations Human Rights Council this summer, it was very online abuse towards women in politics was far too common. On a personal level, I strongly urge all political parties and membership organisations to develop a code of conduct for their members and be crystal clear about the support they will provide to their candidates and members when standing, campaigning or representing the organisation.

Moving forward, we need to amplify the experiences of women in public life, campaigners, councillors, candidates, activists, founders of charities, Youtubers, artists and bloggers. We also need to amplify the everyday woman who may use #blackgirlmagic, #blackhistory or #metoo and face abuse from those who are hijacking hashtags and derailing them for their own racist, sexist (or both) agendas. We must have more, frequent and visible conversations on online abuse and the diversity of experiences women face.

I hope we continue to see responsible data-gathering on women with intersectional identities like black disabled women and black muslim women. However, civil society groups cannot combat online abuse alone; tech companies and governments must also be involved. We therefore need new money, resources and training to understand and appropriately educate, enforce and empower society against online abuse. We also must see sufficient resources to support diverse media groups such Black Ballad, Media Diversified and Gal-dem Magazine who are mentioned in the research, as well as diverse academics, technologists, law enforcement and civil society groups to continue their vital work.

Finally and arguably most importantly we need to see more men be effective allies to women online and certainly must see more white women be effective allies to black women. Last year, the BNP created a racist and sexist Christmas card, which well-meaning Twitter users then forwarded on to Diane Abbott MP when sharing their outrage. Abbott, who receives almost half of all online abuse directed at female MPs, need to see the offensive card over and over again? Glitch’s digital citizenship workshop, aimed at all online users but specifically young people, encourages participants to under with digital rights comes digital responsibilities and adopt an ‘active bystander’ attitude when engaging in activities online. Black women have been talking about their negative experience online and thanks to Amnesty, we now have the data to prove it. But it’s up to all of us to make lasting change.

 

First appeared in the Huffington Post

Channel 5 News Interview

Our Executive Director, Seyi Akiwowo responding Amnesty International’s new report revealing abusive tweets are sent to female politicians and journalists every 30 seconds. The study also found that black and minority ethnic women were a third more likely to be targeted than white women. It’s time to fix the glitch and end online abuse

‘Tis the season to fix the Glitch

Make the most of this season of goodwill by donating to help us continue tackling online abuse. As we countdown to Christmas we’ll be asking for your support with some of our most important work. You can get started now with a donation of just £5 via our Paypal

Glitch’s Response to the New Crown Prosecution Service Social Media Guidelines

Read the new guidelines here: https://www.cps.gov.uk/legal-guidance/social-media-guidelines-prosecuting-cases-involving-communications-sent-social-media

What’s new

Glitch celebrates the adoption of new social media guidelines by the CPS. It is a step forward to fix the glitch and end online abuse.  These guidelines show the commitment of the UK to the United Nation treaties and resolutions, adopting real measures to protect Human Rights in our national legislation. We are pleased to see our recommendations are being heard and acted upon.

Our Concerns

Glitch would like to know what steps will be taken to effectively implement these guidelines. The police are currently under-resourced and for these guidelines to be effective there must be investment in and transparency around the training and capacity of prosecutors and law enforcement teams.

Civil society groups also need to see how data is being collected, strengthen sex and age-disaggregated data collection and publication as recommended in the Harlevoix Commitment. We hope that reports will be publicly shared for all to analyst the impact of these guidelines and that the CPS will work with civil society groups to ensure these guidelines are effective.                

For more information

The UN Resolutions to eliminate violence against women and girls: preventing and responding to violence against women and girls in digital contexts

Harlevoix commitment to end sexual and gender-based violence, abuse and harassment in digital context

Glitch’s Response to Twitter’s Updated Hateful Conduct Policy

What’s new

Last September, Twitter expanded its Hateful Conduct Policy with the introduction of a Dehumanisation Policy. This new set of rules aims to prohibit “language that treats others as less than human”, denying their human nature or their human qualities. It includes an extensive list of the identifiable groups that could be victimised. However, this policy only protects individuals when they have been dehumanised based on “a membership in an identifiable group”. While we celebrate the good gesture of Twitter to seek feedback from global perspectives and acknowledge the different impacts on cultures and communities around the world, an issue arises when a person’s membership is not clear due to the nature of the dehumanising insult. For example, when someone makes a direct comparison of another person to an animal.

Twitter’s consultation was short, just 14 days which is inaccessible for many and limits the scope. By not giving adequate chance to participate, the purpose of the survey is undermined.  While we believe it is a step forward to end online abusive behaviour, it is not enough.

Our recommendations

    • Publish more information on the implementation of Twitter’s current hateful conduct and new dehumanisation policies
    • Higher engagement with the community and communication of results and statistics
    • Clarity on the definition of “abusive tweet”  
    • Transparency in the exercise of enforcing e.g: sharing guidance of moderators
    • Implementation of an accountability mechanism to secure enforcement
    • Expansion of the regulation for a more consistent policy and higher protection of individuals

Final Thoughts

There is still a large policy gap that social media companies must address. We can all help to fix the glitch and therefore we want to see real efforts to tackle online abuse. Read more of our recommendations here: https://seyiakiwowo.com/our-recommendations/.

Glitch’s response to Ch4 adverts featuring online abuse

Tonight, during a commercial break on GoggleBox, Channel 4 will broadcast adverts overlaid with abuse that has been directed at the people who feature in them via social media platforms.

The online abuse, including all manner of terrible things including death threats, received by those featured in these ads is shameful and we massively empathise with those having to deal with this. Glitch!UK believes each person is incredibly brave for telling their story and allowing the ads to be re-played in the hope it will bring awareness and change.

Social media companies must do better to enforce their own rules and take down inappropriate, abusive and illegal content. We believe they have a duty to ensure all their users feel safe on their platforms.

Companies that run ads like these also have a duty to protect those they work with. This should include clear anti-hate and safe online community policies accessible via their social media accounts.

Brands and companies can be leaders in helping #fixtheglitch. They have trust and people have relationships with them, take Nike’s new campaign with Colin Kaepernick, for example. Companies can help lead social change but to do it they need to vocalise their rebuke of this abuse.

We can all be active bystanders. Call out trolls when you see them, report them to the platforms they’re being abusive on and drown out their negative comments with positive ones. We can all be digital citizens and help #fixtheglitch.

 

You can contact our Communications Manager and Director via info[AT]fixglitch.org for further comment.

World Suicide Prevention Day 2018

Glitch!UK is proud to announce we are taking part in World Suicide Prevention Day. A study found that victims of cyberbullying under the age of 25 are more than twice as likely to self-harm and enact suicidal behaviour. This is why we are getting involved. Our goal is to raise awareness and prevent more young people from feeling this way in the future. Our aim is to get more young people to take part in our Digital Citizenship workshops about digital safety and helping young people navigate the online world confidently, critically and positively. On the lead up to world suicide prevention day we will be signposting to organisations and resources. You can find out more about our Digital Citizenship programme here.

1

2

3

4

Helplines: 

  • Samaritans – for everyone, call 116123 or email jo@samaritans.orgHelpline: Samaritans – for everyone, call 116123 or email jo@samaritans.org
  • Campaign Against Living Miserably (CALM) – for men, call 0800585858 (5pm to 12am)
  • Papyrus – for people under 35, call 08000684141 (Mon-Fri 10am-10pm, weekends 2pm-10pm, bank holidays 2pm-5pm), text 07786209697, email pat@papyrus-uk.org
  • Childline – for young people under 19, call 08001111 (it’s anonymous on your phone bill)
  • The Silver Line – for seniors, call 08004708090

Glitch!UK is a new, thriving and internationally known non-for-profit organisation with a mission to end online abuse in the UK. ‘Glitch’ is defined as ‘a temporary malfunction’. When we look back on this period of time, we want to be able to say that the rise in online abuse was only a ‘glitch’ in our history.

Glitch!UK began as an online initiative in April 2017 after the Founder and Director, Seyi Akiwowo, faced horrendous online abuse. A video of a speech Seyi gave at the European Parliament went viral and months later Seyi was a target of abuse across multiple social media platforms. The Glitch!UK initiative has been shared in the media, at conferences, mentioned in Parliament and supported by organisations such as Feminist Internet, Plan International and the Anne Frank Trust UK.

Glitch!UK is not about imposing restrictions on how we use social media or censoring our right to free speech and freedom of expression. This submission outlines forms of online abuse including hate speech, its impact and consequences through an intersectional lens. This submission also outlines key recommendations such as digital citizenship education and best practice.

Although the UK does not recognise misogyny as a hate crime, we define misogyny as the hatred of, contempt for, or prejudice against women or girls. We believe it is important to be intersectional when talking about forms online abuse and hate crime, particularly misogynoir. Glitch!UK believes that online abuse can become a dangerous vehicle for movements to further divide society and spread fear so we must work together to #fixtheglitch.

Introduction

In today’s digital age, the Internet and digital spaces are rapidly creating new social digital spaces and transforming how individuals meet, communicate and interact, and as a result more generally reshaping society as a whole. Most recently digital spaces like social media platforms have become unfriendly, unsafe and toxic, particularly for people of colour (1), women and historically underrepresented groups. Online abuse takes many forms and we have seen a particular increase in online gender based violence (2) and online hate speech (3). We believe this has been a consequence of, and enabled by, a number of events:

  • Rise of far right movements across Europe
  • Increased polarisation within societies
  • Rhetoric used during the UK European Union Independence referendum (4)
  • Rhetoric used during recent political elections including the 2016 United States Presidential Election
  • The failure of social media companies to adequately and consistently address online abuse as well as ensuring safety of all users
  • Lack of diversity within technology and the digital engineering
  • Inconsistency within UK laws, requiring policy makers to play catch up with crimes in digital spaces and a gap between international legislations and implementation
  • Underreporting of incidents to the police and lack of robust commitment from the criminal justice system to prosecute
The Situation: The Glitch

Digital spaces have been used directly as a tool for making threats of physical and/or sexual violence, rape, killing, hate speech, unwanted and harassing online communications, or even the encouragement of others to harm women and people of colour physically. It may also involve the dissemination of reputation-harming lies, electronic sabotage in the form of spam and malignant viruses, impersonation of the victim online and the sending of abusive emails or spam, blog posts, tweets or other online communications in the victim’s name (5).

Forms of online abuse

Doxing refers to the publication of private information, such as contact details, on the Internet with malicious intent. It includes situations where personal information and data retrieved by a perpetrator is made public with malicious intent, clearly violating the right to privacy.

Trolling is the posting of messages, uploading of images or videos and creation of hashtags for the purpose of annoying, provoking or inciting violence against women and girls. Many “trolls” are anonymous and use false accounts to generate hate speech.

Online mobbing and harassment refer to the online equivalents of mobbing or harassment on social platforms, the Internet, in chat rooms, instant messaging and mobile communications.

Online stalking is the repeated harassment of individuals, perpetrated by means of mobile phones or messaging applications, in the form of crank calls or private conversations on online applications (such as WhatsApp) or in online chat groups.

Online hate speech is a type of speech that takes place online (e.g. the Internet, social media platforms) with the purpose to attack a person or a group on the basis of attributes such as race, religion, ethnic origin, sexual orientation, disability, or gender.

All the above-mentioned forms of online abuse create a permanent digital record that can be distributed worldwide and cannot be easily deleted, which may result in further victimisation of the individual(s) targeted.

It is important to distinguish from heated debated online and freedom of expression with online abuse and hate speech. Online abuse is not about robust debate. It’s about intentional harassment of individuals to force them to leave the digital space, particularly social media, modify their behaviour and create self-censors. Sending racist abuse, rape threats and sharing a video without someone’s consent are clear red lines. Once we tackle this distinction, then we can turn our attention to the remarks that are not so clear cut.

It is also important to say that social media companies must respect and do more to protect the right of women and diverse groups to express themselves online. We have outlined recommendations in our Fix the Glitch Report: How Social Media Companies can better address online abuse (6).

The Impact of Online Abuse and Online Hate Speech

Online abuse and online hate speech has a significant impact on individuals, communities and our societal values. These consequences can be broken into two key themes: health and wellbeing and democracy and human rights.

Health and Wellbeing

Online abuse has a serious psychological impact with victims reporting stress, anxiety or panic attacks as well as lower self-esteem as a result of the abuse. Amnesty International’s research showed, 67% of women who had experienced abuse or harassment online in the UK stated a feeling of apprehension when thinking about using the internet or social media. Around 1 in 8 young people have been bullied on social media (7) with 57% of young people believing they were bullied because of their appearance, 9% because of their race and 9% because of their sexuality (8). We are also concerned by the increase in the reporting of young suicide and the increase in NHS treatment for self harm cases. Obtaining a breakdown of NHS figures by demographic would provide further insight and clarity.

Democracy and Human Rights

The online world can be seen as either an extension or a mirror of offline realities and therefore violations of human rights and threats to our democracy also happen online. Over a third (34%) of Black, Asian or minority ethnic people (BAME) witnessed or experienced racial abuse in the seven months following the Brexit vote in June 2016, a TUC poll has found (9). In 2017, Metropolitan Police Sergeant said, “Every time there is a terrorist atrocity, we record a peak in hate crimes reported” (10).

Online abuse and online hate speech not only violate a individual’s right to live free from violence and to participate online but also undermine democratic exercise and good governance, and as such create a democratic deficit.  Diane Abbott, the UK’s first black woman MP and current shadow Home secretary, not only tops the list of MPs for largest number of abusive tweets received, but she received ten times more abuse than any other woman MP.  Former East London politician Seyi Akiwowo had a similar experience of unsolicited abuse in response to an online video of her speech at the European Parliament. She explains the emotional impact of the misogynistic and racial abuse:

“I was so overwhelmed by it all. Looking back, even though I went into fighter mode, wellbeing wise – I wasn’t okay. It was obvious that the harassment affected me which is surprising because I have always been a big believer in the saying ‘sticks and stones may break my bones but words will never hurt me.’ This is so not true. Words hurt and hateful words lead to hateful action,” she says.

Black people reported far more incidents of being harassed online simply for being black, rather than in response to any particular view or comment (11). Many women have contacted Seyi Akiwowo and Diane Abbott telling them they are seriously re-thinking a career in politics because they see the abuse politicians that look similar to them receive.

Democracy only works when representatives reflect their communities and online abuse is becoming an additional barrier for women and people of colour standing for public office positions.

A Tell MAMA report identified 45% of anti-Muslim hate crime took place online, and the organisation is seeing up to 80% of its resources used in monitoring online hate and supporting the victims. Community Security Trust reports 17% of anti-Semitic incidents took place on social media (12). Again, democracy rests on the engagement of all citizens including via online platforms. The reported use of bots by foreign governments and extreme right-wing groups will not only further exacerbate human rights violations and threats to our democracy but cause further divisions and echo chambers.

Our Recommendations

Digital Citizenship Education Provision

We cannot afford for our generation and the next to become desensitised to any form of hate crimes. We want to cultivate the agency of young people and we want to start a conversation about the importance of our generation being responsible citizens online. Educational institutions should take cyber-bulling more seriously and be supported to respond to bullying driven by any form of hate robustly.

Digital citizenship needs to be central to education, taught universally and from a young age. The need for more intensive delivery of digital citizenship education is now recognised around the world from UNESCO to the House of Lords in the UK. Programmes like Internet Citizens by Institute of Structured Dialogue and Glitch!UK’s Digital Citizenship aim to raise the agency of young people to use digital technology online confidently, respectfully and positively online. Digital citizenship education provides young people with an understanding of the forms of online abuse, including online hate speech, its impacts and consequences. It also prepares young people to navigate a constantly changing digital space as well as build a positive online community.

Elements of Glitch!UK’s Digital Citizenship programme include: Digital etiquette, law and security; digital rights and responsibilities; digital health, wellbeing and critical thinking.

National and Local Governments

Ahead of the UN Elimination of Violence Against Women Day on 25th November, all states and political parties should acknowledge online violence as a form of violence against women (13).

We also recommend that the the UK Government and the Criminal Justice System capture all evidence on online abuse and online hate speech and produce annual reports. The UK Government should make a commitment to the collection of data, on a regular basis, on different forms of online violence against women, people of colour and other diverse groups. This can provide evidence for the development of policy responses and action on the ground.

We recommend that the UK Government ensure social media companies pull in additional resources to moderate their platforms when there has been a major terrorist attack and in the days following.

National and local government can help increase community cohesion in the face of rising hate crime and hate speech by raising awareness of what constitutes as a hate crime, including online hate crime, its impact and the consequences.

Regional governments can join the calls for social media companies to consistently enforce their terms and conditions as well as learn from The Mayor of London’s Online Hate Crime programme. This programme enables detectives to investigate all forms of online hate crime. (14).

In September 2017, Former Home Secretary, Amber Rudd announced a UK-wide online hate crime hub but has been given very few resources and there has been no further announcement. We recommend that the current Home Secretary honours the commitment of his predecessor and priorities a UK- wide online hate crime hub.

Finally devolved governments can create and in some instances enforce an online code of a conduct for their staff, schools and community groups, as well as commission the delivery digital citizenship programmes.

Law Enforcement

Part of the London Online Hate Crime Hub Programme’s objectives is  to train other police officers to better deal with online hate crime. This is vital. Anecdotal evidence suggests that law enforcement is not routinely taking allegations of stalking or coercive control seriously, particularly in relation to online behaviour.

We recommend that all police officers are trained to understand online hate crime, follow recent Crown Prosecution Service guidelines changes, and treat online hate crime as seriously as hate crime committed face to face. New Crown Prosecution Service guidance means stronger penalties for abuse on all social media platforms and hopes to offer more support and protection to victims than ever before.

Organisations, charities, unions and places of work

We can all significantly change the nature, scale, and effect of the intimidation of diverse groups in digital spaces. We recommend that community organisations and charities seek training to better understand online abuse and online hate speech, as well as ensure their organisation have a strong online code of conduct for all their staff, particularly if their organisation have a social media group.

Additional Recommendations and Best Practice

We recommend that hate based on gender, including misogynistic speech, should be considered a hate crime. Internet intermediaries can be more transparent, more diverse and follow a code of conduct to high standards (15).

The General Policy Recommendation No. 15 on combating hate speech of the European Commission Against Racism and Intolerance (ECRI) recommends a coherent and comprehensive approach to combating hate speech, covering legal and administrative measures; self-regulatory mechanisms; effective monitoring; victim support; awareness raising and educational measures.

The “Network Enforcement Bill” (19/13013) adopted in Germany on 30 June 2017, calls on Internet providers to asses and remove hate speech content within 24 hours after being reported. Review of complex cases may take one week and can be referred to an independent body of self-regulation. It’s the first example of national authority enforcing legislation on combating illegal hate speech online and many of its modalities are still being shaped. Glitch!UK recommends that the UK Government involves individuals with experience and expertise from protected characteristics to inform policy to ensure freedom of speech for all is protected.

In ECRI’s GPR 15 on Combating Hate Speech recommendations 6 and 7 provide general principles for a self-regulatory body, which should adopt comprehensive code of conduct that can be enforced, be transparent and known, include monitoring and complaints mechanisms with possibility for appeal and ensure sufficient training of staff.

In 2015 the Austrian Government amended the Criminal Code to include online offences like cyber-bullying, cyber-mobbing, online-stalking, insults, hate speech and personal defamations which are now punishable by law (16).

The Portuguese Government has strengthened its cooperation between countries to fight the use of new technologies to commit crimes (17).

The French Government has launched PHAROS, a reporting platform that allows citizens to report on abuse suffered online. Reports are processed by police assigned to the platform (18).

The UK Government can adopt the recommendations in the UN Special Rapporteur’s report, particularly the call to improve gender-disaggregated data on the prevalence and harms of online abuse and this data collection should also be intersectional (19).

Both governments and internet intermediaries should fully resource and support civil society organisations raising awareness, providing support, training and capacity building to women and other historically underrepresented groups.

Source List

(1) Pew Research; 1 in 4 black Americans have faced online harassment because of their race or ethnicity: http://www.pewresearch.org/fact-tank/2017/07/25/1-in-4-black-americans-have-faced-online-harassment-because-of-their-race-or-ethnicity/

(2) Report of the Special Rapporteur on violence against women, its causes and consequences on online violence against women and girls from a human rights perspective: https://www.ohchr.org/EN/Issues/Women/SRWomen/Pages/AnnualReports.aspx

(3) In 2015/16 it completed 15,442 hate crime prosecutions, the highest number it has ever recorded http://www.report-it.org.uk/files/hate_crime_report_1.pdf

(4) 1 in 3 BAME people have witnessed or experienced racist abuse since Brexit vote, finds TUC poll: https://www.tuc.org.uk/news/1-3-bame-people-have-witnessed-or-experienced-racist-abuse-brexit-vote-finds-tuc-poll

(5) Report of the Special Rapporteur on violence against women, its causes and consequences on online violence against women and girls from a human rights perspective: https://www.ohchr.org/EN/Issues/Women/SRWomen/Pages/AnnualReports.aspx

(6) Fix the Glitch Report: How Social Media Companies can better address online abuse: http://www.bit.ly/GlitchUKRecommendations

(7) NSPCC Online abuse Facts and statistics  https://www.nspcc.org.uk/preventing-abuse/child-abuse-and-neglect/online-abuse/facts-statistics/

(8) Ditch the Label Annual Bullying Survey 2018: https://www.ditchthelabel.org/research-papers/the-annual-bullying-survey-2018/

(9) 1 in 3 BAME people have witnessed or experienced racist abuse since Brexit vote, finds TUC poll: https://www.tuc.org.uk/news/1-3-bame-people-have-witnessed-or-experienced-racist-abuse-brexit-vote-finds-tuc-poll

(10) The Guardian, Hate crime surged in England and Wales after terrorist attacks https://www.theguardian.com/uk-news/2017/oct/17/hate-soars-in-england-and-wales
(11)Pew Research; 1 in 4 black Americans have faced online harassment because of their race or ethnicity: http://www.pewresearch.org/fact-tank/2017/07/25/1-in-4-black-americans-have-faced-online-harassment-because-of-their-race-or-ethnicity/

(12) https://www.london.gov.uk/what-we-do/mayors-office-policing-and-crime-mopac/governance-and-decision-making/mopac-decisions-206

(13 )Glitch!UK News + update,  https://seyiakiwowo.com/2017/11/30/my-piece-in-labourlist-today-is-the-day-labour-must-renew-its-determination-to-end-online-violence-against-women-and-girls/

(14) Mayor launches new unit to tackle online hate crime 24th April 2017, Press Release https://www.london.gov.uk/press-releases/mayoral/mayor-launches-unit-to-tackle-online-hate-crime

(15) Glitch!UK, Our Recommendations April 2017, https://seyiakiwowo.com/2017/04/13/twitter-and-youtube-do-more-to-deal-with-your-trolls/

(16) European Women’s Lobby Her Net Her Rights Report 2017 https://www.womenlobby.org/IMG/pdf/hernetherrights_report_2017_for_web.pdf

(17) Ibid

(18) Ibid

(19) Report of the Special Rapporteur on violence against women, its causes and consequences on online violence against women and girls from a human rights perspective 2018 https://www.ohchr.org/EN/Issues/Women/SRWomen/Pages/AnnualReports.aspx

Glitch’s Founder + Director Intervention Notes at the UN Human Rights Council

38th session of the Human Rights Council
Annual full-day discussion on the human rights of women
Panel 1: The impact of violence against women human rights defenders and
women’s organizations in digital spaces
Founder and Director, Glitch!UK’s Seyi Akiwowo’s Intervention Notes 

First, I am honoured to have been invited by the President of the United Nation’s Human Rights Commission, Vojislav Šuc to participate at in the 38th Human Rights Council’s annual full day discussion on the human rights of women. I am extremely pleased that the theme is the Impact of Violence Against Women Human Rights Defenders and Women Organisations in Digital Spaces. I also must thank and praise the United Nation Special Rapporteur for Violence Against Women, Dubravka Šimonović for her strong and comprehensive report on online violence against women, its causes and consequences against women and girls from a human rights perspective.

I will use my intervention today to help debunk 5 myths commonly used to dispute, disrupt and downgrade online violence and its harmful impact.

 

In 2017, after facing horrendous online violence when a video of my speech made at the European Parliament, as one of UK’s young British Nigerian politicians, went viral I founded Glitch!UK, a not-for-profit online abuse advocacy, campaigning and training organisation. Glitch!UK aims to end online abuse and harassment including online violence against women in politically active women. ‘Glitch’ means a temporary malfunction with equipment, and I used it for my organisations name because when we look back on this period in time I want us all to be able to say that the rise in online abuse was only a ‘glitch’ in our history.

 

In the last year, I have been fortunate to have met and/or worked with fantastic hardworking women human rights defenders and women organisations like, National Democratic Institute and their #NotTheCost campaign, Amnesty International’s #ToxicTwitter research, and the Association for Progressive Communications’ Take Back The Tech initiative. I have heard many heart-wrenching testimonies of the online violence both politically active and non-politically women have experienced. This tells me very loud and very clearly online violence against women is a multi-faceted and global problem.  Therefore, solutions many outlined in the UN Special Rapporteur of Violence Against Women’s report, must to be multi-faceted into include digital technologies companies, Governments as well as civil society organisations.

However, there are many and most likely some in this very room that does not believe online violence exists let alone believe the impact on women.

 

Screen Shot 2018-06-21 at 13.59.40

  1. “‘Online violence’ actually doesn’t exist.”

It very much does.  In Europe, 9 million girls have experienced some kind of online violence by the time they are 15 years old. Globally, women are 27 times more likely to be harassed online[1].

Online violence manifests in a multitude of forms including harassment, harassment across multiple social media platforms, online stalking, sharing of private information, trolling, non-consensual online dissemination of intimate images and sextortion. In some member states some forms are already a crime offline including online hate crime.

When talking about the online abuse women and politically active women face we must be intersectional and look at women with multiple identities. I not only face misogyny I am also faced with racism or as Academic Moya Bailey terms it misogynoir.[2]

  1. “Addressing ‘online violence’ is infringing on individual’s rights to freedom of expression.”

Online violence is not robust debate[3]. It is about intentional harassment of women to silence and force them to leave digital spaces. It is an attempt to modify women’s behaviour to conform to patriarchy and self-censorship.

There are some words and acts that are just clearly hateful and do not belong in robust debate. Sending racist abuse, rape threats and sharing an intimate video without someone’s consent are clear red lines. Once we address this, then we can turn our attention to remarks that are not so clear cut.

  1. “‘Online violence’ has no harmful impact”

Online violence has an impact on health and wellbeing, progress towards gender equality and is a threat to democracy.

Online violence against politically active women represents a direct barrier to women’s free speech and political participation. The anti-democratic impact of psychological abuse and other forms of violence through digital technology undermines a woman’s sense of personal security that leads to women’s self-censorship and withdrawal from public discourse and correspondence.[4]

Evidence from around the world suggests that women in politics have experienced such violence and intimidation, and that their experiences have implications for their ability and willingness to participate actively in public life. [5] Around two-thirds of women who had experienced abuse or harassment online in the UK (67%) stated a feeling of apprehension when thinking about using the internet or social media.[6]

  1. “There are no solutions”

We can all significantly change the nature, scale, and effect of the intimidation of politically active and non-politically active women in digital spaces.

Ahead of the UN Elimination of Violence Against Women Day on 25th November all states and political parties can acknowledge online violence as a form of violence against women.[7]

Internet intermediaries can be more transparent, more diverse and follow a code of conduct of high standards.[8]

The German Government now have powers to fine social media companies up to €50m for failing to remove illegal content within 24 hours.

Anecdotal evidence suggests that law enforcements are not routinely taking allegations of stalking or coercive control seriously, particularly in relation to online behaviour.[9] There is a role for regional governments here too. Last year, Mayor of London Sadiq Khan launched the Online Hate Crime programme that investigates all forms of online hate crime. [10] The United Kingdom now treat offences committed online as if they happened in a public space. [11]

In 2015 Austrian Government amended the Criminal Code to include offences online such as cyber-bullying, cyber-mobbing, online-stalking, insults, hate speech and personal defamations which are now punishable by law.[12]

The Portuguese Government have strengthened their cooperation between countries to fight the use of new technologies to commit crime. [13]

The French Government, have launched PHAROS a reporting platform that allows citizens to report on abuse suffered online. Reports are processed by police assigned to the platform[14]

Member States today can adopt the recommendations[15] in the UN Special Rapporteur’s report particularly the call to improve gender-disaggregated data on the prevalence and harms of online abuse.

Finally, both governments and internet intermediaries should fully resource and support civil society organisations raising awareness, providing support, training and capacity building to women and other historically underrepresented groups.

  1. “Citizenship cannot be extended to digital spaces”

Digital citizenship needs to be central to education, taught universally and from a young age.

The need for more intensive delivery of digital citizenship education is now recognised around the world from UNESCO to the House of Lords in the UK.

Programmes like Internet Citizens by Institute of Structured Dialogue, Glitch!UK’s Digital Citizenship aim to raise the agency of young people to use digital technology online confidently, respectfully and positively online.

Digital citizenship education provides young people with an understanding of the forms of online abuse (online bullying), its impact, consequences and prepares them to navigate a constantly changing digital space.

 

Driving women out of public space is no new thing. Online violence in public digital spaces is merely an extension of a reality, a reality lived by millions of women around the world. Nevertheless, by working together comprehensively we can #fixtheglitch. I hope this 38th Human Rights Council will be another marker in international action and we see significant commitments to ending all forms of online violence against all women, including political active women and women organisations.

Bibliography 

[1] UN Broadband Commission for Digital Development, “Cyber Violence Against Women and Girls: A World- Wide Wake-Up Call”, 2015, Available online at: http://www.unwomen. org/-/media/headquarters/attachments/sections/library/ publications/2015/cyber_violence_gender%20report. pdf?vs=4259

[2] Maya Goodfellow 2017, Misogynoir: How social media abuse exposes longstanding prejudices against black women, https://www.newstatesman.com/politics/uk/2017/02/misogynoir-how-social-media-abuse-exposes-longstanding-prejudices-against-black

[3] Glitch!UK News + update, Human Rights Day 2017 https://seyiakiwowo.com/2017/12/10/humanrightday2017/

[4] National Democratic Institute, Evidence Paper, Review of the Committee on Standards in Public Life

into the Intimidation of Parliamentary Candidates https://www.ndi.org/sites/default/files/NDI%20UK%20Report.pdf

[5] Ibid

[6]Amnesty International Report Online abuse of women widespread in UK  https://www.amnesty.org.uk/online-abuse-women-widespread

[7] Glitch!UK News + update,  https://seyiakiwowo.com/2017/11/30/my-piece-in-labourlist-today-is-the-day-labour-must-renew-its-determination-to-end-online-violence-against-women-and-girls/

[8] Glitch!UK, Our Recommendations April 2017, https://seyiakiwowo.com/2017/04/13/twitter-and-youtube-do-more-to-deal-with-your-trolls/

[9] Westminster Foundation for Democracy, Case Study: The United Kingdom Report 2018 http://www.wfd.org/wp-content/uploads/2018/06/UK-Case-Study-1.pdf

[10] Mayor launches new unit to tackle online hate crime 24th April 2017, Press Release https://www.london.gov.uk/press-releases/mayoral/mayor-launches-unit-to-tackle-online-hate-crime

[11] Hate is hate. Online abusers must be dealt with harshly

Alison Saunders 21st August 2017, https://www.theguardian.com/commentisfree/2017/aug/20/hate-crimes-online-abusers-prosecutors-serious-crackdown-internet-face-to-face

[12] European Women’s Lobby Her Net Her Rights Report 2017 https://www.womenlobby.org/IMG/pdf/hernetherrights_report_2017_for_web.pdf

[13] Ibid

[14] Ibid

[15] Report of the Special Rapporteur on violence against women, its causes and consequences on online violence against women and girls from a human rights perspective 2018 https://www.ohchr.org/EN/Issues/Women/SRWomen/Pages/AnnualReports.aspx

Our response to the US leaving the UN Human Rights Council the day before an important session

President Trump has announced the United States will leave the United Nations Human Rights Council a day before the 38th Human Rights Council session on the human rights of women.

We agree with Salil Shetty, Amnesty International’s Secretary General “while the Human Rights Council is by no means perfect and its membership is frequently under scrutiny, it remains an important force for accountability and justice.”

Two panel discussions have been scheduled for 38th Human Rights Council session, The impact of violence against women human rights defenders and women’s organizations in digital spaces and  Advancing women’s rights through access and participation in information and communication technologies.

We agree with the statements shared by the United Human Rights Council, diplomats and nation leaders around the world. However we hope this announcement does not dominate, overshadow nor derail from the important work to be discussed this week. We share solidarity with all US Civil Society groups that will be affected by this announcement and those attending the 38th Human Rights Council session.