2018 was a year of growth for us and we are so thankful for your continued support. We’ve shared some of our proudest moment of 2018 and exciting plans for 2019.
Our Proudest Moments of 2018
Our key focus last year was to encourage as many people to take some form of action to help fix the glitch to end online abuse.
Over 100 people pledged to help #fixtheglitch and be an active bystander online.
Over 30 public figures have received our bespoke Digital Resilience Training.
We’ve been working closely with policy makers and Members of Parliament cross party to highlight the intersectionality of online abuse, outline both its impact and harms as well as explore effective ways to fix the glitch. Our presentation at the 38th United Nations Human Rights Council on Online Violence Against Women was extremely well received.
We were also invited to Number 10 Downing Street thanks to new Glitch Partners Antisemitism Policy Trust.
We are proud of our submissions to two All-Party Parliamentary Groups and our joint response to Government consultation with Centenary Action Group.
We recently developed working relationships with Facebook as well as Twitter who have agreed for us to a host a listening meeting for British Black Women to share their experiences of online abuse in Spring 2019.
Have you noticed our rebrand? Huge thanks to Double Noire this has supported our online and offline campaigns. We’re excited to see more people learning about different forms of online abuse, the scale of the challenge and exploring effective ways to end it.
We are so proud for our work and organisation to be recommended as best practice in a recent European Parliament Report on Cyberviolence.
Our Hopes for 2019
- Deliver online resources to empower more people to help end online abuse in their local communities.
- Deliver more successful campaigns for change
- Deliver workshops to 5000 young people.
To achieve this we need your support. Here 5 ways you can help:
- Donate: support our Christmas Fundraiser you can make a financial donation via paypal.
- Book: our Our Digital Resilience training or our Digital Citizenship workshop.
- Volunteer: we’ve tripled in size but are also looking for volunteers with skills and expertise to help us grow. If you have finance, legal, fundraising or marketing skills we would love to hear from you! Please complete this short interest form.
- Partner with us: our mission to fix this current glitch in our online world can only be achieved through collaboration and partnership. If you are an individual or organisation working to make the online world safer we would love to work with you! Please get in contact.
- Spread the word: we want to build an inclusive movement to end online abuse to fix this glitch. Follow us online and share Glitch with your Facebook friends and your Twitter and Instagram followers.
We wish you a wonderful 2019 and we look forward to working with you to end online abuse.
Yesterday, Amnesty International has further proved that online abuse is a violation of our human rights with the launch of its latest report on women in politics and journalism. Results from the global crowdsourcing project, named TrollPatrol, support what women, particularly black women, have been reporting for over several years. The research revealed that women in the US and UK face a staggering level of abuse – every 30 seconds on Twitter – and that black women are 84% more likely than white women to face abusive or problematic tweets. Organisations like Take Back the Tech, Women’s Media Center and The National Democratic Institute’s Not the Cost Campaign were one of the first to address this pandemic and we pay tribute to their dedication. Now that we have intersectional data, it’s time we all help fix the glitch and end online abuse.
Last year I experienced a tidal wave of misogynoir, which is abuse that’s both misogynistic and racist, after a speech I made went viral. As a young black woman in politics observing how other women were publicly treated on social media platforms, experiencing abuse first hand was the final straw. This is why I founded Glitch, a not-for-profit organisation that exists to end online abuse. Our workshops centre on digital self care, self defence and digital citizenship and were recently recommended as best practice in a recent European Parliament Report.
According to the report, 1 in 10 tweets mentioning black women is either abusive or problematic, compared with 1 in 15 directed at white women. We strongly agree with Amnesty, social media companies such as Twitter must be more transparent but they must also engage and support many more diverse activist groups using their platforms. For so long, research, policies and Government interventions to address online abuse have focused on children and women, as if the two are homogenous groups. By doing so, we are ignoring the real drivers behind a lot of online abuse and online bullying, affecting women and girls on a daily basis. Amnesty’s research is a welcome step towards a more nuanced, intersectional critique of online harms.
It’s important to understand that while women experience all different kinds of online abuse, the overall impact has a silencing effect that represents a potent threat to gender equality, human rights and democracy. It causes anxiety and, in very sad situations, has resulted in girls self harming and taking their own lives. Amnesty’s Write for Rights Campaign has inspired thousands of people around the world to write to Jack Dorsey, asking him to take serious action. I’ve also received so many messages from people sharing their own experiences of online abuse and losing loved ones.
So, how do we begin to address online abuse?
The current online/offline dichotomy is unhelpful and hides much of the violence that women and girls face online. Founder of Everyday Sexism, Laura Bates’ book Misogynation is a collection of essays talking about the importance of joining the dots between the different forms of violence that women and girls face. We can already see patterns emerging between domestic violence and terrorist attacks and in individuals who are violent offline also, sending abusive messages to women online. Even with these cases we again see social media platforms failing to enforce their own rules.
Having presented at the United Nations Human Rights Council this summer, it was very online abuse towards women in politics was far too common. On a personal level, I strongly urge all political parties and membership organisations to develop a code of conduct for their members and be crystal clear about the support they will provide to their candidates and members when standing, campaigning or representing the organisation.
Moving forward, we need to amplify the experiences of women in public life, campaigners, councillors, candidates, activists, founders of charities, Youtubers, artists and bloggers. We also need to amplify the everyday woman who may use #blackgirlmagic, #blackhistory or #metoo and face abuse from those who are hijacking hashtags and derailing them for their own racist, sexist (or both) agendas. We must have more, frequent and visible conversations on online abuse and the diversity of experiences women face.
I hope we continue to see responsible data-gathering on women with intersectional identities like black disabled women and black muslim women. However, civil society groups cannot combat online abuse alone; tech companies and governments must also be involved. We therefore need new money, resources and training to understand and appropriately educate, enforce and empower society against online abuse. We also must see sufficient resources to support diverse media groups such Black Ballad, Media Diversified and Gal-dem Magazine who are mentioned in the research, as well as diverse academics, technologists, law enforcement and civil society groups to continue their vital work.
Finally and arguably most importantly we need to see more men be effective allies to women online and certainly must see more white women be effective allies to black women. Last year, the BNP created a racist and sexist Christmas card, which well-meaning Twitter users then forwarded on to Diane Abbott MP when sharing their outrage. Abbott, who receives almost half of all online abuse directed at female MPs, need to see the offensive card over and over again? Glitch’s digital citizenship workshop, aimed at all online users but specifically young people, encourages participants to under with digital rights comes digital responsibilities and adopt an ‘active bystander’ attitude when engaging in activities online. Black women have been talking about their negative experience online and thanks to Amnesty, we now have the data to prove it. But it’s up to all of us to make lasting change.
First appeared in the Huffington Post
Our Executive Director, Seyi Akiwowo responding Amnesty International’s new report revealing abusive tweets are sent to female politicians and journalists every 30 seconds. The study also found that black and minority ethnic women were a third more likely to be targeted than white women. It’s time to fix the glitch and end online abuse
Make the most of this season of goodwill by donating to help us continue tackling online abuse. As we countdown to Christmas we’ll be asking for your support with some of our most important work. You can get started now with a donation of just £5 via our Paypal
Glitch celebrates the adoption of new social media guidelines by the CPS. It is a step forward to fix the glitch and end online abuse. These guidelines show the commitment of the UK to the United Nation treaties and resolutions, adopting real measures to protect Human Rights in our national legislation. We are pleased to see our recommendations are being heard and acted upon.
Glitch would like to know what steps will be taken to effectively implement these guidelines. The police are currently under-resourced and for these guidelines to be effective there must be investment in and transparency around the training and capacity of prosecutors and law enforcement teams.
Civil society groups also need to see how data is being collected, strengthen sex and age-disaggregated data collection and publication as recommended in the Harlevoix Commitment. We hope that reports will be publicly shared for all to analyst the impact of these guidelines and that the CPS will work with civil society groups to ensure these guidelines are effective.
For more information
Last September, Twitter expanded its Hateful Conduct Policy with the introduction of a Dehumanisation Policy. This new set of rules aims to prohibit “language that treats others as less than human”, denying their human nature or their human qualities. It includes an extensive list of the identifiable groups that could be victimised. However, this policy only protects individuals when they have been dehumanised based on “a membership in an identifiable group”. While we celebrate the good gesture of Twitter to seek feedback from global perspectives and acknowledge the different impacts on cultures and communities around the world, an issue arises when a person’s membership is not clear due to the nature of the dehumanising insult. For example, when someone makes a direct comparison of another person to an animal.
Twitter’s consultation was short, just 14 days which is inaccessible for many and limits the scope. By not giving adequate chance to participate, the purpose of the survey is undermined. While we believe it is a step forward to end online abusive behaviour, it is not enough.
- Publish more information on the implementation of Twitter’s current hateful conduct and new dehumanisation policies
- Higher engagement with the community and communication of results and statistics
- Clarity on the definition of “abusive tweet”
- Transparency in the exercise of enforcing e.g: sharing guidance of moderators
- Implementation of an accountability mechanism to secure enforcement
- Expansion of the regulation for a more consistent policy and higher protection of individuals
There is still a large policy gap that social media companies must address. We can all help to fix the glitch and therefore we want to see real efforts to tackle online abuse. Read more of our recommendations here: https://seyiakiwowo.com/our-recommendations/.
Sed nec facilisis lacus. Aenean ullamcorper, ex sit amet consectetur imperdiet, dui lorem eleifend dolor, non commodo ipsum purus sed quam. Nullam eget maximus leo. Pellentesque ac tempor risus. In vestibulum orci arcu. Morbi in vehicula libero, sollicitudin ornare felis. Aliquam vestibulum pulvinar dui. Pellentesque dui justo, facilisis id lobortis non, volutpat quis ligula. Integer vel sollicitudin sapien, ac feugiat arcu. Ut quis libero ut nisi aliquet volutpat. Nunc eget sagittis tortor. Pellentesque lobortis in justo at molestie. Maecenas imperdiet leo id sem consequat, et dapibus velit congue.
Phasellus sit amet nibh vel purus rutrum viverra vitae vitae mi. Etiam at erat ipsum. Sed sit amet nulla porta, elementum lectus vel, gravida eros. Curabitur tincidunt neque eu semper bibendum. Nullam faucibus, quam in ullamcorper cursus, risus metus pulvinar augue, convallis rhoncus nisi dui ut orci. Proin iaculis, dolor eget consectetur pharetra, dolor massa viverra sapien, nec iaculis turpis lorem eget augue. In eget tempor dui. Phasellus sit amet elementum orci.
Tonight, during a commercial break on GoggleBox, Channel 4 will broadcast adverts overlaid with abuse that has been directed at the people who feature in them via social media platforms.
The online abuse, including all manner of terrible things including death threats, received by those featured in these ads is shameful and we massively empathise with those having to deal with this. Glitch!UK believes each person is incredibly brave for telling their story and allowing the ads to be re-played in the hope it will bring awareness and change.
Social media companies must do better to enforce their own rules and take down inappropriate, abusive and illegal content. We believe they have a duty to ensure all their users feel safe on their platforms.
Companies that run ads like these also have a duty to protect those they work with. This should include clear anti-hate and safe online community policies accessible via their social media accounts.
Brands and companies can be leaders in helping #fixtheglitch. They have trust and people have relationships with them, take Nike’s new campaign with Colin Kaepernick, for example. Companies can help lead social change but to do it they need to vocalise their rebuke of this abuse.
We can all be active bystanders. Call out trolls when you see them, report them to the platforms they’re being abusive on and drown out their negative comments with positive ones. We can all be digital citizens and help #fixtheglitch.
You can contact our Communications Manager and Director via info[AT]fixglitch.org for further comment.
Glitch!UK is proud to announce we are taking part in World Suicide Prevention Day. A study found that victims of cyberbullying under the age of 25 are more than twice as likely to self-harm and enact suicidal behaviour. This is why we are getting involved. Our goal is to raise awareness and prevent more young people from feeling this way in the future. Our aim is to get more young people to take part in our Digital Citizenship workshops about digital safety and helping young people navigate the online world confidently, critically and positively. On the lead up to world suicide prevention day we will be signposting to organisations and resources. You can find out more about our Digital Citizenship programme here.
- Samaritans – for everyone, call 116123 or email email@example.comHelpline: Samaritans – for everyone, call 116123 or email firstname.lastname@example.org
- Campaign Against Living Miserably (CALM) – for men, call 0800585858 (5pm to 12am)
- Papyrus – for people under 35, call 08000684141 (Mon-Fri 10am-10pm, weekends 2pm-10pm, bank holidays 2pm-5pm), text 07786209697, email email@example.com
- Childline – for young people under 19, call 08001111 (it’s anonymous on your phone bill)
- The Silver Line – for seniors, call 08004708090
Glitch!UK is a new, thriving and internationally known non-for-profit organisation with a mission to end online abuse in the UK. ‘Glitch’ is defined as ‘a temporary malfunction’. When we look back on this period of time, we want to be able to say that the rise in online abuse was only a ‘glitch’ in our history.
Glitch!UK began as an online initiative in April 2017 after the Founder and Director, Seyi Akiwowo, faced horrendous online abuse. A video of a speech Seyi gave at the European Parliament went viral and months later Seyi was a target of abuse across multiple social media platforms. The Glitch!UK initiative has been shared in the media, at conferences, mentioned in Parliament and supported by organisations such as Feminist Internet, Plan International and the Anne Frank Trust UK.
Glitch!UK is not about imposing restrictions on how we use social media or censoring our right to free speech and freedom of expression. This submission outlines forms of online abuse including hate speech, its impact and consequences through an intersectional lens. This submission also outlines key recommendations such as digital citizenship education and best practice.
Although the UK does not recognise misogyny as a hate crime, we define misogyny as the hatred of, contempt for, or prejudice against women or girls. We believe it is important to be intersectional when talking about forms online abuse and hate crime, particularly misogynoir. Glitch!UK believes that online abuse can become a dangerous vehicle for movements to further divide society and spread fear so we must work together to #fixtheglitch.
In today’s digital age, the Internet and digital spaces are rapidly creating new social digital spaces and transforming how individuals meet, communicate and interact, and as a result more generally reshaping society as a whole. Most recently digital spaces like social media platforms have become unfriendly, unsafe and toxic, particularly for people of colour (1), women and historically underrepresented groups. Online abuse takes many forms and we have seen a particular increase in online gender based violence (2) and online hate speech (3). We believe this has been a consequence of, and enabled by, a number of events:
- Rise of far right movements across Europe
- Increased polarisation within societies
- Rhetoric used during the UK European Union Independence referendum (4)
- Rhetoric used during recent political elections including the 2016 United States Presidential Election
- The failure of social media companies to adequately and consistently address online abuse as well as ensuring safety of all users
- Lack of diversity within technology and the digital engineering
- Inconsistency within UK laws, requiring policy makers to play catch up with crimes in digital spaces and a gap between international legislations and implementation
- Underreporting of incidents to the police and lack of robust commitment from the criminal justice system to prosecute
The Situation: The Glitch
Digital spaces have been used directly as a tool for making threats of physical and/or sexual violence, rape, killing, hate speech, unwanted and harassing online communications, or even the encouragement of others to harm women and people of colour physically. It may also involve the dissemination of reputation-harming lies, electronic sabotage in the form of spam and malignant viruses, impersonation of the victim online and the sending of abusive emails or spam, blog posts, tweets or other online communications in the victim’s name (5).
Forms of online abuse
Doxing refers to the publication of private information, such as contact details, on the Internet with malicious intent. It includes situations where personal information and data retrieved by a perpetrator is made public with malicious intent, clearly violating the right to privacy.
Trolling is the posting of messages, uploading of images or videos and creation of hashtags for the purpose of annoying, provoking or inciting violence against women and girls. Many “trolls” are anonymous and use false accounts to generate hate speech.
Online mobbing and harassment refer to the online equivalents of mobbing or harassment on social platforms, the Internet, in chat rooms, instant messaging and mobile communications.
Online stalking is the repeated harassment of individuals, perpetrated by means of mobile phones or messaging applications, in the form of crank calls or private conversations on online applications (such as WhatsApp) or in online chat groups.
Online hate speech is a type of speech that takes place online (e.g. the Internet, social media platforms) with the purpose to attack a person or a group on the basis of attributes such as race, religion, ethnic origin, sexual orientation, disability, or gender.
All the above-mentioned forms of online abuse create a permanent digital record that can be distributed worldwide and cannot be easily deleted, which may result in further victimisation of the individual(s) targeted.
It is important to distinguish from heated debated online and freedom of expression with online abuse and hate speech. Online abuse is not about robust debate. It’s about intentional harassment of individuals to force them to leave the digital space, particularly social media, modify their behaviour and create self-censors. Sending racist abuse, rape threats and sharing a video without someone’s consent are clear red lines. Once we tackle this distinction, then we can turn our attention to the remarks that are not so clear cut.
It is also important to say that social media companies must respect and do more to protect the right of women and diverse groups to express themselves online. We have outlined recommendations in our Fix the Glitch Report: How Social Media Companies can better address online abuse (6).
The Impact of Online Abuse and Online Hate Speech
Online abuse and online hate speech has a significant impact on individuals, communities and our societal values. These consequences can be broken into two key themes: health and wellbeing and democracy and human rights.
Health and Wellbeing
Online abuse has a serious psychological impact with victims reporting stress, anxiety or panic attacks as well as lower self-esteem as a result of the abuse. Amnesty International’s research showed, 67% of women who had experienced abuse or harassment online in the UK stated a feeling of apprehension when thinking about using the internet or social media. Around 1 in 8 young people have been bullied on social media (7) with 57% of young people believing they were bullied because of their appearance, 9% because of their race and 9% because of their sexuality (8). We are also concerned by the increase in the reporting of young suicide and the increase in NHS treatment for self harm cases. Obtaining a breakdown of NHS figures by demographic would provide further insight and clarity.
Democracy and Human Rights
The online world can be seen as either an extension or a mirror of offline realities and therefore violations of human rights and threats to our democracy also happen online. Over a third (34%) of Black, Asian or minority ethnic people (BAME) witnessed or experienced racial abuse in the seven months following the Brexit vote in June 2016, a TUC poll has found (9). In 2017, Metropolitan Police Sergeant said, “Every time there is a terrorist atrocity, we record a peak in hate crimes reported” (10).
Online abuse and online hate speech not only violate a individual’s right to live free from violence and to participate online but also undermine democratic exercise and good governance, and as such create a democratic deficit. Diane Abbott, the UK’s first black woman MP and current shadow Home secretary, not only tops the list of MPs for largest number of abusive tweets received, but she received ten times more abuse than any other woman MP. Former East London politician Seyi Akiwowo had a similar experience of unsolicited abuse in response to an online video of her speech at the European Parliament. She explains the emotional impact of the misogynistic and racial abuse:
“I was so overwhelmed by it all. Looking back, even though I went into fighter mode, wellbeing wise – I wasn’t okay. It was obvious that the harassment affected me which is surprising because I have always been a big believer in the saying ‘sticks and stones may break my bones but words will never hurt me.’ This is so not true. Words hurt and hateful words lead to hateful action,” she says.
Black people reported far more incidents of being harassed online simply for being black, rather than in response to any particular view or comment (11). Many women have contacted Seyi Akiwowo and Diane Abbott telling them they are seriously re-thinking a career in politics because they see the abuse politicians that look similar to them receive.
Democracy only works when representatives reflect their communities and online abuse is becoming an additional barrier for women and people of colour standing for public office positions.
A Tell MAMA report identified 45% of anti-Muslim hate crime took place online, and the organisation is seeing up to 80% of its resources used in monitoring online hate and supporting the victims. Community Security Trust reports 17% of anti-Semitic incidents took place on social media (12). Again, democracy rests on the engagement of all citizens including via online platforms. The reported use of bots by foreign governments and extreme right-wing groups will not only further exacerbate human rights violations and threats to our democracy but cause further divisions and echo chambers.
Digital Citizenship Education Provision
We cannot afford for our generation and the next to become desensitised to any form of hate crimes. We want to cultivate the agency of young people and we want to start a conversation about the importance of our generation being responsible citizens online. Educational institutions should take cyber-bulling more seriously and be supported to respond to bullying driven by any form of hate robustly.
Digital citizenship needs to be central to education, taught universally and from a young age. The need for more intensive delivery of digital citizenship education is now recognised around the world from UNESCO to the House of Lords in the UK. Programmes like Internet Citizens by Institute of Structured Dialogue and Glitch!UK’s Digital Citizenship aim to raise the agency of young people to use digital technology online confidently, respectfully and positively online. Digital citizenship education provides young people with an understanding of the forms of online abuse, including online hate speech, its impacts and consequences. It also prepares young people to navigate a constantly changing digital space as well as build a positive online community.
Elements of Glitch!UK’s Digital Citizenship programme include: Digital etiquette, law and security; digital rights and responsibilities; digital health, wellbeing and critical thinking.
National and Local Governments
Ahead of the UN Elimination of Violence Against Women Day on 25th November, all states and political parties should acknowledge online violence as a form of violence against women (13).
We also recommend that the the UK Government and the Criminal Justice System capture all evidence on online abuse and online hate speech and produce annual reports. The UK Government should make a commitment to the collection of data, on a regular basis, on different forms of online violence against women, people of colour and other diverse groups. This can provide evidence for the development of policy responses and action on the ground.
We recommend that the UK Government ensure social media companies pull in additional resources to moderate their platforms when there has been a major terrorist attack and in the days following.
National and local government can help increase community cohesion in the face of rising hate crime and hate speech by raising awareness of what constitutes as a hate crime, including online hate crime, its impact and the consequences.
Regional governments can join the calls for social media companies to consistently enforce their terms and conditions as well as learn from The Mayor of London’s Online Hate Crime programme. This programme enables detectives to investigate all forms of online hate crime. (14).
In September 2017, Former Home Secretary, Amber Rudd announced a UK-wide online hate crime hub but has been given very few resources and there has been no further announcement. We recommend that the current Home Secretary honours the commitment of his predecessor and priorities a UK- wide online hate crime hub.
Finally devolved governments can create and in some instances enforce an online code of a conduct for their staff, schools and community groups, as well as commission the delivery digital citizenship programmes.
Part of the London Online Hate Crime Hub Programme’s objectives is to train other police officers to better deal with online hate crime. This is vital. Anecdotal evidence suggests that law enforcement is not routinely taking allegations of stalking or coercive control seriously, particularly in relation to online behaviour.
We recommend that all police officers are trained to understand online hate crime, follow recent Crown Prosecution Service guidelines changes, and treat online hate crime as seriously as hate crime committed face to face. New Crown Prosecution Service guidance means stronger penalties for abuse on all social media platforms and hopes to offer more support and protection to victims than ever before.
Organisations, charities, unions and places of work
We can all significantly change the nature, scale, and effect of the intimidation of diverse groups in digital spaces. We recommend that community organisations and charities seek training to better understand online abuse and online hate speech, as well as ensure their organisation have a strong online code of conduct for all their staff, particularly if their organisation have a social media group.
Additional Recommendations and Best Practice
We recommend that hate based on gender, including misogynistic speech, should be considered a hate crime. Internet intermediaries can be more transparent, more diverse and follow a code of conduct to high standards (15).
The General Policy Recommendation No. 15 on combating hate speech of the European Commission Against Racism and Intolerance (ECRI) recommends a coherent and comprehensive approach to combating hate speech, covering legal and administrative measures; self-regulatory mechanisms; effective monitoring; victim support; awareness raising and educational measures.
The “Network Enforcement Bill” (19/13013) adopted in Germany on 30 June 2017, calls on Internet providers to asses and remove hate speech content within 24 hours after being reported. Review of complex cases may take one week and can be referred to an independent body of self-regulation. It’s the first example of national authority enforcing legislation on combating illegal hate speech online and many of its modalities are still being shaped. Glitch!UK recommends that the UK Government involves individuals with experience and expertise from protected characteristics to inform policy to ensure freedom of speech for all is protected.
In ECRI’s GPR 15 on Combating Hate Speech recommendations 6 and 7 provide general principles for a self-regulatory body, which should adopt comprehensive code of conduct that can be enforced, be transparent and known, include monitoring and complaints mechanisms with possibility for appeal and ensure sufficient training of staff.
In 2015 the Austrian Government amended the Criminal Code to include online offences like cyber-bullying, cyber-mobbing, online-stalking, insults, hate speech and personal defamations which are now punishable by law (16).
The Portuguese Government has strengthened its cooperation between countries to fight the use of new technologies to commit crimes (17).
The French Government has launched PHAROS, a reporting platform that allows citizens to report on abuse suffered online. Reports are processed by police assigned to the platform (18).
The UK Government can adopt the recommendations in the UN Special Rapporteur’s report, particularly the call to improve gender-disaggregated data on the prevalence and harms of online abuse and this data collection should also be intersectional (19).
Both governments and internet intermediaries should fully resource and support civil society organisations raising awareness, providing support, training and capacity building to women and other historically underrepresented groups.
(1) Pew Research; 1 in 4 black Americans have faced online harassment because of their race or ethnicity: http://www.pewresearch.org/fact-tank/2017/07/25/1-in-4-black-americans-have-faced-online-harassment-because-of-their-race-or-ethnicity/
(2) Report of the Special Rapporteur on violence against women, its causes and consequences on online violence against women and girls from a human rights perspective: https://www.ohchr.org/EN/Issues/Women/SRWomen/Pages/AnnualReports.aspx
(3) In 2015/16 it completed 15,442 hate crime prosecutions, the highest number it has ever recorded http://www.report-it.org.uk/files/hate_crime_report_1.pdf
(4) 1 in 3 BAME people have witnessed or experienced racist abuse since Brexit vote, finds TUC poll: https://www.tuc.org.uk/news/1-3-bame-people-have-witnessed-or-experienced-racist-abuse-brexit-vote-finds-tuc-poll
(5) Report of the Special Rapporteur on violence against women, its causes and consequences on online violence against women and girls from a human rights perspective: https://www.ohchr.org/EN/Issues/Women/SRWomen/Pages/AnnualReports.aspx
(6) Fix the Glitch Report: How Social Media Companies can better address online abuse: http://www.bit.ly/GlitchUKRecommendations
(7) NSPCC Online abuse Facts and statistics https://www.nspcc.org.uk/preventing-abuse/child-abuse-and-neglect/online-abuse/facts-statistics/
(8) Ditch the Label Annual Bullying Survey 2018: https://www.ditchthelabel.org/research-papers/the-annual-bullying-survey-2018/
(9) 1 in 3 BAME people have witnessed or experienced racist abuse since Brexit vote, finds TUC poll: https://www.tuc.org.uk/news/1-3-bame-people-have-witnessed-or-experienced-racist-abuse-brexit-vote-finds-tuc-poll
(10) The Guardian, Hate crime surged in England and Wales after terrorist attacks https://www.theguardian.com/uk-news/2017/oct/17/hate-soars-in-england-and-wales
(11)Pew Research; 1 in 4 black Americans have faced online harassment because of their race or ethnicity: http://www.pewresearch.org/fact-tank/2017/07/25/1-in-4-black-americans-have-faced-online-harassment-because-of-their-race-or-ethnicity/
(13 )Glitch!UK News + update, https://seyiakiwowo.com/2017/11/30/my-piece-in-labourlist-today-is-the-day-labour-must-renew-its-determination-to-end-online-violence-against-women-and-girls/
(14) Mayor launches new unit to tackle online hate crime 24th April 2017, Press Release https://www.london.gov.uk/press-releases/mayoral/mayor-launches-unit-to-tackle-online-hate-crime
(15) Glitch!UK, Our Recommendations April 2017, https://seyiakiwowo.com/2017/04/13/twitter-and-youtube-do-more-to-deal-with-your-trolls/
(16) European Women’s Lobby Her Net Her Rights Report 2017 https://www.womenlobby.org/IMG/pdf/hernetherrights_report_2017_for_web.pdf
(19) Report of the Special Rapporteur on violence against women, its causes and consequences on online violence against women and girls from a human rights perspective 2018 https://www.ohchr.org/EN/Issues/Women/SRWomen/Pages/AnnualReports.aspx