Last month saw the launch of the All-Party Parliamentary Group’s (APPG) report on Hate Crime.
The group, created in 2018, connects researchers, civic society organisations, law enforcement and politicians across parties to work together to better understand and address hate crime. This report summarises findings from the group’s first enquiry, launched a year ago, which asked:
‘How can we build community cohesion when hate crime is on the rise?’
This question is both challenging and complex, particularly when thinking about hate crimes perpetuated online; the report describes the online world as ‘fertile breeding ground for hate crime’ and emphasises that harm perpetuated online has serious consequences offline.
What is a hate crime?
A hate crime is an offence that is:
‘…perceived by the victim or any other person, to be motivated by hostility or prejudice based on a person’s race or perceived race; religion or perceived religion; sexual orientation or perceived sexual orientation; disability or perceived disability and any crime motivated by hostility or prejudice against a person who is transgender or perceived to be transgender.’
What did the enquiry find?
The report confirms that hate crime is on the rise.
As of October 2018, data evidenced a 17% increase on the previous year. Race and ethnicity make up the most of reported hate crime, and religion saw the largest increase, of 40%. The enquiry illuminates the present prevalence hate crime, and takes an intersectional approach to examine how harm is directed at, and experienced by, different groups. Importantly, the report acknowledges that children and young people face significant risks from online hate, and women face disproportionate harassment too.
To gather this detail, the enquiry drew on expertise and insights from organisations working across different forms of hate crime, including evidence submitted by Glitch that underlines the need for greater safeguarding in digital spaces.
What happens next?
The current picture is concerning, but evidence from the enquiry will help policymakers explore solutions that address the complexities of hate crime.
Also, the report explores best practice and highlights education as key to tackling the problem. It notes the positive impact of existing resources currently available for schools aimed at raising awareness of risks and dangers around online bullying, with Glitch’s Digital Citizenship Toolkit included as an example. Ultimately, the report confirmed a need to expand provision of resources and training, especially for young people, to build resilience.
The report offers recommendations, echoing Glitch’s call for greater governance of online spaces and enforcement of standards that protect people’s rights and dignity online.
Consent is simple.
It means giving permission for something to happen and also – this is key – being comfortable giving that permission.
This year’s Safer Internet Day invites us to think about how consent works in digital spaces. Since, according to Ofcom research, we check our phones every 12 minutes and spend, on average, more than a full day a week online, how we relate through the Internet is not insignificant. With a series of taps we are able to connect, interact and share with millions around the world. But the terms on which we interact are not always consensual.
Being coerced into sharing, engaging and communicating in ways we find uncomfortable can be emotionally devastating. Online, this can involve experiencing unwanted sharing of intimate images and videos, having personal information published without our consent, being shamed or deadnamed.
Safer Internet Day sheds light on the range of approaches people take to hurt others online – often strangers but not always . Research published by Project deSHAME, found 6% of 13-17 year olds across the UK, Denmark and Hungary have had their nude or nearly nude image shared with others without permission in the last year, while 2 in 5 have witnessed this happening. Alarmingly,
25% reported witnessing young people secretly taking images of others and sharing them online while 10% admitted having done this themselves in the last year. Non-consensual sharing of images might be done flippantly – perhaps you have seen others doing it for a laugh – but it is one hurtful way in which people are intimidated, disrespected and shamed online.
Girls and women are most at risk; research published by Amnesty International revealed women in the UK and US receive abuse online every thirty seconds – and that’s just on Twitter. Online abuse, whether doxing (publication of private information with malicious intent), online stalking, hate speech, harassment and trolling, always involves treating people in ways no one would consent to be treated.
Conversations about consent – particularly in thinking about healthy relationships – have become more open over the past five years. The viral Thames Valley Police Tea and Consent video from 2015 helped clarify the concept’s simplicity. It explains consent with an analogy of (not) forcing others to drink tea and starkly shows the discomfort and inappropriateness of disrespecting people’s boundaries and preferences. Online, this simple principle of respect is regularly violated.
Last year’s #PlaneBae incident saw a woman on a flight secretly photographed and ultimately harassed when a fictional romance thread captured the public’s imagination on social media. The thread shared on Twitter gained more than half a million likes. The woman was doxxed and targeted with hate messages, despite her repeatedly stating she was not comfortable with the public attention.
This Safer Internet Day, let’s consider a broader understanding of consent: an understanding that covers more than our individual decisions on how much data to share, or whether to accept all the cookies and agree to Terms and Conditions on a particular website. Let’s reflect on what we can do, as online bystanders and digital citizens, to challenge abusive tactics such as doxing and deadnaming.
Understanding different forms of online abuse and how different people are affected is the first step. Let’s challenge wider patterns of misogyny, misogynoir, racial discrimination and anonymous hate that violate consent online. Join the conversation with Glitch’s upcoming TweetChat to learn more about different forms of online gender-based violence and ways it can be addressed on Wednesday 13th February 12:30pm (GMT).
We can challenge our friends to reconsider before sharing photos of others, and be mindful about how we engage with others, and ask whether we ourselves would comfortably consent for our friends and family to be treated in the way we treat, or see others treated, online. Consent, while simple in theory, is revolutionary. It provides language to defend dignity, which can seem abstract and distant online.
Written by: Sussie Anie – Creative Agent
Editor: Seyi Akiwowo- Executive Director
On 13th February we will launch the first edition of our brand new Fix The Glitch Toolkit. This toolkit is designed to support individuals who want to help end online abuse but might not know where to begin. We have created this toolkit to not only raise awareness of online gender based violence (OGBV) but to begin addressing the problem by equipping communities with tools to take action. This is an exclusive edition comprised of only 100 copies. Find out more here.
To officially launch the Fix The Glitch Toolkit we will host a lunchtime TweetChat at 12:30pm (GMT) on 13th February on how can we end online gender based violence. A Tweet chat is a Twitter-based conversation around a specific topic, using a designated hashtag (#FixtTheGlitch) for each tweet. During the tweet chat, our chair will post questions from the Toolkit which is where you are welcome to share your own posts and views on this issue.
Our expert friends joining us for the discussion are:
- Laura Bates, Author and Founder of Everyday Sexism
- Sandra Pepera Director of Gender, Women and Democracy, National Democratic Institute #NotTheCost
- Soraya Chemaly, Director of Women’s Media Centre
- Nighat Dad, Founder of Digital Rights Foundation Pakistan
- Catherine Anderson, Chief Executive, Jo Cox Foundation
- Gina Martin, Writer and Lead Campaigner on the new UK Upskirting Law
- Azmina Dhrodia, Expert on Technology and Gender Based Violence
- Gabby Edlin, CEO and Founder of Bloody Good Period
- Alice Skinner, Feminist Illustrator and Visual Artist
- Chair: Seyi Akiwowo, Founder and Executive Director, Glitch
Partners and expert organisations joining us for the discussion are:
- Amnesty International
- Feminist Internet
- Antisemitism Policy Trust
- The Parliament Project
- Gender Policy and Insights
HOW TO PARTICIPATE IN GLITCH TWEETCHAT
- Mark 13th February, 12:30pm (GMT) in your calendar so you don’t forget to join us
- Use the hashtag #FixTheGlitch so that your tweets are visible to those following the chat
- When answering a specific question or replying to comment from another tweeter, use the question number (i.e Q2) and Twitter handles to identify who you’re speaking to
- Remember, this is a good opportunity learn new perspectives and engage more people in a much needed debate on how to end OGBV, so interact in a positive, respectful way
We look forward to hearing from you on 13th February!
2018 was a year of growth for us and we are so thankful for your continued support. We’ve shared some of our proudest moment of 2018 and exciting plans for 2019.
Our Proudest Moments of 2018
Our key focus last year was to encourage as many people to take some form of action to help fix the glitch to end online abuse.
Over 100 people pledged to help #fixtheglitch and be an active bystander online.
Over 30 public figures have received our bespoke Digital Resilience Training.
We’ve been working closely with policy makers and Members of Parliament cross party to highlight the intersectionality of online abuse, outline both its impact and harms as well as explore effective ways to fix the glitch. Our presentation at the 38th United Nations Human Rights Council on Online Violence Against Women was extremely well received.
We were also invited to Number 10 Downing Street thanks to new Glitch Partners Antisemitism Policy Trust.
We are proud of our submissions to two All-Party Parliamentary Groups and our joint response to Government consultation with Centenary Action Group.
We recently developed working relationships with Facebook as well as Twitter who have agreed for us to a host a listening meeting for British Black Women to share their experiences of online abuse in Spring 2019.
Have you noticed our rebrand? Huge thanks to Double Noire this has supported our online and offline campaigns. We’re excited to see more people learning about different forms of online abuse, the scale of the challenge and exploring effective ways to end it.
We are so proud for our work and organisation to be recommended as best practice in a recent European Parliament Report on Cyberviolence.
Our Hopes for 2019
- Deliver online resources to empower more people to help end online abuse in their local communities.
- Deliver more successful campaigns for change
- Deliver workshops to 5000 young people.
To achieve this we need your support. Here 5 ways you can help:
- Donate: support our Christmas Fundraiser you can make a financial donation via paypal.
- Book: our Our Digital Resilience training or our Digital Citizenship workshop.
- Volunteer: we’ve tripled in size but are also looking for volunteers with skills and expertise to help us grow. If you have finance, legal, fundraising or marketing skills we would love to hear from you! Please complete this short interest form.
- Partner with us: our mission to fix this current glitch in our online world can only be achieved through collaboration and partnership. If you are an individual or organisation working to make the online world safer we would love to work with you! Please get in contact.
- Spread the word: we want to build an inclusive movement to end online abuse to fix this glitch. Follow us online and share Glitch with your Facebook friends and your Twitter and Instagram followers.
We wish you a wonderful 2019 and we look forward to working with you to end online abuse.
Yesterday, Amnesty International has further proved that online abuse is a violation of our human rights with the launch of its latest report on women in politics and journalism. Results from the global crowdsourcing project, named TrollPatrol, support what women, particularly black women, have been reporting for over several years. The research revealed that women in the US and UK face a staggering level of abuse – every 30 seconds on Twitter – and that black women are 84% more likely than white women to face abusive or problematic tweets. Organisations like Take Back the Tech, Women’s Media Center and The National Democratic Institute’s Not the Cost Campaign were one of the first to address this pandemic and we pay tribute to their dedication. Now that we have intersectional data, it’s time we all help fix the glitch and end online abuse.
Last year I experienced a tidal wave of misogynoir, which is abuse that’s both misogynistic and racist, after a speech I made went viral. As a young black woman in politics observing how other women were publicly treated on social media platforms, experiencing abuse first hand was the final straw. This is why I founded Glitch, a not-for-profit organisation that exists to end online abuse. Our workshops centre on digital self care, self defence and digital citizenship and were recently recommended as best practice in a recent European Parliament Report.
According to the report, 1 in 10 tweets mentioning black women is either abusive or problematic, compared with 1 in 15 directed at white women. We strongly agree with Amnesty, social media companies such as Twitter must be more transparent but they must also engage and support many more diverse activist groups using their platforms. For so long, research, policies and Government interventions to address online abuse have focused on children and women, as if the two are homogenous groups. By doing so, we are ignoring the real drivers behind a lot of online abuse and online bullying, affecting women and girls on a daily basis. Amnesty’s research is a welcome step towards a more nuanced, intersectional critique of online harms.
It’s important to understand that while women experience all different kinds of online abuse, the overall impact has a silencing effect that represents a potent threat to gender equality, human rights and democracy. It causes anxiety and, in very sad situations, has resulted in girls self harming and taking their own lives. Amnesty’s Write for Rights Campaign has inspired thousands of people around the world to write to Jack Dorsey, asking him to take serious action. I’ve also received so many messages from people sharing their own experiences of online abuse and losing loved ones.
So, how do we begin to address online abuse?
The current online/offline dichotomy is unhelpful and hides much of the violence that women and girls face online. Founder of Everyday Sexism, Laura Bates’ book Misogynation is a collection of essays talking about the importance of joining the dots between the different forms of violence that women and girls face. We can already see patterns emerging between domestic violence and terrorist attacks and in individuals who are violent offline also, sending abusive messages to women online. Even with these cases we again see social media platforms failing to enforce their own rules.
Having presented at the United Nations Human Rights Council this summer, it was very online abuse towards women in politics was far too common. On a personal level, I strongly urge all political parties and membership organisations to develop a code of conduct for their members and be crystal clear about the support they will provide to their candidates and members when standing, campaigning or representing the organisation.
Moving forward, we need to amplify the experiences of women in public life, campaigners, councillors, candidates, activists, founders of charities, Youtubers, artists and bloggers. We also need to amplify the everyday woman who may use #blackgirlmagic, #blackhistory or #metoo and face abuse from those who are hijacking hashtags and derailing them for their own racist, sexist (or both) agendas. We must have more, frequent and visible conversations on online abuse and the diversity of experiences women face.
I hope we continue to see responsible data-gathering on women with intersectional identities like black disabled women and black muslim women. However, civil society groups cannot combat online abuse alone; tech companies and governments must also be involved. We therefore need new money, resources and training to understand and appropriately educate, enforce and empower society against online abuse. We also must see sufficient resources to support diverse media groups such Black Ballad, Media Diversified and Gal-dem Magazine who are mentioned in the research, as well as diverse academics, technologists, law enforcement and civil society groups to continue their vital work.
Finally and arguably most importantly we need to see more men be effective allies to women online and certainly must see more white women be effective allies to black women. Last year, the BNP created a racist and sexist Christmas card, which well-meaning Twitter users then forwarded on to Diane Abbott MP when sharing their outrage. Abbott, who receives almost half of all online abuse directed at female MPs, need to see the offensive card over and over again? Glitch’s digital citizenship workshop, aimed at all online users but specifically young people, encourages participants to under with digital rights comes digital responsibilities and adopt an ‘active bystander’ attitude when engaging in activities online. Black women have been talking about their negative experience online and thanks to Amnesty, we now have the data to prove it. But it’s up to all of us to make lasting change.
First appeared in the Huffington Post
Our Executive Director, Seyi Akiwowo responding Amnesty International’s new report revealing abusive tweets are sent to female politicians and journalists every 30 seconds. The study also found that black and minority ethnic women were a third more likely to be targeted than white women. It’s time to fix the glitch and end online abuse
Make the most of this season of goodwill by donating to help us continue tackling online abuse. As we countdown to Christmas we’ll be asking for your support with some of our most important work. You can get started now with a donation of just £5 via our Paypal
Glitch celebrates the adoption of new social media guidelines by the CPS. It is a step forward to fix the glitch and end online abuse. These guidelines show the commitment of the UK to the United Nation treaties and resolutions, adopting real measures to protect Human Rights in our national legislation. We are pleased to see our recommendations are being heard and acted upon.
Glitch would like to know what steps will be taken to effectively implement these guidelines. The police are currently under-resourced and for these guidelines to be effective there must be investment in and transparency around the training and capacity of prosecutors and law enforcement teams.
Civil society groups also need to see how data is being collected, strengthen sex and age-disaggregated data collection and publication as recommended in the Harlevoix Commitment. We hope that reports will be publicly shared for all to analyst the impact of these guidelines and that the CPS will work with civil society groups to ensure these guidelines are effective.
For more information
Last September, Twitter expanded its Hateful Conduct Policy with the introduction of a Dehumanisation Policy. This new set of rules aims to prohibit “language that treats others as less than human”, denying their human nature or their human qualities. It includes an extensive list of the identifiable groups that could be victimised. However, this policy only protects individuals when they have been dehumanised based on “a membership in an identifiable group”. While we celebrate the good gesture of Twitter to seek feedback from global perspectives and acknowledge the different impacts on cultures and communities around the world, an issue arises when a person’s membership is not clear due to the nature of the dehumanising insult. For example, when someone makes a direct comparison of another person to an animal.
Twitter’s consultation was short, just 14 days which is inaccessible for many and limits the scope. By not giving adequate chance to participate, the purpose of the survey is undermined. While we believe it is a step forward to end online abusive behaviour, it is not enough.
- Publish more information on the implementation of Twitter’s current hateful conduct and new dehumanisation policies
- Higher engagement with the community and communication of results and statistics
- Clarity on the definition of “abusive tweet”
- Transparency in the exercise of enforcing e.g: sharing guidance of moderators
- Implementation of an accountability mechanism to secure enforcement
- Expansion of the regulation for a more consistent policy and higher protection of individuals
There is still a large policy gap that social media companies must address. We can all help to fix the glitch and therefore we want to see real efforts to tackle online abuse. Read more of our recommendations here: https://seyiakiwowo.com/our-recommendations/.
Sed nec facilisis lacus. Aenean ullamcorper, ex sit amet consectetur imperdiet, dui lorem eleifend dolor, non commodo ipsum purus sed quam. Nullam eget maximus leo. Pellentesque ac tempor risus. In vestibulum orci arcu. Morbi in vehicula libero, sollicitudin ornare felis. Aliquam vestibulum pulvinar dui. Pellentesque dui justo, facilisis id lobortis non, volutpat quis ligula. Integer vel sollicitudin sapien, ac feugiat arcu. Ut quis libero ut nisi aliquet volutpat. Nunc eget sagittis tortor. Pellentesque lobortis in justo at molestie. Maecenas imperdiet leo id sem consequat, et dapibus velit congue.
Phasellus sit amet nibh vel purus rutrum viverra vitae vitae mi. Etiam at erat ipsum. Sed sit amet nulla porta, elementum lectus vel, gravida eros. Curabitur tincidunt neque eu semper bibendum. Nullam faucibus, quam in ullamcorper cursus, risus metus pulvinar augue, convallis rhoncus nisi dui ut orci. Proin iaculis, dolor eget consectetur pharetra, dolor massa viverra sapien, nec iaculis turpis lorem eget augue. In eget tempor dui. Phasellus sit amet elementum orci.