Law Tutor

View Original

Hate Crime

WHAT IS A HATE CRIME?

Hate crimes are any acts committed against an individual as a result of enmity or prejudice against that individual's:  disability, ethnicity or race, religion or belief, sexual orientation, transgender identity. Hate crime has a severe effect on victims and it’s intolerable that the present levels of protection are so uneven. We need a duty of care imposed on social media platforms.

The Law Commission’s proposals would increase safeguards for victims while online also ensuring that the right of freedom of speech is preserved. The Law Commission has released its suggestions to modify hate crime laws to guarantee that disabled and LGBT+ victims get the same levels of protection as other victims with protected characteristics such as race and religion.

Additionally, the Law Commission has urged that more should be done to safeguard women and girls by expanding the crime of stirring up hatred to encompass sex or gender in an attempt to criminalise radical sexist hate propaganda. It has also made a number of proposals to defend freedom of expression and ensuring criminal legislation is focused on hate speech.

THE LAW AND ONLINE HATE CRIME

Historically, there has been little regulation of online services, owing in large part to the belief that the internet should be unfiltered, which evolved from the beginning of the internet's existence. Individuals and the general public are becoming more aware of the possible harmful consequences of internet services and their content, which has resulted in an increase in public concern and awareness. As a result, even massive social media platforms such as Facebook have asked for further regulation. Due to increased scrutiny and the financial and commercial implications of non-compliance, online services are required to invest in technology as well as terms of use, acceptable use rules, and privacy policies that reflect public and political opinion.

There is no simple solution for regulating internet content. Numerous agencies, however, are responsible for supervising specific internet activities (for example, Ofcom, the Competition and Markets Authority (CMA), the Advertising Standards Authority (ASA), the Information Commissioner's Office (ICO), and the Financial Conduct Authority).The law applies to online activity in the same way as it does to acts committed offline. Criminal offences can also be prosecuted by the Crown Prosecution Service (CPS) in the same way (see below). Some organisations that operate online such as broadcasters, video-sharing platforms and financial services companies are subject to additional regulation.

Under EU E-Commerce Directive 2000/31/EC, EEA operators of social networking networks operating in an EEA nation are free from responsibility for unlawful material they contain if they take a ‘neutral, merely technical and passive role’ in deleting or restricting access to the content upon becoming aware of it. In the United Kingdom, the Electronic Commerce (EC Directive) Regulations 2002 (E-Commerce Regulations 2002), SI 2002/2013, implement the EU E-Commerce Directive and are preserved, subject to amendment by the Electronic Commerce (Amendment etc) (EU Exit) Regulations 2019, SI 2019/87.

The E-Commerce Regulations 2002 do not exclude hosted material from obligation; rather, they enable defences against monetary claims and criminal prosecutions. Social media platforms and search engines self-regulate through their terms and conditions of use, policies, certification requirements in certain limited circumstances, fact checking, Rights Manager software (a programme available to Facebook users to help them identify copyright infringements), and ‘strike systems’ (such as YouTube's strike system for policy violations, which can result in the temporary or permanent removal of a YouTube channel). Numerous bigger organisations have signed on to a variety of regulations, industry alliances, and other initiatives.

In compliance with section 103 of the Digital Economy Act 2017, the UK government has produced a Code of Practice for providers of online social media platforms (Social Media Code of Practice) (DEA 2017). The Social Media Code of Practice is a guideline that requires social media providers to maintain systems for reporting harmful behaviour and to be honest about the measures taken in response to harmful material. The Social Media Code of Practice's principles mirror many of the bigger platforms' pre-existing standards, and are likely to be most beneficial in offering direction to smaller platforms or new entrants to the market. The following are some examples of excellent practise outlined in the code:

  • allowing non-users to report harmful content;

  • revising reporting procedures in response to user feedback;

  • promptly recognising and assessing complaints in accordance with all terms and conditions;

  • informing users of the penalties of violating terms and conditions; and

  • explaining why material was/was not deleted. 

The government's full response to the consultation on its April 2019 Online Harms White Paper lays out the basic principles for a legislative system of promoting online safety and security. The White Paper on Online Harm encapsulated the government's intention to make the UK the safest place in the world to go online, in part by imposing a duty of care on businesses to enhance their consumers' online safety. The Online Safety Bill will provide the basis for the regulatory system, which will be monitored and implemented by Ofcom.

Additionally, the European Union Council enacted a new version of the Audio Visual Media Services Directive, Directive 2010/13/EU (EU AVMS Directive), which governs video-sharing services such as YouTube, TikTok, and Vimeo. Directive (EU) 2018/1808, the Revised EU AVMS Directive, broadens the scope of the original EU AVMS Directive from on-demand services to video-sharing platforms whose 'essential functionality' is to provide programmes and/or user-generated content to the public for the purpose of informing, entertaining, or educating.

In the UK, the Revised EU AVMS Directive is implemented by the Audiovisual Media Services Regulations 2020 (AVMS Regulations 2020), SI 2020/1062 (which came into effect in phases on 1 November 2020 and entirely on 6 April 2021. Part 4B of the Communications Act 2003 (CA 2003) establishes the legislative framework, which compels video-sharing platform operators to take 'necessary steps' to accomplish the following:

  1. the protection of minors from content that can impair their physical, mental or moral development;

  2. the protection of the general public from content and advertising that incites violence or hatred towards people’s protected characteristics;

  3. the protection of the general public from content and advertising that is a criminal offence under EU law to circulate (i.e. terrorist content).

Schedule 15A of the CA 2003 contains a list of the measures that providers must examine. Appropriate measures may include establishing and enforcing specific terms and conditions of service for users; establishing and operating mechanisms for flagging and reporting, age verification, content rating, and easy-to-access complaints; providing parental control systems; providing media literacy measures and tools; and providing an appropriate redress mechanism for disputes with users.

Providers of video-sharing platforms may not be required to implement all of these measures to comply, but they must consider the nature of the content, the potential harm, the characteristics of the individuals to be protected, as well as their rights and legitimate interests (including the rights of the video-sharing platform provider, the person uploading the material and the general public interest, eg privacy rights and the right to freedom of expression). Additionally, they might consider whether the proposed actions are suitable and reasonable. Ofcom will first regulate video-sharing sites, and providers will be forced to inform Ofcom that they are offering such a service. Ofcom will have enforcement authority, including the ability to levy substantial penalties.

In March 2021, Ofcom provided guidelines on the self-assessment criteria that businesses should use to determine if they match the new laws' legal definition of a video-sharing platform and are subject to UK jurisdiction. Between 6 April and 6 May 2021, video-sharing sites that fit the requirements will be obliged to inform Ofcom. After 6 May 2021, every new video-sharing platform in the UK must notify Ofcom at least ten days before to debut.

In October 2021, Ofcom published further guidelines to video-sharing platform providers on the required safeguards for users against defined kinds of harmful content, as well as the execution of such safeguards.

In May 2021, the Government released its proposed Online Safety Bill, confirming its intention to remove Part 4B of the CA 2003 and to substitute the regulation of existing video-sharing sites in the UK with the implementation of the online harms regulatory framework.

In the European Union, the Proposal for a Regulation on a Single Market for Digital Services (Digital Services Act) seeks to strengthen and harmonise the duties of online platforms and information service providers, as well as to strengthen monitoring of platforms' content regulations. It is applicable to all intermediate service providers, regardless of their location of establishment or residency, who offer services in the EU (as defined by the 'substantial link to the EU' criteria).

Final Remarks

Thanks for reading this article. Send me an email if you need help with law school dissertations, time management, course selection, or anything else. I'll answer as quickly as I can. I'm also a law tutor who offers video call lessons. I can help with topic and exam technique. Just send me a WhatsApp from the link below.