The UK’s landmark Online Safety Bill has been introduced to Parliament today.
The legislation was drafted in May last year and contained measures to tackle a range of digital harms, including child sexual abuse, terrorist material, fraud and online abuse.
New obligations will be placed on social media firms and other services hosting user-generated content to prevent and remove harmful content on their platforms. These rules will be enforced by the UK’s communications regulator, Ofcom, who will have the power to issue fines of up to £18m or 10% of annual global turnover, whichever is higher, for those who fail in their duty of care.
The UK government also announced today that executives whose companies fail to cooperate with Ofcom’s information requests could face prosecution and even a prison sentence within two months of the bill passing into law. This is opposed to two years in the original draft. In addition, new offenses have been added to the bill to make senior managers of such firms criminally liable for destroying evidence, failing to attend or providing false information in interviews with Ofcom and for obstructing the regulator when it enters company offices.
The measures have come amid rising internet usage in the UK, exacerbated by the COVID-19 pandemic. This has led to an increase in online harms, including grooming and abuse.
Numerous new provisions have been added to the draft legislation in the past few months. These include new offenses relating to abusive and offensive online communications, forcing social media firms to create tools to enable UK users to have greater control of what they see and who can interact with them on their platforms, new duties to tackle fraudulent adverts and making cyber flashing a criminal offense.
The government argued that the law would strengthen people’s ability to express themselves freely online by ensuring social media firms are not removing legal free speech and are protecting journalism and democratic political debate on their platforms.
Digital Secretary Nadine Dorries said: “The internet has transformed our lives for the better. It’s connected us and empowered us. But on the other side, tech firms haven’t been held to account when harm, abuse and criminal behavior have run riot on their platforms. Instead, they have been left to mark their own homework.
“We don’t give it a second’s thought when we buckle our seat belts to protect ourselves when driving. Given all the risks online, it’s only sensible we ensure similar basic protections for the digital age. If we fail to act, we risk sacrificing the wellbeing and innocence of countless generations of children to the power of unchecked algorithms.
“Since taking on the job I have listened to people in politics, wider society and industry and strengthened the Bill so that we can achieve our central aim: to make the UK the safest place to go online.”
However, Jake Moore, global cybersecurity advisor at ESET, expressed concerns about the lack of guidance offered to social media companies on how to comply with their new duties. “Although this is a great start and more safety measures to protect all users on the internet are desperately needed, many parts of the Online Safety Bill have the potential to fail from the outset without further guidance or directive. There is a lack of solutions on offer, but there remains a level of responsibility from companies in collaboration with the government,” he commented.
“Social media platforms and corresponding technology firms want to make their platforms a safe environment for all users, but threatening them with sole accountability without extra support highlights a lack of understanding of how the internet works.”