By Albert Toth
Technology firms must do more to tackle illegal content on their platforms as Ofcom gains new powers. From Monday, the regulator will start enforcing the Online Safety Act’s illegal content codes, requiring social media companies to find and remove content such as child sexual abuse material.
The illegal content codes relate to material such as child sexual exploitation and abuse, terrorism, hate crimes, content encouraging or assisting suicide, and fraud. New duties on social media firms require them to detect and remove the content, using advanced tools such as automated hash-matching and robust moderation and reporting mechanisms.
New strict enforcement rules will also now give Ofcom the power to administer hefty fines for non-compliance, and even bans in the most serious cases.
Technology secretary Peter Kyle said the changes ‘represent a major step forward in creating a safer online world’. He added that ‘for too long’ child abuse material, terrorist content and ...
Want to see the rest of this article?
Would you like to see the rest of this article and all the other benefits that Issues Online can provide with?
- Useful related articles
- Video and multimedia references
- Statistical information and reference material
- Glossary of terms
- Key Facts and figures
- Related assignments
- Resource material and websites