Cambodia, Thailand agree to ceasefire after Donald Trump's intervention, latter has one condition
Stampede at Mansa Devi temple in Haridwar, 6 feared dead, several injured
Parag Tyagi pens heartfelt message to 'maa' Shefali Jariwala from Simba: 'It's been one month...'
Taylor Swift gives sassy reply to ex-boyfriend Matty Healy's mom's jibe: 'She has to take...'
VIRAL
Companies found violating the rules could incur hefty fines of up to 10% of their global annual revenue.
The UK's significant Online Safety Act is now in effect, introducing stricter online content moderation and holding tech giants such as Meta-owned Facebook and Instagram, Google, TikTok, and other social media platforms. This legislation grants Ofcom, the UK's media regulator, the authority to implement tougher rules regarding how these companies manage harmful and illegal content on their platforms.
The Online Safety Act, enforced last year, is designed to make a safer online environment, especially for children. Ofcom has released its initial codes of practice, which outline the necessary actions tech companies must take to tackle illegal activities such as terrorism, hate speech, fraud, and child sexual abuse.
Under this Act, tech companies are required to uphold a 'duty of care' to shield users from harmful content. They have until March 16, 2025, to evaluate the risks associated with illegal content on their platforms and to put in place measures to mitigate those risks.
This involves enhancing content moderation, streamlining reporting processes, and integrating safety features directly into platforms.
Ofcom Chief Executive Melanie Dawes said, “This marks a major advancement in online safety. We will be actively monitoring the industry to ensure adherence to these stringent safety standards.”
Companies found violating the rules could incur hefty fines of up to 10% of their global annual revenue. In cases of repeated or serious infractions, senior managers may face imprisonment, and Ofcom has the authority to seek court orders to block access to services that do not comply in the UK.
The new codes says that reporting and complaint mechanisms must be easily accessible and require high-risk platforms to implement hash-matching technology to detect and eliminate child sexual abuse material.
Meanwhile, Ofcom announced that additional regulations is likely to be announced in 2025, which will include actions to block accounts that share child sexual abuse material and the implementation of AI to tackle illegal content.
Technology Minister Peter Kyle said, "These codes connect the protections we have in the offline world with those online. Platforms must enhance their efforts, or Ofcom has my full backing to utilize its powers, including imposing fines and blocking websites."