Facebook app secretly accessing people's camera as they read news feed
Company claims strange behaviour is result of a bug, but there is an easy fix to stay safe.
Sites must assess content for sexual abuse and suicide risk or face fines of up to £17m.
Technology companies will be required to assess their sites for sexual abuse risks, prevent self-harm and pro-suicide content, and block children from broadcasting their location, after the publication of new rules for “age-appropriate design” in the sector.
The UK Information Commissioner’s Office, which was tasked with creating regulations to protect children online, will enforce the new rules from autumn 2021, after one-year transition period. After which companies that break the law can face sanctions comparable to those under GDPR, including fines of up to £17m or 4% of global turnover.
Companies that make services likely to be accessed by a child will have to take account of 15 principles designed to ensure their services do not cause harm by default. Those include:
https://www.theguardian.com/technology/2020/jan/22/tech-firms-fail-protect-children-sexual-abuse-suicide-safety-privacy