TikTok May Be Fined For Failing to Protect Children's Privacy

TikTok May Be Fined For Failing to Protect Children’s Privacy

TikTok May Be Fined For Failing to Protect Children’s Privacy

A recent report has claimed that TikTok is being fined over failing to protect the privacy of children on its platform. According to the report, the platform’s age limit was not monitored, violating UK data protection laws. This means that it broke the General Data Protection Regulation.

TikTok fined for failing to protect children’s privacy

TikTok is facing a PS27 million fine after being found in breach of data protection laws in the United Kingdom. The regulator, the Information Commissioners Office, said the social network was collecting and processing sensitive details about children without their consent. It also failed to provide users with adequate information about its data practices.

A notice of intent has been issued by the Information Commissioner’s Office (ICO) indicating that the social media platform is in breach of UK data protection law. The ICO says that the app processed the data of children under 13 without the consent of their parents and without legal grounds. Furthermore, the company failed to give users the information they needed to know about the processing of personal data of children.

Platform violated UK data protection laws

The ICO, the UK’s data protection regulator, has issued a notice to TikTok, its parent company, and its U.K. subsidiary that the company failed to properly protect children’s privacy. The ICO believes TikTok processed personal data of children under 13 without their consent, and did not provide clear and easily understandable information to users about how their information was processed.

The ICO (Information Commissioner’s Office) is investigating TikTok and has issued a notice of intent to impose a fine of PS27 million. TikTok has been critical of the ICO’s findings and will make its response soon.

Platform did not monitor its self-imposed age limit

A report issued by the UK’s Information Commissioner’s Office suggests TikTok may be fined up to PS27 million for failing to protect children’s privacy on its website. The video app, which is popular among children and teenagers, is the subject of the investigation.

Specifically, the investigation found that TikTok processed data on children under 13 without their consent. The company also failed to provide clear and easy-to-understand information about the information that was collected about children.

Platform violated General Data Protection Regulation

The Information Commissioner’s Office (ICO) is currently investigating TikTok’s practices, including how the social media company collects and uses children’s private information. The investigation has also found that TikTok may have processed children’s data without their parents’ consent, and failed to make information easily understandable. The ICO will now consider the representations of the social media company and take action if it deems a breach of the law.

According to the ICO, TikTok could face a PS27m fine in the UK, based on its failure to protect the privacy of children using its service. The company has been accused of processing data of children under 13 without obtaining consent from their parents. The investigation was launched after the Information Commissioner’s Office (ICO) released a provisional notice of intent. In the notice of intent, the ICO found that TikTok failed to provide users with the information they needed about its use of children’s personal data. ICO Commissioner John Edwards urged the company to comply with the law.

Platform violated Children’s Online Privacy Protection Act

The Children’s Online Privacy Protection Act requires online companies to obtain parental consent before collecting and using personal information from children. OpenX’s Ad Exchange platform connects advertisers with publishers looking to serve targeted ads to children. It collected personal information from children under the age of thirteen without parental consent and used it to target ads in hundreds of children-directed apps. This practice violated COPPA and the FTC Act.

The Children’s Online Privacy Protection Act (COPPA) is the United States law protecting the personal information of children under 13. It was enacted in 1998 and came into effect on January 1, 2000. Since then, it has been revised by the Federal Trade Commission. The act requires commercial websites to obtain parental consent before collecting and using the personal information of children.


Posted

in

by

Comments

Leave a Reply