Privacy experts who spoke to WIRED described Rumble, Quora and WeChat as unusual suspects but declined to speculate on why they were included in the investigation. Josh Golin, executive director of the nonprofit Fairplay, which advocates for children’s digital safety, says concerns aren’t always obvious. For example, few interest groups were concerned about Pinterest until the case of a U.K. teenager who died of self-harm after being exposed to sensitive content on the platform, he says.
In a press release last month, Paxton called his up-to-date investigation “a critical step toward ensuring that social media and artificial intelligence companies comply with our laws designed to protect children from exploitation and harm.”
The U.S. Congress has never passed a comprehensive privacy law and has not significantly updated children’s online safety rules in a quarter of a century. That left state lawmakers and regulators with a substantial role to play.
Paxton’s investigation focuses on compliance with the Texas Safeguarding Children on the Internet through Parental Authority (SCOPE) Act. which came into force in September. It applies to any website or application with social media or chat features that registers users under the age of 18, making it more wide-ranging than federal law, which only covers services provided to users under 13.
SCOPE requires services to ask users’ ages and give parents or guardians control over children’s account settings and user data. Companies are also prohibited from selling information collected about minors without parental consent. In October, Paxton sued TikTok for allegedly violating the law by providing insufficient parental controls and disclosing data without consent. TikTok denied the allegations.
The investigation announced last month also referenced the Texas Data Protection and Security Act TDPSAwhich came into force in July and requires parental consent before processing data relating to users under 13 years of age. Paxton’s office has asked the companies under investigation for detailed information on their compliance with both the SCOPE Act and the TDPSA, as required by law obtained through a public records request.
In total, the companies must answer eight questions by next week, including how many minors in Texas they count as users and are prohibited from registering an incorrect date of birth. Lists of persons to whom minors’ data is sold or shared should be provided. It has not been possible to learn whether any companies have already responded to this demand.
Tech company lobbying groups are challenging the constitutionality of the SCOPE Act in court. In August, they secured a preliminary and partial victory when a federal judge in Austin, Texas, ruled that a rule requiring companies to take steps to prevent minors from viewing self-harming and offensive content was too vague.
However, even total victory may not be a saving grace for tech companies. States including Maryland and Novel York are expected to introduce similar legislation later this year, says Ariel Fox Johnson, an attorney and principal at the consulting firm Digital Smarts Law & Policy. State attorneys general, however, could pursue narrower cases under their proven laws prohibiting duplicitous business practices. “We have observed that information is often shared, sold or disclosed in ways that families did not expect or understand,” Johnson says. “As more and more laws are passed that set stringent requirements, it becomes increasingly clear that not everyone is complying with them.”