On June 13, 2023, Texas Governor Greg Abbott signed H.B. 18, or the Securing Children Online through Parental Empowerment (“SCOPE”) Act that would require digital service providers to get parental consent to create an account with minors younger than 18 years of age.
The SCOPE Act would apply to any digital service provider who provides a digital service that: (1) connects users in a manner that allows users to socially interact with other users on the digital service; (2) allows a user to create a public or semi-public profile for purposes of signing into and using the digital service; and (3) allows a user to create or post content that can be viewed by other users of the digital service, including sharing contents on a message board; chat room; and landing page, video channel or main feed that presents to a user content created and posted by other users. A minor is a child who is younger than 18 years of age who has not had the disabilities of minority removed.
The SCOPE Act contains a number of exemptions, including financial institutions or data subject to the Gramm-Leach-Bliley Act, covered entities or business associates under HIPAA, small businesses as defined by the United States Small Business Administration and institutions of higher education.
Digital Service Providers General Duties
A digital service provider must get parental consent to enter into an agreement to create an account with minors, register the person’s age with the digital service provider and prevent a person from altering their registered age, unless the alteration process involves a commercially reasonable review process. Further, a digital service provider must use a commercially reasonable age verification method to verify that any person seeking to access content on or through the provider’s digital service is 18 years of age or older.
A digital service provider that enters into an agreement with a minor for access to a digital service must: (1) limit collection of the minor’s personal identifying information to information reasonably necessary to provide the digital service; and (2) limit use of the minor’s personal identifiable information to the purpose for which the information was collected. Further, the digital service provider may not (1) allow the minor to make purchases or engage in other financial transactions through the digital service; (2) share, disclose or sell the minor’s personal identifying information; (3) use the digital service to collect the minor’s precise geolocation data; or (4) use the digital service to display targeted advertising to the minor.
Digital Service Provider Duty to Prevent Harm
A digital service provider must, in relation to a minor’s use of a digital service, develop and implement a strategy to prevent the minor’s exposure to harmful material and other content that promotes, glorifies or facilitates suicide, self-harm, eating disorders, substance abuse, stalking, bullying, harassment, grooming, trafficking, child pornography or other sexual exploitation or abuse. This strategy must include, amongst other requirements, creating and maintaining a comprehensive list of harmful material or other content to block from display to a minor, using filtering technology and other protocols to enforce the blocking of harmful material or content and using hash-sharing technology and other protocols to identify recurring harmful material or other content, creating and maintaining a database of keywords used for filter evasion (e.g., identifiable misspellings, hash-tags or identifiable homoglyphs). This strategy may include engaging a third party to rigorously review the digital service provider’s content filtering technology, participating in industry-specific partnerships to share best practices in preventing access to harmful materials or other content or conducting periodic independent audits to ensure continued compliance with the digital service provider’s strategy and efficacy of filtering technology and protocols used by the digital service provider.
Digital Service Provider Duty to Create Parental Tools
A digital service provider must create and provide parental tools to allow a parent to supervise the parent’s minor’s use of a digital service. The parental tools must allow a parent to control the minor’s privacy and account settings, alter the duties of a digital service provider with regard to the minor, restrict the minor’s ability to make purchases or engage in financial transactions and monitor and limit the amount of time the minor spends using the digital service.
Digital Service Provider Duties Regarding Advertising and Marketing
A digital service provider must make a commercially reasonable effort to prevent advertisers on the digital service provider’s digital service from targeting a minor with advertisements that facilitate, promote or offer a product, service or activity that is unlawful for a minor in this state to use or engage in.
Digital Service Provider’s Use of Algorithms
H.B. 18 does not contain a private right of action and will be enforced by the consumer protection division of the attorney general’s office. H.B. 18 also does not permit a court to certify an action under H.B. 18 as a class action. If a digital service provider violates H.B. 18, the parent or guardian of a minor affected by the violation may seek a declaratory judgement or an injunction against the digital service provider.