Tag: children

  • YouTube Children’s Privacy by Google for $30M

    YouTube Children’s Privacy by Google for $30M

    Google Pays $30M to Settle YouTube Children’s Data Lawsuit

    Google has agreed to pay $30 million to settle a class-action lawsuit addressing the company’s alleged collection of children’s data on YouTube. The plaintiffs claimed that Google violated children’s privacy laws by tracking their viewing history without parental consent.

    Background of the Lawsuit

    The lawsuit filed several years ago accused YouTube of collecting data from users under 13 without obtaining verifiable parental consent. This practice violated the Children’s Online Privacy Protection Act COPPA. Moreover the plaintiffs argued that Google used this data to target advertising to children thereby generating substantial revenue. Ultimately the settlement resolves these claims before they could proceed further in court.

    Details of the Settlement

    Under the terms of the settlement, Google will pay $30 million into a fund to compensate affected parties. Additionally Google has agreed to implement changes to its data collection practices related to children’s content on YouTube. This includes enhancing age-screening mechanisms and increasing parental controls to ensure better compliance with COPPA regulations.

    Google’s Response

    Google maintains that it has already taken significant steps to protect children’s privacy on YouTube. The company emphasizes its commitment to providing a safe online environment for kids and families. Furthermore Google states that it continually updates its policies and tools to address evolving privacy concerns and comply with applicable laws.

    Implications for YouTube and Content Creators

    This settlement may lead to stricter enforcement of COPPA guidelines on YouTube. As a result content creators who produce videos aimed at children might face increased scrutiny over data collection and advertising practices. To address these concerns, YouTube has already introduced features like YouTube Kids to provide a safer environment for younger viewers. Going forward this settlement could prompt further refinements to such platforms.

    Google’s $30 Million YouTube Settlement

    On August 19-2025 Google agreed to a $30 million settlement in a class-action lawsuit alleging that YouTube violated children’s privacy by collecting personal data without parental consent and using it for targeted ads. The case involves U.S. children under 13 who watched YouTube between July 1-2013 and April 1- 2020 and potentially covers 35-45 million claimants. Compensation could range from $30 to $60 per claimant if 1–2% file claims. The settlement is pending judicial approval.Reuters

    Genshin Impact Developer’s $20 Million COPPA Settlement

    Earlier in 2025 January 17 the FTC announced a $20 million settlement with the developer of Genshin Impact for COPPA violations and deceptive marketing. Specifically the developer collected personal information from children and misled users about in-game purchases and odds.

    Strengthening the COPPA Framework

    The FTC finalized its first major updates to the COPPA Rule since 2013. Announced on January 16 2025 the Final Rule imposes new obligations including:

    • Mandatory separate parental consent before disclosing a child’s personal information to third parties e.g. for advertising or AI training.
    • Enhanced data retention rules operators may retain data only as long as necessary for its original purpose.
    • Stricter obligations around notice safe harbor programs and data security requirements.

    State-Level Enhancements

    • Virginia now requires parental consent for processing known children’s personal data and mandates data protection assessments.
    • Colorado similarly updated its privacy law to better safeguard youth.
  • Texas AG: Meta, Character.AI Mislead Kids on Mental Health

    Texas AG: Meta, Character.AI Mislead Kids on Mental Health

    Texas AG Accuses Meta & Character.AI of Misleading Kids

    Texas Attorney General (AG) Ken Paxton is taking action against Meta and Character.AI, accusing them of deceiving children about the mental health impacts of their platforms. Paxton asserts that these companies are engaging in practices that are detrimental to young users.

    Meta’s Alleged Deceptive Practices

    The Texas AG’s office claims that Meta, the parent company of Facebook and Instagram, is misleading children about the addictive nature and potential harms of its social media platforms. The lawsuit alleges that Meta fails to adequately protect young users from harmful content and that they are not transparent about the negative effects of prolonged social media use on mental health. More details on the Meta lawsuit are available.

    • Failure to Protect: Allegations include not doing enough to shield children from cyberbullying and inappropriate content.
    • Addictive Design: Claims that Meta intentionally designs its platforms to be addictive, keeping young users engaged for extended periods.
    • Lack of Transparency: Criticism over the lack of clear information about the potential mental health risks associated with social media use.

    Character.AI’s Role in the Controversy

    Character.AI, an AI-powered chatbot platform, is also under scrutiny. The Texas AG alleges that Character.AI provides responses that can be harmful to children, especially those struggling with mental health issues. The platform allows users to create and interact with AI characters, and concerns have been raised about the quality and safety of the advice and interactions these characters provide. You can read more about Character AI’s claims and controversies online.

    Legal Actions and Potential Consequences

    The lawsuits against Meta and Character.AI represent a growing trend of holding tech companies accountable for the impact of their products on children’s mental health. If successful, these legal actions could result in significant penalties and require the companies to implement stricter safeguards for young users. These safeguards could include:

    • Age Verification: Implementing robust age verification systems to prevent underage users from accessing the platforms.
    • Content Moderation: Improving content moderation policies to remove harmful and inappropriate content more effectively.
    • Mental Health Resources: Providing access to mental health resources and support for users who may be struggling.

    Protecting Children Online

    Parents and caregivers play a critical role in protecting children online. Some steps to consider include:

    • Open Communication: Talk to children about the risks and benefits of using social media and online platforms.
    • Set Boundaries: Establish clear rules and boundaries for screen time and online activities.
    • Monitor Activity: Keep an eye on children’s online activity and be aware of the content they are accessing.
  • Kids Online Safety Act Protecting Children

    Kids Online Safety Act Protecting Children

    The Kids Online Safety Act: Protecting Children Online

    The Kids Online Safety Act (KOSA) is a bipartisan U.S. legislative proposal aimed at enhancing the safety and well-being of minors online. Reintroduced in the Senate by Senators Marsha Blackburn (R-TN) and Richard Blumenthal (D-CT), the bill seeks to hold online platforms accountable for protecting young users from harmful content and interactions.


    🛡️ Key Provisions of KOSA

    1. Duty of Care
    KOSA establishes a legal obligation for online platforms to take reasonable steps to prevent and mitigate harms to minors. This includes addressing issues such as cyberbullying, sexual exploitation, substance abuse, self-harm, and exposure to content promoting eating disorders. Richard Blumenthal

    2. Enhanced Privacy and Safety Settings
    The act requires platforms to implement the highest privacy settings by default for users identified as minors. This includes restricting public access to personal data, limiting communication from unknown users, and disabling features that encourage prolonged use, such as autoplay videos.

    3. Parental Control Tools
    KOSA mandates that platforms provide parents with tools to manage their children’s online experience. These tools include the ability to adjust privacy settings, monitor usage, restrict purchases, and access reporting mechanisms for harmful content.

    4. Transparency and Accountability
    The legislation requires platforms to undergo independent audits and publish annual reports detailing foreseeable risks to minors and the measures taken to address them.

    5. Establishment of the Kids Online Safety Council
    KOSA proposes the creation of a council comprising parents, platform representatives, and federal agencies to advise on best practices and ongoing improvements for online child safety.


    ⚖️ Enforcement Mechanism

    Enforcement of KOSA‘s provisions would primarily fall under the jurisdiction of the Federal Trade Commission (FTC), which would oversee compliance and address violations. State attorneys general would also have the authority to enforce certain aspects of the law.


    📣 Support and Criticism

    Supporters:
    KOSA has garnered bipartisan support in the Senate and endorsements from various organizations, including the American Academy of Pediatrics, Common Sense Media, and tech companies like Apple and Snap.

    Critics:
    Civil liberties groups, such as the ACLU and the Electronic Frontier Foundation, have raised concerns that the bill could lead to over-censorship and negatively impact marginalized communities, particularly LGBTQ+ youth. They argue that platforms might suppress content related to gender identity and sexuality to avoid potential legal repercussions.


    📅 Legislative Status

    Despite passing the Senate with overwhelming support, KOSA stalled in the House of Representatives due to concerns over free speech and potential censorship. Senator Richard Blumenthal has expressed intentions to reintroduce the bill in the current congressional session, aiming to address previous objections and advance the legislation. The Guardian


    For more detailed information on KOSA and its implications:

    These resources provide comprehensive insights into the legislation and its potential impact on families and online platforms.


    What is the Kids Online Safety Act?

    KOSA focuses on holding social media platforms accountable for protecting children from harmful content. It requires platforms to prioritize the safety and well-being of their young users. The main goals include:

    • Preventing exposure to content that promotes suicide, eating disorders, and substance abuse.
    • Reducing online bullying and harassment.
    • Giving parents more tools to monitor their children’s online activity.

    Key Provisions of the Act

    Several key provisions are part of KOSA aimed at strengthening online protections for kids:

    • Duty of Care: Platforms have a legal obligation to act in the best interests of their young users.
    • Safety by Design: Platforms must design their services with safety features and protections built in from the start.
    • Transparency: Requires increased transparency from platforms about their safety policies and practices.
    • Parental Controls: Enhances parental controls to allow parents to manage their children’s online experiences more effectively.

    Potential Impacts on the Internet

    KOSA could significantly change how social media platforms operate. These changes have a ripple effect on the broader internet ecosystem.

    Platform Accountability

    KOSA seeks to shift the responsibility for child safety onto the platforms themselves. Companies may need to invest more resources in content moderation and safety measures.

    Content Moderation

    Expect increased content moderation and stricter enforcement of community guidelines. Platforms will actively remove harmful content and accounts.

    User Experience

    Users, especially younger ones, may experience changes in how they interact with social media. These include new safety features and restrictions on certain types of content.

    Free Speech Concerns

    Some critics worry about the potential for censorship and restrictions on free speech. Lawmakers are attempting to balance safety with freedom of expression.