SPD proposes social media ban for children under 14 with mandatory age verification

(de-news.net) – The Social Democratic Party of Germany (SPD) has proposed draft legislation that would prohibit children under the age of 14 from accessing social media platforms and would require the implementation of mandatory age verification through a dedicated application-based system. This initiative, outlined in a formal party policy paper, seeks to establish a comprehensive regulatory architecture structured around three distinct age categories. Under the proposed regulatory model, operators would also be required to create and maintain youth-adapted platform environments for users under 16, specifically designed to exclude algorithmically driven content feeds, personalized recommendations, and other forms of automated behavioral targeting. For example, a range of platform design features commonly associated with maximizing user engagement—including infinite scrolling interfaces, automatic playback of content, push notifications, and gamification elements—would be prohibited for this age group, reflecting the party’s assessment that unrestricted exposure poses disproportionate developmental and psychological risks.

For users between the ages of 14 and 16, access would be limited to a compulsory youth-specific platform version designed to restrict potentially harmful engagement features and reduce exposure to manipulative or addictive content dynamics. For all users aged 16 and above—including adults—algorithmic recommendation systems would be deactivated by default, thereby shifting control over algorithmically curated content from platform operators to individual users. Those wishing to receive personalized or algorithmically generated recommendations would be required to actively enable such functions, reinforcing the principle of informed and deliberate consent within digital environments.

Vice Chancellor and Finance Minister Lars Klingbeil expressed clear support for the SPD parliamentary group’s initiative, arguing that the structural risks associated with largely unregulated social media ecosystems had made explicit legal intervention increasingly necessary. He indicated that legislative action should prioritize shielding children and adolescents from exposure to harmful content, including hate speech, violent material, and psychologically manipulative platform mechanisms. Klingbeil further maintained that the prevailing economic logic of major digital platforms was centered on maximizing user engagement, often through mechanisms that amplify emotional intensity, polarization, and prolonged screen time, while simultaneously facilitating extensive data collection. In his assessment, this business model had frequently resulted in insufficient safeguards for younger users, whose vulnerability to such design features was particularly pronounced. Consequently, he emphasized that companies operating within the European market bore a corresponding responsibility to ensure user safety and to align platform design with broader societal interests, rather than prioritizing commercial incentives alone. He framed the proposed regulatory intervention as a necessary corrective measure intended to rebalance the relationship between platform operators and their users.

Hubig for binding protections as Data Commissioner warns on privacy

Justice Minister Stefanie Hubig, who co-signed the SPD position paper, acknowledged that social media had become an integral component of everyday social interaction and identity formation among younger generations. However, she emphasized that the developmental and psychological risks associated with sustained and unregulated exposure necessitated more clearly defined and robust legal protections. Hubig explained that the objective of the proposed framework was not to impose an indiscriminate or absolute prohibition, but rather to establish a differentiated regulatory structure capable of reconciling continued digital participation with age-appropriate safeguards. She further argued that many digital platforms deliberately incorporated behavioral reinforcement mechanisms and persuasive design features intended to prolong engagement and foster habitual usage patterns, thereby increasing users’ dependency on platform interaction. According to her assessment, growing numbers of children and adolescents had themselves expressed concern regarding the psychological pressures associated with constant connectivity, including social comparison, performance anxiety, and exposure to harmful content. Hubig therefore maintained that the establishment of binding regulatory standards was essential not only for protecting minors from specific harms such as cyberbullying and unrealistic social expectations, but also for strengthening long-term public trust in digital infrastructures and ensuring that technological environments remained compatible with broader social and developmental needs.

At the same time, Federal Data Protection Commissioner Louisa Specht-Riemenschneider expressed reservations regarding the adoption of a blanket prohibition, cautioning that such measures could inadvertently impose disproportionate compliance burdens on smaller platforms and services specifically designed for younger audiences. A spokesperson for her office indicated that decisions concerning age-based access restrictions must be understood within a broader policy context encompassing child protection, media governance, and digital regulatory strategy. From a data protection perspective, the decisive issue was not solely the establishment of age thresholds, but the manner in which age verification mechanisms were implemented, particularly with regard to preserving user privacy and preventing excessive data collection. The commissioner’s office therefore emphasized that any verification system must adhere strictly to the principles of proportionality and data minimization, ensuring that only the minimum necessary information was processed. As a potential technical solution, the authority referenced zero-knowledge verification technologies under discussion in connection with the planned European Digital Identity Wallet. Such systems would enable platforms to confirm that a user meets a specified age requirement without transmitting sensitive personal information, such as full birthdates or identity credentials, thereby reconciling regulatory enforcement with fundamental privacy protections.

Leave a Reply

Your email address will not be published. Required fields are marked *