State lawmakers across the country, including Florida, are trying to tackle issues that are close to home for many of us: protecting children online. As a mother of two and an expert who spent over a decade on big data and analytics, I am deeply interested in keeping harmful content away from children. But I also know that I can’t take my personal information to a greater risk and protect my children.
Some states, like Utah, are pushing for forceful laws aimed at solving one problem by creating another. For example, the recently proposed App Store Accountability Act validates the age of all Internet users and requests an app store to centrally store their sensitive data. Not only does this raise large-scale privacy concerns, it also completely strains platforms like the App Store and Google Play, such as the App Store.
Let’s be clear: this is not internet safe. It’s a data privacy disaster waiting for it to happen.
Irony hurts. The same lawmakers who want to protect their children from online manipulation have proposed a system that collects more data on minors. As someone who understands the basic data for artificial intelligence and digital ecosystems, it is a slippery slope to collect detailed personal information about children and “protect” them. That data can be used to sell products, track them across platforms, or even worse.
And many kids are actually spending their time, like Tiktok, Snapchat, Discord, Steam, Roblox, the platforms that these laws can’t deal with. These apps work with little monitoring and are constantly evolving in how you can deliver content and connect users. To expect only the app store to police this landscape is like denounced by the grocery store for the sugar content of all the products on the shelf. I sell Candy, so should Publix be accountable for my child’s tooth decay? Of course it’s not.
There’s a smarter way. Companies like Google are proposing a more balanced framework focused on the collaboration between app stores and the companies behind the app. In these suggestions, the app store will send encrypted age signals only when needed, minimizing data collection while allowing for age-appropriate experience. This allows businesses to create a more secure in-app environment, allowing parents to have more control without turning all users into data points in a centralized registry.
You need a policy that works in the way the internet actually works. A versatile obligation like Utah is not unrealistic, it is dangerous. They remove parental rights, increase surveillance and have little to say to users about how they use their data.
We all want the same thing. It’s a safer internet for kids. However, the solution is not blanket age verification or centralized data storage. It is targeted safety measures, smarter regulations and a real commitment to protect both our children’s minds and their personal information.
You need a smart policy that reflects how the digital world actually works. That means protecting children from harmful content without creating new privacy risks. That means providing parents with better tools and taking responsibility for the experience they have designed.
Spend your days with Hayes
Subscribe to our free Stephenly newsletter
Columnist Stephanie Hayes shares thoughts, feelings and interesting business with you every Monday.
You’re all signed up!
Want more free weekly newsletters in your inbox? Let’s get started.
Check out all options
As lawmakers look at ways to shape the future of online safety for children, they hope they choose balance over dull forces. Let’s create something that really works for our kids, our families, and our future.
Carolyn Eagen is a mother of two, founder of Kinstak AI and a senior technical leader with experience in data analytics and digital strategy. She lives in the Tampa Bay Area and advocates for a responsible technical policy that protects both children and data privacy.