|
Getting your Trinity Audio player ready...
|
***
When I had my twins, my instinct was simple: protect them. That hasn’t changed as they’ve grown. Whether it’s keeping them safe in sports, helping them navigate friendships at school, or thinking ahead about the digital world, I’m constantly trying to stay one step ahead so I can guide them as they grow.
Even though my kids are still young, technology is already part of their everyday lives, in classrooms, in games, and in how they connect with friends and family. I know it’s only a matter of time before social media becomes part of that mix, and I want to be prepared well before that day comes. As a parent and as a blogger and content creator who works online, I feel a responsibility to truly understand the digital space my kids are growing up in.

***
Like most parents, I want my children to feel confident using technology and to benefit from all the good it offers: creativity, learning, and connection. But I also know that access without guardrails can create real challenges. Parents everywhere are trying to strike the right balance between encouraging exploration and setting healthy boundaries, often without clear guidance or consistent tools. Too often, families are left to figure it all out on their own.
Right now, digital safety largely depends on individual apps, each with its own rules, settings, and standards. Some take child safety seriously, while others make protections hard to find or easy to bypass. By the time kids reach their teen years, many already know how to move seamlessly between platforms and work around age limits. My kids are almost nine, and they already know how to use my phone with ease. Something as simple as checking a box or entering a different birthdate can unlock content that was never meant for them.
***
RELATED | NEW POLL SHOWS NANCY MACE BACK IN THE LEAD
***
That’s why federal legislation like the App Store Accountability Act matters so much to parents like me who are trying to plan ahead. Instead of placing the burden on every individual app, this proposal makes the app store the central place for age verification and parental approval before downloads happen. No matter how attentive we are, parents can’t be everywhere at once. When every app has different standards, things inevitably slip through the cracks.
A national standard would replace today’s fragmented system with a clear, consistent process for families everywhere. Parents in South Carolina should have the same protections as parents across the country. One centralized checkpoint would allow families to manage permissions in one place, instead of chasing settings across dozens of platforms. App stores already control which apps are available. It makes sense for them to also be the place where age verification happens.
This bill isn’t about limiting choice or monitoring every move our kids make. It’s about strengthening parental guidance at key moments. I’m already thinking ahead to my kids’ teenage years, and I want to know that when they’re ready for social media, meaningful protections will already be in place. Built-in guardrails allow parents to focus on open, intentional conversations about digital use, rather than constant, helicopter-style monitoring.
***
NEW LIVE SHOW WEDNESDAYS @ 7:00 P.M.

***
Some alternative proposals simply don’t go far enough. Systems that rely on kids to self-report their age or only apply to certain types of apps leave too many gaps. If we’re serious about real solutions, automatic protections and safety settings should be the priority, not a confusing, state-by-state patchwork.
To lawmakers, I ask you to think about what South Carolina parents truly need. We aren’t asking for perfection, just clarity and consistency. Parents deserve confidence that every app their child downloads includes meaningful age checks and meets a nationwide standard. The App Store Accountability Act offers a clear, federal solution that reflects today’s family dynamics and gives parents practical tools to raise kids in an increasingly digital world.
***
ABOUT THE AUTHOR…
Brianna Steele is a South Carolina-based parent, blogger, and digital content creator.
***
WANNA SOUND OFF?
Got something you’d like to say in response to one of our articles? Or an issue you’d like to address proactively? We have an open microphone policy! Submit your letter to the editor (or guest column) via email HERE. Got a tip for a story? CLICK HERE. Got a technical question or a glitch to report? CLICK HERE.




1 comment
Is this what your school is doing to your child? “AI student surveillance software has broad collection capabilities, and student data collected by AI student surveillance software is not kept private. These capabilities combined with the lack of data privacy can harm the very children the companies claim to protect.” More than likely the answer is yes and you don’t know what is going on….Perhaps the highest price paid for EdTech surveillance software is that a generation of students is consequently being raised in a surveillance culture. They are being trained to believe that their society and peers are inherently dangerous, and for some it becomes a self-fulfilling prophecy…. “School districts should not view EdTech Surveillance companies as their allies, partners, or saviors in pursuing that [safety] goal. Each is simply a company trying to sell you a product. No more and no less.”….Over the last two decades, a segment of the educational technology (EdTech) sector that markets student surveillance products to schools — the EdTech Surveillance industry — has grown into a $3.1 billion a year economic juggernaut with a projected 8% annual growth rate.1 The EdTech Surveillance industry accomplished that feat by playing on school districts’ fears of school shootings, student self-harm and suicides, and bullying — marketing them as common, ever-present threats. Check out Truth In Education and the article Would You Say the Same Things if You Knew an Algorithm Was watching?