33.3: Making the Invisible Side of AI Visible

Saranyan"The technology we have isn't the technology we're stuck with. It's just the technology we've chosen so far."
The Question That Started Everything
My sons are 13 and 10. They care deeply about global issues—ending war, protecting artists' welfare, safeguarding the environment. They have the kind of idealism that makes me hopeful about the future.
They also trust technology implicitly.
This paradox kept me up at night. How do you help young people who care about justice understand that the technology they use might undermine the very values they hold? How do you reveal the invisible costs without crushing their idealism?
The answer became 33.3—33 interactive experiences designed to make visible what's typically invisible in our digital systems.
What 33.3 Is
Think of it as a radio station to understand AI and tech. Not through lectures or textbooks, but through guided exploration of real-world cases and documented investigations. It's designed for parents, educators, and critical thinkers who want age-appropriate conversations about technology's hidden impacts.
The platform offers 11 curated learning journeys:
- A 15-minute Quick Start for beginners
- Paths for parents, educators, and students (middle school through college)
- Specialized journeys for designers, engineers, and professionals
- A comprehensive 3-hour complete journey for those ready to go deep
Each journey explores what's typically invisible: psychological manipulation in apps, data commodification, AI bias, labor exploitation, environmental costs, surveillance, and corporate power structures.
Why This Matters Now
Consider these realities:
Labor exploitation: Content moderators earn $2 per hour viewing traumatic material so your social media feed stays "clean." Who knew? Who asked them to do this work? Who decided this was an acceptable cost?
Environmental impact: Training GPT-3 produced 552 tons of CO2. Meta's data centers are projected to use 905 million gallons of water. The cloud isn't weightless—it's heavy with consequences.
Accessibility gaps: Over a billion people with vision loss are excluded from AI systems. Not because it's technically impossible to include them, but because someone chose not to prioritize it.
Psychological manipulation: Social platforms are deliberately engineered for addiction. The infinite scroll, the variable rewards, the fear of missing out—these aren't bugs, they're features designed to keep you scrolling.
AI screening risks: Deepfake fraud stole $25.6 million from Arup. The technology we're told makes us safer creates entirely new vulnerabilities.
These aren't hypothetical. These are documented cases, real people, actual costs that someone is paying while we swipe, click, and scroll.
Making Invisible Systems Visible
As a teacher of AI and Ethics to middle schoolers, I've learned that young people are ready for these conversations. They just need the right entry points.
I have learnt that the best way to engage with children is to ask questions:
- Who benefits from this system?
- Who pays the hidden costs?
- What alternatives exist?
- What choices are we making, often without knowing we're making them?
Each of the 33 experiences is grounded in documented reality—human research, fact-checking, real-world investigations. The coding assistance came from Claude, but the content comes from the messy, complicated truth of how technology actually works in the world.
Explore for Yourself
33.3 is freely available at 33point3.org. Whether you spend 15 minutes or 3 hours, whether you're a parent, educator, or curious individual—the experiences are designed to meet you where you are.
Start with what sparks your curiosity. Follow the questions that matter to you. Share what you discover with the young people in your life.
Because in a world increasingly shaped by invisible systems, making those systems visible isn't just educational—it's essential.
Join the Discussion
Share your thoughtsJoin the discussion
I look forward to hearing your thoughts! Share your perspective, ask questions, or add to the conversation.