I used to love the idea of the internet as an “equalizer.”
A place where anyone, anywhere, could learn anything, share their story, build community, or create something that mattered.

But the more time I spend online, the more I realize that dream was never fully true.

And it wasn’t built that way.


đŸ‘ïž Who Benefits from Being Watched?

You know that feeling—when you’re typing something into a search bar and pause, not because you don’t know what to say, but because you’re wondering who’s watching?

It’s subtle. And it’s everywhere.

Smartphones track our steps. Browsers track our behavior. Educational platforms track how long we spend on a quiz, whether our eyes leave the screen during a test, whether we copy-paste something too quickly. Not because they care—but because they can.

We’re told it’s for “better insights” or “security.”
But it rarely feels like safety—it feels like surveillance.

Surveillance isn’t neutral. It tends to fall harder on certain bodies: Black and Indigenous students, students with disabilities, those who need more time or ask more questions.

What does it mean when our digital spaces are designed to observe rather than to trust?


♿ The Myth of One-Size-Fits-All

Here’s a truth I’ve learned slowly: accessibility is not a feature. It’s a value.

Most websites and tools treat accessibility like an afterthought. An optional upgrade. A patch.

But what if we flipped the question?

What if we designed from the margins inward, instead of from the center out?

People like Alice Wong have been reminding us for years that access is not just about ramps and captions—it’s about recognizing whose bodies and minds tech was built for. If a classroom video has no captions, the message is: “This wasn’t made with you in mind.”

That goes deeper than tech. That’s pedagogy. That’s design. That’s care.


đŸŒ± Data Is More Than a Resource. It’s a Story.

When I first heard the term â€œIndigenous digital literacies,” I thought it just meant bringing tech into Indigenous communities.

But it’s so much more than that.

It’s about using digital tools to preserve language, protect land-based knowledge, and reclaim stories that colonial systems tried to erase. It’s about sovereignty in a datafied world.

The OCAPÂź principles—Ownership, Control, Access, Possession—completely reframe how we think about data. In most Western systems, data is a commodity. In Indigenous frameworks, it’s a responsibility.

If you collect data on a community, you don’t own it—you steward it. You’re accountable to the people behind it. And you ask permission before you touch it.

Imagine if all tech worked like that.


✹ What If We Started Again?

What if we stopped designing systems to monitor, and started building systems to support?

What if platforms prioritized trust over tracking?
What if accessibility was assumed, not requested?
What if Indigenous knowledge systems were treated as frameworks, not footnotes?

The future of digital space shouldn’t be about efficiency. It should be about relationship.

And maybe that starts with how we write, how we share, how we learn—and how we listen.


🧭 A Few Voices That Guide Me:


💬 Your Turn

  • Who do you think the internet was built for?
  • Have you ever felt excluded, watched, or erased in a digital space?
  • What does a more just, caring, accessible internet look like for you?

Feel free to comment, tag me, or respond on your own blog or platform.
Let’s imagine something better—together.