Breaching Cybersecurity: My Digital Homeschool Crash Course in Offense and Defense

by | Jan 29, 2026 | Biography | 0 comments

TL;DR
After 9/11, my family’s move to Virginia and a stretch of homeschooling dropped me into long, unsupervised hours with a computer, dial-up internet, and a growing obsession with how software really works. Watching early “techno” films and hearing a family member talk shop from an AOL-adjacent world pushed me toward the mythology of hacking—but what I actually learned was deeper: trust is fragile, systems fail in predictable ways, and if you don’t understand attacks, you can’t build real defenses. By the time I was 11, I’d internalized what I’d later learn professionals call a “purple team” mindset: offense informs defense, and discipline beats luck.


When people ask how I became so focused on security and reliability, they usually expect a clean, modern origin story—certifications, labs, mentorship, a linear path.

That’s not how it happened.

My first real venture into cybersecurity wasn’t a course, a bootcamp, or a formal program. It was a weird mix of circumstances that, at the time, felt like chaos—and in hindsight, became a crash course in how computers break and what it takes to keep them stable.

It started in the early 2000s, right after 9/11, when my mom moved us to Virginia. For a period, we were homeschooled by my sister and her husband while my mom handled the Texas side of the move and got us situated for a new life. That in-between time mattered more than I realized.

Because in the middle of that transition, I encountered three ingredients that changed everything:

  1. a passionate tech adult in my orbit

  2. unlimited curiosity and time

  3. a computer + dial-up internet at just the wrong (or right) age


The “Tech Lingo” That Hooked Me Before I Understood It

Bill worked for AOL, and he had a passion for programming. I didn’t understand most of the words he used when he talked about work—but I understood how he said them.

He’d emphasize certain ideas like they mattered in ways I couldn’t yet articulate. You know that feeling when someone speaks a language you don’t fully understand, but you can tell the concepts are important because of the precision and seriousness in their tone? That was it.

I’d already had my first “tinkering” experience years earlier—taking apart SkiFree at age seven, breaking it, documenting what changed, and learning that experimentation without notes is expensive.

Now there was this adult with real-world tech gravity, and I could feel there was a bigger world behind the lingo. I wanted in. I just didn’t have the vocabulary yet.

So I did what I always did:

I went back to the library.


Tech Films, Hacker Mythology, and What Didn’t Add Up

Around the same time, I started watching every tech-driven movie I could find—from early 2000’s techno films back into the 80’s. Those stories are entertaining, but they also do something subtle: they make “hacking” look like a single dramatic moment instead of what it really is—methodical exploration of systems.

Even as a kid, parts of the movie mythology didn’t make sense to me.

Not because I was some genius—because I was already familiar with cause and effect from tinkering. I could tell there were missing steps. Things that should have been true if the story were accurate… but weren’t.

That mismatch created a kind of itch:

  • What’s real?

  • What’s exaggerated?

  • How do these systems actually work?

  • If someone can break it, what did they exploit? What did they understand that others didn’t?

So I dove deeper—again, without the perfect tools, without modern internet access, and without someone handing me a clean learning path.

Just books, time, and stubborn curiosity.


Friends Who Spoke the Language (and Me Trying to Catch Up)

At church, I met friends who were into programming and hardware—Sam, Richie, and their brother Ben. They spoke “computer” naturally. They’d talk about parts, builds, software, and ideas with a fluency I didn’t have yet.

I still didn’t always understand their lingo, but I loved sitting there listening, watching, asking questions, and trying to connect the dots. That dynamic shaped me in a big way: I learned early how to learn from people without pretending I already knew.

That humility—paired with relentless curiosity—is still a core part of how I operate today.


The Second Family Computer and a Homeschool “Lab Environment”

Eventually, my mom came back from Texas and we moved into a townhouse. She bought another computer—the first one we’d had again since that original Windows dinosaur years earlier.

She also bought homeschool CD-ROM courses, and then she went to work. We were latch-key kids, which meant lots of time at home and lots of time on the computer.

That unsupervised time is important context, not because it’s glamorous, but because it created something that security professionals recognize instantly:

A learning environment where experimentation happens fast… and consequences show up faster.

I started exploring software the way I always had—trying to understand how it worked under the hood. The “homeschool computer time” unintentionally became a hands-on lab.

And if you used the internet on dial-up back then, you know exactly what that meant.


Dial-Up Internet, Bad Search Results, and Learning Distrust the Hard Way

When you’re a kid on dial-up trying to search for technical answers, you don’t just find information.

You find traps.

You find shady downloads, fake tools, infected files, sketchy popups, and the early version of what we now call social engineering. And when you don’t yet have strong security instincts, you learn quickly that the internet is not a safe library—it’s a hostile environment with helpful corners.

I learned the hard lesson: you can’t trust what you click.

And that lesson didn’t stay abstract for me. It became practical because I saw what happens when systems get compromised:

  • instability

  • broken functionality

  • lost work

  • unpredictable behavior

  • time-consuming recovery

That phase taught me the foundations of defensive thinking:

  • take backups seriously

  • understand recovery steps

  • know how to rebuild and restore when things go wrong

  • treat “unknown software” as dangerous until proven safe

I’m being careful with how I describe this because I’m not interested in glorifying “hacking.” The point isn’t that I did something edgy. The point is that I learned, early, what every security-minded engineer learns eventually:

The web is adversarial, and reliability requires defensive habits.


The Real Lesson: Offense and Defense Are Two Sides of the Same Coin

At some point in that era, I crossed a line in my learning that I didn’t fully understand at the time. I was exploring boundaries. Testing assumptions. Trying to understand how “locked doors” worked in software environments.

And the most important outcome wasn’t that I “got past” anything.

The most important outcome was the realization that:

  • systems have rules

  • those rules exist for a reason

  • and if you don’t understand how rules are bypassed, you can’t design defenses that hold up

Security isn’t fear. Security is realism.

If you want a strong lock, you need to know how locks fail.
If you want a safer building, you need to know how people break in.
If you want a stable platform, you need to understand the failure modes.

That’s the mindset people now call purple teaming: offense informs defense.

Back then, I didn’t have the vocabulary. I just had the instinct.


Why This Shaped My Work Today

If you look at my modern work—whether for clients or employers—you’ll see this early lesson everywhere:

1) I prioritize attack surface reduction

I’m constantly asking: what exposure is unnecessary? What can be tightened, restricted, minimized, or removed?

2) I prefer layered protection

Edge + origin + application hardening, not “one plugin fixes it.”

3) I treat reliability as part of security

A platform that fails under stress is vulnerable—whether the stress comes from traffic spikes, misconfigurations, or malicious behavior.

4) I build with rollback and recovery in mind

Backups that restore, change control, update governance, monitoring—these aren’t “extras.” They’re what makes the system survivable.

5) I don’t trust “it should be fine”

I validate. I monitor. I measure. I verify.

Because I learned early that “should” doesn’t protect you when reality hits.


The Practical Analogy I Still Use

I think about security the way I think about physical safety tools.

If you don’t understand what you’re defending against—how attacks work, how systems get bypassed, how failures propagate—you can’t build real protection. You end up with security theater: things that look safe until they’re tested.

That’s true in cybersecurity and it’s true in WordPress operations.

A WordPress site isn’t just pages. It’s:

  • authentication and admin exposure

  • plugin supply chain risk

  • hosting misconfiguration risk

  • third-party scripts that expand attack surface

  • uptime and incident response discipline

  • backup and restore readiness

If you don’t design defensively, you don’t have “a website.” You have an accident waiting for the right conditions.


Why I’m Sharing This Story

I’m not sharing this to romanticize hacking or make myself sound edgy.

I’m sharing it because it explains why I’m so serious about:

  • hardened hosting

  • update governance

  • monitoring

  • documentation

  • disciplined scope control

Those habits aren’t trends to me. They’re earned. They come from learning early that systems fail in predictable ways—and that recovery is painful when you don’t prepare for it.


Next in the Biography Series

Next, I’ll write about how this early security mindset evolved into professional discipline: turning curiosity into structured workflows, and how I learned to ship projects with accountability (time tracking, scope enforcement, documentation-as-you-build) instead of chaos.


Want to work together?

If you need someone who treats your website or platform like a production system—secure, monitored, stable, and fast—book a call and tell me what environment you’re running (WordPress, hosting, Cloudflare/CDN, and what’s currently breaking). I’ll explain what I’d fix first, how I’d measure success, and how I’d protect it long-term.

Ready to Keep WordPress Fast Long-Term?

If you want performance that doesn’t regress after the next plugin install, I can implement a performance protection layer: monitoring, update governance, backup validation, rollback readiness, and performance budgets—so your WordPress site stays fast, stable, and resilient.

Written By Curtis Lancaster

undefined

Explore More Insights

0 Comments

Submit a Comment