Fear is stealing our jobs. According to a 2023 Pew Research Center survey reported by Reuters, over half of U.S. workers whose jobs are exposed to AI are at least somewhat concerned about being replaced.
That fear is real. It is tangible. Visceral, alive, and burning inside us. Artificial Intelligence represents a kind of change that feels both invisible and uncontrollable. It doesn’t just threaten to change how we work; it challenges who does the work at all. It’s not replacing muscle, but judgment. Creativity. Voice. That’s far more personal. For someone who’s built their career around language, design, teaching, coding, or, let’s face it, any career, it’s unnerving to watch a machine mimic those abilities in seconds. The fear isn’t just about losing income. It’s about losing identity, relevance, and purpose. There’s a psychological weight to the idea that a machine will outperform you at something you once considered uniquely human. It reflects real gaps in reskilling support, economic stability, and our cultural reverence for productivity over people.
I have felt that fear myself. As a software architect, designer, and developer, I view the future with anxiety. Where do my meals come from? How do I provide for my family?
Furthermore, as a creative, I have published nine original stories. From blank page to published book, I’ve been involved in every step: writing, editing, formatting, and uploading my manuscripts to KDP for both print and Kindle. I create my own cover images, approve them for publication, and distribute them through Amazon.
But that’s not where it ends.
I’m also a voice actor. I narrate my own audiobooks. I record, edit, scrub, master the audio, and upload everything to ACX, where it gets distributed through Audible. I’m a one-person creative studio, and I’ve built this ecosystem with discipline, passion, and grit.
I put myself on a ten-year plan to write a million words, not just to publish, but to learn. To understand every step of this industry from the inside out. That mindset, part artist, part craftsman, part researcher, led me to artificial intelligence.
At first, it was pure curiosity. How does it work? What can it do? Where does it break? I wanted to know how these models were trained. Why they hallucinate. Where they fall short in logic, in storytelling, in nuance. I was skeptical, sometimes disturbed, but always intrigued.
I’ve watched the big players: OpenAI, Google, Meta, Anthropic, and others. I’ve studied how they race to solve massive challenges: contextual reasoning, ethical safeguards, malicious misuse, age-appropriate outputs, and the never-ending complexity of human language itself.
But as I dug deeper, something shifted.
I stopped seeing AI as a threat and started seeing it as a tool. Not a finished product. Not a rival artist. A collaborator. One with limitations, yes, but also with astonishing potential.
There is a mindset already in existence regarding artificial intelligence and automation, first issued in 2012, updated in 2023, reflecting AI advancements. DoD Directive 3000.09 – Autonomy in Weapon Systems. The core principle here is that autonomous and semi-autonomous weapon systems shall be designed to allow commanders and operators to exercise appropriate levels of human judgment over the use of force.
In practice, it means that weapons must not engage targets without human oversight. There must be clear accountability, and AI must be predictable, understandable, and controllable.
Where speed and automation can save lives, this principle codifies guardrails, suggests boundaries, where people are accountable, apply moral convictions, and consciously decide the outcome.
Because judgment matters.
Accountability matters.
If you repeat the fear-driven refrain without doing your own due diligence, you won’t just echo a concern, you’ll ensure your own obsolescence. Now is the time to grasp the future with both hands. Not next year. Not when it feels safer. Now.
Those who adapt and adopt will thrive.
Those who experiment, who learn the tools, who collaborate instead of resisting, will compose the works of art in the future.
But those who stand still, beating their chests, wailing at the unknown, will fulfill their own prophecy. Fear has a way of making itself true when we let it guide our choices. Psychologists call this a self-fulfilling prophecy: if you believe the future is stacked against you, you’ll stop trying. If you stop trying, you’ll get left behind.
And here’s the uncomfortable truth: AI isn’t going away. You can either learn to shape it or be shaped by those who do.