Emotionally Engineered: How AI Is Rewriting Free Will

My “aha” moment happened recently. My associate and I were working on a presentation for our clients about modern workforce solutions addressing the challenges of remote work and security. Using Google’s free tool, NotebookLM, he created a podcast, entirely AI-generated, that was so compelling, it shook me. In that moment, I realized just how this technology could threaten humanity as we know it. Combined with quantum computing, AI’s growing power demands, and political shifts toward autocracy, this is perhaps the greatest threat to the world order and humanity in modern history.

I know that sounds dramatic, but it’s not. What struck me was how emotionally convincing the fabricated podcast was. It wasn’t real, yet it moved me. That same power can be used to mass-produce content designed to influence thought and behavior. With the right message, format, and messenger, even fiction can feel like truth. Governments and powerful entities can exploit this, weaponizing our emotions to shape our beliefs and actions.

Reporter Tom Bartlett, from The Atlantic, profiles a controversial study by University of Zurich researchers where they found that AI driven posts on Reddit were most successful in persuading the opinions of other human participants. You can read more about this experiment on The Atlantic and on NPR.

We like to think we’re rational, but we’re not. Emotions govern nearly everything: whom we love, where we live, what we buy, the careers we pursue. Sales and marketing professionals understand this well—emotions close deals, not logic. If we’re constantly exposed to emotionally manipulative content and lack the tools to critically evaluate it, we lose our agency, our free will, and perhaps even our right to life, liberty, and the pursuit of happiness.

There is strong science around the power of emotions. Ethan Kross, author of the book Shift, demonstrates through scientific studies the ways in which our emotions affect us and how we can manage them. If we’re subject to the emotional manipulation of overwhelming and compelling AI driven content, we are likely to become the very sheep we fear. And if we’re not in control of that content or trained on how to understand it and it’s emotional impact, we are not the ones in control.

Consider this: I love watching motorcycle and car shows. Some of the most emotionally detached people are the men who love their cars and bikes. Many of them think they’re emotionally simple, direct and logical people. The irony is that these are the same people collecting muscle cars and custom modifications purely for the feeling it gives them. They are the very epitome of the emotionally driven art lover, willing to spend money and resources that are entirely based on emotion. I watch the Mecum and Barrett Jackson car auctions and watch these same people spend tens, hundreds and sometimes millions of dollars on effectively useless equipment. That’s emotion in action.

Still skeptical? Look at the rising crisis of gambling addiction, particularly around sports betting. In Massachusetts alone, people reported gambling problems jumped from 12.7% in 2014 to 25.6% in 2023. People knowingly destroy their lives, and those of their loved ones, because emotional impulses override rational judgment. The emotional reward is so powerful, it eclipses consequences. That same principle makes emotional manipulation more powerful than drugs or physical power.

The atomic bomb of this century isn’t nuclear, it’s AI-powered, emotionally manipulative content. Combined with the reach of the internet and social media, it allows curated information to be injected directly into our minds. Making matters worse, trustworthy news sources are shrinking, and access to reliable content is increasingly hidden behind paywalls. It’s a race for power like no other. For all its productive uses, the race for AI technology and technological primacy is not about improving our lives, but rather a race for global power.

Much political rhetoric, especially around First and Second Amendment rights, is also part of this manipulation. Arguments about freedom of speech and gun rights are often used to distract and divide. Bumper stickers like “If guns kill people, do pens misspell words?” are clever, but they miss the bigger point. Real power doesn’t lie in firearms, it lies in controlling information and influencing emotion. Division is the goal, and many don’t see they’re being played.

Quantum computing will amplify this exponentially. “Q-Day,” the hypothetical moment quantum computers can break current encryption standards, could arrive before 2035, if it hasn’t already. Governments have been stockpiling encrypted data for years, waiting for the day it becomes readable. Once AI and quantum power converge, decades of private communication could be analyzed and exploited to manipulate individuals and populations on an unprecedented scale.

The convergence of AI, emotion, and geopolitics is fundamentally changing the world. The fact that a fake podcast could make me feel something so deeply is proof of how easily we can be influenced. We are emotional beings first, rational beings second, and now there is a reliable, scalable way to exploit that truth. Without critical thinking, education, and emotional literacy, we risk surrendering our thoughts, behaviors, and freedoms to those who can most effectively control the narrative.

This is not just a technical or political crisis, it’s a philosophical and moral one. Even the new Pope Leo recognizes the risks. What kind of future do we want to build? Our ability to protect truth, agency, and societal cohesion is at stake.

Maybe the greatest security risk to your business that no one is talking about 

Do you or your staff use Virtual Assistants (VAs)? Using VAs is great, but if you don’t have good security controls, you may be putting yourself, your staff and your organization at serious risk.  

My daughter came home with an assignment to explore human nature and different philosophical perspectives. As I reflected on that conversation, which in true teenage fashion she quickly told me she was done with, I started to think about a common situation we have addressed with many clients and wondered, do all these business execs believe in the innate goodness of people? That certainly is a much kinder way to understand their choices. 

A quick Google search reveals many online resources and people extolling the benefits of overseas VAs including flexibility, availability and cost. Many of these resources have brief notices about security and privacy considerations. Many suggest that having proper due diligence and contracts in place are good mitigations to risks. However, the technical details of how to properly protect data and privacy are never disclosed or discussed. This article from US News has a small 4 sentence expandable blurb at the end that simply suggests choosing a reputable service is the best way to protect yourself. 

Using an overseas VA is not necessarily a bad idea. VAs can be a very powerful tool for businesses and individuals. However, in order to avoid a serious data security or financial disaster, it is critical to understand what your risks are and how to mitigate them.   

One day we got an alert for one of our clients who is in a regulated industry. The alert was about improbable geographic access to one of their Microsoft 365 mailboxes. Upon investigation, it turned out that a salesperson had hired an overseas VA to help manage their calendar and sales efforts. The salesperson had shared their credentials and MFA to an unvetted foreign agent and potentially provided access to legally protected data. This was a clear violation of their security policy. 

Despite having gone through training on the company’s security policies and participating in regular cybersecurity awareness training, this salesperson seemingly didn’t know that what they were doing was wrong. Perhaps they just didn’t care? Or perhaps the lure of what the VA offered was just too compelling; a widely used, cheap, effective sales support tool. Or maybe they just believed in the innate goodness of people? Surprisingly we see this all the time. People are too willing to give up their privacy and security for free or inexpensive products and services. Gmail is a clear example of this! 

It isn’t just salespeople, it’s the C suite too. Outsourcing administrative work to inexpensive overseas staff is very common. We have clients ranging from plumbers to data analytics companies and insurance agencies that have outsourced executive assistant and administrative roles to overseas VAs. 

What many people fail to realize is that granting a stranger access to your email is not only against most company policies, but also a very bad idea. In 2023, the FBI reported $12.5 billion in losses from US firms due to fraud and cybercrime. Of that, $2.9 billion was related to Business Email Compromise (BEC). For most businesses, email is a critical system that provides significant access to other systems, files, people and resources. This is why email systems are a favorite target for attackers. They stand to gain significant levels of access and are able to use that access to establish authority with other victims.  

In another recent example, we advised a client whose use of an overseas VA would have allowed the VA to easily impersonate, defraud and damage the client’s business. In our discussions with them, they revealed that they had set up an Apple iPad for their VA, through which the VA had complete access to their personal Apple ID, phone records and text messages. They were using the VA to help with administrative tasks including responding to emails and text messages. With access to the person’s Apple ID, this complete stranger on the other side of the world had access to personal photos, access to financial resources such as Apple Pay, mobile banking, and access to sensitive data stored throughout their Apple account. They even knew the client’s location information! What’s more, the VA’s access included knowledge of the device PIN. The device PIN is a form of identity verification for Apple and is used to encrypt iMessages. 

Many of these VA services are located outside the US, beyond the jurisdiction of the US legal system. If you don’t have the time and resources to hire someone locally, you probably don’t have the time and resources to chase down an overseas fraudster. So even if your overseas VA was caught doing something illicit or immoral, there is little recourse, and navigating a foreign legal system can be challenging and costly. 

It is astounding to me the way in which people are circumventing their own security policies to take advantage of low-cost efficiency tools. Those policies are in place for a reason. If I were North Korea’s Kim Jung Un or Russia’s Vladimir Putin, why spend the time and resources breaking into systems around the world when all you have to do is ask? They could setup inexpensive overseas VA shops, charge reasonable rates and wait for their victims to open their doors to them! In fact this is already happening. It has been recently reported that North Korean employees are infiltrating western companies, a slight twist on the VA angle I’m describing.

So, when you’re ready to engage your VA, you don’t have to solve the age-old question of human nature. You just need to do some planning. Take the time to understand what systems or resources your VA will need access to to perform their role. Determine if company policies or laws limit what data and systems they can access. Finally work with your IT and/or security team to put the necessary controls around their access with appropriate monitoring. If you take the time to plan appropriately, you can help avoid a costly and disruptive breach.