Emotionally Engineered: How AI Is Rewriting Free Will

My “aha” moment happened recently. My associate and I were working on a presentation for our clients about modern workforce solutions addressing the challenges of remote work and security. Using Google’s free tool, NotebookLM, he created a podcast, entirely AI-generated, that was so compelling, it shook me. In that moment, I realized just how this technology could threaten humanity as we know it. Combined with quantum computing, AI’s growing power demands, and political shifts toward autocracy, this is perhaps the greatest threat to the world order and humanity in modern history.

I know that sounds dramatic, but it’s not. What struck me was how emotionally convincing the fabricated podcast was. It wasn’t real, yet it moved me. That same power can be used to mass-produce content designed to influence thought and behavior. With the right message, format, and messenger, even fiction can feel like truth. Governments and powerful entities can exploit this, weaponizing our emotions to shape our beliefs and actions.

Reporter Tom Bartlett, from The Atlantic, profiles a controversial study by University of Zurich researchers where they found that AI driven posts on Reddit were most successful in persuading the opinions of other human participants. You can read more about this experiment on The Atlantic and on NPR.

We like to think we’re rational, but we’re not. Emotions govern nearly everything: whom we love, where we live, what we buy, the careers we pursue. Sales and marketing professionals understand this well—emotions close deals, not logic. If we’re constantly exposed to emotionally manipulative content and lack the tools to critically evaluate it, we lose our agency, our free will, and perhaps even our right to life, liberty, and the pursuit of happiness.

There is strong science around the power of emotions. Ethan Kross, author of the book Shift, demonstrates through scientific studies the ways in which our emotions affect us and how we can manage them. If we’re subject to the emotional manipulation of overwhelming and compelling AI driven content, we are likely to become the very sheep we fear. And if we’re not in control of that content or trained on how to understand it and it’s emotional impact, we are not the ones in control.

Consider this: I love watching motorcycle and car shows. Some of the most emotionally detached people are the men who love their cars and bikes. Many of them think they’re emotionally simple, direct and logical people. The irony is that these are the same people collecting muscle cars and custom modifications purely for the feeling it gives them. They are the very epitome of the emotionally driven art lover, willing to spend money and resources that are entirely based on emotion. I watch the Mecum and Barrett Jackson car auctions and watch these same people spend tens, hundreds and sometimes millions of dollars on effectively useless equipment. That’s emotion in action.

Still skeptical? Look at the rising crisis of gambling addiction, particularly around sports betting. In Massachusetts alone, people reported gambling problems jumped from 12.7% in 2014 to 25.6% in 2023. People knowingly destroy their lives, and those of their loved ones, because emotional impulses override rational judgment. The emotional reward is so powerful, it eclipses consequences. That same principle makes emotional manipulation more powerful than drugs or physical power.

The atomic bomb of this century isn’t nuclear, it’s AI-powered, emotionally manipulative content. Combined with the reach of the internet and social media, it allows curated information to be injected directly into our minds. Making matters worse, trustworthy news sources are shrinking, and access to reliable content is increasingly hidden behind paywalls. It’s a race for power like no other. For all its productive uses, the race for AI technology and technological primacy is not about improving our lives, but rather a race for global power.

Much political rhetoric, especially around First and Second Amendment rights, is also part of this manipulation. Arguments about freedom of speech and gun rights are often used to distract and divide. Bumper stickers like “If guns kill people, do pens misspell words?” are clever, but they miss the bigger point. Real power doesn’t lie in firearms, it lies in controlling information and influencing emotion. Division is the goal, and many don’t see they’re being played.

Quantum computing will amplify this exponentially. “Q-Day,” the hypothetical moment quantum computers can break current encryption standards, could arrive before 2035, if it hasn’t already. Governments have been stockpiling encrypted data for years, waiting for the day it becomes readable. Once AI and quantum power converge, decades of private communication could be analyzed and exploited to manipulate individuals and populations on an unprecedented scale.

The convergence of AI, emotion, and geopolitics is fundamentally changing the world. The fact that a fake podcast could make me feel something so deeply is proof of how easily we can be influenced. We are emotional beings first, rational beings second, and now there is a reliable, scalable way to exploit that truth. Without critical thinking, education, and emotional literacy, we risk surrendering our thoughts, behaviors, and freedoms to those who can most effectively control the narrative.

This is not just a technical or political crisis, it’s a philosophical and moral one. Even the new Pope Leo recognizes the risks. What kind of future do we want to build? Our ability to protect truth, agency, and societal cohesion is at stake.

My son, the hacker, and the lessons I learned 

Despite the thousands of cybersecurity products on the market today, most business leaders do not understand their true cybersecurity risk or who their potential attackers are. Most think they’re not much of a target at all. They understand that they have to budget for certain protections such as antivirus or firewalls, but once they’ve metaphorically locked the doors and windows, they think they are done. In fact, this lack of understanding of the true risk and who the attackers are is driving complacency, ineffective spending and financial losses. 

2024 is on track to be the costliest for cyber security related incidents. Official reporting from the FBI shows that in the US, losses grew from $3.5 billion in 2019 to $12.5 billion in 2023. This year we saw an Illinois hospital permanently close because of ransomware and countless high profile incidents like MGM Casino, Caesars Palace as well as AT&T and Change Healthcare. This week we learned the seminal brand Stoli Vodka has filed for bankruptcy, due in part to a ransomware attack. But we also know thousands of small organizations were impacted like this Reddit account of a small law firm in Connecticut who closed their doors forever after 30+ years in business. 

As a leader, if you don’t understand the underlying problem, you’re unlikely to fully address it. I’m reminded of this every day in my own home life. I assumed protecting my kids from the dangers of the Internet would be relatively easy for me compared to my friends and non-technical counterparts. I know all the tools and how to set them up to filter Internet content and control my devices. In much the same way that businesses fail to understand their risks and adversaries, I had failed to account for my son’s determination, ingenuity and resources (knowledge, Internet and time). He consistently and repeatedly circumvented the limits and systems I had put in place. 

In 6th grade we gave my son a smartphone. We wrote up a contract about responsibility and acceptable use and the risks posed by social media and the Internet. Additionally, we setup parental controls to limit apps, inappropriate content and the amount of time he could spend on different apps and sites. Lastly, I had put a business class firewall in my home to filter and control the Internet. I was busy congratulating myself and pitying the fools who weren’t as smart as me. I had secured my home technology kingdom, and I thought I was done! 

The first thing he did was realize that if he embedded URL links in Google Docs, which was allowed because he needed it for school, he could open whatever links he wanted in an embedded browser window that would circumvent the parental controls and time limits in place. Next, he realized I had no way of controlling the hotspot on his phone. So, he would connect his computer or our TVs to his hotspot to get around all of my limits. As I ran around scrambling to patch the holes, he continued to find the “bugs.” One day after checking his screen time reports I noticed he was spending a lot of time with the Files app, the program used to browse and open documents on the iPhone. Apparently he had figured out that by embedding URLs in the Files app, he could again circumvent my controls. Lastly, he realized that he could cover his tracks by deleting incriminating files on his phone and then restoring them from the trash when he wanted to access them. He could continue to do this as long as he restored them within the 30 day permanent deletion retention period. 

There is a great talk published on YouTube that I highly recommend for business leaders. It’s only 30 minutes and if you watch it at 2x speed, like my children would, you could get through it in only 15 minutes. In the video researcher Selena Larson tries to dispel the misguided focus of businesses and cyber professionals on APT or government threat actors as the greatest risk. She argues that this is a distraction and provides a false sense of security. She describes a criminal ecosystem that supports both government and non-government threat actors working like any legitimate industry driven by money and endless opportunity. As we learn about the illicit ransomware industry, we learn that it doesn’t matter what type of business you are or how small or large you are, you are a target of equal significance to this criminal industry. 

If we take the example of my son, had I not monitored his screen time regularly, I wouldn’t have noticed the unusually high usage of an unlikely program, the Files app. This cued me into the fact that he was doing something unexpected. If this sounds expensive and time consuming for a business whose focus is making widgets, you’re right. But over the last 30 years businesses have been enjoying the productivity and cost savings of automation, computers and Cloud computing. Now with the explosion of AI and natural language learning models, it will only get more efficient. We have to invest some of that efficiency into understanding the business risks fully and developing effective cybersecurity programs.  

Cybersecurity is not just about implementing tools or locking digital doors; it’s about understanding the risks, the attackers, and the ever-evolving threat landscape. My experience with my son highlights how a determined individual, armed with time and ingenuity, can outmaneuver even the most carefully implemented defenses if risks are not fully anticipated. For businesses, the lesson is clear: a static approach to security is insufficient. Success requires continuous monitoring, adaptability, and a deep understanding of both risks and adversaries. By investing in comprehensive cybersecurity strategies, businesses can safeguard themselves from the devastating consequences of cyberattacks and build resilience in an increasingly connected world. 

Maybe the greatest security risk to your business that no one is talking about 

Do you or your staff use Virtual Assistants (VAs)? Using VAs is great, but if you don’t have good security controls, you may be putting yourself, your staff and your organization at serious risk.  

My daughter came home with an assignment to explore human nature and different philosophical perspectives. As I reflected on that conversation, which in true teenage fashion she quickly told me she was done with, I started to think about a common situation we have addressed with many clients and wondered, do all these business execs believe in the innate goodness of people? That certainly is a much kinder way to understand their choices. 

A quick Google search reveals many online resources and people extolling the benefits of overseas VAs including flexibility, availability and cost. Many of these resources have brief notices about security and privacy considerations. Many suggest that having proper due diligence and contracts in place are good mitigations to risks. However, the technical details of how to properly protect data and privacy are never disclosed or discussed. This article from US News has a small 4 sentence expandable blurb at the end that simply suggests choosing a reputable service is the best way to protect yourself. 

Using an overseas VA is not necessarily a bad idea. VAs can be a very powerful tool for businesses and individuals. However, in order to avoid a serious data security or financial disaster, it is critical to understand what your risks are and how to mitigate them.   

One day we got an alert for one of our clients who is in a regulated industry. The alert was about improbable geographic access to one of their Microsoft 365 mailboxes. Upon investigation, it turned out that a salesperson had hired an overseas VA to help manage their calendar and sales efforts. The salesperson had shared their credentials and MFA to an unvetted foreign agent and potentially provided access to legally protected data. This was a clear violation of their security policy. 

Despite having gone through training on the company’s security policies and participating in regular cybersecurity awareness training, this salesperson seemingly didn’t know that what they were doing was wrong. Perhaps they just didn’t care? Or perhaps the lure of what the VA offered was just too compelling; a widely used, cheap, effective sales support tool. Or maybe they just believed in the innate goodness of people? Surprisingly we see this all the time. People are too willing to give up their privacy and security for free or inexpensive products and services. Gmail is a clear example of this! 

It isn’t just salespeople, it’s the C suite too. Outsourcing administrative work to inexpensive overseas staff is very common. We have clients ranging from plumbers to data analytics companies and insurance agencies that have outsourced executive assistant and administrative roles to overseas VAs. 

What many people fail to realize is that granting a stranger access to your email is not only against most company policies, but also a very bad idea. In 2023, the FBI reported $12.5 billion in losses from US firms due to fraud and cybercrime. Of that, $2.9 billion was related to Business Email Compromise (BEC). For most businesses, email is a critical system that provides significant access to other systems, files, people and resources. This is why email systems are a favorite target for attackers. They stand to gain significant levels of access and are able to use that access to establish authority with other victims.  

In another recent example, we advised a client whose use of an overseas VA would have allowed the VA to easily impersonate, defraud and damage the client’s business. In our discussions with them, they revealed that they had set up an Apple iPad for their VA, through which the VA had complete access to their personal Apple ID, phone records and text messages. They were using the VA to help with administrative tasks including responding to emails and text messages. With access to the person’s Apple ID, this complete stranger on the other side of the world had access to personal photos, access to financial resources such as Apple Pay, mobile banking, and access to sensitive data stored throughout their Apple account. They even knew the client’s location information! What’s more, the VA’s access included knowledge of the device PIN. The device PIN is a form of identity verification for Apple and is used to encrypt iMessages. 

Many of these VA services are located outside the US, beyond the jurisdiction of the US legal system. If you don’t have the time and resources to hire someone locally, you probably don’t have the time and resources to chase down an overseas fraudster. So even if your overseas VA was caught doing something illicit or immoral, there is little recourse, and navigating a foreign legal system can be challenging and costly. 

It is astounding to me the way in which people are circumventing their own security policies to take advantage of low-cost efficiency tools. Those policies are in place for a reason. If I were North Korea’s Kim Jung Un or Russia’s Vladimir Putin, why spend the time and resources breaking into systems around the world when all you have to do is ask? They could setup inexpensive overseas VA shops, charge reasonable rates and wait for their victims to open their doors to them! In fact this is already happening. It has been recently reported that North Korean employees are infiltrating western companies, a slight twist on the VA angle I’m describing.

So, when you’re ready to engage your VA, you don’t have to solve the age-old question of human nature. You just need to do some planning. Take the time to understand what systems or resources your VA will need access to to perform their role. Determine if company policies or laws limit what data and systems they can access. Finally work with your IT and/or security team to put the necessary controls around their access with appropriate monitoring. If you take the time to plan appropriately, you can help avoid a costly and disruptive breach. 

The Roomba Approach to Cybersecurity and Compliance

Imagine that I have a big house with 4 kids, 3 animals, 4 bedrooms and a lot of chaos. It’s a mess. It’s dirty and disorganized. January 1st rolls around and my New Year’s resolution is to get the house organized and clean.

If I were to take the approach to this problem like so many do to cybersecurity, the first thing I would do is get on the Internet and google “house cleaning technologies.” When I do that I’d find solutions like iRobot’s Roomba. Boom!! It turns out all I have to do is buy some autonomous AI cleaning tech. This was the first result in my search: https://www.architecturaldigest.com/story/high-tech-cleaning-devices-for-your-home

I purchase an AI powered vacuum ($1,000), a phone sanitizer ($120), a smart trash can ($200), a smart litter box ($700) and finally an air cleaner ($250). After spending $2,300, my home is actually more cluttered! Despite all the automated smart tech with AI, my house is no cleaner or more organized than it was before my tech buying spree. In fact, it would be worse now because I have more useless stuff laying around!

Alternatively, if I spend the time organizing the first floor, putting away what’s not in use, labeling and getting things off the floor and making sure everyone in the family knows what their role is in keeping the space clean, then can I leverage the Roomba and other technologies to become more effective and efficient.

Like with cybersecurity tech, successful and meaningful implementation of the Roomba will require knowing the areas it can clean, setting up guiding barriers and purchasing 2 Roomba’s because it turns out I have a step in the middle of my first floor.

The point is there is no magic Easy Button (not to mix marketing metaphors) in cybersecurity or compliance despite what all the marketing vaporware tells us. There are a number of vendors advertising automated SOC 2 compliance solutions with type 1 reports completed in 5 days! Like cleaning one’s house, cybersecurity and compliance require actual elbow grease and a disciplined effort to understand what’s going on in an organization. Only then can you leverage the fancy tech to become more effective and efficient.