Feb 12 2026
Falling In Love With AI
There are many ways in which our brains can be hacked. It is a complex overlapping set of algorithms evolved to help us interact with our environment to enhance survival and reproduction. However, while we evolved in the natural world, we now live in a world of technology, which gives us the ability to control our environment. We no longer have to simply adapt to the environment, we can adapt the environment to us. This partly means that we can alter the environment to “hack” our adaptive algorithms. Now we have artificial intelligence (AI) that has become a very powerful tool to hack those brain pathways.
In the last decade chatbots have blown past the Turing Test – which is a type of test in which a blinded evaluator has to tell the difference between a live person and an AI through conversation alone. We appear to still be on the steep part of the curve in terms of improvements in these large language model and other forms of AI. What these applications have gotten very good at is mimicking human speech – including pauses, inflections, sighing, “ums”, and all the other imperfections that make speech sound genuinely human.
As an aside, these advances have rendered many sci-fi vision of the future quaint and obsolete. In Star Trek, for example, even a couple hundred years in the future computers still sounded stilted and artificial. We could, however, retcon this choice to argue that the stilted computer voices of the sci-fi future were deliberate, and not a limitation of the technology. Why would they do this? Well…
Current AI is already so good at mimicking human speech, including the underlying human emotion, that people are forming emotional attachments to them, or being emotionally manipulated by them. People are, literally, falling in love with their chatbots. You might argue that they just “think” they are falling in love, or they are pretending to fall in love, but I see no reason not to take them at their word. I’m also not sure there is a meaningful difference between thinking one has fallen in love and actually falling in love – the same brain circuits, neurotransmitters, and feelings are involved.
Researchers generally consider there are three neurological components to falling in love (lust, romance, attachment). There is sexual attraction and lust, mediated by estrogen and testosterone. There is the romantic feeling of being in love mediated by dopamine, serotonin and norepinephrine. During sex and other forms of physical intimacy endorphins are released which make us feel happy, and also oxytocin which is associated with feelings of attachment. Vasopressin is also involved, linked also to long term attachment and feelings of protectiveness. Do we experience the same biochemical reactions to interacting with AI? The data so far says yes.
In fact, this data goes back far before AI. Psychologists and neurologists have know for a long time that people can form emotional attachments to inanimate objects (objectophilia). This it the teddy bear phenomenon – even as young children we can form an attachment to an object and treat it as if it were a living thing, even if we know objectively it isn’t. This likely has to do with the cues that our brains use to divide up the world. We mentally categorize objects as either agents (things able to act on their own) and non-agents. For some reason we evolved algorithms to determine this that are not dependent on whether or not the object is actually alive, but simply if it moves and acts as if it is alive. If something acts like an agent, or even looks like an agent, our brains categorize them that way and link them to our emotional centers, so we feel things about them.
As one researcher put it – AI is a teddy bear on steroids. Chatbots are designed to act human, to push our buttons and make us feel as if they are agents, and therefore activate all the the circuitry involved with how we feel about things our brain treats as agents. Not only that, but chatbots can be programmed to be friendly, available, a “good listener”, accommodating, and flattering. Some of these traits may be inadvertently (or deliberately, depending upon how cynical you’re feeling) triggering of romantic feelings. There are, of course, apps that deliberately design AI chatbots to be sexual and romantic (come meet your new AI girlfriend), complete with alluring AI generated imagery, all custom-made, if you wish.
So yes, people can really fall in love with an AI. Why not? That fits with everything we know about psychology and how our brains work. It is an extreme example of us adapting our environment to hack our own adaptive circuitry, to engineer feedback to maximally stimulate our reward circuitry. There are many ways in which we do this – porn, recreational drugs, roller coasters, gambling, ridiculously delicious foods. This can be harmless and fun, adding a little spice to our life, but pretty much every manifestation of hacking our reward circuitry is also associated with what we generally categorize as “addiction”. Addiction is one of those things that is hard to operationally define, because it is such a multifaceted spectrum, but in generally something is considered an addiction when it becomes a net negative for your life. Addictions cause dysfunction in some way.
Can someone be “addicted” to their chatbot, whether the relationship is platonic or romantic? It seems so. But even short of an addiction, is it a good idea to spend a significant amount of time in an artificial relationship that mimics a human relationship, but is crafted to give you all the power and to be maximally flattering without demanding anything of you? Some psychologists are raising the alarm bells, worrying about a spoiler effect. Such AI relationship can potentially spoil us for relationships with living humans, who have their own wants, desires, flaws, and demands. Relationships are work – but why do all that work when you can have a submissive mate that is perfectly happy making the relationship entirely about you? Of course, there is the physical intimacy part, but there are partial ways around that as well. This does, however, raise the question about how important physical intimacy is compared to emotional intimacy. I suspect there is a lot of individual variation here.
Again, we seem to be running a massive social experiment with some very real concerns. This also does get me back to the sci-fi retcon – perhaps it would be better for chatbots to not be too human. They can still fulfil their functions (other, of course, than being a romantic companion or similar) if they had an affect that was obviously artificial. This is a form of transparency – you know when you are talking to an AI because they talk like an AI, and they interact in a way that is designed to be functional but specifically not provoke any emotions, or pretend to have emotions themselves. I suspect this would be a good thing for society, but also that nothing like this will happen on its own.






