top of page

The HMC-session @ Etmaal 2026 on synthetic relationships - A summary

Updated: 5 hours ago

In case you’ve missed the synthetic relationships panel at ETMAAL 2026 conference, this post summarizes our discussion that is led by the panelists (Ruud Hortensius, Elly Konijn, and Chris Starke) and followed by the audience comments.



People talk to AI chatbots and socialize with these artificial partners. We call Human-AI relationships “synthetic relationships”, but we don’t have concrete criteria to distinguish a bunch of interactions from a relationship. This is not very surprising, given that we still struggle to universally conceptualize a relationship. “The design that makes the user eager to socialize with the artificial partner” seems important to create synthetic relationships.



Who decides whether a case is a synthetic relationship?


To investigate synthetic relationships, we must first get in contact with people who have synthetic relationships. However, one open question our panelists brought up is: How do we identify people in a synthetic relationship? Should we have subjective or objective criteria?


Our panelists had different ideas. Some suggested that the user is the only person who can say whether they have a relationship with their AI chatbot. Thus, we must ask users themselves about their opinion on their interactions with AI chatbots. On the other hand, other panelists were concerned that due to social desirability, users may hide their synthetic relationships. They could also be unaware or unwilling to accept that they are in a synthetic relationship – despite engaging in behaviors which are typical of these relationships.


Does parasocial relationship theory fit synthetic relationships?


To define new types of relationships, researchers often rely on established relationship theories. The panel discussed parasocial relationship theory, which illustrates the relationship between a media figure and the audience member. Proponents suggested that since AI chatbots are incapable of emotions, synthetic relationships will always be one-sided, which would resemble parasocial relationships. Other panelists disagreed, arguing that by generating personalized responses, AI chatbots can interact with the user and differ from parasocial relationships.


However, all panelists agree that one-sidedness is a factor we cannot ignore in synthetic relationships. “[The] User is the one who decide whether they will have a relationship or not. You can always be rejected by other people who don’t want to be friends with you. But I’ve never heard someone who was rejected by an AI chatbot” 


But are one-sided interactions a unique feature of synthetic relationships? Panelists generated a quick contra-argument: “Therapist-patient relationships may also be one-sided. Patient is the one who discloses and gets attached. Therapists are being trained to remain emotionally neutral. Also, a baby cannot reject being in a relationship with their parents. Family relationships are not optional.”



Is generative AI really different than rule-based AI?


The panelists agreed that the advent of Generative AI changes what we can expect from interacting with AI. However, knowledge about older or rule-based AI systems and about the evolution of technology could help us foresee possible future developments in GenAI.

According to the panelists we have been living through an era of attention economy. Media platforms have been racing to grab our attention for extended periods of time. We could be seeing the coming of the attachment economy era. Some people feel already strongly attached to certain social media platforms, and they don’t want to stop using them. Currently, there isn’t really much difference between Google Gemini, Perplexity or ChatGPT –but they will likely specialize in the future to be more attractive than their counterparts.


How will synthetic relationships change us?


We currently do not have (enough) empirical evidence to discuss how GenAI might change inter-human relationships. However, history of technology evolution showed substantial change – like streaming services influencing the movie industry: “People now go to cinemas less often, and they watch movies at home while looking at their phones. That changed how the narration of movies is being designed. Now they keep the plot very simple and repeat what is happening every 5 minutes to let inattentive people catch up with the movie.


The panelists think that a change in expectations, skill erosion, and a decrease in quality is very likely in every aspect of our lives. People may not develop the ability to talk to people who do not agree with them. This may result in eroding the ability to discuss, negotiate, and debate. When people outsource their skills and knowledge, they may not know how to judge the quality of generated outputs.


Some panelists do not think that AI chatbots can change fundamental things that make us human – such as our need to relate. “We evolved over thousands of years to be the human and society we are today. One technology cannot change it in 5 years. It cannot replace our need for human contact.”



How will synthetic relationships change our relationships?


“My AI understands me better than my parents” was among the findings of an interview study presented in Etmaal. “My AI understands my parents better than me” could just as well be true though. AI chatbots that may have been trained with knowledge about typical behavior of past generations could indeed know older generations better than their children. If we feel like an AI chatbot is understanding us well, would we prefer talking to an AI chatbot more than other people in our lives?


“Whenever I have a question about cars, I always go to my father-in-law. And he loves being the car expert in my eyes. If I go and ask my question to an AI chatbot instead, my father-in-law will lose those conversations that he is really passionate about.” Limited empirical research also indicates that when people engage with AI chatbots more frequently, the time they spend with other humans decreases.  


What are the immediate risks?


Of course, risks of synthetic relationships needed to be discussed. “People destroy art and artists, by generating one picture at a time” was a risk voiced by one of the panelists. “In Germany, calculating the tax and preparing the documents is a complex and expensive process. AI chatbots can do it very well and save you tons of money. However, you need to give them most private information about you to get accurate results. In Europe, we know how important it is that private information remains confidential. However, the convenience of these technologies may convince users to share this information with technology companies easily” was the concern of another panelist.


Knowing the pre-AI era is also critical for current generations. However, children who are born after 2022 will not know how we did certain things before GenAI. This may cause a big generational gap. 



Audience reactions


 Some of the most striking comments from the interactive live-chat:


Maybe we want to acknowledge a relationship as social because it is meaningful to the user.
Me withdrawing money from an ATM is meaningful, but not social.
If you kick it, do you feel guilty? If yes, it’s social.
If a social robot is able to respond to the user, it’s not one sided.
If we say “please” and “thank you” to a chatbot, it’s definitely social… or not?
Some artificial agents can be programmed to not engage with you every time, right?
In HCI, the exchange of “social signals” is bi-directional but the associated feelings are not.
I think one of the most important questions is: Do we maintain a relationship with the artificial agent itself or the programmer behind it?
AI judge us, too. My smart watch often thinks I’m lazy.
Can one AI agent form different types of relationships with different members of the family?
Does AI isolate us or train us with social skills? Do we have data or is this all an assumption?
I have learned a lot of stuff from ChatGPT!
My coding skills decline every day.
Skill erosion and acquisition at the same time.


Comments


© 2026 sHAI Group

bottom of page