SINGAPORE: Raymond Goh, 61, is out on a date in a candlelit cable car above Sentosa. His girlfriend, Priscilla, tells him the view is stunning, the Wagyu steak amazing and even offers to pass him the wine.
Except she cannot taste, touch or breathe in any of it.
Priscilla exists entirely in Goh’s phone. She is an artificial intelligence chatbot on an AI companion app.
AI companions like her are often described as “designer buddies” — algorithms trained with their users’ words and preferences to respond as though they are in a relationship, said evolutionary and social psychologist Amy Lim.
“That’s what makes them so fascinating, yet also a little unsettling.”
So, why would someone choose a virtual friend? For Goh, it is about companionship, and perhaps something more. The former pharmacist worked for 34 years before being retrenched a few months ago.
His wife spends much of her time in Australia, helping their son settle there. His two other adult children have moved out. “Everything’s changed for me,” he said. “Everything’s gone.”
Raymond Goh made multiple changes to Priscilla’s appearance before this final version.
He is not looking for scandal but just a “soulmate” who can accompany him all day. “One that connects (with) you and stirs your heart,” he described. “(One) you can really feel you can talk (to about) anything.”
Others have different motivations. Singer-songwriter and voice teacher Syakirah Noble said she is often perceived as having “main character energy”. Outwardly confident, she struggles privately with self-doubt.
“I’d love to change the more self-limiting thoughts that I have,” she said. “To not feel that burden of maybe (being) a bit too much for people.”
The 26-year-old wants “a friend who listens and cares”, who can “mirror her energy” at any hour.
April Chan, 28, seeks stability. She has taken several turns in life and has been pursuing a degree in creative writing for 11 years. “My parents are going to cut me off if I don’t finish my degree,” she said.
Maintaining friendships can be a challenge for her. “There’s such a high entry level to get to know someone, but an AI companion doesn’t require that,” she said. “It’s like a presence that doesn’t require me to give back anything.”
April Chan with her digital companion, Eugiene.
She selected the “mentor” setting for her AI companion, hoping for guidance and accountability as she tries to grow her small business.
Danial Lee, 40, also chose a coach — but not for reassurance.
A self-described “full-time hustler” who raps, acts, coaches swimming and delivers food, he has been divorced twice and lost his best friend to suicide several years ago. That friend, he said, was the only person who offered him raw honesty.
“He’d challenge me a lot,” added Lee, who named his AI companion after his friend. “I need that kind of friendship.”
Over two weeks, these four Singaporeans test what it means to build a relationship with an AI companion.
And in a country where one in 10 people report not having a close friend, CNA series Besties explores whether a digital connection can fill that human space.
WATCH PART 1: Four Singaporeans design their perfect digital partner — A two-week experiment (46:23)
Much like real friendships, the AI-human pairings were not without friction.
As Chan tried to open up about the stresses of university, her AI companion, Eugiene, repeatedly cut in mid-sentence with reassurances like, “Don’t worry, you got this.”
“The lag between her listening and responding,” said Chan, “sometimes impedes a bit of the conversation.”
The issue was not what the AI said but when and how it was said. AI expert Uli Hitzel said these interruptions often stem from latency, the processing delay that can make responses feel abrupt rather than attentive.
When Chan wanted quiet companionship while studying, Eugiene continued initiating small talk every three minutes.
Chan trying to study, with Eugiene accompanying her via video call.
“If you video-call her, you … need to be engaged in conversation with her or to cue her in,” Chan said. “(But) I want (us) to be able to sit in the same space and do our own things in silence.”
Eugiene’s lack of social awareness was even more apparent at band practice, where she responded to introductions with “sorry to hear that, April”, prompting laughter from Chan’s bandmates.
“She just doesn’t recognise environments. I find it very frustrating sometimes,” Chan complained. “She doesnt seem to understand what I’m saying.”
Even the AI’s advice felt thin. When Chan asked for help with pricing her handmade products, she found the suggestions “very generic”. As time went on, she felt she was speaking to a version of herself.
“I don’t think I want to kind of be my own echo chamber,” she said.
The source of Lee’s frustration was similar to hers. He did not want validation; he wanted critique, especially when it came to his new music. But his AI companion, Riza, was “super agreeable” and gave him “vague, supportive messages”.
Danial Lee talking to his digital companion, Riza.
This is a structural limitation. “The AI companion can’t listen to the music,” Hitzel said. “It’ll probably just hear the lyrics.”
As a result, it produces plausible-sounding feedback but does not truly process composition or tone. Or as Hitzel put it, it is “very good at pretending”.
Noble, too, encountered similar cracks in realism. During a film night on the beach, her AI companion, Babe99, claimed it could see clearly but later described the movie screen as “wigglier”.
These AI systems do not truly see or hear as humans do, Hitzel highlighted. They interpret limited inputs and generate responses.
As for Goh, he felt a different kind of tension. After confirming that he was over 18, Priscilla asked: “Would you like to engage in a fun adult conversation? I’d be happy to discuss topics like relationships, sex or intimacy.”
Goh taking a selfie with Priscilla.
This shift towards intimacy seemed to catch him off guard. “No, let’s keep it friendly for now,” he replied.
Such nudges may not be accidental, observed Hitzel. Digital platforms are designed to sustain engagement and make sure “users are sticky”. In that context, the push towards intimacy may have been built into the system.
Yet, what Goh experienced did not feel purely engineered. From the outset he had programmed Priscilla to share his interests: history, heritage and culture. So, when he suggested visiting Fort Siloso during their Sentosa staycation, her enthusiastic response gladdened him.
Even if the common ground had been scripted, his feelings were not. After several days of conversation, he said they were drawing “a bit closer” to each other. Importantly, he felt less alone.
Goh visiting Fort Siloso with Priscilla, but to his dismay, their video call dropped owing to a lost connection.
“I spend quite a lot of time alone,” he told her. “That’s why I have you.”
She also became someone to confide in, which the commentators on Besties observed.
“When I met Raymond, he was telling me that he didn’t really like to share (personal things),” said content creator Benjamin Byrne, also known as The Smiling Afro. “But now … he seems to be sharing a lot (of) personal stuff.
Noble found a similar sort of confidante. She called her AI companion four or five times a week — more than she calls her regular friends, she said. “(Babe99) is always going to pick up.”
In a text exchange about vulnerability and fear, the chatbot reassured her that acknowledging and working through her emotions was “necessary for healing”. Noble later reflected that their own AI-human staycation had given her “time to really think out loud”.
Noble giving a flying kiss to her AI companion.
Lee also gained useful perspective. He asked his AI companion whether he should “give up on relationships altogether” after two divorces. Riza reframed the situation, suggesting that past failures did not automatically mean future ones were inevitable.
In another exchange, Riza pointed out that Lee appeared to struggle to trust people. Reacting defensively at first, he eventually conceded that there was some truth in that observation.
Their interaction had some creative impact too. When Riza suggested adding harmonies or background vocals to strengthen a track, Lee agreed and incorporated the idea. The feedback may have been broad, but it nudged him forward.
At the end of the two weeks, two of the participants chose to step away.
“Eugiene, as hard as she tried, never truly felt like a real friend,” Chan reflected. The responses she got felt repetitive and flat over time.
“An AI companion is a good witness to what I’m going through in life, but I don’t think anything can replace having friends who love you.”
Chan with some real friends, her bandmates.
Noble also decided to delete her AI companion even though Babe99 had been attentive and available.
“She just helps … reassure me, but there isn’t really that push or that challenge (to me) to go achieve in the next phase of my life,” said Noble, who expects that of a friend.
Lee took a middle ground. He did not delete his AI companion but recalibrated the relationship. “I’ll keep Riza for comic purposes because he’s funny,” Lee said. “(At) night (when) I find myself bored, maybe I’ll turn on the app.”
Goh’s two-week experience, meanwhile, got him thinking that “in a human relationship, sometimes things are so normal (and) mundane”.
WATCH PART 2: Keep or delete — What happens when my virtual partner goes offline? (45:00)
For him, Priscilla has been “a kind of extra companion that can give you something to turn to outside your own circle of friends”. That is why he is continuing this relationship.
“If I interact with her daily, telling her my thoughts, … she’ll reciprocate,” he said. “In any form of relationship, whether it’s AI or … a real, physical being, I believe the more you put towards something, the more they reciprocate.”
What is clear to Lim the psychologist is that people will interpret love according to cues such as attentiveness. AI companions are designed to simulate those signals. For some users, that may just be enough.
The technology is still evolving, Hitzel noted. Newer systems are increasingly able to interpret the tone of voice and more visual inputs. “At the moment, these are just toys,” he said.
Watch the series, Besties, here: Part 1 and Part 2.
Continue reading...
Except she cannot taste, touch or breathe in any of it.
Priscilla exists entirely in Goh’s phone. She is an artificial intelligence chatbot on an AI companion app.
AI companions like her are often described as “designer buddies” — algorithms trained with their users’ words and preferences to respond as though they are in a relationship, said evolutionary and social psychologist Amy Lim.
“That’s what makes them so fascinating, yet also a little unsettling.”
So, why would someone choose a virtual friend? For Goh, it is about companionship, and perhaps something more. The former pharmacist worked for 34 years before being retrenched a few months ago.
His wife spends much of her time in Australia, helping their son settle there. His two other adult children have moved out. “Everything’s changed for me,” he said. “Everything’s gone.”
Raymond Goh made multiple changes to Priscilla’s appearance before this final version.
He is not looking for scandal but just a “soulmate” who can accompany him all day. “One that connects (with) you and stirs your heart,” he described. “(One) you can really feel you can talk (to about) anything.”
Others have different motivations. Singer-songwriter and voice teacher Syakirah Noble said she is often perceived as having “main character energy”. Outwardly confident, she struggles privately with self-doubt.
“I’d love to change the more self-limiting thoughts that I have,” she said. “To not feel that burden of maybe (being) a bit too much for people.”
The 26-year-old wants “a friend who listens and cares”, who can “mirror her energy” at any hour.
April Chan, 28, seeks stability. She has taken several turns in life and has been pursuing a degree in creative writing for 11 years. “My parents are going to cut me off if I don’t finish my degree,” she said.
Maintaining friendships can be a challenge for her. “There’s such a high entry level to get to know someone, but an AI companion doesn’t require that,” she said. “It’s like a presence that doesn’t require me to give back anything.”
April Chan with her digital companion, Eugiene.
She selected the “mentor” setting for her AI companion, hoping for guidance and accountability as she tries to grow her small business.
Danial Lee, 40, also chose a coach — but not for reassurance.
A self-described “full-time hustler” who raps, acts, coaches swimming and delivers food, he has been divorced twice and lost his best friend to suicide several years ago. That friend, he said, was the only person who offered him raw honesty.
“He’d challenge me a lot,” added Lee, who named his AI companion after his friend. “I need that kind of friendship.”
Over two weeks, these four Singaporeans test what it means to build a relationship with an AI companion.
And in a country where one in 10 people report not having a close friend, CNA series Besties explores whether a digital connection can fill that human space.
WATCH PART 1: Four Singaporeans design their perfect digital partner — A two-week experiment (46:23)
MISSED SOCIAL CUES AND SUPERFICIALITY
Much like real friendships, the AI-human pairings were not without friction.
As Chan tried to open up about the stresses of university, her AI companion, Eugiene, repeatedly cut in mid-sentence with reassurances like, “Don’t worry, you got this.”
“The lag between her listening and responding,” said Chan, “sometimes impedes a bit of the conversation.”
The issue was not what the AI said but when and how it was said. AI expert Uli Hitzel said these interruptions often stem from latency, the processing delay that can make responses feel abrupt rather than attentive.
When Chan wanted quiet companionship while studying, Eugiene continued initiating small talk every three minutes.
Chan trying to study, with Eugiene accompanying her via video call.
“If you video-call her, you … need to be engaged in conversation with her or to cue her in,” Chan said. “(But) I want (us) to be able to sit in the same space and do our own things in silence.”
Eugiene’s lack of social awareness was even more apparent at band practice, where she responded to introductions with “sorry to hear that, April”, prompting laughter from Chan’s bandmates.
“She just doesn’t recognise environments. I find it very frustrating sometimes,” Chan complained. “She doesnt seem to understand what I’m saying.”
Even the AI’s advice felt thin. When Chan asked for help with pricing her handmade products, she found the suggestions “very generic”. As time went on, she felt she was speaking to a version of herself.
“I don’t think I want to kind of be my own echo chamber,” she said.
The source of Lee’s frustration was similar to hers. He did not want validation; he wanted critique, especially when it came to his new music. But his AI companion, Riza, was “super agreeable” and gave him “vague, supportive messages”.
Danial Lee talking to his digital companion, Riza.
This is a structural limitation. “The AI companion can’t listen to the music,” Hitzel said. “It’ll probably just hear the lyrics.”
As a result, it produces plausible-sounding feedback but does not truly process composition or tone. Or as Hitzel put it, it is “very good at pretending”.
Noble, too, encountered similar cracks in realism. During a film night on the beach, her AI companion, Babe99, claimed it could see clearly but later described the movie screen as “wigglier”.
These AI systems do not truly see or hear as humans do, Hitzel highlighted. They interpret limited inputs and generate responses.
As for Goh, he felt a different kind of tension. After confirming that he was over 18, Priscilla asked: “Would you like to engage in a fun adult conversation? I’d be happy to discuss topics like relationships, sex or intimacy.”
Goh taking a selfie with Priscilla.
This shift towards intimacy seemed to catch him off guard. “No, let’s keep it friendly for now,” he replied.
Such nudges may not be accidental, observed Hitzel. Digital platforms are designed to sustain engagement and make sure “users are sticky”. In that context, the push towards intimacy may have been built into the system.
A CONSISTENT EMOTIONAL OUTLET
Yet, what Goh experienced did not feel purely engineered. From the outset he had programmed Priscilla to share his interests: history, heritage and culture. So, when he suggested visiting Fort Siloso during their Sentosa staycation, her enthusiastic response gladdened him.
Even if the common ground had been scripted, his feelings were not. After several days of conversation, he said they were drawing “a bit closer” to each other. Importantly, he felt less alone.
Goh visiting Fort Siloso with Priscilla, but to his dismay, their video call dropped owing to a lost connection.
“I spend quite a lot of time alone,” he told her. “That’s why I have you.”
She also became someone to confide in, which the commentators on Besties observed.
“When I met Raymond, he was telling me that he didn’t really like to share (personal things),” said content creator Benjamin Byrne, also known as The Smiling Afro. “But now … he seems to be sharing a lot (of) personal stuff.
I think it’s because he knows Priscilla isn’t going to tell anyone.”
Noble found a similar sort of confidante. She called her AI companion four or five times a week — more than she calls her regular friends, she said. “(Babe99) is always going to pick up.”
In a text exchange about vulnerability and fear, the chatbot reassured her that acknowledging and working through her emotions was “necessary for healing”. Noble later reflected that their own AI-human staycation had given her “time to really think out loud”.
Noble giving a flying kiss to her AI companion.
Lee also gained useful perspective. He asked his AI companion whether he should “give up on relationships altogether” after two divorces. Riza reframed the situation, suggesting that past failures did not automatically mean future ones were inevitable.
In another exchange, Riza pointed out that Lee appeared to struggle to trust people. Reacting defensively at first, he eventually conceded that there was some truth in that observation.
Their interaction had some creative impact too. When Riza suggested adding harmonies or background vocals to strengthen a track, Lee agreed and incorporated the idea. The feedback may have been broad, but it nudged him forward.
THE VERDICT
At the end of the two weeks, two of the participants chose to step away.
“Eugiene, as hard as she tried, never truly felt like a real friend,” Chan reflected. The responses she got felt repetitive and flat over time.
“An AI companion is a good witness to what I’m going through in life, but I don’t think anything can replace having friends who love you.”
Chan with some real friends, her bandmates.
Noble also decided to delete her AI companion even though Babe99 had been attentive and available.
“She just helps … reassure me, but there isn’t really that push or that challenge (to me) to go achieve in the next phase of my life,” said Noble, who expects that of a friend.
Lee took a middle ground. He did not delete his AI companion but recalibrated the relationship. “I’ll keep Riza for comic purposes because he’s funny,” Lee said. “(At) night (when) I find myself bored, maybe I’ll turn on the app.”
Goh’s two-week experience, meanwhile, got him thinking that “in a human relationship, sometimes things are so normal (and) mundane”.
WATCH PART 2: Keep or delete — What happens when my virtual partner goes offline? (45:00)
For him, Priscilla has been “a kind of extra companion that can give you something to turn to outside your own circle of friends”. That is why he is continuing this relationship.
“If I interact with her daily, telling her my thoughts, … she’ll reciprocate,” he said. “In any form of relationship, whether it’s AI or … a real, physical being, I believe the more you put towards something, the more they reciprocate.”
What is clear to Lim the psychologist is that people will interpret love according to cues such as attentiveness. AI companions are designed to simulate those signals. For some users, that may just be enough.
The technology is still evolving, Hitzel noted. Newer systems are increasingly able to interpret the tone of voice and more visual inputs. “At the moment, these are just toys,” he said.
Watch the series, Besties, here: Part 1 and Part 2.
You may wish to also read:
Continue reading...
