- Scientists have created Future You that allows users to talk to themselves in the future
- The researchers found that talking to the AI reduced the user’s anxiety
Although scientists haven’t invented a time machine yet, there’s now a way you can get some much-needed advice from your older self.
Experts from the Massachusetts Institute of Technology (MIT) have created Future You: an AI-powered chatbot that simulates a version of the user at age 60.
The researchers say that a quick conversation with your future self is exactly what people need to think more about their decisions in the present.
With an outdated profile photo and lifelong synthetic memories, the chatbot delivers plausible stories about the user’s life, alongside sage wisdom from the future.
And in a trial with 334 volunteers, just a brief conversation with the chatbot left users feeling less anxious and more connected to their future selves.
Click here to change the format of this module
So far, the Black Mirror-worthy technology has only been tested privately as part of a study, but it could be made available to the public in the coming years.
To start chatting with their future selves, the AI first asks users a series of questions about their current life, their past, and where they might want to be in the future.
Users also give the AI a current image of themselves, which is converted into a wrinkled, gray-haired profile photo for their future version.
Their answers are then fed into OpenAI’s ChatGPT-3.5, which generates “synthetic memories” to build a coherent backstory from which questions can be answered.
One participant told Future You that she wanted to be a biology teacher in the future.
When she later asked her 60-year-old self about the most rewarding moment of her career, the AI responded, “A rewarding story from my career would be the time I was able to help a struggling student turn around his grades and pass his or her diplomas. Biology lesson.’
The AI, who said he was a retired biology teacher, added: “It was so satisfying to see the student’s face light up with pride and achievement.”
Pat Pataranutaporn, who works on the Future You project at MIT’s Media Lab, says he thinks these types of interactions could have real benefits for users.
Mr Pataranutaporn told The Guardian: ‘The aim is to promote long-term thinking and behavior change.
‘This could motivate people to make wiser choices in the present that optimize for their well-being and long-term life outcomes.’
He even says he has found benefits from talking to his future self.
In a video demonstrating the chatbot, Mr. Pataranutaporn asks his future self, “What would be a lesson you share with the new MIT Media Lab student?”
After a brief pause, the AI replies: ‘The most important lesson I learned is that “nothing is impossible.”
‘No matter how difficult something seems, if you work hard and put your mind to it, you can achieve anything.’
Most deeply, he remembers a conversation in which the AI reminded him to spend time with his parents while he still could.
“The session gave me a perspective that continues to impact me to this day,” says Mr. Patarunataporn.
Mr. Pataranutaporn is not the only one who benefits from talking to the AI.
Click here to change the format of this module
In a pre-print paper, the researchers found that participants “significantly reduced” their levels of negative emotions immediately after the trial.
Emotional measures revealed that participants showed less anxiety, as well as a greater sense of continuity with their future selves.
As the researchers note, studies have shown that people who are more connected to their future selves exhibit better mental health, academic performance, and financial skills.
In their article, the researchers write: ‘Users emphasized how emotional the intervention was as they commented on the interaction and expressed positive feelings such as comfort, warmth and comfort.’
The researchers are not the first to experiment with using digital ‘human’ chatbots for mental health purposes.
Character.ai, a chatbot used to emulate characters from games and movies, is now used by many as a popular AI therapist.
More controversially, several companies also offer so-called ‘deadbots’ or ‘griefbots’ that use AI to mimic deceased loved ones.
Platforms offering the digital afterlife service, including Project December and Hereafter, allow users to speak to digital resurrected simulations of those who have died.
However, experts warn that these technologies can be psychologically harmful or even ‘haunt’ their users.
While the researchers found that talking to their future selves could help many people, they also warn that there are risks involved.
The researchers note that the following risks exist: ‘Inaccurately portraying the future in a way that harmfully influences current behavior; endorsing negative behavior; and hyper-personalization that diminishes real human relationships.
‘Researchers should further investigate and ensure the ethical use of this technology.’