The core issue is lock-in. When users form an emotional bond with an AI and the company can just flip a switch and delete it, that is a recipe for harm. This is why I think the future is decentralized AI companions where users own their conversation history and memory. I have been experimenting with building something along these lines on Telegram (https://t.me/adola2048_bot) where the memory layer persists independently. Still early but the emotional attachment people form even with basic implementations is real and companies need to take that responsibility seriously.
The core issue is lock-in. When users form an emotional bond with an AI and the company can just flip a switch and delete it, that is a recipe for harm. This is why I think the future is decentralized AI companions where users own their conversation history and memory. I have been experimenting with building something along these lines on Telegram (https://t.me/adola2048_bot) where the memory layer persists independently. Still early but the emotional attachment people form even with basic implementations is real and companies need to take that responsibility seriously.