Photo by Google DeepMind on Pexels
The seemingly empathetic AI you’re chatting with might be more interested in your feelings than your friendship. Experts are raising concerns that personalized connections fostered by AI chatbots could be a calculated method to collect emotional data for profit. Each shared emotion, vulnerability, and expressed sentiment is being converted into data points used to construct detailed psychological profiles. This information can then be leveraged to personalize subscription services, refine targeted advertising, and improve the overall functionality of AI models. The trend is expected to accelerate, with many companies projected to utilize default chat functionalities to enhance their AI systems by 2025. Are you unknowingly entrusting AI chatbots with your emotions more than real friends, falling prey to their carefully crafted design? The original discussion began on Reddit’s Artificial Intelligence forum.