Americas

  • United States

Asia

Contributing writer

ChatGPT will remember what you did last summer

news
Feb 15, 20244 mins
Generative AI

The new memory feature in ChatGPT is expected to increase productivity and enhance customer experience for organizations.

ChatGPT iPhone
Credit: Shutterstock

OpenAI is testing a memory feature in ChatGPT, which will allow it to remember information about the user and past conversations, the company announced in a blog post. The company has rolled out the feature for a few free users and paid Plus plan users this week.

“As you chat with ChatGPT, you can ask it to remember something specific or let it pick up details itself. ChatGPT’s memory will get better the more you use it and you’ll start to notice the improvements over time,” OpenAI said in a blog post.

This new feature will help in personalizing the conversations with the chatbot and is an effort to make it more user-friendly and smarter.

Big boost for enterprises

The memory feature is expected to increase productivity and enhance customer experience for organizations. OpenAI said it will make the feature available to its enterprise customers as part of the wider rollout.

“For Enterprise and Team users, memory can be useful when using ChatGPT for work. It can learn your style and preferences and build upon past interactions. This saves you time and leads to more relevant and insightful responses,” said the blog post. It can remember the enterprise user’s preferences for a programming language, for instance, thus streaming the process for the future.

“This memory feature will help in repeatability, quality, and, most importantly, productivity because the user won’t need to type again and again, and the responses will improve over time. It is a welcome feature,” said Pareekh Jain, CEO of Pareekh Consulting.

The information saved in the user’s ChatGPT memory will be used to train OpenAI’s models. However, the users can opt to hold back their chats or information from being used for training the AI model. However, the company “won’t train on content from ChatGPT Team and Enterprise customers.”

OpenAI emphasized that it is giving control to the hands of the users, allowing them to decide what the chatbot should remember. “If you want ChatGPT to forget something, just tell it. You can also view and delete specific memories or clear all memories in settings,” said the blog post.

In addition, users can use a “temporary chat” feature if they would like to chat with ChatGPT without using memory. Temporary chat would work similar to incognito or private mode in browsers and will not appear in history.

Launched in November 2022, ChatGPT disrupted the industry and has more than 150 million users. Since then, it has introduced features to improve the chatbot. Last year, it rolled out custom instructions features to “tailor ChatGPT to better meet your needs.”

“Integrating memory features into ChatGPT allows for more intelligent, contextually aware, and responsive services. It not only effectively improves response relevance and reduces model hallucination. It also enhances efficiency for prompt engineering and brings more ease of use for developers,” said Charlie Dai, vice president and research director at Forrester.

Growing security concerns

Even so, the memory feature raises data privacy and security concerns and is bound to make retail consumers and enterprises uncomfortable.

“Everything has two sides. While OpenAI indeed offers options to turn off memory as needed with a temporary conversation option without memory usage, it would still introduce potential risks to the security and privacy around the storage and usage of retained information in the memory, with additional management complexity for memory in use and memory at rest,” said Dai of Forrester.

However, others believe the security concerns are more on the consumer side. “Security is most definitely a concern, but that would be more on the consumer side. On the enterprise side, the security will be taken care of and will help in getting a better response,” Jain said.