Is the AI assistant as forgetful as a goldfish? Try equipping it with “memory” and then see!

AI Tools posted 2w ago dongdong
9 0

Have you ever had such an experience? You’ve just finished telling an AI something, but when you start another conversation shortly after, it immediately “forgets” as if it never knew you. You have to start from scratch every time you talk, and the only word to describe the experience is: exhausting.

It’s time to equip AI with some “high-tech” features—let’s add a memory function to it. Today, let’s talk about how to use Python + DeepSeek to create an AI assistant that has memory, can evolve, and can get things done. This way, it won’t just feel like a cold, robotic entity, but rather a smart companion that chat with you.

Why does AI necessarily have to have memory?

You don’t need me to elaborate on this. Just imagine:
Today, you tell AI that you like Americano, and tomorrow it can automatically recommend which coffee shop is offering discounts on Americano; you tell it that you’re going to Hangzhou next week, and it remembers to remind you not to forget your umbrella. Now *that’s* what I call “smart.”

An AI without memory can only answer questions one by one, forever stuck in the past ten seconds. It can neither understand the context nor help you remember things. We’re not chatting with a disposable cup here; we deserve a “TV series” kind of experience.

How is AI’s memory created?

Simply put, what we need to do is provide AI with a “notebook” that records everything you say. For this “notebook,” we can use the lightweight SQLite, and the code is quite simple. Let’s take a look at an example:

import sqlite3
import datetime

conn = sqlite3.connect(‘chat_history.db’, check_same_thread=False)
c = conn.cursor()

c.execute(”’CREATE TABLE IF NOT EXISTS chat_history
(user_id TEXT, timestamp TEXT, message TEXT)”’)
conn.commit()

defsave_chat_history(user_id, message):
c.execute(“INSERT INTO chat_history (user_id, timestamp, message) VALUES (?, ?, ?)”,
(user_id, datetime.datetime.now(), message))
conn.commit()

defget_chat_history(user_id):
c.execute(“SELECT message FROM chat_history WHERE user_id = ? ORDER BY timestamp DESC LIMIT 5”, (user_id,))
return [row[0] for row in c.fetchall()]

Look, doesn’t this just resemble a “memo” that can take notes? You chat with it, and it remembers; you come back later, and it flips through the records before responding. Once you get the hang of it, you can even optimize it with PostgreSQL or Redis for faster performance.

With memory, one also needs to know how to “get the work done”.

An AI that can chat is certainly great, but it would be even better if it could also help you with practical tasks, such as checking the weather, searching for information… That’s when it becomes truly useful. Let’s equip it with some skill sets.

Multilingual Translation Machine Launched

Got an English email to reply to from your boss? Don’t panic. AI can translate it in seconds.

from deep_translator import GoogleTranslator

def translate_text(text, target_lang=’en’):
return GoogleTranslator(source=’auto’, target=target_lang).translate(text)

No worries about minority languages. Whether it’s Japanese, French, Korean, etc., I can handle them all.

How to make AI “always online” waiting for you?

Now that you’ve completed the functionality, you can’t always rely on running it locally. You need to deploy it to a server and integrate it with Flask or FastAPI. That way, a service can be up and running with just one setup.

from flask import Flask, request, jsonify
app = Flask(__name__)

@app.route(‘/chat’, methods=[‘POST’])
def chat():
user_id = request.json[‘user_id’]
message = request.json[‘message’]
response = generate_response_with_history(user_id, message)
return jsonify({‘response’: response})

if __name__ == ‘__main__’:
app.run()

If you’re looking for speed, FastAPI is a better choice, after all, fast response time is the ultimate goal. It’s not troublesome to deploy either after being encapsulated with Docker.

Choose a platform and just drive.

After it goes online, recommend several platforms to keep it running in the background constantly.

  • Vercel: Suitable for small projects that deploy both the front-end and back-end together.
  • Heroku: Beginner-friendly with a free tier.
  • AWS Lambda: It’s cost-effective when the usage is small. Pay-as-you-go won’t result in losses.

All these platforms enable you to use your AI assistant as long as you are connected to the Internet, whether on your mobile phone, computer or even the door of your refrigerator.

Let’s have a few more words at last.

Creating an AI assistant with memory and skills is truly a great tool for developers to boost efficiency. For example, when I’m writing code and running out of inspiration, I just chat with the assistant for a while. It can remember all the requirements and coding styles I mentioned before, recommend open-source libraries I like, translate documents, debug errors…

If you are also a working person who is as busy as a bee every day, you might as well spend a weekend trying it out. It can really be of great help.

Moreover, using the AI you build yourself is really comfortable. You no longer have to worry that one day the “official AI” will suddenly forget what your name is. Besides, as the saying goes, God helps those who help themselves.

© Copyright Notice

Related Posts

No comments yet...

none
No comments yet...