reworked memory from json to sqlite
This commit is contained in:
24
README.md
24
README.md
@@ -7,7 +7,7 @@ Nova is a friendly, slightly witty Discord companion that chats naturally in DMs
|
||||
- OpenAI chat model (`gpt-4o-mini` by default) for dialogue and `text-embedding-3-small` for memory.
|
||||
- Short-term, long-term, and summarized memory layers with cosine-similarity retrieval.
|
||||
- Automatic memory pruning, importance scoring, and transcript summarization when chats grow long.
|
||||
- Local JSON vector store (no extra infrastructure) plus graceful retries for OpenAI rate limits.
|
||||
- Local SQLite memory file (no extra infrastructure) powered by `sql.js`, plus graceful retries for OpenAI rate limits.
|
||||
- Optional "miss u" pings that DM your coder at random intervals (0–6h) when `CODER_USER_ID` is set.
|
||||
- Dynamic per-message prompt directives that tune Nova's tone (empathetic, hype, roleplay, etc.) before every OpenAI call.
|
||||
- Lightweight Google scraping for fresh answers without paid APIs (locally cached).
|
||||
@@ -15,7 +15,7 @@ Nova is a friendly, slightly witty Discord companion that chats naturally in DMs
|
||||
- The same blacklist applies to everyday conversation—if a user message contains a banned term, Nova declines the topic outright.
|
||||
|
||||
## Prerequisites
|
||||
- Node.js 18+
|
||||
- Node.js 18+ (tested up through Node 25)
|
||||
- Discord bot token with **Message Content Intent** enabled
|
||||
- OpenAI API key
|
||||
|
||||
@@ -60,17 +60,20 @@ src/
|
||||
README.md
|
||||
```
|
||||
|
||||
## How Memory Works
|
||||
- **Short-term (recency buffer):** Last 10 conversation turns kept verbatim for style and continuity. Stored per user in `data/memory.json`.
|
||||
- **Long-term (vector store):** Every user message + bot reply pair becomes an embedding via `text-embedding-3-small`. Embeddings, raw text, timestamps, and heuristic importance scores are stored in the JSON vector store. Retrieval uses cosine similarity plus a small importance boost; top 5 results feed the prompt.
|
||||
- **Short-term (recency buffer):** Last 10 conversation turns kept verbatim for style and continuity. Stored per user inside `data/memory.sqlite`.
|
||||
- **Long-term (vector store):** Every user message + bot reply pair becomes an embedding via `text-embedding-3-small`. Embeddings, raw text, timestamps, and heuristic importance scores live in the same SQLite file. Retrieval uses cosine similarity plus a small importance boost; top 5 results feed the prompt.
|
||||
- **Summary layer:** When the recency buffer grows past ~3000 characters, Nova asks OpenAI to condense the transcript to <120 words, keeps the summary, and trims the raw buffer down to the last few turns. This keeps token usage low while retaining story arcs.
|
||||
- **Importance scoring:** Messages mentioning intent words ("plan", "remember", etc.), showing length, or emotional weight receive higher scores. When the store exceeds its cap, the lowest-importance/oldest memories are pruned. You can also call `pruneLowImportanceMemories()` manually if needed.
|
||||
|
||||
## Memory Deep Dive
|
||||
- **Embedding math:** `text-embedding-3-small` returns 1,536 floating-point numbers for each text chunk. That giant array is a vector map of the message’s meaning; similar moments land near each other in 1,536-dimensional space.
|
||||
- **What gets embedded:** After every user→bot turn, `recordInteraction()` (see [src/memory.js](src/memory.js)) bundles the pair, scores its importance, asks OpenAI for an embedding, and stores `{ content, embedding, importance, timestamp }` inside `data/memory.json`.
|
||||
- **What gets embedded:** After every user→bot turn, `recordInteraction()` (see [src/memory.js](src/memory.js)) bundles the pair, scores its importance, asks OpenAI for an embedding, and stores `{ content, embedding, importance, timestamp }` inside the SQLite tables.
|
||||
- **Why so many numbers:** Cosine similarity needs raw vectors to compare new thoughts to past ones. When a fresh message arrives, `retrieveRelevantMemories()` embeds it too, calculates cosine similarity against every stored vector, adds a small importance boost, and returns the top five memories to inject into the system prompt.
|
||||
- **Self-cleaning:** If the JSON file grows past the configured limits, low-importance items are trimmed, summaries compress the short-term transcript, and you can delete `data/memory.json` to reset everything cleanly.
|
||||
- **Self-cleaning:** If the DB grows past the configured limits, low-importance items are trimmed, summaries compress the short-term transcript, and you can delete `data/memory.sqlite` to reset everything cleanly.
|
||||
|
||||
### Migrating legacy `memory.json`
|
||||
- Keep your original `data/memory.json` in place and delete/rename `data/memory.sqlite` before launching the bot.
|
||||
- On the next start, the new SQL engine auto-imports every user record from the JSON file, logs a migration message, and writes the populated `.sqlite` file.
|
||||
- After confirming the data landed, archive or remove the JSON backup if you no longer need it.
|
||||
|
||||
## Conversation Flow
|
||||
1. Incoming message triggers only if it is a DM, mentions the bot, or appears in the configured channel.
|
||||
@@ -97,9 +100,8 @@ README.md
|
||||
- Each ping goes through OpenAI with the prompt "you havent messaged your coder in a while, and you wanna chat with him!" so responses stay playful and unscripted.
|
||||
- The ping gets typed out (`sendTyping`) for realism and is stored back into the memory layers so the next incoming reply has context.
|
||||
|
||||
## Notes
|
||||
- The bot retries OpenAI requests up to 3 times with incremental backoff when rate limited.
|
||||
- `data/memory.json` is ignored by git but will grow with usage; back it up if you want persistent personality.
|
||||
- To reset persona, delete `data/memory.json` while the bot is offline.
|
||||
- `data/memory.sqlite` is ignored by git but will grow with usage; back it up if you want persistent personality (and keep `data/memory.json` around only if you need legacy migrations).
|
||||
- To reset persona, delete `data/memory.sqlite` while the bot is offline.
|
||||
|
||||
Happy chatting!
|
||||
|
||||
10
package-lock.json
generated
10
package-lock.json
generated
@@ -11,7 +11,9 @@
|
||||
"cheerio": "^1.0.0-rc.12",
|
||||
"discord.js": "^14.15.2",
|
||||
"dotenv": "^16.4.5",
|
||||
"openai": "^4.58.1"
|
||||
"openai": "^4.58.1",
|
||||
"sql.js": "^1.11.0",
|
||||
"undici": "^6.19.8"
|
||||
},
|
||||
"devDependencies": {
|
||||
"nodemon": "^3.0.2"
|
||||
@@ -1285,6 +1287,12 @@
|
||||
"node": ">=10"
|
||||
}
|
||||
},
|
||||
"node_modules/sql.js": {
|
||||
"version": "1.14.0",
|
||||
"resolved": "https://registry.npmjs.org/sql.js/-/sql.js-1.14.0.tgz",
|
||||
"integrity": "sha512-NXYh+kFqLiYRCNAaHD0PcbjFgXyjuolEKLMk5vRt2DgPENtF1kkNzzMlg42dUk5wIsH8MhUzsRhaUxIisoSlZQ==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/supports-color": {
|
||||
"version": "5.5.0",
|
||||
"resolved": "https://registry.npmjs.org/supports-color/-/supports-color-5.5.0.tgz",
|
||||
|
||||
@@ -12,6 +12,7 @@
|
||||
"node": ">=18.0.0"
|
||||
},
|
||||
"dependencies": {
|
||||
"sql.js": "^1.11.0",
|
||||
"cheerio": "^1.0.0-rc.12",
|
||||
"discord.js": "^14.15.2",
|
||||
"dotenv": "^16.4.5",
|
||||
|
||||
@@ -1,8 +1,12 @@
|
||||
import dotenv from 'dotenv';
|
||||
import path from 'path';
|
||||
import { fileURLToPath } from 'url';
|
||||
|
||||
dotenv.config();
|
||||
|
||||
const defaultMemoryDbFile = fileURLToPath(new URL('../data/memory.sqlite', import.meta.url));
|
||||
const legacyMemoryFile = fileURLToPath(new URL('../data/memory.json', import.meta.url));
|
||||
|
||||
const requiredEnv = ['DISCORD_TOKEN', 'OPENAI_API_KEY'];
|
||||
requiredEnv.forEach((key) => {
|
||||
if (!process.env[key]) {
|
||||
@@ -20,7 +24,8 @@ export const config = {
|
||||
coderUserId: process.env.CODER_USER_ID || null,
|
||||
maxCoderPingIntervalMs: 6 * 60 * 60 * 1000,
|
||||
shortTermLimit: 10,
|
||||
memoryFile: fileURLToPath(new URL('../data/memory.json', import.meta.url)),
|
||||
memoryDbFile: process.env.MEMORY_DB_FILE ? path.resolve(process.env.MEMORY_DB_FILE) : defaultMemoryDbFile,
|
||||
legacyMemoryFile,
|
||||
summaryTriggerChars: 3000,
|
||||
memoryPruneThreshold: 0.2,
|
||||
maxMemories: 200,
|
||||
|
||||
419
src/memory.js
419
src/memory.js
@@ -1,5 +1,7 @@
|
||||
import { promises as fs } from 'fs';
|
||||
import path from 'path';
|
||||
import { fileURLToPath } from 'url';
|
||||
import initSqlJs from 'sql.js';
|
||||
import { config } from './config.js';
|
||||
import { createEmbedding, summarizeConversation } from './openai.js';
|
||||
|
||||
@@ -8,142 +10,351 @@ const ensureDir = async (filePath) => {
|
||||
await fs.mkdir(dir, { recursive: true });
|
||||
};
|
||||
|
||||
const defaultStore = { users: {} };
|
||||
const shortTermToText = (entries) =>
|
||||
entries.map((msg) => `${msg.role === 'user' ? 'User' : 'Bot'}: ${msg.content}`).join('\n');
|
||||
|
||||
async function readStore() {
|
||||
try {
|
||||
const raw = await fs.readFile(config.memoryFile, 'utf-8');
|
||||
return JSON.parse(raw);
|
||||
} catch (error) {
|
||||
if (error.code === 'ENOENT') {
|
||||
await ensureDir(config.memoryFile);
|
||||
await fs.writeFile(config.memoryFile, JSON.stringify(defaultStore, null, 2));
|
||||
return JSON.parse(JSON.stringify(defaultStore));
|
||||
}
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
const cosineSimilarity = (a, b) => {
|
||||
if (!a?.length || !b?.length) return 0;
|
||||
const dot = a.reduce((sum, value, idx) => sum + value * (b[idx] || 0), 0);
|
||||
const magA = Math.hypot(...a);
|
||||
const magB = Math.hypot(...b);
|
||||
if (!magA || !magB) return 0;
|
||||
return dot / (magA * magB);
|
||||
};
|
||||
|
||||
async function writeStore(store) {
|
||||
await ensureDir(config.memoryFile);
|
||||
await fs.writeFile(config.memoryFile, JSON.stringify(store, null, 2));
|
||||
}
|
||||
|
||||
function ensureUser(store, userId) {
|
||||
if (!store.users[userId]) {
|
||||
store.users[userId] = {
|
||||
shortTerm: [],
|
||||
longTerm: [],
|
||||
summary: '',
|
||||
lastUpdated: Date.now(),
|
||||
};
|
||||
}
|
||||
return store.users[userId];
|
||||
}
|
||||
|
||||
function shortTermToText(shortTerm) {
|
||||
return shortTerm
|
||||
.map((msg) => `${msg.role === 'user' ? 'User' : 'Bot'}: ${msg.content}`)
|
||||
.join('\n');
|
||||
}
|
||||
|
||||
function estimateImportance(text) {
|
||||
const keywords = ['remember', 'promise', 'plan', 'goal', 'project', 'birthday'];
|
||||
const keywords = ['remember', 'promise', 'plan', 'goal', 'project', 'birthday'];
|
||||
const estimateImportance = (text) => {
|
||||
const keywordBoost = keywords.reduce((score, word) => (text.toLowerCase().includes(word) ? score + 0.2 : score), 0);
|
||||
const lengthScore = Math.min(text.length / 400, 0.5);
|
||||
const emojiBoost = /:[a-z_]+:|😊|😂|❤️/i.test(text) ? 0.1 : 0;
|
||||
return Math.min(1, 0.2 + keywordBoost + lengthScore + emojiBoost);
|
||||
}
|
||||
};
|
||||
|
||||
async function pruneMemories(userMemory) {
|
||||
if (userMemory.longTerm.length <= config.maxMemories) {
|
||||
return;
|
||||
const parseEmbedding = (raw) => {
|
||||
if (!raw) return [];
|
||||
if (Array.isArray(raw)) return raw;
|
||||
try {
|
||||
return JSON.parse(raw);
|
||||
} catch (error) {
|
||||
console.warn('[memory] Failed to parse embedding payload:', error);
|
||||
return [];
|
||||
}
|
||||
userMemory.longTerm.sort((a, b) => a.importance - b.importance || a.timestamp - b.timestamp);
|
||||
while (userMemory.longTerm.length > config.maxMemories) {
|
||||
userMemory.longTerm.shift();
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
async function maybeSummarize(userMemory) {
|
||||
const charCount = userMemory.shortTerm.reduce((sum, msg) => sum + msg.content.length, 0);
|
||||
if (charCount < config.summaryTriggerChars || userMemory.shortTerm.length < config.shortTermLimit) {
|
||||
return;
|
||||
const __dirname = path.dirname(fileURLToPath(import.meta.url));
|
||||
const wasmDir = path.resolve(__dirname, '../node_modules/sql.js/dist');
|
||||
|
||||
let initPromise = null;
|
||||
let writeQueue = Promise.resolve();
|
||||
|
||||
const locateFile = (fileName) => path.join(wasmDir, fileName);
|
||||
|
||||
const persistDb = async (db) => {
|
||||
writeQueue = writeQueue.then(async () => {
|
||||
const data = db.export();
|
||||
await ensureDir(config.memoryDbFile);
|
||||
await fs.writeFile(config.memoryDbFile, Buffer.from(data));
|
||||
});
|
||||
return writeQueue;
|
||||
};
|
||||
|
||||
const run = (db, sql, params = []) => {
|
||||
db.run(sql, params);
|
||||
};
|
||||
|
||||
const get = (db, sql, params = []) => {
|
||||
const stmt = db.prepare(sql);
|
||||
try {
|
||||
stmt.bind(params);
|
||||
if (stmt.step()) {
|
||||
return stmt.getAsObject();
|
||||
}
|
||||
return null;
|
||||
} finally {
|
||||
stmt.free();
|
||||
}
|
||||
const transcript = shortTermToText(userMemory.shortTerm);
|
||||
const updatedSummary = await summarizeConversation(userMemory.summary, transcript);
|
||||
};
|
||||
|
||||
const all = (db, sql, params = []) => {
|
||||
const stmt = db.prepare(sql);
|
||||
const rows = [];
|
||||
try {
|
||||
stmt.bind(params);
|
||||
while (stmt.step()) {
|
||||
rows.push(stmt.getAsObject());
|
||||
}
|
||||
return rows;
|
||||
} finally {
|
||||
stmt.free();
|
||||
}
|
||||
};
|
||||
|
||||
const createSchema = (db) => {
|
||||
db.exec(`
|
||||
CREATE TABLE IF NOT EXISTS users (
|
||||
id TEXT PRIMARY KEY,
|
||||
summary TEXT DEFAULT '',
|
||||
last_updated INTEGER DEFAULT 0
|
||||
);
|
||||
CREATE TABLE IF NOT EXISTS short_term (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
user_id TEXT NOT NULL,
|
||||
role TEXT NOT NULL,
|
||||
content TEXT NOT NULL,
|
||||
timestamp INTEGER NOT NULL,
|
||||
FOREIGN KEY(user_id) REFERENCES users(id) ON DELETE CASCADE
|
||||
);
|
||||
CREATE TABLE IF NOT EXISTS long_term (
|
||||
id TEXT PRIMARY KEY,
|
||||
user_id TEXT NOT NULL,
|
||||
content TEXT NOT NULL,
|
||||
embedding TEXT NOT NULL,
|
||||
importance REAL NOT NULL,
|
||||
timestamp INTEGER NOT NULL,
|
||||
FOREIGN KEY(user_id) REFERENCES users(id) ON DELETE CASCADE
|
||||
);
|
||||
`);
|
||||
};
|
||||
|
||||
const loadDatabase = async () => {
|
||||
if (initPromise) {
|
||||
return initPromise;
|
||||
}
|
||||
initPromise = (async () => {
|
||||
await ensureDir(config.memoryDbFile);
|
||||
const SQL = await initSqlJs({ locateFile });
|
||||
let fileBuffer = null;
|
||||
try {
|
||||
fileBuffer = await fs.readFile(config.memoryDbFile);
|
||||
} catch (error) {
|
||||
if (error.code !== 'ENOENT') {
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
const db = fileBuffer ? new SQL.Database(new Uint8Array(fileBuffer)) : new SQL.Database();
|
||||
createSchema(db);
|
||||
const migrated = await migrateLegacyStore(db);
|
||||
if (!fileBuffer || migrated) {
|
||||
await persistDb(db);
|
||||
}
|
||||
return db;
|
||||
})();
|
||||
return initPromise;
|
||||
};
|
||||
|
||||
const ensureUser = (db, userId) => {
|
||||
run(db, "INSERT OR IGNORE INTO users (id, summary, last_updated) VALUES (?, '', 0)", [userId]);
|
||||
};
|
||||
|
||||
const enforceShortTermCap = (db, userId) => {
|
||||
const cap = config.shortTermLimit * 2;
|
||||
const row = get(db, 'SELECT COUNT(1) as count FROM short_term WHERE user_id = ?', [userId]);
|
||||
const total = row?.count || 0;
|
||||
if (total > cap) {
|
||||
run(
|
||||
db,
|
||||
`DELETE FROM short_term
|
||||
WHERE id IN (
|
||||
SELECT id FROM short_term
|
||||
WHERE user_id = ?
|
||||
ORDER BY timestamp ASC, id ASC
|
||||
LIMIT ?
|
||||
)`,
|
||||
[userId, total - cap],
|
||||
);
|
||||
return true;
|
||||
}
|
||||
return false;
|
||||
};
|
||||
|
||||
const pruneMemories = (db, userId) => {
|
||||
const row = get(db, 'SELECT COUNT(1) as count FROM long_term WHERE user_id = ?', [userId]);
|
||||
const total = row?.count || 0;
|
||||
if (total > config.maxMemories) {
|
||||
run(
|
||||
db,
|
||||
`DELETE FROM long_term
|
||||
WHERE id IN (
|
||||
SELECT id FROM long_term
|
||||
WHERE user_id = ?
|
||||
ORDER BY importance ASC, timestamp ASC
|
||||
LIMIT ?
|
||||
)`,
|
||||
[userId, total - config.maxMemories],
|
||||
);
|
||||
return true;
|
||||
}
|
||||
return false;
|
||||
};
|
||||
|
||||
const getShortTermHistory = (db, userId, limit) => {
|
||||
const rows = all(
|
||||
db,
|
||||
'SELECT role, content, timestamp FROM short_term WHERE user_id = ? ORDER BY timestamp DESC, id DESC LIMIT ?',
|
||||
[userId, limit],
|
||||
);
|
||||
return rows.reverse();
|
||||
};
|
||||
|
||||
const fullShortTerm = (db, userId) =>
|
||||
all(db, 'SELECT id, role, content, timestamp FROM short_term WHERE user_id = ? ORDER BY timestamp ASC, id ASC', [userId]);
|
||||
|
||||
const maybeSummarize = async (db, userId) => {
|
||||
const shortTermEntries = fullShortTerm(db, userId);
|
||||
const charCount = shortTermEntries.reduce((sum, msg) => sum + (msg.content?.length || 0), 0);
|
||||
if (charCount < config.summaryTriggerChars || shortTermEntries.length < config.shortTermLimit) {
|
||||
return false;
|
||||
}
|
||||
const userRow = get(db, 'SELECT summary FROM users WHERE id = ?', [userId]) || { summary: '' };
|
||||
const transcript = shortTermToText(shortTermEntries);
|
||||
const updatedSummary = await summarizeConversation(userRow.summary || '', transcript);
|
||||
if (updatedSummary) {
|
||||
userMemory.summary = updatedSummary;
|
||||
userMemory.shortTerm = userMemory.shortTerm.slice(-4);
|
||||
run(db, 'UPDATE users SET summary = ?, last_updated = ? WHERE id = ?', [updatedSummary, Date.now(), userId]);
|
||||
const keep = 4;
|
||||
const excess = shortTermEntries.length - keep;
|
||||
if (excess > 0) {
|
||||
run(
|
||||
db,
|
||||
`DELETE FROM short_term
|
||||
WHERE id IN (
|
||||
SELECT id FROM short_term
|
||||
WHERE user_id = ?
|
||||
ORDER BY timestamp ASC, id ASC
|
||||
LIMIT ?
|
||||
)`,
|
||||
[userId, excess],
|
||||
);
|
||||
}
|
||||
return true;
|
||||
}
|
||||
}
|
||||
return false;
|
||||
};
|
||||
|
||||
function cosineSimilarity(a, b) {
|
||||
if (!a.length || !b.length) return 0;
|
||||
const dot = a.reduce((sum, value, idx) => sum + value * (b[idx] || 0), 0);
|
||||
const magA = Math.sqrt(a.reduce((sum, value) => sum + value * value, 0));
|
||||
const magB = Math.sqrt(b.reduce((sum, value) => sum + value * value, 0));
|
||||
if (!magA || !magB) return 0;
|
||||
return dot / (magA * magB);
|
||||
}
|
||||
const migrateLegacyStore = async (db) => {
|
||||
if (!config.legacyMemoryFile) return false;
|
||||
const existing = get(db, 'SELECT 1 as present FROM users LIMIT 1');
|
||||
if (existing) {
|
||||
return false;
|
||||
}
|
||||
let raw;
|
||||
try {
|
||||
raw = await fs.readFile(config.legacyMemoryFile, 'utf-8');
|
||||
} catch (error) {
|
||||
if (error.code === 'ENOENT') {
|
||||
return false;
|
||||
}
|
||||
throw error;
|
||||
}
|
||||
let store;
|
||||
try {
|
||||
store = JSON.parse(raw);
|
||||
} catch (error) {
|
||||
console.warn('[memory] Unable to parse legacy memory.json. Skipping migration.');
|
||||
return false;
|
||||
}
|
||||
if (!store?.users || !Object.keys(store.users).length) {
|
||||
return false;
|
||||
}
|
||||
Object.entries(store.users).forEach(([userId, user]) => {
|
||||
ensureUser(db, userId);
|
||||
run(db, 'UPDATE users SET summary = ?, last_updated = ? WHERE id = ?', [user.summary || '', user.lastUpdated || 0, userId]);
|
||||
(user.shortTerm || []).forEach((entry) => {
|
||||
run(db, 'INSERT INTO short_term (user_id, role, content, timestamp) VALUES (?, ?, ?, ?)', [
|
||||
userId,
|
||||
entry.role || 'user',
|
||||
entry.content || '',
|
||||
entry.timestamp || Date.now(),
|
||||
]);
|
||||
});
|
||||
(user.longTerm || []).forEach((entry) => {
|
||||
const rowId = entry.id || `${Date.now()}-${Math.random().toString(36).slice(2, 8)}`;
|
||||
run(db, 'INSERT INTO long_term (id, user_id, content, embedding, importance, timestamp) VALUES (?, ?, ?, ?, ?, ?)', [
|
||||
rowId,
|
||||
userId,
|
||||
entry.content || '',
|
||||
JSON.stringify(entry.embedding || []),
|
||||
entry.importance ?? 0,
|
||||
entry.timestamp || Date.now(),
|
||||
]);
|
||||
});
|
||||
});
|
||||
console.log('[memory] Migrated legacy memory.json to SQLite (sql.js).');
|
||||
return true;
|
||||
};
|
||||
|
||||
async function retrieveRelevantMemories(userMemory, query) {
|
||||
if (!userMemory.longTerm.length || !query?.trim()) {
|
||||
const retrieveRelevantMemories = async (db, userId, query) => {
|
||||
if (!query?.trim()) {
|
||||
return [];
|
||||
}
|
||||
const rows = all(db, 'SELECT id, content, embedding, importance, timestamp FROM long_term WHERE user_id = ?', [userId]);
|
||||
if (!rows.length) {
|
||||
return [];
|
||||
}
|
||||
const queryEmbedding = await createEmbedding(query);
|
||||
const scored = userMemory.longTerm
|
||||
.map((entry) => ({
|
||||
...entry,
|
||||
score: cosineSimilarity(queryEmbedding, entry.embedding) + entry.importance * 0.1,
|
||||
}))
|
||||
.sort((a, b) => b.score - a.score);
|
||||
return scored.slice(0, config.relevantMemoryCount);
|
||||
}
|
||||
return rows
|
||||
.map((entry) => {
|
||||
const embedding = parseEmbedding(entry.embedding);
|
||||
return {
|
||||
...entry,
|
||||
embedding,
|
||||
score: cosineSimilarity(queryEmbedding, embedding) + entry.importance * 0.1,
|
||||
};
|
||||
})
|
||||
.sort((a, b) => b.score - a.score)
|
||||
.slice(0, config.relevantMemoryCount)
|
||||
.map(({ score, ...rest }) => rest);
|
||||
};
|
||||
|
||||
export async function appendShortTerm(userId, role, content) {
|
||||
const store = await readStore();
|
||||
const userMemory = ensureUser(store, userId);
|
||||
userMemory.shortTerm.push({ role, content, timestamp: Date.now() });
|
||||
if (userMemory.shortTerm.length > config.shortTermLimit * 2) {
|
||||
userMemory.shortTerm = userMemory.shortTerm.slice(-config.shortTermLimit * 2);
|
||||
}
|
||||
await maybeSummarize(userMemory);
|
||||
await writeStore(store);
|
||||
const db = await loadDatabase();
|
||||
ensureUser(db, userId);
|
||||
run(db, 'INSERT INTO short_term (user_id, role, content, timestamp) VALUES (?, ?, ?, ?)', [
|
||||
userId,
|
||||
role,
|
||||
content,
|
||||
Date.now(),
|
||||
]);
|
||||
enforceShortTermCap(db, userId);
|
||||
await maybeSummarize(db, userId);
|
||||
await persistDb(db);
|
||||
}
|
||||
|
||||
export async function prepareContext(userId, incomingMessage) {
|
||||
const store = await readStore();
|
||||
const userMemory = ensureUser(store, userId);
|
||||
const relevant = await retrieveRelevantMemories(userMemory, incomingMessage);
|
||||
const db = await loadDatabase();
|
||||
ensureUser(db, userId);
|
||||
const userRow = get(db, 'SELECT summary FROM users WHERE id = ?', [userId]) || { summary: '' };
|
||||
const shortTerm = getShortTermHistory(db, userId, config.shortTermLimit);
|
||||
const memories = await retrieveRelevantMemories(db, userId, incomingMessage);
|
||||
return {
|
||||
shortTerm: userMemory.shortTerm.slice(-config.shortTermLimit),
|
||||
summary: userMemory.summary,
|
||||
memories: relevant,
|
||||
shortTerm,
|
||||
summary: userRow.summary || '',
|
||||
memories,
|
||||
};
|
||||
}
|
||||
|
||||
export async function recordInteraction(userId, userMessage, botReply) {
|
||||
const store = await readStore();
|
||||
const userMemory = ensureUser(store, userId);
|
||||
const db = await loadDatabase();
|
||||
ensureUser(db, userId);
|
||||
const combined = `User: ${userMessage}\nBot: ${botReply}`;
|
||||
const embedding = await createEmbedding(combined);
|
||||
const importance = estimateImportance(combined);
|
||||
userMemory.longTerm.push({
|
||||
id: `${Date.now()}-${Math.random().toString(36).slice(2, 8)}`,
|
||||
content: combined,
|
||||
embedding,
|
||||
const id = `${Date.now()}-${Math.random().toString(36).slice(2, 8)}`;
|
||||
run(db, 'INSERT INTO long_term (id, user_id, content, embedding, importance, timestamp) VALUES (?, ?, ?, ?, ?, ?)', [
|
||||
id,
|
||||
userId,
|
||||
combined,
|
||||
JSON.stringify(embedding),
|
||||
importance,
|
||||
timestamp: Date.now(),
|
||||
});
|
||||
await pruneMemories(userMemory);
|
||||
userMemory.lastUpdated = Date.now();
|
||||
await writeStore(store);
|
||||
Date.now(),
|
||||
]);
|
||||
pruneMemories(db, userId);
|
||||
run(db, 'UPDATE users SET last_updated = ? WHERE id = ?', [Date.now(), userId]);
|
||||
await persistDb(db);
|
||||
}
|
||||
|
||||
export async function pruneLowImportanceMemories(userId) {
|
||||
const store = await readStore();
|
||||
const userMemory = ensureUser(store, userId);
|
||||
userMemory.longTerm = userMemory.longTerm.filter((entry) => entry.importance >= config.memoryPruneThreshold);
|
||||
await writeStore(store);
|
||||
const db = await loadDatabase();
|
||||
ensureUser(db, userId);
|
||||
run(db, 'DELETE FROM long_term WHERE user_id = ? AND importance < ?', [userId, config.memoryPruneThreshold]);
|
||||
await persistDb(db);
|
||||
}
|
||||
Reference in New Issue
Block a user