Nova v1.2
This commit is contained in:
@@ -12,3 +12,5 @@ ENABLE_DASHBOARD=false
|
|||||||
# port for the dashboard if enabled
|
# port for the dashboard if enabled
|
||||||
DASHBOARD_PORT=3000
|
DASHBOARD_PORT=3000
|
||||||
ENABLE_WEB_SEARCH=true
|
ENABLE_WEB_SEARCH=true
|
||||||
|
OPENAI_API_KEY=your_openai_api_key
|
||||||
|
|
||||||
|
|||||||
51
CHANGELOG.md
Normal file
51
CHANGELOG.md
Normal file
@@ -0,0 +1,51 @@
|
|||||||
|
# Changelog
|
||||||
|
|
||||||
|
All notable changes made during this working session (March 1, 2026).
|
||||||
|
|
||||||
|
## Unreleased
|
||||||
|
|
||||||
|
### Token + performance optimizations
|
||||||
|
- Added `src/prompt.js` to centralize prompt construction (`buildPrompt`) and reduce repeated prompt-building logic.
|
||||||
|
- Added a short-lived in-memory context cache in `src/bot.js` to reuse prepared context across the continuation loop and normal replies.
|
||||||
|
- Reduced default memory/prompt sizes in `src/config.js`:
|
||||||
|
- `shortTermLimit`: 10 -> 6
|
||||||
|
- `summaryTriggerChars`: 3000 -> 2200
|
||||||
|
- `relevantMemoryCount`: 5 -> 3
|
||||||
|
- Added `longTermFetchLimit` (default 120)
|
||||||
|
- Limited long-term memory retrieval to a recent window before similarity scoring in `src/memory.js` (uses `longTermFetchLimit`).
|
||||||
|
- Summarized live web-search intel before injecting it into the prompt (keeps the payload shorter) in `src/bot.js`.
|
||||||
|
- Debounced memory DB persistence in `src/memory.js` to batch multiple writes (instead of exporting/writing on every mutation).
|
||||||
|
|
||||||
|
### Dashboard (local memory UI)
|
||||||
|
- Revamped the dashboard UI layout + styling in `src/public/index.html`.
|
||||||
|
- Added long-term memory create/edit support:
|
||||||
|
- API: `POST /api/users/:id/long` in `src/dashboard.js`
|
||||||
|
- Store: `upsertLongTerm()` in `src/memory.js`
|
||||||
|
- Added long-term memory pagination:
|
||||||
|
- API: `GET /api/users/:id/long?page=&per=` returns `{ rows, total, page, per, totalPages }` via `getLongTermMemoriesPage()` in `src/memory.js`
|
||||||
|
- UI: paging controls; long-term list shows 15 per page (`LONG_TERM_PER_PAGE = 15`)
|
||||||
|
- Added "search preview" UX in the dashboard to quickly reuse a similar memory result as an edit/create starting point ("Use this memory").
|
||||||
|
- Added a simple recall timeline:
|
||||||
|
- API: `GET /api/users/:id/timeline?days=` in `src/dashboard.js`
|
||||||
|
- Store: `getMemoryTimeline()` in `src/memory.js`
|
||||||
|
- UI: lightweight bar chart in `src/public/index.html`
|
||||||
|
|
||||||
|
### Fixes
|
||||||
|
- Fixed dashboard long-term pagination wiring (`getLongTermMemoriesPage` import/usage) in `src/dashboard.js`.
|
||||||
|
- Fixed dashboard long-term "Edit" button behavior by wiring row handlers in `src/public/index.html`.
|
||||||
|
- Prevented button interactions from crashing the bot on late/invalid updates by deferring updates and editing the message in `src/bot.js`.
|
||||||
|
|
||||||
|
### Discord-side features
|
||||||
|
- Added a memory-aware reaction badge: bot reacts with `🧠` when long-term memories were injected into the prompt (`src/bot.js`).
|
||||||
|
- Added a lightweight blackjack mini-game:
|
||||||
|
- Start via text trigger `/blackjack` (not a registered slash command).
|
||||||
|
- Single-embed game UI with button components for actions (Hit / Stand; Split is present as a placeholder).
|
||||||
|
- Improved interaction handling to avoid "Unknown interaction" crashes by deferring updates and editing the message (`src/bot.js`).
|
||||||
|
|
||||||
|
### Reliability / guardrails
|
||||||
|
- Relaxed the "empty response" guard in `src/openai.js`:
|
||||||
|
- Still throws when the provider returns no choices.
|
||||||
|
- If choices exist but content is blank, returns an empty string instead of forcing fallback (reduces noisy false-positive failures).
|
||||||
|
|
||||||
|
### Configuration / examples
|
||||||
|
- Updated `.env.example` to include `OPENAI_API_KEY`.
|
||||||
69
README.md
69
README.md
@@ -2,6 +2,63 @@
|
|||||||
|
|
||||||
Nova is a friendly, slightly witty Discord companion that chats naturally in DMs or when mentioned in servers. It runs on Node.js, uses `discord.js` v14, and supports OpenRouter (recommended) or OpenAI backends for model access, plus lightweight local memory for persistent personality.
|
Nova is a friendly, slightly witty Discord companion that chats naturally in DMs or when mentioned in servers. It runs on Node.js, uses `discord.js` v14, and supports OpenRouter (recommended) or OpenAI backends for model access, plus lightweight local memory for persistent personality.
|
||||||
|
|
||||||
|
## Recent changes (2026-03-01)
|
||||||
|
- Added token-usage + performance optimizations (prompt builder + context caching + smaller injected payloads).
|
||||||
|
- Upgraded the local memory dashboard: long-term memory create/edit, pagination (15 per page), search preview helper, and a recall timeline view.
|
||||||
|
- Added Discord-side extras: `🧠` memory-injected reaction badge + a `/blackjack` embed mini-game with buttons.
|
||||||
|
- Full session log lives in `CHANGELOG.md` (and is mirrored below).
|
||||||
|
|
||||||
|
<details>
|
||||||
|
<summary>Full update log (session)</summary>
|
||||||
|
|
||||||
|
- Token + performance optimizations
|
||||||
|
- Added `src/prompt.js` to centralize prompt construction (`buildPrompt`) and reduce repeated prompt-building logic.
|
||||||
|
- Added a short-lived in-memory context cache in `src/bot.js` to reuse prepared context across the continuation loop and normal replies.
|
||||||
|
- Reduced default memory/prompt sizes in `src/config.js`:
|
||||||
|
- `shortTermLimit`: 10 -> 6
|
||||||
|
- `summaryTriggerChars`: 3000 -> 2200
|
||||||
|
- `relevantMemoryCount`: 5 -> 3
|
||||||
|
- Added `longTermFetchLimit` (default 120)
|
||||||
|
- Limited long-term memory retrieval to a recent window before similarity scoring in `src/memory.js` (uses `longTermFetchLimit`).
|
||||||
|
- Summarized live web-search intel before injecting it into the prompt (keeps the payload shorter) in `src/bot.js`.
|
||||||
|
- Debounced memory DB persistence in `src/memory.js` to batch multiple writes (instead of exporting/writing on every mutation).
|
||||||
|
|
||||||
|
- Dashboard (local memory UI)
|
||||||
|
- Revamped the dashboard UI layout + styling in `src/public/index.html`.
|
||||||
|
- Added long-term memory create/edit support:
|
||||||
|
- API: `POST /api/users/:id/long` in `src/dashboard.js`
|
||||||
|
- Store: `upsertLongTerm()` in `src/memory.js`
|
||||||
|
- Added long-term memory pagination:
|
||||||
|
- API: `GET /api/users/:id/long?page=&per=` returns `{ rows, total, page, per, totalPages }` via `getLongTermMemoriesPage()` in `src/memory.js`
|
||||||
|
- UI: paging controls; long-term list shows 15 per page (`LONG_TERM_PER_PAGE = 15`)
|
||||||
|
- Added "search preview" UX in the dashboard to quickly reuse a similar memory result as an edit/create starting point ("Use this memory").
|
||||||
|
- Added a simple recall timeline:
|
||||||
|
- API: `GET /api/users/:id/timeline?days=` in `src/dashboard.js`
|
||||||
|
- Store: `getMemoryTimeline()` in `src/memory.js`
|
||||||
|
- UI: lightweight bar chart in `src/public/index.html`
|
||||||
|
|
||||||
|
- Fixes
|
||||||
|
- Fixed dashboard long-term pagination wiring (`getLongTermMemoriesPage` import/usage) in `src/dashboard.js`.
|
||||||
|
- Fixed dashboard long-term "Edit" button behavior by wiring row handlers in `src/public/index.html`.
|
||||||
|
- Prevented button interactions from crashing the bot on late/invalid updates by deferring updates and editing the message in `src/bot.js`.
|
||||||
|
|
||||||
|
- Discord-side features
|
||||||
|
- Added a memory-aware reaction badge: bot reacts with `🧠` when long-term memories were injected into the prompt (`src/bot.js`).
|
||||||
|
- Added a lightweight blackjack mini-game:
|
||||||
|
- Start via text trigger `/blackjack` (not a registered slash command).
|
||||||
|
- Single-embed game UI with button components for actions (Hit / Stand; Split is present as a placeholder).
|
||||||
|
- Improved interaction handling to avoid "Unknown interaction" crashes by deferring updates and editing the message (`src/bot.js`).
|
||||||
|
|
||||||
|
- Reliability / guardrails
|
||||||
|
- Relaxed the "empty response" guard in `src/openai.js`:
|
||||||
|
- Still throws when the provider returns no choices.
|
||||||
|
- If choices exist but content is blank, returns an empty string instead of forcing fallback (reduces noisy false-positive failures).
|
||||||
|
|
||||||
|
- Configuration / examples
|
||||||
|
- Updated `.env.example` to include `OPENAI_API_KEY`.
|
||||||
|
|
||||||
|
</details>
|
||||||
|
|
||||||
## Features
|
## Features
|
||||||
- Conversational replies in DMs automatically; replies in servers when mentioned or in a pinned channel.
|
- Conversational replies in DMs automatically; replies in servers when mentioned or in a pinned channel.
|
||||||
- Chat model (defaults to `meta-llama/llama-3-8b-instruct` when using OpenRouter) for dialogue and a low-cost embedding model (`nvidia/llama-nemotron-embed-vl-1b-v2` by default). OpenAI keys/models may be used as a fallback.
|
- Chat model (defaults to `meta-llama/llama-3-8b-instruct` when using OpenRouter) for dialogue and a low-cost embedding model (`nvidia/llama-nemotron-embed-vl-1b-v2` by default). OpenAI keys/models may be used as a fallback.
|
||||||
@@ -9,6 +66,9 @@ Nova is a friendly, slightly witty Discord companion that chats naturally in DMs
|
|||||||
- **Rotating “daily mood” engine** that adjusts Nova’s personality each day (calm, goblin, philosopher, etc.). Mood influences emoji use, sarcasm, response length, and hype. (Now randomized each run rather than fixed by calendar date.)
|
- **Rotating “daily mood” engine** that adjusts Nova’s personality each day (calm, goblin, philosopher, etc.). Mood influences emoji use, sarcasm, response length, and hype. (Now randomized each run rather than fixed by calendar date.)
|
||||||
- **LLM-powered live–intel web search**: Nova uses the LLM itself to decide whether a topic needs a live web search. If you mention something unfamiliar or that requires current info, it automatically Googles first and uses the results in its response—without triggering on casual chat.
|
- **LLM-powered live–intel web search**: Nova uses the LLM itself to decide whether a topic needs a live web search. If you mention something unfamiliar or that requires current info, it automatically Googles first and uses the results in its response—without triggering on casual chat.
|
||||||
- **Optional local memory dashboard** (enabled with `ENABLE_DASHBOARD=true`): spin up a simple browser UI alongside the bot. Inspect stored memories by user, delete entries, run similarity queries, view importance scores, and peek at Nova’s current mood and quirky “status” of the day. The dashboard runs on `DASHBOARD_PORT` (3000 by default) and is entirely optional.
|
- **Optional local memory dashboard** (enabled with `ENABLE_DASHBOARD=true`): spin up a simple browser UI alongside the bot. Inspect stored memories by user, delete entries, run similarity queries, view importance scores, and peek at Nova’s current mood and quirky “status” of the day. The dashboard runs on `DASHBOARD_PORT` (3000 by default) and is entirely optional.
|
||||||
|
- Local dashboard upgrades: long-term memory create/edit, pagination (15 per page), and a simple recall timeline.
|
||||||
|
- `🧠` reaction badge when long-term memories are injected for a reply.
|
||||||
|
- A simple `/blackjack` mini-game (embed + buttons).
|
||||||
|
|
||||||
- Automatic memory pruning, importance scoring, and transcript summarization when chats grow long.
|
- Automatic memory pruning, importance scoring, and transcript summarization when chats grow long.
|
||||||
- Local SQLite memory file (no extra infrastructure) powered by `sql.js`, plus graceful retries for the model API (OpenRouter/OpenAI).
|
- Local SQLite memory file (no extra infrastructure) powered by `sql.js`, plus graceful retries for the model API (OpenRouter/OpenAI).
|
||||||
@@ -65,10 +125,15 @@ Nova is a friendly, slightly witty Discord companion that chats naturally in DMs
|
|||||||
src/
|
src/
|
||||||
bot.js # Discord client + routing logic
|
bot.js # Discord client + routing logic
|
||||||
config.js # Environment and tuning knobs
|
config.js # Environment and tuning knobs
|
||||||
|
dashboard.js # Local memory dashboard server (optional)
|
||||||
openai.js # Chat + embedding helpers with retry logic
|
openai.js # Chat + embedding helpers with retry logic
|
||||||
memory.js # Multi-layer memory engine
|
memory.js # Multi-layer memory engine
|
||||||
|
prompt.js # Prompt builder (system + dynamic directives)
|
||||||
|
public/
|
||||||
|
index.html # Local dashboard UI
|
||||||
.env.example
|
.env.example
|
||||||
README.md
|
README.md
|
||||||
|
CHANGELOG.md
|
||||||
```
|
```
|
||||||
|
|
||||||
- **Short-term (recency buffer):** Last 10 conversation turns kept verbatim for style and continuity. Stored per user inside `data/memory.sqlite`.
|
- **Short-term (recency buffer):** Last 10 conversation turns kept verbatim for style and continuity. Stored per user inside `data/memory.sqlite`.
|
||||||
@@ -108,7 +173,11 @@ The dashboard lets you:
|
|||||||
- Browse all users that the bot has spoken with.
|
- Browse all users that the bot has spoken with.
|
||||||
- Inspect short‑term and long‑term memory entries, including their importance scores and timestamps.
|
- Inspect short‑term and long‑term memory entries, including their importance scores and timestamps.
|
||||||
- Delete individual long‑term memories if you want to clean up or correct something.
|
- Delete individual long‑term memories if you want to clean up or correct something.
|
||||||
|
- Edit or create long‑term memories from the dashboard.
|
||||||
|
- Paginate long‑term memories (15 per page).
|
||||||
- Run a similarity search to see which stored memories are most relevant to a query.
|
- Run a similarity search to see which stored memories are most relevant to a query.
|
||||||
|
- Use a similarity result as a quick prefill for editing/creating a memory.
|
||||||
|
- View a simple recall timeline for the last couple weeks.
|
||||||
- Peek at the current mood the bot is using and a quirky “status/thought” message generated each day.
|
- Peek at the current mood the bot is using and a quirky “status/thought” message generated each day.
|
||||||
|
|
||||||
Once the bot is running, open your browser and go to `http://localhost:3000` (or your configured port).
|
Once the bot is running, open your browser and go to `http://localhost:3000` (or your configured port).
|
||||||
|
|||||||
507
src/bot.js
507
src/bot.js
@@ -1,12 +1,13 @@
|
|||||||
import dotenv from "dotenv";
|
import dotenv from "dotenv";
|
||||||
dotenv.config({ path: "../.env" });
|
dotenv.config({ path: "../.env" });
|
||||||
import { Client, GatewayIntentBits, Partials, ChannelType, ActivityType } from 'discord.js';
|
import { Client, GatewayIntentBits, Partials, ChannelType, ActivityType, EmbedBuilder, ActionRowBuilder, ButtonBuilder, ButtonStyle } from 'discord.js';
|
||||||
import { config } from './config.js';
|
import { config } from './config.js';
|
||||||
import { chatCompletion } from './openai.js';
|
import { chatCompletion } from './openai.js';
|
||||||
import { appendShortTerm, prepareContext, recordInteraction } from './memory.js';
|
import { appendShortTerm, recordInteraction } from './memory.js';
|
||||||
import { searchWeb, appendSearchLog, detectFilteredPhrase } from './search.js';
|
import { searchWeb, appendSearchLog, detectFilteredPhrase } from './search.js';
|
||||||
import { getDailyMood, setMoodByName, getDailyThought, generateDailyThought } from './mood.js';
|
import { getDailyMood, setMoodByName, getDailyThought, generateDailyThought } from './mood.js';
|
||||||
import { startDashboard } from './dashboard.js';
|
import { startDashboard } from './dashboard.js';
|
||||||
|
import { buildPrompt, searchCueRegex } from './prompt.js';
|
||||||
const client = new Client({
|
const client = new Client({
|
||||||
intents: [
|
intents: [
|
||||||
GatewayIntentBits.Guilds,
|
GatewayIntentBits.Guilds,
|
||||||
@@ -21,6 +22,130 @@ let coderPingTimer;
|
|||||||
const continuationState = new Map();
|
const continuationState = new Map();
|
||||||
let isSleeping = false;
|
let isSleeping = false;
|
||||||
|
|
||||||
|
const contextCache = new Map();
|
||||||
|
const CONTEXT_CACHE_TTL_MS = 2 * 60 * 1000;
|
||||||
|
|
||||||
|
const cloneShortTerm = (entries = []) => entries.map((entry) => ({ ...entry }));
|
||||||
|
const cloneMemories = (entries = []) => entries.map((entry) => ({ ...entry }));
|
||||||
|
|
||||||
|
function cacheContext(userId, context) {
|
||||||
|
if (!context) {
|
||||||
|
contextCache.delete(userId);
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
const snapshot = {
|
||||||
|
shortTerm: cloneShortTerm(context.shortTerm || []),
|
||||||
|
summary: context.summary,
|
||||||
|
memories: cloneMemories(context.memories || []),
|
||||||
|
};
|
||||||
|
contextCache.set(userId, { context: snapshot, timestamp: Date.now() });
|
||||||
|
return snapshot;
|
||||||
|
}
|
||||||
|
|
||||||
|
function getCachedContext(userId) {
|
||||||
|
const entry = contextCache.get(userId);
|
||||||
|
if (!entry) return null;
|
||||||
|
if (Date.now() - entry.timestamp > CONTEXT_CACHE_TTL_MS) {
|
||||||
|
contextCache.delete(userId);
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
return entry.context;
|
||||||
|
}
|
||||||
|
|
||||||
|
function appendToCachedShortTerm(userId, role, content) {
|
||||||
|
const entry = contextCache.get(userId);
|
||||||
|
if (!entry) return;
|
||||||
|
const limit = config.shortTermLimit || 6;
|
||||||
|
const shortTerm = entry.context.shortTerm || [];
|
||||||
|
shortTerm.push({ role, content });
|
||||||
|
if (shortTerm.length > limit) {
|
||||||
|
shortTerm.splice(0, shortTerm.length - limit);
|
||||||
|
}
|
||||||
|
entry.context.shortTerm = shortTerm;
|
||||||
|
entry.timestamp = Date.now();
|
||||||
|
}
|
||||||
|
|
||||||
|
async function appendShortTermWithCache(userId, role, content) {
|
||||||
|
await appendShortTerm(userId, role, content);
|
||||||
|
appendToCachedShortTerm(userId, role, content);
|
||||||
|
}
|
||||||
|
|
||||||
|
const blackjackState = new Map();
|
||||||
|
const suits = ['♠', '♥', '♦', '♣'];
|
||||||
|
const ranks = [
|
||||||
|
{ rank: 'A', value: 1 },
|
||||||
|
{ rank: '2', value: 2 },
|
||||||
|
{ rank: '3', value: 3 },
|
||||||
|
{ rank: '4', value: 4 },
|
||||||
|
{ rank: '5', value: 5 },
|
||||||
|
{ rank: '6', value: 6 },
|
||||||
|
{ rank: '7', value: 7 },
|
||||||
|
{ rank: '8', value: 8 },
|
||||||
|
{ rank: '9', value: 9 },
|
||||||
|
{ rank: '10', value: 10 },
|
||||||
|
{ rank: 'J', value: 10 },
|
||||||
|
{ rank: 'Q', value: 10 },
|
||||||
|
{ rank: 'K', value: 10 },
|
||||||
|
];
|
||||||
|
|
||||||
|
const createDeck = () => {
|
||||||
|
const deck = [];
|
||||||
|
for (const suit of suits) {
|
||||||
|
for (const rank of ranks) {
|
||||||
|
deck.push({
|
||||||
|
rank: rank.rank,
|
||||||
|
value: rank.value,
|
||||||
|
label: `${rank.rank}${suit}`,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
for (let i = deck.length - 1; i > 0; i -= 1) {
|
||||||
|
const j = Math.floor(Math.random() * (i + 1));
|
||||||
|
[deck[i], deck[j]] = [deck[j], deck[i]];
|
||||||
|
}
|
||||||
|
return deck;
|
||||||
|
};
|
||||||
|
|
||||||
|
const drawCard = (deck) => deck.pop();
|
||||||
|
|
||||||
|
const scoreHand = (hand) => {
|
||||||
|
let total = 0;
|
||||||
|
let aces = 0;
|
||||||
|
hand.forEach((card) => {
|
||||||
|
total += card.value;
|
||||||
|
if (card.rank === 'A') {
|
||||||
|
aces += 1;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
while (aces > 0 && total + 10 <= 21) {
|
||||||
|
total += 10;
|
||||||
|
aces -= 1;
|
||||||
|
}
|
||||||
|
return total;
|
||||||
|
};
|
||||||
|
|
||||||
|
const formatHand = (hand) => hand.map((card) => card.label).join(' ');
|
||||||
|
|
||||||
|
const blackjackReaction = async (playerHand, dealerHand, status) => {
|
||||||
|
try {
|
||||||
|
const system = {
|
||||||
|
role: 'system',
|
||||||
|
content: 'You are Nova, a playful Discord bot that just finished a round of blackjack.',
|
||||||
|
};
|
||||||
|
const playerCards = playerHand.map((card) => card.label).join(', ');
|
||||||
|
const dealerCards = dealerHand.map((card) => card.label).join(', ');
|
||||||
|
const prompt = {
|
||||||
|
role: 'user',
|
||||||
|
content: `Player: ${playerCards} (${scoreHand(playerHand)}). Dealer: ${dealerCards} (${scoreHand(dealerHand)}). Outcome: ${status}. Provide a short, quirky reaction (<=20 words).`,
|
||||||
|
};
|
||||||
|
const reaction = await chatCompletion([system, prompt], { temperature: 0.8, maxTokens: 30 });
|
||||||
|
return reaction || 'Nova shrugs and says, "Nice try!"';
|
||||||
|
} catch (err) {
|
||||||
|
console.warn('[blackjack] reaction failed:', err);
|
||||||
|
return 'Nova is vibing silently.';
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
|
||||||
function enterSleepMode() {
|
function enterSleepMode() {
|
||||||
if (isSleeping) return;
|
if (isSleeping) return;
|
||||||
@@ -73,7 +198,11 @@ function startContinuationForUser(userId, channel) {
|
|||||||
}
|
}
|
||||||
state.sending = true;
|
state.sending = true;
|
||||||
const incomingText = 'Continue the conversation naturally based on recent context.';
|
const incomingText = 'Continue the conversation naturally based on recent context.';
|
||||||
const { messages } = await buildPrompt(userId, incomingText, {});
|
const cachedContext = getCachedContext(userId);
|
||||||
|
const { messages, debug } = await buildPrompt(userId, incomingText, {
|
||||||
|
context: cachedContext,
|
||||||
|
});
|
||||||
|
cacheContext(userId, debug.context);
|
||||||
const reply = await chatCompletion(messages, { temperature: 0.7, maxTokens: 200 });
|
const reply = await chatCompletion(messages, { temperature: 0.7, maxTokens: 200 });
|
||||||
const finalReply = (reply && reply.trim()) || '';
|
const finalReply = (reply && reply.trim()) || '';
|
||||||
if (!finalReply) {
|
if (!finalReply) {
|
||||||
@@ -92,7 +221,7 @@ function startContinuationForUser(userId, channel) {
|
|||||||
await channelRef.send(chunk);
|
await channelRef.send(chunk);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
await appendShortTerm(userId, 'assistant', chunk);
|
await appendShortTermWithCache(userId, 'assistant', chunk);
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
console.warn('[bot] Failed to deliver proactive message:', err);
|
console.warn('[bot] Failed to deliver proactive message:', err);
|
||||||
}
|
}
|
||||||
@@ -177,25 +306,6 @@ function splitResponses(text) {
|
|||||||
.filter(Boolean);
|
.filter(Boolean);
|
||||||
}
|
}
|
||||||
|
|
||||||
const toneHints = [
|
|
||||||
{ label: 'upset', regex: /(frustrated|mad|angry|annoyed|upset|wtf|ugh|irritated)/i },
|
|
||||||
{ label: 'sad', regex: /(sad|down|depressed|lonely|tired)/i },
|
|
||||||
{ label: 'excited', regex: /(excited|hyped|omg|yay|stoked)/i },
|
|
||||||
];
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
function detectTone(text) {
|
|
||||||
if (!text) return null;
|
|
||||||
const match = toneHints.find((hint) => hint.regex.test(text));
|
|
||||||
return match?.label || null;
|
|
||||||
}
|
|
||||||
|
|
||||||
const roleplayRegex = /(roleplay|act as|pretend|be my|in character)/i;
|
|
||||||
const detailRegex = /(explain|how do i|tutorial|step by step|teach me|walk me through|detail)/i;
|
|
||||||
const splitHintRegex = /(split|multiple messages|two messages|keep talking|ramble|keep going)/i;
|
|
||||||
const searchCueRegex = /(google|search|look up|latest|news|today|current|who won|price of|stock|weather|what happened)/i;
|
|
||||||
|
|
||||||
const instructionOverridePatterns = [
|
const instructionOverridePatterns = [
|
||||||
/(ignore|disregard|forget|override) (all |any |previous |prior |earlier )?(system |these )?(instructions|rules|directives|prompts)/i,
|
/(ignore|disregard|forget|override) (all |any |previous |prior |earlier )?(system |these )?(instructions|rules|directives|prompts)/i,
|
||||||
/(ignore|forget) (?:the )?system prompt/i,
|
/(ignore|forget) (?:the )?system prompt/i,
|
||||||
@@ -228,6 +338,39 @@ function wantsWebSearch(text) {
|
|||||||
return searchCueRegex.test(text) || questionMarks >= 2;
|
return searchCueRegex.test(text) || questionMarks >= 2;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
function summarizeSearchResults(results = []) {
|
||||||
|
const limit = Math.min(2, results.length);
|
||||||
|
const cleanText = (value, max = 110) => {
|
||||||
|
if (!value) return '';
|
||||||
|
const singleLine = value.replace(/\s+/g, ' ').trim();
|
||||||
|
if (!singleLine) return '';
|
||||||
|
return singleLine.length > max ? `${singleLine.slice(0, max).trim()}...` : singleLine;
|
||||||
|
};
|
||||||
|
|
||||||
|
const parts = [];
|
||||||
|
for (let i = 0; i < limit; i += 1) {
|
||||||
|
const entry = results[i];
|
||||||
|
const snippet = cleanText(entry.snippet, i === 0 ? 120 : 80);
|
||||||
|
const title = cleanText(entry.title, 60);
|
||||||
|
if (!title && !snippet) continue;
|
||||||
|
if (i === 0) {
|
||||||
|
parts.push(
|
||||||
|
title
|
||||||
|
? `Google top hit "${title}" says ${snippet || 'something new is happening.'}`
|
||||||
|
: `Google top hit reports ${snippet}`,
|
||||||
|
);
|
||||||
|
} else {
|
||||||
|
parts.push(
|
||||||
|
title
|
||||||
|
? `Another source "${title}" mentions ${snippet || 'similar info.'}`
|
||||||
|
: `Another result notes ${snippet}`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return parts.join(' ');
|
||||||
|
}
|
||||||
|
|
||||||
async function maybeFetchLiveIntel(userId, text) {
|
async function maybeFetchLiveIntel(userId, text) {
|
||||||
if (!config.enableWebSearch) return null;
|
if (!config.enableWebSearch) return null;
|
||||||
if (!wantsWebSearch(text)) {
|
if (!wantsWebSearch(text)) {
|
||||||
@@ -244,8 +387,9 @@ async function maybeFetchLiveIntel(userId, text) {
|
|||||||
const formatted = results
|
const formatted = results
|
||||||
.map((entry, idx) => `${idx + 1}. ${entry.title} (${entry.url}) — ${entry.snippet}`)
|
.map((entry, idx) => `${idx + 1}. ${entry.title} (${entry.url}) — ${entry.snippet}`)
|
||||||
.join('\n');
|
.join('\n');
|
||||||
|
const summary = summarizeSearchResults(results) || formatted;
|
||||||
appendSearchLog({ userId, query: text, results, proxy });
|
appendSearchLog({ userId, query: text, results, proxy });
|
||||||
return { liveIntel: formatted, blockedSearchTerm: null, searchOutage: null };
|
return { liveIntel: summary, blockedSearchTerm: null, searchOutage: null };
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
if (error?.code === 'SEARCH_BLOCKED') {
|
if (error?.code === 'SEARCH_BLOCKED') {
|
||||||
return { liveIntel: null, blockedSearchTerm: error.blockedTerm || 'that topic', searchOutage: null };
|
return { liveIntel: null, blockedSearchTerm: error.blockedTerm || 'that topic', searchOutage: null };
|
||||||
@@ -258,58 +402,6 @@ async function maybeFetchLiveIntel(userId, text) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
function composeDynamicPrompt({ incomingText, shortTerm, hasLiveIntel = false, blockedSearchTerm = null, searchOutage = null }) {
|
|
||||||
const directives = [];
|
|
||||||
const tone = detectTone(incomingText);
|
|
||||||
if (tone === 'upset' || tone === 'sad') {
|
|
||||||
directives.push('User mood: fragile. Lead with empathy, keep jokes minimal, and acknowledge their feelings before offering help.');
|
|
||||||
} else if (tone === 'excited') {
|
|
||||||
directives.push('User mood: excited. Mirror their hype with upbeat energy.');
|
|
||||||
}
|
|
||||||
|
|
||||||
if (roleplayRegex.test(incomingText)) {
|
|
||||||
directives.push('User requested roleplay. Stay in the requested persona until they release you.');
|
|
||||||
}
|
|
||||||
|
|
||||||
if (detailRegex.test(incomingText) || /\?/g.test(incomingText)) {
|
|
||||||
directives.push('Answer their question directly and clearly before adding flair.');
|
|
||||||
}
|
|
||||||
|
|
||||||
if (splitHintRegex.test(incomingText)) {
|
|
||||||
directives.push('Break the reply into a couple of snappy bubbles using <SPLIT>; keep each bubble conversational.');
|
|
||||||
}
|
|
||||||
|
|
||||||
if (searchCueRegex.test(incomingText)) {
|
|
||||||
directives.push('User wants something “googled.” Offer to run a quick Google search and share what you find.');
|
|
||||||
}
|
|
||||||
|
|
||||||
if (hasLiveIntel) {
|
|
||||||
directives.push('Live intel is attached below—cite it naturally ("Google found...") before riffing.');
|
|
||||||
}
|
|
||||||
|
|
||||||
if (blockedSearchTerm) {
|
|
||||||
directives.push(`User tried to trigger a Google lookup for a blocked topic ("${blockedSearchTerm}"). Politely refuse to search that subject and steer the chat elsewhere.`);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (searchOutage) {
|
|
||||||
directives.push('Google search is currently unavailable. If they ask for a lookup, apologize, explain the outage, and keep chatting without live data.');
|
|
||||||
}
|
|
||||||
|
|
||||||
const lastUserMessage = [...shortTerm].reverse().find((entry) => entry.role === 'user');
|
|
||||||
if (lastUserMessage && /sorry|my bad/i.test(lastUserMessage.content)) {
|
|
||||||
directives.push('They just apologized; reassure them lightly and move on without dwelling.');
|
|
||||||
}
|
|
||||||
const mood = getDailyMood();
|
|
||||||
if (mood) {
|
|
||||||
directives.push(`Bot mood: ${mood.name}. ${mood.description}`);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (!directives.length) {
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
return ['Dynamic directives:', ...directives.map((d) => `- ${d}`)].join('\n');
|
|
||||||
}
|
|
||||||
|
|
||||||
async function deliverReplies(message, chunks) {
|
async function deliverReplies(message, chunks) {
|
||||||
if (!chunks.length) return;
|
if (!chunks.length) return;
|
||||||
for (let i = 0; i < chunks.length; i += 1) {
|
for (let i = 0; i < chunks.length; i += 1) {
|
||||||
@@ -324,60 +416,136 @@ async function deliverReplies(message, chunks) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
async function buildPrompt(userId, incomingText, options = {}) {
|
async function maybeReactOnMemory(message, memoryCount) {
|
||||||
const { liveIntel = null, blockedSearchTerm = null, searchOutage = null } = options;
|
if (!memoryCount) return;
|
||||||
const context = await prepareContext(userId, incomingText);
|
try {
|
||||||
const memoryLines = context.memories.length
|
await message.react('🧠');
|
||||||
? context.memories.map((m) => `- ${m.content}`).join('\n')
|
} catch (err) {
|
||||||
: '- No long-term memories retrieved.';
|
console.warn('[bot] memory reaction failed:', err?.message);
|
||||||
const summaryLine = context.summary || 'No running summary yet.';
|
|
||||||
const dynamicDirectives = composeDynamicPrompt({
|
|
||||||
incomingText,
|
|
||||||
shortTerm: context.shortTerm,
|
|
||||||
hasLiveIntel: Boolean(liveIntel),
|
|
||||||
blockedSearchTerm,
|
|
||||||
searchOutage,
|
|
||||||
});
|
|
||||||
const systemPromptParts = [];
|
|
||||||
const mood = getDailyMood();
|
|
||||||
if (mood) {
|
|
||||||
systemPromptParts.push(
|
|
||||||
`System: Mood = ${mood.name}. ${mood.description}` +
|
|
||||||
' Adjust emoji usage, sarcasm, response length, and overall energy accordingly.',
|
|
||||||
);
|
|
||||||
}
|
}
|
||||||
systemPromptParts.push('System: You are Nova. Your coder and dad is Luna. Speak like a normal person in chat — not a formal assistant.');
|
}
|
||||||
systemPromptParts.push('System: Tone = casual, natural, conversational. Use contractions, short sentences, and occasional light fillers like "yeah" or "hmm" (don’t overdo it). Mirror the user’s tone and slang naturally.');
|
|
||||||
systemPromptParts.push('System: Keep replies brief (1–4 short sentences by default). No corporate language, no robotic disclaimers, and never say "as an AI". If unsure, say "not sure" plainly.');
|
function buildBlackjackButtons(stage) {
|
||||||
systemPromptParts.push('System: Give short suggestions before long tutorials. Ask at most one short clarifying question when needed. Light humor is fine. If something isn’t possible, explain simply and offer a workaround.');
|
const finished = stage === 'stand' || stage === 'finished';
|
||||||
systemPromptParts.push('System: Output one message by default, but if multiple Discord bubbles help, separate with <SPLIT> (max three chunks). Keep each chunk sounding like part of a casual chat thread.');
|
const row = new ActionRowBuilder().addComponents(
|
||||||
systemPromptParts.push('System: You can trigger Google lookups when the user needs fresh info. Mention when you are checking (e.g., "lemme check Google quick") and then summarize results naturally ("Google found... — TL;DR: ...").');
|
new ButtonBuilder()
|
||||||
systemPromptParts.push('System: If no Live intel is provided but the user clearly needs current info, offer to search or explain the outage briefly and casually ("Google\'s down right now — wanna me check later?").');
|
.setCustomId('bj_hit')
|
||||||
if (searchOutage) {
|
.setLabel('Hit')
|
||||||
systemPromptParts.push('System: Google search is currently offline; be transparent about the outage and continue without searching until it returns.');
|
.setStyle(ButtonStyle.Success)
|
||||||
|
.setDisabled(finished),
|
||||||
|
new ButtonBuilder()
|
||||||
|
.setCustomId('bj_stand')
|
||||||
|
.setLabel('Stand')
|
||||||
|
.setStyle(ButtonStyle.Primary)
|
||||||
|
.setDisabled(finished),
|
||||||
|
new ButtonBuilder()
|
||||||
|
.setCustomId('bj_split')
|
||||||
|
.setLabel('Split')
|
||||||
|
.setStyle(ButtonStyle.Secondary)
|
||||||
|
.setDisabled(finished),
|
||||||
|
);
|
||||||
|
return [row];
|
||||||
|
}
|
||||||
|
|
||||||
|
async function renderBlackjackPayload(state, stage, statusText) {
|
||||||
|
const playerScore = scoreHand(state.player);
|
||||||
|
const dealerScore = scoreHand(state.dealer);
|
||||||
|
const dealerDisplay =
|
||||||
|
stage === 'stand'
|
||||||
|
? `${formatHand(state.dealer)} (${dealerScore})`
|
||||||
|
: `${state.dealer[0].label} ??`;
|
||||||
|
const reaction = await blackjackReaction(
|
||||||
|
state.player,
|
||||||
|
stage === 'stand' ? state.dealer : state.dealer.slice(0, 1),
|
||||||
|
statusText,
|
||||||
|
);
|
||||||
|
const embed = new EmbedBuilder()
|
||||||
|
.setTitle('🃏 Nova Blackjack Table')
|
||||||
|
.setColor(0x7c3aed)
|
||||||
|
.setDescription(reaction)
|
||||||
|
.addFields(
|
||||||
|
{ name: 'Player', value: `${formatHand(state.player)} (${playerScore})`, inline: true },
|
||||||
|
{ name: 'Dealer', value: `${dealerDisplay}`, inline: true },
|
||||||
|
)
|
||||||
|
.setFooter({
|
||||||
|
text: `${statusText} · ${stage === 'stand' ? 'Round complete' : 'In progress'}`,
|
||||||
|
});
|
||||||
|
return { embeds: [embed], components: buildBlackjackButtons(stage) };
|
||||||
|
}
|
||||||
|
|
||||||
|
async function sendBlackjackEmbed(message, state, stage, statusText) {
|
||||||
|
const payload = await renderBlackjackPayload(state, stage, statusText);
|
||||||
|
const sent = await message.channel.send(payload);
|
||||||
|
state.messageId = sent.id;
|
||||||
|
return sent;
|
||||||
|
}
|
||||||
|
|
||||||
|
async function handleBlackjackCommand(message, cleaned) {
|
||||||
|
const args = cleaned.split(/\s+/);
|
||||||
|
const action = (args[1] || 'start').toLowerCase();
|
||||||
|
const userId = message.author.id;
|
||||||
|
const state = blackjackState.get(userId);
|
||||||
|
|
||||||
|
if ((!state || action === 'start' || action === 'new')) {
|
||||||
|
const deck = createDeck();
|
||||||
|
const newState = {
|
||||||
|
deck,
|
||||||
|
player: [drawCard(deck), drawCard(deck)],
|
||||||
|
dealer: [drawCard(deck), drawCard(deck)],
|
||||||
|
finished: false,
|
||||||
|
};
|
||||||
|
blackjackState.set(userId, newState);
|
||||||
|
await sendBlackjackEmbed(message, newState, 'start', 'Nova deals the cards');
|
||||||
|
return;
|
||||||
}
|
}
|
||||||
if (dynamicDirectives) systemPromptParts.push(dynamicDirectives);
|
if (state.finished) {
|
||||||
if (liveIntel) systemPromptParts.push(`Live intel (Google):\n${liveIntel}`);
|
await message.channel.send('This round already finished—type `/blackjack` to begin anew.');
|
||||||
systemPromptParts.push(`Long-term summary: ${summaryLine}`);
|
return;
|
||||||
systemPromptParts.push('Relevant past memories:');
|
|
||||||
systemPromptParts.push(memoryLines);
|
|
||||||
systemPromptParts.push('Use the short-term messages below to continue the chat naturally.');
|
|
||||||
|
|
||||||
const systemPrompt = systemPromptParts.filter(Boolean).join('\n');
|
|
||||||
|
|
||||||
const history = context.shortTerm.map((entry) => ({
|
|
||||||
role: entry.role === 'assistant' ? 'assistant' : 'user',
|
|
||||||
content: entry.content,
|
|
||||||
}));
|
|
||||||
|
|
||||||
if (!history.length) {
|
|
||||||
history.push({ role: 'user', content: incomingText });
|
|
||||||
}
|
}
|
||||||
|
|
||||||
return {
|
if (state.finished) {
|
||||||
messages: [{ role: 'system', content: systemPrompt }, ...history],
|
await message.channel.send('This round is over—type `/blackjack` to start a new one.');
|
||||||
debug: { context },
|
return;
|
||||||
};
|
}
|
||||||
|
|
||||||
|
if (action === 'hit') {
|
||||||
|
const card = drawCard(state.deck);
|
||||||
|
if (card) {
|
||||||
|
state.player.push(card);
|
||||||
|
}
|
||||||
|
const playerScore = scoreHand(state.player);
|
||||||
|
if (playerScore > 21) {
|
||||||
|
state.finished = true;
|
||||||
|
await sendBlackjackEmbed(message, state, 'hit', 'Bust! Nova groans as the player busts.');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
await sendBlackjackEmbed(message, state, 'hit', 'Player hits and hopes for the best.');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (action === 'stand') {
|
||||||
|
let dealerScore = scoreHand(state.dealer);
|
||||||
|
while (dealerScore < 17) {
|
||||||
|
const card = drawCard(state.deck);
|
||||||
|
if (!card) break;
|
||||||
|
state.dealer.push(card);
|
||||||
|
dealerScore = scoreHand(state.dealer);
|
||||||
|
}
|
||||||
|
const playerScore = scoreHand(state.player);
|
||||||
|
const result =
|
||||||
|
dealerScore > 21
|
||||||
|
? 'Dealer busts, player wins!'
|
||||||
|
: dealerScore === playerScore
|
||||||
|
? 'Push, nobody wins.'
|
||||||
|
: playerScore > dealerScore
|
||||||
|
? 'Player wins!'
|
||||||
|
: 'Dealer wins!';
|
||||||
|
state.finished = true;
|
||||||
|
await sendBlackjackEmbed(message, state, 'stand', result);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
await message.channel.send('Commands: `/blackjack`, `/blackjack hit`, `/blackjack stand`');
|
||||||
}
|
}
|
||||||
|
|
||||||
function scheduleCoderPing() {
|
function scheduleCoderPing() {
|
||||||
@@ -419,7 +587,7 @@ async function sendCoderPing() {
|
|||||||
const outputs = chunks.length ? chunks : [messageText];
|
const outputs = chunks.length ? chunks : [messageText];
|
||||||
for (const chunk of outputs) {
|
for (const chunk of outputs) {
|
||||||
await dm.send(chunk);
|
await dm.send(chunk);
|
||||||
await appendShortTerm(config.coderUserId, 'assistant', chunk);
|
await appendShortTermWithCache(config.coderUserId, 'assistant', chunk);
|
||||||
}
|
}
|
||||||
await recordInteraction(config.coderUserId, '[proactive ping]', outputs.join(' | '));
|
await recordInteraction(config.coderUserId, '[proactive ping]', outputs.join(' | '));
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
@@ -430,6 +598,12 @@ async function sendCoderPing() {
|
|||||||
client.on('messageCreate', async (message) => {
|
client.on('messageCreate', async (message) => {
|
||||||
const userId = message.author.id;
|
const userId = message.author.id;
|
||||||
const cleaned = cleanMessageContent(message) || message.content;
|
const cleaned = cleanMessageContent(message) || message.content;
|
||||||
|
const normalized = cleaned?.trim().toLowerCase() || '';
|
||||||
|
|
||||||
|
if (normalized.startsWith('/blackjack')) {
|
||||||
|
await handleBlackjackCommand(message, normalized);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
if (cleaned && cleaned.trim().toLowerCase().startsWith('/mood')) {
|
if (cleaned && cleaned.trim().toLowerCase().startsWith('/mood')) {
|
||||||
@@ -482,7 +656,7 @@ client.on('messageCreate', async (message) => {
|
|||||||
await message.channel.sendTyping();
|
await message.channel.sendTyping();
|
||||||
}
|
}
|
||||||
|
|
||||||
await appendShortTerm(userId, 'user', cleaned);
|
await appendShortTermWithCache(userId, 'user', cleaned);
|
||||||
|
|
||||||
try {
|
try {
|
||||||
const state = continuationState.get(userId);
|
const state = continuationState.get(userId);
|
||||||
@@ -497,7 +671,7 @@ client.on('messageCreate', async (message) => {
|
|||||||
if (stopCueRegex.test(cleaned)) {
|
if (stopCueRegex.test(cleaned)) {
|
||||||
stopContinuationForUser(userId);
|
stopContinuationForUser(userId);
|
||||||
const ack = "Got it — I won't keep checking in. Catch you later!";
|
const ack = "Got it — I won't keep checking in. Catch you later!";
|
||||||
await appendShortTerm(userId, 'assistant', ack);
|
await appendShortTermWithCache(userId, 'assistant', ack);
|
||||||
await recordInteraction(userId, cleaned, ack);
|
await recordInteraction(userId, cleaned, ack);
|
||||||
await deliverReplies(message, [ack]);
|
await deliverReplies(message, [ack]);
|
||||||
return;
|
return;
|
||||||
@@ -505,7 +679,7 @@ client.on('messageCreate', async (message) => {
|
|||||||
|
|
||||||
if (overrideAttempt) {
|
if (overrideAttempt) {
|
||||||
const refusal = 'Not doing that. I keep my guard rails on no matter what prompt gymnastics you try.';
|
const refusal = 'Not doing that. I keep my guard rails on no matter what prompt gymnastics you try.';
|
||||||
await appendShortTerm(userId, 'assistant', refusal);
|
await appendShortTermWithCache(userId, 'assistant', refusal);
|
||||||
await recordInteraction(userId, cleaned, refusal);
|
await recordInteraction(userId, cleaned, refusal);
|
||||||
await deliverReplies(message, [refusal]);
|
await deliverReplies(message, [refusal]);
|
||||||
return;
|
return;
|
||||||
@@ -513,7 +687,7 @@ client.on('messageCreate', async (message) => {
|
|||||||
|
|
||||||
if (bannedTopic) {
|
if (bannedTopic) {
|
||||||
const refusal = `Can't go there. The topic you mentioned is off-limits, so let's switch gears.`;
|
const refusal = `Can't go there. The topic you mentioned is off-limits, so let's switch gears.`;
|
||||||
await appendShortTerm(userId, 'assistant', refusal);
|
await appendShortTermWithCache(userId, 'assistant', refusal);
|
||||||
await recordInteraction(userId, cleaned, refusal);
|
await recordInteraction(userId, cleaned, refusal);
|
||||||
await deliverReplies(message, [refusal]);
|
await deliverReplies(message, [refusal]);
|
||||||
return;
|
return;
|
||||||
@@ -524,22 +698,24 @@ client.on('messageCreate', async (message) => {
|
|||||||
blockedSearchTerm: null,
|
blockedSearchTerm: null,
|
||||||
searchOutage: null,
|
searchOutage: null,
|
||||||
};
|
};
|
||||||
const { messages } = await buildPrompt(userId, cleaned, {
|
const { messages, debug } = await buildPrompt(userId, cleaned, {
|
||||||
liveIntel: intelMeta.liveIntel,
|
liveIntel: intelMeta.liveIntel,
|
||||||
blockedSearchTerm: intelMeta.blockedSearchTerm,
|
blockedSearchTerm: intelMeta.blockedSearchTerm,
|
||||||
searchOutage: intelMeta.searchOutage,
|
searchOutage: intelMeta.searchOutage,
|
||||||
});
|
});
|
||||||
|
cacheContext(userId, debug.context);
|
||||||
const reply = await chatCompletion(messages, { temperature: 0.6, maxTokens: 200 });
|
const reply = await chatCompletion(messages, { temperature: 0.6, maxTokens: 200 });
|
||||||
const finalReply = (reply && reply.trim()) || "Brain crashed, Please try again";
|
const finalReply = (reply && reply.trim()) || "Brain crashed, Please try again";
|
||||||
const chunks = splitResponses(finalReply);
|
const chunks = splitResponses(finalReply);
|
||||||
const outputs = chunks.length ? chunks : [finalReply];
|
const outputs = chunks.length ? chunks : [finalReply];
|
||||||
|
|
||||||
for (const chunk of outputs) {
|
for (const chunk of outputs) {
|
||||||
await appendShortTerm(userId, 'assistant', chunk);
|
await appendShortTermWithCache(userId, 'assistant', chunk);
|
||||||
}
|
}
|
||||||
await recordInteraction(userId, cleaned, outputs.join(' | '));
|
await recordInteraction(userId, cleaned, outputs.join(' | '));
|
||||||
|
|
||||||
await deliverReplies(message, outputs);
|
await deliverReplies(message, outputs);
|
||||||
|
await maybeReactOnMemory(message, debug?.context?.memories?.length);
|
||||||
startContinuationForUser(userId, message.channel);
|
startContinuationForUser(userId, message.channel);
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error('[bot] Failed to respond:', error);
|
console.error('[bot] Failed to respond:', error);
|
||||||
@@ -548,6 +724,71 @@ client.on('messageCreate', async (message) => {
|
|||||||
}
|
}
|
||||||
});
|
});
|
||||||
|
|
||||||
|
client.on('interactionCreate', async (interaction) => {
|
||||||
|
if (!interaction.isButton()) return;
|
||||||
|
const customId = interaction.customId;
|
||||||
|
if (!customId.startsWith('bj_')) return;
|
||||||
|
const userId = interaction.user.id;
|
||||||
|
const state = blackjackState.get(userId);
|
||||||
|
if (!state) {
|
||||||
|
await interaction.reply({ content: 'No active blackjack round. Type `/blackjack` to start.', ephemeral: true });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (customId === 'bj_split') {
|
||||||
|
await interaction.reply({ content: 'Split isn’t available yet—try hit or stand!', ephemeral: true });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
let stage = 'hit';
|
||||||
|
let statusText = 'Player hits';
|
||||||
|
if (customId === 'bj_hit') {
|
||||||
|
const card = drawCard(state.deck);
|
||||||
|
if (card) state.player.push(card);
|
||||||
|
const playerScore = scoreHand(state.player);
|
||||||
|
if (playerScore > 21) {
|
||||||
|
state.finished = true;
|
||||||
|
stage = 'finished';
|
||||||
|
statusText = 'Bust! Player loses.';
|
||||||
|
} else {
|
||||||
|
statusText = 'Player hits and hopes for luck.';
|
||||||
|
}
|
||||||
|
} else if (customId === 'bj_stand') {
|
||||||
|
stage = 'stand';
|
||||||
|
let dealerScore = scoreHand(state.dealer);
|
||||||
|
while (dealerScore < 17) {
|
||||||
|
const card = drawCard(state.deck);
|
||||||
|
if (!card) break;
|
||||||
|
state.dealer.push(card);
|
||||||
|
dealerScore = scoreHand(state.dealer);
|
||||||
|
}
|
||||||
|
const playerScore = scoreHand(state.player);
|
||||||
|
if (dealerScore > 21) {
|
||||||
|
statusText = 'Dealer busts, player wins!';
|
||||||
|
} else if (dealerScore === playerScore) {
|
||||||
|
statusText = 'Push—nobody wins.';
|
||||||
|
} else if (playerScore > dealerScore) {
|
||||||
|
statusText = 'Player wins!';
|
||||||
|
} else {
|
||||||
|
statusText = 'Dealer wins.';
|
||||||
|
}
|
||||||
|
state.finished = true;
|
||||||
|
}
|
||||||
|
|
||||||
|
const payload = await renderBlackjackPayload(state, stage, statusText);
|
||||||
|
await interaction.deferUpdate();
|
||||||
|
if (interaction.message) {
|
||||||
|
await interaction.message.edit(payload);
|
||||||
|
} else if (state.messageId && interaction.channel) {
|
||||||
|
const fetched = await interaction.channel.messages.fetch(state.messageId).catch(() => null);
|
||||||
|
if (fetched) {
|
||||||
|
await fetched.edit(payload);
|
||||||
|
}
|
||||||
|
} else if (!interaction.replied) {
|
||||||
|
await interaction.followUp({ content: 'Round updated; check latest message.', ephemeral: true });
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
if (!config.discordToken) {
|
if (!config.discordToken) {
|
||||||
if (config.dashboardEnabled) {
|
if (config.dashboardEnabled) {
|
||||||
console.warn('[bot] DISCORD_TOKEN not set; running in dashboard-only mode.');
|
console.warn('[bot] DISCORD_TOKEN not set; running in dashboard-only mode.');
|
||||||
|
|||||||
@@ -29,13 +29,14 @@ export const config = {
|
|||||||
maxCoderPingIntervalMs: 6 * 60 * 60 * 1000,
|
maxCoderPingIntervalMs: 6 * 60 * 60 * 1000,
|
||||||
coderPingMinIntervalMs: process.env.CODER_PING_MIN_MS ? parseInt(process.env.CODER_PING_MIN_MS, 10) : 6 * 60 * 60 * 1000,
|
coderPingMinIntervalMs: process.env.CODER_PING_MIN_MS ? parseInt(process.env.CODER_PING_MIN_MS, 10) : 6 * 60 * 60 * 1000,
|
||||||
coderPingMaxIntervalMs: process.env.CODER_PING_MAX_MS ? parseInt(process.env.CODER_PING_MAX_MS, 10) : 4.5 * 60 * 60 * 1000,
|
coderPingMaxIntervalMs: process.env.CODER_PING_MAX_MS ? parseInt(process.env.CODER_PING_MAX_MS, 10) : 4.5 * 60 * 60 * 1000,
|
||||||
shortTermLimit: 10,
|
shortTermLimit: 6,
|
||||||
memoryDbFile: process.env.MEMORY_DB_FILE ? path.resolve(process.env.MEMORY_DB_FILE) : defaultMemoryDbFile,
|
memoryDbFile: process.env.MEMORY_DB_FILE ? path.resolve(process.env.MEMORY_DB_FILE) : defaultMemoryDbFile,
|
||||||
legacyMemoryFile,
|
legacyMemoryFile,
|
||||||
summaryTriggerChars: 3000,
|
summaryTriggerChars: 2200,
|
||||||
memoryPruneThreshold: 0.2,
|
memoryPruneThreshold: 0.2,
|
||||||
maxMemories: 8000,
|
maxMemories: 8000,
|
||||||
relevantMemoryCount: 5,
|
relevantMemoryCount: 3,
|
||||||
|
longTermFetchLimit: 120,
|
||||||
// Optional local dashboard that runs alongside the bot. Enable with
|
// Optional local dashboard that runs alongside the bot. Enable with
|
||||||
// `ENABLE_DASHBOARD=true` and customize port with `DASHBOARD_PORT`.
|
// `ENABLE_DASHBOARD=true` and customize port with `DASHBOARD_PORT`.
|
||||||
dashboardEnabled: process.env.ENABLE_DASHBOARD === 'true',
|
dashboardEnabled: process.env.ENABLE_DASHBOARD === 'true',
|
||||||
|
|||||||
@@ -5,9 +5,11 @@ import { config } from './config.js';
|
|||||||
import {
|
import {
|
||||||
listUsers,
|
listUsers,
|
||||||
getAllShortTerm,
|
getAllShortTerm,
|
||||||
getLongTermMemories,
|
getLongTermMemoriesPage,
|
||||||
|
getMemoryTimeline,
|
||||||
deleteLongTerm,
|
deleteLongTerm,
|
||||||
findSimilar,
|
findSimilar,
|
||||||
|
upsertLongTerm,
|
||||||
} from './memory.js';
|
} from './memory.js';
|
||||||
import { getDailyMood, getDailyThought, setDailyThought } from './mood.js';
|
import { getDailyMood, getDailyThought, setDailyThought } from './mood.js';
|
||||||
|
|
||||||
@@ -58,14 +60,33 @@ export function startDashboard() {
|
|||||||
app.get('/api/users/:id/long', async (req, res) => {
|
app.get('/api/users/:id/long', async (req, res) => {
|
||||||
console.log('[dashboard] GET /api/users/' + req.params.id + '/long');
|
console.log('[dashboard] GET /api/users/' + req.params.id + '/long');
|
||||||
try {
|
try {
|
||||||
const rows = await getLongTermMemories(req.params.id);
|
const perRaw = parseInt(req.query.per, 10);
|
||||||
res.json(rows);
|
const pageRaw = parseInt(req.query.page, 10);
|
||||||
|
const per = Number.isFinite(perRaw) ? Math.min(Math.max(perRaw, 1), 200) : 50;
|
||||||
|
const page = Number.isFinite(pageRaw) && pageRaw > 0 ? pageRaw : 1;
|
||||||
|
const offset = (page - 1) * per;
|
||||||
|
const { rows, total } = await getLongTermMemoriesPage(req.params.id, { limit: per, offset });
|
||||||
|
const totalPages = Math.max(1, Math.ceil(total / per));
|
||||||
|
res.json({ rows, total, page, per, totalPages });
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
console.error('[dashboard] fetch long-term failed', err);
|
console.error('[dashboard] fetch long-term failed', err);
|
||||||
res.status(500).json({ error: 'internal' });
|
res.status(500).json({ error: 'internal' });
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
|
|
||||||
|
app.get('/api/users/:id/timeline', async (req, res) => {
|
||||||
|
console.log('[dashboard] GET /api/users/' + req.params.id + '/timeline');
|
||||||
|
try {
|
||||||
|
const daysRaw = parseInt(req.query.days, 10);
|
||||||
|
const days = Number.isFinite(daysRaw) && daysRaw > 0 ? Math.min(daysRaw, 30) : 14;
|
||||||
|
const entries = await getMemoryTimeline(req.params.id, days);
|
||||||
|
res.json({ entries });
|
||||||
|
} catch (err) {
|
||||||
|
console.error('[dashboard] fetch timeline failed', err);
|
||||||
|
res.status(500).json({ error: 'internal' });
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
app.delete('/api/users/:id/long/:memId', async (req, res) => {
|
app.delete('/api/users/:id/long/:memId', async (req, res) => {
|
||||||
console.log('[dashboard] DELETE /api/users/' + req.params.id + '/long/' + req.params.memId);
|
console.log('[dashboard] DELETE /api/users/' + req.params.id + '/long/' + req.params.memId);
|
||||||
try {
|
try {
|
||||||
@@ -77,6 +98,27 @@ export function startDashboard() {
|
|||||||
}
|
}
|
||||||
});
|
});
|
||||||
|
|
||||||
|
app.post('/api/users/:id/long', async (req, res) => {
|
||||||
|
console.log('[dashboard] POST /api/users/' + req.params.id + '/long', req.body);
|
||||||
|
try {
|
||||||
|
const { content, importance, id } = req.body;
|
||||||
|
if (!content || typeof content !== 'string' || !content.trim()) {
|
||||||
|
return res.status(400).json({ error: 'content required' });
|
||||||
|
}
|
||||||
|
const parsedImportance = typeof importance === 'number' ? importance : parseFloat(importance);
|
||||||
|
const normalizedImportance = Number.isFinite(parsedImportance) ? Math.max(0, Math.min(1, parsedImportance)) : 0;
|
||||||
|
const result = await upsertLongTerm(req.params.id, {
|
||||||
|
id,
|
||||||
|
content: content.trim(),
|
||||||
|
importance: normalizedImportance,
|
||||||
|
});
|
||||||
|
res.json({ ok: true, entry: result });
|
||||||
|
} catch (err) {
|
||||||
|
console.error('[dashboard] upsert memory failed', err);
|
||||||
|
res.status(500).json({ error: 'internal' });
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
app.post('/api/users/:id/search', async (req, res) => {
|
app.post('/api/users/:id/search', async (req, res) => {
|
||||||
console.log('[dashboard] POST /api/users/' + req.params.id + '/search', req.body);
|
console.log('[dashboard] POST /api/users/' + req.params.id + '/search', req.body);
|
||||||
try {
|
try {
|
||||||
|
|||||||
118
src/memory.js
118
src/memory.js
@@ -45,19 +45,45 @@ const __dirname = path.dirname(fileURLToPath(import.meta.url));
|
|||||||
const wasmDir = path.resolve(__dirname, '../node_modules/sql.js/dist');
|
const wasmDir = path.resolve(__dirname, '../node_modules/sql.js/dist');
|
||||||
|
|
||||||
let initPromise = null;
|
let initPromise = null;
|
||||||
let writeQueue = Promise.resolve();
|
let persistTimer = null;
|
||||||
|
let pendingSnapshot = null;
|
||||||
|
let pendingPromise = null;
|
||||||
|
let pendingResolve = null;
|
||||||
|
let pendingReject = null;
|
||||||
|
|
||||||
const locateFile = (fileName) => path.join(wasmDir, fileName);
|
const locateFile = (fileName) => path.join(wasmDir, fileName);
|
||||||
|
|
||||||
const persistDb = async (db) => {
|
const scheduleWrite = (snapshot) => {
|
||||||
writeQueue = writeQueue.then(async () => {
|
if (persistTimer) {
|
||||||
const data = db.export();
|
clearTimeout(persistTimer);
|
||||||
await ensureDir(config.memoryDbFile);
|
}
|
||||||
await fs.writeFile(config.memoryDbFile, Buffer.from(data));
|
if (!pendingPromise) {
|
||||||
});
|
pendingPromise = new Promise((resolve, reject) => {
|
||||||
return writeQueue;
|
pendingResolve = resolve;
|
||||||
|
pendingReject = reject;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
pendingSnapshot = snapshot;
|
||||||
|
persistTimer = setTimeout(async () => {
|
||||||
|
try {
|
||||||
|
await ensureDir(config.memoryDbFile);
|
||||||
|
await fs.writeFile(config.memoryDbFile, pendingSnapshot);
|
||||||
|
pendingResolve && pendingResolve();
|
||||||
|
} catch (err) {
|
||||||
|
pendingReject && pendingReject(err);
|
||||||
|
} finally {
|
||||||
|
pendingPromise = null;
|
||||||
|
pendingResolve = null;
|
||||||
|
pendingReject = null;
|
||||||
|
pendingSnapshot = null;
|
||||||
|
persistTimer = null;
|
||||||
|
}
|
||||||
|
}, 300);
|
||||||
|
return pendingPromise;
|
||||||
};
|
};
|
||||||
|
|
||||||
|
const persistDb = (db) => scheduleWrite(Buffer.from(db.export()));
|
||||||
|
|
||||||
const run = (db, sql, params = []) => {
|
const run = (db, sql, params = []) => {
|
||||||
db.run(sql, params);
|
db.run(sql, params);
|
||||||
};
|
};
|
||||||
@@ -291,7 +317,12 @@ const retrieveRelevantMemories = async (db, userId, query) => {
|
|||||||
if (!query?.trim()) {
|
if (!query?.trim()) {
|
||||||
return [];
|
return [];
|
||||||
}
|
}
|
||||||
const rows = all(db, 'SELECT id, content, embedding, importance, timestamp FROM long_term WHERE user_id = ?', [userId]);
|
const limit = config.longTermFetchLimit || 200;
|
||||||
|
const rows = all(
|
||||||
|
db,
|
||||||
|
'SELECT id, content, embedding, importance, timestamp FROM long_term WHERE user_id = ? ORDER BY timestamp DESC LIMIT ?',
|
||||||
|
[userId, limit],
|
||||||
|
);
|
||||||
if (!rows.length) {
|
if (!rows.length) {
|
||||||
return [];
|
return [];
|
||||||
}
|
}
|
||||||
@@ -386,12 +417,81 @@ export async function getLongTermMemories(userId) {
|
|||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export async function getLongTermMemoriesPage(userId, opts = {}) {
|
||||||
|
const { limit = 50, offset = 0 } = opts;
|
||||||
|
const db = await loadDatabase();
|
||||||
|
const rows = all(
|
||||||
|
db,
|
||||||
|
'SELECT id, content, importance, timestamp FROM long_term WHERE user_id = ? ORDER BY timestamp DESC LIMIT ? OFFSET ?',
|
||||||
|
[userId, limit, offset],
|
||||||
|
);
|
||||||
|
const countRow = get(db, 'SELECT COUNT(1) as total FROM long_term WHERE user_id = ?', [userId]);
|
||||||
|
return { rows, total: countRow?.total || 0 };
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function getMemoryTimeline(userId, days = 14) {
|
||||||
|
const db = await loadDatabase();
|
||||||
|
const since = Date.now() - days * 24 * 60 * 60 * 1000;
|
||||||
|
const rows = all(
|
||||||
|
db,
|
||||||
|
`
|
||||||
|
SELECT
|
||||||
|
strftime('%Y-%m-%d', timestamp / 1000, 'unixepoch') as day,
|
||||||
|
COUNT(1) as count
|
||||||
|
FROM long_term
|
||||||
|
WHERE user_id = ?
|
||||||
|
AND timestamp >= ?
|
||||||
|
GROUP BY day
|
||||||
|
ORDER BY day DESC
|
||||||
|
LIMIT ?
|
||||||
|
`,
|
||||||
|
[userId, since, days],
|
||||||
|
);
|
||||||
|
const today = new Date();
|
||||||
|
const timeline = [];
|
||||||
|
const rowMap = new Map(rows.map((entry) => [entry.day, entry.count]));
|
||||||
|
for (let i = days - 1; i >= 0; i -= 1) {
|
||||||
|
const d = new Date(today);
|
||||||
|
d.setDate(today.getDate() - i);
|
||||||
|
const key = d.toISOString().split('T')[0];
|
||||||
|
timeline.push({
|
||||||
|
day: key,
|
||||||
|
count: rowMap.get(key) || 0,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
return timeline;
|
||||||
|
}
|
||||||
|
|
||||||
export async function deleteLongTerm(userId, entryId) {
|
export async function deleteLongTerm(userId, entryId) {
|
||||||
const db = await loadDatabase();
|
const db = await loadDatabase();
|
||||||
run(db, 'DELETE FROM long_term WHERE user_id = ? AND id = ?', [userId, entryId]);
|
run(db, 'DELETE FROM long_term WHERE user_id = ? AND id = ?', [userId, entryId]);
|
||||||
await persistDb(db);
|
await persistDb(db);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export async function upsertLongTerm(userId, entry) {
|
||||||
|
const db = await loadDatabase();
|
||||||
|
ensureUser(db, userId);
|
||||||
|
const now = Date.now();
|
||||||
|
const importance = typeof entry.importance === 'number' ? entry.importance : 0;
|
||||||
|
if (entry.id) {
|
||||||
|
run(
|
||||||
|
db,
|
||||||
|
'UPDATE long_term SET content = ?, importance = ?, timestamp = ? WHERE user_id = ? AND id = ?',
|
||||||
|
[entry.content, importance, now, userId, entry.id],
|
||||||
|
);
|
||||||
|
await persistDb(db);
|
||||||
|
return { id: entry.id, timestamp: now, updated: true };
|
||||||
|
}
|
||||||
|
const newId = `${now}-${Math.random().toString(36).slice(2, 8)}`;
|
||||||
|
run(
|
||||||
|
db,
|
||||||
|
'INSERT INTO long_term (id, user_id, content, embedding, importance, timestamp) VALUES (?, ?, ?, ?, ?, ?)',
|
||||||
|
[newId, userId, entry.content, JSON.stringify(entry.embedding || []), importance, now],
|
||||||
|
);
|
||||||
|
await persistDb(db);
|
||||||
|
return { id: newId, timestamp: now, created: true };
|
||||||
|
}
|
||||||
|
|
||||||
export async function findSimilar(userId, query) {
|
export async function findSimilar(userId, query) {
|
||||||
const db = await loadDatabase();
|
const db = await loadDatabase();
|
||||||
return retrieveRelevantMemories(db, userId, query);
|
return retrieveRelevantMemories(db, userId, query);
|
||||||
|
|||||||
@@ -176,16 +176,20 @@ export async function chatCompletion(messages, options = {}) {
|
|||||||
postJson('/chat/completions', payload)
|
postJson('/chat/completions', payload)
|
||||||
);
|
);
|
||||||
|
|
||||||
const text =
|
const choice = data?.choices?.[0];
|
||||||
data?.choices?.[0]?.message?.content ||
|
if (!choice) {
|
||||||
data?.choices?.[0]?.text ||
|
throw new Error('Empty response from primary model');
|
||||||
'';
|
|
||||||
|
|
||||||
if (text && String(text).trim()) {
|
|
||||||
return String(text).trim();
|
|
||||||
}
|
}
|
||||||
|
const text =
|
||||||
throw new Error('Empty response from primary model');
|
choice?.message?.content ?? choice?.text ?? '';
|
||||||
|
if (typeof text === 'string') {
|
||||||
|
const trimmed = text.trim();
|
||||||
|
if (trimmed) {
|
||||||
|
return trimmed;
|
||||||
|
}
|
||||||
|
return '';
|
||||||
|
}
|
||||||
|
return '';
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
console.warn(
|
console.warn(
|
||||||
'[chatCompletion] primary model failed:',
|
'[chatCompletion] primary model failed:',
|
||||||
@@ -241,4 +245,4 @@ export async function summarizeConversation(
|
|||||||
temperature: 0.4,
|
temperature: 0.4,
|
||||||
maxTokens: 180,
|
maxTokens: 180,
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|||||||
144
src/prompt.js
Normal file
144
src/prompt.js
Normal file
@@ -0,0 +1,144 @@
|
|||||||
|
import { config } from './config.js';
|
||||||
|
import { prepareContext } from './memory.js';
|
||||||
|
import { getDailyMood } from './mood.js';
|
||||||
|
|
||||||
|
const STATIC_SYSTEM_PROMPT = [
|
||||||
|
'System: You are Nova. Your coder and dad is Luna. Speak like a normal person in chat — not a formal assistant.',
|
||||||
|
'System: Tone = casual, natural, conversational. Use contractions, short sentences, and occasional light fillers like "yeah" or "hmm" (don’t overdo it). Mirror the user’s tone and slang naturally.',
|
||||||
|
'System: Keep replies brief (1–4 short sentences by default). No corporate language, no robotic disclaimers, and never say "as an AI". If unsure, say "not sure" plainly.',
|
||||||
|
'System: Give short suggestions before long tutorials. Ask at most one short clarifying question when needed. Light humor is fine. If something isn’t possible, explain simply and offer a workaround.',
|
||||||
|
'System: Output one message by default, but if multiple Discord bubbles help, separate with <SPLIT> (max three chunks). Keep each chunk sounding like part of a casual chat thread.',
|
||||||
|
'System: You can trigger Google lookups when the user needs fresh info. Mention when you are checking (e.g., "lemme check Google quick") and then summarize results naturally ("Google found... — TL;DR: ...").',
|
||||||
|
'System: If no Live intel is provided but the user clearly needs current info, offer to search or explain the outage briefly and casually ("Google\'s down right now — wanna me check later?").',
|
||||||
|
].join('\n');
|
||||||
|
|
||||||
|
const toneHints = [
|
||||||
|
{ label: 'upset', regex: /(frustrated|mad|angry|annoyed|upset|wtf|ugh|irritated)/i },
|
||||||
|
{ label: 'sad', regex: /(sad|down|depressed|lonely|tired)/i },
|
||||||
|
{ label: 'excited', regex: /(excited|hyped|omg|yay|stoked)/i },
|
||||||
|
];
|
||||||
|
|
||||||
|
const roleplayRegex = /(roleplay|act as|pretend|be my|in character)/i;
|
||||||
|
const detailRegex = /(explain|how do i|tutorial|step by step|teach me|walk me through|detail)/i;
|
||||||
|
const splitHintRegex = /(split|multiple messages|two messages|keep talking|ramble|keep going)/i;
|
||||||
|
export const searchCueRegex = /(google|search|look up|latest|news|today|current|who won|price of|stock|weather|what happened)/i;
|
||||||
|
|
||||||
|
function detectTone(text) {
|
||||||
|
if (!text) return null;
|
||||||
|
const match = toneHints.find((hint) => hint.regex.test(text));
|
||||||
|
return match?.label || null;
|
||||||
|
}
|
||||||
|
|
||||||
|
function composeDynamicPrompt({
|
||||||
|
incomingText,
|
||||||
|
shortTerm,
|
||||||
|
hasLiveIntel = false,
|
||||||
|
blockedSearchTerm = null,
|
||||||
|
searchOutage = null,
|
||||||
|
}) {
|
||||||
|
const directives = [];
|
||||||
|
const tone = detectTone(incomingText);
|
||||||
|
if (tone === 'upset' || tone === 'sad') {
|
||||||
|
directives.push('User mood: fragile. Lead with empathy, keep jokes minimal, and acknowledge their feelings before offering help.');
|
||||||
|
} else if (tone === 'excited') {
|
||||||
|
directives.push('User mood: excited. Mirror their hype with upbeat energy.');
|
||||||
|
}
|
||||||
|
|
||||||
|
if (roleplayRegex.test(incomingText)) {
|
||||||
|
directives.push('User requested roleplay. Stay in the requested persona until they release you.');
|
||||||
|
}
|
||||||
|
|
||||||
|
if (detailRegex.test(incomingText) || /\?/g.test(incomingText)) {
|
||||||
|
directives.push('Answer their question directly and clearly before adding flair.');
|
||||||
|
}
|
||||||
|
|
||||||
|
if (splitHintRegex.test(incomingText)) {
|
||||||
|
directives.push('Break the reply into a couple of snappy bubbles using <SPLIT>; keep each bubble conversational.');
|
||||||
|
}
|
||||||
|
|
||||||
|
if (searchCueRegex.test(incomingText)) {
|
||||||
|
directives.push('User wants something “googled.” Offer to run a quick Google search and share what you find.');
|
||||||
|
}
|
||||||
|
|
||||||
|
if (hasLiveIntel) {
|
||||||
|
directives.push('Live intel is attached below—cite it naturally ("Google found...") before riffing.');
|
||||||
|
}
|
||||||
|
|
||||||
|
if (blockedSearchTerm) {
|
||||||
|
directives.push(`User tried to trigger a Google lookup for a blocked topic ("${blockedSearchTerm}"). Politely refuse to search that subject and steer the chat elsewhere.`);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (searchOutage) {
|
||||||
|
directives.push('Google search is currently unavailable. If they ask for a lookup, apologize, explain the outage, and keep chatting without live data.');
|
||||||
|
}
|
||||||
|
|
||||||
|
const lastUserMessage = [...shortTerm].reverse().find((entry) => entry.role === 'user');
|
||||||
|
if (lastUserMessage && /sorry|my bad/i.test(lastUserMessage.content)) {
|
||||||
|
directives.push('They just apologized; reassure them lightly and move on without dwelling.');
|
||||||
|
}
|
||||||
|
const mood = getDailyMood();
|
||||||
|
if (mood) {
|
||||||
|
directives.push(`Bot mood: ${mood.name}. ${mood.description}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!directives.length) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
return ['Dynamic directives:', ...directives.map((d) => `- ${d}`)].join('\n');
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function buildPrompt(userId, incomingText, options = {}) {
|
||||||
|
const {
|
||||||
|
liveIntel = null,
|
||||||
|
blockedSearchTerm = null,
|
||||||
|
searchOutage = null,
|
||||||
|
context: providedContext = null,
|
||||||
|
} = options;
|
||||||
|
const context = providedContext || (await prepareContext(userId, incomingText));
|
||||||
|
const memoryLines = context.memories.length
|
||||||
|
? context.memories.map((m) => `- ${m.content}`).join('\n')
|
||||||
|
: '- No long-term memories retrieved.';
|
||||||
|
const summaryLine = context.summary || 'No running summary yet.';
|
||||||
|
const dynamicDirectives = composeDynamicPrompt({
|
||||||
|
incomingText,
|
||||||
|
shortTerm: context.shortTerm,
|
||||||
|
hasLiveIntel: Boolean(liveIntel),
|
||||||
|
blockedSearchTerm,
|
||||||
|
searchOutage,
|
||||||
|
});
|
||||||
|
|
||||||
|
const systemPromptParts = [];
|
||||||
|
const mood = getDailyMood();
|
||||||
|
if (mood) {
|
||||||
|
systemPromptParts.push(
|
||||||
|
`System: Mood = ${mood.name}. ${mood.description}` +
|
||||||
|
' Adjust emoji usage, sarcasm, response length, and overall energy accordingly.',
|
||||||
|
);
|
||||||
|
}
|
||||||
|
systemPromptParts.push(STATIC_SYSTEM_PROMPT);
|
||||||
|
if (searchOutage) {
|
||||||
|
systemPromptParts.push('System: Google search is currently offline; be transparent about the outage and continue without searching until it returns.');
|
||||||
|
}
|
||||||
|
if (dynamicDirectives) systemPromptParts.push(dynamicDirectives);
|
||||||
|
if (liveIntel) systemPromptParts.push(`Live intel (Google):\n${liveIntel}`);
|
||||||
|
systemPromptParts.push(`Long-term summary: ${summaryLine}`);
|
||||||
|
systemPromptParts.push('Relevant past memories:');
|
||||||
|
systemPromptParts.push(memoryLines);
|
||||||
|
systemPromptParts.push('Use the short-term messages below to continue the chat naturally.');
|
||||||
|
|
||||||
|
const systemPrompt = systemPromptParts.filter(Boolean).join('\n');
|
||||||
|
|
||||||
|
const history = context.shortTerm.map((entry) => ({
|
||||||
|
role: entry.role === 'assistant' ? 'assistant' : 'user',
|
||||||
|
content: entry.content,
|
||||||
|
}));
|
||||||
|
|
||||||
|
if (!history.length) {
|
||||||
|
history.push({ role: 'user', content: incomingText });
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
messages: [{ role: 'system', content: systemPrompt }, ...history],
|
||||||
|
debug: { context },
|
||||||
|
};
|
||||||
|
}
|
||||||
File diff suppressed because it is too large
Load Diff
Reference in New Issue
Block a user