7598942bc5a548d78252d0b65bce3c690b77f210
ADHDbot
Quick Start
- Copy the example environment file and fill in your secrets:
cp .env.example .env # edit .env to insert your real OPENROUTER_API_KEY, DISCORD_BOT_TOKEN, TARGET_USER_ID, etc. - Bring up the stack with docker-compose (recommended; includes host persistence for logs/notes):
docker compose up -d --build./memoryis bind-mounted into the container (./memory:/app/memory), so any saved notes appear in the repo directly..envis auto-loaded and the FastAPI service is exposed onhttp://localhost:8000.
- Or build/run manually if you prefer the raw Docker commands:
docker build -t adhdbot . docker run --rm -p 8000:8000 --env-file .env -v "$PWD/memory:/app/memory" adhdbot
API usage
Once the container is running, hit the API to trigger a prompt flow:
curl -X POST http://localhost:8000/run \
-H "Content-Type: application/json" \
-d '{
"userId": "chelsea",
"category": "general",
"promptName": "welcome",
"context": "Take a note that the user is testing the system you're being called from"
}'
Endpoints:
GET /health– simple liveness check.POST /run– triggersRunner.run; passuserId,category,promptName, andcontextto override defaults from.env.
Environment variables of interest (see .env.example):
OPENROUTER_API_KEY– OpenRouter key used byAIInteraction.DISCORD_BOT_TOKEN/TARGET_USER_ID/DISCORD_WEBHOOK_URL– Discord plumbing.PROMPT_CATEGORY,PROMPT_NAME,PROMPT_CONTEXT– defaults for the/runendpoint.LOG_PROMPTS(default1) – when truthy, every outgoing prompt is logged to stdout so you can audit the final instructions sent to the LLM.
Prompt + tooling customization
- All templates live in
prompts/defaultPrompts.json(and sibling files). Edit them and restart the service to take effect. - Shared tooling instructions live in
prompts/tool_instructions.md.AIInteractioninjects this file both into the system prompt and at the end of every user prompt, so any changes immediately affect how models emittake_note,store_task, orschedule_reminderJSON payloads. PROMPTS.mddocuments each category plus examples of the structured JSON outputs that downstream services can parse.
Memory + notes
- The memory subsystem watches LLM responses for fenced ```json payloads. When it sees
{"action": "take_note", ...}it writes tomemory/<user>_memory.json(now persisted on the host via the compose volume). - Each entry includes the note text, UTC timestamp, and the raw metadata payload, so other services can build summaries or downstream automations from the same file.
Debugging tips
- Tail the container logs with
docker compose logs -f adhdbotto see:- The final prompt (with tooling contract) sent to the model.
- Memory ingestion messages like
[memory] Recorded note for <user>: ....
- If you swap models, change
openRouterModelinAIInteraction.py(or surface it via env) and rebuild the container.
Description
An AI bot to help with adhd and med compliance.
Reminds user of med times, chores in an AI powered fashion.
Languages
Python
56.3%
TypeScript
22.8%
HTML
10.8%
CSS
9.9%
Dockerfile
0.2%