Skip to content

Token tracking#58

Merged
simongdavies merged 5 commits intohyperlight-dev:mainfrom
simongdavies:token-tracking
Apr 16, 2026
Merged

Token tracking#58
simongdavies merged 5 commits intohyperlight-dev:mainfrom
simongdavies:token-tracking

Conversation

@simongdavies
Copy link
Copy Markdown
Member

This pull request introduces comprehensive session-level token usage tracking and reporting to the agent. It adds a new /tokens command to display a summary of cumulative input, output, and cache tokens, as well as request and turn counts for the current session. These statistics are now tracked throughout the session and displayed automatically on exit, improving transparency around LLM usage. Additionally, there are minor improvements to module handling and system messaging.

Track cumulative input/output/cache tokens, request count, and turn
count across the session. Display via:

- /tokens command — show session totals at any time
- Exit summary — printed at all exit points (interactive, --prompt, SIGINT)
- Per-turn inline display unchanged (📊 line)

New state fields: totalInputTokens, totalOutputTokens,
totalCacheReadTokens, totalRequests, totalTurns

Accumulation happens in the assistant.usage event handler.
Turn count incremented on each onUserPromptSubmitted.

Signed-off-by: Simon Davies <simongdavies@users.noreply.github.com>
Add guidance to system message encouraging the LLM to use
register_module when it identifies missing capabilities rather
than just describing gaps. Modules persist across sessions.

Signed-off-by: Simon Davies <simongdavies@users.noreply.github.com>
…rnings

When register_module saves a user module, it now writes the sourceHash
to the .json metadata. Previously only loadModuleAsync set this field,
so re-registering a module with updated code would leave a stale hash
that the validator flagged as a mismatch warning.

Signed-off-by: Simon Davies <simongdavies@users.noreply.github.com>
@simongdavies simongdavies added the enhancement New feature or request label Apr 16, 2026
Copilot AI review requested due to automatic review settings April 16, 2026 18:48
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Adds session-level token usage tracking to the agent and exposes it via a new /tokens command, with an automatic summary printed on exit.

Changes:

  • Track cumulative input/output/cache tokens plus request/turn counts in AgentState and update them from assistant.usage events.
  • Add /tokens slash command and an exit-time token summary renderer.
  • Improve module metadata persistence by keeping sourceHash in sync; expand system message guidance around reusable modules.

Reviewed changes

Copilot reviewed 8 out of 8 changed files in this pull request and generated 6 comments.

Show a summary per file
File Description
src/agent/system-message.ts Adds guidance encouraging creation/import of reusable ha:* modules.
src/agent/state.ts Extends AgentState with token/request/turn counters and initializes them in the factory.
src/agent/slash-commands.ts Adds the /tokens command to print the session token summary.
src/agent/module-store.ts Ensures module JSON metadata includes an updated/truncated sourceHash on save.
src/agent/llm-output.ts Introduces formatTokenSummary() used by /tokens and exit-time reporting.
src/agent/index.ts Prints token summary on exit paths and increments turn count on prompt submission.
src/agent/event-handler.ts Accumulates token/request totals from assistant.usage SDK events.
src/agent/commands.ts Registers /tokens in the help/tab-completion command registry.

Comment thread src/agent/module-store.ts Outdated
Comment thread src/agent/state.ts Outdated
Comment thread src/agent/event-handler.ts Outdated
Comment thread src/agent/index.ts Outdated
Comment thread src/agent/llm-output.ts Outdated
Comment thread src/agent/slash-commands.ts
1. event-handler.ts: guard against undefined token values before accumulating
2. llm-output.ts: format large token numbers with toLocaleString, fix alignment
3. index.ts: use guard function for exit summary (no tokens = no summary)
4. state.ts: add doc comment clarifying totalRequests tracks premium requests
5. module-store.ts: use computeTruncatedHash instead of manual slice

Signed-off-by: Simon Davies <simongdavies@users.noreply.github.com>
@simongdavies simongdavies enabled auto-merge (squash) April 16, 2026 19:50
@simongdavies simongdavies disabled auto-merge April 16, 2026 19:50
@simongdavies simongdavies merged commit de73c25 into hyperlight-dev:main Apr 16, 2026
11 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants