docs · v0.1.0

shiplog manual

Every command, every flag, every config file. Companion to the landing page — assumes you've installed shiplog and want the depth.

Conventions in this doc:


01Install

The canonical install path is a one-liner that downloads a signed, sha256-verified tarball and drops the shiplog binary on PATH.

$ curl -fsSL https://shiplog.life/install.sh | sh

Targets: macOS & Linux, amd64 & arm64. The script is a short POSIX sh script you can read before piping — it's idempotent (re-running upgrades in place) and short enough to audit in 30 seconds.

Environment knobs

VariableDefaultEffect
SHIPLOG_VERSIONresolved from https://get.shiplog.life/latestPin a specific tag, e.g. v0.1.0.
SHIPLOG_PREFIX~/.local/bin if writable, else /usr/local/binOverride install directory.
SHIPLOG_NO_VERIFY0Skip the sha256 verification. Don't.
SHIPLOG_DOWNLOAD_HOSTget.shiplog.lifeFor mirrors or air-gapped environments.

Set them on the receiving end of the pipe — the env block has to reach sh, not curl:

$ curl -fsSL https://shiplog.life/install.sh | SHIPLOG_VERSION=v0.1.0 sh

What gets installed

One file: shiplog, dropped at $SHIPLOG_PREFIX/shiplog. No daemons, no launch agents, no config — those are created lazily on first setup.

If ~/.local/bin isn't on your PATH, the installer prints a one-line export to add to your shell profile:

note: /Users/you/.local/bin is not on your $PATH. add this to your shell profile: export PATH="/Users/you/.local/bin:$PATH"

macOS code signing

macOS binaries are codesigned with a Developer ID Application certificate and notarized by Apple. Gatekeeper validates them online on first launch — no "unverified developer" prompt, no xattr -d com.apple.quarantine dance.

The binaries are not stapled (Apple's stapler tool only handles .app/.pkg/.dmg, not bare Mach-O). This is the same shape terraform, kubectl, gh, and mise ship in. Practical effect: first launch on an offline machine will show "cannot verify developer" until the machine is online; subsequent launches are fine.

Upgrade

Re-run the same one-liner. The installer overwrites the existing binary in place. Your config in ~/.shiplog/ is untouched — it's never read or written by the installer.

$ curl -fsSL https://shiplog.life/install.sh | sh shiplog 0.1.1

To pin a downgrade: SHIPLOG_VERSION=v0.1.0 curl … | sh.

Verify the install

$ shiplog --version shiplog v0.1.0 $ shiplog version shiplog v0.1.0 commit: ae4f8d9 built: 2026-05-05T11:37:30Z go: go1.22.3 darwin/arm64

Both work — --version prints the one-liner that install.sh greps for; shiplog version adds commit and build metadata.


02First run

One wizard. Pick your identity, the directory shiplog should crawl, and the AI provider it should call.

$ shiplog setup identity you@example.com code roots ~/code AI provider anthropic · claude-sonnet-4-6 schedule every 1h (daemon) · Paste your Anthropic API key. Stored in macOS Keychain, never in config.json. ********************************** setup complete. try shiplog scan --dry-run to see what it'll crawl.

What setup actually does

  1. Creates ~/.shiplog/ if it doesn't exist.
  2. Writes config.json with the answers above.
  3. Stores the AI key in the OS keyring (macOS Keychain on darwin, Secret Service on Linux).
  4. Seeds the changelog and state files lazily — they're written on first scan.

Nothing else. No network calls beyond the AI provider's test ping.

Re-running setup

shiplog setup is idempotent — it loads the current config, lets you change any field, then writes back. To start fresh, delete ~/.shiplog/config.json; setup will treat the next run as a first-time wizard.

Individual config fields can be edited without re-running the whole wizard. shiplog config list shows what's settable; get / set / unset mutate one key at a time.

$ shiplog config list ai.base_url Override provider endpoint (ollama/openai-compatible) ai.model Model name (e.g. gemini-2.5-flash, claude-sonnet-4-6, qwen3:8b) ai.provider AI provider: ollama | anthropic | gemini default_trigger When to run: any-commit | push | manual granularity Bundle shape: per-commit | per-day | per-week schedule Background scan interval (e.g. 30m, 1h) summarize_others Also summarize commits not authored by you token_budget Max tokens to spend per scan $ shiplog config set granularity per-week granularity = per-week

Fields not in config list (identities, roots, repos, exclude_repos, sinks) are managed by their dedicated commands — repo add, repo rm, sink add, etc. — or by editing config.json directly with shiplog config edit.

What lives where

PathContains
~/.shiplog/config.jsonIdentities, roots, AI, schedule, sinks, repo overrides.
~/.shiplog/changelog.jsonSource of truth — every entry, sorted newest-first.
~/.shiplog/voice.mdStyle instructions injected into AI prompts.
~/.shiplog/ignore.confGitignore-style patterns excluded from diffs.
~/.shiplog/projects.jsonProject name + public URL + assigned repos.
OS keyringAPI keys (AI provider, R2/S3 sinks). Never on disk.

03scan

The workhorse. Walks every tracked repo since the last scan, groups new commits into bundles, summarizes each bundle through your configured AI, writes the result to ~/.shiplog/changelog.json, and fans out to any configured sinks.

$ shiplog scan shiplog scan — 6 repo(s), model=claude-sonnet-4-6, sinks=[local only] · ~/code/storefront (main, 2 commit(s), 1 bundle(s)) · ~/code/notes-app (main, 7 commit(s), 1 bundle(s)) · ~/code/portfolio (main, 0 commit(s)) — up to date == ~/code/notes-app == -- 2026-05-04 · you@example.com -- Cleaned up the search index so deleted notes drop out of results immediately instead of waiting for the nightly rebuild. Added keyboard shortcuts for the sidebar. 2 entries written. cursor advanced.

Cursor model

Each tracked repo has a cursor in state.json: the SHA of the last commit shiplog summarized. On scan, shiplog walks git log <cursor>..HEAD, groups the new commits into bundles, summarizes them, and advances the cursor only if the AI call succeeds. Failures leave the cursor in place so the next run retries the same range.

This is what makes scans cheap to run on a schedule: a no-op scan is essentially git rev-parse HEAD per repo + a comparison, no AI calls, no diff reads.

Bundles & granularity

A bundle is one AI call that produces one changelog entry. Granularity controls how commits group into bundles:

ModeOne bundle is…Best for
per-commitEach commit standaloneRepos where every commit is meaningful (libraries, infra).
per-dayAll commits with the same author-dateActive product repos with many small commits.
per-weekAll commits in the same ISO weekBackground repos where week-level cadence is enough.

Set the default with shiplog config set granularity per-day; override per-repo via shiplog repo set <path> granularity=per-week; override for one run with --group=….

Triggers

Triggers control when shiplog considers a repo's commits ready to summarize:

any-commitDefault. Every commit past the cursor is fair game.
pushOnly commits that have been pushed to the remote.
manualNever auto-pick; the repo only gets scanned when explicitly named with --repo.

Set the default with shiplog config set default_trigger push; override per-repo with shiplog repo set <path> trigger=manual.

Flags

FlagPurpose
--since=<cutoff>Only consider commits at or after this cutoff. Accepts today, Nd (e.g. 7d), or YYYY-MM-DD.
--repo=<path>Limit scan to one repo (path or ~/path).
--forceIgnore saved cursors and re-walk the full range. Pair with --since to bound the walk.
--group=<mode>One-run override of granularity (per-commit, per-day, per-week).
--dry-runShow what would be summarized without calling the AI.
--dump=<dir>Write each bundle's full prompt to a file in <dir> instead of calling the AI. Useful for debugging voice drift.

What the AI sees

For each bundle, the prompt includes:

Use --dump to see the exact prompt for a real bundle:

$ shiplog scan --dump=/tmp/prompts --since=7d wrote 4 prompt(s) to /tmp/prompts/ $ ls /tmp/prompts/ storefront-2026-W18.txt notes-app-2026-05-04.txt portfolio-2026-W18.txt cli-tool-2026-05-03.txt

04log

Browse what you've shipped. Two modes, picked automatically: interactive when stdout is a TTY, flat list when piped.

Interactive

Ship log 40 total ▸ 1w Notes App Search now drops deleted notes immediately. 2w Storefront Password reset flow shipped 3w Notes App Sidebar keyboard shortcuts 4w Storefront Previously, an abrupt checkout error was silent… 1mo Portfolio Refreshed the marketing page above the fold. 1mo Portfolio Site now adapts to your system's light or dark… ↑↓ Move Enter Open E Edit D Delete / Filter ? Help Esc Back
KeyAction
/ j kMove highlight.
EnterOpen the highlighted entry full-screen (note + tags + media + timestamp).
eEdit the highlighted entry — drops into the same form as shiplog edit.
dDelete the highlighted entry. Confirms with y / n.
/Filter. See "filter syntax" below.
?Toggle the help overlay.
EscClose filter / help / drop back to the menu.

Filter

/ opens a filter prompt. Whatever you type matches as a case-insensitive substring against the project name, the tags, and the note body all at once. Clear the filter with Esc.

Piped

Either pipe the command (stdout isn't a TTY) or pass --oneline explicitly. Output is a flat, grep-friendly list.

$ shiplog log --oneline | head -3 2026-05-04T18:33:11+05:30 Notes App Search now drops deleted notes immediately. 2026-04-27T09:00:00+05:30 Storefront Password reset flow shipped 2026-04-20T11:14:02+05:30 Notes App Sidebar keyboard shortcuts

Flags

FlagPurpose
-n, --limit=NMax entries to show. Default 50; 0 = all.
--since=<cutoff>Only entries from this window: today, Nd, or YYYY-MM-DD.
--project=<name>Filter by project (case-insensitive substring).
--onelineForce the flat one-row-per-entry output even on a TTY.

05add

Manual entry — for shipping things that aren't commits (a launch, a talk, a write-up). Same shape as the AI-generated entries, written to the same changelog.json.

$ shiplog add · Project Notes App · Note Wrote up the search-index rewrite for the team blog. · Tags writing, milestone · Media (none) entry written. published to: local, r2:my-bucket

Fields

FieldRequiredNotes
ProjectYesFree-text project name. If it matches an entry in projects.json, the public URL is attached automatically. New names get added to the registry — you'll be prompted for a public URL on first use.
NoteYesMulti-line. Enter inserts a newline, Ctrl-D finishes input.
TagsNoComma-separated. Lowercased and deduped on write.
MediaNoOne or more file paths. See "Media handling" below.

Media handling

Files passed to the Media field are uploaded to the first sink that supports media (R2, S3) and the public URL is stored in the entry. Local-only setups skip this step.

shiplog add is interactive — there are no flags today. Non-interactive bulk import is coming soon.

Editing

shiplog edit opens an interactive picker (same shape as log) to pick the entry, then drops into the same form as add with the current values pre-filled.

Special inputs in the Media field when editing:


06status

What does shiplog know about right now? One row per tracked repo.

$ shiplog status AGE REPO ENTRIES PENDING SOURCE 2h ~/code/notes-app 18 — roots 2h ~/code/storefront 7 — roots 2h ~/code/portfolio 12 — roots 1d ~/code/cli-tool 24 3 roots never ~/code/sandbox 0 — roots

Columns

ColumnMeaning
AGETime since the last successful scan touched this repo. never means scanned but nothing summarized; means newly discovered, never scanned.
REPORepo path, with ~ for your home directory.
ENTRIESTotal changelog entries for this project.
PENDINGCommits past the cursor that would be summarized on next scan. = nothing pending.
SOURCEWhy shiplog tracks it: roots (auto-discovered under a code_roots entry), added (explicit repo add), ignored (would be auto-discovered but excluded).

shiplog status takes no flags today. JSON output and pending-only filtering are coming soon.


07daemon

Run scan on a schedule. Foreground by default; pass --background to detach.

Foreground

$ shiplog daemon shiplog daemon — every 30m, pid 47213 ── 2026-05-05 09:00:01 ── shiplog scan — 6 repo(s) · ~/code/notes-app (main, 3 commit(s), 1 bundle(s)) · 5 other repos — up to date 1 entry written.

The interval comes from config.schedule (default 30m). Override per-run with --schedule 1h. Ctrl-C exits cleanly — the in-flight scan finishes first.

Background

$ shiplog daemon --background shiplog daemon detached, pid 47213 log: /Users/you/.shiplog/daemon.log watch: shiplog daemon logs status: shiplog daemon status stop: shiplog daemon stop

Detaches from the controlling terminal, redirects stdout and stderr to ~/.shiplog/daemon.log, and writes the child pid to ~/.shiplog/daemon.pid. Survives the shell that started it; survives a logout if your platform's session manager doesn't reap orphaned processes (most don't).

Trying to start a second background daemon while one is already running fails with a clear message — the pidfile is the lock.

Watching the log

$ shiplog daemon logs ── 2026-05-05 09:00:01 ── shiplog scan — 6 repo(s) · ~/code/notes-app (main, 3 commit(s), 1 bundle(s)) ✓ 1 entry written. # Ctrl-C exits the tail; the daemon keeps running.

logs is a thin wrapper around tail -F — it follows rotations and never exits on its own. Ctrl-C stops the tail, not the daemon.

Snapshot

$ shiplog daemon status daemon: running pid: 47213 started: 2026-05-05 09:00:01 (3h47m ago) log: /Users/you/.shiplog/daemon.log

One-shot. Reads the pidfile, validates the process is alive, prints uptime from the pidfile's mtime. Reports not running if there's no pidfile or the recorded pid is stale.

Stopping

$ shiplog daemon stop daemon (pid 47213) stopped

Sends SIGTERM, waits up to 10 seconds for the daemon to flush its in-flight scan, then removes the pidfile. If the daemon hangs longer than that, stop reports the still-running pid and exits — at which point you can kill -9 if needed.

"Bring back to foreground"

There isn't one — once the daemon detaches, its stdio is a file and the parent shell is gone. To interact with a running scan, stop the background daemon and run shiplog daemon in your terminal:

$ shiplog daemon stop && shiplog daemon

If you want to truly attach/detach to a running process (interrupt mid-scan, pipe stdin, etc.), don't use --background at all — run shiplog inside tmux or screen and detach the multiplexer instead.

One-shot

--once runs a single scan and exits. Pair with cron when you'd rather not have a long-running process at all:

0 * * * *  /Users/you/.local/bin/shiplog daemon --once >> /Users/you/.shiplog/cron.log 2>&1

Schedule format

A Go duration: 30m, 1h, 4h, etc. Minimum 1 minute. Set the default with shiplog config set schedule 1h; override per-run with --schedule.

Survives logout?

The detached daemon is in a fresh session (setsid), so it's not killed when its parent shell exits. It is killed if your OS terminates user processes on logout — most desktop sessions don't, but some Linux distros configured with KillUserProcesses=yes in logind.conf do. If that bites you, write a systemd user unit (a launchd plist on macOS) — that surface is coming soon as a built-in installer; for now, hand-write one with ExecStart=shiplog daemon (no --background needed; the supervisor handles backgrounding).


08Configuration files

Everything lives in ~/.shiplog/. Files are JSON or plain text — no binary state, no databases. Edit them by hand if you want; shiplog config validate sanity-checks before any subcommand reads them.

config.json

{
  "identities": ["you@example.com"],
  "summarize_others": false,
  "granularity": "per-day",
  "default_trigger": "any-commit",
  "schedule": "30m",
  "token_budget": 40000,
  "roots": ["~/code"],
  "repos": ["~/work/secret-side-project"],
  "exclude_repos": ["~/code/sandbox"],
  "skip_dirs": ["node_modules", ".venv", "vendor", "target", "build", "dist", ".next", "__pycache__"],
  "ai": {
    "provider": "anthropic",
    "model": "claude-sonnet-4-6",
    "base_url": ""
  },
  "sinks": [
    {
      "type": "r2",
      "bucket": "my-bucket",
      "account_id": "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa",
      "public_host": "data.example.com"
    }
  ],
  "granularity_overrides": {
    "~/code/cli-tool": "per-week"
  },
  "trigger_overrides": {
    "~/code/cli-tool": "push"
  }
}
FieldNotes
identitiesGit emails that count as "you." Seeded from git config user.email on first run.
summarize_othersIf true, summarize commits not authored by anyone in identities. Default false.
granularityper-commit · per-day · per-week.
default_triggerany-commit · push · manual.
scheduleGo duration string, used by daemon.
token_budgetSoft ceiling on tokens spent per scan run.
rootsDirectories scanned recursively for git repos.
reposExplicit repos outside any root.
exclude_reposRepos to skip even if they fall under a root.
skip_dirsDirectory basenames root-walk descent skips.
ai.api_keyNever written. Keys live in the OS keyring.
granularity_overridesPer-repo path → granularity override.
trigger_overridesPer-repo path → trigger override.

Tildes in path-valued fields are expanded at read time. Edit by hand with shiplog config edit; the binary parses on next read.

changelog.json

An array of entries, sorted newest-first. One entry per shipped thing.

[
  {
    "id": "f12e34c5d678",
    "timestamp": "2026-05-04T18:33:11+05:30",
    "project_id": "a3b4c5d6",
    "project": "Notes App",
    "url": "https://notes.example.com",
    "note": "Cleaned up the search index so deleted notes drop out…",
    "tags": ["search", "performance"],
    "media": "https://data.example.com/ship/media/20260504T183311-screenshot.png",
    "author": "you@example.com"
  }
]
FieldTypeNotes
idstringStable entry ID; used for dedup and edit/delete.
timestampstringISO 8601 with offset.
project_idstring8-char hash. See "Project ID derivation" below.
projectstringDisplay name from projects.json.
urlstringPublic URL for the project.
notestringThe summary itself.
tagsstring[]Lowercased, deduped.
mediastring or string[]String for one file, array for many. Both shapes round-trip cleanly.
authorstringThe git email associated with the bundle's commits.

Retention. The changelog trims to entries newer than 180 days after every write. Older entries are silently dropped — long-term archive is the sink's job, not the source-of-truth file's.

state.json

Per-repo cursors and metadata. Don't edit by hand — scan --force is the supported way to re-walk.

{
  "~/code/notes-app": {
    "last_sha": "a3b4c5d6e7f8…",
    "last_scan": "2026-05-05T09:00:00+05:30",
    "last_entry_id": "f12e34c5d678"
  }
}

projects.json

Keyed by project ID, with display name and metadata.

{
  "a3b4c5d6": {
    "name": "Notes App",
    "url": "https://notes.example.com",
    "repo": "git@github.com:you/notes-app.git",
    "repos": ["~/code/notes-app", "~/code/notes-app-mobile"]
  }
}

Project ID derivation

For repos directly under one of your code_roots, the ID is md5(origin_url)[:8]. For nested repos (e.g. ~/code/mailbox/Mbox), the ID is md5(parent_dir_name)[:8] — this lets monorepo-ish layouts group multiple inner repos under a single "mailbox" project. First time a project ID is seen, shiplog prompts for a name and public URL.

voice.md

The prompt fragment that shapes how the AI sounds. Plain Markdown, injected verbatim into every summarization prompt before the diff. See voice.md below for examples.

ignore.conf

Gitignore-style patterns applied to diffs before the prompt is built. Patterns must live inside a [section] header — the section name is either an absolute path (literal match) or a glob. Multiple matching sections union their patterns.

# Match one repo by path
[~/code/mbox]
*.sql
migrations/
secrets/

# Match a glob — applies to every repo under ~/code/work/
[~/code/work/*]
*.env
*.lock
vendor/

Add a pattern interactively with shiplog repo ignore <pattern> from inside the repo. Skip-entire-directory globs (like node_modules) are handled separately by the top-level skip_dirs config — those don't even appear in the discovery walk, let alone in diffs.


09ai

The provider that writes your summaries. Bring your own key; nothing is proxied through us.

Supported providers

ProviderDefault modelNotes
anthropicclaude-sonnet-4-6Hosted; key required.
geminigemini-2.5-flashHosted; key required.
ollamaqwen3:8bLocal; no key. Endpoint defaults to http://127.0.0.1:11434.

coming soon: OpenAI.

Setup

$ shiplog ai setup · Provider [anthropic] gemini openai ollama · Model claude-sonnet-4-6 · API key ********************************** stored in Keychain as shiplog:ai:anthropic test call returned 27 tokens in 0.9s

Switching providers

shiplog ai setup overwrites the active provider in config.json. The previous provider's key stays in the keyring — switching back doesn't ask for it again. To remove a stored key:

$ shiplog ai rm anthropic removed Anthropic key from keyring

Run shiplog ai rm with no argument to pick from a list interactively.

Listing

$ shiplog ai list anthropic key in keyring gemini key in keyring ollama local — no key needed

Test

shiplog ai test sends a one-line prompt to the configured provider and prints the response. Useful to validate a key without burning a full scan.

Local Ollama

Set provider to ollama and shiplog talks to http://127.0.0.1:11434 by default. Override with shiplog config set ai.base_url http://gpu-box.local:11434. No data leaves the machine in this configuration.

Spend

shiplog ai budget exists as a placeholder for month-to-date AI spend tracking. Token accounting is coming soon; today the command prints a "not tracked yet" notice.


10sink

Sinks publish your changelog elsewhere. ~/.shiplog/changelog.json is always written; sinks are extras. Multi-sink: every configured sink runs on every successful entry write.

Supported types

TypeWritesUse for
fileMirror copy at a chosen pathBackups, alternate locations.
csvUTF-8 BOM + CRLF Excel-friendly mirrorSpreadsheets.
r2Object on Cloudflare R2 + media uploadsThe /now-style page on your site.
s3Object on AWS S3 + media uploadsSame, on AWS.

Planned: webhook, sftp.

Adding a sink

$ shiplog sink add r2 · Cloudflare account id aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa · Bucket my-bucket · Public host data.example.com · R2 access key id ********** · R2 secret access key ********** added r2 sink (bucket=my-bucket) credentials in Keychain as sink:r2:my-bucket:*

R2 layout

The R2 sink writes to two paths in your bucket:

Public URLs use the public_host from sink config: https://data.example.com/changelog.json, https://data.example.com/ship/media/…. The r2 sink talks to the Cloudflare API directly — no wrangler dependency at runtime.

S3 layout

Same, but uses standard AWS SDK auth: access_key_id + secret_access_key + region + bucket. Optional endpoint for S3-compatible services (Backblaze B2, MinIO, Wasabi).

CSV sink

One row per entry, columns: timestamp, project, url, note, tags, media. Encoded UTF-8 with a BOM and CRLF line endings so Excel opens it without prompting for encoding. Tags joined with ; ; multiple media URLs joined with ; .

Listing & removing

$ shiplog sink list # TYPE TARGET 0 r2 aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa/my-bucket (public: data.example.com) 1 csv ./changelog.csv $ shiplog sink rm 0 removed sink #0 (r2) removed credentials from keyring

Run shiplog sink rm with no argument to pick from a list interactively. shiplog sink test picks one sink (interactively or by index) and republishes the current changelog to it — handy for validating credentials after a rotation.

Republishing

shiplog sync republishes the local changelog to every configured sink. Useful after adding a sink to a long-running setup, or when restoring from backup.

$ shiplog sync · r2 / my-bucket 42 entries written · csv / ./changelog.csv 42 entries written

11repo

Decide what gets crawled. Anything under your code_roots is auto-discovered. Use repo add to track something outside, repo ignore to exclude one inside.

List

$ shiplog repo list REPO GRAN TRIGGER LAST ~/code/storefront per-week any-commit 2h ~/code/portfolio per-week any-commit 2h ~/code/notes-app per-day any-commit 2h ~/code/cli-tool per-day merge-to-main 1d

Per-repo overrides

Override granularity or trigger per-repo. Pass key=value:

$ shiplog repo set ~/code/cli-tool granularity=per-week $ shiplog repo set ~/code/cli-tool trigger=tag-push
KeyValues
granularityper-commit · per-day · per-week · per-tag
triggerany-commit · merge-to-main · tag-push

Set the value to the empty string to clear an override: shiplog repo set ~/code/cli-tool granularity=.

Adding & excluding

Anything under your roots is auto-discovered. repo add tracks a path outside the roots; repo ignore appends a pattern to ~/.shiplog/ignore.conf; repo rm adds the path to exclude_repos so it's skipped on the next scan.

$ shiplog repo add ~/work/secret-side-project tracking ~/work/secret-side-project $ shiplog repo rm ~/code/sandbox excluded ~/code/sandbox from future scans

Projects

A project is the display name + URL that show up next to entries on the page. Projects auto-register the first time a new repo gets summarized — shiplog uses the parent directory or the origin URL to derive a project ID and prompts you for the human name on first sight. Most users never touch shiplog project directly.

When you do, the relevant subcommands are:

All of these accept interactive pickers when called with no arguments.


12voice.md

The prompt fragment that shapes how the AI sounds. Plain Markdown injected verbatim before each diff. This is where you teach shiplog to sound like you.

Default seed

# Voice

Write in first-person plural ("we shipped…") or impersonal active voice
("Search now drops deleted notes…"). Skip "I added" / "I fixed."

Be specific. Name the user-visible change, not the implementation.
"Added a debounce to search input" → "Search no longer flickers while
you type."

Skip filler: no "we are excited," no "today we are pleased to."

One paragraph per bundle. 2–4 sentences. No headers, no lists, no
markdown.

Drop boring commits — if the bundle is just version bumps and lint
fixes, write a single sentence acknowledging there's nothing
user-visible.

Editing

shiplog voice edit opens voice.md in $EDITOR. shiplog voice show prints the current contents. shiplog voice path prints the file path so you can hand it to other tools.

How it threads into prompts

For each bundle, the prompt assembled by shiplog looks roughly like:

System: You are summarizing a batch of git commits as one
        changelog entry. Follow the voice instructions strictly.

User:   <voice.md verbatim>

        ## Project: Notes App
        ## Branch: main
        ## Commits (3):
        - 1a2b3c4 (you@example.com) Add reset password page
        - 5d6e7f8 (you@example.com) Wire up forgot-password email
        - 9a0b1c2 (you@example.com) Add captcha to auth forms

        ## Diff summary:
        <filtered diff>

        ## Recent entries from this project (for stylistic continuity):
        - "Search now drops deleted notes immediately…"
        - "Sidebar keyboard shortcuts…"

        Write one paragraph following the voice above.

Use shiplog scan --dump=DIR to see the actual fully-assembled prompt for any bundle.

Drift

If the AI starts drifting back to its trained "we are excited to announce" voice after a model update, edit voice.md to add an explicit anti-pattern and re-run the latest few entries with --force.


13doctor

One command, every check. If anything's off, doctor tells you exactly what's broken and how to fix it.

$ shiplog doctor shiplog doctor — 14 check(s) on darwin/arm64 [ok] shiplog dir /Users/you/.shiplog [ok] config.json /Users/you/.shiplog/config.json [ok] voice.md /Users/you/.shiplog/voice.md [ok] ignore.conf /Users/you/.shiplog/ignore.conf [ok] changelog.json /Users/you/.shiplog/changelog.json [ok] state.json /Users/you/.shiplog/state.json [ok] git git version 2.50.1 · /usr/bin/git [ok] ffmpeg /opt/homebrew/bin/ffmpeg [ok] identities you@example.com [ok] ai provider anthropic · claude-sonnet-4-6 [ok] sink[0] r2 bucket=my-bucket · public=data.example.com [ok] discovery 6 repo(s) tracked summary: 14 ok · 0 warn · 0 fail

What it checks

Pass --ai-ping to also send a tiny prompt to the configured provider and confirm the round-trip. Costs a few tokens.

shiplog doctor --ai-ping

14Security & privacy

What leaves the machine

Three things, each opt-in and configurable:

  1. AI provider calls. The diff summary + commit messages + voice + recent entries go to whatever provider you configured. Pick ollama for fully local; pick anthropic/openai/gemini for hosted.
  2. Sink uploads. The changelog (and any media) go to whatever sinks you configured. Local-only setups skip this entirely.
  3. Update checks. Off by default. Re-running install.sh is the supported upgrade path.

That's it. No telemetry, no analytics, no usage pings, no error reporting. The binary doesn't phone home for anything.

What stays local

What shiplog never does

Diff filtering

Before any diff is sent to the AI, ignore.conf patterns are applied. The default seed excludes lockfiles, build output, and minified bundles — but if you have proprietary patterns (generated code with embedded keys, customer data fixtures), add them.

To audit exactly what would be sent for a bundle without making the call:

$ shiplog scan --dump=/tmp/audit --since=1d wrote 3 prompt(s) to /tmp/audit/

Read the dumped files; if anything sensitive made it through, add a pattern to ignore.conf and re-dump.

Code signing & integrity


15Troubleshooting

"shiplog: command not found" after install

The install dir isn't on your PATH. Add it:

echo 'export PATH="$HOME/.local/bin:$PATH"' >> ~/.zshrc
exec zsh

"checksum mismatch" during install

Either Cloudflare's edge cache is mid-flight (try again in 30s) or the tarball was tampered with in transit. Don't bypass with SHIPLOG_NO_VERIFY=1; verify the served checksum matches what's on the GitHub release:

curl -fsS https://get.shiplog.life/v0.1.0/checksums.txt
gh release view v0.1.0 --json assets -q '.assets[] | select(.name=="checksums.txt") | .url' \
  | xargs curl -fsSL

If they differ, file an issue — the R2 upload is corrupted.

"the developer cannot be verified" on first run (macOS)

The machine is offline and Gatekeeper can't reach Apple's notarization service. Connect to a network and re-run; subsequent launches don't need the network. If you genuinely need offline-first, file an issue and we'll look at packaging a stapled .pkg.

AI test call fails

shiplog ai test dumps the full error. Common causes:

Scan summarizes the same commits twice

The cursor failed to advance — usually because the AI call returned an error after the entry was partially written, or the process was killed mid-write. shiplog scan --force walks the full history, re-summarizes via bundle ID, and dedupes. Existing duplicates can be cleaned with shiplog log + d.

Sink upload fails after entry write

The local changelog.json is the source of truth — entry is saved even if the sink upload fails. shiplog sync retries every configured sink against the full local changelog; idempotent.

.mov uploads fail

shiplog transcodes .mov.mp4 via ffmpeg before upload. Install it with your package manager: brew install ffmpeg on macOS, apt install ffmpeg on Debian/Ubuntu. shiplog doctor warns when ffmpeg is missing.

Daemon stopped on its own

Run shiplog daemon status. Three possibilities:

Reset everything and start over

Nuclear option: delete ~/.shiplog/ and re-run setup. Keys in the keyring outlive the directory; remove them with shiplog ai rm per provider and shiplog sink rm per sink, or delete the entries from Keychain Access directly.


16Releasing & versioning

Mostly a developer concern; included here so users understand the upgrade and pinning model.

SemVer with intent

shiplog follows semantic versioning, but with a deliberately conservative read:

BumpTriggers
MAJORBreaking change to the changelog JSON shape, the config file schema, or any CLI flag rename. Always with a migration note.
MINORNew commands, new flags, new sinks, new providers. Backwards-compatible additions.
PATCHBug fixes, prompt-template tweaks, internal refactors invisible to users.

The on-disk formats (config.json, changelog.json, state.json, projects.json) are part of the contract — bumping one of those without a MAJOR bump is a bug.

Release artifacts

Each tagged release publishes:

Upgrade behavior

Re-running the installer always pulls the latest. To pin: SHIPLOG_VERSION=v0.1.0 curl … | sh. To downgrade: same — pin to the older tag, the installer overwrites in place. Your ~/.shiplog/ data is never touched by the installer; downgrades carry your data forward.

If a release breaks something

Pin back to the previous tag (SHIPLOG_VERSION=…) while the fix is being prepared. The previous tarballs stay on R2 forever — no version is ever deleted.


17Uninstall

Two layers — the binary, and your data.

Remove the binary

$ rm "$(command -v shiplog)"

If installed with sudo to /usr/local/bin/shiplog, that's sudo rm /usr/local/bin/shiplog.

Remove the data

$ rm -rf ~/.shiplog

Wipes config, changelog, state, projects, voice, ignore. Can't be recovered without a sink mirror.

Stop the daemon

If you backgrounded shiplog daemon with launchd, systemd, tmux, or nohup, stop it through the same supervisor before removing the binary. There's no shiplog-side uninstaller — the daemon is just a foreground process.

Remove the keyring entries

API keys persist after the binary and data are gone. Two paths:

Verify clean

$ command -v shiplog $ ls ~/.shiplog 2>/dev/null

Both empty = clean.