shiplog manual
Every command, every flag, every config file. Companion to the landing page — assumes you've installed shiplog and want the depth.
Conventions in this doc:
- Code spans like
shiplog scanare commands you type. - Black terminal blocks are real output (truncated for fit).
- Light gray code blocks are file contents or shell snippets.
- Esc-style spans are key presses inside an interactive view.
01Install
The canonical install path is a one-liner that downloads a signed, sha256-verified tarball and drops the shiplog binary on PATH.
Targets: macOS & Linux, amd64 & arm64. The script is a short POSIX sh script you can read before piping — it's idempotent (re-running upgrades in place) and short enough to audit in 30 seconds.
Environment knobs
| Variable | Default | Effect |
|---|---|---|
SHIPLOG_VERSION | resolved from https://get.shiplog.life/latest | Pin a specific tag, e.g. v0.1.0. |
SHIPLOG_PREFIX | ~/.local/bin if writable, else /usr/local/bin | Override install directory. |
SHIPLOG_NO_VERIFY | 0 | Skip the sha256 verification. Don't. |
SHIPLOG_DOWNLOAD_HOST | get.shiplog.life | For mirrors or air-gapped environments. |
Set them on the receiving end of the pipe — the env block has to reach sh, not curl:
What gets installed
One file: shiplog, dropped at $SHIPLOG_PREFIX/shiplog. No daemons, no launch agents, no config — those are created lazily on first setup.
If ~/.local/bin isn't on your PATH, the installer prints a one-line export to add to your shell profile:
macOS code signing
macOS binaries are codesigned with a Developer ID Application certificate and notarized by Apple. Gatekeeper validates them online on first launch — no "unverified developer" prompt, no xattr -d com.apple.quarantine dance.
The binaries are not stapled (Apple's stapler tool only handles .app/.pkg/.dmg, not bare Mach-O). This is the same shape terraform, kubectl, gh, and mise ship in. Practical effect: first launch on an offline machine will show "cannot verify developer" until the machine is online; subsequent launches are fine.
Upgrade
Re-run the same one-liner. The installer overwrites the existing binary in place. Your config in ~/.shiplog/ is untouched — it's never read or written by the installer.
To pin a downgrade: SHIPLOG_VERSION=v0.1.0 curl … | sh.
Verify the install
Both work — --version prints the one-liner that install.sh greps for; shiplog version adds commit and build metadata.
02First run
One wizard. Pick your identity, the directory shiplog should crawl, and the AI provider it should call.
What setup actually does
- Creates
~/.shiplog/if it doesn't exist. - Writes
config.jsonwith the answers above. - Stores the AI key in the OS keyring (macOS Keychain on darwin, Secret Service on Linux).
- Seeds the changelog and state files lazily — they're written on first
scan.
Nothing else. No network calls beyond the AI provider's test ping.
Re-running setup
shiplog setup is idempotent — it loads the current config, lets you change any field, then writes back. To start fresh, delete ~/.shiplog/config.json; setup will treat the next run as a first-time wizard.
Individual config fields can be edited without re-running the whole wizard. shiplog config list shows what's settable; get / set / unset mutate one key at a time.
Fields not in config list (identities, roots, repos, exclude_repos, sinks) are managed by their dedicated commands — repo add, repo rm, sink add, etc. — or by editing config.json directly with shiplog config edit.
What lives where
| Path | Contains |
|---|---|
~/.shiplog/config.json | Identities, roots, AI, schedule, sinks, repo overrides. |
~/.shiplog/changelog.json | Source of truth — every entry, sorted newest-first. |
~/.shiplog/voice.md | Style instructions injected into AI prompts. |
~/.shiplog/ignore.conf | Gitignore-style patterns excluded from diffs. |
~/.shiplog/projects.json | Project name + public URL + assigned repos. |
| OS keyring | API keys (AI provider, R2/S3 sinks). Never on disk. |
03scan
The workhorse. Walks every tracked repo since the last scan, groups new commits into bundles, summarizes each bundle through your configured AI, writes the result to ~/.shiplog/changelog.json, and fans out to any configured sinks.
Cursor model
Each tracked repo has a cursor in state.json: the SHA of the last commit shiplog summarized. On scan, shiplog walks git log <cursor>..HEAD, groups the new commits into bundles, summarizes them, and advances the cursor only if the AI call succeeds. Failures leave the cursor in place so the next run retries the same range.
This is what makes scans cheap to run on a schedule: a no-op scan is essentially git rev-parse HEAD per repo + a comparison, no AI calls, no diff reads.
Bundles & granularity
A bundle is one AI call that produces one changelog entry. Granularity controls how commits group into bundles:
| Mode | One bundle is… | Best for |
|---|---|---|
per-commit | Each commit standalone | Repos where every commit is meaningful (libraries, infra). |
per-day | All commits with the same author-date | Active product repos with many small commits. |
per-week | All commits in the same ISO week | Background repos where week-level cadence is enough. |
Set the default with shiplog config set granularity per-day; override per-repo via shiplog repo set <path> granularity=per-week; override for one run with --group=….
Triggers
Triggers control when shiplog considers a repo's commits ready to summarize:
any-commit | Default. Every commit past the cursor is fair game. |
push | Only commits that have been pushed to the remote. |
manual | Never auto-pick; the repo only gets scanned when explicitly named with --repo. |
Set the default with shiplog config set default_trigger push; override per-repo with shiplog repo set <path> trigger=manual.
Flags
| Flag | Purpose |
|---|---|
--since=<cutoff> | Only consider commits at or after this cutoff. Accepts today, Nd (e.g. 7d), or YYYY-MM-DD. |
--repo=<path> | Limit scan to one repo (path or ~/path). |
--force | Ignore saved cursors and re-walk the full range. Pair with --since to bound the walk. |
--group=<mode> | One-run override of granularity (per-commit, per-day, per-week). |
--dry-run | Show what would be summarized without calling the AI. |
--dump=<dir> | Write each bundle's full prompt to a file in <dir> instead of calling the AI. Useful for debugging voice drift. |
What the AI sees
For each bundle, the prompt includes:
voice.mdverbatim (style instructions).- Project name, repo path, branch.
- Commit messages and authors.
- A diff summary, filtered through
ignore.confpatterns (sopackage-lock.jsonand friends don't dominate). - The previous N changelog entries for this project, for stylistic continuity (default 3).
Use --dump to see the exact prompt for a real bundle:
04log
Browse what you've shipped. Two modes, picked automatically: interactive when stdout is a TTY, flat list when piped.
Interactive
| Key | Action |
|---|---|
| ↑ ↓ / j k | Move highlight. |
| Enter | Open the highlighted entry full-screen (note + tags + media + timestamp). |
| e | Edit the highlighted entry — drops into the same form as shiplog edit. |
| d | Delete the highlighted entry. Confirms with y / n. |
| / | Filter. See "filter syntax" below. |
| ? | Toggle the help overlay. |
| Esc | Close filter / help / drop back to the menu. |
Filter
/ opens a filter prompt. Whatever you type matches as a case-insensitive substring against the project name, the tags, and the note body all at once. Clear the filter with Esc.
Piped
Either pipe the command (stdout isn't a TTY) or pass --oneline explicitly. Output is a flat, grep-friendly list.
Flags
| Flag | Purpose |
|---|---|
-n, --limit=N | Max entries to show. Default 50; 0 = all. |
--since=<cutoff> | Only entries from this window: today, Nd, or YYYY-MM-DD. |
--project=<name> | Filter by project (case-insensitive substring). |
--oneline | Force the flat one-row-per-entry output even on a TTY. |
05add
Manual entry — for shipping things that aren't commits (a launch, a talk, a write-up). Same shape as the AI-generated entries, written to the same changelog.json.
Fields
| Field | Required | Notes |
|---|---|---|
| Project | Yes | Free-text project name. If it matches an entry in projects.json, the public URL is attached automatically. New names get added to the registry — you'll be prompted for a public URL on first use. |
| Note | Yes | Multi-line. Enter inserts a newline, Ctrl-D finishes input. |
| Tags | No | Comma-separated. Lowercased and deduped on write. |
| Media | No | One or more file paths. See "Media handling" below. |
Media handling
Files passed to the Media field are uploaded to the first sink that supports media (R2, S3) and the public URL is stored in the entry. Local-only setups skip this step.
- Soft 50 MB limit per file — larger files prompt for confirmation.
.mov→.mp4viaffmpegwhen available (-vcodec h264 -acodec aac -movflags +faststart).- Content-Type set explicitly for
jpg/jpeg/png/gif/mp4/webp; defaults toapplication/octet-streamotherwise. - R2 key pattern:
ship/media/<YYYYMMDDTHHMMSS>-<filename>. - Single file →
mediais a string. Multiple →mediais an array of strings.
shiplog add is interactive — there are no flags today. Non-interactive bulk import is coming soon.
Editing
shiplog edit opens an interactive picker (same shape as log) to pick the entry, then drops into the same form as add with the current values pre-filled.
Special inputs in the Media field when editing:
- Empty (just Enter) — keep current media as-is.
clear— remove all media from the entry.replace— discard current media and start a fresh list.- Anything else — first file path in a new list (replaces).
06status
What does shiplog know about right now? One row per tracked repo.
Columns
| Column | Meaning |
|---|---|
| AGE | Time since the last successful scan touched this repo. never means scanned but nothing summarized; — means newly discovered, never scanned. |
| REPO | Repo path, with ~ for your home directory. |
| ENTRIES | Total changelog entries for this project. |
| PENDING | Commits past the cursor that would be summarized on next scan. — = nothing pending. |
| SOURCE | Why shiplog tracks it: roots (auto-discovered under a code_roots entry), added (explicit repo add), ignored (would be auto-discovered but excluded). |
shiplog status takes no flags today. JSON output and pending-only filtering are coming soon.
07daemon
Run scan on a schedule. Foreground by default; pass --background to detach.
Foreground
The interval comes from config.schedule (default 30m). Override per-run with --schedule 1h. Ctrl-C exits cleanly — the in-flight scan finishes first.
Background
Detaches from the controlling terminal, redirects stdout and stderr to ~/.shiplog/daemon.log, and writes the child pid to ~/.shiplog/daemon.pid. Survives the shell that started it; survives a logout if your platform's session manager doesn't reap orphaned processes (most don't).
Trying to start a second background daemon while one is already running fails with a clear message — the pidfile is the lock.
Watching the log
logs is a thin wrapper around tail -F — it follows rotations and never exits on its own. Ctrl-C stops the tail, not the daemon.
Snapshot
One-shot. Reads the pidfile, validates the process is alive, prints uptime from the pidfile's mtime. Reports not running if there's no pidfile or the recorded pid is stale.
Stopping
Sends SIGTERM, waits up to 10 seconds for the daemon to flush its in-flight scan, then removes the pidfile. If the daemon hangs longer than that, stop reports the still-running pid and exits — at which point you can kill -9 if needed.
"Bring back to foreground"
There isn't one — once the daemon detaches, its stdio is a file and the parent shell is gone. To interact with a running scan, stop the background daemon and run shiplog daemon in your terminal:
If you want to truly attach/detach to a running process (interrupt mid-scan, pipe stdin, etc.), don't use --background at all — run shiplog inside tmux or screen and detach the multiplexer instead.
One-shot
--once runs a single scan and exits. Pair with cron when you'd rather not have a long-running process at all:
0 * * * * /Users/you/.local/bin/shiplog daemon --once >> /Users/you/.shiplog/cron.log 2>&1
Schedule format
A Go duration: 30m, 1h, 4h, etc. Minimum 1 minute. Set the default with shiplog config set schedule 1h; override per-run with --schedule.
Survives logout?
The detached daemon is in a fresh session (setsid), so it's not killed when its parent shell exits. It is killed if your OS terminates user processes on logout — most desktop sessions don't, but some Linux distros configured with KillUserProcesses=yes in logind.conf do. If that bites you, write a systemd user unit (a launchd plist on macOS) — that surface is coming soon as a built-in installer; for now, hand-write one with ExecStart=shiplog daemon (no --background needed; the supervisor handles backgrounding).
08Configuration files
Everything lives in ~/.shiplog/. Files are JSON or plain text — no binary state, no databases. Edit them by hand if you want; shiplog config validate sanity-checks before any subcommand reads them.
config.json
{
"identities": ["you@example.com"],
"summarize_others": false,
"granularity": "per-day",
"default_trigger": "any-commit",
"schedule": "30m",
"token_budget": 40000,
"roots": ["~/code"],
"repos": ["~/work/secret-side-project"],
"exclude_repos": ["~/code/sandbox"],
"skip_dirs": ["node_modules", ".venv", "vendor", "target", "build", "dist", ".next", "__pycache__"],
"ai": {
"provider": "anthropic",
"model": "claude-sonnet-4-6",
"base_url": ""
},
"sinks": [
{
"type": "r2",
"bucket": "my-bucket",
"account_id": "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa",
"public_host": "data.example.com"
}
],
"granularity_overrides": {
"~/code/cli-tool": "per-week"
},
"trigger_overrides": {
"~/code/cli-tool": "push"
}
}
| Field | Notes |
|---|---|
identities | Git emails that count as "you." Seeded from git config user.email on first run. |
summarize_others | If true, summarize commits not authored by anyone in identities. Default false. |
granularity | per-commit · per-day · per-week. |
default_trigger | any-commit · push · manual. |
schedule | Go duration string, used by daemon. |
token_budget | Soft ceiling on tokens spent per scan run. |
roots | Directories scanned recursively for git repos. |
repos | Explicit repos outside any root. |
exclude_repos | Repos to skip even if they fall under a root. |
skip_dirs | Directory basenames root-walk descent skips. |
ai.api_key | Never written. Keys live in the OS keyring. |
granularity_overrides | Per-repo path → granularity override. |
trigger_overrides | Per-repo path → trigger override. |
Tildes in path-valued fields are expanded at read time. Edit by hand with shiplog config edit; the binary parses on next read.
changelog.json
An array of entries, sorted newest-first. One entry per shipped thing.
[
{
"id": "f12e34c5d678",
"timestamp": "2026-05-04T18:33:11+05:30",
"project_id": "a3b4c5d6",
"project": "Notes App",
"url": "https://notes.example.com",
"note": "Cleaned up the search index so deleted notes drop out…",
"tags": ["search", "performance"],
"media": "https://data.example.com/ship/media/20260504T183311-screenshot.png",
"author": "you@example.com"
}
]
| Field | Type | Notes |
|---|---|---|
id | string | Stable entry ID; used for dedup and edit/delete. |
timestamp | string | ISO 8601 with offset. |
project_id | string | 8-char hash. See "Project ID derivation" below. |
project | string | Display name from projects.json. |
url | string | Public URL for the project. |
note | string | The summary itself. |
tags | string[] | Lowercased, deduped. |
media | string or string[] | String for one file, array for many. Both shapes round-trip cleanly. |
author | string | The git email associated with the bundle's commits. |
Retention. The changelog trims to entries newer than 180 days after every write. Older entries are silently dropped — long-term archive is the sink's job, not the source-of-truth file's.
state.json
Per-repo cursors and metadata. Don't edit by hand — scan --force is the supported way to re-walk.
{
"~/code/notes-app": {
"last_sha": "a3b4c5d6e7f8…",
"last_scan": "2026-05-05T09:00:00+05:30",
"last_entry_id": "f12e34c5d678"
}
}
projects.json
Keyed by project ID, with display name and metadata.
{
"a3b4c5d6": {
"name": "Notes App",
"url": "https://notes.example.com",
"repo": "git@github.com:you/notes-app.git",
"repos": ["~/code/notes-app", "~/code/notes-app-mobile"]
}
}
Project ID derivation
For repos directly under one of your code_roots, the ID is md5(origin_url)[:8]. For nested repos (e.g. ~/code/mailbox/Mbox), the ID is md5(parent_dir_name)[:8] — this lets monorepo-ish layouts group multiple inner repos under a single "mailbox" project. First time a project ID is seen, shiplog prompts for a name and public URL.
voice.md
The prompt fragment that shapes how the AI sounds. Plain Markdown, injected verbatim into every summarization prompt before the diff. See voice.md below for examples.
ignore.conf
Gitignore-style patterns applied to diffs before the prompt is built. Patterns must live inside a [section] header — the section name is either an absolute path (literal match) or a glob. Multiple matching sections union their patterns.
# Match one repo by path
[~/code/mbox]
*.sql
migrations/
secrets/
# Match a glob — applies to every repo under ~/code/work/
[~/code/work/*]
*.env
*.lock
vendor/
Add a pattern interactively with shiplog repo ignore <pattern> from inside the repo. Skip-entire-directory globs (like node_modules) are handled separately by the top-level skip_dirs config — those don't even appear in the discovery walk, let alone in diffs.
09ai
The provider that writes your summaries. Bring your own key; nothing is proxied through us.
Supported providers
| Provider | Default model | Notes |
|---|---|---|
anthropic | claude-sonnet-4-6 | Hosted; key required. |
gemini | gemini-2.5-flash | Hosted; key required. |
ollama | qwen3:8b | Local; no key. Endpoint defaults to http://127.0.0.1:11434. |
coming soon: OpenAI.
Setup
Switching providers
shiplog ai setup overwrites the active provider in config.json. The previous provider's key stays in the keyring — switching back doesn't ask for it again. To remove a stored key:
Run shiplog ai rm with no argument to pick from a list interactively.
Listing
Test
shiplog ai test sends a one-line prompt to the configured provider and prints the response. Useful to validate a key without burning a full scan.
Local Ollama
Set provider to ollama and shiplog talks to http://127.0.0.1:11434 by default. Override with shiplog config set ai.base_url http://gpu-box.local:11434. No data leaves the machine in this configuration.
Spend
shiplog ai budget exists as a placeholder for month-to-date AI spend tracking. Token accounting is coming soon; today the command prints a "not tracked yet" notice.
10sink
Sinks publish your changelog elsewhere. ~/.shiplog/changelog.json is always written; sinks are extras. Multi-sink: every configured sink runs on every successful entry write.
Supported types
| Type | Writes | Use for |
|---|---|---|
file | Mirror copy at a chosen path | Backups, alternate locations. |
csv | UTF-8 BOM + CRLF Excel-friendly mirror | Spreadsheets. |
r2 | Object on Cloudflare R2 + media uploads | The /now-style page on your site. |
s3 | Object on AWS S3 + media uploads | Same, on AWS. |
Planned: webhook, sftp.
Adding a sink
R2 layout
The R2 sink writes to two paths in your bucket:
changelog.json— full changelog, overwritten on every successful entry write.ship/media/<timestamp>-<filename>— uploaded files (one per upload, never overwritten).
Public URLs use the public_host from sink config: https://data.example.com/changelog.json, https://data.example.com/ship/media/…. The r2 sink talks to the Cloudflare API directly — no wrangler dependency at runtime.
S3 layout
Same, but uses standard AWS SDK auth: access_key_id + secret_access_key + region + bucket. Optional endpoint for S3-compatible services (Backblaze B2, MinIO, Wasabi).
CSV sink
One row per entry, columns: timestamp, project, url, note, tags, media. Encoded UTF-8 with a BOM and CRLF line endings so Excel opens it without prompting for encoding. Tags joined with ; ; multiple media URLs joined with ; .
Listing & removing
Run shiplog sink rm with no argument to pick from a list interactively. shiplog sink test picks one sink (interactively or by index) and republishes the current changelog to it — handy for validating credentials after a rotation.
Republishing
shiplog sync republishes the local changelog to every configured sink. Useful after adding a sink to a long-running setup, or when restoring from backup.
11repo
Decide what gets crawled. Anything under your code_roots is auto-discovered. Use repo add to track something outside, repo ignore to exclude one inside.
List
Per-repo overrides
Override granularity or trigger per-repo. Pass key=value:
| Key | Values |
|---|---|
granularity | per-commit · per-day · per-week · per-tag |
trigger | any-commit · merge-to-main · tag-push |
Set the value to the empty string to clear an override: shiplog repo set ~/code/cli-tool granularity=.
Adding & excluding
Anything under your roots is auto-discovered. repo add tracks a path outside the roots; repo ignore appends a pattern to ~/.shiplog/ignore.conf; repo rm adds the path to exclude_repos so it's skipped on the next scan.
Projects
A project is the display name + URL that show up next to entries on the page. Projects auto-register the first time a new repo gets summarized — shiplog uses the parent directory or the origin URL to derive a project ID and prompts you for the human name on first sight. Most users never touch shiplog project directly.
When you do, the relevant subcommands are:
shiplog project list— every project and its assigned repos.shiplog project add— create a project explicitly (name, URL, optional repo slug).shiplog project assign— move a repo into a project (handy when shiplog grouped repos differently than you'd like).shiplog project unassign— drop a repo's explicit assignment back to the auto-derived ID.shiplog project rename— rename a project; existing entries pick up the new name on next read.shiplog project rm— delete a project that has no repos assigned.
All of these accept interactive pickers when called with no arguments.
12voice.md
The prompt fragment that shapes how the AI sounds. Plain Markdown injected verbatim before each diff. This is where you teach shiplog to sound like you.
Default seed
# Voice
Write in first-person plural ("we shipped…") or impersonal active voice
("Search now drops deleted notes…"). Skip "I added" / "I fixed."
Be specific. Name the user-visible change, not the implementation.
"Added a debounce to search input" → "Search no longer flickers while
you type."
Skip filler: no "we are excited," no "today we are pleased to."
One paragraph per bundle. 2–4 sentences. No headers, no lists, no
markdown.
Drop boring commits — if the bundle is just version bumps and lint
fixes, write a single sentence acknowledging there's nothing
user-visible.
Editing
shiplog voice edit opens voice.md in $EDITOR. shiplog voice show prints the current contents. shiplog voice path prints the file path so you can hand it to other tools.
How it threads into prompts
For each bundle, the prompt assembled by shiplog looks roughly like:
System: You are summarizing a batch of git commits as one
changelog entry. Follow the voice instructions strictly.
User: <voice.md verbatim>
## Project: Notes App
## Branch: main
## Commits (3):
- 1a2b3c4 (you@example.com) Add reset password page
- 5d6e7f8 (you@example.com) Wire up forgot-password email
- 9a0b1c2 (you@example.com) Add captcha to auth forms
## Diff summary:
<filtered diff>
## Recent entries from this project (for stylistic continuity):
- "Search now drops deleted notes immediately…"
- "Sidebar keyboard shortcuts…"
Write one paragraph following the voice above.
Use shiplog scan --dump=DIR to see the actual fully-assembled prompt for any bundle.
Drift
If the AI starts drifting back to its trained "we are excited to announce" voice after a model update, edit voice.md to add an explicit anti-pattern and re-run the latest few entries with --force.
13doctor
One command, every check. If anything's off, doctor tells you exactly what's broken and how to fix it.
What it checks
~/.shiplog/and the files inside it (config.json,voice.md,ignore.conf, optionalredact.conf,changelog.json,state.json).gitbinary version and path.ffmpegon PATH (warn if missing — only matters for.movmedia uploads).- Identities seeded.
- AI provider configured; key in keyring for hosted providers.
- Each configured sink: credentials present, target reachable.
- Media uploader's public host configured (where applicable).
- Discovery: how many repos shiplog would walk on the next scan.
Pass --ai-ping to also send a tiny prompt to the configured provider and confirm the round-trip. Costs a few tokens.
shiplog doctor --ai-ping
14Security & privacy
What leaves the machine
Three things, each opt-in and configurable:
- AI provider calls. The diff summary + commit messages + voice + recent entries go to whatever provider you configured. Pick
ollamafor fully local; pickanthropic/openai/geminifor hosted. - Sink uploads. The changelog (and any media) go to whatever sinks you configured. Local-only setups skip this entirely.
- Update checks. Off by default. Re-running
install.shis the supported upgrade path.
That's it. No telemetry, no analytics, no usage pings, no error reporting. The binary doesn't phone home for anything.
What stays local
- The full changelog (
~/.shiplog/changelog.json) is always the source of truth, even if you have remote sinks. Sinks are mirrors. - API keys live in the OS keyring (macOS Keychain, Linux Secret Service). Never in
config.json, never in env files, never in process arg lists. - State (cursors, last-scan timestamps) stays local.
What shiplog never does
- Touch your repos. Read-only — no checkouts, no fetches, no writes.
git logandgit diffonly. - Store secrets in
config.json. Every credential goes to the keyring;config.jsononly references them by name. - Send your code anywhere you didn't configure. The diff goes to your AI provider; nothing else.
- Lock you in. The changelog is a plain JSON array. Take it and go.
Diff filtering
Before any diff is sent to the AI, ignore.conf patterns are applied. The default seed excludes lockfiles, build output, and minified bundles — but if you have proprietary patterns (generated code with embedded keys, customer data fixtures), add them.
To audit exactly what would be sent for a bundle without making the call:
Read the dumped files; if anything sensitive made it through, add a pattern to ignore.conf and re-dump.
Code signing & integrity
- The install script verifies the tarball's sha256 against the published
checksums.txton R2 before extracting. - macOS binaries are codesigned (Developer ID Application) and notarized.
- Both
checksums.txtand the tarballs are served over TLS from Cloudflare. - The release process is fully local — no CI, no remote build runner has ever held the signing certificate.
15Troubleshooting
"shiplog: command not found" after install
The install dir isn't on your PATH. Add it:
echo 'export PATH="$HOME/.local/bin:$PATH"' >> ~/.zshrc
exec zsh
"checksum mismatch" during install
Either Cloudflare's edge cache is mid-flight (try again in 30s) or the tarball was tampered with in transit. Don't bypass with SHIPLOG_NO_VERIFY=1; verify the served checksum matches what's on the GitHub release:
curl -fsS https://get.shiplog.life/v0.1.0/checksums.txt
gh release view v0.1.0 --json assets -q '.assets[] | select(.name=="checksums.txt") | .url' \
| xargs curl -fsSL
If they differ, file an issue — the R2 upload is corrupted.
"the developer cannot be verified" on first run (macOS)
The machine is offline and Gatekeeper can't reach Apple's notarization service. Connect to a network and re-run; subsequent launches don't need the network. If you genuinely need offline-first, file an issue and we'll look at packaging a stapled .pkg.
AI test call fails
shiplog ai test dumps the full error. Common causes:
- 401 / invalid key. Re-run
shiplog ai setup. Most providers don't show keys after creation — paste from the source. - 429 / rate limit. Most often hit during
scan --forceover months of history. Wait a few minutes or switch to a higher tier. - Model not available. Provider deprecated or renamed the model. Check the provider dashboard for current model IDs.
- Connection refused (Ollama). Ollama isn't running. Start it:
ollama serve.
Scan summarizes the same commits twice
The cursor failed to advance — usually because the AI call returned an error after the entry was partially written, or the process was killed mid-write. shiplog scan --force walks the full history, re-summarizes via bundle ID, and dedupes. Existing duplicates can be cleaned with shiplog log + d.
Sink upload fails after entry write
The local changelog.json is the source of truth — entry is saved even if the sink upload fails. shiplog sync retries every configured sink against the full local changelog; idempotent.
.mov uploads fail
shiplog transcodes .mov → .mp4 via ffmpeg before upload. Install it with your package manager: brew install ffmpeg on macOS, apt install ffmpeg on Debian/Ubuntu. shiplog doctor warns when ffmpeg is missing.
Daemon stopped on its own
Run shiplog daemon status. Three possibilities:
not runningwith no pidfile — the daemon exited cleanly (SIGTERM, OS shutdown, or a fatal config error during scan). Check~/.shiplog/daemon.logfor the last entries; restart withshiplog daemon --background.not running (stale pidfile)— the process was killed without running its cleanup (e.g.kill -9, OOM, panic). Runshiplog daemon stopto clean up, then restart.runningbut no recent activity in the log — the daemon is alive but the most recent scan tick is still in flight (slow AI provider, slow git in a giant repo). Wait one schedule interval. If still nothing,shiplog daemon stop && shiplog daemon --onceto surface any per-scan error in your terminal.
Reset everything and start over
Nuclear option: delete ~/.shiplog/ and re-run setup. Keys in the keyring outlive the directory; remove them with shiplog ai rm per provider and shiplog sink rm per sink, or delete the entries from Keychain Access directly.
16Releasing & versioning
Mostly a developer concern; included here so users understand the upgrade and pinning model.
SemVer with intent
shiplog follows semantic versioning, but with a deliberately conservative read:
| Bump | Triggers |
|---|---|
| MAJOR | Breaking change to the changelog JSON shape, the config file schema, or any CLI flag rename. Always with a migration note. |
| MINOR | New commands, new flags, new sinks, new providers. Backwards-compatible additions. |
| PATCH | Bug fixes, prompt-template tweaks, internal refactors invisible to users. |
The on-disk formats (config.json, changelog.json, state.json, projects.json) are part of the contract — bumping one of those without a MAJOR bump is a bug.
Release artifacts
Each tagged release publishes:
- Four tarballs at
https://get.shiplog.life/<version>/shiplog_<version>_<os>_<arch>.tar.gz(darwin amd64/arm64, linux amd64/arm64). - A
checksums.txtalongside each version's tarballs. - A plain-text
https://get.shiplog.life/latestcontaining the most recent tag (e.g.v0.1.0\n) — what the installer reads when noSHIPLOG_VERSIONis set. - A GitHub Release with the same tarballs attached and an autogenerated changelog.
Upgrade behavior
Re-running the installer always pulls the latest. To pin: SHIPLOG_VERSION=v0.1.0 curl … | sh. To downgrade: same — pin to the older tag, the installer overwrites in place. Your ~/.shiplog/ data is never touched by the installer; downgrades carry your data forward.
If a release breaks something
Pin back to the previous tag (SHIPLOG_VERSION=…) while the fix is being prepared. The previous tarballs stay on R2 forever — no version is ever deleted.
17Uninstall
Two layers — the binary, and your data.
Remove the binary
If installed with sudo to /usr/local/bin/shiplog, that's sudo rm /usr/local/bin/shiplog.
Remove the data
Wipes config, changelog, state, projects, voice, ignore. Can't be recovered without a sink mirror.
Stop the daemon
If you backgrounded shiplog daemon with launchd, systemd, tmux, or nohup, stop it through the same supervisor before removing the binary. There's no shiplog-side uninstaller — the daemon is just a foreground process.
Remove the keyring entries
API keys persist after the binary and data are gone. Two paths:
- Through shiplog (before you remove the binary): run
shiplog ai rmfor each AI provider andshiplog sink rmfor each sink — both clean up the keyring entries they own. - By hand: on macOS, open Keychain Access and search for shiplog; delete every match. On Linux (Secret Service), use
secret-tool clear application shiplogor your keyring manager's UI.
Verify clean
Both empty = clean.