CLI Reference

A full-featured terminal client for managing knowledge, running queries, and administering your Nocturnus.AI server — interactively or from scripts.

Two modes. Use the CLI as an interactive REPL for exploration, or pass -e to run a single command for scripting and automation.

Install with AI

Copy a prompt and paste it into your AI coding assistant — it will run the installer for you.

Claude Code
Install the NocturnusAI CLI on my machine using the official installer: curl -fsSL https://raw.githubusercontent.com/Auctalis/nocturnusai/main/install.sh | bash — then verify it works by running: nocturnusai --version
GitHub Copilot
@terminal Install the NocturnusAI CLI: curl -fsSL https://raw.githubusercontent.com/Auctalis/nocturnusai/main/install.sh | bash
Cursor
Run this in my terminal to install the NocturnusAI CLI: curl -fsSL https://raw.githubusercontent.com/Auctalis/nocturnusai/main/install.sh | bash

Installation

Recommended. The native binary is a single self-contained executable — no JVM, no Docker, instant startup. Install it with the one-liner installer or download directly from GitHub Releases.

One-liner installer (recommended)

The standard install script automatically downloads and installs the native CLI binary for your platform:

$ curl -fsSL https://raw.githubusercontent.com/Auctalis/nocturnusai/main/install.sh | bash

The CLI is placed at /usr/local/bin/nocturnusai (or ~/.local/bin/nocturnusai if that isn't writable) and added to your PATH automatically.

Download directly from GitHub Releases

Download the pre-built binary for your platform from the latest release:

# macOS (Apple Silicon)
$ curl -fsSL https://github.com/Auctalis/nocturnusai/releases/latest/download/nocturnusai-macos-arm64 -o /usr/local/bin/nocturnusai && chmod +x /usr/local/bin/nocturnusai

# macOS (Intel)
$ curl -fsSL https://github.com/Auctalis/nocturnusai/releases/latest/download/nocturnusai-macos-x86_64 -o /usr/local/bin/nocturnusai && chmod +x /usr/local/bin/nocturnusai

# Linux (x86_64)
$ curl -fsSL https://github.com/Auctalis/nocturnusai/releases/latest/download/nocturnusai-linux-x86_64 -o /usr/local/bin/nocturnusai && chmod +x /usr/local/bin/nocturnusai

# Linux (ARM64)
$ curl -fsSL https://github.com/Auctalis/nocturnusai/releases/latest/download/nocturnusai-linux-arm64 -o /usr/local/bin/nocturnusai && chmod +x /usr/local/bin/nocturnusai

Build from Source

Requires JDK 17+ and GraalVM for native compilation:

# Run via Gradle (JVM — no native build needed)
$ ./gradlew :nocturnusai-cli:run

# Build a native binary
$ ./gradlew :nocturnusai-cli:nativeCompile
$ ./nocturnusai-cli/build/native/nativeCompile/nocturnusai

With Docker Compose

The Docker image runs the server, not the standalone CLI binary. Start the server with Docker Compose, then point the host-installed CLI at it:

$ docker compose up -d
$ nocturnusai --server http://localhost:9300 --db default

Connecting to a Server

Control which server, database, tenant, and API key to use via command-line flags:

Flag Short Default Description
--server -s http://localhost:9300 Server URL
--db -d default Database name
--api-key -k API key for authentication
--tenant -t Tenant ID for multi-tenant databases
--exec -e Run a single command and exit
# Connect to a remote server with specific database and API key
$ nocturnusai -s https://api.example.com -d production -k nai_abc123

# Connect to a specific tenant
$ nocturnusai -d mydb -t tenant_acme

Interactive REPL

Launch the CLI without -e to enter interactive mode. You'll see a connection banner and a prompt showing the active database:

$ nocturnusai

NocturnusAI CLI — logic server for agentic AI
Server:   http://localhost:9300
Database: default
Connected.

Type 'help' for commands, 'status' for details, 'setup' to configure LLM.

default> _

The prompt shows your current database. Type commands directly — no need for prefixes or wrappers.

Switch databases on the fly. Use use <database> to change the active database without reconnecting. Your prompt updates to reflect the change.
default> use production
Switched to production

production> _

Knowledge Management

tell — Store a Fact

Store a structured fact using predicate syntax, or pass natural language to auto-extract facts via LLM:

# Structured fact
default> tell human(socrates)
OK Fact stored

# Natural language (requires LLM configured on server)
default> tell the president met with the prime minister yesterday
+ met_with(president, prime_minister)   95%
+ time(meeting, yesterday)               90%
2 facts, 0 rules — asserted

Shortcut: +

ask — Query Knowledge

Ask queries in predicate syntax (with ? variables) or natural language. The CLI auto-detects the format:

# Structured query — variables start with ?
default> ask mortal(?who)
  who = socrates
1 result(s)

# Natural language question (routed to /synthesize)
default> ask who is mortal?
  Socrates is mortal.

Derivation:
  fact  human(socrates)
  rule  mortal(?x) :- human(?x)  via mortal_rule
Confidence: 95%  anthropic/claude-sonnet-4-20250514

Shortcut: ?

teach — Define a Rule

Rules use :- syntax. Variables (prefixed with ?) are automatically unified:

default> teach mortal(?x) :- human(?x)
OK Rule added

# Multi-body rules
default> teach grandparent(?x, ?z) :- parent(?x, ?y), parent(?y, ?z)
OK Rule added

# Negation-as-Failure (NAF) — use NOT in rule bodies
default> teach can_fly(?x) :- bird(?x), NOT penguin(?x)
OK Rule added

# NAF means "not provable" (closed-world assumption)
# can_fly(robin) succeeds if bird(robin) exists and penguin(robin) does not
NOT vs explicit negation. NOT in rule bodies uses Negation-as-Failure: the condition succeeds when the atom cannot be proven from known facts. This is different from asserting that something is explicitly false.

Shortcut: ++

forget — Retract a Fact

Remove a fact from the knowledge base:

default> forget human(socrates)
OK Retracted

Shortcut: -

ingest — Extract Knowledge from Text

Feed plain text — the server's LLM automatically extracts facts and rules and asserts them:

# Inline text
default> ingest Alice is Bob's mother. Bob has three children.
  + mother(alice, bob)        92%
  + has_children(bob, three)  88%
2 facts, 0 rules — asserted  via anthropic/claude-sonnet-4-20250514

# From a file
default> ingest -f article.txt
  + founded(elon_musk, spacex)  95%
  + ...
12 facts, 2 rules — asserted

Inspection & Export

inspect — Browse All Knowledge

List all facts and rules in the current database. Optionally filter by a search term:

default> inspect
  fact  human(socrates)
  fact  parent(alice, bob)
  rule  mortal(?x) :- human(?x)
3 item(s)

# Filter results
default> inspect parent
  fact  parent(alice, bob)
  fact  parent(bob, charlie)
2 item(s)

Shortcut: ls

export — Dump Knowledge to File

Export all facts and rules in a loadable text format:

# Print to stdout
default> export

# Save to file
default> export knowledge.ab
Exported — 5 facts, 2 rules → knowledge.ab

Shortcut: dump

import — Load Knowledge from File

Import facts and rules from an .ab file. Lines starting with # are treated as comments:

default> import knowledge.ab
Imported — 5 facts, 2 rules

Shortcut: load

File format. Each line is either a fact (predicate(arg1, arg2)) or a rule (head(?x) :- body(?x)). Comments start with #. This is the same format that export produces.

Memory Operations

Nocturnus has a built-in salience model. These commands let you manage it:

context — Salience-Ranked Context Window

See the most relevant facts, ranked by salience score:

default> context 10
Context Window — 10 of 42 facts (by salience)
  [0.950]  subscription_tier(acme_corp, enterprise)
  [0.880]  location(acme_corp, austin)
  [0.720]  parent(alice, bob)
  ...

Shortcut: ctx

compress — Consolidate Episodic Patterns

Merge repeated or related facts into summary facts:

default> compress
Compressed — 8 consolidated, 2 new summary facts

cleanup — Evict Low-Salience Facts

Remove expired and low-salience facts below a threshold:

# Default threshold: 0.05
default> cleanup

# Custom threshold
default> cleanup 0.1
Cleanup — 3 expired, 5 evicted (threshold=0.1)
Context optimization from CLI workflows. Use context for salience-ranked retrieval in interactive sessions, then run compress and cleanup regularly. For goal-driven context windows and diff sessions, call /context/optimize and /context/diff from the HTTP API alongside CLI usage.
Workflow Step CLI Command HTTP Method + Endpoint
Salience context window context [max] POST /memory/context
Memory consolidation compress POST /memory/compress
Memory cleanup/decay cleanup [threshold] POST /memory/cleanup
Goal-driven context API-only POST /context/optimize
Incremental context diff API-only POST /context/diff
Session cleanup API-only POST /context/session/clear

The CLI uses the simplified REST aliases /memory/compress and /memory/cleanup. The lower-level API docs also show the canonical endpoints /memory/consolidate and /memory/decay.


Administration

tenant — Switch Tenant

Switch the active tenant within the current database. Multi-tenant databases isolate knowledge per tenant:

default> tenant acme
Switched to tenant acme

default> tenant
Current tenant: acme

dbs — List Databases

Show all databases on the server. An arrow marks the active one:

default> dbs
  default <-
  production
  staging

health — Server Health Check

default> health
healthy
  pass  database  Connected
  pass  memory    128 MB free

status — Full Server Status

A dashboard view showing health, LLM configuration, databases, auth mode, and knowledge stats:

default> status
NocturnusAI Status
Server: http://localhost:9300
Database: default

  Health:     healthy
  LLM:        anthropic/claude-sonnet-4-20250514
  Extraction: enabled
  Databases:  3
  Auth:       RBAC (2 keys)
  Knowledge:  42 facts, 5 rules (in default)

setup — Configure LLM Provider

Interactive wizard to configure which LLM provider powers natural language features:

default> setup

NocturnusAI Setup
Configure your LLM provider for natural language features.

  1) Ollama (local, free, private)
  2) Anthropic Claude
  3) OpenAI GPT
  4) Google Gemini
  5) Custom (any OpenAI-compatible endpoint)
  q) Cancel

Choice [1]: 1

Ollama selected.
Make sure Ollama is running: ollama serve
Model [llama3.2]: _

The wizard writes the configuration to your .env file after confirmation.


Authentication

Manage RBAC-based API keys directly from the CLI.

login — Bootstrap or Check Auth

If auth is enabled but no keys exist yet, login guides you through creating the first admin key:

default> login

NocturnusAI Auth
  Mode: rbac
RBAC auth enabled — no keys yet.
Let's create your first admin key.

Admin username [admin]: admin
Admin password: ********
Key name [admin]: admin

Admin key created!

  API Key:  nai_abc123...xyz789
  Prefix:   nai_abc1
  ID:       550e8400-e29b-41d4-a716-446655440000

Save this key — it won't be shown again.
Use it with: --api-key nai_abc123...xyz789

whoami — Show Current Identity

default> whoami
  Mode:     rbac
  Name:     admin
  Role:     admin
  DBs:      all
  Tenants:  all
  Perms:    read, write, admin, keys

keys — Manage API Keys

# List all keys
default> keys list
API Keys:
  nai_abc1...  admin  admin     550e8400-...
  nai_def2...  writer agent-1   661f9500-...

# Create a new key (roles: admin, writer, reader)
default> keys create my-agent writer
Key created:
  Key:    nai_xyz789...
  Prefix: nai_xyz7
  Role:   writer
Save this key — it won't be shown again.

# Revoke a key by ID
default> keys revoke 661f9500-e29b-41d4-a716-446655440000
Key revoked.

Single-Command Mode & Scripting

Pass -e to run a single command and exit — perfect for shell scripts, CI/CD, and automation:

# Store a fact
$ nocturnusai -d mydb -e "tell human(socrates)"

# Query
$ nocturnusai -d mydb -e "ask mortal(?who)"

# Export a database
$ nocturnusai -d mydb -e "export backup.ab"

# Import into a database
$ nocturnusai -d mydb -e "import knowledge.ab"

# Health check in CI
$ nocturnusai -e "health"

Shell Scripting Examples

#!/bin/bash
# Seed a database with a knowledge file
nocturnusai -d production -e "import seed_data.ab"
nocturnusai -d production -e "inspect"

# Backup all databases
for db in default production staging; do
    nocturnusai -d $db -e "export backups/backup.ab"
done

Raw DSL Mode

For advanced use, the dsl command passes raw Logiql DSL directly to the server:

default> dsl ASSERT human(socrates).
default> dsl QUERY mortal(?who).

Shortcut: exec


Advanced Features (API Only)

Several advanced features are available through the HTTP API and MCP protocol but not yet exposed as CLI commands. Use these via direct API calls or MCP-connected agents:

Feature API Endpoint Description
Confidence POST /assert/fact Attach confidence scores (0.0–1.0) to facts, filter queries with minConfidence
Conflict Strategy POST /assert/fact Control duplicate handling: NEWEST_WINS, CONFIDENCE, KEEP_BOTH, REJECT
Scope Fork POST /scope/fork Copy all atoms from one scope to another for hypothetical reasoning
Scope Diff POST /scope/diff Compare two scopes — see what's only in A, only in B, or in both
Scope Merge POST /scope/merge Merge scopes with strategy (SOURCE_WINS, TARGET_WINS, KEEP_BOTH, REJECT)
Aggregation POST /aggregate COUNT, SUM, MIN, MAX, AVG over matched facts
Bulk Assert POST /assert/facts Assert multiple facts in a single request (best-effort)
Retract Pattern POST /retract/pattern Retract all facts matching a pattern with variable wildcards
Full details. See the Aggregation, Scope Management, and Confidence & Conflict sections in the API and Multi-Tenancy docs.

Command Reference

Command Shortcut Description
tell <fact> + Store a fact (structured or natural language)
ask <query> ? Query with inference (structured or NL)
teach <rule> ++ Define a logical rule
forget <fact> - Retract a fact
ingest <text | -f file> Extract knowledge from text via LLM
inspect [filter] ls Browse all facts and rules
context [max] ctx Salience-ranked context window
compress Consolidate episodic patterns
cleanup [threshold] Evict expired/low-salience facts
dsl <command> exec Raw Logiql DSL passthrough
import <file> load Load facts & rules from file
export [file] dump Dump knowledge to file or stdout
use <database> Switch active database
tenant [name] Switch or show active tenant
dbs List all databases
health Server health check
status Full server status dashboard
setup Interactive LLM provider configuration
login Bootstrap auth or check auth status
whoami Show current key identity & permissions
keys <list|create|revoke> API key management
help h Show all commands
exit q Quit the REPL

What's Next?

API Reference →

Full HTTP endpoint documentation

Core Concepts →

Facts, rules, inference, and salience scoring

Operations →

Monitoring, persistence, and deployment