create

Overview

Bootstraps .ouro/ontology/main.md through a guided interview. The command can enlist the external codex CLI to auto-answer the facilitator questions, then streams an @openai/agents run that incrementally updates the ontology after each response.

Usage

ouro create [--human] [--agent-model <model>] [--codex-model <model>] [--max-questions <n>]

Key Options

  • --human – Disable Codex auto-responses and answer prompts manually.
  • --agent-model <model> – Override the model that rewrites the ontology after each answer.
  • --codex-model <model> – Override the codex CLI model used for automated answers.
  • --max-questions <n> – Cap the follow-up refinement dialog (default 12 turns).

Inputs & Outputs

  • Inputs: interactive answers covering architecture, technologies, communication, data layer, testing, and error handling. If .ouro/technologies/MAP.md is missing, the command can launch gather to detect technologies first.
  • Outputs: initializes or updates .ouro/ontology/main.md after each question and starts an additional refinement loop once the questionnaire completes.

When to Use

  • First-time setup for a repository adopting Ouro.
  • Major architecture shifts that merit rewriting large portions of the ontology.

Follow-up Checklist

  • Run ouro map to regenerate the technologies mapping from the new ontology.
  • Commit the refreshed ontology together with any changes produced by gather.