Quick Start
The fastest way to understand what codectx does is to run it on your own codebase.
-
Navigate to your repository Open your terminal and navigate to any Python, JavaScript, TypeScript, or Rust project.
Terminal window cd ~/my-awesome-repo -
Run codectx Run the
analyzecommand specifying the current directory (.) as the target:Terminal window codectx analyze . -
Check the Output
codectxwill generate a structuredCONTEXT.mdfile in the root of your project. Open it in a text editor to see a tiered layout detailing your architectural entry points, dependency connections, and key implementations. -
Try a specific task profile
codectxcan adjust its scoring and output based on what your AI assistant is trying to accomplish. For example, if you are asking an agent to write a new feature, run:Terminal window codectx analyze . --task feature
Passing Context to an LLM
Section titled “Passing Context to an LLM”Now that you have your CONTEXT.md, here is how you use it:
- Open your AI coding assistant (Claude, cursor, ChatGPT, etc.).
- Drag and drop the
CONTEXT.mdfile into the chat. - Prompt the agent: “Read the attached context file to understand my project’s architecture, then implement [Feature X] in the [Module Y].”
By providing the structured CONTEXT beforehand, the LLM starts with a deterministic, highly organized understanding of exactly how your project goes together.
Next, learn more commands in Basic Usage.