Using Project State and LLM Context with Architect
This guide explains how to leverage Architect to synchronize your project's current state and generate a comprehensive context file suitable for use with Large Language Models (LLMs). This context can then be used to assist in generating revision drafts for your project components.
1. Synchronize Project State
First, you need to capture the current state of your application's codebase. This involves scanning your project for defined components and their operations.
Use the following command:
architect --verbose --components-base-path=${COMPONENTS_BASE_PATH} --app-root-name=${APP_ROOT_NAME} state sync-from-code
Explanation of Parameters:
--verbose
: Enables detailed logging output, which can be helpful for troubleshooting.--components-base-path=${COMPONENTS_BASE_PATH}
: Specifies the root directory of your target project. Replace${COMPONENTS_BASE_PATH}
with the actual path (e.g.,.
if running from the project root, or an absolute path like/path/to/your/project
).--app-root-name=${APP_ROOT_NAME}
: Defines the name of your main application package directory (e.g.,app
,src
). Replace${APP_ROOT_NAME}
with your project's specific application root directory name.
This command will analyze your code and update Architect's internal representation of your project's state.
2. Generate LLM Context File
Once the project state is synchronized, you can generate a consolidated context file. This file aggregates information about your project's components, operations, and relationships in a format optimized for LLMs.
Use the following command:
architect --verbose --components-base-path=${COMPONENTS_BASE_PATH} --app-root-name=${APP_ROOT_NAME} context combine-for-llm -o path/to/your/context.json
Explanation of Parameters:
- Global options (
--verbose
,--components-base-path
,--app-root-name
) are the same as above. context combine-for-llm
: This is the specific command to generate the LLM context.-o path/to/your/context.json
: Specifies the output file path for the generated JSON context. Replacepath/to/your/context.json
with your desired location and filename (e.g.,project_context.json
,llm_input.json
).
Tip: For a full list of available CLI options and their corresponding environment variables, consult the Architect CLI reference documentation.
3. Utilize the Context with an LLM
With the context.json
file generated, you can now use it with your preferred Large Language Model.
How to use the context:
- Provide the
context.json
content to the LLM. This might involve pasting the JSON content directly into the prompt or using an API if the LLM supports file uploads or structured data inputs. - Craft your prompt. Along with the context, instruct the LLM on what you want to achieve. For example:
- Ask the LLM to understand the "big picture" of your application based on the context.
- Describe a specific workflow or feature you intend to implement or modify.
- Detail any new permissions or changes to existing ones.
- Request the LLM to generate a revision draft (in JSON format compatible with Architect) containing all necessary
ComponentOperation
definitions for the new feature or changes.
Example Prompt Idea:
"Based on the provided project context (see
context.json
content below), I need to implement a new 'Order Cancellation' feature. This feature involves a new API endpoint, updates to the 'Order' aggregate, and a new 'OrderCancelled' domain event. The user performing this action must have the 'cancel_order' permission. Please generate an Architect revision draft with all the necessary ComponentOperations to implement this."
Iterative Approach:
- You don't have to ask for a complete revision draft at once.
- You can start by asking the LLM to identify relevant components for a specific task or to suggest modifications to existing components based on the context.
- Iterate on the LLM's suggestions, refine your requirements, and gradually build up the
revisions.json
file.
By following these steps, you can effectively use Architect's state and context generation features to streamline your development process and leverage LLMs for component design and modification.