Analyzing a Specific Prompt

Analyzing a Specific Prompt

If there’s a question that was asked that gave unexpected answers we can extract lots of relevant information about the process used to create the answer.

This process will provide you with a zip file with relevant debugging files inside.

Gateway Process

Preparation

docker-compose.override.debug.yml

Make sure you’re running businessgpt/debug_collector:1.1c or newer.

image-20250703-092215.png
DEBUG_KEEP_RUNNING=true

 

docker.env

Add

SAVE_DEBUG_INFO= True

Ensure the debug container is running. E.g.

docker compose -f docker-compose.yml -f docker-compose.override.yml -f docker-compose.override.debug.yml --profile debug up -d debug_collector_bgpt

Getting the Prompt Debug file

Run this command, specifying any of the following: chat, prompt or request ID.

docker exec -it debug_collector_bgpt python /app/prompt_debug_collector.py <ID>

Take the .7z file in the prompt debug directory or any of the desired files individually.
e.g. /DebugData/prompt_debug_<ID>_<TimeStamp>/

Repeat the above docker exec command for each prompt you wish to debug. Alternately, run the debug container as usual to get all of the files for all of the prompts.

Setting SAVE_DEBUG_INFO= True may affect performance and increase disk space usage. Set to False on production environments where possible.