Prompt versioning with LLMs | James' Coffee Blog
The templates used to generate prompts for my GPT 3.5-powered chatbot are versioned in a custom-made system. This was a requirement for the project that came to mind after I made the initial logic to query sources and return a result that makes reference to the sources. I decided that all prompts should be saved separately from my application code to ensure that I didn't overwrite them in testing and lose the history of the prompts with which I have been working.