Release 0.5.0
This release of vscode-gpt-automate includes a few tweaks to the backend, some front-end improvements, and entirely new functionality.
Frontend
- New progress bar awaiting a response from API
- New command:
RFC(see 'RFC Handshake' below) [Allows AI to read code from files & fix issues]
Backend
- Added persistent logging so I can further diagnose issues. There will be an option to disable telemetry in the future, planned for next release.
- Improved prompt to be MUCH more willing to write actual code
- Added token prioritization weighting to limit prompts to 4,096 tokens (Meaning the AI will have a much greater understanding of your workspace)
RFC Handshake
This feature is actively being worked on but is now ready for a preview release. The RFC handshake allows the AI to request the contents of a file only when necessary.
The main reason why the AI can't just do this already is for a plethora of reasons:
- The ChatGPT turbo API is limited to 4,096 tokens
- Workspaces can often contain thousands of tokens alone in just file names and folders.
- The pre-prompt in the backend is almost 700 tokens in size alone.
Trying to add file contents into this mix is quite difficult. That's why I created the RFC Handshake
The AI reads the prompt sent to it and judges whether or not it needs to read a file to complete the user's request. If so, the AI sends back, and only back, RFC "path/to/file". Upon receiving this, the client automatically replies with the content of the files. The AI then will complete the prompt as necessary.
I am still working on tweaking the prompt to understand what RFC means, but the feature is implemented in the meantime.
Happy prompting!