You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Major Refactor and New Features: Modularization, Enhanced CLI, and Logging (#20)
* added whitelist and blacklist
* added unit test
* update rules
* fail
* unit tests
* strong unit tests
* update main
* compilation
* update code
* extracted filter
* filter test implemented
* glob approach
* filter in progress
* filter test pass
* test filter good
* split into modules
* include ok
* logger
* docstring
* fixed the exclude pattern empty bug
* unit test done
* git test
* code cleaning
* README Updated
* cleanup
* debug cleaned
* template support
* improve api
* test template
* git test
* clean the code
* handling error properly
* cleaned
* documentation
* semver
* ver 2.0.0
---------
Co-authored-by: SChattot <stephane.chattot@heig-vd.ch>
A CLI tool to convert your codebase into a single LLM prompt with source tree, prompt templating, and token counting.
4
-
5
-
---
6
-
7
-
You can run this tool on the entire directory and it would generate a well-formatted Markdown prompt detailing the source tree structure, and all the code. You can then upload this document to either GPT or Claude models with higher context windows and ask it to:
8
-
9
-
- Rewrite the code to another language.
10
-
- Find bugs/security vulnerabilities.
11
-
- Document the code.
12
-
- Implement new features.
13
-
14
-
You can customize the prompt template to achieve any of the desired use cases. It essentially traverses a codebase and creates a prompt with all source files combined. In short, it automates copy-pasting multiple source files into your prompt and formatting them along with letting you know how many tokens your code consumes.
15
-
16
-
> I initially wrote this for personal use to utilize Claude 3.0's 200K context window and it has proven to be pretty useful so I decided to open-source it!
`code2prompt` is a command-line tool (CLI) that converts your codebase into a single LLM prompt with a source tree, prompt templating, and token counting.
11
+
25
12
## Table of Contents
26
13
27
-
*[Features](#features)
28
-
*[Installation](#installation)
29
-
*[Usage](#usage)
30
-
*[Templates](#templates)
31
-
*[User Defined Variables](#user-defined-variables)
32
-
*[Build From Source](#build-from-source)
33
-
*[Contribution](#contribution)
34
-
*[License](#license)
35
-
*[Support The Author](#liked-the-project)
14
+
-[Features](#features)
15
+
-[Installation](#installation)
16
+
-[Usage](#usage)
17
+
-[Templates](#templates)
18
+
-[User Defined Variables](#user-defined-variables)
19
+
-[Tokenizers](#tokenizers)
20
+
-[Build From Source](#build-from-source)
21
+
-[Contribution](#contribution)
22
+
-[License](#license)
23
+
-[Support The Author](#support-the-author)
36
24
37
25
## Features
38
26
27
+
You can run this tool on the entire directory and it would generate a well-formatted Markdown prompt detailing the source tree structure, and all the code. You can then upload this document to either GPT or Claude models with higher context windows and ask it to:
28
+
39
29
- Quickly generate LLM prompts from codebases of any size.
40
30
- Customize prompt generation with Handlebars templates. (See the [default template](src/default_template.hbs))
41
-
-Follows`.gitignore`.
42
-
- Filter and exclude files by extension.
43
-
- Display the token count of the generated prompt. (See [Tokenizers](#Tokenizers) for more details)
44
-
- Optionally include the Git diff output (staged files) in the generated prompt.
45
-
-Copy the generated prompt to the clipboard on generation.
31
+
-Respects`.gitignore`.
32
+
- Filter and exclude files using glob patterns.
33
+
- Display the token count of the generated prompt. (See [Tokenizers](#tokenizers) for more details)
34
+
- Optionally include Git diff output (staged files) in the generated prompt.
35
+
-Automatically copy the generated prompt to the clipboard.
46
36
- Save the generated prompt to an output file.
47
37
- Exclude files and folders by name or path.
48
38
- Add line numbers to source code blocks.
49
39
40
+
You can customize the prompt template to achieve any of the desired use cases. It essentially traverses a codebase and creates a prompt with all source files combined. In short, it automates copy-pasting multiple source files into your prompt and formatting them along with letting you know how many tokens your code consumes.
41
+
50
42
## Installation
51
43
44
+
### Latest Release
45
+
52
46
Download the latest binary for your OS from [Releases](https://github.com/mufeedvh/code2prompt/releases) OR install with `cargo`:
The first command clones the `code2prompt` repository to your local machine. The next two commands change into the `code2prompt` directory and build it in release mode.
> I initially wrote this for personal use to utilize Claude 3.0's 200K context window and it has proven to be pretty useful so I decided to open-source it!
140
+
144
141
## Templates
145
142
146
143
`code2prompt` comes with a set of built-in templates for common use cases. You can find them in the [`templates`](templates) directory.
@@ -153,7 +150,7 @@ Use this template to generate prompts for documenting the code. It will add docu
153
150
154
151
Use this template to generate prompts for finding potential security vulnerabilities in the codebase. It will look for common security issues and provide recommendations on how to fix or mitigate them.
Use this template to generate prompts for cleaning up and improving the code quality. It will look for opportunities to improve readability, adherence to best practices, efficiency, error handling, and more.
159
156
@@ -175,7 +172,7 @@ Use this template to generate prompts for improving the performance of the codeb
175
172
176
173
You can use these templates by passing the `-t` flag followed by the path to the template file. For example:
@@ -206,24 +203,6 @@ For more context on the different tokenizers, see the [OpenAI Cookbook](https://
206
203
207
204
`code2prompt` makes it easy to generate prompts for LLMs from your codebase. It traverses the directory, builds a tree structure, and collects information about each file. You can customize the prompt generation using Handlebars templates. The generated prompt is automatically copied to your clipboard and can also be saved to an output file. `code2prompt` helps streamline the process of creating LLM prompts for code analysis, generation, and other tasks.
208
205
209
-
## Build From Source
210
-
211
-
### Prerequisites
212
-
213
-
For building `code2prompt` from source, you need to have these tools installed:
214
-
215
-
*[Git](https://git-scm.org/downloads)
216
-
*[Rust](https://rust-lang.org/tools/install)
217
-
* Cargo (Automatically installed when installing Rust)
The first command clones the `code2prompt` repository to your local machine. The next two commands change into the `code2prompt` directory and build it in release mode.
226
-
227
206
## Contribution
228
207
229
208
Ways to contribute:
@@ -240,4 +219,4 @@ Licensed under the MIT License, see <a href="https://github.com/mufeedvh/code2pr
240
219
241
220
## Liked the project?
242
221
243
-
If you liked the project and found it useful, please give it a :star: and consider supporting the author!
222
+
If you liked the project and found it useful, please give it a :star: and consider supporting the authors!
0 commit comments