A command-line-based tokenizer written in Rust. Takes in a file and tokenizes each word, while also counting the number of times that word appears in the text file.
/target/debug/tokenizer $TEXT_FILE.txt$
- convert hardcoded text into CL input
- configure tool to print output json file to the location of your choosing
- configure tool to allow user to configure delimeters
- refactor such that the code meets the style guide
- implement benchmark tests and optimize performance accordingly