A small compiler implementation for the TINY programming language, written in C with an optional Python GUI interface. This project demonstrates the fundamental concepts of lexical analysis and parsing in compiler construction.
This compiler consists of two main components:
Scanner (Lexical Analyzer)
- Reads TINY source code and breaks it into tokens
- Recognizes keywords, identifiers, numbers, and operators
- Outputs token information for the parser
Parser (Syntax Analyzer)
- Takes tokens from the scanner and builds a parse tree
- Implements the grammar rules of the TINY language
- Generates a visual representation of the program structure
The project includes a Makefile for easy compilation:
# Build both scanner and parser
make
# Run the complete compilation process
make run
# Clean compiled files
make clean
You can also run the components individually:
# Run just the scanner
./scanner
# Run just the parser
./parser
For a more user-friendly experience, there's a Python GUI application:
# Make sure you have the required dependencies
pip install PySide6
# Run the GUI
python guii.py
The GUI provides a visual interface for editing TINY programs, running the compiler, and viewing both the token output and parse tree.
TINY supports the following language constructs:
- Variables: Simple identifiers for storing values
- Numbers: Integer literals
- Arithmetic: Addition (+), subtraction (-), multiplication (*), division (/)
- Assignment: Using the
:=
operator - Comparison: Less than (<) and equality (=) operators
- Control Flow:
- Conditional statements with
if-then
- Loops with
repeat-until
- Conditional statements with
- I/O Operations:
read
andwrite
statements
scanner.c
- Lexical analyzer implementationparser.c
- Syntax analyzer implementationtokens.h
- Token type definitions and constantstoken_strings.c
- String representations of tokensinput.tiny
- Sample TINY program for testingguii.py
- Optional graphical user interfacerun.sh
- Shell script for GUI setupMakefile
- Build configuration
-
Lexical Analysis: The scanner reads the source file character by character, grouping them into meaningful tokens (keywords, identifiers, operators, etc.)
-
Syntax Analysis: The parser takes the stream of tokens and checks if they follow the grammar rules of TINY, building a parse tree in the process
-
Output: The compiler generates two files:
tokens.txt
- List of all tokens found in the sourcetree.txt
- Visual representation of the parse tree