Version: 1.0.0 (MVP) License: GPL v3 Language: Go 1.21+
The core transformation engine for Infrar - converts provider-agnostic code (Infrar SDK) into native cloud provider SDK code at deployment time, enabling true multi-cloud portability with zero runtime overhead.
Infrar Engine uses compile-time code transformation to convert your infrastructure-agnostic code into provider-specific implementations:
User Code (Infrar SDK) β Infrar Engine β Provider Code (Native SDK)
(GitHub repo) (Transformation) (Deployed to cloud)
Input (Infrar SDK):
from infrar.storage import upload
upload(bucket='my-bucket', source='file.txt', destination='backup/file.txt')Output (AWS/boto3):
import boto3
s3 = boto3.client('s3')
s3.upload_file('file.txt', 'my-bucket', 'backup/file.txt')Output (GCP/Cloud Storage):
from google.cloud import storage
storage_client = storage.Client()
bucket = storage_client.bucket('my-bucket')
blob = bucket.blob('backup/file.txt')
blob.upload_from_filename('file.txt')- β Zero Runtime Overhead - Code is transformed at deployment time, not runtime
- β AST-Based Transformation - Accurate code parsing using language-native parsers
- β Plugin Architecture - Extensible transformation rules via YAML
- β Multi-Provider Support - AWS, GCP, Azure (MVP: AWS + GCP for storage)
- β Validation - Generated code is validated for syntax correctness
- β Type-Safe - Full type system for transformation pipeline
- Go 1.21 or higher
- Python 3.8+ (for Python AST parsing)
git clone https://github.com/QodeSrl/infrar-engine.git
cd infrar-engine
go build -o bin/transform ./cmd/transform# Run all tests
go test ./...
# Run with verbose output
go test ./... -v
# Run specific package
go test ./pkg/parser -v# app.py
from infrar.storage import upload
def backup_data():
upload(
bucket='my-data-bucket',
source='/tmp/data.csv',
destination='backups/data.csv'
)./bin/transform -provider aws -input app.py -output app_aws.pyOutput (app_aws.py):
import boto3
s3 = boto3.client('s3')
def backup_data():
s3.upload_file('/tmp/data.csv', 'my-data-bucket', 'backups/data.csv')./bin/transform -provider gcp -input app.py -output app_gcp.pyThe transformation pipeline consists of 6 core components:
ββββββββββββ ββββββββββββ βββββββββββββ ββββββββββββ βββββββββββββ ββββββββββββ
β Parser βββ>β Detector βββ>βTransformerβββ>β Generatorβββ>β Validator βββ>β Result β
ββββββββββββ ββββββββββββ βββββββββββββ ββββββββββββ βββββββββββββ ββββββββββββ
β β β β β
βΌ βΌ βΌ βΌ βΌ
AST Infrar Calls Transformed Final Code Validated
Calls Code
- Parser (
pkg/parser) - Parses source code into AST using Python's native parser - Detector (
pkg/detector) - Identifies Infrar SDK calls in the AST - Plugin Loader (
pkg/plugin) - Loads transformation rules from YAML files - Transformer (
pkg/transformer) - Applies rules to generate provider code - Generator (
pkg/generator) - Combines transformed calls into final code - Validator (
pkg/validator) - Validates generated code syntax
See ARCHITECTURE.md for detailed technical documentation.
Transformation rules are defined in YAML files:
# storage/aws/rules.yaml
operations:
- name: upload
pattern: "infrar.storage.upload"
target:
provider: aws
service: s3
transformation:
imports:
- "import boto3"
setup_code: |
s3 = boto3.client('s3')
code_template: |
s3.upload_file(
{{ .source }},
{{ .bucket }},
{{ .destination }}
)
parameter_mapping:
bucket: bucket
source: source
destination: destination
requirements:
- package: boto3
version: ">=1.28.0"Plugin Locations:
- Production plugins: infrar-plugins repository (
../infrar-plugins/packages) - Test plugins:
./test-pluginsdirectory (for local development and testing)
Use the production plugins repository for actual transformations. The test-plugins directory is kept for development convenience.
package main
import (
"fmt"
"github.com/QodeSrl/infrar-engine/pkg/engine"
"github.com/QodeSrl/infrar-engine/pkg/types"
)
func main() {
// Create engine
eng, err := engine.New()
if err != nil {
panic(err)
}
// Load transformation rules
err = eng.LoadRules("../infrar-plugins/packages", types.ProviderAWS, "storage")
if err != nil {
panic(err)
}
// Transform code
sourceCode := `
from infrar.storage import upload
upload(bucket='data', source='file.txt', destination='file.txt')
`
result, err := eng.Transform(sourceCode, types.ProviderAWS)
if err != nil {
panic(err)
}
fmt.Println(result.TransformedCode)
}# Transform from stdin
echo "from infrar.storage import upload" | ./bin/transform -provider aws
# Transform file
./bin/transform -provider aws -input app.py -output app_aws.py
# Transform to GCP
./bin/transform -provider gcp -input app.py -output app_gcp.py
# Specify plugin directory
./bin/transform -provider aws -plugins ./custom-plugins -input app.py-provider string
Target cloud provider (aws, gcp, azure) (default "aws")
-plugins string
Path to plugins directory (default "../infrar-plugins/packages")
-capability string
Capability to transform (storage, database, etc.) (default "storage")
-input string
Input file to transform (or use stdin)
-output string
Output file (or use stdout)
Current test coverage (MVP):
- β Parser: 100% (all tests passing)
- β Detector: 100% (all tests passing)
- β Plugin Loader: 100% (all tests passing)
- β Transformer: 100% (all tests passing)
- β Generator: 100% (all tests passing)
- β Validator: 100% (all tests passing)
- β Engine (E2E): 100% (all tests passing)
# Run all tests
go test ./... -v
# Run with coverage
go test ./... -cover
# Generate coverage report
go test ./... -coverprofile=coverage.out
go tool cover -html=coverage.outinfrar-engine/
βββ cmd/
β βββ transform/ # CLI tool
βββ pkg/
β βββ types/ # Core type definitions
β βββ parser/ # AST parsing (Python)
β βββ detector/ # Infrar call detection
β βββ plugin/ # Plugin loader & registry
β βββ transformer/ # Core transformation logic
β βββ generator/ # Code generation
β βββ validator/ # Code validation
β βββ engine/ # Main engine (public API)
βββ internal/
β βββ util/ # Internal utilities
βββ tests/
β βββ integration/ # Integration tests
β βββ fixtures/ # Test fixtures
βββ go.mod
βββ go.sum
βββ README.md # This file
βββ ARCHITECTURE.md # Technical architecture
βββ LICENSE
Target performance metrics (MVP):
- Transform 100 lines in < 100ms β
- Support files up to 10,000 lines β
- Cache parsed ASTs for repeated transformations π§
- Parallel transformation of multiple files π§
- Python AST parser
- Infrar call detector
- Plugin system with YAML rules
- Transformation engine
- Code generator
- Code validator
- AWS S3 transformations
- GCP Cloud Storage transformations
- CLI tool
- Comprehensive test suite
- Node.js/TypeScript support
- Database capability (RDS, Cloud SQL)
- Messaging capability (SQS, Pub/Sub)
- Azure support
- Performance optimizations (caching, parallelization)
- Go language support
- Multi-file project transformation
- IDE integration (VS Code extension)
- Language Server Protocol (LSP)
- Advanced optimizations
GNU General Public License v3.0 - see LICENSE file for details.
- Issues: https://github.com/QodeSrl/infrar-engine/issues
- Docs: https://docs.infrar.io
- Email: support@infrar.io
Made with β€οΈ by the Infrar Team
Website β’ Documentation β’ GitHub