A detailed simulation modeling a complete self-replicating factory with 250+ components, 16 specialized module types, and a fully modular architecture for extensibility and experimentation.
This simulation addresses the question: What is the true complexity and resource requirement for a solar-powered factory system that starts from completely raw materials and builds a full copy of itself?
The detailed model reveals that autonomous replication requires approximately 2-3x more resources and time than simplified models suggest, primarily due to software development overhead, transport logistics, quality control, thermal management, and maintenance downtime.
- 250+ Components: From raw ores to microprocessors
- 16 Specialized Modules: Mining, chemical, CNC, cleanroom, transport, etc.
- Spec System: Define complete factories in external configuration files
- Modular Architecture: Swappable subsystems, parallel execution, event-driven
- Comprehensive Systems: Thermal management, contamination control, AGV routing
- Configurable Profiles: High-throughput, energy-efficient, high-quality modes
- Dynamic Configuration: Load different factory designs without code changes
- ๐ Input Validation: File size limits, circular inheritance detection, security hardening
- ๐ Event Bus Metrics: Backpressure handling, queue monitoring, dropped event tracking
- ๐ Structured Logging: Production-grade error handling with context and tracebacks
- ๐งช CI/CD Pipeline: Automated testing, linting, type-checking, security scanning
- ๐ฆ Modern Packaging: PEP 621 compliant
pyproject.toml, CLI entry points - ๐ Code Quality: Eliminated duplication with
ResourceEnumMixin, comprehensive type hints - โ Integration Tests: Full end-to-end workflow testing for critical paths
# Install with all features (development ready)
pip install -e ".[all]"
# Or install specific feature sets
pip install -e ".[viz]" # Visualization only
pip install -e ".[dev]" # Development tools
pip install -e . # Core only
# CLI commands available after installation
factory-sim --help
factory-builder list# No dependencies required for core simulation
python3 self_replicating_factory_sim.py
# Optional: Install dependencies for full features
pip3 install -r requirements.txt
# - pyyaml: For YAML spec files (.spec, .yaml)
# - matplotlib: For visualization graphs# Quick automated run (sets up venv, runs simulation & analysis)
./run_simulation.sh
# Or run manually:
python3 self_replicating_factory_sim.py
python3 analyze_factory_sim.py # Requires matplotlib
python3 visualize_factory_system.py # Requires matplotlib# Default simulation (ultra-realistic with 250+ components)
python3 self_replicating_factory_sim.py
# Minimal spec for faster testing (~40 components)
python3 self_replicating_factory_sim.py --spec minimal.json --max-hours 100
# Use spec with profile (high_throughput, energy_efficient, etc.)
python3 self_replicating_factory_sim.py --spec minimal.json --profile fast_simulation
# Use the full ultra-realistic spec (requires PyYAML)
python3 self_replicating_factory_sim.py --spec default.spec --output results.json
# NEW: Run with modular architecture using custom subsystems from spec
python3 self_replicating_factory_sim.py --spec specs/genetic_optimized.json --modular
# Use factory_builder module for complete spec-to-factory conversion
python3 factory_builder.py create specs/default.spec high_throughput# Option 1: Programmatic factory creation
from modular_factory_adapter import ModularFactory
from custom_subsystems import GeneticRoutingTransport
# Create modular factory
factory = ModularFactory()
# Add custom subsystem
factory.add_custom_subsystem("genetic_transport", GeneticRoutingTransport())
# Enable parallel execution
factory.set_update_strategy(UpdateStrategy.PARALLEL)
# Run simulation
result = factory.run_simulation(max_hours=1000)
# Option 2: Create from spec file with custom subsystems
from factory_builder import create_factory_from_spec
# Spec defines both configuration AND custom subsystem implementations
factory = create_factory_from_spec("specs/my_advanced_factory.json")
result = factory.run_simulation(max_hours=1000)self_replicating_factory_sim.py- Main factory simulation (2600+ lines)analyze_factory_sim.py- Enhanced analysis with 12-panel dashboardvisualize_factory_system.py- System architecture and flow diagramsrun_simulation.sh- Automated runner script
modular_framework.py- Core interfaces, event bus, orchestratormodular_factory_adapter.py- Bridges existing simulation with modular systemcustom_subsystems.py- 7 advanced example subsystemsconfigs/modular_base.json- Hierarchical configuration with profiles
CLAUDE.md- Development guide for Claude CodeREADME.md- This fileSPEC_FORMAT.md- Complete spec system format documentationIMPROVEMENTS.md- Detailed changelog of v1.0.0 improvements (NEW)CONTRIBUTING.md- Contribution guidelinesMIGRATION_GUIDE.md- Guide for migrating to dynamic subsystemsdocs/archive/- Completed implementation plans and historical docs
.github/workflows/test.yml- Automated testing pipeline.github/workflows/docs.yml- Documentation workflow.github/dependabot.yml- Dependency managementpyproject.toml- Modern Python packaging configurationpytest.ini- Test configurationmypy.ini- Type checking configuration
spec_loader.py- Dynamic spec loading and validation systemfactory_builder.py- Creates complete factories from spec files (NEW)specs/- Factory specification filesdefault.spec- Ultra-realistic factory with 250+ componentsdefault_recipes.yaml- Complete recipe chainsminimal.json- Simplified ~40 component factory
The spec system allows you to define complete factory configurations externally without modifying code. The new subsystem_implementations feature extends this to define both WHAT (resources, recipes) and HOW (subsystem implementations) in a single spec file.
The factory_builder.py module enables complete factory creation from spec files, bridging the spec system with the modular architecture.
- Complete Spec-to-Factory Translation: Creates fully configured ModularFactory instances from specs
- Subsystem Validation: Validates that all specified subsystems are available
- Dynamic Configuration: Applies profiles and configurations at runtime
- No-Code Factory Creation: Define advanced factories without writing Python code
# List all available subsystem implementations
python3 factory_builder.py list
# Validate a spec file's subsystem implementations
python3 factory_builder.py validate specs/my_factory.json
# Create a factory from a spec
python3 factory_builder.py create specs/genetic_optimized.json high_throughputfrom factory_builder import create_factory_from_spec, list_available_subsystems
# See all available subsystems
subsystems = list_available_subsystems()
print("Available:", list(subsystems.keys()))
# Create factory with genetic algorithms and smart grid
factory = create_factory_from_spec("specs/advanced_factory.json", "optimization_mode")
result = factory.run_simulation(max_hours=2000)- Complete Configuration: Define both WHAT (resources, recipes) and HOW (subsystem implementations) in specs
- Modular Design: Define complete factory configurations in external files
- Dynamic Loading: ResourceType enum generated from specs at runtime
- Subsystem Implementations: Specify custom subsystems (genetic routing, smart grids, etc.) in specs
- Inheritance: Specs can extend parent specs for variations
- Profiles: Switch between optimization modes (high_throughput, energy_efficient, etc.)
- Validation: Automatic checking for dependency cycles and missing resources
- Format Flexibility: Support for JSON and YAML formats (YAML requires PyYAML)
- No-Code Factory Creation: Create complete modular factories without writing Python code
specs/default.spec: Ultra-realistic factory with 250+ components and 16 modulesspecs/minimal.json: Simplified factory with ~40 components for faster simulation
--spec: Path to spec file (e.g.,specs/minimal.json)--profile: Configuration profile from spec (e.g.,high_throughput)--max-hours: Maximum simulation hours (default: 10000)--output: Output file for results (default:factory_simulation_log.json)--modular: Use ModularFactory with custom subsystems from spec (NEW)
See SPEC_FORMAT.md for complete documentation. Basic structure:
metadata:
name: "My Custom Factory"
version: "1.0.0"
resources:
STEEL:
density: 7.8
storage_temp: 25
contamination_sensitivity: 0.1
recipes:
- output: STEEL
output_quantity: 500
inputs:
IRON_ORE: 600
energy_kwh: 150
time_hours: 1.5
required_module: refining
modules:
refining:
max_throughput: 5.0
power_consumption_idle: 10.0
power_consumption_active: 100.0
# NEW: Define custom subsystem implementations
subsystem_implementations:
transport: "genetic_routing"
energy: "smart_grid"
quality: "spc_quality"
# Configure subsystems with specific parameters
subsystem_data:
transport:
population_size: 100
mutation_rate: 0.15
energy:
grid_connection: true
battery_strategy: "economic"
profiles:
fast_simulation:
processing_speed_multiplier: 2.0
enable_degradation: falseThe simulation features a fully modular, event-driven architecture where subsystems can be easily swapped, parallelized, and extended without modifying core code.
- ISubsystem: Abstract interface all subsystems must implement
- EventBus: Publish/subscribe communication between subsystems
- SubsystemOrchestrator: Manages and coordinates subsystem execution
- ConfigManager: Hierarchical configuration management
- SubsystemContainer: Dependency injection container
- GeneticRoutingTransport: Uses genetic algorithms for route optimization
- SwarmTransportSystem: Swarm intelligence for coordination
- StatisticalProcessControl: SPC for quality monitoring
- PredictiveMaintenanceSystem: Predictive maintenance with degradation modeling
- SmartGridEnergySystem: Grid integration with demand response
- RenewableEnergyOptimizer: Multi-source renewable optimization
- DigitalTwinSubsystem: Predictive simulation and optimization
from modular_framework import SubsystemBase, EventType
class MyCustomSubsystem(SubsystemBase):
def __init__(self, name: str = "my_subsystem"):
super().__init__(name)
def initialize(self, config, event_bus):
super().initialize(config, event_bus)
# Subscribe to events
event_bus.subscribe(EventType.TASK_STARTED, self.handle_event)
def update(self, delta_time, context):
# Your logic here
result = self.process_something(context)
# Publish events
self.publish_event(EventType.RESOURCE_PRODUCED, {
"resource": "steel",
"quantity": 100
})
return {"processed": result}The system includes predefined profiles in configs/modular_base.json:
- high_throughput: 2x AGVs, 5x solar capacity, larger storage
- energy_efficient: Optimized cooling, reduced transport speed
- high_quality: Class 100 cleanroom, enhanced quality control
- experimental: Genetic routing, smart grid, advanced features
from modular_framework import ConfigManager
config_manager = ConfigManager()
config_manager.load_from_file("configs/modular_base.json")
config_manager.apply_profile("high_throughput")Subsystems communicate via events, not direct method calls:
# Publishing events
self.publish_event(EventType.RESOURCE_PRODUCED, {
"resource": "steel",
"quantity": 100,
"quality": 0.95
})
# Subscribing to events
def initialize(self, config, event_bus):
event_bus.subscribe(EventType.RESOURCE_PRODUCED, self.on_resource_produced)
def on_resource_produced(self, event):
self.inventory[event.data["resource"]] += event.data["quantity"]# Sequential (default)
factory.set_update_strategy(UpdateStrategy.SEQUENTIAL)
# Parallel execution
factory.set_update_strategy(UpdateStrategy.PARALLEL)
# Priority-based
factory.set_update_strategy(UpdateStrategy.PRIORITY)- Mining Module - Raw material extraction
- Refining Module - Material processing
- Chemical Module - Reactions, distillation, catalysis
- Electronics Module - PCB and component assembly
- Mechanical Module - Gears, bearings, springs
- CNC Module - Precision machining (ยฑ5ฮผm tolerance)
- Laser Module - Cutting and welding
- Cleanroom Module - Class 10-100,000 environments
- Assembly Module - Complex system integration
- Software Module - Firmware and AI development
- Transport Module - AGV fleet and conveyors
- Recycling Module - Waste processing (60-95% recovery)
- Testing Module - Quality assurance and metrology
- Thermal Module - Cooling and heat management
- Power Module - Solar and battery systems
- Control Module - SCADA and automation
- Chemical Products: Acids, solvents, polymers, electrolytes
- Precision Parts: Ball screws, linear guides, precision bearings
- Electronics: 40+ types from transistors to microprocessors
- Software: PLC programs, robot firmware, AI models
- Sensors: 15+ types including LIDAR, thermocouples, load cells
- Testing Equipment: CMM probes, oscilloscopes, tensile testers
- Transport System: AGV routing with battery management
- Waste Management: Material recovery and recycling streams
- Software Production: Version control and bug tracking
- Contamination Control: Particle counting and yield impact
- Thermal Management: COP calculations and cooling requirements
- Quality Control: Statistical variation and defect rates
CONFIG = {
# Energy
"initial_solar_capacity_kw": 100,
"solar_panel_efficiency": 0.22,
"battery_efficiency": 0.95,
# Physical constraints
"factory_area_m2": 20000,
"max_storage_volume_m3": 15000,
# All realism features enabled by default
"enable_thermal_management": True,
"enable_contamination": True,
"enable_software_production": True,
# ... and more
}{
"subsystems": {
"transport": {
"agv_fleet_size": 10,
"routing_algorithm": "dijkstra"
},
"energy": {
"solar_capacity_kw": 100,
"grid_connection": false
}
},
"profiles": {
"high_throughput": { ... },
"energy_efficient": { ... }
}
}The codebase includes a comprehensive pytest-based testing suite with 79 tests covering core functionality:
# Run all tests
pytest tests/ -v
# Run with coverage report
pytest tests/ --cov=. --cov-report=html
# Run specific test file
pytest tests/unit/test_exceptions.py -v
# Run integration tests
pytest tests/integration/ -v
# Open coverage report
open htmlcov/index.htmlTest Structure:
tests/unit/- Unit tests for individual components (exceptions, spec loader, modular framework)tests/integration/- End-to-end workflow teststests/conftest.py- Shared fixtures (EventBus, FactorySpec, temp directories)- Current Status: 79 tests, 71% pass rate, 20% baseline code coverage
Available Test Fixtures:
# Use in any test via conftest.py
def test_example(event_bus, minimal_factory_spec, temp_spec_dir):
# event_bus - Fresh EventBus instance with bounded history
# minimal_factory_spec - Complete minimal factory configuration
# temp_spec_dir - Temporary directory for test spec files
assert event_bus.max_history == 100Custom exception hierarchy provides clear, actionable error messages:
from exceptions import (
FactorySimulationError, # Base exception
SpecValidationError, # Spec validation failures
ResourceNotFoundError, # Missing resources
SubsystemNotFoundError, # Missing subsystems
InsufficientResourcesError, # Resource shortages
ModuleNotAvailableError, # Module unavailable
CircularDependencyError, # Dependency cycles
# ... 25+ specialized exceptions
)
# Example usage with context
try:
factory = create_factory_from_spec("specs/my_factory.json")
except SpecValidationError as e:
print(f"Validation failed: {e}")
print(f"Specific errors:")
for error in e.errors:
print(f" - {error}")
except ResourceNotFoundError as e:
print(f"Resource '{e.resource_name}' not found in {e.context}")Exception Categories:
SpecError- Spec loading, parsing, validation, inheritanceResourceError- Resource availability, storage, allocationSubsystemError- Subsystem registration, initialization, configurationModuleError- Module availability, capacity, failuresTaskError- Task blocking, dependencies, executionEventError- Event publishing, queue overflow
Runtime configuration validation using Pydantic catches errors before simulation starts:
from config_validation import FactoryConfig, validate_config
from pydantic import ValidationError
# Option 1: Use Pydantic models directly
config = FactoryConfig(
energy=EnergyConfig(
initial_solar_capacity_kw=200,
battery_capacity_kwh=1000
),
processing=ProcessingConfig(
parallel_processing_limit=20
),
features=FeatureToggles(
enable_degradation=True,
enable_quality_control=True
)
)
# Option 2: Validate existing dictionary
try:
validated_config = validate_config({
"initial_solar_capacity_kw": 200,
"battery_capacity_kwh": 1000,
"parallel_processing_limit": 20
})
except ValidationError as e:
print(f"Configuration invalid:")
for error in e.errors():
print(f" {error['loc']}: {error['msg']}")
raise
# Convert back to flat dictionary for legacy code
config_dict = config.to_dict()Validation Features:
- Field-level validation (range checks, type enforcement, bounds)
- Cross-field validation (consistency between related fields)
- Auto-documentation (every field has description and constraints)
- Clear error messages with specific field locations
Configuration Models:
EnergyConfig- Solar capacity, battery, efficiency, weatherProcessingConfig- Parallelization, batch sizes, throughput limitsFeatureToggles- Enable/disable simulation featuresPhysicalConstraints- Factory area, storage volumes, cooling capacityQualityConfig- Defect rates, tolerances, cleanroom standardsTransportConfig- AGV fleet size, routing algorithms, battery management
Built-in profiling, caching, and debugging tools in performance_utils.py:
from performance_utils import resource_cache, cached_resource_calculation
# Decorator for automatic caching (10-100x speedup)
@cached_resource_calculation
def calculate_requirements(resource, quantity):
# Expensive recursive calculation
return requirements_dict
# Check cache performance
stats = resource_cache.get_stats()
print(f"Cache hit rate: {stats['hit_rate']:.2%}")
print(f"Size: {stats['size']}/{stats['max_size']}")
# Clear if needed
resource_cache.clear()from performance_utils import profiler
# Enable profiling
profiler.enable()
# Decorate functions
@profiler.profile("expensive_calculation")
def calculate_something():
# Work here
pass
# Run simulation
factory.run_simulation(max_hours=1000)
# Print results
profiler.print_stats(top_n=20)
# Output:
# Function Calls Total(s) Avg(ms) Min(ms) Max(ms)
# calculate_requirements 1000 5.234 5.234 2.1 15.8
# update_subsystems 500 3.123 6.246 4.2 12.1from performance_utils import DebugMode
# Enable debug mode (zero overhead when disabled)
DebugMode.enable(strict=False) # Log warnings only
DebugMode.enable(strict=True) # Raise exceptions
# Domain-specific assertions
DebugMode.assert_positive(quantity, "quantity")
DebugMode.assert_range(efficiency, 0, 1, "efficiency")
DebugMode.assert_energy_conservation(generated, consumed, stored)
DebugMode.assert_resource_balance(produced, consumed, tolerance=0.01)
DebugMode.check_invariant(total >= 0, "Resources cannot be negative")
# Disable for production (no performance impact)
DebugMode.disable()Static type analysis with mypy:
# Check specific file
mypy spec_loader.py
# Check entire codebase
mypy .
# Check with strict mode
mypy --strict modular_framework.pymypy Configuration (mypy.ini):
- Python 3.10+ type hints required
- Gradual strict mode adoption (
exceptions.pyfully strict) - Per-module configuration for legacy code
- Clear error messages with line numbers
EventBus and modular framework are fully thread-safe for parallel execution:
Thread-Safe Operations:
- Event subscription/unsubscription during runtime
- Concurrent event publishing from multiple subsystems
- Event history access without race conditions
- Subsystem registration/unregistration
Implementation Pattern:
# Fine-grained locking prevents deadlocks
with self._subscribers_lock:
# Copy handler list under lock
handlers = list(self.subscribers[event_type])
# Call handlers WITHOUT lock (prevents deadlock)
for handler in handlers:
handler(event)Parallel Execution:
from modular_framework import UpdateStrategy
# Enable parallel subsystem updates
factory.set_update_strategy(UpdateStrategy.PARALLEL)
# Subsystems run concurrently in thread pool
result = factory.run_simulation(max_hours=1000)Complete development cycle:
# 1. Set up environment
pip install -e ".[dev]" # Installs pytest, mypy, black, etc.
# 2. Make changes
# ... edit files ...
# 3. Run type checking
mypy .
# 4. Run tests with coverage
pytest tests/ -v --cov=.
# 5. Check coverage report
open htmlcov/index.html
# 6. Run simulation to verify
python3 self_replicating_factory_sim.py --spec specs/minimal.json --max-hours 100
# 7. Analyze results
python3 analyze_factory_sim.pyPre-commit Checklist:
- Tests pass:
pytest tests/ -v - Type checking passes:
mypy . - Code coverage maintained or improved
- New exceptions added for error cases
- Configuration validation updated for new fields
- Documentation updated (docstrings, README)
Enable verbose logging:
import logging
logging.basicConfig(level=logging.DEBUG)Track event flow:
# Get complete event history
events = factory.event_bus.get_history()
# Filter by type and source
resource_events = factory.event_bus.get_history(
event_type=EventType.RESOURCE_PRODUCED,
source="refining_module"
)
# Analyze event patterns
from collections import Counter
event_counts = Counter(e.type for e in events)
print(f"Most common: {event_counts.most_common(5)}")
# Check for dropped events (backpressure)
if factory.event_bus.dropped_events > 0:
print(f"Warning: {factory.event_bus.dropped_events} events dropped")Profile performance bottlenecks:
from performance_utils import profiler
profiler.enable()
factory.run_simulation(max_hours=100)
profiler.print_stats(top_n=20) # Top 20 slowest functionsValidate configuration:
from config_validation import validate_config
from pydantic import ValidationError
try:
config = validate_config(my_config_dict)
print("โ Configuration valid")
except ValidationError as e:
print("โ Invalid configuration:")
for error in e.errors():
loc = " -> ".join(str(x) for x in error['loc'])
print(f" {loc}: {error['msg']}")Check cache efficiency:
from performance_utils import resource_cache
# Run simulation
factory.run_simulation(max_hours=1000)
# Check cache stats
stats = resource_cache.get_stats()
if stats['hit_rate'] < 0.5:
print(f"Warning: Low cache hit rate ({stats['hit_rate']:.2%})")
print(f"Consider increasing cache size (current: {stats['max_size']})")- Replication Time: 800-1200 days
- Factory Mass: 200-300 tons
- Factory Footprint: 20,000 mยฒ
- Energy Requirement: 5-10 MW continuous
- Material Efficiency: 75-85% with recycling
- Overall Yield: 70-80% after quality losses
- Module Count: 200+ specialized modules needed
- Energy - Needs 3-5x more solar capacity than base
- Software - Development time for control systems
- Transport - AGV charging and routing delays
- Thermal - Cooling power requirements
- Quality - Rework and defect management
from modular_framework import MockSubsystem
# Create mock for testing
mock_transport = MockSubsystem("transport", update_return={"completed": 5})
factory.orchestrator.register_subsystem("transport", mock_transport)
# Run tests
result = factory.run_simulation(max_hours=10)
assert mock_transport.update_count > 0# Get event history
events = factory.event_bus.get_history()
# Filter by type
transport_events = factory.event_bus.get_history(
event_type=EventType.TRANSPORT_COMPLETED
)
# Analyze patterns
from collections import Counter
event_types = Counter(e.type for e in events)
print(f"Most common: {event_types.most_common(5)}")# Profile all subsystems
for name, subsystem in factory.orchestrator.subsystems.items():
factory.orchestrator.subsystems[name] = ProfiledSubsystem(subsystem)
# Run simulation
factory.run_simulation(max_hours=100)
# Get results
for name, subsystem in factory.orchestrator.subsystems.items():
if isinstance(subsystem, ProfiledSubsystem):
print(f"{name}: {subsystem.get_average_time():.4f}s per update")# Save state
factory.save_state("simulation_state.json")
# Load state
factory = ModularFactory()
factory.load_state("simulation_state.json")
# Continue from saved point
factory.run_simulation(max_hours=factory.time + 100)checkpoint_interval = 100 # hours
for i in range(10):
factory.run_simulation(max_hours=checkpoint_interval * (i + 1))
factory.save_state(f"checkpoint_{i}.json")from modular_factory_adapter import ModularFactory
from custom_subsystems import *
# Create factory
factory = ModularFactory()
# Replace default subsystems
factory.orchestrator.unregister_subsystem("transport")
factory.add_custom_subsystem(
"genetic_transport",
GeneticRoutingTransport(),
SubsystemConfig({
"population_size": 100,
"mutation_rate": 0.15
})
)
# Add smart grid
factory.add_custom_subsystem(
"smart_grid",
SmartGridEnergySystem(),
SubsystemConfig({
"grid_connection": True,
"battery_strategy": "economic"
})
)
# Enable parallel execution
factory.set_update_strategy(UpdateStrategy.PARALLEL)
# Run simulation
result = factory.run_simulation(max_hours=1000)from self_replicating_factory_sim import Factory, CONFIG
results = []
for solar_capacity in [100, 200, 500, 1000]:
config = CONFIG.copy()
config["initial_solar_capacity_kw"] = solar_capacity
factory = Factory(config)
result = factory.run_simulation(max_hours=10000)
results.append({
"solar": solar_capacity,
"time": result["final_status"]["time"]
})from modular_factory_adapter import ModularFactoryBridge
# Create bridge for gradual migration
bridge = ModularFactoryBridge()
hybrid_factory = bridge.create_hybrid_factory()
# Run using existing interface
result = hybrid_factory.run_simulation()
Complete factory system architecture showing material flow hierarchy and module production network
Detailed production dependency graph showing all 250+ components and their relationships
12-panel dashboard showing key metrics, bottlenecks, and system performance
factory_simulation_log.json- Complete simulation datafactory_simulation_analysis_ultra.png- 12-panel metrics dashboardfactory_system_diagram.png- System architecture visualizationfactory_production_graph.png- Production dependency graph
- Tasks/Day: Production rate (target: >10)
- Module Efficiency: Equipment degradation (keep >80%)
- Storage Utilization: Material flow bottlenecks
- Transport Jobs: Logistics congestion
- Software Bugs: Control system reliability
- Thermal Load: Cooling capacity limits
- Software is Critical: Control systems often gate production
- Transport Adds 15-20%: Material movement is non-trivial
- Thermal Limits Scale: Cooling becomes dominant at scale
- Quality Cascades: Small defect rates compound dramatically
- Maintenance Windows Matter: Scheduled downtime impacts throughput
This simulation provides a testbed for:
- Manufacturing system optimization
- Autonomous production planning
- Resource allocation strategies
- Quality control methodologies
- Energy system design
- Supply chain resilience
- Circular economy modeling
- Digital twin development
# Install development dependencies
pip install -e ".[dev]"
# Run full test suite with coverage
pytest --cov=. --cov-report=term-missing
# Run specific test files
pytest tests/unit/test_spec_loader.py -v
pytest tests/integration/ -v
# Run with markers
pytest -m "not slow" # Skip slow tests
pytest -m integration # Integration tests only# Format code
black .
# Lint code
ruff check .
# Type checking
mypy .
# Security scanning
bandit -r . -ll# Build package
python -m build
# Check package
twine check dist/*
# Install locally
pip install -e .The project includes automated GitHub Actions workflows:
- Testing: Multi-Python version (3.10-3.12) test matrix
- Linting: Black formatting and Ruff linting checks
- Type Checking: MyPy static analysis
- Security: Bandit security scanning, Safety vulnerability checks
- Build: Package build verification
View workflows in .github/workflows/ directory.
See IMPROVEMENTS.md for detailed documentation of all improvements.
- โ
Production-Grade Logging: Replaced all
print()with structured logging - โ Event Bus Backpressure: Queue limits, metrics, dropped event tracking
- โ
Code Deduplication:
ResourceEnumMixineliminates 60+ duplicate lines - โ Input Validation: File size limits, circular inheritance detection
- โ
Modern Packaging: PEP 621
pyproject.tomlwith optional dependencies - โ CI/CD Pipeline: Automated testing, linting, security scanning
- โ Integration Tests: 200+ lines of end-to-end workflow tests
- โ Type Hints: Comprehensive type annotations across core modules
All improvements maintain 100% backward compatibility. Existing code continues to work without modifications.
- Dynamic Subsystems: โ Completed - subsystems now configurable via spec files
- Async/Await Support: Async subsystem updates for better parallelization
- Database Backend: SQLite persistence for large simulations
- Checkpointing: Save/resume simulation state
- API Documentation: Sphinx-generated API docs
- Performance Profiling: Identify and optimize bottlenecks
- Plugin System: Entry point-based subsystem discovery
- Spatial Layout: 2D/3D factory floor optimization
- Economic Modeling: Cost optimization and market dynamics
- Learning Curves: Efficiency improvements over time
- Fault Tolerance: Redundancy and backup systems
- Network Effects: Module-to-module direct connections
- Multi-Factory: Distributed production networks
- Spec Optimization: Automated spec generation and optimization tools
[Your license here]
[Contributing guidelines]
[Contact information]
"The factory must grow" - Every self-replicating system, probably