A modular, high-performance particles framework for visionOS (RealityKit/RealityView) and SwiftUI 2D overlays. Built around four core parts—Detection → Trigger → Emissor → Controller—with a mergeable Design System of presets.
- Make immersive particles simple: a single SwiftUI entry point with a fluent builder, sensible defaults, and great performance out-of-the-box.
- Bridge ML signals to visuals: map SoundAnalysis and Vision/CoreML outputs to particle effects via declarative triggers.
- Be flexible and extensible: plug in external/custom detectors (async closures/streams), add custom particle runtimes, and merge user catalogs with the built-in Design System.
- Ship delightful demos: provide showcase apps for sound-driven rain/sparkles and multimedia (brightness→sparkles, dusty→dust).
+-------------------+ +-------------------+ +-------------------+ +------------------+
| 1) Detection | --> | 2) Trigger | --> | 3) Emissor | --> | 4) Controller |
| (Sound/Vision/ | |(Rules/Conditions) | | (Particle Builder) | | (Orchestration) |
| External Fn) | | | | + Animator | | + Budgeting |
+-------------------+ +-------------------+ +-------------------+ +------------------+
| | | |
| Async Events | Evaluated Conditions | Spawn/Update/Recycle | Life-cycle, Scopes,
v v v v
EventBus ─────────────────────────────────────────────────────────────────> Immersive State
- Detection: adapters for SoundAnalysis, Vision/CoreML, and custom async sources. Emit normalized DetectionEvents.
- Trigger: TriggerPredicates with confidence thresholds, combinators (AND/OR/NOT), and per-rule rate-limit/debounce.
- Emissor: builder/DSL creates ParticleConfig and delegates to ParticleRuntime implementations (e.g., Rain/Dust/Sparkle/Glow) updated by a single Animator and pooled.
- Controller: central orchestration (scopes, budget/priorities, routing Detection→Trigger→Actions, lifecycle). SwiftUI modifiers configure it.
- ARParticlesKit (library): the framework sources.
- ARParticlesShowcase (demos): sound & multimedia showcases, plus controls for scopes/intensity/accessibility.
- ARParticlesKitTests (tests): unit & smoke tests for core, detection, trigger, emissor, and rendering.
Build platforms: visionOS for RealityView; iOS 17+ for 2D overlays (SwiftUI Canvas).
ARParticlesKit/
├─ Package.swift
├─ Sources/
│ ├─ ARParticlesKit/
│ │ ├─ Core/
│ │ │ ├─ EventBus.swift
│ │ │ ├─ ParticlesController.swift
│ │ │ ├─ ParticleBudget.swift
│ │ │ ├─ AccessibilityPolicy.swift
│ │ │ └─ (Routing, Scopes, AdaptivePolicy, Backpressure)
│ │ ├─ Detection/
│ │ │ ├─ DetectionSource.swift
│ │ │ ├─ SoundDetection.swift
│ │ │ ├─ VisionDetection.swift
│ │ │ └─ CustomDetection.swift
│ │ ├─ Trigger/
│ │ │ ├─ Trigger.swift
│ │ │ ├─ RuleEngine.swift
│ │ │ └─ TriggerCombinators.swift
│ │ ├─ Emissor/
│ │ │ ├─ Particle.swift
│ │ │ ├─ ParticleBuilder.swift
│ │ │ ├─ ParticleSystems/
│ │ │ │ ├─ RainParticle.swift
│ │ │ │ ├─ DustParticle.swift
│ │ │ │ ├─ SparkleParticle.swift
│ │ │ │ └─ GlowParticle.swift
│ │ │ ├─ Animator.swift
│ │ │ └─ Pooling.swift
│ │ ├─ Render/
│ │ │ ├─ RealityViewAdaptor.swift
│ │ │ └─ Particle2DView.swift
│ │ ├─ DesignSystem/
│ │ │ ├─ ParticleCatalog.swift
│ │ │ ├─ Categories.swift
│ │ │ └─ JSONLoader.swift
│ │ └─ PublicAPI/
│ │ ├─ ParticleImmersiveView.swift
│ │ └─ Modifiers.swift
│ └─ ARParticlesShowcase/
│ ├─ SoundDemo/
│ ├─ MultimediaDemo/
│ └─ Common/Controls.swift
└─ Tests/
└─ ARParticlesKitTests/
├─ CoreTests.swift
├─ DetectionTests.swift
├─ TriggerTests.swift
├─ EmissorTests.swift
└─ RenderTests.swift
Add this package URL in Xcode (File → Add Packages…) and select ARParticlesKit.
import SwiftUI
import ARParticlesKit
struct DemoImmersiveView: View {
var body: some View {
ParticleImmersiveView()
.scope([.sound, .multimedia])
.designSystem(.default) // can be merged later with custom
.particle("Rain") { b in
b.style(.rain)
.spawnRate(120)
.lifetime(2.0...3.0)
.opacity(0.6)
.trigger(.sound(.class("rain"), minConfidence: 0.8))
}
.onAppear { ARParticles.current.controller.start() }
.onDisappear { ARParticles.current.controller.stop() }
}
}
import SwiftUI
import ARParticlesKit
struct HUD: View {
var body: some View {
Particle2DView()
.scope([.sound])
.particle("Dust") { $0.style(.dust()).spawnRate(20).opacity(0.5) }
}
}
ARParticles.current
is a small convenience façade used in the demos to access the sharedParticlesController
.
- Built-in:
SoundDetection
(SoundAnalysis),VisionDetection
(brightness/tags). - Custom: provide an async source that emits
DetectionEvent
s.
// Register an external/custom detector (e.g., weather)
ARParticles.current.detectors.register(
source: .custom(id: "externalWeather") {
// emit events when appropriate
.label("rain", confidence: 0.9)
}
)
- Describe when to act:
.soundClass("rain", minConfidence: 0.8)
,.imageTag("bright", minConfidence: 0.7)
,.feature("loudness", >, 0.6)
, or.custom(...)
. - Combine with AND/OR/NOT and apply per-rule debounce/rate limits.
- Fluent builder to configure:
spawnRate
,speed
,lifetime
,opacity
,angle
,depth
,priority
,categories
. - Runtimes:
RainParticle
,DustParticle
,SparkleParticle
,GlowParticle
(or create your own by conforming toParticleRuntime
). - Animator: single tick actor; Pooling: avoids per-frame allocations.
- The orchestrator: manages scopes, budgets/priorities, routing, lifecycle (
start/stop
), and manual control. - SwiftUI modifiers configure behavior:
.scope()
,.designSystem()
,.budget()
,.accessibility()
.
ParticleCatalog.default
ships presets with documented defaults.- Merge with user catalogs:
.default + .myCustomSet
. - Optional JSON/asset loader to import presets at runtime.
Activate only the inputs you want the app to listen to (e.g., sound-only environments):
ParticleImmersiveView().scope([.sound]) // image/video triggers won’t fire
Keep performance stable with global/per-category caps; preempt low-priority effects first.
.budget(.init(policy: .adaptive(cpuTarget: 0.6),
perCategoryCaps: [.natural: 400, .magical: 200]))
- Auto-contrast adapt particle materials against scene luminance.
- Reduce Motion scales spawn rate/speed for sensitive users.
.accessibility(.autoContrast(minRatio: 4.5))
- RealityViewAdaptor (visionOS): bridges particle runtimes to RealityKit
Entity
trees; camera-relative emitters; depth bands. - Particle2DView (SwiftUI): Canvas + TimelineView for HUDs and non-immersive overlays.
- Single Animator Tick: one scheduler updates all particles using delta time.
- Pooling & Batching: avoid entity churn; reuse memory and node graphs.
- Adaptive Policy: throttles spawn/lifetime to meet
cpuTarget
with hysteresis to avoid oscillations. - Backpressure: per-rule debounce and global rate limiting to prevent action storms.
- Sound Immersive:
rain
→ RainParticle;ambient_music
→ SparkleParticle. - Multimedia:
bright
→ SparkleParticle;dusty
→ DustParticle. - Custom Detection: external weather feed emits
rain
label. - Controls: UI toggles for scopes, intensity, and reduce-motion.
- Core: RuleEngine, EventBus, Budget (high coverage; deterministic seeds).
- Detection: fixtures for labels/tags; confidence thresholds and sampling.
- Trigger: truth tables for AND/OR/NOT; debounce/rate-limit behavior.
- Emissor: deterministic sequences under seeded randomness.
- Render: smoke tests for spawn/update; snapshot checks for 2D when feasible.
-
M1 — MVP Engine + Sound Demo
- EventBus, Events, RuleEngine, Builder, Animator, Pooling, Budget
- RealityView adaptor, Rain preset, SoundDetection & permissions
- Triggers + mapping, Controller lifecycle, SwiftUI entry points, Quickstart
-
M2 — Multimedia + 2D + Adaptive Perf
- VisionDetection + frame pipeline, Sparkle preset, Particle2DView
- Batching + telemetry, perf harness, adaptive throttling
-
M3 — Design System + Extensibility + Polish
- Catalog presets & categories, Merge API, JSON loader
- CustomDetection demo, Accessibility polish, CI & docs
Detailed, Notion-ready CSV backlogs are available per area (Core, Detection, Trigger, Render, Controller, DesignSystem, etc.).
- Fork and create a feature branch (
feature/area-shortname
). - Add tests and docs for any new feature.
- Run CI locally (build + tests + coverage).
- Open a PR with a concise summary and screenshots/clips for visual changes.
Code style: Swift 6 concurrency-first; avoid per-frame allocations; favor value types for configs.
TBD. Include license text here (e.g., MIT/Apache-2.0).