A Unity AR Foundation application demonstrating real-time plane detection and interactive 3D object placement in augmented reality environments.
Recording_2025-06-24-07-46-55.mp4
DepthForge showcases core AR development concepts using Unity's AR Foundation framework. The application scans real-world environments to detect horizontal and vertical surfaces, then allows users to place and interact with 3D objects through touch input.
- Real-time Plane Detection: Automatically detects horizontal and vertical surfaces using SLAM technology
- Interactive Object Placement: Touch-to-place system for positioning 3D objects on detected surfaces
- Multi-object Support: Cycle through different 3D models for placement
- Cross-platform Compatibility: Supports both iOS (ARKit) and Android (ARCore) devices
- Visual Surface Feedback: Real-time visualization of detected planes
- Placement Validation: Ensures objects are only placed on valid surfaces
- Unity 2021.3 LTS or newer
- AR Foundation 4.2+
- ARCore XR Plugin (Android)
- ARKit XR Plugin (iOS)
- Android: ARCore-compatible device running Android 7.0+
- iOS: ARKit-compatible device running iOS 11.0+
- Device with rear-facing camera
- Sufficient processing power for real-time AR tracking
-
Clone the repository
git clone https://github.com/prakash-aryan/DepthForge.git cd DepthForge
-
Open in Unity
- Launch Unity Hub
- Click "Open" and select the project folder
- Ensure Unity 2021.3 LTS or newer is installed
-
Configure Platform
- Go to File > Build Settings
- Select target platform (Android/iOS)
- Click "Switch Platform"
-
Install Required Packages
- Open Window > Package Manager
- Install AR Foundation
- Install ARCore XR Plugin (Android) or ARKit XR Plugin (iOS)
Assets/
├── Scripts/
│ └── ARObjectPlacer.cs # Main AR interaction logic
├── Prefabs/
│ ├── AR Session Origin/ # AR camera and session setup
│ ├── Object Prefabs/ # 3D models for placement
│ └── Plane Visualization/ # Surface detection feedback
├── Materials/
│ └── Plane Materials/ # Visual materials for detected surfaces
├── Models/
│ └── 3D Objects/ # 3D models and textures
└── Scenes/
└── SampleScene.unity # Main AR scene
The main interaction component handling:
- Input detection (touch/mouse)
- Surface validation through raycasting
- Object instantiation and management
- Placement timing and cooldown logic
Key Methods:
HandleInput()
: Processes user touch inputTryGetPlacementPose()
: Converts screen coordinates to 3D world positionsPlaceObject()
: Instantiates objects at calculated positionsCanPlaceObject()
: Validates placement conditions
- AR Session: Manages AR system lifecycle
- AR Session Origin: Contains AR Camera and coordinate system
- AR Plane Manager: Detects and tracks surface planes
- AR Raycast Manager: Handles touch-to-world coordinate conversion
- Launch Application: Deploy to AR-compatible device
- Environment Scanning: Move device to scan surroundings
- Surface Detection: Wait for plane indicators to appear
- Object Placement: Tap detected surfaces to place objects
- Multiple Objects: Continue tapping to place additional objects
- Configure Player Settings:
- Set minimum API level to Android 7.0 (API 24)
- Configure package name and version
- Enable ARCore in XR settings
- Build and deploy to device
- Configure Player Settings:
- Set deployment target to iOS 11.0+
- Configure bundle identifier
- Enable ARKit in XR settings
- Build Xcode project and deploy via Xcode
- Placement Cooldown: Minimum time between object placements (default: 0.5s)
- Object Cycling: Automatically cycle through available prefabs
- Surface Types: Configure detection for horizontal/vertical planes
- Limit concurrent plane detection
- Optimize 3D model polycount
- Configure appropriate texture resolutions
Common Issues:
- No plane detection: Ensure adequate lighting and textured surfaces
- Poor tracking: Move device slowly and avoid reflective surfaces
- Performance issues: Reduce 3D model complexity or texture sizes
- Build errors: Verify all required packages are installed
Debug Features:
- Console logging for placement events
- Visual plane indicators for detected surfaces
- Error messages for missing components
- Fork the repository
- Create a feature branch (
git checkout -b feature/new-feature
) - Commit changes (
git commit -m 'Add new feature'
) - Push to branch (
git push origin feature/new-feature
) - Open a Pull Request
- Uses Unity's component-based architecture
- Implements event-driven plane detection
- Supports both marker-based and markerless tracking
- Optimized for mobile AR performance constraints
This project is licensed under the MIT License - see the LICENSE file for details.
- Unity Technologies for AR Foundation framework
- Google for ARCore platform support
- Apple for ARKit platform support