Skip to content

InspectAR

iOS/iPadOS AR model viewer for placing, inspecting, and interacting with 3D models in augmented reality

ROLE: AR Developer
PERIOD: Dec 2023 — Present
PLATFORMS: iOS / iPadOS / visionOS
STATUS: ACTIVE
DOMAIN: Extended Reality
UnityC#AR FoundationARKitXR Interaction ToolkitUniversal Render PipelineInput SystemUI Toolkit
InspectAR cover

Overview

InspectAR v2.0 is a complete ground-up rebuild of the ARchitect iOS/iPadOS model viewer, developed at Shenandoah University's Center for Innovative Learning. The app enables users to place an AR anchor on a detected surface, select from a library of 3D models, and interact with the placed model, rotating, scaling, inspecting individual parts, and viewing with animated shader-based reveal effects.

The rebuild replaced the v1.0 prototype's ad-hoc architecture with a formal SessionStateMachine driving four states, attribute-based dependency injection via a custom service-registry package, and a sequential async pipeline for initialization. The entire UI layer uses Unity's UI Toolkit with a custom dark-themed StyleKit package, and the app runs in a single scene with all transitions managed by the state machine.

The project evolved through an experimental Command Pattern phase (July–November 2025) before being simplified to the current lean state-machine-plus-services architecture in early 2026, demonstrating iterative architectural refinement driven by real usage.

Non-Technical Summary

InspectAR is an iPad and iPhone app that lets you view detailed 3D models in augmented reality. When you open the app, you see a brief loading screen while it prepares your model library. Then you're guided through a simple flow: tap to start, point your camera at a flat surface like a table or desk, tap to place a virtual anchor point, and choose which model you'd like to view.

Once a model is placed, it appears with a smooth animated reveal effect, materializing from the bottom up as if being constructed in real time. You can rotate the model using a slider, shrink it to fit the anchor area, or expand it back to full size. For complex models, you can tap "View Parts" to see a list of individual components and isolate any single part for closer inspection.

The app was built for Shenandoah University to help students explore objects that are difficult to bring into a classroom, like architectural models of campus buildings, drones, and vehicles. Instead of looking at flat images or diagrams, students can walk around a life-size (or scaled) 3D model and examine it from every angle, right on their desk.

Highlights

  • Architected and built a ground-up rebuild of an iOS AR model viewer using a custom state machine, attribute-based dependency injection, and sequential pipeline initialization, replacing an ad-hoc prototype with a maintainable, extensible architecture
  • Designed and implemented a shader-based reveal/despawn animation system using a custom VerticalReveal shader with runtime material swapping, delivering polished visual transitions for 3D model placement in AR
  • Engineered a complete AR anchor placement pipeline with ARAnchorManager attachment, plane-aware raycasting, bounds-based alignment, and automatic model scaling to fit anchor dimensions
  • Built an interactive model part inspection system allowing users to isolate and view individual components of complex 3D models with per-part metadata and artist credits

Quick Highlights

  • Single-scene architecture with a POCO state machine managing all application flow
  • Custom attribute-based DI system: [AutoRegister] + [Inject], no manual wiring
  • Shader-based model reveal/despawn animations with runtime material swapping
  • Individual part inspection with per-component metadata and toggle visibility
  • 7 service interfaces with strict interface segregation

Technical Breakdown

The v2.0 architecture is organized around a central SessionController that implements ISessionController and acts as the facade through which all states access the application's 7 services. The entire app runs in a single Unity scene with no scene transitions.

State Machine: SessionStateMachine is a plain C# class (no MonoBehaviour) that holds a reference to ISessionController as its context. States are instantiated on each transition: IntroductionState, AnchorSelectState, ModelSelectState, and ModelViewState, each implementing ISessionState with Enter()/Exit() lifecycle methods. States subscribe to service events in Enter() and strictly unsubscribe in Exit(), preventing memory leaks and stale callbacks.

Dependency Injection: A custom attribute-based DI system provides [AutoRegister(ServiceLifetime.Scene, typeof(IFoo))] on MonoBehaviours to register them as their interface at scene load. Consumer fields marked with [Inject] are populated via reflection by ServiceInjector. Seven services are registered: SessionController, ModelPlacementHandler, UIHandler, ModelDataHandler, AnchorSelectHandler, ModelSelectUIHandler, and ModelViewUIHandler.

Initialization Pipeline: SessionBootstrapProvider extends SequentialPipelineProviderBehaviour and builds a 4-step async pipeline: show splash screen, load model data from Resources, populate the model selection UI, and apply a 1.5-second splash delay. On success, it calls SessionController.StartSession(); on failure, it displays an error in the UI.

AR Anchor System: AnchorSelectHandler polls Touch.activeTouches (via the new Input System) during the anchor select phase. It raycasts against TrackableType.PlaneWithinPolygon, then attempts ARAnchorManager.AttachAnchor() to the detected AR plane, falling back to a manual ARAnchor component if no trackable plane exists. An offset prefab is instantiated under the anchor with bounds-based Y adjustment.

Model Placement: ModelPlacementHandler instantiates the selected model prefab under the anchor offset, calls ModelAlignService.AlignBottomToAnchor() for vertical alignment, optionally scales via ModelScaleService.ResizeToFit(), fires a PostProcessGlitchService URP Volume flash, and runs a ShaderRevealEffect coroutine for the bottom-to-top spawn animation. Despawn reverses the process (top-to-bottom reveal, then Destroy).

Shader Reveal Effect: ShaderRevealEffect computes world bounds across all renderers, swaps their sharedMaterials to temporary copies using the custom VerticalReveal.shader, animates a _RevealHeight property from min to max Y (or reverse for despawn), then restores originals or destroys the model. A finally block ensures all temporary materials are cleaned up.

Static Utility Services: Six stateless static classes handle pure computations: BoundsService (collider/renderer bounds), ModelAlignService (anchor alignment), AnchorAlignmentService (floor alignment), ModelScaleService (fit-to-anchor scaling), ModelPositionOverrideService (debug offset), and PostProcessGlitchService (URP volume flash with a self-cleaning coroutine runner).

UI Toolkit Layer: Five UXML documents (one per screen) are managed by a central UIHandler service. StyleKitMotion.FadeIn() is called on every transition. ModelViewUIHandler provides the richest interaction: rotate slider (0–360), shrink/restore buttons, and a parts inspection mode that builds a renderer lookup from the model hierarchy and toggles visibility per named component.

Model Data: ModelDataEntryScriptableObject assets in Resources/ModelDataEntries/ define each model's name, description, prefab reference, artist credits, and per-object part metadata. LocalModelDataService loads all enabled entries at runtime via Resources.LoadAll. A custom editor script auto-generates the object list from the prefab's MeshRenderer hierarchy.

Systems Used

  • Session State Machine - POCO state machine managing application flow through Introduction, AnchorSelect, ModelSelect, and ModelView states with strict enter/exit lifecycle and event subscription management
  • Attribute-based Dependency Injection - Custom DI system providing [AutoRegister] and [Inject] attributes for scene-scoped MonoBehaviour service resolution via reflection, with 7 registered service interfaces
  • Sequential Pipeline Initialization - Ordered async bootstrap pipeline with splash screen, model data loading, UI population, and graceful error handling before session start
  • Shader-based Reveal Effect - Custom VerticalReveal shader with runtime material swapping for animated bottom-to-top model spawn and top-to-bottom despawn sequences, with automatic temporary material cleanup
  • ScriptableObject Model Data System - ModelDataEntryScriptableObject assets loaded from Resources at runtime, supporting per-model metadata, artist credits, and individual part inspection with editor tooling for auto-generating object lists
  • UI Toolkit Screen Management - Five UXML documents driven by a central UIHandler service with StyleKit dark theme and FadeIn transitions between screens
  • AR Anchor Placement System - Touch-driven AR plane raycasting with ARAnchorManager attachment, offset prefab instantiation, and bounds-based Y alignment
  • Post-Process Glitch Effect - URP Volume weight flash triggered on model placement using a self-cleaning coroutine runner with [RuntimeInitializeOnLoadMethod] domain reload safety

Impact & Results

  • Reduced codebase complexity by ~6,500 lines through architectural simplification, removing the Command Pattern and experimental infrastructure in favor of a lean state-machine-plus-services pattern
  • Expanded the model library from drone-only assets to 9 diverse entries including campus architecture (Performing Arts Center), vehicles, and industrial equipment
  • Achieved a single-scene architecture eliminating scene transition overhead and simplifying the deployment pipeline for iOS builds
  • Established a formal interface-segregated service layer with 7 distinct contracts, enabling independent testing and future service substitution

Deep Dive

The v2.0 rebuild represents a complete rethinking of the ARchitect application, driven by the limitations encountered in v1.0's ad-hoc prototype architecture. The rebuild spanned approximately 9 months and went through a significant internal architectural evolution before arriving at the current lean design.

The Command Pattern Experiment (July to November 2025): The initial rebuild introduced a formal Command Pattern alongside the state machine. Commands like GlitchCommand, RevealAnimationCommand, and DespawnAnimationCommand encapsulated individual actions within a CommandSequence. While this provided clean separation of concerns, it introduced excessive indirection for an app of this scope. A download service and remote connection infrastructure were also built to support fetching models from a server, but this was stripped in November 2025 when the scope was narrowed to local-only model loading.

The Infrastructure Spike (February 2026): An ambitious push introduced three new infrastructure systems (ConfigRegistry, ProcessPipeline, and ServiceRegistry) totaling ~4,100 lines of new code. These were integrated and tested, then completely removed within two weeks. The lesson: the infrastructure was being developed in-repo rather than as standalone concerns, creating coupling without reuse.

The Simplification (March 2026): The final architectural pass removed the entire Command Pattern implementation (~1,913 lines across 81 files), the in-repo infrastructure (~4,650 lines across 169 files), and legacy USS style files. The replacement was dramatically simpler: states call services directly, services are registered via attribute-based DI, and initialization runs through a sequential pipeline. The app went from a complex multi-pattern architecture to a clean state-machine-plus-services design.

Dependency Injection Design Decisions: The [AutoRegister] / [Inject] system was designed to eliminate Unity's fragile Inspector-drag-drop wiring for service references. All 7 MonoBehaviour services are discovered at scene load via attribute reflection, and all consumer fields are populated automatically. Static utility services (BoundsService, ModelAlignService, etc.) deliberately bypass DI: they hold no state, need no lifecycle, and are pure functions. This hybrid approach avoids the trap of injecting everything while still getting the benefits of DI for stateful services.

The Shader Reveal System: The ShaderRevealEffect is one of the most technically intricate pieces. It computes world-space bounds across all Renderer components, creates temporary material copies using the custom VerticalReveal.shader (which clips fragments below a _RevealHeight threshold), and animates that threshold over time. The spawn animation reveals bottom-to-top with configurable bottom/top gradient colors; despawn reverses the direction. A critical detail: the effect runs as a coroutine yielded from ModelPlacementHandler, and the finally block ensures temporary materials are destroyed even if the coroutine is interrupted. Original materials are cached and restored after spawn; for despawn, the entire model GameObject is destroyed.

AR Anchor Pipeline: The anchor placement system handles a subtle complexity: AR planes detected by ARKit can disappear or merge. The preferred path uses ARAnchorManager.AttachAnchor(hitPlane, hitPose), which creates a trackable-aware anchor that moves with the plane. The fallback path creates a standalone ARAnchor component for cases where the raycast hits geometry but no trackable plane is available. The offset prefab (instantiated as a child of the anchor) provides a consistent coordinate space for model placement, and bounds-based Y adjustment ensures models sit flush on the detected surface rather than floating above it.

Model Part Inspection: ModelViewUIHandler implements a three-mode interaction system: Normal (full model visible with rotate/scale controls), Selecting (parts panel open, scrollable list of named components), and ViewingPart (single component isolated, all other renderers disabled). The parts list is built dynamically from the model prefab's MeshRenderer and SkinnedMeshRenderer hierarchy, cross-referenced against the ModelDataEntryScriptableObject's objects list for enabled/disabled filtering and per-part descriptions. A custom Editor script (ModelDataEntryScriptableObjectEditor) auto-generates the object list from any prefab's renderer hierarchy.


vv1 — v1.0 — Original ARchitect / DroneVR

vv1 Highlights

  • Built an AR model viewer prototype for iOS/iPadOS using Unity and AR Foundation, establishing the foundation for an interactive 3D inspection tool targeting university educational use cases
  • Integrated XR Interaction Toolkit and ARKit to enable real-time surface detection and model placement on physical surfaces
  • Upgraded project from Unity 2022 to Unity 6, modernizing the rendering pipeline and XR subsystem dependencies
  • Delivered a functional DroneVR demo with imported 3D drone models viewable in augmented reality on iOS devices

vv1 Overview

ARchitect v1.0 was the initial prototype of an augmented reality model viewer for iOS and iPadOS, developed at Shenandoah University's Center for Innovative Learning (SCiL). The app allowed users to view 3D models, primarily drone assets, placed on detected real-world surfaces using Apple's ARKit via Unity's AR Foundation.

The prototype was built in stages: an initial scaffold in December 2023, followed by XR tooling integration in mid-2024, and a Unity 6 upgrade in early 2025. It served as the proof-of-concept that validated the AR model inspection workflow before a complete architectural rebuild.

vv1 Technical Breakdown

The v1.0 prototype used a straightforward Unity architecture with a multi-scene setup: a main menu scene and a separate viewer scene. The app relied on Unity's AR Foundation layer with ARKit as the iOS-specific subsystem provider.

XR Stack: XR Interaction Toolkit provided the interaction framework, with ARKit handling plane detection and anchor placement. The XR Simulation package was also integrated for Windows-based development testing without a physical iOS device.

UI: TextMeshPro was used for text rendering in the initial version, with a basic UIHandler MonoBehaviour managing screen transitions between the menu and viewer.

Model Pipeline: 3D drone models (FBX format) were imported directly into Unity with standard material assignments. Model placement on detected surfaces used basic AR raycast hit testing.


vv2 — v2.0 — InspectAR Rebuild

vv2 Highlights

  • Architected and built a ground-up rebuild of an iOS AR model viewer using a custom state machine, attribute-based dependency injection, and sequential pipeline initialization, replacing an ad-hoc prototype with a maintainable, extensible architecture
  • Designed and implemented a shader-based reveal/despawn animation system using a custom VerticalReveal shader with runtime material swapping, delivering polished visual transitions for 3D model placement in AR
  • Engineered a complete AR anchor placement pipeline with ARAnchorManager attachment, plane-aware raycasting, bounds-based alignment, and automatic model scaling to fit anchor dimensions
  • Built an interactive model part inspection system allowing users to isolate and view individual components of complex 3D models with per-part metadata and artist credits

vv2 Overview

InspectAR v2.0 is a complete ground-up rebuild of the ARchitect iOS/iPadOS model viewer, developed at Shenandoah University's Center for Innovative Learning. The app enables users to place an AR anchor on a detected surface, select from a library of 3D models, and interact with the placed model, rotating, scaling, inspecting individual parts, and viewing with animated shader-based reveal effects.

The rebuild replaced the v1.0 prototype's ad-hoc architecture with a formal SessionStateMachine driving four states, attribute-based dependency injection via a custom service-registry package, and a sequential async pipeline for initialization. The entire UI layer uses Unity's UI Toolkit with a custom dark-themed StyleKit package, and the app runs in a single scene with all transitions managed by the state machine.

The project evolved through an experimental Command Pattern phase (July–November 2025) before being simplified to the current lean state-machine-plus-services architecture in early 2026, demonstrating iterative architectural refinement driven by real usage.

vv2 Technical Breakdown

The v2.0 architecture is organized around a central SessionController that implements ISessionController and acts as the facade through which all states access the application's 7 services. The entire app runs in a single Unity scene with no scene transitions.

State Machine: SessionStateMachine is a plain C# class (no MonoBehaviour) that holds a reference to ISessionController as its context. States are instantiated on each transition: IntroductionState, AnchorSelectState, ModelSelectState, and ModelViewState, each implementing ISessionState with Enter()/Exit() lifecycle methods. States subscribe to service events in Enter() and strictly unsubscribe in Exit(), preventing memory leaks and stale callbacks.

Dependency Injection: A custom attribute-based DI system provides [AutoRegister(ServiceLifetime.Scene, typeof(IFoo))] on MonoBehaviours to register them as their interface at scene load. Consumer fields marked with [Inject] are populated via reflection by ServiceInjector. Seven services are registered: SessionController, ModelPlacementHandler, UIHandler, ModelDataHandler, AnchorSelectHandler, ModelSelectUIHandler, and ModelViewUIHandler.

Initialization Pipeline: SessionBootstrapProvider extends SequentialPipelineProviderBehaviour and builds a 4-step async pipeline: show splash screen, load model data from Resources, populate the model selection UI, and apply a 1.5-second splash delay. On success, it calls SessionController.StartSession(); on failure, it displays an error in the UI.

AR Anchor System: AnchorSelectHandler polls Touch.activeTouches (via the new Input System) during the anchor select phase. It raycasts against TrackableType.PlaneWithinPolygon, then attempts ARAnchorManager.AttachAnchor() to the detected AR plane, falling back to a manual ARAnchor component if no trackable plane exists. An offset prefab is instantiated under the anchor with bounds-based Y adjustment.

Model Placement: ModelPlacementHandler instantiates the selected model prefab under the anchor offset, calls ModelAlignService.AlignBottomToAnchor() for vertical alignment, optionally scales via ModelScaleService.ResizeToFit(), fires a PostProcessGlitchService URP Volume flash, and runs a ShaderRevealEffect coroutine for the bottom-to-top spawn animation. Despawn reverses the process (top-to-bottom reveal, then Destroy).

Shader Reveal Effect: ShaderRevealEffect computes world bounds across all renderers, swaps their sharedMaterials to temporary copies using the custom VerticalReveal.shader, animates a _RevealHeight property from min to max Y (or reverse for despawn), then restores originals or destroys the model. A finally block ensures all temporary materials are cleaned up.

Static Utility Services: Six stateless static classes handle pure computations: BoundsService (collider/renderer bounds), ModelAlignService (anchor alignment), AnchorAlignmentService (floor alignment), ModelScaleService (fit-to-anchor scaling), ModelPositionOverrideService (debug offset), and PostProcessGlitchService (URP volume flash with a self-cleaning coroutine runner).

UI Toolkit Layer: Five UXML documents (one per screen) are managed by a central UIHandler service. StyleKitMotion.FadeIn() is called on every transition. ModelViewUIHandler provides the richest interaction: rotate slider (0–360), shrink/restore buttons, and a parts inspection mode that builds a renderer lookup from the model hierarchy and toggles visibility per named component.

Model Data: ModelDataEntryScriptableObject assets in Resources/ModelDataEntries/ define each model's name, description, prefab reference, artist credits, and per-object part metadata. LocalModelDataService loads all enabled entries at runtime via Resources.LoadAll. A custom editor script auto-generates the object list from the prefab's MeshRenderer hierarchy.