❙❙❙Indoor Navigator.

An indoor navigation system that speaks through vibration,
guides with confidence, and recovers with dignity.
Designed for users who navigate the world without sight.

Role
UX + System Design
Platform
Android
Domain
Assistive Technology
Scope
Dual-App System
Solution

The Answer, First

Map Planner dashboard with system health metrics
Floor plan editor with rooms and objects
Space change reporting form with priority levels
Mode selection with Study, Travel, Shadow, and Emergency options
Active navigation showing Walk Straight 10m instruction
Video support with live camera feed for human assistance
Haptic Language
A vocabulary of six distinct vibration patterns that replaces visual navigation cues. Users feel directions through their phone without needing to look at a screen.
Confidence-Driven
The system changes how it speaks, vibrates, and routes based on how certain it actually is about the user's position. No false confidence.
Dignity-First Recovery
When things go wrong, the system blames itself, not the user. Three layers of human fallback ensure the user is never abandoned.
Challenge

The Indoor Navigation Crisis

Indoor spaces are architecturally hostile to people who cannot see. GPS drops to over 10 meters of error the moment you walk through a door. Existing navigation apps assume the user can read a map, follow a blue dot, or glance at a screen mid-walk. For a blind user carrying a cane, none of that applies.

The gap is not just technical. It is emotional. One wrong instruction in an unfamiliar hallway does not just add 30 seconds to a journey. It destroys the user's trust in the entire system. And once trust is broken, the user stops relying on the app and starts relying on strangers. The technology failed its one job: to give someone independence.

The challenge was to build a navigation system where the positioning layer works without GPS, the feedback layer works without vision, and the recovery layer works without abandoning the user when things inevitably go wrong.

Research

Understanding the Landscape

Idea Communication and Audience

The project started with a core philosophy: navigation should be as tactile and accessible as braille itself. I mapped the target audience, their rationale for needing indoor navigation, and defined the product scope before any design work began.

Idea communication sheet defining the product vision
Target audience and rationale analysis

Competitive Audit

I studied Be My Eyes (relies entirely on human volunteers, no autonomous navigation), Google Maps (visual-first, no indoor routing, no haptic feedback), and Navigine (freemium vendor lock-in, limited accessibility features). None of them solved the core problem: giving a blind user independent, real-time indoor navigation with non-visual feedback.

Competitive analysis of existing indoor navigation solutions

Competitive audit mapping feature gaps across existing solutions

Three User Personas

Research identified three distinct user archetypes, each requiring different levels of guidance and different interaction preferences.

Tara (Student)
High technical literacy. Navigates campus buildings daily. Needs efficient, fast routing. Prefers "Less Talking" mode because she already knows the general layout.
Noor (Professional)
Moderate tech literacy. Navigates office buildings for meetings. Needs reliable, predictable guidance. Prefers "Balanced" mode with clear turn-by-turn instructions.
Eva (Elderly)
Low technical literacy. Visits hospitals, malls, and public buildings. Needs maximum clarity and reassurance. Prefers "More Guidance" with frequent check-ins.

Visual Impairment Research

I catalogued 8 types of visual impairment and mapped each to specific UI adaptations. Total blindness requires 18-point minimum fonts with audio-first interaction. Low vision users need 28-point fonts with high contrast. Color blindness requires an accessible palette that avoids red-green combinations. Every design decision passed through the filter: does this work without sight?

Research on types of visual impairment and required UI adaptations

Insights on visual impairments mapped to design system requirements

Design System Study for Accessibility

I studied existing design systems to understand how accessibility requirements translate into component-level decisions. High contrast color schemes, minimum touch target sizes, screen reader compatibility patterns, and font scaling requirements all fed into the system design.

Design system study for accessibility patterns
System design study for accessibility architecture

Technology Stack Decision

Android was chosen over iOS because iOS restricts background Wi-Fi scanning, which is critical for continuous indoor positioning. The primary positioning engine uses Wi-Fi fingerprinting (Anyplace SDK) because it works with standard Wi-Fi hardware already installed in buildings. When Wi-Fi scans are throttled by the operating system (Android limits scans to 4 per 2 minutes), step-counting sensors fill the gap using pedestrian dead reckoning. A mathematical filter continuously blends both data sources to produce the smoothest possible position estimate.

Technology stack analysis and selection rationale
Technical scoping document
Synthesis

From Research to System Logic

Research finding: Blind users cannot process visual directional cues. Voice-only guidance creates cognitive overload when the user is also listening for environmental sounds like footsteps, doors, and echoes.
Design response: I created a vocabulary of six distinct vibration patterns. Each pattern feels fundamentally different, not just louder or softer. A left turn is a double-knock. A right turn is a sustained pulse. A wall boundary is a grainy sandpaper buzz. The user learns this dictionary during onboarding and then navigates primarily by touch.

Research finding: Positioning accuracy drops significantly near metal structures like elevators and in areas with thick concrete. Wi-Fi signals bounce and create false location readings.
Design response: Instead of hiding this uncertainty, I made it a first-class design material. The Confidence Index drives every aspect of the system. When accuracy is high, the system speaks directly: "Turn left now." When accuracy drops, the system hedges: "I think a left turn is coming up." The haptic texture also shifts from a solid pulse to an irregular flutter so the user can literally feel the system's uncertainty.

Research finding: Users who receive blaming language after a navigation error ("You missed the turn") show significantly lower willingness to continue using the system compared to users who receive neutral language.
Design response: I built a "dignity layer" into the recovery system. When the user drifts off-path, the system says "I might be mistaken. Did we pass the turn?" The system takes responsibility for the confusion. It recalculates the route silently in the background and only announces the new path once it is confident. Three consecutive successful instructions are required before the system returns to its normal confidence tone.

Edge Case Evaluation with ECE

To systematically identify failure points before they reached users, I ran the system design through ECE (Edge Case Evaluator), a structured evaluation tool I built specifically for forcing AI and design thinking to map consequences before proposing solutions. ECE's approach (Map, Expand, Validate, Audit, Synthesize) produced 55 edge cases across 12 failure dimensions: perception breakdowns, system accuracy failures, feedback channel failures, cognitive overload, behavioral variability, trust erosion, interaction constraints, environmental volatility, infrastructure issues, social interruptions, arrival ambiguity, and meta failures like phone drops and app crashes. Each edge case was traced to its worst-case outcome and paired with a specific mitigation strategy that fed directly into the system design.

The Wireflow: Six Phases of Navigation

The entire navigation experience maps to six phases. Each phase has a defined system action, a specific haptic signature the user feels, and a corresponding user action.

PhaseWhat the System DoesWhat the User FeelsWhat the User Does
Phase 1: ConnectionGenerates a counted pulse to confirm the app is active and listeningShort heartbeat pulsesMulti-tap to match the rhythm and confirm presence
Phase 2: IntentPre-loads the building map and calculates possible routes before the user starts walkingQuick 100ms selection tickTaps a destination card or speaks a task
Phase 3: StudyEnables finger-tracing over a simplified map with different haptic textures for walls, edges, and open spacesBuzz for walls, sandpaper texture for edgesTraces the map with their finger to build a mental model
Phase 4: TravelFuses Wi-Fi positioning with step-counting sensors to track the user in real timeDirectional pulses (left or right side)Walks the route, guided by vibrations and voice
Phase 5: RecoveryDetects if the user has drifted off-path, verifies before correcting, and uses non-blaming languageGhost pulse (irregular flutter indicating uncertainty)Confirms or corrects with a tap or voice response
Phase 6: ClosureAnnounces arrival with confidence-appropriate language and asks for feedbackCascade pattern (rising intensity)Taps to confirm arrival and leave feedback

The Confidence Index

The system's positioning confidence (a score from 0 to 1) modulates everything: the haptic texture, the voice tone, the word choice, and even the routing strategy.

System StateWhat the User FeelsWhat the User HearsWhat Changes
High ConfidenceSolid 50ms pulse, clean and predictableStandard tone, direct languageThe system speaks with certainty. "Turn left now." Routing uses the fastest available path.
Low ConfidenceIrregular flutter, like a hesitant heartbeat"Signal is weak." Softer, hedged phrasing.The system admits uncertainty. Recommends wall-following for safety. Increases the margin around obstacles.
Off-PathThree rapid stabs, unmistakably different from normal pulses"Let's pause." Calm, non-blaming.Immediate stop instruction. The system recalculates silently before speaking. Never says "you went wrong."
Target ProximityIncreasing frequency and pitch, like approaching a finish line"Approaching your destination."Slows down instruction cadence. Gives the user time to orient. Announces within-one-meter arrival.
Design Intent

Designing for Trust

Dignity Over Efficiency. The system never blames the user. When navigation goes wrong, the technology is the one that was confused, not the person. Recovery language like "I might be mistaken" preserves the user's sense of competence. This is not just politeness. Research shows that self-blaming language from assistive technology directly correlates with abandonment rates.

Chunked Cognition. No instruction contains more than two sequential steps. "Walk 10 meters, then wait" instead of "Walk 10 meters, turn left, then turn right." Silent periods between instructions let the user hear environmental cues like echoing hallways, door hinges, or crowd noise. These natural landmarks are often more reliable than the technology itself.

Confidence Transparency. When the system is uncertain, it says so. "My signal is weak, instructions might be less precise." This prevents the worst outcome: the user trusting a confident-sounding instruction that turns out to be wrong. Transparent uncertainty builds more long-term trust than false certainty.

Multi-Modal Redundancy. Every instruction is delivered through two independent channels: voice and vibration. If the environment is noisy, haptics carry the message. If the user misses a vibration through thick clothing, voice fills the gap. The two channels reinforce the same instruction simultaneously without competing.

The Haptic Dictionary

Six patterns make up the complete vibration language. Each was designed to feel distinct from all others, even through clothing, even while walking.

Turn Left
Two short pulses, like a double knock
A left turn is coming up. The double-beat pattern is distinct from other cues so users can recognize it while walking.
Turn Right
One long, sustained pulse
A right turn is ahead. The single long pulse feels fundamentally different from the double-knock of a left turn.
Reassurance
Subtle 50ms tap, barely perceptible
You are on track. This repeats every 5 meters during straight sections to prevent silence panic without being intrusive.
Arrival
Triple strong pulse with pauses between
You have reached your destination. The intensity demands attention and signals a clear transition from "traveling" to "arrived."
Boundary Warning
Rapid grainy buzz, like sandpaper
You are near a wall or edge. This texture-based pattern feels inherently different from directional pulses.
Uncertainty
Chaotic flutter, irregular and hesitant
The system is not confident in its position. The irregular pattern honestly communicates that something is off.
Solution

The Complete Experience

A. Establishing Tone

Research finding: First-time users of assistive navigation apps report high anxiety during onboarding. The first 30 seconds determine whether the user continues or uninstalls.
Design response: The splash screen is high-contrast black with minimal elements. The welcome screen introduces the system in first-person voice: "I will help you navigate indoors using sound and touch." A "Talking Model" selection lets the user choose how much the system speaks, giving them control from the very first interaction.

Splash screen
Welcome screen with first-person system voice
Talking model selection with three verbosity options

"Double-tap anywhere to hear this again" provides TalkBack redundancy on every onboarding screen

B. The Gesture Lab

Research finding: Blind users often have both hands occupied (cane in one hand, bag in the other). Traditional touch interactions like swipe and pinch are impractical during navigation.
Design response: I designed two primary gestures that work without precise targeting. Triple-tap anywhere on the screen is the emergency abort (stops all navigation and sound immediately). Long-press anywhere activates voice assistant mode. Both work regardless of where the user's finger lands. The onboarding includes a practice area where the user rehearses each gesture until the system confirms they have it right.

Triple tap gesture practice screen
Long press gesture practice
Green success confirmation after completing gesture practice

C. Haptic Calibration

Research finding: Haptic patterns are only useful if the user can distinguish them. Vibration perception varies by device model, phone case thickness, and individual sensitivity.
Design response: Before navigation begins, the user goes through a haptic calibration screen where they tap each pattern to feel it. A "Didn't catch it? Tap again." retry option ensures every pattern is learned. The onboarding also introduces Study Mode (trace the map with your finger to build a mental model) and Shadow Mode (you lead, the system follows silently and only alerts to hazards).

Haptic pattern learning cards
Boundary warning pattern selected with green confirmation
Study Mode and Shadow Mode introduction

D. Mode Selection and Emergency Setup

Research finding: Users have different autonomy preferences. Some want full guidance; others know the general direction and just want positioning confirmation. Forcing one mode on all users reduces adoption.
Design response: Four distinct modes let users choose their level of system involvement. Study mode lets them explore the map before walking. Travel mode provides full turn-by-turn guidance. Shadow mode follows silently unless there is a hazard. Emergency mode prioritizes safety with immediate human escalation. The emergency contact setup includes a multi-layer fallback question: "If they don't answer, should I connect you to live video help?"

Four navigation modes: Study, Travel, Shadow, Emergency
Emergency contact setup with fallback options
Contact list selection screen
Contact confirmation with yellow CTA

E. Active Travel

Research finding: Cognitive overload during walking is the primary cause of navigation errors. Stacking more than two instructions causes the user to lose track of where they are in the sequence.
Design response: The active navigation screen shows one instruction at a time in large text with a directional arrow. Instructions are capped at two sequential steps maximum. A red "Pause Navigation" button and white "Repeat Instructions" button are always visible at the bottom of the screen. "Long press for Assistance" is persistently available. Behind the scenes, Wi-Fi positioning and step-counting sensors are continuously fused through a mathematical filter that produces real-time position updates.

Active navigation: Walk Straight 10m with directional arrow, pause and repeat buttons

The core navigation screen. One instruction. One direction. Always-visible controls.

Research finding: When obstacles appear unexpectedly, panic response is immediate. The user needs unambiguous stop signals, not gentle suggestions.
Design response: The obstacle detection screen shifts to red, the universal danger color. A prohibition icon and "Obstacle ahead. Stop." in red text leave no ambiguity. The system auto-pauses navigation and delivers three rapid haptic stabs. It does not just warn. It stops. The system then recalculates a route around the obstacle before resuming guidance.

Assistance activated with green state showing emergency contact being reached
Obstacle detection: red screen with stop warning and prohibition icon

F. Human-Centric Recovery

Research finding: The biggest fear of blind users using navigation technology is being abandoned when the technology fails. Every existing solution has a single point of failure: if the app crashes or loses signal, the user is on their own.
Design response: I designed a three-layer fallback system. When the user requests help, the system first calls their emergency contact with a live video feed. If the call fails, it automatically sends a text message with the user's indoor position. If the contact is unreachable, it connects to a remote video assistance service. At no point is the user abandoned. Each escalation happens automatically, with clear status messages so the user knows what the system is doing.

Layer 1
Call Emergency Contact
The system calls the user's pre-set emergency contact and opens a live video feed so the contact can see the user's surroundings.
Triggered by: long-press assistance gesture
Layer 2
SMS with Location
If the call fails, the system automatically sends a text message containing the user's indoor position to the emergency contact.
Triggered by: call unanswered or failed
Layer 3
Remote Assistance
If the contact is unreachable, the system connects to a remote video assistance service. The user is never left without help.
Triggered by: contact unreachable
Video support connected with camera feed and End Support / Resume Navigation controls
Call failed state with automatic SMS and location share fallback
Remote assistance connection as third-layer fallback

Three layers of escalation. The user is never abandoned.

G. The Map Planner

Research finding: Map accuracy directly determines navigation safety. If the digital map does not match the physical building, the user gets routed into walls, blocked paths, or non-existent doors.
Design response: I designed a separate companion app for building administrators and helpers. The Map Planner provides four ways to create indoor maps: Draw (manual precision on a grid), Walk (physically walk the space and trace your path using the same step-counting technology the blind user navigates with), Upload (overlay an existing floor plan image), and AR Mesh (use the phone camera to scan the room in 3D). A dashboard shows system health metrics including flag reports, accuracy percentage, object count, and working exits.

Map Planner dashboard showing system health metrics
Building management with campus list
Floor plan editor with draw tools
Floor plan with rooms, objects, and multi-floor support
Map editor insert menu with objects, upload, room, and AR mesh options

Research finding: Maps decay over time. Furniture moves, temporary obstacles appear, paths get blocked for maintenance. Static maps become dangerous.
Design response: A flag reporting system lets anyone (users, helpers, building staff) report space changes. Flags have types (exit obstruction, new obstacle), priority levels, and a lifecycle (Pending, Approved, Rejected). When an admin approves a flag, the map updates and the navigation routing engine immediately reflects the change. The next blind user who walks that path gets accurate guidance.

Flag reporting list with exit obstruction and obstacle types
Flag detail view with description, location, and assignment workflow
Space change reporting form with priority and visual proof
Raw Figma design of the Map Planner interface

Map Planner raw Figma design showing the full admin interface

Design Iterations

The UI went through two full iterations. The first iteration was wireframed before the backend logic existed. The second iteration, designed in Figma, was driven by the system behavior requirements, specifically how each phase of the wireflow maps to what the user sees and feels on screen.

First iteration wireframe before backend logic

First iteration wireframes

Second iteration in Figma Make driven by system behavior

Second iteration in Figma Make, shaped by backend phase requirements

Reflection

Key Learnings

Accessibility Is Architecture
Building for blind users first forced every design decision through the filter: does this work without sight? The result is a system that is more robust for everyone, not just the target audience.
Confidence Is a Design Material
The system's self-awareness about its own uncertainty became the most important variable. When technology admits its limits, users trust it more than when it fakes certainty.
Recovery Over Prevention
You cannot prevent every navigation failure. But you can make every failure feel like a gentle redirect instead of an abandonment. The dignity layer changed everything about how users responded to errors.

Future Scope

01
Production Anyplace SDK Integration
Replace the mock positioning layer with real Wi-Fi fingerprinting in mapped buildings.
02
Community Flag Network
Map corrections propagate across buildings, so improvements made by one admin benefit all users on the platform.
03
Full Tactile Map Exploration
In Study Mode, the user traces walls, doorways, and corridors with different haptic textures to build a complete mental model before traveling.
04
Companion Mode for Sighted Helpers
A helper can monitor the blind user's position in real time and provide guidance through the Map Planner app.
More in Featured Work