Clarity is an iOS app designed to help visually impaired users read text in their environment. Using advanced text recognition and accessibility features, it makes reading more accessible and engaging.
I built Clarity because I’ve seen firsthand how something as simple as reading a newspaper or a medicine label can become a struggle. My thatha (grandfather) often squints, holding newspapers and medicine bottles unnaturally close to his eyes, trying to make sense of the text. Whenever I suggest getting his eyes checked, he refuses—out of habit, reluctance, or simply the thought of wearing glasses all the time. And I get it.
Even I’ve had moments when I forget to wear my glasses and suddenly find myself struggling to read a restaurant menu on a board. In those moments, I realized that clarity shouldn’t be a privilege—it should be accessible to everyone, anywhere.
That’s why I designed Clarity to be effortless, lightweight, and empowering ,So that anyone, regardless of age or tech familiarity, can use it without frustration. The real-time text recognition makes it feel natural, and the text-to-speech feature ensures that even if reading is difficult, the words can still be heard. I chose SwiftUI for its clean, adaptive design, ensuring the app feels intuitive. VisionKit and Vision handle text detection seamlessly, while AVFoundation enables multi-accent voice support. I integrated CoreHaptics to add a subtle but meaningful layer of feedback—reinforcing interactions without overwhelming the user.
This app isn’t just about seeing words. It’s about restoring independence. About ensuring a grandfather can read his medicine label without strain, a student can study from a textbook without frustration, and a traveler can navigate signs with confidence.I built this app because representation matters. Because innovation should include everyone. And because, sometimes, a little clarity can change someone’s world.
~ The aim is to not just see things, but to be self reliant and independent. ~
ScreenRecording_02-11-2025_19-20-59_1.mov
- Real-time text recognition
- Voice guidance
- Customizable text display
- Offline functionality
- Haptic feedback
- Multi-accent support
- Dark/Light mode support
- iOS 16.0+
- Xcode 15.0+
- Swift 5.9+
- Camera access for text scanning
- SwiftUI - For the user interface
- VisionKit - For real-time text scanning (DataScannerViewController)
- Vision - For advanced text recognition and analysis
- AVFoundation - For camera control and text-to-speech
- CoreHaptics - For haptic feedback
- Camera