-Picsart-AiImageEnhancer.jpg)
DESIGN RESEARCH AND EXPLORATION
​
"Whispers of the Canvas" emerged from a need to bridge the gap between traditional art appreciation and modern digital engagement. The project draws inspiration from advancements in AR, AI, and multi-sensory design, aiming to set a new standard for interactive museum experiences.
​
VISION:
To transform passive observation into active exploration, making art accessible, educational, and emotionally impactful for diverse audiences worldwide.
​
KEY FINDINGS:
-
Immersive Learning: Create an AR experience that emotionally resonates with users, fostering a deeper connection to artwork.
-
Personalized Exploration: Tailor content to individual preferences, learning styles, and cultural backgrounds.
-
Multi-Sensory Engagement: Integrate audio, visual, and haptic feedback to engage users across sensory modalities.
-
Accessibility for All: Ensure inclusivity for users with diverse abilities, including visual, auditory, and motor impairments.
-
Curatorial Storytelling: Empower curators to weave compelling narratives and contextual insights seamlessly into the AR experience.​​
PROBLEM:​
Traditional methods (static labels, audio guides, digital screens) are detached and fail to engage modern audiences, especially younger generations and those with diverse learning needs.
​
-
Visitors gain only surface-level understanding, missing deeper historical, cultural, and personal stories.
-
Accessibility barriers and lack of interactivity limit inclusivity and impact.





KEY FINDINGS:
-
Real-Time Object Recognition: Instant identification of artwork via device cameras, triggering dynamic AR overlays with contextual information.
-
Interactive Timelines: Users can explore the artwork’s creation process chronologically, visualizing key milestones and techniques.
-
Layered Historical Context: AR overlays reveal social, political, and cultural influences, presented in digestible, interactive layers.
-
Layered 3D Exploration: Expand on the prototype’s ability to rotate 3D objects toward the canvas, allowing users to examine each layer
-
Accessibility Enhancements: Voice navigation, high-contrast visuals, and haptic cues for visually or hearing-impaired users.
FUTURE ITERATIONS
​
-
Refining 3D Interaction: Improve the prototype’s rotation and layer separation mechanics for smoother, more intuitive exploration.
-
AI and Contextual Expansion: Integrate AI to dynamically adjust 3D overlays based on user behavior, expanding beyond the prototype’s static 3D objects.
-
Multi-Sensory Integration: Add haptic, auditory, and potentially olfactory feedback to complement the visual 3D experience of the prototype.