top of page
Imagink Cover.png
Logo.png

Imagink

Imagink is an app designed to help artists and hobbyists improve tracing accuracy and speed. The project is fully funded by Snap Inc.’s accelerator program.

Role

0→1 Product Designer | Co-Founder | Technical Artist responsible for end-to-end design including: user flows, UI, interaction driven by user research; 3D modeling; and coding.

Project

Snap Accelerator

Duration

January 2025 - October 2025

App Avaliability

Built for Snap’s Spectacles (AR Lens), with a planned device timeline of Q1 2026.

The Problem

Creators often start with a small sketch or reference images, but turning that into a large, final piece isn’t easy.

Most have to rely on clunky, manual methods like grids, projectors, or guesswork. It's slow, frustrating, and often leads to results that don’t match their vision.

The Process

User Interview

After securing funding by pitching our solution, I conducted user interviews with creators to validate the problem. These sessions provided qualitative insights into key pain points and helped uncover potential design solutions based on real user needs.

Pain Points

1. Inaccurate Result

Inaccurate.png

2. Lacking Flexibility

Lacking Flexibility.png

3. Lacking Efficiency

Lacking Efficiency.png
Screenshot 2024-08-09 at 7.55.17 AM 1 (1).png

The Adaptive Designer

Age: 36

Gender: Female

Needs

  • Flexible tools that support back-and-forth workflows

  • Easy version tracking across sketches and edits

  • Seamless support from ideation to execution

Goals

  • Translate expressive ideas into workable solutions

  • Balance emotional intent with technical feasibility

  • Create designs that feel personal yet buildable

Motivation

  • Tell a personal, meaningful story through design

  • Keep the emotional essence intact through production

  • Bridge creative vision with real-world constraints

Behavior

  • Discovers ideas through making

  • Jumps between sketching, testing, and refining

  • Preserves emotional intent from early sketches

Prototype Test

Prototyping using Lens Studio to test out basic flow and discover technical challenges. The image is layered with pre-made tracing line for testing purpose.

Establishing the Core Mechanics 

After multiple iterations of the user flow, I designed a core flow for the app that aligned with technical feasibility.

Imagink Userflow.png
Executing the Core Concept

We refined the core mechanics and created a set of lo-fi and hi-fi wireframes, clarifying interaction patterns and screen-level flows.

Initial Onboarding to Image Gen Experience
01_Welcome.png
02_Project start_select project type.png
03_Imggen00.png
04_Imggen02 processing.png
Initial Image Editing to Tracing
05_Refine00.png
06_Refine icon1 (1).png
07_Refine icon1.png
08_Trace00_trace feedback.png

Constraints

After building the prototype and exploring Lens Studio’s development capabilities, we developed new solutions to work around the following technical constraints.

Hardware Capabilities

Due to hardware limitations, we avoided feature overload and focused on building a cost-efficient script. The GPU and CPU couldn’t handle complex scripts or high-polygon assets without risking crashes.

Battery Life

Snap AR Spectacles have only 40 minutes of battery life, so it was crucial to design an experience that fits within that limit, using scripts and visual effects optimized to minimize power consumption.

Limited API Call

With no support for Python or external APIs in Lens Studio, we turned to the platform’s native APIs and SDKs, crafting solutions that balanced feasibility with design intent.

Unsupported Mobile Data Transfer

Spectacles do not support mobile-to-device data transfer, preventing users from uploading their original drawings.

Image Scan Inaccuracy

With the current supported API and timeframe, it was not feasible to develop a script capable of precisely scanning a physical image.

Limited Field of View

The given field of view was 900 × 450, and we designed our app to fit within this constraint. We also had to account for how users interact with objects when they move offscreen and adjust content size based on user actions.

Testing the Flow

After prototyping and defining the core flow, we built a working concept and validated the initial experience with 10 users.

01. Difficult to Pinch
01_Usability Pinch.png
02. Losing Content
02_Menu Stand Off.png
01. Difficult to Pinch

Users had difficulty aiming the cursor to pinch interactable objects.

02. Losing Content

Users lose the floating instruction during the projection step and forget what to do next.

03. Mic Input Error
03_Mic Input.png
04. Distracting VFX
04_Distracting Size.png
03. Mic Input Error

The device struggled to recognize users’ pronunciation of the word “project,” which prevented users from placing the image.

04. Distracting VFX

VFX feedback during tracing was slightly distracting because of its size; users want feedback, but more subtle.

Visual Guideline

A Playful Theme

We made the visual language bold, playful, and a little imperfect, just like the creative process. We use hand-drawn shapes & illustration, energetic colors, and friendly type to keep growing creators inspired from first scanning sketch to final trace.

Font
Headline_DynaPuff
Secondary Header & Body_Manrope
Color Palette
Visual Style
Visual Example.png

The Solution

Our AR tool helps creators scale sketches from small drawings onto real-world surfaces, ensuring precision as they trace and refine their work in real time.


Whether painting a mural, designing clothing, or planning home decor, users can visualize their projects in real-time, adjust proportions, and receive precise material measurements.

Screenshot 2025-09-08 155529.png
Gen AI

An experience to generate any image by saying a prompt.

Headtrack Projection

A way to discover surfaces to project onto simply by moving their head.

Screenshot 2025-09-08 155420.png
Screenshot 2025-09-08 155604.png
Image Control

A power to change opacity and saturation, and lock the image in place.

VFX Tracing Feedback

Provides a magical visual effect feedback when the user is successfully tracing.

Screenshot 2025-09-08 155632.png
Turning imagination into reality for tracing

A magical experience that lets users generate their imagination and bring it into reality for tracing.

Take Away

Knowing Whats Feasible Before Hand

Researching available tools beforehand is essential; it informs what can realistically be prototyped and ensures that design ideas are communicated effectively to engineers.

Knowing Whats Feasible Before Hand

It is crucial to research and build prototypes to test the feasibility of a design when utilizing a game engine or other development software to effectively communicate ideas to engineers.

Rapid Iteration

With new tools, fast iteration and prototyping become the key to discovering what truly makes an experience work.

Knowing Whats Feasible Before Hand

It is crucial to research and build prototypes to test the feasibility of a design when utilizing a game engine or other development software to effectively communicate ideas to engineers.

Understanding User's Movement

It’s crucial to observe subtle user movements and tendencies so we can design interactions that minimize hand input, maximize comfort, and deliver a seamless, delightful automated experience.

Knowing Whats Feasible Before Hand

It is crucial to research and build prototypes to test the feasibility of a design when utilizing a game engine or other development software to effectively communicate ideas to engineers.

Potential Impact

Impact metrics will be available a few months post-launch (2026) of Snap Spectacles (AR Lens).

Knowing Whats Feasible Before Hand

It is crucial to research and build prototypes to test the feasibility of a design when utilizing a game engine or other development software to effectively communicate ideas to engineers.

80% 

Reducing Workflow Time

In usability testing, users reduced their tracing workflow from 1.5 hours to just 10–15 minutes.

B-TSU Design

Follow

  • Instagram
  • LinkedIn
bottom of page