Crafticulate

A living archive of pottery gestures, preserving the silent language of making through real-time interaction and machine learning.

Role Researcher, Developer, Designer, Maker
Timeline 4 months
Year 2025
Thesis GSAPP-CDP
Crafticulate Project

A gesture-based archive that preserves traditional craft knowledge through the lens of technology. Focused on the embodied movements of pottery-making, the project captures, classifies, and reanimates hand gestures using real-time machine learning and computer vision.

Inspired by how techniques are passed through demonstration rather than text, Crafticulate transforms ephemeral acts into interactive digital memory. Each gesture—pinching, pulling, smoothing—becomes a traceable artifact, stored not as video, but as data: adaptable, searchable, and open to reinterpretation.

Users explore the archive through an interface that is itself a site of learning. They can view 3D gesture visualizations, listen to artisan reflections, and even try gestures live using their own hands via webcam. The system responds in real time, offering feedback and inviting users to contribute their own interpretations.

At its core, Crafticulate reimagines preservation not as storage, but as dialogue: between human and machine, memory and movement, tradition and transformation.

It begins with a question

Question

"how can we preserve gestures as living knowledge, not just recorded motion?"

The project was developed through iterative prototyping and real-time testing. I started by analyzing pottery-making gestures through observational video, then trained a gesture recognition model using MediaPipe and TensorFlow. From there, I built a 3D visualization interface to explore movement beyond flat video. A physical tray of clay forms was also created to ground the digital gestures in tactile reality.

Each user interaction feeds the system—expanding the dataset and refining the model to better reflect the diversity of embodied craft.

In Progress

Demo Video

When you visit Crafticulate, you begin at the Gesture Gallery — a constellation of floating pottery gestures you can explore freely.

You can filter gestures by:

  • Process (e.g., forming, shaping)
  • Stage (e.g., centering, finishing)
  • Application (e.g., bowl-making, vase-making)
  • Hand Use (single-hand, double-hand)
  • Technique (e.g., coiling, pinching)
  • Cultural Context (comparisons across traditions)

🎟️ Following a Gesture Journey

Suppose you choose bowl-making. You're guided into a process timeline—almost like a recipe made of movements.

Each gesture flows into the next: kneading, pinching, pulling, lifting—each shaping the clay toward form.

🔍 Diving Deeper: Gesture Metadata Pages

When you click on a gesture—say, Ball Kneading—you open its deeper archive page, revealing:

  • A 3D model of the gesture, which you can rotate and animate.
  • A description of when and why this movement is used.
  • The tool or material involved (hands, ribs, wheels).
  • Voice reflections or short clips from artisans sharing personal insights.
  • Cultural overlays comparing similar gestures across global craft traditions.
In Progress

👐🏻 Practicing the Gestures Yourself

When ready, you can turn on your webcam and practice a gesture live. A real-time hand tracking model maps your movements through 21 keypoints, offering visual feedback—like a digital mirror guiding your hands.

Because gestures live on a spectrum, the system doesn't enforce rigid categories. Small variations are part of learning, just as they are in real craft.

In Progress

For me, Crafticulate is not just about storing craft—it's about reactivating it. It's about feeling it in your fingers, translating it across generations, and inviting others to mold it anew. Craft is knowledge. Knowledge is movement. Movement is memory.

Technical Details

The system utilizes:

Motion Tracking Sensors

Hand tracking via MediaPipe with 21 keypoint landmarks for finger and palm articulation

Machine Learning Algorithms

A custom-trained gesture classification model using TensorFlow.js, achieving ~95% validation accuracy

Real-Time Data Visualization

3D gesture rendering using matplotlib + JSON data conversion into interactive coordinates

Custom Software Interface

A web-based UI allowing users to explore, practice, and contribute gestures, built with HTML, CSS, JavaScript, and integrated model prediction

Process Image 1 Process Image 2

References & Acknowledgments

Tools & Technologies

Spatial Pixel - Real-time gesture tracking and visualization framework

MediaPipe - Hand tracking and pose estimation library

TensorFlow - Machine learning framework for gesture classification

Three.js - 3D visualization library for gesture modeling

p5.js - JavaScript library for creative coding

Figma - Collaborative design tool

Cursor - AI-powered coding assistant

Craft Communities & Inspiration

Pakkret Pottery Village, Thailand - Traditional pottery community and knowledge center

Clay Circle Workshop, Bangkok - Contemporary ceramics studio and learning space

Ban Bat Community, Bangkok - Bat making community

Fingerspelling - ASL movement documentation project by Hello Monday

Cybersubin - "Human-AI co-dancing," blending traditional Thai dance knowledge with virtual choreographic agents

People

William Martin - Project Advisor

Seth Thompson - Faculty Support

Catherine Griffith - Faculty Support

Laura Kurgan - Faculty Support

Snoweria Zhang - Faculty Support

Violet Whitney - Faculty Support