Chaerim Wendy Moon

PhD Candidate in Mechanical Science and Engineering
Novel Mobile Robots Lab (NMbL)
University of Illinois at Urbana-Champaign

Chaerim Wendy Moon

Hi! I am a robotics researcher, passionate about shaping a world that's not only more capable, but also more magical 🪄 Currently, I am pursuing a PhD at UIUC (in the 4th year), advised by Professor Justin Yim. Previously, I received M.S. and B.S. (with Great Honor) in Mechanical Engineering from Korea University in 2022 and 2020, respectively.

My research focuses on designing motion planning architectures for heterogeneous robotic systems, with an emphasis on how embodiment, task demands, and environmental constraints shape architectural design. Previous and ongoing research projects span domains including human-robot communication; teleoperation- and vision-based manipulation; and unconventional legged and non-legged locomotion (click the cards for details):

🤖 Interactive Head Module

Robotic Head Module for Non-Verbal Communication [🔗 View Project Page ]
A modular 3-DOF robotic head module is developed with a hierarchical system framework that integrates perception, planning, and control in a layered manner. It enables human–robot interaction through expressive head gestures (i.e., nodding, shaking, and lateral tilting) and vision-based subject tracking. Furthermore, the system is designed to adapt to arbitrary and dynamic mounting configurations, enhancing its modular deployment capability and robustness to platform variability.

Keywords: Social Human-Robot Interaction Non-Verbal Communication Vision-Based Interaction

🦾 Superlimb Manipulation

Supernumerary Robotic Limbs for Manipulation Assistance [🔗 View Project Page ]
A dual-layer coordinated motion planning framework is introduced to support daily manipulation tasks using a reconfigurable wearable SRL platform. The system supports multiple operational modes (i.e., upper-body motion-based teleoperation and vision-based object manipulation) and accommodates flexible robot configurations through a modular software architecture. In consideration of the physically coupled human–robot context, an additional planning layer is adopted to generate moment compensation motions to mitigate asymmetric physical loads transferred to the user.

Keywords: Physical Human-Robot Interaction Multi-Limb Coordination Object Manipulation Teleoperation

🦿 Superlimb Locomotion

Supernumerary Robotic Limbs for Locomotion Assistance
Details are confidential for now. This study suggests a grasp-based locomotion planning framework under microgravity using an SRL platform.

Keywords: Grasp-Based Quadruped Locomotion Multi-Limb Coordination Locomotion Optimization

❓ Non-Legged Locomotion

Novel Non-Legged Locomotion Paradigm
Details are confidential for now. This study will suggest a novel locomotion paradigm that addresses the limitations of conventional robots to enable operation in challenging terrains.

Click a video to view the project page
or swipe to explore more videos

Across the aforementioned domains, the proposed system architectures are grounded in a shared design philosophy: (1) scenario-driven formulation of physical and operational constraints; (2) behavior-level modularization; and (3) task-sensitive hierarchical structuring. Through these implementations, my research offers design-level insights into how motion planning architectures can be structured in response to domain-specific constraints, highlighting recurring patterns across varied deployment scenarios with heterogeneous robotic systems. These principles are further extended to underexplored domains, demonstrating their potential to inform system design in emerging application areas.

Outside of research, I’m an avid orchestral violinist 🎻 and sport climber 🧗🏼‍♀️ — pursuits that challenge me to stay focused, adaptable, and in sync with complex systems.