A six-axis Omron robot arm that identifies groceries using AR tags and autonomously loads them into a real refrigerator.

Industrial hardware deployed in a real kitchen to explore perception, manipulation, and task-level autonomy.
A multidisciplinary team from EECS, ME, BioE & Business.
Mechanical Engineering & Business major interested in robotics. Well-versed in Solidworks, ROS, Python, and C++.
Mechanical Engineering major and Material Science minor interested in robotics. Works with humanoids; skilled in Solidworks, ROS, Python, and C++.
EECS major interested in robotics/optimization. Experience with CV, Python, C, PyTorch, and TensorFlow.
Bioengineering major & EECS minor interested in surgical robotics. Experience with Solidworks, C++, Python, and embedded systems.
Third-year EECS major with interest in EE & robotics. Experience with Fusion, KiCad, and Python.
We will use an industrial six-axis robot arm acquired through Omron Robotics to autonomously load groceries into a household refrigerator. The system includes a mobile base, vision-based item identification via AR tags, and a gripper capable of handling cartons, cans, and loose produce. After mapping the kitchen, the robot detects tagged groceries, plans collision-free trajectories, and places each item on a matching AR-tagged shelf inside the fridge. Safety is ensured through soft limits, supervised operation, and an emergency stop. The prototype demonstrates practical home automation and provides a platform for experimenting with perception, grasp planning, and task sequencing in cluttered environments—supporting elderly, disabled, or recovering individuals with day-to-day tasks.
A ROS 2 system using an Omron 6-axis arm, Robotiq gripper, and RGB-D camera. AR tags provide identity and location of groceries. A task manager controls the robot state. MoveIt + inverse kinematics generate collision-free grasp and place trajectories. Force sensing prevents slipping or crushing.
RGB-D camera detects AR tags on groceries & fridge shelves; gripper force sensor ensures safe grasps.
MoveIt + IK compute collision-free pick-and-place trajectories around obstacles like fridge doors.
Omron TM5-700 arm and Robotiq gripper manipulate groceries safely using category-based grip forces.
N/A
| Item | Qty | Owner / Location |
|---|---|---|
| Omron TM5-700 | 1 | Arya / Living Room |
| Item | Qty | Justification |
|---|---|---|
| Grocery Items | 10+ | Objects to load into fridge |
| AR Tags | 15+ | Identify groceries & shelves |
| Sticker Paper | 15+ | Attach AR tags to produce |
Replace images in /public/images/ with your own.


Example pipeline showing vision → pose estimation → IK planning.
# Example ROS 2 pipeline for GroceryGizmo
# Camera node publishes RGB(+depth)
# /camera/image_raw
# AR Code node → detects markers
# /aruco/poses
# /tf
# Pose Estimator → outputs PoseStamped in camera frame
# /grocery_pose
# Object Selector:
# - picks nearest target
# - transforms pose to base_link
# - publishes:
# /target_pose
# MoveIt + IK:
# - generate collision-free pick+place trajectories
# - adjust gripping force based on AR tag class
GroceryGizmo • EECS 106A • UC Berkeley