A six-degree-of-freedom Omron robot arm that uses AR tags to recognize groceries and autonomously load them into a real refrigerator.
You can drag, pinch, and orbit the full TM5 model to see the workspace clearances and wrist-mounted RealSense setup directly in the browser.
For a lot of people, putting groceries into the fridge is more than just a small chore. It can be physically demanding and uncomfortable, especially when items are heavy, shelves are deep, or space is tight. Reaching high shelves, bending down over and over, and twisting around open doors can be hard or unsafe for older adults, people with limited mobility, shorter users, or anyone dealing with fatigue or injury.
On top of that, there is a constant mental task of figuring out where everything should go in a crowded, messy fridge. Robots do not get tired or frustrated by cramped spaces, but teaching a robot to understand those spaces and move safely inside them is a real challenge.
GroceryGizmo is an autonomous fridge-loading system that uses AR tags, computer vision, and a six-degree-of-freedom Omron arm to handle both perception and manipulation:
The result is a full pipeline that shows how a robot can handle a real household task from start to finish.
As AI and robotics become more common, we want them to support people in everyday life, not just in factories. The goal is not to replace meaningful work, but to offload tasks that are repetitive, physically demanding, or simply out of reach for many people.
GroceryGizmo is one step toward assistive home robots that quietly take care of the small, tiring tasks in the background.
A multidisciplinary team from EECS, Mechanical Engineering, Bioengineering, and Business.
Mechanical Engineering and Business student focused on robotics, with experience in Solidworks, ROS, Python, and C++.
Mechanical Engineering major and Materials Science minor who works with humanoid robots and designs with Solidworks, ROS, Python, and C++.
EECS major interested in robotics and optimization, with experience in computer vision, Python, C, PyTorch, and TensorFlow.
Bioengineering major and EECS minor interested in surgical robotics, with experience in Solidworks, C++, Python, and embedded systems.
Third-year EECS major interested in electronics and robotics, with experience in Fusion, KiCad, and Python.
The robot must reliably detect AR-tagged objects on the counter and estimate their 3D positions, regardless of how each item is oriented.
AR tags inside the refrigerator act as labeled anchor points, telling the robot exactly where each item should be placed.
The system should automatically pick the closest reachable item to reduce motion and keep the workflow efficient.
Accurate coordinate transforms are crucial so the Omron arm can move to the correct pre-grasp and placement poses in its own base frame.
The robot should approach with a safe offset, lower the gripper for a stable grasp, move through the fridge without collisions, and place items gently at their tagged locations.
Together, these criteria guide GroceryGizmo toward being predictable, safe, and actually helpful in a real kitchen setting instead of just a lab environment.
The GroceryGizmo stack combines collaborative robot hardware with ROS 2, MoveIt2 planning, and a lightweight GUI. Each grocery item is scanned on the counter, matched to a location in the fridge, and moved along a planned trajectory that respects the door, shelves, and workspace limits.
Expand each diagram to explore how perception, planning, and hardware all connect in GroceryGizmo.
Browse a few representative Python files from the ROS 2 stack: launch orchestration, AR tag detection, package entry points, and linter tests.
These samples highlight key parts of the system: ROS launch orchestration, AR tag detection, package configuration, and supporting utilities.
Setuptools entry points for ROS 2 console scripts (GUI, gripper, AR detection).
from setuptools import find_packages, setup
import os
from glob import glob
package_name = 'tm5_grocery_gizmo'
setup(
name=package_name,
version='0.0.0',
packages=find_packages(exclude=['test']),
data_files=[
('share/ament_index/resource_index/packages',
['resource/' + package_name]),
('share/' + package_name, ['package.xml']),
(os.path.join('share', package_name, 'launch'), glob('launch/*.launch.py')),
],
install_requires=['setuptools'],
zip_safe=True,
maintainer='arya',
maintainer_email='arya@todo.todo',
description='Custom applications for TM5-700 robot',
license='MIT',
extras_require={
'test': [
'pytest',
],
},
entry_points={
'console_scripts': [
'save_camera_images = tm5_grocery_gizmo.camera_image_saver:main',
'gripper_control = tm5_grocery_gizmo.robotiq_gripper_control:main',
'gripper_tmscript = tm5_grocery_gizmo.robotiq_gripper_tmscript:main',
'gripper_variable = tm5_grocery_gizmo.robotiq_gripper_variable:main',
'gripper_position_control = tm5_grocery_gizmo.robotiq_gripper_position_control:main',
'robot_gui = tm5_grocery_gizmo.robot_joint_gui:main',
'robot_gui_smooth = tm5_grocery_gizmo.robot_joint_gui_smooth:main',
'robot_gui_camera = tm5_grocery_gizmo.robot_joint_gui_camera:main',
'robot_gui_leader = tm5_grocery_gizmo.robot_joint_gui_leader:main',
'robot_gui_recorder = tm5_grocery_gizmo.robot_joint_gui_recorder:main',
'ar_tag_detector_service = tm5_grocery_gizmo.ar_tag_detector_service:main',
],
},
)
Explore the simplified CAD assets we use to plan the workspace layout, storage locations, and reachability inside the fridge.
Custom Intel RealSense mount designed for the Omron robotic arm.
Material-coded reference cube used to demonstrate MTL-based shading in the viewer.
We evaluated GroceryGizmo on a full countertop-to-fridge workflow using AR-tagged groceries of different shapes and sizes. The system executed autonomous pick-and-place cycles, logged motion telemetry, and demonstrated how perception, planning, and manipulation interact in a real setup.
GroceryGizmo met its main design goals: it could detect groceries, match them to fridge placements, and execute constrained pick-and-place motions. The robot handled several item types without collisions and maintained consistent localization accuracy across repeated runs.
GroceryGizmo • EECS 106A • UC Berkeley