GroceryGizmo
EECS 106A Robotics Final Project

GroceryGizmo:
Robot-Assisted Grocery Loading

A six-axis Omron robot arm that identifies groceries using AR tags and autonomously loads them into a real refrigerator.

Omron TM5-700 Arm staging groceries
Omron TM5-700 + Robotiq Gripper

Industrial hardware deployed in a real kitchen to explore perception, manipulation, and task-level autonomy.


Team

A multidisciplinary team from EECS, ME, BioE & Business.

Arya Sasikumar

Mechanical Engineering & Business major interested in robotics. Well-versed in Solidworks, ROS, Python, and C++.

Gursimar Virk

Mechanical Engineering major and Material Science minor interested in robotics. Works with humanoids; skilled in Solidworks, ROS, Python, and C++.

Yamuna Rao

EECS major interested in robotics/optimization. Experience with CV, Python, C, PyTorch, and TensorFlow.

Divya Krishnaswamy

Bioengineering major & EECS minor interested in surgical robotics. Experience with Solidworks, C++, Python, and embedded systems.

Patrick O’Connor

Third-year EECS major with interest in EE & robotics. Experience with Fusion, KiCad, and Python.


Abstract

We will use an industrial six-axis robot arm acquired through Omron Robotics to autonomously load groceries into a household refrigerator. The system includes a mobile base, vision-based item identification via AR tags, and a gripper capable of handling cartons, cans, and loose produce. After mapping the kitchen, the robot detects tagged groceries, plans collision-free trajectories, and places each item on a matching AR-tagged shelf inside the fridge. Safety is ensured through soft limits, supervised operation, and an emergency stop. The prototype demonstrates practical home automation and provides a platform for experimenting with perception, grasp planning, and task sequencing in cluttered environments—supporting elderly, disabled, or recovering individuals with day-to-day tasks.


Project Description

Project Goals
  • Identify grocery items using AR tags
  • Pick and load groceries into a refrigerator
  • Match fridge shelf AR tags to object AR tags
Stretch Goals
  • Automatically opening the fridge door
  • Using pretrained CV models instead of AR tags
  • Placing items by semantic labels (e.g., “Dairy”)
  • Sorting groceries by expiration date
Software & Hardware Architecture

A ROS 2 system using an Omron 6-axis arm, Robotiq gripper, and RGB-D camera. AR tags provide identity and location of groceries. A task manager controls the robot state. MoveIt + inverse kinematics generate collision-free grasp and place trajectories. Force sensing prevents slipping or crushing.

Sensing

RGB-D camera detects AR tags on groceries & fridge shelves; gripper force sensor ensures safe grasps.

Planning

MoveIt + IK compute collision-free pick-and-place trajectories around obstacles like fridge doors.

Actuation

Omron TM5-700 arm and Robotiq gripper manipulate groceries safely using category-based grip forces.


Tasks

Build
  • Manufacture AR tag stickers
  • Apply tags to groceries and fridge shelf locations
Code
  • Camera Node → /camera/image_raw
  • AR Node → /aruco/poses, /tf
  • Pose Estimator → PoseStamped in camera frame
  • Object Selector → /target_pose (base_link)
Test
  • Begin with fridge open and item on counter
  • Perform pick + move sequence
  • Place item in correct AR-tagged fridge location

Bill of Materials

5.1 Lab Resources

N/A

5.2 Other Robotic Platforms
ItemQtyOwner / Location
Omron TM5-7001Arya / Living Room
5.3 Other Purchases
ItemQtyJustification
Grocery Items10+Objects to load into fridge
AR Tags15+Identify groceries & shelves
Sticker Paper15+Attach AR tags to produce


ROS 2 Code Pipeline

Example pipeline showing vision → pose estimation → IK planning.

# Example ROS 2 pipeline for GroceryGizmo

# Camera node publishes RGB(+depth)
#   /camera/image_raw

# AR Code node → detects markers
#   /aruco/poses
#   /tf

# Pose Estimator → outputs PoseStamped in camera frame
#   /grocery_pose

# Object Selector:
#   - picks nearest target
#   - transforms pose to base_link
#   - publishes:
#       /target_pose

# MoveIt + IK:
#   - generate collision-free pick+place trajectories
#   - adjust gripping force based on AR tag class

GroceryGizmo • EECS 106A • UC Berkeley