Vision-Guided Robotic Sorting System

A complete computer vision and robotics integration project using ArUco markers, homography mapping, and the EEZYbotARM MK2 to autonomously detect, track, and sort colored objects.

View on GitHub See Demo

Project Overview

This system combines advanced computer vision techniques with robotic control to create an autonomous object sorting solution. Using a downward-facing camera and ArUco marker-based homography, the system accurately maps pixel coordinates to real-world millimeter positions on a workspace. Objects are detected, tracked through multiple frames, classified by color and orientation, and then picked and sorted by a 4-DOF robotic arm.

The project demonstrates practical applications of OpenCV, inverse kinematics, serial communication, and state machine design for real-time robotic control.

Key Features

๐ŸŽฏ ArUco Homography

Four corner markers establish precise coordinate mapping from camera pixels to workspace millimeters with sub-mm accuracy.

๐Ÿ” Multi-Object Tracking

Sophisticated tracking algorithm maintains object identity across frames, handles occlusions, and manages appearance/disappearance.

๐ŸŽจ Color Classification

HSV-based color detection with voting system ensures accurate classification (RED, GREEN, BLUE, YELLOW) even under varying lighting.

๐Ÿ“ Orientation Detection

Calculates object angle using minimum area rectangles with circular mean smoothing for stable rotation measurements.

๐Ÿค– Robot Integration

Serial communication with Arduino-controlled EEZYbotARM using inverse kinematics for precise pick-and-place operations.

โšก Real-Time Processing

Runs at 30+ FPS with state machine logic ensuring objects are stable before picking, preventing false detections.

System Demonstration

Workflow Pipeline

  1. Detection: Camera detects colored objects on white table surface
  2. Stabilization: Objects tracked until position is stable (8 frames, <5mm variation)
  3. Classification: Color identified via HSV voting (70% majority required)
  4. Orientation: Angle locked using circular mean of last 8 measurements
  5. Queueing: Object marked ready, queued for robot pickup
  6. Pick Command: System sends PICK <id> <x> <y> <angle> <color> <target_x> <target_y>
  7. Robot Execution: Arm calculates inverse kinematics, picks object, sorts to color zone
  8. Completion: Robot returns DONE, system removes object from tracking

Technical Implementation

Computer Vision Pipeline

# ArUco Marker Detection & Homography
detector = aruco.ArucoDetector(aruco_dict, params)
corners, ids = detector.detectMarkers(gray)
H, _ = cv2.findHomography(image_points, world_points)

# Object Detection (HSV-based)
hsv = cv2.cvtColor(frame, cv2.COLOR_BGR2HSV)
mask = cv2.inRange(s, 21, 255) & cv2.inRange(v, 0, 249)
contours = cv2.findContours(mask, cv2.RETR_EXTERNAL)

# Coordinate Transformation
world_pos = cv2.perspectiveTransform(pixel_pos, H)
    

Robot Control (Arduino)

// Inverse Kinematics Calculation
base = atan2(y, x) * 180 / PI;
distance = sqrt(reachยฒ + z_adjยฒ);

// Law of Cosines for Joint Angles
cosElbow = (L1ยฒ + L2ยฒ - dยฒ) / (2 * L1 * L2);
elbowAngle = acos(cosElbow) * 180 / PI;

// Smooth Servo Movement
for (int pos = current; pos <= target; pos++) {
  servo.write(pos);
  delay(MOVE_DELAY);
}
    

System Components

Hardware

Software Stack

Installation & Setup

Python Environment

# Install dependencies
pip install opencv-python opencv-contrib-python numpy pyserial

# Clone repository
git clone https://github.com/BenjaminKemmitz/vision-robot-sorting
cd vision-robot-sorting
    

Arduino Setup

# Upload firmware to Arduino
1. Open eezybot_controller.ino in Arduino IDE
2. Calibrate arm dimensions (ARM_SHOULDER_LENGTH, ARM_ELBOW_LENGTH)
3. Adjust servo limits (SHOULDER_MIN/MAX, ELBOW_MIN/MAX)
4. Upload to Arduino Uno
    

Running the System

# Configure workspace dimensions and COM port
# Edit configuration in homography_robot_sorting.py:
XMAX = 663.0          # Workspace width (mm)
YMAX = 316.0          # Workspace height (mm)
ROBOT_PORT = "COM3"   # Arduino serial port

# Run the system
python homography_robot_sorting.py
    

Calibration Process

1. ArUco Markers

Print 4 markers (IDs 0-3) from DICT_4X4_50 dictionary. Place at table corners in clockwise order starting top-left.

2. Workspace Measurement

Measure distances between markers in millimeters. Update XMAX and YMAX values in config.

3. Robot Dimensions

Measure shoulder-to-elbow and elbow-to-gripper lengths. Critical for accurate inverse kinematics.

4. Sort Zones

Define target positions for each color in SORT_ZONES dictionary. Ensure all zones are reachable by arm.

Core Algorithms

Object Tracking State Machine

STATE_NEW โ†’ Object first detected, position tracking begins

STATE_STABLE โ†’ Position stable for 8 frames, color voting active

STATE_QUEUED โ†’ Ready to pick, waiting for robot availability

STATE_PICKING โ†’ Robot currently executing pick sequence

STATE_PICKED โ†’ Successfully sorted, removed from tracking

Circular Angle Smoothing

def circular_mean(angles_deg):
    """Handle 0ยฐ/180ยฐ wrapping for angle averaging"""
    angles_rad = np.deg2rad(angles_deg)
    sin_sum = np.mean(np.sin(angles_rad))
    cos_sum = np.mean(np.cos(angles_rad))
    mean = np.arctan2(sin_sum, cos_sum)
    return np.rad2deg(mean) % 180
    

HSV Color Classification

# Hue ranges for different colors (OpenCV uses 0-180)
if h < 10 or h > 170:    return "RED"
elif 35 < h < 85:         return "GREEN"
elif 100 < h < 130:       return "BLUE"
elif 15 < h < 30:         return "YELLOW"
    

Performance Metrics

โšก Processing Speed

30-35 FPS

Real-time vision processing on standard hardware

๐ŸŽฏ Position Accuracy

ยฑ2-3mm

Homography-based coordinate transformation precision

๐Ÿ”„ Sort Speed

~15 sec/object

Complete pick-and-place cycle including stabilization

โœ“ Success Rate

95%+

Successful picks under optimal lighting conditions

Project Structure

vision-robot-sorting/
โ”‚
โ”œโ”€โ”€ homography_robot_sorting.py    # Main vision system
โ”œโ”€โ”€ eezybot_controller.ino         # Arduino firmware
โ”œโ”€โ”€ SETUP_GUIDE.md                 # Complete setup instructions
โ”‚
โ”œโ”€โ”€ assets/
โ”‚   โ””โ”€โ”€ aruco_markers/             # Printable ArUco markers
โ”‚
โ”œโ”€โ”€ docs/
โ”‚   โ”œโ”€โ”€ calibration.md             # Calibration procedures
โ”‚   โ””โ”€โ”€ troubleshooting.md         # Common issues & fixes
โ”‚
โ””โ”€โ”€ requirements.txt               # Python dependencies
    

Applications & Extensions

๐Ÿญ Manufacturing

Automated part sorting, quality inspection, assembly line pick-and-place

๐Ÿ“š Education

Teaching computer vision, robotics, and control systems integration

๐Ÿ”ฌ Research

Vision-guided manipulation, human-robot collaboration studies

๐ŸŽฎ Competitions

Robotics competitions requiring autonomous object manipulation

Future Enhancements

Technical Challenges Solved

โš™๏ธ Angle Wrapping

Implemented circular mean to properly average angles near 0ยฐ/180ยฐ boundary, preventing jitter.

๐ŸŽจ Variable Lighting

HSV color space with voting system provides robust classification across lighting conditions.

๐Ÿ” Object Persistence

Distance-based matching with missed frame counter maintains tracking through brief occlusions.

๐Ÿค– Kinematic Limits

Reachability checking prevents invalid arm commands, with graceful fallback for unreachable targets.