Mechanical Engineer, Submarine Veteran, UC Berkeley Student

Computer Vision Controlled Stewart Platform

Problem

The goal of this project was to balance a ball on a platform while utilizing a microcontroller with real-time programming.

Design

Made as a group project for a class on microprocessor based mechanical design, this six degree of freedom Stewart platform uses computer vision to balance a ball on a flat platform. The system was controlled with a LabVIEW GUI.

The flow of information is as follows:

  • A PlayStation Eye mounted from above the platform takes pictures of the ball.
  • LabVIEW processes the image and outputs an X and Y coordinate, then sends that wirelessly to an ESP32 over TCP.
  • The ESP32 does PD calculations to adjust the angle of six servo motors.
  • The XZ and YZ angles of the platform are changed to move the ball to the center of the platform.

This project was completed with Chase LaForge-Sciacqua, Jacob Lopez, Kyuheon Kim, and Nathan Seymour

Real Time and Multitasking with RTOS

The use of six independent servo motors meant that we were beyond the pulse width modulation capacity of the standard output for the ESP32. This issue was overcome by using a set of hardware clocks to manually generate pulses that would set the duty cycle of the servo motors.

A Real Time Operating System was used to further schedule the other tasks involved in the smooth operation of the system. RTOS set the priority of the communication with LabVIEW and with performing the PD calculations.

The ESP32 was programmed in C with VS Code.