Felipe C. R. Miftajov

Industrial Controls Engineer, Dynacraft | M. Eng. Mechanical Engineering, Texas A&M University | Robotics and Controls

SorterBot | Felipe C. R. Miftajov

SorterBot

GitHub repo for my portion of the project

Introduction

SorterBot was a team project created for a control system design course. The goal was to sort items from one box to another based on their color. We decided that a RRP (2x rotational, 1x prismatic) robot with a SCARA configuration would be ideal for the task.

SorterBot hardware layout SorterBot end effector

We considered a few different end effectors; in the end, we decided to use a suction cup as it would allow us to lift items regardless of their dimensions. All the items had flat surfaces suitable for the suction cup.

Although I assisted with the design and construction of the robot, my primary responsibility was to program the BeagleBoneBlack (BBB) portion of the robot’s software.

Programming

Because of the computer vision requirement, we decided to perform the image processing on a laptop and communicate the target coordinates to the BBB which would control the arm.

SorterBot behavior flowchart

As seen from the diagram above, we decided to separate the behavior into four separate modules or nodes: inverse kinematics, arm positioning, end effector control, and a “BBB core” module to coordinate the other modules.

We used the Robot Operating System (ROS) framework to enable communications between all the parts of the program. The arrows in the diagram represent ROS messages being sent between different nodes.

I wrote the nodes that perform calculations in C++ for greater speed. Nodes that directly interfaced with the hardware (using GPIO or PWM) were written in Python to make use of the Adafruit BBIO library.

BBB Core Node

The BBB Core node coordinates the other nodes and implements the state machine that makes up the robot’s logic. The robot could be in any one of the following three states:

End Effector Control Node

The End Effector Control (EEC) node interfaces between the logic and the hardware of the end effector. It uses PWM to signal the servo raise and lower the suction cup, and a digital GPIO signal to switch the vacuum pump on and off. If the suction cup contacts a hard surface, the node detects this using a contact sensor.

If the EEC node receives a grab command, it lowers the suction cup until it either senses contact or detects that the suction cup has traveled too far without contact. If the suction cup makes contact, the node activates the vacuum pump. Finally, it raises the suction cup and reports success or failure.

If the EEC node receives a drop command, it simply turns off the vacuum pump.

Inverse Kinematics Node

The Inverse Kinematics (IK) node calculates the correct joint angles for the arm required for the end effector to reach a given (x,y) position.

SorterBot inverse kinematics diagram

The second angle is calculated using the law of cosines, and the first angle is then calculated using the geometric constraints between x, y, and the angles:

\(x = L_1 \cos(\theta_1) - L_2 \cos(\theta_1 + \theta_2)\)
\(y = L_1 \sin(\theta_1) - L_2 \sin(\theta_1 + \theta_2)\)

Finally, the node makes sure that the position requested isn’t out of reach for the robot and doesn’t cause the robot to enter a singularity position.

Arm Positioning Node

The Arm Positioning node uses PWM to instruct the arm servos to move to a given angle.

Lessons and Future Improvements

A few lessons I learned from this project and things I would change if I were to redo it now:

Acknowledgments