Robot Arm Upgrade Project

Updated 13 January 2022

This project has been a long and tricky one for me, I purchased my robot arm kit from Jaycar many years ago with the intention of controlling it from the start.

I used to work with industrial robots, so when I saw this robot kit for sale I immediately thought how can I allow it to move in full cartesian coordinates since I can’t afford a full-size industrial robot.  So that’s the reason for this project.

I have gained stacks of experience and knowledge in a number of areas. Namely ROS, this operating system is amazing, I’ve barely scratched the surface with what it can do. This project has allowed me to learn a range of different facets of this operating system. Another thing I’ve gained experience in is Particle Photon microprocessors, I’ve used them before on other projects, so it was an obvious choice for this one, also the Blynk app, which is great for quick control via phone or remote monitoring.

Step 1: Install joint positioning sensors

My first step was to add a pulse counter (reluctant to call it an encoder) to each motor like this:

diy-positioning-sensors

hall-effect-soldered-into-gripper

The above images show the insides of the gripper motor. I drilled and installed 3 magnets into the primary orange gear just after the motors worm gear (later changed to neodymium magnets for greater sensing) These were detected by a hall effect hidden under the gear.

I repeated this same process for each of the five motors, in the other yellow motors I installed 4x 3mm diameter magnets for slightly greater positional accuracy.

 

Step 2: Build motor control board

My next step was to build joint motor controllers, I thought I would be clever and design a PCB that included a microprocessor, H bridge motor driver, and associated hardware, which I did, but unfortunately, I’m not an expert at PCB design and motor commutation electrical noise was a HUGE issue :(

My idea was to connect 5 of these boards all together using RS232 (it's all I knew back then), then I would use ModbusRTU to send destination commands to each motor using slave addresses. I wrote all the microprocessor code using CCSINFO compiled for PIC microprocessor 16F88 and a VB6 computer interface for testing. But I really struggled with the back EMF noise from the motors and also what I didn't realise was RS232 comms lines are high so you can't daisy chain them easily, this is where RS485 is meant to be used. So the project stalled for many years.

 completed-pcb

Step 3: Take a break and return with new knowledge

After many years of arm living on the shelf, and heaps of new knowledge I’d gained, namely ROS. I was ready to take on the project again.

I found out that ROS, Robot Operating System, supported Kinematics for robot end-effector positioning. This is awesome, exactly what I was after. Despite my previous board failures and RS232 problems I still needed a way to control or coordinate all 5 joints to drive to each position, I hadn't solved that problem, until now.

 

I swapped the multi-board PCB approach for a Particle Photon chip which on paper was able to drive 5 motors via 5 PWM outputs each with 5 direction control outputs and 5 pulse counter inputs via external interrupts. This board would then interface with ROS via the Photons USB connection, which ironically appears as RS232, but only one RS232 now :)

Photon Pin Mapping

photon-pin-mapping

With the above pin configuration, I was able to achieve 5 dedicated PWM with allowance for 5 direction outputs and 5 separate dedicated external interrupts. Plus leaving room for built-in i2c data and clock pins. Maxing out the Photon, spoiler alert, would have worked too if it wasn't for an interrupt issue

I made a proto-type Vero board which I mounted 3x DRV8835 Dual Motor Drivers, which the Photon interfaced with. Things were looking up!

veroboard-layout-with-particle-photon

Step 4: How to test joint operation?

I used Blynk to remote control the arm from my phone, that way I could jog the joints and read the position.

blynk-app-interface-to-control-robot-arm

Once I had the Blynk app running I noticed that my position counting was quite erratic. It would count hundreds of extra pulses. I was suspecting electrical noise again! Argh!

So I did all the normal things you're meant to do to reduce interference issues, like this:

  • installing capacitors across motor terminals

  • Extra capacitors from each motor terminal to motor body

  • Shielding motor cables with grounded earths

  • Grounding motor body to shield which was grounded at the controller (both ends)

  • Common point grounding, so all 0volt references originate from a central point

  • Extra electrolytic capacitors at various points around the circuit board

  • Shielding pulse counter cables with ground at one end only

  • Increasing the current draw on the interrupt inputs (not just internal pull-ups)

Nothing seemed to eliminate it. I connected my oscilloscope to the pulse counter external interrupt lines and the signal was clean enough. Only a slight ripple of a few mV, nothing that I could attribute to triggering extra pulses. I was running a 3.3v system, and my hall effects were NPN sinking type so increased the current draw on these by adding resistors, this was meant to help make it more difficult for the noise to affect the signal. Still did not eliminate it.

A suggestion from the Particle forum was to add filtering to the interrupt in software, which to me seemed like a bandaid solution, something was wrong here.

My next attempt at solving this was to set up separate microprocessors for counting pulses and have the Photon communicate with these via the i2c bus. Wow did that open a can of worms! Particle’s implementation of i2c requires you to do a lot of manual things, I guess that's because I wasn't using a predefined library. Anyway I figured it out in the end and now I have more knowledge in i2c comms too :)

i2c-microprocessors-handling-the-encoders-output


So, back to the ‘electrical noise’ problem, well, as soon as I got the external microprocessors reading the interrupts and sending count values back via i2c comms the counts were perfect! No extra counts, no lost counts! So there seems to be something wrong with external interrupts and Particle Photons. This also lends weight to another issue I have with my Photon Weather Shield project that I have and how sometimes it says we’ve had rain when we haven’t, interesting…

Step 5: More resolution

After fixing all the above things, I decided that I needed more resolution. My 3 or 4 magnet pulse counter didn't quite cut it even if I counted both rising and falling edges, plus I needed to remember which direction the motor was running in order to count up or count down and now that I have the counters being counted via separate external micro’s, this was getting difficult.

So here comes quadrature and I can call it an encoder. I also hugely increased the resolution by mounting my magnets directly to the motor worm gear. I required a 3D printed magnet holding widget, so I designed one in SolidWorks and mounted/glued it to the motor shaft.

motor-encoder-mod-cad

motor-encoder-completed-mod

Then I mounted an extra hall effect to the top side of the yellow motor housing, which was great because the top side was slightly further away than the bottom hall effect which created a timing offset in the sensing and the end result was a full quadrature encoder!

encoder-mod-attached-to-motor

Now I have stand alone external high speed position sensing for four of the five joints that's accurate, repeatable, and most importantly reliable.

Step 6: Model creation

My first step was to reverse engineer the robot arm piece by piece. This took about 30hours of painstaking measuring and modeling. I used SolidWorks because it has a built in URDF exporter

computer-aided-design-model-of-the-robot-arm

exploded-computer-aided-design-model-of-the-robot-arm

Each part had to be constrained properly so the exporter would know which part was a joint and which part was fixed. You can also set joint limits, center of mass, collision data, and all kinds of cool stuff.

After export I had to configure ROS to use the model, to do this I used MoveIt. This package handles inverse kinematics calculations and other cool visualisation things. I used the Setup Assistant to create all the configuration files. But I’m still missing a part, the link between model and hardware.

Step 7: Build ROS hardware interface

This was probably the trickiest part (besides motor noise issues), it required lots of C programming and compiling. I followed this example here, but got stuck on the CMakeLists.txt formatting and package.xml layout as there were no premade examples. 

It took me ages to find all the relevant things to include and stuff. Not my strength.

There was also quite a bit of mucking around with ROS controller types and how different ones interact with MoveIt. Required heaps of searching, reading, and finally posting on the ROS forum for help. It’s quite involved and very complicated. I don't pertain to fully understand it all but I’m getting lots of experience with this pretty amazing operating system. Quick note here, I’m doing another project with ROS and it's going to be controlling my autonomous robot lawn mower, based on a previous project posted here, stay tuned.

Anyway, it also required a method of communicating between ROS and the Particle Photon. For this I built a custom RS232 packet protocol with a rudimentary CRC byte on the end to make sure the data received was valid. This also took a bit of nutting out, it seems Linux serial port handling requires you to flush the buffer before you use it, talk about a trap for young players, this also took ages to solve!

Code snippet:

// Create new termios struct, we call it 'tty' for convention

struct termios tty;

memset(&tty, 0, sizeof tty);

 

// Read in existing settings, and handle any error

if(tcgetattr(rs232Handle_, &tty) != 0) {

printf("Error %i from tcgetattr: %s\n", errno, strerror(errno));

    return;

}

tty.c_cflag &= ~CSTOPB;  // Clear stop field, only one stop bit used in communication

tty.c_cflag &= ~CRTSCTS; // Disable RTS/CTS hardware flow control

tty.c_cflag |= CREAD | CLOCAL; // Turn on READ & ignore ctrl lines (CLOCAL = 1)

cfmakeraw(&tty);     // sets all the things to make it RAW

tty.c_cc[VTIME] = 0; // Wait for up to 100ms (1 deciseconds)

tty.c_cc[VMIN] = 0;

// Set in/out baud rate to be 115200

cfsetispeed(&tty, B115200);

cfsetospeed(&tty, B115200);

 

// clear buffer first, VERY important! screws comms otherwise

::tcflush(rs232Handle_,TCIOFLUSH);

  

Note: the ::tcflush command, man that took some finding

 

Step 8: ROS and MoveIt

Now with all the hardware layers built, I can move on to the fun stuff of controlling my arm, my goal from the very start. Should be easy or so I thought! Hurdles everywhere...

 

You still have to configure MoveIt to access ROS hardware interfaces via controllers, the key here was the configuration of controllers in the YAML files, see forum post. Example like this:

 

File snippet:

controller_list:

  [ ]

arm_controller:

  type: velocity_controllers/JointTrajectoryController

  joints:

- jt1_joint

- jt2_joint

- jt3_joint

- jt4_joint

  constraints:

goal_time: 10.0

stopped_velocity_tolerance: 0.1

jt1_joint: {trajectory: 0.02, goal: 0.01}

jt2_joint: {trajectory: 0.02, goal: 0.01}

jt3_joint: {trajectory: 0.02, goal: 0.01}

jt4_joint: {trajectory: 0.02, goal: 0.01}

  Gains:

jt1_joint:  {p: 0.1,  i: 0.2, d: 0.1, i_clamp: 0.1}

jt2_joint:  {p: 0.1,  i: 0.2, d: 0.1, i_clamp: 0.05}

jt3_joint:  {p: 0.1,  i: 0.2, d: 0.1, i_clamp: 0.1}

jt4_joint:  {p: 0.5,  i: 0.2, d: 0.1, i_clamp: 0.1}

  stop_trajectory_duration: 0.5

  state_publish_rate: 25

  action_monitor_rate: 10

This is a screenshot of the arm displayed in RVIZ:

robot-arm-in-moveit

This is my model represented in RVIZ, notice the interactive markers around joint 4. These are used to drag the arm to the required position. Like this:

robot-arm-in-moveit-with-new-position

Because the encoders are not absolute positioning, you have to home the robot each time, so I use the Blynk interface to drive the arm to the vertical straight up position and reset all the joint position values. This is the default home position that ROS starts in for MoveIt. I then convert joint velocity targets from radians/sec to PWM motor speed values and send via USB RS232.

Summary

This has been an epic project for me over many years and I’m so stoked that I was able to achieve my goal. I still have to properly tune my PID loops variables to make the movement a little smoother but overall it works.

What I wanted to see was the arm end effector follow a linear path by itself, where all joints travel at different speeds to compensate for different arm segment lengths and angles. It can do this (albeit a little jerky) but it works.

I don't really have a purpose for it, except for being a learning tool to get familiar with ROS and all the associated interfaces. It has been fun and frustrating at the same time, but very rewarding. Thanks for reading and enjoy the video (note: my very first YouTube video too!).

 

Have a question? Ask the Author of this guide today!

Please enter minimum 20 characters

Your comment will be posted (automatically) on our Support Forum which is publicly accessible. Don't enter private information, such as your phone number.

Expect a quick reply during business hours, many of us check-in over the weekend as well.

Comments


Loading...
Feedback

Please continue if you would like to leave feedback for any of these topics:

  • Website features/issues
  • Content errors/improvements
  • Missing products/categories
  • Product assignments to categories
  • Search results relevance

For all other inquiries (orders status, stock levels, etc), please contact our support team for quick assistance.

Note: click continue and a draft email will be opened to edit. If you don't have an email client on your device, then send a message via the chat icon on the bottom left of our website.

Makers love reviews as much as you do, please follow this link to review the products you have purchased.