Thursday, January 26, 2017

Humanoid robot video

Here's a video of my Bioloid humanoid robot. As mentioned in a previous post, I controlled it using a BeagleBone Black embedded system instead of its propietary controller. I created the routines for arms, legs and hips to move based on inverse kinematics equations, trajectory generators, and some high level behaviours.

Information about the project was in a previous post here.
Detailed information is here (in spanish).
All code used is here.

Wednesday, January 25, 2017

My humanoid robot (Bioloid robot controlled by a BeagleBone Black)

Well... actually it was not mine but from my university, but I used for my master thesis and I showed him how to move from scratch.

The details on the mathematical calculations and code implemented is detailed on my thesis. You can find it here (it´s in Spanish).

Bioloid robot with a BeagleBone Black board on the chest.

Usually Bioloid humanoid robots are controlled by Robotis proprietary controller (a microcontroller board) and their proprietary software. On that system, basic moves are already being done and you just have to program the high level instructions you want, like: "Robot, go get me some coffee", well maybe not so high level but you have basic moves like "walk forward" or "turn right" already programed. The idea at the university research center (AI2) was to replace that controller and use an embedded system instead, specifically a BeagleBone Black. By using an embedded system on future Bioloid projects, we could use the benefits of this type of system like true parallelism in processes, extended computation capacity and easy integration with cameras and IMUs. But before this, the guts of the system had to be done, and here I come...

My thesis objective was to develop the Locomotion libraries (in an API) for the robot to move, that is creating from scratch all the movements on the robot in order to have some high order instructions like the proprietary software have.

That's me with my robot.

The first thing I did is that I used the code created for a single robotic arm (see previous post) and used for the two arms in the robot. For the legs to move I programmed the kinematic equations that describe the movement of the foot using hip as a reference and the equations that describe the hip movement using foot as a reference. The reason to have 2 sets of equations for the legs is that for example walking is divided in two main phases: foot on ground (here you move the hip) and foot on air (here you move the foot). Likewise I created two linear trajectories generators, one for the foot and one for the hip. Both were generate movements in straight lines since it´s important to control on every moment were the foot and hip is, and avoid this way any possible unstable position.

Matlab simulation of the legs as kinematics chains.

This time for the communications from the BeagleBone Black to the servos (Dynamixel bus) I used an USB2Dynamixel device. The low level drivers for the BeagleBone Black to send the Dynamixel data frames was created by David Cisternes as a generic API for communications on a Dynamixel bus.

Next task was to create those high level movements everybody loves, so for example for a movement "one step forward" I divided the movement into 7 different segments: lean sideway, lift foot, low loft, move hip forward and sideway, lift back foot, low back foot, return to normal position. Then I created the trajectories for all the limbs during each segment. This duty required a lot of trial and error correction to reach a stable movement.

Robot during one the segments of "one step forward" movement.

I called this high level movements as "behaviors" in order to be used on a state machine that ruled the robot logic. The robot would move between behaviors based on the input from sensors, in my case the ones implemented on the smart foot (see previous post). Finally the robot was able to climb or descend stairs (not big ones, it was a small robot after all!), walk forward and make small turns to the right or left.

Robot climbing one step of a stair

Robot descending the same step of the stair

The code used on the BeagleBone Black and my thesis are on my GitHub web:

Wednesday, December 7, 2016

Arduino based smart feet for a Bioloid robot (Arduino as slave on Dynamixel bus)

During my Master thesis, I was in the need for some proprioception from a humanoid robot’s foot. The final objective was to make the humanoid robot climb and descent stairs, so I placed some pressure sensors and a proximity sensor on the robot foot.

With all these sensors in the foot, having a direct wiring to the controller would imply a lot of cables (ground, vcc, signals…), therefore I decided to connect them to the Dynamixel bus. I used an Arduino Nano for this matter.

I used a Sharp GP2Y0A21YK0F proximity sensor and four FSR pressure sensors, all connected to the Arduino analog inputs. FSR sensors are quite noisy, so I applied a rolling average to filter the measured values.

Arduino has a UART TTL that communicates using 3 lines: Tx, Rx, and GND. Dynamixel communication bus has 3 different lines: Vcc, GND and Data; this last one used to send and receive. To convert the 2 data lines on the Arduino board to only one, I used and 74LS241 chip that has several 3 states buffers by implementing the connections described in this web. Note that I had to use an enable signal from the Arduino to the buffer.

External circuit from Arduino to Dynamixel bus. Image from Savage Electronics.

I programmed the Arduino to keep listening to serial messages on the serial bus, when he detects the header of the Dynamixel data frame, then go to check the ID of the device, and if it is the right ID, check the rest of the message.

Dynamixel frame format. Message sent from Master to Slave.

Data from each sensor were assigned to a different address, creating this way a logical table with sensor data. When the Arduino is asked for data from a specific address, it will respond with a message with a 2 byte value as showed in the following example.

Dynamixel frame format. Message response from Slave to Master.

Finally I soldered all the circuits created on a prototype board and placed the Arduino Nano on top of the components to save space. Looking like below picture. Pins are for connection to sensors and Dynamixel bus.

Implemented circuit (components are below Arduino Nano).

When I placed the board into the foot and connected the sensors, I looked like the following pictures.

Assembled circuit on the robot's foot

New smart feet placed on the robot.

Arduino code is on my github web:

Wednesday, November 30, 2016

New Robotic Arm with Dynamixel servos

During the last year and some months I was getting my Master degree. As part of the embedded systems class, I created a Robotic Arm that could write some letters. Pretty cool, right?

The arm has 3 Degrees of Freedom (DOF) using Dynamixel RX-10 servos. I controlled the robot with a Beaglebone Black board with a cape that has a MAX485 chip. This chip converts UART TTL communications to RS-485 differential protocol (used on the Dynamixel servos). Also. I had to create some drivers for the controller to speak Dynamixel language (specific data frames format) on the communications bus.

For the arm to reach some point in 3D space, I programmed the kinematic equations that translated this desired point to a set of angles for the servos. I used the equations proposed by Núñez et al on the paper “Explicit analytic solution for inverse kinematics of Bioloid humanoid robot” (IEEE paper site).

Also I created two trajectories generators: one for free trajectories (moving from 1 point to another without care of the described trajectory) and one for linear  trajectories (moving between 2 points describing a straight line).

Finally I programmed the a routine on the robot using these trajectories functions, so it can describe some letters on a vertical plane. Here's a video of the robot:

The code for the project is found here: project link.
Also here is the technical document for the project (in spanish): document link

My github site:

Thursday, January 15, 2015

Getting feedback from the servo

You see, hobby servos are extremely easy to use but they are also pretty dumb, you would say them something like go to position “75 degrees” and then you will have faith that it reaches that point; but you really don’t get any confirmation from the servo. So if you are having a situation that can induce some bias on the position, like a lot of torque applied against the rotor, you don’t get any clue to where it actually is (besides visual feedback when you see the robot arm smashing somewhere it shouldn’t smash).

I decided I want to know where the servo rotor and horn actually are located at any time. For this endeavor I used the potentiometer that is already built in the servo.

First I opened the servo. Then I desoldered the connections from the servo PCB to the motor, so I could remove the PCB and gain access to the potentiometer. Once I was able to see the potentiometer I soldered a wire to the connection point in the middle (there are 3 connection points). I made a small notch into the servo case to put the new wire and later I soldered again the PCB to the motor. Finally I closed the servo.

Opening servo case
Servo PCB just desoldered from motor connections.
Note the potentiometer on the right, I soldered a line to  the yellow wire.

Servo closed again with the position feedback line coming from the case.

With this method I measured the voltage from the moving part of the potentiometer (that’s the middle connection point) to ground. I used the analog input in the Arduino, therefore I had to convert the analog value (a number from 0 to 1024) to a voltage value (a number from 0 to 5V).

Also to characterize which voltage value corresponds to each position, I took apart each servo from the arm and swept across all positions without any load. While I was doing this, Arduino was also sending position and measured voltage values to my PC by serial communication. With all that information, I graphed Voltage vs Position, and fitted a linear tendency line to the graph, the resulting equation from the best fit tendency line will be my model of servo position.

Shoulder servo graph with best fit line and equation.

Next step will be to use this position model into my arm code and develop some closed loop control logic to correct position based on the position feedback.

Friday, October 10, 2014

Coming from a halt... Servo calibration

Coming back from a halt… lots of things happened last year that consumed my time, but I’m back.

During the last weeks I've resuming the work I've done before. Although I have new ideas for the arm, the issue with the servos showing position errors kept bothering me; so I decided to give it a last try and solve it for good.

Background:  the issue was that the arm shows an error on the position, which increases as the arm increases its extend.  The larger error was seen on the shoulder servo. Hypothesis was error was caused by arm weight.

Debug process: First I calculated the sum of torques that will be applied to the shoulder servo (based on arm weight and extension) and compared with the spec data. Even on full extension the sum of torques won’t be larger than 4 (this type of units is used on hobby servos torques) and the shoulder servo (HS-755HB) is capable of holding Now, the hypothesis of weight seems not so good…

Due to Lynxmotion staff suggestion I also checked that power supply was capable to provide enough power to the arm (servos and control). My wall power adapter was sufficient and I also tried with a DC power supply (5V 2A).  Measured current consumption was around 350mA. Also with both power supplies the servos behaved similarly.

Then also due to Lynxmotion suggestion I took apart the arm and test the servo without any load… surprise! The servo kept showing errors without any load. Now, the error statement changed into: shoulder servo shows an error in position as it approaches 0 degrees.

I did some research and found some important fact about hobby servos. Their origin is very influential since they came from the RC world, where you need some precision for the center position (90 degrees) and then you just need it to go right or left but with no precise amount.

What I found on the literature is that there's an agreement about having center position with a signal of 1500us, but there's no consensus on what is signal length for 0 degrees and 180 degrees on hobby servos. 

Lynxmotion mentioned that "500 is 0 degrees, and 2500 is 180 degrees". Arduino Servo Library has "0-degree angle on the servo defaults to 544... 180-degree angle on the servo defaults to 2400. Other Arduino reference even says: "1000 is fully counter-clockwise, 2000 is fully clockwise". 

But the most important statement that I found was: "Note that some manufactures do not follow this standard very closely so that servos often respond to values between 700 and 2300".

Since I was using Arduino Servo library, it uses its default values and my experimental data was consistent with them: 547us for 0 degrees and 2392 for 180 degrees (It was slightly different from theoretic values, probably since I didn't have a scope so I used a DVM and measured Mean Voltage and then calculated time).

This article provided me some insights about how to modify time values for 0 and 180 degrees on the Arduino Servo function when declaring Servo Attach statement.  I found that my HS-755 servo has 670us for 0 degrees and 2300us for 180 degrees. I tried also with HS-422 servo and found a different set of values.

Technical conclusion: each servo might have difference time values for 0 and 180 degrees. For future settings, if I need some precision on angle settings, I need to manually measure those time values first and adjust Arduino Servo function accordingly.

Time for some reflection: the main difference this time is that I asked for help early in the debugging process. I consulted on a help forum from Lynxmotion and during the discussion process I got new ideas and found new answers. I could came to this point many time ago if I have just asked for help earlier.

Philosophical conclusion: While debugging an issue, try for a while alone but if it is taking you too much, ask for help. 

Monday, July 22, 2013

Back on writing soon

I haven't write anything new for some months, the reason for that is that I got married and I have been busy setting up our new home; but anxious the get my hands on the robot again, so new posts are coming soon!