Tuesday, January 28, 2020

Neck movement of InMoov robotic head

A quick update on the project. In below video you can see the InMoov robotic head now moving its neck. Movements are 2 type of rotations: "yaw" by the servo inside the head, and "pitch" by the screw + servo mechanism.

For controls I used the previous Arduino+Shield used before (previous post). On the video the robots moves controlled manually by the potentiometers on the shield and automatic on a fixed sequence.



Sunday, November 4, 2018

Humanoid Robot Head (based on InMoov project)




I´ve following for a while InMoov project. It´s an open source humanoid robot from French designer Gaël Langevin, who has done a great job accomplishing a nice and stylish robot.

Two years ago I started working as a full time lecturer at the Costa Rica Institute of Technology, I see this project as an engagement tool to attract high school students into STEM majors at college. 

InMoov website has detailed construction videos, bills of materials and stl files for the 3D printed parts. We 3D printed all those head parts and follow the assembly instructions. We took special care on the outside parts to make our robot look extra cool, so we applied automotive primer sealer and carefully sanded everything to get that nice and smooth finish, then we painted it white. For the eyes I followed Gaël's design on using a thermoplastic material to cover the eye balls to make them look glossy and shiny. It really enhances the appearance of the robot. 


 

On InMoov webpage the control of the robot is done by a software called MyRobotLab, however I decided to just used and Arduino. I also used 3 shields above the Arduino, one to power all the servos and 2 of them for the joysticks.
The robot has 2 servos on the eyes to make a 2 dimensional movement, one servo to open-close the jaw and 2 servos on the neck. Currently there’s only 1 servo running on the neck, I expect to mount the remaining one soon. 


My first job on the Arduino code was to set the center of each servo and the limits they can reach before something breaks. Then I put some code to map the joystick positions (they are just potentiometers) and map those movements as an increment/decrement on servo position.



Future work on the Robot Head is will be making a strong and nice looking head base (not a cardboard box ;) ) and installing in it the missing neck servo. Then we want to start merging this robot with a chatbot done by an investigator at the university, and hopefully we will have a talking robot!

The Arduino code is available here: GithubGist code




Thursday, January 26, 2017

Humanoid robot video

Here's a video of my Bioloid humanoid robot. As mentioned in a previous post, I controlled it using a BeagleBone Black embedded system instead of its propietary controller. I created the routines for arms, legs and hips to move based on inverse kinematics equations, trajectory generators, and some high level behaviours.


Information about the project was in a previous post here.
Detailed information is here (in spanish).
All code used is here.

Wednesday, January 25, 2017

My humanoid robot (Bioloid robot controlled by a BeagleBone Black)

Well... actually it was not mine but from my university, but I used for my master thesis and I showed him how to move from scratch.

The details on the mathematical calculations and code implemented is detailed on my thesis. You can find it here (it´s in Spanish).

Bioloid robot with a BeagleBone Black board on the chest.

Usually Bioloid humanoid robots are controlled by Robotis proprietary controller (a microcontroller board) and their proprietary software. On that system, basic moves are already being done and you just have to program the high level instructions you want, like: "Robot, go get me some coffee", well maybe not so high level but you have basic moves like "walk forward" or "turn right" already programed. The idea at the university research center (AI2) was to replace that controller and use an embedded system instead, specifically a BeagleBone Black. By using an embedded system on future Bioloid projects, we could use the benefits of this type of system like true parallelism in processes, extended computation capacity and easy integration with cameras and IMUs. But before this, the guts of the system had to be done, and here I come...

My thesis objective was to develop the Locomotion libraries (in an API) for the robot to move, that is creating from scratch all the movements on the robot in order to have some high order instructions like the proprietary software have.

That's me with my robot.

The first thing I did is that I used the code created for a single robotic arm (see previous post) and used for the two arms in the robot. For the legs to move I programmed the kinematic equations that describe the movement of the foot using hip as a reference and the equations that describe the hip movement using foot as a reference. The reason to have 2 sets of equations for the legs is that for example walking is divided in two main phases: foot on ground (here you move the hip) and foot on air (here you move the foot). Likewise I created two linear trajectories generators, one for the foot and one for the hip. Both were generate movements in straight lines since it´s important to control on every moment were the foot and hip is, and avoid this way any possible unstable position.

Matlab simulation of the legs as kinematics chains.

This time for the communications from the BeagleBone Black to the servos (Dynamixel bus) I used an USB2Dynamixel device. The low level drivers for the BeagleBone Black to send the Dynamixel data frames was created by David Cisternes as a generic API for communications on a Dynamixel bus.

Next task was to create those high level movements everybody loves, so for example for a movement "one step forward" I divided the movement into 7 different segments: lean sideway, lift foot, low loft, move hip forward and sideway, lift back foot, low back foot, return to normal position. Then I created the trajectories for all the limbs during each segment. This duty required a lot of trial and error correction to reach a stable movement.

Robot during one the segments of "one step forward" movement.

I called this high level movements as "behaviors" in order to be used on a state machine that ruled the robot logic. The robot would move between behaviors based on the input from sensors, in my case the ones implemented on the smart foot (see previous post). Finally the robot was able to climb or descend stairs (not big ones, it was a small robot after all!), walk forward and make small turns to the right or left.

Robot climbing one step of a stair

Robot descending the same step of the stair

The code used on the BeagleBone Black and my thesis are on my GitHub web: https://github.com/jcbrenes/Bioloid-Locomotion-API




Wednesday, December 7, 2016

Arduino based smart feet for a Bioloid robot (Arduino as slave on Dynamixel bus)

During my Master thesis, I was in the need for some proprioception from a humanoid robot’s foot. The final objective was to make the humanoid robot climb and descent stairs, so I placed some pressure sensors and a proximity sensor on the robot foot.

With all these sensors in the foot, having a direct wiring to the controller would imply a lot of cables (ground, vcc, signals…), therefore I decided to connect them to the Dynamixel bus. I used an Arduino Nano for this matter.

I used a Sharp GP2Y0A21YK0F proximity sensor and four FSR pressure sensors, all connected to the Arduino analog inputs. FSR sensors are quite noisy, so I applied a rolling average to filter the measured values.

Arduino has a UART TTL that communicates using 3 lines: Tx, Rx, and GND. Dynamixel communication bus has 3 different lines: Vcc, GND and Data; this last one used to send and receive. To convert the 2 data lines on the Arduino board to only one, I used and 74LS241 chip that has several 3 states buffers by implementing the connections described in this web. Note that I had to use an enable signal from the Arduino to the buffer.

External circuit from Arduino to Dynamixel bus. Image from Savage Electronics.

I programmed the Arduino to keep listening to serial messages on the serial bus, when he detects the header of the Dynamixel data frame, then go to check the ID of the device, and if it is the right ID, check the rest of the message.

Dynamixel frame format. Message sent from Master to Slave.

Data from each sensor were assigned to a different address, creating this way a logical table with sensor data. When the Arduino is asked for data from a specific address, it will respond with a message with a 2 byte value as showed in the following example.

Dynamixel frame format. Message response from Slave to Master.

Finally I soldered all the circuits created on a prototype board and placed the Arduino Nano on top of the components to save space. Looking like below picture. Pins are for connection to sensors and Dynamixel bus.

Implemented circuit (components are below Arduino Nano).

When I placed the board into the foot and connected the sensors, I looked like the following pictures.

Assembled circuit on the robot's foot

New smart feet placed on the robot.

Arduino code is on my github web: https://github.com/jcbrenes/Planta-SerialDynamixel


Wednesday, November 30, 2016

New Robotic Arm with Dynamixel servos

During the last year and some months I was getting my Master degree. As part of the embedded systems class, I created a Robotic Arm that could write some letters. Pretty cool, right?

The arm has 3 Degrees of Freedom (DOF) using Dynamixel RX-10 servos. I controlled the robot with a Beaglebone Black board with a cape that has a MAX485 chip. This chip converts UART TTL communications to RS-485 differential protocol (used on the Dynamixel servos). Also. I had to create some drivers for the controller to speak Dynamixel language (specific data frames format) on the communications bus.


For the arm to reach some point in 3D space, I programmed the kinematic equations that translated this desired point to a set of angles for the servos. I used the equations proposed by Núñez et al on the paper “Explicit analytic solution for inverse kinematics of Bioloid humanoid robot” (IEEE paper site).

Also I created two trajectories generators: one for free trajectories (moving from 1 point to another without care of the described trajectory) and one for linear  trajectories (moving between 2 points describing a straight line).

Finally I programmed the a routine on the robot using these trajectories functions, so it can describe some letters on a vertical plane. Here's a video of the robot:


The code for the project is found here: project link.
Also here is the technical document for the project (in spanish): document link

My github site: https://github.com/jcbrenes


Thursday, January 15, 2015

Getting feedback from the servo

You see, hobby servos are extremely easy to use but they are also pretty dumb, you would say them something like go to position “75 degrees” and then you will have faith that it reaches that point; but you really don’t get any confirmation from the servo. So if you are having a situation that can induce some bias on the position, like a lot of torque applied against the rotor, you don’t get any clue to where it actually is (besides visual feedback when you see the robot arm smashing somewhere it shouldn’t smash).

I decided I want to know where the servo rotor and horn actually are located at any time. For this endeavor I used the potentiometer that is already built in the servo.

First I opened the servo. Then I desoldered the connections from the servo PCB to the motor, so I could remove the PCB and gain access to the potentiometer. Once I was able to see the potentiometer I soldered a wire to the connection point in the middle (there are 3 connection points). I made a small notch into the servo case to put the new wire and later I soldered again the PCB to the motor. Finally I closed the servo.

Opening servo case
Servo PCB just desoldered from motor connections.
Note the potentiometer on the right, I soldered a line to  the yellow wire.

Servo closed again with the position feedback line coming from the case.

With this method I measured the voltage from the moving part of the potentiometer (that’s the middle connection point) to ground. I used the analog input in the Arduino, therefore I had to convert the analog value (a number from 0 to 1024) to a voltage value (a number from 0 to 5V).

Also to characterize which voltage value corresponds to each position, I took apart each servo from the arm and swept across all positions without any load. While I was doing this, Arduino was also sending position and measured voltage values to my PC by serial communication. With all that information, I graphed Voltage vs Position, and fitted a linear tendency line to the graph, the resulting equation from the best fit tendency line will be my model of servo position.

Shoulder servo graph with best fit line and equation.

Next step will be to use this position model into my arm code and develop some closed loop control logic to correct position based on the position feedback.