Showing posts with label humanoid robot. Show all posts
Showing posts with label humanoid robot. Show all posts

Sunday, November 4, 2018

Humanoid Robot Head (based on InMoov project)




I´ve following for a while InMoov project. It´s an open source humanoid robot from French designer Gaël Langevin, who has done a great job accomplishing a nice and stylish robot.

Two years ago I started working as a full time lecturer at the Costa Rica Institute of Technology, I see this project as an engagement tool to attract high school students into STEM majors at college. 

InMoov website has detailed construction videos, bills of materials and stl files for the 3D printed parts. We 3D printed all those head parts and follow the assembly instructions. We took special care on the outside parts to make our robot look extra cool, so we applied automotive primer sealer and carefully sanded everything to get that nice and smooth finish, then we painted it white. For the eyes I followed Gaël's design on using a thermoplastic material to cover the eye balls to make them look glossy and shiny. It really enhances the appearance of the robot. 


 

On InMoov webpage the control of the robot is done by a software called MyRobotLab, however I decided to just used and Arduino. I also used 3 shields above the Arduino, one to power all the servos and 2 of them for the joysticks.
The robot has 2 servos on the eyes to make a 2 dimensional movement, one servo to open-close the jaw and 2 servos on the neck. Currently there’s only 1 servo running on the neck, I expect to mount the remaining one soon. 


My first job on the Arduino code was to set the center of each servo and the limits they can reach before something breaks. Then I put some code to map the joystick positions (they are just potentiometers) and map those movements as an increment/decrement on servo position.



Future work on the Robot Head is will be making a strong and nice looking head base (not a cardboard box ;) ) and installing in it the missing neck servo. Then we want to start merging this robot with a chatbot done by an investigator at the university, and hopefully we will have a talking robot!

The Arduino code is available here: GithubGist code




Wednesday, January 25, 2017

My humanoid robot (Bioloid robot controlled by a BeagleBone Black)

Well... actually it was not mine but from my university, but I used for my master thesis and I showed him how to move from scratch.

The details on the mathematical calculations and code implemented is detailed on my thesis. You can find it here (it´s in Spanish).

Bioloid robot with a BeagleBone Black board on the chest.

Usually Bioloid humanoid robots are controlled by Robotis proprietary controller (a microcontroller board) and their proprietary software. On that system, basic moves are already being done and you just have to program the high level instructions you want, like: "Robot, go get me some coffee", well maybe not so high level but you have basic moves like "walk forward" or "turn right" already programed. The idea at the university research center (AI2) was to replace that controller and use an embedded system instead, specifically a BeagleBone Black. By using an embedded system on future Bioloid projects, we could use the benefits of this type of system like true parallelism in processes, extended computation capacity and easy integration with cameras and IMUs. But before this, the guts of the system had to be done, and here I come...

My thesis objective was to develop the Locomotion libraries (in an API) for the robot to move, that is creating from scratch all the movements on the robot in order to have some high order instructions like the proprietary software have.

That's me with my robot.

The first thing I did is that I used the code created for a single robotic arm (see previous post) and used for the two arms in the robot. For the legs to move I programmed the kinematic equations that describe the movement of the foot using hip as a reference and the equations that describe the hip movement using foot as a reference. The reason to have 2 sets of equations for the legs is that for example walking is divided in two main phases: foot on ground (here you move the hip) and foot on air (here you move the foot). Likewise I created two linear trajectories generators, one for the foot and one for the hip. Both were generate movements in straight lines since it´s important to control on every moment were the foot and hip is, and avoid this way any possible unstable position.

Matlab simulation of the legs as kinematics chains.

This time for the communications from the BeagleBone Black to the servos (Dynamixel bus) I used an USB2Dynamixel device. The low level drivers for the BeagleBone Black to send the Dynamixel data frames was created by David Cisternes as a generic API for communications on a Dynamixel bus.

Next task was to create those high level movements everybody loves, so for example for a movement "one step forward" I divided the movement into 7 different segments: lean sideway, lift foot, low loft, move hip forward and sideway, lift back foot, low back foot, return to normal position. Then I created the trajectories for all the limbs during each segment. This duty required a lot of trial and error correction to reach a stable movement.

Robot during one the segments of "one step forward" movement.

I called this high level movements as "behaviors" in order to be used on a state machine that ruled the robot logic. The robot would move between behaviors based on the input from sensors, in my case the ones implemented on the smart foot (see previous post). Finally the robot was able to climb or descend stairs (not big ones, it was a small robot after all!), walk forward and make small turns to the right or left.

Robot climbing one step of a stair

Robot descending the same step of the stair

The code used on the BeagleBone Black and my thesis are on my GitHub web: https://github.com/jcbrenes/Bioloid-Locomotion-API