This is MAX He is a NAO robot from Aldebaran

Max is an NAO robot from Alderbran he is a 58-cm tall humanoid robot. He is small, cute and round. You can't help but love him! NAO is intended to be a friendly companion around the house. He moves, recognises you, hears you and even talks to you! and he fulfils all the criteria to be regarded as a robot and his software has sufficient capacity, capability and flexibility for Max to be considered a form of Artificial Intelligence.

Since his birth in 2006, he has been constantly evolving to please, amuse, understand and love you. In short, to one day become your friend. Aldebaran created NAO to be a true daily companion. He is the little creature who helps you be your best. His humanoid form and extreme interactivity make him really endearing and loveable.

Robots like Max are already at work doing important jobs. NAO robots are active in special needs schools where their ability to tirelessly deal with children on the autism spectrum is producing encouraging results. Not only children with special needs can benefit, residents in retirement homes and care homes can interact with robots like Max in a meaningful way and Max would never get bored or frustrated or distracted when answering the same question for the 20th time today or the 200th either. In all caring and nurturing environments there are a variety of tasks; one way to characterise them is classify them as either warm or cold. Warm tasks might, for example, require the intervention of a real human being with empathy, authority and responsibility, whereas cold tasks need only standard responses to a limited range of stimuli. Max is very good at cold tasks – he can run a bingo game complete with jokes and all the usual patter, he can demonstrate exercises and physical activities for children or elderly people. Max can provide basic information on request and that includes using the internet for up-to-date information on what in the news and weather reports.

He can manage enough in the way of polite conversation to engage someone who might struggle with real people.

Max has an array of sensors available and the ability to use wifi to communicate in real time with him means that a healthcare professional can monitor and interact with residents and patients through Max.


  • 360 Robotics Solutions Education Flyer
  • 360 Robotics Solution Developer's Flyer
  •  Teach With Robots

NAO is a programmable, 58cm tall humanoid robot with the following key components:

  • Body with 25 degrees of freedom (DOF) whose key elements are electric motors and actuators
  • Sensor network: two cameras, four directional microphones, sonar rangefinder, two IR emitters and receivers, one inertial board, nine tactile sensors and eight pressure sensors
  • Various communication devices, including voice synthesizer, LED lights, and 2 high-fidelity speakers
  • Intel ATOM 1,6ghz CPU (located in the head) that runs a Linux kernel and supports Aldebaran’s proprietary middleware (NAOqi)
  • Second CPU (located in the torso)
  • 48.6-watt-hour battery that provides NAO with 1.5 or more hours of autonomy, depending on usage

Omnidirectional walking

NAO's walking uses a simple dynamic model (linear inverse pendulum) and quadratic programming. It is stabilized using feedback from joint sensors. This makes walking robust and resistant to small disturbances, and torso oscillations in the frontal and lateral planes are absorbed. NAO can walk on a variety of floor surfaces, such as carpeted, tiled, and wooden floors. NAO can transition between these surfaces while walking.

Whole body motion

NAO's motion module is based on generalized inverse kinematics, which handles Cartesian coordinates, joint control, balance, redundancy, and task priority. This means that when asking NAO to extend its arm, it bends over because its arms and leg joints are taken into account. NAO will stop its movement to maintain balance.

Fall Manager

The Fall Manager protects NAO when it falls. Its main function is to detect when NAO's center of mass (CoM) shifts outside the support polygon. The support polygon is determined by the position of the foot or feet in contact with the ground. When a fall is detected, all motion tasks are killed and, depending on the direction, NAO's arms assume protective positioning, the CoM is lowered, and robot stiffness is reduced to zero.

NAO has two cameras and can track, learn, and recognize images and faces.

That’s why NAO contains a set of algorithms for detecting and recognizing faces and shapes. NAO can recognize who is talking to it or find a ball or, eventually, more complex objects.

These algorithms have been specially developed, with constant attention to using a minimum of processor resources.

Furthermore, NAO’s SDK lets you develop your own modules to interface with OpenCV (the Open Source Computer Vision library originally developed by Intel).

Since you can execute modules on NAO or transfer them to a PC connected to NAO, you can easily use the OpenCV display functions to develop and test your algorithms with image feedback.

NAO uses four directional microphones to detect sounds, and his voice recognition and text-to-speech capabilities allow him to communicate in 19 languages.

Sound Source Localization

One of the main purposes of humanoid robots is to interact with people. Sound localization allows a robot to identify the direction of sounds. To produce robust and useful outputs while meeting CPU and memory requirements, NAO sound source localization is based on an approach known as “Time Difference of Arrival.”

When a nearby source emits a sound, each of NAO’s four microphones receives the sound wave at slightly different times.

For example, if someone talks to NAO on its left side, the corresponding sound wave first hits the left microphones, then the front and rear microphones a few milliseconds later, and finally the right microphone.

These differences, known as interaural time difference (ITD), can then be mathematically processed to determine the location of the emitting source.

By solving the equation every time it hears a sound, NAO can determine the direction of the emitting source (azimuthal and elevation angles) from ITDs between the four microphones.

This feature is available as a NAOqi module called ALAudioSourceLocalization; it provides a C++ and Python API that allows precise interactions with a Python script or NAOqi module.

Two Choregraphe boxes that allow easy use of the feature inside a behavior are also available:

Possible applications include:

  • Human Detection, Tracking, and Recognition
  • Noisy Object Detection, Tracking, and Recognition
  • Speech Recognition in a specific direction
  • Speaker Recognition in a specific direction
  • Remote Monitoring/Security applications
  • Entertainment applications

Audio Signal Processing

In robotics, embedded processors have limited computational power, making it useful to perform some calculations remotely on a desktop computer or server.

This is especially true for audio signal processing; for example, speech recognition often takes place more efficiently, faster, and more accurately on a remote processor. Most modern smartphones process voice recognition remotely.

Users may want to use their own signal processing algorithms directly in the robot.

The NAOqi framework uses Simple Object Access Protocol (SOAP) to send and receive audio signals over the Internet, to a processing platform.

Sound is produced and recorded in NAO using the Advanced Linux Sound Architecture (ALSA) library.

The ALAudioDevice module manages audio inputs and outputs.

Using NAO’s audio capabilities, a wide range of experiments and research can take place in the fields of communications and human-robot interaction.

For example, users can employ NAO as a communication device, interacting with NAO (talk and hear) as if it were a human being.

Signal processing is of course an interesting example. Thanks to the audio module, you can get the raw audio data from the microphones in real time and process it with your own code.

Tactile Sensors

Besides cameras and microphones, NAO is fitted with capacitive sensors positioned on top of its head in three sections and on its hands.

You can therefore give NAO information through touch: pressing once to tell it shut down, for example, or using the sensors as a series of buttons to trigger an associated action.

The system comes with LED lights that indicate the type of contact. You can also program complex sequences.

Sonar Rangefinders

NAO is equipped with two sonar channels: two transmitters and two receivers.

They allow NAO to estimate the distances to obstacles in his environment. The detection range is 1 cm to 3 metres.

Less than 15 cm, there is no distance information; NAO only knows that an object is present.

Ethernet and Wi-Fi

NAO currently supports Wi-Fi (bgn) and Ethernet, which are currently the most widespread network communication protocols. In addition, infrared transceivers in his eyes allow connection to objects in the environment. NAO is compatible with the IEE 802.11b/g/n Wi-Fi standard and can be used on both WPA and WEP networks, making it possible to connect him to most home and office networks. NAO's OS supports both Ethernet and Wi-Fi connections and requires no Wi-Fi setup other than entering the password.

NAO's ability to connect to networks offers a wide range of possibilities. You can pilot and program NAO using any computer on the network.

Here are a few examples of applications NAO users have already created:

  • Based on NAO's IP address, NAO can figure out its location and give you a personalized weather report.
  • Ask NAO about a topic and it connects to Wikipedia and read you the relevant entry.
  • Connect NAO to an audio stream and it plays an Internet radio station for you.

Using XMPP technology (like in the Google Chat system), you can control NAO remotely and stream video from its cameras.

Open Source

With over five years of experience in developing embedded systems for robotics platforms, Aldebaran Robotics is sharing its cross-platform build tools, the core communication library, and other essential modules with researchers, developers, and emerging projects in humanoid robotics.

By capitalizing on Aldebaran Robotics's extensive experience, users can concentrate their efforts on creating innovative and exciting applications.

In addition, users benefit from the strong innovation that characterizes the growing NAO community.

Robotics and its associated applications are still emerging fields of research.

Collaboration in exploring future applications and ongoing exchange within our user community are essential.

SDKs in Python, Java and C++, plus Aldebaran's Choregraphe GUI IDE


NAO Version 5 - Gallery

nao-crdit-vincent-desailly_ld (1) nao-crdit-vincent-desailly_ld nao-crdit-vincent-desailly_ld (2) nao-crdit-vincent-desailly_ld (5) nao-crdit-vincent-desailly_ld (3) nao-crdit-vincent-desailly_ld (4) nao-crdit-vincent-desailly_ld (7) nao-crdit-vincent-desailly_ld (8) nao-crdit-vincent-desailly_ld (6) nao-crdit-vincent-desailly_ld (9) IMG_0032 IMG_0061 IMG_0072 IMG_0027