Saturday, March 29, 2014

Honda U3-x personal mobility







U3-X Personal MobilityA unique fusion of rider and vehicle, the U3-X makes new strides in the advancement of human mobility.

ASIMO Balance Control Technology

Technology developed from research into walking and balance for ASIMO provides the U3-X with balance and free movement in all directions, just as in human walking. Any change in the incline of the U3-X caused by a rider’s weight shift is analyzed to determine the rider's intention in terms of direction and speed. Precise control then returns the device to an upright position. The effect is smooth, agile movements and simple operation simply by shifting your weight.

Honda Omni Traction Drive System

The Honda Omni Traction Drive System is the world’s first wheel structure that enables movement in all directions. Multiple small-diameter motor-controlled wheels are connected in-line to form one large-diameter wheel. Rotating the large-diameter wheel moves the U3-X forward and backward, while rotating the small-diameter wheels moves it side-to-side. Combining these movements causes the U3-X to move diagonally.

Compact and Innovative Design

Weighing in at less than 22 pounds, the U3-X is lightweight and portable. The carbon fiber body-cover doubles as the support frame and also houses the foldable seat and foot rests when not in use.

ASIMO

ASIMO

Asimo is a humanoid robot designed and developed by Honda.It is a most complex but an accurate humanoid robot which learns from it's experience. Introduced on 21 October 2000, ASIMO was designed to be a multi-functional mobile assistant. With aspirations of helping those who lack full mobility, ASIMO is frequently used in demonstrations across the world to encourage the study of science and mathematics. At 130 cm (4 ft 3 in) tall and 48 kg (106 lb), ASIMO was designed to operate in real-world environments, with the ability to walk or run on two feet at speeds of up to 6 kilometres per hour (3.7 mph). In the USA, ASIMO is part of the Innoventions attraction at Disneyland and has been featured in a 15-minute show called "Say 'Hello' to Honda's ASIMO" since June 2005. The robot has made public appearances around the world, including the Consumer Electronics Show (CES), the Miraikan Museum and Honda Collection Hall in Japan, and the Ars Electronica festival in Austria.

Development history

Honda began developing humanoid robots in the 1980s, including several prototypes that preceded ASIMO. It was the company's goal to create a walking robot which could not only adapt and interact in human situations, but also improve the quality of life. The E0 was the first bipedal (two-legged) model produced as part of the Honda E series, which was an early experimental line of humanoid robots created between 1986 and 1993. This was followed by the Honda P series of robots produced from 1993 through 1997, which included the first self-regulating, humanoid walking robot with wireless movements.
The research conducted on the E- and P-series led to the creation of ASIMO. Development began at Honda's Wako Fundamental Technical Research Center in Japan in 1999 and ASIMO was unveiled in October 2000.
Differing from its predecessors, ASIMO was the first to incorporate predicted movement control, allowing for increased joint flexibility and a smoother, more human-like walking motion.Introduced in 2000, the first version of ASIMO was designed to function in a human environment, which would enable it to better assist people in real-world situations. Since then, several updated models have been produced to improve upon its original abilities of carrying out mobility assistance tasks. A new ASIMO was introduced in 2005, with an increased running speed to 3.7 mph, which is twice as fast as the original robot. ASIMO fell during an attempt to climb stairs at a presentation in Tokyo in December 2006, but then a month later, ASIMO demonstrated tasks such as kicking a football, running and walking up and down a set of steps at the Consumer Electronics Show in Las Vegas, Nevada.
In 2007, Honda updated ASIMO's intelligence technologies, enabling multiple ASIMO robots to work together in coordination. This version also introduced the ability to step aside when humans approach the robot and the ability to return to its charging unit upon sensing low battery levels.

Features and technology

 Form

ASIMO stands 130 cm (4 ft 3 in) tall and weighs 54 kg (119 lb). Research conducted by Honda found that the ideal height for a mobility assistant robot was between 120 cm and the height of an average adult, which is conducive to operating door knobs and light switches. ASIMO is powered by a rechargeable 51.8V lithium ion battery with an operating time of one hour. Switching from a nickel metal hydride in 2004 increased the amount of time ASIMO can operate before recharging. ASIMO has a three-dimensional computer processor that was created by Honda and consists of a three stacked die, a processor, a signal converter and memory. The computer that controls ASIMO's movement is housed in the robot's waist area and can be controlled by a PC, wireless controller, or voice commands.

Abilities

ASIMO has the ability to recognize moving objects, postures, gestures, its surrounding environment, sounds and faces, which enables it to interact with humans. The robot can detect the movements of multiple objects by using visual information captured by two camera "eyes" in its head and also determine distance and direction. This feature allows ASIMO to follow or face a person when approached. The robot interprets voice commands and human gestures, enabling it to recognize when a handshake is offered or when a person waves or points, and then respond accordingly. ASIMO's ability to distinguish between voices and other sounds allows it to identify its companions. ASIMO is able to respond to its name and recognizes sounds associated with a falling object or collision. This allows the robot to face a person when spoken to or look towards a sound. ASIMO responds to questions by nodding or providing a verbal answer and can recognize approximately 10 different faces and address them by name.

Mobility

ASIMO has a walking speed of 2.7 kilometres per hour (1.7 mph) and a running speed of 6 kilometres per hour (3.7 mph). Its movements are determined by floor reaction control and target Zero Moment Point control, which enables the robot to keep a firm stance and maintain position. ASIMO can adjust the length of its steps, body position, speed and the direction in which it is stepping. Its arms, hands, legs, waist and neck also have varying degrees of movement. The technology that allows the robot to maintain its balance was later used by Honda when it began the research and development project for its motorized unicycle, U3-X, in 2009. ASIMO has a total of 34 degrees of freedom. The neck, shoulder, wrist and hip joints each have three degrees of freedom, while each hand has four fingers and a thumb that have two degrees of freedom. Each ankle has two degrees of freedom, and the waist, knees and elbows each have one degree of freedom.

Impact and technologies

Honda's work with ASIMO led to further research on Walking Assist™ devices that resulted in innovations such as the Stride Management Assist and the Bodyweight Support Assist.
In honor of ASIMO's 10th anniversary in November 2010, Honda developed an application for the iPhone and Android smartphones called "Run with ASIMO." Users learn about the development of ASIMO by virtually walking the robot through the steps of a race and then sharing their lap times on Twitter and Facebook.

Specifications

Model 2000, 2001, 2002 2004 2005, 2007 2011
Mass 52 kg 54 kg 48 kg
Height 120 cm 130 cm 130 cm
Width 45 cm 45 cm 45 cm
Depth 44 cm 37 cm 34 cm
Walking speed 1.6 km/hour 2.5 km/hour 2.7 km/hour
1.6 km/hour (carrying 1 kg)
Running speed 3 km/hour 6 km/hour (straight)
5 km/hour (circling)
9 km/hour (straight)
Airborne time
(Running motion)
0.05 seconds 0.08 seconds
Battery Nickel metal hydride
38.4 V / 10 Ah/ 7.7 kg
4 hours to fully charge
Lithium ion
51.8 V / 6 kg
3 hours to fully charge
Continuous operating time 30 minutes 40 mins to 1 hour (walking) 1 hour (running/walking)
Degrees of Freedom 26
(head: 2, arm: 5×2, hand: 1×2, leg: 6×2)
34
(head: 3, arm: 7×2, hand: 2×2, torso: 1, leg: 6×2)
57
(head: 3, arm: 7×2, hand: 13×2, torso: 2, leg: 6×2)

 

Controlling and Powering ASIMO

ASIMO is not an autonomous robot. It can't enter a room and make decisions on its own about how to navigate. ASIMO either has to be programmed to do a specific job in a specific area that has markers that it understands, or it has to be manually controlled by a human.
ASIMO can be controlled by four methods:
  • Wireless controller (sort of like a joystick)
  • Gestures
  • Voice commands
Using 802.11 wireless technology and a laptop or desktop computer, you can control ASIMO as well as see what ASIMO sees via its camera eyes. ASIMO can also use its PC connection to access the Internet and retrieve information for you, such as weather reports and news.
The wireless joystick controller operates ASIMO's movements the same way you would operate a remote-control car. You can make ASIMO go forward, backward, sideways, diagonally, turn in place, walk around a corner or run in circles. Making ASIMO move by remote control may not seem that advanced, but ASIMO does have the ability to self-adjust its steps. If you have it walk forward, and it encounters a slope or some sort of obstacle, ASIMO automatically adjusts its steps to accommodate the terrain.
ASIMO can recognize and react to several gestures and body postures, allowing users to command ASIMO nonverbally. You can point to a particular spot you want ASIMO to walk towards, for example, and it will follow your lead. If you wave to ASIMO, it will respond with a wave of its own. It can even recognize when you want to shake its hand.
ASIMO can understand and execute simple, preprogrammed verbal commands. The number of commands that can be programmed into its memory is practically unlimited. You can also have your voice registered in its programming, making it easier for ASIMO to recognize you.
In addition to the voice commands for controlling ASIMO's movements, there are also spoken commands to which ASIMO can respond verbally. This is the feature that has made it possible for ASIMO to work as a receptionist, greeting visitors and answering questions.
Like most other technologies in the robotics field, ASIMO is powered by servo motors. These are small but powerful motors with a rotating shaft that moves limbs or surfaces to a specific angle as directed by a controller. Once the motor has turned to the appropriate angle, it shuts off until it is instructed to turn again. For example, a servo may control the angle of a robot's arm joint, keeping it at the right angle until it needs to move, and then controlling that move. Servos use a position-sensing device (also called a digital decoder) to ensure that the shaft of the motor is in the right position. They usually use power proportional to the mechanical load they are carrying. A lightly loaded servo, for example, doesn't use much energy.
ASIMO has 34 servo motors in its body that move its torso, arms, hands, legs, feet, ankles and other moving parts. ASIMO manages a series of servo motors to control each kind of movement.
ASIMO is powered by a rechargeable, 51.8 volt lithium ion (Li-ION) battery that lasts for one hour on a single charge. The battery is stored in ASIMO's backpack and weighs about 13 pounds. ASIMO's battery takes three hours to fully charge, so a second (and third) battery is crucial if you needed ASIMO to operate for very long. Users can charge the battery onboard ASIMO through a power connection or remove the backpack to charge separately.

 

ASIMO's Motion: Walk Like a Human

Honda researchers began by studying the legs of insects, mammals, and the motion of a mountain climber with prosthetic legs to better understand the physiology and all of the things that take place when we walk -- particularly in the joints. For example, the fact that we shift our weight using our bodies and especially our arms in order to balance was very important in getting ASIMO's walking mechanism right. The fact that we have toes that help with our balance was also taken into consideration: ASIMO actually has soft projections on its feet that play a similar role to the one our toes play when we walk. This soft material also absorbs impact on the joints, just as our soft tissues do when we walk.
ASIMO has hip, knee, and foot joints. Robots have joints that researchers refer to as "degrees of freedom." A single degree of freedom allows movement either right and left or up and down. ASIMO has 34 degrees of freedom spread over different points of its body in order to allow it to move freely. There are three degrees of freedom in ASIMO's neck, seven on each arm and six on each leg. The number of degrees of freedom necessary for ASIMO's legs was decided by measuring human joint movement while walking on flat ground, climbing stairs and running.
ASIMO also has a speed sensor and a gyroscope sensor mounted on its body. They perform the tasks of:
  • sensing the position of ASIMO's body and the speed at which it is moving
  • relaying adjustments for balance to the central computer
These sensors work similarly to our inner ears in the way they maintain balance and orientation.
ASIMO also has floor surface sensors in its feet and six ultrasonic sensors in its midsection. These sensors enhance ASIMO's ability to interact with its environment by detecting objects around ASIMO and comparing gathered information with maps of the area stored in ASIMO's memory.
To accomplish the job our muscles and skin do in sensing muscle power, pressure and joint angles, ASIMO has both joint-angle sensors and a six-axis force sensor.

Unless you know a lot about robotics, you may not fully grasp the incredible milestone it is that ASIMO walks as we do. The most significant part of ASIMO's walk is the turning capabilities. Rather than having to stop and shuffle, stop and shuffle, and stop and shuffle into a new direction, ASIMO leans and smoothly turns just like a human. ASIMO can also self-adjust its steps in case it stumbles, is pushed, or otherwise encounters something that alters normal walking.
In order to accomplish this, ASIMO's engineers had to find a way to work with the inertial forces created when walking. For example, the earth's gravity creates a force, as does the speed at which you walk. Those two forces are called the "total inertial force." There is also the force created when your foot connects with the ground, called the "ground reaction force." These forces have to balance out, and posture has to work to make it happen. This is called the "zero moment point" (ZMP).
To control ASIMO's posture, engineers worked on three areas of control:
  • Floor reaction control means that the soles of the feet absorb floor unevenness while still maintaining a firm stance.
  • Target ZMP control means that when ASIMO can't stand firmly and its body begins to fall forward, it maintains position by moving its upper body in the direction opposite the impending fall. At the same time, it speeds up its walking to quickly counterbalance the fall.
  • Foot-planting location control kicks in when the target ZMP control has been activated. It adjusts the length of the step to regain the right relationship between the position and speed of the body and the length of the step.

    ASIMO's Motion: Smooth Moves

    ASIMO can sense falling movements and react to them quickly; but ASIMO's engineers wanted more. They wanted the robot to have a smooth gait as well as do something that other robots can't do -- turn without stopping.

    When we walk around corners, we shift our center of gravity into the turn. ASIMO uses a technology called "predictive movement control," also called Honda's Intelligent Real-Time Flexible Walking Technology or I-Walk, to accomplish that same thing. ASIMO predicts how much it should shift its center of gravity to the inside of the turn and how long that shift should be maintained. Because this technology works in real time, ASIMO can do this without stopping between steps, which other robots must do.
    Essentially, with every step ASIMO takes, it has to determine its inertia and then predict how its weight needs to be shifted for the next step in order to walk and turn smoothly. It adjusts any of the following factors in order to maintain the right position:
    • the length of its steps
    • its body position
    • its speed
    • the direction in which it is stepping
    While reproducing a human-like walk is an amazing achievement, ASIMO can now run at speeds up to 3.7 miles per hour (6 kilometers per hour). In order to qualify as a true running robot, ASIMO must have both feet off the ground for an instant in each step. ASIMO manages to be airborne for .08 seconds with each step while running.
    Honda engineers encountered an entirely new set of challenges while trying to give ASIMO the ability to run. They gave ASIMO’s torso a degree of freedom to aid in bending and twisting so that the robot could adjust its posture while airborne. Without this ability, ASIMO would lose control while airborne, possibly spinning in the air or tripping when landing.
    In order to make turns smoothly while running, the engineers enhanced ASIMO's ability to tilt its center of gravity inside turns to maintain balance and counteract centrifugal force. ASIMO could even anticipate turns and begin to lean into them before starting the turn, much like you would if you were skiing or skating.
    In the next section, we’ll look at how ASIMO is able to recognize images and sense its environment.

NAO

Nao (pronounced now) is an autonomous, programmable humanoid robot developed by Aldebaran Robotics, a French robotics company headquartered in Paris. The robot's development began with the launch of Project Nao in 2004. On 15 August 2007, Nao replaced Sony's robot dog Aibo as the robot used in the RoboCup Standard Platform League (SPL), an international robot soccer competition.The Nao was used in RoboCup 2008 and 2009, and the NaoV3R was chosen as the platform for the SPL at RoboCup 2010.
Numerous versions of the robot have been released since 2008. The Nao Academics Edition was developed for universities and laboratories for research and education purposes. It was released to institutions in 2008, and was made publicly available by 2011. The robot has since entered use in numerous academic institutions worldwide, including the University of Tokyo, India's IIT Kanpur and Saudi Arabia's King Fahd University of Petroleum and Minerals . In December 2011, Aldebaran Robotics released the Nao Next Gen, featuring enhanced software, a more powerful CPU and HD cameras.

History

Six prototypes of Nao were designed between 2005 and 2007. In March 2008, the first production version of the robot, the Nao Robocup Edition, was released to the contestants of that year's Robocup. The Nao Academics Edition was released to universities, education institutes and research laboratories in late 2008.
In the summer of 2010, Nao made global headlines with a synchronized dance routine at the Shanghai Expo in China. In October 2010, the University of Tokyo purchased 30 Nao robots for their Nakamura Lab, with hopes of developing the robots into active lab assistants. In December 2010, a Nao robot was demonstrated doing a stand-up comedy routine, and a new version of the robot was released, featuring sculpted arms and improved motors. In May 2011, Aldebaran announced that it would release Nao's controlling source code to the public as open source software. In June 2011, Aldebaran raised US$13 million in a round of venture funding led by Intel Capital.
In December 2011, the Nao Next Gen, featuring hardware and software enhancements such as HD cameras, improved robustness, anti-collision systems and a faster walking speed. Since 2011, over 200 academic institutions worldwide have made use of the robot.[3][4][11] In 2012, donated Nao robots were used to teach autistic children in a UK school; some of the children found the childlike, expressive robots more relatable than human beings.

Design

The various versions of the Nao robotics platform feature either 14, 21 or 25 degrees of freedom (DoF). A specialised model with 21 DoF and no actuated hands was created for the Robocup competition. All Nao Academics versions feature an inertial measurement unit with accelerometer, gyrometer and four ultrasonic sensors that provide Nao with stability and positioning within space. The legged versions included eight force-sensing resistors and two bumpers.
The Nao robot also features an onboard Linux-powered multimedia system, including four microphones (for voice recognition and sound localization), two speakers (for text-to-speech synthesis) and two HD cameras (for computer vision, including facial and shape recognition). The robot comes with a software suite that includes a graphical programming tool ("Choregraphe"), simulation software and a software developer's kit. Nao is also compatible with the Microsoft Robotics Studio, Cyberbotics Webots, and the Gostai Urbi Studio. An upgraded version of the robot, featuring enhanced multilingual speech synthesis, will be released in 2014.

Specifications

Nao Next Gen (2011)
Height 58 centimetres (23 in)
Weight 4.3 kilograms (9.5 lb)
Autonomy 60 minutes (active use), 90 minutes (normal use)
Degrees of freedom 21 to 25
CPU Intel Atom @ 1.6 GHz
Built-in OS Linux
Compatible OS Windows, Mac OS, Linux
Programming languages C++, Python, Java, MATLAB, Urbi, C, .Net
Vision Two HD 1280x960 cameras
Connectivity Ethernet, Wi-Fi

 


 


 

 

 

How does he work?

NAO is a little artificial character with a unique combination of hardware and software: it consists of sensors, motors and software driven by NAOqi, our dedicated operating system.

It gets its magic from its programming and animation.

Movement libraries are available through graphics tools such as Choregraphe and other advanced programming software.
They allow users to create elaborate behaviors, access the data acquired by the sensors, and control the robot... to give it life!

NAO has:
  • 25 motors, for movement
  • Two cameras, to see its surroundings
  • An inertial measurement unit, which lets it know whether it is upright or lying down
  • Touch sensors to detect your touch
  • Four microphones, so it can hear you
This combination of technologies (and many others) gives NAO the ability to detect its surroundings.
Now it must interpret what it detected. This is where the embedded software in NAO's head comes in. Aldebaran created a dedicated operating system, NAOqi, allowing the small humanoid to understand and interpret the data received by its sensors. Beyond that, it's all programming and imagination.

More about


Hardware Platform

NAO is a programmable, 58cm tall humanoid robot with the following key components:

  • Body with 25 degrees of freedom (DOF) whose key elements are electric motors and actuators
  • Sensor network, including 2 cameras, 4 microphones, sonar rangefinder, 2 IR emitters and receivers, 1 inertial board, 9 tactile sensors, and 8 pressure sensors
  • Various communication devices, including voice synthesizer, LED lights, and 2 high-fidelity speakers
  • Intel ATOM 1,6ghz CPU (located in the head) that runs a Linux kernel and supports Aldebaran’s proprietary middleware (NAOqi)
  • Second CPU (located in the torso)
  • 27,6-watt-hour battery that provides NAO with 1.5 or more hours of autonomy, depending on usage

Motion

Omnidirectional walking

NAO's walking uses a simple dynamic model (linear inverse pendulum) and quadratic programming. It is stabilized using feedback from joint sensors. This makes walking robust and resistant to small disturbances, and torso oscillations in the frontal and lateral planes are absorbed. NAO can walk on a variety of floor surfaces, such as carpeted, tiled, and wooden floors. NAO can transition between these surfaces while walking.

Whole body motion

NAO's motion module is based on generalized inverse kinematics, which handles Cartesian coordinates, joint control, balance, redundancy, and task priority. This means that when asking NAO to extend its arm, it bends over because its arms and leg joints are taken into account. NAO will stop its movement to maintain balance.

Fall Manager

The Fall Manager protects NAO when it falls. Its main function is to detect when NAO's center of mass (CoM) shifts outside the support polygon. The support polygon is determined by the position of the foot or feet in contact with the ground. When a fall is detected, all motion tasks are killed and, depending on the direction, NAO's arms assume protective positioning, the CoM is lowered, and robot stiffness is reduced to zero.

Vision

NAO has two cameras and can track, learn, and recognize images and faces.

NAO sees using two 920p cameras, which can capture up to 30 images per second.
The first camera, located on NAO’s forehead, scans the horizon, while the second located at mouth level scans the immediate surroundings.
The software lets you recover photos and video streams of what NAO sees. But eyes are only useful if you can interpret what you see.
That’s why NAO contains a set of algorithms for detecting and recognizing faces and shapes. NAO can recognize who is talking to it or find a ball or, eventually, more complex objects.
These algorithms have been specially developed, with constant attention to using a minimum of processor resources.
Furthermore, NAO’s SDK lets you develop your own modules to interface with OpenCV (the Open Source Computer Vision library originally developed by Intel).
Since you can execute modules on NAO or transfer them to a PC connected to NAO, you can easily use the OpenCV display functions to develop and test your algorithms with image feedback.

Audio

NAO uses four microphones to track sounds, and its voice recognition and text-to-speech capabilities allow it to communicate in 8 languages.

Sound Source Localization

One of the main purposes of humanoid robots is to interact with people. Sound localization allows a robot to identify the direction of sounds. To produce robust and useful outputs while meeting CPU and memory requirements, NAO sound source localization is based on an approach known as “Time Difference of Arrival.”
When a nearby source emits a sound, each of NAO’s four microphones receives the sound wave at slightly different times.
For example, if someone talks to NAO on its left side, the corresponding sound wave first hits the left microphones, then the front and rear microphones a few milliseconds later, and finally the right microphone.
These differences, known as interaural time difference (ITD), can then be mathematically processed to determine the current location of the emitting source.
By solving the equation every time it hears a sound, NAO can determine the direction of the emitting source (azimuthal and elevation angles) from ITDs between the four microphones.
This feature is available as a NAOqi module called ALAudioSourceLocalization; it provides a C++ and Python API that allows precise interactions with a Python script or NAOqi module.
Two Choregraphe boxes that allow easy use of the feature inside a behavior are also available:

Possible applications include:

  • Human Detection, Tracking, and Recognition
  • Noisy Object Detection, Tracking, and Recognition
  • Speech Recognition in a specific direction
  • Speaker Recognition in a specific direction
  • Remote Monitoring/Security applications
  • Entertainment applications

Audio Signal Processing

In robotics, embedded processors have limited computational power, making it useful to perform some calculations remotely on a desktop computer or server.
This is especially true for audio signal processing; for example, speech recognition often takes place more efficiently, faster, and more accurately on a remote processor. Most modern smartphones process voice recognition remotely.
Users may want to use their own signal processing algorithms directly in the robot.
The NAOqi framework uses Simple Object Access Protocol (SOAP) to send and receive audio signals over the Web.
Sound is produced and recorded in NAO using the Advanced Linux Sound Architecture (ALSA) library.
The ALAudioDevice module manages audio inputs and outputs.
Using NAO’s audio capabilities, a wide range of experiments and research can take place in the fields of communications and human-robot interaction.
For example, users can employ NAO as a communication device, interacting with NAO (talk and hear) as if it were a human being.
Signal processing is of course an interesting example. Thanks to the audio module, you can get the raw audio data from the microphones in real time and process it with your own code.

Tactile sensors & Sonar Rangefinders

Tactile Sensors

Besides cameras and microphones, NAO is fitted with capacitive sensors positioned on top of its head in three sections and on its hands.
You can therefore give NAO information through touch: pressing once to tell it shut down, for example, or using the sensors as a series of buttons to trigger an associated action.
The system comes with LED lights that indicate the type of contact. You can also program complex sequences.

Sonar Rangefinders

NAO is equipped with two sonar channels: two transmitters and two receivers.
They allow NAO to estimate the distances to obstacles in its environment. The detection range is 0–70 cm.
Less than 15 cm, there is no distance information; NAO only knows that an object is present.

Connectivity

Ethernet and Wi-Fi

NAO currently supports Wi-Fi (bgn) and Ethernet, the most widespread network communication protocols. In addition, infrared transceivers in the eyes allow connection to objects in the environment. NAO is compatible with the IEE 802.11g Wi-Fi standard and can be used on both WPA and WEP networks, making it possible to connect to most home and office networks. NAO's OS supports both Ethernet and Wi-Fi connections and requires no Wi-Fi setup other than entering the password.
NAO's ability to connect to networks offers a wide range of possibilities. You can pilot and program NAO using any computer on the network.

Here are a few examples of applications NAO users have already created:

  • Based on NAO's IP address, NAO can figure out its location and give you a personalized weather report.
  • Ask NAO about a topic and it connects to Wikipedia and read you the relevant entry.
  • Connect NAO to an audio stream and it plays an Internet radio station for you.
Using XMPP technology (like in the Google Chat system), you can control NAO remotely and stream video from its cameras.

Infrared

Using infrared, NAO can communicate with other NAOs and other devices that support infrared. You can configure NAO to use infrared to control other devices (“NAO, please turn on the TV”). In addition, NAO can also receive instructions from infrared emitters, such as remote controls. And of course, two NAOs can communicate with each other directly.
Infrared is already the most common method of controlling appliances, making NAO easily adaptable to domotics applications. NAO can also detect whether an infrared signal received is coming from the left or right.

 


 

Monday, March 3, 2014

DRASTIC DEVELOPMENT IN FLYING ROBOTS

Tiny Robot Flies Like a Jellyfish

A new teeny-tiny robot flies through the air like a jellyfish swims.
The jellyfish flier is a strange sight — it looks a little bit like a Chinese lantern that's developed a hankering for the open skies — but its unique design keeps it from tipping over without the use of sensors or external controls. That talent could make it handy for maneuvering in small spaces, said its inventor Leif Ristroph, a postdoctoral researcher at New York University.
"What's cool is you can actually build these flying things yourself," Ristroph told LiveScience. "All the components I used to make this, they cost about $15 and they're available on hobby airplane websites." 

 

Google's Latest Robot Acquisition Is the Smartest Yet

Google’s remarkable push into robotics continued today with the acquisition of Boston Dynamics, a particularly exciting—and potentially important—company.
Boston Dynamics creates legged robots with eerily lifelike running and balancing abilities. These machines are more than just spectacular feats of engineering, though; they embody a powerful approach to robot locomotion that might have an big impact on the way future machines move around our world.
Google is making a big drive into robotics. It has acquired seven other other robotics startups in recent months as part of an effort led by Andy Rubin, the man who previously led the development of the Android mobile operating system (see “Why Google Is Buying So Many Robotics Startups”). Google hasn’t yet said what it plans to do with the robotics technologies it has acquired, but they include manipulation, vision, and other areas of innovation that are likely to become increasingly important. Although Boston Dynamics’ machines can bound across wild terrain and run at great speed, the approach the company has pioneered could be useful for more mundane purposes, like climbing a flight of stairs or staying balanced when accidentally shoved.Boston Dynamics was founded by Marc Raibert, a roboticist who developed a new approach to legged locomotion in the 1980s and led research labs at CMU and MIT dedicated to building walking machines. At a time when most walking robots were rigid and moved slowly, Raibert sought to mimic biology, devising engineering principles that made it possible for machines to remain stable on uneven or treacherous terrain. Like living creatures, Raibert’s robots moved quickly and continually, storing energy in their limbs. His first dynamic robot was a one-legged machine that kept from falling over simply by moving its leg as it bounced around.
Raibert left MIT and founded Boston Dynamics in 1992, initially to develop simulation software. But the company soon began building robots based on his dynamic balancing ideas. Its YouTube videos show the astonishingly lifelike results: a St. Bernard-sized four-legged robot, called BigDog, walking across all sorts of treacherous terrain, and a slightly larger robot, called WildCat, running about the company’s car park at about 30 kilometers per hour. Most recently, Boston Dynamics developed a humanoid robot called Atlas for a DARPA challenge designed to foster the development of robots that can perform rescue missions in environments like Fukushima (see “Meet Atlas, the Robot Designed to Save the Day”). Most of these robots were developed largely with funding from the military.
The technologies developed by Boston Dynamics could be significant because most robots cannot cover uneven terrain. The few mobile robots that exist today, such as the bomb-disposal bots used by the military, tend to travel using wheels or tracks.
In recent weeks I’ve visited Boston Dynamics several times to see Altas, WildCat, and other Boston Dynamics robots in action. Significant challenges remain, including making its hydraulic actuators more efficient and making its robots quieter. But the company is already making impressive strides in these areas.
“It has been a great time at Boston Dynamics, getting the robots this far along,” Raibert told me via e-mail. “Now we are excited to see how far we can take things as part of Google.”