196 posts tagged robots
"The thing that’s been missing in robotics is a sense of smell," said biology professor Joseph Ayers. For more than four decades, he has been working to develop robots that do not rely on algorithms or external controllers. Instead, they incorporate electronic nervous systems that take in sensory inputs from the environment and spit out autonomous behaviors. For example, his team’s robo-lobsters are designed to seek out underwater mines without following a predetermined course. "Now people want robots to do group behavior," said Ayers, noting that social insect colonies are the perfect model. "If you’re doing large field explorations for mines, you want to have 20 or 30 robots out there." In order to get robots to cooperate with each other, he needs them to act like ants or bees or termites. Bees waggle their behinds to communicate. Ants use almost two dozen scent glands, depositing a trail of "stinks" as they go about their business. It’s this behavior that Ayers wants to mimic in his next generation of biomimetic robots. To do so, he needs electronic devices that can sense chemical inputs, such as explosives. His idea is to integrate various microelectronic sensors that can interface with living cells. For example, a bacterial cell programmed to bind odorants in the environment may elicit a conformational change; that change may translate to an influx of calcium ions, which are detected by a second cell that is programmed to generate light when bound to calcium. In this way, Ayers said, "you can see smell." That output would then trigger microelectronic actuators that tell the robot to perform a particular action, such as moving toward or away from the stimulus. But in order for any of this to play out, somebody needs to build these futuristic device
Pterodactyl-like bot “Daler” uses its wings for walking
The Daler can fly through the air with grace, but the fun doesn’t stop when this remote-controlled robot lands. The wings segment, and it clambers across the ground like an ungainly pterodactyl or bat. Most robots only use one type of locomotion. They fly through the air (raining death from above, in the case of US military drones), swim through the seas, like the Sharkbot or this robotic jellyfish, or crawl, run, and roll across the earth. However, the Daler—Deployable air land exploration robot—uses “adaptive morphology” to master the skies and the earth. It has a wingspan of 60cm, and using its battery-powered “Whegs”—wheel-legs—can fly for 30 minutes or walk for an hour. Developed at the Laboratory of Intelligent Systems at Ecole Polytechnique Fédérale de Lausanne, the Daler could one day be used in search-and-rescue, the robot’s inventor told Wired.co.uk. “The goal is to cover large distances as fast as possible in a forward flight configuration and then use hover or ground locomotion to search for victims,” said Ludovic Daler. “Being capable of multi-modal locomotion for a robot has an advantage in complex terrains where other robots would get stuck at some point.” At the moment, the Daler’s mastery of the earth is rather turtle-like, with a top speed of just 0.2m/s, and it needs to be thrown by hand to launch. But despite these shortcomings, this initial prototype shows how robots of the future could tackle any obstacle by changing their shape. The next version of the robot will include vertical take off and landing capabilities, says Daler. He is also investigating ways of allowing the robot to choose its gait depending on the terrain, as well as reducing the wingspan with deployable wings to increase its ground manoeuvrability. (via Pterodactyl-like bot “Daler” uses its wings for walking | Ars Technica)
Japan has launched the world’s first talking robot into space to serve as companion to astronaut Kochi Wakata who will begin his mission in November.
The android took off from the island of Tanegashima in an unmanned rocket also carrying supplies for crew onboard the International Space Station (ISS). Measuring 34cm (13 inches), Kirobo is due to arrive at the ISS on 9 August. It is part of a study to see how machines can lend emotional support to people isolated over long periods. The launch of the H-2B rocket was broadcast online by the Japan Aerospace Exploration Agency (Jaxa). The unmanned rocket is also carrying drinking water, food, clothing and work supplies to the six permanent crew members based at the ISS. (via BBC News - Kirobo is world’s first talking robot sent into space)
Robotic Skin Lights Up When Touched
Imagine how awesome — or distracting — it would be if human skin lit up every time something pushed on it. Pulsing arteries, mosquitoes, a rude shoulder-check on the sidewalk, or scratching an itch would transform a person into a blinking light show.
Now, scientists at the University of California, Berkeley have designed an electronic skin that does actually does this: Super-thin and flexible, the skin lights up when touched. More pressure produces a brighter light, the team reports July 21 in Nature Materials.
Thinner than a sheet of paper, the skin is made from layers of plastic and a pressure-sensitive rubber. A conductive silver ink, organic LEDs, and thin-film transistors made from semiconductor-enriched carbon nanotubes are sandwiched between the layers. Applying pressure sends a signal through the rubber that ultimately turns on the LEDs, which light up in red, green, yellow or blue.
Instead of using the material to create bodysuits for Burning Man or other illuminated party tricks, scientists suggest that it might be used for smart wallpapers, health-monitoring devices, or in robotics. The type of interactive pressure sensor developed by the Berkeley scientists could also be useful in artificial skin for prosthetic limbs. For years, scientists have been working on developing systems and materials that could be integrated into a functioning, stimulus-responsive skin — something that can sense temperature, pressure, and stretch, and can heal itself. In addition, such a sheath might one day transform an ordinary robot into an interactive machine that’s capable of responding to tiny changes in its environment.
If and when that day comes, we will welcome our touchy-feely glow-bot overlords.
Aaron Saenz: I, For One, Welcome Our New Robot Overlords
Aaron Saenz is a former physicist, an improvisational actor, and a tech journalist. He currently writes for SingularityHub.com, covering everything from crowd-sourced holographic Japanese pop stars to open source research robotics. He’s also the host of Singularity Hub’s Accelerated Tech News, a new video recap of the week’s top stories in science and technology. Follow his work on Twitter: @adsaenz
@ BAASICS.2: The Future (by BAASICSsf)
US unveils ‘Atlas’ humanoid robot test bed
A humanoid robot called Atlas could pave the way for intelligent machines to help in the wake of natural disasters. The two meter tall robot was created as a test bed for a US Defence Advanced Research Projects Agency challenge. The Darpa challenge demands Atlas completes eight tasks that it might have to perform in an emergency. Six teams have until December 2013 to develop software that will help Atlas complete the tasks. Atlas has been developed by the Boston Dynamics robotics firm which has been working on robots that can aid the military. Like a human, Atlas has two arms and legs and gets around by walking. It sees using a stereo laser scanning system and has gripping hands developed by two separate robotics companies. Unlike humans, it has a high speed networking system built-in so it can communicate with its creators and pipe data back from disaster areas. Before now, the teams taking part in the robotic challenge have only worked with virtual versions of Atlas. In the next stage of the competition, algorithms and control programs for the virtual Atlas will be transferred to the real thing. The teams will then have five months to refine Atlas’s abilities before taking part in a series of trials. During those, a tethered version of Atlas will be expected to complete tasks which include driving a car, removing debris blocking doors. climbing a ladder, finding and closing a valve and connecting a fire hose. The best performing teams in the December 2013 trials will win funding to continue refining Atlas so it can perform all eight tasks autonomously during the challenge finals in late 2014. (via BBC News - US unveils ‘Atlas’ humanoid robot test bed)
Calculating art: Meet e-David, the painting machine (w/ Video)
Sometime in the future, you will be at an art gallery where you are drawn to a nice-looking tree, or haunting line drawing of a woman’s face, or historical portrait, and you will wonder who is the artist. Eye the lower corner of the canvas and it will tell you, “David.” What you might not realize is that David is a robot—e-David, to be exact. A team at the University of Konstanz in Germany have developed e-David as a robot “artist” that uses software to decide where to add the next brush stroke. After each brush stroke, e-David takes a picture, and its software calculates such moves as where the image needs to be lightened or darkened. At the University of Konstanz, the group said their project objective is to build a robot that can paint, pure and simple. By paint, they do not mean adding a fresh coat to a kitchen ceiling but delivering art.
The school’s Department of Computer and Information Science has a structure where workgroups share a common research topic, and e-David is a project within the topic, “Exploration and visualization of large information spaces.”
The robot is not “person-able,” more like the metallic skeleton of a mythical and very studious canine. They used an industry robot normally used to weld car bodies, and enhanced it with sensors, camera, control computer, and art supplies. They chose the name “David” not because they especially liked that name but because it stands for what they tried to accomplish: Drawing Apparatus for Vivid Image Display.
The computer program provides drawing commands that are executed by the machine. This can be considered as a step above humans painting- by-numbers: Just as one observes sidewalk artists repeatedly tweaking their lines and dabs and brush strokes as they fill an empty canvas, the robot does something similar. The device takes a picture of what it wants to copy. The robot watches itself paint and decides where to add the next stroke, constantly tweaking its moves based on what it’s seeing through a camera pointed at its canvas. (via Calculating art: Meet e-David, the painting machine (w/ Video))
Robots Hallucinate Humans to Aid in Object Recognition
Almost exactly a year ago, we posted about how Ashutosh Saxena’s lab at Cornell was teaching robots to use their “imaginations” to try to picture how a human would want a room organized. The research was successful, with algorithms that used hallucinated humans (which are the best sort of humans) to influence the placement of objects performing significantly better than other methods. Cool stuff indeed, and now comes the next step: labeling 3D point-clouds obtained from RGB-D sensors by leveraging contextual hallucinated people.
A significant amount of research has been done investigating the relationships between objects and other objects. It’s called semantic mapping, and it’s very valuable in giving robots what we’d call things like “intuition” or “common sense.” However, being humans, we tend to live human-centered lives, and that means that the majority of our stuff tends to be human-centered too, and keeping this in mind can help to put objects in context. (via Robots Hallucinate Humans to Aid in Object Recognition - IEEE Spectrum)
The octopus is a natural escape artist. It can squeeze its soft body into impossibly tight spaces and often baffles aquarium workers with its ability to break out of tanks. These abilities could be very useful in an underwater robot, which is why the OCTOPUS Project, a consortium of European robotics labs, is attempting to reverse engineer it in all its tentacled glory. Now researchers from the Foundation for Research and Technology (FORTH), in Hellas, Greece are learning how the robot might use its tentacles to swim. (via Unleash the Kraken! Robot octopus learning to swim)
Japanese robots make a stink about bad breath, body odor
Have you got a case of dog breath? How about smelly feet? Friends and family may not tell you, but a couple of new robots will. Built by the Kitakyushu National College of Technology and a group of inventive pranksters calling itself CrazyLabo, the pair of odor-detecting robots are giving people a lesson in hygiene and a few chuckles.
Kaori-chan, a decapitated mannequin head that sits atop a pink box, is the one that smells your breath. Simply blow into her face and don’t expect her to spare your feelings if you could use a mint or two. Responses range from the blunt, “Yuck, you have bad breath!” to an embarrassing, “Emergency! There’s an emergency taking place! That’s beyond the limit of patience!”
The foot-sniffing dog, Shuntaro-kun, is a bit less eloquent but just as clear with his responses. He’ll cuddle up to you if you smell ok, but if you stink he’ll bark, fall down and growl, or play dead. Both robots get their sense of smell from a commercially available odor sensor and grade your aroma on a four point scale. (via Japanese robots make a stink about bad breath, body odor)