226 posts tagged robotics
Human beings have long performed sexual acts with artifacts. Ancient religious rituals oftentimes involved the performance of sexual acts with statues, and down through the ages a vast array of devices for sexual stimulation and gratification have been created. Little wonder then that a perennial goal among roboticists and AI experts has been the creation of sex robots (“sexbots”): robots from whom we can receive sexual gratification, and with whom we may even be able achieve an emotional connection.
Qualcomm’s brain-inspired chip: Good phone, good robot
This month, chipmaker Qualcomm opened up about its progress and goals in work on a brain-inspired chip architecture. The results are impressive. Computers that can mimic the human brain pose a challenge that attracts many computer scientists. While some people take comfort in the difference between computers and humans, such scientists see the difference as a challenge and ask if the gap can be narrowed. Qualcomm, for one, is working away at a computer architecture modeled after the brain, imitating brain processes. In a recent blog posting, Samir Kumar, Qualcomm director business development, presented his overview of the company’s Zeroth processors, which are brain–inspired. “For the past few years our Research and Development teams have been working on a new computer architecture that breaks the traditional mold. We wanted to create a new computer processor that mimics the human brain and nervous system so devices can have embedded cognition driven by brain inspired computing—this is Qualcomm Zeroth processing.” The company envisions “neuro-inspired” chips for robots, vision systems, brain implants and smartphones that will sense and process information more efficiently than ever before. Qualcomm has been focusing on a class of processors called neural processing units (NPUs). designed to be massively parallel, reprogrammable, and capable of cognitive tasks such as classification and prediction (via Qualcomm’s brain-inspired chip: Good phone, good robot)
Imagine if an army of completely flat-faced cubes could roll around and even jump on their own, joining with one another to form a variety of large-scale structures. Well, that’s exactly what a team of robotics researchers at MIT are trying to turn into a reality – and they’ve already developed the cubes that could do it. Known as M-Blocks, the devices were created by MIT’s John Romanishin, Daniela Rus and Kyle Gilpin. Along with electronics that allow them to orient themselves relative to one another, each cube also contains a motor-driven flywheel, that spins at speeds of up to 20,000 rpm. When that flywheel suddenly brakes, the transfered momentum sends the cube flying in the direction that the wheel was spinning. Because the cubes additionally have magnets on each of their faces, they stick to one another when they make contact, until the flywheel in one sends it on its way again. In order to make sure that the magnets of any two cubes meet north-pole-to-south-pole, the magnets themselves are cylindrical, and mounted in such a way that they can roll in place. If the magnets on two cubes are brought together north-to-north or south-to-south, the resulting repellant force will cause them to turn until their north and south poles are facing one another – at which point they’ll join together. (via Self-propelled robotic cubes can form into structures)
It takes years of practice and intense concentration to master the art of painting, or if you’re a welding robot, just some really good programming. In a studio at the University of Konztanz in Germany just such a robot is dabbing its brush in paint as it works. The robot is called e-David, and it can reproduce any work of art it’s shown.
A welding robot is actually a good choice for a makeshift artist. These robot arms have three degrees of freedom in order to precisely aim a torch at bits of metal. It can just as easily be programmed to point a paintbrush at canvases as an arc welder at car doors. Researchers have given e-David a palette of 24 colors to work with, and it does okay for a robot. (via Re-purposed welding robot can forge any painting it’s shown | News | Geek.com)
"The thing that’s been missing in robotics is a sense of smell," said biology professor Joseph Ayers. For more than four decades, he has been working to develop robots that do not rely on algorithms or external controllers. Instead, they incorporate electronic nervous systems that take in sensory inputs from the environment and spit out autonomous behaviors. For example, his team’s robo-lobsters are designed to seek out underwater mines without following a predetermined course. "Now people want robots to do group behavior," said Ayers, noting that social insect colonies are the perfect model. "If you’re doing large field explorations for mines, you want to have 20 or 30 robots out there." In order to get robots to cooperate with each other, he needs them to act like ants or bees or termites. Bees waggle their behinds to communicate. Ants use almost two dozen scent glands, depositing a trail of "stinks" as they go about their business. It’s this behavior that Ayers wants to mimic in his next generation of biomimetic robots. To do so, he needs electronic devices that can sense chemical inputs, such as explosives. His idea is to integrate various microelectronic sensors that can interface with living cells. For example, a bacterial cell programmed to bind odorants in the environment may elicit a conformational change; that change may translate to an influx of calcium ions, which are detected by a second cell that is programmed to generate light when bound to calcium. In this way, Ayers said, "you can see smell." That output would then trigger microelectronic actuators that tell the robot to perform a particular action, such as moving toward or away from the stimulus. But in order for any of this to play out, somebody needs to build these futuristic device
Pterodactyl-like bot “Daler” uses its wings for walking
The Daler can fly through the air with grace, but the fun doesn’t stop when this remote-controlled robot lands. The wings segment, and it clambers across the ground like an ungainly pterodactyl or bat. Most robots only use one type of locomotion. They fly through the air (raining death from above, in the case of US military drones), swim through the seas, like the Sharkbot or this robotic jellyfish, or crawl, run, and roll across the earth. However, the Daler—Deployable air land exploration robot—uses “adaptive morphology” to master the skies and the earth. It has a wingspan of 60cm, and using its battery-powered “Whegs”—wheel-legs—can fly for 30 minutes or walk for an hour. Developed at the Laboratory of Intelligent Systems at Ecole Polytechnique Fédérale de Lausanne, the Daler could one day be used in search-and-rescue, the robot’s inventor told Wired.co.uk. “The goal is to cover large distances as fast as possible in a forward flight configuration and then use hover or ground locomotion to search for victims,” said Ludovic Daler. “Being capable of multi-modal locomotion for a robot has an advantage in complex terrains where other robots would get stuck at some point.” At the moment, the Daler’s mastery of the earth is rather turtle-like, with a top speed of just 0.2m/s, and it needs to be thrown by hand to launch. But despite these shortcomings, this initial prototype shows how robots of the future could tackle any obstacle by changing their shape. The next version of the robot will include vertical take off and landing capabilities, says Daler. He is also investigating ways of allowing the robot to choose its gait depending on the terrain, as well as reducing the wingspan with deployable wings to increase its ground manoeuvrability. (via Pterodactyl-like bot “Daler” uses its wings for walking | Ars Technica)
Japan has launched the world’s first talking robot into space to serve as companion to astronaut Kochi Wakata who will begin his mission in November.
The android took off from the island of Tanegashima in an unmanned rocket also carrying supplies for crew onboard the International Space Station (ISS). Measuring 34cm (13 inches), Kirobo is due to arrive at the ISS on 9 August. It is part of a study to see how machines can lend emotional support to people isolated over long periods. The launch of the H-2B rocket was broadcast online by the Japan Aerospace Exploration Agency (Jaxa). The unmanned rocket is also carrying drinking water, food, clothing and work supplies to the six permanent crew members based at the ISS. (via BBC News - Kirobo is world’s first talking robot sent into space)
Robotic Skin Lights Up When Touched
Imagine how awesome — or distracting — it would be if human skin lit up every time something pushed on it. Pulsing arteries, mosquitoes, a rude shoulder-check on the sidewalk, or scratching an itch would transform a person into a blinking light show.
Now, scientists at the University of California, Berkeley have designed an electronic skin that does actually does this: Super-thin and flexible, the skin lights up when touched. More pressure produces a brighter light, the team reports July 21 in Nature Materials.
Thinner than a sheet of paper, the skin is made from layers of plastic and a pressure-sensitive rubber. A conductive silver ink, organic LEDs, and thin-film transistors made from semiconductor-enriched carbon nanotubes are sandwiched between the layers. Applying pressure sends a signal through the rubber that ultimately turns on the LEDs, which light up in red, green, yellow or blue.
Instead of using the material to create bodysuits for Burning Man or other illuminated party tricks, scientists suggest that it might be used for smart wallpapers, health-monitoring devices, or in robotics. The type of interactive pressure sensor developed by the Berkeley scientists could also be useful in artificial skin for prosthetic limbs. For years, scientists have been working on developing systems and materials that could be integrated into a functioning, stimulus-responsive skin — something that can sense temperature, pressure, and stretch, and can heal itself. In addition, such a sheath might one day transform an ordinary robot into an interactive machine that’s capable of responding to tiny changes in its environment.
If and when that day comes, we will welcome our touchy-feely glow-bot overlords.
Aaron Saenz: I, For One, Welcome Our New Robot Overlords
Aaron Saenz is a former physicist, an improvisational actor, and a tech journalist. He currently writes for SingularityHub.com, covering everything from crowd-sourced holographic Japanese pop stars to open source research robotics. He’s also the host of Singularity Hub’s Accelerated Tech News, a new video recap of the week’s top stories in science and technology. Follow his work on Twitter: @adsaenz
@ BAASICS.2: The Future (by BAASICSsf)
US unveils ‘Atlas’ humanoid robot test bed
A humanoid robot called Atlas could pave the way for intelligent machines to help in the wake of natural disasters. The two meter tall robot was created as a test bed for a US Defence Advanced Research Projects Agency challenge. The Darpa challenge demands Atlas completes eight tasks that it might have to perform in an emergency. Six teams have until December 2013 to develop software that will help Atlas complete the tasks. Atlas has been developed by the Boston Dynamics robotics firm which has been working on robots that can aid the military. Like a human, Atlas has two arms and legs and gets around by walking. It sees using a stereo laser scanning system and has gripping hands developed by two separate robotics companies. Unlike humans, it has a high speed networking system built-in so it can communicate with its creators and pipe data back from disaster areas. Before now, the teams taking part in the robotic challenge have only worked with virtual versions of Atlas. In the next stage of the competition, algorithms and control programs for the virtual Atlas will be transferred to the real thing. The teams will then have five months to refine Atlas’s abilities before taking part in a series of trials. During those, a tethered version of Atlas will be expected to complete tasks which include driving a car, removing debris blocking doors. climbing a ladder, finding and closing a valve and connecting a fire hose. The best performing teams in the December 2013 trials will win funding to continue refining Atlas so it can perform all eight tasks autonomously during the challenge finals in late 2014. (via BBC News - US unveils ‘Atlas’ humanoid robot test bed)