248 posts tagged Robotics
When Will Robots Take Over the World?
Jul 17, 2014 | 2-part series
Kasia Cieplak-Mayr von Baldegg, Sam Price-Waldman, Paul Rosenfeld
The Big Question is a series inspired by The Atlantic's back-page feature.
Courtesy of The Atlantic
The Moral Hazards and Legal Conundrums of Our Robot-Filled Future
The robots are coming, and they’re getting smarter. They’re evolving from single-task devices like Roomba and its floor-mopping, pool-cleaning cousins into machines that can make their own decisions and autonomously navigate public spaces. Thanks to artificial intelligence, machines are getting better at understanding our speech and detecting and reflecting our emotions. In many ways, they’re becoming more like us. Whether you find it exhilarating or terrifying (or both), progress in robotics and related fields like AI is raising new ethical quandaries and challenging legal codes that were created for a world in which a sharp line separates man from machine. Last week, roboticists, legal scholars, and other experts met at the University of California, Berkeley law school to talk through some of the social, moral, and legal hazards that are likely to arise as that line starts to blur. At a panel discussion on July 11, the discussion ranged from whether police should be allowed to have drones that can taser suspected bad guys to whether life-like robots should have legal rights. One of the most provocative topics was robot intimacy. If, for example, pedophilia could be eradicated by assigning child-like robots to sex offenders, would it be ethical to do that? Is it even ethical to do the research to find out if it would work? “We’re poised at the cusp of really being surrounded by robots in daily life,” said Jennifer Urban, the Berkeley law professor who moderated the panel. That’s why now is the time to start grappling with these questions, Urban says. A future filled with robots may be inevitable, but we still have an opportunity to shape it. Below are five thought-provoking themes that emerged from the discussion. (via The Moral Hazards and Legal Conundrums of Our Robot-Filled Future | Science | WIRED)
Meet Ray: Düsseldorf Airport’s autonomous robot car parking concierge
You’ve just caught the “red-eye” flight home. The baby in the next row screamed all of the way, your inflight meal was awful, and you’re beyond tired. You drag yourself off the plane and schlep your heavy baggage over to the car park – only to realize that you’ve forgotten where you parked your car. At times like this, wouldn’t it be nice if you could just have your vehicle magically appear? Well, if you’re at Düsseldorf Airport in Germany you can. That’s because “Ray” the parking robot concierge installed there knows when your flight arrives, picks up your car in its mechanical arms and delivers it right to you. (via Meet Ray: Düsseldorf Airport’s autonomous robot car parking concierge)
An intelligent robot equipped with emotion might feel sad at its lack of progress, and eventually give up and do something else. (via Artificial Emotions - Issue 1: What Makes You So Special - Nautilus)
hitchBOT aims to be first robot to hitchhike across Canada
In what is hailed as a world first for robots, a Canadian robot dubbed “hitchBOT” hopes to be the first to hitchhike across Canada this July. Wearing jaunty red boots and yellow garden gloves (with one in a permanent “thumbing a ride” gesture), HitchBOT is going to try to use his good looks and power of speech to convince people to pick him up and drive him from Halifax, Nova Scotia to Victoria, British Columbia. According to his designers, hitchBOT boasts artificial intelligence (AI) and and has been endowed with speech recognition and speech processing capabilities so that he may understand and converse with those people that he may encounter on his journey. To keep them engaged in conversation, hitchBOT apparently also runs social media and Wikipedia APIs, so that he will not only be able to talk to the people that pick him up, he’ll be able to make interesting and informed small talk with them whilst tweeting and posting his “thoughts” to a wider audience. A collaborative venture first conceived in 2013, hitchBOT is a product of the work of Dr. David Harris Smith of McMaster University and Dr. Frauke Zeller of Ryerson University. Since its inception, the team has further expanded to include other collaborators and researchers from a wide range of disciplines from both universities, including computer science, electrical engineering, communication, and mechatronics. (via hitchBOT aims to be first robot to hitchhike across Canada)
Thanks to new technology, sex toys are becoming tools for connection - but will sexbots reverse that trend?
There is only one true sexbot that you can go out and buy today. Her name is Roxxxy, and she is a ‘robot companion’ intended to look human, or something very close. She’s 5’7” and slender. She’s got a wide range of hair and eye colours. And depending on the model you choose, she can ‘hear’ you, ‘talk’ to you, and ‘feel’ you. First introduced in 2010 after nearly a decade of development, the Roxxxy line now includes RoxxxyGold, RoxxxySilver, and RoxxxyPillow, as well as Rocky. Only RoxxxyGold comes equipped with a ‘personality,’ although RoxxxySilver will talk during sex. RoxxxyPillow, the least expensive model, is only the torso, head, and three ‘inputs’ – vagina, anus, and mouth. Unlike the other models, which are full-sized, RoxxxyPillow can be tucked away discreetly when not in use.
go read the full article..
For my own take on the issue read here:
Photos of a Strange, Thriving Humanoid Robotics Movement
Japan is famous for its robotics industry which has developed everything from faceless industrial robots that power factories to cybernetic cats that provide companionship to the elderly. There’s also a subculture of scientists trying to create robots that could pass as humans and London-based photographer Luisa Whitton has captured their stories in a series called What About the Heart? A scholarship provided Whitton with the opportunity to travel to Japan to meet with robotics pioneer Hiroshi Ishiguro, who became famous in tech circles for having built an eerily creepy robotic copy of himself. “I was initially drawn to the uncanny and surrealistic aspect to Ishiguro’s story, and this area of robotics specific to Japan which has a reputation in pushing the boundaries between science, art, and philosophy,” says Whitton. The result is a collection of photos that appear to capture robots in the throes of electronic existential crises. (via Photos of a Strange, Thriving Humanoid Robotics Movement | Design | WIRED)
Robot Linda to meet the public at London’s Natural History Museum
Having a robot around the house might be nice, but not if it keeps stepping on the cat and tripping over the coffee table. This month, the public will get the chance to meet a robot at the Natural History Museum in London that may be a bit kinder to furniture and tabbies. The University of Lincoln’s Linda robot, which will mingle with visitors, is designed to learn about its surroundings and make it easier to work human environments. Looking like a pair of eyes in a fishbowl stuck on traffic cone, Linda is a mobile robot developed by the University of Lincoln’s School of Computer Science. Its name is a reference to “Lindum Colonia,” the ancient name for the city of Lincoln. It’s one of six robots built for the £7.2 million (US$12 million) STRANDS project, which aims to produce robots suitable for working with security guards and staff in nursing homes. Most state-of-the-art robots are given maps of their surroundings, or create them when they begin operations in an area. This works, but human environments tend to change over time as furniture is moved, people come and go, and objects disappear and reappear. These degrade the robot’s map as anomalies build up. The result is that most robots can only operate for a few hours before needing to restart and remap the area. (via Robot Linda to meet the public at London’s Natural History Museum)
MIT’s cooking up robots that can assemble themselves in the oven
It’s 2050, and you’re prepping the oven to bake your next robotic minion while a 3D printer spews out its components. Wait a sec… bake a robot? As strange as that sounds, there’s already a group of MIT researchers developing the technology and the printable materials that can self-assemble into a robot when heated. Since we usually bake food and not robots (and this is all very new), the researchers are experimenting with different materials to find the best option. One is aluminum-coated polyster that folds or twirls itself to form the proper components inside an oven. The other is PVC plastic sandwiched between rigid polyester sheets full of cuts and slits — upon heating, the PVC becomes deformed and the slits close, forcing the whole thing to bend and fold into place. Also, the scientists are looking into developing a system that uses CAD files to create 2D patterns, as described in one of the two papers they published about the research. Obviously, the team’s not going to develop the perfect material and method overnight, but MIT professor Daniela Rus says they ultimately hope to make it possible to create useful robots anytime. (via MIT’s cooking up robots that can assemble themselves in the oven)
If a robot read a novel, how would it feel? You might get a sense from these little jingles. Below are some songs that were automatically created by a series of algorithms that turn the emotions in novels into short pieces of music. If the songs remind you, traumatically, of your untalented little sister practicing piano… well, you can’t say I didn’t warn you. Actually, the origins of the songs are pretty cool, as the Physics arXiv Blog reports. They start with sentiment analysis, a field in computer science that got hot not long after Twitter did. As more and more people started tweeting, computer scientists and companies wanted to automatically process those tweets, to figure out what emotions people were expressing in them. For example, do people feel negatively or positively about… snack cakes? How do people feel about a specific brand, say, Little Debbie? You can see the commercial interest in this. The same techniques computer scientists use to analyze Twitter are also able read the feels in any text. So now it’s possible to automatically read the emotions in novels, too. To make the songs below, two researchers—one of them a programmer and a musician—went one step beyond that. After running novels through a sentiment-analysis algorithm, they created an algorithm that would express those sentiments through music.