A Momentary Flow

Updating Worldviews one World at a time

Team builds ‘reliable’ array for quantum computing
-
University of California, Santa Barbara Original Study
-
Physicists are closer to making a quantum computer a reality by demonstrating a new level of reliability in a five-qubit array. A fully functional quantum computer is one of the holy grails of physics. Unlike conventional computers, the quantum version uses qubits (quantum bits), which make direct use of the multiple states of quantum phenomena. When realized, a quantum computer will be millions of times more powerful at certain computations than today’s supercomputers. Quantum computing relies on aspects of quantum mechanics such as superposition. This notion holds that any physical object, such as an atom or electron—what quantum computers use to store information—can exist in all of its theoretical states simultaneously. This could take parallel computing to new heights. “Quantum hardware is very, very unreliable compared to classical hardware,” says Austin Fowler, a staff scientist in the physics department at the University of California, Santa Barbara, whose theoretical work inspired the experiments. “Even the best state-of-the-art hardware is unreliable. Our paper shows that for the first time reliability has been reached.” While Fowler and colleagues have shown logic operations at the threshold, the array must operate below the threshold to provide an acceptable margin of error. “Qubits are faulty, so error correction is necessary,” says graduate student and co-lead author Julian Kelly, who worked on the five-qubit array. “We need to improve and we would like to scale up to larger systems,” says lead author Rami Barends, a postdoctoral fellow with the group. “The intrinsic physics of control and coupling won’t have to change but the engineering around it is going to be a big challenge.” (via Team builds ‘reliable’ array for quantum computing | Futurity)

Team builds ‘reliable’ array for quantum computing
-
University of California, Santa Barbara Original Study
-
Physicists are closer to making a quantum computer a reality by demonstrating a new level of reliability in a five-qubit array. A fully functional quantum computer is one of the holy grails of physics. Unlike conventional computers, the quantum version uses qubits (quantum bits), which make direct use of the multiple states of quantum phenomena. When realized, a quantum computer will be millions of times more powerful at certain computations than today’s supercomputers. Quantum computing relies on aspects of quantum mechanics such as superposition. This notion holds that any physical object, such as an atom or electron—what quantum computers use to store information—can exist in all of its theoretical states simultaneously. This could take parallel computing to new heights. “Quantum hardware is very, very unreliable compared to classical hardware,” says Austin Fowler, a staff scientist in the physics department at the University of California, Santa Barbara, whose theoretical work inspired the experiments. “Even the best state-of-the-art hardware is unreliable. Our paper shows that for the first time reliability has been reached.” While Fowler and colleagues have shown logic operations at the threshold, the array must operate below the threshold to provide an acceptable margin of error. “Qubits are faulty, so error correction is necessary,” says graduate student and co-lead author Julian Kelly, who worked on the five-qubit array. “We need to improve and we would like to scale up to larger systems,” says lead author Rami Barends, a postdoctoral fellow with the group. “The intrinsic physics of control and coupling won’t have to change but the engineering around it is going to be a big challenge.” (via Team builds ‘reliable’ array for quantum computing | Futurity)

More brains don’t always lead to better decisions
-
Princeton University Original Study
-
We tend to think that a group decision is more likely to be accurate when there are more brains involved—but that might not be true in all situations. Researchers report that smaller groups actually tend to make more accurate decisions, while larger assemblies may become excessively focused on only certain pieces of information. The findings present a significant caveat to what is known about collective intelligence, or the “wisdom of crowds,” wherein individual observations—even if imperfect—coalesce into a single, accurate group decision. A classic example of crowd wisdom is English statistician Sir Francis Galton’s 1907 observation of a contest in which villagers attempted to guess the weight of an ox. Although not one of the 787 estimates was correct, the average of the guessed weights was just one pound short of the animal’s recorded heft. Along those lines, the consensus has been that group decisions are enhanced as more individuals have input. But collective decision-making has rarely been tested under complex, “realistic” circumstances where information comes from multiple sources, the researchers report in the journal Proceedings of the Royal Society B. (via More brains don’t always lead to better decisions | Futurity)

More brains don’t always lead to better decisions
-
Princeton University Original Study
-
We tend to think that a group decision is more likely to be accurate when there are more brains involved—but that might not be true in all situations. Researchers report that smaller groups actually tend to make more accurate decisions, while larger assemblies may become excessively focused on only certain pieces of information. The findings present a significant caveat to what is known about collective intelligence, or the “wisdom of crowds,” wherein individual observations—even if imperfect—coalesce into a single, accurate group decision. A classic example of crowd wisdom is English statistician Sir Francis Galton’s 1907 observation of a contest in which villagers attempted to guess the weight of an ox. Although not one of the 787 estimates was correct, the average of the guessed weights was just one pound short of the animal’s recorded heft. Along those lines, the consensus has been that group decisions are enhanced as more individuals have input. But collective decision-making has rarely been tested under complex, “realistic” circumstances where information comes from multiple sources, the researchers report in the journal Proceedings of the Royal Society B. (via More brains don’t always lead to better decisions | Futurity)

Medieval bishop’s theory resembles modern concept of multiple universes
-
A 13th century bishop’s theory about the formation of the universe has intriguing parallels with the theory of multiple universes. This was uncovered by the the Ordered Universe project at Durham University, which has brought together researchers from humanities and the sciences in a radically collaborative way. The project explores the conceptual world of Robert Grosseteste, one of the most dazzling minds of his generation (1170 to 1253): sometime bishop of Lincoln, church reformer, theologian, poet, politician, and one of the first to absorb, teach and debate new texts on natural phenomena that were becoming available to western scholars. These texts, principally the natural science of the greek scholar Aristotle, were translated from Arabic into Latin during the course of the 12th and 13th centuries, along with a wonderful array of material from Islamic and Jewish commentators. They revolutionised the intellectual resources of western scholars, posing challenges to established ways of thinking. We now recognise that the thinking they stimulated prepared the way for the scientific advances of the 16th and 17th centuries, too. Nearly 800 years later the example of Grosseteste’s works provides the basis for doing great interdisciplinary work, offering unexpected challenges to both modern scientists and humanities experts alike, especially in working closely together. (via Medieval bishop’s theory resembles modern concept of multiple universes)

Medieval bishop’s theory resembles modern concept of multiple universes
-
A 13th century bishop’s theory about the formation of the universe has intriguing parallels with the theory of multiple universes. This was uncovered by the the Ordered Universe project at Durham University, which has brought together researchers from humanities and the sciences in a radically collaborative way. The project explores the conceptual world of Robert Grosseteste, one of the most dazzling minds of his generation (1170 to 1253): sometime bishop of Lincoln, church reformer, theologian, poet, politician, and one of the first to absorb, teach and debate new texts on natural phenomena that were becoming available to western scholars. These texts, principally the natural science of the greek scholar Aristotle, were translated from Arabic into Latin during the course of the 12th and 13th centuries, along with a wonderful array of material from Islamic and Jewish commentators. They revolutionised the intellectual resources of western scholars, posing challenges to established ways of thinking. We now recognise that the thinking they stimulated prepared the way for the scientific advances of the 16th and 17th centuries, too. Nearly 800 years later the example of Grosseteste’s works provides the basis for doing great interdisciplinary work, offering unexpected challenges to both modern scientists and humanities experts alike, especially in working closely together. (via Medieval bishop’s theory resembles modern concept of multiple universes)

People eat more in restaurants when the temperature is cool, possibly because we need more energy to warm up; Soft lighting (candlelight, in particular) puts us at ease and makes us eat for longer periods of time, while bright lights make us eat faster; Nice smells were shown to increase soda consumption in movie-watching experiments, while awful smells make us feel full faster Social distractions — particularly watching TV or eating with friends — can lead to longer periods of eating because, like the amnesic patients at the top of the article, they make us forget what we’ve just consumed.

The Ways Food Tricks Our Brains - Derek Thompson - The Atlantic

Source The Atlantic

go read: The Ways Food Tricks Our Brains
How restaurants, low-cal labels, candles, music, and even salads fool us into unhealthy eating.
-
In 1998, researchers from the University of Pennsylvania published a study that might strike you as kind of mean. They took two people with severe amnesia, who couldn’t remember events occurring more than a minute earlier, and fed them lunch. Then a few minutes later, they offered a second lunch. The amnesic patients eagerly ate it. Then a few minutes later, they offered a third lunch, and the patients ate that, too. Days later, they repeated the experiment, telling two people with no short-term memory that it was lunch time over and over and observing them readily eat multiple meals in a short period of time. This might seem like a somewhat trivial discovery, but it unveils a simple truth about why we eat. Hunger doesn’t come from our stomachs alone. It comes from our heads, too. We need our active memories to know when to begin and end a meal. While our stomachs know exactly what food we’re eating (since they’re the organ responsible for processing it) our brains are a bit more easily tricked. In this month’s Journal of Consumer Research, two studies on our brains and food open a crack into a depressing world of the eating brain’s awful gullibility. (via The Ways Food Tricks Our Brains - Derek Thompson - The Atlantic)

go read: The Ways Food Tricks Our Brains
How restaurants, low-cal labels, candles, music, and even salads fool us into unhealthy eating.
-
In 1998, researchers from the University of Pennsylvania published a study that might strike you as kind of mean. They took two people with severe amnesia, who couldn’t remember events occurring more than a minute earlier, and fed them lunch. Then a few minutes later, they offered a second lunch. The amnesic patients eagerly ate it. Then a few minutes later, they offered a third lunch, and the patients ate that, too. Days later, they repeated the experiment, telling two people with no short-term memory that it was lunch time over and over and observing them readily eat multiple meals in a short period of time. This might seem like a somewhat trivial discovery, but it unveils a simple truth about why we eat. Hunger doesn’t come from our stomachs alone. It comes from our heads, too. We need our active memories to know when to begin and end a meal. While our stomachs know exactly what food we’re eating (since they’re the organ responsible for processing it) our brains are a bit more easily tricked. In this month’s Journal of Consumer Research, two studies on our brains and food open a crack into a depressing world of the eating brain’s awful gullibility. (via The Ways Food Tricks Our Brains - Derek Thompson - The Atlantic)

Regrown nerves boost bionic ears

See on Scoop.it - The future of medicine and health

Gene therapy aids performance of cochlear impants in guinea pigs.

Gene therapy delivered to the inner ear can help shrivelled auditory nerves to regrow — and in turn, improve bionic ear technology, researchers report today in Science Translational Medicine1. The work, conducted in guinea pigs, suggests a possible avenue for developing a new generation of hearing prosthetics that more closely mimics the richness and acuity of natural hearing.


See on nature.com

With Farm Robotics, the Cows Decide When It’s Milking Time

See on Scoop.it - Cyborg Lives

Farms in upstate New York and elsewhere are using automatic milkers that scan and map the underbellies of cows, extract the milk, and monitor its quality, without the use of human hands.

-

..

The cows seem to like it, too.

Robots allow the cows to set their own hours, lining up for automated milking five or six times a day — turning the predawn and late-afternoon sessions around which dairy farmers long built their lives into a thing of the past.

With transponders around their necks, the cows get individualized service. Lasers scan and map their underbellies, and a computer charts each animal’s “milking speed,” a critical factor in a 24-hour-a-day operation.

The robots also monitor the amount and quality of milk produced, the frequency of visits to the machine, how much each cow has eaten, and even the number of steps each cow has taken per day, which can indicate when she is in heat.

“The animals just walk through,” said Jay Skellie, a dairyman from Salem, N.Y., after watching a demonstration. “I think we’ve got to look real hard at robots.”

Many of those running small farms said the choice of a computerized milker came down to a bigger question: whether to upgrade or just give up.


See on nytimes.com

First transfusions of “manufactured” blood planned for 2016

See on Scoop.it - The future of medicine and health

According to the World Health Organization, approximately 107 million blood donations are collected globally every year. Nonetheless, blood is often in short supply – particularly in developing nations. Despite new safeguards, there’s also still the risk of incompatibility, or of infections being transmitted from donors to recipients. Charitable organization the Wellcome Trust hopes to address these problems, by developing the ability to manufacture blood outside of the body. Last week, it announced that test subjects should begin receiving transfusions of blood made with lab-grown red blood cells by late 2016.

The research program is being led by the Scottish National Blood Transfusion Service, thanks to the Wellcome Trust’s £5 million (US$8.4 million) Strategic Award grant. Institutions collaborating on the project include the University of Glasgow, the University of Edinburgh, Loughborough University, NHS Blood and Transplant, the Irish Blood Transfusion Service, Roslin Cells Ltd and the Cell Therapy Catapult, in collaboration with Bristol University and the University of Cambridge.


See on gizmag.com