Tools like Lumosity promise to stimulate your mind, though researchers question how much they improve cognitive performance.
For a $14.95 monthly membership, the website Lumosity promises to “train” your brain with games designed to stave off mental decline. Users view a quick succession of bird images and numbers to test attention span, for instance, or match increasingly complex tile patterns to challenge memory.
While Lumosity is perhaps the best known of the brain-game websites, with 50 million subscribers in 180 countries, the cognitive training business is booming. Happy Neuron of Mountain View, Calif., promises “brain fitness for life.” Cogmed, owned by the British education company Pearson, says its training program will give students “improved attention and capacity for learning.” The Israeli firm Neuronix is developing a brain stimulation and cognitive training program that the company calls a “new hope for Alzheimer’s disease.”
And last month, in a move that could significantly improve the financial prospects for brain-game developers, the Centers for Medicare and Medicaid Services began seeking comments on a proposal that would, in some cases, reimburse the cost of “memory fitness activities.”
Let’s say Martians land on the Earth and wish to understand more about humans. Someone hands them a copy of the Complete Works of Shakespeare and says: “When you understand what’s in there, you will understand everything important about us.” The Martians set to work – they allocate vast resources to recording every detail of this great tome until eventually they know where every “e”, every “a”, every “t” is on every page. They remain puzzled, and return to Earth. “We have completely characterised this book,” they say, “but we still aren’t sure we really understand you people at all.” The problem is that characterising a language is not the same as understanding it, and this is the problem faced by brain researchers too. Neurons (brain cells) use language of a kind, a “code”, to communicate with each other, and we can tap into that code by listening to their “chatter” as they fire off tiny bursts of electricity (nerve impulses). We can record this chatter and document all its properties. We can also determine the location of every single neuron and all of its connections and its chemical messengers. Having done this, though, we still will not understand how the brain works. To understand a code we need to anchor that code to the real world.
Philosophy of cosmology, philosophy of physics, philosophy of science, metaphysics, philosophy of mathematics, University of Oxford.
Philosophy of cosmology is an expanding discipline, directed to the conceptual foundations of cosmology and the philosophical contemplation of the universe as a totality. It draws on the fundamental theories of physics — thermodynamics, statistical mechanics, quantum mechanics, quantum field theory, and special and general relativity — and on several branches of philosophy — philosophy of physics, philosophy of science, metaphysics, philosophy of mathematics, and epistemology.
Central questions concern limits to explanation, physical infinity, laws, especially laws, if any, of initial conditions, selection effects and the anthropic principle, objective probability, the nature of space, time, and spacetime, the arrow of time, the measurement problem of quantum mechanics, dark energy and quantum fluctuations, scale, the origins of structure formation, the origins and fate of the universe, and the place of life and intelligence within it.
The book business is merging into the magazine business as more publishers sell literature via subscription to highly targeted clusters of readers. High-profile literary studio Plympton is leading the charge with its $5-a-month iOS service Rooster.
Humans live on a water world, and yet, many of us still struggle to slake our thirst. Why is that? Earth’s oceans are salty. Just 2.5% of the Earth’s water is freshwater, and of that, 60% is trapped in glaciers, 30% in groundwater (not all of which is accessible), and just 10% is on the surface in lakes and rivers.
There is, of course, great demand for freshwater, and it isn’t all for drinking. Freshwater is used for industrial and agricultural purposes too. Because current methods for removing the salt from ocean water (desalination) are energy intensive and expensive—there is increasing competition for a limited supply of freshwater.
Confronted with a simple mathematical problem, most children ages 4 to 6 can use algebraic concepts intuitively to solve for a hidden variable, say researchers. “These very young children, some of whom are just learning to count, and few of whom have even gone to school yet, are doing basic algebra and with little effort,” says Melissa Kibbe, a post-doctoral fellow at Johns Hopkins University. “They do it by using what we call their ‘Approximate Number System:’ their gut-level, inborn sense of quantity and number.” Kibbe, lead author of a report in the journal Developmental Science, says the “Approximate Number System,” or “number sense,” is the ability to quickly estimate the quantity of objects in their everyday environments. Humans and a host of other animals are born with this ability, probably an evolutionary adaptation to help them survive in the wild, scientists say. Previous research has revealed that adolescents with better math abilities also had superior number sense when they were preschoolers, and that number sense peaks around age 35.
“at first sight it might be thought that knowledge might be defined as belief which is in agreement with the facts. The trouble is that no one knows what a belief is, no one knows what a fact is, and no one knows what sort of agreement between them would make a belief true.”—
Bertrand Russell (1872-1970) summarizes the problem with defining knowledge in his ‘Theory of Knowledge’ (1913):
“Information is perhaps the rawest material in the process out of which we arrive at meaning: an undifferentiated stream of sense and nonsense in which we go fishing for facts. But the journey from information to meaning involves more than simply filtering the signal from the noise. It is an alchemical transformation, always surprising. It takes skill, time and effort, practice and patience. No matter how experienced we become, success cannot be guaranteed. In most human societies, there have been specialists in this skill, yet it can never be the monopoly of experts, for it is also a very basic, deeply human activity, essential to our survival. If boredom has become a sickness in modern societies, this is because the knack of finding meaning is harder to come by.”—The problem with too much information – Dougald Hine – Aeon
An artificial-intelligence system has learned to spot the telltale language people use when lying in court or in fawning online book reviews
LAWYERS and judges use skill and instinct to sense who might be lying in court. Soon they may be able to rely on a computer, too.
An AI system trained on false statements is highly accurate at spotting deceptive language in written or spoken testimony. It can also be used to weed out fake online reviews of books, hotels and restaurants.
The system is the work of computational linguists Massimo Poesio at the University of Essex in Colchester, UK, and Tommaso Fornaciari at the Center for Mind/Brain sciences in Trento, Italy. It is based on a technique called stylometry, which counts how often certain words appear in a passage.
The method is often applied to determine who wrote a piece of text, but software can employ it to pick out deception instead. The strategy is to seek out the overuse of linguistic hedges such as “to the best of my knowledge”, or overzealous expressions such as “I swear to god”.
"But all previous studies had used deceptive texts created in the lab," Poesio says. "What has been missing was a system that could work on real-world lies."
So he and Fornaciari trained a machine learning system by feeding it Italian courtroom depositions and statements by defendants known to have committed perjury. The researchers say it is now nearly 75 per cent accurate at indicating whether a defendant or witness is being deceptive. “We can achieve an accuracy that is way above chance,” says Poesio.
Neuroscientists monitor inhibitory neurons that link sense of smell with memory and cognition in mice, shaping perception from experiences
Odors have a way of connecting us with moments buried deep in our past. But researchers have long wondered how the process works in reverse: how do our memories shape the way sensory information is collected? In work published in Nature Neuroscience, scientists from Cold Spring Harbor Laboratory (CSHL) demonstrate for the first time a way to observe this process in awake animals. The team, led by Assistant Professor Stephen Shea, was able to measure the activity of a group of inhibitory neurons that links the odor-sensing area of the brain with brain areas responsible for thought and cognition. This connection provides feedback so that memories and experiences can alter the way smells are interpreted. The inhibitory neurons that forge the link are known as granule cells. They are found in the core of the olfactory bulb, the area of the mouse brain responsible for receiving odor information from the nose. Granule cells in the olfactory bulb receive inputs from areas deep within the brain involved in memory formation and cognition. Granule cells relay the information they receive from neurons involved in memory and cognition back to the olfactory bulb. There, the granule cells inhibit the neurons that receive sensory inputs. In this way, “the granule cells provide a way for the brain to ‘talk’ to the sensory information as it comes in,” explains Shea. “You can think of these cells as conduits which allow experiences to shape incoming data.” Why might an animal want to inhibit or block out specific parts of a stimulus, like an odor? Every scent is made up of hundreds of different chemicals, and “granule cells might help animals to emphasize the important components of complex mixtures,” says Shea. For example, an animal might have learned through experience to associate a particular scent, such as a predator’s urine, with danger. But each encounter with the smell is likely to be different. Maybe it is mixed with the smell of pine on one occasion and seawater on another. Granule cells provide the brain with an opportunity to filter away the less important odors and to focus sensory neurons only on the salient part of the stimulus. Now that it is possible to measure the activity of granule cells in awake animals, Shea and his team are eager to look at how sensory information changes when the expectations and memories associated with an odor change. “The interplay between a stimulus and our expectations is truly the merger of ourselves with the world. It exciting to see just how the brain mediates that interaction,” says Shea. This work was supported by the Klingenstein fellowship and a fellowship from the Natural Sciences and Engineering Research Council of Canada.
“It is the only rule of liberation that you will need to follow, the law of liberating yourself from that which makes you free in appearance to that which makes you free in substance.”—my latest.. (in the ultrashort project) Wildcat: Suchness reveals Haecceity
Obesity may have harmful effects on the brain, and exercise may counteract many of those negative effects, according to sophisticated new neurological experiments with mice, even when the animals do not lose much weight. While it’s impossible to know if human brains respond in precisely the same way to fat and physical activity, the findings offer one more reason to get out and exercise.
It’s been known for some time that obesity can alter cognition in animals. Past experiments with lab rodents, for instance, have shown that obese animals display poor memory and learning skills compared to their normal-weight peers. They don’t recognize familiar objects or recall the location of the exit in mazes that they’ve negotiated multiple times.
But scientists hadn’t understood how excess weight affects the brain. Fat cells, they knew, manufacture and release substances into the bloodstream that flow to other parts of the body, including the heart and muscles. There, these substances jump-start biochemical processes that produce severe inflammation and other conditions that can lead to poor health.
Many thought the brain, though, should be insulated from those harmful effects. It contains no fat cells and sits behind the protective blood-brain barrier that usually blocks the entry of undesirable molecules.
However, recent disquieting studies in animals indicate that obesity weakens that barrier, leaving it leaky and permeable. In obese animals, substances released by fat cells can ooze past the barrier and into the brain.
Craig Venter, who managed to make science both lucrative and glamorous with his pioneering approach to gene sequencing and synthetic biology, is taking on a new venture: aging.
He has joined forces with the founder of the X Prize and an expert in cell therapy to launch on Tuesday a new company called Human Longevity Inc. The man who once took off on his personal yacht to sample all the microscopic life in the seas plans to leverage some of the most fashionable new scientific approaches to figure out what makes us sick and old.
The San Diego-based company will tackle aging using gene sequencing; stem cell approaches; the collection of bacteria and other life forms that live in and on us called the microbiome; and the metabolome, which includes the byproducts of life called metabolites.
They’ll start out with what they are calling the largest human sequencing operation in the world.
“We are building a lab to a scale never attempted (before),” Venter told NBC News.
Venter first shot to fame when he raced with government scientists to finish the first map of all human DNA, called the human genome. Venter, himself a former government scientist, annoyed his former colleagues with a brash new approach to gene sequencing that was much faster but far less accurate, in their opinion.
The Golden Age of universities may be dead. And while much of the commentary around the online disruption of education ranges from cost-benefit analyses to assessing ideology of what drives MOOCs (massively open online courses), the real question becomes — what is the point of the university in this landscape?
It’s clear that universities will have to figure out the balance between commercial relevance and basic research, as well as how to prove their value beyond being vehicles for delivering content. But lost in the shuffle of commentary here is something arguably more important than and yet containing all of these factors: culture.
Online courses can be part of, and have, their own culture, but university culture cannot be replicated in an online environment (at least not easily). Once this cultural difference is acknowledged, we can revisit the cost-benefit analysis: Is cheaper tuition worth it if it pays for education that isn’t optimized for innovation? Will university culture further stratify the socioeconomic difference MOOCs may level? And so on…
While innovation is a buzzword that’s bandied about a bit too loosely, we think this is the lens we need to use in judging the relevance of universities. It’s the only thing that prevents us from programming students as robots, a workforce whose jobs can be automated away. In fact, universities that excel at preparing students for such a creative economy prioritize the same three things that drive successful startup cultures: density, shared resources, and community.
Think back to a time when you were completely engaged in an activity. Maybe it was reading a comic book, or catching up with an old friend. Whatever it was, what do you remember about the experience? Are “effort” and “persistence” words you would use to describe the activity? Even though something technically got done (a comic book was read, a fruitful discussion ensued), it most likely felt effortless and enjoyable.
After interviewing people about their “peak experiences” —from rock climbers to chess masters to artists to scientists— psychologist Mihalyi Csikszentmihalyi found that people kept describing a state of intense concentration and absorption in which no mental resources were left over for distraction. In this state of flow, people felt in control of their consciousness, their inner critic disappeared, and time seemed to recede in the background. Importantly, the activity felt effortless.
The great educational philosopher John Dewey was one of the first to emphasize the important linkages among interest, curiosity, and effort. Dewey made the persuasive case that interest-based learning is more beneficial than effort-based learning. He noted that “willing attention” is more effective than “forced effort” because interest drives active learning: “If we can secure interest in a given set of facts or ideas we may be perfectly sure that the pupil will direct his energies toward mastering them.” In contrast, he noted, an education based on forcing children to expend energy unwillingly only results in a “character dull, mechanical, unalert, because the vital juice of spontaneous interest has been squeezed out.”
When we read, our eyes move across a page or a screen to digest the words. All of that eye movement slows us down, but a new technology called Spritz claims to have figured out a way to turn us into speed-readers. By flashing words onto a single point on a screen, much like watching TV, Spritz says it will double your reading speed.