lunes, 3 de febrero de 2014

Mentes sobresaturadas??

Mi amiga Alma me envió el otro día este artículo preguntándome que me parecía....pues bien....voy a dejar que primero lo leáis vosotrxs también,  y a lo largo de esta semana escribiré mi punto de vista; pero también escribiré acerca de si hay evidencia científica que sugiera que sólo utilizamos el 10% de nuestro cerebro..... 


Buen día a todxs :)


The Older Mind May Just Be a Fuller Mind

People of a certain age (and we know who we are) don’t spend much leisure time reviewing the research into cognitive performance and aging. The story is grim, for one thing: Memory’s speed and accuracy begin to slip around age 25 and keep on slipping.
The story is familiar, too, for anyone who is over 50 and, having finally learned to live fully in the moment, discovers it’s a senior moment. The finding that the brain slows with age is one of the strongest in all of psychology.
Lisa Haney
Over the years, some scientists have questioned this dotage curve. But these challenges have had an ornery-old-person slant: that the tests were biased toward the young, for example. Or that older people have learned not to care about clearly trivial things, like memory tests. Or that an older mind must organize information differently from one attached to some 22-year-old who records his every Ultimate Frisbee move on Instagram.
Now comes a new kind of challenge to the evidence of a cognitive decline, from a decidedly digital quarter: data mining, based on theories of information processing. In a paper published in Topics in Cognitive Science, a team of linguistic researchers from the University of Tübingen in Germany used advanced learning models to search enormous databases of words and phrases.
Since educated older people generally know more words than younger people, simply by virtue of having been around longer, the experiment simulates what an older brain has to do to retrieve a word. And when the researchers incorporated that difference into the models, the aging “deficits” largely disappeared.
“What shocked me, to be honest, is that for the first half of the time we were doing this project, I totally bought into the idea of age-related cognitive decline in healthy adults,” the lead author, Michael Ramscar, said by email. But the simulations, he added, “fit so well to human data that it slowly forced me to entertain this idea that I didn’t need to invoke decline at all.”
Can it be? Digital tools have confounded predigital generations; now here they are, coming to the rescue. Or is it that younger scientists are simply pretesting excuses they can use in the future to cover their own golden-years lapses?
In fact, the new study is not likely to overturn 100 years of research, cognitive scientists say. Neuroscientists have some reason to believe that neural processing speed, like many reflexes, slows over the years; anatomical studies suggest that the brain also undergoes subtle structural changes that could affect memory.
Still, the new report will very likely add to a growing skepticism about how steep age-related decline really is. It goes without saying that many people remain disarmingly razor-witted well into their 90s; yet doubts about the average extent of the decline are rooted not in individual differences but in study methodology. Many studies comparing older and younger people, for instance, did not take into account the effects of pre-symptomatic Alzheimer’s disease, said Laura Carstensen, a psychologist at Stanford University.
Dr. Carstensen and others have found, too, that with age people become biased in their memory toward words and associations that have a positive connotation — the “age-related positivity effect,” as it’s known. This bias very likely applies when older people perform so-called paired-associate tests, a common measure that involves memorizing random word pairs, like ostrich and house.
“Given that most cognitive research asks participants to engage with neutral (and in emotion studies, negative) stimuli, the traditional research paradigm may put older people at a disadvantage,” Dr. Carstensen said by email.
The new data-mining analysis also raises questions about many of the measures scientists use. Dr. Ramscar and his colleagues applied leading learning models to an estimated pool of words and phrases that an educated 70-year-old would have seen, and another pool suitable for an educated 20-year-old. Their model accounted for more than 75 percent of the difference in scores between older and younger adults on items in a paired-associate test, he said.
That is to say, the larger the library you have in your head, the longer it usually takes to find a particular word (or pair).
Scientists who study thinking and memory often make a broad distinction between “fluid” and “crystallized” intelligence. The former includes short-term memory, like holding a phone number in mind, analytical reasoning, and the ability to tune out distractions, like ambient conversation. The latter is accumulated knowledge, vocabulary and expertise.
“In essence, what Ramscar’s group is arguing is that an increase in crystallized intelligence can account for a decrease in fluid intelligence,” said Zach Hambrick, a psychologist at Michigan State University. In a variety of experiments, Dr. Hambrick and Timothy A. Salthouse of the University of Virginia have shown that crystallized knowledge (as measured by New York Times crosswords, for example) climbs sharply between ages 20 and 50 and then plateaus, even as the fluid kind (like analytical reasoning) is dropping steadily — by more than 50 percent between ages 20 and 70 in some studies. “To know for sure whether the one affects the other, ideally we’d need to see it in human studies over time,” Dr. Hambrick said.
Dr. Ramscar’s report was a simulation and included no tested subjects, though he said he does have several memory studies with normal subjects on the way.
For the time being, this new digital-era challenge to “cognitive decline” can serve as a ready-made explanation for blank moments, whether senior or otherwise.
It’s not that you’re slow. It’s that you know so much.

Fuente: The New York Times

No hay comentarios:

Publicar un comentario