Showing posts with label greenfield. Show all posts
Showing posts with label greenfield. Show all posts

Wednesday, 13 June 2012

Kids Today Are Not Inattentive

There's no evidence that children today are less attentive or more distractible than kids in the past, according to research just published by a team of Pennsylvania psychologists: Long-Term Temporal Stability of Measured Inattention and Impulsivity in Typical and Referred Children.


The study gave a large sample of kids the "Gordon Diagnostic System" GDS test of sustained concentration ability. This dates to the 80s and it consists of a box, with a button, and a display with three digits. There are three different tasks but the main one is a sustained attention test. The goal is to watch a series of numbers and quickly press the button whenever a "1" is followed by a "9". Easy... but it takes concentration to do well.

Over the period of 2000-2006, the researchers gave the GDS to 445 healthy American kids, not diagnosed with any learning or behavioural disorder and not taking medication. They compared their scores to the standardized norms - which were based on a sample of American kids back in 1983.

The results showed that today's kids scored pretty much the same, on average, as the 1983 kids. The average age-standardized scores were extremely close to the 1983 means, across the board. Children diagnosed with ADHD, as expected, scored much worse. Oddly, kids with an Autism Spectrum Disorder did just as badly as the ADHD ones.

One of the researchers on this study is none other than Michael Gordon, who invented the GDS and, one assumes, makes money selling it. (Each GDS kit costs $1595, so someone is making a killing here.) So perhaps we should take this paper with a pinch of salt, because it's kind of an advertisement for the reliability of the GDS.

Still, these results seem pretty solid. That's good news for American children... but bad news for people like Professor Susan Greenfield, who thinks that the internet and videogames are causing an epidemic of ADHD, and all kinds of other problems.

These data suggest rather that, while ADHD diagnoses are certainly rising, children as a whole are not getting less attentive, suggesting that the rise of ADHD is more of a cultural shift.

ResearchBlogging.orgMayes, S., Gordon, M., Calhoun, S., and Bixler, E. (2012). Long-Term Temporal Stability of Measured Inattention and Impulsivity in Typical and Referred Children Journal of Attention Disorders DOI: 10.1177/1087054712448961

Sunday, 6 November 2011

Susan Greenfield's Dopamine Disaster

It's Susan Greenfield again.

Continuing her campaign warning of the dangers of modern technology in terms of their effects on the vulnerable brains of the young, the British neuroscientist and Baroness has written another article. This is the latest of many. None of them have been in peer reviewed academic journals.

This one's behind the Great Times Paywall so I can't link to it, but it's called Are video games taking away our identities?

The first part of the article is hard to argue against. Either you'll agree with it or you won't. Personally, videogames as Greenfield describes them bear little resemblance to any games that I've played recently. Similarly for her account of the Internet. But maybe this rings true for some:

Screen images do not depend for their impact on seeing one thing in terms of anything else. Their premium lies invariably in their raw sensory content... we are perhaps heading towards a much weaker sense of identity by engaging in a world where we are the passive recipient of senses and where there is no fixed narrative of past and future but an atomised thrill of the moment. One could even suggest that the constant self-centred readout on Twitter belies a more childlike insecurity, an existential crisis.

Greenfield then moves into discussing the brain, and this is where the science comes in. This is her "home turf" - she's Professor of physiology at Oxford. Yet it's a shambles.
There is one alarm bell ringing, which suggests that increasing 2D screen existence may be having undesirable effects: it is the threefold increase over the past decade in prescriptions for drugs for attention deficit hyperactivity disorder.
While this could be due to changes in doctors’ prescribing procedures, or indeed to a greater recognition and medicalisation of attentional problems, a third possibility could indeed be that if the young brain is exposed from the outset to a world of fast action-reaction, of instant new screen images flashing up with each press of a key, then such rapid interchange might lead to a shorter attention span.

The human condition can be basically divided into two alternating modes, first described by Euripedes... the rational “bread force”, characterised by a strong cognitive take on the world — a personalised past, present and future, in turn related to an active prefrontal cortex and lower levels of the brain chemical dopamine; and the “wine force”, more the state of young children or those adults indulging in “letting themselves go”, in situations perhaps involving wine, women and song, where a strong sensory environment demands less reflection, more passive reaction.
...An increase in physiological arousal can be linked to excessive release of dopamine. Could the screen experience be tilting this ancient balance in favour of the more infantile, senses-driven brain state?
Greenfield says that high dopamine and low prefrontal cortex activity is associated with irrationality and a deficit in attention. Video games are causing a flood of dopamine and causing ADHD. That would make sense, if ADHD was caused by too much dopamine, and if drugs for ADHD reduced dopamine release.

The problem is that it's the exact opposite. Drugs for ADHD increase dopamine release and ADHD is widely believed (although it's controversial) to be caused by a dopamine deficit.

Greenfield then says "We know too that dopamine suppresses the activity of neurons in the prefrontal cortex", but this is a serious oversimplification. Dopamine has complex effects on target neurons. It can inhibit firing, but it can also excite it. It all depends on the conditions. Here's what the authors of an influential scientific review said in 2004: "It is agreed by most researchers is that dopamine is a neuromodulator and is clearly not an excitatory or inhibitory neurotransmitter"

Some say that dopamine helps to "tune" the prefrontal by increasing the signal to noise ratio - more signal, less noise. Here's one of the most cited papers about dopamine and the PFC: Cognitive deficit caused by regional depletion of dopamine in prefrontal cortex of rhesus monkey.

Remember that drugs for ADHD like Ritalin, which are sometimes used illicitly by students without that disorder to help them focus and concentrate, cause dopamine release. If Greenfield were right, it would be the exact opposite.

...[other] people characterised by an underactive prefrontal cortex are those with schizophrenia, this time not due to physical damage but rather a chemical imbalance, in particular an excessive amount of the transmitter dopamine. In schizophrenia, like children, the patient is easily distracted, cannot interpret proverbs, is not strong on metaphor but takes the world literally; it is a vibrant world that can implode on, and overwhelm, the fragile firewall of the schizophrenic mindset.
This again is a serious simplification. Actually, you don't need to be a neuroscientist to work that out. Just recall the earlier bit: Greenfield has said that ADHD is caused by too much dopamine leading to an underactive prefrontal cortex. Now she says that schizophrenia is the same. So why are the symptoms of ADHD completely different from schizophrenia?

Why is it, in fact, that Ritalin and similar dopamine releasing drugs help with ADHD, but can make schizophrenia worse?

As a neuroscientist, I can tell you that we don't really know what's going on with dopamine in ADHD or schizophrenia. There's decent evidence that dopamine is involved in schizophrenia, but not in any straightforward sense. Schizophrenia is now believed to be linked to reduced dopamine in the prefrontal cortex, and too much in other areas.

As for ADHD, remember: the leading theory is that it's about too little dopamine. Not too much.

The only disease that we know certainly is associated with too little dopamine is Parkinson's. Contrary to Greenfield's theory, people with Parkinson's often have cognitive and mood problems as well as the better known difficulties with movements. They're not super intelligent, prefrontal-cortex-wielding geniuses.

I appreciate that an opinion piece in the Times is never going to be a rigorously argued scientific paper, but the fact that Greenfield's article contains several claims which are the exact opposite of the truth (or at least of current scientific thinking) calls her credibility into serious question.

Friday, 12 August 2011

Debating Greenfield


British neuroscientist Susan Greenfield regrets the recent controversy over certain of her remarks, and calls for a serious debate over "mind change" -
"Mind change" is an appropriately neutral, umbrella concept encompassing the diverse issues of whether and how modern technologies may be changing the functional state of the human brain, both for good and bad.
Very well, here goes. I wonder if Greenfield will reply.

As Greenfield points out, the human brain is plastic and interacts with the environment. Indeed, this is how we are able to learn and adapt to anything. Were our brains entirely unresponsive to what happens to them we would have no memory and probably no behaviour at all.

The modern world is changing your brain, in other words.

However, the same is true of every other era. The Victorian era, the Roman Empire, the invention of agriculture - human brains were never the same after those came along.

Because the brain is where behaviour happens, any change in behaviour must be accompanied by a change in the brain. By talking about how behaviour changes, we will, implicitly, also be discussing the brain.

However it doesn't work in reverse. Changes in the brain can't be assumed to mean changes in behaviour. Greenfield cites, for example, this paper which purports to show reductions in the grey matter volume of certain areas of the brain cortex in Chinese students with internet addiction compared to those without.

The obvious comment here is that it doesn't prove causality, as it is only a correlation. Maybe the reason they got addicted was because they already had these brain changes.

However, there is a more subtle point. Even if these were a direct consequence of excessive internet use, it wouldn't mean that the internet use was changing behaviour.

We have no idea what a slight decrease in grey matter volume in the cerebellum, dorsolateral prefrontal cortex, and supplementary motor area would do to cognition and behaviour. It might not do anything.

My point here is that rather than worrying about the brain, we ought to focus on behaviour. Because that is also focussing on the brain, but it's focussing on the aspects of brain function that actually matter.

Greenfield then poses three questions.
1. Could sustained and often obsessive game-playing, in which actions have no consequences, enhance recklessness in real life?
It's possible that it could, although I don't think we do live in an especially reckless society, given that crime rates are lower now than they have been for 20 years.

However, the question assumes that game playing has no consequences. Yet in-game actions do have in-game consequences. To a non-gamer, these may seem like no consequences, because they're not real.

Yet in the game, they're perfectly real, and if you spend 12 hours a day playing that game, and all your friends do as well - you are going to care about that. Those consequences will matter, to you, and with luck, you'll learn not to be so impulsive in the future.

In World of Warcraft, for example, actions have all too many consequences. If you impulsively decide to attack an enemy in the middle of a raid, you could cause a wipe, which would, quite possibly, ruin everyone's evening and get you a reputation as an oaf.

Exactly as your reputation would suffer if you and your friends went for an evening at the opera, and you stood up in the middle and shouted a profanity. Ah, but that's real life, the response goes. Is it? Is a performance in which hundreds of people sit solemnly, while grown adults dress up and pretend to be singing gods and fairies on the instructions of a deceased anti-semite, any more real than this?
3. How can young people develop empathy if they conduct relationships via a medium which does not allow them the opportunity to gain full experience of eye contact, interpret voice tone or body language, and learn how and when to give and receive hugs?
I do not think that this accurately represents the experience of most children today. However, assuming that it were true, what would be the problem?

If everyone's relationships were conducted online, surely it would be more important to learn how to navigate the online world, than it would be to learn how to interpret body language, which (webcams aside), you would never see, or need to see.

If the brain is plastic and adapts to the environment, as Greenfield argues, then surely the fact that it is adapting to the information age is neither surprising nor concerning. If anything, we ought to be trying to help the process along, to make ourselves better adapted. It would be more worrying if it didn't adapt.

Some might be concerned by this. Surely, there is value in the old way of doing things, value that would be lost in the new era. Unless one can point to definite reasons why the new state of affairs is inherently worse than the old - not just different from it - it is hard to distinguish these concerns from the simple feeling of nostalgia over the past.

The same point could have equally well been made at any time in history. When our ancestors first settled down to farm crops, an early conservative might have lamented - "Young people today are growing up with no idea of how to stab a mammoth in the eye with a spear. All they know is how to plant, water and raise this new-fangled 'wheat'."

Monday, 8 August 2011

Susan Greenfield Causes Autism

British neuroscientist Susan Greenfield has caused a storm with her suggestion that the recent rise in the use of the internet and social media may be related to the recent rise in autism.
I point to the increase in autism and I point to internet use. That's all. Establishing a causal relationship is very hard but there are trends out there that we must think about.

This has led to fellow Oxford neuroscientist Dorothy Bishop of BishopBlog writing an Open Letter asking her to "please, please, stop talking about autism". Twitter has been enlivened by #greenfieldism's such as "I point to the rise of Rebecca Black and the Greek sovereign debt crisis, that is all."

However, in a Neuroskeptic exclusive, I can reveal that the situation is far worse than anyone feared. Greenfield is not merely spreading unwarranted speculations about the recent rise in autism diagnoses.

She caused that rise.

The graph above shows the total number of scientific citations for Susan Greenfield's papers, over time. This is as good a measure as any of the influence Greenfield has had over our culture.

The trend is obvious, the growth is dramatic, and the correlation with the modern autism epidemic is undeniable.

Tuesday, 10 March 2009

In Defense of Susan Greenfield

Baroness Susan Greenfield has been taking a lot of flak these past few days for her comments about Facebook and computers in general:
If the young brain is exposed from the outset to a world of fast action and reaction, of instant new screen images flashing up with the press of a key, such rapid interchange might accustom the brain to operate over such timescales. Perhaps when in the real world such responses are not immediately forthcoming, we will see such behaviours and call them attention-deficit disorder...
and
I often wonder whether real conversation in real time may eventually give way to these sanitised and easier screen dialogues, in much the same way as killing, skinning and butchering an animal to eat has been replaced by the convenience of packages of meat on the supermarket shelf
She's taken a lot of flak, and she fully deserves it. Her comments were ill-judged and they bring her position as head of the Royal Institution into disrepute. Her speculations about clinical diagnoses such as ADHD and autism were especially dubious.

Greenfield's statements also display the vacuous obsession with "The Brain" so common today - if she'd simply said that spending hours on the internet might plausibly make kids grow up anti-social, that would be fair enough, but she had to bring the brain into it (several times in her various comments). Hence the headlines to the effect that Facebook could change or damage the brain. Well, Facebook does change the brain - as does everything else - because every experience we have has an influence somewhere in the brain. I'm reminded of Vicky Tuck on boy's and girl's brains; Tuck, however, is not a neuroscientist. Greenfield should know better.

But despite all this, Baroness Greenfield does make an important point.
At the moment I think we're sleepwalking into these technologies and assuming that everything will shake down just fine
These are very wise words. As a society, we are in danger of "sleepwalking" into social and cultural changes which we may end up regretting. Profound changes in the way people live rarely happen overnight, and they are rarely presented to us as a choice that we can either accept or reject. Societies just change, over a span of decades, often without anyone noticing what is happening until the change has happened.
One of my favorite books is Bowling Alone by the sociologist Robert D. Putnam. Putnam assembled data from a wide range of sources to support his theory that a profound change took place in America over the years from about 1960 to 1990; namely, that Americans stopped participating in community life. Union membership, Church attendance, charitable giving, league bowling, voter turnout, cards-playing, and many other such statistics fell markedly over this period, after a high peak in the 1950s. Meanwhile, solitary or small-group activities such as TV watching, spectator sports, and so on, exploded. Over the span of 20 years or so, Americans lost interest in "the community" as a whole and turned to themselves and their immediate circle of friends and family. He also makes a convincing case that this is, in many ways, a bad thing.

I doubt that Putnam's thesis is water-tight; for all I know he may have cherry-picked those statistics that support his theory and ignored those that don't. It wouldn't be the first time that someone has done that. Yet what's interesting about Bowling Alone is that even if Putnam's theory is only part of the truth, it's hard to deny that there's something in it - but it still took a book published in 2000 to bring it to people's attention. Putnam was writing about profound changes that every American will have felt to some degree. Yet these changes went un-noticed, or at least, few noticed that the various individual changes were part of a larger trend.

Putnam proposes various causes for the fragmentation of American community life, ranging from suburbanization to the increasing time pressures of work to that old favorite "the breakdown of the family". None of these were deliberate choices. Over 20 years or so America sleepwalked into a different way of life. This is hard to deny even, if you don't accept everything Putnam says. Baroness Greenfield, clearly, is no Robert Putnam. But her point about the dangers of sleepwalking is a sound one. Sleepwalking happens. It would be a pity if that message were to be lost in all the nonsense about Facebook and the brain.

[BPSDB]