Sunday, January 18, 2015

Interfering With Traumatic Memories of the Boston Marathon Bombings

The Boston Marathon bombings of April 15, 2013 killed three people and injured hundreds of others near the finish line of the iconic footrace. The oldest and most prominent marathon in the world, Boston attracts over 20,000 runners and 500,000 spectators. The terrorist act shocked and traumatized and unified the city.

What should the survivors do with their traumatic memories of the event? Many with disabling post-traumatic stress disorder (PTSD) receive therapy to lessen the impact of the trauma. Should they forget completely? Is it possible to selectively “alter” or “remove” a specific memory? Studies in rodents are investigating the use of pharmacological manipulations (Otis et al., 2014) and behavioral interventions (Monfils et al., 2009) to disrupt the reconsolidation of a conditioned fear memory. Translating these interventions into clinically effective treatments in humans is an ongoing challenge.

The process of reconsolidation may provide a window to altering unwanted memories. When an old memory is retrieved, it enters a transiently labile state, when it's susceptible to change before becoming consolidated and stored again (Nader & Hardt et al., 2009). There's some evidence that the autonomic response to a conditioned fear memory can be lessened by an “updating” procedure during the reconsolidation period (Schiller et al., 2010).1 How this might apply to the recollection of personally experienced trauma memories is uncertain.

Remembering the Boston Bombings

Can you interfere with recall of a traumatic event by presenting competing information during the so-called reconsolidation window? A new study by Kredlow and Otto (2015) recruited 113 Boston University undergraduates who were in Boston on the day of the bombings. In the first testing session, participants wrote autobiographical essays recounting the details of their experience, prompted by specific questions. In principle, this procedure re-activated the traumatic memory, rendering it vulnerable to updating during the reconsolidation window (~6 hours).

The allotted time for the autobiographical essay was 4 min. After that, separate groups of subjects read either a neutral story, a negative story, or a positive story (for 5 min). The fourth group did not read a story. Presentation of a story that is not one's own would presumably “update” the personal memory of the bombings.

A second session occurred one week later. The participants were again asked to write an autobiographical essay for 4 min, under the same conditions as Session #1. They were also asked about their physical proximity to the bombings, whether they watched the marathon in person, feared for anyone's safety, and knew anyone who was injured or killed. Nineteen subjects were excluded for various reasons, leaving the final n=94.

One notable weakness is that we don't know anything about the mental health of these undergrads, except that they completed the 10 item Positive and Negative Affective Schedule (PANAS-SF) before each session. And they were “provided with mental health resources” after testing (presumably links to resources, since the study was conducted online).

In terms of proximity, 10% of the participants were within one block of the bombings (“Criterion A” stressor), placing them at risk for developing of PTSD. Most (95%) feared for someone's safety and 12% knew someone who was injured or killed (also considered Criterion A). But we don't know if anyone had a current or former PTSD diagnosis.

The authors predicted that reading the negative stories during the “autobiographical reconsolidation window” would yield the greatest reduction in episodic details recalled from Session #1 (S1) to Session #2 (S2), relative to the No-Story condition. This is because the negative story and the horrific memories are both negative in valence [although I'm not sure of what mechanism would account for this effect].2
Specifically, we hypothesized that learning a negative affective story during the reconsolidation window compared to no interference would interfere with the reconsolidation of memories of the Boston Marathon bombings. In addition, we expected the neutral and positive stories to result in some interference, but not as much as the negative story.

The essays were coded for the number of memory details recalled in S1 and S2 (by 3-5 raters3), and the main measure was the number of details recalled in S2 for each of the four conditions. Other factors taken into account were the number of words used in S1, and time between the Boston Marathon and the testing session (both of which influenced the number of details recalled).

The results are shown in Table 1 below. the authors reported comparisons between Negative Story vs. No Story (p<.05, d = 0.62), Neutral Story vs. No Story (p=.20, d = 0.39), and Positive Story vs. No Story (p=.83, d = 0.06). The effect sizes are “medium-ish” for both the Negative and Neutral comparisons, but only “significant” for Negative.

I would argue that the comparison between Negative Story vs. Neutral Story which was not reported is the only way to evaluate the valence aspect of the prediction, i.e. whether the reduction in details recalled was specific to reading a negative story vs. potentially any story. I wasn't exactly sure why they didn't do an ANOVA in the first place, either.

Nonetheless, Kredlow and Otto (2015) suggest that their study...
...represent[s] a step toward translating reconsolidation interference work to the clinic, as, to our knowledge, no published studies to date have examined nonpharmacological reconsolidation interference for clinically-relevant negative memories. Additional studies should examine reconsolidation interference paradigms, such as this one, in clinical populations.

If this work was indeed extended to clinical populations, I would suggest conducting the study under more controlled conditions (in the lab, not online), which would also allow close monitoring of any distress elicited by writing the autobiographical essay (essentially a symptom provocation design). As the authors acknowledge, it would be especially important to evaluate not only the declarative, detail-oriented aspects of the traumatic memories, but also any change in their emotional impact.

Further Reading

Brief review of memory reconsolidation

Media’s role in broadcasting acute stress following the Boston Marathon bombings

Autobiographical Memory for a Life-Threatening Airline Disaster

I Forget...


1 But this effect hasn't replicated in other studies (e.g., Golkar et al., 2012).

2 Here, the authors say:
...some degree of similarity between the original memory and interference task may be required to achieve interference effects. This is in line with research suggesting that external and internal context is an important factor in extinction learning and may also be relevant to reconsolidation. As such, activating the affective context in which a memory was originally consolidated may facilitate reconsolidation interference.
This is a very different strategy than the “updating of fear memories” approach, where a safety signal occurs before extinction. But conditioned fear (blue square paired with mild shock) is very different from episodic memories of a bombing scene.

3 Details of the coding system:
A group consensus coding system was used to code the memories. S1 and S2 memory descriptions for each participant were compared and coded for recall of memory details. One point was given for each detail from the S1 memory description that was recalled in the S2 memory description. Each memory pair was coded by between three to five raters until a consensus between three raters was reached. Raters were blind to participant randomization, but not to each other's ratings. Consensus was reached in 83% of memory pairs.


Kredlow MA, & Otto MW (2015). Interference with the reconsolidation of trauma-related memories in adults. Depression and anxiety, 32 (1), 32-7 PMID: 25585535

Monfils MH, Cowansage KK, Klann E, LeDoux JE. (2009). Extinction-reconsolidation boundaries: key to persistent attenuation of fear memories. Science 324:951-5.

Nader K, Hardt O. (2009). A single standard for memory: the case for reconsolidation. Nat Rev Neurosci. 10:224-34.

Otis JM, Werner CT, Mueller D. (2014). Noradrenergic Regulation of Fear and Drug-Associated Memory Reconsolidation. Neuropsychopharmacology. [Epub ahead of print]

Schiller D, Monfils MH, Raio CM, Johnson DC, Ledoux JE, & Phelps EA (2010). Preventing the return of fear in humans using reconsolidation update mechanisms. Nature 463: 49-53.

Subscribe to Post Comments [Atom]

Saturday, January 10, 2015

The Incredible Growing Brain!

The Incredible Grow Your Own Brain (Barron Bob)

Using super absorbent material from disposable diapers, MIT neuroengineers Ed Boyden, Fei Chen, and Paul Tillberg went well beyond the garden variety novelty store "Grow Brain" to expand real brain slices to nearly five times their normal size.

Boyden, E., Chen, F. & Tillberg, P. / MIT / Courtesy of NIH

A slice of a mouse brain (left) was expanded by nearly five-fold in each dimension by adding a water-soaking salt. The result — shown at smaller magnification (right) for comparison — has its anatomical structures are essentially unchanged. (Nature - E. Callaway)

As covered by Ewan Callaway in Nature:
Blown-up brains reveal nanoscale details

Material used in diaper absorbant can make brain tissue bigger and enable ordinary microscopes to resolve features down to 60 nanometres.

Microscopes make living cells and tissues appear bigger. But what if we could actually make the things bigger?

It might sound like the fantasy of a scientist who has read Alice’s Adventures in Wonderland too many times, but the concept is the basis for a new method that could enable biologists to image an entire brain in exquisite molecular detail using an ordinary microscope, and to resolve features that would normally be beyond the limits of optics.

The technique, called expansion microscopy, involves physically inflating biological tissues using a material more commonly found in baby nappies (diapers).

. . .

“What we’ve been trying to do is figure out if we can make everything bigger,” Boyden told the meeting at the NIH in Bethesda, Maryland. To manage this, his team used a chemical called acrylate that has two useful properties: it can form a dense mesh that holds proteins in place, and it swells in the presence of water.

Sodium polyacrylate (via Leonard Gelfand Center, CMU)

Acrylate, a type of salt also known as waterlock, is the substance that gives nappies their sponginess. When inflated, Boyden's tissues grow about 4.5 times in each dimension.

Just add water

Before swelling, the tissue is treated with a chemical cocktail that makes it transparent, and then with the fluorescent molecules that anchor specific proteins to the acrylate, which is then infused into tissue. Just as with nappies, adding water causes the acrylate polymer to swell. After stretching, the fluorescent-tagged molecules move further away from each other; proteins that were previously too close to distinguish with a visible-light microscope come into crisp focus. In his NIH presentation, Boyden suggested that the technique can resolve molecules that had been as close as 60nm before expansion.

Most scientists thought it was cool, but there were some naysayers: “This is certainly highly ingenious, but how much practical use it will be is less clear,” notes Guy Cox, a microscopy specialist at the University of Sydney, Australia.

Others saw nothing new with the latest brain-transforming gimmick. Below, Marc Schuster displays his 2011 invention, the inflatable brain.

“An inflatable brain makes a great prop for your Zombie Prom King costume,” says Schuster, author of The Grievers.

Link via Roger Highfield:

Subscribe to Post Comments [Atom]

Friday, January 02, 2015

The Futility of Progesterone for Traumatic Brain Injury (but hope for the future)

Traumatic Brain Injury (TBI) is a serious public health problem that affects about 1.5 million people per year in the US, with direct and indirect medical costs of over $50 billion. Rapid intervention to reduce the risk of death and disability is crucial. The diagnosis and treatment of TBI is an active area of preclinical and clinical research funded by NIH and other federal agencies.

But during the White House BRAIN Conference, a leading neurosurgeon painted a pessimistic picture of current treatments for acute TBI. In response to a question about clinical advances based on cellular neurobiology, Dr. Geoffry Manley noted that the field is on its 32nd or 33rd failed clinical trial. The termination of a very promising trial of progesterone for TBI had just been announced (the ProTECT III, Phase III Clinical Trial “based on 17 years of work with 200 positive papers in preclinical models”), although I couldn't find any notice at the time (Sept 30 2014).

Now, the results from ProTECT III have been published in the New England Journal of Medicine (Wright et al., 2014). 882 TBI patients from 49 trauma centers were enrolled in the study and randomized to receive progesterone, thought to be a neuroprotective agent, or placebo within 4 hours of major head injury. The severity of TBI fell in the moderate to severe range, as indicated by scores on the Glasgow Coma Scale (which rates the degree of impaired consciousness).

The primary outcome measure was the Extended Glasgow Outcome Scale (GOS-E) at six months post-injury. The trial was stopped at 882 patients (out of a planned 1140) because there was no way that progesterone would improve outcomes:
After the second interim analysis, the trial was stopped because of futility. For the primary hypothesis comparing progesterone with placebo, favorable outcomes occurred in 51.0% of patients assigned to progesterone and in 55.5% of those assigned to placebo. 

Analysis of subgroups by race, ethnicity, and injury severity showed no differences between them, but there was a suggestive (albeit non-significant) sex difference.

- click on image for a larger view -

Modified from Fig. 2 (Wright et al., 2014). Adjusted Relative Benefit in Predefined Subgroups. Note the red box p value for sex differences.

Squares to the left of the dotted line indicate that placebo performed better than progesterone in a given patient group, while values to the right favor progesterone. The error bars show confidence intervals, which indicate that nearly all groups overlap with 0 (representing zero benefit for progesterone) The red box indicates a near-significant difference between men and women, with women actually faring worse with progesterone than with placebo. You may quibble about conventional significance, but women on average deteriorated with treatment, while men were largely unaffected.

This was a highly disappointing outcome for a well-conducted study that built on promising results in smaller Phase II Clinical Trials (which were backed by a boatload of preclinical data). The authors reflect on this gloomy state of affairs:
The PROTECT III trial joins a growing list of negative or inconclusive trials in the arduous search for a treatment for TBI. To date, more than 30 clinical trials have investigated various compounds for the treatment of acute TBI, yet no treatment has succeeded at the confirmatory trial stage. Many reasons for the disappointing record of translating promising agents from the laboratory to the clinic have been postulated, including limited preclinical development work, poor drug penetration into the brain, delayed initiation of treatment, heterogeneity of injuries, variability in routine patient care across sites, and insensitive outcome measures.

If that isn't enough, a second failed trial of progesterone was published in the same issue of NEJM (Skolnick et al., 2014). This group reported on negative results from an even larger pharma-funded trial (SyNAPse, which is the tortured acronym for Study of a Neuroprotective Agent, Progesterone, in Severe Traumatic Brain Injury). The SyNAPse trial enrolled the projected number of 1180 patients across 21 countries, all with severe TBI. The percentage of patients with favorable outcomes at six months was 50.4% in the progesterone group and 50.5% in the placebo group.
The negative result of this study, combined with the results of the PROTECT III trial, should stimulate a rethinking of procedures for drug development and testing in TBI.

This led Dr. Lee H. Schwamm (2014) to expound on the flawed culture of research in an Editorial, invoking the feared god of false positive findings (Ioannidis, 2005) and his minions: small effect sizes, small n's, too few studies, flexibility of analysis, and bias. Schwamm pointed to problematic aspects of the Phase II Trials that preceded ProTECT III and SyNAPse, including modest effect sizes and better-than expected outcomes in the placebo group.

Hope for the Future

“And you have to give them hope.”
--Harvey Milk

When the going gets tough in research, who better to rally the troops than your local university press office? The day after Dr. Manley's presentation at the BRAIN conference on Sept. 30, the University of California San Francisco issued this optimistic news release:

$17M DoD Award Aims to Improve Clinical Trials for Traumatic Brain Injury

An unprecedented, public-private partnership funded by the Department of Defense (DoD) is being launched to drive the development of better-run clinical trials and may lead to the first successful treatments for traumatic brain injury, a condition affecting not only athletes and members of the military, but also millions among the general public, ranging from youngsters to elders.

Under the partnership, officially launched Oct. 1 with a $17 million, five-year award from the DoD, the research team, representing many universities, the Food and Drug Administration (FDA), companies and philanthropies, will examine data from thousands of patients in order to identify effective measures of brain injury and recovery, using biomarkers from blood, new imaging equipment and software, and other tools.
. . .

“TBI is really a multifaceted condition, not a single event,” said UCSF neurosurgeon Geoffrey T. Manley, MD, PhD, principal investigator for the new award... “TBI lags 40 to 50 years behind heart disease and cancer in terms of progress and understanding of the actual disease process and its potential aftermath. More than 30 clinical trials of potential TBI treatments have failed, and not a single drug has been approved.”

The TED (TBI Endpoints Development) Award is meant to accelerate research to improve TBI diagnostics, classification, and patient selection for clinical trials. Quite a reversal of fortune in one day.

Out of the ashes of two failed clinical trials, a phoenix arises. Hope for TBI patients and their families takes wing.

Further Reading (and viewing)

White House BRAIN Conference (blog post)

90 min video of the conference

Brief Storify (summary of the conference) listings for SyNAPSe and ProTECT III.


Schwamm, L. (2014). Progesterone for Traumatic Brain Injury — Resisting the Sirens' Song New England Journal of Medicine, 371 (26), 2522-2523 DOI: 10.1056/NEJMe1412951

Skolnick, B., Maas, A., Narayan, R., van der Hoop, R., MacAllister, T., Ward, J., Nelson, N., & Stocchetti, N. (2014). A Clinical Trial of Progesterone for Severe Traumatic Brain Injury New England Journal of Medicine, 371 (26), 2467-2476 DOI: 10.1056/NEJMoa1411090

Wright, D., Yeatts, S., Silbergleit, R., Palesch, Y., Hertzberg, V., Frankel, M., Goldstein, F., Caveney, A., Howlett-Smith, H., Bengelink, E., Manley, G., Merck, L., Janis, L., & Barsan, W. (2014). Very Early Administration of Progesterone for Acute Traumatic Brain Injury. New England Journal of Medicine, 371 (26), 2457-2466 DOI: 10.1056/NEJMoa1404304

Subscribe to Post Comments [Atom]

Thursday, December 25, 2014

Eliciting Mirth and Laughter via Cortical Stimulation

Ho ho ho!

“Laughter consists of both motor and emotional aspects. The emotional component, known as mirth, is usually associated with the motor component, namely, bilateral facial movements.”

-Yamao et al. (2014)

The subject of laughter has been under an increasing amount of scientific scrutiny.  A recent review by Dr. Sophie Scott and colleagues (Scott et al., 2014) emphasized that laughter is a social emotion. During conversations, voluntary laughter by the speaker is a communicative act. This contrasts with involuntary laughter, which is elicited by external events like jokes and funny behavior.

One basic idea about the neural systems involved in the production of laughter relies on this dual process theme:
The coordination of human laughter involves the periaqueductal grey [PAG] and the reticular formation [RF], with inputs from cortex, the basal ganglia, and the hypothalamus. The hypothalamus is more active during reactive laughter than during voluntary laughter. Motor and premotor cortices are involved in the inhibition of the brainstem laughter centres and are more active when suppressing laughter than when producing it.

Figure 1 (Scott et al., 2014). Voluntary and involuntary laughter in the brain.

An earlier paper on laughter and humor focused on neurological conditions such as pathological laughter and gelastic epilepsy (Wild et al., 2003). In gelastic epilepsy, laughter is the major symptom of a seizure. These gelastic (“laughing”) seizures usually originate from the temporal poles, the frontal poles, or from benign tumors in the hypothalamus (Wild et al., 2003). Some patients experience these seizures as pleasant (even mirthful), while others do not:
During gelastic seizures, some patients report pleasant feelings which include exhilaration or mirth. Other patients experience the attacks of laughter as inappropriate and feel no positive emotions during their laughter. It has been claimed that gelastic seizures originating in the temporal regions involve mirth but that those originating in the hypothalamus do not. This claim has been called into question, however...

In their extensive review of the literature, Wild et al. (2003) concluded that the “laughter‐coordinating centre” must lie in the dorsal midbrain, with intimate connections to PAG and RF. Together, this system may comprise the “final common pathway” for laughter (i.e., coordinating changes in facial muscles, respiration, and vocalizations). During emotional reactions, prefrontal cortex, basal temporal cortex, the hypothalamus, and the basal ganglia transmit excitatory inputs to PAG and RF, which in turn generates laughter.

Can direct cortical stimulation produce laughter and mirth?

It turns out that the basal temporal cortex (wearing a Santa hat above) plays a surprising role in the generation of mirth, at least according to a recent paper by Yamao et al., (2014). Over a period of 13 years, they recorded neural activity from the cortical surface of epilepsy patients undergoing seizure monitoring, with the purpose of localizing the aberrant epileptogenic tissue. They enrolled 13 patients with implanted subdural grids to monitor for left temporal lobe seizures, and identified induced feelings of mirth in two patients (resulting from electrical stimulation in specific regions).

Obviously, this is not the typical way we feel amusement and utter guffaws of delight, but direct stimulation of the cortical surface goes back to Wilder Penfield as a way for neurosurgeons to map the behavioral functions of the brain. Of particular interest is the localization of language-related cortex that should be spared from surgical removal if at all possible.

The mirth-inducing region (Yamao et al., 2014) encompasses what is known as the basal temporal language area (BTLA), first identified by Lüders and colleagues in 1986. The region includes the left fusiform gyrus, about 3-7 cm from the tip of the temporal lobe. Stimulation at high intensities produces total speech arrest (inability to speak) and global language comprehension problems. Low stimulation intensity produces severe anomia, an inability to name things (or places or people). Remarkably, however, Lüders et al. (1991) found that “Surgical resection of the basal temporal language area produces no lasting language deficit.”

With this background in mind, let's look at the results from the mirthful patients. The location of induced-mirth (shown below) is the white circle in Patient 1 and the black circles in Patient 2.  In comparison, the locations of stimulation-induced language impairment are shown in diamonds. Note, however, that mirth was co-localized with language impairment in Patient 2.

Fig. 1 (modified from Yamao et al., 2014). The results of high-frequency electrical cortical stimulation. “Mirth” (circles) and “language” (diamonds) electrodes are shown in white and black colors for Patients 1 and 2, respectively. Note that mirth was elicited at or adjacent to the electrode associated with language impairment.  R = right side. The view is of the bottom of the brain.

How do the authors interpret this finding?
...the ratio of electrodes eliciting language impairment was higher for the mirth electrodes than in no-mirth electrodes, suggesting an association between mirth and language function. Since the BTLA is actively involved in semantic processing (Shimotake et al., 2014 and Usui et al., 2003), this semantic/language area was likely involved in the semantic aspect of humor detection in our cases.

Except there was no external humor to detect, as the laughter and feelings of mirth were spontaneous. After high-frequency stimulation, one patient reported, “I do not know why, but something amused me and I laughed.” The other patient said, “A familiar melody that I had heard in a television program in my childhood came to mind; its tune sounded funny and amused me.”

The latter description sounds like memory-induced nostalgia or reminiscence, which can occur with electrical stimulation of the temporal lobe (or TL seizures). But most of the relevant stimulation sites for those déjà vu-like experiences are not in the fusiform gyrus, which has been mostly linked to higher-level visual processing.

The authors also found that stimulation of the left hippocampus consistently caused contralateral (right-sided) facial movement that led to laughter.

I might have missed it, but one thing we don't know is whether stimulation of the right fusiform gyrus would have produced similar effects. Another thing to keep in mind is that these little circles are only one part of a larger system (see Scott et al. figure above). Presumably, the stimulated BTLA sites send excitatory projections to PAG and RF, which initiate laughter. But where is mirth actually represented, if you can feel amused and laugh for no apparent reason? By bypassing higher-order regions1, laughter can be a surprising and puzzling experience.


1 Like, IDK, maybe ventromedial PFC, other places in both frontal lobes, hypothalamus, basal ganglia, and more "classically" semantic areas in the left temporal lobe...

link originally via @Neuro_Skeptic:


LÜDERS, H., LESSER, R., HAHN, J., DINNER, D., MORRIS, H., WYLLIE, E., & GODOY, J. (1991). BASAL TEMPORAL LANGUAGE AREA Brain, 114 (2), 743-754 DOI: 10.1093/brain/114.2.743

Scott, S., Lavan, N., Chen, S., & McGettigan, C. (2014). The social life of laughter Trends in Cognitive Sciences, 18 (12), 618-620 DOI: 10.1016/j.tics.2014.09.002

Wild, B., & et al. (2003). Neural correlates of laughter and humour Brain, 126 (10), 2121-2138 DOI: 10.1093/brain/awg226

Yamao, Y., Matsumoto, R., Kunieda, T., Shibata, S., Shimotake, A., Kikuchi, T., Satow, T., Mikuni, N., Fukuyama, H., Ikeda, A., & Miyamoto, S. (2014). Neural correlates of mirth and laughter: A direct electrical cortical stimulation study Cortex DOI: 10.1016/j.cortex.2014.11.008

Subscribe to Post Comments [Atom]

Sunday, December 21, 2014

Go to Bed Early and Cure Your Negative Ruminations!

Source: Alyssa L. Miller, Flickr.

For nearly 9 years, this blog has been harping on the blight of overblown press releases, with posts like:

Irresponsible Press Release Gives False Hope to People With Tourette's, OCD, and Schizophrenia

Press Release: Press Releases Are Prestidigitation

New research provides fresh evidence that bogus press releases may depend largely on our biological make-up

Save Us From Misleading Press Releases


So it was heartening to see a team of UK researchers formally evaluate the content of 462 heath-related press releases issued by leading universities in 2011 (Sumner et al., 2014). They classified three types of exaggerated claims and found that 40% of the press releases contained exaggerated health advice, 33% made causal statements based on correlational results, and 36% extrapolated from animal research to humans.

A fine duo of exaggerated health advice and causal statements based on correlational results recently caught my eye. Here's a press release issued by Springer, the company that publishes Cognitive Therapy and Research:

Don’t worry, be happy: just go to bed earlier

When you go to bed, and how long you sleep at a time, might actually make it difficult for you to stop worrying. So say Jacob Nota and Meredith Coles of Binghamton University in the US, who found that people who sleep for shorter periods of time and go to bed very late at night are often overwhelmed with more negative thoughts than those who keep more regular sleeping hours.

The PR issues health advice (“just go to bed earlier”) based on correlational data: “people who sleep for shorter periods of time and go to bed very late at night are often overwhelmed with more negative thoughts.” But does staying up late cause you to worry, or do worries keep you awake at night? A survey can't distinguish between the two.

The study by Nota and Coles (2014) recruited 100 teenagers (or near-teenagers, mean age = 19.4 + 1.9) from the local undergraduate research pool. They filled out a number of self-report questionnaires that assessed negative affect, sleep quality, chronotype (morning person vs. evening person), and aspects of repetitive negative thinking (RNT).

RNT is a transdiagnostic construct that encompasses symptoms typical of depression (rumination), anxiety (worry), and obsessive-compulsive disorder (obsessions). Thus, the process of RNT is considered similar across the disorders, but the content may differ. The undergraduates were not clinically evaluated so we don't know if any of them actually had the diagnoses of depression, anxiety, and/or OCD. But one can look at whether the types of symptoms that are endorsed (whether clinically relevant or not) are related to sleep duration and timing. Which is what the authors did.

Shorter sleep duration and a later bedtime were indeed associated with more RNT. However, when accounting for levels of negative affect, the sleep variables no longer showed a significant correlation.Not a completely overwhelming relationship, then.

But as expected, the night owls reported more RNT than the non-night owls. 

Here's how the findings were interpreted in the Springer press release and conspicuously, by the authors themselves (the study of Sumner et al., 2014 also observed this pattern). Note the exaggerated health advice and causal statements based on correlational results.

“Making sure that sleep is obtained during the right time of day may be an inexpensive and easily disseminable intervention for individuals who are bothered by intrusive thoughts,” remarks Nota.

The findings also suggest that sleep disruption may be linked to the development of repetitive negative thinking. Nota and Coles therefore believe that it might benefit people who are at risk of developing a disorder characterized by such intrusive thoughts to focus on getting enough sleep.

“If further findings support the relation between sleep timing and repetitive negative thinking, this could one day lead to a new avenue for treatment of individuals with internalizing disorders,” adds Coles. “Studying the relation between reductions in sleep duration and psychopathology has already demonstrated that focusing on sleep in the clinic also leads to reductions in symptoms of psychopathology.”

As they mentioned, we already know that many psychiatric disorders are associated with problematic sleep, and that improved sleep is helpful in these conditions. Recommending that people suffering with debilitating and uncontrollable intrusive thoughts to “just go to bed earlier” isn't particularly helpful. Not only that, such advice can be downright irritating.

Here's a news story from Yahoo that plays up the “sleep reduces worry” causal relationship even more:
This Sleep Tweak Could Help You Worry Less

Can the time you hit the hay actually influence the types of thoughts you have? Science says yes.

Are you a chronic worrier? The hour you’re going to sleep, and how much sleep you’re getting overall, may exacerbate your anxiety, according to a new study published in the journal Cognitive Therapy and Research.

The great news here? By tweaking your sleep habits you could actually help yourself worry less. Really.

Great! So internal monologues of self-loathing (“I'm a complete failure”, “No one likes me”) and deep anxiety about the future (“My career prospects are dismal”, “I worry about my partner's terrible diagnosis”) can be cured by going to bed earlier!

Even if you could forcibly alter your chronotype (and I don't know if this is possible), what do you do when you wake up in the middle of the night haunted by your repetitive negative thoughts?

Further Reading

Alexis Delanoir on the RNT paper and much more in Depression And Stress/Mood Disorders: Causes Of Repetitive Negative Thinking And Ruminations

Scicurious, with an amusingly titled piece: This study of hype in press releases will change journalism


Chronotype was dichotomously classified as evening type vs. moderately morning-type / neither type (not a lot of early birds, I guess). And only 75 students completed questionnaires in this part of the study.

2 It's notable that the significance level for these correlations was not corrected for multiple comparisons in the first place.


Nota, J., & Coles, M. (2014). Duration and Timing of Sleep are Associated with Repetitive Negative Thinking. Cognitive Therapy and Research DOI: 10.1007/s10608-014-9651-7

Sumner, P., Vivian-Griffiths, S., Boivin, J., Williams, A., Venetis, C., Davies, A., Ogden, J., Whelan, L., Hughes, B., Dalton, B., Boy, F., & Chambers, C. (2014). The association between exaggeration in health related science news and academic press releases: retrospective observational study, BMJ, 349 (dec09 7) DOI: 10.1136/bmj.g7015

Subscribe to Post Comments [Atom]

Monday, December 08, 2014

Hipster Neuroscience

According to Urban Dictionary,
Hipsters are a subculture of men and women typically in their 20's and 30's that value independent thinking, counter-culture, progressive politics, an appreciation of art and indie-rock, creativity, intelligence, and witty banter.  ...  Hipsters reject the culturally-ignorant attitudes of mainstream consumers, and are often be seen wearing vintage and thrift store inspired fashions, tight-fitting jeans, old-school sneakers, and sometimes thick rimmed glasses.

by Trey Parasuco November 22, 2007 

Makes them sound so cool. But we all know that everyone loves to complain about hipsters and the endless lifestyle/culture/fashion pieces written about them.

And they're so conformist in their nonconformity.

Recently, Jonathan Touboul posted a paper at arXiv to model The hipster effect: When anticonformists all look the same:
The hipster effect is this non-concerted emergent collective phenomenon of looking alike trying to look different. Uncovering the structures behind this apparent paradox ... can have implications in deciphering collective phenomena in economics and finance, where individuals may find an interest in taking positions in opposition to the majority (for instance, selling stocks when others want to buy). Applications also extend to the case of neuronal networks with inhibition, where neurons tend to fire when others and silent, and reciprocally.

You can find great write ups of the paper at Neuroecology and the Washington Post:
There are two kinds of people in this world: those who like to go with the flow, and those who do the opposite — hipsters, in other words. Over time, people perceive what the mainstream trend is, and either align themselves with it or oppose it.

What if this world contained equal numbers of conformists and hipsters? No matter how the population starts out, it will end up in some kind of cycle, as the conformists try to catch up to the hipsters, and the hipsters try to differentiate themselves from the conformists.

But there aren't equal numbers of conformists and hipsters. And this type of cycle doesn't apply to neuroscience research, which is always moving forward in terms of trends and technical advances (right)?

It may be the Dream of the 1890s in Portland, but it's BRAIN 2015 all the way (RFA-MH-15-225):

BRAIN Initiative: Development and Validation of Novel Tools to Analyze Cell-Specific and Circuit-Specific Processes in the Brain (U01)

Although hipsters are in their 20s and 30s, the august NIH crowd (and its advisors) has set the BRAIN agenda that everyone else has to follow. When the cutting-edge tools (e.g., optogenetics) become commonplace, you have to do amazing things with them like create false memories in mice, or else develop methods like Dreadd2.0: An Enhanced Chemogenetic Toolkit or Ultra-Multiplexed Nanoscale In Situ Proteomics for Understanding Synapse Types.

The BRAIN Initiative wants to train the hipsters and other "graduate students, medical students, postdoctoral scholars, medical residents, and/or early-career faculty" in Research Tools and Methods and Computational Neuroscience. This will "complement and/or enhance the training of a workforce to meet the nation’s biomedical, behavioral and clinical research needs."

But this is an era when the average age of first-time R01 Principal Investigators is 42 1 and post-docs face harsh realities:
Research in 2014 is a brutal business, at least for those who want to pursue academic science as a career. Perhaps the most telling line comes from the UK report: of 100 science PhD graduates, about 30 will go on to postdoc research, but just four will secure permanent academic posts with a significant research component. There are too many scientists chasing too few academic careers.

How do you respond to these brutal challenges? I don't have an answer.2  But many young neuroscientists may have to start pickling their own vegetables, raising their own chickens, and curing their own meats.


1  The average age of first-time Principal Investigators on NIH R01 grants has risen from 36 in 1980 to 42 in 2001, where it remains today (see this PPT). So this has been going on for a while.

2  Or at least, not an answer that will fit within the scope of this post. Some obvious places to start are to train fewer scientists, enforce a reasonable retirement age, and increase funding somehow. And decide whether all research should be done by 20 megalabs, or else reduce the $$ amount and number of grants awarded to any one investigator.

Subscribe to Post Comments [Atom]

Monday, November 24, 2014

The Humanities Are Ruining Neuroscience

Photo illustration by Andrea Levy for The Chronicle Review

Inflammatory title, isn't it. Puzzled by how it could possibly happen? Then read on!

A few days ago, The Chronicle of Higher Education published a piece called Neuroscience Is Ruining the Humanities. You can find it in a Google search and at reddit, among other places. The url is {notice the “Neuroscience-Is-Ruining” part.}

Oh wait. Here's a tweet.

At some point along the way, without explanation, the title of the article was changed to the more mundane The Shrinking World of Ideas. The current take-home bullet points are:
  • We have shifted our focus from the meaning of ideas to the means by which they’re produced.
  • When professors began using critical theory to teach literature they were, in effect, committing suicide by theory.

The author is essayist Arthur Krystal, whose 4,000+ word piece can be summarized as “postmodernism ruined everything.” In the olden days of the 19th century, ideas mattered. Then along came the language philosophers and some French historians in the 1920s/30s, who opened the door for Andy Warhol and Jacques Derrida and what do you know, ideas didn't matter any more. That's fine, he can express that opinion, and normally I wouldn't care. I'm not going to debate the cultural harms or merits of postmodernism today.

What did catch my eye was this: “...what the postmodernists indirectly accomplished was to open the humanities to the sciences, particularly neuroscience.”

My immediate response: “that is the most ironic thing I've ever heard!! there is no truth [scientific or otherwise] in postmodernism!” Meaning: scientific inquiry was either irrelevant to these theorists, or something to be distrusted, if not disdained. So how could they possibly invite Neuroscience into the Humanities Building?

Let's look at Krystal's extended quote (emphasis mine):
“...By exposing the ideological codes in language, by revealing the secret grammar of architectural narrative and poetic symmetries, and by identifying the biases that frame "disinterested" judgment, postmodern theorists provided a blueprint of how we necessarily think and express ourselves. In their own way, they mirrored the latest developments in neurology, psychology, and evolutionary biology. [Ed. warning: non sequitur ahead.] To put it in the most basic terms: Our preferences, behaviors, tropes, and thoughts—the very stuff of consciousness—are byproducts of the brain’s activity. And once we map the electrochemical impulses that shoot between our neurons, we should be able to understand—well, everything. So every discipline becomes implicitly a neurodiscipline, including ethics, aesthetics, musicology, theology, literature, whatever.”

I'm as reductionist as the next neuroscientist, sure, but Krystal's depiction of the field is either quite the caricature, or incredibly naïve. Ultimately, I can't tell if he's actually in favor of "neurohumanities"...
In other words, there’s a good reason that "neurohumanities" are making headway in the academy. Now that psychoanalytic, Marxist, and literary theory have fallen from grace, neuroscience and evolutionary biology can step up. And what better way for the liberal arts to save themselves than to borrow liberally from science?

...or opposed:
Even more damning are the accusations in Sally Satel and Scott O. Lilienfeld’s Brainwashed: The Seductive Appeal of Mindless Neuroscience , which argues that the insights gathered from neurotechnologies have less to them than meets the eye. The authors seem particularly put out by the real-world applications of neuroscience as doctors, psychologists, and lawyers increasingly rely on its tenuous and unprovable conclusions. Brain scans evidently are "often ambiguous representations of a highly complex system … so seeing one area light up on an MRI in response to a stimulus doesn’t automatically indicate a particular sensation or capture the higher cognitive functions that come from those interactions." 1

Then he links to articles like Adventures in Neurohumanities and Can ‘Neuro Lit Crit’ Save the Humanities? (in a non-critical way) 2  before meandering back down memory lane. They sure don't make novelists like they used to!

So you see, neuroscience hasn't really ruined the humanities.3 Have the humanities ruined neuroscience? Although there has been a disturbing proliferation of neuro- fields, I think we can weather the storm of Jane Austen neuroimaging studies.


1 Although I haven't always seen eye to eye with Satel and Lilienfeld, here Krystal clearly overstates the extent of their dismissal of the entire field (which has happened before).

2 Read Professor of Literary Neuroimaging instead.

3 The author of the Neurocultures Manifesto may disagree, however.

link via @vaughanbell

Subscribe to Post Comments [Atom]

eXTReMe Tracker