User login
HUNTINGTON BEACH, CALIF. – Five years ago, Dr. Bradley S. Peterson was about to give up on the idea that advanced neuroimaging could one day be used as a diagnostic tool for clinical practice in child psychiatry.
“I thought it was completely hopeless; I really did,” Dr. Peterson told attendees at the annual meeting of the American College of Psychiatrists. “I’m extremely optimistic now.”
Thanks to advances in technology and more rigorously designed studies, imaging will aid clinical diagnoses in the foreseeable future, predicted Dr. Peterson, director of the Institute for the Developing Mind at Children’s Hospital Los Angeles.
“I sincerely believe it’s around the corner,” he said. “I think the biggest challenge may be addressing regulatory issues, as some of these computer algorithms, depending on how they are used, may constitute a medical device. Getting things through the [Food and Drug Administration] can be prohibitively expensive, time consuming, and arduous.”
Using examples from studies conducted by his lab and that of colleagues in the field, he discussed four cutting-edge new uses of imaging studies in childhood disorders psychiatry.
Identifying biological vulnerabilities
In an ongoing study conducted with Myrna M. Weissman, Ph.D., division chief of epidemiology in the department of psychiatry at Columbia University, New York, researchers are prospectively following a three-generation cohort for more than 25 years in an effort to understand families at high and low risk for major depressive disorder. A large sample of patients with chronic, severe, highly impairing depression was recruited in and around New Haven, Conn., along with a group of community controls who, according to self-report, spouses, and other family members, had never suffered from depression. Longitudinal studies of that cohort to date have demonstrated that grandchildren in families with multiple generations of major depressive disorder are at high risk for depression and anxiety disorders.
In a study that Dr. Peterson conducted with Dr. Weissman and colleagues, the brains of 131 study participants were imaged “to identify in the brain what is transmitted between these generations that place these biological offspring of depressed people at risk for depression,” he explained. He presented published brain measurement findings from 66 subjects in the high-risk group and 65 in the low-risk group (PNAS 2009;105:6273-8). The primary measurement of interest was the cortical mantle, which he described as “the gray matter at the surface of the brain, which contains most of the nerve cell bodies and synapses of the brain that carry information from one part of the brain to another. This is generally about 6 mm thick on average, but it varies slightly across the brain.”
Dr. Peterson and his associates found that subjects at high risk for depression had a 28% average reduction in cortical thickness, compared with their counterparts in the low-risk group, primarily in the right hemisphere of the brain. He characterized this as “a massive finding in two respects. It’s massive in its spatial extent, from the back of the brain to the front. It’s also massive at each point of the brain. The average reduction of 28% in offspring of the high-risk people is a massive biological effect. It’s astounding that we can find this in offspring. Even people who are offspring of depressed individuals two generations removed carry this abnormality, and it’s there even if they’ve never been sick in their lifetime. This high-risk approach is one way of identifying true biological vulnerability to illness.”
Identifying brain-based causal mechanisms
This effort involves yoking MRI or other imaging technology to randomized, controlled trials. “Instead of having a change in symptoms be an outcome measure, here it’s a change in MRI or brain-based measure,” said Dr. Peterson, who is also director of child and adolescent psychiatry at the University of Southern California, Los Angeles.
In a recent study, he and other researchers, including Dr. David J. Hellerstein and Dr. Jonathan Posner at Columbia University, conducted functional MRIs to determine whether antidepressant medication normalizes default mode network connectivity in adults with dysthymia. They imaged 25 healthy controls at one point in time and imaged 41 dysthymic adults twice: once before and once after a 10-week trial of duloxetine (JAMA Psychiatry 2013;70:373-82). They were interested in the effects of duloxetine on the default mode network of the brain, a “circuit” of brain regions that include the ventral anterior cingulate, the posterior cingulate, and the inferior parietal lobule.
“It’s called the default mode [circuit] because when you daydream or mind wander or introspect, this set of circuits is highly active,” Dr. Peterson said. “If you have to perform a task rather than let your mind wander, that system has to shut off. This region has been implicated many times in depression, because it’s been shown to be hyperactive in currently depressed people. It’s been especially related to ruminations. So the more people ruminate, the more this default system is active.”
At baseline, the researchers found that the coherence of neural activity within the brain’s default mode network was greater in persons with dysthymia than in healthy controls. Following the 10-week trial, they found that treatment with duloxetine, but not placebo, normalized default mode network connectivity (P < .03). “If they received placebo, the activity [in this brain region] didn’t change at all; it’s exactly the same as it was before the medication trial,” Dr. Peterson said. “If they received active medication, activity normalized; it reduced the cross-talk across nodes of the default mode network so that now, their [default mode network] activity is no longer discernible or different from the healthy controls. This shows that duloxetine is causing the reduction in the cross-talk between these circuits, and by doing so, it’s normalizing activity in the default mode system.”
A similar effect was observed in a study that assessed the impact of stimulant medications in children with attention-deficit/hyperactivity disorder (ADHD). Specifically, researchers including Dr. Peterson used cross-sectional MRI to examine the morphologic features of the basal ganglia nuclei in 48 children with ADHD who were off medication and 56 healthy controls (Am. J. Psychiatry 2010;167:977-86).
“Reduced volume in portions of the basal ganglia structures is important for impulse control and attention,” Dr. Peterson said. “We found that those same structures are enlarged when kids are on their medication so as not to be different from healthy controls. We think that stimulant medications in ADHD are normalizing these disturbances in the structure of the basal ganglia.”
Identifying neurometabolic dysfunction
Prior studies measured lactate in peripheral blood, muscle, or postmortem samples of people with autism spectrum disorders (ASD), “but these do not necessarily indicate the presence of metabolic dysfunction in the brain,” Dr. Peterson said. In a recent study he and other researchers used magnetic resonance spectroscopic imaging to measure lactate in the brains of people at risk for ASD. The analysis included 75 high-functioning ASD participants and 96 typically developing children and adults (JAMA Psychiatry 2014;71:665:71). Definite lactate peaks were present at a significantly higher rate in the ASD participants, compared with controls (13% vs. 1%, P = .001). In addition, the presence of lactate was significantly greater in adults, compared with children (20% vs. 6%; P = .004), ”perhaps suggesting that this could be a degenerative process that exacerbates through time,” Dr. Peterson said.
The presence of lactate did not correlate with clinical symptoms based on ASD subtype, autism diagnostic observation schedule domain score, or full-scale IQ. Dr. Peterson said the presence of lactate is “definitive proof that mitochondria are dysfunctional in the brains of a substantial number of autistic people. This disturbance is found now in autism, but it will likely be true in people with other neuropsychiatric disorders as well.”
A key advantage of MR spectroscopic imaging, he continued, “is that we can determine where in the brain lactate’s being produced. These kinds of studies will help guide studies in mitochondrial genetics and dysfunction in ASD and other conditions. It also has important clinical implications, because there are novel treatment approaches now for mitochondrial dysfunction, such as dietary interventions that can reduce the metabolic dysfunction.”
Using automated, brain-based diagnostic classifications
Dr. Peterson and other researchers used an automated method to diagnose individuals as having one of various neuropsychiatric illnesses using only anatomical MRI scans. “The method employs a semisupervised learning algorithm that discovers natural groupings of brains based on the spatial patterns of variation in the morphology of the cerebral cortex and other brain regions,” they explained in their article (PLoS One 2012;7:e50698). “We used split-half and leave-one-out cross-validation analyses in large MRI datasets to assess the reproducibility and diagnostic accuracy of those groupings.”
Conceptually, “we look at the volume increases and decreases throughout the surface of the cortex,” Dr. Peterson said at the meeting. “We do the same thing at the basal ganglia, thalamus, cerebellum, amygdala, and hippocampus. What we’re trying to do is to use structural variation in the brain to identify spatial patterns in brain structure that help to identify people who have brain features in common, just as the pattern of dermatomal ridges on your finger can identify specific individuals with great accuracy.
“We use machine learning algorithms to identify a pattern of abnormality in brain structure that’s more similar among people with Tourette’s syndrome, for example, than in people who have ADHD.”
Using this automated approach to making clinical diagnoses yielded impressive results. For example, the sensitivity and specificity to differentiate ADHD subjects from healthy controls was 93.6% and 89.5%, respectively. It also was strong for diagnosis of Tourette’s syndrome, (94.6% sensitive and 79% specific) and for schizophrenia (93.1% sensitive and 94.5% specific).
Dr. Peterson disclosed that he has received research funding from the National Institute of Mental Health, the National Institute on Drug Abuse, the National Institute of Environmental Health Sciences, the National Science Foundation, the Brain & Behavior Research Foundation (formerly NARSAD), and the Simons Foundation. He also has received investigator-initiated funding from Eli Lilly and Pfizer.
On Twitter @dougbrunk
HUNTINGTON BEACH, CALIF. – Five years ago, Dr. Bradley S. Peterson was about to give up on the idea that advanced neuroimaging could one day be used as a diagnostic tool for clinical practice in child psychiatry.
“I thought it was completely hopeless; I really did,” Dr. Peterson told attendees at the annual meeting of the American College of Psychiatrists. “I’m extremely optimistic now.”
Thanks to advances in technology and more rigorously designed studies, imaging will aid clinical diagnoses in the foreseeable future, predicted Dr. Peterson, director of the Institute for the Developing Mind at Children’s Hospital Los Angeles.
“I sincerely believe it’s around the corner,” he said. “I think the biggest challenge may be addressing regulatory issues, as some of these computer algorithms, depending on how they are used, may constitute a medical device. Getting things through the [Food and Drug Administration] can be prohibitively expensive, time consuming, and arduous.”
Using examples from studies conducted by his lab and that of colleagues in the field, he discussed four cutting-edge new uses of imaging studies in childhood disorders psychiatry.
Identifying biological vulnerabilities
In an ongoing study conducted with Myrna M. Weissman, Ph.D., division chief of epidemiology in the department of psychiatry at Columbia University, New York, researchers are prospectively following a three-generation cohort for more than 25 years in an effort to understand families at high and low risk for major depressive disorder. A large sample of patients with chronic, severe, highly impairing depression was recruited in and around New Haven, Conn., along with a group of community controls who, according to self-report, spouses, and other family members, had never suffered from depression. Longitudinal studies of that cohort to date have demonstrated that grandchildren in families with multiple generations of major depressive disorder are at high risk for depression and anxiety disorders.
In a study that Dr. Peterson conducted with Dr. Weissman and colleagues, the brains of 131 study participants were imaged “to identify in the brain what is transmitted between these generations that place these biological offspring of depressed people at risk for depression,” he explained. He presented published brain measurement findings from 66 subjects in the high-risk group and 65 in the low-risk group (PNAS 2009;105:6273-8). The primary measurement of interest was the cortical mantle, which he described as “the gray matter at the surface of the brain, which contains most of the nerve cell bodies and synapses of the brain that carry information from one part of the brain to another. This is generally about 6 mm thick on average, but it varies slightly across the brain.”
Dr. Peterson and his associates found that subjects at high risk for depression had a 28% average reduction in cortical thickness, compared with their counterparts in the low-risk group, primarily in the right hemisphere of the brain. He characterized this as “a massive finding in two respects. It’s massive in its spatial extent, from the back of the brain to the front. It’s also massive at each point of the brain. The average reduction of 28% in offspring of the high-risk people is a massive biological effect. It’s astounding that we can find this in offspring. Even people who are offspring of depressed individuals two generations removed carry this abnormality, and it’s there even if they’ve never been sick in their lifetime. This high-risk approach is one way of identifying true biological vulnerability to illness.”
Identifying brain-based causal mechanisms
This effort involves yoking MRI or other imaging technology to randomized, controlled trials. “Instead of having a change in symptoms be an outcome measure, here it’s a change in MRI or brain-based measure,” said Dr. Peterson, who is also director of child and adolescent psychiatry at the University of Southern California, Los Angeles.
In a recent study, he and other researchers, including Dr. David J. Hellerstein and Dr. Jonathan Posner at Columbia University, conducted functional MRIs to determine whether antidepressant medication normalizes default mode network connectivity in adults with dysthymia. They imaged 25 healthy controls at one point in time and imaged 41 dysthymic adults twice: once before and once after a 10-week trial of duloxetine (JAMA Psychiatry 2013;70:373-82). They were interested in the effects of duloxetine on the default mode network of the brain, a “circuit” of brain regions that include the ventral anterior cingulate, the posterior cingulate, and the inferior parietal lobule.
“It’s called the default mode [circuit] because when you daydream or mind wander or introspect, this set of circuits is highly active,” Dr. Peterson said. “If you have to perform a task rather than let your mind wander, that system has to shut off. This region has been implicated many times in depression, because it’s been shown to be hyperactive in currently depressed people. It’s been especially related to ruminations. So the more people ruminate, the more this default system is active.”
At baseline, the researchers found that the coherence of neural activity within the brain’s default mode network was greater in persons with dysthymia than in healthy controls. Following the 10-week trial, they found that treatment with duloxetine, but not placebo, normalized default mode network connectivity (P < .03). “If they received placebo, the activity [in this brain region] didn’t change at all; it’s exactly the same as it was before the medication trial,” Dr. Peterson said. “If they received active medication, activity normalized; it reduced the cross-talk across nodes of the default mode network so that now, their [default mode network] activity is no longer discernible or different from the healthy controls. This shows that duloxetine is causing the reduction in the cross-talk between these circuits, and by doing so, it’s normalizing activity in the default mode system.”
A similar effect was observed in a study that assessed the impact of stimulant medications in children with attention-deficit/hyperactivity disorder (ADHD). Specifically, researchers including Dr. Peterson used cross-sectional MRI to examine the morphologic features of the basal ganglia nuclei in 48 children with ADHD who were off medication and 56 healthy controls (Am. J. Psychiatry 2010;167:977-86).
“Reduced volume in portions of the basal ganglia structures is important for impulse control and attention,” Dr. Peterson said. “We found that those same structures are enlarged when kids are on their medication so as not to be different from healthy controls. We think that stimulant medications in ADHD are normalizing these disturbances in the structure of the basal ganglia.”
Identifying neurometabolic dysfunction
Prior studies measured lactate in peripheral blood, muscle, or postmortem samples of people with autism spectrum disorders (ASD), “but these do not necessarily indicate the presence of metabolic dysfunction in the brain,” Dr. Peterson said. In a recent study he and other researchers used magnetic resonance spectroscopic imaging to measure lactate in the brains of people at risk for ASD. The analysis included 75 high-functioning ASD participants and 96 typically developing children and adults (JAMA Psychiatry 2014;71:665:71). Definite lactate peaks were present at a significantly higher rate in the ASD participants, compared with controls (13% vs. 1%, P = .001). In addition, the presence of lactate was significantly greater in adults, compared with children (20% vs. 6%; P = .004), ”perhaps suggesting that this could be a degenerative process that exacerbates through time,” Dr. Peterson said.
The presence of lactate did not correlate with clinical symptoms based on ASD subtype, autism diagnostic observation schedule domain score, or full-scale IQ. Dr. Peterson said the presence of lactate is “definitive proof that mitochondria are dysfunctional in the brains of a substantial number of autistic people. This disturbance is found now in autism, but it will likely be true in people with other neuropsychiatric disorders as well.”
A key advantage of MR spectroscopic imaging, he continued, “is that we can determine where in the brain lactate’s being produced. These kinds of studies will help guide studies in mitochondrial genetics and dysfunction in ASD and other conditions. It also has important clinical implications, because there are novel treatment approaches now for mitochondrial dysfunction, such as dietary interventions that can reduce the metabolic dysfunction.”
Using automated, brain-based diagnostic classifications
Dr. Peterson and other researchers used an automated method to diagnose individuals as having one of various neuropsychiatric illnesses using only anatomical MRI scans. “The method employs a semisupervised learning algorithm that discovers natural groupings of brains based on the spatial patterns of variation in the morphology of the cerebral cortex and other brain regions,” they explained in their article (PLoS One 2012;7:e50698). “We used split-half and leave-one-out cross-validation analyses in large MRI datasets to assess the reproducibility and diagnostic accuracy of those groupings.”
Conceptually, “we look at the volume increases and decreases throughout the surface of the cortex,” Dr. Peterson said at the meeting. “We do the same thing at the basal ganglia, thalamus, cerebellum, amygdala, and hippocampus. What we’re trying to do is to use structural variation in the brain to identify spatial patterns in brain structure that help to identify people who have brain features in common, just as the pattern of dermatomal ridges on your finger can identify specific individuals with great accuracy.
“We use machine learning algorithms to identify a pattern of abnormality in brain structure that’s more similar among people with Tourette’s syndrome, for example, than in people who have ADHD.”
Using this automated approach to making clinical diagnoses yielded impressive results. For example, the sensitivity and specificity to differentiate ADHD subjects from healthy controls was 93.6% and 89.5%, respectively. It also was strong for diagnosis of Tourette’s syndrome, (94.6% sensitive and 79% specific) and for schizophrenia (93.1% sensitive and 94.5% specific).
Dr. Peterson disclosed that he has received research funding from the National Institute of Mental Health, the National Institute on Drug Abuse, the National Institute of Environmental Health Sciences, the National Science Foundation, the Brain & Behavior Research Foundation (formerly NARSAD), and the Simons Foundation. He also has received investigator-initiated funding from Eli Lilly and Pfizer.
On Twitter @dougbrunk
HUNTINGTON BEACH, CALIF. – Five years ago, Dr. Bradley S. Peterson was about to give up on the idea that advanced neuroimaging could one day be used as a diagnostic tool for clinical practice in child psychiatry.
“I thought it was completely hopeless; I really did,” Dr. Peterson told attendees at the annual meeting of the American College of Psychiatrists. “I’m extremely optimistic now.”
Thanks to advances in technology and more rigorously designed studies, imaging will aid clinical diagnoses in the foreseeable future, predicted Dr. Peterson, director of the Institute for the Developing Mind at Children’s Hospital Los Angeles.
“I sincerely believe it’s around the corner,” he said. “I think the biggest challenge may be addressing regulatory issues, as some of these computer algorithms, depending on how they are used, may constitute a medical device. Getting things through the [Food and Drug Administration] can be prohibitively expensive, time consuming, and arduous.”
Using examples from studies conducted by his lab and that of colleagues in the field, he discussed four cutting-edge new uses of imaging studies in childhood disorders psychiatry.
Identifying biological vulnerabilities
In an ongoing study conducted with Myrna M. Weissman, Ph.D., division chief of epidemiology in the department of psychiatry at Columbia University, New York, researchers are prospectively following a three-generation cohort for more than 25 years in an effort to understand families at high and low risk for major depressive disorder. A large sample of patients with chronic, severe, highly impairing depression was recruited in and around New Haven, Conn., along with a group of community controls who, according to self-report, spouses, and other family members, had never suffered from depression. Longitudinal studies of that cohort to date have demonstrated that grandchildren in families with multiple generations of major depressive disorder are at high risk for depression and anxiety disorders.
In a study that Dr. Peterson conducted with Dr. Weissman and colleagues, the brains of 131 study participants were imaged “to identify in the brain what is transmitted between these generations that place these biological offspring of depressed people at risk for depression,” he explained. He presented published brain measurement findings from 66 subjects in the high-risk group and 65 in the low-risk group (PNAS 2009;105:6273-8). The primary measurement of interest was the cortical mantle, which he described as “the gray matter at the surface of the brain, which contains most of the nerve cell bodies and synapses of the brain that carry information from one part of the brain to another. This is generally about 6 mm thick on average, but it varies slightly across the brain.”
Dr. Peterson and his associates found that subjects at high risk for depression had a 28% average reduction in cortical thickness, compared with their counterparts in the low-risk group, primarily in the right hemisphere of the brain. He characterized this as “a massive finding in two respects. It’s massive in its spatial extent, from the back of the brain to the front. It’s also massive at each point of the brain. The average reduction of 28% in offspring of the high-risk people is a massive biological effect. It’s astounding that we can find this in offspring. Even people who are offspring of depressed individuals two generations removed carry this abnormality, and it’s there even if they’ve never been sick in their lifetime. This high-risk approach is one way of identifying true biological vulnerability to illness.”
Identifying brain-based causal mechanisms
This effort involves yoking MRI or other imaging technology to randomized, controlled trials. “Instead of having a change in symptoms be an outcome measure, here it’s a change in MRI or brain-based measure,” said Dr. Peterson, who is also director of child and adolescent psychiatry at the University of Southern California, Los Angeles.
In a recent study, he and other researchers, including Dr. David J. Hellerstein and Dr. Jonathan Posner at Columbia University, conducted functional MRIs to determine whether antidepressant medication normalizes default mode network connectivity in adults with dysthymia. They imaged 25 healthy controls at one point in time and imaged 41 dysthymic adults twice: once before and once after a 10-week trial of duloxetine (JAMA Psychiatry 2013;70:373-82). They were interested in the effects of duloxetine on the default mode network of the brain, a “circuit” of brain regions that include the ventral anterior cingulate, the posterior cingulate, and the inferior parietal lobule.
“It’s called the default mode [circuit] because when you daydream or mind wander or introspect, this set of circuits is highly active,” Dr. Peterson said. “If you have to perform a task rather than let your mind wander, that system has to shut off. This region has been implicated many times in depression, because it’s been shown to be hyperactive in currently depressed people. It’s been especially related to ruminations. So the more people ruminate, the more this default system is active.”
At baseline, the researchers found that the coherence of neural activity within the brain’s default mode network was greater in persons with dysthymia than in healthy controls. Following the 10-week trial, they found that treatment with duloxetine, but not placebo, normalized default mode network connectivity (P < .03). “If they received placebo, the activity [in this brain region] didn’t change at all; it’s exactly the same as it was before the medication trial,” Dr. Peterson said. “If they received active medication, activity normalized; it reduced the cross-talk across nodes of the default mode network so that now, their [default mode network] activity is no longer discernible or different from the healthy controls. This shows that duloxetine is causing the reduction in the cross-talk between these circuits, and by doing so, it’s normalizing activity in the default mode system.”
A similar effect was observed in a study that assessed the impact of stimulant medications in children with attention-deficit/hyperactivity disorder (ADHD). Specifically, researchers including Dr. Peterson used cross-sectional MRI to examine the morphologic features of the basal ganglia nuclei in 48 children with ADHD who were off medication and 56 healthy controls (Am. J. Psychiatry 2010;167:977-86).
“Reduced volume in portions of the basal ganglia structures is important for impulse control and attention,” Dr. Peterson said. “We found that those same structures are enlarged when kids are on their medication so as not to be different from healthy controls. We think that stimulant medications in ADHD are normalizing these disturbances in the structure of the basal ganglia.”
Identifying neurometabolic dysfunction
Prior studies measured lactate in peripheral blood, muscle, or postmortem samples of people with autism spectrum disorders (ASD), “but these do not necessarily indicate the presence of metabolic dysfunction in the brain,” Dr. Peterson said. In a recent study he and other researchers used magnetic resonance spectroscopic imaging to measure lactate in the brains of people at risk for ASD. The analysis included 75 high-functioning ASD participants and 96 typically developing children and adults (JAMA Psychiatry 2014;71:665:71). Definite lactate peaks were present at a significantly higher rate in the ASD participants, compared with controls (13% vs. 1%, P = .001). In addition, the presence of lactate was significantly greater in adults, compared with children (20% vs. 6%; P = .004), ”perhaps suggesting that this could be a degenerative process that exacerbates through time,” Dr. Peterson said.
The presence of lactate did not correlate with clinical symptoms based on ASD subtype, autism diagnostic observation schedule domain score, or full-scale IQ. Dr. Peterson said the presence of lactate is “definitive proof that mitochondria are dysfunctional in the brains of a substantial number of autistic people. This disturbance is found now in autism, but it will likely be true in people with other neuropsychiatric disorders as well.”
A key advantage of MR spectroscopic imaging, he continued, “is that we can determine where in the brain lactate’s being produced. These kinds of studies will help guide studies in mitochondrial genetics and dysfunction in ASD and other conditions. It also has important clinical implications, because there are novel treatment approaches now for mitochondrial dysfunction, such as dietary interventions that can reduce the metabolic dysfunction.”
Using automated, brain-based diagnostic classifications
Dr. Peterson and other researchers used an automated method to diagnose individuals as having one of various neuropsychiatric illnesses using only anatomical MRI scans. “The method employs a semisupervised learning algorithm that discovers natural groupings of brains based on the spatial patterns of variation in the morphology of the cerebral cortex and other brain regions,” they explained in their article (PLoS One 2012;7:e50698). “We used split-half and leave-one-out cross-validation analyses in large MRI datasets to assess the reproducibility and diagnostic accuracy of those groupings.”
Conceptually, “we look at the volume increases and decreases throughout the surface of the cortex,” Dr. Peterson said at the meeting. “We do the same thing at the basal ganglia, thalamus, cerebellum, amygdala, and hippocampus. What we’re trying to do is to use structural variation in the brain to identify spatial patterns in brain structure that help to identify people who have brain features in common, just as the pattern of dermatomal ridges on your finger can identify specific individuals with great accuracy.
“We use machine learning algorithms to identify a pattern of abnormality in brain structure that’s more similar among people with Tourette’s syndrome, for example, than in people who have ADHD.”
Using this automated approach to making clinical diagnoses yielded impressive results. For example, the sensitivity and specificity to differentiate ADHD subjects from healthy controls was 93.6% and 89.5%, respectively. It also was strong for diagnosis of Tourette’s syndrome, (94.6% sensitive and 79% specific) and for schizophrenia (93.1% sensitive and 94.5% specific).
Dr. Peterson disclosed that he has received research funding from the National Institute of Mental Health, the National Institute on Drug Abuse, the National Institute of Environmental Health Sciences, the National Science Foundation, the Brain & Behavior Research Foundation (formerly NARSAD), and the Simons Foundation. He also has received investigator-initiated funding from Eli Lilly and Pfizer.
On Twitter @dougbrunk