User login
Book Review: The hope that comes from ‘Growing Pains’
You might be surprised by child psychiatrist’s Mike Shooter’s response revealed in his book, “Growing Pains: Making Sense of Childhood: A Psychiatrist’s Story”(London: Hodder & Stoughton, 2018). Rather than hospitalizing this patient, as was done many times before, he makes a bold decision to listen to the group members, who help the patient develop a plan that ultimately leads to greater resiliency.
Dr. Shooter shares many stories about the power of therapy to heal, often visiting patients at their homes to better understand the dynamics of their distress. Stories themselves heal: “It is the job of the therapist to encourage them to reveal their story, to listen to it, and to help them find a better outcome.”
From these stories, we learn about Dr. Shooter’s passion and commitment to his relationship with the child – listening, fostering autonomy, recognizing the power of family systems, working with a multidisciplinary team, and using his own experiences with depression to better help his patients.
Dr. Shooter closes the distance between himself and readers by sharing his own story – his difficult relationship with his strict father, his own uncertainty about his future profession, the deep depression that could have derailed his family life and career, and the treatment that got him back on track.
This book is an excellent read for psychiatrists and other mental health professionals, whether they work with children or adults. It is especially valuable to psychiatrists like me who work with college students – transitional-age youth at the border between childhood and adulthood. Dr. Shooter beautifully describes the societal ills that have contributed to a global rise in child and adolescent mental health problems:
“We live in an ever-more competitive world. To the normal pressures of growing up are added the educational demands to pass more and more exams, a gloominess about the future, and a loss of faith in political processes to put it right; private catastrophes at home and global catastrophes beamed in from all over the world; and a media that’s in love with how to be popular, how to look attractive, and how to be a success.”
The general public would also find this book an interesting glimpse into the world of child psychiatry. The public as well as politicians would benefit from knowing the value child psychiatry can provide at a time when services are underfunded in many countries, including the United States.
This book uses the words of children to highlight the challenges young people face – from bereavement to bullying to abuse. He writes about children on the “margins of margins.” As I read the book, Dr. Shooter reminded me of psychiatrist and author Robert Coles, who taught my favorite college class and wrote about children in crisis from the Appalachians to Africa.
Not surprisingly, Dr. Shooter describes spending time with Dr. Coles at a conference on bereavement. He adheres to the advice Dr. Coles offered, which was to “Listen to what the children say, not what the adults say about them. ... Follow what your gut tells you, not your head.”
In addition to listening to the patient and your gut, Dr. Shooter describes offering hope as another essential element to treatment. He describes giving hope to children of parents who die by suicide, as these children often fear they will meet their parents’ fate. “And they need to know, too, that suicide is not inevitable. … Help is ready and available to stop the children and young people ever getting to that state.”
One element of treatment Dr. Shooter minimally addresses is psychopharmacology, and mostly in a negative way. While he acknowledges that some children genuinely do have attention-deficit/hyperactivity disorder or depression, he feels they are overdiagnosed and thus overtreated with medication. I would have liked to hear more about the times he prescribed medication and how it was integrated into comprehensive care that included therapy and lifestyle changes. I would not want parents reading this book to feel badly if they have supported having their child take medication for a mental health disorder.
Dr. Shooter does make the important point that therapy is often left on the sidelines in current medical systems. Therapy can benefit people of all ages as we face our own “growing pains.” He highlights the “opportunity for growth” that challenges provide, and indeed gives us a great sense of hope in our lives and our work as psychiatrists.
Dr. Morris is an associate professor of psychiatry and associate program director for student health psychiatry at the University of Florida, Gainesville. She is the author of “The Campus Cure: A Parent’s Guide to Mental Health and Wellness for College Students” (Lanham, Md.: Rowman & Littlefield of Lanham, 2018).
You might be surprised by child psychiatrist’s Mike Shooter’s response revealed in his book, “Growing Pains: Making Sense of Childhood: A Psychiatrist’s Story”(London: Hodder & Stoughton, 2018). Rather than hospitalizing this patient, as was done many times before, he makes a bold decision to listen to the group members, who help the patient develop a plan that ultimately leads to greater resiliency.
Dr. Shooter shares many stories about the power of therapy to heal, often visiting patients at their homes to better understand the dynamics of their distress. Stories themselves heal: “It is the job of the therapist to encourage them to reveal their story, to listen to it, and to help them find a better outcome.”
From these stories, we learn about Dr. Shooter’s passion and commitment to his relationship with the child – listening, fostering autonomy, recognizing the power of family systems, working with a multidisciplinary team, and using his own experiences with depression to better help his patients.
Dr. Shooter closes the distance between himself and readers by sharing his own story – his difficult relationship with his strict father, his own uncertainty about his future profession, the deep depression that could have derailed his family life and career, and the treatment that got him back on track.
This book is an excellent read for psychiatrists and other mental health professionals, whether they work with children or adults. It is especially valuable to psychiatrists like me who work with college students – transitional-age youth at the border between childhood and adulthood. Dr. Shooter beautifully describes the societal ills that have contributed to a global rise in child and adolescent mental health problems:
“We live in an ever-more competitive world. To the normal pressures of growing up are added the educational demands to pass more and more exams, a gloominess about the future, and a loss of faith in political processes to put it right; private catastrophes at home and global catastrophes beamed in from all over the world; and a media that’s in love with how to be popular, how to look attractive, and how to be a success.”
The general public would also find this book an interesting glimpse into the world of child psychiatry. The public as well as politicians would benefit from knowing the value child psychiatry can provide at a time when services are underfunded in many countries, including the United States.
This book uses the words of children to highlight the challenges young people face – from bereavement to bullying to abuse. He writes about children on the “margins of margins.” As I read the book, Dr. Shooter reminded me of psychiatrist and author Robert Coles, who taught my favorite college class and wrote about children in crisis from the Appalachians to Africa.
Not surprisingly, Dr. Shooter describes spending time with Dr. Coles at a conference on bereavement. He adheres to the advice Dr. Coles offered, which was to “Listen to what the children say, not what the adults say about them. ... Follow what your gut tells you, not your head.”
In addition to listening to the patient and your gut, Dr. Shooter describes offering hope as another essential element to treatment. He describes giving hope to children of parents who die by suicide, as these children often fear they will meet their parents’ fate. “And they need to know, too, that suicide is not inevitable. … Help is ready and available to stop the children and young people ever getting to that state.”
One element of treatment Dr. Shooter minimally addresses is psychopharmacology, and mostly in a negative way. While he acknowledges that some children genuinely do have attention-deficit/hyperactivity disorder or depression, he feels they are overdiagnosed and thus overtreated with medication. I would have liked to hear more about the times he prescribed medication and how it was integrated into comprehensive care that included therapy and lifestyle changes. I would not want parents reading this book to feel badly if they have supported having their child take medication for a mental health disorder.
Dr. Shooter does make the important point that therapy is often left on the sidelines in current medical systems. Therapy can benefit people of all ages as we face our own “growing pains.” He highlights the “opportunity for growth” that challenges provide, and indeed gives us a great sense of hope in our lives and our work as psychiatrists.
Dr. Morris is an associate professor of psychiatry and associate program director for student health psychiatry at the University of Florida, Gainesville. She is the author of “The Campus Cure: A Parent’s Guide to Mental Health and Wellness for College Students” (Lanham, Md.: Rowman & Littlefield of Lanham, 2018).
You might be surprised by child psychiatrist’s Mike Shooter’s response revealed in his book, “Growing Pains: Making Sense of Childhood: A Psychiatrist’s Story”(London: Hodder & Stoughton, 2018). Rather than hospitalizing this patient, as was done many times before, he makes a bold decision to listen to the group members, who help the patient develop a plan that ultimately leads to greater resiliency.
Dr. Shooter shares many stories about the power of therapy to heal, often visiting patients at their homes to better understand the dynamics of their distress. Stories themselves heal: “It is the job of the therapist to encourage them to reveal their story, to listen to it, and to help them find a better outcome.”
From these stories, we learn about Dr. Shooter’s passion and commitment to his relationship with the child – listening, fostering autonomy, recognizing the power of family systems, working with a multidisciplinary team, and using his own experiences with depression to better help his patients.
Dr. Shooter closes the distance between himself and readers by sharing his own story – his difficult relationship with his strict father, his own uncertainty about his future profession, the deep depression that could have derailed his family life and career, and the treatment that got him back on track.
This book is an excellent read for psychiatrists and other mental health professionals, whether they work with children or adults. It is especially valuable to psychiatrists like me who work with college students – transitional-age youth at the border between childhood and adulthood. Dr. Shooter beautifully describes the societal ills that have contributed to a global rise in child and adolescent mental health problems:
“We live in an ever-more competitive world. To the normal pressures of growing up are added the educational demands to pass more and more exams, a gloominess about the future, and a loss of faith in political processes to put it right; private catastrophes at home and global catastrophes beamed in from all over the world; and a media that’s in love with how to be popular, how to look attractive, and how to be a success.”
The general public would also find this book an interesting glimpse into the world of child psychiatry. The public as well as politicians would benefit from knowing the value child psychiatry can provide at a time when services are underfunded in many countries, including the United States.
This book uses the words of children to highlight the challenges young people face – from bereavement to bullying to abuse. He writes about children on the “margins of margins.” As I read the book, Dr. Shooter reminded me of psychiatrist and author Robert Coles, who taught my favorite college class and wrote about children in crisis from the Appalachians to Africa.
Not surprisingly, Dr. Shooter describes spending time with Dr. Coles at a conference on bereavement. He adheres to the advice Dr. Coles offered, which was to “Listen to what the children say, not what the adults say about them. ... Follow what your gut tells you, not your head.”
In addition to listening to the patient and your gut, Dr. Shooter describes offering hope as another essential element to treatment. He describes giving hope to children of parents who die by suicide, as these children often fear they will meet their parents’ fate. “And they need to know, too, that suicide is not inevitable. … Help is ready and available to stop the children and young people ever getting to that state.”
One element of treatment Dr. Shooter minimally addresses is psychopharmacology, and mostly in a negative way. While he acknowledges that some children genuinely do have attention-deficit/hyperactivity disorder or depression, he feels they are overdiagnosed and thus overtreated with medication. I would have liked to hear more about the times he prescribed medication and how it was integrated into comprehensive care that included therapy and lifestyle changes. I would not want parents reading this book to feel badly if they have supported having their child take medication for a mental health disorder.
Dr. Shooter does make the important point that therapy is often left on the sidelines in current medical systems. Therapy can benefit people of all ages as we face our own “growing pains.” He highlights the “opportunity for growth” that challenges provide, and indeed gives us a great sense of hope in our lives and our work as psychiatrists.
Dr. Morris is an associate professor of psychiatry and associate program director for student health psychiatry at the University of Florida, Gainesville. She is the author of “The Campus Cure: A Parent’s Guide to Mental Health and Wellness for College Students” (Lanham, Md.: Rowman & Littlefield of Lanham, 2018).
Memories, flashbacks, and PTSD in NYC
On June 10, 2019, a rainy, foggy day, there was a news flash that a plane had crashed into a building in the middle of New York City. I first saw this notification on my iPhone and my immediate thought was: Could this be a redo of Sept. 11?
I was especially concerned because I knew the area fairly well, in that a clinic I had worked in for more than 10 years was only a few blocks away. However, my memory bank brought me back to that day almost 18 years ago when, from a hospital window, many of us doctors, nurses, social workers, and patients saw the fire in the north tower and then saw the second plane crash into the south tower of the World Trade Center. Once we all knew what happened, we spent that night at the hospital awaiting the arrival of people in need of care. Unfortunately, very few arrived.
On this past June day, before anyone really knew the facts, what we heard and saw on TV was buildings being evacuated in midtown Manhattan, people running and moving in all directions with police officers directing people and diverting traffic, firemen entering the building, and EMT first responders in place. What mayhem!
Gov. Andrew Cuomo got to the scene very quickly and assured us that the incident did not appear to be a terrorist attack. Furthermore, he thoughtfully pointed out, we in New York City all seem to have a version or a form of posttraumatic stress disorder taking us back to Sept. 11, 2001. From my point of view, Gov. Cuomo could not have been more correct in his short, televised talk to a nervous public. The incident, and the governor’s reaction to it, started me thinking about how easily triggered the memories and flashbacks of PTSD can be.
It became clear very soon that a pilot had lost control of the helicopter on that foggy, rainy June day and had tried to make an emergency landing on the roof of a Manhattan high-rise. The roof landing did not go well; the helicopter crashed on the roof; and the lone pilot died.
As it turned out, mental health care workers treated many PTSD sufferers at the Bellevue and Mount Sinai hospital programs set up after Sept. 11, including those who were part of the rescue as well as the clean-up. In addition, it appears that many who witnessed the disaster also were vulnerable to PTSD and were additionally treated in various programs. I have seen and interviewed many of those people over the last 10 years.
PTSD is defined mainly in terms of experiencing a traumatic event during a man-made or natural disaster: torture, assaults, the tragedies of war, or any event that causes physical or psychological injury. According to research, it can occur right after the event or years later. Besides those major traumatic events, I’ve seen PTSD occur from much lesser traumas; much depends on how individuals process what is happening around them. For example, in some people, I’ve seen PTSD occur after job loss, where identity and persona are lost and the brain experiences the psychological shock consistent with more dangerously threatening aspects of PTSD. I’ve seen dog bites, auto accidents, even “fender benders” and emotional break-ups bring out the symptoms of PTSD (J Adv Nurs. 2005 Oct;52[1]:22-30). Luckily, in most of those cases, treatment or time itself can heal the problems.
Going back to that June day, for a few brief moments, my memory was jogged back to Sept. 11. A few people I spoke with about the event last month also reported being taken back to that fateful day (Am Psychol. 2011 Sep;66[6]:429-46).
For some experiencing PTSD, flashbacks to the physically threatening or psychologically shocking event occur as opposed to memory alone. During a flashback, the person actually relives the experience as if it were in the present. Flashbacks are quite different from recall alone. In my experience, the flashback is not unlike age regression, where an individual actually relives an event as opposed to having a memory of an event.
PTSD is a serious emotional problem, and I believe that much of it is undiagnosed in society – partly because we tend to look for the disorder after major traumatic events, such as physical and psychological effects of war or disaster, man-made and natural disasters, as well as assaults and torture. As we know in medicine and mental health care, there are certain vulnerabilities to some disorders. I believe that, whether through education, environment, or genetics, we have vulnerabilities to PTSD (Brain Behav Immun. 2013 May;30:12-21); (Clin Psychol Rev. 2012 Nov;32[7]:630-41), not only from major disastrous physical and psychological shocks but less obvious events in life that might create the same clinical picture we see in more traditional cases of PTSD.
Some PTSD survivors will improve and get better with time. Others do well after getting treatments with interventions such as cognitive-behavioral therapy (CBT) and prolonged exposure therapy, both of which are fairly short term in many instances. An ongoing relationship with a supportive therapist or friends and family is extremely important, in order to keep PTSD survivors from isolating and endlessly “living in their heads” as they relive the experience and face the multiple symptom complexes of PTSD.
Dr. London is a practicing psychiatrist and has been a newspaper columnist for 35 years, specializing in and writing about short-term therapy, including CBT and guided imagery. He recently published a book called “Find Freedom Fast” (New York: Kettlehole Publishing, 2018).
On June 10, 2019, a rainy, foggy day, there was a news flash that a plane had crashed into a building in the middle of New York City. I first saw this notification on my iPhone and my immediate thought was: Could this be a redo of Sept. 11?
I was especially concerned because I knew the area fairly well, in that a clinic I had worked in for more than 10 years was only a few blocks away. However, my memory bank brought me back to that day almost 18 years ago when, from a hospital window, many of us doctors, nurses, social workers, and patients saw the fire in the north tower and then saw the second plane crash into the south tower of the World Trade Center. Once we all knew what happened, we spent that night at the hospital awaiting the arrival of people in need of care. Unfortunately, very few arrived.
On this past June day, before anyone really knew the facts, what we heard and saw on TV was buildings being evacuated in midtown Manhattan, people running and moving in all directions with police officers directing people and diverting traffic, firemen entering the building, and EMT first responders in place. What mayhem!
Gov. Andrew Cuomo got to the scene very quickly and assured us that the incident did not appear to be a terrorist attack. Furthermore, he thoughtfully pointed out, we in New York City all seem to have a version or a form of posttraumatic stress disorder taking us back to Sept. 11, 2001. From my point of view, Gov. Cuomo could not have been more correct in his short, televised talk to a nervous public. The incident, and the governor’s reaction to it, started me thinking about how easily triggered the memories and flashbacks of PTSD can be.
It became clear very soon that a pilot had lost control of the helicopter on that foggy, rainy June day and had tried to make an emergency landing on the roof of a Manhattan high-rise. The roof landing did not go well; the helicopter crashed on the roof; and the lone pilot died.
As it turned out, mental health care workers treated many PTSD sufferers at the Bellevue and Mount Sinai hospital programs set up after Sept. 11, including those who were part of the rescue as well as the clean-up. In addition, it appears that many who witnessed the disaster also were vulnerable to PTSD and were additionally treated in various programs. I have seen and interviewed many of those people over the last 10 years.
PTSD is defined mainly in terms of experiencing a traumatic event during a man-made or natural disaster: torture, assaults, the tragedies of war, or any event that causes physical or psychological injury. According to research, it can occur right after the event or years later. Besides those major traumatic events, I’ve seen PTSD occur from much lesser traumas; much depends on how individuals process what is happening around them. For example, in some people, I’ve seen PTSD occur after job loss, where identity and persona are lost and the brain experiences the psychological shock consistent with more dangerously threatening aspects of PTSD. I’ve seen dog bites, auto accidents, even “fender benders” and emotional break-ups bring out the symptoms of PTSD (J Adv Nurs. 2005 Oct;52[1]:22-30). Luckily, in most of those cases, treatment or time itself can heal the problems.
Going back to that June day, for a few brief moments, my memory was jogged back to Sept. 11. A few people I spoke with about the event last month also reported being taken back to that fateful day (Am Psychol. 2011 Sep;66[6]:429-46).
For some experiencing PTSD, flashbacks to the physically threatening or psychologically shocking event occur as opposed to memory alone. During a flashback, the person actually relives the experience as if it were in the present. Flashbacks are quite different from recall alone. In my experience, the flashback is not unlike age regression, where an individual actually relives an event as opposed to having a memory of an event.
PTSD is a serious emotional problem, and I believe that much of it is undiagnosed in society – partly because we tend to look for the disorder after major traumatic events, such as physical and psychological effects of war or disaster, man-made and natural disasters, as well as assaults and torture. As we know in medicine and mental health care, there are certain vulnerabilities to some disorders. I believe that, whether through education, environment, or genetics, we have vulnerabilities to PTSD (Brain Behav Immun. 2013 May;30:12-21); (Clin Psychol Rev. 2012 Nov;32[7]:630-41), not only from major disastrous physical and psychological shocks but less obvious events in life that might create the same clinical picture we see in more traditional cases of PTSD.
Some PTSD survivors will improve and get better with time. Others do well after getting treatments with interventions such as cognitive-behavioral therapy (CBT) and prolonged exposure therapy, both of which are fairly short term in many instances. An ongoing relationship with a supportive therapist or friends and family is extremely important, in order to keep PTSD survivors from isolating and endlessly “living in their heads” as they relive the experience and face the multiple symptom complexes of PTSD.
Dr. London is a practicing psychiatrist and has been a newspaper columnist for 35 years, specializing in and writing about short-term therapy, including CBT and guided imagery. He recently published a book called “Find Freedom Fast” (New York: Kettlehole Publishing, 2018).
On June 10, 2019, a rainy, foggy day, there was a news flash that a plane had crashed into a building in the middle of New York City. I first saw this notification on my iPhone and my immediate thought was: Could this be a redo of Sept. 11?
I was especially concerned because I knew the area fairly well, in that a clinic I had worked in for more than 10 years was only a few blocks away. However, my memory bank brought me back to that day almost 18 years ago when, from a hospital window, many of us doctors, nurses, social workers, and patients saw the fire in the north tower and then saw the second plane crash into the south tower of the World Trade Center. Once we all knew what happened, we spent that night at the hospital awaiting the arrival of people in need of care. Unfortunately, very few arrived.
On this past June day, before anyone really knew the facts, what we heard and saw on TV was buildings being evacuated in midtown Manhattan, people running and moving in all directions with police officers directing people and diverting traffic, firemen entering the building, and EMT first responders in place. What mayhem!
Gov. Andrew Cuomo got to the scene very quickly and assured us that the incident did not appear to be a terrorist attack. Furthermore, he thoughtfully pointed out, we in New York City all seem to have a version or a form of posttraumatic stress disorder taking us back to Sept. 11, 2001. From my point of view, Gov. Cuomo could not have been more correct in his short, televised talk to a nervous public. The incident, and the governor’s reaction to it, started me thinking about how easily triggered the memories and flashbacks of PTSD can be.
It became clear very soon that a pilot had lost control of the helicopter on that foggy, rainy June day and had tried to make an emergency landing on the roof of a Manhattan high-rise. The roof landing did not go well; the helicopter crashed on the roof; and the lone pilot died.
As it turned out, mental health care workers treated many PTSD sufferers at the Bellevue and Mount Sinai hospital programs set up after Sept. 11, including those who were part of the rescue as well as the clean-up. In addition, it appears that many who witnessed the disaster also were vulnerable to PTSD and were additionally treated in various programs. I have seen and interviewed many of those people over the last 10 years.
PTSD is defined mainly in terms of experiencing a traumatic event during a man-made or natural disaster: torture, assaults, the tragedies of war, or any event that causes physical or psychological injury. According to research, it can occur right after the event or years later. Besides those major traumatic events, I’ve seen PTSD occur from much lesser traumas; much depends on how individuals process what is happening around them. For example, in some people, I’ve seen PTSD occur after job loss, where identity and persona are lost and the brain experiences the psychological shock consistent with more dangerously threatening aspects of PTSD. I’ve seen dog bites, auto accidents, even “fender benders” and emotional break-ups bring out the symptoms of PTSD (J Adv Nurs. 2005 Oct;52[1]:22-30). Luckily, in most of those cases, treatment or time itself can heal the problems.
Going back to that June day, for a few brief moments, my memory was jogged back to Sept. 11. A few people I spoke with about the event last month also reported being taken back to that fateful day (Am Psychol. 2011 Sep;66[6]:429-46).
For some experiencing PTSD, flashbacks to the physically threatening or psychologically shocking event occur as opposed to memory alone. During a flashback, the person actually relives the experience as if it were in the present. Flashbacks are quite different from recall alone. In my experience, the flashback is not unlike age regression, where an individual actually relives an event as opposed to having a memory of an event.
PTSD is a serious emotional problem, and I believe that much of it is undiagnosed in society – partly because we tend to look for the disorder after major traumatic events, such as physical and psychological effects of war or disaster, man-made and natural disasters, as well as assaults and torture. As we know in medicine and mental health care, there are certain vulnerabilities to some disorders. I believe that, whether through education, environment, or genetics, we have vulnerabilities to PTSD (Brain Behav Immun. 2013 May;30:12-21); (Clin Psychol Rev. 2012 Nov;32[7]:630-41), not only from major disastrous physical and psychological shocks but less obvious events in life that might create the same clinical picture we see in more traditional cases of PTSD.
Some PTSD survivors will improve and get better with time. Others do well after getting treatments with interventions such as cognitive-behavioral therapy (CBT) and prolonged exposure therapy, both of which are fairly short term in many instances. An ongoing relationship with a supportive therapist or friends and family is extremely important, in order to keep PTSD survivors from isolating and endlessly “living in their heads” as they relive the experience and face the multiple symptom complexes of PTSD.
Dr. London is a practicing psychiatrist and has been a newspaper columnist for 35 years, specializing in and writing about short-term therapy, including CBT and guided imagery. He recently published a book called “Find Freedom Fast” (New York: Kettlehole Publishing, 2018).
Update on Diet and Acne
Acne is a common condition that most often affects adolescents but is not uncommon in adults. It can result in considerable anxiety, depression, and medical and pharmaceutical costs. Additionally, oral antibiotics, the standard treatment for acne, are increasingly under suspicion for causing bacterial resistance as well as disruption of the cutaneous and gut microbiomes.1,2 These factors are among those that often drive patients and physicians to search for alternative and complementary treatments, including dietary modification.
Over the last few decades, the interaction between diet and acne has been one of the most fluid areas of research in dermatology. The role of diet in acne incidence and presentation has evolved from the general view in the 1970s that there was no connection to today’s more data-driven understanding that the acne disease course likely is modified by specific dietary components. Better designed and more rigorous studies have supported a link between acne severity and glycemic index (GI)/glycemic load (GL) and possibly dairy consumption. The ability to use data-driven evidence to counsel patients regarding dietary treatment of acne is increasingly important to counteract the pseudoadvice that patients can easily find on the Internet.
This article summarizes the history of beliefs about diet and acne, reviews more recent published data regarding dietary components that can modify acne severity, and outlines the current American Academy of Dermatology (AAD) guidelines and recommendations for diet and acne.
History of Diet and Acne
In most of the current literature, acne frequently is referred to as a disease of modern civilization or a consequence of the typical Western diet.3 For clarity, the Western diet is most commonly described as “a dietary regimen characterized by high amounts of sugary desserts, refined grains, high protein, high-fat dairy products, and high-sugar drinks.”4 The role of dairy in the etiology of acne typically is discussed separately from the Western diet. It has been reported that acne is not found in nonwesternized populations where a Paleolithic diet, which does not include consumption of high-GI carbohydrates, milk, or other dairy products, is common.5
Extending this line of argument, acne vulgaris has been called a metabolic syndrome of the sebaceous follicle and one of the mammalian target of rapamycin complex 1–driven diseases of civilization, along with cancer, obesity, and diabetes mellitus.3 This view seems somewhat extreme and discounts other drivers of acne incidence and severity. Twin studies have shown that acne is highly heritable, with 81% of the population variance attributed to genetic factors.6 Similar incidence numbers for acne vulgaris have been reported worldwide, and global incidence in late adolescence is rising; however, it is unknown whether this increase is a result of the adoption of the Western diet, which is thought to encourage early onset of puberty; genetic drift; changes in regional and cultural understanding and reporting of acne; or a byproduct of unknown environmental factors.4 More nuanced views acknowledge that acne is a multifactorial disease,7 and therefore genetic and possibly epigenetic factors as well as the cutaneous and gut microbiomes also must be taken into account. An interesting historical perspective on acne by Mahmood and Shipman8 outlined acne descriptions, diagnoses, topical treatments, and dietary advice going back to ancient Greek and Egyptian civilizations. They also cited recommendations from the 1930s that suggested avoiding “starchy foods, bread rolls, noodles, spaghetti, potatoes, oily nuts, chop suey, chow mein, and waffles” and listed the following foods as suitable to cure acne: “cooked and raw fruit, farina, rice, wheat, oatmeal, green vegetables, boiled or broiled meat and poultry, clear soup, vegetable soup, and an abundance of water.”8
More Recent Evidence of Dietary Influence on Acne
Importantly, the available research does not demonstrate that diet causes acne but rather that it may influence or aggravate existing acne. Data collection for acne studies also can be confounded by the interplay of many factors, such as increased access to health care, socioeconomic status, and shifting cultural perceptions of skin care and beauty.4 An important facet of any therapeutic recommendation is that it should be supported by confirmable mechanistic pathways.
GI and GL
Over the last few decades, a number of observational and intervention studies have focused on the possible influence of the GI/GL of foods on acne incidence and/or severity. A high GI diet is characterized by a relatively high intake of carbohydrate-containing foods that are quickly digested and absorbed, increasing blood glucose and insulin concentrations. Glycemic load takes the portion size of dietary carbohydrates into consideration and therefore is a measure of both the quality and quantity of carbohydrate-containing foods.9 TheGI/GL values of more than 2480 food items are available in the literature.10
Evidence from several studies supports the role of high GI/GL diets in exacerbating acne and suggests that transitioning to low GI/GL diets may lead to decreased lesion counts after 12 weeks.11-13 In one randomized controlled trial, male participants aged 15 to 25 years with mild to moderate facial acne were instructed either to eat a high protein/low GI diet or a conventional high GL control diet.13 After 12 weeks, total lesion counts had decreased more in the low GI diet group than the control. As partial confirmation of a mechanistic pathway for a high GI diet and acne, the low GI group demonstrated lower free androgen index and insulin levels than the control group.13 In a Korean study, a 10-week low GL regimen led to a reduction in acne lesion count, a decrease in sebaceous gland size, decreased inflammation, and reduced expression of sterol regulatory element-binding protein 1 and IL-8.14
More recent studies have further solidified the role of high GI/GL diets in acne severity.9,15,16 High GI/GL diets are believed to stimulate acne pathways by stimulating insulinlike growth factor 1 (IGF-1), which induces proliferation of both keratinocytes and sebocytes and simulates androgen production.17 An excellent diagram showing the connection between high GI diets (and dairy) and IGF-1, insulin and its receptors, androgen and its receptors, mammalian target of rapamycin, and the pilosebaceous unit was published in the literature in 2016.4 Interestingly, metformin has been shown to be an effective adjunctive therapy in the treatment of moderate to severe acne vulgaris.18,19
Milk and Dairy Consumption
Milk consumption also has been examined for its potential role in the pathogenesis of acne, including its ability to increase insulin and IGF-1 levels and bind to the human IGF-1 receptor as well as the fact that it contains bovine IGF-1 and dihydrotestosterone precursors.20 Although not studied quite as extensively or rigorously as GI/GL, consumption of milk and dairy products does appear to have the potential to exacerbate acne lesions. Beginning with a series of retrospective and prospective epidemiologic studies published from 2005 to 2008,21-23 a link between clinical acne and milk or dairy consumption in adolescent subjects was reported. A recent meta-analysis found a positive relationship between dairy, total milk, whole milk, low-fat milk, and skim milk consumption and acne occurrence but no significant association between yogurt/cheese consumption and acne development.24
AAD Guidelines
In their public forum, the AAD has advised that a low-glycemic diet may reduce the number of lesions in acne patients and highlighted data from around the world that support the concept that a high-glycemic diet and dairy are correlated with acne severity. They stated that consumption of milk—whole, low fat, and skim—may be linked to an increase in acne breakouts but that no studies have found that products made from milk, such as yogurt or cheese, lead to more breakouts.25
Other Considerations
Acne can be a serious quality-of-life issue with considerable psychological distress, physical morbidity, and social prejudice.9 Consequently, acne patients may be more willing to accept nonprofessional treatment advice, and there is no shortage of non–health care “experts” willing to provide an array of unfounded and fantastical advice. Dietary recommendations found online range from specific “miracle” foods to the more data-driven suggestions to “avoid dairy” or “eat low GI foods.” An important study recently published in Cutis concluded that most of the information found online regarding diet and acne is unfounded and/or misleading.26
Two additional reasons for recommending that acne patients consider dietary modification are not directly related to the disease: (1) the general health benefits of a lower GI/GL diet, and (2) the potential for decreasing the use of antibiotics. Antibiotic resistance is a growing problem across medicine, and dermatologists prescribe more antibiotics per provider than any other specialty.17 Dietary modification, where appropriate, could provide an approach to limiting the use of antibiotics in acne.
Final Thoughts
When advising acne patients, dermatologists can refer to the Table for general guidelines that incorporate the most current data-driven information on the relationship between diet and acne. Dietary modification, of course, will not work for all but can be safely recommended in cases of mild to moderate acne.
- Barbieri JS, Bhate K, Hartnett KP, et al. Trends in oral antibiotic prescription in dermatology, 2008 to 2016 [published online January 16, 2019]. JAMA Dermatol. doi:10.1001/jamadermatol.2018.4944.
- Barbieri JS, Spaccarelli N, Margolis DJ, et al. Approaches to limit systemic antibiotic use in acne: systemic alternatives, emerging topical therapies, dietary modification, and laser and light-based treatments. J Am Acad Dermatol. 2019;80:538-549.
- Melnik BC. Acne vulgaris: the metabolic syndrome of the pilosebaceous follicle [published online September 8, 2017]. Clin Dermatol. 2018;36:29-40.
- Lynn DD, Umari T, Dunnick CA, et al. The epidemiology of acne vulgaris in late adolescence. Adolesc Health Med Ther. 2016;7:13-25.
- Cordain L, Lindeberg S, Hurtado M, et al. Acne vulgaris: a disease of Western civilization. Arch Dermatol. 2002;138:1584-1590.
- Zaenglein AL, Pathy AL, Schlosser BJ, et al. Guidelines of care for the management of acne vulgaris [published online February 17, 2016]. J Am Acad Dermatol. 2016;74:945.e33-973.e33.
- Rezakovic´ S, Bukvic´ Mokos Z, Basta-Juzbašic´ A. Acne and diet: facts and controversies. Acta Dermatovenerol Croat. 2012;20:170-174.
- Mahmood NF, Shipman AR. The age-old problem of acne. Int J Womens Dermatol. 2017;3:71-76.
- Burris J, Shikany JM, Rietkerk W, et al. A low glycemic index and glycemic load diet decreases insulin-like growth factor-1 among adults with moderate and severe acne: a short-duration, 2-week randomized controlled trial. J Acad Nutr Diet. 2018;118:1874-1885.
- Atkinson FS, Foster-Powell K, Brand-Miller JC. International tables of glycemic index and glycemic load values: 2008 [published online October 3, 2008]. Diabetes Care. 2008;31:2281-2283.
- Smith RN, Braue A, Varigos GA, et al. The effect of a low glycemic load diet on acne vulgaris and the fatty acid composition of skin surface triglycerides. J Dermatol Sci. 2008;50:41-52
- Smith RN, Braue A, Varigos GA, et al. A low-glycemic-load diet improves symptoms in acne vulgaris patients: a randomized controlled trial. Am J Clin Nutr. 2007;86:107-115.
- Smith RN, Mann NJ, Braue A, et al. The effect of a high-protein, low glycemic-load diet versus a conventional, high glycemic-load diet on biochemical parameters associated with acne vulgaris: a randomized, investigator-masked, controlled trial. J Am Acad Dermatol. 2007;57:247-256.
- Kwon HH, Yoon JY, Hong JS, et al. Clinical and histological effect of a low glycaemic load diet in treatment of acne vulgaris in Korean patients: a randomized, controlled trial. Acta Derm Venereol. 2012;92:241-246.
- Burris J, Rietkerk W, Woolf K. Differences in dietary glycemic load and hormones in New York City adults with no and moderate/severe acne. J Acad Nutr Diet. 2017;117:1375-1383.
- Burris J, Rietkerk W, Woolf K. Relationships of self-reported dietary factors and perceived acne severity in a cohort of New York young adults [published online January 9, 2014]. J Acad Nutr Diet. 2014;114:384-392.
- Barbieri JS, Bhate K, Hartnett KP, et al. Trends in oral antibiotic prescription in dermatology, 2008 to 2016 [published online January 16, 2019]. JAMA Dermatol. 2019. doi:10.1001/jamadermatol.2018.4944.
- Lee JK, Smith AD. Metformin as an adjunct therapy for the treatment of moderate to severe acne vulgaris [published online November 15, 2017]. Dermatol Online J. 2017;23. pii:13030/qt53m2q13s.
- Robinson S, Kwan Z, Tang MM. Metformin as an adjunct therapy for the treatment of moderate to severe acne vulgaris: a randomized open-labeled study [published online May 1, 2019]. Dermatol Ther. 2019. doi:10.1111/dth.12953.
- Barbieri JS, Spaccarelli N, Margolis DJ, et al. Approaches to limitsystemic antibiotic use in acne: systemic alternatives, emerging topical therapies, dietary modification, and laser and light-based treatments [published online October 5, 2018]. J Am Acad Dermatol. 2019;80:538-549.
- Adebamowo CA, Spiegelman D, Berkey CS, et al. Milk consumption and acne in adolescent girls. Dermatol Online J. 2006;12:1.
- Adebamowo CA, Spiegelman D, Berkey CS, et al. Milk consumption and acne in teenaged boys. J Am Acad Dermatol. 2008;58:787-793.
- Adebamowo CA, Spiegelman D, Danby FW, et al. High school dietary dairy intake and teenage acne. J Am Acad Dermatol. 2005;52:207-214.
- Aghasi M, Golzarand M, Shab-Bidar S, et al. Dairy intake and acne development: a meta-analysis of observational studies. Clin Nutr. 2019;38:1067-1075.
- Can the right diet get rid of acne? American Academy of Dermatology website. https://www.aad.org/public/diseases/acne-and-rosacea/can-the-right-diet-get-rid-of-acne. Accessed June 13, 2019.
- Khanna R, Shifrin N, Nektalova T, et al. Diet and dermatology: Google search results for acne, psoriasis, and eczema. Cutis. 2018;102:44-46, 48.
Acne is a common condition that most often affects adolescents but is not uncommon in adults. It can result in considerable anxiety, depression, and medical and pharmaceutical costs. Additionally, oral antibiotics, the standard treatment for acne, are increasingly under suspicion for causing bacterial resistance as well as disruption of the cutaneous and gut microbiomes.1,2 These factors are among those that often drive patients and physicians to search for alternative and complementary treatments, including dietary modification.
Over the last few decades, the interaction between diet and acne has been one of the most fluid areas of research in dermatology. The role of diet in acne incidence and presentation has evolved from the general view in the 1970s that there was no connection to today’s more data-driven understanding that the acne disease course likely is modified by specific dietary components. Better designed and more rigorous studies have supported a link between acne severity and glycemic index (GI)/glycemic load (GL) and possibly dairy consumption. The ability to use data-driven evidence to counsel patients regarding dietary treatment of acne is increasingly important to counteract the pseudoadvice that patients can easily find on the Internet.
This article summarizes the history of beliefs about diet and acne, reviews more recent published data regarding dietary components that can modify acne severity, and outlines the current American Academy of Dermatology (AAD) guidelines and recommendations for diet and acne.
History of Diet and Acne
In most of the current literature, acne frequently is referred to as a disease of modern civilization or a consequence of the typical Western diet.3 For clarity, the Western diet is most commonly described as “a dietary regimen characterized by high amounts of sugary desserts, refined grains, high protein, high-fat dairy products, and high-sugar drinks.”4 The role of dairy in the etiology of acne typically is discussed separately from the Western diet. It has been reported that acne is not found in nonwesternized populations where a Paleolithic diet, which does not include consumption of high-GI carbohydrates, milk, or other dairy products, is common.5
Extending this line of argument, acne vulgaris has been called a metabolic syndrome of the sebaceous follicle and one of the mammalian target of rapamycin complex 1–driven diseases of civilization, along with cancer, obesity, and diabetes mellitus.3 This view seems somewhat extreme and discounts other drivers of acne incidence and severity. Twin studies have shown that acne is highly heritable, with 81% of the population variance attributed to genetic factors.6 Similar incidence numbers for acne vulgaris have been reported worldwide, and global incidence in late adolescence is rising; however, it is unknown whether this increase is a result of the adoption of the Western diet, which is thought to encourage early onset of puberty; genetic drift; changes in regional and cultural understanding and reporting of acne; or a byproduct of unknown environmental factors.4 More nuanced views acknowledge that acne is a multifactorial disease,7 and therefore genetic and possibly epigenetic factors as well as the cutaneous and gut microbiomes also must be taken into account. An interesting historical perspective on acne by Mahmood and Shipman8 outlined acne descriptions, diagnoses, topical treatments, and dietary advice going back to ancient Greek and Egyptian civilizations. They also cited recommendations from the 1930s that suggested avoiding “starchy foods, bread rolls, noodles, spaghetti, potatoes, oily nuts, chop suey, chow mein, and waffles” and listed the following foods as suitable to cure acne: “cooked and raw fruit, farina, rice, wheat, oatmeal, green vegetables, boiled or broiled meat and poultry, clear soup, vegetable soup, and an abundance of water.”8
More Recent Evidence of Dietary Influence on Acne
Importantly, the available research does not demonstrate that diet causes acne but rather that it may influence or aggravate existing acne. Data collection for acne studies also can be confounded by the interplay of many factors, such as increased access to health care, socioeconomic status, and shifting cultural perceptions of skin care and beauty.4 An important facet of any therapeutic recommendation is that it should be supported by confirmable mechanistic pathways.
GI and GL
Over the last few decades, a number of observational and intervention studies have focused on the possible influence of the GI/GL of foods on acne incidence and/or severity. A high GI diet is characterized by a relatively high intake of carbohydrate-containing foods that are quickly digested and absorbed, increasing blood glucose and insulin concentrations. Glycemic load takes the portion size of dietary carbohydrates into consideration and therefore is a measure of both the quality and quantity of carbohydrate-containing foods.9 TheGI/GL values of more than 2480 food items are available in the literature.10
Evidence from several studies supports the role of high GI/GL diets in exacerbating acne and suggests that transitioning to low GI/GL diets may lead to decreased lesion counts after 12 weeks.11-13 In one randomized controlled trial, male participants aged 15 to 25 years with mild to moderate facial acne were instructed either to eat a high protein/low GI diet or a conventional high GL control diet.13 After 12 weeks, total lesion counts had decreased more in the low GI diet group than the control. As partial confirmation of a mechanistic pathway for a high GI diet and acne, the low GI group demonstrated lower free androgen index and insulin levels than the control group.13 In a Korean study, a 10-week low GL regimen led to a reduction in acne lesion count, a decrease in sebaceous gland size, decreased inflammation, and reduced expression of sterol regulatory element-binding protein 1 and IL-8.14
More recent studies have further solidified the role of high GI/GL diets in acne severity.9,15,16 High GI/GL diets are believed to stimulate acne pathways by stimulating insulinlike growth factor 1 (IGF-1), which induces proliferation of both keratinocytes and sebocytes and simulates androgen production.17 An excellent diagram showing the connection between high GI diets (and dairy) and IGF-1, insulin and its receptors, androgen and its receptors, mammalian target of rapamycin, and the pilosebaceous unit was published in the literature in 2016.4 Interestingly, metformin has been shown to be an effective adjunctive therapy in the treatment of moderate to severe acne vulgaris.18,19
Milk and Dairy Consumption
Milk consumption also has been examined for its potential role in the pathogenesis of acne, including its ability to increase insulin and IGF-1 levels and bind to the human IGF-1 receptor as well as the fact that it contains bovine IGF-1 and dihydrotestosterone precursors.20 Although not studied quite as extensively or rigorously as GI/GL, consumption of milk and dairy products does appear to have the potential to exacerbate acne lesions. Beginning with a series of retrospective and prospective epidemiologic studies published from 2005 to 2008,21-23 a link between clinical acne and milk or dairy consumption in adolescent subjects was reported. A recent meta-analysis found a positive relationship between dairy, total milk, whole milk, low-fat milk, and skim milk consumption and acne occurrence but no significant association between yogurt/cheese consumption and acne development.24
AAD Guidelines
In their public forum, the AAD has advised that a low-glycemic diet may reduce the number of lesions in acne patients and highlighted data from around the world that support the concept that a high-glycemic diet and dairy are correlated with acne severity. They stated that consumption of milk—whole, low fat, and skim—may be linked to an increase in acne breakouts but that no studies have found that products made from milk, such as yogurt or cheese, lead to more breakouts.25
Other Considerations
Acne can be a serious quality-of-life issue with considerable psychological distress, physical morbidity, and social prejudice.9 Consequently, acne patients may be more willing to accept nonprofessional treatment advice, and there is no shortage of non–health care “experts” willing to provide an array of unfounded and fantastical advice. Dietary recommendations found online range from specific “miracle” foods to the more data-driven suggestions to “avoid dairy” or “eat low GI foods.” An important study recently published in Cutis concluded that most of the information found online regarding diet and acne is unfounded and/or misleading.26
Two additional reasons for recommending that acne patients consider dietary modification are not directly related to the disease: (1) the general health benefits of a lower GI/GL diet, and (2) the potential for decreasing the use of antibiotics. Antibiotic resistance is a growing problem across medicine, and dermatologists prescribe more antibiotics per provider than any other specialty.17 Dietary modification, where appropriate, could provide an approach to limiting the use of antibiotics in acne.
Final Thoughts
When advising acne patients, dermatologists can refer to the Table for general guidelines that incorporate the most current data-driven information on the relationship between diet and acne. Dietary modification, of course, will not work for all but can be safely recommended in cases of mild to moderate acne.
Acne is a common condition that most often affects adolescents but is not uncommon in adults. It can result in considerable anxiety, depression, and medical and pharmaceutical costs. Additionally, oral antibiotics, the standard treatment for acne, are increasingly under suspicion for causing bacterial resistance as well as disruption of the cutaneous and gut microbiomes.1,2 These factors are among those that often drive patients and physicians to search for alternative and complementary treatments, including dietary modification.
Over the last few decades, the interaction between diet and acne has been one of the most fluid areas of research in dermatology. The role of diet in acne incidence and presentation has evolved from the general view in the 1970s that there was no connection to today’s more data-driven understanding that the acne disease course likely is modified by specific dietary components. Better designed and more rigorous studies have supported a link between acne severity and glycemic index (GI)/glycemic load (GL) and possibly dairy consumption. The ability to use data-driven evidence to counsel patients regarding dietary treatment of acne is increasingly important to counteract the pseudoadvice that patients can easily find on the Internet.
This article summarizes the history of beliefs about diet and acne, reviews more recent published data regarding dietary components that can modify acne severity, and outlines the current American Academy of Dermatology (AAD) guidelines and recommendations for diet and acne.
History of Diet and Acne
In most of the current literature, acne frequently is referred to as a disease of modern civilization or a consequence of the typical Western diet.3 For clarity, the Western diet is most commonly described as “a dietary regimen characterized by high amounts of sugary desserts, refined grains, high protein, high-fat dairy products, and high-sugar drinks.”4 The role of dairy in the etiology of acne typically is discussed separately from the Western diet. It has been reported that acne is not found in nonwesternized populations where a Paleolithic diet, which does not include consumption of high-GI carbohydrates, milk, or other dairy products, is common.5
Extending this line of argument, acne vulgaris has been called a metabolic syndrome of the sebaceous follicle and one of the mammalian target of rapamycin complex 1–driven diseases of civilization, along with cancer, obesity, and diabetes mellitus.3 This view seems somewhat extreme and discounts other drivers of acne incidence and severity. Twin studies have shown that acne is highly heritable, with 81% of the population variance attributed to genetic factors.6 Similar incidence numbers for acne vulgaris have been reported worldwide, and global incidence in late adolescence is rising; however, it is unknown whether this increase is a result of the adoption of the Western diet, which is thought to encourage early onset of puberty; genetic drift; changes in regional and cultural understanding and reporting of acne; or a byproduct of unknown environmental factors.4 More nuanced views acknowledge that acne is a multifactorial disease,7 and therefore genetic and possibly epigenetic factors as well as the cutaneous and gut microbiomes also must be taken into account. An interesting historical perspective on acne by Mahmood and Shipman8 outlined acne descriptions, diagnoses, topical treatments, and dietary advice going back to ancient Greek and Egyptian civilizations. They also cited recommendations from the 1930s that suggested avoiding “starchy foods, bread rolls, noodles, spaghetti, potatoes, oily nuts, chop suey, chow mein, and waffles” and listed the following foods as suitable to cure acne: “cooked and raw fruit, farina, rice, wheat, oatmeal, green vegetables, boiled or broiled meat and poultry, clear soup, vegetable soup, and an abundance of water.”8
More Recent Evidence of Dietary Influence on Acne
Importantly, the available research does not demonstrate that diet causes acne but rather that it may influence or aggravate existing acne. Data collection for acne studies also can be confounded by the interplay of many factors, such as increased access to health care, socioeconomic status, and shifting cultural perceptions of skin care and beauty.4 An important facet of any therapeutic recommendation is that it should be supported by confirmable mechanistic pathways.
GI and GL
Over the last few decades, a number of observational and intervention studies have focused on the possible influence of the GI/GL of foods on acne incidence and/or severity. A high GI diet is characterized by a relatively high intake of carbohydrate-containing foods that are quickly digested and absorbed, increasing blood glucose and insulin concentrations. Glycemic load takes the portion size of dietary carbohydrates into consideration and therefore is a measure of both the quality and quantity of carbohydrate-containing foods.9 TheGI/GL values of more than 2480 food items are available in the literature.10
Evidence from several studies supports the role of high GI/GL diets in exacerbating acne and suggests that transitioning to low GI/GL diets may lead to decreased lesion counts after 12 weeks.11-13 In one randomized controlled trial, male participants aged 15 to 25 years with mild to moderate facial acne were instructed either to eat a high protein/low GI diet or a conventional high GL control diet.13 After 12 weeks, total lesion counts had decreased more in the low GI diet group than the control. As partial confirmation of a mechanistic pathway for a high GI diet and acne, the low GI group demonstrated lower free androgen index and insulin levels than the control group.13 In a Korean study, a 10-week low GL regimen led to a reduction in acne lesion count, a decrease in sebaceous gland size, decreased inflammation, and reduced expression of sterol regulatory element-binding protein 1 and IL-8.14
More recent studies have further solidified the role of high GI/GL diets in acne severity.9,15,16 High GI/GL diets are believed to stimulate acne pathways by stimulating insulinlike growth factor 1 (IGF-1), which induces proliferation of both keratinocytes and sebocytes and simulates androgen production.17 An excellent diagram showing the connection between high GI diets (and dairy) and IGF-1, insulin and its receptors, androgen and its receptors, mammalian target of rapamycin, and the pilosebaceous unit was published in the literature in 2016.4 Interestingly, metformin has been shown to be an effective adjunctive therapy in the treatment of moderate to severe acne vulgaris.18,19
Milk and Dairy Consumption
Milk consumption also has been examined for its potential role in the pathogenesis of acne, including its ability to increase insulin and IGF-1 levels and bind to the human IGF-1 receptor as well as the fact that it contains bovine IGF-1 and dihydrotestosterone precursors.20 Although not studied quite as extensively or rigorously as GI/GL, consumption of milk and dairy products does appear to have the potential to exacerbate acne lesions. Beginning with a series of retrospective and prospective epidemiologic studies published from 2005 to 2008,21-23 a link between clinical acne and milk or dairy consumption in adolescent subjects was reported. A recent meta-analysis found a positive relationship between dairy, total milk, whole milk, low-fat milk, and skim milk consumption and acne occurrence but no significant association between yogurt/cheese consumption and acne development.24
AAD Guidelines
In their public forum, the AAD has advised that a low-glycemic diet may reduce the number of lesions in acne patients and highlighted data from around the world that support the concept that a high-glycemic diet and dairy are correlated with acne severity. They stated that consumption of milk—whole, low fat, and skim—may be linked to an increase in acne breakouts but that no studies have found that products made from milk, such as yogurt or cheese, lead to more breakouts.25
Other Considerations
Acne can be a serious quality-of-life issue with considerable psychological distress, physical morbidity, and social prejudice.9 Consequently, acne patients may be more willing to accept nonprofessional treatment advice, and there is no shortage of non–health care “experts” willing to provide an array of unfounded and fantastical advice. Dietary recommendations found online range from specific “miracle” foods to the more data-driven suggestions to “avoid dairy” or “eat low GI foods.” An important study recently published in Cutis concluded that most of the information found online regarding diet and acne is unfounded and/or misleading.26
Two additional reasons for recommending that acne patients consider dietary modification are not directly related to the disease: (1) the general health benefits of a lower GI/GL diet, and (2) the potential for decreasing the use of antibiotics. Antibiotic resistance is a growing problem across medicine, and dermatologists prescribe more antibiotics per provider than any other specialty.17 Dietary modification, where appropriate, could provide an approach to limiting the use of antibiotics in acne.
Final Thoughts
When advising acne patients, dermatologists can refer to the Table for general guidelines that incorporate the most current data-driven information on the relationship between diet and acne. Dietary modification, of course, will not work for all but can be safely recommended in cases of mild to moderate acne.
- Barbieri JS, Bhate K, Hartnett KP, et al. Trends in oral antibiotic prescription in dermatology, 2008 to 2016 [published online January 16, 2019]. JAMA Dermatol. doi:10.1001/jamadermatol.2018.4944.
- Barbieri JS, Spaccarelli N, Margolis DJ, et al. Approaches to limit systemic antibiotic use in acne: systemic alternatives, emerging topical therapies, dietary modification, and laser and light-based treatments. J Am Acad Dermatol. 2019;80:538-549.
- Melnik BC. Acne vulgaris: the metabolic syndrome of the pilosebaceous follicle [published online September 8, 2017]. Clin Dermatol. 2018;36:29-40.
- Lynn DD, Umari T, Dunnick CA, et al. The epidemiology of acne vulgaris in late adolescence. Adolesc Health Med Ther. 2016;7:13-25.
- Cordain L, Lindeberg S, Hurtado M, et al. Acne vulgaris: a disease of Western civilization. Arch Dermatol. 2002;138:1584-1590.
- Zaenglein AL, Pathy AL, Schlosser BJ, et al. Guidelines of care for the management of acne vulgaris [published online February 17, 2016]. J Am Acad Dermatol. 2016;74:945.e33-973.e33.
- Rezakovic´ S, Bukvic´ Mokos Z, Basta-Juzbašic´ A. Acne and diet: facts and controversies. Acta Dermatovenerol Croat. 2012;20:170-174.
- Mahmood NF, Shipman AR. The age-old problem of acne. Int J Womens Dermatol. 2017;3:71-76.
- Burris J, Shikany JM, Rietkerk W, et al. A low glycemic index and glycemic load diet decreases insulin-like growth factor-1 among adults with moderate and severe acne: a short-duration, 2-week randomized controlled trial. J Acad Nutr Diet. 2018;118:1874-1885.
- Atkinson FS, Foster-Powell K, Brand-Miller JC. International tables of glycemic index and glycemic load values: 2008 [published online October 3, 2008]. Diabetes Care. 2008;31:2281-2283.
- Smith RN, Braue A, Varigos GA, et al. The effect of a low glycemic load diet on acne vulgaris and the fatty acid composition of skin surface triglycerides. J Dermatol Sci. 2008;50:41-52
- Smith RN, Braue A, Varigos GA, et al. A low-glycemic-load diet improves symptoms in acne vulgaris patients: a randomized controlled trial. Am J Clin Nutr. 2007;86:107-115.
- Smith RN, Mann NJ, Braue A, et al. The effect of a high-protein, low glycemic-load diet versus a conventional, high glycemic-load diet on biochemical parameters associated with acne vulgaris: a randomized, investigator-masked, controlled trial. J Am Acad Dermatol. 2007;57:247-256.
- Kwon HH, Yoon JY, Hong JS, et al. Clinical and histological effect of a low glycaemic load diet in treatment of acne vulgaris in Korean patients: a randomized, controlled trial. Acta Derm Venereol. 2012;92:241-246.
- Burris J, Rietkerk W, Woolf K. Differences in dietary glycemic load and hormones in New York City adults with no and moderate/severe acne. J Acad Nutr Diet. 2017;117:1375-1383.
- Burris J, Rietkerk W, Woolf K. Relationships of self-reported dietary factors and perceived acne severity in a cohort of New York young adults [published online January 9, 2014]. J Acad Nutr Diet. 2014;114:384-392.
- Barbieri JS, Bhate K, Hartnett KP, et al. Trends in oral antibiotic prescription in dermatology, 2008 to 2016 [published online January 16, 2019]. JAMA Dermatol. 2019. doi:10.1001/jamadermatol.2018.4944.
- Lee JK, Smith AD. Metformin as an adjunct therapy for the treatment of moderate to severe acne vulgaris [published online November 15, 2017]. Dermatol Online J. 2017;23. pii:13030/qt53m2q13s.
- Robinson S, Kwan Z, Tang MM. Metformin as an adjunct therapy for the treatment of moderate to severe acne vulgaris: a randomized open-labeled study [published online May 1, 2019]. Dermatol Ther. 2019. doi:10.1111/dth.12953.
- Barbieri JS, Spaccarelli N, Margolis DJ, et al. Approaches to limitsystemic antibiotic use in acne: systemic alternatives, emerging topical therapies, dietary modification, and laser and light-based treatments [published online October 5, 2018]. J Am Acad Dermatol. 2019;80:538-549.
- Adebamowo CA, Spiegelman D, Berkey CS, et al. Milk consumption and acne in adolescent girls. Dermatol Online J. 2006;12:1.
- Adebamowo CA, Spiegelman D, Berkey CS, et al. Milk consumption and acne in teenaged boys. J Am Acad Dermatol. 2008;58:787-793.
- Adebamowo CA, Spiegelman D, Danby FW, et al. High school dietary dairy intake and teenage acne. J Am Acad Dermatol. 2005;52:207-214.
- Aghasi M, Golzarand M, Shab-Bidar S, et al. Dairy intake and acne development: a meta-analysis of observational studies. Clin Nutr. 2019;38:1067-1075.
- Can the right diet get rid of acne? American Academy of Dermatology website. https://www.aad.org/public/diseases/acne-and-rosacea/can-the-right-diet-get-rid-of-acne. Accessed June 13, 2019.
- Khanna R, Shifrin N, Nektalova T, et al. Diet and dermatology: Google search results for acne, psoriasis, and eczema. Cutis. 2018;102:44-46, 48.
- Barbieri JS, Bhate K, Hartnett KP, et al. Trends in oral antibiotic prescription in dermatology, 2008 to 2016 [published online January 16, 2019]. JAMA Dermatol. doi:10.1001/jamadermatol.2018.4944.
- Barbieri JS, Spaccarelli N, Margolis DJ, et al. Approaches to limit systemic antibiotic use in acne: systemic alternatives, emerging topical therapies, dietary modification, and laser and light-based treatments. J Am Acad Dermatol. 2019;80:538-549.
- Melnik BC. Acne vulgaris: the metabolic syndrome of the pilosebaceous follicle [published online September 8, 2017]. Clin Dermatol. 2018;36:29-40.
- Lynn DD, Umari T, Dunnick CA, et al. The epidemiology of acne vulgaris in late adolescence. Adolesc Health Med Ther. 2016;7:13-25.
- Cordain L, Lindeberg S, Hurtado M, et al. Acne vulgaris: a disease of Western civilization. Arch Dermatol. 2002;138:1584-1590.
- Zaenglein AL, Pathy AL, Schlosser BJ, et al. Guidelines of care for the management of acne vulgaris [published online February 17, 2016]. J Am Acad Dermatol. 2016;74:945.e33-973.e33.
- Rezakovic´ S, Bukvic´ Mokos Z, Basta-Juzbašic´ A. Acne and diet: facts and controversies. Acta Dermatovenerol Croat. 2012;20:170-174.
- Mahmood NF, Shipman AR. The age-old problem of acne. Int J Womens Dermatol. 2017;3:71-76.
- Burris J, Shikany JM, Rietkerk W, et al. A low glycemic index and glycemic load diet decreases insulin-like growth factor-1 among adults with moderate and severe acne: a short-duration, 2-week randomized controlled trial. J Acad Nutr Diet. 2018;118:1874-1885.
- Atkinson FS, Foster-Powell K, Brand-Miller JC. International tables of glycemic index and glycemic load values: 2008 [published online October 3, 2008]. Diabetes Care. 2008;31:2281-2283.
- Smith RN, Braue A, Varigos GA, et al. The effect of a low glycemic load diet on acne vulgaris and the fatty acid composition of skin surface triglycerides. J Dermatol Sci. 2008;50:41-52
- Smith RN, Braue A, Varigos GA, et al. A low-glycemic-load diet improves symptoms in acne vulgaris patients: a randomized controlled trial. Am J Clin Nutr. 2007;86:107-115.
- Smith RN, Mann NJ, Braue A, et al. The effect of a high-protein, low glycemic-load diet versus a conventional, high glycemic-load diet on biochemical parameters associated with acne vulgaris: a randomized, investigator-masked, controlled trial. J Am Acad Dermatol. 2007;57:247-256.
- Kwon HH, Yoon JY, Hong JS, et al. Clinical and histological effect of a low glycaemic load diet in treatment of acne vulgaris in Korean patients: a randomized, controlled trial. Acta Derm Venereol. 2012;92:241-246.
- Burris J, Rietkerk W, Woolf K. Differences in dietary glycemic load and hormones in New York City adults with no and moderate/severe acne. J Acad Nutr Diet. 2017;117:1375-1383.
- Burris J, Rietkerk W, Woolf K. Relationships of self-reported dietary factors and perceived acne severity in a cohort of New York young adults [published online January 9, 2014]. J Acad Nutr Diet. 2014;114:384-392.
- Barbieri JS, Bhate K, Hartnett KP, et al. Trends in oral antibiotic prescription in dermatology, 2008 to 2016 [published online January 16, 2019]. JAMA Dermatol. 2019. doi:10.1001/jamadermatol.2018.4944.
- Lee JK, Smith AD. Metformin as an adjunct therapy for the treatment of moderate to severe acne vulgaris [published online November 15, 2017]. Dermatol Online J. 2017;23. pii:13030/qt53m2q13s.
- Robinson S, Kwan Z, Tang MM. Metformin as an adjunct therapy for the treatment of moderate to severe acne vulgaris: a randomized open-labeled study [published online May 1, 2019]. Dermatol Ther. 2019. doi:10.1111/dth.12953.
- Barbieri JS, Spaccarelli N, Margolis DJ, et al. Approaches to limitsystemic antibiotic use in acne: systemic alternatives, emerging topical therapies, dietary modification, and laser and light-based treatments [published online October 5, 2018]. J Am Acad Dermatol. 2019;80:538-549.
- Adebamowo CA, Spiegelman D, Berkey CS, et al. Milk consumption and acne in adolescent girls. Dermatol Online J. 2006;12:1.
- Adebamowo CA, Spiegelman D, Berkey CS, et al. Milk consumption and acne in teenaged boys. J Am Acad Dermatol. 2008;58:787-793.
- Adebamowo CA, Spiegelman D, Danby FW, et al. High school dietary dairy intake and teenage acne. J Am Acad Dermatol. 2005;52:207-214.
- Aghasi M, Golzarand M, Shab-Bidar S, et al. Dairy intake and acne development: a meta-analysis of observational studies. Clin Nutr. 2019;38:1067-1075.
- Can the right diet get rid of acne? American Academy of Dermatology website. https://www.aad.org/public/diseases/acne-and-rosacea/can-the-right-diet-get-rid-of-acne. Accessed June 13, 2019.
- Khanna R, Shifrin N, Nektalova T, et al. Diet and dermatology: Google search results for acne, psoriasis, and eczema. Cutis. 2018;102:44-46, 48.
Appropriateness of performing in-office uterine aspiration
In their article, "Uterine aspiration: From OR to office" (February 2019), Lauren Thaxton, MD, MBA, and Bri Tristan, MD, made the case for why, in appropriate clinical situations, office-based uterine aspiration, compared with uterine aspiration in the OR, should be the standard surgical management of early pregnancy failure. Their reasons included an equivalent safety profile, reduced costs, and patient-centered characteristics.
OBG Management posed this query to readers in a website poll: "Should the standard location for uterine apiration be in the office?" See how readers responded, below.
Poll results
A total of 73 readers cast their vote:
- 86.3% (63 readers) said yes, in appropriate clinical situations
- 13.7% (10 readers) said no
Reader comments
"Yes, in appropriate clinical situations."
-Yardlie Toussaint-Foster, DO, Downingtown, Pennsylvania
"I have been doing it this way (in the office) for years, up to 11 to 12 weeks without complication."
-John Lane, MD, Raleigh, North Carolina
In their article, "Uterine aspiration: From OR to office" (February 2019), Lauren Thaxton, MD, MBA, and Bri Tristan, MD, made the case for why, in appropriate clinical situations, office-based uterine aspiration, compared with uterine aspiration in the OR, should be the standard surgical management of early pregnancy failure. Their reasons included an equivalent safety profile, reduced costs, and patient-centered characteristics.
OBG Management posed this query to readers in a website poll: "Should the standard location for uterine apiration be in the office?" See how readers responded, below.
Poll results
A total of 73 readers cast their vote:
- 86.3% (63 readers) said yes, in appropriate clinical situations
- 13.7% (10 readers) said no
Reader comments
"Yes, in appropriate clinical situations."
-Yardlie Toussaint-Foster, DO, Downingtown, Pennsylvania
"I have been doing it this way (in the office) for years, up to 11 to 12 weeks without complication."
-John Lane, MD, Raleigh, North Carolina
In their article, "Uterine aspiration: From OR to office" (February 2019), Lauren Thaxton, MD, MBA, and Bri Tristan, MD, made the case for why, in appropriate clinical situations, office-based uterine aspiration, compared with uterine aspiration in the OR, should be the standard surgical management of early pregnancy failure. Their reasons included an equivalent safety profile, reduced costs, and patient-centered characteristics.
OBG Management posed this query to readers in a website poll: "Should the standard location for uterine apiration be in the office?" See how readers responded, below.
Poll results
A total of 73 readers cast their vote:
- 86.3% (63 readers) said yes, in appropriate clinical situations
- 13.7% (10 readers) said no
Reader comments
"Yes, in appropriate clinical situations."
-Yardlie Toussaint-Foster, DO, Downingtown, Pennsylvania
"I have been doing it this way (in the office) for years, up to 11 to 12 weeks without complication."
-John Lane, MD, Raleigh, North Carolina
Treatment in prison systems might lead to drop in overdose deaths
Incarceration versus treatment takes center stage in a new analysis of U.S. data from researchers in the United Kingdom.
The researchers performed an observational study looking at rates of incarceration, income, and drug-related deaths from 1983 to 2014 in the United States. They found a strong association between incarceration rates and drug-related deaths. Also, a very strong association was found between lower household income and drug-related deaths. Strikingly, in the counties with the highest incarceration rates, there was a 50% higher rate of drug deaths, reported Elias Nosrati, PhD, and associates (Lancet Public Health. 2019 Jul 3;4:e326-33). It is clearer every day that our opioid epidemic was in part wrought by a zealous push to change protocols on treating chronic pain. The epidemic also appears tied to well-meaning but overprescribing doctors and allegedly unscrupulous pharmaceutical companies and distributors. What we are learning through this most recent study is that another factor tied to the opioid and overdose epidemic could be incarceration.
According to the study, an increase in crime rates combined with sentencing reforms led the number of people incarcerated in state and federal prisons to soar from less than 200,000 in 1970 to almost 1 million in 1995. Furthermore, Dr. Nosrati and associates wrote, “Incarceration is directly associated with stigma, discrimination, poor mental health, and chronic economic hardship, all of which are linked to drug use disorders.”
Treatment for drug addiction in prison systems is rare, as is adequate mental health treatment. However, treatment for this population would likely help reduce drug overdose deaths and improve the quality of life for people who are incarcerated and their families. In the Philadelphia prison system, for example, treatment for inmates is available for opioid addiction, both with methadone and now more recently with buprenorphine (Suboxone). The Philadelphia Department of Prisons also provides cognitive-behavioral therapy. In Florida, Chapter 397 of the Florida statutes – known as the Marchman Act – provides for the involuntary (and voluntary) treatment of individuals with substance abuse problems.
The court systems in South Florida have a robust drug-diversion program, aimed at directing people facing incarceration for drug offenses into treatment instead. North Carolina has studied this issue specifically and found through a model simulation that diverting 10% of drug-abusing offenders out of incarceration into treatment would save $4.8 billion in legal costs for North Carolina counties and state legal systems. Diverting 40% of individuals would close to triple that savings.
There are striking data from programs treating individuals who are leveraged into treatment in order to maintain professional licenses. These such individuals, many of whom are physicians, airline pilots, and nurses, have a rate of sobriety of 90% or greater after 5 years. This data show that
In addition to the potential reduction in morbidity and mortality as well as the financial savings, why is treatment important? Because of societal costs. When parents or family members are put in jail for a drug charge or other charge, they leave behind a community, family, and very often children who are affected economically, emotionally, and socially. Those children in particular have higher risks of depression and PTSD. Diverting an offender into treatment or treating an incarcerated person for drug and mental health problems can change the life of a child or family member, and ultimately can change society.
Dr. Jorandby is chief medical officer of Lakeview Health in Jacksonville, Fla. She trained in addiction psychiatry at Yale University, New Haven, Conn.
Incarceration versus treatment takes center stage in a new analysis of U.S. data from researchers in the United Kingdom.
The researchers performed an observational study looking at rates of incarceration, income, and drug-related deaths from 1983 to 2014 in the United States. They found a strong association between incarceration rates and drug-related deaths. Also, a very strong association was found between lower household income and drug-related deaths. Strikingly, in the counties with the highest incarceration rates, there was a 50% higher rate of drug deaths, reported Elias Nosrati, PhD, and associates (Lancet Public Health. 2019 Jul 3;4:e326-33). It is clearer every day that our opioid epidemic was in part wrought by a zealous push to change protocols on treating chronic pain. The epidemic also appears tied to well-meaning but overprescribing doctors and allegedly unscrupulous pharmaceutical companies and distributors. What we are learning through this most recent study is that another factor tied to the opioid and overdose epidemic could be incarceration.
According to the study, an increase in crime rates combined with sentencing reforms led the number of people incarcerated in state and federal prisons to soar from less than 200,000 in 1970 to almost 1 million in 1995. Furthermore, Dr. Nosrati and associates wrote, “Incarceration is directly associated with stigma, discrimination, poor mental health, and chronic economic hardship, all of which are linked to drug use disorders.”
Treatment for drug addiction in prison systems is rare, as is adequate mental health treatment. However, treatment for this population would likely help reduce drug overdose deaths and improve the quality of life for people who are incarcerated and their families. In the Philadelphia prison system, for example, treatment for inmates is available for opioid addiction, both with methadone and now more recently with buprenorphine (Suboxone). The Philadelphia Department of Prisons also provides cognitive-behavioral therapy. In Florida, Chapter 397 of the Florida statutes – known as the Marchman Act – provides for the involuntary (and voluntary) treatment of individuals with substance abuse problems.
The court systems in South Florida have a robust drug-diversion program, aimed at directing people facing incarceration for drug offenses into treatment instead. North Carolina has studied this issue specifically and found through a model simulation that diverting 10% of drug-abusing offenders out of incarceration into treatment would save $4.8 billion in legal costs for North Carolina counties and state legal systems. Diverting 40% of individuals would close to triple that savings.
There are striking data from programs treating individuals who are leveraged into treatment in order to maintain professional licenses. These such individuals, many of whom are physicians, airline pilots, and nurses, have a rate of sobriety of 90% or greater after 5 years. This data show that
In addition to the potential reduction in morbidity and mortality as well as the financial savings, why is treatment important? Because of societal costs. When parents or family members are put in jail for a drug charge or other charge, they leave behind a community, family, and very often children who are affected economically, emotionally, and socially. Those children in particular have higher risks of depression and PTSD. Diverting an offender into treatment or treating an incarcerated person for drug and mental health problems can change the life of a child or family member, and ultimately can change society.
Dr. Jorandby is chief medical officer of Lakeview Health in Jacksonville, Fla. She trained in addiction psychiatry at Yale University, New Haven, Conn.
Incarceration versus treatment takes center stage in a new analysis of U.S. data from researchers in the United Kingdom.
The researchers performed an observational study looking at rates of incarceration, income, and drug-related deaths from 1983 to 2014 in the United States. They found a strong association between incarceration rates and drug-related deaths. Also, a very strong association was found between lower household income and drug-related deaths. Strikingly, in the counties with the highest incarceration rates, there was a 50% higher rate of drug deaths, reported Elias Nosrati, PhD, and associates (Lancet Public Health. 2019 Jul 3;4:e326-33). It is clearer every day that our opioid epidemic was in part wrought by a zealous push to change protocols on treating chronic pain. The epidemic also appears tied to well-meaning but overprescribing doctors and allegedly unscrupulous pharmaceutical companies and distributors. What we are learning through this most recent study is that another factor tied to the opioid and overdose epidemic could be incarceration.
According to the study, an increase in crime rates combined with sentencing reforms led the number of people incarcerated in state and federal prisons to soar from less than 200,000 in 1970 to almost 1 million in 1995. Furthermore, Dr. Nosrati and associates wrote, “Incarceration is directly associated with stigma, discrimination, poor mental health, and chronic economic hardship, all of which are linked to drug use disorders.”
Treatment for drug addiction in prison systems is rare, as is adequate mental health treatment. However, treatment for this population would likely help reduce drug overdose deaths and improve the quality of life for people who are incarcerated and their families. In the Philadelphia prison system, for example, treatment for inmates is available for opioid addiction, both with methadone and now more recently with buprenorphine (Suboxone). The Philadelphia Department of Prisons also provides cognitive-behavioral therapy. In Florida, Chapter 397 of the Florida statutes – known as the Marchman Act – provides for the involuntary (and voluntary) treatment of individuals with substance abuse problems.
The court systems in South Florida have a robust drug-diversion program, aimed at directing people facing incarceration for drug offenses into treatment instead. North Carolina has studied this issue specifically and found through a model simulation that diverting 10% of drug-abusing offenders out of incarceration into treatment would save $4.8 billion in legal costs for North Carolina counties and state legal systems. Diverting 40% of individuals would close to triple that savings.
There are striking data from programs treating individuals who are leveraged into treatment in order to maintain professional licenses. These such individuals, many of whom are physicians, airline pilots, and nurses, have a rate of sobriety of 90% or greater after 5 years. This data show that
In addition to the potential reduction in morbidity and mortality as well as the financial savings, why is treatment important? Because of societal costs. When parents or family members are put in jail for a drug charge or other charge, they leave behind a community, family, and very often children who are affected economically, emotionally, and socially. Those children in particular have higher risks of depression and PTSD. Diverting an offender into treatment or treating an incarcerated person for drug and mental health problems can change the life of a child or family member, and ultimately can change society.
Dr. Jorandby is chief medical officer of Lakeview Health in Jacksonville, Fla. She trained in addiction psychiatry at Yale University, New Haven, Conn.
The 10 Commandments of Internships
The Shot That Won the Revolutionary War and Is Still Reverberating
The disputes about those who decline to vaccinate their children for communicable infectious diseases, especially measles, have been in the headlines of late. Those refusals are often done in the name of “medical freedom.”1 Yet this is a much older debate for the military. It seems fitting in this month in which we celebrate the 243rd anniversary of the Declaration of Independence to reflect on the earliest history of the interaction between vaccinations and war in the US and what it tells us about the fight for religious and political freedom and individual liberty.
Go back in time with me to 1776, long before the Fourth of July was a day for barbecues and fireworks. We are in Boston, Philadelphia, and other important cities in colonial America. This time, concern was not about measles but the even more dreaded smallpox. In the first years of the Revolutionary War, General George Washington took command of a newly formed and named Continental Army. A catastrophic 90% of casualties in the Continental Army were from infectious diseases, with the lion’s share of these from smallpox, which at that time had a mortality rate of about 30%.2,3
Early efforts to introduce inoculation into the colonies had failed for many of the same reasons parents across the US today refuse immunization: fear and anxiety. When the renowned New England Puritan minister and scientist Cotton Mather attempted in 1721 to introduce variolation, his house was firebombed and his fellow clergy and physicians alleged that his efforts at inoculation were challenging God’s will to send a plague.3 Variolation was the now antiquated and then laborious process in which a previously unexposed individual was inoculated with material from the vesicle of someone infected with the disease.4,5 Variolation was practiced in parts of Africa and Asia and among wealthy Europeans but remained controversial in many colonies where few Americans had been exposed to smallpox or could afford the procedure.3
It is important to note that the use of variolation was practiced before Edward Jenner famously demonstrated that cowpox vaccine could provide immunity to smallpox in 1798. The majority of those inoculated would develop a mild case of smallpox that required a 5-week period of illness and recovery that provided lifelong immunity. However, during those 5 weeks, they remained a vector of disease for the uninoculated. Southern and New England colonies passed laws that prohibited variolation. Those anti-inoculation attitudes were the basis for the order given to the surgeons general of the Continental Army in 1776 that all inoculations of the troops were forbidden, despite the fact that perhaps only 25% of soldiers possessed any natural immunity.2,3
There was yet another reason that many colonial Americans opposed government-sponsored preventative care, and it was the same reason that they were fighting a war of independence: distrust and resentment of authority. The modern antivaccine movement voices similar fears and suspicions regarding public health campaigns and especially legislative efforts to mandate vaccinations or remove extant exemptions.
In 1775 in Boston, a smallpox outbreak occurred at the same time the Americans laid siege to the British troops occupying the city. Greater natural immunity to the scourge of smallpox either through exposure or variolation provided the British with a stronger defense than the mere city fortifications. There are even some suspicions that the British used the virus as a proto-biologic weapon.
General Washington had initially been against inoculation until he realized that without it the British might win the war. This possibility presented him with a momentous decision: inoculate despite widespread anxiety that variolation would spread the disease or risk the virus ravaging the fighting force. Perhaps the most compelling reason to variolate was that new recruits refused to sign up, fearing not that they would die in battle but of smallpox. In 1777, Washington mandated variolation of the nonimmune troops and new recruits, making it the first large-scale military preventative care measure in history.
Recapitulating an ethical dilemma that still rages in the military nearly 3 centuries later, for British soldiers, inoculation was voluntary not compulsory as for the Americans. There was so much opposition to Washington’s order that communications with surgeons were secret, and commanding officers had to oversee the inoculations.2,3
Washington’s policy not only contributed mightily to the American victory in the war, but also set the precedent for compulsory vaccination in the US military for the next 3 centuries. Currently, regulations require that service members be vaccinated for multiple infectious diseases. Of interest, this mandatory vaccination program has led to no reported cases of measles among military families to date, in part because of federal regulations requiring families of those service members to be vaccinated.6
Ironically, once General Washington made the decision for mass inoculation, he encountered little actual resistance among the troops. However, throughout military history some service members have objected to compulsory vaccination on medical, religious, and personal grounds. In United States v Chadwell, a military court ruled against 2 Marine Corps members who refused vaccination for smallpox, typhoid, paratyphoid, and influenza, citing religious grounds. The court opined that the military orders that ensure the health and safety of the armed forces and thereby that of the public override personal religious beliefs.7
The paradox of liberty—the liberty first won in the Revolutionary War—is that in a pluralistic representative democracy like ours to secure the freedom for all, some, such as the military, must relinquish the very choice to refuse. Their sacrifices grant liberty to others. On June 6, we commemorated the seventy-fifth anniversary of D-Day, remembering how great the cost of that eternal vigilance, which the patriot Thomas Paine said was the price of liberty. On Memorial Day, we remember all those men and women who died in the service of their country. And while they gave up the most precious gift, we must never forget that every person in uniform also surrenders many other significant personal freedoms so that their fellow civilians may exercise them.
The question General Washington faced is one that public health authorities and our legislators again confront. When should the freedom to refuse, which was won with the blood of many valiant heroes and has been defended since 1776, be curtailed for the greater good? We are the one nation in history that has made the defense of self-determination its highest value and in so doing, its greatest challenge.
1. Sun LH. Senate panel warns of dangers of ant-vaccine movement. https://www.washingtonpost.com/health/2019/03/05/combat-anti-vaxxers-us-needs-national-campaign-top-washington-state-official-says/?utm_term=.9a4201be0ed1. Published March 5, 2019. Accessed June 9, 2019.
2. Filsinger AL, Dwek R. George Washington and the first mass military inoculation. http://www.loc.gov/rr/scitech/GW&smallpoxinoculation.html. Published February 12, 2009. Accessed June 10, 2019.
3. Fenn EA. Pox Americana. New York: Hill and Wang; 2001.
4. Steadman’s Medical Dictionary. 28th edition. Philadelphia, PA: Lippincott, Williams & Wilkins; 2006.
5. Artenstein AW, Opal JM, Opal SM, Tramont EC, Georges P, Russell PK. History of U.S. military contributions to the study of vaccines and infectious diseases. Mil Med. 2005;170(suppl 4):3-11.
6. Jowers K. So far, no measles cases at military medical facilities—but officials are watching. https://www.militarytimes.com/pay-benefits/2019/04/19/so-far-no-measles-cases-at-military-medical-facilities-but-officials-are-watching/. Published April 19, 2019. Accessed June 9, 2019.
7. Cole JP, Swendiman KS. Mandatory vaccinations: precedent and current laws. https://fas.org/sgp/crs/misc/RS21414.pdf. Published May 21, 2014. Accessed June 10, 2019.
The disputes about those who decline to vaccinate their children for communicable infectious diseases, especially measles, have been in the headlines of late. Those refusals are often done in the name of “medical freedom.”1 Yet this is a much older debate for the military. It seems fitting in this month in which we celebrate the 243rd anniversary of the Declaration of Independence to reflect on the earliest history of the interaction between vaccinations and war in the US and what it tells us about the fight for religious and political freedom and individual liberty.
Go back in time with me to 1776, long before the Fourth of July was a day for barbecues and fireworks. We are in Boston, Philadelphia, and other important cities in colonial America. This time, concern was not about measles but the even more dreaded smallpox. In the first years of the Revolutionary War, General George Washington took command of a newly formed and named Continental Army. A catastrophic 90% of casualties in the Continental Army were from infectious diseases, with the lion’s share of these from smallpox, which at that time had a mortality rate of about 30%.2,3
Early efforts to introduce inoculation into the colonies had failed for many of the same reasons parents across the US today refuse immunization: fear and anxiety. When the renowned New England Puritan minister and scientist Cotton Mather attempted in 1721 to introduce variolation, his house was firebombed and his fellow clergy and physicians alleged that his efforts at inoculation were challenging God’s will to send a plague.3 Variolation was the now antiquated and then laborious process in which a previously unexposed individual was inoculated with material from the vesicle of someone infected with the disease.4,5 Variolation was practiced in parts of Africa and Asia and among wealthy Europeans but remained controversial in many colonies where few Americans had been exposed to smallpox or could afford the procedure.3
It is important to note that the use of variolation was practiced before Edward Jenner famously demonstrated that cowpox vaccine could provide immunity to smallpox in 1798. The majority of those inoculated would develop a mild case of smallpox that required a 5-week period of illness and recovery that provided lifelong immunity. However, during those 5 weeks, they remained a vector of disease for the uninoculated. Southern and New England colonies passed laws that prohibited variolation. Those anti-inoculation attitudes were the basis for the order given to the surgeons general of the Continental Army in 1776 that all inoculations of the troops were forbidden, despite the fact that perhaps only 25% of soldiers possessed any natural immunity.2,3
There was yet another reason that many colonial Americans opposed government-sponsored preventative care, and it was the same reason that they were fighting a war of independence: distrust and resentment of authority. The modern antivaccine movement voices similar fears and suspicions regarding public health campaigns and especially legislative efforts to mandate vaccinations or remove extant exemptions.
In 1775 in Boston, a smallpox outbreak occurred at the same time the Americans laid siege to the British troops occupying the city. Greater natural immunity to the scourge of smallpox either through exposure or variolation provided the British with a stronger defense than the mere city fortifications. There are even some suspicions that the British used the virus as a proto-biologic weapon.
General Washington had initially been against inoculation until he realized that without it the British might win the war. This possibility presented him with a momentous decision: inoculate despite widespread anxiety that variolation would spread the disease or risk the virus ravaging the fighting force. Perhaps the most compelling reason to variolate was that new recruits refused to sign up, fearing not that they would die in battle but of smallpox. In 1777, Washington mandated variolation of the nonimmune troops and new recruits, making it the first large-scale military preventative care measure in history.
Recapitulating an ethical dilemma that still rages in the military nearly 3 centuries later, for British soldiers, inoculation was voluntary not compulsory as for the Americans. There was so much opposition to Washington’s order that communications with surgeons were secret, and commanding officers had to oversee the inoculations.2,3
Washington’s policy not only contributed mightily to the American victory in the war, but also set the precedent for compulsory vaccination in the US military for the next 3 centuries. Currently, regulations require that service members be vaccinated for multiple infectious diseases. Of interest, this mandatory vaccination program has led to no reported cases of measles among military families to date, in part because of federal regulations requiring families of those service members to be vaccinated.6
Ironically, once General Washington made the decision for mass inoculation, he encountered little actual resistance among the troops. However, throughout military history some service members have objected to compulsory vaccination on medical, religious, and personal grounds. In United States v Chadwell, a military court ruled against 2 Marine Corps members who refused vaccination for smallpox, typhoid, paratyphoid, and influenza, citing religious grounds. The court opined that the military orders that ensure the health and safety of the armed forces and thereby that of the public override personal religious beliefs.7
The paradox of liberty—the liberty first won in the Revolutionary War—is that in a pluralistic representative democracy like ours to secure the freedom for all, some, such as the military, must relinquish the very choice to refuse. Their sacrifices grant liberty to others. On June 6, we commemorated the seventy-fifth anniversary of D-Day, remembering how great the cost of that eternal vigilance, which the patriot Thomas Paine said was the price of liberty. On Memorial Day, we remember all those men and women who died in the service of their country. And while they gave up the most precious gift, we must never forget that every person in uniform also surrenders many other significant personal freedoms so that their fellow civilians may exercise them.
The question General Washington faced is one that public health authorities and our legislators again confront. When should the freedom to refuse, which was won with the blood of many valiant heroes and has been defended since 1776, be curtailed for the greater good? We are the one nation in history that has made the defense of self-determination its highest value and in so doing, its greatest challenge.
The disputes about those who decline to vaccinate their children for communicable infectious diseases, especially measles, have been in the headlines of late. Those refusals are often done in the name of “medical freedom.”1 Yet this is a much older debate for the military. It seems fitting in this month in which we celebrate the 243rd anniversary of the Declaration of Independence to reflect on the earliest history of the interaction between vaccinations and war in the US and what it tells us about the fight for religious and political freedom and individual liberty.
Go back in time with me to 1776, long before the Fourth of July was a day for barbecues and fireworks. We are in Boston, Philadelphia, and other important cities in colonial America. This time, concern was not about measles but the even more dreaded smallpox. In the first years of the Revolutionary War, General George Washington took command of a newly formed and named Continental Army. A catastrophic 90% of casualties in the Continental Army were from infectious diseases, with the lion’s share of these from smallpox, which at that time had a mortality rate of about 30%.2,3
Early efforts to introduce inoculation into the colonies had failed for many of the same reasons parents across the US today refuse immunization: fear and anxiety. When the renowned New England Puritan minister and scientist Cotton Mather attempted in 1721 to introduce variolation, his house was firebombed and his fellow clergy and physicians alleged that his efforts at inoculation were challenging God’s will to send a plague.3 Variolation was the now antiquated and then laborious process in which a previously unexposed individual was inoculated with material from the vesicle of someone infected with the disease.4,5 Variolation was practiced in parts of Africa and Asia and among wealthy Europeans but remained controversial in many colonies where few Americans had been exposed to smallpox or could afford the procedure.3
It is important to note that the use of variolation was practiced before Edward Jenner famously demonstrated that cowpox vaccine could provide immunity to smallpox in 1798. The majority of those inoculated would develop a mild case of smallpox that required a 5-week period of illness and recovery that provided lifelong immunity. However, during those 5 weeks, they remained a vector of disease for the uninoculated. Southern and New England colonies passed laws that prohibited variolation. Those anti-inoculation attitudes were the basis for the order given to the surgeons general of the Continental Army in 1776 that all inoculations of the troops were forbidden, despite the fact that perhaps only 25% of soldiers possessed any natural immunity.2,3
There was yet another reason that many colonial Americans opposed government-sponsored preventative care, and it was the same reason that they were fighting a war of independence: distrust and resentment of authority. The modern antivaccine movement voices similar fears and suspicions regarding public health campaigns and especially legislative efforts to mandate vaccinations or remove extant exemptions.
In 1775 in Boston, a smallpox outbreak occurred at the same time the Americans laid siege to the British troops occupying the city. Greater natural immunity to the scourge of smallpox either through exposure or variolation provided the British with a stronger defense than the mere city fortifications. There are even some suspicions that the British used the virus as a proto-biologic weapon.
General Washington had initially been against inoculation until he realized that without it the British might win the war. This possibility presented him with a momentous decision: inoculate despite widespread anxiety that variolation would spread the disease or risk the virus ravaging the fighting force. Perhaps the most compelling reason to variolate was that new recruits refused to sign up, fearing not that they would die in battle but of smallpox. In 1777, Washington mandated variolation of the nonimmune troops and new recruits, making it the first large-scale military preventative care measure in history.
Recapitulating an ethical dilemma that still rages in the military nearly 3 centuries later, for British soldiers, inoculation was voluntary not compulsory as for the Americans. There was so much opposition to Washington’s order that communications with surgeons were secret, and commanding officers had to oversee the inoculations.2,3
Washington’s policy not only contributed mightily to the American victory in the war, but also set the precedent for compulsory vaccination in the US military for the next 3 centuries. Currently, regulations require that service members be vaccinated for multiple infectious diseases. Of interest, this mandatory vaccination program has led to no reported cases of measles among military families to date, in part because of federal regulations requiring families of those service members to be vaccinated.6
Ironically, once General Washington made the decision for mass inoculation, he encountered little actual resistance among the troops. However, throughout military history some service members have objected to compulsory vaccination on medical, religious, and personal grounds. In United States v Chadwell, a military court ruled against 2 Marine Corps members who refused vaccination for smallpox, typhoid, paratyphoid, and influenza, citing religious grounds. The court opined that the military orders that ensure the health and safety of the armed forces and thereby that of the public override personal religious beliefs.7
The paradox of liberty—the liberty first won in the Revolutionary War—is that in a pluralistic representative democracy like ours to secure the freedom for all, some, such as the military, must relinquish the very choice to refuse. Their sacrifices grant liberty to others. On June 6, we commemorated the seventy-fifth anniversary of D-Day, remembering how great the cost of that eternal vigilance, which the patriot Thomas Paine said was the price of liberty. On Memorial Day, we remember all those men and women who died in the service of their country. And while they gave up the most precious gift, we must never forget that every person in uniform also surrenders many other significant personal freedoms so that their fellow civilians may exercise them.
The question General Washington faced is one that public health authorities and our legislators again confront. When should the freedom to refuse, which was won with the blood of many valiant heroes and has been defended since 1776, be curtailed for the greater good? We are the one nation in history that has made the defense of self-determination its highest value and in so doing, its greatest challenge.
1. Sun LH. Senate panel warns of dangers of ant-vaccine movement. https://www.washingtonpost.com/health/2019/03/05/combat-anti-vaxxers-us-needs-national-campaign-top-washington-state-official-says/?utm_term=.9a4201be0ed1. Published March 5, 2019. Accessed June 9, 2019.
2. Filsinger AL, Dwek R. George Washington and the first mass military inoculation. http://www.loc.gov/rr/scitech/GW&smallpoxinoculation.html. Published February 12, 2009. Accessed June 10, 2019.
3. Fenn EA. Pox Americana. New York: Hill and Wang; 2001.
4. Steadman’s Medical Dictionary. 28th edition. Philadelphia, PA: Lippincott, Williams & Wilkins; 2006.
5. Artenstein AW, Opal JM, Opal SM, Tramont EC, Georges P, Russell PK. History of U.S. military contributions to the study of vaccines and infectious diseases. Mil Med. 2005;170(suppl 4):3-11.
6. Jowers K. So far, no measles cases at military medical facilities—but officials are watching. https://www.militarytimes.com/pay-benefits/2019/04/19/so-far-no-measles-cases-at-military-medical-facilities-but-officials-are-watching/. Published April 19, 2019. Accessed June 9, 2019.
7. Cole JP, Swendiman KS. Mandatory vaccinations: precedent and current laws. https://fas.org/sgp/crs/misc/RS21414.pdf. Published May 21, 2014. Accessed June 10, 2019.
1. Sun LH. Senate panel warns of dangers of ant-vaccine movement. https://www.washingtonpost.com/health/2019/03/05/combat-anti-vaxxers-us-needs-national-campaign-top-washington-state-official-says/?utm_term=.9a4201be0ed1. Published March 5, 2019. Accessed June 9, 2019.
2. Filsinger AL, Dwek R. George Washington and the first mass military inoculation. http://www.loc.gov/rr/scitech/GW&smallpoxinoculation.html. Published February 12, 2009. Accessed June 10, 2019.
3. Fenn EA. Pox Americana. New York: Hill and Wang; 2001.
4. Steadman’s Medical Dictionary. 28th edition. Philadelphia, PA: Lippincott, Williams & Wilkins; 2006.
5. Artenstein AW, Opal JM, Opal SM, Tramont EC, Georges P, Russell PK. History of U.S. military contributions to the study of vaccines and infectious diseases. Mil Med. 2005;170(suppl 4):3-11.
6. Jowers K. So far, no measles cases at military medical facilities—but officials are watching. https://www.militarytimes.com/pay-benefits/2019/04/19/so-far-no-measles-cases-at-military-medical-facilities-but-officials-are-watching/. Published April 19, 2019. Accessed June 9, 2019.
7. Cole JP, Swendiman KS. Mandatory vaccinations: precedent and current laws. https://fas.org/sgp/crs/misc/RS21414.pdf. Published May 21, 2014. Accessed June 10, 2019.
Psychosis as a common thread across psychiatric disorders
Ask a psychiatrist to name a psychotic disorder, and the answer will most likely be “schizophrenia.” But if you closely examine the symptom structure of DSM-5 psychiatric disorders, you will note the presence of psychosis in almost all of them.
Fixed false beliefs and impaired reality testing are core features of psychosis. Those are certainly prominent in severe psychoses such as schizophrenia, schizoaffective disorder, or delusional disorder. But psychosis is actually a continuum of varying severity across most psychiatric disorders, although they carry different diagnostic labels. Irrational false beliefs and impaired functioning due to poor reality testing are embedded among many DSM-5 disorders. Hallucinations are less common; they are perceptual aberrations, not thought abnormalities, although they can trigger delusional explanations as to their causation.
Consider the following:
- Bipolar disorder. A large proportion of patients with bipolar disorder manifest delusions, usually grandiose, but often paranoid or referential.
- Major depressive disorder (MDD). Although regarded as a “pure mood disorder,” the core symptoms of MDD—self-deprecation and sense of worthlessness—as well as the poor reality testing of suicidal thoughts (that death is a better option than living) are psychotic false beliefs.
- Anxiety and panic disorder. The central symptom in anxiety and panic attacks is a belief in impending doom and/or death. The fear in anxiety disorders is actually based on a false belief (eg, if I get on the plane, it will crash, and I will die). Thus, technically an irrational/psychotic thought process underpins the terror and fear of anxiety disorders.
- Borderline personality disorder. Frank psychotic symptoms, such as paranoid beliefs, are known to be a component of borderline personality disorder symptoms. Although these symptoms tend to be brief and episodic, they can have a deleterious effect on the person’s coping and relationships.
- Other personality disorders. While many individuals with narcissistic personality disorder are functional, their exaggerated sense of self-importance, entitlement, and self-aggrandizement certainly qualifies as a fixed false belief. Patients with other personality disorders, such as schizotypal and paranoid, are known to harbor false beliefs or magical thinking.
- Body dysmorphic disorder. False beliefs about one’s appearance (such as blemishes or asymmetry) are at the center of this disorder, and it meets the litmus test of a psychosis.
- Anorexia nervosa. This disorder is well known to be characterized by a fixed false belief that one is “fat,” even when the patient’s body borders on being cachectic in appearance according to objective observers.
- Autism. This spectrum of diseases includes false beliefs that drive the ritualistic or odd behaviors.
- Obsessive-compulsive disorder. Although obsessions are usually ego-dystonic, in severe cases, they become ego-syntonic, similar to delusions. On the other hand, compulsions are often driven by a false belief, such as believing that one’s hands are dirty and must be washed incessantly, or that the locks on the door must be rechecked repeatedly because an intruder may break into the house and harm the inhabitants.
- Neurodegenerative syndromes. Neurodegenerative syndromes are neuropsychiatric disorders that very frequently include psychotic symptoms, such as paranoid delusions, delusions of marital infidelity, Capgras syndrome, or folie à deux. These disorders include Alzheimer’s disease, Parkinson’s disease, Lewy body dementia, frontal temporal dementia, metachromatic leukodystrophy, Huntington’s chorea, temporal lobe epilepsy, stroke, xenomelia, reduplicative phenomena, etc. This reflects the common emergence of faulty thinking with disintegration of neural tissue, both gray and white matter.
Continue to: So it should not be...
So it should not be surprising that antipsychotic medications, especially second-generation agents, have been shown to be helpful as monotherapy or adjunctive therapy in practically all the above psychiatric disorders, whether on-label or off-label.
Finally, it should also be noted that a case has been made for the existence of one dimension in all mental disorders manifesting in multiple psychopathologies.1 It is possible that a continuum of delusional thinking is a common thread across many psychiatric disorders due to this putative shared dimension. The milder form of this dimension may also explain the presence of pre-psychotic thinking in a significant proportion of the general population who do not seek psychiatric help.2 Just think of how many people you befriend, socialize with, and regard as perfectly “normal” endorse wild superstitions and astrological predictions, or believe in various conspiracy theories that have no basis in reality.
To comment on this editorial or other topics of interest: [email protected].
1. Caspi A, Moffitt TE. All for one and one for all: mental disorders in one dimension. Am J Psychiatry. 2018;175(9):831-844.
2. van Os J, Linscott RJ, Myin-Germeys I, et al. A systematic review and meta-analysis of the psychosis continuum: evidence for a psychosis proneness-persistence-impairment model of psychotic disorder. Psychol Med. 2009;39(2):179-195.
Ask a psychiatrist to name a psychotic disorder, and the answer will most likely be “schizophrenia.” But if you closely examine the symptom structure of DSM-5 psychiatric disorders, you will note the presence of psychosis in almost all of them.
Fixed false beliefs and impaired reality testing are core features of psychosis. Those are certainly prominent in severe psychoses such as schizophrenia, schizoaffective disorder, or delusional disorder. But psychosis is actually a continuum of varying severity across most psychiatric disorders, although they carry different diagnostic labels. Irrational false beliefs and impaired functioning due to poor reality testing are embedded among many DSM-5 disorders. Hallucinations are less common; they are perceptual aberrations, not thought abnormalities, although they can trigger delusional explanations as to their causation.
Consider the following:
- Bipolar disorder. A large proportion of patients with bipolar disorder manifest delusions, usually grandiose, but often paranoid or referential.
- Major depressive disorder (MDD). Although regarded as a “pure mood disorder,” the core symptoms of MDD—self-deprecation and sense of worthlessness—as well as the poor reality testing of suicidal thoughts (that death is a better option than living) are psychotic false beliefs.
- Anxiety and panic disorder. The central symptom in anxiety and panic attacks is a belief in impending doom and/or death. The fear in anxiety disorders is actually based on a false belief (eg, if I get on the plane, it will crash, and I will die). Thus, technically an irrational/psychotic thought process underpins the terror and fear of anxiety disorders.
- Borderline personality disorder. Frank psychotic symptoms, such as paranoid beliefs, are known to be a component of borderline personality disorder symptoms. Although these symptoms tend to be brief and episodic, they can have a deleterious effect on the person’s coping and relationships.
- Other personality disorders. While many individuals with narcissistic personality disorder are functional, their exaggerated sense of self-importance, entitlement, and self-aggrandizement certainly qualifies as a fixed false belief. Patients with other personality disorders, such as schizotypal and paranoid, are known to harbor false beliefs or magical thinking.
- Body dysmorphic disorder. False beliefs about one’s appearance (such as blemishes or asymmetry) are at the center of this disorder, and it meets the litmus test of a psychosis.
- Anorexia nervosa. This disorder is well known to be characterized by a fixed false belief that one is “fat,” even when the patient’s body borders on being cachectic in appearance according to objective observers.
- Autism. This spectrum of diseases includes false beliefs that drive the ritualistic or odd behaviors.
- Obsessive-compulsive disorder. Although obsessions are usually ego-dystonic, in severe cases, they become ego-syntonic, similar to delusions. On the other hand, compulsions are often driven by a false belief, such as believing that one’s hands are dirty and must be washed incessantly, or that the locks on the door must be rechecked repeatedly because an intruder may break into the house and harm the inhabitants.
- Neurodegenerative syndromes. Neurodegenerative syndromes are neuropsychiatric disorders that very frequently include psychotic symptoms, such as paranoid delusions, delusions of marital infidelity, Capgras syndrome, or folie à deux. These disorders include Alzheimer’s disease, Parkinson’s disease, Lewy body dementia, frontal temporal dementia, metachromatic leukodystrophy, Huntington’s chorea, temporal lobe epilepsy, stroke, xenomelia, reduplicative phenomena, etc. This reflects the common emergence of faulty thinking with disintegration of neural tissue, both gray and white matter.
Continue to: So it should not be...
So it should not be surprising that antipsychotic medications, especially second-generation agents, have been shown to be helpful as monotherapy or adjunctive therapy in practically all the above psychiatric disorders, whether on-label or off-label.
Finally, it should also be noted that a case has been made for the existence of one dimension in all mental disorders manifesting in multiple psychopathologies.1 It is possible that a continuum of delusional thinking is a common thread across many psychiatric disorders due to this putative shared dimension. The milder form of this dimension may also explain the presence of pre-psychotic thinking in a significant proportion of the general population who do not seek psychiatric help.2 Just think of how many people you befriend, socialize with, and regard as perfectly “normal” endorse wild superstitions and astrological predictions, or believe in various conspiracy theories that have no basis in reality.
To comment on this editorial or other topics of interest: [email protected].
Ask a psychiatrist to name a psychotic disorder, and the answer will most likely be “schizophrenia.” But if you closely examine the symptom structure of DSM-5 psychiatric disorders, you will note the presence of psychosis in almost all of them.
Fixed false beliefs and impaired reality testing are core features of psychosis. Those are certainly prominent in severe psychoses such as schizophrenia, schizoaffective disorder, or delusional disorder. But psychosis is actually a continuum of varying severity across most psychiatric disorders, although they carry different diagnostic labels. Irrational false beliefs and impaired functioning due to poor reality testing are embedded among many DSM-5 disorders. Hallucinations are less common; they are perceptual aberrations, not thought abnormalities, although they can trigger delusional explanations as to their causation.
Consider the following:
- Bipolar disorder. A large proportion of patients with bipolar disorder manifest delusions, usually grandiose, but often paranoid or referential.
- Major depressive disorder (MDD). Although regarded as a “pure mood disorder,” the core symptoms of MDD—self-deprecation and sense of worthlessness—as well as the poor reality testing of suicidal thoughts (that death is a better option than living) are psychotic false beliefs.
- Anxiety and panic disorder. The central symptom in anxiety and panic attacks is a belief in impending doom and/or death. The fear in anxiety disorders is actually based on a false belief (eg, if I get on the plane, it will crash, and I will die). Thus, technically an irrational/psychotic thought process underpins the terror and fear of anxiety disorders.
- Borderline personality disorder. Frank psychotic symptoms, such as paranoid beliefs, are known to be a component of borderline personality disorder symptoms. Although these symptoms tend to be brief and episodic, they can have a deleterious effect on the person’s coping and relationships.
- Other personality disorders. While many individuals with narcissistic personality disorder are functional, their exaggerated sense of self-importance, entitlement, and self-aggrandizement certainly qualifies as a fixed false belief. Patients with other personality disorders, such as schizotypal and paranoid, are known to harbor false beliefs or magical thinking.
- Body dysmorphic disorder. False beliefs about one’s appearance (such as blemishes or asymmetry) are at the center of this disorder, and it meets the litmus test of a psychosis.
- Anorexia nervosa. This disorder is well known to be characterized by a fixed false belief that one is “fat,” even when the patient’s body borders on being cachectic in appearance according to objective observers.
- Autism. This spectrum of diseases includes false beliefs that drive the ritualistic or odd behaviors.
- Obsessive-compulsive disorder. Although obsessions are usually ego-dystonic, in severe cases, they become ego-syntonic, similar to delusions. On the other hand, compulsions are often driven by a false belief, such as believing that one’s hands are dirty and must be washed incessantly, or that the locks on the door must be rechecked repeatedly because an intruder may break into the house and harm the inhabitants.
- Neurodegenerative syndromes. Neurodegenerative syndromes are neuropsychiatric disorders that very frequently include psychotic symptoms, such as paranoid delusions, delusions of marital infidelity, Capgras syndrome, or folie à deux. These disorders include Alzheimer’s disease, Parkinson’s disease, Lewy body dementia, frontal temporal dementia, metachromatic leukodystrophy, Huntington’s chorea, temporal lobe epilepsy, stroke, xenomelia, reduplicative phenomena, etc. This reflects the common emergence of faulty thinking with disintegration of neural tissue, both gray and white matter.
Continue to: So it should not be...
So it should not be surprising that antipsychotic medications, especially second-generation agents, have been shown to be helpful as monotherapy or adjunctive therapy in practically all the above psychiatric disorders, whether on-label or off-label.
Finally, it should also be noted that a case has been made for the existence of one dimension in all mental disorders manifesting in multiple psychopathologies.1 It is possible that a continuum of delusional thinking is a common thread across many psychiatric disorders due to this putative shared dimension. The milder form of this dimension may also explain the presence of pre-psychotic thinking in a significant proportion of the general population who do not seek psychiatric help.2 Just think of how many people you befriend, socialize with, and regard as perfectly “normal” endorse wild superstitions and astrological predictions, or believe in various conspiracy theories that have no basis in reality.
To comment on this editorial or other topics of interest: [email protected].
1. Caspi A, Moffitt TE. All for one and one for all: mental disorders in one dimension. Am J Psychiatry. 2018;175(9):831-844.
2. van Os J, Linscott RJ, Myin-Germeys I, et al. A systematic review and meta-analysis of the psychosis continuum: evidence for a psychosis proneness-persistence-impairment model of psychotic disorder. Psychol Med. 2009;39(2):179-195.
1. Caspi A, Moffitt TE. All for one and one for all: mental disorders in one dimension. Am J Psychiatry. 2018;175(9):831-844.
2. van Os J, Linscott RJ, Myin-Germeys I, et al. A systematic review and meta-analysis of the psychosis continuum: evidence for a psychosis proneness-persistence-impairment model of psychotic disorder. Psychol Med. 2009;39(2):179-195.
Factors that change our brains; The APA’s stance on neuroimaging
Factors that change our brains
I greatly enjoyed Dr. Nasrallah’s editorial, “Your patient’s brain is different at every visit” (From the Editor,
In reading this editorial, it is clear that a myriad of factors we consider and address with our patients during each visit underly intricate neurobiologic mechanisms and processes that ever deepen our understanding of the brain. In discussing the changes taking place in our patients, I can’t help but wonder what changes are also occurring in our brains (as Dr. Nasrallah noted). What would be the resulting impact of these changes in our next patient interaction and/or subsequent interaction(s) with the same patient? Looking through the editorial’s bullet points, many (if not all) of the factors contributing to brain changes apply equally and naturally to clinicians as well as patients. In this light, the editorial serves not only as a broad guideline for patient psychoeducation but also as a reminder of wellness and well-being for clinicians.
As a “fresh-out-of-training” psychiatrist, I can definitely work on several of the factors, such as diet and exercise. Trainees and residents can be more susceptible to overlook and befall some of these factors and changes, and may already be basing the clinical advice they give to their patients on these same factors and changes. As a child psychiatrist, I value the importance of modeling healthy behaviors for my patients, and their families and with coworkers or colleagues. In accordance with the impact these factors have on our brains, it’s important to emphasize what we can do to further strengthen rapport and therapeutic value through modeling. I strive to model the desired behaviors, attitudes, and dynamics that are the external, observable manifestation or symptomology of what takes place in my brain. To do so, I understand I need to be mindful in proactively managing the contributing factors, such as those listed in Dr. Nasrallah’s editorial. I imagine patients and their families would easily notice if we are in suboptimal physical and/or mental health that results in us not being prompt, fully engaged, or receptive. I believe that attending to these facets during training falls under the umbrella of professionalism. Being a professional in our field often entails practicing what we preach. So, I’m grateful that what we preach is informed by our field’s exciting research, continued advancements, and expertise that benefits our patients and us professionally and personally.
Philip Yen-Tsun Liu, MD
Child and adolescent psychiatrist
innovaTel Telepsychiatry
San Antonio, Texas
Dr. Nasrallah responds
I would like to thank Dr. Liu for his thoughtful response to my editorial. He seems to be very cognizant of the fact that experiential neuroplasticity and brain tissue remodeling occurs in both the patient and physician. I admire his focus on psychoeducation, wellness, and professionalism. He is right that we as psychiatrists (and nurse practitioners) must be role models for our patients in multiple ways, because it may help enhance clinical outcomes and have a positive impact on their brains.
I would also like to point Dr. Liu to the editorial “The most powerful placebo is not a pill” (From the Editor,
Henry A. Nasrallah, MD
Editor-in-Chief
Sydney W. Souers Endowed Chair
Professor and Chairman
Department of Psychiatry and Behavioral Neuroscience
Saint Louis University School of Medicine
St. Louis, Missouri
The APA’s stance on neuroimaging
Can anyone in the modern world argue that the brain is irrelevant to psychiatry? Yet surprisingly, in September 2018, the American Psychiatric Association (APA) officially declared that neuroimaging of the brain has no clinical value in psychiatry.1
Unfortunately, the APA focused almost exclusively on functional magnetic resonance imaging (fMRI) and neglected an extensive library of studies of single-photon emission computed tomography (SPECT) and positron emission tomography (PET). The APA’s position on neuroimaging is as follows1,2:
- A neuroimaging finding must have a sensitivity and specificity (S/sp) of no less than 80%.
- The psychiatric imaging literature does not support using neuroimaging in psychiatric diagnostics or treatment.
- Neuroimaging has not had a significant impact on the diagnosis and treatment of psychiatric disorders.
The APA set unrealistic standards for biomarkers in a field that lacks pathologic markers of specific disease entities.3 Moreover, numerous widely used tests fall below the APA’s unrealistic S/sp cutoff, including the Hamilton Depression Rating Scale,4 Zung Depression Scale,5 the clock drawing test,6 and even the chest X-ray.3 Curiously, numerous replicated SPECT and PET studies were not included in the APA’s analysis.1-3 For example, in a study of 196 veterans, posttraumatic stress disorder was distinguished from traumatic brain injury with an S/sp of 0.92/0.85.7,8 Also, fluorodeoxyglucose (FDG)-PET has an S/sp of 0.84/0.74 in differentiating patients with Alzheimer’s disease from controls, while perfusion SPECT, using multi-detector cameras, has an S/sp of 0.93/0.84.3,9 Moreover, both FDG-PET and SPECT can differentiate other forms of dementia from Alzheimer’s disease, yielding an additional benefit compared to amyloid imaging alone.2,9 As President of the International Society of Applied Neuroimaging, I suggest neuroimaging should not be feared. Neuroimaging does not replace the diagnostician; rather, it aids him/her in a complex case.
Theodore A. Henderson, MD, PhD
President
Neuro-Luminance Brain Health Centers, Inc.
Denver, Colorado
Director
The Synaptic Space
Vice President
The Neuro-Laser Foundation
President
International Society of Applied Neuroimaging
Centennial, Colorado
Disclosure
The author has no ownership in, and receives no remuneration from, any neuroimaging company.
References
1. First MB, Drevets WC, Carter C, et al. Clinical applications of neuroimaging in psychiatric disorders. Am J Psychiatry. 2018:175:
2. First MB, Drevets WC, Carter C, et al. Data supplement for Clinical applications of neuroimaging in psychiatric disorders. Am J Psychiatry. 2018;175(suppl).
3. Henderson TA. Brain SPECT imaging in neuropsychiatric diagnosis and monitoring. EPatient. http://nmpangea.com/2018/10/09/738/. Published 2018. Accessed May 31, 2019.
4. Bagby RM, Ryder AG, Schuller DR, et al. The Hamilton Depression Rating Scale: has the gold standard become a lead weight? Am J Psychiatry. 2004;161(12):2163-2177.
5. Biggs JT, Wylie LT, Ziegler VE. Validity of the Zung Self-rating Depression Scale. Br J Psychiatry. 1978;132:381-385.
6. Seigerschmidt E, Mösch E, Siemen M, et al. The clock drawing test and questionable dementia: reliability and validity. Int J Geriatr Psychiatry. 2002;17(11):1048-1054.
7. Raji CA, Willeumier K, Taylor D, et al. Functional neuroimaging with default mode network regions distinguishes PTSD from TBI in a military veteran population. Brain Imaging Behav. 2015;9(3):527-534.
8. Amen DG, Raji CA, Willeumier K, et al. Functional neuroimaging distinguishes posttraumatic stress disorder from traumatic brain injury in focused and large community datasets. PLoS One. 2015;10(7):e0129659. doi: 10.1371/journal.pone.0129659.
9. Henderson TA. The diagnosis and evaluation of dementia and mild cognitive impairment with emphasis on SPECT perfusion neuroimaging. CNS Spectr. 2012;17(4):176-206.
Factors that change our brains
I greatly enjoyed Dr. Nasrallah’s editorial, “Your patient’s brain is different at every visit” (From the Editor,
In reading this editorial, it is clear that a myriad of factors we consider and address with our patients during each visit underly intricate neurobiologic mechanisms and processes that ever deepen our understanding of the brain. In discussing the changes taking place in our patients, I can’t help but wonder what changes are also occurring in our brains (as Dr. Nasrallah noted). What would be the resulting impact of these changes in our next patient interaction and/or subsequent interaction(s) with the same patient? Looking through the editorial’s bullet points, many (if not all) of the factors contributing to brain changes apply equally and naturally to clinicians as well as patients. In this light, the editorial serves not only as a broad guideline for patient psychoeducation but also as a reminder of wellness and well-being for clinicians.
As a “fresh-out-of-training” psychiatrist, I can definitely work on several of the factors, such as diet and exercise. Trainees and residents can be more susceptible to overlook and befall some of these factors and changes, and may already be basing the clinical advice they give to their patients on these same factors and changes. As a child psychiatrist, I value the importance of modeling healthy behaviors for my patients, and their families and with coworkers or colleagues. In accordance with the impact these factors have on our brains, it’s important to emphasize what we can do to further strengthen rapport and therapeutic value through modeling. I strive to model the desired behaviors, attitudes, and dynamics that are the external, observable manifestation or symptomology of what takes place in my brain. To do so, I understand I need to be mindful in proactively managing the contributing factors, such as those listed in Dr. Nasrallah’s editorial. I imagine patients and their families would easily notice if we are in suboptimal physical and/or mental health that results in us not being prompt, fully engaged, or receptive. I believe that attending to these facets during training falls under the umbrella of professionalism. Being a professional in our field often entails practicing what we preach. So, I’m grateful that what we preach is informed by our field’s exciting research, continued advancements, and expertise that benefits our patients and us professionally and personally.
Philip Yen-Tsun Liu, MD
Child and adolescent psychiatrist
innovaTel Telepsychiatry
San Antonio, Texas
Dr. Nasrallah responds
I would like to thank Dr. Liu for his thoughtful response to my editorial. He seems to be very cognizant of the fact that experiential neuroplasticity and brain tissue remodeling occurs in both the patient and physician. I admire his focus on psychoeducation, wellness, and professionalism. He is right that we as psychiatrists (and nurse practitioners) must be role models for our patients in multiple ways, because it may help enhance clinical outcomes and have a positive impact on their brains.
I would also like to point Dr. Liu to the editorial “The most powerful placebo is not a pill” (From the Editor,
Henry A. Nasrallah, MD
Editor-in-Chief
Sydney W. Souers Endowed Chair
Professor and Chairman
Department of Psychiatry and Behavioral Neuroscience
Saint Louis University School of Medicine
St. Louis, Missouri
The APA’s stance on neuroimaging
Can anyone in the modern world argue that the brain is irrelevant to psychiatry? Yet surprisingly, in September 2018, the American Psychiatric Association (APA) officially declared that neuroimaging of the brain has no clinical value in psychiatry.1
Unfortunately, the APA focused almost exclusively on functional magnetic resonance imaging (fMRI) and neglected an extensive library of studies of single-photon emission computed tomography (SPECT) and positron emission tomography (PET). The APA’s position on neuroimaging is as follows1,2:
- A neuroimaging finding must have a sensitivity and specificity (S/sp) of no less than 80%.
- The psychiatric imaging literature does not support using neuroimaging in psychiatric diagnostics or treatment.
- Neuroimaging has not had a significant impact on the diagnosis and treatment of psychiatric disorders.
The APA set unrealistic standards for biomarkers in a field that lacks pathologic markers of specific disease entities.3 Moreover, numerous widely used tests fall below the APA’s unrealistic S/sp cutoff, including the Hamilton Depression Rating Scale,4 Zung Depression Scale,5 the clock drawing test,6 and even the chest X-ray.3 Curiously, numerous replicated SPECT and PET studies were not included in the APA’s analysis.1-3 For example, in a study of 196 veterans, posttraumatic stress disorder was distinguished from traumatic brain injury with an S/sp of 0.92/0.85.7,8 Also, fluorodeoxyglucose (FDG)-PET has an S/sp of 0.84/0.74 in differentiating patients with Alzheimer’s disease from controls, while perfusion SPECT, using multi-detector cameras, has an S/sp of 0.93/0.84.3,9 Moreover, both FDG-PET and SPECT can differentiate other forms of dementia from Alzheimer’s disease, yielding an additional benefit compared to amyloid imaging alone.2,9 As President of the International Society of Applied Neuroimaging, I suggest neuroimaging should not be feared. Neuroimaging does not replace the diagnostician; rather, it aids him/her in a complex case.
Theodore A. Henderson, MD, PhD
President
Neuro-Luminance Brain Health Centers, Inc.
Denver, Colorado
Director
The Synaptic Space
Vice President
The Neuro-Laser Foundation
President
International Society of Applied Neuroimaging
Centennial, Colorado
Disclosure
The author has no ownership in, and receives no remuneration from, any neuroimaging company.
References
1. First MB, Drevets WC, Carter C, et al. Clinical applications of neuroimaging in psychiatric disorders. Am J Psychiatry. 2018:175:
2. First MB, Drevets WC, Carter C, et al. Data supplement for Clinical applications of neuroimaging in psychiatric disorders. Am J Psychiatry. 2018;175(suppl).
3. Henderson TA. Brain SPECT imaging in neuropsychiatric diagnosis and monitoring. EPatient. http://nmpangea.com/2018/10/09/738/. Published 2018. Accessed May 31, 2019.
4. Bagby RM, Ryder AG, Schuller DR, et al. The Hamilton Depression Rating Scale: has the gold standard become a lead weight? Am J Psychiatry. 2004;161(12):2163-2177.
5. Biggs JT, Wylie LT, Ziegler VE. Validity of the Zung Self-rating Depression Scale. Br J Psychiatry. 1978;132:381-385.
6. Seigerschmidt E, Mösch E, Siemen M, et al. The clock drawing test and questionable dementia: reliability and validity. Int J Geriatr Psychiatry. 2002;17(11):1048-1054.
7. Raji CA, Willeumier K, Taylor D, et al. Functional neuroimaging with default mode network regions distinguishes PTSD from TBI in a military veteran population. Brain Imaging Behav. 2015;9(3):527-534.
8. Amen DG, Raji CA, Willeumier K, et al. Functional neuroimaging distinguishes posttraumatic stress disorder from traumatic brain injury in focused and large community datasets. PLoS One. 2015;10(7):e0129659. doi: 10.1371/journal.pone.0129659.
9. Henderson TA. The diagnosis and evaluation of dementia and mild cognitive impairment with emphasis on SPECT perfusion neuroimaging. CNS Spectr. 2012;17(4):176-206.
Factors that change our brains
I greatly enjoyed Dr. Nasrallah’s editorial, “Your patient’s brain is different at every visit” (From the Editor,
In reading this editorial, it is clear that a myriad of factors we consider and address with our patients during each visit underly intricate neurobiologic mechanisms and processes that ever deepen our understanding of the brain. In discussing the changes taking place in our patients, I can’t help but wonder what changes are also occurring in our brains (as Dr. Nasrallah noted). What would be the resulting impact of these changes in our next patient interaction and/or subsequent interaction(s) with the same patient? Looking through the editorial’s bullet points, many (if not all) of the factors contributing to brain changes apply equally and naturally to clinicians as well as patients. In this light, the editorial serves not only as a broad guideline for patient psychoeducation but also as a reminder of wellness and well-being for clinicians.
As a “fresh-out-of-training” psychiatrist, I can definitely work on several of the factors, such as diet and exercise. Trainees and residents can be more susceptible to overlook and befall some of these factors and changes, and may already be basing the clinical advice they give to their patients on these same factors and changes. As a child psychiatrist, I value the importance of modeling healthy behaviors for my patients, and their families and with coworkers or colleagues. In accordance with the impact these factors have on our brains, it’s important to emphasize what we can do to further strengthen rapport and therapeutic value through modeling. I strive to model the desired behaviors, attitudes, and dynamics that are the external, observable manifestation or symptomology of what takes place in my brain. To do so, I understand I need to be mindful in proactively managing the contributing factors, such as those listed in Dr. Nasrallah’s editorial. I imagine patients and their families would easily notice if we are in suboptimal physical and/or mental health that results in us not being prompt, fully engaged, or receptive. I believe that attending to these facets during training falls under the umbrella of professionalism. Being a professional in our field often entails practicing what we preach. So, I’m grateful that what we preach is informed by our field’s exciting research, continued advancements, and expertise that benefits our patients and us professionally and personally.
Philip Yen-Tsun Liu, MD
Child and adolescent psychiatrist
innovaTel Telepsychiatry
San Antonio, Texas
Dr. Nasrallah responds
I would like to thank Dr. Liu for his thoughtful response to my editorial. He seems to be very cognizant of the fact that experiential neuroplasticity and brain tissue remodeling occurs in both the patient and physician. I admire his focus on psychoeducation, wellness, and professionalism. He is right that we as psychiatrists (and nurse practitioners) must be role models for our patients in multiple ways, because it may help enhance clinical outcomes and have a positive impact on their brains.
I would also like to point Dr. Liu to the editorial “The most powerful placebo is not a pill” (From the Editor,
Henry A. Nasrallah, MD
Editor-in-Chief
Sydney W. Souers Endowed Chair
Professor and Chairman
Department of Psychiatry and Behavioral Neuroscience
Saint Louis University School of Medicine
St. Louis, Missouri
The APA’s stance on neuroimaging
Can anyone in the modern world argue that the brain is irrelevant to psychiatry? Yet surprisingly, in September 2018, the American Psychiatric Association (APA) officially declared that neuroimaging of the brain has no clinical value in psychiatry.1
Unfortunately, the APA focused almost exclusively on functional magnetic resonance imaging (fMRI) and neglected an extensive library of studies of single-photon emission computed tomography (SPECT) and positron emission tomography (PET). The APA’s position on neuroimaging is as follows1,2:
- A neuroimaging finding must have a sensitivity and specificity (S/sp) of no less than 80%.
- The psychiatric imaging literature does not support using neuroimaging in psychiatric diagnostics or treatment.
- Neuroimaging has not had a significant impact on the diagnosis and treatment of psychiatric disorders.
The APA set unrealistic standards for biomarkers in a field that lacks pathologic markers of specific disease entities.3 Moreover, numerous widely used tests fall below the APA’s unrealistic S/sp cutoff, including the Hamilton Depression Rating Scale,4 Zung Depression Scale,5 the clock drawing test,6 and even the chest X-ray.3 Curiously, numerous replicated SPECT and PET studies were not included in the APA’s analysis.1-3 For example, in a study of 196 veterans, posttraumatic stress disorder was distinguished from traumatic brain injury with an S/sp of 0.92/0.85.7,8 Also, fluorodeoxyglucose (FDG)-PET has an S/sp of 0.84/0.74 in differentiating patients with Alzheimer’s disease from controls, while perfusion SPECT, using multi-detector cameras, has an S/sp of 0.93/0.84.3,9 Moreover, both FDG-PET and SPECT can differentiate other forms of dementia from Alzheimer’s disease, yielding an additional benefit compared to amyloid imaging alone.2,9 As President of the International Society of Applied Neuroimaging, I suggest neuroimaging should not be feared. Neuroimaging does not replace the diagnostician; rather, it aids him/her in a complex case.
Theodore A. Henderson, MD, PhD
President
Neuro-Luminance Brain Health Centers, Inc.
Denver, Colorado
Director
The Synaptic Space
Vice President
The Neuro-Laser Foundation
President
International Society of Applied Neuroimaging
Centennial, Colorado
Disclosure
The author has no ownership in, and receives no remuneration from, any neuroimaging company.
References
1. First MB, Drevets WC, Carter C, et al. Clinical applications of neuroimaging in psychiatric disorders. Am J Psychiatry. 2018:175:
2. First MB, Drevets WC, Carter C, et al. Data supplement for Clinical applications of neuroimaging in psychiatric disorders. Am J Psychiatry. 2018;175(suppl).
3. Henderson TA. Brain SPECT imaging in neuropsychiatric diagnosis and monitoring. EPatient. http://nmpangea.com/2018/10/09/738/. Published 2018. Accessed May 31, 2019.
4. Bagby RM, Ryder AG, Schuller DR, et al. The Hamilton Depression Rating Scale: has the gold standard become a lead weight? Am J Psychiatry. 2004;161(12):2163-2177.
5. Biggs JT, Wylie LT, Ziegler VE. Validity of the Zung Self-rating Depression Scale. Br J Psychiatry. 1978;132:381-385.
6. Seigerschmidt E, Mösch E, Siemen M, et al. The clock drawing test and questionable dementia: reliability and validity. Int J Geriatr Psychiatry. 2002;17(11):1048-1054.
7. Raji CA, Willeumier K, Taylor D, et al. Functional neuroimaging with default mode network regions distinguishes PTSD from TBI in a military veteran population. Brain Imaging Behav. 2015;9(3):527-534.
8. Amen DG, Raji CA, Willeumier K, et al. Functional neuroimaging distinguishes posttraumatic stress disorder from traumatic brain injury in focused and large community datasets. PLoS One. 2015;10(7):e0129659. doi: 10.1371/journal.pone.0129659.
9. Henderson TA. The diagnosis and evaluation of dementia and mild cognitive impairment with emphasis on SPECT perfusion neuroimaging. CNS Spectr. 2012;17(4):176-206.
Wellness seminars won’t fix burnout
“Burnout” has been defined as long-term, unresolvable job stress that leads to exhaustion, depression, and in some tragic circumstances, suicide. One of our lead articles this month concerns an attempt to place a financial cost on physician burnout. More important, I think, is the toll burnout takes on an individual, their family, and their patients. In my role as Chief Clinical Officer of the University of Michigan Medical Group (our faculty and other clinical providers), I struggle to balance productivity demands with the increasing damage such demands are doing to our clinicians. Few primary care physicians at Michigan Medicine work full-time as clinicians (defined as 32 hours patient facing time per week for 46 weeks). Almost all request part-time status if they do not have protected, grant-funded time. They simply cannot keep up with the documentation required in our electronic health record, combined with our “patient-friendly” access via the electronic portal. One-third of the private practice group I helped build was part-time when I left in 2012, and it is not unusual to hear complaints about burnout from my ex-partners.
Let’s be clear, burnout is not going to be solved by increasing the resilience of our physicians or sending us to wellness seminars. That approach is a direct blame-the-victim paradigm. Physicians are burned out because of the constant assault on the core reasons we entered medicine – to help people (this assault has been termed “moral injury”). BPAs (best practice alerts), coding requirements, inbox demands, prior authorizations (see the practice management section of this issue), electronic-order entry, and most other practice enhancement tools rely on the willingness of physicians to sacrifice more time and energy and sit in front of a computer screen.
Salvation of our health care system will not come from mass retirements (although that is happening), concierge practices, part-time status, or other individual responses to this crisis. We will need a fundamental reorganization of our practice, where we (physicians) reduce our work to activities for which we trained combined with a shift of nonphysician work to others; better technology, virtual visits, and ancillary personnel. Patient expectations must be realistic and legal protections need strengthening. The politics of health care has focused on funds flow and ideology. We need a stronger voice that articulates the daily microaggressions that we each endure as we try to live Oslerian physician ideals.
John I. Allen, MD, MBA, AGAF
Editor in Chief
“Burnout” has been defined as long-term, unresolvable job stress that leads to exhaustion, depression, and in some tragic circumstances, suicide. One of our lead articles this month concerns an attempt to place a financial cost on physician burnout. More important, I think, is the toll burnout takes on an individual, their family, and their patients. In my role as Chief Clinical Officer of the University of Michigan Medical Group (our faculty and other clinical providers), I struggle to balance productivity demands with the increasing damage such demands are doing to our clinicians. Few primary care physicians at Michigan Medicine work full-time as clinicians (defined as 32 hours patient facing time per week for 46 weeks). Almost all request part-time status if they do not have protected, grant-funded time. They simply cannot keep up with the documentation required in our electronic health record, combined with our “patient-friendly” access via the electronic portal. One-third of the private practice group I helped build was part-time when I left in 2012, and it is not unusual to hear complaints about burnout from my ex-partners.
Let’s be clear, burnout is not going to be solved by increasing the resilience of our physicians or sending us to wellness seminars. That approach is a direct blame-the-victim paradigm. Physicians are burned out because of the constant assault on the core reasons we entered medicine – to help people (this assault has been termed “moral injury”). BPAs (best practice alerts), coding requirements, inbox demands, prior authorizations (see the practice management section of this issue), electronic-order entry, and most other practice enhancement tools rely on the willingness of physicians to sacrifice more time and energy and sit in front of a computer screen.
Salvation of our health care system will not come from mass retirements (although that is happening), concierge practices, part-time status, or other individual responses to this crisis. We will need a fundamental reorganization of our practice, where we (physicians) reduce our work to activities for which we trained combined with a shift of nonphysician work to others; better technology, virtual visits, and ancillary personnel. Patient expectations must be realistic and legal protections need strengthening. The politics of health care has focused on funds flow and ideology. We need a stronger voice that articulates the daily microaggressions that we each endure as we try to live Oslerian physician ideals.
John I. Allen, MD, MBA, AGAF
Editor in Chief
“Burnout” has been defined as long-term, unresolvable job stress that leads to exhaustion, depression, and in some tragic circumstances, suicide. One of our lead articles this month concerns an attempt to place a financial cost on physician burnout. More important, I think, is the toll burnout takes on an individual, their family, and their patients. In my role as Chief Clinical Officer of the University of Michigan Medical Group (our faculty and other clinical providers), I struggle to balance productivity demands with the increasing damage such demands are doing to our clinicians. Few primary care physicians at Michigan Medicine work full-time as clinicians (defined as 32 hours patient facing time per week for 46 weeks). Almost all request part-time status if they do not have protected, grant-funded time. They simply cannot keep up with the documentation required in our electronic health record, combined with our “patient-friendly” access via the electronic portal. One-third of the private practice group I helped build was part-time when I left in 2012, and it is not unusual to hear complaints about burnout from my ex-partners.
Let’s be clear, burnout is not going to be solved by increasing the resilience of our physicians or sending us to wellness seminars. That approach is a direct blame-the-victim paradigm. Physicians are burned out because of the constant assault on the core reasons we entered medicine – to help people (this assault has been termed “moral injury”). BPAs (best practice alerts), coding requirements, inbox demands, prior authorizations (see the practice management section of this issue), electronic-order entry, and most other practice enhancement tools rely on the willingness of physicians to sacrifice more time and energy and sit in front of a computer screen.
Salvation of our health care system will not come from mass retirements (although that is happening), concierge practices, part-time status, or other individual responses to this crisis. We will need a fundamental reorganization of our practice, where we (physicians) reduce our work to activities for which we trained combined with a shift of nonphysician work to others; better technology, virtual visits, and ancillary personnel. Patient expectations must be realistic and legal protections need strengthening. The politics of health care has focused on funds flow and ideology. We need a stronger voice that articulates the daily microaggressions that we each endure as we try to live Oslerian physician ideals.
John I. Allen, MD, MBA, AGAF
Editor in Chief