User login
Update on Tinea Capitis Diagnosis and Treatment
Tinea capitis (TC) most often is caused by Trichophyton tonsurans and Microsporum canis. The peak incidence is between 3 and 7 years of age. Noninflammatory TC typically presents as fine scaling with single or multiple scaly patches of circular alopecia (grey patches); diffuse or patchy, fine, white, adherent scaling of the scalp resembling generalized dandruff with subtle hair loss; or single or multiple patches of well-demarcated areas of alopecia with fine scale studded with broken-off hairs at the scalp surface, resulting in a black dot appearance. Inflammatory variants of TC include kerion and favus.1 Herein, updates on diagnosis, treatment, and monitoring of TC are provided, as well as a discussion of changes in the fungal microbiome associated with TC. Lastly, insights to some queries that practitioners may encounter when treating children with TC are provided.
Genetic Susceptibility
Molecular techniques have identified a number of macrophage regulator, leukocyte activation and migration, and cutaneous permeability genes associated with susceptibility to TC. These findings indicate that genetically determined deficiency in adaptive immune responses may affect the predisposition to dermatophyte infections.2
Clinical Varieties of Infection
Dermatophytes causing ringworm are capable of invading the hair shafts and can simultaneously invade smooth or glabrous skin (eg, T tonsurans, Trichophyton schoenleinii, Trichophyton violaceum). Some causative dermatophytes can even penetrate the nails (eg, Trichophyton soudanense). The clinical presentation is dependent on 3 main patterns of hair invasion3:
• Ectothrix: A mid-follicular pattern of invasion with hyphae growing down to the hair bulb that commonly is caused by Microsporum species. It clinically presents with scaling and inflammation with hair shafts breaking 2 to 3 mm above the scalp level.
• Endothrix: This pattern is nonfluorescent on Wood lamp examination, and hairs often break at the scalp level (black dot type). Trichophyton tonsurans, T soudanense, Trichophyton rubrum, and T violaceum are common causes.
• Favus: In this pattern, T schoenleinii is a common cause, and hairs grow to considerable lengths above the scalp with less damage than the other patterns. The hair shafts present with characteristic air spaces, and hyphae form clusters at the level of the epidermis.
Diagnosis
Optimal treatment of TC relies on proper identification of the causative agent. Fungal culture remains the gold standard of mycologic diagnosis regardless of its delayed results, which may take up to 4 weeks for proper identification of the fungal colonies and require ample expertise to interpret the morphologic features of the grown colonies.4
Other tests such as the potassium hydroxide preparation are nonspecific and do not identify the dermatophyte species. Although this method has been reported to have 5% to 15% false-negative results in routine practice depending on the skill of the observer and the quality of sampling, microscopic examination is essential, as it may allow the clinician to start treatment sooner pending culture results. The use of a Wood lamp is not suitable for definitive species identification, as this technique primarily is useful for observing fluorescence in ectothrix infection caused by Microsporum species, with the exception of T schoenleinii; otherwise, Trichophyton species, which cause endothrix infections, do not fluoresce.5Polymerase chain reaction is a sensitive technique that can help identify both the genus and species of common dermatophytes. Common target sequences include the ribosomal internal transcribed spacer and translation elongation factor 1α. The use of matrix-assisted laser desorption/ionization time-of-flight mass spectrometry also has become popular for dermatophyte identification.6Trichoscopic diagnosis of TC, which is simple and noninvasive, is becoming increasingly popular. Features such as short, broken, black dot, comma, corkscrew, and/or zigzag hairs, as well as perifollicular scaling, are helpful for diagnosing TC (Figure). Moreover, trichoscopy can be useful for differentiating other common causes of hair loss, such as trichotillomania and alopecia areata. It had been reported that the trichoscopic features of TC can be seen as early as 2 weeks after starting treatment and therefore this can be a reliable period in which to follow-up with the patient to evaluate progress. The disappearance of black dots and comma hairs can be appreciated from 2 weeks onwards by trichoscopic evaluation.4
Treatment
The common recommendation for first-line treatment of TC is the use of systemic antifungals with the use of a topical agent as an adjuvant to prevent the spread of fungal spores. For almost 6 decades, griseofulvin had been the gold-standard fungistatic used for treating TC in patients older than 2 years until the 2007 US Food and Drug Administration (FDA) approval of terbinafine fungicidal oral granules for treatment of TC in patients older than 4 years.7
Meta-analyses have demonstrated comparable efficacy for a 4-week course of terbinafine compared to 6 weeks of griseofulvin for TC based on the infectious organism. Terbinafine demonstrated superiority in treating T tonsurans and a similar efficacy in treating T violaceum, while griseofulvin was superior in treating M canis and other Microsporum species.8,9
The off-label use of fluconazole and itraconazole to treat TC is gaining popularity, with limited trials showing increased evidence of their effectiveness. There is not much clinical evidence to support the use of other oral antifungals, including the newer azoles such as voriconazole or posaconazole.9
Newer limited evidence has shown the off-label use of photodynamic therapy to be a promising alternative to systemic antifungal therapy in treating TC, pending validation by larger sample trials.10In my practice, I have found that severe cases of TC demonstrating inflammation or possible widespread id reactions are better treated with oral steroids. Ketoconazole shampoo or selenium sulfide used 2 to 3 times weekly to prevent spread in the early phases of therapy is a good adjunct to systemic treatment. Cases with kerions should be assessed for the possibility of a coexisting bacterial infection under the crusts, and if confirmed, antibiotics should be started.9The commonly used systemic antifungals generally are safe with a low side-effect profile, but there is a risk for hepatotoxicity. The FDA recommends that baseline alanine transaminase and aspartate transaminase levels should be obtained prior to beginning a terbinafine-based treatment regimen.11 The American Academy of Pediatrics has specifically stated that laboratory testing of serum hepatic enzymes is not a requirement if a griseofulvin-based regimen does not exceed 8 weeks; however, transaminase levels (alanine transaminase and aspartate transaminase) should be considered in patients using terbinafine at baseline or if treatment is prolonged beyond 4 to 6 weeks.12 In agreement with the FDA guidelines, the Canadian Pediatric Society has suggested that liver enzymes should be periodically monitored in patients being treated with terbinafine beyond 4 to 6 weeks.13
Changes in the Fungal Microbiome
Research has shown that changes in the fungal microbiome were associated with an altered bacterial community in patients with TC. During fungal infection, the relative abundances of Cutibacterium and Corynebacterium increased, and the relative abundance of Streptococcus decreased. In addition, some uncommon bacterial genera such as Herbaspirillum and Methylorubrum were detected on the scalp in TC.14
Carrier State
Carrier state is determined for those siblings and contacts of cases with a clinically normal scalp that are positive on culture. Those individuals could represent a potential reservoir responsible for contamination (or recontamination) of the patient as well as treatment failure. Opinions remain divided as to whether to use oral antifungal therapy in these carriers or maintain therapy on antifungal shampoos containing ketoconazole or povidone-iodine. Due to the paucity of available data, my experience has shown that it is sufficient to use antifungal shampoos for such carriers. In zoophilic infections, it is important to identify and treat the animal source.6-9
Final Thoughts
Successful treatment of TC requires accurate identification of the pathogen, which commonly is achieved via fungal culture. Despite its practical value, the conventional identification of dermatophytes based on morphologic features can be highly challenging due to the low positive rate and delayed results. Trichoscopy is a quick, handy, and noninvasive tool that can better indicate the diagnosis and also is helpful for follow-up on treatment progress. Due to better understanding of the immunology and genetic susceptibility associated with TC spread, the current treatment pipeline holds more insight into better control of this condition. Increased surveillance, prompt diagnosis, and early onset of systemic treatment are the key to proper prevention of spread of TC.
- Leung AKC, Hon KL, Leong KF, et al. Tinea capitis: an updated review. Recent Pat Inflamm Allergy Drug Discov. 2020;14:58-68.
- Abdel-Rahman SM, Preuett BL. Genetic predictors of susceptibility to cutaneous fungal infections: a pilot genome wide association study to refine a candidate gene search. J Dermatol Sci. 2012;67:147-152.
- Hay RJ. Tinea capitis: current status. Mycopathologia. 2017;182:87-93.
- Wahbah HR, Atallah RB, Eldahshan RM, et al. A prospective clinical and trichoscopic study of tinea capitis in children during treatment [published online May 23, 2022]. Dermatol Ther. 2022;35:E15582. doi:10.1111/dth.15582
- Salehi Z, Shams-Ghahfarokhi M, Razzaghi-Abyaneh M. Molecular epidemiology, genetic diversity, and antifungal susceptibility of major pathogenic dermatophytes isolated from human dermatophytosis. Front Microbiol. 2021;12:643509.
- Lamisil. Package insert. Novartis; 2011. Accessed October 17, 2022. https://www.accessdata.fda.gov/drugsatfda_docs/label/2012/020539s021lbl.pdf
- Gupta AK, Drummond-Main C. Meta-analysis of randomized, controlled trials comparing particular doses of griseofulvin and terbinafine for the treatment of tinea capitis. Pediatr Dermatol. 2013;30:1-6.
- Tey HL, Tan AS, Chan YC. Meta-analysis of randomized, controlled trials comparing griseofulvin and terbinafine in the treatment of tinea capitis. J Am Acad Dermatol. 2011;64:663-670.
- Gupta AK, Friedlander SF, Simkovich AJ. Tinea capitis: an update. Pediatr Dermatol. 2022;39:167-172.
- Aspiroz C, Melcon B, Cerro PA, et al. Tinea capitis caused by Microsporum canis treated with methyl-aminolevulinate daylight photodynamic therapy and ketoconazole shampooing. Photodermatol Photoimmunol Photomed. 2021;37:567-568.
- Aleohin N, Bar J, Bar-Ilan E, et al. Laboratory monitoring during antifungal treatment of paediatric tinea capitis. Mycoses. 2021;64:157-161.
- Kimberlin DW, Brady MT, Jackson MA, et al, eds. Tinea capitis. In: Red Book 2018-2021: Report of the Committee of Infectious Diseases. American Academy of Pediatrics; 2018:798-801.
- Bortolussi R, Martin S, Audcent T, et al. Antifungal agents for common outpatient paediatric infections. Canadian Paediatric Society website. Published June 20, 2019. Accessed October 4, 2022. https://www.cps.ca/en/documents/position/antifungal-agents-common-infections
- Tao R, Zhu P, Zhou Y, et al. Altered skin fungal and bacterial community compositions in tinea capitis. Mycoses. 2022;65:834-840.
Tinea capitis (TC) most often is caused by Trichophyton tonsurans and Microsporum canis. The peak incidence is between 3 and 7 years of age. Noninflammatory TC typically presents as fine scaling with single or multiple scaly patches of circular alopecia (grey patches); diffuse or patchy, fine, white, adherent scaling of the scalp resembling generalized dandruff with subtle hair loss; or single or multiple patches of well-demarcated areas of alopecia with fine scale studded with broken-off hairs at the scalp surface, resulting in a black dot appearance. Inflammatory variants of TC include kerion and favus.1 Herein, updates on diagnosis, treatment, and monitoring of TC are provided, as well as a discussion of changes in the fungal microbiome associated with TC. Lastly, insights to some queries that practitioners may encounter when treating children with TC are provided.
Genetic Susceptibility
Molecular techniques have identified a number of macrophage regulator, leukocyte activation and migration, and cutaneous permeability genes associated with susceptibility to TC. These findings indicate that genetically determined deficiency in adaptive immune responses may affect the predisposition to dermatophyte infections.2
Clinical Varieties of Infection
Dermatophytes causing ringworm are capable of invading the hair shafts and can simultaneously invade smooth or glabrous skin (eg, T tonsurans, Trichophyton schoenleinii, Trichophyton violaceum). Some causative dermatophytes can even penetrate the nails (eg, Trichophyton soudanense). The clinical presentation is dependent on 3 main patterns of hair invasion3:
• Ectothrix: A mid-follicular pattern of invasion with hyphae growing down to the hair bulb that commonly is caused by Microsporum species. It clinically presents with scaling and inflammation with hair shafts breaking 2 to 3 mm above the scalp level.
• Endothrix: This pattern is nonfluorescent on Wood lamp examination, and hairs often break at the scalp level (black dot type). Trichophyton tonsurans, T soudanense, Trichophyton rubrum, and T violaceum are common causes.
• Favus: In this pattern, T schoenleinii is a common cause, and hairs grow to considerable lengths above the scalp with less damage than the other patterns. The hair shafts present with characteristic air spaces, and hyphae form clusters at the level of the epidermis.
Diagnosis
Optimal treatment of TC relies on proper identification of the causative agent. Fungal culture remains the gold standard of mycologic diagnosis regardless of its delayed results, which may take up to 4 weeks for proper identification of the fungal colonies and require ample expertise to interpret the morphologic features of the grown colonies.4
Other tests such as the potassium hydroxide preparation are nonspecific and do not identify the dermatophyte species. Although this method has been reported to have 5% to 15% false-negative results in routine practice depending on the skill of the observer and the quality of sampling, microscopic examination is essential, as it may allow the clinician to start treatment sooner pending culture results. The use of a Wood lamp is not suitable for definitive species identification, as this technique primarily is useful for observing fluorescence in ectothrix infection caused by Microsporum species, with the exception of T schoenleinii; otherwise, Trichophyton species, which cause endothrix infections, do not fluoresce.5Polymerase chain reaction is a sensitive technique that can help identify both the genus and species of common dermatophytes. Common target sequences include the ribosomal internal transcribed spacer and translation elongation factor 1α. The use of matrix-assisted laser desorption/ionization time-of-flight mass spectrometry also has become popular for dermatophyte identification.6Trichoscopic diagnosis of TC, which is simple and noninvasive, is becoming increasingly popular. Features such as short, broken, black dot, comma, corkscrew, and/or zigzag hairs, as well as perifollicular scaling, are helpful for diagnosing TC (Figure). Moreover, trichoscopy can be useful for differentiating other common causes of hair loss, such as trichotillomania and alopecia areata. It had been reported that the trichoscopic features of TC can be seen as early as 2 weeks after starting treatment and therefore this can be a reliable period in which to follow-up with the patient to evaluate progress. The disappearance of black dots and comma hairs can be appreciated from 2 weeks onwards by trichoscopic evaluation.4
Treatment
The common recommendation for first-line treatment of TC is the use of systemic antifungals with the use of a topical agent as an adjuvant to prevent the spread of fungal spores. For almost 6 decades, griseofulvin had been the gold-standard fungistatic used for treating TC in patients older than 2 years until the 2007 US Food and Drug Administration (FDA) approval of terbinafine fungicidal oral granules for treatment of TC in patients older than 4 years.7
Meta-analyses have demonstrated comparable efficacy for a 4-week course of terbinafine compared to 6 weeks of griseofulvin for TC based on the infectious organism. Terbinafine demonstrated superiority in treating T tonsurans and a similar efficacy in treating T violaceum, while griseofulvin was superior in treating M canis and other Microsporum species.8,9
The off-label use of fluconazole and itraconazole to treat TC is gaining popularity, with limited trials showing increased evidence of their effectiveness. There is not much clinical evidence to support the use of other oral antifungals, including the newer azoles such as voriconazole or posaconazole.9
Newer limited evidence has shown the off-label use of photodynamic therapy to be a promising alternative to systemic antifungal therapy in treating TC, pending validation by larger sample trials.10In my practice, I have found that severe cases of TC demonstrating inflammation or possible widespread id reactions are better treated with oral steroids. Ketoconazole shampoo or selenium sulfide used 2 to 3 times weekly to prevent spread in the early phases of therapy is a good adjunct to systemic treatment. Cases with kerions should be assessed for the possibility of a coexisting bacterial infection under the crusts, and if confirmed, antibiotics should be started.9The commonly used systemic antifungals generally are safe with a low side-effect profile, but there is a risk for hepatotoxicity. The FDA recommends that baseline alanine transaminase and aspartate transaminase levels should be obtained prior to beginning a terbinafine-based treatment regimen.11 The American Academy of Pediatrics has specifically stated that laboratory testing of serum hepatic enzymes is not a requirement if a griseofulvin-based regimen does not exceed 8 weeks; however, transaminase levels (alanine transaminase and aspartate transaminase) should be considered in patients using terbinafine at baseline or if treatment is prolonged beyond 4 to 6 weeks.12 In agreement with the FDA guidelines, the Canadian Pediatric Society has suggested that liver enzymes should be periodically monitored in patients being treated with terbinafine beyond 4 to 6 weeks.13
Changes in the Fungal Microbiome
Research has shown that changes in the fungal microbiome were associated with an altered bacterial community in patients with TC. During fungal infection, the relative abundances of Cutibacterium and Corynebacterium increased, and the relative abundance of Streptococcus decreased. In addition, some uncommon bacterial genera such as Herbaspirillum and Methylorubrum were detected on the scalp in TC.14
Carrier State
Carrier state is determined for those siblings and contacts of cases with a clinically normal scalp that are positive on culture. Those individuals could represent a potential reservoir responsible for contamination (or recontamination) of the patient as well as treatment failure. Opinions remain divided as to whether to use oral antifungal therapy in these carriers or maintain therapy on antifungal shampoos containing ketoconazole or povidone-iodine. Due to the paucity of available data, my experience has shown that it is sufficient to use antifungal shampoos for such carriers. In zoophilic infections, it is important to identify and treat the animal source.6-9
Final Thoughts
Successful treatment of TC requires accurate identification of the pathogen, which commonly is achieved via fungal culture. Despite its practical value, the conventional identification of dermatophytes based on morphologic features can be highly challenging due to the low positive rate and delayed results. Trichoscopy is a quick, handy, and noninvasive tool that can better indicate the diagnosis and also is helpful for follow-up on treatment progress. Due to better understanding of the immunology and genetic susceptibility associated with TC spread, the current treatment pipeline holds more insight into better control of this condition. Increased surveillance, prompt diagnosis, and early onset of systemic treatment are the key to proper prevention of spread of TC.
Tinea capitis (TC) most often is caused by Trichophyton tonsurans and Microsporum canis. The peak incidence is between 3 and 7 years of age. Noninflammatory TC typically presents as fine scaling with single or multiple scaly patches of circular alopecia (grey patches); diffuse or patchy, fine, white, adherent scaling of the scalp resembling generalized dandruff with subtle hair loss; or single or multiple patches of well-demarcated areas of alopecia with fine scale studded with broken-off hairs at the scalp surface, resulting in a black dot appearance. Inflammatory variants of TC include kerion and favus.1 Herein, updates on diagnosis, treatment, and monitoring of TC are provided, as well as a discussion of changes in the fungal microbiome associated with TC. Lastly, insights to some queries that practitioners may encounter when treating children with TC are provided.
Genetic Susceptibility
Molecular techniques have identified a number of macrophage regulator, leukocyte activation and migration, and cutaneous permeability genes associated with susceptibility to TC. These findings indicate that genetically determined deficiency in adaptive immune responses may affect the predisposition to dermatophyte infections.2
Clinical Varieties of Infection
Dermatophytes causing ringworm are capable of invading the hair shafts and can simultaneously invade smooth or glabrous skin (eg, T tonsurans, Trichophyton schoenleinii, Trichophyton violaceum). Some causative dermatophytes can even penetrate the nails (eg, Trichophyton soudanense). The clinical presentation is dependent on 3 main patterns of hair invasion3:
• Ectothrix: A mid-follicular pattern of invasion with hyphae growing down to the hair bulb that commonly is caused by Microsporum species. It clinically presents with scaling and inflammation with hair shafts breaking 2 to 3 mm above the scalp level.
• Endothrix: This pattern is nonfluorescent on Wood lamp examination, and hairs often break at the scalp level (black dot type). Trichophyton tonsurans, T soudanense, Trichophyton rubrum, and T violaceum are common causes.
• Favus: In this pattern, T schoenleinii is a common cause, and hairs grow to considerable lengths above the scalp with less damage than the other patterns. The hair shafts present with characteristic air spaces, and hyphae form clusters at the level of the epidermis.
Diagnosis
Optimal treatment of TC relies on proper identification of the causative agent. Fungal culture remains the gold standard of mycologic diagnosis regardless of its delayed results, which may take up to 4 weeks for proper identification of the fungal colonies and require ample expertise to interpret the morphologic features of the grown colonies.4
Other tests such as the potassium hydroxide preparation are nonspecific and do not identify the dermatophyte species. Although this method has been reported to have 5% to 15% false-negative results in routine practice depending on the skill of the observer and the quality of sampling, microscopic examination is essential, as it may allow the clinician to start treatment sooner pending culture results. The use of a Wood lamp is not suitable for definitive species identification, as this technique primarily is useful for observing fluorescence in ectothrix infection caused by Microsporum species, with the exception of T schoenleinii; otherwise, Trichophyton species, which cause endothrix infections, do not fluoresce.5Polymerase chain reaction is a sensitive technique that can help identify both the genus and species of common dermatophytes. Common target sequences include the ribosomal internal transcribed spacer and translation elongation factor 1α. The use of matrix-assisted laser desorption/ionization time-of-flight mass spectrometry also has become popular for dermatophyte identification.6Trichoscopic diagnosis of TC, which is simple and noninvasive, is becoming increasingly popular. Features such as short, broken, black dot, comma, corkscrew, and/or zigzag hairs, as well as perifollicular scaling, are helpful for diagnosing TC (Figure). Moreover, trichoscopy can be useful for differentiating other common causes of hair loss, such as trichotillomania and alopecia areata. It had been reported that the trichoscopic features of TC can be seen as early as 2 weeks after starting treatment and therefore this can be a reliable period in which to follow-up with the patient to evaluate progress. The disappearance of black dots and comma hairs can be appreciated from 2 weeks onwards by trichoscopic evaluation.4
Treatment
The common recommendation for first-line treatment of TC is the use of systemic antifungals with the use of a topical agent as an adjuvant to prevent the spread of fungal spores. For almost 6 decades, griseofulvin had been the gold-standard fungistatic used for treating TC in patients older than 2 years until the 2007 US Food and Drug Administration (FDA) approval of terbinafine fungicidal oral granules for treatment of TC in patients older than 4 years.7
Meta-analyses have demonstrated comparable efficacy for a 4-week course of terbinafine compared to 6 weeks of griseofulvin for TC based on the infectious organism. Terbinafine demonstrated superiority in treating T tonsurans and a similar efficacy in treating T violaceum, while griseofulvin was superior in treating M canis and other Microsporum species.8,9
The off-label use of fluconazole and itraconazole to treat TC is gaining popularity, with limited trials showing increased evidence of their effectiveness. There is not much clinical evidence to support the use of other oral antifungals, including the newer azoles such as voriconazole or posaconazole.9
Newer limited evidence has shown the off-label use of photodynamic therapy to be a promising alternative to systemic antifungal therapy in treating TC, pending validation by larger sample trials.10In my practice, I have found that severe cases of TC demonstrating inflammation or possible widespread id reactions are better treated with oral steroids. Ketoconazole shampoo or selenium sulfide used 2 to 3 times weekly to prevent spread in the early phases of therapy is a good adjunct to systemic treatment. Cases with kerions should be assessed for the possibility of a coexisting bacterial infection under the crusts, and if confirmed, antibiotics should be started.9The commonly used systemic antifungals generally are safe with a low side-effect profile, but there is a risk for hepatotoxicity. The FDA recommends that baseline alanine transaminase and aspartate transaminase levels should be obtained prior to beginning a terbinafine-based treatment regimen.11 The American Academy of Pediatrics has specifically stated that laboratory testing of serum hepatic enzymes is not a requirement if a griseofulvin-based regimen does not exceed 8 weeks; however, transaminase levels (alanine transaminase and aspartate transaminase) should be considered in patients using terbinafine at baseline or if treatment is prolonged beyond 4 to 6 weeks.12 In agreement with the FDA guidelines, the Canadian Pediatric Society has suggested that liver enzymes should be periodically monitored in patients being treated with terbinafine beyond 4 to 6 weeks.13
Changes in the Fungal Microbiome
Research has shown that changes in the fungal microbiome were associated with an altered bacterial community in patients with TC. During fungal infection, the relative abundances of Cutibacterium and Corynebacterium increased, and the relative abundance of Streptococcus decreased. In addition, some uncommon bacterial genera such as Herbaspirillum and Methylorubrum were detected on the scalp in TC.14
Carrier State
Carrier state is determined for those siblings and contacts of cases with a clinically normal scalp that are positive on culture. Those individuals could represent a potential reservoir responsible for contamination (or recontamination) of the patient as well as treatment failure. Opinions remain divided as to whether to use oral antifungal therapy in these carriers or maintain therapy on antifungal shampoos containing ketoconazole or povidone-iodine. Due to the paucity of available data, my experience has shown that it is sufficient to use antifungal shampoos for such carriers. In zoophilic infections, it is important to identify and treat the animal source.6-9
Final Thoughts
Successful treatment of TC requires accurate identification of the pathogen, which commonly is achieved via fungal culture. Despite its practical value, the conventional identification of dermatophytes based on morphologic features can be highly challenging due to the low positive rate and delayed results. Trichoscopy is a quick, handy, and noninvasive tool that can better indicate the diagnosis and also is helpful for follow-up on treatment progress. Due to better understanding of the immunology and genetic susceptibility associated with TC spread, the current treatment pipeline holds more insight into better control of this condition. Increased surveillance, prompt diagnosis, and early onset of systemic treatment are the key to proper prevention of spread of TC.
- Leung AKC, Hon KL, Leong KF, et al. Tinea capitis: an updated review. Recent Pat Inflamm Allergy Drug Discov. 2020;14:58-68.
- Abdel-Rahman SM, Preuett BL. Genetic predictors of susceptibility to cutaneous fungal infections: a pilot genome wide association study to refine a candidate gene search. J Dermatol Sci. 2012;67:147-152.
- Hay RJ. Tinea capitis: current status. Mycopathologia. 2017;182:87-93.
- Wahbah HR, Atallah RB, Eldahshan RM, et al. A prospective clinical and trichoscopic study of tinea capitis in children during treatment [published online May 23, 2022]. Dermatol Ther. 2022;35:E15582. doi:10.1111/dth.15582
- Salehi Z, Shams-Ghahfarokhi M, Razzaghi-Abyaneh M. Molecular epidemiology, genetic diversity, and antifungal susceptibility of major pathogenic dermatophytes isolated from human dermatophytosis. Front Microbiol. 2021;12:643509.
- Lamisil. Package insert. Novartis; 2011. Accessed October 17, 2022. https://www.accessdata.fda.gov/drugsatfda_docs/label/2012/020539s021lbl.pdf
- Gupta AK, Drummond-Main C. Meta-analysis of randomized, controlled trials comparing particular doses of griseofulvin and terbinafine for the treatment of tinea capitis. Pediatr Dermatol. 2013;30:1-6.
- Tey HL, Tan AS, Chan YC. Meta-analysis of randomized, controlled trials comparing griseofulvin and terbinafine in the treatment of tinea capitis. J Am Acad Dermatol. 2011;64:663-670.
- Gupta AK, Friedlander SF, Simkovich AJ. Tinea capitis: an update. Pediatr Dermatol. 2022;39:167-172.
- Aspiroz C, Melcon B, Cerro PA, et al. Tinea capitis caused by Microsporum canis treated with methyl-aminolevulinate daylight photodynamic therapy and ketoconazole shampooing. Photodermatol Photoimmunol Photomed. 2021;37:567-568.
- Aleohin N, Bar J, Bar-Ilan E, et al. Laboratory monitoring during antifungal treatment of paediatric tinea capitis. Mycoses. 2021;64:157-161.
- Kimberlin DW, Brady MT, Jackson MA, et al, eds. Tinea capitis. In: Red Book 2018-2021: Report of the Committee of Infectious Diseases. American Academy of Pediatrics; 2018:798-801.
- Bortolussi R, Martin S, Audcent T, et al. Antifungal agents for common outpatient paediatric infections. Canadian Paediatric Society website. Published June 20, 2019. Accessed October 4, 2022. https://www.cps.ca/en/documents/position/antifungal-agents-common-infections
- Tao R, Zhu P, Zhou Y, et al. Altered skin fungal and bacterial community compositions in tinea capitis. Mycoses. 2022;65:834-840.
- Leung AKC, Hon KL, Leong KF, et al. Tinea capitis: an updated review. Recent Pat Inflamm Allergy Drug Discov. 2020;14:58-68.
- Abdel-Rahman SM, Preuett BL. Genetic predictors of susceptibility to cutaneous fungal infections: a pilot genome wide association study to refine a candidate gene search. J Dermatol Sci. 2012;67:147-152.
- Hay RJ. Tinea capitis: current status. Mycopathologia. 2017;182:87-93.
- Wahbah HR, Atallah RB, Eldahshan RM, et al. A prospective clinical and trichoscopic study of tinea capitis in children during treatment [published online May 23, 2022]. Dermatol Ther. 2022;35:E15582. doi:10.1111/dth.15582
- Salehi Z, Shams-Ghahfarokhi M, Razzaghi-Abyaneh M. Molecular epidemiology, genetic diversity, and antifungal susceptibility of major pathogenic dermatophytes isolated from human dermatophytosis. Front Microbiol. 2021;12:643509.
- Lamisil. Package insert. Novartis; 2011. Accessed October 17, 2022. https://www.accessdata.fda.gov/drugsatfda_docs/label/2012/020539s021lbl.pdf
- Gupta AK, Drummond-Main C. Meta-analysis of randomized, controlled trials comparing particular doses of griseofulvin and terbinafine for the treatment of tinea capitis. Pediatr Dermatol. 2013;30:1-6.
- Tey HL, Tan AS, Chan YC. Meta-analysis of randomized, controlled trials comparing griseofulvin and terbinafine in the treatment of tinea capitis. J Am Acad Dermatol. 2011;64:663-670.
- Gupta AK, Friedlander SF, Simkovich AJ. Tinea capitis: an update. Pediatr Dermatol. 2022;39:167-172.
- Aspiroz C, Melcon B, Cerro PA, et al. Tinea capitis caused by Microsporum canis treated with methyl-aminolevulinate daylight photodynamic therapy and ketoconazole shampooing. Photodermatol Photoimmunol Photomed. 2021;37:567-568.
- Aleohin N, Bar J, Bar-Ilan E, et al. Laboratory monitoring during antifungal treatment of paediatric tinea capitis. Mycoses. 2021;64:157-161.
- Kimberlin DW, Brady MT, Jackson MA, et al, eds. Tinea capitis. In: Red Book 2018-2021: Report of the Committee of Infectious Diseases. American Academy of Pediatrics; 2018:798-801.
- Bortolussi R, Martin S, Audcent T, et al. Antifungal agents for common outpatient paediatric infections. Canadian Paediatric Society website. Published June 20, 2019. Accessed October 4, 2022. https://www.cps.ca/en/documents/position/antifungal-agents-common-infections
- Tao R, Zhu P, Zhou Y, et al. Altered skin fungal and bacterial community compositions in tinea capitis. Mycoses. 2022;65:834-840.
A 95-year-old White male with hypertension presented with itchy patches and bullae on the trunk and extremities
and is associated with various predisposing factors, including HLA genes, comorbidities, aging, and trigger factors such as drugs, trauma, radiation, chemotherapy, and infections. The autoimmune reaction is mediated by a dysregulation of T cells in which IgG and IgE autoantibodies form against hemidesmosomal proteins (BP180 and BP230). These autoantibodies induce neutrophil activation, recruitment, and degradation in the basement membrane of the skin.
Typically, patients present with intense pruritus followed by an urticarial or eczematous eruption. Tense blisters and bullae occur commonly on the trunk and extremities. Drug-associated bullous pemphigoid (DABP) is a common manifestation of the disease with histologic and immunologic features similar to those of the idiopathic version. Eruptions can be triggered by systemic or topical medications, and incidence of these reactions may be related to a genetic predisposition for the disease.
Some research suggests that drug-induced changes to the antigenic properties of the epidermal basement membrane result in an augmented immune response, while others point to structural modification in these zones that stimulate the immune system. Thiol- and phenol-based drugs have been largely implicated in the development of DABP because they are capable of structural modification and disruption of the dermo-epidermal junction in the basement membrane.
DABP often presents with patients taking multiple medications. Some of the most common medications are gliptins, PD-1 inhibitors, diuretics, antibiotics, anti-inflammatory drugs, and ACE-inhibitors, and other cardiovascular drugs. DABP may present with mucosal eruptions unlike its idiopathic counterpart that is mostly contained to the skin.
On this patient, two punch biopsies were taken. Histopathology revealed an eosinophil-rich subepidermal blister with a smooth epidermal undersurface consistent with bullous pemphigoid. Direct immunofluorescence was positive with a deposition of IgG and C3 at the epidermal side of salt split basement membrane zone.
Treatment for BP includes high potency topical and systemic steroids. Tetracyclines and niacinamide have been reported to improve the condition. Treatment is tailored to allow for cutaneous healing and control pruritus, but the physician must be mindful of the patient’s comorbidities and capacity for self-care. Prognosis is often better for DABP as withdrawal of the medication greatly accelerates clearance of the lesions. Worse prognosis is related to increased number of comorbidities and older age. Our patient’s BP is controlled currently with topical steroids and oral doxycycline.
This case and photo were submitted by Lucas Shapiro, BS, Nova Southeastern University College of Osteopathic Medicine, Tampa, and Dr. Bilu Martin.
Dr. Bilu Martin is a board-certified dermatologist in private practice at Premier Dermatology, MD, in Aventura, Fla. More diagnostic cases are available at mdedge.com/dermatology. To submit a case for possible publication, send an email to [email protected].
References
1. Miyamoto D et al. An Bras Dermatol. 2019 Mar-Apr;94(2):133-46.
2. Moro et al. Biomolecules. 2020 Oct 10;10(10):1432.
3. Verheyden M et al. Acta Derm Venereol. 2020 Aug 17;100(15):adv00224.
and is associated with various predisposing factors, including HLA genes, comorbidities, aging, and trigger factors such as drugs, trauma, radiation, chemotherapy, and infections. The autoimmune reaction is mediated by a dysregulation of T cells in which IgG and IgE autoantibodies form against hemidesmosomal proteins (BP180 and BP230). These autoantibodies induce neutrophil activation, recruitment, and degradation in the basement membrane of the skin.
Typically, patients present with intense pruritus followed by an urticarial or eczematous eruption. Tense blisters and bullae occur commonly on the trunk and extremities. Drug-associated bullous pemphigoid (DABP) is a common manifestation of the disease with histologic and immunologic features similar to those of the idiopathic version. Eruptions can be triggered by systemic or topical medications, and incidence of these reactions may be related to a genetic predisposition for the disease.
Some research suggests that drug-induced changes to the antigenic properties of the epidermal basement membrane result in an augmented immune response, while others point to structural modification in these zones that stimulate the immune system. Thiol- and phenol-based drugs have been largely implicated in the development of DABP because they are capable of structural modification and disruption of the dermo-epidermal junction in the basement membrane.
DABP often presents with patients taking multiple medications. Some of the most common medications are gliptins, PD-1 inhibitors, diuretics, antibiotics, anti-inflammatory drugs, and ACE-inhibitors, and other cardiovascular drugs. DABP may present with mucosal eruptions unlike its idiopathic counterpart that is mostly contained to the skin.
On this patient, two punch biopsies were taken. Histopathology revealed an eosinophil-rich subepidermal blister with a smooth epidermal undersurface consistent with bullous pemphigoid. Direct immunofluorescence was positive with a deposition of IgG and C3 at the epidermal side of salt split basement membrane zone.
Treatment for BP includes high potency topical and systemic steroids. Tetracyclines and niacinamide have been reported to improve the condition. Treatment is tailored to allow for cutaneous healing and control pruritus, but the physician must be mindful of the patient’s comorbidities and capacity for self-care. Prognosis is often better for DABP as withdrawal of the medication greatly accelerates clearance of the lesions. Worse prognosis is related to increased number of comorbidities and older age. Our patient’s BP is controlled currently with topical steroids and oral doxycycline.
This case and photo were submitted by Lucas Shapiro, BS, Nova Southeastern University College of Osteopathic Medicine, Tampa, and Dr. Bilu Martin.
Dr. Bilu Martin is a board-certified dermatologist in private practice at Premier Dermatology, MD, in Aventura, Fla. More diagnostic cases are available at mdedge.com/dermatology. To submit a case for possible publication, send an email to [email protected].
References
1. Miyamoto D et al. An Bras Dermatol. 2019 Mar-Apr;94(2):133-46.
2. Moro et al. Biomolecules. 2020 Oct 10;10(10):1432.
3. Verheyden M et al. Acta Derm Venereol. 2020 Aug 17;100(15):adv00224.
and is associated with various predisposing factors, including HLA genes, comorbidities, aging, and trigger factors such as drugs, trauma, radiation, chemotherapy, and infections. The autoimmune reaction is mediated by a dysregulation of T cells in which IgG and IgE autoantibodies form against hemidesmosomal proteins (BP180 and BP230). These autoantibodies induce neutrophil activation, recruitment, and degradation in the basement membrane of the skin.
Typically, patients present with intense pruritus followed by an urticarial or eczematous eruption. Tense blisters and bullae occur commonly on the trunk and extremities. Drug-associated bullous pemphigoid (DABP) is a common manifestation of the disease with histologic and immunologic features similar to those of the idiopathic version. Eruptions can be triggered by systemic or topical medications, and incidence of these reactions may be related to a genetic predisposition for the disease.
Some research suggests that drug-induced changes to the antigenic properties of the epidermal basement membrane result in an augmented immune response, while others point to structural modification in these zones that stimulate the immune system. Thiol- and phenol-based drugs have been largely implicated in the development of DABP because they are capable of structural modification and disruption of the dermo-epidermal junction in the basement membrane.
DABP often presents with patients taking multiple medications. Some of the most common medications are gliptins, PD-1 inhibitors, diuretics, antibiotics, anti-inflammatory drugs, and ACE-inhibitors, and other cardiovascular drugs. DABP may present with mucosal eruptions unlike its idiopathic counterpart that is mostly contained to the skin.
On this patient, two punch biopsies were taken. Histopathology revealed an eosinophil-rich subepidermal blister with a smooth epidermal undersurface consistent with bullous pemphigoid. Direct immunofluorescence was positive with a deposition of IgG and C3 at the epidermal side of salt split basement membrane zone.
Treatment for BP includes high potency topical and systemic steroids. Tetracyclines and niacinamide have been reported to improve the condition. Treatment is tailored to allow for cutaneous healing and control pruritus, but the physician must be mindful of the patient’s comorbidities and capacity for self-care. Prognosis is often better for DABP as withdrawal of the medication greatly accelerates clearance of the lesions. Worse prognosis is related to increased number of comorbidities and older age. Our patient’s BP is controlled currently with topical steroids and oral doxycycline.
This case and photo were submitted by Lucas Shapiro, BS, Nova Southeastern University College of Osteopathic Medicine, Tampa, and Dr. Bilu Martin.
Dr. Bilu Martin is a board-certified dermatologist in private practice at Premier Dermatology, MD, in Aventura, Fla. More diagnostic cases are available at mdedge.com/dermatology. To submit a case for possible publication, send an email to [email protected].
References
1. Miyamoto D et al. An Bras Dermatol. 2019 Mar-Apr;94(2):133-46.
2. Moro et al. Biomolecules. 2020 Oct 10;10(10):1432.
3. Verheyden M et al. Acta Derm Venereol. 2020 Aug 17;100(15):adv00224.
Four methods to chip away at imposter syndrome
Regardless of the setting, one of the most frequently discussed topics in health care is imposter syndrome.
Imposter syndrome was first defined by Clance and Imes as an inability to internalize success, and the tendency to attribute success to external causes such as luck, error, or knowing the appropriate individual.1 This definition is essential because most health care professionals have had a sense of doubt or questioned the full extent of their competencies in various situations. I would argue that this is normal and – within reason – helpful to the practice of medicine. The problem with true imposter syndrome is that the individual does not incorporate success in a way that builds healthy self-esteem and self-efficacy.2
Imposter syndrome has a very nasty way of interacting with burnout. Studies have shown that imposter syndrome can be associated with high levels of emotional exhaustion at work.3 In my experience, this makes clinical sense. Professionals suffering from imposter syndrome can spend a great deal of time and energy trying to maintain a particular image.4 They are acting a part 24/7. Have you ever seriously tried to act? It’s arduous work. A friend once asked me to read a role for a play because “you’d be great; you’re a natural.” By the time I was done with rehearsal, I felt like I had run a 4-by-400-meter relay, by myself, in Victoria, Tex.
And any talk of imposter syndrome must include its running mate, perfectionism. These two conditions exist together so commonly it can be a bit of a chicken or egg question as to which came first.
Imposter syndrome, perfectionism, and burnout can form a deadly triad if not recognized and addressed quickly. In medicine, perfectionism can be a coping strategy that sets up unrelenting standards. Failure to meet unrelenting standards then serves as fuel and validation for imposter syndrome and emotional exhaustion. The consequences of this cycle going unchecked over a health care professional’s career are seismic and can include downstream effects ranging from depression to suicide.
Some readers will relate to this, while others will shrug their shoulders and say that this has never happened in their professional life. I get it. However, I would now ask if you have ever felt like an imposter in your personal life. I’ll make a cup of tea and wait for you to figure out precisely what is the boundary between your personal and professional life. Okay, all done? Great. Now I’ll give you some more time to sincerely reflect if any of the traits of imposter syndrome have described you at times in your personal life. Hmmm, interesting to think about, isn’t it?
I believe that health care professionals frequently use one credit card to pay off another, but the debt remains the same. So even if things are going well at work, we may have just shifted the debt to our personal lives. (At some point in the future, I’ll share my 10 greatest father fails to date to elucidate my point.)
In my work at the GW Resiliency and Well-Being Center, I’ve gravitated toward a few methods supported by evidence that help alleviate imposter syndrome symptoms and potentially serve as protective factors against the future development of imposter syndrome.4 These include but are not limited to:
- Keep a record of small personal success that is yours alone.
- Have a mentor to share failures with.
- Use personal reflection to examine what it means to successfully reach your goals and fulfill your purpose, not a relative value unit target.
- Share experiences with each other, so you know you’re not alone.
The last method is one of my favorites because it involves connecting to others and shining a light on our shared experiences and, coincidentally, our collective strengths. Once this collective strength is realized, the circumstances of that 4-by-400-meter relay change drastically. Be safe and well, everyone.
Lorenzo Norris, MD, is a psychiatrist and chief wellness officer for the George Washington University Medical Enterprise and serves as associate dean of student affairs and administration for the George Washington University School of Medicine and Health Sciences. A version of this article first appeared on Medscape.com.
References
1. Clance PR, Imes SA. The imposter phenomenon in high achieving women: Dynamics and therapeutic intervention. Psychotherapy: Theory, Research & Practice. 1978;15(3): 241-7. doi: 10.1037/h0086006.
2. Thomas M, Bigatti S. Perfectionism, impostor phenomenon, and mental health in medicine: A literature review. Int J Med Educ. 2020 Sep 28;11:201-3. doi: 10.5116/ijme.5f54.c8f8.
3. Liu RQ et al. Impostorism and anxiety contribute to burnout among resident physicians. Med Teach. 2022 Jul;44(7):758-64. doi: 10.1080/0142159X.2022.2028751.
4. Gottlieb M et al. Impostor syndrome among physicians and physicians in training: A scoping review. Med Educ. 2020 Feb;54(2):116-24. doi: 10.1111/medu.13956.
Regardless of the setting, one of the most frequently discussed topics in health care is imposter syndrome.
Imposter syndrome was first defined by Clance and Imes as an inability to internalize success, and the tendency to attribute success to external causes such as luck, error, or knowing the appropriate individual.1 This definition is essential because most health care professionals have had a sense of doubt or questioned the full extent of their competencies in various situations. I would argue that this is normal and – within reason – helpful to the practice of medicine. The problem with true imposter syndrome is that the individual does not incorporate success in a way that builds healthy self-esteem and self-efficacy.2
Imposter syndrome has a very nasty way of interacting with burnout. Studies have shown that imposter syndrome can be associated with high levels of emotional exhaustion at work.3 In my experience, this makes clinical sense. Professionals suffering from imposter syndrome can spend a great deal of time and energy trying to maintain a particular image.4 They are acting a part 24/7. Have you ever seriously tried to act? It’s arduous work. A friend once asked me to read a role for a play because “you’d be great; you’re a natural.” By the time I was done with rehearsal, I felt like I had run a 4-by-400-meter relay, by myself, in Victoria, Tex.
And any talk of imposter syndrome must include its running mate, perfectionism. These two conditions exist together so commonly it can be a bit of a chicken or egg question as to which came first.
Imposter syndrome, perfectionism, and burnout can form a deadly triad if not recognized and addressed quickly. In medicine, perfectionism can be a coping strategy that sets up unrelenting standards. Failure to meet unrelenting standards then serves as fuel and validation for imposter syndrome and emotional exhaustion. The consequences of this cycle going unchecked over a health care professional’s career are seismic and can include downstream effects ranging from depression to suicide.
Some readers will relate to this, while others will shrug their shoulders and say that this has never happened in their professional life. I get it. However, I would now ask if you have ever felt like an imposter in your personal life. I’ll make a cup of tea and wait for you to figure out precisely what is the boundary between your personal and professional life. Okay, all done? Great. Now I’ll give you some more time to sincerely reflect if any of the traits of imposter syndrome have described you at times in your personal life. Hmmm, interesting to think about, isn’t it?
I believe that health care professionals frequently use one credit card to pay off another, but the debt remains the same. So even if things are going well at work, we may have just shifted the debt to our personal lives. (At some point in the future, I’ll share my 10 greatest father fails to date to elucidate my point.)
In my work at the GW Resiliency and Well-Being Center, I’ve gravitated toward a few methods supported by evidence that help alleviate imposter syndrome symptoms and potentially serve as protective factors against the future development of imposter syndrome.4 These include but are not limited to:
- Keep a record of small personal success that is yours alone.
- Have a mentor to share failures with.
- Use personal reflection to examine what it means to successfully reach your goals and fulfill your purpose, not a relative value unit target.
- Share experiences with each other, so you know you’re not alone.
The last method is one of my favorites because it involves connecting to others and shining a light on our shared experiences and, coincidentally, our collective strengths. Once this collective strength is realized, the circumstances of that 4-by-400-meter relay change drastically. Be safe and well, everyone.
Lorenzo Norris, MD, is a psychiatrist and chief wellness officer for the George Washington University Medical Enterprise and serves as associate dean of student affairs and administration for the George Washington University School of Medicine and Health Sciences. A version of this article first appeared on Medscape.com.
References
1. Clance PR, Imes SA. The imposter phenomenon in high achieving women: Dynamics and therapeutic intervention. Psychotherapy: Theory, Research & Practice. 1978;15(3): 241-7. doi: 10.1037/h0086006.
2. Thomas M, Bigatti S. Perfectionism, impostor phenomenon, and mental health in medicine: A literature review. Int J Med Educ. 2020 Sep 28;11:201-3. doi: 10.5116/ijme.5f54.c8f8.
3. Liu RQ et al. Impostorism and anxiety contribute to burnout among resident physicians. Med Teach. 2022 Jul;44(7):758-64. doi: 10.1080/0142159X.2022.2028751.
4. Gottlieb M et al. Impostor syndrome among physicians and physicians in training: A scoping review. Med Educ. 2020 Feb;54(2):116-24. doi: 10.1111/medu.13956.
Regardless of the setting, one of the most frequently discussed topics in health care is imposter syndrome.
Imposter syndrome was first defined by Clance and Imes as an inability to internalize success, and the tendency to attribute success to external causes such as luck, error, or knowing the appropriate individual.1 This definition is essential because most health care professionals have had a sense of doubt or questioned the full extent of their competencies in various situations. I would argue that this is normal and – within reason – helpful to the practice of medicine. The problem with true imposter syndrome is that the individual does not incorporate success in a way that builds healthy self-esteem and self-efficacy.2
Imposter syndrome has a very nasty way of interacting with burnout. Studies have shown that imposter syndrome can be associated with high levels of emotional exhaustion at work.3 In my experience, this makes clinical sense. Professionals suffering from imposter syndrome can spend a great deal of time and energy trying to maintain a particular image.4 They are acting a part 24/7. Have you ever seriously tried to act? It’s arduous work. A friend once asked me to read a role for a play because “you’d be great; you’re a natural.” By the time I was done with rehearsal, I felt like I had run a 4-by-400-meter relay, by myself, in Victoria, Tex.
And any talk of imposter syndrome must include its running mate, perfectionism. These two conditions exist together so commonly it can be a bit of a chicken or egg question as to which came first.
Imposter syndrome, perfectionism, and burnout can form a deadly triad if not recognized and addressed quickly. In medicine, perfectionism can be a coping strategy that sets up unrelenting standards. Failure to meet unrelenting standards then serves as fuel and validation for imposter syndrome and emotional exhaustion. The consequences of this cycle going unchecked over a health care professional’s career are seismic and can include downstream effects ranging from depression to suicide.
Some readers will relate to this, while others will shrug their shoulders and say that this has never happened in their professional life. I get it. However, I would now ask if you have ever felt like an imposter in your personal life. I’ll make a cup of tea and wait for you to figure out precisely what is the boundary between your personal and professional life. Okay, all done? Great. Now I’ll give you some more time to sincerely reflect if any of the traits of imposter syndrome have described you at times in your personal life. Hmmm, interesting to think about, isn’t it?
I believe that health care professionals frequently use one credit card to pay off another, but the debt remains the same. So even if things are going well at work, we may have just shifted the debt to our personal lives. (At some point in the future, I’ll share my 10 greatest father fails to date to elucidate my point.)
In my work at the GW Resiliency and Well-Being Center, I’ve gravitated toward a few methods supported by evidence that help alleviate imposter syndrome symptoms and potentially serve as protective factors against the future development of imposter syndrome.4 These include but are not limited to:
- Keep a record of small personal success that is yours alone.
- Have a mentor to share failures with.
- Use personal reflection to examine what it means to successfully reach your goals and fulfill your purpose, not a relative value unit target.
- Share experiences with each other, so you know you’re not alone.
The last method is one of my favorites because it involves connecting to others and shining a light on our shared experiences and, coincidentally, our collective strengths. Once this collective strength is realized, the circumstances of that 4-by-400-meter relay change drastically. Be safe and well, everyone.
Lorenzo Norris, MD, is a psychiatrist and chief wellness officer for the George Washington University Medical Enterprise and serves as associate dean of student affairs and administration for the George Washington University School of Medicine and Health Sciences. A version of this article first appeared on Medscape.com.
References
1. Clance PR, Imes SA. The imposter phenomenon in high achieving women: Dynamics and therapeutic intervention. Psychotherapy: Theory, Research & Practice. 1978;15(3): 241-7. doi: 10.1037/h0086006.
2. Thomas M, Bigatti S. Perfectionism, impostor phenomenon, and mental health in medicine: A literature review. Int J Med Educ. 2020 Sep 28;11:201-3. doi: 10.5116/ijme.5f54.c8f8.
3. Liu RQ et al. Impostorism and anxiety contribute to burnout among resident physicians. Med Teach. 2022 Jul;44(7):758-64. doi: 10.1080/0142159X.2022.2028751.
4. Gottlieb M et al. Impostor syndrome among physicians and physicians in training: A scoping review. Med Educ. 2020 Feb;54(2):116-24. doi: 10.1111/medu.13956.
Fitness trackers: Useful in sleep medicine?
Who doesn’t love data, especially their own? With that thought in mind, over the years I have owned several activity trackers, including at least two Fitbits, and I frequently check my iPhone to see how far I’ve walked or how many steps I have taken. My most recent acquisition is an Oura (smart ring, third generation), which includes my first sleep tracker.
Sleep trackers are not unique to the Oura Ring; they are included on many of the newer activity trackers and smart watches, but the design and breakdown of daily sleep, activity, and readiness scores are hallmarks of Oura Rings.
The ring generates data for different phases of sleep, movements, oxygen saturation, disturbances in breathing, heart rate, and heart rate variability. I began to wonder how useful this information would be clinically and whether it might be helpful in either the diagnosis or treatment of sleep disorders.
David Neubauer, MD, is a psychiatrist at the Johns Hopkins Sleep Disorders Center. “Sleep tracking devices are more than just toys but less than medical devices. They do have clinical utility and might show findings that warrant further medical workup,” Dr. Neubauer said. “It is impressive that these devices estimate sleep as well as they do, but there is a problem with how they divide sleep stages that can lead people to believe their sleep is worse than it really is.”
For more than 50 years, he explained, sleep researchers and clinicians have categorized sleep as non–rapid eye movement (NREM) sleep stages 1-4 and REM sleep. More recently, sleep was reorganized to N1, N2, and N3 (which combines the older stages 3 and 4, representing “deep sleep” or “slow wave sleep”) and REM sleep. We normally spend more time in N2 than the other stages. However, the device companies often categorize their sleep estimates as “light sleep,” “deep sleep,” or “REM.” With “light sleep,” they are lumping together N1 and N2 sleep, and this is misleading, said Dr. Neubauer. “Understandably, people often think that there is something wrong if their tracker reports they are spending a lot of time in light sleep, when actually their sleep may be entirely normal.”
Sleep tracker validity
A study by Massimiliano de Zambotti, PhD, and colleagues, “The Sleep of the Ring: Comparison of the ŌURA Sleep Tracker Against Polysomnography”, looked at sleep patterns of 41 adolescents and young adults and concluded that the second-generation tracker was accurate in terms of total sleep but underestimated time spent in N3 stage sleep by approximately 20 minutes while overestimating time spent in REM sleep by 17 minutes. They concluded that the ring had potential to be clinically useful but that further studies and validation were needed.
A larger study of the newest, third-generation Oura tracker, conducted by Altini and Kinnunen at Oura Health, found that the added sensors with the newer-generation ring led to improved accuracy, but they noted that the study was done with a healthy population and might not generalize to clinical populations.
Fernando Goes, MD, and Matthew Reid, PhD, both at Johns Hopkins, are working on a multicenter study using the Oura Ring and the mindLAMP app to look at the impact of sleep on mood in people with mood disorders as well as healthy controls. Dr. Reid said that “validation of sleep stages takes a hit when the ring is used in people with insomnia. We find it useful for total sleep time, but when you look at sleep architecture, the concordance is only 60%. And oxygen saturation measures are less accurate in people with dark skin.”
Clinical uses for sleep trackers
More accurate information might prove reassuring to patients. Dr. Goes added, “One use, for example, might be to help patients to limit or come off of long-term hypnotics with a more benign intervention that incorporates passive monitoring such as that in the Oura Ring. Some patients worry excessively about not being able to sleep, and sleep monitoring data can be helpful to reduce some of these concerns so patients can focus on safer interventions, such as cognitive behavioral therapy for insomnia.” Dr. Reid believes that wearable trackers have potential usefulness in monitoring sleep in patients with insomnia. “In insomnia, sleep state misperception is common. They are hyper-aroused, and they perceive that they are awake when in fact they are sleeping.”
Dr. Goes mentioned another use for sleep trackers in clinical settings: “In our inpatient units, the nurses open the door to look in on patients every hour to monitor and document if they are sleeping. If they look in and the patient isn’t moving, they will ask the patient to raise their hand, which of course is not going to help someone to fall back asleep.” Wearable devices might provide data on sleep without the risk of waking patients every hour through the night.
Not medical devices
However, Dr. Neubauer emphasized that current sleep trackers are not medical devices, saying “they may be measuring the same parameters that are measured with medical devices, for example pulse oximetry or sleep states, but there’s no simple answer yet to the question of whether the devices provide reliable data for clinical decision-making.”
Dr. Neubauer is skeptical about the accuracy of some of the measures the device provides. “I would not use the information from a consumer device to rule out obstructive sleep apnea based on good oxygen saturation numbers. So much depends on the history – snoring, gasping awakenings, reports from bed partners, and daytime sleepiness. These devices do not measure respiratory effort or nasal airflow as sleep studies do. But big drops in oxygen saturation from a consumer device certainly warrant attention for further evaluation.” Dr. Neubauer also noted that the parameters on sleep trackers do not differentiate between central or obstructive sleep apnea and that insurers won’t pay for continuous positive airway pressure to treat sleep apnea without a sleep study.
I enjoy looking at the data, even knowing that they are not entirely accurate. and we may find more clinical uses for these devices. For now, I’m off to get more exercise, at the suggestion of my tracker!
Dinah Miller, MD, is assistant professor of psychiatry and behavioral sciences, Johns Hopkins Medicine, Baltimore.
A version of this article first appeared on Medscape.com.
Who doesn’t love data, especially their own? With that thought in mind, over the years I have owned several activity trackers, including at least two Fitbits, and I frequently check my iPhone to see how far I’ve walked or how many steps I have taken. My most recent acquisition is an Oura (smart ring, third generation), which includes my first sleep tracker.
Sleep trackers are not unique to the Oura Ring; they are included on many of the newer activity trackers and smart watches, but the design and breakdown of daily sleep, activity, and readiness scores are hallmarks of Oura Rings.
The ring generates data for different phases of sleep, movements, oxygen saturation, disturbances in breathing, heart rate, and heart rate variability. I began to wonder how useful this information would be clinically and whether it might be helpful in either the diagnosis or treatment of sleep disorders.
David Neubauer, MD, is a psychiatrist at the Johns Hopkins Sleep Disorders Center. “Sleep tracking devices are more than just toys but less than medical devices. They do have clinical utility and might show findings that warrant further medical workup,” Dr. Neubauer said. “It is impressive that these devices estimate sleep as well as they do, but there is a problem with how they divide sleep stages that can lead people to believe their sleep is worse than it really is.”
For more than 50 years, he explained, sleep researchers and clinicians have categorized sleep as non–rapid eye movement (NREM) sleep stages 1-4 and REM sleep. More recently, sleep was reorganized to N1, N2, and N3 (which combines the older stages 3 and 4, representing “deep sleep” or “slow wave sleep”) and REM sleep. We normally spend more time in N2 than the other stages. However, the device companies often categorize their sleep estimates as “light sleep,” “deep sleep,” or “REM.” With “light sleep,” they are lumping together N1 and N2 sleep, and this is misleading, said Dr. Neubauer. “Understandably, people often think that there is something wrong if their tracker reports they are spending a lot of time in light sleep, when actually their sleep may be entirely normal.”
Sleep tracker validity
A study by Massimiliano de Zambotti, PhD, and colleagues, “The Sleep of the Ring: Comparison of the ŌURA Sleep Tracker Against Polysomnography”, looked at sleep patterns of 41 adolescents and young adults and concluded that the second-generation tracker was accurate in terms of total sleep but underestimated time spent in N3 stage sleep by approximately 20 minutes while overestimating time spent in REM sleep by 17 minutes. They concluded that the ring had potential to be clinically useful but that further studies and validation were needed.
A larger study of the newest, third-generation Oura tracker, conducted by Altini and Kinnunen at Oura Health, found that the added sensors with the newer-generation ring led to improved accuracy, but they noted that the study was done with a healthy population and might not generalize to clinical populations.
Fernando Goes, MD, and Matthew Reid, PhD, both at Johns Hopkins, are working on a multicenter study using the Oura Ring and the mindLAMP app to look at the impact of sleep on mood in people with mood disorders as well as healthy controls. Dr. Reid said that “validation of sleep stages takes a hit when the ring is used in people with insomnia. We find it useful for total sleep time, but when you look at sleep architecture, the concordance is only 60%. And oxygen saturation measures are less accurate in people with dark skin.”
Clinical uses for sleep trackers
More accurate information might prove reassuring to patients. Dr. Goes added, “One use, for example, might be to help patients to limit or come off of long-term hypnotics with a more benign intervention that incorporates passive monitoring such as that in the Oura Ring. Some patients worry excessively about not being able to sleep, and sleep monitoring data can be helpful to reduce some of these concerns so patients can focus on safer interventions, such as cognitive behavioral therapy for insomnia.” Dr. Reid believes that wearable trackers have potential usefulness in monitoring sleep in patients with insomnia. “In insomnia, sleep state misperception is common. They are hyper-aroused, and they perceive that they are awake when in fact they are sleeping.”
Dr. Goes mentioned another use for sleep trackers in clinical settings: “In our inpatient units, the nurses open the door to look in on patients every hour to monitor and document if they are sleeping. If they look in and the patient isn’t moving, they will ask the patient to raise their hand, which of course is not going to help someone to fall back asleep.” Wearable devices might provide data on sleep without the risk of waking patients every hour through the night.
Not medical devices
However, Dr. Neubauer emphasized that current sleep trackers are not medical devices, saying “they may be measuring the same parameters that are measured with medical devices, for example pulse oximetry or sleep states, but there’s no simple answer yet to the question of whether the devices provide reliable data for clinical decision-making.”
Dr. Neubauer is skeptical about the accuracy of some of the measures the device provides. “I would not use the information from a consumer device to rule out obstructive sleep apnea based on good oxygen saturation numbers. So much depends on the history – snoring, gasping awakenings, reports from bed partners, and daytime sleepiness. These devices do not measure respiratory effort or nasal airflow as sleep studies do. But big drops in oxygen saturation from a consumer device certainly warrant attention for further evaluation.” Dr. Neubauer also noted that the parameters on sleep trackers do not differentiate between central or obstructive sleep apnea and that insurers won’t pay for continuous positive airway pressure to treat sleep apnea without a sleep study.
I enjoy looking at the data, even knowing that they are not entirely accurate. and we may find more clinical uses for these devices. For now, I’m off to get more exercise, at the suggestion of my tracker!
Dinah Miller, MD, is assistant professor of psychiatry and behavioral sciences, Johns Hopkins Medicine, Baltimore.
A version of this article first appeared on Medscape.com.
Who doesn’t love data, especially their own? With that thought in mind, over the years I have owned several activity trackers, including at least two Fitbits, and I frequently check my iPhone to see how far I’ve walked or how many steps I have taken. My most recent acquisition is an Oura (smart ring, third generation), which includes my first sleep tracker.
Sleep trackers are not unique to the Oura Ring; they are included on many of the newer activity trackers and smart watches, but the design and breakdown of daily sleep, activity, and readiness scores are hallmarks of Oura Rings.
The ring generates data for different phases of sleep, movements, oxygen saturation, disturbances in breathing, heart rate, and heart rate variability. I began to wonder how useful this information would be clinically and whether it might be helpful in either the diagnosis or treatment of sleep disorders.
David Neubauer, MD, is a psychiatrist at the Johns Hopkins Sleep Disorders Center. “Sleep tracking devices are more than just toys but less than medical devices. They do have clinical utility and might show findings that warrant further medical workup,” Dr. Neubauer said. “It is impressive that these devices estimate sleep as well as they do, but there is a problem with how they divide sleep stages that can lead people to believe their sleep is worse than it really is.”
For more than 50 years, he explained, sleep researchers and clinicians have categorized sleep as non–rapid eye movement (NREM) sleep stages 1-4 and REM sleep. More recently, sleep was reorganized to N1, N2, and N3 (which combines the older stages 3 and 4, representing “deep sleep” or “slow wave sleep”) and REM sleep. We normally spend more time in N2 than the other stages. However, the device companies often categorize their sleep estimates as “light sleep,” “deep sleep,” or “REM.” With “light sleep,” they are lumping together N1 and N2 sleep, and this is misleading, said Dr. Neubauer. “Understandably, people often think that there is something wrong if their tracker reports they are spending a lot of time in light sleep, when actually their sleep may be entirely normal.”
Sleep tracker validity
A study by Massimiliano de Zambotti, PhD, and colleagues, “The Sleep of the Ring: Comparison of the ŌURA Sleep Tracker Against Polysomnography”, looked at sleep patterns of 41 adolescents and young adults and concluded that the second-generation tracker was accurate in terms of total sleep but underestimated time spent in N3 stage sleep by approximately 20 minutes while overestimating time spent in REM sleep by 17 minutes. They concluded that the ring had potential to be clinically useful but that further studies and validation were needed.
A larger study of the newest, third-generation Oura tracker, conducted by Altini and Kinnunen at Oura Health, found that the added sensors with the newer-generation ring led to improved accuracy, but they noted that the study was done with a healthy population and might not generalize to clinical populations.
Fernando Goes, MD, and Matthew Reid, PhD, both at Johns Hopkins, are working on a multicenter study using the Oura Ring and the mindLAMP app to look at the impact of sleep on mood in people with mood disorders as well as healthy controls. Dr. Reid said that “validation of sleep stages takes a hit when the ring is used in people with insomnia. We find it useful for total sleep time, but when you look at sleep architecture, the concordance is only 60%. And oxygen saturation measures are less accurate in people with dark skin.”
Clinical uses for sleep trackers
More accurate information might prove reassuring to patients. Dr. Goes added, “One use, for example, might be to help patients to limit or come off of long-term hypnotics with a more benign intervention that incorporates passive monitoring such as that in the Oura Ring. Some patients worry excessively about not being able to sleep, and sleep monitoring data can be helpful to reduce some of these concerns so patients can focus on safer interventions, such as cognitive behavioral therapy for insomnia.” Dr. Reid believes that wearable trackers have potential usefulness in monitoring sleep in patients with insomnia. “In insomnia, sleep state misperception is common. They are hyper-aroused, and they perceive that they are awake when in fact they are sleeping.”
Dr. Goes mentioned another use for sleep trackers in clinical settings: “In our inpatient units, the nurses open the door to look in on patients every hour to monitor and document if they are sleeping. If they look in and the patient isn’t moving, they will ask the patient to raise their hand, which of course is not going to help someone to fall back asleep.” Wearable devices might provide data on sleep without the risk of waking patients every hour through the night.
Not medical devices
However, Dr. Neubauer emphasized that current sleep trackers are not medical devices, saying “they may be measuring the same parameters that are measured with medical devices, for example pulse oximetry or sleep states, but there’s no simple answer yet to the question of whether the devices provide reliable data for clinical decision-making.”
Dr. Neubauer is skeptical about the accuracy of some of the measures the device provides. “I would not use the information from a consumer device to rule out obstructive sleep apnea based on good oxygen saturation numbers. So much depends on the history – snoring, gasping awakenings, reports from bed partners, and daytime sleepiness. These devices do not measure respiratory effort or nasal airflow as sleep studies do. But big drops in oxygen saturation from a consumer device certainly warrant attention for further evaluation.” Dr. Neubauer also noted that the parameters on sleep trackers do not differentiate between central or obstructive sleep apnea and that insurers won’t pay for continuous positive airway pressure to treat sleep apnea without a sleep study.
I enjoy looking at the data, even knowing that they are not entirely accurate. and we may find more clinical uses for these devices. For now, I’m off to get more exercise, at the suggestion of my tracker!
Dinah Miller, MD, is assistant professor of psychiatry and behavioral sciences, Johns Hopkins Medicine, Baltimore.
A version of this article first appeared on Medscape.com.
Rheumatic diseases and assisted reproductive technology: Things to consider
The field of “reproductive rheumatology” has received growing attention in recent years as we learn more about how autoimmune rheumatic diseases and their treatment affect women of reproductive age. In 2020, the American College of Rheumatology published a comprehensive guideline that includes recommendations and supporting evidence for managing issues related to reproductive health in patients with rheumatic diseases and has since launched an ongoing Reproductive Health Initiative, with the goal of translating established guidelines into practice through various education and awareness campaigns. One area addressed by the guideline that comes up commonly in practice but receives less attention and research is the use of assisted reproductive technology (ART) in patients with rheumatic diseases.
Literature is conflicting regarding whether patients with autoimmune rheumatic diseases are inherently at increased risk for infertility, defined as failure to achieve a clinical pregnancy after 12 months or more of regular unprotected intercourse, or subfertility, defined as a delay in conception. Regardless, several factors indirectly contribute to a disproportionate risk for infertility or subfertility in this patient population, including active inflammatory disease, reduced ovarian reserve, and medications.
Patients with subfertility or infertility who desire pregnancy may pursue ovulation induction with timed intercourse or intrauterine insemination, in vitro fertilization (IVF)/intracytoplasmic sperm injection with either embryo transfer, or gestational surrogacy. Those who require treatment with cyclophosphamide or who plan to defer pregnancy for whatever reason can opt for oocyte cryopreservation (colloquially known as “egg freezing”). For IVF and oocyte cryopreservation, controlled ovarian stimulation is typically the first step (except in unstimulated, or “natural cycle,” IVF).
Various protocols are used for ovarian stimulation and ovulation induction, the nuances of which are beyond the scope of this article. In general, ovarian stimulation involves gonadotropin therapy (follicle-stimulating hormone and/or human menopausal gonadotropin) administered via scheduled subcutaneous injections to stimulate follicular growth, as well as gonadotropin-releasing hormone (GnRH) agonists or antagonists to suppress luteinizing hormone, preventing ovulation. Adjunctive oral therapy (clomiphene citrate or letrozole, an aromatase inhibitor) may be used as well. The patient has frequent lab monitoring of hormone levels and transvaginal ultrasounds to measure follicle number and size and, when the timing is right, receives an “ovulation trigger” – either human chorionic gonadotropin or GnRH agonist, depending on the protocol. At this point, transvaginal ultrasound–guided egg retrieval is done under sedation. Recovered oocytes are then either frozen for later use or fertilized in the lab for embryo transfer. Lastly, exogenous hormones are often used: estrogen to support frozen embryo transfers and progesterone for so-called luteal phase support.
ART is not contraindicated in patients with autoimmune rheumatic diseases, but there may be additional factors to consider, particularly for those with systemic lupus erythematosus (SLE), antiphospholipid syndrome (APS), and antiphospholipid antibodies (aPL) without clinical APS.
Ovarian stimulation elevates estrogen levels to varying degrees depending on the patient and the medications used. In all cases, though, peak levels are significantly lower than levels reached during pregnancy. It is well established that elevated estrogen – whether from hormone therapies or pregnancy – significantly increases thrombotic risk, even in healthy people. High-risk patients should receive low-molecular-weight heparin – a prophylactic dose for patients with either positive aPL without clinical APS (including those with SLE) or with obstetric APS, and a therapeutic dose for those with thrombotic APS – during ART procedures.
In patients with SLE, another concern is that increased estrogen will cause disease flare. One case series published in 2017 reported 37 patients with SLE and/or APS who underwent 97 IVF cycles, of which 8% were complicated by flare or thrombotic events. Notably, half of these complications occurred in patients who stopped prescribed therapies (immunomodulatory therapy in two patients with SLE, anticoagulation in two patients with APS) after failure to conceive. In a separate study from 2000 including 19 patients with SLE, APS, or high-titer aPL who underwent 68 IVF cycles, 19% of cycles in patients with SLE were complicated by flare, and no thrombotic events occurred in the cohort. The authors concluded that ovulation induction does not exacerbate SLE or APS. In these studies, the overall pregnancy rates were felt to be consistent with those achieved by the general population through IVF. Although obstetric complications, such as preeclampsia and preterm delivery, were reported in about half of the pregnancies described, these are known to occur more frequently in those with SLE and APS, especially when active disease or other risk factors are present. There are no large-scale, controlled studies evaluating ART outcomes in patients with autoimmune rheumatic diseases to date.
Finally, ovarian hyperstimulation syndrome (OHSS) is an increasingly rare but severe complication of ovarian stimulation. OHSS is characterized by capillary leak, fluid overload, and cytokine release syndrome and can lead to thromboembolic events. Comorbidities like hypertension and renal failure, which can go along with autoimmune rheumatic diseases, are risk factors for OHSS. The use of human chorionic gonadotropin to trigger ovulation is also associated with an increased risk for OHSS, so a GnRH agonist trigger may be preferable.
The ACR guideline recommends that individuals with any of these underlying conditions undergo ART only in expert centers. The ovarian stimulation protocol needs to be tailored to the individual patient to minimize risk and optimize outcomes. The overall goal when managing patients with autoimmune rheumatic diseases during ART is to establish and maintain disease control with pregnancy-compatible medications (when pregnancy is the goal). With adequate planning, appropriate treatment, and collaboration between obstetricians and rheumatologists, individuals with autoimmune rheumatic diseases can safely pursue ART and go on to have successful pregnancies.
Dr. Siegel is a 2022-2023 UCB Women’s Health rheumatology fellow in the rheumatology reproductive health program of the Barbara Volcker Center for Women and Rheumatic Diseases at Hospital for Special Surgery/Weill Cornell Medicine, New York. Her clinical and research focus is on reproductive health issues in individuals with rheumatic disease. Dr. Chan is an assistant professor at Weill Cornell Medical College and an attending physician at Hospital for Special Surgery and Memorial Sloan Kettering Cancer Center in New York. Before moving to New York City, she spent 7 years in private practice in Rhode Island and was a columnist for a monthly rheumatology publication, writing about the challenges of starting life as a full-fledged rheumatologist in a private practice. Follow Dr Chan on Twitter. Dr. Siegel and Dr. Chan disclosed no relevant financial relationships.
A version of this article – an editorial collaboration between Medscape and the Hospital for Special Surgery – first appeared on Medscape.com.
The field of “reproductive rheumatology” has received growing attention in recent years as we learn more about how autoimmune rheumatic diseases and their treatment affect women of reproductive age. In 2020, the American College of Rheumatology published a comprehensive guideline that includes recommendations and supporting evidence for managing issues related to reproductive health in patients with rheumatic diseases and has since launched an ongoing Reproductive Health Initiative, with the goal of translating established guidelines into practice through various education and awareness campaigns. One area addressed by the guideline that comes up commonly in practice but receives less attention and research is the use of assisted reproductive technology (ART) in patients with rheumatic diseases.
Literature is conflicting regarding whether patients with autoimmune rheumatic diseases are inherently at increased risk for infertility, defined as failure to achieve a clinical pregnancy after 12 months or more of regular unprotected intercourse, or subfertility, defined as a delay in conception. Regardless, several factors indirectly contribute to a disproportionate risk for infertility or subfertility in this patient population, including active inflammatory disease, reduced ovarian reserve, and medications.
Patients with subfertility or infertility who desire pregnancy may pursue ovulation induction with timed intercourse or intrauterine insemination, in vitro fertilization (IVF)/intracytoplasmic sperm injection with either embryo transfer, or gestational surrogacy. Those who require treatment with cyclophosphamide or who plan to defer pregnancy for whatever reason can opt for oocyte cryopreservation (colloquially known as “egg freezing”). For IVF and oocyte cryopreservation, controlled ovarian stimulation is typically the first step (except in unstimulated, or “natural cycle,” IVF).
Various protocols are used for ovarian stimulation and ovulation induction, the nuances of which are beyond the scope of this article. In general, ovarian stimulation involves gonadotropin therapy (follicle-stimulating hormone and/or human menopausal gonadotropin) administered via scheduled subcutaneous injections to stimulate follicular growth, as well as gonadotropin-releasing hormone (GnRH) agonists or antagonists to suppress luteinizing hormone, preventing ovulation. Adjunctive oral therapy (clomiphene citrate or letrozole, an aromatase inhibitor) may be used as well. The patient has frequent lab monitoring of hormone levels and transvaginal ultrasounds to measure follicle number and size and, when the timing is right, receives an “ovulation trigger” – either human chorionic gonadotropin or GnRH agonist, depending on the protocol. At this point, transvaginal ultrasound–guided egg retrieval is done under sedation. Recovered oocytes are then either frozen for later use or fertilized in the lab for embryo transfer. Lastly, exogenous hormones are often used: estrogen to support frozen embryo transfers and progesterone for so-called luteal phase support.
ART is not contraindicated in patients with autoimmune rheumatic diseases, but there may be additional factors to consider, particularly for those with systemic lupus erythematosus (SLE), antiphospholipid syndrome (APS), and antiphospholipid antibodies (aPL) without clinical APS.
Ovarian stimulation elevates estrogen levels to varying degrees depending on the patient and the medications used. In all cases, though, peak levels are significantly lower than levels reached during pregnancy. It is well established that elevated estrogen – whether from hormone therapies or pregnancy – significantly increases thrombotic risk, even in healthy people. High-risk patients should receive low-molecular-weight heparin – a prophylactic dose for patients with either positive aPL without clinical APS (including those with SLE) or with obstetric APS, and a therapeutic dose for those with thrombotic APS – during ART procedures.
In patients with SLE, another concern is that increased estrogen will cause disease flare. One case series published in 2017 reported 37 patients with SLE and/or APS who underwent 97 IVF cycles, of which 8% were complicated by flare or thrombotic events. Notably, half of these complications occurred in patients who stopped prescribed therapies (immunomodulatory therapy in two patients with SLE, anticoagulation in two patients with APS) after failure to conceive. In a separate study from 2000 including 19 patients with SLE, APS, or high-titer aPL who underwent 68 IVF cycles, 19% of cycles in patients with SLE were complicated by flare, and no thrombotic events occurred in the cohort. The authors concluded that ovulation induction does not exacerbate SLE or APS. In these studies, the overall pregnancy rates were felt to be consistent with those achieved by the general population through IVF. Although obstetric complications, such as preeclampsia and preterm delivery, were reported in about half of the pregnancies described, these are known to occur more frequently in those with SLE and APS, especially when active disease or other risk factors are present. There are no large-scale, controlled studies evaluating ART outcomes in patients with autoimmune rheumatic diseases to date.
Finally, ovarian hyperstimulation syndrome (OHSS) is an increasingly rare but severe complication of ovarian stimulation. OHSS is characterized by capillary leak, fluid overload, and cytokine release syndrome and can lead to thromboembolic events. Comorbidities like hypertension and renal failure, which can go along with autoimmune rheumatic diseases, are risk factors for OHSS. The use of human chorionic gonadotropin to trigger ovulation is also associated with an increased risk for OHSS, so a GnRH agonist trigger may be preferable.
The ACR guideline recommends that individuals with any of these underlying conditions undergo ART only in expert centers. The ovarian stimulation protocol needs to be tailored to the individual patient to minimize risk and optimize outcomes. The overall goal when managing patients with autoimmune rheumatic diseases during ART is to establish and maintain disease control with pregnancy-compatible medications (when pregnancy is the goal). With adequate planning, appropriate treatment, and collaboration between obstetricians and rheumatologists, individuals with autoimmune rheumatic diseases can safely pursue ART and go on to have successful pregnancies.
Dr. Siegel is a 2022-2023 UCB Women’s Health rheumatology fellow in the rheumatology reproductive health program of the Barbara Volcker Center for Women and Rheumatic Diseases at Hospital for Special Surgery/Weill Cornell Medicine, New York. Her clinical and research focus is on reproductive health issues in individuals with rheumatic disease. Dr. Chan is an assistant professor at Weill Cornell Medical College and an attending physician at Hospital for Special Surgery and Memorial Sloan Kettering Cancer Center in New York. Before moving to New York City, she spent 7 years in private practice in Rhode Island and was a columnist for a monthly rheumatology publication, writing about the challenges of starting life as a full-fledged rheumatologist in a private practice. Follow Dr Chan on Twitter. Dr. Siegel and Dr. Chan disclosed no relevant financial relationships.
A version of this article – an editorial collaboration between Medscape and the Hospital for Special Surgery – first appeared on Medscape.com.
The field of “reproductive rheumatology” has received growing attention in recent years as we learn more about how autoimmune rheumatic diseases and their treatment affect women of reproductive age. In 2020, the American College of Rheumatology published a comprehensive guideline that includes recommendations and supporting evidence for managing issues related to reproductive health in patients with rheumatic diseases and has since launched an ongoing Reproductive Health Initiative, with the goal of translating established guidelines into practice through various education and awareness campaigns. One area addressed by the guideline that comes up commonly in practice but receives less attention and research is the use of assisted reproductive technology (ART) in patients with rheumatic diseases.
Literature is conflicting regarding whether patients with autoimmune rheumatic diseases are inherently at increased risk for infertility, defined as failure to achieve a clinical pregnancy after 12 months or more of regular unprotected intercourse, or subfertility, defined as a delay in conception. Regardless, several factors indirectly contribute to a disproportionate risk for infertility or subfertility in this patient population, including active inflammatory disease, reduced ovarian reserve, and medications.
Patients with subfertility or infertility who desire pregnancy may pursue ovulation induction with timed intercourse or intrauterine insemination, in vitro fertilization (IVF)/intracytoplasmic sperm injection with either embryo transfer, or gestational surrogacy. Those who require treatment with cyclophosphamide or who plan to defer pregnancy for whatever reason can opt for oocyte cryopreservation (colloquially known as “egg freezing”). For IVF and oocyte cryopreservation, controlled ovarian stimulation is typically the first step (except in unstimulated, or “natural cycle,” IVF).
Various protocols are used for ovarian stimulation and ovulation induction, the nuances of which are beyond the scope of this article. In general, ovarian stimulation involves gonadotropin therapy (follicle-stimulating hormone and/or human menopausal gonadotropin) administered via scheduled subcutaneous injections to stimulate follicular growth, as well as gonadotropin-releasing hormone (GnRH) agonists or antagonists to suppress luteinizing hormone, preventing ovulation. Adjunctive oral therapy (clomiphene citrate or letrozole, an aromatase inhibitor) may be used as well. The patient has frequent lab monitoring of hormone levels and transvaginal ultrasounds to measure follicle number and size and, when the timing is right, receives an “ovulation trigger” – either human chorionic gonadotropin or GnRH agonist, depending on the protocol. At this point, transvaginal ultrasound–guided egg retrieval is done under sedation. Recovered oocytes are then either frozen for later use or fertilized in the lab for embryo transfer. Lastly, exogenous hormones are often used: estrogen to support frozen embryo transfers and progesterone for so-called luteal phase support.
ART is not contraindicated in patients with autoimmune rheumatic diseases, but there may be additional factors to consider, particularly for those with systemic lupus erythematosus (SLE), antiphospholipid syndrome (APS), and antiphospholipid antibodies (aPL) without clinical APS.
Ovarian stimulation elevates estrogen levels to varying degrees depending on the patient and the medications used. In all cases, though, peak levels are significantly lower than levels reached during pregnancy. It is well established that elevated estrogen – whether from hormone therapies or pregnancy – significantly increases thrombotic risk, even in healthy people. High-risk patients should receive low-molecular-weight heparin – a prophylactic dose for patients with either positive aPL without clinical APS (including those with SLE) or with obstetric APS, and a therapeutic dose for those with thrombotic APS – during ART procedures.
In patients with SLE, another concern is that increased estrogen will cause disease flare. One case series published in 2017 reported 37 patients with SLE and/or APS who underwent 97 IVF cycles, of which 8% were complicated by flare or thrombotic events. Notably, half of these complications occurred in patients who stopped prescribed therapies (immunomodulatory therapy in two patients with SLE, anticoagulation in two patients with APS) after failure to conceive. In a separate study from 2000 including 19 patients with SLE, APS, or high-titer aPL who underwent 68 IVF cycles, 19% of cycles in patients with SLE were complicated by flare, and no thrombotic events occurred in the cohort. The authors concluded that ovulation induction does not exacerbate SLE or APS. In these studies, the overall pregnancy rates were felt to be consistent with those achieved by the general population through IVF. Although obstetric complications, such as preeclampsia and preterm delivery, were reported in about half of the pregnancies described, these are known to occur more frequently in those with SLE and APS, especially when active disease or other risk factors are present. There are no large-scale, controlled studies evaluating ART outcomes in patients with autoimmune rheumatic diseases to date.
Finally, ovarian hyperstimulation syndrome (OHSS) is an increasingly rare but severe complication of ovarian stimulation. OHSS is characterized by capillary leak, fluid overload, and cytokine release syndrome and can lead to thromboembolic events. Comorbidities like hypertension and renal failure, which can go along with autoimmune rheumatic diseases, are risk factors for OHSS. The use of human chorionic gonadotropin to trigger ovulation is also associated with an increased risk for OHSS, so a GnRH agonist trigger may be preferable.
The ACR guideline recommends that individuals with any of these underlying conditions undergo ART only in expert centers. The ovarian stimulation protocol needs to be tailored to the individual patient to minimize risk and optimize outcomes. The overall goal when managing patients with autoimmune rheumatic diseases during ART is to establish and maintain disease control with pregnancy-compatible medications (when pregnancy is the goal). With adequate planning, appropriate treatment, and collaboration between obstetricians and rheumatologists, individuals with autoimmune rheumatic diseases can safely pursue ART and go on to have successful pregnancies.
Dr. Siegel is a 2022-2023 UCB Women’s Health rheumatology fellow in the rheumatology reproductive health program of the Barbara Volcker Center for Women and Rheumatic Diseases at Hospital for Special Surgery/Weill Cornell Medicine, New York. Her clinical and research focus is on reproductive health issues in individuals with rheumatic disease. Dr. Chan is an assistant professor at Weill Cornell Medical College and an attending physician at Hospital for Special Surgery and Memorial Sloan Kettering Cancer Center in New York. Before moving to New York City, she spent 7 years in private practice in Rhode Island and was a columnist for a monthly rheumatology publication, writing about the challenges of starting life as a full-fledged rheumatologist in a private practice. Follow Dr Chan on Twitter. Dr. Siegel and Dr. Chan disclosed no relevant financial relationships.
A version of this article – an editorial collaboration between Medscape and the Hospital for Special Surgery – first appeared on Medscape.com.
Is evolution’s greatest triumph its worst blunder?
Of all the dazzling achievements of evolution, the most glorious by far is the emergence of the advanced human brain, especially the prefrontal cortex. Homo sapiens (the wise humans) are without doubt the most transformative development in the consequential annals of evolution. It was evolution’s spectacular “moonshot.” Ironically, it may also have been the seed of its destruction.
The unprecedented growth of the human brain over the past 7 million years (tripling in size) was a monumental tipping point in evolution that ultimately disrupted the entire orderly cascade of evolution on Planet Earth. Because of their superior intelligence, Homo sapiens have substantially “tinkered” with the foundations of evolution, such as “natural selection” and “survival of the fittest,” and may eventually change the course of evolution, or even reverse it. It should also be recognized that 20% of the human genome is Neanderthal, and the 2022 Nobel Prize in Physiology or Medicine was awarded to Svante Pääbo, the founder of the field of paleogenetics, who demonstrated genetically that Homo sapiens interbred with Homo neanderthalensis (who disappeared 30,000 years ago).
The majestic evolution of the human brain, in both size and complexity, led to monumental changes in the history of humankind compared to their primitive predecessors. Thanks to a superior cerebral cortex, humans developed traits and abilities that were nonexistent, even unimaginable, in the rest of animal kingdom, including primates and other mammals. These include thoughts; speech (hundreds of languages), spoken and written, to communicate among themselves; composed music and created numerous instruments to play it; invented mathematics, physics, and chemistry; developed agriculture to sustain and feed the masses; built homes, palaces, and pyramids, with water and sewage systems; hatched hundreds of religions and built thousands of houses of worship; built machines to transport themselves (cars, trains, ships, planes, and space shuttles); paved airports and countless miles of roads and railways; established companies, universities, hospitals, and research laboratories; built sports facilities such as stadiums for Olympic games and all its athletics; created hotels, restaurants, coffee shops, newspapers, and magazines; discovered the amazing DNA double helix and its genome with 23,000 coding genes containing instructions to build the brain and 200 other body tissues; developed surgeries and invented medications for diseases that would have killed millions every year; and established paper money to replace gold and silver coins. Humans established governments that included monarchies, dictatorships, democracies, and pseudodemocracies; stipulated constitutions, laws, and regulations to maintain various societies; and created several civilizations around the world that thrived and then faded. Over the past century, the advanced human brain elevated human existence to a higher sophistication with technologies such as electricity, phones, computers, internet, artificial intelligence, and machine learning. Using powerful rockets and space stations, humans have begun to expand their influence to the moon and planets of the solar system. Humans are very likely to continue achieving what evolution could never have done without evolving the human brain to become the most powerful force in nature.
The key ingredient of the brain that has enabled humans to achieve so much is the development of an advanced cognition, with superior functions that far exceed those of other living organisms. These include neurocognitive functions such as memory and attention, and executive functions that include planning, problem-solving, decision-making, abstract thinking, and insight. Those cognitive functions generate lofty prose, splendiferous poetry, and heavenly symphonies that inspire those who create it and others. The human brain also developed social cognition, with empathy, theory of mind, recognition of facial expressions, and courtship rituals that can trigger infatuation and love. Homo sapiens can experience a wide range of emotions in addition to love and attachment (necessary for procreation), including shame, guilt, surprise, embarrassment, disgust, and indifference, and a unique sense of right and wrong.
Perhaps the most distinctive human attribute, generated by an advanced prefrontal cortex, is a belief system that includes philosophy, politics, religion, and faith. Hundreds of different religions sprouted throughout human history (each claiming a monopoly on “the truth”), mandating rituals and behaviors, but also promoting a profound and unshakable belief in a divine “higher being” and an afterlife that mitigates the fear of death. Humans, unlike other animals, are painfully aware of mortality and the inevitability of death. Faith is an antidote for thanatophobia. Unfortunately, religious beliefs often generated severe and protracted schisms and warfare, with fatal consequences for their followers.
The anti-evolution aspect of the advanced brain
Despite remarkable talents and achievements, the unprecedented evolutionary expansion of the human brain also has a detrimental downside. The same intellectual power that led to astonishing positive accomplishments has a wicked side as well. While most animals have a predator, humans have become the “omni-predator” that preys on all living things. The balanced ecosystems of animals and plants has been dominated and disrupted by humans. Thousands of species that evolution had so ingeniously spawned became extinct because of human actions. The rainforests, jewels of nature’s plantation system, were victimized by human indifference to the deleterious effects on nature and climate. The excavation of coal and oil, exploited as necessary sources of energy for societal infrastructure, came back to haunt humans with climate consequences. In many ways, human “progress” corrupted evolution and dismantled its components. Survival of the fittest among various species was whittled down to “survival of humans” (and their domesticated animals) at the expense of all other organisms, animals, or plants.
Among Homo sapiens, momentous scientific, medical, and technological advances completely undermined the principle of survival of the fittest. Very premature infants, who would have certainly died, were kept alive. Children with disabling genetic disorders who would have perished in childhood were kept alive into the age of procreation, perpetuating the genetic mutations. The discovery of antibiotic and antiviral medications, and especially vaccines, ensured the survival of millions of humans who would have succumbed to infections. With evolution’s natural selection, humans who survived severe infections without medications would have passed on their “infection-resistant genes” to their progeny. The triumph of human medical progress can be conceptualized as a setback for the principles of evolution.
Continue to: The most malignant...
The most malignant consequence of the exceptional human brain is the evil of which it is capable. Human ingenuity led to the development of weapons of individual killing (guns), large-scale murder (machine guns), and massive destruction (nuclear weapons). And because aggression and warfare are an inherent part of human nature, the most potent predator for a human is another human. The history of humans is riddled with conflict and death on a large scale. Ironically, many wars were instigated by various religious groups around the world, who developed intense hostility towards one another.
There are other downsides to the advanced human brain. It can channel its talents and skills into unimaginably wicked and depraved behaviors, such as premeditated and well-planned murder, slavery, cults, child abuse, domestic abuse, pornography, fascism, dictatorships, and political corruption. Astonishingly, the same brain that can be loving, kind, friendly, and empathetic can suddenly become hateful, vengeful, cruel, vile, sinister, vicious, diabolical, and capable of unimaginable violence and atrocities. The advanced human brain definitely has a very dark side.
Finally, unlike other members of the animal kingdom, the human brain generates its virtual counterpart: the highly complex human mind, which is prone to various maladies, labeled as “psychiatric disorders.” No other animal species develops delusions, hallucinations, thought disorders, melancholia, mania, obsessive-compulsive disorder, generalized anxiety, panic attacks, posttraumatic stress disorder, psychopathy, narcissistic and borderline personality disorders, alcohol addiction, and drug abuse. Homo sapiens are the only species whose members decide to end their own life in large numbers. About 25% of human minds are afflicted with one or more of those psychiatric ailments.1,2 The redeeming grace of the large human brain is that it led to the development of pharmacologic and somatic treatments for most of them, including psychotherapy, which is a uniquely human treatment strategy that can mend many psychiatric disorders.
Evolution may not realize what it hath wrought when it evolved the dramatically expanded human brain, with its extraordinary cognition. This awe-inspiring “biological computer” can be creative and adaptive, with superlative survival abilities, but it can also degenerate and become nefarious, villainous, murderous, and even demonic. The human brain has essentially brought evolution to a screeching halt and may at some point end up destroying Earth and all of its Homo sapien inhabitants, who may foolishly use their weapons of mass destruction. The historic achievement of evolution has become the ultimate example of “the law of unintended consequences.”
1. Robin LN, Regier DA. Psychiatric Disorders in America: The Epidemiologic Catchment Area Study. Free Press; 1990.
2. Johns Hopkins Medicine. Mental Health Disorder Statistics. Accessed October 12, 2022. https://www.hopkinsmedicine.org/health/wellness-and-prevention/mental-health-disorder-statistics
Of all the dazzling achievements of evolution, the most glorious by far is the emergence of the advanced human brain, especially the prefrontal cortex. Homo sapiens (the wise humans) are without doubt the most transformative development in the consequential annals of evolution. It was evolution’s spectacular “moonshot.” Ironically, it may also have been the seed of its destruction.
The unprecedented growth of the human brain over the past 7 million years (tripling in size) was a monumental tipping point in evolution that ultimately disrupted the entire orderly cascade of evolution on Planet Earth. Because of their superior intelligence, Homo sapiens have substantially “tinkered” with the foundations of evolution, such as “natural selection” and “survival of the fittest,” and may eventually change the course of evolution, or even reverse it. It should also be recognized that 20% of the human genome is Neanderthal, and the 2022 Nobel Prize in Physiology or Medicine was awarded to Svante Pääbo, the founder of the field of paleogenetics, who demonstrated genetically that Homo sapiens interbred with Homo neanderthalensis (who disappeared 30,000 years ago).
The majestic evolution of the human brain, in both size and complexity, led to monumental changes in the history of humankind compared to their primitive predecessors. Thanks to a superior cerebral cortex, humans developed traits and abilities that were nonexistent, even unimaginable, in the rest of animal kingdom, including primates and other mammals. These include thoughts; speech (hundreds of languages), spoken and written, to communicate among themselves; composed music and created numerous instruments to play it; invented mathematics, physics, and chemistry; developed agriculture to sustain and feed the masses; built homes, palaces, and pyramids, with water and sewage systems; hatched hundreds of religions and built thousands of houses of worship; built machines to transport themselves (cars, trains, ships, planes, and space shuttles); paved airports and countless miles of roads and railways; established companies, universities, hospitals, and research laboratories; built sports facilities such as stadiums for Olympic games and all its athletics; created hotels, restaurants, coffee shops, newspapers, and magazines; discovered the amazing DNA double helix and its genome with 23,000 coding genes containing instructions to build the brain and 200 other body tissues; developed surgeries and invented medications for diseases that would have killed millions every year; and established paper money to replace gold and silver coins. Humans established governments that included monarchies, dictatorships, democracies, and pseudodemocracies; stipulated constitutions, laws, and regulations to maintain various societies; and created several civilizations around the world that thrived and then faded. Over the past century, the advanced human brain elevated human existence to a higher sophistication with technologies such as electricity, phones, computers, internet, artificial intelligence, and machine learning. Using powerful rockets and space stations, humans have begun to expand their influence to the moon and planets of the solar system. Humans are very likely to continue achieving what evolution could never have done without evolving the human brain to become the most powerful force in nature.
The key ingredient of the brain that has enabled humans to achieve so much is the development of an advanced cognition, with superior functions that far exceed those of other living organisms. These include neurocognitive functions such as memory and attention, and executive functions that include planning, problem-solving, decision-making, abstract thinking, and insight. Those cognitive functions generate lofty prose, splendiferous poetry, and heavenly symphonies that inspire those who create it and others. The human brain also developed social cognition, with empathy, theory of mind, recognition of facial expressions, and courtship rituals that can trigger infatuation and love. Homo sapiens can experience a wide range of emotions in addition to love and attachment (necessary for procreation), including shame, guilt, surprise, embarrassment, disgust, and indifference, and a unique sense of right and wrong.
Perhaps the most distinctive human attribute, generated by an advanced prefrontal cortex, is a belief system that includes philosophy, politics, religion, and faith. Hundreds of different religions sprouted throughout human history (each claiming a monopoly on “the truth”), mandating rituals and behaviors, but also promoting a profound and unshakable belief in a divine “higher being” and an afterlife that mitigates the fear of death. Humans, unlike other animals, are painfully aware of mortality and the inevitability of death. Faith is an antidote for thanatophobia. Unfortunately, religious beliefs often generated severe and protracted schisms and warfare, with fatal consequences for their followers.
The anti-evolution aspect of the advanced brain
Despite remarkable talents and achievements, the unprecedented evolutionary expansion of the human brain also has a detrimental downside. The same intellectual power that led to astonishing positive accomplishments has a wicked side as well. While most animals have a predator, humans have become the “omni-predator” that preys on all living things. The balanced ecosystems of animals and plants has been dominated and disrupted by humans. Thousands of species that evolution had so ingeniously spawned became extinct because of human actions. The rainforests, jewels of nature’s plantation system, were victimized by human indifference to the deleterious effects on nature and climate. The excavation of coal and oil, exploited as necessary sources of energy for societal infrastructure, came back to haunt humans with climate consequences. In many ways, human “progress” corrupted evolution and dismantled its components. Survival of the fittest among various species was whittled down to “survival of humans” (and their domesticated animals) at the expense of all other organisms, animals, or plants.
Among Homo sapiens, momentous scientific, medical, and technological advances completely undermined the principle of survival of the fittest. Very premature infants, who would have certainly died, were kept alive. Children with disabling genetic disorders who would have perished in childhood were kept alive into the age of procreation, perpetuating the genetic mutations. The discovery of antibiotic and antiviral medications, and especially vaccines, ensured the survival of millions of humans who would have succumbed to infections. With evolution’s natural selection, humans who survived severe infections without medications would have passed on their “infection-resistant genes” to their progeny. The triumph of human medical progress can be conceptualized as a setback for the principles of evolution.
Continue to: The most malignant...
The most malignant consequence of the exceptional human brain is the evil of which it is capable. Human ingenuity led to the development of weapons of individual killing (guns), large-scale murder (machine guns), and massive destruction (nuclear weapons). And because aggression and warfare are an inherent part of human nature, the most potent predator for a human is another human. The history of humans is riddled with conflict and death on a large scale. Ironically, many wars were instigated by various religious groups around the world, who developed intense hostility towards one another.
There are other downsides to the advanced human brain. It can channel its talents and skills into unimaginably wicked and depraved behaviors, such as premeditated and well-planned murder, slavery, cults, child abuse, domestic abuse, pornography, fascism, dictatorships, and political corruption. Astonishingly, the same brain that can be loving, kind, friendly, and empathetic can suddenly become hateful, vengeful, cruel, vile, sinister, vicious, diabolical, and capable of unimaginable violence and atrocities. The advanced human brain definitely has a very dark side.
Finally, unlike other members of the animal kingdom, the human brain generates its virtual counterpart: the highly complex human mind, which is prone to various maladies, labeled as “psychiatric disorders.” No other animal species develops delusions, hallucinations, thought disorders, melancholia, mania, obsessive-compulsive disorder, generalized anxiety, panic attacks, posttraumatic stress disorder, psychopathy, narcissistic and borderline personality disorders, alcohol addiction, and drug abuse. Homo sapiens are the only species whose members decide to end their own life in large numbers. About 25% of human minds are afflicted with one or more of those psychiatric ailments.1,2 The redeeming grace of the large human brain is that it led to the development of pharmacologic and somatic treatments for most of them, including psychotherapy, which is a uniquely human treatment strategy that can mend many psychiatric disorders.
Evolution may not realize what it hath wrought when it evolved the dramatically expanded human brain, with its extraordinary cognition. This awe-inspiring “biological computer” can be creative and adaptive, with superlative survival abilities, but it can also degenerate and become nefarious, villainous, murderous, and even demonic. The human brain has essentially brought evolution to a screeching halt and may at some point end up destroying Earth and all of its Homo sapien inhabitants, who may foolishly use their weapons of mass destruction. The historic achievement of evolution has become the ultimate example of “the law of unintended consequences.”
Of all the dazzling achievements of evolution, the most glorious by far is the emergence of the advanced human brain, especially the prefrontal cortex. Homo sapiens (the wise humans) are without doubt the most transformative development in the consequential annals of evolution. It was evolution’s spectacular “moonshot.” Ironically, it may also have been the seed of its destruction.
The unprecedented growth of the human brain over the past 7 million years (tripling in size) was a monumental tipping point in evolution that ultimately disrupted the entire orderly cascade of evolution on Planet Earth. Because of their superior intelligence, Homo sapiens have substantially “tinkered” with the foundations of evolution, such as “natural selection” and “survival of the fittest,” and may eventually change the course of evolution, or even reverse it. It should also be recognized that 20% of the human genome is Neanderthal, and the 2022 Nobel Prize in Physiology or Medicine was awarded to Svante Pääbo, the founder of the field of paleogenetics, who demonstrated genetically that Homo sapiens interbred with Homo neanderthalensis (who disappeared 30,000 years ago).
The majestic evolution of the human brain, in both size and complexity, led to monumental changes in the history of humankind compared to their primitive predecessors. Thanks to a superior cerebral cortex, humans developed traits and abilities that were nonexistent, even unimaginable, in the rest of animal kingdom, including primates and other mammals. These include thoughts; speech (hundreds of languages), spoken and written, to communicate among themselves; composed music and created numerous instruments to play it; invented mathematics, physics, and chemistry; developed agriculture to sustain and feed the masses; built homes, palaces, and pyramids, with water and sewage systems; hatched hundreds of religions and built thousands of houses of worship; built machines to transport themselves (cars, trains, ships, planes, and space shuttles); paved airports and countless miles of roads and railways; established companies, universities, hospitals, and research laboratories; built sports facilities such as stadiums for Olympic games and all its athletics; created hotels, restaurants, coffee shops, newspapers, and magazines; discovered the amazing DNA double helix and its genome with 23,000 coding genes containing instructions to build the brain and 200 other body tissues; developed surgeries and invented medications for diseases that would have killed millions every year; and established paper money to replace gold and silver coins. Humans established governments that included monarchies, dictatorships, democracies, and pseudodemocracies; stipulated constitutions, laws, and regulations to maintain various societies; and created several civilizations around the world that thrived and then faded. Over the past century, the advanced human brain elevated human existence to a higher sophistication with technologies such as electricity, phones, computers, internet, artificial intelligence, and machine learning. Using powerful rockets and space stations, humans have begun to expand their influence to the moon and planets of the solar system. Humans are very likely to continue achieving what evolution could never have done without evolving the human brain to become the most powerful force in nature.
The key ingredient of the brain that has enabled humans to achieve so much is the development of an advanced cognition, with superior functions that far exceed those of other living organisms. These include neurocognitive functions such as memory and attention, and executive functions that include planning, problem-solving, decision-making, abstract thinking, and insight. Those cognitive functions generate lofty prose, splendiferous poetry, and heavenly symphonies that inspire those who create it and others. The human brain also developed social cognition, with empathy, theory of mind, recognition of facial expressions, and courtship rituals that can trigger infatuation and love. Homo sapiens can experience a wide range of emotions in addition to love and attachment (necessary for procreation), including shame, guilt, surprise, embarrassment, disgust, and indifference, and a unique sense of right and wrong.
Perhaps the most distinctive human attribute, generated by an advanced prefrontal cortex, is a belief system that includes philosophy, politics, religion, and faith. Hundreds of different religions sprouted throughout human history (each claiming a monopoly on “the truth”), mandating rituals and behaviors, but also promoting a profound and unshakable belief in a divine “higher being” and an afterlife that mitigates the fear of death. Humans, unlike other animals, are painfully aware of mortality and the inevitability of death. Faith is an antidote for thanatophobia. Unfortunately, religious beliefs often generated severe and protracted schisms and warfare, with fatal consequences for their followers.
The anti-evolution aspect of the advanced brain
Despite remarkable talents and achievements, the unprecedented evolutionary expansion of the human brain also has a detrimental downside. The same intellectual power that led to astonishing positive accomplishments has a wicked side as well. While most animals have a predator, humans have become the “omni-predator” that preys on all living things. The balanced ecosystems of animals and plants has been dominated and disrupted by humans. Thousands of species that evolution had so ingeniously spawned became extinct because of human actions. The rainforests, jewels of nature’s plantation system, were victimized by human indifference to the deleterious effects on nature and climate. The excavation of coal and oil, exploited as necessary sources of energy for societal infrastructure, came back to haunt humans with climate consequences. In many ways, human “progress” corrupted evolution and dismantled its components. Survival of the fittest among various species was whittled down to “survival of humans” (and their domesticated animals) at the expense of all other organisms, animals, or plants.
Among Homo sapiens, momentous scientific, medical, and technological advances completely undermined the principle of survival of the fittest. Very premature infants, who would have certainly died, were kept alive. Children with disabling genetic disorders who would have perished in childhood were kept alive into the age of procreation, perpetuating the genetic mutations. The discovery of antibiotic and antiviral medications, and especially vaccines, ensured the survival of millions of humans who would have succumbed to infections. With evolution’s natural selection, humans who survived severe infections without medications would have passed on their “infection-resistant genes” to their progeny. The triumph of human medical progress can be conceptualized as a setback for the principles of evolution.
Continue to: The most malignant...
The most malignant consequence of the exceptional human brain is the evil of which it is capable. Human ingenuity led to the development of weapons of individual killing (guns), large-scale murder (machine guns), and massive destruction (nuclear weapons). And because aggression and warfare are an inherent part of human nature, the most potent predator for a human is another human. The history of humans is riddled with conflict and death on a large scale. Ironically, many wars were instigated by various religious groups around the world, who developed intense hostility towards one another.
There are other downsides to the advanced human brain. It can channel its talents and skills into unimaginably wicked and depraved behaviors, such as premeditated and well-planned murder, slavery, cults, child abuse, domestic abuse, pornography, fascism, dictatorships, and political corruption. Astonishingly, the same brain that can be loving, kind, friendly, and empathetic can suddenly become hateful, vengeful, cruel, vile, sinister, vicious, diabolical, and capable of unimaginable violence and atrocities. The advanced human brain definitely has a very dark side.
Finally, unlike other members of the animal kingdom, the human brain generates its virtual counterpart: the highly complex human mind, which is prone to various maladies, labeled as “psychiatric disorders.” No other animal species develops delusions, hallucinations, thought disorders, melancholia, mania, obsessive-compulsive disorder, generalized anxiety, panic attacks, posttraumatic stress disorder, psychopathy, narcissistic and borderline personality disorders, alcohol addiction, and drug abuse. Homo sapiens are the only species whose members decide to end their own life in large numbers. About 25% of human minds are afflicted with one or more of those psychiatric ailments.1,2 The redeeming grace of the large human brain is that it led to the development of pharmacologic and somatic treatments for most of them, including psychotherapy, which is a uniquely human treatment strategy that can mend many psychiatric disorders.
Evolution may not realize what it hath wrought when it evolved the dramatically expanded human brain, with its extraordinary cognition. This awe-inspiring “biological computer” can be creative and adaptive, with superlative survival abilities, but it can also degenerate and become nefarious, villainous, murderous, and even demonic. The human brain has essentially brought evolution to a screeching halt and may at some point end up destroying Earth and all of its Homo sapien inhabitants, who may foolishly use their weapons of mass destruction. The historic achievement of evolution has become the ultimate example of “the law of unintended consequences.”
1. Robin LN, Regier DA. Psychiatric Disorders in America: The Epidemiologic Catchment Area Study. Free Press; 1990.
2. Johns Hopkins Medicine. Mental Health Disorder Statistics. Accessed October 12, 2022. https://www.hopkinsmedicine.org/health/wellness-and-prevention/mental-health-disorder-statistics
1. Robin LN, Regier DA. Psychiatric Disorders in America: The Epidemiologic Catchment Area Study. Free Press; 1990.
2. Johns Hopkins Medicine. Mental Health Disorder Statistics. Accessed October 12, 2022. https://www.hopkinsmedicine.org/health/wellness-and-prevention/mental-health-disorder-statistics
Warning: Watch out for ‘medication substitution reaction’
Editor’s note: Readers’ Forum is a department for correspondence from readers that is not in response to articles published in
I (MZP) recently started medical school, and one of the first things we learned in our Human Dimension class was to listen to our patients. While this may seem prosaic to seasoned practitioners, I quickly realized the important, real-world consequences of doing so.
Clinicians rightfully presume that when they send a prescription to a pharmacy, the patient will receive what they have ordered or the generic equivalent unless it is ordered “Dispense as written.” Unfortunately, a confluence of increased demand and supply chain disruptions has produced nationwide shortages of generic Adderall extended-release (XR) and Adderall, which are commonly prescribed to patients with attention-deficit/hyperactivity disorder (ADHD).1 While pharmacies should notify patients when they do not have these medications in stock, we have encountered numerous cases where due to shortages, prescriptions for generic dextroamphetamine/amphetamine salts XR or immediate-release (IR) have been filled with the same milligrams of only dextroamphetamine XR or IR, respectively, without notifying the patient or the prescribing clinician. Pharmacies have included several national chains and local independent stores in the New York/New Jersey region.
Over the past several months, we have encountered patients who had been well stabilized on their ADHD medication regimen who began to report anxiety, jitteriness, agitation, fatigue, poor concentration, and/or hyperactivity, and who also reported that their pills “look different.” First, we considered their symptoms could be attributed to a switch between generic manufacturers. However, upon further inspection, we discovered that the medication name printed on the label was different from what had been prescribed. We confirmed this by checking the Prescription Monitoring Program database.
Pharmacists have recently won prescribing privileges for nirmatrelvir/ritonavir (Paxlovid) to treat COVID-19, but they certainly are not permitted to fill prescriptions for psychoactive controlled substances that have different pharmacologic profiles than the medication the clinician ordered. Adderall contains D-amphetamine and L-amphetamine in a ratio of 3:1, which makes it different in potency from dextroamphetamine alone and requires adjustment to the dosage and potentially to the frequency to achieve near equivalency.
Once we realized the issue and helped our patients locate a pharmacy that had generic Adderall XR and Adderall in stock so they could resume their previous regimen, their symptoms resolved.
It is important for all clinicians to add “medication substitution reaction” to their differential diagnosis of new-onset ADHD-related symptoms in previously stable patients.
1. Pharmaceutical Commerce. Innovative solutions for pandemic-driven pharmacy drug shortages. Published February 28, 2022. Accessed September 8, 2022. https://www.pharmaceuticalcommerce.com/view/innovative-solutions-for-pandemic-driven-pharmacy-drug-shortages
Editor’s note: Readers’ Forum is a department for correspondence from readers that is not in response to articles published in
I (MZP) recently started medical school, and one of the first things we learned in our Human Dimension class was to listen to our patients. While this may seem prosaic to seasoned practitioners, I quickly realized the important, real-world consequences of doing so.
Clinicians rightfully presume that when they send a prescription to a pharmacy, the patient will receive what they have ordered or the generic equivalent unless it is ordered “Dispense as written.” Unfortunately, a confluence of increased demand and supply chain disruptions has produced nationwide shortages of generic Adderall extended-release (XR) and Adderall, which are commonly prescribed to patients with attention-deficit/hyperactivity disorder (ADHD).1 While pharmacies should notify patients when they do not have these medications in stock, we have encountered numerous cases where due to shortages, prescriptions for generic dextroamphetamine/amphetamine salts XR or immediate-release (IR) have been filled with the same milligrams of only dextroamphetamine XR or IR, respectively, without notifying the patient or the prescribing clinician. Pharmacies have included several national chains and local independent stores in the New York/New Jersey region.
Over the past several months, we have encountered patients who had been well stabilized on their ADHD medication regimen who began to report anxiety, jitteriness, agitation, fatigue, poor concentration, and/or hyperactivity, and who also reported that their pills “look different.” First, we considered their symptoms could be attributed to a switch between generic manufacturers. However, upon further inspection, we discovered that the medication name printed on the label was different from what had been prescribed. We confirmed this by checking the Prescription Monitoring Program database.
Pharmacists have recently won prescribing privileges for nirmatrelvir/ritonavir (Paxlovid) to treat COVID-19, but they certainly are not permitted to fill prescriptions for psychoactive controlled substances that have different pharmacologic profiles than the medication the clinician ordered. Adderall contains D-amphetamine and L-amphetamine in a ratio of 3:1, which makes it different in potency from dextroamphetamine alone and requires adjustment to the dosage and potentially to the frequency to achieve near equivalency.
Once we realized the issue and helped our patients locate a pharmacy that had generic Adderall XR and Adderall in stock so they could resume their previous regimen, their symptoms resolved.
It is important for all clinicians to add “medication substitution reaction” to their differential diagnosis of new-onset ADHD-related symptoms in previously stable patients.
Editor’s note: Readers’ Forum is a department for correspondence from readers that is not in response to articles published in
I (MZP) recently started medical school, and one of the first things we learned in our Human Dimension class was to listen to our patients. While this may seem prosaic to seasoned practitioners, I quickly realized the important, real-world consequences of doing so.
Clinicians rightfully presume that when they send a prescription to a pharmacy, the patient will receive what they have ordered or the generic equivalent unless it is ordered “Dispense as written.” Unfortunately, a confluence of increased demand and supply chain disruptions has produced nationwide shortages of generic Adderall extended-release (XR) and Adderall, which are commonly prescribed to patients with attention-deficit/hyperactivity disorder (ADHD).1 While pharmacies should notify patients when they do not have these medications in stock, we have encountered numerous cases where due to shortages, prescriptions for generic dextroamphetamine/amphetamine salts XR or immediate-release (IR) have been filled with the same milligrams of only dextroamphetamine XR or IR, respectively, without notifying the patient or the prescribing clinician. Pharmacies have included several national chains and local independent stores in the New York/New Jersey region.
Over the past several months, we have encountered patients who had been well stabilized on their ADHD medication regimen who began to report anxiety, jitteriness, agitation, fatigue, poor concentration, and/or hyperactivity, and who also reported that their pills “look different.” First, we considered their symptoms could be attributed to a switch between generic manufacturers. However, upon further inspection, we discovered that the medication name printed on the label was different from what had been prescribed. We confirmed this by checking the Prescription Monitoring Program database.
Pharmacists have recently won prescribing privileges for nirmatrelvir/ritonavir (Paxlovid) to treat COVID-19, but they certainly are not permitted to fill prescriptions for psychoactive controlled substances that have different pharmacologic profiles than the medication the clinician ordered. Adderall contains D-amphetamine and L-amphetamine in a ratio of 3:1, which makes it different in potency from dextroamphetamine alone and requires adjustment to the dosage and potentially to the frequency to achieve near equivalency.
Once we realized the issue and helped our patients locate a pharmacy that had generic Adderall XR and Adderall in stock so they could resume their previous regimen, their symptoms resolved.
It is important for all clinicians to add “medication substitution reaction” to their differential diagnosis of new-onset ADHD-related symptoms in previously stable patients.
1. Pharmaceutical Commerce. Innovative solutions for pandemic-driven pharmacy drug shortages. Published February 28, 2022. Accessed September 8, 2022. https://www.pharmaceuticalcommerce.com/view/innovative-solutions-for-pandemic-driven-pharmacy-drug-shortages
1. Pharmaceutical Commerce. Innovative solutions for pandemic-driven pharmacy drug shortages. Published February 28, 2022. Accessed September 8, 2022. https://www.pharmaceuticalcommerce.com/view/innovative-solutions-for-pandemic-driven-pharmacy-drug-shortages
The light at the end of the tunnel: Reflecting on a 7-year training journey
Throughout my training, a common refrain from more senior colleagues was that training “goes by quickly.” At the risk of sounding cliché, and even after a 7-year journey spanning psychiatry and preventive medicine residencies as well as a consultation-liaison psychiatry fellowship, I agree without reservations that it does indeed go quickly. In the waning days of my training, reflection and nostalgia have become commonplace, as one might expect after such a meaningful pursuit. In sharing my reflections, I hope others progressing through training will also reflect on elements that added meaning to their experience and how they might improve the journey for future trainees.
Residency is a team sport
One realization that quickly struck me was that residency is a team sport, and finding supportive communities is essential to survival. Other residents, colleagues, and mentors played integral roles in making my experience rewarding. Training might be considered a shared traumatic experience, but having peers to commiserate with at each step has been among its greatest rewards. Residency automatically provided a cohort of colleagues who shared and validated my experiences. Additionally, having mentors who have been through it themselves and find ways to improve the training experience made mine superlative. Mentors assisted me in tailoring my training and developing interests that I could integrate into my future practice. The interpersonal connections I made were critical in helping me survive and thrive during training.
See one, do one, teach one
Residency and fellowship programs might be considered “see one, do one, teach one”1 at large scale. Since their inception, these programs—designed to develop junior physicians—have been inherently educational in nature. The structure is elegant, allowing trainees to continue learning while incrementally gaining more autonomy and teaching responsibility.2 Naively, I did not understand that implicit within my education was an expectation to become an educator and hone my teaching skills. Initially, being a newly minted resident receiving brand-new 3rd-year medical students charged me with apprehension. Thoughts I internalized, such as “these students probably know more than me” or “how can I be responsible for patients and students simultaneously,” may have resulted from a paucity of instruction about teaching available during medical school.3,4 I quickly found, though, that teaching was among the most rewarding facets of training. Helping other learners grow became one of my passions and added to my experience.
Iron sharpens iron
Although my experience was enjoyable, I would be remiss without also considering accompanying trials and tribulations. Seemingly interminable night shifts, sleep deprivation, lack of autonomy, and system inefficiencies frustrated me. Eventually, these frustrations seemed less bothersome. These challenges likely had not vanished with time, but perhaps my capacity to tolerate distress improved—likely corresponding with increasing skill and confidence. These challenges allowed me to hone my clinical decision-making abilities while under duress. My struggles and frustrations were not unique but perhaps lessons themselves.
Residency is not meant to be easy. The crucible of residency taught me that I had resilience to draw upon during challenging times. “Iron sharpens iron,” as the adage goes, and I believe adversity ultimately helped me become a better psychiatrist.
Self-reflection is part of completing training
Reminders that my journey is at an end are everywhere. Seeing notes written by past residents or fellows reminds me that soon I too will merely be a name in the chart to future trainees. Perhaps this line of thought is unfair, reducing my training experience to notes I signed—whereas my training experience was defined by connections made with colleagues and mentors, opportunities to teach junior learners, and confidence gained by overcoming adversity.
While becoming an attending psychiatrist fills me with trepidation, fear need not be an inherent aspect of new beginnings. Reflection has been a powerful practice, allowing me to realize what made my experience so meaningful, and that training is meant to be process-oriented rather than outcome-oriented. My reflection has underscored the realization that challenges are inherent in training, although not without purpose. I believe these struggles were meant to allow me to build meaningful relationships with colleagues, discover joy in teaching, and build resiliency.
The purpose of residencies and fellowships should be to produce clinically excellent psychiatrists, but I feel the journey was as important as the destination. Psychiatrists likely understand this better than most, as we were trained to thoughtfully approach the process of termination with patients.5 While the conclusion of our training journeys may seem unceremonious or anticlimactic, the termination process should include self-reflection on meaningful facets of training. For me, this reflection has itself been invaluable, while also making me hopeful to contribute value to the training journeys of future psychiatrists.
1. Gorrindo T, Beresin EV. Is “See one, do one, teach one” dead? Implications for the professionalization of medical educators in the twenty-first century. Acad Psychiatry. 2015;39(6):613-614. doi:10.1007/s40596-015-0424-8
2. Wright Jr. JR, Schachar NS. Necessity is the mother of invention: William Stewart Halsted’s addiction and its influence on the development of residency training in North America. Can J Surg. 2020;63(1):E13-E19. doi:10.1503/cjs.003319
3. Dandavino M, Snell L, Wiseman J. Why medical students should learn how to teach. Med Teach. 2007;29(6):558-565. doi:10.1080/01421590701477449
4. Liu AC, Liu M, Dannaway J, et al. Are Australian medical students being taught to teach? Clin Teach. 2017;14(5):330-335. doi:10.1111/tct.12591
5. Vasquez MJ, Bingham RP, Barnett JE. Psychotherapy termination: clinical and ethical responsibilities. J Clin Psychol. 2008;64(5):653-665. doi:10.1002/jclp.20478
Throughout my training, a common refrain from more senior colleagues was that training “goes by quickly.” At the risk of sounding cliché, and even after a 7-year journey spanning psychiatry and preventive medicine residencies as well as a consultation-liaison psychiatry fellowship, I agree without reservations that it does indeed go quickly. In the waning days of my training, reflection and nostalgia have become commonplace, as one might expect after such a meaningful pursuit. In sharing my reflections, I hope others progressing through training will also reflect on elements that added meaning to their experience and how they might improve the journey for future trainees.
Residency is a team sport
One realization that quickly struck me was that residency is a team sport, and finding supportive communities is essential to survival. Other residents, colleagues, and mentors played integral roles in making my experience rewarding. Training might be considered a shared traumatic experience, but having peers to commiserate with at each step has been among its greatest rewards. Residency automatically provided a cohort of colleagues who shared and validated my experiences. Additionally, having mentors who have been through it themselves and find ways to improve the training experience made mine superlative. Mentors assisted me in tailoring my training and developing interests that I could integrate into my future practice. The interpersonal connections I made were critical in helping me survive and thrive during training.
See one, do one, teach one
Residency and fellowship programs might be considered “see one, do one, teach one”1 at large scale. Since their inception, these programs—designed to develop junior physicians—have been inherently educational in nature. The structure is elegant, allowing trainees to continue learning while incrementally gaining more autonomy and teaching responsibility.2 Naively, I did not understand that implicit within my education was an expectation to become an educator and hone my teaching skills. Initially, being a newly minted resident receiving brand-new 3rd-year medical students charged me with apprehension. Thoughts I internalized, such as “these students probably know more than me” or “how can I be responsible for patients and students simultaneously,” may have resulted from a paucity of instruction about teaching available during medical school.3,4 I quickly found, though, that teaching was among the most rewarding facets of training. Helping other learners grow became one of my passions and added to my experience.
Iron sharpens iron
Although my experience was enjoyable, I would be remiss without also considering accompanying trials and tribulations. Seemingly interminable night shifts, sleep deprivation, lack of autonomy, and system inefficiencies frustrated me. Eventually, these frustrations seemed less bothersome. These challenges likely had not vanished with time, but perhaps my capacity to tolerate distress improved—likely corresponding with increasing skill and confidence. These challenges allowed me to hone my clinical decision-making abilities while under duress. My struggles and frustrations were not unique but perhaps lessons themselves.
Residency is not meant to be easy. The crucible of residency taught me that I had resilience to draw upon during challenging times. “Iron sharpens iron,” as the adage goes, and I believe adversity ultimately helped me become a better psychiatrist.
Self-reflection is part of completing training
Reminders that my journey is at an end are everywhere. Seeing notes written by past residents or fellows reminds me that soon I too will merely be a name in the chart to future trainees. Perhaps this line of thought is unfair, reducing my training experience to notes I signed—whereas my training experience was defined by connections made with colleagues and mentors, opportunities to teach junior learners, and confidence gained by overcoming adversity.
While becoming an attending psychiatrist fills me with trepidation, fear need not be an inherent aspect of new beginnings. Reflection has been a powerful practice, allowing me to realize what made my experience so meaningful, and that training is meant to be process-oriented rather than outcome-oriented. My reflection has underscored the realization that challenges are inherent in training, although not without purpose. I believe these struggles were meant to allow me to build meaningful relationships with colleagues, discover joy in teaching, and build resiliency.
The purpose of residencies and fellowships should be to produce clinically excellent psychiatrists, but I feel the journey was as important as the destination. Psychiatrists likely understand this better than most, as we were trained to thoughtfully approach the process of termination with patients.5 While the conclusion of our training journeys may seem unceremonious or anticlimactic, the termination process should include self-reflection on meaningful facets of training. For me, this reflection has itself been invaluable, while also making me hopeful to contribute value to the training journeys of future psychiatrists.
Throughout my training, a common refrain from more senior colleagues was that training “goes by quickly.” At the risk of sounding cliché, and even after a 7-year journey spanning psychiatry and preventive medicine residencies as well as a consultation-liaison psychiatry fellowship, I agree without reservations that it does indeed go quickly. In the waning days of my training, reflection and nostalgia have become commonplace, as one might expect after such a meaningful pursuit. In sharing my reflections, I hope others progressing through training will also reflect on elements that added meaning to their experience and how they might improve the journey for future trainees.
Residency is a team sport
One realization that quickly struck me was that residency is a team sport, and finding supportive communities is essential to survival. Other residents, colleagues, and mentors played integral roles in making my experience rewarding. Training might be considered a shared traumatic experience, but having peers to commiserate with at each step has been among its greatest rewards. Residency automatically provided a cohort of colleagues who shared and validated my experiences. Additionally, having mentors who have been through it themselves and find ways to improve the training experience made mine superlative. Mentors assisted me in tailoring my training and developing interests that I could integrate into my future practice. The interpersonal connections I made were critical in helping me survive and thrive during training.
See one, do one, teach one
Residency and fellowship programs might be considered “see one, do one, teach one”1 at large scale. Since their inception, these programs—designed to develop junior physicians—have been inherently educational in nature. The structure is elegant, allowing trainees to continue learning while incrementally gaining more autonomy and teaching responsibility.2 Naively, I did not understand that implicit within my education was an expectation to become an educator and hone my teaching skills. Initially, being a newly minted resident receiving brand-new 3rd-year medical students charged me with apprehension. Thoughts I internalized, such as “these students probably know more than me” or “how can I be responsible for patients and students simultaneously,” may have resulted from a paucity of instruction about teaching available during medical school.3,4 I quickly found, though, that teaching was among the most rewarding facets of training. Helping other learners grow became one of my passions and added to my experience.
Iron sharpens iron
Although my experience was enjoyable, I would be remiss without also considering accompanying trials and tribulations. Seemingly interminable night shifts, sleep deprivation, lack of autonomy, and system inefficiencies frustrated me. Eventually, these frustrations seemed less bothersome. These challenges likely had not vanished with time, but perhaps my capacity to tolerate distress improved—likely corresponding with increasing skill and confidence. These challenges allowed me to hone my clinical decision-making abilities while under duress. My struggles and frustrations were not unique but perhaps lessons themselves.
Residency is not meant to be easy. The crucible of residency taught me that I had resilience to draw upon during challenging times. “Iron sharpens iron,” as the adage goes, and I believe adversity ultimately helped me become a better psychiatrist.
Self-reflection is part of completing training
Reminders that my journey is at an end are everywhere. Seeing notes written by past residents or fellows reminds me that soon I too will merely be a name in the chart to future trainees. Perhaps this line of thought is unfair, reducing my training experience to notes I signed—whereas my training experience was defined by connections made with colleagues and mentors, opportunities to teach junior learners, and confidence gained by overcoming adversity.
While becoming an attending psychiatrist fills me with trepidation, fear need not be an inherent aspect of new beginnings. Reflection has been a powerful practice, allowing me to realize what made my experience so meaningful, and that training is meant to be process-oriented rather than outcome-oriented. My reflection has underscored the realization that challenges are inherent in training, although not without purpose. I believe these struggles were meant to allow me to build meaningful relationships with colleagues, discover joy in teaching, and build resiliency.
The purpose of residencies and fellowships should be to produce clinically excellent psychiatrists, but I feel the journey was as important as the destination. Psychiatrists likely understand this better than most, as we were trained to thoughtfully approach the process of termination with patients.5 While the conclusion of our training journeys may seem unceremonious or anticlimactic, the termination process should include self-reflection on meaningful facets of training. For me, this reflection has itself been invaluable, while also making me hopeful to contribute value to the training journeys of future psychiatrists.
1. Gorrindo T, Beresin EV. Is “See one, do one, teach one” dead? Implications for the professionalization of medical educators in the twenty-first century. Acad Psychiatry. 2015;39(6):613-614. doi:10.1007/s40596-015-0424-8
2. Wright Jr. JR, Schachar NS. Necessity is the mother of invention: William Stewart Halsted’s addiction and its influence on the development of residency training in North America. Can J Surg. 2020;63(1):E13-E19. doi:10.1503/cjs.003319
3. Dandavino M, Snell L, Wiseman J. Why medical students should learn how to teach. Med Teach. 2007;29(6):558-565. doi:10.1080/01421590701477449
4. Liu AC, Liu M, Dannaway J, et al. Are Australian medical students being taught to teach? Clin Teach. 2017;14(5):330-335. doi:10.1111/tct.12591
5. Vasquez MJ, Bingham RP, Barnett JE. Psychotherapy termination: clinical and ethical responsibilities. J Clin Psychol. 2008;64(5):653-665. doi:10.1002/jclp.20478
1. Gorrindo T, Beresin EV. Is “See one, do one, teach one” dead? Implications for the professionalization of medical educators in the twenty-first century. Acad Psychiatry. 2015;39(6):613-614. doi:10.1007/s40596-015-0424-8
2. Wright Jr. JR, Schachar NS. Necessity is the mother of invention: William Stewart Halsted’s addiction and its influence on the development of residency training in North America. Can J Surg. 2020;63(1):E13-E19. doi:10.1503/cjs.003319
3. Dandavino M, Snell L, Wiseman J. Why medical students should learn how to teach. Med Teach. 2007;29(6):558-565. doi:10.1080/01421590701477449
4. Liu AC, Liu M, Dannaway J, et al. Are Australian medical students being taught to teach? Clin Teach. 2017;14(5):330-335. doi:10.1111/tct.12591
5. Vasquez MJ, Bingham RP, Barnett JE. Psychotherapy termination: clinical and ethical responsibilities. J Clin Psychol. 2008;64(5):653-665. doi:10.1002/jclp.20478
Lamotrigine for bipolar depression?
In reading Dr. Nasrallah's August 2022 editorial (“Reversing depression: A plethora of therapeutic strategies and mechanisms,”
Dr. Nasrallah responds
Thanks for your message. Lamotrigine is not FDA-approved for bipolar or unipolar depression, either as monotherapy or as an adjunctive therapy. It has never been approved for mania, either (no efficacy at all). Its only FDA-approved psychiatric indication is maintenance therapy after a patient with bipolar I disorder emerges from mania with the help of one of the antimanic drugs. Yet many clinicians may perceive lamotrigine as useful for bipolar depression because more than 20 years ago the manufacturer sponsored several small studies (not FDA trials). Two studies that showed efficacy were published, but 4 other studies that failed to show efficacy were not published. As a result, many clinicians got the false impression that lamotrigine is an effective antidepressant. I hope this explains why lamotrigine was not included in the list of antidepressants in my editorial.
In reading Dr. Nasrallah's August 2022 editorial (“Reversing depression: A plethora of therapeutic strategies and mechanisms,”
Dr. Nasrallah responds
Thanks for your message. Lamotrigine is not FDA-approved for bipolar or unipolar depression, either as monotherapy or as an adjunctive therapy. It has never been approved for mania, either (no efficacy at all). Its only FDA-approved psychiatric indication is maintenance therapy after a patient with bipolar I disorder emerges from mania with the help of one of the antimanic drugs. Yet many clinicians may perceive lamotrigine as useful for bipolar depression because more than 20 years ago the manufacturer sponsored several small studies (not FDA trials). Two studies that showed efficacy were published, but 4 other studies that failed to show efficacy were not published. As a result, many clinicians got the false impression that lamotrigine is an effective antidepressant. I hope this explains why lamotrigine was not included in the list of antidepressants in my editorial.
In reading Dr. Nasrallah's August 2022 editorial (“Reversing depression: A plethora of therapeutic strategies and mechanisms,”
Dr. Nasrallah responds
Thanks for your message. Lamotrigine is not FDA-approved for bipolar or unipolar depression, either as monotherapy or as an adjunctive therapy. It has never been approved for mania, either (no efficacy at all). Its only FDA-approved psychiatric indication is maintenance therapy after a patient with bipolar I disorder emerges from mania with the help of one of the antimanic drugs. Yet many clinicians may perceive lamotrigine as useful for bipolar depression because more than 20 years ago the manufacturer sponsored several small studies (not FDA trials). Two studies that showed efficacy were published, but 4 other studies that failed to show efficacy were not published. As a result, many clinicians got the false impression that lamotrigine is an effective antidepressant. I hope this explains why lamotrigine was not included in the list of antidepressants in my editorial.
Then and now: Gut microbiome
The HMP, which was supported by “only” approximately $20 million of funding in its first year, served as a catalyst for the development of computational tools, clinical protocols, and reference datasets for an emerging field that now approaches nearly $2 billion per year in market value of diagnostics and therapeutics.
Over the past 15 years, many important discoveries about the microbiome have been made, particularly in the fields of gastroenterology, hepatology, and nutrition. The transplantation of gut microbiome from one person to another has been shown to be more than 90% effective in the treatment of recurrent C. difficile infection, disrupting our current therapeutic algorithms of repetitive antibiotics. Other exciting discoveries have included the relationship between the gut microbiome and enteric nervous system, and its roles in the regulation of metabolism and obesity and in the progression of liver fibrosis and cancer.
Looking ahead, several exciting areas related to digestive health and the microbiome are being prioritized, including the role of probiotics in nutrition, the complex relationship of the bidirectional “gut-brain” axis, and further development of analytics to define and deliver precision medicine across a wide range of digestive disorders. Without a doubt, emerging microbiome discoveries will be prominently featured in the pages of GI & Hepatology News over the coming years to keep our readers informed of these cutting-edge findings.
Dr. Rosenberg is medical director of the North Shore Endoscopy Center and director of clinical research at GI Alliance of Illinois in Gurnee, Ill. Dr. Rosenberg is a consultant for Aimmune Therapeutics and performs clinical research with Ferring Pharmaceuticals.
The HMP, which was supported by “only” approximately $20 million of funding in its first year, served as a catalyst for the development of computational tools, clinical protocols, and reference datasets for an emerging field that now approaches nearly $2 billion per year in market value of diagnostics and therapeutics.
Over the past 15 years, many important discoveries about the microbiome have been made, particularly in the fields of gastroenterology, hepatology, and nutrition. The transplantation of gut microbiome from one person to another has been shown to be more than 90% effective in the treatment of recurrent C. difficile infection, disrupting our current therapeutic algorithms of repetitive antibiotics. Other exciting discoveries have included the relationship between the gut microbiome and enteric nervous system, and its roles in the regulation of metabolism and obesity and in the progression of liver fibrosis and cancer.
Looking ahead, several exciting areas related to digestive health and the microbiome are being prioritized, including the role of probiotics in nutrition, the complex relationship of the bidirectional “gut-brain” axis, and further development of analytics to define and deliver precision medicine across a wide range of digestive disorders. Without a doubt, emerging microbiome discoveries will be prominently featured in the pages of GI & Hepatology News over the coming years to keep our readers informed of these cutting-edge findings.
Dr. Rosenberg is medical director of the North Shore Endoscopy Center and director of clinical research at GI Alliance of Illinois in Gurnee, Ill. Dr. Rosenberg is a consultant for Aimmune Therapeutics and performs clinical research with Ferring Pharmaceuticals.
The HMP, which was supported by “only” approximately $20 million of funding in its first year, served as a catalyst for the development of computational tools, clinical protocols, and reference datasets for an emerging field that now approaches nearly $2 billion per year in market value of diagnostics and therapeutics.
Over the past 15 years, many important discoveries about the microbiome have been made, particularly in the fields of gastroenterology, hepatology, and nutrition. The transplantation of gut microbiome from one person to another has been shown to be more than 90% effective in the treatment of recurrent C. difficile infection, disrupting our current therapeutic algorithms of repetitive antibiotics. Other exciting discoveries have included the relationship between the gut microbiome and enteric nervous system, and its roles in the regulation of metabolism and obesity and in the progression of liver fibrosis and cancer.
Looking ahead, several exciting areas related to digestive health and the microbiome are being prioritized, including the role of probiotics in nutrition, the complex relationship of the bidirectional “gut-brain” axis, and further development of analytics to define and deliver precision medicine across a wide range of digestive disorders. Without a doubt, emerging microbiome discoveries will be prominently featured in the pages of GI & Hepatology News over the coming years to keep our readers informed of these cutting-edge findings.
Dr. Rosenberg is medical director of the North Shore Endoscopy Center and director of clinical research at GI Alliance of Illinois in Gurnee, Ill. Dr. Rosenberg is a consultant for Aimmune Therapeutics and performs clinical research with Ferring Pharmaceuticals.