User login
Cancer centers may not allow for dignified deaths
Credit: NCI and
Mathews Media Group
A new study suggests many patients in cancer centers do not experience a dignified death.
Study investigators surveyed physicians and nurses in 16 hospitals belonging to 10 cancer centers in Baden-Württemberg, Germany.
The results revealed a need for cancer centers to invest more in palliative care services, adequate rooms for dying patients, staff training in end-of-life care, and advance-care-planning standards.
Karin Jors, of the University Medical Center Freiburg, and her colleagues reported these findings in Cancer.
Previous research has shown that hospitals are often ill-prepared to provide care for dying patients.
To investigate whether the circumstances for dying on cancer center wards allow for a dignified death, Jors and her colleagues surveyed physicians and nurses in German cancer centers.
Among 1131 survey respondents, 57% believed that patients could die with dignity on their ward.
Half of the surveyed staff members indicated that they rarely have enough time to care for dying patients, and 55% found the rooms available for dying patients unsatisfactory.
Only 19% of respondents felt they had been well-prepared to care for dying patients, and only 6% of physicians felt that way.
On the other hand, physicians perceived the circumstances for dying patients much more positively than nurses, especially regarding communication and life-prolonging measures.
While 72% of physicians reported that patients can usually die a dignified death on their ward, only 52% of nurses shared this opinion.
Palliative care staff reported much better conditions for dying patients than staff from other wards, with 95% of palliative care staff indicating that patients die with dignity on their wards.
“In our aging society, it is predicted that the number of hospital deaths will continue to rise in the coming years, and many of these deaths will be attributable to cancer,” Jors said.
“For this reason, it is particularly important that cancer centers strive to create a comfortable, dignified experience for dying patients and their families. Above all, this requires that staff members are provided with the adequate resources to care for these patients.”
The investigators therefore encourage the integration of palliative care into standard oncology care, beginning as early as diagnosis. They also believe physicians and nurses would benefit from increased education and training in end-of-life care.
Credit: NCI and
Mathews Media Group
A new study suggests many patients in cancer centers do not experience a dignified death.
Study investigators surveyed physicians and nurses in 16 hospitals belonging to 10 cancer centers in Baden-Württemberg, Germany.
The results revealed a need for cancer centers to invest more in palliative care services, adequate rooms for dying patients, staff training in end-of-life care, and advance-care-planning standards.
Karin Jors, of the University Medical Center Freiburg, and her colleagues reported these findings in Cancer.
Previous research has shown that hospitals are often ill-prepared to provide care for dying patients.
To investigate whether the circumstances for dying on cancer center wards allow for a dignified death, Jors and her colleagues surveyed physicians and nurses in German cancer centers.
Among 1131 survey respondents, 57% believed that patients could die with dignity on their ward.
Half of the surveyed staff members indicated that they rarely have enough time to care for dying patients, and 55% found the rooms available for dying patients unsatisfactory.
Only 19% of respondents felt they had been well-prepared to care for dying patients, and only 6% of physicians felt that way.
On the other hand, physicians perceived the circumstances for dying patients much more positively than nurses, especially regarding communication and life-prolonging measures.
While 72% of physicians reported that patients can usually die a dignified death on their ward, only 52% of nurses shared this opinion.
Palliative care staff reported much better conditions for dying patients than staff from other wards, with 95% of palliative care staff indicating that patients die with dignity on their wards.
“In our aging society, it is predicted that the number of hospital deaths will continue to rise in the coming years, and many of these deaths will be attributable to cancer,” Jors said.
“For this reason, it is particularly important that cancer centers strive to create a comfortable, dignified experience for dying patients and their families. Above all, this requires that staff members are provided with the adequate resources to care for these patients.”
The investigators therefore encourage the integration of palliative care into standard oncology care, beginning as early as diagnosis. They also believe physicians and nurses would benefit from increased education and training in end-of-life care.
Credit: NCI and
Mathews Media Group
A new study suggests many patients in cancer centers do not experience a dignified death.
Study investigators surveyed physicians and nurses in 16 hospitals belonging to 10 cancer centers in Baden-Württemberg, Germany.
The results revealed a need for cancer centers to invest more in palliative care services, adequate rooms for dying patients, staff training in end-of-life care, and advance-care-planning standards.
Karin Jors, of the University Medical Center Freiburg, and her colleagues reported these findings in Cancer.
Previous research has shown that hospitals are often ill-prepared to provide care for dying patients.
To investigate whether the circumstances for dying on cancer center wards allow for a dignified death, Jors and her colleagues surveyed physicians and nurses in German cancer centers.
Among 1131 survey respondents, 57% believed that patients could die with dignity on their ward.
Half of the surveyed staff members indicated that they rarely have enough time to care for dying patients, and 55% found the rooms available for dying patients unsatisfactory.
Only 19% of respondents felt they had been well-prepared to care for dying patients, and only 6% of physicians felt that way.
On the other hand, physicians perceived the circumstances for dying patients much more positively than nurses, especially regarding communication and life-prolonging measures.
While 72% of physicians reported that patients can usually die a dignified death on their ward, only 52% of nurses shared this opinion.
Palliative care staff reported much better conditions for dying patients than staff from other wards, with 95% of palliative care staff indicating that patients die with dignity on their wards.
“In our aging society, it is predicted that the number of hospital deaths will continue to rise in the coming years, and many of these deaths will be attributable to cancer,” Jors said.
“For this reason, it is particularly important that cancer centers strive to create a comfortable, dignified experience for dying patients and their families. Above all, this requires that staff members are provided with the adequate resources to care for these patients.”
The investigators therefore encourage the integration of palliative care into standard oncology care, beginning as early as diagnosis. They also believe physicians and nurses would benefit from increased education and training in end-of-life care.
WHO supports study of blood transfusions for Ebola
Credit: Elise Amendola
Experts from the World Health Organization (WHO) have identified several interventions that should be the focus of clinical evaluation for treating and preventing Ebola.
Transfusions of blood products from Ebola survivors topped this list.
Of course, such blood preparations, like the other interventions the WHO discussed, have not been approved to treat or prevent Ebola.
However, they could be available before the year is out, according to WHO estimates. The organization is exploring options to conduct clinical trials of blood products in Ebola patients.
Previous studies have suggested blood transfusions from Ebola survivors might prevent or treat Ebola virus infection. However, it is unclear whether antibodies in the plasma of survivors are sufficient to treat or prevent the disease.
Safety is also a concern, although the WHO said transfusions should be safe if they are provided by well-managed blood banks. Still, there is a risk of transmitting blood-borne pathogens and a theoretical concern about antibody-dependent enhancement of Ebola virus infection.
“[T]here was a lot of discussion and emphasis on blood, on blood transfusion, whole-blood transfusion, as well as on plasma that can be purified from convalescent serum,” said Marie-Paule Kieny, Assistant Director-General at the WHO.
“There was consensus that this has a good chance to work and that, also, this is something that can be produced now from the affected countries themselves.”
The experts also agreed that the international community needs to help affected countries create the necessary infrastructure to draw blood safely and prepare the blood products safely.
Aside from blood transfusions, the WHO experts mentioned 2 potential Ebola vaccines that should be a priority. Safety studies of these vaccines—based on vesicular stomatitis virus (VSV-EBO) and chimpanzee adenovirus (ChAd-EBO)—are beginning in the US and are slated to begin in Africa and Europe in mid-September.
If proven safe, a vaccine could be available in November 2014 for priority use in healthcare workers.
The WHO experts also discussed the availability and evidence supporting the use of novel therapeutic drugs, including monoclonal antibodies, RNA-based drugs, and small antiviral molecules. They considered the potential use of existing drugs approved for other diseases and conditions as well.
Of the novel products discussed, some have shown great promise in monkey models. Others have been used in a few Ebola patients and appear safe, but the numbers are too small to permit any definitive conclusions about efficacy.
Existing supplies of all experimental medicines are limited, the WHO said. While many efforts are underway to accelerate production, supplies will not be sufficient for several months to come. The prospects of having augmented supplies of vaccines rapidly look slightly better.
The WHO also cautioned that the investigation of the aforementioned interventions should not detract attention from measures to prevent Ebola from spreading.
Credit: Elise Amendola
Experts from the World Health Organization (WHO) have identified several interventions that should be the focus of clinical evaluation for treating and preventing Ebola.
Transfusions of blood products from Ebola survivors topped this list.
Of course, such blood preparations, like the other interventions the WHO discussed, have not been approved to treat or prevent Ebola.
However, they could be available before the year is out, according to WHO estimates. The organization is exploring options to conduct clinical trials of blood products in Ebola patients.
Previous studies have suggested blood transfusions from Ebola survivors might prevent or treat Ebola virus infection. However, it is unclear whether antibodies in the plasma of survivors are sufficient to treat or prevent the disease.
Safety is also a concern, although the WHO said transfusions should be safe if they are provided by well-managed blood banks. Still, there is a risk of transmitting blood-borne pathogens and a theoretical concern about antibody-dependent enhancement of Ebola virus infection.
“[T]here was a lot of discussion and emphasis on blood, on blood transfusion, whole-blood transfusion, as well as on plasma that can be purified from convalescent serum,” said Marie-Paule Kieny, Assistant Director-General at the WHO.
“There was consensus that this has a good chance to work and that, also, this is something that can be produced now from the affected countries themselves.”
The experts also agreed that the international community needs to help affected countries create the necessary infrastructure to draw blood safely and prepare the blood products safely.
Aside from blood transfusions, the WHO experts mentioned 2 potential Ebola vaccines that should be a priority. Safety studies of these vaccines—based on vesicular stomatitis virus (VSV-EBO) and chimpanzee adenovirus (ChAd-EBO)—are beginning in the US and are slated to begin in Africa and Europe in mid-September.
If proven safe, a vaccine could be available in November 2014 for priority use in healthcare workers.
The WHO experts also discussed the availability and evidence supporting the use of novel therapeutic drugs, including monoclonal antibodies, RNA-based drugs, and small antiviral molecules. They considered the potential use of existing drugs approved for other diseases and conditions as well.
Of the novel products discussed, some have shown great promise in monkey models. Others have been used in a few Ebola patients and appear safe, but the numbers are too small to permit any definitive conclusions about efficacy.
Existing supplies of all experimental medicines are limited, the WHO said. While many efforts are underway to accelerate production, supplies will not be sufficient for several months to come. The prospects of having augmented supplies of vaccines rapidly look slightly better.
The WHO also cautioned that the investigation of the aforementioned interventions should not detract attention from measures to prevent Ebola from spreading.
Credit: Elise Amendola
Experts from the World Health Organization (WHO) have identified several interventions that should be the focus of clinical evaluation for treating and preventing Ebola.
Transfusions of blood products from Ebola survivors topped this list.
Of course, such blood preparations, like the other interventions the WHO discussed, have not been approved to treat or prevent Ebola.
However, they could be available before the year is out, according to WHO estimates. The organization is exploring options to conduct clinical trials of blood products in Ebola patients.
Previous studies have suggested blood transfusions from Ebola survivors might prevent or treat Ebola virus infection. However, it is unclear whether antibodies in the plasma of survivors are sufficient to treat or prevent the disease.
Safety is also a concern, although the WHO said transfusions should be safe if they are provided by well-managed blood banks. Still, there is a risk of transmitting blood-borne pathogens and a theoretical concern about antibody-dependent enhancement of Ebola virus infection.
“[T]here was a lot of discussion and emphasis on blood, on blood transfusion, whole-blood transfusion, as well as on plasma that can be purified from convalescent serum,” said Marie-Paule Kieny, Assistant Director-General at the WHO.
“There was consensus that this has a good chance to work and that, also, this is something that can be produced now from the affected countries themselves.”
The experts also agreed that the international community needs to help affected countries create the necessary infrastructure to draw blood safely and prepare the blood products safely.
Aside from blood transfusions, the WHO experts mentioned 2 potential Ebola vaccines that should be a priority. Safety studies of these vaccines—based on vesicular stomatitis virus (VSV-EBO) and chimpanzee adenovirus (ChAd-EBO)—are beginning in the US and are slated to begin in Africa and Europe in mid-September.
If proven safe, a vaccine could be available in November 2014 for priority use in healthcare workers.
The WHO experts also discussed the availability and evidence supporting the use of novel therapeutic drugs, including monoclonal antibodies, RNA-based drugs, and small antiviral molecules. They considered the potential use of existing drugs approved for other diseases and conditions as well.
Of the novel products discussed, some have shown great promise in monkey models. Others have been used in a few Ebola patients and appear safe, but the numbers are too small to permit any definitive conclusions about efficacy.
Existing supplies of all experimental medicines are limited, the WHO said. While many efforts are underway to accelerate production, supplies will not be sufficient for several months to come. The prospects of having augmented supplies of vaccines rapidly look slightly better.
The WHO also cautioned that the investigation of the aforementioned interventions should not detract attention from measures to prevent Ebola from spreading.
Drug shows early promise for hematologic malignancies
A drug that targets mitochondrial function is largely safe and can be active in heavily pretreated patients with hematologic malignancies, a phase 1 trial indicates.
The drug, CPI-613, prompted responses in only 4 of 21 evaluable patients. However, 2 of those responses lasted more than 2 years.
CPI-613 was generally well-tolerated and did not induce bone marrow suppression. Four patients experienced renal failure, but it was reversed in 3 of them.
These results appear in Clinical Cancer Research.
“This drug is selectively taken up by cancer cells and then shuts down the production of energy in the mitochondria,” said study author Timothy Pardee, MD, PhD, of the Comprehensive Cancer Center of Wake Forest University in Winston-Salem, North Carolina.
“This is the first drug to inhibit mitochondria in this way, and, if it proves effective in further clinical trials, it will open up a whole new approach to fighting cancer.”
Dr Pardee and his colleagues evaluated CPI-613 in 26 patients with relapsed or refractory hematologic malignancies—11 with acute myeloid leukemia, 6 with non-Hodgkin lymphoma, 4 with multiple myeloma, 4 with myelodysplastic syndrome (MDS), and 1 with Hodgkin lymphoma.
The median patient age was 65 years (range, 19-81), and the median number of prior therapies was 3 (range, 1-9).
Treatment dosing and toxicity
Patients received CPI-613 as a 2-hour infusion on days 1 and 4 for 3 weeks every 28 days.
When the infusion time was shortened to 1 hour, renal failure occurred in 2 patients. At 3780 mg/m2, there were 2 dose-limiting toxicities. There were no such toxicities at a dose of 2940 mg/m2 over 2 hours, so this was considered the maximum-tolerated dose.
The following grade 2 or higher toxicities were probably or definitely related to treatment: nausea (1 grade 2), vomiting (1 grade 3), diarrhea (3 grade 2), proteinuria (1 grade 2), renal failure (4 grade 3), hypotension (1 grade 2), hypocalcemia (1 grade 2), hypoalbuminemia (1 grade 2), and hyperkalemia (1 grade 3).
Renal failure was resolved in 3 of the 4 patients. The remaining patient chose hospice care.
Response data
Five patients discontinued treatment—1 refused therapy, 1 acquired an infection, and 3 developed acute kidney failure.
Of the 21 patients evaluable for response, 4 had an objective response following CPI-613 treatment, and 2 had prolonged stable disease.
One patient with MDS achieved a complete response that has been maintained for more than 3 years. A patient with acute myeloid leukemia achieved a morphologically leukemia-free state, went on to transplant, and is still alive and leukemia-free.
A patient with Burkitt lymphoma achieved a partial response after 3 cycles of therapy that was maintained for 17 cycles. She discontinued CPI-613 to have her residual disease resected, and has not received any treatment since. She is now disease-free more than 12 months later.
A patient with cutaneous T-cell lymphoma achieved a partial response that has been sustained for more than 2 years. At her request, she started to receive continuous therapy (no 1-week rest period), and she remains on treatment without significant toxicities and no evidence of marrow suppression.
The 2 patients with prolonged stable disease had MDS. Their disease was stable for 8 and 12 cycles, respectively. Two patients with multiple myeloma also initially had stable disease, but they progressed after 2 and 4 cycles, respectively.
Two patients died from disease progression while on study.
The researchers said these results suggest that agents targeting mitochondrial metabolism can be safe and active in hematologic malignancies. A phase 2 trial of CPI-613 is now underway.
Support for the phase 1 trial was provided by National Cancer Institute grants P30CA012197 and 1K08CA169809, the Doug Coley Foundation for Leukemia Research, the Frances P. Tutwiler Fund, The MacKay Foundation for Cancer Research, and Cornerstone Pharmaceuticals, which manufactured and provided CPI-613.
A drug that targets mitochondrial function is largely safe and can be active in heavily pretreated patients with hematologic malignancies, a phase 1 trial indicates.
The drug, CPI-613, prompted responses in only 4 of 21 evaluable patients. However, 2 of those responses lasted more than 2 years.
CPI-613 was generally well-tolerated and did not induce bone marrow suppression. Four patients experienced renal failure, but it was reversed in 3 of them.
These results appear in Clinical Cancer Research.
“This drug is selectively taken up by cancer cells and then shuts down the production of energy in the mitochondria,” said study author Timothy Pardee, MD, PhD, of the Comprehensive Cancer Center of Wake Forest University in Winston-Salem, North Carolina.
“This is the first drug to inhibit mitochondria in this way, and, if it proves effective in further clinical trials, it will open up a whole new approach to fighting cancer.”
Dr Pardee and his colleagues evaluated CPI-613 in 26 patients with relapsed or refractory hematologic malignancies—11 with acute myeloid leukemia, 6 with non-Hodgkin lymphoma, 4 with multiple myeloma, 4 with myelodysplastic syndrome (MDS), and 1 with Hodgkin lymphoma.
The median patient age was 65 years (range, 19-81), and the median number of prior therapies was 3 (range, 1-9).
Treatment dosing and toxicity
Patients received CPI-613 as a 2-hour infusion on days 1 and 4 for 3 weeks every 28 days.
When the infusion time was shortened to 1 hour, renal failure occurred in 2 patients. At 3780 mg/m2, there were 2 dose-limiting toxicities. There were no such toxicities at a dose of 2940 mg/m2 over 2 hours, so this was considered the maximum-tolerated dose.
The following grade 2 or higher toxicities were probably or definitely related to treatment: nausea (1 grade 2), vomiting (1 grade 3), diarrhea (3 grade 2), proteinuria (1 grade 2), renal failure (4 grade 3), hypotension (1 grade 2), hypocalcemia (1 grade 2), hypoalbuminemia (1 grade 2), and hyperkalemia (1 grade 3).
Renal failure was resolved in 3 of the 4 patients. The remaining patient chose hospice care.
Response data
Five patients discontinued treatment—1 refused therapy, 1 acquired an infection, and 3 developed acute kidney failure.
Of the 21 patients evaluable for response, 4 had an objective response following CPI-613 treatment, and 2 had prolonged stable disease.
One patient with MDS achieved a complete response that has been maintained for more than 3 years. A patient with acute myeloid leukemia achieved a morphologically leukemia-free state, went on to transplant, and is still alive and leukemia-free.
A patient with Burkitt lymphoma achieved a partial response after 3 cycles of therapy that was maintained for 17 cycles. She discontinued CPI-613 to have her residual disease resected, and has not received any treatment since. She is now disease-free more than 12 months later.
A patient with cutaneous T-cell lymphoma achieved a partial response that has been sustained for more than 2 years. At her request, she started to receive continuous therapy (no 1-week rest period), and she remains on treatment without significant toxicities and no evidence of marrow suppression.
The 2 patients with prolonged stable disease had MDS. Their disease was stable for 8 and 12 cycles, respectively. Two patients with multiple myeloma also initially had stable disease, but they progressed after 2 and 4 cycles, respectively.
Two patients died from disease progression while on study.
The researchers said these results suggest that agents targeting mitochondrial metabolism can be safe and active in hematologic malignancies. A phase 2 trial of CPI-613 is now underway.
Support for the phase 1 trial was provided by National Cancer Institute grants P30CA012197 and 1K08CA169809, the Doug Coley Foundation for Leukemia Research, the Frances P. Tutwiler Fund, The MacKay Foundation for Cancer Research, and Cornerstone Pharmaceuticals, which manufactured and provided CPI-613.
A drug that targets mitochondrial function is largely safe and can be active in heavily pretreated patients with hematologic malignancies, a phase 1 trial indicates.
The drug, CPI-613, prompted responses in only 4 of 21 evaluable patients. However, 2 of those responses lasted more than 2 years.
CPI-613 was generally well-tolerated and did not induce bone marrow suppression. Four patients experienced renal failure, but it was reversed in 3 of them.
These results appear in Clinical Cancer Research.
“This drug is selectively taken up by cancer cells and then shuts down the production of energy in the mitochondria,” said study author Timothy Pardee, MD, PhD, of the Comprehensive Cancer Center of Wake Forest University in Winston-Salem, North Carolina.
“This is the first drug to inhibit mitochondria in this way, and, if it proves effective in further clinical trials, it will open up a whole new approach to fighting cancer.”
Dr Pardee and his colleagues evaluated CPI-613 in 26 patients with relapsed or refractory hematologic malignancies—11 with acute myeloid leukemia, 6 with non-Hodgkin lymphoma, 4 with multiple myeloma, 4 with myelodysplastic syndrome (MDS), and 1 with Hodgkin lymphoma.
The median patient age was 65 years (range, 19-81), and the median number of prior therapies was 3 (range, 1-9).
Treatment dosing and toxicity
Patients received CPI-613 as a 2-hour infusion on days 1 and 4 for 3 weeks every 28 days.
When the infusion time was shortened to 1 hour, renal failure occurred in 2 patients. At 3780 mg/m2, there were 2 dose-limiting toxicities. There were no such toxicities at a dose of 2940 mg/m2 over 2 hours, so this was considered the maximum-tolerated dose.
The following grade 2 or higher toxicities were probably or definitely related to treatment: nausea (1 grade 2), vomiting (1 grade 3), diarrhea (3 grade 2), proteinuria (1 grade 2), renal failure (4 grade 3), hypotension (1 grade 2), hypocalcemia (1 grade 2), hypoalbuminemia (1 grade 2), and hyperkalemia (1 grade 3).
Renal failure was resolved in 3 of the 4 patients. The remaining patient chose hospice care.
Response data
Five patients discontinued treatment—1 refused therapy, 1 acquired an infection, and 3 developed acute kidney failure.
Of the 21 patients evaluable for response, 4 had an objective response following CPI-613 treatment, and 2 had prolonged stable disease.
One patient with MDS achieved a complete response that has been maintained for more than 3 years. A patient with acute myeloid leukemia achieved a morphologically leukemia-free state, went on to transplant, and is still alive and leukemia-free.
A patient with Burkitt lymphoma achieved a partial response after 3 cycles of therapy that was maintained for 17 cycles. She discontinued CPI-613 to have her residual disease resected, and has not received any treatment since. She is now disease-free more than 12 months later.
A patient with cutaneous T-cell lymphoma achieved a partial response that has been sustained for more than 2 years. At her request, she started to receive continuous therapy (no 1-week rest period), and she remains on treatment without significant toxicities and no evidence of marrow suppression.
The 2 patients with prolonged stable disease had MDS. Their disease was stable for 8 and 12 cycles, respectively. Two patients with multiple myeloma also initially had stable disease, but they progressed after 2 and 4 cycles, respectively.
Two patients died from disease progression while on study.
The researchers said these results suggest that agents targeting mitochondrial metabolism can be safe and active in hematologic malignancies. A phase 2 trial of CPI-613 is now underway.
Support for the phase 1 trial was provided by National Cancer Institute grants P30CA012197 and 1K08CA169809, the Doug Coley Foundation for Leukemia Research, the Frances P. Tutwiler Fund, The MacKay Foundation for Cancer Research, and Cornerstone Pharmaceuticals, which manufactured and provided CPI-613.
USPSTF recommends low-dose aspirin for preeclampsia prevention
The use of low-dose aspirin is advisable after 12 weeks of gestation in asymptomatic pregnant women at high risk for developing preeclampsia, according to a recommendation from the U.S. Preventive Services Task Force.
The recommendation, published online Sept. 8 in the Annals of Internal Medicine, is based on a review of new evidence suggesting that the net benefit of low-dose aspirin for preventing preeclampsia is of substantial magnitude. It updates a 1996 recommendation from the USPSTF, which concluded that there was insufficient evidence at that time to recommend for or against the routine use of aspirin for the prevention of preeclampsia.
The current evidence – including 15 randomized controlled trials used to assess the health benefits of low-dose aspirin, 13 randomized controlled trials used to evaluate preeclampsia incidence, and 19 randomized controlled trials and 2 good-quality observational studies used to evaluate harms associated with low-dose aspirin use – suggests that women at risk may benefit from low-dose aspirin beginning after 12 weeks of gestation.
Preeclampsia complicates 2%-8% of pregnancies worldwide, and accounts for 15% of preterm births and 12% of maternal deaths in the United States, according to the task force.
"The USPSTF found adequate evidence of a reduction in risk for preeclampsia, preterm birth, and IUGR [intrauterine growth restriction] in women at increased risk for preeclampsia who received low-dose aspirin, thus demonstrating substantial benefit. Low-dose aspirin (range, 60-150 mg/day) reduced the risk for preeclampsia by 24% in clinical trials [pooled relative risk, 0.76] and reduced the risk for preterm birth by 14% and IUGR by 20% [pooled relative risk, 0.86 and 0.80, respectively]," the updated recommendation stated (Ann. Intern. Med. 2014 Sept. 8 [doi:10.7326/m14-1884]).
Adequate evidence also indicates that low-dose aspirin is not associated with any increase in the risk of placental abruption, postpartum hemorrhage, fetal intracranial bleeding, or perinatal mortality.
"Evidence on long-term outcomes in offspring exposed in utero to low-dose aspirin is limited, but no developmental harms were identified by age 18 months in the one study reviewed," the task force wrote, concluding – with moderate certainty – that there is a substantial net benefit of daily low-dose aspirin use to reduce the risk for preeclampsia, preterm birth, and IUGR in women at high risk.
The decision to initiate low-dose aspirin therapy in this population is typically based on medical history; there are no validated methods for identifying women at high risk based on biomarkers, clinical diagnostic tests, or medical history. However, as part of the recommendation, the USPSTF provided a pragmatic approach that may help identify those at risk.
"Women with one or more risk factors should receive low-dose aspirin. Women with several moderate risk factors may also benefit from low-dose aspirin," the task force noted, adding that the evidence for the latter approach is less certain, and that clinicians should use clinical judgment and discuss the risks and benefits with patients.
The recommendation applies to asymptomatic women at risk in whom low-dose aspirin is not contraindicated, and defines women at high risk as those with a history of preeclampsia, especially those with an adverse outcome; chronic hypertension, renal disease, type 1 or 2 diabetes, or an autoimmune disease; and those with multifetal gestation, according to the updated recommendation.
Moderate risk factors include nulliparity, obesity, a family history of preeclampsia, age greater than or equal to 35 years, African American race, low socioeconomic status, low birth rate or small for gestational age, greater than 10-year pregnancy interval, or previous adverse pregnancy outcome.
As for appropriate dosing, the most common dosage across studies was 100 mg, but the two largest trials contributing to benefit estimates used 60 mg.
An 81-mg dose was not specifically evaluated, but is commonly available in the United States in tablet form, and is a reasonable dosage for preeclampsia prophylaxis, the task force said.
The updated recommendation is generally in keeping with those of other organizations, including the American College of Obstetricians and Gynecologists, the World Health Organization, the National Institute for Health and Clinical Excellence, the American Heart Association/American Stroke Association, and the American Academy of Family Physicians. For example, ACOG recommends initiating daily low-dose aspirin during the late first trimester in those with a history of early-onset preeclampsia and preterm delivery, or with a history of preeclampsia in more than one prior pregnancy (<cf number="\"2\"">“</cf>American College of Obstetricians and Gynecologists: Hypertension in Pregnancy [Washington, D.C.: American College of Obstetricians and Gynecologists, 2013]), and WHO recommends daily low-dose aspirin as early as 12 weeks for those at high risk ("WHO Recommendations for Prevention and Treatment of Pre-Eclampsia and Eclampsia" [Geneva: World Health Organization, 2011]).
The review by the USPSTF identified several research needs. For example, additional study is needed on the effects of low-dose aspirin on the development of preeclampsia and how patient response is affected by various risk factors. Research is also needed on how to improve clinicians’ ability to identify those at risk, and particularly those who would benefit most from prophylaxis. Study is needed on risk assessment tools, and on populations at particular risk, such as African American and nulliparous women.
Future trials should recruit adequate numbers of women from racial/ethnic populations that are at disproportionate risk.
"Larger studies on aspirin use in the first or early second trimester may improve the evidence base on optimal timing of low-dose aspirin preventive medication. Other areas of research include optimal therapies that individualize the aspirin dosage and timing of administration (e.g., morning vs. bedtime)," they concluded, noting that research is also needed to explore less-well-established risk factors, and to investigate whether preeclampsia prevention with low-dose aspirin affects long-term risk for cardiovascular disease, and whether there is any benefit to continuing low-dose aspirin after delivery in those at high risk.
The use of low-dose aspirin is advisable after 12 weeks of gestation in asymptomatic pregnant women at high risk for developing preeclampsia, according to a recommendation from the U.S. Preventive Services Task Force.
The recommendation, published online Sept. 8 in the Annals of Internal Medicine, is based on a review of new evidence suggesting that the net benefit of low-dose aspirin for preventing preeclampsia is of substantial magnitude. It updates a 1996 recommendation from the USPSTF, which concluded that there was insufficient evidence at that time to recommend for or against the routine use of aspirin for the prevention of preeclampsia.
The current evidence – including 15 randomized controlled trials used to assess the health benefits of low-dose aspirin, 13 randomized controlled trials used to evaluate preeclampsia incidence, and 19 randomized controlled trials and 2 good-quality observational studies used to evaluate harms associated with low-dose aspirin use – suggests that women at risk may benefit from low-dose aspirin beginning after 12 weeks of gestation.
Preeclampsia complicates 2%-8% of pregnancies worldwide, and accounts for 15% of preterm births and 12% of maternal deaths in the United States, according to the task force.
"The USPSTF found adequate evidence of a reduction in risk for preeclampsia, preterm birth, and IUGR [intrauterine growth restriction] in women at increased risk for preeclampsia who received low-dose aspirin, thus demonstrating substantial benefit. Low-dose aspirin (range, 60-150 mg/day) reduced the risk for preeclampsia by 24% in clinical trials [pooled relative risk, 0.76] and reduced the risk for preterm birth by 14% and IUGR by 20% [pooled relative risk, 0.86 and 0.80, respectively]," the updated recommendation stated (Ann. Intern. Med. 2014 Sept. 8 [doi:10.7326/m14-1884]).
Adequate evidence also indicates that low-dose aspirin is not associated with any increase in the risk of placental abruption, postpartum hemorrhage, fetal intracranial bleeding, or perinatal mortality.
"Evidence on long-term outcomes in offspring exposed in utero to low-dose aspirin is limited, but no developmental harms were identified by age 18 months in the one study reviewed," the task force wrote, concluding – with moderate certainty – that there is a substantial net benefit of daily low-dose aspirin use to reduce the risk for preeclampsia, preterm birth, and IUGR in women at high risk.
The decision to initiate low-dose aspirin therapy in this population is typically based on medical history; there are no validated methods for identifying women at high risk based on biomarkers, clinical diagnostic tests, or medical history. However, as part of the recommendation, the USPSTF provided a pragmatic approach that may help identify those at risk.
"Women with one or more risk factors should receive low-dose aspirin. Women with several moderate risk factors may also benefit from low-dose aspirin," the task force noted, adding that the evidence for the latter approach is less certain, and that clinicians should use clinical judgment and discuss the risks and benefits with patients.
The recommendation applies to asymptomatic women at risk in whom low-dose aspirin is not contraindicated, and defines women at high risk as those with a history of preeclampsia, especially those with an adverse outcome; chronic hypertension, renal disease, type 1 or 2 diabetes, or an autoimmune disease; and those with multifetal gestation, according to the updated recommendation.
Moderate risk factors include nulliparity, obesity, a family history of preeclampsia, age greater than or equal to 35 years, African American race, low socioeconomic status, low birth rate or small for gestational age, greater than 10-year pregnancy interval, or previous adverse pregnancy outcome.
As for appropriate dosing, the most common dosage across studies was 100 mg, but the two largest trials contributing to benefit estimates used 60 mg.
An 81-mg dose was not specifically evaluated, but is commonly available in the United States in tablet form, and is a reasonable dosage for preeclampsia prophylaxis, the task force said.
The updated recommendation is generally in keeping with those of other organizations, including the American College of Obstetricians and Gynecologists, the World Health Organization, the National Institute for Health and Clinical Excellence, the American Heart Association/American Stroke Association, and the American Academy of Family Physicians. For example, ACOG recommends initiating daily low-dose aspirin during the late first trimester in those with a history of early-onset preeclampsia and preterm delivery, or with a history of preeclampsia in more than one prior pregnancy (<cf number="\"2\"">“</cf>American College of Obstetricians and Gynecologists: Hypertension in Pregnancy [Washington, D.C.: American College of Obstetricians and Gynecologists, 2013]), and WHO recommends daily low-dose aspirin as early as 12 weeks for those at high risk ("WHO Recommendations for Prevention and Treatment of Pre-Eclampsia and Eclampsia" [Geneva: World Health Organization, 2011]).
The review by the USPSTF identified several research needs. For example, additional study is needed on the effects of low-dose aspirin on the development of preeclampsia and how patient response is affected by various risk factors. Research is also needed on how to improve clinicians’ ability to identify those at risk, and particularly those who would benefit most from prophylaxis. Study is needed on risk assessment tools, and on populations at particular risk, such as African American and nulliparous women.
Future trials should recruit adequate numbers of women from racial/ethnic populations that are at disproportionate risk.
"Larger studies on aspirin use in the first or early second trimester may improve the evidence base on optimal timing of low-dose aspirin preventive medication. Other areas of research include optimal therapies that individualize the aspirin dosage and timing of administration (e.g., morning vs. bedtime)," they concluded, noting that research is also needed to explore less-well-established risk factors, and to investigate whether preeclampsia prevention with low-dose aspirin affects long-term risk for cardiovascular disease, and whether there is any benefit to continuing low-dose aspirin after delivery in those at high risk.
The use of low-dose aspirin is advisable after 12 weeks of gestation in asymptomatic pregnant women at high risk for developing preeclampsia, according to a recommendation from the U.S. Preventive Services Task Force.
The recommendation, published online Sept. 8 in the Annals of Internal Medicine, is based on a review of new evidence suggesting that the net benefit of low-dose aspirin for preventing preeclampsia is of substantial magnitude. It updates a 1996 recommendation from the USPSTF, which concluded that there was insufficient evidence at that time to recommend for or against the routine use of aspirin for the prevention of preeclampsia.
The current evidence – including 15 randomized controlled trials used to assess the health benefits of low-dose aspirin, 13 randomized controlled trials used to evaluate preeclampsia incidence, and 19 randomized controlled trials and 2 good-quality observational studies used to evaluate harms associated with low-dose aspirin use – suggests that women at risk may benefit from low-dose aspirin beginning after 12 weeks of gestation.
Preeclampsia complicates 2%-8% of pregnancies worldwide, and accounts for 15% of preterm births and 12% of maternal deaths in the United States, according to the task force.
"The USPSTF found adequate evidence of a reduction in risk for preeclampsia, preterm birth, and IUGR [intrauterine growth restriction] in women at increased risk for preeclampsia who received low-dose aspirin, thus demonstrating substantial benefit. Low-dose aspirin (range, 60-150 mg/day) reduced the risk for preeclampsia by 24% in clinical trials [pooled relative risk, 0.76] and reduced the risk for preterm birth by 14% and IUGR by 20% [pooled relative risk, 0.86 and 0.80, respectively]," the updated recommendation stated (Ann. Intern. Med. 2014 Sept. 8 [doi:10.7326/m14-1884]).
Adequate evidence also indicates that low-dose aspirin is not associated with any increase in the risk of placental abruption, postpartum hemorrhage, fetal intracranial bleeding, or perinatal mortality.
"Evidence on long-term outcomes in offspring exposed in utero to low-dose aspirin is limited, but no developmental harms were identified by age 18 months in the one study reviewed," the task force wrote, concluding – with moderate certainty – that there is a substantial net benefit of daily low-dose aspirin use to reduce the risk for preeclampsia, preterm birth, and IUGR in women at high risk.
The decision to initiate low-dose aspirin therapy in this population is typically based on medical history; there are no validated methods for identifying women at high risk based on biomarkers, clinical diagnostic tests, or medical history. However, as part of the recommendation, the USPSTF provided a pragmatic approach that may help identify those at risk.
"Women with one or more risk factors should receive low-dose aspirin. Women with several moderate risk factors may also benefit from low-dose aspirin," the task force noted, adding that the evidence for the latter approach is less certain, and that clinicians should use clinical judgment and discuss the risks and benefits with patients.
The recommendation applies to asymptomatic women at risk in whom low-dose aspirin is not contraindicated, and defines women at high risk as those with a history of preeclampsia, especially those with an adverse outcome; chronic hypertension, renal disease, type 1 or 2 diabetes, or an autoimmune disease; and those with multifetal gestation, according to the updated recommendation.
Moderate risk factors include nulliparity, obesity, a family history of preeclampsia, age greater than or equal to 35 years, African American race, low socioeconomic status, low birth rate or small for gestational age, greater than 10-year pregnancy interval, or previous adverse pregnancy outcome.
As for appropriate dosing, the most common dosage across studies was 100 mg, but the two largest trials contributing to benefit estimates used 60 mg.
An 81-mg dose was not specifically evaluated, but is commonly available in the United States in tablet form, and is a reasonable dosage for preeclampsia prophylaxis, the task force said.
The updated recommendation is generally in keeping with those of other organizations, including the American College of Obstetricians and Gynecologists, the World Health Organization, the National Institute for Health and Clinical Excellence, the American Heart Association/American Stroke Association, and the American Academy of Family Physicians. For example, ACOG recommends initiating daily low-dose aspirin during the late first trimester in those with a history of early-onset preeclampsia and preterm delivery, or with a history of preeclampsia in more than one prior pregnancy (<cf number="\"2\"">“</cf>American College of Obstetricians and Gynecologists: Hypertension in Pregnancy [Washington, D.C.: American College of Obstetricians and Gynecologists, 2013]), and WHO recommends daily low-dose aspirin as early as 12 weeks for those at high risk ("WHO Recommendations for Prevention and Treatment of Pre-Eclampsia and Eclampsia" [Geneva: World Health Organization, 2011]).
The review by the USPSTF identified several research needs. For example, additional study is needed on the effects of low-dose aspirin on the development of preeclampsia and how patient response is affected by various risk factors. Research is also needed on how to improve clinicians’ ability to identify those at risk, and particularly those who would benefit most from prophylaxis. Study is needed on risk assessment tools, and on populations at particular risk, such as African American and nulliparous women.
Future trials should recruit adequate numbers of women from racial/ethnic populations that are at disproportionate risk.
"Larger studies on aspirin use in the first or early second trimester may improve the evidence base on optimal timing of low-dose aspirin preventive medication. Other areas of research include optimal therapies that individualize the aspirin dosage and timing of administration (e.g., morning vs. bedtime)," they concluded, noting that research is also needed to explore less-well-established risk factors, and to investigate whether preeclampsia prevention with low-dose aspirin affects long-term risk for cardiovascular disease, and whether there is any benefit to continuing low-dose aspirin after delivery in those at high risk.
FROM ANNALS OF INTERNAL MEDICINE
Psychiatry, free speech, school safety, and cannibalism
Over the past few days, an article has circulated about a 23-year-old middle school teacher in Cambridge, Md., who was suspended from his job because of two futuristic novels he wrote, including one about a school massacre 900 years in the future. The story was reported in The Atlantic under the headline, "In Maryland, a Soviet-Style Punishment for a Novelist."
The article, by Jeffrey Goldberg, said the young teacher had self-published his novels some time ago under a pseudonym. In addition to his being suspended, an "emergency medical evaluation" was ordered, his house was searched, and the school was swept for bombs by K-9 dogs. No charges have been filed as of this writing.
This response was deemed an "overreaction," and certainly has been good for book sales but probably not so much for the young man’s teaching career. The idea that artistic expression must conform to a specific standard or jeopardize one’s job leaves those with creative pursuits to worry and civil rights advocates to protest.
Soon after, the Los Angeles Times published an article stating that the issue was not the novels – the school knew about those in 2012 – but rather the content of a four-page letter the teacher had written to the school board suggesting that the teacher was suffering from some type of psychiatric condition and might have included indications that he was suicidal or dangerous. With this information, it was not as clear if the police response was an overreaction, and such determinations are generally made in hindsight: If a bomb is found, the decision was heroic, if not, it was an overreaction and a civil rights violation.
The case reminded me of the story about a New York City police officer who had Internet discussions about his desire to cook and eat women, including his ex-wife. While the officer never ate anyone, he was part of an online community called the Dark Fetish Network, which has of tens of thousands of registered users who discuss violent sexual fantasies. The officer, known in the media frenzy as Cannibal Cop, lost his job and was convicted of plotting to kidnap, a crime that could carry a life sentence. He reportedly had graphic discussions of plans to kill, roast, and eat specified victims, and he claimed that he had the means to do so. An investigation revealed that he did not own the implements that would enable him to carry out such a plan. His lawyer insisted that he was engaged in a role-playing fantasy, but he was convicted by a jury in 2012. In July, his conviction was overturned and he was released on bond. By that time, Cannibal Cop had served a year and a half in prison, with several months of it in solitary confinement.
Situations in which a person has done nothing illegal but has spoken or written words that indicate he or she might be a threat to public safety are fraught with concerns. While violent fantasies might be seen as "creepy" at a minimum, the criminal justice system is left to decide where the line is between fantasy and plan, and when a real threat exists. A person has the right to his dark fantasies, and the First Amendment right to free speech allows for discussion of those fantasies, while artistic endeavors allow for their expression. At the same time, if there are named or presumed victims, those individuals should not have to live with the terror of wondering if the fantasizer is going to act on the fantasies.
Invariably, psychiatrists end up being involved, even if the individual in question has no psychiatric history or obvious diagnosis. In a New York magazine article about the police officer titled, "A Dangerous Mind," Robert Kolker noted: "Pre-crime and psychiatry often go hand in hand. Legal instruments like institutionalization and sex-offender registration all share the goal of preventing crime from taking place, and for better or worse, they’re based on a psychiatric rationale."
As we all know, it can be difficult – if not impossible – to distinguish those who are having fantasies from those who are planning to commit a dangerous act. As psychiatrists, we deal with this uncertainty for patients who have suicidal thoughts on a regular basis. Often, even the patients don’t know for sure if they will act on their impulses. Fantasies that involve harming others are more unusual in clinical practice, and our risk assessment often begins with the stated intent of the individual. Our strongest predictor of future behavior continues to be past behavior, and neither the teacher nor the police officer in the stories above had criminal records.
To make it even more confusing, the Internet has added to the uncertainty; people have always had dangerous and fetishistic fantasies, but now there are ways others can learn the content of what was once very private. The risk, of course, is that fantasies and artistic endeavors become subject to both psychiatric scrutiny and criminal prosecution in a way that threatens civil rights and squelches creativity.
Dr. Miller is a coauthor of "Shrink Rap: Three Psychiatrists Explain Their Work" (Baltimore:The Johns Hopkins University Press, 2011).
Over the past few days, an article has circulated about a 23-year-old middle school teacher in Cambridge, Md., who was suspended from his job because of two futuristic novels he wrote, including one about a school massacre 900 years in the future. The story was reported in The Atlantic under the headline, "In Maryland, a Soviet-Style Punishment for a Novelist."
The article, by Jeffrey Goldberg, said the young teacher had self-published his novels some time ago under a pseudonym. In addition to his being suspended, an "emergency medical evaluation" was ordered, his house was searched, and the school was swept for bombs by K-9 dogs. No charges have been filed as of this writing.
This response was deemed an "overreaction," and certainly has been good for book sales but probably not so much for the young man’s teaching career. The idea that artistic expression must conform to a specific standard or jeopardize one’s job leaves those with creative pursuits to worry and civil rights advocates to protest.
Soon after, the Los Angeles Times published an article stating that the issue was not the novels – the school knew about those in 2012 – but rather the content of a four-page letter the teacher had written to the school board suggesting that the teacher was suffering from some type of psychiatric condition and might have included indications that he was suicidal or dangerous. With this information, it was not as clear if the police response was an overreaction, and such determinations are generally made in hindsight: If a bomb is found, the decision was heroic, if not, it was an overreaction and a civil rights violation.
The case reminded me of the story about a New York City police officer who had Internet discussions about his desire to cook and eat women, including his ex-wife. While the officer never ate anyone, he was part of an online community called the Dark Fetish Network, which has of tens of thousands of registered users who discuss violent sexual fantasies. The officer, known in the media frenzy as Cannibal Cop, lost his job and was convicted of plotting to kidnap, a crime that could carry a life sentence. He reportedly had graphic discussions of plans to kill, roast, and eat specified victims, and he claimed that he had the means to do so. An investigation revealed that he did not own the implements that would enable him to carry out such a plan. His lawyer insisted that he was engaged in a role-playing fantasy, but he was convicted by a jury in 2012. In July, his conviction was overturned and he was released on bond. By that time, Cannibal Cop had served a year and a half in prison, with several months of it in solitary confinement.
Situations in which a person has done nothing illegal but has spoken or written words that indicate he or she might be a threat to public safety are fraught with concerns. While violent fantasies might be seen as "creepy" at a minimum, the criminal justice system is left to decide where the line is between fantasy and plan, and when a real threat exists. A person has the right to his dark fantasies, and the First Amendment right to free speech allows for discussion of those fantasies, while artistic endeavors allow for their expression. At the same time, if there are named or presumed victims, those individuals should not have to live with the terror of wondering if the fantasizer is going to act on the fantasies.
Invariably, psychiatrists end up being involved, even if the individual in question has no psychiatric history or obvious diagnosis. In a New York magazine article about the police officer titled, "A Dangerous Mind," Robert Kolker noted: "Pre-crime and psychiatry often go hand in hand. Legal instruments like institutionalization and sex-offender registration all share the goal of preventing crime from taking place, and for better or worse, they’re based on a psychiatric rationale."
As we all know, it can be difficult – if not impossible – to distinguish those who are having fantasies from those who are planning to commit a dangerous act. As psychiatrists, we deal with this uncertainty for patients who have suicidal thoughts on a regular basis. Often, even the patients don’t know for sure if they will act on their impulses. Fantasies that involve harming others are more unusual in clinical practice, and our risk assessment often begins with the stated intent of the individual. Our strongest predictor of future behavior continues to be past behavior, and neither the teacher nor the police officer in the stories above had criminal records.
To make it even more confusing, the Internet has added to the uncertainty; people have always had dangerous and fetishistic fantasies, but now there are ways others can learn the content of what was once very private. The risk, of course, is that fantasies and artistic endeavors become subject to both psychiatric scrutiny and criminal prosecution in a way that threatens civil rights and squelches creativity.
Dr. Miller is a coauthor of "Shrink Rap: Three Psychiatrists Explain Their Work" (Baltimore:The Johns Hopkins University Press, 2011).
Over the past few days, an article has circulated about a 23-year-old middle school teacher in Cambridge, Md., who was suspended from his job because of two futuristic novels he wrote, including one about a school massacre 900 years in the future. The story was reported in The Atlantic under the headline, "In Maryland, a Soviet-Style Punishment for a Novelist."
The article, by Jeffrey Goldberg, said the young teacher had self-published his novels some time ago under a pseudonym. In addition to his being suspended, an "emergency medical evaluation" was ordered, his house was searched, and the school was swept for bombs by K-9 dogs. No charges have been filed as of this writing.
This response was deemed an "overreaction," and certainly has been good for book sales but probably not so much for the young man’s teaching career. The idea that artistic expression must conform to a specific standard or jeopardize one’s job leaves those with creative pursuits to worry and civil rights advocates to protest.
Soon after, the Los Angeles Times published an article stating that the issue was not the novels – the school knew about those in 2012 – but rather the content of a four-page letter the teacher had written to the school board suggesting that the teacher was suffering from some type of psychiatric condition and might have included indications that he was suicidal or dangerous. With this information, it was not as clear if the police response was an overreaction, and such determinations are generally made in hindsight: If a bomb is found, the decision was heroic, if not, it was an overreaction and a civil rights violation.
The case reminded me of the story about a New York City police officer who had Internet discussions about his desire to cook and eat women, including his ex-wife. While the officer never ate anyone, he was part of an online community called the Dark Fetish Network, which has of tens of thousands of registered users who discuss violent sexual fantasies. The officer, known in the media frenzy as Cannibal Cop, lost his job and was convicted of plotting to kidnap, a crime that could carry a life sentence. He reportedly had graphic discussions of plans to kill, roast, and eat specified victims, and he claimed that he had the means to do so. An investigation revealed that he did not own the implements that would enable him to carry out such a plan. His lawyer insisted that he was engaged in a role-playing fantasy, but he was convicted by a jury in 2012. In July, his conviction was overturned and he was released on bond. By that time, Cannibal Cop had served a year and a half in prison, with several months of it in solitary confinement.
Situations in which a person has done nothing illegal but has spoken or written words that indicate he or she might be a threat to public safety are fraught with concerns. While violent fantasies might be seen as "creepy" at a minimum, the criminal justice system is left to decide where the line is between fantasy and plan, and when a real threat exists. A person has the right to his dark fantasies, and the First Amendment right to free speech allows for discussion of those fantasies, while artistic endeavors allow for their expression. At the same time, if there are named or presumed victims, those individuals should not have to live with the terror of wondering if the fantasizer is going to act on the fantasies.
Invariably, psychiatrists end up being involved, even if the individual in question has no psychiatric history or obvious diagnosis. In a New York magazine article about the police officer titled, "A Dangerous Mind," Robert Kolker noted: "Pre-crime and psychiatry often go hand in hand. Legal instruments like institutionalization and sex-offender registration all share the goal of preventing crime from taking place, and for better or worse, they’re based on a psychiatric rationale."
As we all know, it can be difficult – if not impossible – to distinguish those who are having fantasies from those who are planning to commit a dangerous act. As psychiatrists, we deal with this uncertainty for patients who have suicidal thoughts on a regular basis. Often, even the patients don’t know for sure if they will act on their impulses. Fantasies that involve harming others are more unusual in clinical practice, and our risk assessment often begins with the stated intent of the individual. Our strongest predictor of future behavior continues to be past behavior, and neither the teacher nor the police officer in the stories above had criminal records.
To make it even more confusing, the Internet has added to the uncertainty; people have always had dangerous and fetishistic fantasies, but now there are ways others can learn the content of what was once very private. The risk, of course, is that fantasies and artistic endeavors become subject to both psychiatric scrutiny and criminal prosecution in a way that threatens civil rights and squelches creativity.
Dr. Miller is a coauthor of "Shrink Rap: Three Psychiatrists Explain Their Work" (Baltimore:The Johns Hopkins University Press, 2011).
Nostalgic for making diagnoses based on medical history alone? Me, neither
"It is clear that physicians take widely different attitudes towards investigations, some relying on them much more heavily than others."
– J.R. Hampton, et al., British Medical Journal, May 31, 1975
In 1975, a study was published in the British Medical Journal looking at new patients referred to a medical clinic. The study looked specifically at how a diagnosis was made for each new patient, and it concluded that in 66 out of 80 patients, a diagnosis was arrived at based on the history alone, that the physical exam was useful in an additional 7 patients, and that lab testing was useful in the last 7 patients (Br. Med. J. 1975;2:486-9). Since then, medical students have been taught that a good history will lead to the diagnosis about 80% of the time.
One of my professors in my internal medicine class taught me this aphorism. I thought he was brilliant. Surely, anyone who can come up with a diagnosis just by talking to a patient is a minor god? (He is a rheumatologist. He became my mentor, and is, in fact, one of the major influences in my choice of specialty.)
Medical school in the Philippines forces one to think that way anyhow. There is very little government-provided health care; everything else is paid for out of pocket. This means that every CBC I order, every electrolyte panel, every antinuclear antibody, every urinalysis, is charged to the patient. There are not enough hospital beds, ventilators, or MRI machines.
So patients waited a while before seeking medical attention, which means we took full advantage of a history that’s remarkably evolved, with classic, textbook physical exam findings. The general medicine wards were crammed with patients with jaundice, whether from hepatitis or from having the carcass of a dead ascaris worm lodged in their bile ducts. We saw fungating breast masses. Hyperthyroidism is not that hard to identify when the patient is in frank thyrotoxicosis. Patients with pulmonary tuberculosis had massive hemoptysis, buckets of blood, and acid-fast bacilli in the emergency department.
This is the environment in which I trained. Could I say then that I would be able to identify a problem just from taking a history alone? If you had months’, nay, years’ worth of history to work with, you’d be able to identify the problem, too. I’ll bet doctors who practiced in the 1960s and 1970s in the United States had similar experiences, having the benefit of witnessing full-blown cases of anything and everything.
Do I practice this way now? Not at all.
But that isn’t to say I don’t take a good enough history or physical exam, it’s just that patients seek medical attention earlier, and we have so many more resources at our disposal. We have the ability to detect illness before it wreaks havoc. (There are other, less charitable interpretations of this behavior, such as lack of time, patient expectations, etc. But that’s a topic for another time.)
I have a great deal of respect for my mentors who practice with very real limitations. I have no doubt they are better doctors than I am. And, of course, I feel nostalgic for the way we used to do things. It is easy to romanticize the sepia-toned snapshots of my third-world youth. But really, modernity is a blessing. We should celebrate our ability to find things early. Nostalgia is for meals and memories. Medicine is much more pedestrian than that.
Dr. Chan practices rheumatology in Pawtucket, R.I.
"It is clear that physicians take widely different attitudes towards investigations, some relying on them much more heavily than others."
– J.R. Hampton, et al., British Medical Journal, May 31, 1975
In 1975, a study was published in the British Medical Journal looking at new patients referred to a medical clinic. The study looked specifically at how a diagnosis was made for each new patient, and it concluded that in 66 out of 80 patients, a diagnosis was arrived at based on the history alone, that the physical exam was useful in an additional 7 patients, and that lab testing was useful in the last 7 patients (Br. Med. J. 1975;2:486-9). Since then, medical students have been taught that a good history will lead to the diagnosis about 80% of the time.
One of my professors in my internal medicine class taught me this aphorism. I thought he was brilliant. Surely, anyone who can come up with a diagnosis just by talking to a patient is a minor god? (He is a rheumatologist. He became my mentor, and is, in fact, one of the major influences in my choice of specialty.)
Medical school in the Philippines forces one to think that way anyhow. There is very little government-provided health care; everything else is paid for out of pocket. This means that every CBC I order, every electrolyte panel, every antinuclear antibody, every urinalysis, is charged to the patient. There are not enough hospital beds, ventilators, or MRI machines.
So patients waited a while before seeking medical attention, which means we took full advantage of a history that’s remarkably evolved, with classic, textbook physical exam findings. The general medicine wards were crammed with patients with jaundice, whether from hepatitis or from having the carcass of a dead ascaris worm lodged in their bile ducts. We saw fungating breast masses. Hyperthyroidism is not that hard to identify when the patient is in frank thyrotoxicosis. Patients with pulmonary tuberculosis had massive hemoptysis, buckets of blood, and acid-fast bacilli in the emergency department.
This is the environment in which I trained. Could I say then that I would be able to identify a problem just from taking a history alone? If you had months’, nay, years’ worth of history to work with, you’d be able to identify the problem, too. I’ll bet doctors who practiced in the 1960s and 1970s in the United States had similar experiences, having the benefit of witnessing full-blown cases of anything and everything.
Do I practice this way now? Not at all.
But that isn’t to say I don’t take a good enough history or physical exam, it’s just that patients seek medical attention earlier, and we have so many more resources at our disposal. We have the ability to detect illness before it wreaks havoc. (There are other, less charitable interpretations of this behavior, such as lack of time, patient expectations, etc. But that’s a topic for another time.)
I have a great deal of respect for my mentors who practice with very real limitations. I have no doubt they are better doctors than I am. And, of course, I feel nostalgic for the way we used to do things. It is easy to romanticize the sepia-toned snapshots of my third-world youth. But really, modernity is a blessing. We should celebrate our ability to find things early. Nostalgia is for meals and memories. Medicine is much more pedestrian than that.
Dr. Chan practices rheumatology in Pawtucket, R.I.
"It is clear that physicians take widely different attitudes towards investigations, some relying on them much more heavily than others."
– J.R. Hampton, et al., British Medical Journal, May 31, 1975
In 1975, a study was published in the British Medical Journal looking at new patients referred to a medical clinic. The study looked specifically at how a diagnosis was made for each new patient, and it concluded that in 66 out of 80 patients, a diagnosis was arrived at based on the history alone, that the physical exam was useful in an additional 7 patients, and that lab testing was useful in the last 7 patients (Br. Med. J. 1975;2:486-9). Since then, medical students have been taught that a good history will lead to the diagnosis about 80% of the time.
One of my professors in my internal medicine class taught me this aphorism. I thought he was brilliant. Surely, anyone who can come up with a diagnosis just by talking to a patient is a minor god? (He is a rheumatologist. He became my mentor, and is, in fact, one of the major influences in my choice of specialty.)
Medical school in the Philippines forces one to think that way anyhow. There is very little government-provided health care; everything else is paid for out of pocket. This means that every CBC I order, every electrolyte panel, every antinuclear antibody, every urinalysis, is charged to the patient. There are not enough hospital beds, ventilators, or MRI machines.
So patients waited a while before seeking medical attention, which means we took full advantage of a history that’s remarkably evolved, with classic, textbook physical exam findings. The general medicine wards were crammed with patients with jaundice, whether from hepatitis or from having the carcass of a dead ascaris worm lodged in their bile ducts. We saw fungating breast masses. Hyperthyroidism is not that hard to identify when the patient is in frank thyrotoxicosis. Patients with pulmonary tuberculosis had massive hemoptysis, buckets of blood, and acid-fast bacilli in the emergency department.
This is the environment in which I trained. Could I say then that I would be able to identify a problem just from taking a history alone? If you had months’, nay, years’ worth of history to work with, you’d be able to identify the problem, too. I’ll bet doctors who practiced in the 1960s and 1970s in the United States had similar experiences, having the benefit of witnessing full-blown cases of anything and everything.
Do I practice this way now? Not at all.
But that isn’t to say I don’t take a good enough history or physical exam, it’s just that patients seek medical attention earlier, and we have so many more resources at our disposal. We have the ability to detect illness before it wreaks havoc. (There are other, less charitable interpretations of this behavior, such as lack of time, patient expectations, etc. But that’s a topic for another time.)
I have a great deal of respect for my mentors who practice with very real limitations. I have no doubt they are better doctors than I am. And, of course, I feel nostalgic for the way we used to do things. It is easy to romanticize the sepia-toned snapshots of my third-world youth. But really, modernity is a blessing. We should celebrate our ability to find things early. Nostalgia is for meals and memories. Medicine is much more pedestrian than that.
Dr. Chan practices rheumatology in Pawtucket, R.I.
Method may fight inhibitor formation in hemophilia A
A new strategy may one day prevent hemophilia patients from developing antibodies that inhibit clotting factors.
With this method, plant cells “teach” the immune system to tolerate the clotting factor protein.
In mice with hemophilia A, the strategy prevented and reversed the formation of factor VIII (FVIII) inhibitors.
Henry Daniell, PhD, of the University of Pennsylvania School of Dental Medicine in Philadelphia, and his colleagues described the approach in Blood. The work was supported by the National Institutes of Health and Bayer.
“The only current treatments for inhibitor formation cost $1 million and are risky for patients,” Dr Daniell said. “Our technique, which uses plant-based capsules, has the potential to be a cost-effective and safe alternative.”
Developing the technique
Previous studies had shown that exposing the immune system to individual components of the clotting factor protein could induce tolerance to the whole protein.
FVIII is composed of a heavy chain and a light chain, with each containing 3 domains. For their study, the researchers used the whole heavy chain and the C2 domain of the light chain.
Dr Daniell and his colleagues developed a platform for delivering drugs and biotherapeutics that relies on genetically modifying plants so they express the protein of interest.
Trying that same method with the components of the FVIII molecule, the team first fused the heavy chain DNA with DNA encoding a cholera toxin subunit, a protein that can cross the intestinal wall and enter the bloodstream, and did the same with the C2 DNA.
They introduced the fused genes into tobacco chloroplasts, so that some plants expressed the heavy chain and cholera toxin proteins and others expressed the C2 and cholera toxin proteins. They then ground up the plant leaves and suspended them in a solution, mixing the heavy chain and C2 solutions together.
Testing in mice
The researchers fed the mixed solution to mice with hemophilia A twice a week for 2 months and compared them to mice that consumed unmodified plant material. The team then gave the mice infusions of FVIII.
As expected, the control mice formed high levels of inhibitors. But the mice fed the experimental plant material formed much lower levels of inhibitors—on average, 7 times lower.
Mice that consumed the experimental plants exhibited upregulation of cytokines associated with suppressing or regulating immune responses, while control mice showed upregulation of cytokines associated with triggering an immune response.
By transferring subsets of regulatory T cells taken from the mice that received the experimental plants into normal mice, the researchers were able to suppress inhibitor formation. This suggests the T cells were able to carry tolerance-inducing characteristics to the new population of animals.
“This gives us an explanation for the mechanism of how this tolerance is being created,” Dr Daniell said.
Finally, the researchers tried to reverse inhibitor formation. They fed the experimental plant material to mice that had already developed inhibitors.
Compared to a control group, the mice given the FVIII-containing plant material had their inhibitor formation slow and then reverse, decreasing 3- to 7-fold over 2 or 3 months of feeding.
This strategy holds promise for preventing and even reversing inhibitor formation in hemophiliacs receiving FVIII infusions. However, the researchers’ experiments showed that inhibitor levels could rise again as time passes.
“After some time, antibodies do develop if you stop giving them the plant material,” Dr Daniell said. “This is not a one-time treatment. You need to do it repetitively to maintain the tolerance.”
Dr Daniell and the Penn Center for Innovation are now working with a pharmaceutical company to test this oral tolerance strategy in other animal species, with plans to begin human trials shortly thereafter. For human use, the goal would be to use lettuce plants instead of tobacco plants.
A new strategy may one day prevent hemophilia patients from developing antibodies that inhibit clotting factors.
With this method, plant cells “teach” the immune system to tolerate the clotting factor protein.
In mice with hemophilia A, the strategy prevented and reversed the formation of factor VIII (FVIII) inhibitors.
Henry Daniell, PhD, of the University of Pennsylvania School of Dental Medicine in Philadelphia, and his colleagues described the approach in Blood. The work was supported by the National Institutes of Health and Bayer.
“The only current treatments for inhibitor formation cost $1 million and are risky for patients,” Dr Daniell said. “Our technique, which uses plant-based capsules, has the potential to be a cost-effective and safe alternative.”
Developing the technique
Previous studies had shown that exposing the immune system to individual components of the clotting factor protein could induce tolerance to the whole protein.
FVIII is composed of a heavy chain and a light chain, with each containing 3 domains. For their study, the researchers used the whole heavy chain and the C2 domain of the light chain.
Dr Daniell and his colleagues developed a platform for delivering drugs and biotherapeutics that relies on genetically modifying plants so they express the protein of interest.
Trying that same method with the components of the FVIII molecule, the team first fused the heavy chain DNA with DNA encoding a cholera toxin subunit, a protein that can cross the intestinal wall and enter the bloodstream, and did the same with the C2 DNA.
They introduced the fused genes into tobacco chloroplasts, so that some plants expressed the heavy chain and cholera toxin proteins and others expressed the C2 and cholera toxin proteins. They then ground up the plant leaves and suspended them in a solution, mixing the heavy chain and C2 solutions together.
Testing in mice
The researchers fed the mixed solution to mice with hemophilia A twice a week for 2 months and compared them to mice that consumed unmodified plant material. The team then gave the mice infusions of FVIII.
As expected, the control mice formed high levels of inhibitors. But the mice fed the experimental plant material formed much lower levels of inhibitors—on average, 7 times lower.
Mice that consumed the experimental plants exhibited upregulation of cytokines associated with suppressing or regulating immune responses, while control mice showed upregulation of cytokines associated with triggering an immune response.
By transferring subsets of regulatory T cells taken from the mice that received the experimental plants into normal mice, the researchers were able to suppress inhibitor formation. This suggests the T cells were able to carry tolerance-inducing characteristics to the new population of animals.
“This gives us an explanation for the mechanism of how this tolerance is being created,” Dr Daniell said.
Finally, the researchers tried to reverse inhibitor formation. They fed the experimental plant material to mice that had already developed inhibitors.
Compared to a control group, the mice given the FVIII-containing plant material had their inhibitor formation slow and then reverse, decreasing 3- to 7-fold over 2 or 3 months of feeding.
This strategy holds promise for preventing and even reversing inhibitor formation in hemophiliacs receiving FVIII infusions. However, the researchers’ experiments showed that inhibitor levels could rise again as time passes.
“After some time, antibodies do develop if you stop giving them the plant material,” Dr Daniell said. “This is not a one-time treatment. You need to do it repetitively to maintain the tolerance.”
Dr Daniell and the Penn Center for Innovation are now working with a pharmaceutical company to test this oral tolerance strategy in other animal species, with plans to begin human trials shortly thereafter. For human use, the goal would be to use lettuce plants instead of tobacco plants.
A new strategy may one day prevent hemophilia patients from developing antibodies that inhibit clotting factors.
With this method, plant cells “teach” the immune system to tolerate the clotting factor protein.
In mice with hemophilia A, the strategy prevented and reversed the formation of factor VIII (FVIII) inhibitors.
Henry Daniell, PhD, of the University of Pennsylvania School of Dental Medicine in Philadelphia, and his colleagues described the approach in Blood. The work was supported by the National Institutes of Health and Bayer.
“The only current treatments for inhibitor formation cost $1 million and are risky for patients,” Dr Daniell said. “Our technique, which uses plant-based capsules, has the potential to be a cost-effective and safe alternative.”
Developing the technique
Previous studies had shown that exposing the immune system to individual components of the clotting factor protein could induce tolerance to the whole protein.
FVIII is composed of a heavy chain and a light chain, with each containing 3 domains. For their study, the researchers used the whole heavy chain and the C2 domain of the light chain.
Dr Daniell and his colleagues developed a platform for delivering drugs and biotherapeutics that relies on genetically modifying plants so they express the protein of interest.
Trying that same method with the components of the FVIII molecule, the team first fused the heavy chain DNA with DNA encoding a cholera toxin subunit, a protein that can cross the intestinal wall and enter the bloodstream, and did the same with the C2 DNA.
They introduced the fused genes into tobacco chloroplasts, so that some plants expressed the heavy chain and cholera toxin proteins and others expressed the C2 and cholera toxin proteins. They then ground up the plant leaves and suspended them in a solution, mixing the heavy chain and C2 solutions together.
Testing in mice
The researchers fed the mixed solution to mice with hemophilia A twice a week for 2 months and compared them to mice that consumed unmodified plant material. The team then gave the mice infusions of FVIII.
As expected, the control mice formed high levels of inhibitors. But the mice fed the experimental plant material formed much lower levels of inhibitors—on average, 7 times lower.
Mice that consumed the experimental plants exhibited upregulation of cytokines associated with suppressing or regulating immune responses, while control mice showed upregulation of cytokines associated with triggering an immune response.
By transferring subsets of regulatory T cells taken from the mice that received the experimental plants into normal mice, the researchers were able to suppress inhibitor formation. This suggests the T cells were able to carry tolerance-inducing characteristics to the new population of animals.
“This gives us an explanation for the mechanism of how this tolerance is being created,” Dr Daniell said.
Finally, the researchers tried to reverse inhibitor formation. They fed the experimental plant material to mice that had already developed inhibitors.
Compared to a control group, the mice given the FVIII-containing plant material had their inhibitor formation slow and then reverse, decreasing 3- to 7-fold over 2 or 3 months of feeding.
This strategy holds promise for preventing and even reversing inhibitor formation in hemophiliacs receiving FVIII infusions. However, the researchers’ experiments showed that inhibitor levels could rise again as time passes.
“After some time, antibodies do develop if you stop giving them the plant material,” Dr Daniell said. “This is not a one-time treatment. You need to do it repetitively to maintain the tolerance.”
Dr Daniell and the Penn Center for Innovation are now working with a pharmaceutical company to test this oral tolerance strategy in other animal species, with plans to begin human trials shortly thereafter. For human use, the goal would be to use lettuce plants instead of tobacco plants.
Banked blood grows stiffer with age, study shows
Credit: Daniel Gay
The longer blood is stored, the less it is able to carry oxygen into the tiny microcapillaries of the body, according to a study published in Scientific Reports.
Using advanced optical techniques, researchers measured the stiffness of the membrane surrounding red blood cells.
They found that, even though the cells retain their shape and hemoglobin content, the membranes get stiffer over time, which steadily decreases the cells’ functionality.
“Our results show some surprising facts: Even though the blood looks good on the surface, its functionality is degrading steadily with time,” said study author Gabriel Popescu, PhD, of the University of Illinois at Urbana-Champaign.
Dr Popescu and his colleagues wanted to measure changes in red blood cells over time to help determine what effect older blood could have on a patient.
They used an optical technique called spatial light interference microscopy (SLIM), which was developed in Dr Popescu’s lab in 2011. It uses light to noninvasively measure cell mass and topology with nanoscale accuracy. Through software and hardware advances, the SLIM system today acquires images almost 100 times faster than it did 3 years ago.
The researchers took time-lapse images of red blood cells, measuring and charting their properties. In particular, the team was able to measure nanometer-scale motions of the cell membrane, which are indicative of the cell’s stiffness and function. The fainter the membrane motion, the less functional the cell.
The measurements revealed that a lot of characteristics stay the same over time. The cells retain their shape, mass, and hemoglobin content, for example.
However, the membranes become stiffer and less elastic as time passes. This is important because the cells need to be flexible enough to travel through tiny capillaries and permeable enough for oxygen to pass through.
“In microcirculation, such as that in the brain, cells need to squeeze though very narrow capillaries to carry oxygen,” said study author Basanta Bhaduri, PhD, of the University of Illinois at Urbana-Champaign.
“If they are not deformable enough, the oxygen transport is impeded to that particular organ, and major clinical problems may arise. This is the reason why new red blood cells are produced continuously by the bone marrow, such that no cells older than 100 days or so exist in our circulation.”
The researchers hope the SLIM imaging method will be used clinically to monitor stored blood before it is given to patients, since conventional white-light microscopes can be easily adapted for SLIM with a few extra components.
“These results can have a wide variety of clinical applications,” said author Krishna Tangella, MD, of the University of Illinois at Urbana-Champaign.
“Functional data from red blood cells would help physicians determine when to give red cell transfusions for patients with anemia. This study may help better utilization of red cell transfusions, which will not only decrease healthcare costs but also increase the quality of care.”
Credit: Daniel Gay
The longer blood is stored, the less it is able to carry oxygen into the tiny microcapillaries of the body, according to a study published in Scientific Reports.
Using advanced optical techniques, researchers measured the stiffness of the membrane surrounding red blood cells.
They found that, even though the cells retain their shape and hemoglobin content, the membranes get stiffer over time, which steadily decreases the cells’ functionality.
“Our results show some surprising facts: Even though the blood looks good on the surface, its functionality is degrading steadily with time,” said study author Gabriel Popescu, PhD, of the University of Illinois at Urbana-Champaign.
Dr Popescu and his colleagues wanted to measure changes in red blood cells over time to help determine what effect older blood could have on a patient.
They used an optical technique called spatial light interference microscopy (SLIM), which was developed in Dr Popescu’s lab in 2011. It uses light to noninvasively measure cell mass and topology with nanoscale accuracy. Through software and hardware advances, the SLIM system today acquires images almost 100 times faster than it did 3 years ago.
The researchers took time-lapse images of red blood cells, measuring and charting their properties. In particular, the team was able to measure nanometer-scale motions of the cell membrane, which are indicative of the cell’s stiffness and function. The fainter the membrane motion, the less functional the cell.
The measurements revealed that a lot of characteristics stay the same over time. The cells retain their shape, mass, and hemoglobin content, for example.
However, the membranes become stiffer and less elastic as time passes. This is important because the cells need to be flexible enough to travel through tiny capillaries and permeable enough for oxygen to pass through.
“In microcirculation, such as that in the brain, cells need to squeeze though very narrow capillaries to carry oxygen,” said study author Basanta Bhaduri, PhD, of the University of Illinois at Urbana-Champaign.
“If they are not deformable enough, the oxygen transport is impeded to that particular organ, and major clinical problems may arise. This is the reason why new red blood cells are produced continuously by the bone marrow, such that no cells older than 100 days or so exist in our circulation.”
The researchers hope the SLIM imaging method will be used clinically to monitor stored blood before it is given to patients, since conventional white-light microscopes can be easily adapted for SLIM with a few extra components.
“These results can have a wide variety of clinical applications,” said author Krishna Tangella, MD, of the University of Illinois at Urbana-Champaign.
“Functional data from red blood cells would help physicians determine when to give red cell transfusions for patients with anemia. This study may help better utilization of red cell transfusions, which will not only decrease healthcare costs but also increase the quality of care.”
Credit: Daniel Gay
The longer blood is stored, the less it is able to carry oxygen into the tiny microcapillaries of the body, according to a study published in Scientific Reports.
Using advanced optical techniques, researchers measured the stiffness of the membrane surrounding red blood cells.
They found that, even though the cells retain their shape and hemoglobin content, the membranes get stiffer over time, which steadily decreases the cells’ functionality.
“Our results show some surprising facts: Even though the blood looks good on the surface, its functionality is degrading steadily with time,” said study author Gabriel Popescu, PhD, of the University of Illinois at Urbana-Champaign.
Dr Popescu and his colleagues wanted to measure changes in red blood cells over time to help determine what effect older blood could have on a patient.
They used an optical technique called spatial light interference microscopy (SLIM), which was developed in Dr Popescu’s lab in 2011. It uses light to noninvasively measure cell mass and topology with nanoscale accuracy. Through software and hardware advances, the SLIM system today acquires images almost 100 times faster than it did 3 years ago.
The researchers took time-lapse images of red blood cells, measuring and charting their properties. In particular, the team was able to measure nanometer-scale motions of the cell membrane, which are indicative of the cell’s stiffness and function. The fainter the membrane motion, the less functional the cell.
The measurements revealed that a lot of characteristics stay the same over time. The cells retain their shape, mass, and hemoglobin content, for example.
However, the membranes become stiffer and less elastic as time passes. This is important because the cells need to be flexible enough to travel through tiny capillaries and permeable enough for oxygen to pass through.
“In microcirculation, such as that in the brain, cells need to squeeze though very narrow capillaries to carry oxygen,” said study author Basanta Bhaduri, PhD, of the University of Illinois at Urbana-Champaign.
“If they are not deformable enough, the oxygen transport is impeded to that particular organ, and major clinical problems may arise. This is the reason why new red blood cells are produced continuously by the bone marrow, such that no cells older than 100 days or so exist in our circulation.”
The researchers hope the SLIM imaging method will be used clinically to monitor stored blood before it is given to patients, since conventional white-light microscopes can be easily adapted for SLIM with a few extra components.
“These results can have a wide variety of clinical applications,” said author Krishna Tangella, MD, of the University of Illinois at Urbana-Champaign.
“Functional data from red blood cells would help physicians determine when to give red cell transfusions for patients with anemia. This study may help better utilization of red cell transfusions, which will not only decrease healthcare costs but also increase the quality of care.”
Overcoming an obstacle to RBC development
Researchers have discovered a natural barrier to hematopoiesis and a way to circumvent it, according to a paper published in Blood.
The group found that components of the exosome complex—exosc8 and exosc9—suppress red blood cell (RBC) maturation.
“From a fundamental perspective, this is very important because this mechanism counteracts the development of precursor cells into red blood cells, thereby establishing a balance between developed cells and the progenitor population,” said study author Emery Bresnick, PhD, of the UW School of Medicine and Public Health in Madison, Wisconsin.
“In the context of translation, if you want to maximize the output of end-stage red blood cells, which we’re not able to do at this time, our study provides a rational approach involving lowering the levels of these subunits.”
Specifically, the researchers found that GATA-1 and Foxo3 can repress the exosome components, thereby allowing for RBC maturation.
The barrier explained
Dr Bresnick and his colleagues noted that the primary obstacle in converting hematopoietic stem cells into RBCs involves late-stage maturation.
“The problem isn’t simply getting erythroid precursors produced by the bucket, but understanding how these cells systematically lose their nuclei and organelles to become a red blood cell, the final product,” Dr Bresnick said.
“This is the bottleneck, even in the stem cell world of embryonic and induced pluripotent stem cells. We know little about how the cell orchestrates the intricate processes that constitute late-stage maturation.”
At the end of RBC development, the erythroid precursor must eject its own genetic material via enucleation. Although it’s clear why enucleation is important (making the cell more flexible and allowing it to carry more oxygen), exactly how the cell does it has been unclear.
Besides ejecting the nucleus, the cell must be cleared of other organelles, such as the endoplasmic reticulum and mitochondria. This process (autophagy) is linked to a pair of transcription factors—GATA1 and Foxo3—that control gene expression important in RBC development.
Because they knew GATA1 and Foxo3 promote autophagy, Dr Bresnick and his colleagues wondered if the proteins these transcription factors repress play an important role in cell maturation.
This led them to identify exosc8 and exosc9, two units of the exosome that ultimately established the development barrier.
The researchers plan to continue studying the exosome because many RNAs in the cell are not degraded by the exosome. Determining exactly how the exosome decides what RNA to dispose of may provide an even better understanding of the newly discovered barrier.
“One goal we have is to establish the specific RNA targets the exosome is regulating that are responsible for the blockade,” Dr Bresnick said. “In doing so, we might even uncover targets that are easier to manipulate than the exosome itself.”
Researchers have discovered a natural barrier to hematopoiesis and a way to circumvent it, according to a paper published in Blood.
The group found that components of the exosome complex—exosc8 and exosc9—suppress red blood cell (RBC) maturation.
“From a fundamental perspective, this is very important because this mechanism counteracts the development of precursor cells into red blood cells, thereby establishing a balance between developed cells and the progenitor population,” said study author Emery Bresnick, PhD, of the UW School of Medicine and Public Health in Madison, Wisconsin.
“In the context of translation, if you want to maximize the output of end-stage red blood cells, which we’re not able to do at this time, our study provides a rational approach involving lowering the levels of these subunits.”
Specifically, the researchers found that GATA-1 and Foxo3 can repress the exosome components, thereby allowing for RBC maturation.
The barrier explained
Dr Bresnick and his colleagues noted that the primary obstacle in converting hematopoietic stem cells into RBCs involves late-stage maturation.
“The problem isn’t simply getting erythroid precursors produced by the bucket, but understanding how these cells systematically lose their nuclei and organelles to become a red blood cell, the final product,” Dr Bresnick said.
“This is the bottleneck, even in the stem cell world of embryonic and induced pluripotent stem cells. We know little about how the cell orchestrates the intricate processes that constitute late-stage maturation.”
At the end of RBC development, the erythroid precursor must eject its own genetic material via enucleation. Although it’s clear why enucleation is important (making the cell more flexible and allowing it to carry more oxygen), exactly how the cell does it has been unclear.
Besides ejecting the nucleus, the cell must be cleared of other organelles, such as the endoplasmic reticulum and mitochondria. This process (autophagy) is linked to a pair of transcription factors—GATA1 and Foxo3—that control gene expression important in RBC development.
Because they knew GATA1 and Foxo3 promote autophagy, Dr Bresnick and his colleagues wondered if the proteins these transcription factors repress play an important role in cell maturation.
This led them to identify exosc8 and exosc9, two units of the exosome that ultimately established the development barrier.
The researchers plan to continue studying the exosome because many RNAs in the cell are not degraded by the exosome. Determining exactly how the exosome decides what RNA to dispose of may provide an even better understanding of the newly discovered barrier.
“One goal we have is to establish the specific RNA targets the exosome is regulating that are responsible for the blockade,” Dr Bresnick said. “In doing so, we might even uncover targets that are easier to manipulate than the exosome itself.”
Researchers have discovered a natural barrier to hematopoiesis and a way to circumvent it, according to a paper published in Blood.
The group found that components of the exosome complex—exosc8 and exosc9—suppress red blood cell (RBC) maturation.
“From a fundamental perspective, this is very important because this mechanism counteracts the development of precursor cells into red blood cells, thereby establishing a balance between developed cells and the progenitor population,” said study author Emery Bresnick, PhD, of the UW School of Medicine and Public Health in Madison, Wisconsin.
“In the context of translation, if you want to maximize the output of end-stage red blood cells, which we’re not able to do at this time, our study provides a rational approach involving lowering the levels of these subunits.”
Specifically, the researchers found that GATA-1 and Foxo3 can repress the exosome components, thereby allowing for RBC maturation.
The barrier explained
Dr Bresnick and his colleagues noted that the primary obstacle in converting hematopoietic stem cells into RBCs involves late-stage maturation.
“The problem isn’t simply getting erythroid precursors produced by the bucket, but understanding how these cells systematically lose their nuclei and organelles to become a red blood cell, the final product,” Dr Bresnick said.
“This is the bottleneck, even in the stem cell world of embryonic and induced pluripotent stem cells. We know little about how the cell orchestrates the intricate processes that constitute late-stage maturation.”
At the end of RBC development, the erythroid precursor must eject its own genetic material via enucleation. Although it’s clear why enucleation is important (making the cell more flexible and allowing it to carry more oxygen), exactly how the cell does it has been unclear.
Besides ejecting the nucleus, the cell must be cleared of other organelles, such as the endoplasmic reticulum and mitochondria. This process (autophagy) is linked to a pair of transcription factors—GATA1 and Foxo3—that control gene expression important in RBC development.
Because they knew GATA1 and Foxo3 promote autophagy, Dr Bresnick and his colleagues wondered if the proteins these transcription factors repress play an important role in cell maturation.
This led them to identify exosc8 and exosc9, two units of the exosome that ultimately established the development barrier.
The researchers plan to continue studying the exosome because many RNAs in the cell are not degraded by the exosome. Determining exactly how the exosome decides what RNA to dispose of may provide an even better understanding of the newly discovered barrier.
“One goal we have is to establish the specific RNA targets the exosome is regulating that are responsible for the blockade,” Dr Bresnick said. “In doing so, we might even uncover targets that are easier to manipulate than the exosome itself.”
Health Canada approves dabigatran for VTE
Credit: Kevin MacKenzie
Health Canada has approved dabigatran etexilate (Pradaxa) for the treatment and prevention of venous thromboembolism (VTE).
Dabigatran is a novel, reversible, oral direct thrombin inhibitor that has been on the market for more than 5 years and is approved in more than 100 countries.
Health Canada’s latest approval of dabigatran is based on results from four phase 3 trials—RE-MEDY, RE-SONATE, and RE-COVER I and II.
The trials suggested that dabigatran given at 150 mg twice daily can treat and prevent a recurrence of deep vein thrombosis or pulmonary embolism.
RE-COVER I
In the first RE-COVER trial, dabigatran proved noninferior to warfarin for preventing VTE recurrence, and rates of major bleeding were similar between the treatment arms. However, patients were more likely to discontinue dabigatran due to adverse events.
VTE recurred in 2.4% of patients treated with dabigatran and 2.1% of patients who received warfarin (P<0.001 for noninferiority).
Bleeding events occurred in 16.1% of patients who received dabigatran and 21.9% of warfarin-treated patients (P<0.001). Major bleeding occurred in 1.6% and 1.9% of patients, respectively (P=0.38).
The numbers of deaths, acute coronary syndromes, and abnormal liver-function tests were similar between the treatment arms. But adverse events leading to treatment discontinuation occurred in 9.0% of dabigatran-treated patients and 6.8% of patients in the warfarin arm (P=0.05).
Results from RE-COVER were presented at ASH 2009 and published in NEJM.
RE-COVER II
The RE-COVER II trial suggested that dabigatran was noninferior to warfarin for preventing VTE recurrence and related deaths. This outcome occurred in 2.3% of dabigatran-treated patients and 2.2% of warfarin-treated patients (P<0.001 for noninferiority).
Major bleeding occurred 1.2% of patients who received dabigatran and 1.7% of patients who received warfarin. Any bleeding occurred in 15.6% and 22.1% of patients, respectively.
Overall, rates of death, adverse events, and acute coronary syndromes were similar between the treatment arms.
Results from RE-COVER II were published in Circulation in 2013.
RE-MEDY and RE-SONATE
The RE-MEDY and RE-SONATE trials were designed to evaluate dabigatran as extended VTE prophylaxis. Results of both trials were reported in a single NEJM article published in 2013.
The RE-MEDY trial showed that dabigatran was noninferior to warfarin as extended prophylaxis for recurrent VTE, and warfarin presented a significantly higher risk of bleeding.
VTE recurred in 1.8% of patients in the dabigatran arm and 1.3% of patients in the warfarin arm (P=0.01 for noninferiority). And the rate of clinically relevant or major bleeding was lower with dabigatran than with warfarin—at 5.6% and 10.2%, respectively (P<0.001).
Results of the RE-SONATE trial showed that dabigatran was superior to placebo for preventing recurrent VTE, although the drug significantly increased the risk of major or clinically relevant bleeding.
VTE recurred in 0.4% of patients in the dabigatran arm and 5.6% of patients in the placebo arm (P<0.001). Clinically relevant or major bleeding occurred in 5.3% of patients in the dabigatran and 1.8% of patients in the placebo arm (P=0.001).
Safety concerns with dabigatran
Over the years, the safety of dabigatran has been called into question, as serious bleeding events have been reported in patients taking the drug.
However, results of two investigations by the US Food and Drug Administration—one reported in 2012 and one reported this year—have suggested the benefits of dabigatran outweigh the risks.
Recently, a series of papers published in The BMJ raised concerns about dabigatran, claiming the drug’s developer underreported adverse events and withheld data showing that monitoring and dose adjustment could improve the safety of dabigatran without compromising its efficacy. The developer, Boehringer Ingelheim, denied these allegations.
For more information on dabigatran, see its product monograph.
Credit: Kevin MacKenzie
Health Canada has approved dabigatran etexilate (Pradaxa) for the treatment and prevention of venous thromboembolism (VTE).
Dabigatran is a novel, reversible, oral direct thrombin inhibitor that has been on the market for more than 5 years and is approved in more than 100 countries.
Health Canada’s latest approval of dabigatran is based on results from four phase 3 trials—RE-MEDY, RE-SONATE, and RE-COVER I and II.
The trials suggested that dabigatran given at 150 mg twice daily can treat and prevent a recurrence of deep vein thrombosis or pulmonary embolism.
RE-COVER I
In the first RE-COVER trial, dabigatran proved noninferior to warfarin for preventing VTE recurrence, and rates of major bleeding were similar between the treatment arms. However, patients were more likely to discontinue dabigatran due to adverse events.
VTE recurred in 2.4% of patients treated with dabigatran and 2.1% of patients who received warfarin (P<0.001 for noninferiority).
Bleeding events occurred in 16.1% of patients who received dabigatran and 21.9% of warfarin-treated patients (P<0.001). Major bleeding occurred in 1.6% and 1.9% of patients, respectively (P=0.38).
The numbers of deaths, acute coronary syndromes, and abnormal liver-function tests were similar between the treatment arms. But adverse events leading to treatment discontinuation occurred in 9.0% of dabigatran-treated patients and 6.8% of patients in the warfarin arm (P=0.05).
Results from RE-COVER were presented at ASH 2009 and published in NEJM.
RE-COVER II
The RE-COVER II trial suggested that dabigatran was noninferior to warfarin for preventing VTE recurrence and related deaths. This outcome occurred in 2.3% of dabigatran-treated patients and 2.2% of warfarin-treated patients (P<0.001 for noninferiority).
Major bleeding occurred 1.2% of patients who received dabigatran and 1.7% of patients who received warfarin. Any bleeding occurred in 15.6% and 22.1% of patients, respectively.
Overall, rates of death, adverse events, and acute coronary syndromes were similar between the treatment arms.
Results from RE-COVER II were published in Circulation in 2013.
RE-MEDY and RE-SONATE
The RE-MEDY and RE-SONATE trials were designed to evaluate dabigatran as extended VTE prophylaxis. Results of both trials were reported in a single NEJM article published in 2013.
The RE-MEDY trial showed that dabigatran was noninferior to warfarin as extended prophylaxis for recurrent VTE, and warfarin presented a significantly higher risk of bleeding.
VTE recurred in 1.8% of patients in the dabigatran arm and 1.3% of patients in the warfarin arm (P=0.01 for noninferiority). And the rate of clinically relevant or major bleeding was lower with dabigatran than with warfarin—at 5.6% and 10.2%, respectively (P<0.001).
Results of the RE-SONATE trial showed that dabigatran was superior to placebo for preventing recurrent VTE, although the drug significantly increased the risk of major or clinically relevant bleeding.
VTE recurred in 0.4% of patients in the dabigatran arm and 5.6% of patients in the placebo arm (P<0.001). Clinically relevant or major bleeding occurred in 5.3% of patients in the dabigatran and 1.8% of patients in the placebo arm (P=0.001).
Safety concerns with dabigatran
Over the years, the safety of dabigatran has been called into question, as serious bleeding events have been reported in patients taking the drug.
However, results of two investigations by the US Food and Drug Administration—one reported in 2012 and one reported this year—have suggested the benefits of dabigatran outweigh the risks.
Recently, a series of papers published in The BMJ raised concerns about dabigatran, claiming the drug’s developer underreported adverse events and withheld data showing that monitoring and dose adjustment could improve the safety of dabigatran without compromising its efficacy. The developer, Boehringer Ingelheim, denied these allegations.
For more information on dabigatran, see its product monograph.
Credit: Kevin MacKenzie
Health Canada has approved dabigatran etexilate (Pradaxa) for the treatment and prevention of venous thromboembolism (VTE).
Dabigatran is a novel, reversible, oral direct thrombin inhibitor that has been on the market for more than 5 years and is approved in more than 100 countries.
Health Canada’s latest approval of dabigatran is based on results from four phase 3 trials—RE-MEDY, RE-SONATE, and RE-COVER I and II.
The trials suggested that dabigatran given at 150 mg twice daily can treat and prevent a recurrence of deep vein thrombosis or pulmonary embolism.
RE-COVER I
In the first RE-COVER trial, dabigatran proved noninferior to warfarin for preventing VTE recurrence, and rates of major bleeding were similar between the treatment arms. However, patients were more likely to discontinue dabigatran due to adverse events.
VTE recurred in 2.4% of patients treated with dabigatran and 2.1% of patients who received warfarin (P<0.001 for noninferiority).
Bleeding events occurred in 16.1% of patients who received dabigatran and 21.9% of warfarin-treated patients (P<0.001). Major bleeding occurred in 1.6% and 1.9% of patients, respectively (P=0.38).
The numbers of deaths, acute coronary syndromes, and abnormal liver-function tests were similar between the treatment arms. But adverse events leading to treatment discontinuation occurred in 9.0% of dabigatran-treated patients and 6.8% of patients in the warfarin arm (P=0.05).
Results from RE-COVER were presented at ASH 2009 and published in NEJM.
RE-COVER II
The RE-COVER II trial suggested that dabigatran was noninferior to warfarin for preventing VTE recurrence and related deaths. This outcome occurred in 2.3% of dabigatran-treated patients and 2.2% of warfarin-treated patients (P<0.001 for noninferiority).
Major bleeding occurred 1.2% of patients who received dabigatran and 1.7% of patients who received warfarin. Any bleeding occurred in 15.6% and 22.1% of patients, respectively.
Overall, rates of death, adverse events, and acute coronary syndromes were similar between the treatment arms.
Results from RE-COVER II were published in Circulation in 2013.
RE-MEDY and RE-SONATE
The RE-MEDY and RE-SONATE trials were designed to evaluate dabigatran as extended VTE prophylaxis. Results of both trials were reported in a single NEJM article published in 2013.
The RE-MEDY trial showed that dabigatran was noninferior to warfarin as extended prophylaxis for recurrent VTE, and warfarin presented a significantly higher risk of bleeding.
VTE recurred in 1.8% of patients in the dabigatran arm and 1.3% of patients in the warfarin arm (P=0.01 for noninferiority). And the rate of clinically relevant or major bleeding was lower with dabigatran than with warfarin—at 5.6% and 10.2%, respectively (P<0.001).
Results of the RE-SONATE trial showed that dabigatran was superior to placebo for preventing recurrent VTE, although the drug significantly increased the risk of major or clinically relevant bleeding.
VTE recurred in 0.4% of patients in the dabigatran arm and 5.6% of patients in the placebo arm (P<0.001). Clinically relevant or major bleeding occurred in 5.3% of patients in the dabigatran and 1.8% of patients in the placebo arm (P=0.001).
Safety concerns with dabigatran
Over the years, the safety of dabigatran has been called into question, as serious bleeding events have been reported in patients taking the drug.
However, results of two investigations by the US Food and Drug Administration—one reported in 2012 and one reported this year—have suggested the benefits of dabigatran outweigh the risks.
Recently, a series of papers published in The BMJ raised concerns about dabigatran, claiming the drug’s developer underreported adverse events and withheld data showing that monitoring and dose adjustment could improve the safety of dabigatran without compromising its efficacy. The developer, Boehringer Ingelheim, denied these allegations.
For more information on dabigatran, see its product monograph.