User login
CAR T-cell therapy: The good and the bad
Credit: MSKCC
Several studies have shown that infusions of T cells modified with chimeric antigen receptors (CARs) can elicit complete responses in leukemia patients who have run out of treatment options.
However, the therapy also puts patients at risk of developing cytokine release syndrome (CRS).
With updated research, investigators have again shown that CAR T cells can produce complete responses in patients with relapsed or refractory B-cell acute lymphoblastic leukemia (B-ALL), thereby allowing them to receive allogeneic stem cell transplant (allo-SCT).
But the researchers have also used this group of patients to define diagnostic criteria for severe CRS. And the team has discovered that measuring C-reactive protein levels can help predict the severity of CRS.
Michel Sadelain, MD, PhD, of Memorial Sloan-Kettering Cancer Center in New York, and his colleagues described these findings in Science Translational Medicine.
Response, bridge to allo-SCT
Dr Sadelain and his colleagues previously reported results in 5 patients with relapsed/refractory B-ALL who received autologous T cells expressing a CD19-specific, CD28/CD3z CAR called 19-28z.
After receiving salvage chemotherapy and CAR T cells, all 5 patients were negative for minimal residual disease. And 4 of the patients went on to receive allo-SCT.
Now, the investigators have expanded upon these findings, reporting results in a total of 16 patients with relapsed/refractory B-ALL who received the 19-28z CAR T cells.
Forty-four percent of patients (n=7) had a complete response to the salvage chemotherapy, and 88% (n=14) had a complete response after CAR T-cell therapy (alghough some had incomplete count recovery). Sixty-three percent of patients (n=10) achieved a complete remission.
Of the 10 patients who were eligible for allo-SCT, 7 underwent the procedure, and all 7 remain free of relapse.
“These extraordinary results demonstrate that cell therapy is a powerful treatment for patients who have exhausted all conventional therapies,” Dr Sadelain said. “Our initial findings have held up in a larger cohort of patients, and we are already looking at new clinical studies to advance this novel therapeutic approach in fighting cancer.”
CRS diagnosis, stratification
In their analysis of 5 B-ALL patients, Dr Sadelain and his colleagues observed a correlation between cytokine elevation and tumor burden at the time of CAR T-cell administration. The team confirmed this correlation in the larger cohort of 16 patients and identified 7 cytokines whose elevation was correlated with pretreatment tumor burden and severe CRS.
Patients with CRS that required intensive medical intervention had a 75-fold increase over baseline levels in 2 of the 7 cytokines, which included IFN-γ, IL-5, IL-6, IL-10, Flt-3L, Fracktalkine, and GM-CSF. These patients also had at least 1 of the following: hypoxia, hypotension, and neurologic changes (such as delirium and seizure-like activity).
Taking these findings together, the researchers concluded that patients had severe CRS if they had persistent fevers (38°C) for more than 3 days, selected cytokine elevations, and additional clinical evidence of toxicity.
The investigators stressed that these patients should be closely monitored. Patients with severe CRS are more likely to need medical intervention than patients with mild CRS, which is characterized by low-grade fever and mild cytokine increases, or absent CRS, which is defined as no fevers and/or no significant cytokine elevations.
Finally, the researchers found that measuring C-reactive protein in serum samples could predict the severity of CRS. Only those patients who met the criteria for severe CRS had a C-reactive protein level of 20 mg/dL or higher.
Patients who had received high-dose steroids were excluded from this analysis, due to the inverse correlation between high-dose steroid treatment and serum C-reactive protein.
Incidentally, the investigators confirmed prior findings that the monoclonal antibody tocilizumab can ameliorate severe CRS as effectively as steroid treatment, without inhibiting the expansion of CAR T cells.
Credit: MSKCC
Several studies have shown that infusions of T cells modified with chimeric antigen receptors (CARs) can elicit complete responses in leukemia patients who have run out of treatment options.
However, the therapy also puts patients at risk of developing cytokine release syndrome (CRS).
With updated research, investigators have again shown that CAR T cells can produce complete responses in patients with relapsed or refractory B-cell acute lymphoblastic leukemia (B-ALL), thereby allowing them to receive allogeneic stem cell transplant (allo-SCT).
But the researchers have also used this group of patients to define diagnostic criteria for severe CRS. And the team has discovered that measuring C-reactive protein levels can help predict the severity of CRS.
Michel Sadelain, MD, PhD, of Memorial Sloan-Kettering Cancer Center in New York, and his colleagues described these findings in Science Translational Medicine.
Response, bridge to allo-SCT
Dr Sadelain and his colleagues previously reported results in 5 patients with relapsed/refractory B-ALL who received autologous T cells expressing a CD19-specific, CD28/CD3z CAR called 19-28z.
After receiving salvage chemotherapy and CAR T cells, all 5 patients were negative for minimal residual disease. And 4 of the patients went on to receive allo-SCT.
Now, the investigators have expanded upon these findings, reporting results in a total of 16 patients with relapsed/refractory B-ALL who received the 19-28z CAR T cells.
Forty-four percent of patients (n=7) had a complete response to the salvage chemotherapy, and 88% (n=14) had a complete response after CAR T-cell therapy (alghough some had incomplete count recovery). Sixty-three percent of patients (n=10) achieved a complete remission.
Of the 10 patients who were eligible for allo-SCT, 7 underwent the procedure, and all 7 remain free of relapse.
“These extraordinary results demonstrate that cell therapy is a powerful treatment for patients who have exhausted all conventional therapies,” Dr Sadelain said. “Our initial findings have held up in a larger cohort of patients, and we are already looking at new clinical studies to advance this novel therapeutic approach in fighting cancer.”
CRS diagnosis, stratification
In their analysis of 5 B-ALL patients, Dr Sadelain and his colleagues observed a correlation between cytokine elevation and tumor burden at the time of CAR T-cell administration. The team confirmed this correlation in the larger cohort of 16 patients and identified 7 cytokines whose elevation was correlated with pretreatment tumor burden and severe CRS.
Patients with CRS that required intensive medical intervention had a 75-fold increase over baseline levels in 2 of the 7 cytokines, which included IFN-γ, IL-5, IL-6, IL-10, Flt-3L, Fracktalkine, and GM-CSF. These patients also had at least 1 of the following: hypoxia, hypotension, and neurologic changes (such as delirium and seizure-like activity).
Taking these findings together, the researchers concluded that patients had severe CRS if they had persistent fevers (38°C) for more than 3 days, selected cytokine elevations, and additional clinical evidence of toxicity.
The investigators stressed that these patients should be closely monitored. Patients with severe CRS are more likely to need medical intervention than patients with mild CRS, which is characterized by low-grade fever and mild cytokine increases, or absent CRS, which is defined as no fevers and/or no significant cytokine elevations.
Finally, the researchers found that measuring C-reactive protein in serum samples could predict the severity of CRS. Only those patients who met the criteria for severe CRS had a C-reactive protein level of 20 mg/dL or higher.
Patients who had received high-dose steroids were excluded from this analysis, due to the inverse correlation between high-dose steroid treatment and serum C-reactive protein.
Incidentally, the investigators confirmed prior findings that the monoclonal antibody tocilizumab can ameliorate severe CRS as effectively as steroid treatment, without inhibiting the expansion of CAR T cells.
Credit: MSKCC
Several studies have shown that infusions of T cells modified with chimeric antigen receptors (CARs) can elicit complete responses in leukemia patients who have run out of treatment options.
However, the therapy also puts patients at risk of developing cytokine release syndrome (CRS).
With updated research, investigators have again shown that CAR T cells can produce complete responses in patients with relapsed or refractory B-cell acute lymphoblastic leukemia (B-ALL), thereby allowing them to receive allogeneic stem cell transplant (allo-SCT).
But the researchers have also used this group of patients to define diagnostic criteria for severe CRS. And the team has discovered that measuring C-reactive protein levels can help predict the severity of CRS.
Michel Sadelain, MD, PhD, of Memorial Sloan-Kettering Cancer Center in New York, and his colleagues described these findings in Science Translational Medicine.
Response, bridge to allo-SCT
Dr Sadelain and his colleagues previously reported results in 5 patients with relapsed/refractory B-ALL who received autologous T cells expressing a CD19-specific, CD28/CD3z CAR called 19-28z.
After receiving salvage chemotherapy and CAR T cells, all 5 patients were negative for minimal residual disease. And 4 of the patients went on to receive allo-SCT.
Now, the investigators have expanded upon these findings, reporting results in a total of 16 patients with relapsed/refractory B-ALL who received the 19-28z CAR T cells.
Forty-four percent of patients (n=7) had a complete response to the salvage chemotherapy, and 88% (n=14) had a complete response after CAR T-cell therapy (alghough some had incomplete count recovery). Sixty-three percent of patients (n=10) achieved a complete remission.
Of the 10 patients who were eligible for allo-SCT, 7 underwent the procedure, and all 7 remain free of relapse.
“These extraordinary results demonstrate that cell therapy is a powerful treatment for patients who have exhausted all conventional therapies,” Dr Sadelain said. “Our initial findings have held up in a larger cohort of patients, and we are already looking at new clinical studies to advance this novel therapeutic approach in fighting cancer.”
CRS diagnosis, stratification
In their analysis of 5 B-ALL patients, Dr Sadelain and his colleagues observed a correlation between cytokine elevation and tumor burden at the time of CAR T-cell administration. The team confirmed this correlation in the larger cohort of 16 patients and identified 7 cytokines whose elevation was correlated with pretreatment tumor burden and severe CRS.
Patients with CRS that required intensive medical intervention had a 75-fold increase over baseline levels in 2 of the 7 cytokines, which included IFN-γ, IL-5, IL-6, IL-10, Flt-3L, Fracktalkine, and GM-CSF. These patients also had at least 1 of the following: hypoxia, hypotension, and neurologic changes (such as delirium and seizure-like activity).
Taking these findings together, the researchers concluded that patients had severe CRS if they had persistent fevers (38°C) for more than 3 days, selected cytokine elevations, and additional clinical evidence of toxicity.
The investigators stressed that these patients should be closely monitored. Patients with severe CRS are more likely to need medical intervention than patients with mild CRS, which is characterized by low-grade fever and mild cytokine increases, or absent CRS, which is defined as no fevers and/or no significant cytokine elevations.
Finally, the researchers found that measuring C-reactive protein in serum samples could predict the severity of CRS. Only those patients who met the criteria for severe CRS had a C-reactive protein level of 20 mg/dL or higher.
Patients who had received high-dose steroids were excluded from this analysis, due to the inverse correlation between high-dose steroid treatment and serum C-reactive protein.
Incidentally, the investigators confirmed prior findings that the monoclonal antibody tocilizumab can ameliorate severe CRS as effectively as steroid treatment, without inhibiting the expansion of CAR T cells.
CDC Identifies Greatest Antibiotic Resistance Threats of Next Decade in U.S.
Clinical question: What antibiotic-resistant bacteria are the greatest threats for the next 10 years?
Background: Two million people suffer antibiotic-resistant infections yearly, and 23,000 die each year as a result. Most of these infections occur in the community, but deaths usually occur in healthcare settings. Cost estimates vary but may be as high as $20 billion in excess direct healthcare costs.
Study design: The CDC used several different surveys and databanks, including the National Antimicrobial Resistance Monitoring System, to collect data. The threat level for antibiotic-resistant bacteria was determined using several factors: clinical impact, economic impact, incidence, 10-year projection of incidence, transmissibility, availability of effective antibiotics, and barriers to prevention.
Setting: United States.
Synopsis: The CDC has three classifications of antibiotic-resistant bacteria: urgent, serious, and concerning. Urgent threats are high-consequence, antibiotic-resistant threats because of significant risks identified across several criteria. These threats might not currently be widespread but have the potential to become so and require urgent public health attention to identify infections and to limit transmission. They include carbapenem-resistant Enterobacteriaceae, drug-resistant Neisseria gonorrhoeae, and Clostridium difficile (does not have true resistance, but is a consequence of antibiotic overuse).
Serious threats are significant antibiotic-resistant threats. These threats will worsen, and might become urgent without ongoing public health monitoring and prevention activities. They include multidrug-resistant Acinetobacter, drug-resistant Campylobacter, fluconazole-resistant Candida (a fungus), extended-spectrum ß-lactamase-producing Enterobacteriaceae, vancomycin-resistant Enterococcus, multidrug-resistant Pseudomonas aeruginosa, drug-resistant non-typhoidal Salmonella, drug-resistant Salmonella Typhimurium, drug-resistant Shigella, methicillin-resistant Staphylococcus aureus, drug-resistant Streptococcus pneumonia, and drug-resistant tuberculosis.
Concerning threats are bacteria for which the threat of antibiotic resistance is low, and/or there are multiple therapeutic options for resistant infections. These bacterial pathogens cause severe illness.
Threats in this category require monitoring and, in some cases, rapid incident or outbreak response. These include vancomycin-resistant Staphylococcus aureus, erythromycin-resistant Group A Streptococcus, and clindamycin-resistant Group B Streptococcus.
Research has shown patients with resistant infections have significantly longer hospital stays, delayed recuperation, long-term disability, and higher mortality. As resistance to current antibiotics occurs, providers are forced to use antibiotics that are more toxic, more expensive, and less effective.
The CDC recommends four core actions to fight antibiotic resistance:
- Preventing infections from occurring and preventing resistant bacteria from spreading (immunization, infection control, screening, treatment, and education);
- Tracking resistant bacteria;
- Improving the use of antibiotics (antibiotic stewardship); and
- Promoting the development of new antibiotics and new diagnostic tests for resistant bacteria.
Bottom line: Antibiotics are a limited resource. The more antibiotics are used today, the less likely they will continue to be effective in the future. The CDC lists 18 antibiotic-resistant organisms as urgent, serious, or concerning and recommends actions to combat the spread of current organisms and emergence of new antibiotic organisms.
Citation: Centers for Disease Control and Prevention. Antibiotic resistance threats in the United States, 2013. CDC website. September 16, 2013. Available at: www.cdc.gov/drugresistance/threat-report-2013. Accessed Nov. 30, 2013.
Clinical question: What antibiotic-resistant bacteria are the greatest threats for the next 10 years?
Background: Two million people suffer antibiotic-resistant infections yearly, and 23,000 die each year as a result. Most of these infections occur in the community, but deaths usually occur in healthcare settings. Cost estimates vary but may be as high as $20 billion in excess direct healthcare costs.
Study design: The CDC used several different surveys and databanks, including the National Antimicrobial Resistance Monitoring System, to collect data. The threat level for antibiotic-resistant bacteria was determined using several factors: clinical impact, economic impact, incidence, 10-year projection of incidence, transmissibility, availability of effective antibiotics, and barriers to prevention.
Setting: United States.
Synopsis: The CDC has three classifications of antibiotic-resistant bacteria: urgent, serious, and concerning. Urgent threats are high-consequence, antibiotic-resistant threats because of significant risks identified across several criteria. These threats might not currently be widespread but have the potential to become so and require urgent public health attention to identify infections and to limit transmission. They include carbapenem-resistant Enterobacteriaceae, drug-resistant Neisseria gonorrhoeae, and Clostridium difficile (does not have true resistance, but is a consequence of antibiotic overuse).
Serious threats are significant antibiotic-resistant threats. These threats will worsen, and might become urgent without ongoing public health monitoring and prevention activities. They include multidrug-resistant Acinetobacter, drug-resistant Campylobacter, fluconazole-resistant Candida (a fungus), extended-spectrum ß-lactamase-producing Enterobacteriaceae, vancomycin-resistant Enterococcus, multidrug-resistant Pseudomonas aeruginosa, drug-resistant non-typhoidal Salmonella, drug-resistant Salmonella Typhimurium, drug-resistant Shigella, methicillin-resistant Staphylococcus aureus, drug-resistant Streptococcus pneumonia, and drug-resistant tuberculosis.
Concerning threats are bacteria for which the threat of antibiotic resistance is low, and/or there are multiple therapeutic options for resistant infections. These bacterial pathogens cause severe illness.
Threats in this category require monitoring and, in some cases, rapid incident or outbreak response. These include vancomycin-resistant Staphylococcus aureus, erythromycin-resistant Group A Streptococcus, and clindamycin-resistant Group B Streptococcus.
Research has shown patients with resistant infections have significantly longer hospital stays, delayed recuperation, long-term disability, and higher mortality. As resistance to current antibiotics occurs, providers are forced to use antibiotics that are more toxic, more expensive, and less effective.
The CDC recommends four core actions to fight antibiotic resistance:
- Preventing infections from occurring and preventing resistant bacteria from spreading (immunization, infection control, screening, treatment, and education);
- Tracking resistant bacteria;
- Improving the use of antibiotics (antibiotic stewardship); and
- Promoting the development of new antibiotics and new diagnostic tests for resistant bacteria.
Bottom line: Antibiotics are a limited resource. The more antibiotics are used today, the less likely they will continue to be effective in the future. The CDC lists 18 antibiotic-resistant organisms as urgent, serious, or concerning and recommends actions to combat the spread of current organisms and emergence of new antibiotic organisms.
Citation: Centers for Disease Control and Prevention. Antibiotic resistance threats in the United States, 2013. CDC website. September 16, 2013. Available at: www.cdc.gov/drugresistance/threat-report-2013. Accessed Nov. 30, 2013.
Clinical question: What antibiotic-resistant bacteria are the greatest threats for the next 10 years?
Background: Two million people suffer antibiotic-resistant infections yearly, and 23,000 die each year as a result. Most of these infections occur in the community, but deaths usually occur in healthcare settings. Cost estimates vary but may be as high as $20 billion in excess direct healthcare costs.
Study design: The CDC used several different surveys and databanks, including the National Antimicrobial Resistance Monitoring System, to collect data. The threat level for antibiotic-resistant bacteria was determined using several factors: clinical impact, economic impact, incidence, 10-year projection of incidence, transmissibility, availability of effective antibiotics, and barriers to prevention.
Setting: United States.
Synopsis: The CDC has three classifications of antibiotic-resistant bacteria: urgent, serious, and concerning. Urgent threats are high-consequence, antibiotic-resistant threats because of significant risks identified across several criteria. These threats might not currently be widespread but have the potential to become so and require urgent public health attention to identify infections and to limit transmission. They include carbapenem-resistant Enterobacteriaceae, drug-resistant Neisseria gonorrhoeae, and Clostridium difficile (does not have true resistance, but is a consequence of antibiotic overuse).
Serious threats are significant antibiotic-resistant threats. These threats will worsen, and might become urgent without ongoing public health monitoring and prevention activities. They include multidrug-resistant Acinetobacter, drug-resistant Campylobacter, fluconazole-resistant Candida (a fungus), extended-spectrum ß-lactamase-producing Enterobacteriaceae, vancomycin-resistant Enterococcus, multidrug-resistant Pseudomonas aeruginosa, drug-resistant non-typhoidal Salmonella, drug-resistant Salmonella Typhimurium, drug-resistant Shigella, methicillin-resistant Staphylococcus aureus, drug-resistant Streptococcus pneumonia, and drug-resistant tuberculosis.
Concerning threats are bacteria for which the threat of antibiotic resistance is low, and/or there are multiple therapeutic options for resistant infections. These bacterial pathogens cause severe illness.
Threats in this category require monitoring and, in some cases, rapid incident or outbreak response. These include vancomycin-resistant Staphylococcus aureus, erythromycin-resistant Group A Streptococcus, and clindamycin-resistant Group B Streptococcus.
Research has shown patients with resistant infections have significantly longer hospital stays, delayed recuperation, long-term disability, and higher mortality. As resistance to current antibiotics occurs, providers are forced to use antibiotics that are more toxic, more expensive, and less effective.
The CDC recommends four core actions to fight antibiotic resistance:
- Preventing infections from occurring and preventing resistant bacteria from spreading (immunization, infection control, screening, treatment, and education);
- Tracking resistant bacteria;
- Improving the use of antibiotics (antibiotic stewardship); and
- Promoting the development of new antibiotics and new diagnostic tests for resistant bacteria.
Bottom line: Antibiotics are a limited resource. The more antibiotics are used today, the less likely they will continue to be effective in the future. The CDC lists 18 antibiotic-resistant organisms as urgent, serious, or concerning and recommends actions to combat the spread of current organisms and emergence of new antibiotic organisms.
Citation: Centers for Disease Control and Prevention. Antibiotic resistance threats in the United States, 2013. CDC website. September 16, 2013. Available at: www.cdc.gov/drugresistance/threat-report-2013. Accessed Nov. 30, 2013.
Team-Based Care Model Improves Communication, Coordination Among Hospital Staffs
A care model in which physicians and nurses do bedside rounds as a team can reduce average length of stay (LOS) and in-hospital mortality, according to a recent Harvard Business Review blog post written by a group healthcare providers from Emory Healthcare in Atlanta.
In September 2010, members of a quality-improvement program at Emory University Hospital in Atlanta reorganized a 24-bed medical unit where six hospital medicine teams had seen patients into an Accountable Care Unit (ACU). They designed the ACU to have four key features: unit-based physician teams; structured interdisciplinary bedside rounds [PDF]; unit-level performance reports; and unit co-management by nurse and physician directors.
Physicians were assigned to units so more than 90% of their patients could be located on the same floor, which allows scheduling of a permanent daily start time for bedside rounds. This consistent schedule allows the entire health care team to round together. It also makes it easier for family members to know when rounds are happening so they can ask questions and learn about the care plan.
"Rather than having six different hospital medicine teams rounding on eight different units every day, we thought it would make more sense to have those teams round on a single unit each day where all their patients were cohorted together," says hospitalist Jason Stein, MD, SFHM, and lead author of the HBR blog post. Dr. Stein is director for quality in the division of hospital medicine at the Emory University School of Medicine and an innovation advisor to the Center for Medicare and Medicaid Innovation, a unit within the Centers for Medicare and Medicaid Services.
According to the blog post, in the first year that the ACU model was implemented, the average LOS for hospitalized patients decreased from 5 days to 4.5 days and in-hospital mortality declined from 2.3 deaths per 100 encounters to 1.1 deaths per 100 encounters. The ACU model was recognized as the top innovation the 2012 SHM Annual Meeting.
Dr. Stein and fellow blog authors noted two challenges in the ACU model—namely, that creating unit teams required assigning attending physicians to a home unit, and that the structure led physicians and staff to follow a patient- and family-based approach to care-planning activities. Dr. Stein says the group behind the creation of the ACU model is in the process of submitting its data for publication.
Visit our website for more information about improving inpatient care coordination.
A care model in which physicians and nurses do bedside rounds as a team can reduce average length of stay (LOS) and in-hospital mortality, according to a recent Harvard Business Review blog post written by a group healthcare providers from Emory Healthcare in Atlanta.
In September 2010, members of a quality-improvement program at Emory University Hospital in Atlanta reorganized a 24-bed medical unit where six hospital medicine teams had seen patients into an Accountable Care Unit (ACU). They designed the ACU to have four key features: unit-based physician teams; structured interdisciplinary bedside rounds [PDF]; unit-level performance reports; and unit co-management by nurse and physician directors.
Physicians were assigned to units so more than 90% of their patients could be located on the same floor, which allows scheduling of a permanent daily start time for bedside rounds. This consistent schedule allows the entire health care team to round together. It also makes it easier for family members to know when rounds are happening so they can ask questions and learn about the care plan.
"Rather than having six different hospital medicine teams rounding on eight different units every day, we thought it would make more sense to have those teams round on a single unit each day where all their patients were cohorted together," says hospitalist Jason Stein, MD, SFHM, and lead author of the HBR blog post. Dr. Stein is director for quality in the division of hospital medicine at the Emory University School of Medicine and an innovation advisor to the Center for Medicare and Medicaid Innovation, a unit within the Centers for Medicare and Medicaid Services.
According to the blog post, in the first year that the ACU model was implemented, the average LOS for hospitalized patients decreased from 5 days to 4.5 days and in-hospital mortality declined from 2.3 deaths per 100 encounters to 1.1 deaths per 100 encounters. The ACU model was recognized as the top innovation the 2012 SHM Annual Meeting.
Dr. Stein and fellow blog authors noted two challenges in the ACU model—namely, that creating unit teams required assigning attending physicians to a home unit, and that the structure led physicians and staff to follow a patient- and family-based approach to care-planning activities. Dr. Stein says the group behind the creation of the ACU model is in the process of submitting its data for publication.
Visit our website for more information about improving inpatient care coordination.
A care model in which physicians and nurses do bedside rounds as a team can reduce average length of stay (LOS) and in-hospital mortality, according to a recent Harvard Business Review blog post written by a group healthcare providers from Emory Healthcare in Atlanta.
In September 2010, members of a quality-improvement program at Emory University Hospital in Atlanta reorganized a 24-bed medical unit where six hospital medicine teams had seen patients into an Accountable Care Unit (ACU). They designed the ACU to have four key features: unit-based physician teams; structured interdisciplinary bedside rounds [PDF]; unit-level performance reports; and unit co-management by nurse and physician directors.
Physicians were assigned to units so more than 90% of their patients could be located on the same floor, which allows scheduling of a permanent daily start time for bedside rounds. This consistent schedule allows the entire health care team to round together. It also makes it easier for family members to know when rounds are happening so they can ask questions and learn about the care plan.
"Rather than having six different hospital medicine teams rounding on eight different units every day, we thought it would make more sense to have those teams round on a single unit each day where all their patients were cohorted together," says hospitalist Jason Stein, MD, SFHM, and lead author of the HBR blog post. Dr. Stein is director for quality in the division of hospital medicine at the Emory University School of Medicine and an innovation advisor to the Center for Medicare and Medicaid Innovation, a unit within the Centers for Medicare and Medicaid Services.
According to the blog post, in the first year that the ACU model was implemented, the average LOS for hospitalized patients decreased from 5 days to 4.5 days and in-hospital mortality declined from 2.3 deaths per 100 encounters to 1.1 deaths per 100 encounters. The ACU model was recognized as the top innovation the 2012 SHM Annual Meeting.
Dr. Stein and fellow blog authors noted two challenges in the ACU model—namely, that creating unit teams required assigning attending physicians to a home unit, and that the structure led physicians and staff to follow a patient- and family-based approach to care-planning activities. Dr. Stein says the group behind the creation of the ACU model is in the process of submitting its data for publication.
Visit our website for more information about improving inpatient care coordination.
Bevacizumab fails to improve survival in newly diagnosed glioblastoma
The addition of bevacizumab to standard treatment did not improve survival in patients with newly diagnosed glioblastoma, and in some cases, worsened quality of life and led to cognitive decline.
Bevacizumab (Avastin) added to frontline radiation and temozolomide therapy extended progression-free survival, but did not improve overall survival in the Radiation Therapy Oncology Group (RTOG) 0825 study, a double-blind, placebo-controlled phase III trial, Dr. Mark R. Gilbert of the University of Texas M.D. Anderson Cancer Center, Houston, and his associates reported Feb. 19 in the New England Journal of Medicine.
Among 637 patients with centrally confirmed glioblastoma who were randomized, median overall survival reached 16.1 months in those assigned to radiation, temozolomide, and placebo, compared with 15.7 months in patients assigned to radiation, temozolomide, and bevacizumab. (N. Engl. J. Med. 2014; 370:699-708.)
Median overall survival data were virtually identical in a second similarly designed study, also reported Feb. 19 in the New England Journal of Medicine.
Median survival in Avaglio, which ran parallel to RTOG 0825, was 16.8 months in 458 patients in its radiation, temozolomide, and bevacizumab arm, vs. 16.7 months in 463 patients in its radiation, temozolomide, and placebo arm, Dr. Olivier L. Chinot of Aix-Marseille University, Marseille, France, and his associates reported (N. Engl. J. Med. 2014; 370:709-22).
Median progression-free survival in Avaglio reached 10.6 months in the bevacizumab arm vs. 6.2 months in the placebo arm, and the difference was significant (hazard ratio, 0.64; P less than .0001).
The Avaglio trial showed a benefit or maintenance of quality of life measures, but did not look at neurocognitive outcomes. More patients in the bevacizumab group than in the placebo group had grade 3 or higher adverse events (66.8% vs. 51.3%) and grade 3 or higher adverse events often associated with bevacizumab (32.5% vs. 15.8%), reported Dr. Chinot and his associates.
Both studies were presented last year at the annual meeting of the American Society of Clinical Oncology.
The RTOG 0285 study was supported by the National Cancer Institute, with additional support from Genentech. Avaglio was supported by Roche. Dr. Gilbert disclosed consulting for, and receiving honoraria and research support from, Genentech. Dr. Chinot disclosed receiving financial and nonfinancial support from Roche.
On Twitter @NikolaidesLaura
The addition of bevacizumab to standard treatment did not improve survival in patients with newly diagnosed glioblastoma, and in some cases, worsened quality of life and led to cognitive decline.
Bevacizumab (Avastin) added to frontline radiation and temozolomide therapy extended progression-free survival, but did not improve overall survival in the Radiation Therapy Oncology Group (RTOG) 0825 study, a double-blind, placebo-controlled phase III trial, Dr. Mark R. Gilbert of the University of Texas M.D. Anderson Cancer Center, Houston, and his associates reported Feb. 19 in the New England Journal of Medicine.
Among 637 patients with centrally confirmed glioblastoma who were randomized, median overall survival reached 16.1 months in those assigned to radiation, temozolomide, and placebo, compared with 15.7 months in patients assigned to radiation, temozolomide, and bevacizumab. (N. Engl. J. Med. 2014; 370:699-708.)
Median overall survival data were virtually identical in a second similarly designed study, also reported Feb. 19 in the New England Journal of Medicine.
Median survival in Avaglio, which ran parallel to RTOG 0825, was 16.8 months in 458 patients in its radiation, temozolomide, and bevacizumab arm, vs. 16.7 months in 463 patients in its radiation, temozolomide, and placebo arm, Dr. Olivier L. Chinot of Aix-Marseille University, Marseille, France, and his associates reported (N. Engl. J. Med. 2014; 370:709-22).
Median progression-free survival in Avaglio reached 10.6 months in the bevacizumab arm vs. 6.2 months in the placebo arm, and the difference was significant (hazard ratio, 0.64; P less than .0001).
The Avaglio trial showed a benefit or maintenance of quality of life measures, but did not look at neurocognitive outcomes. More patients in the bevacizumab group than in the placebo group had grade 3 or higher adverse events (66.8% vs. 51.3%) and grade 3 or higher adverse events often associated with bevacizumab (32.5% vs. 15.8%), reported Dr. Chinot and his associates.
Both studies were presented last year at the annual meeting of the American Society of Clinical Oncology.
The RTOG 0285 study was supported by the National Cancer Institute, with additional support from Genentech. Avaglio was supported by Roche. Dr. Gilbert disclosed consulting for, and receiving honoraria and research support from, Genentech. Dr. Chinot disclosed receiving financial and nonfinancial support from Roche.
On Twitter @NikolaidesLaura
The addition of bevacizumab to standard treatment did not improve survival in patients with newly diagnosed glioblastoma, and in some cases, worsened quality of life and led to cognitive decline.
Bevacizumab (Avastin) added to frontline radiation and temozolomide therapy extended progression-free survival, but did not improve overall survival in the Radiation Therapy Oncology Group (RTOG) 0825 study, a double-blind, placebo-controlled phase III trial, Dr. Mark R. Gilbert of the University of Texas M.D. Anderson Cancer Center, Houston, and his associates reported Feb. 19 in the New England Journal of Medicine.
Among 637 patients with centrally confirmed glioblastoma who were randomized, median overall survival reached 16.1 months in those assigned to radiation, temozolomide, and placebo, compared with 15.7 months in patients assigned to radiation, temozolomide, and bevacizumab. (N. Engl. J. Med. 2014; 370:699-708.)
Median overall survival data were virtually identical in a second similarly designed study, also reported Feb. 19 in the New England Journal of Medicine.
Median survival in Avaglio, which ran parallel to RTOG 0825, was 16.8 months in 458 patients in its radiation, temozolomide, and bevacizumab arm, vs. 16.7 months in 463 patients in its radiation, temozolomide, and placebo arm, Dr. Olivier L. Chinot of Aix-Marseille University, Marseille, France, and his associates reported (N. Engl. J. Med. 2014; 370:709-22).
Median progression-free survival in Avaglio reached 10.6 months in the bevacizumab arm vs. 6.2 months in the placebo arm, and the difference was significant (hazard ratio, 0.64; P less than .0001).
The Avaglio trial showed a benefit or maintenance of quality of life measures, but did not look at neurocognitive outcomes. More patients in the bevacizumab group than in the placebo group had grade 3 or higher adverse events (66.8% vs. 51.3%) and grade 3 or higher adverse events often associated with bevacizumab (32.5% vs. 15.8%), reported Dr. Chinot and his associates.
Both studies were presented last year at the annual meeting of the American Society of Clinical Oncology.
The RTOG 0285 study was supported by the National Cancer Institute, with additional support from Genentech. Avaglio was supported by Roche. Dr. Gilbert disclosed consulting for, and receiving honoraria and research support from, Genentech. Dr. Chinot disclosed receiving financial and nonfinancial support from Roche.
On Twitter @NikolaidesLaura
FROM THE NEW ENGLAND JOURNAL OF MEDICINE
Major finding: Two similar studies found no difference in median survival with the addition of bevacizumab to standard therapy for patients with newly diagnosed glioblastoma. Median overall survival was 15.7 months in the bevacizumab arm vs. 16.1 months in the placebo arm in one study, and16.8 months in the bevacizumab arm vs. 16.7 months in the placebo arm in a second, similar study.
Data source: Two randomized, double-blind placebo-controlled phase III trials; the Radiation Therapy Oncology Group (RCOG) 0825 trial included 637 patients and the Avastin in Glioblastoma (Avaglia) trial involved 921 patients.
Disclosures: The RTOG 0285 study was supported by the National Cancer Institute, with additional support from Genentech. Avaglio was supported by Roche. Dr. Gilbert disclosed consulting for, and receiving honoraria and research support from, Genentech. Dr. Chinot disclosed receiving financial and nonfinancial support from Roche.
Treating female pattern hair loss
Female pattern hair loss, which affects over 21 million women in the United States, is a nonscarring hair loss primarily involving the frontal and vertex scalp. FPHS causes women significant emotional and psychological distress. We see, and will continue to see, a lot of these cases in primary care. If left untreated or unaddressed, FPHL results in a slow, progressive decline in the density of scalp hair.
FPHL is characterized by the production of shorter and finer hairs and shortening of the growth phase of hair follicles. One may find it important to rule out secondary causes of hair loss, such as hyperandrogenism. So after excluding secondary causes, what are the best treatment options for treatment?
Researchers conducted a systematic review (Cochrane Database Syst. Rev. 2012 May, CD007628 [doi:10.1002/14651858.CD007628.pub3]) assessing the effectiveness of interventions for female pattern hair loss. Studies were included if they compared any type of monotherapy or combination therapy to treat FPHL. Studies evaluating treatments in women with increased circulating androgens (such as polycystic ovarian syndrome) were included. Primary outcomes included self-reported hair regrowth, quality of life, and adverse effects.
The Cochrane review included 22 studies that enrolled a total of 2,349 subjects. Ten studies evaluated minoxidil, four evaluated finasteride, two cyproterone acetate, and two flutamide. A variety of other exotic interventions was evaluated, including topical melatonin-alcohol solution, adenosine lotion, and pulsed electrostatic field.
The best data continue to exist for minoxidil. "Pooled data from 4 studies indicated that a greater proportion of participants (121/488) treated with minoxidil reported a moderate increase in their hair regrowth when compared with placebo (64/476) (risk ratio, 1.86; 95% confidence interval [CI], 1.42 to 2.43). In 7 studies, there was an important increase of 13.28 in total hair count per cm² in the minoxidil group compared to the placebo group (95% CI, 10.89 to 15.68). There was no difference in the number of adverse events in the twice daily minoxidil and placebo intervention groups, with the exception of a reported increase of adverse events (additional hair growth on areas other than the scalp) with minoxidil (5%) twice daily," according to the Cochrane report.
Other promising agents might be octyl nicotinate (0.5%), myristyl nicotinate (5%), and flutamide. Fulvestrant, adenosine, pulsed electrostatic field, and estradiol valerate are ineffective.
Minoxidil it is. It may be important to remind patients that 2% twice daily may be as effective and safe as 5% once a day.
Dr. Ebbert is professor of medicine and a general internist at the Mayo Clinic in Rochester, Minn., and a diplomate of the American Board of Addiction Medicine. The opinions expressed are those of the author. He reports no conflicts of interest.
Female pattern hair loss, which affects over 21 million women in the United States, is a nonscarring hair loss primarily involving the frontal and vertex scalp. FPHS causes women significant emotional and psychological distress. We see, and will continue to see, a lot of these cases in primary care. If left untreated or unaddressed, FPHL results in a slow, progressive decline in the density of scalp hair.
FPHL is characterized by the production of shorter and finer hairs and shortening of the growth phase of hair follicles. One may find it important to rule out secondary causes of hair loss, such as hyperandrogenism. So after excluding secondary causes, what are the best treatment options for treatment?
Researchers conducted a systematic review (Cochrane Database Syst. Rev. 2012 May, CD007628 [doi:10.1002/14651858.CD007628.pub3]) assessing the effectiveness of interventions for female pattern hair loss. Studies were included if they compared any type of monotherapy or combination therapy to treat FPHL. Studies evaluating treatments in women with increased circulating androgens (such as polycystic ovarian syndrome) were included. Primary outcomes included self-reported hair regrowth, quality of life, and adverse effects.
The Cochrane review included 22 studies that enrolled a total of 2,349 subjects. Ten studies evaluated minoxidil, four evaluated finasteride, two cyproterone acetate, and two flutamide. A variety of other exotic interventions was evaluated, including topical melatonin-alcohol solution, adenosine lotion, and pulsed electrostatic field.
The best data continue to exist for minoxidil. "Pooled data from 4 studies indicated that a greater proportion of participants (121/488) treated with minoxidil reported a moderate increase in their hair regrowth when compared with placebo (64/476) (risk ratio, 1.86; 95% confidence interval [CI], 1.42 to 2.43). In 7 studies, there was an important increase of 13.28 in total hair count per cm² in the minoxidil group compared to the placebo group (95% CI, 10.89 to 15.68). There was no difference in the number of adverse events in the twice daily minoxidil and placebo intervention groups, with the exception of a reported increase of adverse events (additional hair growth on areas other than the scalp) with minoxidil (5%) twice daily," according to the Cochrane report.
Other promising agents might be octyl nicotinate (0.5%), myristyl nicotinate (5%), and flutamide. Fulvestrant, adenosine, pulsed electrostatic field, and estradiol valerate are ineffective.
Minoxidil it is. It may be important to remind patients that 2% twice daily may be as effective and safe as 5% once a day.
Dr. Ebbert is professor of medicine and a general internist at the Mayo Clinic in Rochester, Minn., and a diplomate of the American Board of Addiction Medicine. The opinions expressed are those of the author. He reports no conflicts of interest.
Female pattern hair loss, which affects over 21 million women in the United States, is a nonscarring hair loss primarily involving the frontal and vertex scalp. FPHS causes women significant emotional and psychological distress. We see, and will continue to see, a lot of these cases in primary care. If left untreated or unaddressed, FPHL results in a slow, progressive decline in the density of scalp hair.
FPHL is characterized by the production of shorter and finer hairs and shortening of the growth phase of hair follicles. One may find it important to rule out secondary causes of hair loss, such as hyperandrogenism. So after excluding secondary causes, what are the best treatment options for treatment?
Researchers conducted a systematic review (Cochrane Database Syst. Rev. 2012 May, CD007628 [doi:10.1002/14651858.CD007628.pub3]) assessing the effectiveness of interventions for female pattern hair loss. Studies were included if they compared any type of monotherapy or combination therapy to treat FPHL. Studies evaluating treatments in women with increased circulating androgens (such as polycystic ovarian syndrome) were included. Primary outcomes included self-reported hair regrowth, quality of life, and adverse effects.
The Cochrane review included 22 studies that enrolled a total of 2,349 subjects. Ten studies evaluated minoxidil, four evaluated finasteride, two cyproterone acetate, and two flutamide. A variety of other exotic interventions was evaluated, including topical melatonin-alcohol solution, adenosine lotion, and pulsed electrostatic field.
The best data continue to exist for minoxidil. "Pooled data from 4 studies indicated that a greater proportion of participants (121/488) treated with minoxidil reported a moderate increase in their hair regrowth when compared with placebo (64/476) (risk ratio, 1.86; 95% confidence interval [CI], 1.42 to 2.43). In 7 studies, there was an important increase of 13.28 in total hair count per cm² in the minoxidil group compared to the placebo group (95% CI, 10.89 to 15.68). There was no difference in the number of adverse events in the twice daily minoxidil and placebo intervention groups, with the exception of a reported increase of adverse events (additional hair growth on areas other than the scalp) with minoxidil (5%) twice daily," according to the Cochrane report.
Other promising agents might be octyl nicotinate (0.5%), myristyl nicotinate (5%), and flutamide. Fulvestrant, adenosine, pulsed electrostatic field, and estradiol valerate are ineffective.
Minoxidil it is. It may be important to remind patients that 2% twice daily may be as effective and safe as 5% once a day.
Dr. Ebbert is professor of medicine and a general internist at the Mayo Clinic in Rochester, Minn., and a diplomate of the American Board of Addiction Medicine. The opinions expressed are those of the author. He reports no conflicts of interest.
Digital Dermatology: VisualDx
"You should always consider three differential diagnoses for each patient." This was sound advice from my dermatology residency director, but advice I often neglect to take. What about you?
If you are like most of us, then your brain in clinic is on autopilot. It instantly selects the diagnosis and moves on. But when pushed by a confusing rash or a disease unresponsive to our standard treatment, we quickly encounter the limits of the human brain.
When stumped, we all do the same thing: Recruit more eyeballs (and brains). We find a colleague nearby and pull him into the room. "So, what do you think this is? What would you do?" Although often helpful, this method is inefficient and fails to capitalize on the most important of all medical tools: the computer.
Unlike our brains, computers in the form of clinical decision support (CDS) tools, are not prone to cognitive errors. Good CDS tools aren’t subject to top-of-mind biases. Their ability to generate differential diagnoses exceeds even the masters among us. Fortunately, there is such a CDS for skin disease: VisualDx.
VisualDx is a CDS tool focused on dermatologic conditions. It covers common and rare skin conditions and has more than 25,000 professional images.
What’s unique here is that VisualDx is more than a database of images. "It is truly a diagnostic decision support tool that allows you to search by multiple factors at once – symptoms, diagnoses, medications, medical history, travel, skin color, etc.," said Dr. Noah Craft, practicing dermatologist and chief medical officer of VisualDx.
In contrast to textbooks and other medical knowledge databases, this system is designed for easy, point-of-care use – it makes a dermatologist’s or even a primary care physician’s work easier. By quickly reviewing photos and diagnostic pearls, our brains are supercharged with deep differentials and management ideas.
For example, I recently had a patient who presented with papulosquamous eruptions that involved his body and hands. Among other diagnoses was secondary syphilis. Yes, I had thought of that, but a quick scan through VisualDx prompted me to ask about other symptoms, including vision changes (which he had). The patient also had HIV. Quick, which test is the best for me to order? Too slow, it’s already there in front of me on my screen.
In addition to improving quality, tools such as these also can improve access. Studies from the company show that the average user saves between 15 and 26 minutes per day using their product. For the working dermatologist, that means being able to see two additional patients a day.
VisualDx also educates and empowers patients. Don’t believe those bumps you have are molluscum? You can see here that these photos look exactly like the bumps you have. Rather than explain conditions through difficult doctor-speak, physicians can show complex knowledge to patients visually. As Dr. Craft notes and many of us have experienced: "For many patients, seeing is believing."
Whether it’s corroborating a diagnosis or exploring treatment options, having the doctor and patient share the same screen is an effective way to increase comprehension and build trust. No matter how good our drawings on the back of a prescription pad may be, they are not as accurate or helpful as curated digital photos. Our screen-savvy patients will soon expect this type of technology with every visit.
Good digital medicine tools also will help remedy one of medicine’s oldest and most glaring defects: We don’t account for the fact that the vast majority of health care happens in between doctor visits. Now patient education doesn’t stop at the culmination of the visit. Physicians can either print or e-mail images and information to patients so that they can have an accurate record at home to share with family members and caregivers.
VisualDx is a leading technology in what will be the future of medicine: Digital tools that serve doctors with everything they need to diagnose and treat patients with a click or flick of the screen. Having ten thousand treatment options instantly in your pocket – try that with any lab coat reference book.
Oftentimes, technology is more sparkle than substance. Not so with VisualDx. Have you used it in your practice? Let us know what you think about it.
For more information and to learn how to subscribe, visit www.visualdx.com. VisualDx is a paid subscription service.
Dr. Benabio is a practicing dermatologist and physician director of health care transformation at Kaiser Permanente in San Diego. Dr. Benabio said he has no financial interest in VisualDx, but he has had complimentary access. Connect with him on Twitter @Dermdoc or drop him a line at [email protected].
"You should always consider three differential diagnoses for each patient." This was sound advice from my dermatology residency director, but advice I often neglect to take. What about you?
If you are like most of us, then your brain in clinic is on autopilot. It instantly selects the diagnosis and moves on. But when pushed by a confusing rash or a disease unresponsive to our standard treatment, we quickly encounter the limits of the human brain.
When stumped, we all do the same thing: Recruit more eyeballs (and brains). We find a colleague nearby and pull him into the room. "So, what do you think this is? What would you do?" Although often helpful, this method is inefficient and fails to capitalize on the most important of all medical tools: the computer.
Unlike our brains, computers in the form of clinical decision support (CDS) tools, are not prone to cognitive errors. Good CDS tools aren’t subject to top-of-mind biases. Their ability to generate differential diagnoses exceeds even the masters among us. Fortunately, there is such a CDS for skin disease: VisualDx.
VisualDx is a CDS tool focused on dermatologic conditions. It covers common and rare skin conditions and has more than 25,000 professional images.
What’s unique here is that VisualDx is more than a database of images. "It is truly a diagnostic decision support tool that allows you to search by multiple factors at once – symptoms, diagnoses, medications, medical history, travel, skin color, etc.," said Dr. Noah Craft, practicing dermatologist and chief medical officer of VisualDx.
In contrast to textbooks and other medical knowledge databases, this system is designed for easy, point-of-care use – it makes a dermatologist’s or even a primary care physician’s work easier. By quickly reviewing photos and diagnostic pearls, our brains are supercharged with deep differentials and management ideas.
For example, I recently had a patient who presented with papulosquamous eruptions that involved his body and hands. Among other diagnoses was secondary syphilis. Yes, I had thought of that, but a quick scan through VisualDx prompted me to ask about other symptoms, including vision changes (which he had). The patient also had HIV. Quick, which test is the best for me to order? Too slow, it’s already there in front of me on my screen.
In addition to improving quality, tools such as these also can improve access. Studies from the company show that the average user saves between 15 and 26 minutes per day using their product. For the working dermatologist, that means being able to see two additional patients a day.
VisualDx also educates and empowers patients. Don’t believe those bumps you have are molluscum? You can see here that these photos look exactly like the bumps you have. Rather than explain conditions through difficult doctor-speak, physicians can show complex knowledge to patients visually. As Dr. Craft notes and many of us have experienced: "For many patients, seeing is believing."
Whether it’s corroborating a diagnosis or exploring treatment options, having the doctor and patient share the same screen is an effective way to increase comprehension and build trust. No matter how good our drawings on the back of a prescription pad may be, they are not as accurate or helpful as curated digital photos. Our screen-savvy patients will soon expect this type of technology with every visit.
Good digital medicine tools also will help remedy one of medicine’s oldest and most glaring defects: We don’t account for the fact that the vast majority of health care happens in between doctor visits. Now patient education doesn’t stop at the culmination of the visit. Physicians can either print or e-mail images and information to patients so that they can have an accurate record at home to share with family members and caregivers.
VisualDx is a leading technology in what will be the future of medicine: Digital tools that serve doctors with everything they need to diagnose and treat patients with a click or flick of the screen. Having ten thousand treatment options instantly in your pocket – try that with any lab coat reference book.
Oftentimes, technology is more sparkle than substance. Not so with VisualDx. Have you used it in your practice? Let us know what you think about it.
For more information and to learn how to subscribe, visit www.visualdx.com. VisualDx is a paid subscription service.
Dr. Benabio is a practicing dermatologist and physician director of health care transformation at Kaiser Permanente in San Diego. Dr. Benabio said he has no financial interest in VisualDx, but he has had complimentary access. Connect with him on Twitter @Dermdoc or drop him a line at [email protected].
"You should always consider three differential diagnoses for each patient." This was sound advice from my dermatology residency director, but advice I often neglect to take. What about you?
If you are like most of us, then your brain in clinic is on autopilot. It instantly selects the diagnosis and moves on. But when pushed by a confusing rash or a disease unresponsive to our standard treatment, we quickly encounter the limits of the human brain.
When stumped, we all do the same thing: Recruit more eyeballs (and brains). We find a colleague nearby and pull him into the room. "So, what do you think this is? What would you do?" Although often helpful, this method is inefficient and fails to capitalize on the most important of all medical tools: the computer.
Unlike our brains, computers in the form of clinical decision support (CDS) tools, are not prone to cognitive errors. Good CDS tools aren’t subject to top-of-mind biases. Their ability to generate differential diagnoses exceeds even the masters among us. Fortunately, there is such a CDS for skin disease: VisualDx.
VisualDx is a CDS tool focused on dermatologic conditions. It covers common and rare skin conditions and has more than 25,000 professional images.
What’s unique here is that VisualDx is more than a database of images. "It is truly a diagnostic decision support tool that allows you to search by multiple factors at once – symptoms, diagnoses, medications, medical history, travel, skin color, etc.," said Dr. Noah Craft, practicing dermatologist and chief medical officer of VisualDx.
In contrast to textbooks and other medical knowledge databases, this system is designed for easy, point-of-care use – it makes a dermatologist’s or even a primary care physician’s work easier. By quickly reviewing photos and diagnostic pearls, our brains are supercharged with deep differentials and management ideas.
For example, I recently had a patient who presented with papulosquamous eruptions that involved his body and hands. Among other diagnoses was secondary syphilis. Yes, I had thought of that, but a quick scan through VisualDx prompted me to ask about other symptoms, including vision changes (which he had). The patient also had HIV. Quick, which test is the best for me to order? Too slow, it’s already there in front of me on my screen.
In addition to improving quality, tools such as these also can improve access. Studies from the company show that the average user saves between 15 and 26 minutes per day using their product. For the working dermatologist, that means being able to see two additional patients a day.
VisualDx also educates and empowers patients. Don’t believe those bumps you have are molluscum? You can see here that these photos look exactly like the bumps you have. Rather than explain conditions through difficult doctor-speak, physicians can show complex knowledge to patients visually. As Dr. Craft notes and many of us have experienced: "For many patients, seeing is believing."
Whether it’s corroborating a diagnosis or exploring treatment options, having the doctor and patient share the same screen is an effective way to increase comprehension and build trust. No matter how good our drawings on the back of a prescription pad may be, they are not as accurate or helpful as curated digital photos. Our screen-savvy patients will soon expect this type of technology with every visit.
Good digital medicine tools also will help remedy one of medicine’s oldest and most glaring defects: We don’t account for the fact that the vast majority of health care happens in between doctor visits. Now patient education doesn’t stop at the culmination of the visit. Physicians can either print or e-mail images and information to patients so that they can have an accurate record at home to share with family members and caregivers.
VisualDx is a leading technology in what will be the future of medicine: Digital tools that serve doctors with everything they need to diagnose and treat patients with a click or flick of the screen. Having ten thousand treatment options instantly in your pocket – try that with any lab coat reference book.
Oftentimes, technology is more sparkle than substance. Not so with VisualDx. Have you used it in your practice? Let us know what you think about it.
For more information and to learn how to subscribe, visit www.visualdx.com. VisualDx is a paid subscription service.
Dr. Benabio is a practicing dermatologist and physician director of health care transformation at Kaiser Permanente in San Diego. Dr. Benabio said he has no financial interest in VisualDx, but he has had complimentary access. Connect with him on Twitter @Dermdoc or drop him a line at [email protected].
MRI method appears comparable to PET/CT
A modified MRI technique can effectively detect tumors in young cancer patients without exposing them to radiation, according to a small study published in The Lancet Oncology.
The method, called whole-body diffusion-weighted MRI, employs a contrast agent consisting of iron oxide nanoparticles.
This technique proved roughly as effective as 18F-FDG-PET/CT scans for detecting lymphoma and sarcoma in pediatric and young adult patients.
Researchers also noted that, as the MRI technique does not employ ionizing radiation, it might help prevent some of the adverse effects typically observed in patients who have undergone radiographic staging, particularly, secondary malignancies.
“I’m excited about having an imaging test for cancer patients that requires zero radiation exposure,” said senior study author Heike Daldrup-Link, MD, of the Stanford University School of Medicine in California.
She and her colleagues pointed out that, in the past, certain obstacles prevented physicians from using whole-body MRIs. For one, the scans take up to 2 hours, whereas a whole-body PET/CT takes only a few minutes.
In addition, in many organs, MRI does not distinguish healthy tissue from cancerous tissue. And existing contrast agents leave the tissues too quickly to be used in a lengthy, whole-body MRI.
In an attempt to overcome these obstacles, Dr Daldrup-Link and her colleagues used a contrast agent consisting of ferumoxytol nanoparticles. Injections of these iron oxide nanoparticles are approved by the US Food and Drug Administration (FDA) to treat anemia, and the researchers obtained FDA permission for use in their study.
The nanoparticles are retained in the body for days. On MRIs, they cause blood vessels to appear brighter, providing anatomic landmarks. The nanoparticles also cause healthy bone marrow, lymph nodes, livers, and spleens to appear darker, which makes tumors stand out.
The researchers compared the whole-body diffusion-weighted MRI method to PET/CTs in 22 patients, ages 8 to 33, who had lymphoma or sarcoma. Fourteen of the patients had Hodgkin lymphoma, 5 had non-Hodgkin lymphoma, 1 had Burkitt leukemia, 1 had Ewing’s sarcoma, and 1 had osteosarcoma.
The team found the MRI scans and PET/CT scans provided comparable information, although tumor detection was slightly better with PET/CT. The PET/CTs detected 163 of the 174 total tumors, and the MRIs detected 158.
The two methods had similar levels of sensitivity, specificity, and diagnostic accuracy. Sensitivity was 93.7% with PET/CT and 90.8% with MRI. Specificity was 97.7% with PET/CT and 99.5% with MRI. And diagnostic accuracy was 97.2% with PET/CT and 98.3% with MRI.
The researchers also noted that none of the patients experienced adverse reactions to the ferumoxytol nanoparticles, although the FDA previously observed a small risk of allergic reaction to the nanoparticles’ coating.
Dr Daldrup-Link said future research will aim to validate the MRI method in larger, more diverse groups of cancer patients, as well as examine its possible use for monitoring tumors over the course of cancer treatment. The technique also holds promise for scanning patients after their treatment is complete.
A modified MRI technique can effectively detect tumors in young cancer patients without exposing them to radiation, according to a small study published in The Lancet Oncology.
The method, called whole-body diffusion-weighted MRI, employs a contrast agent consisting of iron oxide nanoparticles.
This technique proved roughly as effective as 18F-FDG-PET/CT scans for detecting lymphoma and sarcoma in pediatric and young adult patients.
Researchers also noted that, as the MRI technique does not employ ionizing radiation, it might help prevent some of the adverse effects typically observed in patients who have undergone radiographic staging, particularly, secondary malignancies.
“I’m excited about having an imaging test for cancer patients that requires zero radiation exposure,” said senior study author Heike Daldrup-Link, MD, of the Stanford University School of Medicine in California.
She and her colleagues pointed out that, in the past, certain obstacles prevented physicians from using whole-body MRIs. For one, the scans take up to 2 hours, whereas a whole-body PET/CT takes only a few minutes.
In addition, in many organs, MRI does not distinguish healthy tissue from cancerous tissue. And existing contrast agents leave the tissues too quickly to be used in a lengthy, whole-body MRI.
In an attempt to overcome these obstacles, Dr Daldrup-Link and her colleagues used a contrast agent consisting of ferumoxytol nanoparticles. Injections of these iron oxide nanoparticles are approved by the US Food and Drug Administration (FDA) to treat anemia, and the researchers obtained FDA permission for use in their study.
The nanoparticles are retained in the body for days. On MRIs, they cause blood vessels to appear brighter, providing anatomic landmarks. The nanoparticles also cause healthy bone marrow, lymph nodes, livers, and spleens to appear darker, which makes tumors stand out.
The researchers compared the whole-body diffusion-weighted MRI method to PET/CTs in 22 patients, ages 8 to 33, who had lymphoma or sarcoma. Fourteen of the patients had Hodgkin lymphoma, 5 had non-Hodgkin lymphoma, 1 had Burkitt leukemia, 1 had Ewing’s sarcoma, and 1 had osteosarcoma.
The team found the MRI scans and PET/CT scans provided comparable information, although tumor detection was slightly better with PET/CT. The PET/CTs detected 163 of the 174 total tumors, and the MRIs detected 158.
The two methods had similar levels of sensitivity, specificity, and diagnostic accuracy. Sensitivity was 93.7% with PET/CT and 90.8% with MRI. Specificity was 97.7% with PET/CT and 99.5% with MRI. And diagnostic accuracy was 97.2% with PET/CT and 98.3% with MRI.
The researchers also noted that none of the patients experienced adverse reactions to the ferumoxytol nanoparticles, although the FDA previously observed a small risk of allergic reaction to the nanoparticles’ coating.
Dr Daldrup-Link said future research will aim to validate the MRI method in larger, more diverse groups of cancer patients, as well as examine its possible use for monitoring tumors over the course of cancer treatment. The technique also holds promise for scanning patients after their treatment is complete.
A modified MRI technique can effectively detect tumors in young cancer patients without exposing them to radiation, according to a small study published in The Lancet Oncology.
The method, called whole-body diffusion-weighted MRI, employs a contrast agent consisting of iron oxide nanoparticles.
This technique proved roughly as effective as 18F-FDG-PET/CT scans for detecting lymphoma and sarcoma in pediatric and young adult patients.
Researchers also noted that, as the MRI technique does not employ ionizing radiation, it might help prevent some of the adverse effects typically observed in patients who have undergone radiographic staging, particularly, secondary malignancies.
“I’m excited about having an imaging test for cancer patients that requires zero radiation exposure,” said senior study author Heike Daldrup-Link, MD, of the Stanford University School of Medicine in California.
She and her colleagues pointed out that, in the past, certain obstacles prevented physicians from using whole-body MRIs. For one, the scans take up to 2 hours, whereas a whole-body PET/CT takes only a few minutes.
In addition, in many organs, MRI does not distinguish healthy tissue from cancerous tissue. And existing contrast agents leave the tissues too quickly to be used in a lengthy, whole-body MRI.
In an attempt to overcome these obstacles, Dr Daldrup-Link and her colleagues used a contrast agent consisting of ferumoxytol nanoparticles. Injections of these iron oxide nanoparticles are approved by the US Food and Drug Administration (FDA) to treat anemia, and the researchers obtained FDA permission for use in their study.
The nanoparticles are retained in the body for days. On MRIs, they cause blood vessels to appear brighter, providing anatomic landmarks. The nanoparticles also cause healthy bone marrow, lymph nodes, livers, and spleens to appear darker, which makes tumors stand out.
The researchers compared the whole-body diffusion-weighted MRI method to PET/CTs in 22 patients, ages 8 to 33, who had lymphoma or sarcoma. Fourteen of the patients had Hodgkin lymphoma, 5 had non-Hodgkin lymphoma, 1 had Burkitt leukemia, 1 had Ewing’s sarcoma, and 1 had osteosarcoma.
The team found the MRI scans and PET/CT scans provided comparable information, although tumor detection was slightly better with PET/CT. The PET/CTs detected 163 of the 174 total tumors, and the MRIs detected 158.
The two methods had similar levels of sensitivity, specificity, and diagnostic accuracy. Sensitivity was 93.7% with PET/CT and 90.8% with MRI. Specificity was 97.7% with PET/CT and 99.5% with MRI. And diagnostic accuracy was 97.2% with PET/CT and 98.3% with MRI.
The researchers also noted that none of the patients experienced adverse reactions to the ferumoxytol nanoparticles, although the FDA previously observed a small risk of allergic reaction to the nanoparticles’ coating.
Dr Daldrup-Link said future research will aim to validate the MRI method in larger, more diverse groups of cancer patients, as well as examine its possible use for monitoring tumors over the course of cancer treatment. The technique also holds promise for scanning patients after their treatment is complete.
Compression device can prevent VTE after surgery
Credit: Piotr Bodzek
A mobile compression device can prevent venous thromboembolism (VTE) after joint replacement surgery, according to research published in the Journal of Bone and Joint Surgery.
The device, called ActiveCare+S.F.T., delivers compressions to the leg that coordinate with a patient’s respiration rate, and this improves blood flow.
Of more than 3000 patients who used the device, with or without aspirin, less than 1% developed VTE.
When the researchers compared this rate to VTE rates observed in previous studies of warfarin, enoxaparin, rivaroxaban, and dabigatran, they found the device to be noninferior to anticoagulant therapy.
“Blood thinners have long been considered the standard of care to prevent blood clots after orthopedic surgery, but they can have side effects that are concerning for many patients,” said study author Clifford Colwell, MD, of the Scripps Clinic in La Jolla, California.
“Through this research, we have found and established an equally effective means of accomplishing the same goal, with an added layer of safety for patients.”
Dr Colwell and his colleagues established a registry of 3060 patients to determine the rate of symptomatic VTE after primary knee arthroplasty (n=1551) or hip arthroplasty (n=1509) performed at 10 different sites.
All of the patients enrolled were 18 years of age or older. They had no known history of VTE, coagulation disorders, or solid tumor malignancies.
Patients wore the ActiveCare+S.F.T device both during and after surgery, for a minimum of 10 days. The researchers evaluated patients at 3 months after their surgery to document evidence of deep vein thrombosis (DVT) or pulmonary embolism (PE).
In all, 28 patients (0.92%) developed VTE. Twenty patients had distal DVT, 3 had proximal DVT, and 5 had PE. One patient died of coronary failure, but there was no autopsy, so it is not clear if the patient developed a PE.
Overall, the rate of VTE with the compression device—0.92%—was considered noninferior to rates previously observed with anticoagulants—2.2% for warfarin, 1.1% for enoxaparin, 0.64% for rivaroxaban, and 1.2% for dabigatran.
However, among patients who underwent knee arthroplasty, the device fell short of the noninferiority margin (1.0%) for rivaroxaban by 0.06%.
The device’s manufacturer, Medical Compression Systems Inc., funded the registry used in this study but did not have a role in the registry design or protocol. And the researchers did not receive compensation from the manufacturer.
Credit: Piotr Bodzek
A mobile compression device can prevent venous thromboembolism (VTE) after joint replacement surgery, according to research published in the Journal of Bone and Joint Surgery.
The device, called ActiveCare+S.F.T., delivers compressions to the leg that coordinate with a patient’s respiration rate, and this improves blood flow.
Of more than 3000 patients who used the device, with or without aspirin, less than 1% developed VTE.
When the researchers compared this rate to VTE rates observed in previous studies of warfarin, enoxaparin, rivaroxaban, and dabigatran, they found the device to be noninferior to anticoagulant therapy.
“Blood thinners have long been considered the standard of care to prevent blood clots after orthopedic surgery, but they can have side effects that are concerning for many patients,” said study author Clifford Colwell, MD, of the Scripps Clinic in La Jolla, California.
“Through this research, we have found and established an equally effective means of accomplishing the same goal, with an added layer of safety for patients.”
Dr Colwell and his colleagues established a registry of 3060 patients to determine the rate of symptomatic VTE after primary knee arthroplasty (n=1551) or hip arthroplasty (n=1509) performed at 10 different sites.
All of the patients enrolled were 18 years of age or older. They had no known history of VTE, coagulation disorders, or solid tumor malignancies.
Patients wore the ActiveCare+S.F.T device both during and after surgery, for a minimum of 10 days. The researchers evaluated patients at 3 months after their surgery to document evidence of deep vein thrombosis (DVT) or pulmonary embolism (PE).
In all, 28 patients (0.92%) developed VTE. Twenty patients had distal DVT, 3 had proximal DVT, and 5 had PE. One patient died of coronary failure, but there was no autopsy, so it is not clear if the patient developed a PE.
Overall, the rate of VTE with the compression device—0.92%—was considered noninferior to rates previously observed with anticoagulants—2.2% for warfarin, 1.1% for enoxaparin, 0.64% for rivaroxaban, and 1.2% for dabigatran.
However, among patients who underwent knee arthroplasty, the device fell short of the noninferiority margin (1.0%) for rivaroxaban by 0.06%.
The device’s manufacturer, Medical Compression Systems Inc., funded the registry used in this study but did not have a role in the registry design or protocol. And the researchers did not receive compensation from the manufacturer.
Credit: Piotr Bodzek
A mobile compression device can prevent venous thromboembolism (VTE) after joint replacement surgery, according to research published in the Journal of Bone and Joint Surgery.
The device, called ActiveCare+S.F.T., delivers compressions to the leg that coordinate with a patient’s respiration rate, and this improves blood flow.
Of more than 3000 patients who used the device, with or without aspirin, less than 1% developed VTE.
When the researchers compared this rate to VTE rates observed in previous studies of warfarin, enoxaparin, rivaroxaban, and dabigatran, they found the device to be noninferior to anticoagulant therapy.
“Blood thinners have long been considered the standard of care to prevent blood clots after orthopedic surgery, but they can have side effects that are concerning for many patients,” said study author Clifford Colwell, MD, of the Scripps Clinic in La Jolla, California.
“Through this research, we have found and established an equally effective means of accomplishing the same goal, with an added layer of safety for patients.”
Dr Colwell and his colleagues established a registry of 3060 patients to determine the rate of symptomatic VTE after primary knee arthroplasty (n=1551) or hip arthroplasty (n=1509) performed at 10 different sites.
All of the patients enrolled were 18 years of age or older. They had no known history of VTE, coagulation disorders, or solid tumor malignancies.
Patients wore the ActiveCare+S.F.T device both during and after surgery, for a minimum of 10 days. The researchers evaluated patients at 3 months after their surgery to document evidence of deep vein thrombosis (DVT) or pulmonary embolism (PE).
In all, 28 patients (0.92%) developed VTE. Twenty patients had distal DVT, 3 had proximal DVT, and 5 had PE. One patient died of coronary failure, but there was no autopsy, so it is not clear if the patient developed a PE.
Overall, the rate of VTE with the compression device—0.92%—was considered noninferior to rates previously observed with anticoagulants—2.2% for warfarin, 1.1% for enoxaparin, 0.64% for rivaroxaban, and 1.2% for dabigatran.
However, among patients who underwent knee arthroplasty, the device fell short of the noninferiority margin (1.0%) for rivaroxaban by 0.06%.
The device’s manufacturer, Medical Compression Systems Inc., funded the registry used in this study but did not have a role in the registry design or protocol. And the researchers did not receive compensation from the manufacturer.
Environment may play role in malaria transmission
the wall of a mosquito midgut
Credit: Krijn Paaijmans
Researchers have found the environment can significantly influence whether or not Wolbachia bacteria will prevent mosquitoes from transmitting malaria.
“Bacteria in the genus Wolbachia represent a promising new tool for controlling malaria due to their demonstrated ability to block the development of the pathogen within Anopheles mosquitoes,” said study investigator Courtney Murdock, PhD, of Pennsylvania State University.
“However, much of the work on the Wolbachia-malaria interaction has been conducted under highly simplified laboratory conditions. In this study, we investigated the ability of Wolbachia to block transmission of malaria—Plasmodium—parasites across variable environmental conditions, which are more reflective of conditions in the field.”
Dr Murdock and her colleagues described this research in Nature Scientific Reports.
The researchers used the malaria parasite Plasmodium yoelii, which affects rodents, and the mosquito Anopheles stephensi as a model system to investigate whether Wolbachia would block the ability of the malaria parasite to infect the mosquitoes.
The team divided the mosquitoes into an uninfected control group and a group infected with Wolbachia. Next, they raised all groups of mosquitoes in incubators set to different experimental temperatures—68, 72, 75, 79, and 82 degrees Fahrenheit.
At 82 degrees, Wolbachia reduced the number of mosquitoes infected by malaria parasites, the number of malaria parasites within each mosquito, and the intensity of oocysts.
At 75 degrees, Wolbachia had no effect on the prevalence of malaria parasites but increased oocyst intensity. At 68 degrees, Wolbachia had no effect on the prevalence of parasites or the intensity of oocysts.
The researchers also identified a previously undiscovered effect of Wolbachia. Infection with the bacterium reduced the development of sporozoites across all temperatures. This suggests that Wolbachia and malaria parasites may compete for similar hosts.
“Typically, the more oocysts a mosquito has on its midgut, the more sporozoites it produces,” Dr Murdock said. “So, depending on the environmental temperature, Wolbachia either reduced, enhanced, or had no effect on the number of oocysts.”
“At 75 degrees Fahrenheit, Wolbachia-infected mosquitos had 3 times the numbers of oocysts relative to uninfected mosquitoes. Thus, we would predict these mosquitoes to produce more sporozoites. But instead, we see that this is not the case, and that is because Wolbachia infection significantly reduces the number of sporozoites produced per oocyst, regardless of the environmental temperature.”
“This effect counteracts the enhancement we see at 75 degrees Fahrenheit. How the influence of Wolbachia on parasite establishment and the production of sporozoites under different temperatures plays out to ultimately affect transmission remains to be determined.”
Dr Murdock and her colleagues plan to duplicate their experiment using a species of malaria parasite that affects humans to determine whether or not the temperature effects they observed occur in humans as well.
The team also intends to explore the effects of additional environmental variation—such as daily temperature fluctuation and differential access to food resources in the mosquito larval and adult environments—on the transmission-blocking ability of Wolbachia.
the wall of a mosquito midgut
Credit: Krijn Paaijmans
Researchers have found the environment can significantly influence whether or not Wolbachia bacteria will prevent mosquitoes from transmitting malaria.
“Bacteria in the genus Wolbachia represent a promising new tool for controlling malaria due to their demonstrated ability to block the development of the pathogen within Anopheles mosquitoes,” said study investigator Courtney Murdock, PhD, of Pennsylvania State University.
“However, much of the work on the Wolbachia-malaria interaction has been conducted under highly simplified laboratory conditions. In this study, we investigated the ability of Wolbachia to block transmission of malaria—Plasmodium—parasites across variable environmental conditions, which are more reflective of conditions in the field.”
Dr Murdock and her colleagues described this research in Nature Scientific Reports.
The researchers used the malaria parasite Plasmodium yoelii, which affects rodents, and the mosquito Anopheles stephensi as a model system to investigate whether Wolbachia would block the ability of the malaria parasite to infect the mosquitoes.
The team divided the mosquitoes into an uninfected control group and a group infected with Wolbachia. Next, they raised all groups of mosquitoes in incubators set to different experimental temperatures—68, 72, 75, 79, and 82 degrees Fahrenheit.
At 82 degrees, Wolbachia reduced the number of mosquitoes infected by malaria parasites, the number of malaria parasites within each mosquito, and the intensity of oocysts.
At 75 degrees, Wolbachia had no effect on the prevalence of malaria parasites but increased oocyst intensity. At 68 degrees, Wolbachia had no effect on the prevalence of parasites or the intensity of oocysts.
The researchers also identified a previously undiscovered effect of Wolbachia. Infection with the bacterium reduced the development of sporozoites across all temperatures. This suggests that Wolbachia and malaria parasites may compete for similar hosts.
“Typically, the more oocysts a mosquito has on its midgut, the more sporozoites it produces,” Dr Murdock said. “So, depending on the environmental temperature, Wolbachia either reduced, enhanced, or had no effect on the number of oocysts.”
“At 75 degrees Fahrenheit, Wolbachia-infected mosquitos had 3 times the numbers of oocysts relative to uninfected mosquitoes. Thus, we would predict these mosquitoes to produce more sporozoites. But instead, we see that this is not the case, and that is because Wolbachia infection significantly reduces the number of sporozoites produced per oocyst, regardless of the environmental temperature.”
“This effect counteracts the enhancement we see at 75 degrees Fahrenheit. How the influence of Wolbachia on parasite establishment and the production of sporozoites under different temperatures plays out to ultimately affect transmission remains to be determined.”
Dr Murdock and her colleagues plan to duplicate their experiment using a species of malaria parasite that affects humans to determine whether or not the temperature effects they observed occur in humans as well.
The team also intends to explore the effects of additional environmental variation—such as daily temperature fluctuation and differential access to food resources in the mosquito larval and adult environments—on the transmission-blocking ability of Wolbachia.
the wall of a mosquito midgut
Credit: Krijn Paaijmans
Researchers have found the environment can significantly influence whether or not Wolbachia bacteria will prevent mosquitoes from transmitting malaria.
“Bacteria in the genus Wolbachia represent a promising new tool for controlling malaria due to their demonstrated ability to block the development of the pathogen within Anopheles mosquitoes,” said study investigator Courtney Murdock, PhD, of Pennsylvania State University.
“However, much of the work on the Wolbachia-malaria interaction has been conducted under highly simplified laboratory conditions. In this study, we investigated the ability of Wolbachia to block transmission of malaria—Plasmodium—parasites across variable environmental conditions, which are more reflective of conditions in the field.”
Dr Murdock and her colleagues described this research in Nature Scientific Reports.
The researchers used the malaria parasite Plasmodium yoelii, which affects rodents, and the mosquito Anopheles stephensi as a model system to investigate whether Wolbachia would block the ability of the malaria parasite to infect the mosquitoes.
The team divided the mosquitoes into an uninfected control group and a group infected with Wolbachia. Next, they raised all groups of mosquitoes in incubators set to different experimental temperatures—68, 72, 75, 79, and 82 degrees Fahrenheit.
At 82 degrees, Wolbachia reduced the number of mosquitoes infected by malaria parasites, the number of malaria parasites within each mosquito, and the intensity of oocysts.
At 75 degrees, Wolbachia had no effect on the prevalence of malaria parasites but increased oocyst intensity. At 68 degrees, Wolbachia had no effect on the prevalence of parasites or the intensity of oocysts.
The researchers also identified a previously undiscovered effect of Wolbachia. Infection with the bacterium reduced the development of sporozoites across all temperatures. This suggests that Wolbachia and malaria parasites may compete for similar hosts.
“Typically, the more oocysts a mosquito has on its midgut, the more sporozoites it produces,” Dr Murdock said. “So, depending on the environmental temperature, Wolbachia either reduced, enhanced, or had no effect on the number of oocysts.”
“At 75 degrees Fahrenheit, Wolbachia-infected mosquitos had 3 times the numbers of oocysts relative to uninfected mosquitoes. Thus, we would predict these mosquitoes to produce more sporozoites. But instead, we see that this is not the case, and that is because Wolbachia infection significantly reduces the number of sporozoites produced per oocyst, regardless of the environmental temperature.”
“This effect counteracts the enhancement we see at 75 degrees Fahrenheit. How the influence of Wolbachia on parasite establishment and the production of sporozoites under different temperatures plays out to ultimately affect transmission remains to be determined.”
Dr Murdock and her colleagues plan to duplicate their experiment using a species of malaria parasite that affects humans to determine whether or not the temperature effects they observed occur in humans as well.
The team also intends to explore the effects of additional environmental variation—such as daily temperature fluctuation and differential access to food resources in the mosquito larval and adult environments—on the transmission-blocking ability of Wolbachia.
Group finds progenitors of ILCs
with ILCs (green), epithelial
cells (red), and nuclei (blue)
University of Pennsylvania
Scientists say they’ve discovered the progenitors of innate lymphoid cells (ILCs) in the liver of fetal mice and the bone marrow of adult mice.
ILCs are among the first components of the immune system to confront certain pathogens, yet the cells went undetected by researchers for a century.
“Scientists tend to look for immune cells in the blood, lymph nodes, or spleen,” said Albert Bendelac, PhD, of the University of Chicago in Illinois.
“That is precisely where you would not find these cells. Once they mature, they directly go to tissues, such as the gut or the skin. You seldom see them in blood.”
To understand how ILCs fit into the ecosystem of cells that fight off infections and cancers, Dr Bendelac’s team focused on finding ILCs’ source.
And they reported their findings in a letter to Nature.
The team noted that ILCs, which were first recognized 5 years ago, are rare. A mouse might have 200 million lymphocytes and only a few thousand ILCs.
But previous work on natural killer (NK) cells showed that ILCs express the transcription factor PLZF during their development.
So Dr Bendelac and his colleagues created mice with the gene for green fluorescent protein inserted into mouse DNA, just downstream from the PLZF gene. As a result, cells from mice that expressed PLZF appeared bright green under the microscope.
Nevertheless, finding the precursors to ILCs was not easy. The precursors are not in the blood, and, by the time they migrate to the lungs or gut, they have already matured into ILCs.
The researchers eventually found the precursor cells—known as ILCPs—in the liver of fetal mice and in the bone marrow of adult mice.
When the team purified the ILCPs, which still contained the GFP gene, and transferred them into mice that lacked ILCs, the precursors were able to reconstitute the 3 known types of ILCs—ILCs 1, 2, and 3.
“There were no B cells or T cells or myeloid cells—no other immune cells, just these,” Dr Bendelac said. “So we think the ILCP really is a committed precursor to innate lymphoid cells.”
To confirm their finding, the researchers designed mice in which PLZF gene expression was tied to the gene for diphtheria toxin. When the cells expressed PLZF, they also produced the toxin, which was lethal for those cells. The result was a mouse that had a normal immune system except that it completely lacked ILCs.
“ILCs are found in the most exposed tissues,” Dr Bendelac noted. “They are one of your first lines of defense. We now suspect they may also influence the ensuing adaptive immune response, priming the pump, influencing how T-helper cells respond.”
Each of the 3 types of ILCs has different properties and serves different functions. ILC1 cells help prevent viral infections and can detect and remove some cancerous cells. They are similar to NK cells, except that NK cells circulate in the blood, and ILC1s live in the gut and the liver.
ILC2s are found in the lungs, where they can detect and respond to parasites. But they can also initiate an allergic reaction and mucus hyper-secretion.
ILC3 cells cluster in the gut, where they help mediate interactions between the bowel and bacteria. When that balance is disturbed, they can accelerate inflammation and may play a role in inflammatory bowel disease.
Dr Bendelac said his group’s research provides “one more tool for understanding this complex system,” and it could help generate a “powerful new way to assess the function of innate lymphocytes.”
with ILCs (green), epithelial
cells (red), and nuclei (blue)
University of Pennsylvania
Scientists say they’ve discovered the progenitors of innate lymphoid cells (ILCs) in the liver of fetal mice and the bone marrow of adult mice.
ILCs are among the first components of the immune system to confront certain pathogens, yet the cells went undetected by researchers for a century.
“Scientists tend to look for immune cells in the blood, lymph nodes, or spleen,” said Albert Bendelac, PhD, of the University of Chicago in Illinois.
“That is precisely where you would not find these cells. Once they mature, they directly go to tissues, such as the gut or the skin. You seldom see them in blood.”
To understand how ILCs fit into the ecosystem of cells that fight off infections and cancers, Dr Bendelac’s team focused on finding ILCs’ source.
And they reported their findings in a letter to Nature.
The team noted that ILCs, which were first recognized 5 years ago, are rare. A mouse might have 200 million lymphocytes and only a few thousand ILCs.
But previous work on natural killer (NK) cells showed that ILCs express the transcription factor PLZF during their development.
So Dr Bendelac and his colleagues created mice with the gene for green fluorescent protein inserted into mouse DNA, just downstream from the PLZF gene. As a result, cells from mice that expressed PLZF appeared bright green under the microscope.
Nevertheless, finding the precursors to ILCs was not easy. The precursors are not in the blood, and, by the time they migrate to the lungs or gut, they have already matured into ILCs.
The researchers eventually found the precursor cells—known as ILCPs—in the liver of fetal mice and in the bone marrow of adult mice.
When the team purified the ILCPs, which still contained the GFP gene, and transferred them into mice that lacked ILCs, the precursors were able to reconstitute the 3 known types of ILCs—ILCs 1, 2, and 3.
“There were no B cells or T cells or myeloid cells—no other immune cells, just these,” Dr Bendelac said. “So we think the ILCP really is a committed precursor to innate lymphoid cells.”
To confirm their finding, the researchers designed mice in which PLZF gene expression was tied to the gene for diphtheria toxin. When the cells expressed PLZF, they also produced the toxin, which was lethal for those cells. The result was a mouse that had a normal immune system except that it completely lacked ILCs.
“ILCs are found in the most exposed tissues,” Dr Bendelac noted. “They are one of your first lines of defense. We now suspect they may also influence the ensuing adaptive immune response, priming the pump, influencing how T-helper cells respond.”
Each of the 3 types of ILCs has different properties and serves different functions. ILC1 cells help prevent viral infections and can detect and remove some cancerous cells. They are similar to NK cells, except that NK cells circulate in the blood, and ILC1s live in the gut and the liver.
ILC2s are found in the lungs, where they can detect and respond to parasites. But they can also initiate an allergic reaction and mucus hyper-secretion.
ILC3 cells cluster in the gut, where they help mediate interactions between the bowel and bacteria. When that balance is disturbed, they can accelerate inflammation and may play a role in inflammatory bowel disease.
Dr Bendelac said his group’s research provides “one more tool for understanding this complex system,” and it could help generate a “powerful new way to assess the function of innate lymphocytes.”
with ILCs (green), epithelial
cells (red), and nuclei (blue)
University of Pennsylvania
Scientists say they’ve discovered the progenitors of innate lymphoid cells (ILCs) in the liver of fetal mice and the bone marrow of adult mice.
ILCs are among the first components of the immune system to confront certain pathogens, yet the cells went undetected by researchers for a century.
“Scientists tend to look for immune cells in the blood, lymph nodes, or spleen,” said Albert Bendelac, PhD, of the University of Chicago in Illinois.
“That is precisely where you would not find these cells. Once they mature, they directly go to tissues, such as the gut or the skin. You seldom see them in blood.”
To understand how ILCs fit into the ecosystem of cells that fight off infections and cancers, Dr Bendelac’s team focused on finding ILCs’ source.
And they reported their findings in a letter to Nature.
The team noted that ILCs, which were first recognized 5 years ago, are rare. A mouse might have 200 million lymphocytes and only a few thousand ILCs.
But previous work on natural killer (NK) cells showed that ILCs express the transcription factor PLZF during their development.
So Dr Bendelac and his colleagues created mice with the gene for green fluorescent protein inserted into mouse DNA, just downstream from the PLZF gene. As a result, cells from mice that expressed PLZF appeared bright green under the microscope.
Nevertheless, finding the precursors to ILCs was not easy. The precursors are not in the blood, and, by the time they migrate to the lungs or gut, they have already matured into ILCs.
The researchers eventually found the precursor cells—known as ILCPs—in the liver of fetal mice and in the bone marrow of adult mice.
When the team purified the ILCPs, which still contained the GFP gene, and transferred them into mice that lacked ILCs, the precursors were able to reconstitute the 3 known types of ILCs—ILCs 1, 2, and 3.
“There were no B cells or T cells or myeloid cells—no other immune cells, just these,” Dr Bendelac said. “So we think the ILCP really is a committed precursor to innate lymphoid cells.”
To confirm their finding, the researchers designed mice in which PLZF gene expression was tied to the gene for diphtheria toxin. When the cells expressed PLZF, they also produced the toxin, which was lethal for those cells. The result was a mouse that had a normal immune system except that it completely lacked ILCs.
“ILCs are found in the most exposed tissues,” Dr Bendelac noted. “They are one of your first lines of defense. We now suspect they may also influence the ensuing adaptive immune response, priming the pump, influencing how T-helper cells respond.”
Each of the 3 types of ILCs has different properties and serves different functions. ILC1 cells help prevent viral infections and can detect and remove some cancerous cells. They are similar to NK cells, except that NK cells circulate in the blood, and ILC1s live in the gut and the liver.
ILC2s are found in the lungs, where they can detect and respond to parasites. But they can also initiate an allergic reaction and mucus hyper-secretion.
ILC3 cells cluster in the gut, where they help mediate interactions between the bowel and bacteria. When that balance is disturbed, they can accelerate inflammation and may play a role in inflammatory bowel disease.
Dr Bendelac said his group’s research provides “one more tool for understanding this complex system,” and it could help generate a “powerful new way to assess the function of innate lymphocytes.”