Reassurance for women taking certolizumab during pregnancy

Article Type
Changed

 

The use of certolizumab pegol during pregnancy does not appear to be associated with an increased risk of fetal death or congenital malformations, according to results of a new study.

Megan E.B. Clowse, MD, of Duke University Medical Center, Durham, N.C., and her coauthors reported on a prospective and retrospective analysis of data from 528 pregnancies – including 10 twin pregnancies – in which the mother was exposed to the anti–tumor necrosis factor (anti-TNF) drug certolizumab during pregnancy.

There were 459 (85.3%) live births, 47 (8.7%) miscarriages, 27 (5%) elective abortions, and five (0.9%) stillbirths, figures that are similar to those seen in the general population. Among the singleton births, 11.7% were low birth weight, which the authors noted was slightly higher than the frequency observed in the general population but was in line with previous reports with TNF inhibitors.

Two cases of neonatal death were reported. One occurred in one member of a female twin pair, born at 25 weeks, who succumbed to brain damage and pneumoperitoneum. The other was also in female twins, born at 27 weeks, in which one of the pair was born with a heart defect and died during surgery for an unspecified infection of the intestines, the authors reported in Arthritis & Rheumatology.

The study noted eight (1.7%) reports of congenital malformations in the infants born alive, including accessory auricle, polydactyly, hydronephrosis, cerebral ventricle dilatation, and congenital heart disease.

Four cases were in infants whose mothers had rheumatoid arthritis, one in an infant whose mother had ankylosing spondylitis, and three in infants whose mothers had Crohn’s disease (CD).

In five cases, the women were exposed to certolizumab at least in the first trimester, in four they were exposed at least in the second trimester, and in five cases, the women were exposed at least during the third trimester of pregnancy.

 

 


“The teratologist review concluded no temporal association with CZP [certolizumab] exposure for the case of hydronephrosis, owing to evidence of anomalies on ultrasound prior to initiation of medication (during the third trimester); the possibility of an association could not be ruled out for the cases of anal fistula, accessory auricle, vesicoureteric reflux, talipes, and congenital heart disease,” the researchers wrote.

Twenty-two women (4.2%) had serious infections during pregnancy, which was in line with the frequency of serious infections reported in patients taking certolizumab.

Among the live births, nearly half (44.5%) reported certolizumab exposure during all three trimesters, 81.2% reported exposure at least during the first trimester, and 29.2% were exposed during the first trimester only.

“More women with rheumatic diseases than CD had first-trimester–only or first- and second-trimester exposure to CZP, reflecting the differing clinical practice between CD and rheumatic diseases,” the authors wrote. “Spontaneous disease improvement in women with IBD [inflammatory bowel disease] during pregnancy is less likely, compared to RA and other rheumatic diseases, and many patients may require treatment throughout pregnancy; thus, potentially reducing the risk of adverse outcomes, such as spontaneous miscarriage, premature birth, low birth weight, small gestational age.”

 

 


The authors commented that, despite increasing evidence of the safety of TNF inhibitors during pregnancy – recent systematic reviews and meta-analyses have found no link to adverse outcomes with anti-TNF exposure during the first trimester – patients and physicians still have had concerns about their use.

“This analysis of prospective pregnancy reports from the UCB Pharma safety database represents the largest published cohort of pregnant women exposed to an anti-TNF for the management of chronic inflammatory diseases,” the authors wrote. “It addresses the need for new and detailed studies on the impact of drug exposure during pregnancy and supports the conclusion that CZP exposure in utero does not appear to increase the risk of major congenital malformations or fetal loss.”

However, they acknowledged the lack of an untreated control group and said this meant a formal statistical analysis was not possible. They also noted that outcomes data were not available for about one-third of the entire cohort of 1,137 women whose pregnancies were exposed to certolizumab.

“However, these data are reassuring for women of childbearing age affected by chronic inflammatory diseases who need an anti-TNF to control their condition and wish to become or are pregnant during CZP treatment,” they wrote.

 

 


The study was funded by UCB Pharma. One author was a contractor for, and three were employees of, UCB Pharma. Six authors declared grants, consulting fees, and other remuneration from pharmaceutical companies, including UCB Pharma.

SOURCE: Clowse MEB et al. Arthritis & Rheumatology. 2018 Apr 5. doi: 10.1002/art.40508.

Publications
Topics
Sections

 

The use of certolizumab pegol during pregnancy does not appear to be associated with an increased risk of fetal death or congenital malformations, according to results of a new study.

Megan E.B. Clowse, MD, of Duke University Medical Center, Durham, N.C., and her coauthors reported on a prospective and retrospective analysis of data from 528 pregnancies – including 10 twin pregnancies – in which the mother was exposed to the anti–tumor necrosis factor (anti-TNF) drug certolizumab during pregnancy.

There were 459 (85.3%) live births, 47 (8.7%) miscarriages, 27 (5%) elective abortions, and five (0.9%) stillbirths, figures that are similar to those seen in the general population. Among the singleton births, 11.7% were low birth weight, which the authors noted was slightly higher than the frequency observed in the general population but was in line with previous reports with TNF inhibitors.

Two cases of neonatal death were reported. One occurred in one member of a female twin pair, born at 25 weeks, who succumbed to brain damage and pneumoperitoneum. The other was also in female twins, born at 27 weeks, in which one of the pair was born with a heart defect and died during surgery for an unspecified infection of the intestines, the authors reported in Arthritis & Rheumatology.

The study noted eight (1.7%) reports of congenital malformations in the infants born alive, including accessory auricle, polydactyly, hydronephrosis, cerebral ventricle dilatation, and congenital heart disease.

Four cases were in infants whose mothers had rheumatoid arthritis, one in an infant whose mother had ankylosing spondylitis, and three in infants whose mothers had Crohn’s disease (CD).

In five cases, the women were exposed to certolizumab at least in the first trimester, in four they were exposed at least in the second trimester, and in five cases, the women were exposed at least during the third trimester of pregnancy.

 

 


“The teratologist review concluded no temporal association with CZP [certolizumab] exposure for the case of hydronephrosis, owing to evidence of anomalies on ultrasound prior to initiation of medication (during the third trimester); the possibility of an association could not be ruled out for the cases of anal fistula, accessory auricle, vesicoureteric reflux, talipes, and congenital heart disease,” the researchers wrote.

Twenty-two women (4.2%) had serious infections during pregnancy, which was in line with the frequency of serious infections reported in patients taking certolizumab.

Among the live births, nearly half (44.5%) reported certolizumab exposure during all three trimesters, 81.2% reported exposure at least during the first trimester, and 29.2% were exposed during the first trimester only.

“More women with rheumatic diseases than CD had first-trimester–only or first- and second-trimester exposure to CZP, reflecting the differing clinical practice between CD and rheumatic diseases,” the authors wrote. “Spontaneous disease improvement in women with IBD [inflammatory bowel disease] during pregnancy is less likely, compared to RA and other rheumatic diseases, and many patients may require treatment throughout pregnancy; thus, potentially reducing the risk of adverse outcomes, such as spontaneous miscarriage, premature birth, low birth weight, small gestational age.”

 

 


The authors commented that, despite increasing evidence of the safety of TNF inhibitors during pregnancy – recent systematic reviews and meta-analyses have found no link to adverse outcomes with anti-TNF exposure during the first trimester – patients and physicians still have had concerns about their use.

“This analysis of prospective pregnancy reports from the UCB Pharma safety database represents the largest published cohort of pregnant women exposed to an anti-TNF for the management of chronic inflammatory diseases,” the authors wrote. “It addresses the need for new and detailed studies on the impact of drug exposure during pregnancy and supports the conclusion that CZP exposure in utero does not appear to increase the risk of major congenital malformations or fetal loss.”

However, they acknowledged the lack of an untreated control group and said this meant a formal statistical analysis was not possible. They also noted that outcomes data were not available for about one-third of the entire cohort of 1,137 women whose pregnancies were exposed to certolizumab.

“However, these data are reassuring for women of childbearing age affected by chronic inflammatory diseases who need an anti-TNF to control their condition and wish to become or are pregnant during CZP treatment,” they wrote.

 

 


The study was funded by UCB Pharma. One author was a contractor for, and three were employees of, UCB Pharma. Six authors declared grants, consulting fees, and other remuneration from pharmaceutical companies, including UCB Pharma.

SOURCE: Clowse MEB et al. Arthritis & Rheumatology. 2018 Apr 5. doi: 10.1002/art.40508.

 

The use of certolizumab pegol during pregnancy does not appear to be associated with an increased risk of fetal death or congenital malformations, according to results of a new study.

Megan E.B. Clowse, MD, of Duke University Medical Center, Durham, N.C., and her coauthors reported on a prospective and retrospective analysis of data from 528 pregnancies – including 10 twin pregnancies – in which the mother was exposed to the anti–tumor necrosis factor (anti-TNF) drug certolizumab during pregnancy.

There were 459 (85.3%) live births, 47 (8.7%) miscarriages, 27 (5%) elective abortions, and five (0.9%) stillbirths, figures that are similar to those seen in the general population. Among the singleton births, 11.7% were low birth weight, which the authors noted was slightly higher than the frequency observed in the general population but was in line with previous reports with TNF inhibitors.

Two cases of neonatal death were reported. One occurred in one member of a female twin pair, born at 25 weeks, who succumbed to brain damage and pneumoperitoneum. The other was also in female twins, born at 27 weeks, in which one of the pair was born with a heart defect and died during surgery for an unspecified infection of the intestines, the authors reported in Arthritis & Rheumatology.

The study noted eight (1.7%) reports of congenital malformations in the infants born alive, including accessory auricle, polydactyly, hydronephrosis, cerebral ventricle dilatation, and congenital heart disease.

Four cases were in infants whose mothers had rheumatoid arthritis, one in an infant whose mother had ankylosing spondylitis, and three in infants whose mothers had Crohn’s disease (CD).

In five cases, the women were exposed to certolizumab at least in the first trimester, in four they were exposed at least in the second trimester, and in five cases, the women were exposed at least during the third trimester of pregnancy.

 

 


“The teratologist review concluded no temporal association with CZP [certolizumab] exposure for the case of hydronephrosis, owing to evidence of anomalies on ultrasound prior to initiation of medication (during the third trimester); the possibility of an association could not be ruled out for the cases of anal fistula, accessory auricle, vesicoureteric reflux, talipes, and congenital heart disease,” the researchers wrote.

Twenty-two women (4.2%) had serious infections during pregnancy, which was in line with the frequency of serious infections reported in patients taking certolizumab.

Among the live births, nearly half (44.5%) reported certolizumab exposure during all three trimesters, 81.2% reported exposure at least during the first trimester, and 29.2% were exposed during the first trimester only.

“More women with rheumatic diseases than CD had first-trimester–only or first- and second-trimester exposure to CZP, reflecting the differing clinical practice between CD and rheumatic diseases,” the authors wrote. “Spontaneous disease improvement in women with IBD [inflammatory bowel disease] during pregnancy is less likely, compared to RA and other rheumatic diseases, and many patients may require treatment throughout pregnancy; thus, potentially reducing the risk of adverse outcomes, such as spontaneous miscarriage, premature birth, low birth weight, small gestational age.”

 

 


The authors commented that, despite increasing evidence of the safety of TNF inhibitors during pregnancy – recent systematic reviews and meta-analyses have found no link to adverse outcomes with anti-TNF exposure during the first trimester – patients and physicians still have had concerns about their use.

“This analysis of prospective pregnancy reports from the UCB Pharma safety database represents the largest published cohort of pregnant women exposed to an anti-TNF for the management of chronic inflammatory diseases,” the authors wrote. “It addresses the need for new and detailed studies on the impact of drug exposure during pregnancy and supports the conclusion that CZP exposure in utero does not appear to increase the risk of major congenital malformations or fetal loss.”

However, they acknowledged the lack of an untreated control group and said this meant a formal statistical analysis was not possible. They also noted that outcomes data were not available for about one-third of the entire cohort of 1,137 women whose pregnancies were exposed to certolizumab.

“However, these data are reassuring for women of childbearing age affected by chronic inflammatory diseases who need an anti-TNF to control their condition and wish to become or are pregnant during CZP treatment,” they wrote.

 

 


The study was funded by UCB Pharma. One author was a contractor for, and three were employees of, UCB Pharma. Six authors declared grants, consulting fees, and other remuneration from pharmaceutical companies, including UCB Pharma.

SOURCE: Clowse MEB et al. Arthritis & Rheumatology. 2018 Apr 5. doi: 10.1002/art.40508.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ARTHRITIS & RHEUMATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Certolizumab use during pregnancy does not appear to increase the risk of congenital malformations.

Major finding: The incidence of fetal death and congenital malformations in certolizumab-exposed pregnancies is similar to that in the general population.

Study details: A prospective and retrospective analysis of data from 528 certolizumab-exposed pregnancies.

Disclosures: The study was funded by UCB Pharma. One author was a contractor for UCB Pharma, and three were employees of, UCB Pharma. Six authors declared grants, consulting fees, and other remuneration from pharmaceutical companies, including UCB Pharma.

Source: Clowse MEB et al. Arthritis & Rheumatology. 2018 Apr 5. doi: 10.1002/art.40508

Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Bisphosphonate use linked with lower risk of lung cancer for never-smokers

Article Type
Changed

 

Oral bisphosphonates may confer a protective effect against lung cancer, according to an observational study.

However, this effect was only observed among never-smokers, as there was no significant difference in lung cancer incidence between oral bisphosphonate use and nonuse in the overall study cohort, Meng-Hua Tao, PhD, of the University of North Texas Health Science Center, Fort Worth, and associates reported in Annals of Oncology.

While bisphosphonates are commonly prescribed to prevent and treat osteoporosis, research suggests that they may have a range of other benefits, such as inhibition of tumor angiogenesis and cellular proliferation, prevention of tumor-cell adhesion and invasion, and induction of tumor-cell apoptosis. Recent studies have also investigated the association between bisphosphonate use and the risk of several cancers including breast, gastrointestinal tract, endometrial, and ovarian. Additionally, a few studies have looked at the relationship between oral bisphosphonates and lung cancer risk, but results have been inconsistent.

However, previous studies had very limited information on lung cancer risk factors, such as tobacco use, and none had analyzed the potential associations between bisphosphonate use and lung cancer and how it might vary by smoking status, the authors noted.

In the current study, the relationship between oral bisphosphonates use and lung cancer risk, and its potential interaction with smoking status on lung cancer risk, was examined using a prospective cohort of 151,432 postmenopausal women enrolled in the Women’s Health Initiative during 1993-1998. At baseline and follow-up, an inventory of regularly used medications including bisphosphonates was completed.

After a mean cumulative follow-up of 13.3 years, there were 2,257 nonusers and 254 ever-users of oral bisphosphonates, and 2,511 women were diagnosed with incident lung cancer. On multivariable-adjusted analysis, the hazard ratio for lung cancer incidence after use of any type of oral bisphosphonate, compared with never-use, was not significantly different (HR = 0.91; 95% confidence interval, 0.80-1.04; P = .16).

In a subgroup analysis, the authors found that the association patterns differed between never-smokers and ever-smokers (P = .02). Among women who never smoked, oral bisphosphonate use was inversely associated with lung cancer risk in the multivariate adjusted models (HR = 0.57; 95% CI, 0.39-0.84; P less than .01), and this association was stronger when the duration of bisphosphonate usage was 1.5 years or longer (HR, 0.36; 95% CI, 0.18-0.73).

 

 


For ever-smokers, there was no association between bisphosphonate use and lung cancer risk.

When looking at lung cancer subtype, the incidences of small cell lung cancer and non–small cell lung cancer were similar in bisphosphonate users, but for never-smokers, bisphosphonate use was also associated with a lower non–small cell lung cancer risk (HR = 0.62; 95% CI, 0.41-0.92; P = .02).

“These observational study findings need to be confirmed. As the U.S. Food and Drug Administration issued safety announcements related to potential risk of long-term bisphosphonate use further studies are warranted to investigate how duration of bisphosphonate use may influence risk of lung cancer and evaluate optimal dose of oral bisphosphonates for lung cancer prevention in older women,” wrote Dr. Tao and her associates.

The study was primarily supported by a grant from the National Cancer Institute at the National Institutes of Health. Dr. Tao has no disclosures.

SOURCE: Tao MH et al. Ann Oncol. 2018 Mar 29. doi:10.1093/annonc/mdy097.

Publications
Topics
Sections

 

Oral bisphosphonates may confer a protective effect against lung cancer, according to an observational study.

However, this effect was only observed among never-smokers, as there was no significant difference in lung cancer incidence between oral bisphosphonate use and nonuse in the overall study cohort, Meng-Hua Tao, PhD, of the University of North Texas Health Science Center, Fort Worth, and associates reported in Annals of Oncology.

While bisphosphonates are commonly prescribed to prevent and treat osteoporosis, research suggests that they may have a range of other benefits, such as inhibition of tumor angiogenesis and cellular proliferation, prevention of tumor-cell adhesion and invasion, and induction of tumor-cell apoptosis. Recent studies have also investigated the association between bisphosphonate use and the risk of several cancers including breast, gastrointestinal tract, endometrial, and ovarian. Additionally, a few studies have looked at the relationship between oral bisphosphonates and lung cancer risk, but results have been inconsistent.

However, previous studies had very limited information on lung cancer risk factors, such as tobacco use, and none had analyzed the potential associations between bisphosphonate use and lung cancer and how it might vary by smoking status, the authors noted.

In the current study, the relationship between oral bisphosphonates use and lung cancer risk, and its potential interaction with smoking status on lung cancer risk, was examined using a prospective cohort of 151,432 postmenopausal women enrolled in the Women’s Health Initiative during 1993-1998. At baseline and follow-up, an inventory of regularly used medications including bisphosphonates was completed.

After a mean cumulative follow-up of 13.3 years, there were 2,257 nonusers and 254 ever-users of oral bisphosphonates, and 2,511 women were diagnosed with incident lung cancer. On multivariable-adjusted analysis, the hazard ratio for lung cancer incidence after use of any type of oral bisphosphonate, compared with never-use, was not significantly different (HR = 0.91; 95% confidence interval, 0.80-1.04; P = .16).

In a subgroup analysis, the authors found that the association patterns differed between never-smokers and ever-smokers (P = .02). Among women who never smoked, oral bisphosphonate use was inversely associated with lung cancer risk in the multivariate adjusted models (HR = 0.57; 95% CI, 0.39-0.84; P less than .01), and this association was stronger when the duration of bisphosphonate usage was 1.5 years or longer (HR, 0.36; 95% CI, 0.18-0.73).

 

 


For ever-smokers, there was no association between bisphosphonate use and lung cancer risk.

When looking at lung cancer subtype, the incidences of small cell lung cancer and non–small cell lung cancer were similar in bisphosphonate users, but for never-smokers, bisphosphonate use was also associated with a lower non–small cell lung cancer risk (HR = 0.62; 95% CI, 0.41-0.92; P = .02).

“These observational study findings need to be confirmed. As the U.S. Food and Drug Administration issued safety announcements related to potential risk of long-term bisphosphonate use further studies are warranted to investigate how duration of bisphosphonate use may influence risk of lung cancer and evaluate optimal dose of oral bisphosphonates for lung cancer prevention in older women,” wrote Dr. Tao and her associates.

The study was primarily supported by a grant from the National Cancer Institute at the National Institutes of Health. Dr. Tao has no disclosures.

SOURCE: Tao MH et al. Ann Oncol. 2018 Mar 29. doi:10.1093/annonc/mdy097.

 

Oral bisphosphonates may confer a protective effect against lung cancer, according to an observational study.

However, this effect was only observed among never-smokers, as there was no significant difference in lung cancer incidence between oral bisphosphonate use and nonuse in the overall study cohort, Meng-Hua Tao, PhD, of the University of North Texas Health Science Center, Fort Worth, and associates reported in Annals of Oncology.

While bisphosphonates are commonly prescribed to prevent and treat osteoporosis, research suggests that they may have a range of other benefits, such as inhibition of tumor angiogenesis and cellular proliferation, prevention of tumor-cell adhesion and invasion, and induction of tumor-cell apoptosis. Recent studies have also investigated the association between bisphosphonate use and the risk of several cancers including breast, gastrointestinal tract, endometrial, and ovarian. Additionally, a few studies have looked at the relationship between oral bisphosphonates and lung cancer risk, but results have been inconsistent.

However, previous studies had very limited information on lung cancer risk factors, such as tobacco use, and none had analyzed the potential associations between bisphosphonate use and lung cancer and how it might vary by smoking status, the authors noted.

In the current study, the relationship between oral bisphosphonates use and lung cancer risk, and its potential interaction with smoking status on lung cancer risk, was examined using a prospective cohort of 151,432 postmenopausal women enrolled in the Women’s Health Initiative during 1993-1998. At baseline and follow-up, an inventory of regularly used medications including bisphosphonates was completed.

After a mean cumulative follow-up of 13.3 years, there were 2,257 nonusers and 254 ever-users of oral bisphosphonates, and 2,511 women were diagnosed with incident lung cancer. On multivariable-adjusted analysis, the hazard ratio for lung cancer incidence after use of any type of oral bisphosphonate, compared with never-use, was not significantly different (HR = 0.91; 95% confidence interval, 0.80-1.04; P = .16).

In a subgroup analysis, the authors found that the association patterns differed between never-smokers and ever-smokers (P = .02). Among women who never smoked, oral bisphosphonate use was inversely associated with lung cancer risk in the multivariate adjusted models (HR = 0.57; 95% CI, 0.39-0.84; P less than .01), and this association was stronger when the duration of bisphosphonate usage was 1.5 years or longer (HR, 0.36; 95% CI, 0.18-0.73).

 

 


For ever-smokers, there was no association between bisphosphonate use and lung cancer risk.

When looking at lung cancer subtype, the incidences of small cell lung cancer and non–small cell lung cancer were similar in bisphosphonate users, but for never-smokers, bisphosphonate use was also associated with a lower non–small cell lung cancer risk (HR = 0.62; 95% CI, 0.41-0.92; P = .02).

“These observational study findings need to be confirmed. As the U.S. Food and Drug Administration issued safety announcements related to potential risk of long-term bisphosphonate use further studies are warranted to investigate how duration of bisphosphonate use may influence risk of lung cancer and evaluate optimal dose of oral bisphosphonates for lung cancer prevention in older women,” wrote Dr. Tao and her associates.

The study was primarily supported by a grant from the National Cancer Institute at the National Institutes of Health. Dr. Tao has no disclosures.

SOURCE: Tao MH et al. Ann Oncol. 2018 Mar 29. doi:10.1093/annonc/mdy097.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM THE ANNALS OF ONCOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Oral bisphosphonates appeared to confer a protective effect against lung cancer in women who have never smoked.

Major finding: Oral bisphosphonate use was inversely associated with lung cancer risk in nonsmokers (hazard ratio = 0.57; 95% confidence interval, 0.39-0.84; P less than .01).

Study details: Data was drawn from the observational Women’s Health Initiative and included 151,432 participants.

Disclosures: The study was primarily supported by a grant from the National Cancer Institute at the National Institutes of Health. Dr. Tao has no disclosures.

Source: Tao MH et al. Ann Oncol. 2018 Mar 29. doi:10.1093/annonc/mdy097.

Disqus Comments
Default
Use ProPublica

Researchers seek better understanding of von Willebrand disease

Article Type
Changed

– Several groups of researchers are examining cohorts of von Willebrand disease (VWD), looking at its pathogenesis and molecular causes, as well as ways to improve treatment strategies.

At the biennial summit of the Thrombosis & Hemostasis Societies of North America, Robert F. Sidonio Jr., MD, MSc, highlighted the efforts underway, including the Zimmerman Program for the Molecular and Clinical Biology of VWD study (ZPMCB-VWD), a large grant project funded by the National Institutes of Health.

At the time of publication, there were more than 700 index cases and more than 2,200 families in the study. It includes data from 8 primary centers and 23 secondary centers. The goals are to characterize the molecular causes of VWD and the examine the fidelity of diagnosis, “which is a critical component of the study,” said Dr. Sidonio, clinical director of the hemostasis/thrombosis program at Children’s Healthcare of Atlanta.

Doug Brunk/MDedge News
Dr. Robert F. Sidonio Jr.
The ZPMCB-VWD cohort demonstrated 74% of subjects to have VWF sequence variations when immunological assays of von Willebrand factor (VWF:Ag) levels were less than 40 IU/dL (Hematology Am Soc Hematol Educ Program. 2014; 2014[1]:531-5). The precise cutoff varies by study, with VWF:Ag levels below 20-40 IU/dL most strongly correlated with presence of a VWF sequence variation.

When the ZPMCB-VWD investigators examined the correlation of bleeding phenotype and the genotype, the found that as the level of VW factor goes down, the bleeding score generally goes up, but it becomes a little bit flat in the 20-30 IU/dL range.

“Where we spend a lot of our time is with patients who have levels of 30%-50%, which can be quite heterogeneous,” Dr. Sidonio said.

Another finding made out of the ZPMCB-VWD project was the discovery of the single nucleotide polymorphism p.D1472H, which was noted to be more common in African American patients and leads to low Von Willebrand Ristocetin Cofactor (VWF:RCo) and VWF:RCo/VWF:Ag ratio, but does not increase the bleeding score.

 

 

The fidelity of diagnosis was another key finding to come out of ZPMCB-VWD. Most type 1 VWD cases were identified by low VWF:RCo. There was poor correlation between historical and current assays (r2 = 0.22), and diagnostic labs improved after central lab testing.

Next, Dr. Sidonio discussed findings from RENAWI 1 and 2, which are Italian registries of about 1,000 VWF patients that were organized by 12 centers in 2002. The goals are to evaluate the natural history of VWD in Italy and to characterize treatment strategies. According to preliminary findings from the researchers, the biological response to desmopressin (DDAVP) was 69% in those with VWD1, 26% in those with VWD2A, 20% in those with VWD2B, 33% in those with VWD2M, 71% in those with VWD2N, and 0% in those with VWD3 (Blood 2014;123:4037-44).

These researchers also found that a mean bleeding score of 3.5 corresponds to a VWF:RCo score of 30 U/dL or greater. “This indicates that there is something slightly different about patients that are above and below that threshold,” Dr. Sidonio said. “I think that’s something we’ve all been struggling with: trying to understand where the differences are and how aggressively we should be treating our patients with mild VWD.”

Another effort, The Willebrand in the Netherlands’ study (WiN), is a prospective cohort trial of about 700 patients with types 1, 2, and 3 VWD from 12 centers in that country (Blood 2008;112:4510). It was the first large study to use VWF propeptide (pp) to discriminate between severe type 1 and type 3 VWD. It also found that type 2 VWD is more characterized by increased clearance in VWF in contrast to type 1 VWD, leading to higher VWFpp/VWF:Ag ratio. In addition, in type 1 VWD, antigen rates increased about 3.5 U/dL per decade, RCo increased about 9.5 U/dL per decade, and Factor VIII: C increased about 7.1 U/dL per decade (J Thromb Haemost. 2014; 12[7]:1066-75).

 

 

“I don’t think we have a study to be able to follow patients for 20 years or so, knowing that we rely on the assays that we were using 20 years ago,” Dr. Sidonio said. “That’s a challenge at a lot of our centers, but we know that VWF generally rises with age in mild VWD patients.”

In patients with definitively diagnosed type 2 VWD, no age-related VWF or FVIII changes were observed. The researchers also observed an increase in surgical bleeding and GI bleeding in elderly VWD patients.

In the meantime, the Canadian Type 1 VWD study was one of the first to elucidate the complexity of the pathogenesis of type 1 VWD. It identified CLEC4M as playing a role in VWF clearance, with polymorphisms contributing to the variability of VWF.

More data collection is underway through a partnership between the Centers for Disease Control and Prevention and the American Thrombosis and Hemostasis Network (ATHN).

 

 

The CDC Universal Data Collection Project gathered surveillance data on bleeding disorders on patients from 1998-2011. The goals were to characterize bleeding complications, monitor safety of blood-based products to manage bleeding, identify health issues in need of additional research, evaluate bleeding disorders over the lifespan, and evaluate quality of life. In 2006, the ATHN was formed to provide stewardship of the secured national database housed at the CDC. To date, at least 34,000 patients have opted in to the data set, which includes demographic and clinical data used for research.

In a separate, phase 4 study of about 130 patients funded by Shire and led by Dr. Sidonio and Angela C. Weyand, MD, researchers will conduct a “real-world” safety and efficacy study of prophylaxis for severe VWD. Known as ATHN 9, the study includes patients currently enrolled in the ATHN data set. Treatment regimen is at the discretion of patients’ providers, and patients will be followed for up to 2 years from the start of enrollment. The study’s primary aim is to collect data on effectiveness and safety, including adverse events of various VWF regimens in adult and pediatric patients with severe congenital VWD.

Another effort is the Medical and Scientific Advisory Council (MASAC) Working Group, of which Dr. Sidonio is a member. The first meeting took place in July of 2016. The goals include making improvements to diagnostic testing and laboratory standards, assessing existing standards of care and clinical practice guidelines, developing educational programming, conducting research to better understand and develop effective treatments for VWD, and collaborating with partner organizations.

Dr. Sidonio reported that he has participated in advisory boards for Shire, CSL Behring, Biogen/Bioverativ, Pfizer, Emergent Solutions, Roche/Genentech, Aptevo, Novo Nordisk, Hema Biologics, and Octapharma. He also has received investigator-initiated grant funding from Bioverativ, Grifols, Kedrion, and Shire.

[email protected]

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

– Several groups of researchers are examining cohorts of von Willebrand disease (VWD), looking at its pathogenesis and molecular causes, as well as ways to improve treatment strategies.

At the biennial summit of the Thrombosis & Hemostasis Societies of North America, Robert F. Sidonio Jr., MD, MSc, highlighted the efforts underway, including the Zimmerman Program for the Molecular and Clinical Biology of VWD study (ZPMCB-VWD), a large grant project funded by the National Institutes of Health.

At the time of publication, there were more than 700 index cases and more than 2,200 families in the study. It includes data from 8 primary centers and 23 secondary centers. The goals are to characterize the molecular causes of VWD and the examine the fidelity of diagnosis, “which is a critical component of the study,” said Dr. Sidonio, clinical director of the hemostasis/thrombosis program at Children’s Healthcare of Atlanta.

Doug Brunk/MDedge News
Dr. Robert F. Sidonio Jr.
The ZPMCB-VWD cohort demonstrated 74% of subjects to have VWF sequence variations when immunological assays of von Willebrand factor (VWF:Ag) levels were less than 40 IU/dL (Hematology Am Soc Hematol Educ Program. 2014; 2014[1]:531-5). The precise cutoff varies by study, with VWF:Ag levels below 20-40 IU/dL most strongly correlated with presence of a VWF sequence variation.

When the ZPMCB-VWD investigators examined the correlation of bleeding phenotype and the genotype, the found that as the level of VW factor goes down, the bleeding score generally goes up, but it becomes a little bit flat in the 20-30 IU/dL range.

“Where we spend a lot of our time is with patients who have levels of 30%-50%, which can be quite heterogeneous,” Dr. Sidonio said.

Another finding made out of the ZPMCB-VWD project was the discovery of the single nucleotide polymorphism p.D1472H, which was noted to be more common in African American patients and leads to low Von Willebrand Ristocetin Cofactor (VWF:RCo) and VWF:RCo/VWF:Ag ratio, but does not increase the bleeding score.

 

 

The fidelity of diagnosis was another key finding to come out of ZPMCB-VWD. Most type 1 VWD cases were identified by low VWF:RCo. There was poor correlation between historical and current assays (r2 = 0.22), and diagnostic labs improved after central lab testing.

Next, Dr. Sidonio discussed findings from RENAWI 1 and 2, which are Italian registries of about 1,000 VWF patients that were organized by 12 centers in 2002. The goals are to evaluate the natural history of VWD in Italy and to characterize treatment strategies. According to preliminary findings from the researchers, the biological response to desmopressin (DDAVP) was 69% in those with VWD1, 26% in those with VWD2A, 20% in those with VWD2B, 33% in those with VWD2M, 71% in those with VWD2N, and 0% in those with VWD3 (Blood 2014;123:4037-44).

These researchers also found that a mean bleeding score of 3.5 corresponds to a VWF:RCo score of 30 U/dL or greater. “This indicates that there is something slightly different about patients that are above and below that threshold,” Dr. Sidonio said. “I think that’s something we’ve all been struggling with: trying to understand where the differences are and how aggressively we should be treating our patients with mild VWD.”

Another effort, The Willebrand in the Netherlands’ study (WiN), is a prospective cohort trial of about 700 patients with types 1, 2, and 3 VWD from 12 centers in that country (Blood 2008;112:4510). It was the first large study to use VWF propeptide (pp) to discriminate between severe type 1 and type 3 VWD. It also found that type 2 VWD is more characterized by increased clearance in VWF in contrast to type 1 VWD, leading to higher VWFpp/VWF:Ag ratio. In addition, in type 1 VWD, antigen rates increased about 3.5 U/dL per decade, RCo increased about 9.5 U/dL per decade, and Factor VIII: C increased about 7.1 U/dL per decade (J Thromb Haemost. 2014; 12[7]:1066-75).

 

 

“I don’t think we have a study to be able to follow patients for 20 years or so, knowing that we rely on the assays that we were using 20 years ago,” Dr. Sidonio said. “That’s a challenge at a lot of our centers, but we know that VWF generally rises with age in mild VWD patients.”

In patients with definitively diagnosed type 2 VWD, no age-related VWF or FVIII changes were observed. The researchers also observed an increase in surgical bleeding and GI bleeding in elderly VWD patients.

In the meantime, the Canadian Type 1 VWD study was one of the first to elucidate the complexity of the pathogenesis of type 1 VWD. It identified CLEC4M as playing a role in VWF clearance, with polymorphisms contributing to the variability of VWF.

More data collection is underway through a partnership between the Centers for Disease Control and Prevention and the American Thrombosis and Hemostasis Network (ATHN).

 

 

The CDC Universal Data Collection Project gathered surveillance data on bleeding disorders on patients from 1998-2011. The goals were to characterize bleeding complications, monitor safety of blood-based products to manage bleeding, identify health issues in need of additional research, evaluate bleeding disorders over the lifespan, and evaluate quality of life. In 2006, the ATHN was formed to provide stewardship of the secured national database housed at the CDC. To date, at least 34,000 patients have opted in to the data set, which includes demographic and clinical data used for research.

In a separate, phase 4 study of about 130 patients funded by Shire and led by Dr. Sidonio and Angela C. Weyand, MD, researchers will conduct a “real-world” safety and efficacy study of prophylaxis for severe VWD. Known as ATHN 9, the study includes patients currently enrolled in the ATHN data set. Treatment regimen is at the discretion of patients’ providers, and patients will be followed for up to 2 years from the start of enrollment. The study’s primary aim is to collect data on effectiveness and safety, including adverse events of various VWF regimens in adult and pediatric patients with severe congenital VWD.

Another effort is the Medical and Scientific Advisory Council (MASAC) Working Group, of which Dr. Sidonio is a member. The first meeting took place in July of 2016. The goals include making improvements to diagnostic testing and laboratory standards, assessing existing standards of care and clinical practice guidelines, developing educational programming, conducting research to better understand and develop effective treatments for VWD, and collaborating with partner organizations.

Dr. Sidonio reported that he has participated in advisory boards for Shire, CSL Behring, Biogen/Bioverativ, Pfizer, Emergent Solutions, Roche/Genentech, Aptevo, Novo Nordisk, Hema Biologics, and Octapharma. He also has received investigator-initiated grant funding from Bioverativ, Grifols, Kedrion, and Shire.

[email protected]

– Several groups of researchers are examining cohorts of von Willebrand disease (VWD), looking at its pathogenesis and molecular causes, as well as ways to improve treatment strategies.

At the biennial summit of the Thrombosis & Hemostasis Societies of North America, Robert F. Sidonio Jr., MD, MSc, highlighted the efforts underway, including the Zimmerman Program for the Molecular and Clinical Biology of VWD study (ZPMCB-VWD), a large grant project funded by the National Institutes of Health.

At the time of publication, there were more than 700 index cases and more than 2,200 families in the study. It includes data from 8 primary centers and 23 secondary centers. The goals are to characterize the molecular causes of VWD and the examine the fidelity of diagnosis, “which is a critical component of the study,” said Dr. Sidonio, clinical director of the hemostasis/thrombosis program at Children’s Healthcare of Atlanta.

Doug Brunk/MDedge News
Dr. Robert F. Sidonio Jr.
The ZPMCB-VWD cohort demonstrated 74% of subjects to have VWF sequence variations when immunological assays of von Willebrand factor (VWF:Ag) levels were less than 40 IU/dL (Hematology Am Soc Hematol Educ Program. 2014; 2014[1]:531-5). The precise cutoff varies by study, with VWF:Ag levels below 20-40 IU/dL most strongly correlated with presence of a VWF sequence variation.

When the ZPMCB-VWD investigators examined the correlation of bleeding phenotype and the genotype, the found that as the level of VW factor goes down, the bleeding score generally goes up, but it becomes a little bit flat in the 20-30 IU/dL range.

“Where we spend a lot of our time is with patients who have levels of 30%-50%, which can be quite heterogeneous,” Dr. Sidonio said.

Another finding made out of the ZPMCB-VWD project was the discovery of the single nucleotide polymorphism p.D1472H, which was noted to be more common in African American patients and leads to low Von Willebrand Ristocetin Cofactor (VWF:RCo) and VWF:RCo/VWF:Ag ratio, but does not increase the bleeding score.

 

 

The fidelity of diagnosis was another key finding to come out of ZPMCB-VWD. Most type 1 VWD cases were identified by low VWF:RCo. There was poor correlation between historical and current assays (r2 = 0.22), and diagnostic labs improved after central lab testing.

Next, Dr. Sidonio discussed findings from RENAWI 1 and 2, which are Italian registries of about 1,000 VWF patients that were organized by 12 centers in 2002. The goals are to evaluate the natural history of VWD in Italy and to characterize treatment strategies. According to preliminary findings from the researchers, the biological response to desmopressin (DDAVP) was 69% in those with VWD1, 26% in those with VWD2A, 20% in those with VWD2B, 33% in those with VWD2M, 71% in those with VWD2N, and 0% in those with VWD3 (Blood 2014;123:4037-44).

These researchers also found that a mean bleeding score of 3.5 corresponds to a VWF:RCo score of 30 U/dL or greater. “This indicates that there is something slightly different about patients that are above and below that threshold,” Dr. Sidonio said. “I think that’s something we’ve all been struggling with: trying to understand where the differences are and how aggressively we should be treating our patients with mild VWD.”

Another effort, The Willebrand in the Netherlands’ study (WiN), is a prospective cohort trial of about 700 patients with types 1, 2, and 3 VWD from 12 centers in that country (Blood 2008;112:4510). It was the first large study to use VWF propeptide (pp) to discriminate between severe type 1 and type 3 VWD. It also found that type 2 VWD is more characterized by increased clearance in VWF in contrast to type 1 VWD, leading to higher VWFpp/VWF:Ag ratio. In addition, in type 1 VWD, antigen rates increased about 3.5 U/dL per decade, RCo increased about 9.5 U/dL per decade, and Factor VIII: C increased about 7.1 U/dL per decade (J Thromb Haemost. 2014; 12[7]:1066-75).

 

 

“I don’t think we have a study to be able to follow patients for 20 years or so, knowing that we rely on the assays that we were using 20 years ago,” Dr. Sidonio said. “That’s a challenge at a lot of our centers, but we know that VWF generally rises with age in mild VWD patients.”

In patients with definitively diagnosed type 2 VWD, no age-related VWF or FVIII changes were observed. The researchers also observed an increase in surgical bleeding and GI bleeding in elderly VWD patients.

In the meantime, the Canadian Type 1 VWD study was one of the first to elucidate the complexity of the pathogenesis of type 1 VWD. It identified CLEC4M as playing a role in VWF clearance, with polymorphisms contributing to the variability of VWF.

More data collection is underway through a partnership between the Centers for Disease Control and Prevention and the American Thrombosis and Hemostasis Network (ATHN).

 

 

The CDC Universal Data Collection Project gathered surveillance data on bleeding disorders on patients from 1998-2011. The goals were to characterize bleeding complications, monitor safety of blood-based products to manage bleeding, identify health issues in need of additional research, evaluate bleeding disorders over the lifespan, and evaluate quality of life. In 2006, the ATHN was formed to provide stewardship of the secured national database housed at the CDC. To date, at least 34,000 patients have opted in to the data set, which includes demographic and clinical data used for research.

In a separate, phase 4 study of about 130 patients funded by Shire and led by Dr. Sidonio and Angela C. Weyand, MD, researchers will conduct a “real-world” safety and efficacy study of prophylaxis for severe VWD. Known as ATHN 9, the study includes patients currently enrolled in the ATHN data set. Treatment regimen is at the discretion of patients’ providers, and patients will be followed for up to 2 years from the start of enrollment. The study’s primary aim is to collect data on effectiveness and safety, including adverse events of various VWF regimens in adult and pediatric patients with severe congenital VWD.

Another effort is the Medical and Scientific Advisory Council (MASAC) Working Group, of which Dr. Sidonio is a member. The first meeting took place in July of 2016. The goals include making improvements to diagnostic testing and laboratory standards, assessing existing standards of care and clinical practice guidelines, developing educational programming, conducting research to better understand and develop effective treatments for VWD, and collaborating with partner organizations.

Dr. Sidonio reported that he has participated in advisory boards for Shire, CSL Behring, Biogen/Bioverativ, Pfizer, Emergent Solutions, Roche/Genentech, Aptevo, Novo Nordisk, Hema Biologics, and Octapharma. He also has received investigator-initiated grant funding from Bioverativ, Grifols, Kedrion, and Shire.

[email protected]

Publications
Publications
Topics
Article Type
Sections
Article Source

EXPERT ANALYSIS FROM THSNA 2018

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Why is gene therapy for hemophilia taking so long?

Article Type
Changed

 

– The goal of gene therapy for hemophilia and other genetic diseases is to achieve long-term expression and levels adequate to improve the phenotype of disease, according to Katherine A. High, MD.

“Sometimes people ask me, ‘Why is it taking so long to develop these therapeutics?’ ” Dr. High said at the biennial summit of the Thrombosis & Hemostasis Societies of North America. “The answer is that gene therapy vectors are arguably one of the most complex therapeutics yet developed.”

Courtesy of Dr. Katherine High
Dr. Katherine High
Currently, gene therapy vectors consist of both a protein and a DNA/RNA component that must be precisely assembled. “Most vectors are engineered from viruses and it has taken time to understand and manage the human immune response, which was poorly predicted by animal models,” said Dr. High, a hematologist who is cofounder, president, and head of research and development at Philadelphia-based Spark Therapeutics. “It took 22 years from the first clinical trial of gene therapy vectors to the first licensed product.”

Spark Therapeutics is currently developing gene therapies for hemophilia A (SPK-8011) and hemophilia B (SPK-9001).



Hemostasis and thrombosis targets in gene therapy trials include hemophilia, as well as peripheral artery disease/claudication and congestive heart failure. In the latter, a prior phase 2b trial of adeno-associated virus (AAV) expressing SERCA2a did not support efficacy (Lancet 2016;387:1178-86), while a current trial of adenovirus 5–vector expressing adenylyl cyclase–type 6 is entering phase 3 study (NCT03360448).

To get a sense of how long it may take for a new class of therapeutics to become established, Dr. High noted that the first monoclonal antibody to be licensed was OKT3 (muromonab-CD3) in 1986, followed by abciximab in 1994, rituximab and daclizumab in 1997, and four additional products in 1998. By 2007, 8 of the top 20 biotech drugs were monoclonal antibodies.

Hemophilia has long been a favored gene therapy target because biology is in its favor. “It has a wide therapeutic window, it does not require tissue-specific expression of transgene, small and large animal models exist, and endpoints are well validated and easy to measure,” she said. “Thus, early gene-therapy clinical investigation since 1998 explored many strategies.”

 

 


There are several current investigational efforts in AAV-mediated gene transfer in hemophilia, including:

  • A single-arm study to evaluate the efficacy and safety of valoctocogene roxaparvovec in hemophilia A patients at a dose of 4×1013 vector genome per kilogram (NCT03392974).
  • A dose-ranging study of recombinant AAV2/6 human factor 8 gene therapy SB-525 in subjects with severe hemophilia A (NCT03061201).
  • A safety and dose-escalation study of an adeno-associated viral vector for gene transfer in hemophilia A subjects (NCT03370172).

Other approaches in preclinical investigation include lentiviral transduction of hematopoietic stem cells with megakaryocyte-restricted expression, lentiviral transduction of liver cells and endothelial cells, and genome editing using zinc finger nucleases.

“AAV vectors are one of the smallest of all naturally occurring viruses,” said Dr. High, who is also emeritus professor of pediatrics at the University of Pennsylvania, Philadelphia. “The recombinant AAV consists of a highly ordered set of proteins [vector capsid] containing DNA [the active agent].”

 

 


Overall goals for a hemophilia gene therapy include long-term expression and levels adequate to prevent bleeds in someone with a normal active lifestyle. “We’d like to see consistency of results from one person to the next, and we’d like to use the lowest possible dose,” she said. “In the setting of gene transfer, the lower the dose, the lower the likelihood of immune responses that need to be managed. Theoretically, the lower the dose, the lower the risk of insertional mutagenesis, and the shorter-term duration of vector shedding in body fluids, including in semen.”

Going forward, a key question for researchers relates to the long-term effect of gene therapy. “How long is long enough?” Dr. High asked. “The longest reported durability is 8 years, with observation ongoing, from studies initially reported in men with severe hemophilia B. The durability in large animal models exceeds 10 years.”

Another unanswered question is what level of factor VIII to aim for in treatment. “Some data suggest that FVIII levels greater than 100 IU/dL are associated with a greater level of thrombosis,” Dr. High said. “So I think somewhere between 12% and 100% is probably the ideal level.”

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

– The goal of gene therapy for hemophilia and other genetic diseases is to achieve long-term expression and levels adequate to improve the phenotype of disease, according to Katherine A. High, MD.

“Sometimes people ask me, ‘Why is it taking so long to develop these therapeutics?’ ” Dr. High said at the biennial summit of the Thrombosis & Hemostasis Societies of North America. “The answer is that gene therapy vectors are arguably one of the most complex therapeutics yet developed.”

Courtesy of Dr. Katherine High
Dr. Katherine High
Currently, gene therapy vectors consist of both a protein and a DNA/RNA component that must be precisely assembled. “Most vectors are engineered from viruses and it has taken time to understand and manage the human immune response, which was poorly predicted by animal models,” said Dr. High, a hematologist who is cofounder, president, and head of research and development at Philadelphia-based Spark Therapeutics. “It took 22 years from the first clinical trial of gene therapy vectors to the first licensed product.”

Spark Therapeutics is currently developing gene therapies for hemophilia A (SPK-8011) and hemophilia B (SPK-9001).



Hemostasis and thrombosis targets in gene therapy trials include hemophilia, as well as peripheral artery disease/claudication and congestive heart failure. In the latter, a prior phase 2b trial of adeno-associated virus (AAV) expressing SERCA2a did not support efficacy (Lancet 2016;387:1178-86), while a current trial of adenovirus 5–vector expressing adenylyl cyclase–type 6 is entering phase 3 study (NCT03360448).

To get a sense of how long it may take for a new class of therapeutics to become established, Dr. High noted that the first monoclonal antibody to be licensed was OKT3 (muromonab-CD3) in 1986, followed by abciximab in 1994, rituximab and daclizumab in 1997, and four additional products in 1998. By 2007, 8 of the top 20 biotech drugs were monoclonal antibodies.

Hemophilia has long been a favored gene therapy target because biology is in its favor. “It has a wide therapeutic window, it does not require tissue-specific expression of transgene, small and large animal models exist, and endpoints are well validated and easy to measure,” she said. “Thus, early gene-therapy clinical investigation since 1998 explored many strategies.”

 

 


There are several current investigational efforts in AAV-mediated gene transfer in hemophilia, including:

  • A single-arm study to evaluate the efficacy and safety of valoctocogene roxaparvovec in hemophilia A patients at a dose of 4×1013 vector genome per kilogram (NCT03392974).
  • A dose-ranging study of recombinant AAV2/6 human factor 8 gene therapy SB-525 in subjects with severe hemophilia A (NCT03061201).
  • A safety and dose-escalation study of an adeno-associated viral vector for gene transfer in hemophilia A subjects (NCT03370172).

Other approaches in preclinical investigation include lentiviral transduction of hematopoietic stem cells with megakaryocyte-restricted expression, lentiviral transduction of liver cells and endothelial cells, and genome editing using zinc finger nucleases.

“AAV vectors are one of the smallest of all naturally occurring viruses,” said Dr. High, who is also emeritus professor of pediatrics at the University of Pennsylvania, Philadelphia. “The recombinant AAV consists of a highly ordered set of proteins [vector capsid] containing DNA [the active agent].”

 

 


Overall goals for a hemophilia gene therapy include long-term expression and levels adequate to prevent bleeds in someone with a normal active lifestyle. “We’d like to see consistency of results from one person to the next, and we’d like to use the lowest possible dose,” she said. “In the setting of gene transfer, the lower the dose, the lower the likelihood of immune responses that need to be managed. Theoretically, the lower the dose, the lower the risk of insertional mutagenesis, and the shorter-term duration of vector shedding in body fluids, including in semen.”

Going forward, a key question for researchers relates to the long-term effect of gene therapy. “How long is long enough?” Dr. High asked. “The longest reported durability is 8 years, with observation ongoing, from studies initially reported in men with severe hemophilia B. The durability in large animal models exceeds 10 years.”

Another unanswered question is what level of factor VIII to aim for in treatment. “Some data suggest that FVIII levels greater than 100 IU/dL are associated with a greater level of thrombosis,” Dr. High said. “So I think somewhere between 12% and 100% is probably the ideal level.”

 

– The goal of gene therapy for hemophilia and other genetic diseases is to achieve long-term expression and levels adequate to improve the phenotype of disease, according to Katherine A. High, MD.

“Sometimes people ask me, ‘Why is it taking so long to develop these therapeutics?’ ” Dr. High said at the biennial summit of the Thrombosis & Hemostasis Societies of North America. “The answer is that gene therapy vectors are arguably one of the most complex therapeutics yet developed.”

Courtesy of Dr. Katherine High
Dr. Katherine High
Currently, gene therapy vectors consist of both a protein and a DNA/RNA component that must be precisely assembled. “Most vectors are engineered from viruses and it has taken time to understand and manage the human immune response, which was poorly predicted by animal models,” said Dr. High, a hematologist who is cofounder, president, and head of research and development at Philadelphia-based Spark Therapeutics. “It took 22 years from the first clinical trial of gene therapy vectors to the first licensed product.”

Spark Therapeutics is currently developing gene therapies for hemophilia A (SPK-8011) and hemophilia B (SPK-9001).



Hemostasis and thrombosis targets in gene therapy trials include hemophilia, as well as peripheral artery disease/claudication and congestive heart failure. In the latter, a prior phase 2b trial of adeno-associated virus (AAV) expressing SERCA2a did not support efficacy (Lancet 2016;387:1178-86), while a current trial of adenovirus 5–vector expressing adenylyl cyclase–type 6 is entering phase 3 study (NCT03360448).

To get a sense of how long it may take for a new class of therapeutics to become established, Dr. High noted that the first monoclonal antibody to be licensed was OKT3 (muromonab-CD3) in 1986, followed by abciximab in 1994, rituximab and daclizumab in 1997, and four additional products in 1998. By 2007, 8 of the top 20 biotech drugs were monoclonal antibodies.

Hemophilia has long been a favored gene therapy target because biology is in its favor. “It has a wide therapeutic window, it does not require tissue-specific expression of transgene, small and large animal models exist, and endpoints are well validated and easy to measure,” she said. “Thus, early gene-therapy clinical investigation since 1998 explored many strategies.”

 

 


There are several current investigational efforts in AAV-mediated gene transfer in hemophilia, including:

  • A single-arm study to evaluate the efficacy and safety of valoctocogene roxaparvovec in hemophilia A patients at a dose of 4×1013 vector genome per kilogram (NCT03392974).
  • A dose-ranging study of recombinant AAV2/6 human factor 8 gene therapy SB-525 in subjects with severe hemophilia A (NCT03061201).
  • A safety and dose-escalation study of an adeno-associated viral vector for gene transfer in hemophilia A subjects (NCT03370172).

Other approaches in preclinical investigation include lentiviral transduction of hematopoietic stem cells with megakaryocyte-restricted expression, lentiviral transduction of liver cells and endothelial cells, and genome editing using zinc finger nucleases.

“AAV vectors are one of the smallest of all naturally occurring viruses,” said Dr. High, who is also emeritus professor of pediatrics at the University of Pennsylvania, Philadelphia. “The recombinant AAV consists of a highly ordered set of proteins [vector capsid] containing DNA [the active agent].”

 

 


Overall goals for a hemophilia gene therapy include long-term expression and levels adequate to prevent bleeds in someone with a normal active lifestyle. “We’d like to see consistency of results from one person to the next, and we’d like to use the lowest possible dose,” she said. “In the setting of gene transfer, the lower the dose, the lower the likelihood of immune responses that need to be managed. Theoretically, the lower the dose, the lower the risk of insertional mutagenesis, and the shorter-term duration of vector shedding in body fluids, including in semen.”

Going forward, a key question for researchers relates to the long-term effect of gene therapy. “How long is long enough?” Dr. High asked. “The longest reported durability is 8 years, with observation ongoing, from studies initially reported in men with severe hemophilia B. The durability in large animal models exceeds 10 years.”

Another unanswered question is what level of factor VIII to aim for in treatment. “Some data suggest that FVIII levels greater than 100 IU/dL are associated with a greater level of thrombosis,” Dr. High said. “So I think somewhere between 12% and 100% is probably the ideal level.”

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

EXPERT ANALYSIS FROM THSNA 2018

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Age and race affect access to myeloma treatment

Article Type
Changed

 

Older patients, African Americans, and individuals of low socioeconomic status may be less likely to receive systemic treatment for newly diagnosed multiple myeloma, results of a recent study suggest.

Comorbidities and poor performance indicators also reduced the likelihood of receiving first-line treatment, according to results of the retrospective cohort study published in Clinical Lymphoma, Myeloma & Leukemia.

The findings highlight the need for a “multifaceted approach” to address outcome disparities in multiple myeloma, according to researcher Bita Fakhri, MD, MPH, of the division of oncology at Washington University, St. Louis, and her coinvestigators.

“Particular attention to aging-related issues is essential to ensure older patients will benefit from the advances achieved in the field, similar to young patients,” the investigators wrote.

Racial and socioeconomic barriers should also be addressed, they added.

The retrospective cohort analysis included data on 3,814 patients with active multiple myeloma in the Surveillance, Epidemiology, and End Results–Medicare database from 2007 to 2011. Investigators found that overall, 1,445 patients (38%) had no insurance claims confirming that they had received systemic treatment.

Older age increased the odds of not receiving treatment, with the likelihood increasing by 7% for each year of advancing age (adjusted odds ratio, 1.07; 95% confidence interval, 1.06-1.08). Likewise, African American patients were 26% more likely to have had no treatment (aOR, 1.26; 95% CI, 1.03-1.54), and patients who were enrolled in both Medicaid and Medicare – a proxy for lower income – had a 21% increased odds of no treatment (aOR, 1.21; 95% CI, 1.02-1.42).

 

 


Similarly increased odds of no treatment were reported for patients with comorbidities and poor performance status indicators.

“In a subset of older and frail patients, the risks of treatments approved for [multiple myeloma] might outweigh the benefits or might not be in line with the individual’s goals of care,” the investigators wrote.

The study did not track supportive-care treatments that patients may have received instead of active disease treatment, such as bisphosphonates for skeletal lesions or plasmapheresis for hyperviscosity syndrome.

Lack of treatment was associated with poorer survival in the study. Median overall survival was just 9.6 months for individuals with no record of treatment, compared with 32.3 months for patients who had received treatment.

Dr. Fakhri and coauthors reported having no financial disclosures related to the study, which was supported by the National Cancer Institute.

SOURCE: Fakhri B et al. Clin Lymphoma Myeloma Leuk. 2018 Mar;18(3):219-24.

Publications
Topics
Sections

 

Older patients, African Americans, and individuals of low socioeconomic status may be less likely to receive systemic treatment for newly diagnosed multiple myeloma, results of a recent study suggest.

Comorbidities and poor performance indicators also reduced the likelihood of receiving first-line treatment, according to results of the retrospective cohort study published in Clinical Lymphoma, Myeloma & Leukemia.

The findings highlight the need for a “multifaceted approach” to address outcome disparities in multiple myeloma, according to researcher Bita Fakhri, MD, MPH, of the division of oncology at Washington University, St. Louis, and her coinvestigators.

“Particular attention to aging-related issues is essential to ensure older patients will benefit from the advances achieved in the field, similar to young patients,” the investigators wrote.

Racial and socioeconomic barriers should also be addressed, they added.

The retrospective cohort analysis included data on 3,814 patients with active multiple myeloma in the Surveillance, Epidemiology, and End Results–Medicare database from 2007 to 2011. Investigators found that overall, 1,445 patients (38%) had no insurance claims confirming that they had received systemic treatment.

Older age increased the odds of not receiving treatment, with the likelihood increasing by 7% for each year of advancing age (adjusted odds ratio, 1.07; 95% confidence interval, 1.06-1.08). Likewise, African American patients were 26% more likely to have had no treatment (aOR, 1.26; 95% CI, 1.03-1.54), and patients who were enrolled in both Medicaid and Medicare – a proxy for lower income – had a 21% increased odds of no treatment (aOR, 1.21; 95% CI, 1.02-1.42).

 

 


Similarly increased odds of no treatment were reported for patients with comorbidities and poor performance status indicators.

“In a subset of older and frail patients, the risks of treatments approved for [multiple myeloma] might outweigh the benefits or might not be in line with the individual’s goals of care,” the investigators wrote.

The study did not track supportive-care treatments that patients may have received instead of active disease treatment, such as bisphosphonates for skeletal lesions or plasmapheresis for hyperviscosity syndrome.

Lack of treatment was associated with poorer survival in the study. Median overall survival was just 9.6 months for individuals with no record of treatment, compared with 32.3 months for patients who had received treatment.

Dr. Fakhri and coauthors reported having no financial disclosures related to the study, which was supported by the National Cancer Institute.

SOURCE: Fakhri B et al. Clin Lymphoma Myeloma Leuk. 2018 Mar;18(3):219-24.

 

Older patients, African Americans, and individuals of low socioeconomic status may be less likely to receive systemic treatment for newly diagnosed multiple myeloma, results of a recent study suggest.

Comorbidities and poor performance indicators also reduced the likelihood of receiving first-line treatment, according to results of the retrospective cohort study published in Clinical Lymphoma, Myeloma & Leukemia.

The findings highlight the need for a “multifaceted approach” to address outcome disparities in multiple myeloma, according to researcher Bita Fakhri, MD, MPH, of the division of oncology at Washington University, St. Louis, and her coinvestigators.

“Particular attention to aging-related issues is essential to ensure older patients will benefit from the advances achieved in the field, similar to young patients,” the investigators wrote.

Racial and socioeconomic barriers should also be addressed, they added.

The retrospective cohort analysis included data on 3,814 patients with active multiple myeloma in the Surveillance, Epidemiology, and End Results–Medicare database from 2007 to 2011. Investigators found that overall, 1,445 patients (38%) had no insurance claims confirming that they had received systemic treatment.

Older age increased the odds of not receiving treatment, with the likelihood increasing by 7% for each year of advancing age (adjusted odds ratio, 1.07; 95% confidence interval, 1.06-1.08). Likewise, African American patients were 26% more likely to have had no treatment (aOR, 1.26; 95% CI, 1.03-1.54), and patients who were enrolled in both Medicaid and Medicare – a proxy for lower income – had a 21% increased odds of no treatment (aOR, 1.21; 95% CI, 1.02-1.42).

 

 


Similarly increased odds of no treatment were reported for patients with comorbidities and poor performance status indicators.

“In a subset of older and frail patients, the risks of treatments approved for [multiple myeloma] might outweigh the benefits or might not be in line with the individual’s goals of care,” the investigators wrote.

The study did not track supportive-care treatments that patients may have received instead of active disease treatment, such as bisphosphonates for skeletal lesions or plasmapheresis for hyperviscosity syndrome.

Lack of treatment was associated with poorer survival in the study. Median overall survival was just 9.6 months for individuals with no record of treatment, compared with 32.3 months for patients who had received treatment.

Dr. Fakhri and coauthors reported having no financial disclosures related to the study, which was supported by the National Cancer Institute.

SOURCE: Fakhri B et al. Clin Lymphoma Myeloma Leuk. 2018 Mar;18(3):219-24.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM CLINICAL LYMPHOMA, MYELOMA & LEUKEMIA

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Older patients, African Americans, and lower-income patients may be less likely to multiple myeloma treatment.

Major finding: Factors significantly associated with no systemic treatment included older age (adjusted odds ratio, 1.07 per year), African American descent (aOR, 1.26), and dual Medicare-Medicaid enrollment (aOR, 1.21).

Study details: A retrospective cohort analysis including data on 3,814 patients with active multiple myeloma in the Surveillance, Epidemiology, and End Results–Medicare database from 2007 to 2011.

Disclosures: The research was supported by the National Cancer Institute. The investigators reported having no financial disclosures.

Source: Fakhri B et al. Clin Lymphoma Myeloma Leuk. Mar 2018;18(3):219-24.

Disqus Comments
Default
Use ProPublica

AAD guidelines’ conflict-of-interest policies discussed in pro-con debate

Article Type
Changed

 

– The American Academy of Dermatology’s policies that regulate conflicts of interest among members of its guidelines panels are “pretty good, but could be improved,” Lionel G. Bercovitch, MD, said at the annual meeting of the American Academy Dermatology.

One positive step might be to tighten the current American Academy of Dermatology requirement that more than half of the members in clinical guideline work groups be free of any financial conflicts and the minimum be raised to a higher percentage, such as more than 70%, suggested Dr. Bercovitch, a professor of dermatology at Brown University, Providence, R.I.

Mitchel L. Zoler/Frontline Medical News
Dr. Lionel G. Bercovitch
“No matter how expert you are, no matter how objective you think you are, if you have financial conflicts, they will influence you,” declared Dr. Bercovitch, who is also director of pediatric dermatology at Hasbro Children’s Hospital in Providence.

But his concern over the adequacy of existing conflict barriers during the writing of clinical guidelines wasn’t shared by Clifford Perlis, MD, who countered that “there are reasons not to waste too much time wringing our hands over conflicts of interest.”

He offered four reasons to support his statement:
  • Conflicts of interest are ubiquitous and thus impossible to eliminate.
  • Excluding working group members with conflicts can deprive the guidelines of valuable expertise.
  • Checks and balances that are already in place in guideline development prevent inappropriate influence from conflicts of interest.
  • No evidence has shown that conflicts of interest have inappropriately influenced development of treatment guidelines.

Mitchel L. Zoler/Frontline Medical News
Dr. Clifford Perlis
“Allowing conflicts of interest adds to the expertise of guideline development and probably does not adversely affect the guidelines,” said Dr. Perlis, a dermatologist and Mohs surgeon who practices in King of Prussia, Pa.
 

 


Conflicts of interest may not be as well managed as AAD policies suggest, Dr. Bercovitch noted. He cited a report published in late 2017 that tallied the actual conflicts of 49 people who served as the authors of three AAD guidelines published during 2013-2016. To objectively double check each author’s conflicts the researchers used the Open Payments database run by the Centers for Medicare & Medicaid Services (JAMA Dermatol. 2017 Dec;153[12]:1229-35).

The analysis showed that 40 of the 49 authors (82%) had received some amount of industry payment, 63% had received more than $1,000, and 51% had received more than $10,000. The median amount received from industry was just over $33,000. The analysis also showed that 22 of the 40 authors who received an industry payment had disclosure statements for the guideline they participated in that did not agree with the information in the Open Payments database.

Mitchel L. Zoler/Frontline Medical News
Dr. Henry W. Lim
A rebuttal to these findings appeared a few weeks later, written by three people with AAD positions, including first author Henry W. Lim, MD, the immediate past president of the AAD and chair emeritus of dermatology at the Henry Ford Health System in Detroit (JAMA Dermatol. 2018 Feb 7. doi: 10.1001/jamadermatol.2017.6207).

“The AAD relies on information obtained through its self-reported online member disclosure system. This internal system collects updates to disclosed relationships on a real-time, ongoing basis, allowing the AAD to regularly assess any changes,” wrote Dr. Lim and his coauthors. “This provides information in a more meaningful and time-sensitive way” than the Open Payments database. In addition, the Open Payments database “is known to be inaccurate,” while the AAD “relies on information obtained through its self-reported online member disclosure system.” This includes an assessment of the relevancy of the conflict to the guideline involved. “This critical evaluation of relevancy was not addressed in the authors’ analysis,” they added.
 

 


They reported an adjusted analysis of the percentage of authors with relevant conflicts for each of the guidelines examined in the initial study. The percentages shrank to zero, 40%, and 43% of the authors with relevant conflicts, percentages that fell within the AAD’s ceiling for an acceptable percentage of work group members with conflicts.

The discussion on this topic was presented during a forum on dermatoethics at the meeting, structured as a debate in which presenters are assigned an ethical argument or point-of-view to discuss and defend. The position taken by the speaker need not (and often does not) correspond to the speaker’s personal views.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

– The American Academy of Dermatology’s policies that regulate conflicts of interest among members of its guidelines panels are “pretty good, but could be improved,” Lionel G. Bercovitch, MD, said at the annual meeting of the American Academy Dermatology.

One positive step might be to tighten the current American Academy of Dermatology requirement that more than half of the members in clinical guideline work groups be free of any financial conflicts and the minimum be raised to a higher percentage, such as more than 70%, suggested Dr. Bercovitch, a professor of dermatology at Brown University, Providence, R.I.

Mitchel L. Zoler/Frontline Medical News
Dr. Lionel G. Bercovitch
“No matter how expert you are, no matter how objective you think you are, if you have financial conflicts, they will influence you,” declared Dr. Bercovitch, who is also director of pediatric dermatology at Hasbro Children’s Hospital in Providence.

But his concern over the adequacy of existing conflict barriers during the writing of clinical guidelines wasn’t shared by Clifford Perlis, MD, who countered that “there are reasons not to waste too much time wringing our hands over conflicts of interest.”

He offered four reasons to support his statement:
  • Conflicts of interest are ubiquitous and thus impossible to eliminate.
  • Excluding working group members with conflicts can deprive the guidelines of valuable expertise.
  • Checks and balances that are already in place in guideline development prevent inappropriate influence from conflicts of interest.
  • No evidence has shown that conflicts of interest have inappropriately influenced development of treatment guidelines.

Mitchel L. Zoler/Frontline Medical News
Dr. Clifford Perlis
“Allowing conflicts of interest adds to the expertise of guideline development and probably does not adversely affect the guidelines,” said Dr. Perlis, a dermatologist and Mohs surgeon who practices in King of Prussia, Pa.
 

 


Conflicts of interest may not be as well managed as AAD policies suggest, Dr. Bercovitch noted. He cited a report published in late 2017 that tallied the actual conflicts of 49 people who served as the authors of three AAD guidelines published during 2013-2016. To objectively double check each author’s conflicts the researchers used the Open Payments database run by the Centers for Medicare & Medicaid Services (JAMA Dermatol. 2017 Dec;153[12]:1229-35).

The analysis showed that 40 of the 49 authors (82%) had received some amount of industry payment, 63% had received more than $1,000, and 51% had received more than $10,000. The median amount received from industry was just over $33,000. The analysis also showed that 22 of the 40 authors who received an industry payment had disclosure statements for the guideline they participated in that did not agree with the information in the Open Payments database.

Mitchel L. Zoler/Frontline Medical News
Dr. Henry W. Lim
A rebuttal to these findings appeared a few weeks later, written by three people with AAD positions, including first author Henry W. Lim, MD, the immediate past president of the AAD and chair emeritus of dermatology at the Henry Ford Health System in Detroit (JAMA Dermatol. 2018 Feb 7. doi: 10.1001/jamadermatol.2017.6207).

“The AAD relies on information obtained through its self-reported online member disclosure system. This internal system collects updates to disclosed relationships on a real-time, ongoing basis, allowing the AAD to regularly assess any changes,” wrote Dr. Lim and his coauthors. “This provides information in a more meaningful and time-sensitive way” than the Open Payments database. In addition, the Open Payments database “is known to be inaccurate,” while the AAD “relies on information obtained through its self-reported online member disclosure system.” This includes an assessment of the relevancy of the conflict to the guideline involved. “This critical evaluation of relevancy was not addressed in the authors’ analysis,” they added.
 

 


They reported an adjusted analysis of the percentage of authors with relevant conflicts for each of the guidelines examined in the initial study. The percentages shrank to zero, 40%, and 43% of the authors with relevant conflicts, percentages that fell within the AAD’s ceiling for an acceptable percentage of work group members with conflicts.

The discussion on this topic was presented during a forum on dermatoethics at the meeting, structured as a debate in which presenters are assigned an ethical argument or point-of-view to discuss and defend. The position taken by the speaker need not (and often does not) correspond to the speaker’s personal views.

 

– The American Academy of Dermatology’s policies that regulate conflicts of interest among members of its guidelines panels are “pretty good, but could be improved,” Lionel G. Bercovitch, MD, said at the annual meeting of the American Academy Dermatology.

One positive step might be to tighten the current American Academy of Dermatology requirement that more than half of the members in clinical guideline work groups be free of any financial conflicts and the minimum be raised to a higher percentage, such as more than 70%, suggested Dr. Bercovitch, a professor of dermatology at Brown University, Providence, R.I.

Mitchel L. Zoler/Frontline Medical News
Dr. Lionel G. Bercovitch
“No matter how expert you are, no matter how objective you think you are, if you have financial conflicts, they will influence you,” declared Dr. Bercovitch, who is also director of pediatric dermatology at Hasbro Children’s Hospital in Providence.

But his concern over the adequacy of existing conflict barriers during the writing of clinical guidelines wasn’t shared by Clifford Perlis, MD, who countered that “there are reasons not to waste too much time wringing our hands over conflicts of interest.”

He offered four reasons to support his statement:
  • Conflicts of interest are ubiquitous and thus impossible to eliminate.
  • Excluding working group members with conflicts can deprive the guidelines of valuable expertise.
  • Checks and balances that are already in place in guideline development prevent inappropriate influence from conflicts of interest.
  • No evidence has shown that conflicts of interest have inappropriately influenced development of treatment guidelines.

Mitchel L. Zoler/Frontline Medical News
Dr. Clifford Perlis
“Allowing conflicts of interest adds to the expertise of guideline development and probably does not adversely affect the guidelines,” said Dr. Perlis, a dermatologist and Mohs surgeon who practices in King of Prussia, Pa.
 

 


Conflicts of interest may not be as well managed as AAD policies suggest, Dr. Bercovitch noted. He cited a report published in late 2017 that tallied the actual conflicts of 49 people who served as the authors of three AAD guidelines published during 2013-2016. To objectively double check each author’s conflicts the researchers used the Open Payments database run by the Centers for Medicare & Medicaid Services (JAMA Dermatol. 2017 Dec;153[12]:1229-35).

The analysis showed that 40 of the 49 authors (82%) had received some amount of industry payment, 63% had received more than $1,000, and 51% had received more than $10,000. The median amount received from industry was just over $33,000. The analysis also showed that 22 of the 40 authors who received an industry payment had disclosure statements for the guideline they participated in that did not agree with the information in the Open Payments database.

Mitchel L. Zoler/Frontline Medical News
Dr. Henry W. Lim
A rebuttal to these findings appeared a few weeks later, written by three people with AAD positions, including first author Henry W. Lim, MD, the immediate past president of the AAD and chair emeritus of dermatology at the Henry Ford Health System in Detroit (JAMA Dermatol. 2018 Feb 7. doi: 10.1001/jamadermatol.2017.6207).

“The AAD relies on information obtained through its self-reported online member disclosure system. This internal system collects updates to disclosed relationships on a real-time, ongoing basis, allowing the AAD to regularly assess any changes,” wrote Dr. Lim and his coauthors. “This provides information in a more meaningful and time-sensitive way” than the Open Payments database. In addition, the Open Payments database “is known to be inaccurate,” while the AAD “relies on information obtained through its self-reported online member disclosure system.” This includes an assessment of the relevancy of the conflict to the guideline involved. “This critical evaluation of relevancy was not addressed in the authors’ analysis,” they added.
 

 


They reported an adjusted analysis of the percentage of authors with relevant conflicts for each of the guidelines examined in the initial study. The percentages shrank to zero, 40%, and 43% of the authors with relevant conflicts, percentages that fell within the AAD’s ceiling for an acceptable percentage of work group members with conflicts.

The discussion on this topic was presented during a forum on dermatoethics at the meeting, structured as a debate in which presenters are assigned an ethical argument or point-of-view to discuss and defend. The position taken by the speaker need not (and often does not) correspond to the speaker’s personal views.

Publications
Publications
Topics
Article Type
Sections
Article Source

EXPERT ANALYSIS FROM AAD 18

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Updated CLL guidelines incorporate a decade of advances

Article Type
Changed

Updated clinical guidelines for diagnosis and treatment of chronic lymphocytic leukemia (CLL) include new and revised recommendations based on major advances in genomics, targeted therapies, and biomarkers that have occurred since the last iteration in 2008.

The guidelines are an update from a consensus document issued a decade ago by the International Workshop on CLL, focusing on the conduct of clinical trials in patients with CLL. The new guidelines are published in Blood.

Major changes or additions include:

Molecular genetics: The updated guidelines recognize the clinical importance of specific genomic alterations/mutations on response to standard chemotherapy or chemoimmunotherapy, including the 17p deletion and mutations in TP53.

“Therefore, the assessment of both del(17p) and TP53 mutation has prognostic and predictive value and should guide therapeutic decisions in routine practice. For clinical trials, it is recommended that molecular genetics be performed prior to treating a patient on protocol,” the guidelines state.

IGHV mutational status: The mutational status of immunoglobulin variable heavy chain (IGHV) genes has been demonstrated to offer important prognostic information, according to the guidelines authors led by Michael Hallek, MD of the University of Cologne, Germany.

Specifically, leukemia with IGHV genes without somatic mutations are associated with worse clinical outcomes, compared with leukemia with IGHV mutations. Patients with mutated IGHV and other prognostic factors such as favorable cytogenetics or minimal residual disease (MRD) negativity generally have excellent outcomes with a chemoimmunotherapy regimen consisting of fludarabine, cyclophosphamide, and rituximab, the authors noted.

 

 


Biomarkers: The guidelines call for standardization and use in prospective clinical trials of assays for serum markers such as soluble CD23, thymidine kinase, and beta-2-microglobulin. These markers have been shown in several studies to be associated with overall survival or progression-free survival, and of these markers, beta-2-microglobulin “has retained independent prognostic value in several multiparameter scores,” the guidelines state.

The authors also tip their hats to recently developed or improved prognostic scores, especially the CLL International Prognostic Index (CLL-IPI), which incorporates clinical stage, age, IGHV mutational status, beta-2-microglobulin, and del(17p) and/or TP53 mutations.

Organ function assessment: Not new, but improved in the current version of the guidelines, are recommendations for evaluation of splenomegaly, hepatomegaly, and lymphadenopathy in response assessment. These recommendations were harmonized with the relevant sections of the updated lymphoma response guidelines.
 

 


Continuous therapy: The guidelines panel recommends assessment of response duration during continuous therapy with oral agents and after the end of therapy, especially after chemotherapy or chemoimmunotherapy.

“Study protocols should provide detailed specifications of the planned time points for the assessment of the treatment response under continuous therapy. Response durations of less than six months are not considered clinically relevant,” the panel cautioned.

Response assessments for treatments with a maintenance phase should be performed at a minimum of 2 months after patients achieve their best responses.

MRD: The guidelines call for minimal residual disease (MRD) assessment in clinical trials aimed at maximizing remission depth, with emphasis on reporting the sensitivity of the MRD evaluation method used, and the type of tissue assessed.
 

 


Antiviral prophylaxis: The guidelines caution that because patients treated with anti-CD20 antibodies, such as rituximab or obinutuzumab, could have reactivation of hepatitis B virus (HBV) infections, patients should be tested for HBV serological status before starting on an anti-CD20 agent.

“Progressive multifocal leukoencephalopathy has been reported in a few CLL patients treated with anti-CD20 antibodies; therefore, infections with John Cunningham (JC) virus should be ruled out in situations of unclear neurological symptoms,” the panel recommended.

They note that patients younger than 65 treated with fludarabine-based therapy in the first line do not require routine monitoring or infection prophylaxis, due to the low reported incidence of infections in this group.

The authors reported having no financial disclosures related to the guidelines.
Publications
Topics
Sections

Updated clinical guidelines for diagnosis and treatment of chronic lymphocytic leukemia (CLL) include new and revised recommendations based on major advances in genomics, targeted therapies, and biomarkers that have occurred since the last iteration in 2008.

The guidelines are an update from a consensus document issued a decade ago by the International Workshop on CLL, focusing on the conduct of clinical trials in patients with CLL. The new guidelines are published in Blood.

Major changes or additions include:

Molecular genetics: The updated guidelines recognize the clinical importance of specific genomic alterations/mutations on response to standard chemotherapy or chemoimmunotherapy, including the 17p deletion and mutations in TP53.

“Therefore, the assessment of both del(17p) and TP53 mutation has prognostic and predictive value and should guide therapeutic decisions in routine practice. For clinical trials, it is recommended that molecular genetics be performed prior to treating a patient on protocol,” the guidelines state.

IGHV mutational status: The mutational status of immunoglobulin variable heavy chain (IGHV) genes has been demonstrated to offer important prognostic information, according to the guidelines authors led by Michael Hallek, MD of the University of Cologne, Germany.

Specifically, leukemia with IGHV genes without somatic mutations are associated with worse clinical outcomes, compared with leukemia with IGHV mutations. Patients with mutated IGHV and other prognostic factors such as favorable cytogenetics or minimal residual disease (MRD) negativity generally have excellent outcomes with a chemoimmunotherapy regimen consisting of fludarabine, cyclophosphamide, and rituximab, the authors noted.

 

 


Biomarkers: The guidelines call for standardization and use in prospective clinical trials of assays for serum markers such as soluble CD23, thymidine kinase, and beta-2-microglobulin. These markers have been shown in several studies to be associated with overall survival or progression-free survival, and of these markers, beta-2-microglobulin “has retained independent prognostic value in several multiparameter scores,” the guidelines state.

The authors also tip their hats to recently developed or improved prognostic scores, especially the CLL International Prognostic Index (CLL-IPI), which incorporates clinical stage, age, IGHV mutational status, beta-2-microglobulin, and del(17p) and/or TP53 mutations.

Organ function assessment: Not new, but improved in the current version of the guidelines, are recommendations for evaluation of splenomegaly, hepatomegaly, and lymphadenopathy in response assessment. These recommendations were harmonized with the relevant sections of the updated lymphoma response guidelines.
 

 


Continuous therapy: The guidelines panel recommends assessment of response duration during continuous therapy with oral agents and after the end of therapy, especially after chemotherapy or chemoimmunotherapy.

“Study protocols should provide detailed specifications of the planned time points for the assessment of the treatment response under continuous therapy. Response durations of less than six months are not considered clinically relevant,” the panel cautioned.

Response assessments for treatments with a maintenance phase should be performed at a minimum of 2 months after patients achieve their best responses.

MRD: The guidelines call for minimal residual disease (MRD) assessment in clinical trials aimed at maximizing remission depth, with emphasis on reporting the sensitivity of the MRD evaluation method used, and the type of tissue assessed.
 

 


Antiviral prophylaxis: The guidelines caution that because patients treated with anti-CD20 antibodies, such as rituximab or obinutuzumab, could have reactivation of hepatitis B virus (HBV) infections, patients should be tested for HBV serological status before starting on an anti-CD20 agent.

“Progressive multifocal leukoencephalopathy has been reported in a few CLL patients treated with anti-CD20 antibodies; therefore, infections with John Cunningham (JC) virus should be ruled out in situations of unclear neurological symptoms,” the panel recommended.

They note that patients younger than 65 treated with fludarabine-based therapy in the first line do not require routine monitoring or infection prophylaxis, due to the low reported incidence of infections in this group.

The authors reported having no financial disclosures related to the guidelines.

Updated clinical guidelines for diagnosis and treatment of chronic lymphocytic leukemia (CLL) include new and revised recommendations based on major advances in genomics, targeted therapies, and biomarkers that have occurred since the last iteration in 2008.

The guidelines are an update from a consensus document issued a decade ago by the International Workshop on CLL, focusing on the conduct of clinical trials in patients with CLL. The new guidelines are published in Blood.

Major changes or additions include:

Molecular genetics: The updated guidelines recognize the clinical importance of specific genomic alterations/mutations on response to standard chemotherapy or chemoimmunotherapy, including the 17p deletion and mutations in TP53.

“Therefore, the assessment of both del(17p) and TP53 mutation has prognostic and predictive value and should guide therapeutic decisions in routine practice. For clinical trials, it is recommended that molecular genetics be performed prior to treating a patient on protocol,” the guidelines state.

IGHV mutational status: The mutational status of immunoglobulin variable heavy chain (IGHV) genes has been demonstrated to offer important prognostic information, according to the guidelines authors led by Michael Hallek, MD of the University of Cologne, Germany.

Specifically, leukemia with IGHV genes without somatic mutations are associated with worse clinical outcomes, compared with leukemia with IGHV mutations. Patients with mutated IGHV and other prognostic factors such as favorable cytogenetics or minimal residual disease (MRD) negativity generally have excellent outcomes with a chemoimmunotherapy regimen consisting of fludarabine, cyclophosphamide, and rituximab, the authors noted.

 

 


Biomarkers: The guidelines call for standardization and use in prospective clinical trials of assays for serum markers such as soluble CD23, thymidine kinase, and beta-2-microglobulin. These markers have been shown in several studies to be associated with overall survival or progression-free survival, and of these markers, beta-2-microglobulin “has retained independent prognostic value in several multiparameter scores,” the guidelines state.

The authors also tip their hats to recently developed or improved prognostic scores, especially the CLL International Prognostic Index (CLL-IPI), which incorporates clinical stage, age, IGHV mutational status, beta-2-microglobulin, and del(17p) and/or TP53 mutations.

Organ function assessment: Not new, but improved in the current version of the guidelines, are recommendations for evaluation of splenomegaly, hepatomegaly, and lymphadenopathy in response assessment. These recommendations were harmonized with the relevant sections of the updated lymphoma response guidelines.
 

 


Continuous therapy: The guidelines panel recommends assessment of response duration during continuous therapy with oral agents and after the end of therapy, especially after chemotherapy or chemoimmunotherapy.

“Study protocols should provide detailed specifications of the planned time points for the assessment of the treatment response under continuous therapy. Response durations of less than six months are not considered clinically relevant,” the panel cautioned.

Response assessments for treatments with a maintenance phase should be performed at a minimum of 2 months after patients achieve their best responses.

MRD: The guidelines call for minimal residual disease (MRD) assessment in clinical trials aimed at maximizing remission depth, with emphasis on reporting the sensitivity of the MRD evaluation method used, and the type of tissue assessed.
 

 


Antiviral prophylaxis: The guidelines caution that because patients treated with anti-CD20 antibodies, such as rituximab or obinutuzumab, could have reactivation of hepatitis B virus (HBV) infections, patients should be tested for HBV serological status before starting on an anti-CD20 agent.

“Progressive multifocal leukoencephalopathy has been reported in a few CLL patients treated with anti-CD20 antibodies; therefore, infections with John Cunningham (JC) virus should be ruled out in situations of unclear neurological symptoms,” the panel recommended.

They note that patients younger than 65 treated with fludarabine-based therapy in the first line do not require routine monitoring or infection prophylaxis, due to the low reported incidence of infections in this group.

The authors reported having no financial disclosures related to the guidelines.
Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM BLOOD

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

MDedge Daily News: Antibiotic resistance leads to ‘nightmare’ bacteria

Article Type
Changed

 

Antibiotic resistance leads to “nightmare” bacteria. PPIs aren’t responsible for cognitive decline. Levothyroxine comes with risks for older patients. And Medicare formulary changes could be on the way.

Listen to the MDedge Daily News podcast for all the details on today’s top news.


 

Publications
Topics
Sections

 

Antibiotic resistance leads to “nightmare” bacteria. PPIs aren’t responsible for cognitive decline. Levothyroxine comes with risks for older patients. And Medicare formulary changes could be on the way.

Listen to the MDedge Daily News podcast for all the details on today’s top news.


 

 

Antibiotic resistance leads to “nightmare” bacteria. PPIs aren’t responsible for cognitive decline. Levothyroxine comes with risks for older patients. And Medicare formulary changes could be on the way.

Listen to the MDedge Daily News podcast for all the details on today’s top news.


 

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default

HSV-2 Has Little to No Effect on HIV Progression

Article Type
Changed
New data challenge the idea herpes simplex virus type 2 has any effect on the viral load and CD4 counts of patients with HIV.

Patients with HIV often also have herpes simplex virus type 2 (HSV-2) infection in part because lesions act as entry portals to susceptible HIV target cells. Some research also has suggested that HSV-2 accelerates HIV progression by upregulating HIV replication and increasing HIV viral load, but data are inconclusive, say researchers from the Iranian Research Center for HIV/AIDS, Pasteur Institute of Iran, Iranian Society for Support of Patients With Infectious Disease, Kermanshah University of Medical Sciences, Tehran University of Medical Sciences, and Zanjan University of Medical Sciences in Iran. They conducted a study to investigate HSV-2 seroprevalence in patients with and without HIV and to find out whether HSV-2 serostatus changed as CD4 counts and HIV viral load changed after 1 year.

The researchers compared 116 HIV patients who were not on HAART with 85 healthy controls. The prevalence and incidence of HSV-2 infection were low in the HIV cases and “negligible” in the control group: 18% of naïve HIV patients had HSV-2 IgG, and none of the control patients did.

Few data exist about HSV-2 seroconversion in HIV patients, the researchers say. In this study, HSV-2 seroconversion was found in 2.43% of HIV patients after 1 year.

Co-infection with HSV-2 had no association with CD4 count and HIV RNA viral load changes in the study participants at baseline or over time, the researchers say. CD4 counts after 1 year were 550 cells/mm3 in the HSV-2 seropositive patients and 563 cells/mm3 in the control group. The viral load in the seropositive group was 3.97 log copies/mL, and 3.49 log copies/mL in the seronegative group.

The researchers conclude that HIV-HSV-2 co-infection does not seem to play a role in HIV infection progression.

Publications
Topics
Sections
New data challenge the idea herpes simplex virus type 2 has any effect on the viral load and CD4 counts of patients with HIV.
New data challenge the idea herpes simplex virus type 2 has any effect on the viral load and CD4 counts of patients with HIV.

Patients with HIV often also have herpes simplex virus type 2 (HSV-2) infection in part because lesions act as entry portals to susceptible HIV target cells. Some research also has suggested that HSV-2 accelerates HIV progression by upregulating HIV replication and increasing HIV viral load, but data are inconclusive, say researchers from the Iranian Research Center for HIV/AIDS, Pasteur Institute of Iran, Iranian Society for Support of Patients With Infectious Disease, Kermanshah University of Medical Sciences, Tehran University of Medical Sciences, and Zanjan University of Medical Sciences in Iran. They conducted a study to investigate HSV-2 seroprevalence in patients with and without HIV and to find out whether HSV-2 serostatus changed as CD4 counts and HIV viral load changed after 1 year.

The researchers compared 116 HIV patients who were not on HAART with 85 healthy controls. The prevalence and incidence of HSV-2 infection were low in the HIV cases and “negligible” in the control group: 18% of naïve HIV patients had HSV-2 IgG, and none of the control patients did.

Few data exist about HSV-2 seroconversion in HIV patients, the researchers say. In this study, HSV-2 seroconversion was found in 2.43% of HIV patients after 1 year.

Co-infection with HSV-2 had no association with CD4 count and HIV RNA viral load changes in the study participants at baseline or over time, the researchers say. CD4 counts after 1 year were 550 cells/mm3 in the HSV-2 seropositive patients and 563 cells/mm3 in the control group. The viral load in the seropositive group was 3.97 log copies/mL, and 3.49 log copies/mL in the seronegative group.

The researchers conclude that HIV-HSV-2 co-infection does not seem to play a role in HIV infection progression.

Patients with HIV often also have herpes simplex virus type 2 (HSV-2) infection in part because lesions act as entry portals to susceptible HIV target cells. Some research also has suggested that HSV-2 accelerates HIV progression by upregulating HIV replication and increasing HIV viral load, but data are inconclusive, say researchers from the Iranian Research Center for HIV/AIDS, Pasteur Institute of Iran, Iranian Society for Support of Patients With Infectious Disease, Kermanshah University of Medical Sciences, Tehran University of Medical Sciences, and Zanjan University of Medical Sciences in Iran. They conducted a study to investigate HSV-2 seroprevalence in patients with and without HIV and to find out whether HSV-2 serostatus changed as CD4 counts and HIV viral load changed after 1 year.

The researchers compared 116 HIV patients who were not on HAART with 85 healthy controls. The prevalence and incidence of HSV-2 infection were low in the HIV cases and “negligible” in the control group: 18% of naïve HIV patients had HSV-2 IgG, and none of the control patients did.

Few data exist about HSV-2 seroconversion in HIV patients, the researchers say. In this study, HSV-2 seroconversion was found in 2.43% of HIV patients after 1 year.

Co-infection with HSV-2 had no association with CD4 count and HIV RNA viral load changes in the study participants at baseline or over time, the researchers say. CD4 counts after 1 year were 550 cells/mm3 in the HSV-2 seropositive patients and 563 cells/mm3 in the control group. The viral load in the seropositive group was 3.97 log copies/mL, and 3.49 log copies/mL in the seronegative group.

The researchers conclude that HIV-HSV-2 co-infection does not seem to play a role in HIV infection progression.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Un-Gate On Date
Use ProPublica

Stopping the Suicide “Contagion” Among Native Americans

Article Type
Changed
The CDC finds that rates of suicide related to other suicide deaths are extremely high among the American Indians/Alaska Natives community.

American Indians/Alaska Natives (AI/AN) have a disproportionately high rate of suicide—more than 3.5 times those of racial/ethnic groups with the lowest rates, according to a CDC study. And the rate has been steadily rising since 2003.

Those at highest risk are young people aged 10 to 24 years: More than one-third of suicides have occurred in that group compared with 11% of whites in the same age group.

In the CDC study, about 70% of AI/AN decedents lived in nonmetropolitan areas, including rural areas, which underscores the importance of implementing suicide prevention strategies in rural AI/AN communities, the researchers say. Rural areas often have fewer mental health services due to provider shortages and social barriers, among other factors. The researchers point out that in their study AI/AN had lower odds than did white decedents of having received a mental health diagnosis or mental health treatment.

The researchers also found suggestions of “suicide contagion”; AI/AN decedents were more than twice as likely to have a friend’s or family member’s suicide contribute to their death. Community-level programs that focus on “postvention,” such as survivor support groups, should be considered, the researchers say. They also advise that media should focus on “safe reporting of suicides,” for example, by not using sensationalized headlines.

Nearly 28% of the people who died had reported alcohol abuse problems, and 49% had used alcohol in the hours before their death. The researchers caution that differences in the prevalence of alcohol use among AI/AN might be a symptom of “disproportionate exposure to poverty, historical trauma, and other contexts of inequity and should not be viewed as inherent to AI/AN culture.”

Publications
Topics
Sections
The CDC finds that rates of suicide related to other suicide deaths are extremely high among the American Indians/Alaska Natives community.
The CDC finds that rates of suicide related to other suicide deaths are extremely high among the American Indians/Alaska Natives community.

American Indians/Alaska Natives (AI/AN) have a disproportionately high rate of suicide—more than 3.5 times those of racial/ethnic groups with the lowest rates, according to a CDC study. And the rate has been steadily rising since 2003.

Those at highest risk are young people aged 10 to 24 years: More than one-third of suicides have occurred in that group compared with 11% of whites in the same age group.

In the CDC study, about 70% of AI/AN decedents lived in nonmetropolitan areas, including rural areas, which underscores the importance of implementing suicide prevention strategies in rural AI/AN communities, the researchers say. Rural areas often have fewer mental health services due to provider shortages and social barriers, among other factors. The researchers point out that in their study AI/AN had lower odds than did white decedents of having received a mental health diagnosis or mental health treatment.

The researchers also found suggestions of “suicide contagion”; AI/AN decedents were more than twice as likely to have a friend’s or family member’s suicide contribute to their death. Community-level programs that focus on “postvention,” such as survivor support groups, should be considered, the researchers say. They also advise that media should focus on “safe reporting of suicides,” for example, by not using sensationalized headlines.

Nearly 28% of the people who died had reported alcohol abuse problems, and 49% had used alcohol in the hours before their death. The researchers caution that differences in the prevalence of alcohol use among AI/AN might be a symptom of “disproportionate exposure to poverty, historical trauma, and other contexts of inequity and should not be viewed as inherent to AI/AN culture.”

American Indians/Alaska Natives (AI/AN) have a disproportionately high rate of suicide—more than 3.5 times those of racial/ethnic groups with the lowest rates, according to a CDC study. And the rate has been steadily rising since 2003.

Those at highest risk are young people aged 10 to 24 years: More than one-third of suicides have occurred in that group compared with 11% of whites in the same age group.

In the CDC study, about 70% of AI/AN decedents lived in nonmetropolitan areas, including rural areas, which underscores the importance of implementing suicide prevention strategies in rural AI/AN communities, the researchers say. Rural areas often have fewer mental health services due to provider shortages and social barriers, among other factors. The researchers point out that in their study AI/AN had lower odds than did white decedents of having received a mental health diagnosis or mental health treatment.

The researchers also found suggestions of “suicide contagion”; AI/AN decedents were more than twice as likely to have a friend’s or family member’s suicide contribute to their death. Community-level programs that focus on “postvention,” such as survivor support groups, should be considered, the researchers say. They also advise that media should focus on “safe reporting of suicides,” for example, by not using sensationalized headlines.

Nearly 28% of the people who died had reported alcohol abuse problems, and 49% had used alcohol in the hours before their death. The researchers caution that differences in the prevalence of alcohol use among AI/AN might be a symptom of “disproportionate exposure to poverty, historical trauma, and other contexts of inequity and should not be viewed as inherent to AI/AN culture.”

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Un-Gate On Date
Use ProPublica