In clinical scenarios where warfarin is still the drug of choice for anticoagulation, it is still a challenge to achieve optimal time in therapeutic range. Is there promise in a new dosing paradigm?
The novel oral anticoagulants (NOACs) have eliminated some of the challenges of warfarin dosing. There are still, however, many clinical situations in which warfarin is the preferred oral anticoagulant, such as during pacemaker implantation, following surgery, and after a stroke. This ongoing practice stems largely from the limited data that is available on the safety of NOACs in such clinical settings.
For this reason, optimizing time in therapeutic range (TTR) for warfarin remains an important issue. This is especially true in the US, where the average TTR is quite low-approximately 60%. Previous studies have raised the question of whether using pharmacogenetic (PG) information (vs a clinical algorithm) to dose warfarin may maximize the number of patients who are within the appropriate therapeutic window.
The Clarification of Optimal Anticoagulation trial, which was presented at the American Heart Association Scientific Sessions in 2013 and simultaneously published in the NEJM, was a large, randomized controlled trial that attempted to answer this question in a prospective fashion. There were 1015 patients evenly randomized to either clinical-guided warfarin dosing (n=501) or PG-guided dosing (n=514). Within the study, patients were stratified by race (black vs nonblack), since it is established that PG algorithms do not perform well in black patients. Clinical-guided dosing algorithms were based on age, race, body surface area, smoking status, amiodarone use, target INR, and warfarin indication. Genotype information included cytochrome P450 2C9 (CYP2C9) and vitamin K epoxide reductase complex 1 (VKORC1). Dose revisions were done at days 4 and/or 5.
The primary endpoint of TTR during the first 28 days of treatment was no different between the clinical-guided dosing arm and PG-dosing arm (mean TTR 45.4% vs 45.2%, respectively; P = .91). Interestingly, black patients had a much lower TTR than nonblacks (mean difference -8.3; 95% CI, -15 to -2; P = .01).
This study definitively lays to rest the question of the advantage of PG-guided warfarin dosing; clinical-guided algorithms are equally effective. Notably, however, both perform rather poorly with a very low TTR (<50%) in both groups. Furthermore, particular attention should be paid to INR monitoring in black patients, in whom warfarin pharmacology may be more complex.
Kimmel SE, French B, Kanser SE, et al. A pharmacogenetic versus a clinical algorithm for warfarin dosing. N Engl J Med. 2013;369:2283-2293. doi: 10.1056/NEJMoa1310669. Epub 2013 Nov 19.