Automated determination of spatial QT interval distribution in cardiac magnetic field mapping reveals repolarization inhomogeneities in high-risk patients

Automated determination of spatial QT interval distribution in cardiac magnetic field mapping reveals repolarization inhomogeneities in high-risk patients

Available online at www.sciencedirect.com Journal of Electrocardiology 40 (2007) S129 – S135 www.jecgonline.com Abstracts Poster Session 3 Impaired ...

118KB Sizes 2 Downloads 19 Views

Available online at www.sciencedirect.com

Journal of Electrocardiology 40 (2007) S129 – S135 www.jecgonline.com

Abstracts Poster Session 3 Impaired T-amplitude adaptation to heart rate changes identifies IKr inhibition in the congenital and acquired form of the long QT syndrome Jean-Philippe Couderc,a Martino Vaglio,a Xiajuan Xia,a Scott McNitt,a Pierre Wicker,b Nenad Sarapa,c Arthur J Moss,a Wojciech Zarebaa a Heart Research Follow-up Program, Cardiology Department, University of Rochester Medical Center, NY, USA b Pfizer Inc., Global Research and Development, CT, USA c Daiichi Sankyo Pharma Development, NJ, USA Background: The QTc interval prolongation is not a perfect surrogate marker of the presence of an increased risk for arrhythmic events. In the congenital form of the long QT syndrome (LQTS), 10% to 15% of patients carrying a HERG mutation do not have a prolongation of the QT interval. In the acquired form of the LQTS, polymorphic ventricular tachycardia is documented in individuals with Ikr-blocking drugs but without strong QT prolongation. Thus, we investigated the role of repolarization adaptation to heart rate changes for identifying the presence of abnormal Ikr kinetics in the congenital and acquired form of the LQTS. Method: Our investigation is based on 5 study groups: genotyped LQT1 patients (n = 49, 35♀), genotyped LQT2 (n = 25, 19♀), healthy individuals (n = 37, 11♀) on and off sotalol including a single dose (SD; 160 mg; n = 37) and a double dose (DD; 320 mg, n = 21). Twelve-lead digital Holter recordings were acquired in LQTS patients and during periods on and off drug in the healthy groups. Median beat analysis was performed in all leads during the diurnal period. We investigated various repolarization parameters and their heart rate dependency including QT interval and T-wave amplitude. Results: The results from lead II reveal a loss of heart rate dependency of the amplitude of the T wave as a common feature in individuals with decreased Ikr kinetics (LQT2 and subjects under sotalol).The Table provides the values of T amplitude/RR and QT/RR slopes in all groups. Conclusion: Impaired adaptation of T-wave amplitude has been evidenced as common electrocardiographic feature associated with HERG mutation and Ikr-blocking drug sotalol. This electrocardiogram marker may play an important role in the future of assessment of both the penetrance of HERG mutation and the Ikr-related cardiotoxicity of drugs.

Using the European Society of cardiology ST database to compare 2 automated methods for ST episode measurement Dirk Feild Advanced Algorithm Research Center, Philips Medical Systems, Thousand Oaks CA Background: An effective way to evaluate ST measurements and episodes is critically needed. One solution is the European Society of Cardiology's (ESC)

a b

Algorithm method

Episode Se (%)

Episode +P (%)

Duration Se (%)

Duration +P (%)

Post averaging Pre averaging

96 98

91 92-96 a

91 94

85 83

a

≥Both.

Conclusions: The ESC ST database was an effective tool in showing the superiority of smoothing before measuring the baseline corrected ECG in this application.

doi:10.1016/j.jelectrocard.2007.08.043

QT/RR T amplitude/RR (μV/ms)

ST Database. As supplied, this database fails to fully meet the evaluator's needs. There are multiple reasons for this mismatch. With supplemental processing, however, the database can be turned into an effective evaluation tool. The limitations of the database include (1) a method for establishing a baseline that fails to match practical use models, (2) sparse ST measurements, (3) fixed episode criteria, (4) shortness of records, and (5) atypically clean records. Of all these issues, only the first is critical. The database defines ST episodes relative to a baseline measurement typically taken from the first beat of the record. No practical computer algorithm would take this approach. ST measurements are inherently sensitive to numerous sources of artifact. Baseline wander, muscle artifact, electrode noise, and positional changes particularly affect measurements significantly. Various filtering methods are used in a practical system to overcome these issues. Clinical algorithms will therefore invariably measure a filtered ST baseline value distant from the first beat. Differences in this one measurement will then apply to all other measurements, effectively destroying episode evaluation. Methods: To overcome this “one measurement/use model” problem, a DC offset is applied to each channel of each record in the database. This correction is fixed throughout all measurements and all episodes in that record. The DC offset is chosen to minimize the total nonoverlap time of the reference and test episodes, effectively canceling the impact of the initial reference measurement. In this work, we compare 2 methods of automated ST-interval measurement implemented in the Philips Holter application: (1) measure the raw baseline wander corrected electrocardiogram (ECG) and then smooth the measurements, and (2) average the beats (baseline wander corrected ECG) then measure the result. In each case, the ESC database is used to independently evaluate performance. Results: Combined channels Gross measurements (as if 1 record):

doi:10.1016/j.jelectrocard.2007.08.044 Automated determination of spatial QT interval distribution in cardiac magnetic field mapping reveals repolarization inhomogeneities in highrisk patients Robert Fischer,a Vinzenz von Tscharner,b Andrej Gapelyuk,a Udo Zacharzowsky,a Henry Schutt,a Alexander Schirdewana

Healthy (n = 39)

LQT1 (n = 49)

LQT2 (n = 25)

SD sotalol (n = 37)

DD sotalol (n = 21)

0.12 ± 0.04 0.55 ± 0.29

0.17 ± 0.10 a 0.62 ± 0.40

0.22 ± 0.16 a 0.31 ± 0.27a,b

0.15 ± 0.05 a 0.26 ± 0.19a,b

0.14 ± 0.06 0.21 ± 0.14a,b

P b .05 in reference to healthy. P b .05 in reference to LQT1.

0022-0736/$ – see front matter doi:10.1016/j.jelectrocard.2007.08.042

S130

Abstracts / Journal of Electrocardiology 40 (2007) S129–S135

a

Franz-Volhard-Klinik, Charité Berlin, Germany Human Performance Laboratory, University of Calgary, Canada

b

Background: Mechanisms of arrhythmogenesis involve spatial inhomogeneities of myocardial activation and repolarization. They result from regional changes in myocardial electrophysiological properties, which occur in the development of most cardiac pathologies. Usually, heterogeneity of ventricular repolarization is assessed by measurements of QT dispersion (QTd) and TPeak-TEnd interval in the 12-lead electrocardiogram. Given its contactless nature and fixed sensor distribution, cardiac magnetic field mapping (CMFM) allows fast and reproducible noninvasive mappings. Methods: We investigated whether CMFM is a suitable tool to image repolarization inhomogeneities in patients with increased risk for sudden cardiac death. Therefore, multichannel magnetocardiograms were registered in a plane area over the thorax. A new automated procedure was applied to determine TPeak and TEnd time points in 36 magnetocardiogram waveforms, which represent the cardiac magnetic field distribution in a regular 6 × 6 grid covering 20 × 20 cm over the heart. Standard QTd parameters as well as maps of spatial TPeak, TEnd, and TPeak-TEnd distribution were compared between 20 healthy subjects (NORMAL) and 27 patients with risk for recurrent ventricular tachycardia before implantable cardiac defibrillator implantation (HIGH_RISK). Results: QT dispersion and standard deviation of QT intervals (QTsd) were significantly higher in HIGH_RISK patients compared with healthy controls (QTd, 141 ± 13 vs 94 ± 8 milliseconds; QTsd, 36.8 ± 4 vs 18.4 ± 1 milliseconds, respectively; P b .005). The same was true for TPeak dispersion; however, the most significant changes were found in TPeak-TEnd dispersion (TPeak-TEndd, 143.8 ± 12 vs 82.5 ± 10 milliseconds; TPeak-TEndsd, 34.5 ± 3.4 vs 17.1 ± 1 milliseconds; P b 10−5). Maps reflecting the distribution of TPeak, TEnd, and TPeak-TEnd showed a homogeneous pattern in healthy subjects. Earliest repolarization was found in central and right inferior areas corresponding to septal and right ventricular sites, whereas late left superior and right superior areas correspond to inferior left ventricular and right outflow tract sites. In contrast, the same maps of HIGH_RISK patients showed inhomogenous multipolar patterns with sharp rises in QT duration, probably indicating significant differences in the repolarization process of neighboring myocardial areas. Conclusion: Application of an automated procedure to multichannel CMFM revealed increased dispersion in repolarization in patients who are at high risk for sudden cardiac death. Distribution patterns of TPeak, TEnd, and TPeak-TEnd values may help to image spatial inhomogeneities in repolarization and may thus help to detect patients at high risk for ventricular arrhythmias. doi:10.1016/j.jelectrocard.2007.08.045 Increasing efficiency of thorough QT/QTc studies Cynthia L. Green, PhD Duke University Medical Center, Durham, North Carolina, USA Background: QT/QTc prolongation may cause ventricular arrhythmias including ventricular fibrillation, which can be fatal even though the degree of this association is not known. Current International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use E14 guidance documents request that sponsors submitting new drug applications conduct a thorough QT/QTc study during early clinical development. The thorough QT study is usually conducted using healthy volunteers to determine if the effect of a novel drug on the QT/ QTc interval in target patient populations should be studied further during later stages of drug development. Methods: A negative thorough QT/QTc study is typically one in which the upper bound of a 95% one-sided confidence interval for the largest timematched mean effect of the drug on the QTc interval excludes 10 milliseconds. When designing QT studies, controlling variability is critical in reducing the required sample size. Repeated QT measurements at each time point are recommended, and the mean of these replicates used for analysis. Baseline QT measurements are typically acquired at matching time points on the day before treatment. The study is designed to test the null hypothesis for any difference larger than the threshold across all time points analyzed; although, the alternative hypothesis claims that all differences are less than the threshold. Results: Let R be the rejection region of a level α test for the null hypothesis that each time-matched difference is larger than the threshold, the IntersectionUnion statistical test is still a level α test for the null hypothesis, eliminating the

need to adjust for multiple tests at each time point. Simulation studies show that the significance level is well controlled, and the power of the statistical test depends on the true mean difference between drug and placebo, the noninferiority margin, variability of data, number of time points analyzed, and sample size and has minimal impact on the study results in most cases. Conclusions: A QT/QTc study is negative if the study drug is noninferior to placebo QT/QTc interval measurements. If all one-sided 95% upper limits of the time-matched mean difference between the drug and placebo after baseline adjustment are below the noninferiority threshold at each time point, then we can claim a negative thorough QT/QTc study. By the IntersectionUnion statistical test, there is no need for multiplicity adjustments to claim the drug is noninferior to placebo, thus reducing the required sample size. doi:10.1016/j.jelectrocard.2007.08.046

Comparison of 2 automated methods for QT interval measurement Richard Gregg, Saeed Babaeizadeh, Dirk Feild, Eric Helfenbein, James Lindauer, Sophia Zhou Advanced Algorithm Research Center, Philips Medical Systems, Thousand Oaks, CA Background: In this work, we compared 2 methods of automated QT interval measurement on standard electrocardiogram databases: the rootmean-square (RMS) lead combining method aimed at QT monitoring and the method of median of lead-by-lead QT interval measurements. Study population: We used the Physionet Physikalisch-Technische Bundesanstalt (PTB) (n = 548) and The Common Standards for Electrocardiography (CSE) measurement (n = 125) standard databases (DBs). Both have reference QT interval measurements from a group of annotators. The last 10 seconds of each PTB record was down-sampled from 1000 sps, and an amplitude resolution of 1 μV to 500 sps and 5 μV to match the CSE set. PTB records #205 and #557 were excluded because of ventricular paced rhythm and artifact, respectively. Twenty-five cases were excluded from the CSE set to match the selection of cases for International Electrotechnical Commission (IEC) algorithm testing (IEC 60601-2-51). Methods: We processed all records using the Philips resting electrocardiogram algorithm to generate representative beats for QT interval measurement. The RMS method measures QRS onset and end of T on an RMS waveform constructed from 8 primary leads I, II, and V1-V6. The lead-bylead method takes the median QT interval across leads. The automated QT intervals by the RMS and lead-by-lead methods were compared with the reference manual QT measurements. Results: We report the difference of QT intervals between automated and manual measurements in the table below. F tests indicate that the SD between methods is not significantly different (P = .8 for PTB and P = .3 for CSE). Because the mean difference between the methods is 13 milliseconds for both databases (0.79 to 14.7 and −12.3 to 0.62), and the same 13-millisecond difference is seen across databases for each method (0.62 to 14.7 and −12.3 to 0.79), we suspect the measurements are 13 milliseconds short on the PTB database. The bias may be due to the high gain, which was being used during manual measurement of the CSE cases. Method

Lead-by-lead RMS

QT difference (ms) (mean ± SD) PTB DB (n = 546)

CSE DB (n = 100)

14.7 ± 21.4 0.79 ± 20.5

0.62 ± 11.0 −12.3 ± 11.2

Conclusion: Lead-by-lead and RMS methods perform similarly, leading to the conclusion that the choice between them should be based on considerations such as the number of leads available or computational efficiency. Note: Results using experimental version 060728bug. doi:10.1016/j.jelectrocard.2007.08.047

Transitioning prescription use medical devices to over-the-counter use Nina Mezu-Nwaba, PharmD, Felipe Aguel, PhD, Randall Brockman, MD, Benjamin Eloff, PhD, Lesley Ewing, MD, Charles Ho, PhD, Frank Lacy,