How to accommodate for different units of time between interrogations
- by primaldiva
- 2017-09-19 02:34:27
- Checkups & Settings
- 1487 views
- 2 comments
I have been keeping a spreadsheet of my event counters since I got my Medronic Adapta implanted in 2009. This data has been instrumental, along with some advice I got on this forum, in my case that I presented to the EPs that they had set the pacemaker wrong and it was actually harming my heart. I am now about 2 years on from when I had it reset to the settings I told them to use and thankfully my heart seems to have reverse remodeled and my % of ventricular pacing as well as my events have gone down considerably.
In order to make a solid case that might help others (I don't mean a legal case) I would like to present my data to my new EP on Thursday. The issue is that my interrogations, whether they be in person or with remote capture, were not always at perfect intervals in terms of days, so when the data is plotted, a particular data point on the X axis is a range of dates since the last interrogation. My plan is to normalize this data by dividing the percentage (ex. AS-VP) by the number of days in the range of dates, so 6 months would be 180 days. For the event counters I would just divide the number of events by the number of days. Does that make sense? Is this the correct way to normalize for the differing length of periods between interrogations? IN the future I can make sure I remote capture every 3 months but I need to get my data from the last few years.
If anyone knows how to build a macro in Excel for this, let me know! Thanks
2 Comments
data plotting ventricular pacing and event counters over time
by primaldiva - 2017-09-20 00:14:32
Donr, thanks for the comment it makes perfect sense. Leave the percentages alone because they already are "normalized" even though they won't show accurate percentages per fixed unit on my X axis. I guess we are just lookiing for a trend here so it doesnt matter. It bothers me that the ranges are differing units of time but the way around this in the future is to get my data checked via remote capture at set intervals. Remote capture does not reset the counters though, so you have to so the subtraction when you come in for your checkup in person.
My second graph with the events I would normalize as you describe. I don't have that many events but I did want to show the trend over time.
Thanks for your help. andrea
You know you're wired when...
You have a $50,000 chest.
Member Quotes
I am active and healthy and have been given a second chance.
NOT the correct way to do it!
by donr - 2017-09-19 09:20:27
At least NOT for percentages. They are already "Normalized" when you convert from a raw number into a percent. Just plot them as they are.
For raw numbers, divide by the number of days in the period to get the average number of events per day. Like for PVC's - say the report says 147,000 single PVC's in 6 months. Divided by 180 days yields 816 per day on average. Unfortunately, that's the BEST you can do. There is an assumption in this presentation - that the number of PVC's is distributed evenly across the entire period - which is not true, but you have no other choice.
Let me give you a F'rinstance - it happened to me this past month. I had a downloadafter a 6 month period that reported 1.47 MILLION PVC's. Now that is exactly TEN times the normal count for them that my reports show. Breaking that number down to an average number per minute for the entire period yields 5.6 per minute - one PVC every 10 seconds (roughly). Now I KNOW that is wrong. Why? Because at the end of the period I was in a condition called "Trigeminy" (Or some such spelling) of tripletts - 2 normal & 1 PVC for hours. I could feel it & confirmed it w/ an ECG. That's an average of about 20 PVC's per minute - based on an ASSUMED rate of 60 BPM. I use the 60 BPM because I can work that out im my head. Unfortunately, my HR has been running high lately, about 120 BPM, so if it has Trigeminy at 120 BPM, it generates 40 PVC's per minute.
Hope this all makes sense to you, but it highlighhts th problems of using bulk data w/o any knowledge of distributiuon.
Donr