Hi
I have a problem where the clock on a datalogger was drifting by up to 10 minutes from real time. We are trying to correct the data (at 1 second record frequency), which we have been able to do by multiplying each timestamp by the slope of real time/datalogger time. However, now the 'correct' timestamp has entries with partial seconds --
The goal is to automate a process which will average data from 1sec to 1min timestep. Typically, if the data was not at partial seconds, I would take the average every 60 rows and fill down. However, at this point, the minute intervals are not necessarily 60 cells long, so a fill down procedure would not work.
Any ideas?
Thanks,
Evan
I have a problem where the clock on a datalogger was drifting by up to 10 minutes from real time. We are trying to correct the data (at 1 second record frequency), which we have been able to do by multiplying each timestamp by the slope of real time/datalogger time. However, now the 'correct' timestamp has entries with partial seconds --
The goal is to automate a process which will average data from 1sec to 1min timestep. Typically, if the data was not at partial seconds, I would take the average every 60 rows and fill down. However, at this point, the minute intervals are not necessarily 60 cells long, so a fill down procedure would not work.
Any ideas?
Thanks,
Evan