I have a large number of structurally identical workbooks. Each of them gets a web update of data on command via a vba macro / sub. I believe that this part works fine.
There is a summary workbook that gets linked in data from all the workbooks mentioned above. Each data file fills a row of data in the summary.
The summary also has other data which includes the data filename and path. This too seems to work fine.
There is a vba sub in the summary workbook which runs through the list of filenames and causes them to update their data.
This process also seems to work fine ... SOMETIMES. This is the focus of the context of my question.
The process pretty much "uses up" the computer where Office app's are concerned - although I can do other things while it's running (like look at file folder contents). That is, one can't readily switch from one Office window to another. There remains plenty of RAM space and cpu capacity during this evolution.
I have purposely left out files or code because I'm interested in some "big picture" things.
First of all there is no indication when the process stops.
Normally, one does not see the data files opened. But, when the process hangs, sometimes one of the data files remains open. Manually closing (with Save) the data file doesn't cause the process to continue. By itself, this suggests that it's the vba code running the data files (from the summary file) that hangs.
Stepping through the code reveals nothing because the code generally just works and, I can well imagine, that the slowness of manual stepping gets by any timing or race conditions that may be present.
The debugger isn't initialized which indicates that the process is "waiting" for something.
This is frustrating because it gives no indication of what's going wrong.
How would you tackle this kind of problem in concept?
Pseudo code process:
Thanks
There is a summary workbook that gets linked in data from all the workbooks mentioned above. Each data file fills a row of data in the summary.
The summary also has other data which includes the data filename and path. This too seems to work fine.
There is a vba sub in the summary workbook which runs through the list of filenames and causes them to update their data.
This process also seems to work fine ... SOMETIMES. This is the focus of the context of my question.
The process pretty much "uses up" the computer where Office app's are concerned - although I can do other things while it's running (like look at file folder contents). That is, one can't readily switch from one Office window to another. There remains plenty of RAM space and cpu capacity during this evolution.
I have purposely left out files or code because I'm interested in some "big picture" things.
First of all there is no indication when the process stops.
Normally, one does not see the data files opened. But, when the process hangs, sometimes one of the data files remains open. Manually closing (with Save) the data file doesn't cause the process to continue. By itself, this suggests that it's the vba code running the data files (from the summary file) that hangs.
Stepping through the code reveals nothing because the code generally just works and, I can well imagine, that the slowness of manual stepping gets by any timing or race conditions that may be present.
The debugger isn't initialized which indicates that the process is "waiting" for something.
This is frustrating because it gives no indication of what's going wrong.
How would you tackle this kind of problem in concept?
Pseudo code process:
Code:
Loop
Get data filename and path.
Open the file and initiate a data update via the web.
[this seems to be where it hangs at times]
Close the file with Save
End loop
Thanks