MrExcel Publishing
Your One Stop for Excel Tips & Solutions

Processing Time

Posted by Gary_41701 on April 18, 2001 9:26 PM

There's a debate going @ work regarding processing
time. We use vlookup (s) to do a lot of reporting.
Some of the files are extremely large (3 mg or better -
Yes, I know we should use Access -- anyways). The
source data comes down in thousands and must be
converted to millions to conform to Corp Standards.

One group claims that Excel processes data faster
if on your reporting Tabs --You insert your vlookup(s)
below the reporting section & reference those #'s into
the formatted schedules & dividing same by a "1000".

I find it hard to believe that it's faster than just
inserting a vlookup formula directly into the
reporting section of the tab and incorporating the
dividing \ rounding factor in the vlookup formula.

Any input would be appreciated.

Please skip the custom formatting route - been there.

Posted by Aladin Akyurek on April 18, 2001 11:27 PM


I think your 'opponents' got a point there. Doing the vlookup at report time boils down to, I have the impression, eliminating extra/intermediate computations. I believe also that the conversion to millions as late as possible is a good idea, which would shorten the processing time, because you just convert a few numbers that you need, not all of them.

My 2 cents.


Posted by Dave Hawley on April 19, 2001 12:19 AM

Hi Gary

Could you provide a sample of each VLOOKUP ?

Also you mention "formatting" No matter how you format a cell it's underlying (true value) will not change.

If your file is > mg then you either have an awful lot of formulas or the the workbook (a worksheet) is corrupt. To find out if it is the latter follow these steps.

1. Save as "Microsoft Excel Workbook" as apposed to 97/95. In other words avoid saving as multiple versions

2. Open the VBE and Export all Modules and Forms to your hard drive (right click on them and select Export). Then delete all Modules and Forms in your Workbook. Now Import all Forms and Modules from your hard drive back in.

3. On each Worksheet select the last row+1 and push Ctrl+Shift+End then go to Edit>Clear>All

4.Right Click on any sheet name tab and select "Move or Copy". From the "To book" select "(new book)". Select the "Create a copy" box and click OK. Now save this new book as any name and go to File>Properties|General and see how much the new workbook has increased in size. Do the same for all sheets.

If a sheet causes an unreasonable increase in file size then it is probably corrupt. Try highlight all the data on the sheet and then copy it to a new sheet. If still no change then copy small areas at a time, save and check file size. This will narrow down any corrupt data, which will need re-doing.

Another factor that can slooooww down Excel is too many array formulas. Most of the time these can be replaced with the Database functions, eg; DSUM, DGET etc.

I would also strongly suggest considering the use of Pivot Tables. These are on of Excels most powerful features and in some cases can eliminate the need for any formulas.

Following the above can (and has) reduced file size by over 75% and increased recalculation time by more than 1000%.


OzGrid Business Applications

Posted by Mark W. on April 19, 2001 7:41 AM

Gary, I'm inclined to side with you on this issue.
Of course, all of us are speculating about the
root cause of your recalc times -- kinda like
performing heart surgery by telephone -- but, if
your worksheet is ladden with VLOOKUP() functions
that have 0 or FALSE as their last argument there
are ways to tune them for better performance.

One approach would be to reorganize your lookup
table so that the most commonly found lookup
values are placed at the top of the lookup range.

If your lookup range is sparsely populated and;
therefore, most of your lookup values are not
found then you should consider using VLOOKUP()
functions with 1 or TRUE as their last argument.
Of course, such a VLOOKUP() would require that
your lookup table range be sorted, and you'd
need to return the value from the 1st column
of the lookup table along with the value you're
seeking. I've shaved an enormous amount of
recalc time using this 2nd approach on co-workers
worksheets. But, as with most things there are
trade-offs. In this, case you're trading off
workbook size to gain recalc speed.

Posted by Aladin Akyurek on April 19, 2001 8:05 AM

Mark -- Reading your response, I come to think that I have misread Gary's case. I'm curious though about those VLOOKUP-formulas.

Gary -- Would you like to share with us the most used VLOOKUP-formula? I'm just curious.


Posted by Aladin Akyurek on April 19, 2001 11:11 AM


Did you look at the exchange between Adam and me?


I've been looking quite some time now to use a different technique to create dynamic ranges with less recalc. Any thoughts on this one?


Posted by Mark W. on April 19, 2001 1:20 PM

Aladin, maybe I'm tired...or left my thinking cap
at home, but wasn't the original request to prevent
his cell range from being affected by row deletions?


Wouldn't =INDEX(INDIRECT("[Book1.xls]Sheet1!$A$1:$M$10000"), MATCH(ramsecname, INDIRECT("[Book1.xls]Sheet1!$A$1:$A$10000"), FALSE),7)
have solved this problem? What am I missing here?

Posted by Aladin Akyurek on April 19, 2001 1:53 PM

Yep. It would...

Posted by Gary04_1901 on April 19, 2001 5:20 PM

WOW !!! - It'll take a while to digest all of the
msgs\Try out your tips - But I will reply