optimize vba for i7 procesor and 64 bit

mountainclimber11

Board Regular
Joined
Dec 1, 2009
Messages
79
Hello,

Here is what I have:
1) an i7 (8 core) processor,
2) 64 bit machine,
3) Excel 2007 (32 bit),
4) a massive permutation program written in VBA with lots of seemingly unavoidable loops that take weeks to run,
5) one (of 8) cores is about maxed out constently while this program runs.

Questions:
a) What options do I have to speed this program up?
b) How can I take advanage of the other 7 processors?
c) Is there anything I can do regarding the 64 bit machine / 32 bit Excel situation and would it help with anything (not just speed)?
c) Is there any way to run two VBA scripts at once on one machine?
d) Any links on the subject would be great.

I did some searching online and didn't see any way to take advantage of the multi-core processor using VBA nor did I see a 64 bit version of Excel. (It looked like a 64 bit Excel is either in beta or will be out in 2010.)

Thanks!
 
Last edited:
Updating the status bar takes longer than running thru a slow loop.

Your probably not still doing that.

For timing long acting loops try
Code:
Dim LC as Long
Dim LCa As Long
Dim Timer1 as Long
 
'Before the start of the loop
LC = 1
LCa = 1
Timer1 = Timer
 
'at the end of the loop
 
LC = LC + 1
If LC = 10000 Then
LCa = LCa + 1
StatusBar =LC * LCa & " loops took " & Timer - Timer1 & " Secs"
End If

That only adds one increment and one IF to the loop's cycle time, except for each 10K loops.
 
Last edited:
Upvote 0

Excel Facts

Easy bullets in Excel
If you have a numeric keypad, press Alt+7 on numeric keypad to type a bullet in Excel.
Excel 2007 or later will multi-thread its calculations (http://www.dailydoseofexcel.com/arc...multi-threading-calculations-on-8-processors/). However, VBA will not. VBA is single-threaded and I would be surprised if Microsoft ever invests in changing that.

You can switch to VB.Net and implement your own multi-threading code. Or, you can optimize VBA by doing, as you already have, removing selects / activates, and by moving data from/to Excel in a single step (if you are looping through individual cells). You can also reexamine your basic algorithms to see if you can get a substantial gain from some other approach.

Tweaking data types is at best minor optimization. You may get a few percent improvement whereas a superior algorithm could improve performance by a magnitude.

Also, depending on how much work you do with each file you are opening/closing, the I/O overhead might be substantial. Disk, after all, is the slowest part of CPU, memory, and disk.
I agree that VBA is not the best, but I don't know SQL or something like that, that would work better.

I think the issue is the number iterations, which is something I can deal with, but I need to maximize/optimize things.

Note: I WANT to use all 8 processors, but VBA will only use one.
 
Upvote 0
Correct, I am not using that anymore. Was just using it to help keep track of things while I tested. Yes I have a simple timing system in place now:
Code:
Dim StartTime, EndTime As Date
StartTime = Timer
 
'code here
 
EndTime = Timer
Debug.Print Format(EndTime - StartTime, "0.0")
Updating the status bar takes longer than running thru a slow loop.

Your probably not still doing that.

For timing long acting loops try
Code:
Dim LC as Long
Dim LCa As Long
Dim Timer1 as Long
 
'Before the start of the loop
LC = 1
LCa = 1
Timer1 = Timer
 
'at the end of the loop
 
LC = LC + 1
If LC = 10000 Then
LCa = LCa + 1
StatusBar =LC * LCa & " loops took " & Timer - Timer1 & " Secs"
End If

That only adds one increment and one IF to the loop's cycle time, except for each 10K loops.
 
Upvote 0
Thanks! It sounds like I am on the right track the. I feel pretty good about my progress so far. On a very small set of data that I test on, I cut the run time from about 15 minutes to around 20 seconds! I still need to replace my
use of Split(txt,delim)(idx) with Mid's and InStr's. So I suspect I will cut a bit more time.

You mention memory. Is there a way to test for memory leaks? Also, are there some best practices in VBA for memory conservation? However, I don't think my system will have any issues with it. But good to know for the future...

thanks!


Excel 2007 or later will multi-thread its calculations (http://www.dailydoseofexcel.com/arc...multi-threading-calculations-on-8-processors/). However, VBA will not. VBA is single-threaded and I would be surprised if Microsoft ever invests in changing that.

You can switch to VB.Net and implement your own multi-threading code. Or, you can optimize VBA by doing, as you already have, removing selects / activates, and by moving data from/to Excel in a single step (if you are looping through individual cells). You can also reexamine your basic algorithms to see if you can get a substantial gain from some other approach.

Tweaking data types is at best minor optimization. You may get a few percent improvement whereas a superior algorithm could improve performance by a magnitude.

Also, depending on how much work you do with each file you are opening/closing, the I/O overhead might be substantial. Disk, after all, is the slowest part of CPU, memory, and disk.
 
Upvote 0
are there some best practices in VBA for memory conservation?
Same as any other language, kill it or shorten its memory space when it's not needed for a while.

Code:
Erase ArrayVar 'Useful for variable length element arrays
'or
ReDim ArrayVar
Variable length StringVar = "" 
VariantVar = Null
Set ObjectVar = Nothing 'Makes var empty. Can reuse
Unload ObjectVar ' Kills it. Must redeclare
Reset 'closes all open diskfiles
 
Upvote 0
If you are reading ranges in one cell at a time, it will be faster to read the whole range into a Variant in one hit, then loop through the resulting array. Ditto when outputting the data - do it in one hit, not cell by cell if at all possible.
 
Upvote 0
Sam Tyler- Thanks! Before I do that, is there a way to check if this is even an issue? If I'm not having memory issues I'll probably just leave it be for now.

thanks.


Same as any other language, kill it or shorten its memory space when it's not needed for a while.

Code:
Erase ArrayVar 'Useful for variable length element arrays
'or
ReDim ArrayVar
Variable length StringVar = "" 
VariantVar = Null
Set ObjectVar = Nothing 'Makes var empty. Can reuse
Unload ObjectVar ' Kills it. Must redeclare
Reset 'closes all open diskfiles
 
Upvote 0
rorya - Thanks! Yes, this is on the list of todo's. I am currently referencing cell by cell in a few key areas.

I just found this link, which I am finding helpful:
http://www.avdf.com/apr98/art_ot003.html

I was about to ask what is the fastest way to get a range of values into an array and back again, but it looks like this is it:
Code:
[COLOR=#008000]'to read in [/COLOR]
Dim A as Variant [COLOR=#008000]'MUST be variant, no brackets[/COLOR]
A = Range("SomeRange").Resize(10,20) [COLOR=#008000]'reads 10x20 array starting at range SomeRange
'(NB I've used Resize above but you can specify a range of cells any way you want)

'to write back to sheet[/COLOR]
Range("SomeRange").Resize(10,20) = A
[COLOR=#008000]'A can be any data type but MUST be two dimensional even if you are only writing one 
'column or row - first dimension is used for rows, and the second for columns
'this can be slow - see third question below for workaround..[/COLOR]

If you are reading ranges in one cell at a time, it will be faster to read the whole range into a Variant in one hit, then loop through the resulting array. Ditto when outputting the data - do it in one hit, not cell by cell if at all possible.
 
Upvote 0
I wouldn't worry about it. The memory hogs are Workbook Vars and large open files. however, a 10x20 range array of Longs will be (3200 + 200 times the length of a Long Type) bytes in memory. But, Arrays are fast! they actually only contain Pointers to memory addresses. That 10x20 array takes 3200 bytes to store all the Pointers.

As with any other variable declaration, unless you specify a data type for the array, the data type of the elements in a declared array is Variant. Each numeric Variant element of the array uses 16 bytes. Each string Variant element uses 22 bytes. To write code that is as compact as possible, explicitly declare your arrays to be of a data type other than Variant.

The following lines of code compare the size of several arrays:

<CODE>' Integer array uses 22 bytes (11 elements * 2 bytes).ReDim MyIntegerArray(10) As Integer' Double-precision array uses 88 bytes (11 elements * 8 bytes).ReDim MyDoubleArray(10) As Double' Variant array uses at least 176 bytes (11 elements * 16 bytes).ReDim MyVariantArray(10)' Integer array uses 100 * 100 * 2 bytes (20,000 bytes).ReDim MyIntegerArray (99, 99) As Integer ' Double-precision array uses 100 * 100 * 8 bytes (80,000 bytes).ReDim MyDoubleArray (99, 99) As Double ' Variant array uses at least 160,000 bytes (100 * 100 * 16 bytes).ReDim MyVariantArray(99, 99)</CODE></PRE>
 
Upvote 0
Sam - Thanks. I'm working on this right now.

I have a maximum of 200 files with approximately 30,000 (and change) rows of values in a single column on each file. So one column, 30,000 rows per file 200 files. Each are accessed over and over again at different times. I see two ways of doing this:

1) Use a 200x30,000 array. <- too big, would lock me up probably?

2) Use 200 individual arrays and maybe store them in an array of arrays 200 elements long?

What do you suggest? What would be the memory usage on those?

If I follow the math in your last post, it would be 3200 + 200 x 30,000 = 6,003,200 in both #1 and #2 above. What does the 6 million mean to me? I used a byte to GB converter online and thats only .06 of a GB memory usage. Is that correct? If that is correct (seems low), I can close all the open files, use the array(s) and save some memory.

There is a 3rd possibility which will be must slower I think:
3) I can load up to 5 arrays, get rid of one, reuse the remaining 4, and load a new one every 30,000 iterations.

Thanks!
 
Upvote 0

Forum statistics

Threads
1,214,812
Messages
6,121,704
Members
449,048
Latest member
81jamesacct

We've detected that you are using an adblocker.

We have a great community of people providing Excel help here, but the hosting costs are enormous. You can help keep this site running by allowing ads on MrExcel.com.
Allow Ads at MrExcel

Which adblocker are you using?

Disable AdBlock

Follow these easy steps to disable AdBlock

1)Click on the icon in the browser’s toolbar.
2)Click on the icon in the browser’s toolbar.
2)Click on the "Pause on this site" option.
Go back

Disable AdBlock Plus

Follow these easy steps to disable AdBlock Plus

1)Click on the icon in the browser’s toolbar.
2)Click on the toggle to disable it for "mrexcel.com".
Go back

Disable uBlock Origin

Follow these easy steps to disable uBlock Origin

1)Click on the icon in the browser’s toolbar.
2)Click on the "Power" button.
3)Click on the "Refresh" button.
Go back

Disable uBlock

Follow these easy steps to disable uBlock

1)Click on the icon in the browser’s toolbar.
2)Click on the "Power" button.
3)Click on the "Refresh" button.
Go back
Back
Top