adodb performance EXCEL VBA vs vbscript

RAYLWARD102

Well-known Member
Joined
May 27, 2010
Messages
529
I have a rather large query, containing a table join that returns approximately 80k rows; executing this query via excel VBA seems to result in a minute or longer of processing. I don’t have the same speed issues when returning 5-10000 rows in excel; just the one that’s returning 80k rows. I replicated my VBA adodb connections to a vbscript and wonder why the vbscript can return 80k rows in about 3seconds. Big difference in timing 1min vs 3seconds. I assure you, I’m invoking adodb the same ways in VBA vs vbscript; promis it has nothing to do with how I coded it. Any idea’s as to why this is?
 

Excel Facts

Which Excel functions can ignore hidden rows?
The SUBTOTAL and AGGREGATE functions ignore hidden rows. AGGREGATE can also exclude error cells and more.
Based on the limited information: no idea.
3 seconds sounds like the right ball park for 80,000 records from multiple tables; and a minute is certainly not.
Can you check it on someone else's computer?
 
Upvote 0
How are you measuring when the data has returned? Please post your code for both versions
 
Upvote 0
was about craft code examples when I discovered the exact bottleneck. I use the adodb getrows method in both excel-vbscript. Directly after getrows (in both excel and vbscript), I create an empty 2d array with 1 extra row, and populate the first row with column names (a header for my 2d array), followed by looping the provided getrows array, for the purpose of transferring to a 2d array containing headers. The vbscript seems to be much quicker at the transfer then excel.

Here is excel code for transfering getrows array into an array with headers:
Code:
        ReDim FormattedDataSet(0 To UBound(QueriedFields), 0 To 0)
        For y = 0 To UBound(QueriedFields)
            FormattedDataSet(y, 0) = QueriedFields(y)
        Next
        For x = 0 To UBound(DataSet, 2)
            ReDim Preserve FormattedDataSet(0 To UBound(FormattedDataSet, 1), 0 To (UBound(FormattedDataSet, 2) + 1))
            For y = 0 To UBound(QueriedFields)
                If Not IsNull(DataSet(y, x)) Then
                    FormattedDataSet(y, UBound(FormattedDataSet, 2)) = DataSet(y, x)
                End If
            Next
        Next

Here is the vbscript code for transfering getrows array into an array with headers:
Code:
            ReDim A(UBound(Temp), (UBound(V, 2) + 1))
            For x = 0 To UBound(Temp)
                A(x, 0) = Temp(x)
            Next
            For x = 0 To UBound(V, 2)
                i = x + 1
                For y = 0 To UBound(V, 1)
                    If IsNull(V(y, x)) = False Then
                        A(y, i) = V(y, x)
                    Else:
                        A(y, i) = ""
                    End If
                Next
            Next


As you can see; same logic, but excel taking nearly 45-55 seconds longer then vbscript.
If I don't do the array to array with headers, they both perform about the same speed (getting query results)
Also; I did try from different computers; same result; I even put the access database on a different server to see if any difference; same all around; I no longer believe this to be a problem related to adodb, but want to understand why the excel for loops are so much slower then my vbscript for loops
 
Last edited:
Upvote 0
may have just answered own question; excel likley running slower due to the redim preserve statement at every row; I will try and get back.
 
Upvote 0
that was the difference; shouldn't have redim preserved every line. Adodb performance is the same between excel and vbscript.
Anyone know a better way to achieve my headerless array to header array, or is this about all we can do for efficiency?
 
Upvote 0
Probably, Redim Preserve is very expensive - it releases and rebuilds the whole array when it is called
 
Upvote 0
maybe explain what you're doing. what is the objective? why make the 2D array?
if we understand what is required, there may be a totally different approach

PS. Like leave the data in a recordset. or clone it. or make a disconnected recordset. or CopyFromRecordset (& then separately load the headers)
 
Last edited:
Upvote 0
Why do you need an array with a header?
 
Upvote 0
I like keeping an access database cached in memory for quick finds-manipulations; looping through a recordset is far more slower then looping an array.
Because I have many tables loaded into memory, it's easier for me to find correct data when I have a header indicated. This is why I add headers to my arrays.
 
Upvote 0

Forum statistics

Threads
1,214,834
Messages
6,121,876
Members
449,056
Latest member
ruhulaminappu

We've detected that you are using an adblocker.

We have a great community of people providing Excel help here, but the hosting costs are enormous. You can help keep this site running by allowing ads on MrExcel.com.
Allow Ads at MrExcel

Which adblocker are you using?

Disable AdBlock

Follow these easy steps to disable AdBlock

1)Click on the icon in the browser’s toolbar.
2)Click on the icon in the browser’s toolbar.
2)Click on the "Pause on this site" option.
Go back

Disable AdBlock Plus

Follow these easy steps to disable AdBlock Plus

1)Click on the icon in the browser’s toolbar.
2)Click on the toggle to disable it for "mrexcel.com".
Go back

Disable uBlock Origin

Follow these easy steps to disable uBlock Origin

1)Click on the icon in the browser’s toolbar.
2)Click on the "Power" button.
3)Click on the "Refresh" button.
Go back

Disable uBlock

Follow these easy steps to disable uBlock

1)Click on the icon in the browser’s toolbar.
2)Click on the "Power" button.
3)Click on the "Refresh" button.
Go back
Back
Top