Definitely. Union queries aren't the most efficient things.
One option is to do the following at run-time:
Instead of using Union Queries, have Append Queries which write the data from each linked table to a "Temporay" table, and get your final results off of that. Just have your process delete all records in that temporary table each time at the beginning. If you can add indexes to that table, it could make it more efficient too.
Just to add my 2 cents worth. In my current position I inherited a database that had a number of linked tables that were linked to Excel worksheets. One of the tables was a customer mapping that mapped customer number to a customer group/location/etc. Given that there were only about 20,000 records in that Excel file the performance was awful - any query that included that table took anything from 30 seconds to 5 minutes to run and I knew that wasn't right. After a bit of playing around I got it so that the customer table was converted to an Access table with appropriate indexes added and the performance improvement was MASSIVE - 10 to 100 times faster for some queries.
So the moral of my story is - don't link to Excel worksheets - use Access tables instead. If you have a need to still maintain an Excel version of your tables then have Access do this for you - maintain the table in Access through a form - once complete you can open the Excel file and refresh with the latest data (all automated obviously). That's what I've ended up doing here and it works almost flawlessly (only problem occurs if someone else has the Excel file open in read-write mode when I try to update).
The linked tables (6 in total) contain a years worth of data which often stretches to some 100K rows. Would having this stored within the database make it too big, and thus, cause more problems?
Well having 100k rows+ in an access table is going to be more efficient than linking to an excel file containing the same data. The problem is that there is an extra layer of communication (via ODBC) when you link to excel and that will introduce extra time to any query. I've got a database that works fine with tables of 1.5m records so 100,000 should be ok.
Another two cents' worth: regarding the Excel file being open in read-write mode. I had something rather similar once. Basically, VBA in the Excel file created on file closure a copy of the required data in a dedicated subdirectory: then the Access file queried this data file in the sub-directory. No one ever went into the sub-directory, so I could rely on the file being available for querying.
We have a great community of people providing Excel help here, but the hosting costs are enormous. You can help keep this site running by allowing ads on MrExcel.com.
Allow Ads at MrExcel
Which adblocker are you using?
Disable AdBlock
Follow these easy steps to disable AdBlock
1)Click on the icon in the browser’s toolbar. 2)Click on the icon in the browser’s toolbar. 2)Click on the "Pause on this site" option.
Go back
Disable AdBlock Plus
Follow these easy steps to disable AdBlock Plus
1)Click on the icon in the browser’s toolbar. 2)Click on the toggle to disable it for "mrexcel.com".
Go back
Disable uBlock Origin
Follow these easy steps to disable uBlock Origin
1)Click on the icon in the browser’s toolbar. 2)Click on the "Power" button. 3)Click on the "Refresh" button.
Go back
Disable uBlock
Follow these easy steps to disable uBlock
1)Click on the icon in the browser’s toolbar. 2)Click on the "Power" button. 3)Click on the "Refresh" button.