Objective: Load only the table data from http://www.mdb.uscourts.gov/unclaimed_funds_results?foo=a&bar=1 (notice no "page=" is specified) and then from http://www.mdb.uscourts.gov/unclaimed_funds_results?page=1&foo=a&bar=1. I want to be able to specify the number of pages for the second URL (http://www.mdb.uscourts.gov/unclaimed_funds_results?page=1&foo=a&bar=1). In this case there are 128 pages. I notice the only variable that changes is the "page=x" (where x= the page number). I would like the import to be into a single worksheet and allow me to specify the number of pages. I want to run this on a daily basis comparing this worksheet to a new worksheet (the next day's query) and highlight what is new in the next day's query page in one color and what has been removed from this worksheet (the previous day's query) in a different color.
I am clearly over my head. I searched the internet and various forums (this seems to be the best). I figured someone has wanted to import data from a website returning a large amount of data, but only allowing visitor to view 20 results per page.
I started by using the Excel web query tool. Problem #1 is that I cannot get it to bring in just the table, it brings in the entire page. It appears all I should have to do to accomplish this is move the "place the arrow on the table I want to import". I cannot move the arrow. I am not sure if this is due to the design of the website or something I just do not understand. NOTE: When looking at the source code for the page in html it shows the data I want as a table. The site was developed in a CMS (i.e Drupal 7)
At the moment, I am having difficulty with importing both URLs and inputting the page range for just the second URL to pull in the rest of the data into one worksheet. Additionally, I am not sure how to address the additional rows I have imported (i.e. the rest of the page vs. table only).
I have a decent compare macro that will compare the two pages and highlight the difference in color. I could call this separately once I get the data into worksheets.
I would appreciate some direction and or specifics as to the best method for importing all this data in a clean fashion. I have been working on this for 6 hours (including research) without any real progress other than being able to import via the Web Query tool. I would really appreciate some assistance.
Thomas99
I am clearly over my head. I searched the internet and various forums (this seems to be the best). I figured someone has wanted to import data from a website returning a large amount of data, but only allowing visitor to view 20 results per page.
I started by using the Excel web query tool. Problem #1 is that I cannot get it to bring in just the table, it brings in the entire page. It appears all I should have to do to accomplish this is move the "place the arrow on the table I want to import". I cannot move the arrow. I am not sure if this is due to the design of the website or something I just do not understand. NOTE: When looking at the source code for the page in html it shows the data I want as a table. The site was developed in a CMS (i.e Drupal 7)
At the moment, I am having difficulty with importing both URLs and inputting the page range for just the second URL to pull in the rest of the data into one worksheet. Additionally, I am not sure how to address the additional rows I have imported (i.e. the rest of the page vs. table only).
I have a decent compare macro that will compare the two pages and highlight the difference in color. I could call this separately once I get the data into worksheets.
I would appreciate some direction and or specifics as to the best method for importing all this data in a clean fashion. I have been working on this for 6 hours (including research) without any real progress other than being able to import via the Web Query tool. I would really appreciate some assistance.
Thomas99