I have an application which outpus a comma delimited file with automobile
data for posting on a Web site, and I am having problems with speed.
There is a table of vehicles with about 10K records belonging to about 50
dealers. Then there are two related tables which are joined to provide a
list of options for each vehicle. Since there is usually a long list of
options, the joined tables have about 600K records.
Since I need to do a lot of data manipulation I open a recordset on the auto
table and step through it, one vehicle at the time. For each vehicle, I then
need to retrieve the text for each option, e.g. Tinted Glass, Power
Steering, CD Player, etc.
First, I tried opening a recordset for each vehicle, selected by the VIN,
then stepping through and concatenating the options into a comma separated
single string for the Options column in the text file. This proved very
slow, 3 to 4 hours typical time. For debugging purposes I use an Access 2000
database running on a 1.7 GHz Win XP machine with 256MB memory, the live
data feed will run SQL Server, but I was shocked by the slow speed.
Then I tried opening a recordset on the entire options tables, with about
600K record, and use Find to pick out the records for each vehicle, and the
time dropped to about 25 minutes!
I have noticed that opening and closing ADODB recordsets seems to slow, but
I would have thought searching through 600K records would tak som time too.
Any comments? Any suggetsions for a better way of doing this. I try to do as
much as I can with SQL, because it seems to be the fastest way of retrieving
data, but I can't see any way of getting the options data into a string with
Any help would be appreciated