Could someone shed some light on this please.
I have a VB5 app that uses ADODB connections and recordsets.
On a timer it creates an ADODB connection & recordset does a query and
then closes the recordset and connection. It also sets the local args
to nothing for the connection object and recordset object.
If I leave it running for a few hours, windows NT performance monitor
indicates that the app is using more and more memory, starts at 3500K
and moves up and up and up till either the system runs out of virtual
memory OR, and get this, I MINIMIZE my application! Once minimized the
memory drops back about 3500K again!
Any ideas would be greatly appreciated.
Thanx in advance.