Result Sets vs copying result sets into memory.

Result Sets vs copying result sets into memory.

Post by AB » Fri, 12 Apr 2002 00:39:07



Does anyone see a problem with copying a result set into a collection
for display vs looping through the result set for display, so I can
return the connection back to the database little faster? I am using
connection pooling, so there is no connection overhead in creating a
new query.

The only performance issue I can see that each result set will take up
memory, but after it's displayed, that memory will be available for
something else, right?  I'm in a situation where up to 500 people can
hit a webpage for data at the same time.

I'm starting to run out of database connections on the web application
and a bigger database isn't an option for me. I can (and do) cache
persistant non-changing data, but have been told that calls for
changing and updated data must always come straight from the database.

Some of the result sets are over 10000 records long and there's no way
to split them. Through sql, I am able to return only rows X through
X+20, so I only return what I am going to show.

Any POSITIVE feedback would be appreciated.

Thanks in advance,

-- ABS

 
 
 

Result Sets vs copying result sets into memory.

Post by Joseph P. Larso » Fri, 12 Apr 2002 06:58:38


Quote:> I'm starting to run out of database connections on the web application
> and a bigger database isn't an option for me. I can (and do) cache
> persistant non-changing data, but have been told that calls for
> changing and updated data must always come straight from the database.

I've never heard this "rule" you've been told.  My standard procedure
has always been to use some sort of surrogate for ALL my database
access.  My SQL queries always return a vector of objects -- I never
deal with raw rows.  I have even gone so far as to write code generators
that read a schema file and generate all the database code -- for
returning surrogates.

If I'm going to do processing on the data and then throw the rows away,
the load routines include a callback mechanism for each row returned.
I have the callback method return a boolean whether the row should be
inserted into the returned vector.

In other words -- what you said you're thinking about doing has been
my standard operating procedure for nearly 10 years.  Not only do
I think it's acceptable -- I think it's the cleanest, most object-oriented
way to deal with the database.

But then, I'm not the sort who uses hash tables except in rare situations.

-Joe

 
 
 

Result Sets vs copying result sets into memory.

Post by AV » Fri, 12 Apr 2002 08:59:12


Hello ABS,


Quote:> Does anyone see a problem with copying a result set into a collection
> for display vs looping through the result set for display, so I can
> return the connection back to the database little faster? I am using
> connection pooling, so there is no connection overhead in creating a
> new query.

Move data from rs to array (or Collection) is better from ,so called,
 layers separation point of view. Presentation layer [preferably]
has to be free from any java.sql.* objects.

Quote:> The only performance issue I can see that each result set will take up
> memory, but after it's displayed, that memory will be available for
> something else, right?  I'm in a situation where up to 500 people can
> hit a webpage for data at the same time.

Big collection will also take memory. Thus, may effords must be applied
to reduce result set size by proper sql query.

Quote:> I'm starting to run out of database connections on the web application
> and a bigger database isn't an option for me. I can (and do) cache
> persistant non-changing data, but have been told that calls for
> changing and updated data must always come straight from the database.

If you do not hold connections (e.g. in session object) and resultsets are
"sql limitted"  than it is natural bottleneck.

Quote:> Some of the result sets are over 10000 records long and there's no way
> to split them. Through sql, I am able to return only rows X through
> X+20, so I only return what I am going to show.

Putting limits in sql is one of the best way. Getting 100.000 records to
show
only 20 is defenitely wrong. It is not so hard to implement,
so called, result paging by sql. If data is very fast changing, there will
be
some possibility for inconsistency...

AlexV

Quote:> Any POSITIVE feedback would be appreciated.
> Thanks in advance,

> -- ABS

 
 
 

1. Result sets using select in Query Anlyzer vs BCP vs Select Into

When I run simple select against my view in Query
Analyzer, I get result set in one sort order. The sort
order differs, when I BCP the same view. Using third
technique i.e. Select Into, I have observed the sort order
is again different in the resulting table. My question is
what is the difference in mechanisim of query analyzer,
bcp, and select into.
Thanks

2. looking for a good Prep course for DBA exams

3. Comparing Result Set to Result Table

4. troubling sql problem (for me anyway)

5. OBDB Result size <> ISQL Result Set

6. Excluding results

7. Oracle Gateway - result set issues - results not matching data in DB2

8. VALID question, invalid input

9. Receiving results of a SP result set

10. [Querying A Subset of Results in a Result Set]

11. Set Criteria is not affecting resulting set

12. how to set sp_executesql result sets to a cursor

13. Trying to use set-based result set instead of looping