1. querying and cursors on large databases returning large resultsets
Hi,
I'm busy looking into changing a query console that performs queries on a
live system. Obviously, there are issues involved when doing this, and I'd
just like some input as to what other people out there do.
The database I will be testing with is quite big (I haven't restored it yet,
but I've been told it's in the region of 25 gb). Anyway, the current
consoles have become an issue for a while. A user enters search criteria
for the transactions he wants, at which point the console then does the
query. The query can be quite intensive on the live system, causing serious
degradation in it's performance. I was wondering what opinions people out
there with this kind of experience do.
Some of the questions I have are:
* How do search engines retrieve results so quickly, and know more or less
how many results they have? An example is doing a search for "dynamic
cursor" on the google search engine. It can tell you that it took 0.13
seconds to find a 99400 results. How does it know this so quickly? And how
does it know how many results are available (even if it is an estimate),
unless the entire query was performed?. Creating any kind of cursor
(client/server) for that would take a lot of memory.
* The current consoles I think populate a grid with all the results, so if
there are a few hundred thousand results, it results in having to move all
those results across the network. Solutions would to be to use a server
cursor instead. There is the option of keyset and dynamic. A keyset
initially takes a long time to get all the results, and a lot of memory is
used on the server, and space in tempdb. A dynamic cursor seems more
responsive initially and use less memory, because it doesn't determine the
whole resultset. However, documentation seems to indicate that subsequent
fetches are resource intensive. How much of an issue is this? Would you
use suggest using a dynamic cursor?
* Obviously another solution is to never select more than a certain number
of records (doing a select top n). Is this what anyone else out there does?
This is not always what is wanted, since someone might want to see more than
n results (although this value could be configurable).
* Yet another option would be to implement paging yourself (although the
commands would be specific in each case - I means this is not a generic
solution for applications). How does this affect the query optimizer, since
the sql statement changes all the time?
* When having a query console, do you ensure that all search criteria is on
indexed columns? or do you allow columns to be queried that are not indexed
(which results in a table scan).
thanks,
malcolm
2. SQXML Slow problem
3. UPDATE on large table NOT updating
4. Problem with Table.APPEND and .INSERT
5. How do i speed up inserting a large amount of data into a very large table
6. VB Programmer job opportunity in Victoria, BC
7. Descending Order Index
8. Large deletes/inserts against large tables....
9. Cursor fetch on large table
10. Cursor failure because table row too large
11. Cursors operations on large table
12. Slow Updating in large table