We are developing a telecomms application which requires very fast
access to a single table. This table contains about 3.5 million rows
of about 750 bytes.
There is a definite profile to the useage of these records... some can
be accessed hundreds of times more often than others. In order to get
the best performance from this table, we have allocated an SGA of
about 3/4 Gig in which to keep those most often used.
What I'm not sure about is how Oracle decides which rows to keep in
memory, as obviously we can't fit all of them in! Should I enforce a
cost based optimisation, or does this only decide on the best way to
join tables together. In other words, what can this optimiser do with
the statistics it saves?
EX Ingres DBA!