Performance Issues using Excel 2000 Pivot Table functionality and Analysis Services

Performance Issues using Excel 2000 Pivot Table functionality and Analysis Services

Post by Lou Habla » Thu, 14 Sep 2000 02:26:16



We're developing a OLAP solution that will provide analysis support to a
larger software app.  We've got quite a few measures (pure and calculated)
and dimensions and our Fact table currently holds 1.6MM rows of data (pared
down from 3.1MM while testing).  We're seeing some huge performance hits
when we try and put this in context of Excel 2000.  Server has 1GB memory
and (2) 800 Mhz CPU's; can anybody shed light on what the upper limits are
of working with Pivot tables in the context of Excel and the hardware
described?  Processing and opening of Pivot Table take 4EVER!  Besides
paring datasets down to bare minimum, what else should we look at re:
performance tuning, etc?

Thanks for taking time to respond to this question.

-Lou

 
 
 

Performance Issues using Excel 2000 Pivot Table functionality and Analysis Services

Post by pc.. » Fri, 15 Sep 2000 04:00:00


I have worked with huge cubes and this is what I've found.  Look at
perf mon while processing, I'll bet you see memory not being spiked too
bad, but processors being killed. Tip: Turn off anything that doesn't
need to be running.  If you have calculate measures that involve
crossjoins, your in for a long show on the process screen.  The more
crossjoins you have the slower it goes. Tip: Can you write the measure
another way to eliminate the crossjoins. Can you use the Generate
command to limit the number of rows that are brought back to the
client.  Remember when viewing calculated measures they are calculated
at THAT time, not in processing.  Can you narrow your cubes down.  Can
you split them out by month or year.  If you do that for your fact
table, try creating a view of large dimensions that will hold only
relevant data for that fact table.  As far as processing goes, look at
the properties of the OLAP server, what is the (under process tab)
process buffer and read-ahead buffer size.  The read-ahead defaults to
4mb, this is how many rows that are read at a time, if you have the
memory, increase it to 64mb.  Whats the process buffer size, again if
you have the memory increase it.  There is a point of limiting returns,
so play with the numbers to get the most performance without giving
away too much memory.

Peter Cwik


Quote:> We're developing a OLAP solution that will provide analysis support
to a
> larger software app.  We've got quite a few measures (pure and
calculated)
> and dimensions and our Fact table currently holds 1.6MM rows of data
(pared
> down from 3.1MM while testing).  We're seeing some huge performance
hits
> when we try and put this in context of Excel 2000.  Server has 1GB
memory
> and (2) 800 Mhz CPU's; can anybody shed light on what the upper
limits are
> of working with Pivot tables in the context of Excel and the hardware
> described?  Processing and opening of Pivot Table take 4EVER!  Besides
> paring datasets down to bare minimum, what else should we look at re:
> performance tuning, etc?

> Thanks for taking time to respond to this question.

> -Lou

Sent via Deja.com http://www.deja.com/
Before you buy.

 
 
 

Performance Issues using Excel 2000 Pivot Table functionality and Analysis Services

Post by George Spoffor » Fri, 15 Sep 2000 04:00:00


How many dimensions? How large are they? MOLAP, ROLAP, or HOLAP? Is the
server taking the hit, the client, or both? Are you using any Excel or
other external functions (as opposed to built-ins)?  Does the hit occur
when creating a query or re-running a stored query? Even 3.1MM rows is
pretty trivial in and of itself, even with a much less powerful server.
Also, what kind of client hardware are you running? And, is there a
client context other than Excel that doesn't cause the same problems?


> We're developing a OLAP solution that will provide analysis support to a
> larger software app.  We've got quite a few measures (pure and calculated)
> and dimensions and our Fact table currently holds 1.6MM rows of data (pared
> down from 3.1MM while testing).  We're seeing some huge performance hits
> when we try and put this in context of Excel 2000.  Server has 1GB memory
> and (2) 800 Mhz CPU's; can anybody shed light on what the upper limits are
> of working with Pivot tables in the context of Excel and the hardware
> described?  Processing and opening of Pivot Table take 4EVER!  Besides
> paring datasets down to bare minimum, what else should we look at re:
> performance tuning, etc?

> Thanks for taking time to respond to this question.

> -Lou

--
George Spofford
DSS Lab

http://www.dsslab.com
 
 
 

Performance Issues using Excel 2000 Pivot Table functionality and Analysis Services

Post by poo.. » Fri, 15 Sep 2000 04:00:00


What is the 'Generate' command that you mention?


Quote:>>Can you use the Generate
>>command to limit the number of rows that are brought back to the
> client.

Sent via Deja.com http://www.deja.com/
Before you buy.
 
 
 

1. Performance differences between Pivot Table in EXCEL, Pivot List in MSOWC.dll

Performance differences between Pivot Table in EXCEL, Pivot List in
MSOWC.dll

Can you provide and inisight on how these two instances of pivot table
respond to data?  I am having performance issues with Pivot Table in Excel
and thought this situation could improve if using PivotList control from
Office Web Component.

Thanks.

2. DTS HELP

3. Using an OLAP Pivot Table in Excel 2000.

4. DENY DROP TABLE

5. Format from Analysis Services not in Excel 2000 Pivot

6. adding records

7. Pivot table in Excel 2000 vs Excel XP

8. Good reverese engine for ASE 12 ?

9. Cube Performance Using Excel Pivot Tables

10. Performance issue between Excel and Analysis Mgr.

11. Performance Issues with Analysis Services 2000

12. Excel 2000 Pivot Table - Full Data to Replicate Problem

13. Retrieving the properties of Dimensions from Excel 2000 Pivot Tables