I cannot see anything wrong with what you are doing, but hey I am no MSOLAP
guru!. Although, 256Mb for memory seems to be a bit low on your server.
Precalulating the results and saving them back into a table is an idea but
remember MSOLAP will only take data into a cube at the lowest level. Since
you have aggregated dimensions you will have to bring the data into severel
cubes at differing levels and bring them together in a virtual cube. Seems
very messy to me!
I urge you to post your 2 messages onto the microsoft.public.sqlserver.olap
newsgroup and if you are desperate for an answer to email
As I said before distinct counts are very important and the calculation
times you are getting are not acceptable. Which does not bode well for my
> Thanks for the response Harsha. I'll put the problem into context.
> We have a data warehouse of web usage data. I'm using MDX here to
> overcome a classic 'semi-additive measure' data warehouse issue because
> I have found the problem cumbersome to solve using standard SQL. The
> measure is Concurrent Users. We have five dimensions, including two
> time dimensions: Calendar Yr By Mth goes from Year down to Day Of
> Month, Hrs and Mins allows drill-down to minute level. I need to
> calculate the average and the maximum concurrent users over both time
> dimensions, whilst summing over all other dimensions. Here are the
> object definitions:
> AvgConcUsers: [Measures].[Conc Users] / Count(Descendants([Calendar Yr
> By Mth].CurrentMember,[Day Of Month])) / Count(Descendants([Hrs and
> MaxConcUsers: MAX(Crossjoin(Descendants([Hrs and
> Mins].CurrentMember,Levels("Minute")) , Descendants([Calendar Yr By
> Mth].CurrentMember, Levels("[Day Of Month]"))), [Measures].[Conc Users])
> These definitions give the right results. AvgConcUsers is slow to
> calculate but MaxConcUsers (which contains the CROSSJOIN) is
> appallingly slow. The cube is a MOLAP cube and I selected 100%
> performance gain in the storage settings (I have also tested 75% and
> 50%, which were worse). The fact table is actually a view (which may
> impact cube processing time but not cube query time I believe) with
> only 3000 rows. The CROSSJOINed dimensions, ie. the two time
> dimensions, contain 1826 and 1440 table rows respectively. The server
> is a Dell PowerEdge PIII 450MHz 256MB 9GB SCSI HDD NT Server, the
> client a Dell Precision PIII 450MHz 256MB NT Workstation. During the
> trials I was the only user of the server and there were no other apps
> running on the client.
> Using BrioQuery as the query tool, I repeatedly ran a simple query with
> the objects Month, Day Of Month, Hour, AvgConcUsers and MaxConcUsers.
> Query time was 35 secs; 5 secs if I used only AvgConcUsers. I then
> added the Product object (ie. now using 3 dimensions) and the query
> time with both measures was 15 min, 1 min with just AvgConcUsers. We
> will consistently need to run queries accross all 5 dimensions, hence
> these response times are totally unacceptable.
> This has now become a major problem for us. I don't see any
> hardware/network issues. Can I use a different formula for
> MaxConcUsers? Can I precalculate the results into another cube somehow?
> Thanks loads for any assistance.
> > Hello Morley
> > I am about to use the same method for a distinct count problem. Could
> > tell me what the specs of the machines were, the crossjoined dimension
> > sizes, whether you were using MOLAP/ROLAP/HOLAP and the timings.
> > Also, could you post this to the olap newsgroup as well. I think this
> is a
> > very important issue.
> > Many Thanks
> > > Hey folks ... I have a rather pressing question.
> > > I have a Plato cube which contains a calculated measure using a
> > > CROSSJOIN function. The calculated measure gives the right results
> > > takes several minutes to perform the calculation on a very small
> > > dataset. Both the server and the clients being used are high-spec
> > > machines which give superb performance in all other circumstances.
> > > My question is: is there some way of precalculating and storing the
> > > results of this calculated measure, so that it wouldn't have to be
> > > calculated at runtime? Essentially this means building a cube on a
> > > cube, I suppose. Is this possible?
> > > Thanks for any assistance.
> > > Morley.
> > > Sent via Deja.com http://www.deja.com/
> > > Before you buy.
> Sent via Deja.com http://www.deja.com/
> Before you buy.