Fact table gets update every day with thousands of records, I been increase the aggregations manually. Is there a way I can increase aggregations (design storage) programmatically . My cub gets process ones in day with VB program. I want to increase aggregations before processing cub through VB.
It's argent
please help me
thanksCan you do a full process? If you're not running into time constraints, then I'd do a full process.|||As fact table's records increases, dont I have to increase the aggregation in the cub?|||No. Aggregations are dimension related, so as long as your not creating new dimensions then you don't have to worry about increasing aggregations. However, if processing time is not an issue, I'd do a full process.|||I am confused according to your statement
When I create cub with single records fact table the aggregations are zero.
When I create cub with thousands of records in fact table the aggregations are 850 or more.|||Here's what I'm trying to say:
The theoretical maximum number of possible aggregations in a cube is the product of the number of levels in each cube dimension. As you add levels and dimensions to a cube, the number of possible aggregations increases exponentially. The higher the number of dimensions and levels in a cube, the greater its complexity. In the example in Figure 2, the Time dimension has four levels, the Customers dimension has five levels, and the Products dimension has five levels. This yields a theoretical maximum number of aggregations of 100 (5 customer levels x 5 product levels x 4 time levels). However, this number increases exponentially as you add dimensions or levels. For example, if you add the Day level to the Time dimension, the theoretical maximum number of aggregations increases to 125 (5 x 5 x 5). Now, suppose you add two more dimensions to this cube, each with three levels. The theoretical maximum number of aggregations increases to 1125 (5 x 5 x 5 x 3 x 3). A cube with nine dimensions containing five levels each yields theoretical maximum number of aggregations of 1,953,125. A cube of this complexity is considered a cube of medium complexity. A cube of high complexity might have 20 dimensions with five levels each and yield a theoretical maximum number of aggregations of approximately 95 trillion. As you can see, you can directly affect the theoretical maximum number of aggregations in a cube by changing the number of dimensions or the number of levels. Having multiple dimensions with deep hierarchies improves the ability of users to perform analysis, but having too many of either can lead to resource problems during querying and processing.
Follow this link for more info...
http://www.microsoft.com/technet/prodtechnol/sql/2000/maintain/ansvcspg.mspx|||Yes I understand that
What I did is, my fact table was MT (no records) I added one dummy record and created cub out of it, basically dimensions contains single level. I have 40 dimensions and 35 measures. I created a script out of it and sending the script to user to create the cub in their analysis server. In the script I had prompt for users data source name. Users fact table contains millions of records. When user processes the cub with millions of records the dimensions will have multiple levels. So user has to recreate the aggregations?
Sorry, I know you are trying to clarify my doubts but I am still confused.
Thank you so much.|||I think I remember now. You're sharing an identical cube structure with someone, but you're each pointing to different data sources, correct?
If that is the case, the user would have to re-process. Especially since the dimensions will have multiple levels once your script has run against their data. With the numbers you've provided, I wouldn't do a full process. However, again, aggregations are based on dimensions, so you don't need to programatically increase the number of aggregations as new records are added to the fact. Even if new dimension levels are being created, I wouldn't increase the number of aggregations. Just select a particular "performance gain level" and let AS do the rest (i.e. 30%). Having said that, you should probably look into usage based optimization. I have a similar sized cube as you've described and I know that there are a lot of wasted aggregations in the cube (fully processed). It only gets updated once a month, so it's not a big deal to do a full process. Remember, as your cube gets more complex, it's unlikely that your users are making the most of that complexity. That's why usage base optimization makes sense. I haven't done usage based optimization "programatically", so I'm no help there.
I hope that makes sense.|||Thanks so much
User doesnt know about analysis server.
I want to provide user to click option to design storage.
After crating the cub on user server,
I am looking for VB code to
1. Count dimension members
2. Design storage
So user doesnt have to do it manually .sql
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment