- Where can I find trend and comparison information to help me plan for future growth of my data warehouse?
- How many cpu's do other customers use per terabyte?
- How many partitions are typically used in large tables? How many indexes?
- How much should I allocate for memory for buffer cache?
- How does my warehouse compare to others of similar and larger scale?
From a Company perspective we are also interested to get feedback on features we have added to the database, are these features used, how are they used etc. For example we are keen to understand:
- Which initialization parameters are most frequently used at what values?
- How many Oracle data warehouses run on RAC? on single nodes?
- Is there a trend one-way or the other, especially as data volumes increase?
- Does this change with newer releases of the database?
Terabyte and larger DW are the primary interest, but information on any data warehouse environment is useful. We would like to have as many customers as possible submit results, ideally by the end of this week. However, this will be an on going process so regular feedback after this week is extremely useful.
To help our developers and product management team please download and run the DW measurement script kit from OTN which is available from the following link:
Please return the script outputs using the link shown on the above web page, see the FAQ section, or alternatively mail them directly to me: email@example.com.
Thank you and we look forward to your responses.