## why divide by 1800 when calculating text/image storage?

### why divide by 1800 when calculating text/image storage?

To calculate the storage requirements for a text/image column the BOL
states:

The text chains that store text and image data have 112 bytes of overhead
per page.

Use the following formula to calculate the number of text chain pages that a
particular entry will use:

Data length / 1800 = Number of 2K pages

The result should be rounded up in all cases; for example, a data length of
1800 bytes requires two 2K pages.

Considering a page is 2048 bytes where do the other 248 bytes (or 136 bytes)
go?

Richard.
Polymorphic.
UK.

### why divide by 1800 when calculating text/image storage?

Richard,

It has to do with the way text/image data is stored.  The "text" data type
is a linked list of 2k pages.  It would appear to me that you would have a
lot of wasted space (1664 bytes).  You would need two 2k pages for each
record with the second page being almost empty.

Grant

### why divide by 1800 when calculating text/image storage?

Richard,

I was confused with the statement:

Quote:> Data length / 1800 = Number of 2K pages
>The result should be rounded up in all cases; for example, a data
>length of 1800 bytes requires two 2K pages.

If you have a data length of 1800 then wouldn't
1800/1800 = one 2k page?

You round up if it is over 1800.  For example if you had a record of 1900
bytes you would have:

1900/1800 = (1.06) two 2k pages wasting almost the whole page.

I am not sure what the 136 bytes is used for.  Maybe pointers for the linked
list.  Sorry for my confusion!

Grant

Suppose a supermarket chain. It has 50 supermarkets , 1800 suppliers and 30
000 products. They want to offer information to their suppliers trough an
Extranet and SQL server 7 Olap services.

Firt issue:
They want that each supplier knows the sales at each supermarket but only

P.e: Coca Cola can see only their products. Coca Cola can not see wich are
the others suppliers and products from other suppliers. All that trough a
browser with security access.
Problems:

-We dont want to build 1800 cubes
-We dont want to build 1800 fact tables and 1800 product and supermarket
dimension.

Second Issue:
In the Intranet, the supermarket chain want to see the data from all the
supplier.

Problems:

If I build a Virtual Cube from all the 1800 cubes,  the "sales" measures
from each cube has a different  measure. Is there any way to group these
measures in only one?

The data warehouse has a Sales Fact table and Product, Supermarket ,
Supplier and time dimensions.