We had a similar requirement and developed a solution using DTS with a VB
GUI front end. The approach we took was to identify and develop extract
queries for the large tables based on user supplied criteria (e.g. customer
list, transaction date range). To keep things simple (the application
consists of multiple databases and several hundred tables) smaller tables
are transferred in their entirety. The process performs the following
1) create new target database
2) copy table structure only to target db (no constraints, indexes,
3) copy data using selective extract queries
4) copy data for remaining tables
5) copy all other database objects (constraints, indexes, views, procs,
The VB app creates the DTS package and transformations on the fly. The
process is actually pretty simple since the source and target schema are
identical. The hard part was developing the selective extract queries.
This required a lot of effort for us because of our complex data model and
some data issues.
Hope this helps.
SQL FAQ links (courtesy Neil Pike):
(faqxxx.zip in lib 7)
> Our production databases are growing quite large (40-50Gb) and we would
> to create smaller duplicates for testing and development. Does any body
> know of a tool that can extract a subset of data from a database, while
> maintaining referential integrity?
> Darren Fuller