Data Corruption w/Data Pipeline (Watcom)

Data Corruption w/Data Pipeline (Watcom)

Post by Gene Hube » Wed, 07 Feb 1996 04:00:00



I just tried my first data pipeline with PB.  I just wanted to move a
small Watcom table from one Watcom DB to another Watcom DB.
I thought it would be really simple and painless.!??!.  Ha!!!!!

I set up the pipeline and ran it interactively.  It ran and created a
table of the right structrue in the destination DB.  However, the data
is obviously corrupted.  I have lots of high order ascii characters
stuck in various fields where there were none in the source.

The source table is all char or varchar fields.  There are a number of
backquote characters in the data (could these cause problems?) The
primary key is on a char(8) field.  All but the key field accept null, with
quite a few actually being null.  There are about 50 records with 9
columns.  Beyond that, I'm still speechless.

Any help appreciated.

Gene Hubert

 
 
 

1. Data Pipeline - Watcom Bind Error

I'm trying to execute a data pipeline from an application. I've
created two seperate databases under Watcom 4.0. When tracing the
start command, I get the message:

"Bind parameter value ":30" is too big(10) "

I do not get an errors back in the exeuting of the pipeline, but also
no data in the target database. Any recommendations.

2. IID_IOutlookExtCallback and OUTLOOK Events

3. Data Archiving - Overview | Data Archiving - Procedure - Data Archiving - SAP Tutorials

4. Electronic Eavesdropping

5. Urgent Need // Data Mapping ( Data Modeler/Data Analyst) // Chicago IL or Richardson, TX

6. Problems making uEmacs 3.10 on AOS/VS, VMS, DOS

7. Data Pipeline Problem

8. ANN : AGA classes posted to PP contributed class archive

9. Data Pipeline

10. Data-Pipelines (currupt in .exe ?)

11. Oracle Data Pipeline

12. Using data pipeline in EXE file

13. Data Pipeline feature in V4, any details