Data Pipeline Problem

Data Pipeline Problem

Post by Kevin Eb » Thu, 08 Feb 1996 04:00:00



I am trying to use the pipeline to transfer data from Sybase to Oracle.
All of the columns in both sybase and oracle are defined as not null.
When I run the pipeline, I get an error ORA-01400:mandatory(not null)
column is missing or null during insert. The coulmn that it is
failing on is char(6) and it is spaces in every row. I can make it
work by allowing nulls in the Oracle table, but this is not what we
need. It seems like the pipeline is compressing the spaces to a null
value during the transfer. This is very discouraging. Any ideas
would be greatly appreciated.  Thanks...
 
 
 

1. Data Corruption w/Data Pipeline (Watcom)

I just tried my first data pipeline with PB.  I just wanted to move a
small Watcom table from one Watcom DB to another Watcom DB.
I thought it would be really simple and painless.!??!.  Ha!!!!!

I set up the pipeline and ran it interactively.  It ran and created a
table of the right structrue in the destination DB.  However, the data
is obviously corrupted.  I have lots of high order ascii characters
stuck in various fields where there were none in the source.

The source table is all char or varchar fields.  There are a number of
backquote characters in the data (could these cause problems?) The
primary key is on a char(8) field.  All but the key field accept null, with
quite a few actually being null.  There are about 50 records with 9
columns.  Beyond that, I'm still speechless.

Any help appreciated.

Gene Hubert

2. Wanted: RD54 expansion for uVAX 2000

3. Data Pipeline

4. how to check values in approach

5. Data-Pipelines (currupt in .exe ?)

6. Setup keeps on updating on every boot?

7. Oracle Data Pipeline

8. Zyxel U-1496RE Modems

9. Data Pipeline - Watcom Bind Error

10. Using data pipeline in EXE file

11. Data Pipeline feature in V4, any details

12. Need GIS Data Sources for Petroleum Pipelines

13. pipeline problem on PowerPC