How to catch Bulk insert error

How to catch Bulk insert error

Post by Nikola Mili » Fri, 22 Feb 2002 18:44:36



Hi,
How can I catch error from Bulk Insert, if file for import doesn't have good
data types? I want to send email to me if Bulk Insert fails, but it's not

Error message is below.

I'm using SS2000 Enterprise edition SP1 on Win2000 Advanced Server SP2.

Thanks in advance
Nikola Milic

Server: Msg 4866, Level 17, State 66, Line 1
Bulk Insert fails. Column is too long in the data file for row 1, column 1.
Make sure the field terminator and row terminator are specified correctly.
Server: Msg 7399, Level 16, State 1, Line 1
OLE DB provider 'STREAM' reported an error. The provider did not give any
information about the error.
The statement has been terminated.

 
 
 

How to catch Bulk insert error

Post by lindawi » Sat, 23 Feb 2002 01:36:17


Nikola,

The bcp command line utility does a better job of reporting error
conditions.
Use the -o switch to write the error messages to a file and the -e switch to
write the offending records to a file. You can also check the program return
code (0 = success, !0 = error).

Linda


> Hi,
> How can I catch error from Bulk Insert, if file for import doesn't have
good
> data types? I want to send email to me if Bulk Insert fails, but it's not

number.
> Error message is below.

> I'm using SS2000 Enterprise edition SP1 on Win2000 Advanced Server SP2.

> Thanks in advance
> Nikola Milic

> Server: Msg 4866, Level 17, State 66, Line 1
> Bulk Insert fails. Column is too long in the data file for row 1, column
1.
> Make sure the field terminator and row terminator are specified correctly.
> Server: Msg 7399, Level 16, State 1, Line 1
> OLE DB provider 'STREAM' reported an error. The provider did not give any
> information about the error.
> The statement has been terminated.


 
 
 

1. Error when bulk insert follows another large bulk insert

I am running a job out of MS SQL Enterprise Manager where the first step
turns off transaction logging, the 2nd step is a large bulk insert step,
and the 3rd step is another large bulk insert step.  I receive the
following error when executing the 3rd step.  Is this error occurring
because the database is still comitting data from the previous large bulk
insert at the same time as it is performing the next bulk insert?  If so,
is there a workaround?  Thank you for any help you can provide.

Error message from Step 3 (2nd bulk insert):
Backup, CHECKALLOC, bulk copy, SELECT INTO, and file manipulation
(such as CREATE FILE) operations on a database must be serialized.
Reissue the statement after the current backup, CHECKALLOC, or
file manipulation operation is completed. [SQLSTATE 42000] (Error 3023)  
The statement has been terminated. [SQLSTATE 01000] (Error 3621).  
The step failed.

--
Posted via CNET Help.com
http://www.help.com/

2. Debugging SP

3. Catch() not catching error during CDaoDatabase.Open()

4. multiple progress jobs

5. select + insert vs. insert and catching exception

6. Another XML question

7. Bulk Delete (as opposed to Bulk Insert)

8. Change a File's Date within a Trigger. Possible???

9. BULK INSERT or BULK COPY

10. Bulk Insert / Bulk copy

11. BULK INSERT under bulk copy?

12. DTS Error (BULK INSERT)

13. Bulk Insert Error with SQL 2000