BULK INSERT bug with large files?

BULK INSERT bug with large files?

Post by Sam Elmor » Mon, 29 Dec 2003 23:34:32



I am using BULK INSERT to import a large web log file into
a SQL Server 2000 SP3a table.  I have set the
rowterminator to '\n' and the fieldterminator to ' '.  The
web log file has been converted to ASCII and has had all #
comment lines removed.  The log file is about 430MB.  When
I run the bulk insert on the large file, it gets a
datetime conversion error on row 1500000 or somewhere
about there.  When I split the file into 100MB chunks
however, it works fine!  Clearly, there is nothing wrong
with the data in the file.  Is this a known limitation of
BULK INSERT, or is this some sort of bug?

Thanks,
Sam

 
 
 

BULK INSERT bug with large files?

Post by Steve Kas » Tue, 30 Dec 2003 01:26:03


Sam,

Most likely there are errors in the data, and specifically there are
more than 10.  Unless you specify a value of MAXERRORS in the BULK
INSERT statement, it defaults to allowing 10 errors.  Perhaps by
breaking the file up into chunks, none of the chunks contains more than
10 errors, but the file as a whole contains at least 11 bad lines.

I'm probably not the only person who was a bit surprised that the
default number of errors BULK INSERT ignores is greater than zero, but
it is, and it's documented in Books Online.

SK


>I am using BULK INSERT to import a large web log file into
>a SQL Server 2000 SP3a table.  I have set the
>rowterminator to '\n' and the fieldterminator to ' '.  The
>web log file has been converted to ASCII and has had all #
>comment lines removed.  The log file is about 430MB.  When
>I run the bulk insert on the large file, it gets a
>datetime conversion error on row 1500000 or somewhere
>about there.  When I split the file into 100MB chunks
>however, it works fine!  Clearly, there is nothing wrong
>with the data in the file.  Is this a known limitation of
>BULK INSERT, or is this some sort of bug?

>Thanks,
>Sam


 
 
 

BULK INSERT bug with large files?

Post by Erland Sommarsko » Tue, 30 Dec 2003 02:00:44



> I am using BULK INSERT to import a large web log file into
> a SQL Server 2000 SP3a table.  I have set the
> rowterminator to '\n' and the fieldterminator to ' '.  The
> web log file has been converted to ASCII and has had all #
> comment lines removed.  The log file is about 430MB.  When
> I run the bulk insert on the large file, it gets a
> datetime conversion error on row 1500000 or somewhere
> about there.  When I split the file into 100MB chunks
> however, it works fine!  Clearly, there is nothing wrong
> with the data in the file.  Is this a known limitation of
> BULK INSERT, or is this some sort of bug?

In addition to Steve's response, you may want to try the command-line
tool BCP instead. With BCP you can define an error file, to which
BCP will log the errouneous rows.

As with BULK INSERT, the default behaviour of BCP is to permit ten errors
and stop on the eleventh.

--

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinfo/productdoc/2000/books.asp

 
 
 

BULK INSERT bug with large files?

Post by Sam Elmor » Tue, 30 Dec 2003 03:54:01


Thanks for the feedback.  Yes, I am quite surprised that
they would default the maxerrors to 10!  Seems like that
is quite a problem to me.  In any case, I am not certain
that there are errors in my data.  When I run the bulk
insert, it tells me the line number that has caused the
error (or the 11th as the case may be).  I have copied
that row, along with about 100 rows on both sides of the
errored row into a new file.  When I try to bulk insert
this file, supposedly containing the error that SQL Server
reported, with maxerrors set to 0, it succeeds!  How is
this possible?  Is the line number that SQL Server reports
back reliable?

Thanks,
Sam

>-----Original Message-----

>> I am using BULK INSERT to import a large web log file
into
>> a SQL Server 2000 SP3a table.  I have set the
>> rowterminator to '\n' and the fieldterminator to ' '.  
The
>> web log file has been converted to ASCII and has had
all #
>> comment lines removed.  The log file is about 430MB.  
When
>> I run the bulk insert on the large file, it gets a
>> datetime conversion error on row 1500000 or somewhere
>> about there.  When I split the file into 100MB chunks
>> however, it works fine!  Clearly, there is nothing
wrong
>> with the data in the file.  Is this a known limitation
of
>> BULK INSERT, or is this some sort of bug?

>In addition to Steve's response, you may want to try the
command-line
>tool BCP instead. With BCP you can define an error file,
to which
>BCP will log the errouneous rows.

>As with BULK INSERT, the default behaviour of BCP is to
permit ten errors
>and stop on the eleventh.

>--

>Books Online for SQL Server SP3 at
>http://www.microsoft.com/sql/techinfo/productdoc/2000/book
s.asp
>.

 
 
 

BULK INSERT bug with large files?

Post by Sam Elmor » Tue, 30 Dec 2003 04:08:50


One other note: If I set the maxerrors to some *ly
high number, like 10,000, it still spits out the same
error on the same line: I am quite certain that there are
less than 10,000 errors in this file (in fact, I am
reasonably certain there are none)...

Sam

>-----Original Message-----

>> I am using BULK INSERT to import a large web log file
into
>> a SQL Server 2000 SP3a table.  I have set the
>> rowterminator to '\n' and the fieldterminator to ' '.  
The
>> web log file has been converted to ASCII and has had
all #
>> comment lines removed.  The log file is about 430MB.  
When
>> I run the bulk insert on the large file, it gets a
>> datetime conversion error on row 1500000 or somewhere
>> about there.  When I split the file into 100MB chunks
>> however, it works fine!  Clearly, there is nothing
wrong
>> with the data in the file.  Is this a known limitation
of
>> BULK INSERT, or is this some sort of bug?

>In addition to Steve's response, you may want to try the
command-line
>tool BCP instead. With BCP you can define an error file,
to which
>BCP will log the errouneous rows.

>As with BULK INSERT, the default behaviour of BCP is to
permit ten errors
>and stop on the eleventh.

>--

>Books Online for SQL Server SP3 at
>http://www.veryComputer.com/
s.asp
>.

 
 
 

BULK INSERT bug with large files?

Post by Steve Kas » Tue, 30 Dec 2003 06:10:03


It's sounding a bit mysterious, and your followup to this suggests a few
questions (at least until Linda W. catches up on news and likely solves
the problem):

Can we see a few lines of the source file as well as the CREATE TABLE
statement for the destination table?
   (feel free to change some characters to X's, but don't change the
lengths or spacing of anything.
Are all the lines terminated with the standard Windows CHAR(13)+CHAR(10) ?
When you import the file in chunks, do you get the same number of rows
in the table as there are lines in the original file?
Is the last line of the file terminated with an end-of-line?
What is the exact file size in bytes (I recall some weird error once
that happens when a file is an exact multiple of some magic number.)?

Since you seem to have a workaround that imports all the data, I'm more
interested in whether this is a bug or not at this point.  It would be
nice to feel sure enough it's a bug to open a case with product support,
but if it's being caused by something in your file (which is too large
to post), you'll get charged...

SK


>Thanks for the feedback.  Yes, I am quite surprised that
>they would default the maxerrors to 10!  Seems like that
>is quite a problem to me.  In any case, I am not certain
>that there are errors in my data.  When I run the bulk
>insert, it tells me the line number that has caused the
>error (or the 11th as the case may be).  I have copied
>that row, along with about 100 rows on both sides of the
>errored row into a new file.  When I try to bulk insert
>this file, supposedly containing the error that SQL Server
>reported, with maxerrors set to 0, it succeeds!  How is
>this possible?  Is the line number that SQL Server reports
>back reliable?

>Thanks,
>Sam

>>-----Original Message-----

>>>I am using BULK INSERT to import a large web log file

>into

>>>a SQL Server 2000 SP3a table.  I have set the
>>>rowterminator to '\n' and the fieldterminator to ' '.  

>The

>>>web log file has been converted to ASCII and has had

>all #

>>>comment lines removed.  The log file is about 430MB.  

>When

>>>I run the bulk insert on the large file, it gets a
>>>datetime conversion error on row 1500000 or somewhere
>>>about there.  When I split the file into 100MB chunks
>>>however, it works fine!  Clearly, there is nothing

>wrong

>>>with the data in the file.  Is this a known limitation

>of

>>>BULK INSERT, or is this some sort of bug?

>>In addition to Steve's response, you may want to try the

>command-line

>>tool BCP instead. With BCP you can define an error file,

>to which

>>BCP will log the errouneous rows.

>>As with BULK INSERT, the default behaviour of BCP is to

>permit ten errors

>>and stop on the eleventh.

>>--

>>Books Online for SQL Server SP3 at
>>http://www.microsoft.com/sql/techinfo/productdoc/2000/book

>s.asp

>>.

 
 
 

BULK INSERT bug with large files?

Post by Erland Sommarsko » Wed, 31 Dec 2003 01:26:13



> Thanks for the feedback.  Yes, I am quite surprised that
> they would default the maxerrors to 10!  Seems like that
> is quite a problem to me.  In any case, I am not certain
> that there are errors in my data.  When I run the bulk
> insert, it tells me the line number that has caused the
> error (or the 11th as the case may be).  I have copied
> that row, along with about 100 rows on both sides of the
> errored row into a new file.  When I try to bulk insert
> this file, supposedly containing the error that SQL Server
> reported, with maxerrors set to 0, it succeeds!  How is
> this possible?  Is the line number that SQL Server reports
> back reliable?

But when you get this error, are really all other rows discarded?
I did this:

   create table hhh(d datetime NOT NULL, comment varchar(40) NOT NULL)
   go
   bulk insert hhh from 'e:\temp\slask.bcp' with (fieldterminator=';')
   select * from hhh
   go
   drop table hhh

And my datafile was:

   2003011;bcp formatted as YYYYMMDD
   2003-01-22;bcp formatted as YYYY-MM-DD
   01/03/2003;bcp formatted as MM/DD/YYYY

The first line gave a message, but the other two made it to the table.

Since you get this error only with the full file, it does smell like
some data is being corrupted while in memory. Then again, I would
expect bulk inserted to be tested and tried both inside and out for
large data volumes. (But I have seen buffer overrun errors in BCP
with regards to the input parameters.) But to research this, I'm
afraid that you will have to open a case with Microsoft, since to
file a bug we need a repro, and that file appears to be to big for a
post...

Did you ever post the exact error message you are getting, by the way?

And did you ever try command-line BCP? I believe BULK INSERT uses
OLE DB, but the command-line tool uses ODBC, so the bug may not be
in both. Also, with -e on BCP you also get the troublesome line printed
to the log.

--

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinfo/productdoc/2000/books.asp

 
 
 

BULK INSERT bug with large files?

Post by Stefan Berglun » Wed, 31 Dec 2003 08:12:52





Quote:>It's sounding a bit mysterious, and your followup to this suggests a few
>questions (at least until Linda W. catches up on news and likely solves
>the problem):

>Can we see a few lines of the source file as well as the CREATE TABLE
>statement for the destination table?
>   (feel free to change some characters to X's, but don't change the
>lengths or spacing of anything.
>Are all the lines terminated with the standard Windows CHAR(13)+CHAR(10) ?
>When you import the file in chunks, do you get the same number of rows
>in the table as there are lines in the original file?
>Is the last line of the file terminated with an end-of-line?
>What is the exact file size in bytes (I recall some weird error once
>that happens when a file is an exact multiple of some magic number.)?

>Since you seem to have a workaround that imports all the data, I'm more
>interested in whether this is a bug or not at this point.  It would be
>nice to feel sure enough it's a bug to open a case with product support,
>but if it's being caused by something in your file (which is too large
>to post), you'll get charged...

Would you or could you please elucidate the possible motivations
for opening a case with product support.  I can understand their
disincentive to having logic shortcomings wantonly reported as
bugs since the volume alone would be prohibitive, but frankly,
I'd be much more inclined to spend my time on a workaround rather
than waste any effort on the gamble that a possible problem is
indeed a bug or otherwise.  

I can only suppose that their system is predicated on the fact
that those who report bugs have either large egos in need of
stroking or unlimited corporate budgets.  It seems that if MS
were truly interested in the integrity of their products that a
modest reward would be more appropriate in the case of actual
bugs.

I'm not trying to be a smartass.  I'm just really having a hard
time understanding how and why the system works the way it does.

 
 
 

BULK INSERT bug with large files?

Post by Steve Kas » Wed, 31 Dec 2003 09:49:55






>>It's sounding a bit mysterious, and your followup to this suggests a few
>>questions (at least until Linda W. catches up on news and likely solves
>>the problem):

>>Can we see a few lines of the source file as well as the CREATE TABLE
>>statement for the destination table?
>>  (feel free to change some characters to X's, but don't change the
>>lengths or spacing of anything.
>>Are all the lines terminated with the standard Windows CHAR(13)+CHAR(10) ?
>>When you import the file in chunks, do you get the same number of rows
>>in the table as there are lines in the original file?
>>Is the last line of the file terminated with an end-of-line?
>>What is the exact file size in bytes (I recall some weird error once
>>that happens when a file is an exact multiple of some magic number.)?

>>Since you seem to have a workaround that imports all the data, I'm more
>>interested in whether this is a bug or not at this point.  It would be
>>nice to feel sure enough it's a bug to open a case with product support,
>>but if it's being caused by something in your file (which is too large
>>to post), you'll get charged...

>Would you or could you please elucidate the possible motivations
>for opening a case with product support.  I can understand their
>disincentive to having logic shortcomings wantonly reported as
>bugs since the volume alone would be prohibitive, but frankly,
>I'd be much more inclined to spend my time on a workaround rather
>than waste any effort on the gamble that a possible problem is
>indeed a bug or otherwise.  

>I can only suppose that their system is predicated on the fact
>that those who report bugs have either large egos in need of
>stroking or unlimited corporate budgets.  It seems that if MS
>were truly interested in the integrity of their products that a
>modest reward would be more appropriate in the case of actual
>bugs.

You're quite right that working with product support to track down a bug
would take your time and likely provide little reward beyond a warm
fuzzy feeling of possibly having helped make the product better and/or
saved someone else some trouble.  On the other hand, there is still a
mystery here, and at least a slim chance product support might uncover
something you want to know, particularly if this turns out not to be a
bug, and you have a hardware or configuration issue worth remedying (of
course then you pay for the support call).

I won't speculate on what assumptions underlie the system, and it's
certainly up to you whether to consider opening a case.  I will pass the
word on about the problem you're having.

Your comments do make me wonder whether Microsoft should consider
acknowledgements in Knowledge Base articles and lists of bugs fixed.  
The security team does this, and I could pass on the suggestion to
consider notes like "Microsoft thanks Stefan Berglund of Keep It In The
Groups for discovering this bug and providing helpful information to
Product Support Services."  Let me know (in the group or offline) what
you think.  Would something like this be welcome or just annoying?

SK

- Show quoted text -

Quote:>I'm not trying to be a smartass.  I'm just really having a hard
>time understanding how and why the system works the way it does.

 
 
 

BULK INSERT bug with large files?

Post by Stefan Berglun » Thu, 01 Jan 2004 21:11:06









>>>It's sounding a bit mysterious, and your followup to this suggests a few
>>>questions (at least until Linda W. catches up on news and likely solves
>>>the problem):

>>>Can we see a few lines of the source file as well as the CREATE TABLE
>>>statement for the destination table?
>>>  (feel free to change some characters to X's, but don't change the
>>>lengths or spacing of anything.
>>>Are all the lines terminated with the standard Windows CHAR(13)+CHAR(10) ?
>>>When you import the file in chunks, do you get the same number of rows
>>>in the table as there are lines in the original file?
>>>Is the last line of the file terminated with an end-of-line?
>>>What is the exact file size in bytes (I recall some weird error once
>>>that happens when a file is an exact multiple of some magic number.)?

>>>Since you seem to have a workaround that imports all the data, I'm more
>>>interested in whether this is a bug or not at this point.  It would be
>>>nice to feel sure enough it's a bug to open a case with product support,
>>>but if it's being caused by something in your file (which is too large
>>>to post), you'll get charged...

>>Would you or could you please elucidate the possible motivations
>>for opening a case with product support.  I can understand their
>>disincentive to having logic shortcomings wantonly reported as
>>bugs since the volume alone would be prohibitive, but frankly,
>>I'd be much more inclined to spend my time on a workaround rather
>>than waste any effort on the gamble that a possible problem is
>>indeed a bug or otherwise.  

>>I can only suppose that their system is predicated on the fact
>>that those who report bugs have either large egos in need of
>>stroking or unlimited corporate budgets.  It seems that if MS
>>were truly interested in the integrity of their products that a
>>modest reward would be more appropriate in the case of actual
>>bugs.

>You're quite right that working with product support to track down a bug
>would take your time and likely provide little reward beyond a warm
>fuzzy feeling of possibly having helped make the product better and/or
>saved someone else some trouble.  On the other hand, there is still a
>mystery here, and at least a slim chance product support might uncover
>something you want to know, particularly if this turns out not to be a
>bug, and you have a hardware or configuration issue worth remedying (of
>course then you pay for the support call).

>I won't speculate on what assumptions underlie the system, and it's
>certainly up to you whether to consider opening a case.  I will pass the
>word on about the problem you're having.

>Your comments do make me wonder whether Microsoft should consider
>acknowledgements in Knowledge Base articles and lists of bugs fixed.  
>The security team does this, and I could pass on the suggestion to
>consider notes like "Microsoft thanks Stefan Berglund of Keep It In The
>Groups for discovering this bug and providing helpful information to
>Product Support Services."  Let me know (in the group or offline) what
>you think.  Would something like this be welcome or just annoying?

>SK

My apologies for jumping into the middle of this thread as I'm
sure you've confused my missive with that of the OP.  I was
merely looking for clarity in what appears to me to be a rather
murky subject.

As an entrepreneur and sole proprietor I honestly would never
consider relying on PSS for a resolution nor would I seek any
form of recognition from MS.  Rather, my primary concern is the
well-being of my clients and therefore I'd tend toward providing
the most expedient workaround solution necessary to that end.

In my opinion these groups provide a level of support and
resources that simply surpasses anything MS could ever muster,
were they so inclined.  The exchange of ideas here is invaluable.

Thank you again for your reply and I apologize once again for any
confusion I may have caused.

 
 
 

BULK INSERT bug with large files?

Post by Kevi » Thu, 01 Jan 2004 23:09:52


Just for curiosity's sake - have you ever been involved in a PSS case for a
SQL Server incident?

I think they're great.  Especially in comparison to some of the other
software vendors' support.

--
Kevin Connell, MCDBA
--------------------------------------------------
The views expressed here are my own
and not of my employer.
----------------------------------------------------

Quote:

> In my opinion these groups provide a level of support and
> resources that simply surpasses anything MS could ever muster,
> were they so inclined.  The exchange of ideas here is invaluable.

> Thank you again for your reply and I apologize once again for any
> confusion I may have caused.

 
 
 

BULK INSERT bug with large files?

Post by Stefan Berglun » Fri, 02 Jan 2004 00:39:26


On Wed, 31 Dec 2003 13:09:52 -0800, "Kevin"

Quote:>Just for curiosity's sake - have you ever been involved in a PSS case for a
>SQL Server incident?

>I think they're great.  Especially in comparison to some of the other
>software vendors' support.

No, I've never been involved in a PSS case for SQL Server or any
of their products for that matter.  I thought I made it clear as
to my reasons why.  And I'm not passing judgment either!  It's
just that I get paid for results and there are always a multitude
of approaches that will solve a problem.  

I neither meant nor intended anything derogatory regarding PSS.
It's just that if there was some form of reward beyond a pat on
the back I might be inclined to consider it, but a system that
only offers a punishment when you're wrong (you get to pay) seems
a bit harsh to say the least.  It just strikes me as inequitable
and I was merely trying to see how others saw it - that's all.

Just out of curiosity - have you ever been the one to foot the
bill or does your employer cover that?  :-)

 
 
 

BULK INSERT bug with large files?

Post by Erland Sommarsko » Fri, 02 Jan 2004 02:04:21



> As an entrepreneur and sole proprietor I honestly would never
> consider relying on PSS for a resolution nor would I seek any
> form of recognition from MS.  Rather, my primary concern is the
> well-being of my clients and therefore I'd tend toward providing
> the most expedient workaround solution necessary to that end.

> In my opinion these groups provide a level of support and
> resources that simply surpasses anything MS could ever muster,
> were they so inclined.  The exchange of ideas here is invaluable.

It depends on the issue you are having. I very rarely open cases
with PSS, but it has happened. For instance, when running our
database scripts on SQL7, an ALTER TABLE crashed on me. I could
figure out a workaround, but I wanted to know exactly when this
problem could occur as input to our migration plans. And this is
not an issue you can sort out in the newsgroups, but it takes
PSS staff that have access to internal databases to find out.

And if you read the groups, you will see the recommendation every
once in a while to open a case with PSS. If someone's database
has crashed, he lost the log file and does not have a good backup,
PSS is the way to go. There are some undocumented tricks that you
sometimes see on the newsgroups, but they are too dangerous to
be put in the hands of unexpierence people. They really need
hand-holding through the process and PSS are the people for that.

--

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinfo/productdoc/2000/books.asp

 
 
 

BULK INSERT bug with large files?

Post by Stefan Berglun » Fri, 02 Jan 2004 04:58:38




>> As an entrepreneur and sole proprietor I honestly would never
>> consider relying on PSS for a resolution nor would I seek any
>> form of recognition from MS.  Rather, my primary concern is the
>> well-being of my clients and therefore I'd tend toward providing
>> the most expedient workaround solution necessary to that end.

>> In my opinion these groups provide a level of support and
>> resources that simply surpasses anything MS could ever muster,
>> were they so inclined.  The exchange of ideas here is invaluable.

>It depends on the issue you are having. I very rarely open cases
>with PSS, but it has happened. For instance, when running our
>database scripts on SQL7, an ALTER TABLE crashed on me. I could
>figure out a workaround, but I wanted to know exactly when this
>problem could occur as input to our migration plans. And this is
>not an issue you can sort out in the newsgroups, but it takes
>PSS staff that have access to internal databases to find out.

>And if you read the groups, you will see the recommendation every
>once in a while to open a case with PSS. If someone's database
>has crashed, he lost the log file and does not have a good backup,
>PSS is the way to go. There are some undocumented tricks that you
>sometimes see on the newsgroups, but they are too dangerous to
>be put in the hands of unexpierence people. They really need
>hand-holding through the process and PSS are the people for that.

No arguments there.  

btw  HAPPY NEW YEAR to all.

 
 
 

BULK INSERT bug with large files?

Post by Sam Elmor » Sat, 03 Jan 2004 00:32:59


To follow up, I did more testing of my file.  I started
cutting the file in half, seeing which half broke, then
cutting that in half and repeating.  I finally did narrow
down the problem to a line in the web log, actually.  The
line had an extra space in the query string, which space
is what the W3C defines as the field delimiter.  Seems
like IIS should be escaping it to %20 or something.  In
any case, BULK INSERT was not providing a correct line
number in the error message, even with MAXERRORS = 0.  So
I have worked around this issue by checking for the
expected number of space delimiters in my log preparation
utility.

Interestingly, this extra space issue was not in my
original file that caused me to split the logs into 100MB
chunks.  I can't, however reproduce that error, so I am
stumped as to how I originally had this problem.  Oh well,
thanks for all the great feedback!

Sam

Quote:>-----Original Message-----
>It's sounding a bit mysterious, and your followup to this
suggests a few
>questions (at least until Linda W. catches up on news and
likely solves
>the problem):

>Can we see a few lines of the source file as well as the
CREATE TABLE
>statement for the destination table?
>   (feel free to change some characters to X's, but don't
change the
>lengths or spacing of anything.
>Are all the lines terminated with the standard Windows
CHAR(13)+CHAR(10) ?
>When you import the file in chunks, do you get the same
number of rows
>in the table as there are lines in the original file?
>Is the last line of the file terminated with an end-of-
line?
>What is the exact file size in bytes (I recall some weird
error once
>that happens when a file is an exact multiple of some
magic number.)?

>Since you seem to have a workaround that imports all the
data, I'm more
>interested in whether this is a bug or not at this

point.  It would be

- Show quoted text -

>nice to feel sure enough it's a bug to open a case with
product support,
>but if it's being caused by something in your file (which
is too large
>to post), you'll get charged...

>SK


>>Thanks for the feedback.  Yes, I am quite surprised that
>>they would default the maxerrors to 10!  Seems like that
>>is quite a problem to me.  In any case, I am not certain
>>that there are errors in my data.  When I run the bulk
>>insert, it tells me the line number that has caused the
>>error (or the 11th as the case may be).  I have copied
>>that row, along with about 100 rows on both sides of the
>>errored row into a new file.  When I try to bulk insert
>>this file, supposedly containing the error that SQL
Server
>>reported, with maxerrors set to 0, it succeeds!  How is
>>this possible?  Is the line number that SQL Server
reports
>>back reliable?

>>Thanks,
>>Sam

>>>-----Original Message-----

writes:

>>>>I am using BULK INSERT to import a large web log file

>>into

>>>>a SQL Server 2000 SP3a table.  I have set the
>>>>rowterminator to '\n' and the fieldterminator to ' '.  

>>The

>>>>web log file has been converted to ASCII and has had

>>all #

>>>>comment lines removed.  The log file is about 430MB.  

>>When

>>>>I run the bulk insert on the large file, it gets a
>>>>datetime conversion error on row 1500000 or somewhere
>>>>about there.  When I split the file into 100MB chunks
>>>>however, it works fine!  Clearly, there is nothing

>>wrong

>>>>with the data in the file.  Is this a known limitation

>>of

>>>>BULK INSERT, or is this some sort of bug?

>>>In addition to Steve's response, you may want to try
the

>>command-line

>>>tool BCP instead. With BCP you can define an error
file,

>>to which

>>>BCP will log the errouneous rows.

>>>As with BULK INSERT, the default behaviour of BCP is to

>>permit ten errors

>>>and stop on the eleventh.

 
 
 

1. Error when bulk insert follows another large bulk insert

I am running a job out of MS SQL Enterprise Manager where the first step
turns off transaction logging, the 2nd step is a large bulk insert step,
and the 3rd step is another large bulk insert step.  I receive the
following error when executing the 3rd step.  Is this error occurring
because the database is still comitting data from the previous large bulk
insert at the same time as it is performing the next bulk insert?  If so,
is there a workaround?  Thank you for any help you can provide.

Error message from Step 3 (2nd bulk insert):
Backup, CHECKALLOC, bulk copy, SELECT INTO, and file manipulation
(such as CREATE FILE) operations on a database must be serialized.
Reissue the statement after the current backup, CHECKALLOC, or
file manipulation operation is completed. [SQLSTATE 42000] (Error 3023)  
The statement has been terminated. [SQLSTATE 01000] (Error 3621).  
The step failed.

--
Posted via CNET Help.com
http://www.help.com/

2. VB4-32bit/Win95 Oracle ODBC problems. Can anyone help?

3. Bulk Insert: Unexpected end-of-file (EOF) encountered in data file

4. ODBC and Native error codes

5. Slow Bulk Insert into Large Table

6. Bulk Insert: Unexpected end-of-file (EOF) encountered in data file

7. Tempdb can't handle large bulk insert operations

8. BULK INSERT bug?

9. Bulk Insert Bug?

10. 9i Bulk Insert Bug

11. BCP command line / error capturing / bulk insert bugs