Reading large .txt file help!

Reading large .txt file help!

Post by Lorne Corrigan - Tiffan » Thu, 30 Nov 1995 04:00:00



I have a large .txt file (100,000 rows) that needs to be QUICKLY read and
inserted into a database table.  Input column sequence is variable and
supplied dynamically at run time by a user.  This is to run on different
databases and platforms.
      1.  What is the fastest way of doing this? I assume a FileRead
          function into a string variable, parse my way thru the string
          pulling out the individual columns into variables, then doing
          inserts into the DB.
      2.  Can this .txt file be tab separated?  If so I can't figure out

-------------------------------------------------------------------------
Disclaimer:  "All opinions above are my own, untouched by human mind, and
                 not necessarily those of my employer or clients!"

Lorne Corrigan

Notice: "All thoughts presented here are untouched by human mind!"

 
 
 

Reading large .txt file help!

Post by Lorne Corrigan - Tiffan » Thu, 30 Nov 1995 04:00:00


Sorry...itchy enter key finger.

        I can't figure out how to parse out the columns with tabs as
separators as opposed to fixed columns.

All help is appreciated.

Lorne

-------------------------------------------------------------------------
Disclaimer:  "All opinions above are my own, untouched by human mind, and
                 not necessarily those of my employer or clients!"

Lorne Corrigan

Notice: "All thoughts presented here are untouched by human mind!"

 
 
 

Reading large .txt file help!

Post by Ahmad Ghoshe » Fri, 01 Dec 1995 04:00:00


if you have a tab seperated file the you can read it into a data window
using ImportFile(). This is a very fast function that can read the file in
seconds.  To create a tab seperated file you need to go to the source. What
is the source of the document?. If it is a pc application then , for most
par, can do a SaveAs from the other application and specify Tab Delimited
format. If it is a mainframe source, God help us, then you need to write a
service request and hope is is done before you retire :)

--
Ahmad Ghosheh                
PowerDesign
The PowerBuilder Experts
Kansas City, MO              

 
 
 

Reading large .txt file help!

Post by Hoyt Nels » Fri, 01 Dec 1995 04:00:00


I suggest that you (1) dynamically create a datawindow and (2) use
dwImportFile(). This will very quickly get your data into the
datawindow, at which time the datawindow's optimized db facilities
will quickly manage the database inserts. Put the DW on an invisible
window. Tab-delimited works well; the file suffix MUST be .txt.


>I have a large .txt file (100,000 rows) that needs to be QUICKLY read and
>inserted into a database table.  Input column sequence is variable and
>supplied dynamically at run time by a user.  This is to run on different
>databases and platforms.
>      1.  What is the fastest way of doing this? I assume a FileRead
>          function into a string variable, parse my way thru the string
>          pulling out the individual columns into variables, then doing
>          inserts into the DB.
>      2.  Can this .txt file be tab separated?  If so I can't figure out
>-------------------------------------------------------------------------
>Disclaimer:  "All opinions above are my own, untouched by human mind, and
>             not necessarily those of my employer or clients!"
>Lorne Corrigan

>Notice: "All thoughts presented here are untouched by human mind!"

Independent consultant (and looking )
Messages: (617) 734-3824