ftp >2GB

ftp >2GB

Post by Ira Childres » Wed, 12 Jul 2000 04:00:00



We have been trying to ftp files to a ULTRA 5 running Solaris 2.7 (32
bit mode) that are greater than 2 GBs and it keeps failing at the 2G
mark with a pipe broken error. According to the man pages, ftp is
suppose to be largefiles aware.

Has anyone seen this before or, more better, does any one know of a fix?

 
 
 

ftp >2GB

Post by Roland Main » Wed, 12 Jul 2000 04:00:00



> We have been trying to ftp files to a ULTRA 5 running Solaris 2.7 (32
> bit mode) that are greater than 2 GBs and it keeps failing at the 2G
> mark with a pipe broken error. According to the man pages, ftp is
> suppose to be largefiles aware.

> Has anyone seen this before or, more better, does any one know of a fix?

Maybe the destination filesystem isn't largefile-aware, see
mount_ufs(1M)
If the filesytem isn't mounted with the "largefiles" option the
filesystem can only host files <=2GB...
... simply type "mount" and look if the matching filesystem is listed
like this one:
-- snip --
/var on /dev/dsk/c1t3d0s1 read/write/setuid/logging/largefiles on Wed
Jun  7 03:34:05 2000
/export on /dev/dsk/c1t3d0s7 read/write/setuid/largefiles/logging on Wed
Jun  7 03:34:07 200
-- snip --

----

Bye,
Roland

--
  __ .  . __


  /O /==\ O\  MPEG specialist, C&&JAVA&&Sun&&Unix programmer
 (;O/ \/ \O;) TEL +49 641 99-13193 FAX +49 641 99-41359

 
 
 

ftp >2GB

Post by Toomas Soom » Thu, 13 Jul 2000 04:00:00


: We have been trying to ftp files to a ULTRA 5 running Solaris 2.7 (32
: bit mode) that are greater than 2 GBs and it keeps failing at the 2G
: mark with a pipe broken error. According to the man pages, ftp is
: suppose to be largefiles aware.

: Has anyone seen this before or, more better, does any one know of a fix?

ftp command (client) is large file aware, not safe. broken pipe error
makes me think, that your ftp server is not capable of large file transfers.

toomas
--
Marriage is the waste-paper basket of the emotions.

 
 
 

1. FTP > 2GB from Solaris 2.6 to 2.5.x

I've experience problems streaming more than 2 GB of data from a 64 bit
(Sol 2.6) to a 32 bit (2.5.5 or 2.5.1, not sure) Solaris box.  Data I
would like to transfer is approx 40 GB uncompressed, compressed to 14
GB, stored as multiple files.  I am attempting to stream the entire data
content to a FIFO pipe on the 2.5 box where an application processes the
data.  The application provides its own support for large files via data
set paritioning.

An example of the problem occurs if I have two files Big1 and Big2, each
just < 2GB, and try to ftp this from the 2.6 host to the 2.5 host:

echo "
connect host32
user <userid> <password>
ascii
put - FIFO
`cat Big1 Big2`          # those are backquotes
" | ftp -niv

On host32 I'm running:

cat FIFO > /dev/null

....ftp bombs after several minute with "No space on device" on the 2.6
box.  Watching /tmp and /var on both boxes shows no appreciable
utilization.

I suspect a 2GB filesize issue.  Not sure if it's the 2.6 ftp process or
the 2.5 ftpd server, or something else.  Hints?

--

    What part of "gestalt" don't you understand?
    Welchen Teil von "gestalt" verstehen Sie nicht?

web:       http://www.netcom.com/~kmself
SAS/Linux: http://www.netcom.com/~kmself/SAS/SAS4Linux.html    

 12:31pm  up 74 days, 10:00,  3 users,  load average: 1.80, 1.69, 1.56

2. How to share passwords?

3. Get >>> 1GB / 2GB <<<< USB Flash Memory Drive

4. Help needed with Debian and XF86Setup

5. How do I get around the limit

6. What happens if files are >2GB on a 2GB NFS mount?

7. Apache 1.3.6 - WebTrends and Usertrack Logging. How do you setup Apache logs??

8. Large file support (>2GB) on 2.2.x kernels (NT vs. Linux FTP)

9. FTP problem: linux>>win/SGI slow win/SGI>>linux fast

10. When you need more SPACE << 2GB / 1GB >>

11. gzip >>2GB on AIX4.3.3???

12. High Capacity / FAST, Instant transfer << 1GB / 2GB >> Flash Memory