l i n u x - u s e r s - g r o u p - o f - d a v i s
Next Meeting:
July 7: Social gathering
Next Installfest:
Latest News:
Jun. 14: June LUGOD meeting cancelled
Page last updated:
2001 Dec 30 16:58

The following is an archive of a post made to our 'vox-tech mailing list' by one of its subscribers.

Report this post as spam:

(Enter your email address)
Re: [vox-tech] Decompressing large files
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [vox-tech] Decompressing large files

  • Subject: Re: [vox-tech] Decompressing large files
  • From: Henry House <hajhouse@hMAPSouseag.com>
  • Date: Mon, 30 Oct 2000 14:18:07 -0800
  • References: 39FDF3AC.82047452@sagresdiscovery.com

On Mon, Oct 30, 2000 at 02:18:20PM -0800, Eric Engelhard wrote:
> I am decompressing large (>1GB compressed) genome sequence files. I have
> plenty of free space on a fast scsi and 256MB of RAM. When I tried to to
> use "uncompress filename.Z" or "gzip -d filename.Z" I got a write error
> each time. The solution (well, MY solution) was to use the "-c" option
> with uncompress and pipe the output to a specified file. Does anyone
> know enough about uncompress to tell me why this solves the problem?

Is it possible that gzip is attempting to create a temporary file that
exceeds 2Gb in size? Files are generally limited to 2Gb or less on 32-bit

Henry House
OpenPGP key available from http://hajhouse.org/hajhouse.asc

LUGOD Group on LinkedIn
Sign up for LUGOD event announcements
Your email address:
LUGOD Group on Facebook
'Like' LUGOD on Facebook:

Hosting provided by:
Sunset Systems
Sunset Systems offers preconfigured Linux systems, remote system administration and custom software development.

LUGOD: Linux Users' Group of Davis
PO Box 2082, Davis, CA 95617
Contact Us

LUGOD is a 501(c)7 non-profit organization
based in Davis, California
and serving the Sacramento area.
"Linux" is a trademark of Linus Torvalds.

Sponsored in part by:
Sunset Systems
Who graciously hosts our website & mailing lists!