l i n u x - u s e r s - g r o u p - o f - d a v i s
L U G O D
 
Next Meeting:
September 2: Social gathering
Next Installfest:
TBD
Latest News:
Aug. 18: Discounts to "Velocity" in NY; come to tonight's "Photography" talk
Page last updated:
2003 Mar 25 11:12

The following is an archive of a post made to our 'vox-tech mailing list' by one of its subscribers.

Report this post as spam:

(Enter your email address)
Re: [vox-tech] Perl Net::HTTP Content-encoding/gzip support broken...
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [vox-tech] Perl Net::HTTP Content-encoding/gzip support broken...



On Mon, Mar 24, 2003 at 01:41:02AM -0500, Mike Simons wrote:
>   Trying to use Net::HTTP to pull compressed web content but it appears
> to be broken.
[...]
> - Know of any perl HTTP modules that really handles compressed content?

  Bleh... I patched up the HTTP module to request and handle
content-encoded data streams.  Sent patches to the libwww-perl list,
but it's not really clean yet.

... Everything is a mess.

1) "mod_gzip" requires configuration to actually compress things dispute
   what the debian package maintainer says in his readme.  This
   configuration you need to basically list every
   filetime/uri/path/mimetype that you want to be compressed, or not
   compressed... so you have a huge list of (don't compress my jpgs, 
   mp3s, pngs, mpegs, etc).

2) "mod_gzip" only sends 'gzip' style Content-encoding, even though 
   gzip and deflate are the exact same compression algorithm... which
   has a slightly different header in front (*: I think this is the case
   but haven't verified it yet)... what;s really bizarre is the gzip
   header has the *length* of uncompressed data in it... which means
   that you can't *stream* through it.

3) Compress::Zlib, doesn't provide hooks to do block by block
   inflation of data... you have to have the whole data stream to 
   use the published API.  Although poorly documented there is a way
   to do block by block decompression of data...

4) I wasn't paying enough attention to details and wasted two hours 
   trying to *see* that the gzip header had to be removed before I could
   feed the data to the inflate function call.  Then again trying to
   figure out that I had to keep track of "bytes pending on the socket"
   and "bytes already decompressed in the local buffer".

5) the libwww-perl authors started work on a revised API which would
   handle all of this much better and HTTP/1.1 support, but the most
   "resent" snapshot has is code from 1998.

> - Know of good documentation source detailing 'Contect-encoding' 
>   data flow over HTTP?

http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html

  This is a really handy RFC.

    Later,
      Mike
_______________________________________________
vox-tech mailing list
vox-tech@lists.lugod.org
http://lists.lugod.org/mailman/listinfo/vox-tech



LinkedIn
LUGOD Group on LinkedIn
Sign up for LUGOD event announcements
Your email address:
facebook
LUGOD Group on Facebook
'Like' LUGOD on Facebook:

Hosting provided by:
Sunset Systems
Sunset Systems offers preconfigured Linux systems, remote system administration and custom software development.

LUGOD: Linux Users' Group of Davis
PO Box 2082, Davis, CA 95617
Contact Us

LUGOD is a 501(c)7 non-profit organization
based in Davis, California
and serving the Sacramento area.
"Linux" is a trademark of Linus Torvalds.

Sponsored in part by:
Sunset Systems
Who graciously hosts our website & mailing lists!