l i n u x - u s e r s - g r o u p - o f - d a v i s
L U G O D
 
Next Meeting:
August 5: Social gathering
Next Installfest:
TBD
Latest News:
Jul. 4: July, August and September: Security, Photography and Programming for Kids
Page last updated:
2001 Dec 30 17:08

The following is an archive of a post made to our 'vox-tech mailing list' by one of its subscribers.

Report this post as spam:

(Enter your email address)
Re: [vox-tech] Gotta question
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [vox-tech] Gotta question



Thanks Jeff,

that worked like a charm.  An aside, I was using "strict".  But I can't use
the line:

open( SCRIPTINPUT, "|ksh_script" ) or die "could not start ksh_script";

because "SCRIPTINPUT" is a bareword.  So instead I used "strict 'vars', and
it worked.  But how would
I define the above line if I wanted to continue with "strict"?

Jay

----- Original Message -----
From: <jdnewmil@dcn.davis.ca.us>
To: <vox-tech@franz.mother.com>
Sent: Thursday, August 16, 2001 8:46 PM
Subject: Re: [vox-tech] Gotta question


> On Thu, 16 Aug 2001, Sam Peterson wrote:
>
> > Set $| =1 at the beginning of the program.
> >
> > It forces flushing of the output buffers.
>
> But it doesn't prevent the backticks from collecting all of the output
> from the script in RAM before passing it to the output buffers.
>
> Try this:
>
> open( SCRIPTINPUT, "|ksh_script" ) or die "could not start ksh_script";
> printf STDOUT SCRIPTINPUT;
> close SCRIPTINPUT;
>
> > At 04:42 PM 8/16/2001 -0500, Jay Strauss wrote:
> >
> > >Fancy this, I gotta new problem
> > >
> > >I'm calling a ksh script within my perl program.  The ksh script
basically
> > >dumps a ton (multiple Gb) of data to stdout.  I'd like my perl program
to
> > >dump it to stdout too.  But if I do a:
> > >
> > >print `ksh_script`;
> > >
> > >The data doesn't get written to stdout till the ksh script completes,
as
> > >opposed to if I execute the ksh
> > >script from the cmd line like:
> > >
> > ># ksh_script > out
> > >
> > >If I call the ksh script within my perl program I figure if I have to
dump
> > >multiple Gb of data I'll run out of memory long before the ksh script
> > >completes.  Is there a different way to call the ksh script from within
perl
> > >so that the output of the ksh goes straight to stdout?  So that I can
do:
> > >
> > ># perl_script > out
> > >
> > >without the memory requirements
> > >
> > >Thanks
> > >Jay
> > >
> > >Jay Strauss
> > >jjstrauss@yahoo.com
> > >
> > >
> > >_________________________________________________________
> > >Do You Yahoo!?
> > >Get your free @yahoo.com address at http://mail.yahoo.com
> >
> > Sam Peterson
> > Hart Interdisciplinary Programs
> > 2201 Hart Hall
> > University of California, Davis
> > One Shields Avenue
> > Davis, California 95616
> > (530) 752-9332
> >
>
> --------------------------------------------------------------------------
-
> Jeff Newmiller                        The     .....       .....  Go
Live...
> DCN:<jdnewmil@dcn.davis.ca.us>        Basics: ##.#.       ##.#.  Live
Go...
>                                       Live:   OO#.. Dead: OO#..  Playing
> Research Engineer (Solar/Batteries            O.O#.       #.O#.  with
> /Software/Embedded Controllers)               .OO#.       .OO#.
rocks...2k
> --------------------------------------------------------------------------
-


_________________________________________________________
Do You Yahoo!?
Get your free @yahoo.com address at http://mail.yahoo.com


LinkedIn
LUGOD Group on LinkedIn
Sign up for LUGOD event announcements
Your email address:
facebook
LUGOD Group on Facebook
'Like' LUGOD on Facebook:

Hosting provided by:
Sunset Systems
Sunset Systems offers preconfigured Linux systems, remote system administration and custom software development.

LUGOD: Linux Users' Group of Davis
PO Box 2082, Davis, CA 95617
Contact Us

LUGOD is a 501(c)7 non-profit organization
based in Davis, California
and serving the Sacramento area.
"Linux" is a trademark of Linus Torvalds.

Sponsored in part by:
Sunset Systems
Who graciously hosts our website & mailing lists!