You are viewing a plain text version of this content. The canonical link for it is here.
Posted to modperl@perl.apache.org by Dermot Paikkos <de...@sciencephoto.co.uk> on 2005/05/03 15:29:39 UTC
Refresh referring page
Hi,
MP2 RC5, Template, Image::Magick
I hope this is not off topic, apologises if it is.
I have a perl script written as a handler. It scans a dir for image
files and offers the user a chance to convert them from tiff to jpeg.
As there can be lots of files of some size, it can take some time for
the process to return.
What I wanted was to loop through the files and refresh the page as
each file was processed. So I had something like;
$r = shift if $ENV{MOD_PERL};
my $tt = Template->new...
foreach my $f (@files) {
my @files_processed;
...snip
push(@files_processed,$f);
my $vars = {
files => \@files_processed,
};
$r->content_type('text/html');
$r->headers_out;
$tt->process($tt_file,$vars)
|| die $tt->error;
} # End of foreach
# $r->headers_out;
I thought this would re-send the $vars and headers_out until the list
was exhausted but in practise what I get was the page repeated for
each file. EG: if there are 4 files I get 4 <html></html> and a messy
looking page.
I am not sure what I am doing wrong. If I move the headers_out
outside the foreach loop I get to the array contents but I still get
the (size of @files_processed) x <html> tags rather than one nice
page and I still have to wait for the whole process to complete.
I imagine I am going to have to take another approach but can't thing
of one. Does anyone now how to refresh a referring page or loop in
the may I described?
Thanx in advance.
Dp.
Re: Refresh referring page
Posted by Issac Goldstand <ma...@beamartyr.net>.
Dermot Paikkos wrote:
> On 3 May 2005 at 17:11, Issac Goldstand wrote:
>
>
>>Is there any particular reason why you must split it into 4 pages?
>
> 3 reasons; I want appearance to be as if the page is refreshing on
> it's own, I thought a large batch of say 30 x 50MB tiffs would cause
> the browser to timeout or give the user the impression the process
> was 'dead', it was a simple way to pass lists (the original list of
> files and those processed and the results) to the page.
>
>
>>Why can't you do something like:
>>
>>local $|=1;
>>$r->headers_out;
>>print $tt_header;
>>foreach my $f (@files) {
>> ... process file ...
>> print $tt_file_info($f);
>>}
>>print $tt_footer;
>>
>>The idea being do everything in 1 single page. Split the template
>>into a header, a footer and information for each processed image, and
>>just loop the per-picture content instead of looping an entire page.
>
>
> This looks like it might work if I throw away Template::Toolkit. I
> don't think it can be configured to just output a header, loop and
> output a footer but I could be wrong. It might be worth a try if I
> could reproduce the template I have with some other templating system
> - or forget templating altogether for this handler.
>
>
>>The only other way I can think of to do this would be to open a second
>>window which calls a second handler which can share the information of
>>the first response handler, via shared memory, or a shared cache, or
>>whatever (or to move the "work" into a cleanup handler and use the
>>original window with a second handler who can share information with
>>the cleanup stuff - but I don't know if you can delay reading POST
>>information beyond the response handler....)
>
>
> This looks complicated. I was hoping for something in the HTTP
> headers that I could use that might ask for a new page if the
> existing one was a older than 30secs.
There is. Use a refresh header.
$r->header_out('Refresh'=>'30;url=http://my.site.com/loopscript');
The problem is that your handler is supposed to generate 1 page, and it
won't be done until you finish processing all that info.
Issac
>
> I might have a look at the method above.
>
>
>> Issac
>
> Thanx.
>
>
>
>>Dermot Paikkos wrote:
>>
>>
>>>Hi,
>>>
>>>MP2 RC5, Template, Image::Magick
>>>
>>>I hope this is not off topic, apologises if it is.
>>>
>>>I have a perl script written as a handler. It scans a dir for image
>>>files and offers the user a chance to convert them from tiff to jpeg.
>>>As there can be lots of files of some size, it can take some time
>>>for the process to return.
>>>
>>>What I wanted was to loop through the files and refresh the page as
>>>each file was processed. So I had something like;
>>>
>>>
>>>$r = shift if $ENV{MOD_PERL};
>>>my $tt = Template->new...
>>>
>>>foreach my $f (@files) {
>>> my @files_processed;
>>> ...snip
>>> push(@files_processed,$f);
>>>
>>> my $vars = {
>>> files => \@files_processed,
>>> };
>>> $r->content_type('text/html');
>>> $r->headers_out;
>>> $tt->process($tt_file,$vars)
>>> || die $tt->error;
>>>
>>>} # End of foreach
>>>
>>># $r->headers_out;
>>>
>>>I thought this would re-send the $vars and headers_out until the list
>>>was exhausted but in practise what I get was the page repeated for
>>>each file. EG: if there are 4 files I get 4 <html></html> and a messy
>>>looking page.
>>>
>>>I am not sure what I am doing wrong. If I move the headers_out
>>>outside the foreach loop I get to the array contents but I still get
>>>the (size of @files_processed) x <html> tags rather than one nice
>>>page and I still have to wait for the whole process to complete.
>>>
>>>I imagine I am going to have to take another approach but can't thing
>>>of one. Does anyone now how to refresh a referring page or loop in
>>>the may I described?
>>>
>>>Thanx in advance.
>>>Dp.
>>>
>>>
>>>
>>>
>>>
>>>
>>
>
>
> ~~
> Dermot Paikkos * dermot@sciencephoto.com
> Network Administrator @ Science Photo Library
> Phone: 0207 432 1100 * Fax: 0207 286 8668
Re: Refresh referring page
Posted by Dermot Paikkos <de...@sciencephoto.co.uk>.
On 3 May 2005 at 17:11, Issac Goldstand wrote:
> Is there any particular reason why you must split it into 4 pages?
3 reasons; I want appearance to be as if the page is refreshing on
it's own, I thought a large batch of say 30 x 50MB tiffs would cause
the browser to timeout or give the user the impression the process
was 'dead', it was a simple way to pass lists (the original list of
files and those processed and the results) to the page.
> Why can't you do something like:
>
> local $|=1;
> $r->headers_out;
> print $tt_header;
> foreach my $f (@files) {
> ... process file ...
> print $tt_file_info($f);
> }
> print $tt_footer;
>
> The idea being do everything in 1 single page. Split the template
> into a header, a footer and information for each processed image, and
> just loop the per-picture content instead of looping an entire page.
This looks like it might work if I throw away Template::Toolkit. I
don't think it can be configured to just output a header, loop and
output a footer but I could be wrong. It might be worth a try if I
could reproduce the template I have with some other templating system
- or forget templating altogether for this handler.
> The only other way I can think of to do this would be to open a second
> window which calls a second handler which can share the information of
> the first response handler, via shared memory, or a shared cache, or
> whatever (or to move the "work" into a cleanup handler and use the
> original window with a second handler who can share information with
> the cleanup stuff - but I don't know if you can delay reading POST
> information beyond the response handler....)
This looks complicated. I was hoping for something in the HTTP
headers that I could use that might ask for a new page if the
existing one was a older than 30secs.
I might have a look at the method above.
> Issac
Thanx.
> Dermot Paikkos wrote:
>
> >Hi,
> >
> >MP2 RC5, Template, Image::Magick
> >
> >I hope this is not off topic, apologises if it is.
> >
> >I have a perl script written as a handler. It scans a dir for image
> >files and offers the user a chance to convert them from tiff to jpeg.
> > As there can be lots of files of some size, it can take some time
> >for the process to return.
> >
> >What I wanted was to loop through the files and refresh the page as
> >each file was processed. So I had something like;
> >
> >
> >$r = shift if $ENV{MOD_PERL};
> >my $tt = Template->new...
> >
> >foreach my $f (@files) {
> > my @files_processed;
> > ...snip
> > push(@files_processed,$f);
> >
> > my $vars = {
> > files => \@files_processed,
> > };
> > $r->content_type('text/html');
> > $r->headers_out;
> > $tt->process($tt_file,$vars)
> > || die $tt->error;
> >
> >} # End of foreach
> >
> ># $r->headers_out;
> >
> >I thought this would re-send the $vars and headers_out until the list
> > was exhausted but in practise what I get was the page repeated for
> >each file. EG: if there are 4 files I get 4 <html></html> and a messy
> > looking page.
> >
> >I am not sure what I am doing wrong. If I move the headers_out
> >outside the foreach loop I get to the array contents but I still get
> >the (size of @files_processed) x <html> tags rather than one nice
> >page and I still have to wait for the whole process to complete.
> >
> >I imagine I am going to have to take another approach but can't thing
> > of one. Does anyone now how to refresh a referring page or loop in
> >the may I described?
> >
> >Thanx in advance.
> >Dp.
> >
> >
> >
> >
> >
> >
>
~~
Dermot Paikkos * dermot@sciencephoto.com
Network Administrator @ Science Photo Library
Phone: 0207 432 1100 * Fax: 0207 286 8668
Re: Refresh referring page
Posted by Issac Goldstand <ma...@beamartyr.net>.
Is there any particular reason why you must split it into 4 pages? Why
can't you do something like:
local $|=1;
$r->headers_out;
print $tt_header;
foreach my $f (@files) {
... process file ...
print $tt_file_info($f);
}
print $tt_footer;
The idea being do everything in 1 single page. Split the template into
a header, a footer and information for each processed image, and just
loop the per-picture content instead of looping an entire page.
The only other way I can think of to do this would be to open a second
window which calls a second handler which can share the information of
the first response handler, via shared memory, or a shared cache, or
whatever (or to move the "work" into a cleanup handler and use the
original window with a second handler who can share information with the
cleanup stuff - but I don't know if you can delay reading POST
information beyond the response handler....)
Issac
Dermot Paikkos wrote:
>Hi,
>
>MP2 RC5, Template, Image::Magick
>
>I hope this is not off topic, apologises if it is.
>
>I have a perl script written as a handler. It scans a dir for image
>files and offers the user a chance to convert them from tiff to jpeg.
>As there can be lots of files of some size, it can take some time for
>the process to return.
>
>What I wanted was to loop through the files and refresh the page as
>each file was processed. So I had something like;
>
>
>$r = shift if $ENV{MOD_PERL};
>my $tt = Template->new...
>
>foreach my $f (@files) {
> my @files_processed;
> ...snip
> push(@files_processed,$f);
>
> my $vars = {
> files => \@files_processed,
> };
> $r->content_type('text/html');
> $r->headers_out;
> $tt->process($tt_file,$vars)
> || die $tt->error;
>
>} # End of foreach
>
># $r->headers_out;
>
>I thought this would re-send the $vars and headers_out until the list
>was exhausted but in practise what I get was the page repeated for
>each file. EG: if there are 4 files I get 4 <html></html> and a messy
>looking page.
>
>I am not sure what I am doing wrong. If I move the headers_out
>outside the foreach loop I get to the array contents but I still get
>the (size of @files_processed) x <html> tags rather than one nice
>page and I still have to wait for the whole process to complete.
>
>I imagine I am going to have to take another approach but can't thing
>of one. Does anyone now how to refresh a referring page or loop in
>the may I described?
>
>Thanx in advance.
>Dp.
>
>
>
>
>
>