You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@cocoon.apache.org by Alex Romayev <ro...@yahoo.com> on 2004/02/26 18:12:18 UTC

Aggregating unknown sources

I'm trying to parse an html page, which contains links
to other pages, which I need to aggregate.

I understand I can develop a pipeline like this:

<map:match pattern="links">
  <map:generate type="html" src="http://foo.org"/>
  <map:transform src="create-links.xsl"/>
  <map:serialize type="xml"/>
</map:match>

This will produce:
<links>
  <link>http://foo.org/page1.html</link>
  <link>http://foo.org/some-other-page.html</link>
  <link>http://foo.org/and-another-page.html</link>
</link>

Now, at this point, I need to be able to aggregate all
of pages via "html" generator into one page.

Any ideas?

Thanks,
-Alex

=====
Alex Romayev
Software Architect
http://www.romayev.com
romayev@yahoo.com

---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@cocoon.apache.org
For additional commands, e-mail: users-help@cocoon.apache.org


Re: Aggregating unknown sources

Posted by Geoff Howard <co...@leverageweb.com>.
Alex Romayev wrote:

>Oh, good point... feel silly now ;-)
>  
>

:). Don't.

>--- Geoff Howard <co...@leverageweb.com> wrote:
>  
>
>>Alex Romayev wrote:
>>
>>    
>>
>>>I'm trying to parse an html page, which contains
>>>      
>>>
>>links
>>    
>>
>>>to other pages, which I need to aggregate.
>>>
>>>I understand I can develop a pipeline like this:
>>>
>>><map:match pattern="links">
>>> <map:generate type="html" src="http://foo.org"/>
>>> <map:transform src="create-links.xsl"/>
>>> <map:serialize type="xml"/>
>>></map:match>
>>>
>>>This will produce:
>>><links>
>>> <link>http://foo.org/page1.html</link>
>>> <link>http://foo.org/some-other-page.html</link>
>>> <link>http://foo.org/and-another-page.html</link>
>>></link>
>>>
>>>Now, at this point, I need to be able to aggregate
>>>      
>>>
>>all
>>    
>>
>>>of pages via "html" generator into one page.
>>> 
>>>
>>>      
>>>
>>Use cinclude or xinclude transformer step. 
>>
>>Geoff
>>    
>>


---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@cocoon.apache.org
For additional commands, e-mail: users-help@cocoon.apache.org


Re: Aggregating unknown sources

Posted by Alex Romayev <ro...@yahoo.com>.
Oh, good point... feel silly now ;-)

--- Geoff Howard <co...@leverageweb.com> wrote:
> Alex Romayev wrote:
> 
> >I'm trying to parse an html page, which contains
> links
> >to other pages, which I need to aggregate.
> >
> >I understand I can develop a pipeline like this:
> >
> ><map:match pattern="links">
> >  <map:generate type="html" src="http://foo.org"/>
> >  <map:transform src="create-links.xsl"/>
> >  <map:serialize type="xml"/>
> ></map:match>
> >
> >This will produce:
> ><links>
> >  <link>http://foo.org/page1.html</link>
> >  <link>http://foo.org/some-other-page.html</link>
> >  <link>http://foo.org/and-another-page.html</link>
> ></link>
> >
> >Now, at this point, I need to be able to aggregate
> all
> >of pages via "html" generator into one page.
> >  
> >
> 
> Use cinclude or xinclude transformer step. 
> 
> Geoff
> 
>
---------------------------------------------------------------------
> To unsubscribe, e-mail:
> users-unsubscribe@cocoon.apache.org
> For additional commands, e-mail:
> users-help@cocoon.apache.org
> 


=====
Alex Romayev
Software Architect
http://www.romayev.com
romayev@yahoo.com

---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@cocoon.apache.org
For additional commands, e-mail: users-help@cocoon.apache.org


Re: Aggregating unknown sources

Posted by Geoff Howard <co...@leverageweb.com>.
Alex Romayev wrote:

>I'm trying to parse an html page, which contains links
>to other pages, which I need to aggregate.
>
>I understand I can develop a pipeline like this:
>
><map:match pattern="links">
>  <map:generate type="html" src="http://foo.org"/>
>  <map:transform src="create-links.xsl"/>
>  <map:serialize type="xml"/>
></map:match>
>
>This will produce:
><links>
>  <link>http://foo.org/page1.html</link>
>  <link>http://foo.org/some-other-page.html</link>
>  <link>http://foo.org/and-another-page.html</link>
></link>
>
>Now, at this point, I need to be able to aggregate all
>of pages via "html" generator into one page.
>  
>

Use cinclude or xinclude transformer step. 

Geoff

---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@cocoon.apache.org
For additional commands, e-mail: users-help@cocoon.apache.org