You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@wicket.apache.org by Jeffrey Schneller <je...@envisa.com> on 2010/10/12 17:30:34 UTC

robots.txt & wicket urls

How can I add an entry to robots.txt for a page of my site that is not
currently mounted and cannot be made bookmarkable.  The url is the
wicket url of "?wicket:interface".

I guess I could say block everything that contains "?wicket:interface"
but that seems kind of brute force.  

 

I tried using IndexedHybridUrlCodingStrategy and HybridUrlCodingStrategy
to mount the page as a named page.  This works but then if someone
changes the number after the "." in the url, they may request a page out
of the pagemap and an exception is thrown and we show our error page.
Is this the only way to do it?

 

Thanks.

 


Re: robots.txt & wicket urls

Posted by Igor Vaynberg <ig...@gmail.com>.
the url to the user is still /Login, its just that when they go to it
they are redirected to /Login.<n>

in your robots.txt if you want to reference the page use /Login - this
is the url the robot has to hit to get to /Login.<n>

also when the user tweaks the <n> they should simply get a new
instance of the page, not an error.

-igor

On Tue, Oct 12, 2010 at 9:32 AM, Jeffrey Schneller
<je...@envisa.com> wrote:
> So using the HybridUrlCodingStrategy is correct and if users want to muck with the URL they can and they may just get an error?  Then for the robots.txt I just say to block "XYZ" without the ".".  For example I have a page called Login that is mounted with the HybridUrlCodingStrategy.  So the url to the user is Login.1 or Login.7 depending on how many times they hit the page.  In the robots.txt I just need to say to ignore the url of Login
>
> Can I set the pagemap to only store N number of pages per session?  I tried using: getSessionSettings().setMaxPageMaps(1) but what does this really do?  Is it 1 page per session or 1 of each page per session.
>
> Thanks.
>
>
>
> -----Original Message-----
> From: Igor Vaynberg [mailto:igor.vaynberg@gmail.com]
> Sent: Tuesday, October 12, 2010 12:21 PM
> To: users@wicket.apache.org
> Subject: Re: robots.txt & wicket urls
>
> in your robots.txt put the url without the ., that way a new version
> of the page will get created.
>
> -igor
>
> On Tue, Oct 12, 2010 at 8:30 AM, Jeffrey Schneller
> <je...@envisa.com> wrote:
>> How can I add an entry to robots.txt for a page of my site that is not
>> currently mounted and cannot be made bookmarkable.  The url is the
>> wicket url of "?wicket:interface".
>>
>> I guess I could say block everything that contains "?wicket:interface"
>> but that seems kind of brute force.
>>
>>
>>
>> I tried using IndexedHybridUrlCodingStrategy and HybridUrlCodingStrategy
>> to mount the page as a named page.  This works but then if someone
>> changes the number after the "." in the url, they may request a page out
>> of the pagemap and an exception is thrown and we show our error page.
>> Is this the only way to do it?
>>
>>
>>
>> Thanks.
>>
>>
>>
>>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: users-unsubscribe@wicket.apache.org
> For additional commands, e-mail: users-help@wicket.apache.org
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: users-unsubscribe@wicket.apache.org
> For additional commands, e-mail: users-help@wicket.apache.org
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@wicket.apache.org
For additional commands, e-mail: users-help@wicket.apache.org


RE: robots.txt & wicket urls

Posted by Jeffrey Schneller <je...@envisa.com>.
So using the HybridUrlCodingStrategy is correct and if users want to muck with the URL they can and they may just get an error?  Then for the robots.txt I just say to block "XYZ" without the ".".  For example I have a page called Login that is mounted with the HybridUrlCodingStrategy.  So the url to the user is Login.1 or Login.7 depending on how many times they hit the page.  In the robots.txt I just need to say to ignore the url of Login

Can I set the pagemap to only store N number of pages per session?  I tried using: getSessionSettings().setMaxPageMaps(1) but what does this really do?  Is it 1 page per session or 1 of each page per session.  

Thanks.



-----Original Message-----
From: Igor Vaynberg [mailto:igor.vaynberg@gmail.com] 
Sent: Tuesday, October 12, 2010 12:21 PM
To: users@wicket.apache.org
Subject: Re: robots.txt & wicket urls

in your robots.txt put the url without the ., that way a new version
of the page will get created.

-igor

On Tue, Oct 12, 2010 at 8:30 AM, Jeffrey Schneller
<je...@envisa.com> wrote:
> How can I add an entry to robots.txt for a page of my site that is not
> currently mounted and cannot be made bookmarkable.  The url is the
> wicket url of "?wicket:interface".
>
> I guess I could say block everything that contains "?wicket:interface"
> but that seems kind of brute force.
>
>
>
> I tried using IndexedHybridUrlCodingStrategy and HybridUrlCodingStrategy
> to mount the page as a named page.  This works but then if someone
> changes the number after the "." in the url, they may request a page out
> of the pagemap and an exception is thrown and we show our error page.
> Is this the only way to do it?
>
>
>
> Thanks.
>
>
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@wicket.apache.org
For additional commands, e-mail: users-help@wicket.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@wicket.apache.org
For additional commands, e-mail: users-help@wicket.apache.org


Re: robots.txt & wicket urls

Posted by Igor Vaynberg <ig...@gmail.com>.
in your robots.txt put the url without the ., that way a new version
of the page will get created.

-igor

On Tue, Oct 12, 2010 at 8:30 AM, Jeffrey Schneller
<je...@envisa.com> wrote:
> How can I add an entry to robots.txt for a page of my site that is not
> currently mounted and cannot be made bookmarkable.  The url is the
> wicket url of "?wicket:interface".
>
> I guess I could say block everything that contains "?wicket:interface"
> but that seems kind of brute force.
>
>
>
> I tried using IndexedHybridUrlCodingStrategy and HybridUrlCodingStrategy
> to mount the page as a named page.  This works but then if someone
> changes the number after the "." in the url, they may request a page out
> of the pagemap and an exception is thrown and we show our error page.
> Is this the only way to do it?
>
>
>
> Thanks.
>
>
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@wicket.apache.org
For additional commands, e-mail: users-help@wicket.apache.org