You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@jspwiki.apache.org by "Janne Jalkanen (JIRA)" <ji...@apache.org> on 2009/04/28 16:40:30 UTC
[jira] Resolved: (JSPWIKI-527) Parsing of WikiDocument cuts too
long lines/paragraphs
[ https://issues.apache.org/jira/browse/JSPWIKI-527?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Janne Jalkanen resolved JSPWIKI-527.
------------------------------------
Resolution: Won't Fix
This is a feature, not a bug. Lines which are too long may cause memory management problems, so we filter out any really large lines.
A simple workaround is that you can emit a newline in suitable points in your HTML.
> Parsing of WikiDocument cuts too long lines/paragraphs
> ------------------------------------------------------
>
> Key: JSPWIKI-527
> URL: https://issues.apache.org/jira/browse/JSPWIKI-527
> Project: JSPWiki
> Issue Type: Bug
> Components: Core & storage
> Affects Versions: 2.8.1
> Environment: Win XP & LInux
> Reporter: Jochen Reutelshoefer
>
> After the filters are run the WikiDocument is re-parsed if the filter changed some data. If there is some too long content without linebreaks the paragraph gets cut. (> 10000 characters)
> I have a pageFilter that creates (longer) HTML-output. If I dont put in any linebreaks, it gets broken/trimmed always to same length (destroying HTML-Structure of page)..
> Might have something to do with the alorithm parsing that tree of Content objects and/or the used data-structures.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.