You are viewing a plain text version of this content. The canonical link for it is here.
Posted to rampart-dev@ws.apache.org by "Jean Marc (JIRA)" <ji...@apache.org> on 2010/06/04 21:12:54 UTC

[jira] Created: (AXIS2-4731) Inefficient inputstream reading in JSONDataSource

Inefficient inputstream reading in JSONDataSource
-------------------------------------------------

                 Key: AXIS2-4731
                 URL: https://issues.apache.org/jira/browse/AXIS2-4731
             Project: Axis2
          Issue Type: Sub-task
          Components: modules
    Affects Versions: 1.5.1
            Reporter: Jean Marc


I am experiencing 100% CPU on InputStream.read() on 30Kb json data from JSONDataSource.
Using InputStream.read() is highly inefficient since we don't know the size of the JSON data; 
a reasonably-sized char buffer should be used to cut down on CPU and to lower the method calls on StringBuilder.append(). 

Note also that regardless of the solution, there must be a charset conversion between an InputStream and a Reader (or a String object for that matter), and that the cast between int and char in the old code will corrupt an InputStream containing multibyte characters. Since JSON data is usually sent in UTF-8, we could hardcode the conversion in the InputStreamReader to overcome problems on Windows: 

            BufferedReader in = new BufferedReader(new InputStreamReader(jsonInputStream,"UTF-8"));
            StringBuilder sb = new StringBuilder(512); 
            char[] tempBuf = new char[512];
            int readLen = -1;
            
            while( (readLen = in.read(tempBuf)) != -1 )
                sb.append(tempBuf,0, readLen);
             tempBuf = null;

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


---------------------------------------------------------------------
To unsubscribe, e-mail: java-dev-unsubscribe@axis.apache.org
For additional commands, e-mail: java-dev-help@axis.apache.org