You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@cassandra.apache.org by "Shotaro Kamio (JIRA)" <ji...@apache.org> on 2011/02/18 03:54:15 UTC

[jira] Created: (CASSANDRA-2189) json2sstable fails due to OutOfMemory

json2sstable fails due to OutOfMemory
-------------------------------------

                 Key: CASSANDRA-2189
                 URL: https://issues.apache.org/jira/browse/CASSANDRA-2189
             Project: Cassandra
          Issue Type: Bug
          Components: Tools
    Affects Versions: 0.7.2
         Environment: linux
            Reporter: Shotaro Kamio
            Priority: Minor


I have a json file created with sstable2json for a column family of super column type. Its size is about 1.9GB. (It's a dump of all keys because I cannot find out how to specify keys to dump in sstable2json.)
When I tried to create sstable from the json file, it failed with OutOfMemoryError as follows.

 WARN 00:31:58,595 Schema definitions were defined both locally and in cassandra.yaml. Definitions in cassandra.yaml were ignored.
Exception in thread "main" java.lang.OutOfMemoryError: PermGen space
        at java.lang.String.intern(Native Method)
        at org.codehaus.jackson.util.InternCache.intern(InternCache.java:40)
        at org.codehaus.jackson.sym.BytesToNameCanonicalizer.addName(BytesToNameCanonicalizer.java:471)
        at org.codehaus.jackson.impl.Utf8StreamParser.addName(Utf8StreamParser.java:893)
        at org.codehaus.jackson.impl.Utf8StreamParser.findName(Utf8StreamParser.java:773)
        at org.codehaus.jackson.impl.Utf8StreamParser.parseLongFieldName(Utf8StreamParser.java:379)
        at org.codehaus.jackson.impl.Utf8StreamParser.parseMediumFieldName(Utf8StreamParser.java:347)
        at org.codehaus.jackson.impl.Utf8StreamParser._parseFieldName(Utf8StreamParser.java:304)
        at org.codehaus.jackson.impl.Utf8StreamParser.nextToken(Utf8StreamParser.java:140)
        at org.codehaus.jackson.map.deser.UntypedObjectDeserializer.mapObject(UntypedObjectDeserializer.java:93)
        at org.codehaus.jackson.map.deser.UntypedObjectDeserializer.deserialize(UntypedObjectDeserializer.java:65)
        at org.codehaus.jackson.map.deser.MapDeserializer._readAndBind(MapDeserializer.java:197)
        at org.codehaus.jackson.map.deser.MapDeserializer.deserialize(MapDeserializer.java:145)
        at org.codehaus.jackson.map.deser.MapDeserializer.deserialize(MapDeserializer.java:23)
        at org.codehaus.jackson.map.ObjectMapper._readValue(ObjectMapper.java:1261)
        at org.codehaus.jackson.map.ObjectMapper.readValue(ObjectMapper.java:517)
        at org.codehaus.jackson.JsonParser.readValueAs(JsonParser.java:897)
        at org.apache.cassandra.tools.SSTableImport.importUnsorted(SSTableImport.java:208)
        at org.apache.cassandra.tools.SSTableImport.importJson(SSTableImport.java:197)
        at org.apache.cassandra.tools.SSTableImport.main(SSTableImport.java:421)

So, what I had to is that split the json file with "split" command and modify them to be correct json file. Create sstable for each small json files.

Could you change json2sstable to avoid OutOfMemory?

-- 
This message is automatically generated by JIRA.
-
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

[jira] [Updated] (CASSANDRA-2189) json2sstable fails due to OutOfMemory

Posted by "Jonathan Ellis (Updated) (JIRA)" <ji...@apache.org>.
     [ https://issues.apache.org/jira/browse/CASSANDRA-2189?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Jonathan Ellis updated CASSANDRA-2189:
--------------------------------------

    Attachment: 2189-2.txt

Patch attached to move configuration to the factory.
                
> json2sstable fails due to OutOfMemory
> -------------------------------------
>
>                 Key: CASSANDRA-2189
>                 URL: https://issues.apache.org/jira/browse/CASSANDRA-2189
>             Project: Cassandra
>          Issue Type: Bug
>          Components: Tools
>         Environment: linux
>            Reporter: Shotaro Kamio
>            Assignee: Jonathan Ellis
>            Priority: Minor
>             Fix For: 0.8.9
>
>         Attachments: 2189-2.txt, 2189.txt
>
>   Original Estimate: 1h
>  Remaining Estimate: 1h
>
> I have a json file created with sstable2json for a column family of super column type. Its size is about 1.9GB. (It's a dump of all keys because I cannot find out how to specify keys to dump in sstable2json.)
> When I tried to create sstable from the json file, it failed with OutOfMemoryError as follows.
>  WARN 00:31:58,595 Schema definitions were defined both locally and in cassandra.yaml. Definitions in cassandra.yaml were ignored.
> Exception in thread "main" java.lang.OutOfMemoryError: PermGen space
>         at java.lang.String.intern(Native Method)
>         at org.codehaus.jackson.util.InternCache.intern(InternCache.java:40)
>         at org.codehaus.jackson.sym.BytesToNameCanonicalizer.addName(BytesToNameCanonicalizer.java:471)
>         at org.codehaus.jackson.impl.Utf8StreamParser.addName(Utf8StreamParser.java:893)
>         at org.codehaus.jackson.impl.Utf8StreamParser.findName(Utf8StreamParser.java:773)
>         at org.codehaus.jackson.impl.Utf8StreamParser.parseLongFieldName(Utf8StreamParser.java:379)
>         at org.codehaus.jackson.impl.Utf8StreamParser.parseMediumFieldName(Utf8StreamParser.java:347)
>         at org.codehaus.jackson.impl.Utf8StreamParser._parseFieldName(Utf8StreamParser.java:304)
>         at org.codehaus.jackson.impl.Utf8StreamParser.nextToken(Utf8StreamParser.java:140)
>         at org.codehaus.jackson.map.deser.UntypedObjectDeserializer.mapObject(UntypedObjectDeserializer.java:93)
>         at org.codehaus.jackson.map.deser.UntypedObjectDeserializer.deserialize(UntypedObjectDeserializer.java:65)
>         at org.codehaus.jackson.map.deser.MapDeserializer._readAndBind(MapDeserializer.java:197)
>         at org.codehaus.jackson.map.deser.MapDeserializer.deserialize(MapDeserializer.java:145)
>         at org.codehaus.jackson.map.deser.MapDeserializer.deserialize(MapDeserializer.java:23)
>         at org.codehaus.jackson.map.ObjectMapper._readValue(ObjectMapper.java:1261)
>         at org.codehaus.jackson.map.ObjectMapper.readValue(ObjectMapper.java:517)
>         at org.codehaus.jackson.JsonParser.readValueAs(JsonParser.java:897)
>         at org.apache.cassandra.tools.SSTableImport.importUnsorted(SSTableImport.java:208)
>         at org.apache.cassandra.tools.SSTableImport.importJson(SSTableImport.java:197)
>         at org.apache.cassandra.tools.SSTableImport.main(SSTableImport.java:421)
> So, what I had to is that split the json file with "split" command and modify them to be correct json file. Create sstable for each small json files.
> Could you change json2sstable to avoid OutOfMemory?

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

[jira] [Updated] (CASSANDRA-2189) json2sstable fails due to OutOfMemory

Posted by "Jonathan Ellis (Updated) (JIRA)" <ji...@apache.org>.
     [ https://issues.apache.org/jira/browse/CASSANDRA-2189?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Jonathan Ellis updated CASSANDRA-2189:
--------------------------------------

    Affects Version/s:     (was: 0.7.0)
        Fix Version/s:     (was: 0.7.3)
                       0.8.9
    
> json2sstable fails due to OutOfMemory
> -------------------------------------
>
>                 Key: CASSANDRA-2189
>                 URL: https://issues.apache.org/jira/browse/CASSANDRA-2189
>             Project: Cassandra
>          Issue Type: Bug
>          Components: Tools
>         Environment: linux
>            Reporter: Shotaro Kamio
>            Assignee: Jonathan Ellis
>            Priority: Minor
>             Fix For: 0.8.9
>
>         Attachments: 2189.txt
>
>   Original Estimate: 1h
>  Remaining Estimate: 1h
>
> I have a json file created with sstable2json for a column family of super column type. Its size is about 1.9GB. (It's a dump of all keys because I cannot find out how to specify keys to dump in sstable2json.)
> When I tried to create sstable from the json file, it failed with OutOfMemoryError as follows.
>  WARN 00:31:58,595 Schema definitions were defined both locally and in cassandra.yaml. Definitions in cassandra.yaml were ignored.
> Exception in thread "main" java.lang.OutOfMemoryError: PermGen space
>         at java.lang.String.intern(Native Method)
>         at org.codehaus.jackson.util.InternCache.intern(InternCache.java:40)
>         at org.codehaus.jackson.sym.BytesToNameCanonicalizer.addName(BytesToNameCanonicalizer.java:471)
>         at org.codehaus.jackson.impl.Utf8StreamParser.addName(Utf8StreamParser.java:893)
>         at org.codehaus.jackson.impl.Utf8StreamParser.findName(Utf8StreamParser.java:773)
>         at org.codehaus.jackson.impl.Utf8StreamParser.parseLongFieldName(Utf8StreamParser.java:379)
>         at org.codehaus.jackson.impl.Utf8StreamParser.parseMediumFieldName(Utf8StreamParser.java:347)
>         at org.codehaus.jackson.impl.Utf8StreamParser._parseFieldName(Utf8StreamParser.java:304)
>         at org.codehaus.jackson.impl.Utf8StreamParser.nextToken(Utf8StreamParser.java:140)
>         at org.codehaus.jackson.map.deser.UntypedObjectDeserializer.mapObject(UntypedObjectDeserializer.java:93)
>         at org.codehaus.jackson.map.deser.UntypedObjectDeserializer.deserialize(UntypedObjectDeserializer.java:65)
>         at org.codehaus.jackson.map.deser.MapDeserializer._readAndBind(MapDeserializer.java:197)
>         at org.codehaus.jackson.map.deser.MapDeserializer.deserialize(MapDeserializer.java:145)
>         at org.codehaus.jackson.map.deser.MapDeserializer.deserialize(MapDeserializer.java:23)
>         at org.codehaus.jackson.map.ObjectMapper._readValue(ObjectMapper.java:1261)
>         at org.codehaus.jackson.map.ObjectMapper.readValue(ObjectMapper.java:517)
>         at org.codehaus.jackson.JsonParser.readValueAs(JsonParser.java:897)
>         at org.apache.cassandra.tools.SSTableImport.importUnsorted(SSTableImport.java:208)
>         at org.apache.cassandra.tools.SSTableImport.importJson(SSTableImport.java:197)
>         at org.apache.cassandra.tools.SSTableImport.main(SSTableImport.java:421)
> So, what I had to is that split the json file with "split" command and modify them to be correct json file. Create sstable for each small json files.
> Could you change json2sstable to avoid OutOfMemory?

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

[jira] [Resolved] (CASSANDRA-2189) json2sstable fails due to OutOfMemory

Posted by "Jonathan Ellis (Resolved) (JIRA)" <ji...@apache.org>.
     [ https://issues.apache.org/jira/browse/CASSANDRA-2189?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Jonathan Ellis resolved CASSANDRA-2189.
---------------------------------------

    Resolution: Fixed

Customer testing indicates that this is a big improvement, especially when combined with an upgrade to Jackson 1.9.2.  I'll commit this patch to 0.8.9 and 1.0.6, and upgrade Jackson in trunk.
                
> json2sstable fails due to OutOfMemory
> -------------------------------------
>
>                 Key: CASSANDRA-2189
>                 URL: https://issues.apache.org/jira/browse/CASSANDRA-2189
>             Project: Cassandra
>          Issue Type: Bug
>          Components: Tools
>         Environment: linux
>            Reporter: Shotaro Kamio
>            Assignee: Jonathan Ellis
>            Priority: Minor
>             Fix For: 0.8.9
>
>         Attachments: 2189-2.txt, 2189.txt
>
>   Original Estimate: 1h
>  Remaining Estimate: 1h
>
> I have a json file created with sstable2json for a column family of super column type. Its size is about 1.9GB. (It's a dump of all keys because I cannot find out how to specify keys to dump in sstable2json.)
> When I tried to create sstable from the json file, it failed with OutOfMemoryError as follows.
>  WARN 00:31:58,595 Schema definitions were defined both locally and in cassandra.yaml. Definitions in cassandra.yaml were ignored.
> Exception in thread "main" java.lang.OutOfMemoryError: PermGen space
>         at java.lang.String.intern(Native Method)
>         at org.codehaus.jackson.util.InternCache.intern(InternCache.java:40)
>         at org.codehaus.jackson.sym.BytesToNameCanonicalizer.addName(BytesToNameCanonicalizer.java:471)
>         at org.codehaus.jackson.impl.Utf8StreamParser.addName(Utf8StreamParser.java:893)
>         at org.codehaus.jackson.impl.Utf8StreamParser.findName(Utf8StreamParser.java:773)
>         at org.codehaus.jackson.impl.Utf8StreamParser.parseLongFieldName(Utf8StreamParser.java:379)
>         at org.codehaus.jackson.impl.Utf8StreamParser.parseMediumFieldName(Utf8StreamParser.java:347)
>         at org.codehaus.jackson.impl.Utf8StreamParser._parseFieldName(Utf8StreamParser.java:304)
>         at org.codehaus.jackson.impl.Utf8StreamParser.nextToken(Utf8StreamParser.java:140)
>         at org.codehaus.jackson.map.deser.UntypedObjectDeserializer.mapObject(UntypedObjectDeserializer.java:93)
>         at org.codehaus.jackson.map.deser.UntypedObjectDeserializer.deserialize(UntypedObjectDeserializer.java:65)
>         at org.codehaus.jackson.map.deser.MapDeserializer._readAndBind(MapDeserializer.java:197)
>         at org.codehaus.jackson.map.deser.MapDeserializer.deserialize(MapDeserializer.java:145)
>         at org.codehaus.jackson.map.deser.MapDeserializer.deserialize(MapDeserializer.java:23)
>         at org.codehaus.jackson.map.ObjectMapper._readValue(ObjectMapper.java:1261)
>         at org.codehaus.jackson.map.ObjectMapper.readValue(ObjectMapper.java:517)
>         at org.codehaus.jackson.JsonParser.readValueAs(JsonParser.java:897)
>         at org.apache.cassandra.tools.SSTableImport.importUnsorted(SSTableImport.java:208)
>         at org.apache.cassandra.tools.SSTableImport.importJson(SSTableImport.java:197)
>         at org.apache.cassandra.tools.SSTableImport.main(SSTableImport.java:421)
> So, what I had to is that split the json file with "split" command and modify them to be correct json file. Create sstable for each small json files.
> Could you change json2sstable to avoid OutOfMemory?

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

[jira] [Reopened] (CASSANDRA-2189) json2sstable fails due to OutOfMemory

Posted by "Jonathan Ellis (Reopened) (JIRA)" <ji...@apache.org>.
     [ https://issues.apache.org/jira/browse/CASSANDRA-2189?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Jonathan Ellis reopened CASSANDRA-2189:
---------------------------------------


This didn't actually fix the problem.  Tatu again:

"This calls configure on parser; but at that point the symbol table has already been created. So configure must be called on the factory first (and only need to be called once, really), and then settings will be passed as expected."
                
> json2sstable fails due to OutOfMemory
> -------------------------------------
>
>                 Key: CASSANDRA-2189
>                 URL: https://issues.apache.org/jira/browse/CASSANDRA-2189
>             Project: Cassandra
>          Issue Type: Bug
>          Components: Tools
>    Affects Versions: 0.7.0
>         Environment: linux
>            Reporter: Shotaro Kamio
>            Assignee: Jonathan Ellis
>            Priority: Minor
>             Fix For: 0.7.3
>
>         Attachments: 2189.txt
>
>   Original Estimate: 1h
>  Remaining Estimate: 1h
>
> I have a json file created with sstable2json for a column family of super column type. Its size is about 1.9GB. (It's a dump of all keys because I cannot find out how to specify keys to dump in sstable2json.)
> When I tried to create sstable from the json file, it failed with OutOfMemoryError as follows.
>  WARN 00:31:58,595 Schema definitions were defined both locally and in cassandra.yaml. Definitions in cassandra.yaml were ignored.
> Exception in thread "main" java.lang.OutOfMemoryError: PermGen space
>         at java.lang.String.intern(Native Method)
>         at org.codehaus.jackson.util.InternCache.intern(InternCache.java:40)
>         at org.codehaus.jackson.sym.BytesToNameCanonicalizer.addName(BytesToNameCanonicalizer.java:471)
>         at org.codehaus.jackson.impl.Utf8StreamParser.addName(Utf8StreamParser.java:893)
>         at org.codehaus.jackson.impl.Utf8StreamParser.findName(Utf8StreamParser.java:773)
>         at org.codehaus.jackson.impl.Utf8StreamParser.parseLongFieldName(Utf8StreamParser.java:379)
>         at org.codehaus.jackson.impl.Utf8StreamParser.parseMediumFieldName(Utf8StreamParser.java:347)
>         at org.codehaus.jackson.impl.Utf8StreamParser._parseFieldName(Utf8StreamParser.java:304)
>         at org.codehaus.jackson.impl.Utf8StreamParser.nextToken(Utf8StreamParser.java:140)
>         at org.codehaus.jackson.map.deser.UntypedObjectDeserializer.mapObject(UntypedObjectDeserializer.java:93)
>         at org.codehaus.jackson.map.deser.UntypedObjectDeserializer.deserialize(UntypedObjectDeserializer.java:65)
>         at org.codehaus.jackson.map.deser.MapDeserializer._readAndBind(MapDeserializer.java:197)
>         at org.codehaus.jackson.map.deser.MapDeserializer.deserialize(MapDeserializer.java:145)
>         at org.codehaus.jackson.map.deser.MapDeserializer.deserialize(MapDeserializer.java:23)
>         at org.codehaus.jackson.map.ObjectMapper._readValue(ObjectMapper.java:1261)
>         at org.codehaus.jackson.map.ObjectMapper.readValue(ObjectMapper.java:517)
>         at org.codehaus.jackson.JsonParser.readValueAs(JsonParser.java:897)
>         at org.apache.cassandra.tools.SSTableImport.importUnsorted(SSTableImport.java:208)
>         at org.apache.cassandra.tools.SSTableImport.importJson(SSTableImport.java:197)
>         at org.apache.cassandra.tools.SSTableImport.main(SSTableImport.java:421)
> So, what I had to is that split the json file with "split" command and modify them to be correct json file. Create sstable for each small json files.
> Could you change json2sstable to avoid OutOfMemory?

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

[jira] Commented: (CASSANDRA-2189) json2sstable fails due to OutOfMemory

Posted by "Hudson (JIRA)" <ji...@apache.org>.
    [ https://issues.apache.org/jira/browse/CASSANDRA-2189?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12998616#comment-12998616 ] 

Hudson commented on CASSANDRA-2189:
-----------------------------------

Integrated in Cassandra-0.7 #313 (See [https://hudson.apache.org/hudson/job/Cassandra-0.7/313/])
    turnoff string interning in json2sstable
patch by jbellis for CASSANDRA-2189


> json2sstable fails due to OutOfMemory
> -------------------------------------
>
>                 Key: CASSANDRA-2189
>                 URL: https://issues.apache.org/jira/browse/CASSANDRA-2189
>             Project: Cassandra
>          Issue Type: Bug
>          Components: Tools
>    Affects Versions: 0.7.0
>         Environment: linux
>            Reporter: Shotaro Kamio
>            Assignee: Jonathan Ellis
>            Priority: Minor
>             Fix For: 0.7.3
>
>         Attachments: 2189.txt
>
>   Original Estimate: 1h
>  Remaining Estimate: 1h
>
> I have a json file created with sstable2json for a column family of super column type. Its size is about 1.9GB. (It's a dump of all keys because I cannot find out how to specify keys to dump in sstable2json.)
> When I tried to create sstable from the json file, it failed with OutOfMemoryError as follows.
>  WARN 00:31:58,595 Schema definitions were defined both locally and in cassandra.yaml. Definitions in cassandra.yaml were ignored.
> Exception in thread "main" java.lang.OutOfMemoryError: PermGen space
>         at java.lang.String.intern(Native Method)
>         at org.codehaus.jackson.util.InternCache.intern(InternCache.java:40)
>         at org.codehaus.jackson.sym.BytesToNameCanonicalizer.addName(BytesToNameCanonicalizer.java:471)
>         at org.codehaus.jackson.impl.Utf8StreamParser.addName(Utf8StreamParser.java:893)
>         at org.codehaus.jackson.impl.Utf8StreamParser.findName(Utf8StreamParser.java:773)
>         at org.codehaus.jackson.impl.Utf8StreamParser.parseLongFieldName(Utf8StreamParser.java:379)
>         at org.codehaus.jackson.impl.Utf8StreamParser.parseMediumFieldName(Utf8StreamParser.java:347)
>         at org.codehaus.jackson.impl.Utf8StreamParser._parseFieldName(Utf8StreamParser.java:304)
>         at org.codehaus.jackson.impl.Utf8StreamParser.nextToken(Utf8StreamParser.java:140)
>         at org.codehaus.jackson.map.deser.UntypedObjectDeserializer.mapObject(UntypedObjectDeserializer.java:93)
>         at org.codehaus.jackson.map.deser.UntypedObjectDeserializer.deserialize(UntypedObjectDeserializer.java:65)
>         at org.codehaus.jackson.map.deser.MapDeserializer._readAndBind(MapDeserializer.java:197)
>         at org.codehaus.jackson.map.deser.MapDeserializer.deserialize(MapDeserializer.java:145)
>         at org.codehaus.jackson.map.deser.MapDeserializer.deserialize(MapDeserializer.java:23)
>         at org.codehaus.jackson.map.ObjectMapper._readValue(ObjectMapper.java:1261)
>         at org.codehaus.jackson.map.ObjectMapper.readValue(ObjectMapper.java:517)
>         at org.codehaus.jackson.JsonParser.readValueAs(JsonParser.java:897)
>         at org.apache.cassandra.tools.SSTableImport.importUnsorted(SSTableImport.java:208)
>         at org.apache.cassandra.tools.SSTableImport.importJson(SSTableImport.java:197)
>         at org.apache.cassandra.tools.SSTableImport.main(SSTableImport.java:421)
> So, what I had to is that split the json file with "split" command and modify them to be correct json file. Create sstable for each small json files.
> Could you change json2sstable to avoid OutOfMemory?

-- 
This message is automatically generated by JIRA.
-
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

[jira] Commented: (CASSANDRA-2189) json2sstable fails due to OutOfMemory

Posted by "Jonathan Ellis (JIRA)" <ji...@apache.org>.
    [ https://issues.apache.org/jira/browse/CASSANDRA-2189?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12996224#comment-12996224 ] 

Jonathan Ellis commented on CASSANDRA-2189:
-------------------------------------------

That kind of looks like a jackson bug to me.  I'll ask Tatu.

> json2sstable fails due to OutOfMemory
> -------------------------------------
>
>                 Key: CASSANDRA-2189
>                 URL: https://issues.apache.org/jira/browse/CASSANDRA-2189
>             Project: Cassandra
>          Issue Type: Bug
>          Components: Tools
>    Affects Versions: 0.7.2
>         Environment: linux
>            Reporter: Shotaro Kamio
>            Priority: Minor
>
> I have a json file created with sstable2json for a column family of super column type. Its size is about 1.9GB. (It's a dump of all keys because I cannot find out how to specify keys to dump in sstable2json.)
> When I tried to create sstable from the json file, it failed with OutOfMemoryError as follows.
>  WARN 00:31:58,595 Schema definitions were defined both locally and in cassandra.yaml. Definitions in cassandra.yaml were ignored.
> Exception in thread "main" java.lang.OutOfMemoryError: PermGen space
>         at java.lang.String.intern(Native Method)
>         at org.codehaus.jackson.util.InternCache.intern(InternCache.java:40)
>         at org.codehaus.jackson.sym.BytesToNameCanonicalizer.addName(BytesToNameCanonicalizer.java:471)
>         at org.codehaus.jackson.impl.Utf8StreamParser.addName(Utf8StreamParser.java:893)
>         at org.codehaus.jackson.impl.Utf8StreamParser.findName(Utf8StreamParser.java:773)
>         at org.codehaus.jackson.impl.Utf8StreamParser.parseLongFieldName(Utf8StreamParser.java:379)
>         at org.codehaus.jackson.impl.Utf8StreamParser.parseMediumFieldName(Utf8StreamParser.java:347)
>         at org.codehaus.jackson.impl.Utf8StreamParser._parseFieldName(Utf8StreamParser.java:304)
>         at org.codehaus.jackson.impl.Utf8StreamParser.nextToken(Utf8StreamParser.java:140)
>         at org.codehaus.jackson.map.deser.UntypedObjectDeserializer.mapObject(UntypedObjectDeserializer.java:93)
>         at org.codehaus.jackson.map.deser.UntypedObjectDeserializer.deserialize(UntypedObjectDeserializer.java:65)
>         at org.codehaus.jackson.map.deser.MapDeserializer._readAndBind(MapDeserializer.java:197)
>         at org.codehaus.jackson.map.deser.MapDeserializer.deserialize(MapDeserializer.java:145)
>         at org.codehaus.jackson.map.deser.MapDeserializer.deserialize(MapDeserializer.java:23)
>         at org.codehaus.jackson.map.ObjectMapper._readValue(ObjectMapper.java:1261)
>         at org.codehaus.jackson.map.ObjectMapper.readValue(ObjectMapper.java:517)
>         at org.codehaus.jackson.JsonParser.readValueAs(JsonParser.java:897)
>         at org.apache.cassandra.tools.SSTableImport.importUnsorted(SSTableImport.java:208)
>         at org.apache.cassandra.tools.SSTableImport.importJson(SSTableImport.java:197)
>         at org.apache.cassandra.tools.SSTableImport.main(SSTableImport.java:421)
> So, what I had to is that split the json file with "split" command and modify them to be correct json file. Create sstable for each small json files.
> Could you change json2sstable to avoid OutOfMemory?

-- 
This message is automatically generated by JIRA.
-
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

[jira] Updated: (CASSANDRA-2189) json2sstable fails due to OutOfMemory

Posted by "Jonathan Ellis (JIRA)" <ji...@apache.org>.
     [ https://issues.apache.org/jira/browse/CASSANDRA-2189?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Jonathan Ellis updated CASSANDRA-2189:
--------------------------------------

    Attachment: 2189.txt

patch to disable interning.  (Thanks to Tatu for pointing me at the right Feature.)

> json2sstable fails due to OutOfMemory
> -------------------------------------
>
>                 Key: CASSANDRA-2189
>                 URL: https://issues.apache.org/jira/browse/CASSANDRA-2189
>             Project: Cassandra
>          Issue Type: Bug
>          Components: Tools
>    Affects Versions: 0.7.2
>         Environment: linux
>            Reporter: Shotaro Kamio
>            Priority: Minor
>         Attachments: 2189.txt
>
>
> I have a json file created with sstable2json for a column family of super column type. Its size is about 1.9GB. (It's a dump of all keys because I cannot find out how to specify keys to dump in sstable2json.)
> When I tried to create sstable from the json file, it failed with OutOfMemoryError as follows.
>  WARN 00:31:58,595 Schema definitions were defined both locally and in cassandra.yaml. Definitions in cassandra.yaml were ignored.
> Exception in thread "main" java.lang.OutOfMemoryError: PermGen space
>         at java.lang.String.intern(Native Method)
>         at org.codehaus.jackson.util.InternCache.intern(InternCache.java:40)
>         at org.codehaus.jackson.sym.BytesToNameCanonicalizer.addName(BytesToNameCanonicalizer.java:471)
>         at org.codehaus.jackson.impl.Utf8StreamParser.addName(Utf8StreamParser.java:893)
>         at org.codehaus.jackson.impl.Utf8StreamParser.findName(Utf8StreamParser.java:773)
>         at org.codehaus.jackson.impl.Utf8StreamParser.parseLongFieldName(Utf8StreamParser.java:379)
>         at org.codehaus.jackson.impl.Utf8StreamParser.parseMediumFieldName(Utf8StreamParser.java:347)
>         at org.codehaus.jackson.impl.Utf8StreamParser._parseFieldName(Utf8StreamParser.java:304)
>         at org.codehaus.jackson.impl.Utf8StreamParser.nextToken(Utf8StreamParser.java:140)
>         at org.codehaus.jackson.map.deser.UntypedObjectDeserializer.mapObject(UntypedObjectDeserializer.java:93)
>         at org.codehaus.jackson.map.deser.UntypedObjectDeserializer.deserialize(UntypedObjectDeserializer.java:65)
>         at org.codehaus.jackson.map.deser.MapDeserializer._readAndBind(MapDeserializer.java:197)
>         at org.codehaus.jackson.map.deser.MapDeserializer.deserialize(MapDeserializer.java:145)
>         at org.codehaus.jackson.map.deser.MapDeserializer.deserialize(MapDeserializer.java:23)
>         at org.codehaus.jackson.map.ObjectMapper._readValue(ObjectMapper.java:1261)
>         at org.codehaus.jackson.map.ObjectMapper.readValue(ObjectMapper.java:517)
>         at org.codehaus.jackson.JsonParser.readValueAs(JsonParser.java:897)
>         at org.apache.cassandra.tools.SSTableImport.importUnsorted(SSTableImport.java:208)
>         at org.apache.cassandra.tools.SSTableImport.importJson(SSTableImport.java:197)
>         at org.apache.cassandra.tools.SSTableImport.main(SSTableImport.java:421)
> So, what I had to is that split the json file with "split" command and modify them to be correct json file. Create sstable for each small json files.
> Could you change json2sstable to avoid OutOfMemory?

-- 
This message is automatically generated by JIRA.
-
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

[jira] Updated: (CASSANDRA-2189) json2sstable fails due to OutOfMemory

Posted by "Jonathan Ellis (JIRA)" <ji...@apache.org>.
     [ https://issues.apache.org/jira/browse/CASSANDRA-2189?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Jonathan Ellis updated CASSANDRA-2189:
--------------------------------------

    Remaining Estimate: 1h
     Original Estimate: 1h

> json2sstable fails due to OutOfMemory
> -------------------------------------
>
>                 Key: CASSANDRA-2189
>                 URL: https://issues.apache.org/jira/browse/CASSANDRA-2189
>             Project: Cassandra
>          Issue Type: Bug
>          Components: Tools
>    Affects Versions: 0.7.0
>         Environment: linux
>            Reporter: Shotaro Kamio
>            Assignee: Jonathan Ellis
>            Priority: Minor
>             Fix For: 0.7.3
>
>         Attachments: 2189.txt
>
>   Original Estimate: 1h
>  Remaining Estimate: 1h
>
> I have a json file created with sstable2json for a column family of super column type. Its size is about 1.9GB. (It's a dump of all keys because I cannot find out how to specify keys to dump in sstable2json.)
> When I tried to create sstable from the json file, it failed with OutOfMemoryError as follows.
>  WARN 00:31:58,595 Schema definitions were defined both locally and in cassandra.yaml. Definitions in cassandra.yaml were ignored.
> Exception in thread "main" java.lang.OutOfMemoryError: PermGen space
>         at java.lang.String.intern(Native Method)
>         at org.codehaus.jackson.util.InternCache.intern(InternCache.java:40)
>         at org.codehaus.jackson.sym.BytesToNameCanonicalizer.addName(BytesToNameCanonicalizer.java:471)
>         at org.codehaus.jackson.impl.Utf8StreamParser.addName(Utf8StreamParser.java:893)
>         at org.codehaus.jackson.impl.Utf8StreamParser.findName(Utf8StreamParser.java:773)
>         at org.codehaus.jackson.impl.Utf8StreamParser.parseLongFieldName(Utf8StreamParser.java:379)
>         at org.codehaus.jackson.impl.Utf8StreamParser.parseMediumFieldName(Utf8StreamParser.java:347)
>         at org.codehaus.jackson.impl.Utf8StreamParser._parseFieldName(Utf8StreamParser.java:304)
>         at org.codehaus.jackson.impl.Utf8StreamParser.nextToken(Utf8StreamParser.java:140)
>         at org.codehaus.jackson.map.deser.UntypedObjectDeserializer.mapObject(UntypedObjectDeserializer.java:93)
>         at org.codehaus.jackson.map.deser.UntypedObjectDeserializer.deserialize(UntypedObjectDeserializer.java:65)
>         at org.codehaus.jackson.map.deser.MapDeserializer._readAndBind(MapDeserializer.java:197)
>         at org.codehaus.jackson.map.deser.MapDeserializer.deserialize(MapDeserializer.java:145)
>         at org.codehaus.jackson.map.deser.MapDeserializer.deserialize(MapDeserializer.java:23)
>         at org.codehaus.jackson.map.ObjectMapper._readValue(ObjectMapper.java:1261)
>         at org.codehaus.jackson.map.ObjectMapper.readValue(ObjectMapper.java:517)
>         at org.codehaus.jackson.JsonParser.readValueAs(JsonParser.java:897)
>         at org.apache.cassandra.tools.SSTableImport.importUnsorted(SSTableImport.java:208)
>         at org.apache.cassandra.tools.SSTableImport.importJson(SSTableImport.java:197)
>         at org.apache.cassandra.tools.SSTableImport.main(SSTableImport.java:421)
> So, what I had to is that split the json file with "split" command and modify them to be correct json file. Create sstable for each small json files.
> Could you change json2sstable to avoid OutOfMemory?

-- 
This message is automatically generated by JIRA.
-
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

[jira] Assigned: (CASSANDRA-2189) json2sstable fails due to OutOfMemory

Posted by "Jonathan Ellis (JIRA)" <ji...@apache.org>.
     [ https://issues.apache.org/jira/browse/CASSANDRA-2189?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Jonathan Ellis reassigned CASSANDRA-2189:
-----------------------------------------

    Assignee: Jonathan Ellis

> json2sstable fails due to OutOfMemory
> -------------------------------------
>
>                 Key: CASSANDRA-2189
>                 URL: https://issues.apache.org/jira/browse/CASSANDRA-2189
>             Project: Cassandra
>          Issue Type: Bug
>          Components: Tools
>    Affects Versions: 0.7.2
>         Environment: linux
>            Reporter: Shotaro Kamio
>            Assignee: Jonathan Ellis
>            Priority: Minor
>         Attachments: 2189.txt
>
>
> I have a json file created with sstable2json for a column family of super column type. Its size is about 1.9GB. (It's a dump of all keys because I cannot find out how to specify keys to dump in sstable2json.)
> When I tried to create sstable from the json file, it failed with OutOfMemoryError as follows.
>  WARN 00:31:58,595 Schema definitions were defined both locally and in cassandra.yaml. Definitions in cassandra.yaml were ignored.
> Exception in thread "main" java.lang.OutOfMemoryError: PermGen space
>         at java.lang.String.intern(Native Method)
>         at org.codehaus.jackson.util.InternCache.intern(InternCache.java:40)
>         at org.codehaus.jackson.sym.BytesToNameCanonicalizer.addName(BytesToNameCanonicalizer.java:471)
>         at org.codehaus.jackson.impl.Utf8StreamParser.addName(Utf8StreamParser.java:893)
>         at org.codehaus.jackson.impl.Utf8StreamParser.findName(Utf8StreamParser.java:773)
>         at org.codehaus.jackson.impl.Utf8StreamParser.parseLongFieldName(Utf8StreamParser.java:379)
>         at org.codehaus.jackson.impl.Utf8StreamParser.parseMediumFieldName(Utf8StreamParser.java:347)
>         at org.codehaus.jackson.impl.Utf8StreamParser._parseFieldName(Utf8StreamParser.java:304)
>         at org.codehaus.jackson.impl.Utf8StreamParser.nextToken(Utf8StreamParser.java:140)
>         at org.codehaus.jackson.map.deser.UntypedObjectDeserializer.mapObject(UntypedObjectDeserializer.java:93)
>         at org.codehaus.jackson.map.deser.UntypedObjectDeserializer.deserialize(UntypedObjectDeserializer.java:65)
>         at org.codehaus.jackson.map.deser.MapDeserializer._readAndBind(MapDeserializer.java:197)
>         at org.codehaus.jackson.map.deser.MapDeserializer.deserialize(MapDeserializer.java:145)
>         at org.codehaus.jackson.map.deser.MapDeserializer.deserialize(MapDeserializer.java:23)
>         at org.codehaus.jackson.map.ObjectMapper._readValue(ObjectMapper.java:1261)
>         at org.codehaus.jackson.map.ObjectMapper.readValue(ObjectMapper.java:517)
>         at org.codehaus.jackson.JsonParser.readValueAs(JsonParser.java:897)
>         at org.apache.cassandra.tools.SSTableImport.importUnsorted(SSTableImport.java:208)
>         at org.apache.cassandra.tools.SSTableImport.importJson(SSTableImport.java:197)
>         at org.apache.cassandra.tools.SSTableImport.main(SSTableImport.java:421)
> So, what I had to is that split the json file with "split" command and modify them to be correct json file. Create sstable for each small json files.
> Could you change json2sstable to avoid OutOfMemory?

-- 
This message is automatically generated by JIRA.
-
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

[jira] [Updated] (CASSANDRA-2189) json2sstable fails due to OutOfMemory

Posted by "Jonathan Ellis (Updated) (JIRA)" <ji...@apache.org>.
     [ https://issues.apache.org/jira/browse/CASSANDRA-2189?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Jonathan Ellis updated CASSANDRA-2189:
--------------------------------------

    Fix Version/s: 1.0.6
    
> json2sstable fails due to OutOfMemory
> -------------------------------------
>
>                 Key: CASSANDRA-2189
>                 URL: https://issues.apache.org/jira/browse/CASSANDRA-2189
>             Project: Cassandra
>          Issue Type: Bug
>          Components: Tools
>         Environment: linux
>            Reporter: Shotaro Kamio
>            Assignee: Jonathan Ellis
>            Priority: Minor
>             Fix For: 0.8.9, 1.0.6
>
>         Attachments: 2189-2.txt, 2189.txt
>
>   Original Estimate: 1h
>  Remaining Estimate: 1h
>
> I have a json file created with sstable2json for a column family of super column type. Its size is about 1.9GB. (It's a dump of all keys because I cannot find out how to specify keys to dump in sstable2json.)
> When I tried to create sstable from the json file, it failed with OutOfMemoryError as follows.
>  WARN 00:31:58,595 Schema definitions were defined both locally and in cassandra.yaml. Definitions in cassandra.yaml were ignored.
> Exception in thread "main" java.lang.OutOfMemoryError: PermGen space
>         at java.lang.String.intern(Native Method)
>         at org.codehaus.jackson.util.InternCache.intern(InternCache.java:40)
>         at org.codehaus.jackson.sym.BytesToNameCanonicalizer.addName(BytesToNameCanonicalizer.java:471)
>         at org.codehaus.jackson.impl.Utf8StreamParser.addName(Utf8StreamParser.java:893)
>         at org.codehaus.jackson.impl.Utf8StreamParser.findName(Utf8StreamParser.java:773)
>         at org.codehaus.jackson.impl.Utf8StreamParser.parseLongFieldName(Utf8StreamParser.java:379)
>         at org.codehaus.jackson.impl.Utf8StreamParser.parseMediumFieldName(Utf8StreamParser.java:347)
>         at org.codehaus.jackson.impl.Utf8StreamParser._parseFieldName(Utf8StreamParser.java:304)
>         at org.codehaus.jackson.impl.Utf8StreamParser.nextToken(Utf8StreamParser.java:140)
>         at org.codehaus.jackson.map.deser.UntypedObjectDeserializer.mapObject(UntypedObjectDeserializer.java:93)
>         at org.codehaus.jackson.map.deser.UntypedObjectDeserializer.deserialize(UntypedObjectDeserializer.java:65)
>         at org.codehaus.jackson.map.deser.MapDeserializer._readAndBind(MapDeserializer.java:197)
>         at org.codehaus.jackson.map.deser.MapDeserializer.deserialize(MapDeserializer.java:145)
>         at org.codehaus.jackson.map.deser.MapDeserializer.deserialize(MapDeserializer.java:23)
>         at org.codehaus.jackson.map.ObjectMapper._readValue(ObjectMapper.java:1261)
>         at org.codehaus.jackson.map.ObjectMapper.readValue(ObjectMapper.java:517)
>         at org.codehaus.jackson.JsonParser.readValueAs(JsonParser.java:897)
>         at org.apache.cassandra.tools.SSTableImport.importUnsorted(SSTableImport.java:208)
>         at org.apache.cassandra.tools.SSTableImport.importJson(SSTableImport.java:197)
>         at org.apache.cassandra.tools.SSTableImport.main(SSTableImport.java:421)
> So, what I had to is that split the json file with "split" command and modify them to be correct json file. Create sstable for each small json files.
> Could you change json2sstable to avoid OutOfMemory?

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

[jira] Commented: (CASSANDRA-2189) json2sstable fails due to OutOfMemory

Posted by "Jonathan Ellis (JIRA)" <ji...@apache.org>.
    [ https://issues.apache.org/jira/browse/CASSANDRA-2189?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12997567#comment-12997567 ] 

Jonathan Ellis commented on CASSANDRA-2189:
-------------------------------------------

Shotaro, can you test the patch?

> json2sstable fails due to OutOfMemory
> -------------------------------------
>
>                 Key: CASSANDRA-2189
>                 URL: https://issues.apache.org/jira/browse/CASSANDRA-2189
>             Project: Cassandra
>          Issue Type: Bug
>          Components: Tools
>    Affects Versions: 0.7.0
>         Environment: linux
>            Reporter: Shotaro Kamio
>            Assignee: Jonathan Ellis
>            Priority: Minor
>             Fix For: 0.7.3
>
>         Attachments: 2189.txt
>
>
> I have a json file created with sstable2json for a column family of super column type. Its size is about 1.9GB. (It's a dump of all keys because I cannot find out how to specify keys to dump in sstable2json.)
> When I tried to create sstable from the json file, it failed with OutOfMemoryError as follows.
>  WARN 00:31:58,595 Schema definitions were defined both locally and in cassandra.yaml. Definitions in cassandra.yaml were ignored.
> Exception in thread "main" java.lang.OutOfMemoryError: PermGen space
>         at java.lang.String.intern(Native Method)
>         at org.codehaus.jackson.util.InternCache.intern(InternCache.java:40)
>         at org.codehaus.jackson.sym.BytesToNameCanonicalizer.addName(BytesToNameCanonicalizer.java:471)
>         at org.codehaus.jackson.impl.Utf8StreamParser.addName(Utf8StreamParser.java:893)
>         at org.codehaus.jackson.impl.Utf8StreamParser.findName(Utf8StreamParser.java:773)
>         at org.codehaus.jackson.impl.Utf8StreamParser.parseLongFieldName(Utf8StreamParser.java:379)
>         at org.codehaus.jackson.impl.Utf8StreamParser.parseMediumFieldName(Utf8StreamParser.java:347)
>         at org.codehaus.jackson.impl.Utf8StreamParser._parseFieldName(Utf8StreamParser.java:304)
>         at org.codehaus.jackson.impl.Utf8StreamParser.nextToken(Utf8StreamParser.java:140)
>         at org.codehaus.jackson.map.deser.UntypedObjectDeserializer.mapObject(UntypedObjectDeserializer.java:93)
>         at org.codehaus.jackson.map.deser.UntypedObjectDeserializer.deserialize(UntypedObjectDeserializer.java:65)
>         at org.codehaus.jackson.map.deser.MapDeserializer._readAndBind(MapDeserializer.java:197)
>         at org.codehaus.jackson.map.deser.MapDeserializer.deserialize(MapDeserializer.java:145)
>         at org.codehaus.jackson.map.deser.MapDeserializer.deserialize(MapDeserializer.java:23)
>         at org.codehaus.jackson.map.ObjectMapper._readValue(ObjectMapper.java:1261)
>         at org.codehaus.jackson.map.ObjectMapper.readValue(ObjectMapper.java:517)
>         at org.codehaus.jackson.JsonParser.readValueAs(JsonParser.java:897)
>         at org.apache.cassandra.tools.SSTableImport.importUnsorted(SSTableImport.java:208)
>         at org.apache.cassandra.tools.SSTableImport.importJson(SSTableImport.java:197)
>         at org.apache.cassandra.tools.SSTableImport.main(SSTableImport.java:421)
> So, what I had to is that split the json file with "split" command and modify them to be correct json file. Create sstable for each small json files.
> Could you change json2sstable to avoid OutOfMemory?

-- 
This message is automatically generated by JIRA.
-
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

[jira] [Commented] (CASSANDRA-2189) json2sstable fails due to OutOfMemory

Posted by "Hudson (Commented) (JIRA)" <ji...@apache.org>.
    [ https://issues.apache.org/jira/browse/CASSANDRA-2189?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13165349#comment-13165349 ] 

Hudson commented on CASSANDRA-2189:
-----------------------------------

Integrated in Cassandra-0.8 #413 (See [https://builds.apache.org/job/Cassandra-0.8/413/])
    turn off string interning in json2sstable, take 2
patch by jbellis; tested by George Ciubotaru for CASSANDRA-2189

jbellis : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1211976
Files : 
* /cassandra/branches/cassandra-0.8
* /cassandra/branches/cassandra-0.8/CHANGES.txt
* /cassandra/branches/cassandra-0.8/src/java/org/apache/cassandra/tools/SSTableImport.java

                
> json2sstable fails due to OutOfMemory
> -------------------------------------
>
>                 Key: CASSANDRA-2189
>                 URL: https://issues.apache.org/jira/browse/CASSANDRA-2189
>             Project: Cassandra
>          Issue Type: Bug
>          Components: Tools
>         Environment: linux
>            Reporter: Shotaro Kamio
>            Assignee: Jonathan Ellis
>            Priority: Minor
>             Fix For: 0.8.9, 1.0.6
>
>         Attachments: 2189-2.txt, 2189.txt
>
>   Original Estimate: 1h
>  Remaining Estimate: 1h
>
> I have a json file created with sstable2json for a column family of super column type. Its size is about 1.9GB. (It's a dump of all keys because I cannot find out how to specify keys to dump in sstable2json.)
> When I tried to create sstable from the json file, it failed with OutOfMemoryError as follows.
>  WARN 00:31:58,595 Schema definitions were defined both locally and in cassandra.yaml. Definitions in cassandra.yaml were ignored.
> Exception in thread "main" java.lang.OutOfMemoryError: PermGen space
>         at java.lang.String.intern(Native Method)
>         at org.codehaus.jackson.util.InternCache.intern(InternCache.java:40)
>         at org.codehaus.jackson.sym.BytesToNameCanonicalizer.addName(BytesToNameCanonicalizer.java:471)
>         at org.codehaus.jackson.impl.Utf8StreamParser.addName(Utf8StreamParser.java:893)
>         at org.codehaus.jackson.impl.Utf8StreamParser.findName(Utf8StreamParser.java:773)
>         at org.codehaus.jackson.impl.Utf8StreamParser.parseLongFieldName(Utf8StreamParser.java:379)
>         at org.codehaus.jackson.impl.Utf8StreamParser.parseMediumFieldName(Utf8StreamParser.java:347)
>         at org.codehaus.jackson.impl.Utf8StreamParser._parseFieldName(Utf8StreamParser.java:304)
>         at org.codehaus.jackson.impl.Utf8StreamParser.nextToken(Utf8StreamParser.java:140)
>         at org.codehaus.jackson.map.deser.UntypedObjectDeserializer.mapObject(UntypedObjectDeserializer.java:93)
>         at org.codehaus.jackson.map.deser.UntypedObjectDeserializer.deserialize(UntypedObjectDeserializer.java:65)
>         at org.codehaus.jackson.map.deser.MapDeserializer._readAndBind(MapDeserializer.java:197)
>         at org.codehaus.jackson.map.deser.MapDeserializer.deserialize(MapDeserializer.java:145)
>         at org.codehaus.jackson.map.deser.MapDeserializer.deserialize(MapDeserializer.java:23)
>         at org.codehaus.jackson.map.ObjectMapper._readValue(ObjectMapper.java:1261)
>         at org.codehaus.jackson.map.ObjectMapper.readValue(ObjectMapper.java:517)
>         at org.codehaus.jackson.JsonParser.readValueAs(JsonParser.java:897)
>         at org.apache.cassandra.tools.SSTableImport.importUnsorted(SSTableImport.java:208)
>         at org.apache.cassandra.tools.SSTableImport.importJson(SSTableImport.java:197)
>         at org.apache.cassandra.tools.SSTableImport.main(SSTableImport.java:421)
> So, what I had to is that split the json file with "split" command and modify them to be correct json file. Create sstable for each small json files.
> Could you change json2sstable to avoid OutOfMemory?

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira