You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@carbondata.apache.org by "Kunal Kapoor (Jira)" <ji...@apache.org> on 2020/05/15 17:48:00 UTC

[jira] [Resolved] (CARBONDATA-3815) Insert into table select from another table throws exception for spatial tables

     [ https://issues.apache.org/jira/browse/CARBONDATA-3815?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Kunal Kapoor resolved CARBONDATA-3815.
--------------------------------------
    Fix Version/s: 2.0.0
       Resolution: Fixed

> Insert into table select from another table throws exception for spatial tables
> -------------------------------------------------------------------------------
>
>                 Key: CARBONDATA-3815
>                 URL: https://issues.apache.org/jira/browse/CARBONDATA-3815
>             Project: CarbonData
>          Issue Type: Bug
>          Components: core, spark-integration
>    Affects Versions: 2.0.0
>            Reporter: Venugopal Reddy K
>            Priority: Major
>             Fix For: 2.0.0
>
>          Time Spent: 1h 40m
>  Remaining Estimate: 0h
>
> *Issue:*
> Insert into table select from another table throws exception for spatial tables. NoSuchElementException exception is thrown with 'mygeohash' column.
> {color:#ff0000}Exception in thread "main" java.util.NoSuchElementException: key not found: mygeohashException in thread "main" java.util.NoSuchElementException: key not found: mygeohash at scala.collection.MapLike$class.default(MapLike.scala:228) at scala.collection.AbstractMap.default(Map.scala:59) at scala.collection.mutable.HashMap.apply(HashMap.scala:65) at org.apache.spark.sql.execution.command.management.CarbonInsertIntoCommand$$anonfun$getReArrangedIndexAndSelectedSchema$5.apply(CarbonInsertIntoCommand.scala:504) at org.apache.spark.sql.execution.command.management.CarbonInsertIntoCommand$$anonfun$getReArrangedIndexAndSelectedSchema$5.apply(CarbonInsertIntoCommand.scala:497) at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48) at org.apache.spark.sql.execution.command.management.CarbonInsertIntoCommand.getReArrangedIndexAndSelectedSchema(CarbonInsertIntoCommand.scala:496) at org.apache.spark.sql.execution.command.management.CarbonInsertIntoCommand.processData(CarbonInsertIntoCommand.scala:164){color}
> *Step to reproduce:*
>  # Create source table and target table spatial tables.
>  # Load data to source table.
>  # Insert into target table select from source table.
> *TestCase:*
> spark.sql(s"""
> CREATE TABLE source(
> timevalue BIGINT,
> longitude LONG,
> latitude LONG) COMMENT "This is a GeoTable"
> STORED AS carbondata
> TBLPROPERTIES ('INDEX_HANDLER'='mygeohash',
> 'INDEX_HANDLER.mygeohash.type'='geohash',
> 'INDEX_HANDLER.mygeohash.sourcecolumns'='longitude, latitude',
> 'INDEX_HANDLER.mygeohash.originLatitude'='39.832277',
> 'INDEX_HANDLER.mygeohash.gridSize'='50',
> 'INDEX_HANDLER.mygeohash.minLongitude'='115.811865',
> 'INDEX_HANDLER.mygeohash.maxLongitude'='116.782233',
> 'INDEX_HANDLER.mygeohash.minLatitude'='39.832277',
> 'INDEX_HANDLER.mygeohash.maxLatitude'='40.225281',
> 'INDEX_HANDLER.mygeohash.conversionRatio'='1000000')
> """.stripMargin)
>  
> val path = s"$rootPath/examples/spark/src/main/resources/geodata.csv"
> spark.sql(s"""
> LOAD DATA LOCAL INPATH '$path'
> INTO TABLE source
> OPTIONS('COMPLEX_DELIMITER_LEVEL_1'='#')
> """.stripMargin)
>  
> spark.sql(s"""
> CREATE TABLE target(
> timevalue BIGINT,
> longitude LONG,
> latitude LONG) COMMENT "This is a GeoTable"
> STORED AS carbondata
> TBLPROPERTIES ('INDEX_HANDLER'='mygeohash',
> 'INDEX_HANDLER.mygeohash.type'='geohash',
> 'INDEX_HANDLER.mygeohash.sourcecolumns'='longitude, latitude',
> 'INDEX_HANDLER.mygeohash.originLatitude'='39.832277',
> 'INDEX_HANDLER.mygeohash.gridSize'='50',
> 'INDEX_HANDLER.mygeohash.minLongitude'='115.811865',
> 'INDEX_HANDLER.mygeohash.maxLongitude'='116.782233',
> 'INDEX_HANDLER.mygeohash.minLatitude'='39.832277',
> 'INDEX_HANDLER.mygeohash.maxLatitude'='40.225281',
> 'INDEX_HANDLER.mygeohash.conversionRatio'='1000000')
> """.stripMargin)
>  
>  
> spark.sql("insert into target select * from source")



--
This message was sent by Atlassian Jira
(v8.3.4#803005)