You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@phoenix.apache.org by "Xavier Jodoin (JIRA)" <ji...@apache.org> on 2017/04/19 17:56:42 UTC

[jira] [Commented] (PHOENIX-2648) Phoenix Spark Integration does not allow Dynamic Columns to be mapped

    [ https://issues.apache.org/jira/browse/PHOENIX-2648?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15975173#comment-15975173 ] 

Xavier Jodoin commented on PHOENIX-2648:
----------------------------------------

The solution with the view doesn't work anymore with the latest release Phoenix release 4.10 we can't map dynamic column in the view

> Phoenix Spark Integration does not allow Dynamic Columns to be mapped
> ---------------------------------------------------------------------
>
>                 Key: PHOENIX-2648
>                 URL: https://issues.apache.org/jira/browse/PHOENIX-2648
>             Project: Phoenix
>          Issue Type: Bug
>    Affects Versions: 4.6.0
>         Environment: phoenix-spark-4.6.0-HBase-0.98  , spark-1.5.0-bin-hadoop2.4
>            Reporter: Suman Datta
>            Assignee: Xiaopeng Liao
>              Labels: patch, phoenixTableAsRDD, spark
>             Fix For: 4.6.0
>
>
> I am using spark-1.5.0-bin-hadoop2.4 and phoenix-spark-4.6.0-HBase-0.98 to load phoenix tables on hbase to Spark RDD. Using the steps in https://phoenix.apache.org/phoenix_spark.html,  I can successfully map standard columns in a table to Phoenix RDD. 
> But my table has some important dynamic columns (https://phoenix.apache.org/dynamic_columns.html) which are not getting mapped to Spark RDD in this process.(using sc.phoenixTableAsRDD)
> This is proving a showstopper for me for using phoenix with spark.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)