You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hbase.apache.org by "Manas (Jira)" <ji...@apache.org> on 2020/08/06 17:49:00 UTC

[jira] [Updated] (HBASE-24828) Unable to call HBaseContext From PySpark

     [ https://issues.apache.org/jira/browse/HBASE-24828?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Manas updated HBASE-24828:
--------------------------
    Description: 
I'm able to get the JavaHBaseContext Object From PySpark but not HBaseContext 
{code:java}
temp = sc._jvm.org.apache.hadoop.hbase.HBaseConfiguration
conf = temp.create()
hbaseCon = sc._jvm.org.apache.hadoop.hbase.spark.HBaseContext(sc, conf){code}
Running the above Code gives me this error

 
{code:java}
AttributeError: 'SparkContext' object has no attribute '_get_object_id'
 AttributeError Traceback (most recent call last)
 in engine
 ----> 1 hbaseCon = sc._jvm.org.apache.hadoop.hbase.spark.HBaseContext(sc, conf)
/usr/local/lib/python3.6/site-packages/py4j/java_gateway.py in _call_(self, *args)
 1543 
 1544 args_command = "".join(
 -> 1545 [get_command_part(arg, self._pool) for arg in new_args])
 1546 
 1547 command = proto.CONSTRUCTOR_COMMAND_NAME +\
/usr/local/lib/python3.6/site-packages/py4j/java_gateway.py in <listcomp>(.0)
 1543 
 1544 args_command = "".join(
 -> 1545 [get_command_part(arg, self._pool) for arg in new_args])
 1546 
 1547 command = proto.CONSTRUCTOR_COMMAND_NAME +\
/usr/local/lib/python3.6/site-packages/py4j/protocol.py in get_command_part(parameter, python_proxy_pool)
 296 command_part += ";" + interface
 297 else:
 --> 298 command_part = REFERENCE_TYPE + parameter._get_object_id()
 299 
 300 command_part += "\n"
AttributeError: 'SparkContext' object has no attribute '_get_object_id'
{code}
 

  was:
I'm able to call the JavaHBaseContext From PySpark but not HBaseContext

 
{code:java}
temp = sc._jvm.org.apache.hadoop.hbase.HBaseConfiguration
conf = temp.create()
hbaseCon = sc._jvm.org.apache.hadoop.hbase.spark.HBaseContext(sc, conf){code}
AttributeError: 'SparkContext' object has no attribute '_get_object_id'
AttributeError                            Traceback (most recent call last)
in engine
----> 1 hbaseCon = sc._jvm.org.apache.hadoop.hbase.spark.HBaseContext(sc, conf)

/usr/local/lib/python3.6/site-packages/py4j/java_gateway.py in __call__(self, *args)
   1543 
   1544         args_command = "".join(
-> 1545             [get_command_part(arg, self._pool) for arg in new_args])
   1546 
   1547         command = proto.CONSTRUCTOR_COMMAND_NAME +\

/usr/local/lib/python3.6/site-packages/py4j/java_gateway.py in <listcomp>(.0)
   1543 
   1544         args_command = "".join(
-> 1545             [get_command_part(arg, self._pool) for arg in new_args])
   1546 
   1547         command = proto.CONSTRUCTOR_COMMAND_NAME +\

/usr/local/lib/python3.6/site-packages/py4j/protocol.py in get_command_part(parameter, python_proxy_pool)
    296             command_part += ";" + interface
    297     else:
--> 298         command_part = REFERENCE_TYPE + parameter._get_object_id()
    299 
    300     command_part += "\n"

AttributeError: 'SparkContext' object has no attribute '_get_object_id'


> Unable to call HBaseContext From PySpark
> ----------------------------------------
>
>                 Key: HBASE-24828
>                 URL: https://issues.apache.org/jira/browse/HBASE-24828
>             Project: HBase
>          Issue Type: Bug
>          Components: hbase-connectors
>            Reporter: Manas
>            Priority: Minor
>
> I'm able to get the JavaHBaseContext Object From PySpark but not HBaseContext 
> {code:java}
> temp = sc._jvm.org.apache.hadoop.hbase.HBaseConfiguration
> conf = temp.create()
> hbaseCon = sc._jvm.org.apache.hadoop.hbase.spark.HBaseContext(sc, conf){code}
> Running the above Code gives me this error
>  
> {code:java}
> AttributeError: 'SparkContext' object has no attribute '_get_object_id'
>  AttributeError Traceback (most recent call last)
>  in engine
>  ----> 1 hbaseCon = sc._jvm.org.apache.hadoop.hbase.spark.HBaseContext(sc, conf)
> /usr/local/lib/python3.6/site-packages/py4j/java_gateway.py in _call_(self, *args)
>  1543 
>  1544 args_command = "".join(
>  -> 1545 [get_command_part(arg, self._pool) for arg in new_args])
>  1546 
>  1547 command = proto.CONSTRUCTOR_COMMAND_NAME +\
> /usr/local/lib/python3.6/site-packages/py4j/java_gateway.py in <listcomp>(.0)
>  1543 
>  1544 args_command = "".join(
>  -> 1545 [get_command_part(arg, self._pool) for arg in new_args])
>  1546 
>  1547 command = proto.CONSTRUCTOR_COMMAND_NAME +\
> /usr/local/lib/python3.6/site-packages/py4j/protocol.py in get_command_part(parameter, python_proxy_pool)
>  296 command_part += ";" + interface
>  297 else:
>  --> 298 command_part = REFERENCE_TYPE + parameter._get_object_id()
>  299 
>  300 command_part += "\n"
> AttributeError: 'SparkContext' object has no attribute '_get_object_id'
> {code}
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)