You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@phoenix.apache.org by "Russell, Bob" <rr...@gmail.com> on 2014/07/30 04:34:11 UTC

Phoenix and HDP 2.0

Any luck with this configuration?  Hbase version is 0.96.0.2.0. I've tied
Phoenix 3.0 and 4.0 and am having issues with both.   Before digging
deeper, I figured I'd check to see if there has been success with HDP 2.0
or it's none to not be possible.

Thanks,
Bob

Re: Phoenix and HDP 2.0

Posted by James Taylor <ja...@apache.org>.
Hi Bob,
Phoenix doesn't support HBase 0.96. You'll need to either:
- upgrade to HDP 2.1
- fix PHOENIX-848
Thanks,
James


On Wed, Jul 30, 2014 at 10:32 AM, Russell, Bob <rr...@gmail.com> wrote:

> Nicolas,
>
> Error message for both 3.0 and 4.0 below.   Hopefully, there is something
> simple to get this going without going through upgrade.
>
> *With Phoenix 3.0 I get the following:*
>
> Setting property: [isolation, TRANSACTION_READ_COMMITTED]
> issuing: !connect jdbc:phoenix:localhost:2181 none none
> org.apache.phoenix.jdbc.PhoenixDr                        iver
> Connecting to jdbc:phoenix:localhost:2181
> 14/07/29 14:15:49 WARN conf.Configuration: dfs.df.interval is deprecated.
> Instead, use fs                        .df.interval
> 14/07/29 14:15:49 WARN conf.Configuration: hadoop.native.lib is
> deprecated. Instead, use                         io.native.lib.available
> 14/07/29 14:15:49 WARN conf.Configuration: fs.default.name is deprecated.
> Instead, use fs                        .defaultFS
> 14/07/29 14:15:49 WARN conf.Configuration: topology.script.number.args is
> deprecated. Ins                        tead, use
> net.topology.script.number.args
> 14/07/29 14:15:49 WARN conf.Configuration: dfs.umaskmode is deprecated.
> Instead, use fs.p                        ermissions.umask-mode
> 14/07/29 14:15:49 WARN conf.Configuration:
> topology.node.switch.mapping.impl is deprecate                        d.
> Instead, use net.topology.node.switch.mapping.impl
> 14/07/29 14:15:50 WARN conf.Configuration: fs.default.name is deprecated.
> Instead, use fs                        .defaultFS
> 14/07/29 14:15:50 WARN util.NativeCodeLoader: Unable to load native-hadoop
> library for yo                        ur platform... using builtin-java
> classes where applicable
> java.lang.IllegalArgumentException: Not a host:port pair: PBUF
> 0
> #testhost�������(
>         at
> org.apache.hadoop.hbase.util.Addressing.parseHostname(Addressing.java:60)
>         at org.apache.hadoop.hbase.ServerName.<init>(ServerName.java:101)
>         at
> org.apache.hadoop.hbase.ServerName.parseVersionedServerName(ServerName.java:28
> 3)
>         at
> org.apache.hadoop.hbase.MasterAddressTracker.bytesToServerName(MasterAddressTr
> acker.java:77)
>         at
> org.apache.hadoop.hbase.MasterAddressTracker.getMasterAddress(MasterAddressTra
> cker.java:61)
>         at
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.ge
> tMaster(HConnectionManager.java:805)
>         at
> org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:127)
>         at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(Connec
> tionQueryServicesImpl.java:739)
>         at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQue
> ryServicesImpl.java:1021)
>         at
> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.ja
> va:1156)
>         at
> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:422)
>         at
> org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.j
> ava:183)
>         at
> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java
> :226)
>         at
> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:9
> 08)
>         at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServi
> cesImpl.java:1351)
>         at
> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver
> .java:131)
>         at
> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.ja
> va:112)
>         at sqlline.SqlLine$DatabaseConnection.connect(SqlLine.java:4650)
>         at
> sqlline.SqlLine$DatabaseConnection.getConnection(SqlLine.java:4701)
>         at sqlline.SqlLine$Commands.connect(SqlLine.java:3942)
>         at sqlline.SqlLine$Commands.connect(SqlLine.java:3851)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.j
> ava:43)
>         at java.lang.reflect.Method.invoke(Method.java:601)
>         at
> sqlline.SqlLine$ReflectiveCommandHandler.execute(SqlLine.java:2810)
>         at sqlline.SqlLine.dispatch(SqlLine.java:817)
>         at sqlline.SqlLine.initArgs(SqlLine.java:633)
>         at sqlline.SqlLine.begin(SqlLine.java:680)
>         at sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:441)
>         at sqlline.SqlLine.main(SqlLine.java:424)
> sqlline version 1.1.2
> 0: jdbc:phoenix:localhost:2181> !quit
> Connection is already closed.
>
> *With Phoenix 4.0 I get the following:*
> ./sqlline.py localhost
> java -cp
> ".:/opt/phoenix-4.0.0-incubating/bin/../hadoop-2/phoenix-4.0.0-incubating-client.jar"
> -Dlog4j.configuration=file:/opt/phoenix-4.0.0-incubating/bin/log4j.properties
> sqlline.SqlLine -d org.apache.phoenix.jdbc.PhoenixDriver -u
> jdbc:phoenix:localhost -n none -p none --color=true --fastConnect=false
> --verbose=true --isolation=TRANSACTION_READ_COMMITTED
> Setting property: [isolation, TRANSACTION_READ_COMMITTED]
> issuing: !connect jdbc:phoenix:localhost none none
> org.apache.phoenix.jdbc.PhoenixDriver
> Connecting to jdbc:phoenix:localhost
> 14/07/30 13:28:16 WARN util.NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable
>
> ####################################################################
> ### Here it hangs for a bit and then spits out the following error over
> and over...
> ####################################################################
>
> java.io.IOException: Input/output error
>         at java.io.FileInputStream.read(Native Method)
>         at
> jline.internal.NonBlockingInputStream.read(NonBlockingInputStream.java:169)
>         at
> jline.internal.NonBlockingInputStream.read(NonBlockingInputStream.java:137)
>         at
> jline.internal.NonBlockingInputStream.read(NonBlockingInputStream.java:246)
>         at
> jline.internal.InputStreamReader.read(InputStreamReader.java:261)
>         at
> jline.internal.InputStreamReader.read(InputStreamReader.java:198)
>         at
> jline.console.ConsoleReader.readCharacter(ConsoleReader.java:2038)
>         at jline.console.ConsoleReader.readLine(ConsoleReader.java:2242)
>         at jline.console.ConsoleReader.readLine(ConsoleReader.java:2162)
>         at sqlline.SqlLine.begin(SqlLine.java:699)
>         at sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:441)
>         at sqlline.SqlLine.main(SqlLine.java:424)
> 0: jdbc:phoenix:localhost> java.io.IOException: Input/output error
>
>
>
> On Wed, Jul 30, 2014 at 12:31 AM, Nicolas Maillard <
> nmaillard@hortonworks.com> wrote:
>
>> Hello russel
>>
>> Phoenix works transparently on HDP 2.1, I have not tried it on HDP 2.0, I
>> am not sure a lot of tests have been done on hbase .96. If I am not
>> mistaken phoenix 3 is compiled with hbase .94 and phoenix 4 with hbase 98.
>> but this can be changed.
>> Do you have any information on a specific error message
>>
>>
>> On Wed, Jul 30, 2014 at 4:34 AM, Russell, Bob <rr...@gmail.com>
>> wrote:
>>
>>> Any luck with this configuration?  Hbase version is 0.96.0.2.0. I've
>>> tied Phoenix 3.0 and 4.0 and am having issues with both.   Before digging
>>> deeper, I figured I'd check to see if there has been success with HDP 2.0
>>> or it's none to not be possible.
>>>
>>> Thanks,
>>> Bob
>>>
>>>
>>>
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.
>
>
>

Re: Phoenix and HDP 2.0

Posted by "Russell, Bob" <rr...@gmail.com>.
Nicolas,

Error message for both 3.0 and 4.0 below.   Hopefully, there is something
simple to get this going without going through upgrade.

*With Phoenix 3.0 I get the following:*

Setting property: [isolation, TRANSACTION_READ_COMMITTED]
issuing: !connect jdbc:phoenix:localhost:2181 none none
org.apache.phoenix.jdbc.PhoenixDr                        iver
Connecting to jdbc:phoenix:localhost:2181
14/07/29 14:15:49 WARN conf.Configuration: dfs.df.interval is deprecated.
Instead, use fs                        .df.interval
14/07/29 14:15:49 WARN conf.Configuration: hadoop.native.lib is deprecated.
Instead, use                         io.native.lib.available
14/07/29 14:15:49 WARN conf.Configuration: fs.default.name is deprecated.
Instead, use fs                        .defaultFS
14/07/29 14:15:49 WARN conf.Configuration: topology.script.number.args is
deprecated. Ins                        tead, use
net.topology.script.number.args
14/07/29 14:15:49 WARN conf.Configuration: dfs.umaskmode is deprecated.
Instead, use fs.p                        ermissions.umask-mode
14/07/29 14:15:49 WARN conf.Configuration:
topology.node.switch.mapping.impl is deprecate                        d.
Instead, use net.topology.node.switch.mapping.impl
14/07/29 14:15:50 WARN conf.Configuration: fs.default.name is deprecated.
Instead, use fs                        .defaultFS
14/07/29 14:15:50 WARN util.NativeCodeLoader: Unable to load native-hadoop
library for yo                        ur platform... using builtin-java
classes where applicable
java.lang.IllegalArgumentException: Not a host:port pair: PBUF
0
#testhost�������(
        at
org.apache.hadoop.hbase.util.Addressing.parseHostname(Addressing.java:60)
        at org.apache.hadoop.hbase.ServerName.<init>(ServerName.java:101)
        at
org.apache.hadoop.hbase.ServerName.parseVersionedServerName(ServerName.java:28
3)
        at
org.apache.hadoop.hbase.MasterAddressTracker.bytesToServerName(MasterAddressTr
acker.java:77)
        at
org.apache.hadoop.hbase.MasterAddressTracker.getMasterAddress(MasterAddressTra
cker.java:61)
        at
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.ge
tMaster(HConnectionManager.java:805)
        at
org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:127)
        at
org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(Connec
tionQueryServicesImpl.java:739)
        at
org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQue
ryServicesImpl.java:1021)
        at
org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.ja
va:1156)
        at
org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:422)
        at
org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.j
ava:183)
        at
org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java
:226)
        at
org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:9
08)
        at
org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServi
cesImpl.java:1351)
        at
org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver
.java:131)
        at
org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.ja
va:112)
        at sqlline.SqlLine$DatabaseConnection.connect(SqlLine.java:4650)
        at
sqlline.SqlLine$DatabaseConnection.getConnection(SqlLine.java:4701)
        at sqlline.SqlLine$Commands.connect(SqlLine.java:3942)
        at sqlline.SqlLine$Commands.connect(SqlLine.java:3851)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.j
ava:43)
        at java.lang.reflect.Method.invoke(Method.java:601)
        at
sqlline.SqlLine$ReflectiveCommandHandler.execute(SqlLine.java:2810)
        at sqlline.SqlLine.dispatch(SqlLine.java:817)
        at sqlline.SqlLine.initArgs(SqlLine.java:633)
        at sqlline.SqlLine.begin(SqlLine.java:680)
        at sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:441)
        at sqlline.SqlLine.main(SqlLine.java:424)
sqlline version 1.1.2
0: jdbc:phoenix:localhost:2181> !quit
Connection is already closed.

*With Phoenix 4.0 I get the following:*
./sqlline.py localhost
java -cp
".:/opt/phoenix-4.0.0-incubating/bin/../hadoop-2/phoenix-4.0.0-incubating-client.jar"
-Dlog4j.configuration=file:/opt/phoenix-4.0.0-incubating/bin/log4j.properties
sqlline.SqlLine -d org.apache.phoenix.jdbc.PhoenixDriver -u
jdbc:phoenix:localhost -n none -p none --color=true --fastConnect=false
--verbose=true --isolation=TRANSACTION_READ_COMMITTED
Setting property: [isolation, TRANSACTION_READ_COMMITTED]
issuing: !connect jdbc:phoenix:localhost none none
org.apache.phoenix.jdbc.PhoenixDriver
Connecting to jdbc:phoenix:localhost
14/07/30 13:28:16 WARN util.NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable

####################################################################
### Here it hangs for a bit and then spits out the following error over and
over...
####################################################################

java.io.IOException: Input/output error
        at java.io.FileInputStream.read(Native Method)
        at
jline.internal.NonBlockingInputStream.read(NonBlockingInputStream.java:169)
        at
jline.internal.NonBlockingInputStream.read(NonBlockingInputStream.java:137)
        at
jline.internal.NonBlockingInputStream.read(NonBlockingInputStream.java:246)
        at jline.internal.InputStreamReader.read(InputStreamReader.java:261)
        at jline.internal.InputStreamReader.read(InputStreamReader.java:198)
        at
jline.console.ConsoleReader.readCharacter(ConsoleReader.java:2038)
        at jline.console.ConsoleReader.readLine(ConsoleReader.java:2242)
        at jline.console.ConsoleReader.readLine(ConsoleReader.java:2162)
        at sqlline.SqlLine.begin(SqlLine.java:699)
        at sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:441)
        at sqlline.SqlLine.main(SqlLine.java:424)
0: jdbc:phoenix:localhost> java.io.IOException: Input/output error



On Wed, Jul 30, 2014 at 12:31 AM, Nicolas Maillard <
nmaillard@hortonworks.com> wrote:

> Hello russel
>
> Phoenix works transparently on HDP 2.1, I have not tried it on HDP 2.0, I
> am not sure a lot of tests have been done on hbase .96. If I am not
> mistaken phoenix 3 is compiled with hbase .94 and phoenix 4 with hbase 98.
> but this can be changed.
> Do you have any information on a specific error message
>
>
> On Wed, Jul 30, 2014 at 4:34 AM, Russell, Bob <rr...@gmail.com>
> wrote:
>
>> Any luck with this configuration?  Hbase version is 0.96.0.2.0. I've tied
>> Phoenix 3.0 and 4.0 and am having issues with both.   Before digging
>> deeper, I figured I'd check to see if there has been success with HDP 2.0
>> or it's none to not be possible.
>>
>> Thanks,
>> Bob
>>
>>
>>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.

Re: Phoenix and HDP 2.0

Posted by Nicolas Maillard <nm...@hortonworks.com>.
Hello russel

Phoenix works transparently on HDP 2.1, I have not tried it on HDP 2.0, I
am not sure a lot of tests have been done on hbase .96. If I am not
mistaken phoenix 3 is compiled with hbase .94 and phoenix 4 with hbase 98.
but this can be changed.
Do you have any information on a specific error message


On Wed, Jul 30, 2014 at 4:34 AM, Russell, Bob <rr...@gmail.com> wrote:

> Any luck with this configuration?  Hbase version is 0.96.0.2.0. I've tied
> Phoenix 3.0 and 4.0 and am having issues with both.   Before digging
> deeper, I figured I'd check to see if there has been success with HDP 2.0
> or it's none to not be possible.
>
> Thanks,
> Bob
>
>
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.