You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hive.apache.org by Marcelo Vanzin <va...@cloudera.com> on 2014/12/08 20:39:25 UTC

Re: Review Request 28779: [spark-client] Netty-based RPC implementation.

-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/28779/
-----------------------------------------------------------

(Updated Dec. 8, 2014, 7:39 p.m.)


Review request for hive, Brock Noland, chengxiang li, Szehon Ho, and Xuefu Zhang.


Bugs: HIVE-9036
    https://issues.apache.org/jira/browse/HIVE-9036


Repository: hive-git


Description (updated)
-------

This is the RPC layer for the spark-client; it just provides the RPC mechanism
for replacing akka in the spark-client communication. See README.md for
more discussion about the approach.

The client impl is still using akka - that will be changed in a separate commit.

[spark-client] Goodbye akka, hello netty.

Use the netty-based RPC implementation for spark-client. API is still
exactly the same, just the internals have changed. API semantics should
still be the same too, although with the new RPC system there is still
room for improvement.

Main thing to remember right now is that all types sent over the wire
need an empty constructor. Currently, failure to do that will result
in errors to be printed to the logs and RPC channels to close, but
for easier debugging in the future I'm planning to change the RPC
internals a bit so that failure to deserialize the payload causes an
RPC failure, instead of the current behavior.

[spark-client] Use a Promise in JobHandleImpl.

Makes it integrate better with the RPC layer, and avoids some ugly
code in the process.

[spark-client] Set up RPC thread factory, add missing timeout.


[spark-client] Better handle serialization errors.

It's hard to report back serialization errors that happen during an
RPC if the RPC header and payload are serialized as a single object.
So break those down into two different objects; we're sure the internal
RPC header is serializable, so this allows us to better handle the case
when the user payload is not serializable and provide nicer errors,
making it easier to debug things.

Fix buglets.


Fix some serialization issues.


More fixes.

- More types needed serialization tweaks (empty constructors or
  transient fields)
- Fix a race when registering outgoing RPCs.
- Add a TODO to fix some very suspicious code in SparkCounter.

Fix SparkCounters serialization.

This code is a little bit weird; the "Accumulator" instance cannot
be serialized by Kryo's default serializer, so we cannot include
it when sending SparkCounters back to the RSC. But we can't make the
field transient, because apparently it need to be serialized when
sent to actual tasks.

Long term it would be better to separate internal counter usage
from RSC counter views (in the spirit of keeping RSC completely
independent from the rest of Hive). But right now make things work
by copying things.

Configurable channel traffic log level.


Avoid an ugly exception when remote end dies.


Make rpc server instance private in factory.


Add max message size checks.


Fix bug when looking for a suitable address.


Diffs (updated)
-----

  pom.xml 630b10ce35032e4b2dee50ef3dfe5feb58223b78 
  ql/src/java/org/apache/hadoop/hive/ql/exec/spark/LocalHiveSparkClient.java PRE-CREATION 
  ql/src/java/org/apache/hadoop/hive/ql/exec/spark/RemoteHiveSparkClient.java PRE-CREATION 
  ql/src/java/org/apache/hadoop/hive/ql/exec/spark/status/impl/RemoteSparkJobStatus.java PRE-CREATION 
  spark-client/pom.xml PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/ClientUtils.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/JobHandleImpl.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/Protocol.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/SparkClientFactory.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/SparkClientImpl.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/metrics/InputMetrics.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/metrics/Metrics.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleReadMetrics.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleWriteMetrics.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/KryoMessageCodec.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/README.md PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/Rpc.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcConfiguration.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcDispatcher.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcException.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcServer.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounter.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounterGroup.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounters.java PRE-CREATION 
  spark-client/src/test/java/org/apache/hive/spark/client/TestSparkClient.java PRE-CREATION 
  spark-client/src/test/java/org/apache/hive/spark/client/rpc/TestKryoMessageCodec.java PRE-CREATION 
  spark-client/src/test/java/org/apache/hive/spark/client/rpc/TestRpc.java PRE-CREATION 

Diff: https://reviews.apache.org/r/28779/diff/


Testing
-------

spark-client unit tests, plus some qtests.


Thanks,

Marcelo Vanzin


Re: Review Request 28779: [spark-client] Netty-based RPC implementation.

Posted by Brock Noland <br...@cloudera.com>.

> On Dec. 8, 2014, 9:03 p.m., Brock Noland wrote:
> > Hey Marcelo,
> > 
> > When I send an HTTP request to the port where RSC is listening the message below is printed. Thus it's doing a good job in that it's checking the max message size which is awesome, but I feel we need to:
> > 
> > 1) Add a small header so that when junk data is sent to this port we can log a better exception than the one below. As I mentioned, we've had massive problems with this is in flume which also uses netty for communication.
> > 
> > 2) ensure the income size is not negative.
> > 
> > 
> > 2014-12-08 20:56:41,070 WARN  [RPC-Handler-7]: rpc.RpcDispatcher (RpcDispatcher.java:exceptionCaught(154)) - [HelloDispatcher] Caught exception in channel pipeline.
> > io.netty.handler.codec.DecoderException: java.lang.IllegalArgumentException: Message exceeds maximum allowed size (10485760 bytes).
> > 	at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:280)
> > 	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:149)
> > 	at io.netty.handler.codec.ByteToMessageCodec.channelRead(ByteToMessageCodec.java:108)
> > 	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
> > 	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
> > 	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:787)
> > 	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:130)
> > 	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
> > 	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
> > 	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
> > 	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
> > 	at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
> > 	at java.lang.Thread.run(Thread.java:745)
> > Caused by: java.lang.IllegalArgumentException: Message exceeds maximum allowed size (10485760 bytes).
> > 	at org.apache.hive.spark.client.rpc.KryoMessageCodec.checkSize(KryoMessageCodec.java:117)
> > 	at org.apache.hive.spark.client.rpc.KryoMessageCodec.decode(KryoMessageCodec.java:77)
> > 	at io.netty.handler.codec.ByteToMessageCodec$1.decode(ByteToMessageCodec.java:42)
> > 	at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:249)
> > 	... 12 more
> 
> Marcelo Vanzin wrote:
>     I can add the check for negative sizes, but I still don't understand why you want a header. It doesn't serve any practical purposes. The protocol itself has a handshake that needs to be successful for the connection to be established; adding a header will add nothing to the process, just complexity.

The only thing I would add is that it's easy for engineers who work on this to look at the exception and know that it's not related, but it's not easy for operations folks. When they turn on debug logging and see these exceptions they will get taken off the trail of the real problem they are trying to debug.


- Brock


-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/28779/#review64279
-----------------------------------------------------------


On Dec. 8, 2014, 9:11 p.m., Marcelo Vanzin wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/28779/
> -----------------------------------------------------------
> 
> (Updated Dec. 8, 2014, 9:11 p.m.)
> 
> 
> Review request for hive, Brock Noland, chengxiang li, Szehon Ho, and Xuefu Zhang.
> 
> 
> Bugs: HIVE-9036
>     https://issues.apache.org/jira/browse/HIVE-9036
> 
> 
> Repository: hive-git
> 
> 
> Description
> -------
> 
> This patch replaces akka with a simple netty-based RPC layer. It doesn't add any features on top of the existing spark-client API, which is unchanged (except for the need to add empty constructors in some places).
> 
> With the new backend we can think about adding some nice features such as future listeners (which were awkward with akka because of Scala), but those are left for a different time.
> 
> The full change set, with more detailed descriptions, can be seen here:
> https://github.com/vanzin/hive/commits/spark-client-netty
> 
> 
> Diffs
> -----
> 
>   pom.xml 630b10ce35032e4b2dee50ef3dfe5feb58223b78 
>   ql/src/java/org/apache/hadoop/hive/ql/exec/spark/LocalHiveSparkClient.java PRE-CREATION 
>   ql/src/java/org/apache/hadoop/hive/ql/exec/spark/RemoteHiveSparkClient.java PRE-CREATION 
>   ql/src/java/org/apache/hadoop/hive/ql/exec/spark/status/impl/RemoteSparkJobStatus.java PRE-CREATION 
>   spark-client/pom.xml PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/ClientUtils.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/JobHandleImpl.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/Protocol.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/SparkClientFactory.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/SparkClientImpl.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/metrics/InputMetrics.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/metrics/Metrics.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleReadMetrics.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleWriteMetrics.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/KryoMessageCodec.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/README.md PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/Rpc.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcConfiguration.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcDispatcher.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcException.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcServer.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounter.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounterGroup.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounters.java PRE-CREATION 
>   spark-client/src/test/java/org/apache/hive/spark/client/TestSparkClient.java PRE-CREATION 
>   spark-client/src/test/java/org/apache/hive/spark/client/rpc/TestKryoMessageCodec.java PRE-CREATION 
>   spark-client/src/test/java/org/apache/hive/spark/client/rpc/TestRpc.java PRE-CREATION 
> 
> Diff: https://reviews.apache.org/r/28779/diff/
> 
> 
> Testing
> -------
> 
> spark-client unit tests, plus some qtests.
> 
> 
> Thanks,
> 
> Marcelo Vanzin
> 
>


Re: Review Request 28779: [spark-client] Netty-based RPC implementation.

Posted by Marcelo Vanzin <va...@cloudera.com>.

> On Dec. 8, 2014, 9:03 p.m., Brock Noland wrote:
> > Hey Marcelo,
> > 
> > When I send an HTTP request to the port where RSC is listening the message below is printed. Thus it's doing a good job in that it's checking the max message size which is awesome, but I feel we need to:
> > 
> > 1) Add a small header so that when junk data is sent to this port we can log a better exception than the one below. As I mentioned, we've had massive problems with this is in flume which also uses netty for communication.
> > 
> > 2) ensure the income size is not negative.
> > 
> > 
> > 2014-12-08 20:56:41,070 WARN  [RPC-Handler-7]: rpc.RpcDispatcher (RpcDispatcher.java:exceptionCaught(154)) - [HelloDispatcher] Caught exception in channel pipeline.
> > io.netty.handler.codec.DecoderException: java.lang.IllegalArgumentException: Message exceeds maximum allowed size (10485760 bytes).
> > 	at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:280)
> > 	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:149)
> > 	at io.netty.handler.codec.ByteToMessageCodec.channelRead(ByteToMessageCodec.java:108)
> > 	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
> > 	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
> > 	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:787)
> > 	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:130)
> > 	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
> > 	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
> > 	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
> > 	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
> > 	at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
> > 	at java.lang.Thread.run(Thread.java:745)
> > Caused by: java.lang.IllegalArgumentException: Message exceeds maximum allowed size (10485760 bytes).
> > 	at org.apache.hive.spark.client.rpc.KryoMessageCodec.checkSize(KryoMessageCodec.java:117)
> > 	at org.apache.hive.spark.client.rpc.KryoMessageCodec.decode(KryoMessageCodec.java:77)
> > 	at io.netty.handler.codec.ByteToMessageCodec$1.decode(ByteToMessageCodec.java:42)
> > 	at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:249)
> > 	... 12 more
> 
> Marcelo Vanzin wrote:
>     I can add the check for negative sizes, but I still don't understand why you want a header. It doesn't serve any practical purposes. The protocol itself has a handshake that needs to be successful for the connection to be established; adding a header will add nothing to the process, just complexity.
> 
> Brock Noland wrote:
>     The only thing I would add is that it's easy for engineers who work on this to look at the exception and know that it's not related, but it's not easy for operations folks. When they turn on debug logging and see these exceptions they will get taken off the trail of the real problem they are trying to debug.

Ops folks should not turn on debug logging unless they're told to; otherwise they'll potentially see a lot of these kinds of things. If they do turn on debug logging by themselves, then they shouldn't be surprised to see things they may not fully understand. There's a reason why it's called "debug", and not "just print the log messages specific to the problem I'm having".


- Marcelo


-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/28779/#review64279
-----------------------------------------------------------


On Dec. 8, 2014, 9:11 p.m., Marcelo Vanzin wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/28779/
> -----------------------------------------------------------
> 
> (Updated Dec. 8, 2014, 9:11 p.m.)
> 
> 
> Review request for hive, Brock Noland, chengxiang li, Szehon Ho, and Xuefu Zhang.
> 
> 
> Bugs: HIVE-9036
>     https://issues.apache.org/jira/browse/HIVE-9036
> 
> 
> Repository: hive-git
> 
> 
> Description
> -------
> 
> This patch replaces akka with a simple netty-based RPC layer. It doesn't add any features on top of the existing spark-client API, which is unchanged (except for the need to add empty constructors in some places).
> 
> With the new backend we can think about adding some nice features such as future listeners (which were awkward with akka because of Scala), but those are left for a different time.
> 
> The full change set, with more detailed descriptions, can be seen here:
> https://github.com/vanzin/hive/commits/spark-client-netty
> 
> 
> Diffs
> -----
> 
>   pom.xml 630b10ce35032e4b2dee50ef3dfe5feb58223b78 
>   ql/src/java/org/apache/hadoop/hive/ql/exec/spark/LocalHiveSparkClient.java PRE-CREATION 
>   ql/src/java/org/apache/hadoop/hive/ql/exec/spark/RemoteHiveSparkClient.java PRE-CREATION 
>   ql/src/java/org/apache/hadoop/hive/ql/exec/spark/status/impl/RemoteSparkJobStatus.java PRE-CREATION 
>   spark-client/pom.xml PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/ClientUtils.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/JobHandleImpl.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/Protocol.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/SparkClientFactory.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/SparkClientImpl.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/metrics/InputMetrics.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/metrics/Metrics.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleReadMetrics.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleWriteMetrics.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/KryoMessageCodec.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/README.md PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/Rpc.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcConfiguration.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcDispatcher.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcException.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcServer.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounter.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounterGroup.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounters.java PRE-CREATION 
>   spark-client/src/test/java/org/apache/hive/spark/client/TestSparkClient.java PRE-CREATION 
>   spark-client/src/test/java/org/apache/hive/spark/client/rpc/TestKryoMessageCodec.java PRE-CREATION 
>   spark-client/src/test/java/org/apache/hive/spark/client/rpc/TestRpc.java PRE-CREATION 
> 
> Diff: https://reviews.apache.org/r/28779/diff/
> 
> 
> Testing
> -------
> 
> spark-client unit tests, plus some qtests.
> 
> 
> Thanks,
> 
> Marcelo Vanzin
> 
>


Re: Review Request 28779: [spark-client] Netty-based RPC implementation.

Posted by Marcelo Vanzin <va...@cloudera.com>.

> On Dec. 8, 2014, 9:03 p.m., Brock Noland wrote:
> > Hey Marcelo,
> > 
> > When I send an HTTP request to the port where RSC is listening the message below is printed. Thus it's doing a good job in that it's checking the max message size which is awesome, but I feel we need to:
> > 
> > 1) Add a small header so that when junk data is sent to this port we can log a better exception than the one below. As I mentioned, we've had massive problems with this is in flume which also uses netty for communication.
> > 
> > 2) ensure the income size is not negative.
> > 
> > 
> > 2014-12-08 20:56:41,070 WARN  [RPC-Handler-7]: rpc.RpcDispatcher (RpcDispatcher.java:exceptionCaught(154)) - [HelloDispatcher] Caught exception in channel pipeline.
> > io.netty.handler.codec.DecoderException: java.lang.IllegalArgumentException: Message exceeds maximum allowed size (10485760 bytes).
> > 	at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:280)
> > 	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:149)
> > 	at io.netty.handler.codec.ByteToMessageCodec.channelRead(ByteToMessageCodec.java:108)
> > 	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
> > 	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
> > 	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:787)
> > 	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:130)
> > 	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
> > 	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
> > 	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
> > 	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
> > 	at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
> > 	at java.lang.Thread.run(Thread.java:745)
> > Caused by: java.lang.IllegalArgumentException: Message exceeds maximum allowed size (10485760 bytes).
> > 	at org.apache.hive.spark.client.rpc.KryoMessageCodec.checkSize(KryoMessageCodec.java:117)
> > 	at org.apache.hive.spark.client.rpc.KryoMessageCodec.decode(KryoMessageCodec.java:77)
> > 	at io.netty.handler.codec.ByteToMessageCodec$1.decode(ByteToMessageCodec.java:42)
> > 	at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:249)
> > 	... 12 more

I can add the check for negative sizes, but I still don't understand why you want a header. It doesn't serve any practical purposes. The protocol itself has a handshake that needs to be successful for the connection to be established; adding a header will add nothing to the process, just complexity.


- Marcelo


-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/28779/#review64279
-----------------------------------------------------------


On Dec. 8, 2014, 7:47 p.m., Marcelo Vanzin wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/28779/
> -----------------------------------------------------------
> 
> (Updated Dec. 8, 2014, 7:47 p.m.)
> 
> 
> Review request for hive, Brock Noland, chengxiang li, Szehon Ho, and Xuefu Zhang.
> 
> 
> Bugs: HIVE-9036
>     https://issues.apache.org/jira/browse/HIVE-9036
> 
> 
> Repository: hive-git
> 
> 
> Description
> -------
> 
> This patch replaces akka with a simple netty-based RPC layer. It doesn't add any features on top of the existing spark-client API, which is unchanged (except for the need to add empty constructors in some places).
> 
> With the new backend we can think about adding some nice features such as future listeners (which were awkward with akka because of Scala), but those are left for a different time.
> 
> The full change set, with more detailed descriptions, can be seen here:
> https://github.com/vanzin/hive/commits/spark-client-netty
> 
> 
> Diffs
> -----
> 
>   pom.xml 630b10ce35032e4b2dee50ef3dfe5feb58223b78 
>   ql/src/java/org/apache/hadoop/hive/ql/exec/spark/LocalHiveSparkClient.java PRE-CREATION 
>   ql/src/java/org/apache/hadoop/hive/ql/exec/spark/RemoteHiveSparkClient.java PRE-CREATION 
>   ql/src/java/org/apache/hadoop/hive/ql/exec/spark/status/impl/RemoteSparkJobStatus.java PRE-CREATION 
>   spark-client/pom.xml PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/ClientUtils.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/JobHandleImpl.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/Protocol.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/SparkClientFactory.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/SparkClientImpl.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/metrics/InputMetrics.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/metrics/Metrics.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleReadMetrics.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleWriteMetrics.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/KryoMessageCodec.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/README.md PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/Rpc.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcConfiguration.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcDispatcher.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcException.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcServer.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounter.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounterGroup.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounters.java PRE-CREATION 
>   spark-client/src/test/java/org/apache/hive/spark/client/TestSparkClient.java PRE-CREATION 
>   spark-client/src/test/java/org/apache/hive/spark/client/rpc/TestKryoMessageCodec.java PRE-CREATION 
>   spark-client/src/test/java/org/apache/hive/spark/client/rpc/TestRpc.java PRE-CREATION 
> 
> Diff: https://reviews.apache.org/r/28779/diff/
> 
> 
> Testing
> -------
> 
> spark-client unit tests, plus some qtests.
> 
> 
> Thanks,
> 
> Marcelo Vanzin
> 
>


Re: Review Request 28779: [spark-client] Netty-based RPC implementation.

Posted by Brock Noland <br...@cloudera.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/28779/#review64279
-----------------------------------------------------------


Hey Marcelo,

When I send an HTTP request to the port where RSC is listening the message below is printed. Thus it's doing a good job in that it's checking the max message size which is awesome, but I feel we need to:

1) Add a small header so that when junk data is sent to this port we can log a better exception than the one below. As I mentioned, we've had massive problems with this is in flume which also uses netty for communication.

2) ensure the income size is not negative.


2014-12-08 20:56:41,070 WARN  [RPC-Handler-7]: rpc.RpcDispatcher (RpcDispatcher.java:exceptionCaught(154)) - [HelloDispatcher] Caught exception in channel pipeline.
io.netty.handler.codec.DecoderException: java.lang.IllegalArgumentException: Message exceeds maximum allowed size (10485760 bytes).
	at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:280)
	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:149)
	at io.netty.handler.codec.ByteToMessageCodec.channelRead(ByteToMessageCodec.java:108)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:787)
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:130)
	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
	at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
	at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.IllegalArgumentException: Message exceeds maximum allowed size (10485760 bytes).
	at org.apache.hive.spark.client.rpc.KryoMessageCodec.checkSize(KryoMessageCodec.java:117)
	at org.apache.hive.spark.client.rpc.KryoMessageCodec.decode(KryoMessageCodec.java:77)
	at io.netty.handler.codec.ByteToMessageCodec$1.decode(ByteToMessageCodec.java:42)
	at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:249)
	... 12 more

- Brock Noland


On Dec. 8, 2014, 7:47 p.m., Marcelo Vanzin wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/28779/
> -----------------------------------------------------------
> 
> (Updated Dec. 8, 2014, 7:47 p.m.)
> 
> 
> Review request for hive, Brock Noland, chengxiang li, Szehon Ho, and Xuefu Zhang.
> 
> 
> Bugs: HIVE-9036
>     https://issues.apache.org/jira/browse/HIVE-9036
> 
> 
> Repository: hive-git
> 
> 
> Description
> -------
> 
> This patch replaces akka with a simple netty-based RPC layer. It doesn't add any features on top of the existing spark-client API, which is unchanged (except for the need to add empty constructors in some places).
> 
> With the new backend we can think about adding some nice features such as future listeners (which were awkward with akka because of Scala), but those are left for a different time.
> 
> The full change set, with more detailed descriptions, can be seen here:
> https://github.com/vanzin/hive/commits/spark-client-netty
> 
> 
> Diffs
> -----
> 
>   pom.xml 630b10ce35032e4b2dee50ef3dfe5feb58223b78 
>   ql/src/java/org/apache/hadoop/hive/ql/exec/spark/LocalHiveSparkClient.java PRE-CREATION 
>   ql/src/java/org/apache/hadoop/hive/ql/exec/spark/RemoteHiveSparkClient.java PRE-CREATION 
>   ql/src/java/org/apache/hadoop/hive/ql/exec/spark/status/impl/RemoteSparkJobStatus.java PRE-CREATION 
>   spark-client/pom.xml PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/ClientUtils.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/JobHandleImpl.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/Protocol.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/SparkClientFactory.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/SparkClientImpl.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/metrics/InputMetrics.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/metrics/Metrics.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleReadMetrics.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleWriteMetrics.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/KryoMessageCodec.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/README.md PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/Rpc.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcConfiguration.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcDispatcher.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcException.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcServer.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounter.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounterGroup.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounters.java PRE-CREATION 
>   spark-client/src/test/java/org/apache/hive/spark/client/TestSparkClient.java PRE-CREATION 
>   spark-client/src/test/java/org/apache/hive/spark/client/rpc/TestKryoMessageCodec.java PRE-CREATION 
>   spark-client/src/test/java/org/apache/hive/spark/client/rpc/TestRpc.java PRE-CREATION 
> 
> Diff: https://reviews.apache.org/r/28779/diff/
> 
> 
> Testing
> -------
> 
> spark-client unit tests, plus some qtests.
> 
> 
> Thanks,
> 
> Marcelo Vanzin
> 
>


Re: Review Request 28779: [spark-client] Netty-based RPC implementation.

Posted by chengxiang li <ch...@intel.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/28779/#review64356
-----------------------------------------------------------



spark-client/src/main/java/org/apache/hive/spark/client/Protocol.java
<https://reviews.apache.org/r/28779/#comment107008>

    unused import.



spark-client/src/main/java/org/apache/hive/spark/client/rpc/README.md
<https://reviews.apache.org/r/28779/#comment107028>

    you mean kryo here, right?



spark-client/src/main/java/org/apache/hive/spark/client/rpc/Rpc.java
<https://reviews.apache.org/r/28779/#comment107030>

    @param egroup


patch looks good to me, only add several minor comments.

- chengxiang li


On 十二月 9, 2014, 1:01 a.m., Marcelo Vanzin wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/28779/
> -----------------------------------------------------------
> 
> (Updated 十二月 9, 2014, 1:01 a.m.)
> 
> 
> Review request for hive, Brock Noland, chengxiang li, Szehon Ho, and Xuefu Zhang.
> 
> 
> Bugs: HIVE-9036
>     https://issues.apache.org/jira/browse/HIVE-9036
> 
> 
> Repository: hive-git
> 
> 
> Description
> -------
> 
> This patch replaces akka with a simple netty-based RPC layer. It doesn't add any features on top of the existing spark-client API, which is unchanged (except for the need to add empty constructors in some places).
> 
> With the new backend we can think about adding some nice features such as future listeners (which were awkward with akka because of Scala), but those are left for a different time.
> 
> The full change set, with more detailed descriptions, can be seen here:
> https://github.com/vanzin/hive/commits/spark-client-netty
> 
> 
> Diffs
> -----
> 
>   pom.xml 630b10ce35032e4b2dee50ef3dfe5feb58223b78 
>   ql/src/java/org/apache/hadoop/hive/ql/exec/spark/LocalHiveSparkClient.java PRE-CREATION 
>   ql/src/java/org/apache/hadoop/hive/ql/exec/spark/RemoteHiveSparkClient.java PRE-CREATION 
>   ql/src/java/org/apache/hadoop/hive/ql/exec/spark/status/impl/RemoteSparkJobStatus.java PRE-CREATION 
>   spark-client/pom.xml PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/ClientUtils.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/JobHandleImpl.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/Protocol.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/SparkClientFactory.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/SparkClientImpl.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/metrics/InputMetrics.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/metrics/Metrics.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleReadMetrics.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleWriteMetrics.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/KryoMessageCodec.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/README.md PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/Rpc.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcConfiguration.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcDispatcher.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcException.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcServer.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounter.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounterGroup.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounters.java PRE-CREATION 
>   spark-client/src/test/java/org/apache/hive/spark/client/TestSparkClient.java PRE-CREATION 
>   spark-client/src/test/java/org/apache/hive/spark/client/rpc/TestKryoMessageCodec.java PRE-CREATION 
>   spark-client/src/test/java/org/apache/hive/spark/client/rpc/TestRpc.java PRE-CREATION 
> 
> Diff: https://reviews.apache.org/r/28779/diff/
> 
> 
> Testing
> -------
> 
> spark-client unit tests, plus some qtests.
> 
> 
> Thanks,
> 
> Marcelo Vanzin
> 
>


Re: Review Request 28779: [spark-client] Netty-based RPC implementation.

Posted by Marcelo Vanzin <va...@cloudera.com>.

> On Dec. 9, 2014, 7:05 p.m., Xuefu Zhang wrote:
> > pom.xml, line 152
> > <https://reviews.apache.org/r/28779/diff/7/?file=786238#file786238line152>
> >
> >     Is there a reason that we cannot keep 3.7.0? Upgrading a dep version usually gives some headaches.

This version is not used anywhere in the Hive build. In fact, there is no version "3.7.0.Final" of "io.netty" (that's for the old "org.jboss.netty" package).


- Marcelo


-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/28779/#review64411
-----------------------------------------------------------


On Dec. 9, 2014, 6:49 p.m., Marcelo Vanzin wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/28779/
> -----------------------------------------------------------
> 
> (Updated Dec. 9, 2014, 6:49 p.m.)
> 
> 
> Review request for hive, Brock Noland, chengxiang li, Szehon Ho, and Xuefu Zhang.
> 
> 
> Bugs: HIVE-9036
>     https://issues.apache.org/jira/browse/HIVE-9036
> 
> 
> Repository: hive-git
> 
> 
> Description
> -------
> 
> This patch replaces akka with a simple netty-based RPC layer. It doesn't add any features on top of the existing spark-client API, which is unchanged (except for the need to add empty constructors in some places).
> 
> With the new backend we can think about adding some nice features such as future listeners (which were awkward with akka because of Scala), but those are left for a different time.
> 
> The full change set, with more detailed descriptions, can be seen here:
> https://github.com/vanzin/hive/commits/spark-client-netty
> 
> 
> Diffs
> -----
> 
>   pom.xml 630b10ce35032e4b2dee50ef3dfe5feb58223b78 
>   ql/src/java/org/apache/hadoop/hive/ql/exec/spark/LocalHiveSparkClient.java PRE-CREATION 
>   ql/src/java/org/apache/hadoop/hive/ql/exec/spark/RemoteHiveSparkClient.java PRE-CREATION 
>   ql/src/java/org/apache/hadoop/hive/ql/exec/spark/status/impl/RemoteSparkJobStatus.java PRE-CREATION 
>   spark-client/pom.xml PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/ClientUtils.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/JobHandleImpl.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/Protocol.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/SparkClientFactory.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/SparkClientImpl.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/metrics/InputMetrics.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/metrics/Metrics.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleReadMetrics.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleWriteMetrics.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/KryoMessageCodec.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/README.md PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/Rpc.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcConfiguration.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcDispatcher.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcException.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcServer.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounter.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounterGroup.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounters.java PRE-CREATION 
>   spark-client/src/test/java/org/apache/hive/spark/client/TestSparkClient.java PRE-CREATION 
>   spark-client/src/test/java/org/apache/hive/spark/client/rpc/TestKryoMessageCodec.java PRE-CREATION 
>   spark-client/src/test/java/org/apache/hive/spark/client/rpc/TestRpc.java PRE-CREATION 
> 
> Diff: https://reviews.apache.org/r/28779/diff/
> 
> 
> Testing
> -------
> 
> spark-client unit tests, plus some qtests.
> 
> 
> Thanks,
> 
> Marcelo Vanzin
> 
>


Re: Review Request 28779: [spark-client] Netty-based RPC implementation.

Posted by Xuefu Zhang <xz...@cloudera.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/28779/#review64411
-----------------------------------------------------------



pom.xml
<https://reviews.apache.org/r/28779/#comment107108>

    Is there a reason that we cannot keep 3.7.0? Upgrading a dep version usually gives some headaches.


- Xuefu Zhang


On Dec. 9, 2014, 6:49 p.m., Marcelo Vanzin wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/28779/
> -----------------------------------------------------------
> 
> (Updated Dec. 9, 2014, 6:49 p.m.)
> 
> 
> Review request for hive, Brock Noland, chengxiang li, Szehon Ho, and Xuefu Zhang.
> 
> 
> Bugs: HIVE-9036
>     https://issues.apache.org/jira/browse/HIVE-9036
> 
> 
> Repository: hive-git
> 
> 
> Description
> -------
> 
> This patch replaces akka with a simple netty-based RPC layer. It doesn't add any features on top of the existing spark-client API, which is unchanged (except for the need to add empty constructors in some places).
> 
> With the new backend we can think about adding some nice features such as future listeners (which were awkward with akka because of Scala), but those are left for a different time.
> 
> The full change set, with more detailed descriptions, can be seen here:
> https://github.com/vanzin/hive/commits/spark-client-netty
> 
> 
> Diffs
> -----
> 
>   pom.xml 630b10ce35032e4b2dee50ef3dfe5feb58223b78 
>   ql/src/java/org/apache/hadoop/hive/ql/exec/spark/LocalHiveSparkClient.java PRE-CREATION 
>   ql/src/java/org/apache/hadoop/hive/ql/exec/spark/RemoteHiveSparkClient.java PRE-CREATION 
>   ql/src/java/org/apache/hadoop/hive/ql/exec/spark/status/impl/RemoteSparkJobStatus.java PRE-CREATION 
>   spark-client/pom.xml PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/ClientUtils.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/JobHandleImpl.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/Protocol.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/SparkClientFactory.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/SparkClientImpl.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/metrics/InputMetrics.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/metrics/Metrics.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleReadMetrics.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleWriteMetrics.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/KryoMessageCodec.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/README.md PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/Rpc.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcConfiguration.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcDispatcher.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcException.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcServer.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounter.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounterGroup.java PRE-CREATION 
>   spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounters.java PRE-CREATION 
>   spark-client/src/test/java/org/apache/hive/spark/client/TestSparkClient.java PRE-CREATION 
>   spark-client/src/test/java/org/apache/hive/spark/client/rpc/TestKryoMessageCodec.java PRE-CREATION 
>   spark-client/src/test/java/org/apache/hive/spark/client/rpc/TestRpc.java PRE-CREATION 
> 
> Diff: https://reviews.apache.org/r/28779/diff/
> 
> 
> Testing
> -------
> 
> spark-client unit tests, plus some qtests.
> 
> 
> Thanks,
> 
> Marcelo Vanzin
> 
>


Re: Review Request 28779: [spark-client] Netty-based RPC implementation.

Posted by Marcelo Vanzin <va...@cloudera.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/28779/
-----------------------------------------------------------

(Updated Dec. 9, 2014, 9:17 p.m.)


Review request for hive, Brock Noland, chengxiang li, Szehon Ho, and Xuefu Zhang.


Bugs: HIVE-9036
    https://issues.apache.org/jira/browse/HIVE-9036


Repository: hive-git


Description
-------

This patch replaces akka with a simple netty-based RPC layer. It doesn't add any features on top of the existing spark-client API, which is unchanged (except for the need to add empty constructors in some places).

With the new backend we can think about adding some nice features such as future listeners (which were awkward with akka because of Scala), but those are left for a different time.

The full change set, with more detailed descriptions, can be seen here:
https://github.com/vanzin/hive/commits/spark-client-netty


Diffs (updated)
-----

  pom.xml 630b10ce35032e4b2dee50ef3dfe5feb58223b78 
  ql/src/java/org/apache/hadoop/hive/ql/exec/spark/LocalHiveSparkClient.java PRE-CREATION 
  ql/src/java/org/apache/hadoop/hive/ql/exec/spark/RemoteHiveSparkClient.java PRE-CREATION 
  ql/src/java/org/apache/hadoop/hive/ql/exec/spark/status/impl/RemoteSparkJobStatus.java PRE-CREATION 
  spark-client/pom.xml PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/ClientUtils.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/JobHandleImpl.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/Protocol.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/SparkClientFactory.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/SparkClientImpl.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/metrics/InputMetrics.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/metrics/Metrics.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleReadMetrics.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleWriteMetrics.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/KryoMessageCodec.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/README.md PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/Rpc.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcConfiguration.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcDispatcher.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcException.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcServer.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounter.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounterGroup.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounters.java PRE-CREATION 
  spark-client/src/test/java/org/apache/hive/spark/client/TestSparkClient.java PRE-CREATION 
  spark-client/src/test/java/org/apache/hive/spark/client/rpc/TestKryoMessageCodec.java PRE-CREATION 
  spark-client/src/test/java/org/apache/hive/spark/client/rpc/TestRpc.java PRE-CREATION 

Diff: https://reviews.apache.org/r/28779/diff/


Testing
-------

spark-client unit tests, plus some qtests.


Thanks,

Marcelo Vanzin


Re: Review Request 28779: [spark-client] Netty-based RPC implementation.

Posted by Marcelo Vanzin <va...@cloudera.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/28779/
-----------------------------------------------------------

(Updated Dec. 9, 2014, 6:49 p.m.)


Review request for hive, Brock Noland, chengxiang li, Szehon Ho, and Xuefu Zhang.


Bugs: HIVE-9036
    https://issues.apache.org/jira/browse/HIVE-9036


Repository: hive-git


Description
-------

This patch replaces akka with a simple netty-based RPC layer. It doesn't add any features on top of the existing spark-client API, which is unchanged (except for the need to add empty constructors in some places).

With the new backend we can think about adding some nice features such as future listeners (which were awkward with akka because of Scala), but those are left for a different time.

The full change set, with more detailed descriptions, can be seen here:
https://github.com/vanzin/hive/commits/spark-client-netty


Diffs (updated)
-----

  pom.xml 630b10ce35032e4b2dee50ef3dfe5feb58223b78 
  ql/src/java/org/apache/hadoop/hive/ql/exec/spark/LocalHiveSparkClient.java PRE-CREATION 
  ql/src/java/org/apache/hadoop/hive/ql/exec/spark/RemoteHiveSparkClient.java PRE-CREATION 
  ql/src/java/org/apache/hadoop/hive/ql/exec/spark/status/impl/RemoteSparkJobStatus.java PRE-CREATION 
  spark-client/pom.xml PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/ClientUtils.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/JobHandleImpl.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/Protocol.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/SparkClientFactory.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/SparkClientImpl.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/metrics/InputMetrics.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/metrics/Metrics.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleReadMetrics.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleWriteMetrics.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/KryoMessageCodec.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/README.md PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/Rpc.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcConfiguration.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcDispatcher.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcException.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcServer.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounter.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounterGroup.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounters.java PRE-CREATION 
  spark-client/src/test/java/org/apache/hive/spark/client/TestSparkClient.java PRE-CREATION 
  spark-client/src/test/java/org/apache/hive/spark/client/rpc/TestKryoMessageCodec.java PRE-CREATION 
  spark-client/src/test/java/org/apache/hive/spark/client/rpc/TestRpc.java PRE-CREATION 

Diff: https://reviews.apache.org/r/28779/diff/


Testing
-------

spark-client unit tests, plus some qtests.


Thanks,

Marcelo Vanzin


Re: Review Request 28779: [spark-client] Netty-based RPC implementation.

Posted by Marcelo Vanzin <va...@cloudera.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/28779/
-----------------------------------------------------------

(Updated Dec. 9, 2014, 1:01 a.m.)


Review request for hive, Brock Noland, chengxiang li, Szehon Ho, and Xuefu Zhang.


Bugs: HIVE-9036
    https://issues.apache.org/jira/browse/HIVE-9036


Repository: hive-git


Description
-------

This patch replaces akka with a simple netty-based RPC layer. It doesn't add any features on top of the existing spark-client API, which is unchanged (except for the need to add empty constructors in some places).

With the new backend we can think about adding some nice features such as future listeners (which were awkward with akka because of Scala), but those are left for a different time.

The full change set, with more detailed descriptions, can be seen here:
https://github.com/vanzin/hive/commits/spark-client-netty


Diffs (updated)
-----

  pom.xml 630b10ce35032e4b2dee50ef3dfe5feb58223b78 
  ql/src/java/org/apache/hadoop/hive/ql/exec/spark/LocalHiveSparkClient.java PRE-CREATION 
  ql/src/java/org/apache/hadoop/hive/ql/exec/spark/RemoteHiveSparkClient.java PRE-CREATION 
  ql/src/java/org/apache/hadoop/hive/ql/exec/spark/status/impl/RemoteSparkJobStatus.java PRE-CREATION 
  spark-client/pom.xml PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/ClientUtils.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/JobHandleImpl.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/Protocol.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/SparkClientFactory.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/SparkClientImpl.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/metrics/InputMetrics.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/metrics/Metrics.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleReadMetrics.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleWriteMetrics.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/KryoMessageCodec.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/README.md PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/Rpc.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcConfiguration.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcDispatcher.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcException.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcServer.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounter.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounterGroup.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounters.java PRE-CREATION 
  spark-client/src/test/java/org/apache/hive/spark/client/TestSparkClient.java PRE-CREATION 
  spark-client/src/test/java/org/apache/hive/spark/client/rpc/TestKryoMessageCodec.java PRE-CREATION 
  spark-client/src/test/java/org/apache/hive/spark/client/rpc/TestRpc.java PRE-CREATION 

Diff: https://reviews.apache.org/r/28779/diff/


Testing
-------

spark-client unit tests, plus some qtests.


Thanks,

Marcelo Vanzin


Re: Review Request 28779: [spark-client] Netty-based RPC implementation.

Posted by Marcelo Vanzin <va...@cloudera.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/28779/
-----------------------------------------------------------

(Updated Dec. 8, 2014, 9:54 p.m.)


Review request for hive, Brock Noland, chengxiang li, Szehon Ho, and Xuefu Zhang.


Bugs: HIVE-9036
    https://issues.apache.org/jira/browse/HIVE-9036


Repository: hive-git


Description
-------

This patch replaces akka with a simple netty-based RPC layer. It doesn't add any features on top of the existing spark-client API, which is unchanged (except for the need to add empty constructors in some places).

With the new backend we can think about adding some nice features such as future listeners (which were awkward with akka because of Scala), but those are left for a different time.

The full change set, with more detailed descriptions, can be seen here:
https://github.com/vanzin/hive/commits/spark-client-netty


Diffs (updated)
-----

  pom.xml 630b10ce35032e4b2dee50ef3dfe5feb58223b78 
  ql/src/java/org/apache/hadoop/hive/ql/exec/spark/LocalHiveSparkClient.java PRE-CREATION 
  ql/src/java/org/apache/hadoop/hive/ql/exec/spark/RemoteHiveSparkClient.java PRE-CREATION 
  ql/src/java/org/apache/hadoop/hive/ql/exec/spark/status/impl/RemoteSparkJobStatus.java PRE-CREATION 
  spark-client/pom.xml PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/ClientUtils.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/JobHandleImpl.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/Protocol.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/SparkClientFactory.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/SparkClientImpl.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/metrics/InputMetrics.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/metrics/Metrics.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleReadMetrics.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleWriteMetrics.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/KryoMessageCodec.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/README.md PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/Rpc.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcConfiguration.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcDispatcher.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcException.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcServer.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounter.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounterGroup.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounters.java PRE-CREATION 
  spark-client/src/test/java/org/apache/hive/spark/client/TestSparkClient.java PRE-CREATION 
  spark-client/src/test/java/org/apache/hive/spark/client/rpc/TestKryoMessageCodec.java PRE-CREATION 
  spark-client/src/test/java/org/apache/hive/spark/client/rpc/TestRpc.java PRE-CREATION 

Diff: https://reviews.apache.org/r/28779/diff/


Testing
-------

spark-client unit tests, plus some qtests.


Thanks,

Marcelo Vanzin


Re: Review Request 28779: [spark-client] Netty-based RPC implementation.

Posted by Marcelo Vanzin <va...@cloudera.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/28779/
-----------------------------------------------------------

(Updated Dec. 8, 2014, 9:52 p.m.)


Review request for hive, Brock Noland, chengxiang li, Szehon Ho, and Xuefu Zhang.


Bugs: HIVE-9036
    https://issues.apache.org/jira/browse/HIVE-9036


Repository: hive-git


Description
-------

This patch replaces akka with a simple netty-based RPC layer. It doesn't add any features on top of the existing spark-client API, which is unchanged (except for the need to add empty constructors in some places).

With the new backend we can think about adding some nice features such as future listeners (which were awkward with akka because of Scala), but those are left for a different time.

The full change set, with more detailed descriptions, can be seen here:
https://github.com/vanzin/hive/commits/spark-client-netty


Diffs (updated)
-----

  pom.xml 630b10ce35032e4b2dee50ef3dfe5feb58223b78 
  ql/src/java/org/apache/hadoop/hive/ql/exec/spark/LocalHiveSparkClient.java PRE-CREATION 
  ql/src/java/org/apache/hadoop/hive/ql/exec/spark/RemoteHiveSparkClient.java PRE-CREATION 
  ql/src/java/org/apache/hadoop/hive/ql/exec/spark/status/impl/RemoteSparkJobStatus.java PRE-CREATION 
  spark-client/pom.xml PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/ClientUtils.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/JobHandleImpl.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/Protocol.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/SparkClientFactory.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/SparkClientImpl.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/metrics/InputMetrics.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/metrics/Metrics.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleReadMetrics.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleWriteMetrics.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/KryoMessageCodec.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/README.md PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/Rpc.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcConfiguration.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcDispatcher.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcException.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcServer.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounter.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounterGroup.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounters.java PRE-CREATION 
  spark-client/src/test/java/org/apache/hive/spark/client/TestSparkClient.java PRE-CREATION 
  spark-client/src/test/java/org/apache/hive/spark/client/rpc/TestKryoMessageCodec.java PRE-CREATION 
  spark-client/src/test/java/org/apache/hive/spark/client/rpc/TestRpc.java PRE-CREATION 

Diff: https://reviews.apache.org/r/28779/diff/


Testing
-------

spark-client unit tests, plus some qtests.


Thanks,

Marcelo Vanzin


Re: Review Request 28779: [spark-client] Netty-based RPC implementation.

Posted by Marcelo Vanzin <va...@cloudera.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/28779/
-----------------------------------------------------------

(Updated Dec. 8, 2014, 9:11 p.m.)


Review request for hive, Brock Noland, chengxiang li, Szehon Ho, and Xuefu Zhang.


Bugs: HIVE-9036
    https://issues.apache.org/jira/browse/HIVE-9036


Repository: hive-git


Description
-------

This patch replaces akka with a simple netty-based RPC layer. It doesn't add any features on top of the existing spark-client API, which is unchanged (except for the need to add empty constructors in some places).

With the new backend we can think about adding some nice features such as future listeners (which were awkward with akka because of Scala), but those are left for a different time.

The full change set, with more detailed descriptions, can be seen here:
https://github.com/vanzin/hive/commits/spark-client-netty


Diffs (updated)
-----

  pom.xml 630b10ce35032e4b2dee50ef3dfe5feb58223b78 
  ql/src/java/org/apache/hadoop/hive/ql/exec/spark/LocalHiveSparkClient.java PRE-CREATION 
  ql/src/java/org/apache/hadoop/hive/ql/exec/spark/RemoteHiveSparkClient.java PRE-CREATION 
  ql/src/java/org/apache/hadoop/hive/ql/exec/spark/status/impl/RemoteSparkJobStatus.java PRE-CREATION 
  spark-client/pom.xml PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/ClientUtils.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/JobHandleImpl.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/Protocol.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/SparkClientFactory.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/SparkClientImpl.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/metrics/InputMetrics.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/metrics/Metrics.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleReadMetrics.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleWriteMetrics.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/KryoMessageCodec.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/README.md PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/Rpc.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcConfiguration.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcDispatcher.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcException.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcServer.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounter.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounterGroup.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounters.java PRE-CREATION 
  spark-client/src/test/java/org/apache/hive/spark/client/TestSparkClient.java PRE-CREATION 
  spark-client/src/test/java/org/apache/hive/spark/client/rpc/TestKryoMessageCodec.java PRE-CREATION 
  spark-client/src/test/java/org/apache/hive/spark/client/rpc/TestRpc.java PRE-CREATION 

Diff: https://reviews.apache.org/r/28779/diff/


Testing
-------

spark-client unit tests, plus some qtests.


Thanks,

Marcelo Vanzin


Re: Review Request 28779: [spark-client] Netty-based RPC implementation.

Posted by Marcelo Vanzin <va...@cloudera.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/28779/
-----------------------------------------------------------

(Updated Dec. 8, 2014, 7:47 p.m.)


Review request for hive, Brock Noland, chengxiang li, Szehon Ho, and Xuefu Zhang.


Bugs: HIVE-9036
    https://issues.apache.org/jira/browse/HIVE-9036


Repository: hive-git


Description
-------

This patch replaces akka with a simple netty-based RPC layer. It doesn't add any features on top of the existing spark-client API, which is unchanged (except for the need to add empty constructors in some places).

With the new backend we can think about adding some nice features such as future listeners (which were awkward with akka because of Scala), but those are left for a different time.

The full change set, with more detailed descriptions, can be seen here:
https://github.com/vanzin/hive/commits/spark-client-netty


Diffs (updated)
-----

  pom.xml 630b10ce35032e4b2dee50ef3dfe5feb58223b78 
  ql/src/java/org/apache/hadoop/hive/ql/exec/spark/LocalHiveSparkClient.java PRE-CREATION 
  ql/src/java/org/apache/hadoop/hive/ql/exec/spark/RemoteHiveSparkClient.java PRE-CREATION 
  ql/src/java/org/apache/hadoop/hive/ql/exec/spark/status/impl/RemoteSparkJobStatus.java PRE-CREATION 
  spark-client/pom.xml PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/ClientUtils.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/JobHandleImpl.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/Protocol.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/SparkClientFactory.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/SparkClientImpl.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/metrics/InputMetrics.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/metrics/Metrics.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleReadMetrics.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleWriteMetrics.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/KryoMessageCodec.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/README.md PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/Rpc.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcConfiguration.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcDispatcher.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcException.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcServer.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounter.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounterGroup.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounters.java PRE-CREATION 
  spark-client/src/test/java/org/apache/hive/spark/client/TestSparkClient.java PRE-CREATION 
  spark-client/src/test/java/org/apache/hive/spark/client/rpc/TestKryoMessageCodec.java PRE-CREATION 
  spark-client/src/test/java/org/apache/hive/spark/client/rpc/TestRpc.java PRE-CREATION 

Diff: https://reviews.apache.org/r/28779/diff/


Testing
-------

spark-client unit tests, plus some qtests.


Thanks,

Marcelo Vanzin


Re: Review Request 28779: [spark-client] Netty-based RPC implementation.

Posted by Marcelo Vanzin <va...@cloudera.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/28779/
-----------------------------------------------------------

(Updated Dec. 8, 2014, 7:40 p.m.)


Review request for hive, Brock Noland, chengxiang li, Szehon Ho, and Xuefu Zhang.


Bugs: HIVE-9036
    https://issues.apache.org/jira/browse/HIVE-9036


Repository: hive-git


Description (updated)
-------

This patch replaces akka with a simple netty-based RPC layer. It doesn't add any features on top of the existing spark-client API, which is unchanged (except for the need to add empty constructors in some places).

With the new backend we can think about adding some nice features such as future listeners (which were awkward with akka because of Scala), but those are left for a different time.

The full change set, with more detailed descriptions, can be seen here:
https://github.com/vanzin/hive/commits/spark-client-netty


Diffs
-----

  pom.xml 630b10ce35032e4b2dee50ef3dfe5feb58223b78 
  ql/src/java/org/apache/hadoop/hive/ql/exec/spark/LocalHiveSparkClient.java PRE-CREATION 
  ql/src/java/org/apache/hadoop/hive/ql/exec/spark/RemoteHiveSparkClient.java PRE-CREATION 
  ql/src/java/org/apache/hadoop/hive/ql/exec/spark/status/impl/RemoteSparkJobStatus.java PRE-CREATION 
  spark-client/pom.xml PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/ClientUtils.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/JobHandleImpl.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/Protocol.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/SparkClientFactory.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/SparkClientImpl.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/metrics/InputMetrics.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/metrics/Metrics.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleReadMetrics.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleWriteMetrics.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/KryoMessageCodec.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/README.md PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/Rpc.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcConfiguration.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcDispatcher.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcException.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/client/rpc/RpcServer.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounter.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounterGroup.java PRE-CREATION 
  spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounters.java PRE-CREATION 
  spark-client/src/test/java/org/apache/hive/spark/client/TestSparkClient.java PRE-CREATION 
  spark-client/src/test/java/org/apache/hive/spark/client/rpc/TestKryoMessageCodec.java PRE-CREATION 
  spark-client/src/test/java/org/apache/hive/spark/client/rpc/TestRpc.java PRE-CREATION 

Diff: https://reviews.apache.org/r/28779/diff/


Testing
-------

spark-client unit tests, plus some qtests.


Thanks,

Marcelo Vanzin