You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@ambari.apache.org by Andrew Onischuk <ao...@hortonworks.com> on 2017/01/30 13:59:11 UTC

Review Request 56079: lzo broken for hive on tez

-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/56079/
-----------------------------------------------------------

Review request for Ambari and Dmitro Lisnichenko.


Bugs: AMBARI-19777
    https://issues.apache.org/jira/browse/AMBARI-19777


Repository: ambari


Description
-------

count * on lzo compressed hive table throws java.lang.RuntimeException:
native-lzo library not available.

In our Sqoop suite(test_copyTableToHiveWithCompression[mysql-
LZO]) first we import
table to hive with lzo compression.

    
    
    
    /usr/hdp/current/sqoop-client/bin/sqoop import --connection-manager org.apache.sqoop.manager.MySQLManager --connect jdbc:mysql://ctr-e85-1482808692054-1557-01-000002.hwx.site/employees --username sqoop --password sqoop --query "select id,data from mytable_compress where locale='en_US' AND\$CONDITIONS" --hive-import --hive-overwrite --hive-partition-key locale --hive-partition-value "en_US" --hive-table mytable_compress --target-dir /user/hrt_qa/test-sqoop/out -m 1 --compress --compression-codec lzo
    

which went successfull for different partition values .

Post the we verify the number of records on the hive table which is failing.

    
    
    
    2016-12-29 06:54:33,003|INFO|MainThread|machine.py:132 - run()|RUNNING: /usr/hdp/current/hive-client/bin/hive -e "select count(*) from mytable_compress;"
    2016-12-29 06:54:37,391|INFO|MainThread|machine.py:145 - run()|
    2016-12-29 06:54:37,391|INFO|MainThread|machine.py:145 - run()|Logging initialized using configuration in file:/etc/hive/2.6.0.0-275/0/hive-log4j.properties
    2016-12-29 06:54:52,463|INFO|MainThread|machine.py:145 - run()|Query ID = hrt_qa_20161229065450_36da5b59-89f5-4bcb-96a5-4ddab3745ea8
    2016-12-29 06:54:52,463|INFO|MainThread|machine.py:145 - run()|Total jobs = 1
    2016-12-29 06:54:52,474|INFO|MainThread|machine.py:145 - run()|Launching Job 1 out of 1
    2016-12-29 06:54:53,628|INFO|MainThread|machine.py:145 - run()|
    2016-12-29 06:54:53,629|INFO|MainThread|machine.py:145 - run()|
    2016-12-29 06:54:53,877|INFO|MainThread|machine.py:145 - run()|Status: Running (Executing on YARN cluster with App id application_1482982635541_0221)
    2016-12-29 06:54:53,877|INFO|MainThread|machine.py:145 - run()|
    2016-12-29 06:54:53,877|INFO|MainThread|machine.py:145 - run()|Map 1: -/-	Reducer 2: 0/1
    2016-12-29 06:54:54,081|INFO|MainThread|machine.py:145 - run()|Map 1: 0/1	Reducer 2: 0/1
    2016-12-29 06:54:57,140|INFO|MainThread|machine.py:145 - run()|Map 1: 0/1	Reducer 2: 0/1
    2016-12-29 06:54:59,393|INFO|MainThread|machine.py:145 - run()|Map 1: 0(+1)/1	Reducer 2: 0/1
    2016-12-29 06:55:01,028|INFO|MainThread|machine.py:145 - run()|Map 1: 0(+1,-1)/1	Reducer 2: 0/1
    2016-12-29 06:55:04,089|INFO|MainThread|machine.py:145 - run()|Map 1: 0(+1,-1)/1	Reducer 2: 0/1
    2016-12-29 06:55:05,108|INFO|MainThread|machine.py:145 - run()|Map 1: 0(+1,-2)/1	Reducer 2: 0/1
    2016-12-29 06:55:08,178|INFO|MainThread|machine.py:145 - run()|Map 1: 0(+1,-2)/1	Reducer 2: 0/1
    2016-12-29 06:55:11,245|INFO|MainThread|machine.py:145 - run()|Map 1: 0(+1,-2)/1	Reducer 2: 0/1
    2016-12-29 06:55:12,267|INFO|MainThread|machine.py:145 - run()|Map 1: 0(+1,-3)/1	Reducer 2: 0/1
    2016-12-29 06:55:15,344|INFO|MainThread|machine.py:145 - run()|Map 1: 0(+1,-3)/1	Reducer 2: 0/1
    2016-12-29 06:55:15,965|INFO|MainThread|machine.py:145 - run()|Status: Failed
    2016-12-29 06:55:15,967|INFO|MainThread|machine.py:145 - run()|Vertex failed, vertexName=Map 1, vertexId=vertex_1482982635541_0221_1_00, diagnostics=[Task failed, taskId=task_1482982635541_0221_1_00_000000, diagnostics=[TaskAttempt 0 failed, info=[Error: Failure while running task:java.lang.RuntimeException: java.lang.RuntimeException: java.io.IOException: java.lang.RuntimeException: native-lzo library not available
    2016-12-29 06:55:15,967|INFO|MainThread|machine.py:145 - run()|at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:173)
    2016-12-29 06:55:15,967|INFO|MainThread|machine.py:145 - run()|at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:139)
    2016-12-29 06:55:15,967|INFO|MainThread|machine.py:145 - run()|at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:347)
    2016-12-29 06:55:15,967|INFO|MainThread|machine.py:145 - run()|at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:194)
    2016-12-29 06:55:15,968|INFO|MainThread|machine.py:145 - run()|at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:185)
    2016-12-29 06:55:15,968|INFO|MainThread|machine.py:145 - run()|at java.security.AccessController.doPrivileged(Native Method)
    2016-12-29 06:55:15,968|INFO|MainThread|machine.py:145 - run()|at javax.security.auth.Subject.doAs(Subject.java:415)
    2016-12-29 06:55:15,968|INFO|MainThread|machine.py:145 - run()|at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
    2016-12-29 06:55:15,968|INFO|MainThread|machine.py:145 - run()|at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:185)
    2016-12-29 06:55:15,968|INFO|MainThread|machine.py:145 - run()|at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:181)
    2016-12-29 06:55:15,968|INFO|MainThread|machine.py:145 - run()|at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)
    2016-12-29 06:55:15,969|INFO|MainThread|machine.py:145 - run()|at java.util.concurrent.FutureTask.run(FutureTask.java:262)
    2016-12-29 06:55:15,969|INFO|MainThread|machine.py:145 - run()|at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    2016-12-29 06:55:15,969|INFO|MainThread|machine.py:145 - run()|at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    2016-12-29 06:55:15,969|INFO|MainThread|machine.py:145 - run()|at java.lang.Thread.run(Thread.java:745)
    2016-12-29 06:55:15,969|INFO|MainThread|machine.py:145 - run()|Caused by: java.lang.RuntimeException: java.io.IOException: java.lang.RuntimeException: native-lzo library not available
    2016-12-29 06:55:15,969|INFO|MainThread|machine.py:145 - run()|at org.apache.hadoop.mapred.split.TezGroupedSplitsInputFormat$TezGroupedSplitsRecordReader.initNextRecordReader(TezGroupedSplitsInputFormat.java:196)
    2016-12-29 06:55:15,969|INFO|MainThread|machine.py:145 - run()|at org.apache.hadoop.mapred.split.TezGroupedSplitsInputFormat$TezGroupedSplitsRecordReader.<init>(TezGroupedSplitsInputFormat.java:135)
    2016-12-29 06:55:15,970|INFO|MainThread|machine.py:145 - run()|at org.apache.hadoop.mapred.split.TezGroupedSplitsInputFormat.getRecordReader(TezGroupedSplitsInputFormat.java:101)
    2016-12-29 06:55:15,970|INFO|MainThread|machine.py:145 - run()|at org.apache.tez.mapreduce.lib.MRReaderMapred.setupOldRecordReader(MRReaderMapred.java:149)
    2016-12-29 06:55:15,970|INFO|MainThread|machine.py:145 - run()|at org.apache.tez.mapreduce.lib.MRReaderMapred.setSplit(MRReaderMapred.java:80)
    2016-12-29 06:55:15,970|INFO|MainThread|machine.py:145 - run()|at org.apache.tez.mapreduce.input.MRInput.initFromEventInternal(MRInput.java:674)
    2016-12-29 06:55:15,970|INFO|MainThread|machine.py:145 - run()|at org.apache.tez.mapreduce.input.MRInput.initFromEvent(MRInput.java:633)
    2016-12-29 06:55:15,970|INFO|MainThread|machine.py:145 - run()|at org.apache.tez.mapreduce.input.MRInputLegacy.checkAndAwaitRecordReaderInitialization(MRInputLegacy.java:145)
    2016-12-29 06:55:15,970|INFO|MainThread|machine.py:145 - run()|at org.apache.tez.mapreduce.input.MRInputLegacy.init(MRInputLegacy.java:109)
    2016-12-29 06:55:15,971|INFO|MainThread|machine.py:145 - run()|at org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.getMRInput(MapRecordProcessor.java:405)
    2016-12-29 06:55:15,971|INFO|MainThread|machine.py:145 - run()|at org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.init(MapRecordProcessor.java:124)
    2016-12-29 06:55:15,971|INFO|MainThread|machine.py:145 - run()|at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:149)
    2016-12-29 06:55:15,971|INFO|MainThread|machine.py:145 - run()|... 14 more
    2016-12-29 06:55:15,971|INFO|MainThread|machine.py:145 - run()|Caused by: java.io.IOException: java.lang.RuntimeException: native-lzo library not available
    2016-12-29 06:55:15,971|INFO|MainThread|machine.py:145 - run()|at org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderCreationException(HiveIOExceptionHandlerChain.java:97)
    2016-12-29 06:55:15,971|INFO|MainThread|machine.py:145 - run()|at org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderCreationException(HiveIOExceptionHandlerUtil.java:57)
    2016-12-29 06:55:15,971|INFO|MainThread|machine.py:145 - run()|at org.apache.hadoop.hive.ql.io.HiveInputFormat.getRecordReader(HiveInputFormat.java:253)
    2016-12-29 06:55:15,972|INFO|MainThread|machine.py:145 - run()|at org.apache.hadoop.mapred.split.TezGroupedSplitsInputFormat$TezGroupedSplitsRecordReader.initNextRecordReader(TezGroupedSplitsInputFormat.java:193)
    2016-12-29 06:55:15,972|INFO|MainThread|machine.py:145 - run()|... 25 more
    2016-12-29 06:55:15,972|INFO|MainThread|machine.py:145 - run()|Caused by: java.lang.RuntimeException: native-lzo library not available


Diffs
-----

  ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/params_linux.py 10bdd45 

Diff: https://reviews.apache.org/r/56079/diff/


Testing
-------

mvn clean test


Thanks,

Andrew Onischuk


Re: Review Request 56079: lzo broken for hive on tez

Posted by Dmitro Lisnichenko <dl...@hortonworks.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/56079/#review163515
-----------------------------------------------------------


Ship it!




Ship It!

- Dmitro Lisnichenko


On Jan. 30, 2017, 3:59 p.m., Andrew Onischuk wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/56079/
> -----------------------------------------------------------
> 
> (Updated Jan. 30, 2017, 3:59 p.m.)
> 
> 
> Review request for Ambari and Dmitro Lisnichenko.
> 
> 
> Bugs: AMBARI-19777
>     https://issues.apache.org/jira/browse/AMBARI-19777
> 
> 
> Repository: ambari
> 
> 
> Description
> -------
> 
> count * on lzo compressed hive table throws java.lang.RuntimeException:
> native-lzo library not available.
> 
> In our Sqoop suite(test_copyTableToHiveWithCompression[mysql-
> LZO]) first we import
> table to hive with lzo compression.
> 
>     
>     
>     
>     /usr/hdp/current/sqoop-client/bin/sqoop import --connection-manager org.apache.sqoop.manager.MySQLManager --connect jdbc:mysql://ctr-e85-1482808692054-1557-01-000002.hwx.site/employees --username sqoop --password sqoop --query "select id,data from mytable_compress where locale='en_US' AND\$CONDITIONS" --hive-import --hive-overwrite --hive-partition-key locale --hive-partition-value "en_US" --hive-table mytable_compress --target-dir /user/hrt_qa/test-sqoop/out -m 1 --compress --compression-codec lzo
>     
> 
> which went successfull for different partition values .
> 
> Post the we verify the number of records on the hive table which is failing.
> 
>     
>     
>     
>     2016-12-29 06:54:33,003|INFO|MainThread|machine.py:132 - run()|RUNNING: /usr/hdp/current/hive-client/bin/hive -e "select count(*) from mytable_compress;"
>     2016-12-29 06:54:37,391|INFO|MainThread|machine.py:145 - run()|
>     2016-12-29 06:54:37,391|INFO|MainThread|machine.py:145 - run()|Logging initialized using configuration in file:/etc/hive/2.6.0.0-275/0/hive-log4j.properties
>     2016-12-29 06:54:52,463|INFO|MainThread|machine.py:145 - run()|Query ID = hrt_qa_20161229065450_36da5b59-89f5-4bcb-96a5-4ddab3745ea8
>     2016-12-29 06:54:52,463|INFO|MainThread|machine.py:145 - run()|Total jobs = 1
>     2016-12-29 06:54:52,474|INFO|MainThread|machine.py:145 - run()|Launching Job 1 out of 1
>     2016-12-29 06:54:53,628|INFO|MainThread|machine.py:145 - run()|
>     2016-12-29 06:54:53,629|INFO|MainThread|machine.py:145 - run()|
>     2016-12-29 06:54:53,877|INFO|MainThread|machine.py:145 - run()|Status: Running (Executing on YARN cluster with App id application_1482982635541_0221)
>     2016-12-29 06:54:53,877|INFO|MainThread|machine.py:145 - run()|
>     2016-12-29 06:54:53,877|INFO|MainThread|machine.py:145 - run()|Map 1: -/-	Reducer 2: 0/1
>     2016-12-29 06:54:54,081|INFO|MainThread|machine.py:145 - run()|Map 1: 0/1	Reducer 2: 0/1
>     2016-12-29 06:54:57,140|INFO|MainThread|machine.py:145 - run()|Map 1: 0/1	Reducer 2: 0/1
>     2016-12-29 06:54:59,393|INFO|MainThread|machine.py:145 - run()|Map 1: 0(+1)/1	Reducer 2: 0/1
>     2016-12-29 06:55:01,028|INFO|MainThread|machine.py:145 - run()|Map 1: 0(+1,-1)/1	Reducer 2: 0/1
>     2016-12-29 06:55:04,089|INFO|MainThread|machine.py:145 - run()|Map 1: 0(+1,-1)/1	Reducer 2: 0/1
>     2016-12-29 06:55:05,108|INFO|MainThread|machine.py:145 - run()|Map 1: 0(+1,-2)/1	Reducer 2: 0/1
>     2016-12-29 06:55:08,178|INFO|MainThread|machine.py:145 - run()|Map 1: 0(+1,-2)/1	Reducer 2: 0/1
>     2016-12-29 06:55:11,245|INFO|MainThread|machine.py:145 - run()|Map 1: 0(+1,-2)/1	Reducer 2: 0/1
>     2016-12-29 06:55:12,267|INFO|MainThread|machine.py:145 - run()|Map 1: 0(+1,-3)/1	Reducer 2: 0/1
>     2016-12-29 06:55:15,344|INFO|MainThread|machine.py:145 - run()|Map 1: 0(+1,-3)/1	Reducer 2: 0/1
>     2016-12-29 06:55:15,965|INFO|MainThread|machine.py:145 - run()|Status: Failed
>     2016-12-29 06:55:15,967|INFO|MainThread|machine.py:145 - run()|Vertex failed, vertexName=Map 1, vertexId=vertex_1482982635541_0221_1_00, diagnostics=[Task failed, taskId=task_1482982635541_0221_1_00_000000, diagnostics=[TaskAttempt 0 failed, info=[Error: Failure while running task:java.lang.RuntimeException: java.lang.RuntimeException: java.io.IOException: java.lang.RuntimeException: native-lzo library not available
>     2016-12-29 06:55:15,967|INFO|MainThread|machine.py:145 - run()|at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:173)
>     2016-12-29 06:55:15,967|INFO|MainThread|machine.py:145 - run()|at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:139)
>     2016-12-29 06:55:15,967|INFO|MainThread|machine.py:145 - run()|at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:347)
>     2016-12-29 06:55:15,967|INFO|MainThread|machine.py:145 - run()|at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:194)
>     2016-12-29 06:55:15,968|INFO|MainThread|machine.py:145 - run()|at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:185)
>     2016-12-29 06:55:15,968|INFO|MainThread|machine.py:145 - run()|at java.security.AccessController.doPrivileged(Native Method)
>     2016-12-29 06:55:15,968|INFO|MainThread|machine.py:145 - run()|at javax.security.auth.Subject.doAs(Subject.java:415)
>     2016-12-29 06:55:15,968|INFO|MainThread|machine.py:145 - run()|at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
>     2016-12-29 06:55:15,968|INFO|MainThread|machine.py:145 - run()|at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:185)
>     2016-12-29 06:55:15,968|INFO|MainThread|machine.py:145 - run()|at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:181)
>     2016-12-29 06:55:15,968|INFO|MainThread|machine.py:145 - run()|at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)
>     2016-12-29 06:55:15,969|INFO|MainThread|machine.py:145 - run()|at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>     2016-12-29 06:55:15,969|INFO|MainThread|machine.py:145 - run()|at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>     2016-12-29 06:55:15,969|INFO|MainThread|machine.py:145 - run()|at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>     2016-12-29 06:55:15,969|INFO|MainThread|machine.py:145 - run()|at java.lang.Thread.run(Thread.java:745)
>     2016-12-29 06:55:15,969|INFO|MainThread|machine.py:145 - run()|Caused by: java.lang.RuntimeException: java.io.IOException: java.lang.RuntimeException: native-lzo library not available
>     2016-12-29 06:55:15,969|INFO|MainThread|machine.py:145 - run()|at org.apache.hadoop.mapred.split.TezGroupedSplitsInputFormat$TezGroupedSplitsRecordReader.initNextRecordReader(TezGroupedSplitsInputFormat.java:196)
>     2016-12-29 06:55:15,969|INFO|MainThread|machine.py:145 - run()|at org.apache.hadoop.mapred.split.TezGroupedSplitsInputFormat$TezGroupedSplitsRecordReader.<init>(TezGroupedSplitsInputFormat.java:135)
>     2016-12-29 06:55:15,970|INFO|MainThread|machine.py:145 - run()|at org.apache.hadoop.mapred.split.TezGroupedSplitsInputFormat.getRecordReader(TezGroupedSplitsInputFormat.java:101)
>     2016-12-29 06:55:15,970|INFO|MainThread|machine.py:145 - run()|at org.apache.tez.mapreduce.lib.MRReaderMapred.setupOldRecordReader(MRReaderMapred.java:149)
>     2016-12-29 06:55:15,970|INFO|MainThread|machine.py:145 - run()|at org.apache.tez.mapreduce.lib.MRReaderMapred.setSplit(MRReaderMapred.java:80)
>     2016-12-29 06:55:15,970|INFO|MainThread|machine.py:145 - run()|at org.apache.tez.mapreduce.input.MRInput.initFromEventInternal(MRInput.java:674)
>     2016-12-29 06:55:15,970|INFO|MainThread|machine.py:145 - run()|at org.apache.tez.mapreduce.input.MRInput.initFromEvent(MRInput.java:633)
>     2016-12-29 06:55:15,970|INFO|MainThread|machine.py:145 - run()|at org.apache.tez.mapreduce.input.MRInputLegacy.checkAndAwaitRecordReaderInitialization(MRInputLegacy.java:145)
>     2016-12-29 06:55:15,970|INFO|MainThread|machine.py:145 - run()|at org.apache.tez.mapreduce.input.MRInputLegacy.init(MRInputLegacy.java:109)
>     2016-12-29 06:55:15,971|INFO|MainThread|machine.py:145 - run()|at org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.getMRInput(MapRecordProcessor.java:405)
>     2016-12-29 06:55:15,971|INFO|MainThread|machine.py:145 - run()|at org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.init(MapRecordProcessor.java:124)
>     2016-12-29 06:55:15,971|INFO|MainThread|machine.py:145 - run()|at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:149)
>     2016-12-29 06:55:15,971|INFO|MainThread|machine.py:145 - run()|... 14 more
>     2016-12-29 06:55:15,971|INFO|MainThread|machine.py:145 - run()|Caused by: java.io.IOException: java.lang.RuntimeException: native-lzo library not available
>     2016-12-29 06:55:15,971|INFO|MainThread|machine.py:145 - run()|at org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderCreationException(HiveIOExceptionHandlerChain.java:97)
>     2016-12-29 06:55:15,971|INFO|MainThread|machine.py:145 - run()|at org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderCreationException(HiveIOExceptionHandlerUtil.java:57)
>     2016-12-29 06:55:15,971|INFO|MainThread|machine.py:145 - run()|at org.apache.hadoop.hive.ql.io.HiveInputFormat.getRecordReader(HiveInputFormat.java:253)
>     2016-12-29 06:55:15,972|INFO|MainThread|machine.py:145 - run()|at org.apache.hadoop.mapred.split.TezGroupedSplitsInputFormat$TezGroupedSplitsRecordReader.initNextRecordReader(TezGroupedSplitsInputFormat.java:193)
>     2016-12-29 06:55:15,972|INFO|MainThread|machine.py:145 - run()|... 25 more
>     2016-12-29 06:55:15,972|INFO|MainThread|machine.py:145 - run()|Caused by: java.lang.RuntimeException: native-lzo library not available
> 
> 
> Diffs
> -----
> 
>   ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/params_linux.py 10bdd45 
> 
> Diff: https://reviews.apache.org/r/56079/diff/
> 
> 
> Testing
> -------
> 
> mvn clean test
> 
> 
> Thanks,
> 
> Andrew Onischuk
> 
>