This article discusses an issue that prevents startup of the Spark SQL Thrift server on DataStax Enterprise.
- DataStax Enterprise 6.x
- DataStax Enterprise 5.x
When attempting to start the Spark SQL Thrift server on a node running in DSE Analytics mode, a
NoSuchMethodError exception is reported in the output file. For example:
$ dse spark-sql-thriftserver start starting org.apache.spark.sql.hive.thriftserver.HiveThriftServer2, logging to \ /home/dse/spark-thrift-server/spark-dse-org.apache.spark.sql.hive.thriftserver.HiveThriftServer2-1-ds9.out
Here are sample entries in an output file for a node running with DSE 5.1.9:
ERROR 2019-03-05 15:34:43,662 org.apache.spark.deploy.DseSparkSubmitBootstrapper: Failed to start or submit Spark application java.lang.NoSuchMethodError: org.apache.hive.service.cli.operation.LogDivertAppender.setWriter(Ljava/io/Writer;)V at org.apache.hive.service.cli.operation.LogDivertAppender.(LogDivertAppender.java:166) ~[spark-hive-thriftserver_2.11-188.8.131.52.jar:184.108.40.206] at org.apache.hive.service.cli.operation.OperationManager.initOperationLogCapture(OperationManager.java:85) ~[spark-hive-thriftserver_2.11-220.127.116.11.jar:18.104.22.168] at org.apache.hive.service.cli.operation.OperationManager.init(OperationManager.java:63) ~[spark-hive-thriftserver_2.11-22.214.171.124.jar:126.96.36.199] at org.apache.spark.sql.hive.thriftserver.ReflectedCompositeService$$anonfun$initCompositeService$1.apply(SparkSQLCLIService.scala:79) ~[spark-hive-thriftserver_2.11-188.8.131.52.jar:184.108.40.206] at org.apache.spark.sql.hive.thriftserver.ReflectedCompositeService$$anonfun$initCompositeService$1.apply(SparkSQLCLIService.scala:79) ~[spark-hive-thriftserver_2.11-220.127.116.11.jar:18.104.22.168] ...
Apache Hive included in DSE uses
WriterAppender.setWriter() but the Log4j bridge (
log4j-over-slf4j library JAR) does not implement this method, so for this reason the
NoSuchMethodError exception occurs. During the Hive initialisation phase, the call to
WriterAppender.setWriter() fails preventing the Spark SQL Thrift server from starting.
The issue with Hive's incorrect Log4j usage has been resolved in DSE 5.0.0 by disabling the redundant Hive operation logging (DSP-7012) in
hive-site.xml with the following configuration:
<property> <name>hive.server2.logging.operation.enabled</name> <value>false</value> </property>
The startup failure occurs on nodes which that were upgraded from earlier versions of DSE but are still using earlier versions of configuration files.
After upgrading DSE, be sure to use the configuration files that are installed with the upgraded version of DSE. DataStax recommends carefully reviewing the Planning and upgrading instructions to ensure a smooth upgrade and avoid pitfalls and frustrations.
For the issue discussed in this article, use the upgraded version of
hive-site.xml configuration file which contains the property that disables Hive operation logging in order to start the Spark SQL Thrift server successfully.