Applies to
- DataStax Enterprise 6.7
- DataStax Enterprise 6.0
Symptom
This example error can appear in the system.log when running Spark jobs and every executor process dies before it even starts:
ERROR [ExecutorRunner for app-20190724064306-0000/5407] 2019-07-24 07:04:55,032 Logging.scala:91 - Error running executor
java.lang.IllegalStateException: Cannot find any build directories.
at org.apache.spark.launcher.CommandBuilderUtils.checkState(CommandBuilderUtils.java:248)
at org.apache.spark.launcher.AbstractCommandBuilder.getScalaVersion(AbstractCommandBuilder.java:240)
at org.apache.spark.launcher.AbstractCommandBuilder.buildClassPath(AbstractCommandBuilder.java:194)
at org.apache.spark.launcher.AbstractCommandBuilder.buildJavaCommand(AbstractCommandBuilder.java:117)
at org.apache.spark.launcher.WorkerCommandBuilder.buildCommand(WorkerCommandBuilder.scala:39)
at org.apache.spark.launcher.WorkerCommandBuilder.buildCommand(WorkerCommandBuilder.scala:45)
at org.apache.spark.deploy.worker.CommandUtils$.buildCommandSeq(CommandUtils.scala:63)
at org.apache.spark.deploy.worker.CommandUtils$.buildProcessBuilder(CommandUtils.scala:51)
at org.apache.spark.deploy.worker.ExecutorRunner.fetchAndRunExecutor(ExecutorRunner.scala:150)
at org.apache.spark.deploy.worker.DseExecutorRunner$$anon$2.run(DseExecutorRunner.scala:88)
Cause
We have observed this problem following an upgrade process where the dse-spark-env.sh file had been incorrectly merged by the user.
Solution
Verify that the below environment is set in the dse-spark-env.sh (default location /etc/dse/spark) for all nodes in the cluster:
export SPARK_SCALA_VERSION="2.11"