Skip to main content

Set default JAVA_HOME path for Apache Spark

Introduction 

Currently, Apache Spark does not support JAVA 11 and later versions. When you try to run the Spark application, you may get the following exception

Exception

pyspark.sql.utils.IllegalArgumentException: 'Unsupported class file major version 55'

Solution

There are different ways to fix this exception like

  • Set environmental variable for JAVA_HOME
  • Modify Apache spark environment configuration file i.e. spark-env.sh or spark-env.cmd

In this post, I will help you to set JAVA_HOME using Spark's configuration file

Windows Environment 


  • Go to the spark-directory\conf
  • Create a file by the name of spark-env.cmd
  •  Paste the following line spark-env.cmd      
           set JAVA_HOME=C:\Program Files\Java\jdk1.8.0_201


Linux and Mac


  • Go to the spark-directory\conf
  • Open spark-env.sh
  • Paste the following line spark-env.cmd 

      export JAVA_HOME=$(user/Java/jdk1.8.0_201 -v 1.8)

Note: Change the installed Java directory accordingly.

Please leave your comments in the comments box if you find this post useful.





Comments

  1. yes this is useful. keep updating

    ReplyDelete
  2. worth noting that this does not change your actual JAVA_HOME path variable

    ReplyDelete
  3. Worked for me, only difference was the file name was `load-spark-env.sh` and it was in bin folder

    ReplyDelete

Post a Comment

Popular posts from this blog

Eclipse - Server Tomcat v8.5 Server at localhost failed to start.

When I try to launch the tomcat from Eclipse, I encountered the following error Server Tomcat v8.5 Server at localhost failed to start. Solution Step 1  Delete the .snap file located at the following location     eclipse workspace Path\ .metadata\.plugins\org.eclipse.core.resources Step 2 Delete the  tmp0  folder from the following path      eclipse workspace Path \.metadata\.plugins\org.eclipse.wst.server.core Step 3  Delete the server from servers list Step 4  Remove already added Tomcat Server      i)  Click on Define a new Server     ii)  Select Server Runtime Environments     iii) Select the Tomcat Server and remove it as follows Remove Selected Server Step 5 Make sure that correct version of Server is configured in Project Properties Step 6 Restart the Eclipse IDE.

hibernate-release-5.4.4.Final - Required Jars

Introduction Hibernate (Object Relational Mapping framework) is an implementation of Java Persistence API (JPA) specification.   Required Jars for Hibernate 5.4.4 Following Jars resided inside the required folder are the mandatory jars required for Hibernate 5.4.4 antlr-2.7.7.jar byte-buddy-1.9.11.jar classmate-1.3.4.jar dom4j-2.1.1.jar FastInfoset-1.2.15.jar hibernate-commons-annotations-5.1.0.Final.jar hibernate-core-5.4.4.Final.jar istack-commons-runtime-3.0.7.jar jandex-2.0.5.Final.jar javassist-3.24.0-GA.jar javax.activation-api-1.2.0.jar javax.persistence-api-2.2.jar jaxb-api-2.3.1.jar jaxb-runtime-2.3.1.jar jboss-logging-3.3.2.Final.jar jboss-transaction-api_1.2_spec-1.1.1.Final.jar stax-ex-1.8.jar txw2-2.3.1.jar Hibernate 5.4.4 release is compatible with  Java 8 or 11  JPA 2.2 References https://hibernate.org/orm/releases/5.4/

spark-submit java.lang.NoClassDefFoundError: scala/runtime/java8/JFunction1$mcII$sp

Exception  Exception in thread "main" java.lang.BootstrapMethodError: java.lang.NoClassDefFoundError: scala/runtime/java8/JFunction1$mcII$sp         at SparkPi$.main(SparkPi.scala:14)         at SparkPi.main(SparkPi.scala)         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) Problem: It seems that you have compiled and generated your jar file with an upper version of the Scala compiler than the one spark is using. Solution Step 1 Run the following command from spark-installed-directory\bin spark-shell.cmd (or.sh) and note the Scala version, My Spark version was 2.4.3 and Scala version 2.11.12 Step 2 Change the scala version into your build.sbt to 2.11.12 (per your configuration). My build.sbt is name := "SparkPi Project" version := "1.0" scalaVersion := "...