Introduction Currently, Apache Spark does not support JAVA 11 and later versions. When you try to run the Spark application, you may get the following exception Exception pyspark.sql.utils.IllegalArgumentException: 'Unsupported class file major version 55' Solution There are different ways to fix this exception like Set environmental variable for JAVA_HOME Modify Apache spark environment configuration file i.e. spark-env.sh or spark-env.cmd In this post, I will help you to set JAVA_HOME using Spark's configuration file Windows Environment Go to the spark-directory\ conf Create a file by the name of spark-env.cmd Paste the following line spark-env.cmd set JAVA_HOME=C:\Program Files\Java\jdk1.8.0_201 Linux and Mac Go to the spark-directory\ conf Open spark-env.sh Paste the following line spark-env.cmd export JAVA_HOME=$(user/Java/jdk1.8.0_201 -v 1.8) Note : Change the installed Java directory accordingly.
Blog about programming