Introduction
Currently, Apache Spark does not support JAVA 11 and later versions. When you try to run the Spark application, you may get the following exceptionException
pyspark.sql.utils.IllegalArgumentException: 'Unsupported class file major version 55'Solution
There are different ways to fix this exception like- Set environmental variable for JAVA_HOME
- Modify Apache spark environment configuration file i.e. spark-env.sh or spark-env.cmd
In this post, I will help you to set JAVA_HOME using Spark's configuration file
Windows Environment
- Go to the spark-directory\conf
- Create a file by the name of spark-env.cmd
- Paste the following line spark-env.cmd
Linux and Mac
- Go to the spark-directory\conf
- Open spark-env.sh
- Paste the following line spark-env.cmd
export JAVA_HOME=$(user/Java/jdk1.8.0_201 -v 1.8)
Note: Change the installed Java directory accordingly.
Please leave your comments in the comments box if you find this post useful.
yes this is useful. keep updating
ReplyDeleteworth noting that this does not change your actual JAVA_HOME path variable
ReplyDeleteWorked for me, only difference was the file name was `load-spark-env.sh` and it was in bin folder
ReplyDelete