![]() spark-shellīy default, spark-shell provides with spark (SparkSession) and sc (SparkContext) object’s to use. This command loads the Spark and displays what version of Spark you are using. ![]() In order to start a shell, go to your SPARK_HOME/bin directory and type “ spark-shell2“. Spark binary comes with an interactive spark-shell. Winutils are different for each Hadoop version hence download the right version from spark-shell PATH=%PATH% C:\apps\spark-3.0.0-bin-hadoop2.7\binĭownload wunutils.exe file from winutils, and copy it to %SPARK_HOME%\bin folder. Now set the following environment variables. If you wanted to use a different version of Spark
0 Comments
Leave a Reply. |