Resolving Spark 1.6.0 "java.lang.NullPointerException, not found: value sqlContext" error when running spark-shell on Windows 10 (64-bit)

It is easy to follow the instructions on http://spark.apache.org/docs/latest/ and download Spark 1.6.0 (Jan 04 2016) with the “Pre-build for Hadoop 2.6 and later” package type from http://spark.apache.org/downloads.html However, when you try to run spark-shell on your Windows 10 (64-bit) machine, you may receive a java.lang.RuntimeException: java.lang.NullPointerException (not found: value sqlContext) java.lang.RuntimeException: java.lang.NullPointerException         at…

16