Google News
logo
PySpark - Interview Questions
Why is PySpark SparkConf used?
PySpark SparkConf is used for setting the configurations and parameters required to run applications on a cluster or local system. The following class can be executed to run the SparkConf:
class pyspark.Sparkconf(
localdefaults = True,
_jvm = None,
_jconf = None
)?

where :

* loadDefaults : is of type boolean and indicates whether we require loading values from Java System Properties. It is True by default.
* _jvm : This belongs to the class py4j.java_gateway.JVMView and is an internal parameter that is used for passing the handle to JVM. This need not be set by the users.
* _jconf : This belongs to the class py4j.java_gateway.JavaObject. This parameter is an option and can be used for passing existing SparkConf handles for using the parameters.
Advertisement