Google News
logo
PySpark - Interview Questions
What do you mean by PySpark architecture?
PySpark uses a paradigm in which one element controls another, just like Apache Spark. Here, the controlling node is the driver, and the others are worker nodes. The Spark Driver builds a SparkContext during the execution of the application, which serves as the entry point. The worker nodes handle all the operations. Cluster managers administer the resources necessary for the operations that affect the worker nodes.
Advertisement