Correct Answer : SparkContext
Explanation : SparkContext is used to create an RDD in PySpark. It is the entry point to the Spark computing system and provides APIs to create RDDs, accumulates values, and manipulate data. Other Spark components in PySpark include SQLContext, SparkSession, and DataFrameReader.