Google News
logo
Spark - Interview Questions
What are the downsides of Spark?
Spark utilizes memory. The developer has to be careful. A casual developer might make the following mistakes :
 
* She may end up running everything on the local node instead of distributing work over to the cluster.
* She might hit some web service too many times by the way of using multiple clusters.

The first problem is well tackled by Hadoop Map reduce paradigm as it ensures that the data your code is churning is fairly small at a point in time thus you can make the mistake of trying to handle whole data on a single node.

The second mistake is possible in Map-Reduce too. While writing Map-Reduce, the user may hit a service from inside of map() or reduce() too many times. This overloading of service is also possible while using Spark.
Advertisement