Conversant Spark Profiler

Spark Profiler shows how "events" generated by Spark applications can be analyzed for profiling them. Profiling here means understanding how and where an application spent its time, the amount of processing it did, its memory footprint, etc. Since Apache Spark is a distributed processing framework, this kind of processing helps understand application resource utilization and provides a framework to optimize and tune applications.

See the project repo for more information.

Wiki | Source Code