Spark page in the Hadoop view
The Spark page in the Hadoop view enables insights over application execution and corresponding resource utilization for the Hadoop Spark service.
To access the page, in the Helix Capacity Optimization Dashboard navigation pane, click Capacity Views > Hadoop > Hadoop View, and in the vSphere Infrastructure page, click the Spark tab.
The data in the page is categorized based on different subsystems and grouped under separate tabs. Click a tab to view the metrics for that subsystem:
Metrics per cluster are displayed in a tabular format.
This topic contains the following sections:
Summary
Resource utilization and Spark performance indicators are displayed for each cluster.
Each row in the table represents a Hadoop cluster and displays the following metrics:
Details
This page displays Spark information for the selected cluster under the following sections or panels:
Section or Panel | Description |
|---|---|
Main page | A graphical analysis of tasks execution and corresponding resource utilization is displayed for Spark service of the selected cluster. |
Configuration Details page | Configuration information about the cluster. |
Applications
Application level analysis is presented, with insight both on Spark performance and resource utilization.
Each row in the table represents a Hadoop cluster and displays the following metrics:
For more information, see Sorting tables in views in the Helix Capacity Optimization Dashboard and Using filtering options in views in the Helix Capacity Optimization console.
that is located next to the table title.