HEBench
|
The following details the format and description of the statistics and summary files produced by Report Compiler.
This example shows an example of the contents of a statistics file. A summary file is very similar, except that the columns for the table of values are reduced to a few highlights. The table describing each column in a later section indicates which columns are common to statistics and summary files.
The header section of a statistics file starts from the beginning of the file until the "Notes" section. It is comprised of some standard information:
The elements listed above will always appear in a header. Some extra elements adding human readable descriptions may appear interspersed throughout.
The "Notes" section follows the header and may contain important information regarding the benchmark performed, or it could be blank if no extra information is required. Some extra information may include whether validation was disabled during the execution of the benchmark, for example.
During a benchmark run, Test Harness measures timings for every event that occurs during execution. Events are grouped by type because some types may occur more than once. While all events are measured, the focus of each supported workload is on the performance of a specific event. Unless otherwise specified the focus in a workload is centered on the "Operation" event type by default (corresponding to calls to the hebench::APIBridge::operate()
function ). However, the framework is designed such that workloads may indicate a main event type other than "Operation". The event type which is the focus of a workload is known as the main event.
The main event is workload specific and is part of the workload definition; thus, it cannot be changed. Test Harness takes care of properly executing the main event such that statistics can be collected correctly, such as performing requested number of warmup iterations in a latency test, or establishing a minimum execution time in offline category, etc.
Each event type is uniquely identified within the report by a numeric ID. The main event for the workload executed in the benchmark is indicated in the "Main event" section of the statistics and summary files. This section is a single line containing the ID for the main event, followed by the name of the event type.
In general, the name given to event types is used for informational purposes only and is inconsequential. Only the unique event type ID is important. Each event type corresponds to one of the main functions from the API Bridge function pipeline .
The final section of a statistics or summary file is the table containing the data with information regarding the benchmark executed. This data is the result of the statistical computations based on all the events recorded in the benchmark report.
Test Harness measures both, wall time and CPU time for a benchmark.
The statistics table contains a section for each: Wall time and CPU time.
The following information is collected for both Wall and CPU time.
Column name | Appears in Summary | Description |
---|---|---|
Average | Y | Average time that the implementation took to process an input sample. |
Standard Deviation | Y | Standard deviation from the average. |
Time Unit | Y | Time unit used to represent all timings for the section. |
Time Factor | Y | Time factor representing the time unit in relation to a second. This is, how many seconds are there in a time unit . Thus, any timing in the section can be converted from the time unit into seconds by multiplying by this value. |
Min | N | For all recorded events of the same type, this is the time it took to complete the shortest event. |
Max | N | For all recorded events of the same type, this is the time it took to complete the longest event. |
Median | N | 50th percentile timing. |
Trimmed Average * | N | Average time that the implementation took to process an input sample, computed from the trimmed input samples dataset. |
Trimmed Standard Deviation * | N | Standard deviation from the trimmed average. |
1-th percentile | N | 1th percentile timing. |
10-th percentile | N | 10th percentile timing. |
90-th percentile | N | 90th percentile timing. |
99-th percentile | N | 99th percentile timing. |
The following information is specific for Wall time only.
Column name | Appears in Summary | Description |
---|---|---|
Total Wall Time | N | Accumulated total wall time elapsed for all events of the same type. |
Samples per sec | Y | Number of input samples that the benchmark implementation was able to process per second for the specified event type. |
Samples per sec trimmed * | N | Number of input samples from the trimmed set that the benchmark implementation was able to process per second for the specified event type. |
The following information appears once and applies to both measurements.
Column name | Appears in Summary | Description |
---|---|---|
ID | Y | Numeric value identifying the event type. The main event section of the file will indicate one of this values. |
Event | Y | Name of the event type corresponding to the identifier. |
Input Samples | Y | Total number of input samples processed by the specified operation during the benchmark run. |
*
The trimmed input samples dataset is obtained from the complete input samples dataset by discarding the top 10% values and bottom 10% values. So, if a dataset has 100 input samples, the corresponding trimmed dataset has 80 values.