Perf Validation
Perf Validation at LAVA
Blueprint information
- Status:
- Complete
- Approver:
- Ricardo Salveti
- Priority:
- Medium
- Drafter:
- None
- Direction:
- Approved
- Assignee:
- Avik Sil
- Definition:
- Approved
- Series goal:
- Accepted for 11.11
- Implementation:
- Implemented
- Milestone target:
- 11.10
- Started by
- Ricardo Salveti
- Completed by
- David Zinman
Related branches
Whiteboard
[dmart] Validation on multiple trees and SoCs is important, since SoC-specific support is needed for perf to work properly -- this seems to be a reason for the current flakiness: we confirm that perf works on one tree, while in the meantime something causes it to break in another.
[aviksil, Oct 25, 2011]: "perf test" runs a testsuite that checks for sanity. Unfortunately it failed on both panda and beagle (though on beagle, other perf tools work fine). I wrote a lava-test definition for this perf test: https:/
[aviksil, Oct 25, 2011]: plars plans to look at the integration after 11.10 release
[dzin, Nov 8, 2011]: break out postponed item to bug 888115. Mark as implemented.
Headline: Perf for ARM has been validated using distinct test cases.
Acceptance: Perf validated for ARM by running testsuites using LAVA framework
Work Items
Work items:
Identify a set of useful test cases to validate perf: DONE
Check if the already available test cases are enough already, if any: DONE
Improve/develop the test cases to be able to validate the tool: DONE
Create a test job at lava containing the test cases identified by previous WIs: DONE
Announce the new test case once it's integrated at Lava: POSTPONED