End to End Demo of Enabling Something for LAVA
LAVA is amazing, but understanding how things work end-to-end is still a bit confusing and not really documented. This session will cover a scenario I recently went through enabling some benchmarks for LAVA in the Android build. Topics included will be:
* the LAVA pieces - a bit of context about the components
and code repos for them and how they fit
* getting started - cover setting up "virtualenv" so that you can
easily run lava-android-test from your dev environment
* how to add a test - a demonstration of two benchmark tests I
created. A simple one using the default parser, and a more complex
one where I had to extend the parser.
* getting the results into the dashboard - basically setting up and
using the lava-dashboard-tool from your local dev system
* running android tests in the validation lab - covering how to work
with android-build.l.o to run specific tests after a build.
* playing with the data - brief coverage of how you can pull bundles
from the validation server and analyze the data. For example, I've
written a script that can look at test_results from multiple runs,
compute an average and standard deviation for each result, and
generate some nice charts.