1
0
mirror of https://github.com/ARM-software/workload-automation.git synced 2025-07-14 19:13:37 +01:00

157 Commits

Author SHA1 Message Date
06518ad40a Version bump for release 2019-12-20 16:07:10 +00:00
009fd831b8 docs/changelog: Update changelog for version 3.2 release. 2019-12-20 16:07:10 +00:00
88284750e7 Dockerfile: Update to reference new release of WA and devlib 2019-12-20 16:07:10 +00:00
8b337768a3 Dockerfile: Update ubuntu base to 19.10 2019-12-20 16:07:10 +00:00
38aa9d12bd fw/entrypoint: Fix devlib version check
That absence of a value in the "dev" version field indicates a release
version, ensure this is taken into account when comparing version numbers.
2019-12-20 16:07:10 +00:00
769c883a3a requirements: Update to latest known working package versions 2019-12-20 16:07:10 +00:00
90db655959 instrument/perf: Fix incorrect argument 2019-12-20 16:07:10 +00:00
817d98ed72 wa/instruments: Refactor collectors to use Collector Inferface
Update the WA instruments which rely on the refactored devlib collectors
to reflect the new API.
2019-12-20 15:17:01 +00:00
d67668621c Geekbench: Adding 4.4.2 as a supported version
There have been no UI changes to the application so simply adding the
new supported version to the list of accepted versions.
2019-12-19 14:18:10 +00:00
1531ddcdef workloads/speedometer: Only close tabs on supported devices
Some devices don't have the option to close all tabs so don't error if
this element cannot be found.
2019-12-18 10:07:11 +00:00
322f9be2d3 workloads/googleplaybooks: Fix book selector
Do not try and use a parent element of the book entry, search for the entry
directly.
2019-12-18 10:07:11 +00:00
494424c8ea utils/types: Fix ParameterDict update method.
When updating a ParameterDict with another ParameterDict the unencoded
values were being merged. Ensure consistent behaviour by implicitally
iterating via `__iter__` which will cause ParameterDict values to be
decoded before being re-endcoded as expected.
2019-12-18 10:07:11 +00:00
ee54a68b65 Updating the framework to include the app package data
As part of our continous integration system it has become
clear that gathering the app package data as well as the
version name can provide useful.

Adding this functionality to mainline as it could prove
useful to other developers.
2019-12-03 17:48:59 +00:00
cc1cc6f77f tests: fix pytest warnings
Fix warnings reported when running unit tests via pytest.

- Rename TestDevice to MockDevice, so that it is not interpreted as a
  test case.
- Fix collections abstract base class imports.
- Add an ini file to ignore the same from "past".
2019-12-03 14:03:18 +00:00
da0ceab027 PCMark: Updating to handle new Android 10 permission warnings
Android 10 has introduced a new permission warning and a
seperate warning about the APK being built for an older version
of the Android OS. Have added two checks to accept these
permissions and continue with the workload and the given APK
rather than attempting to update.
2019-11-28 16:02:20 +00:00
683eec2377 PCMark: Editing the screen orientation
Some devices have proved to have a natural orientation
that does not lend itself well to this workload. Therefore
I have edited the orientation lock to portrait instead of
natural.
2019-11-22 16:29:59 +00:00
07e47de807 Geekbench: Updating supported versions
Adding support for Geekbench 4.3.4 and 4.4.0.

Adding support for Geekbench Corporate 5.0.1 and 5.0.3.

There are no changes required to the functional workload.
2019-11-22 12:07:50 +00:00
5906bca6b3 instruments/acme_cape: Fix missing parameter to get_instruments
The signature of `get_instruments` was missing the `keep_raw` parameter
so fix this and use it as part of the subsequent common invocation.
2019-11-18 16:09:09 +00:00
9556c3a004 docs: Fix typos 2019-11-05 08:35:57 +00:00
1f4bae92bf docs/device_setup: Explicitly mention load_default_modules
This is becoming a commonly used parameter in the `device_config` so
explicitly list its functionality.
2019-11-05 08:35:57 +00:00
dcbc00addd docs/faq: Add workaround for module initialisation failure 2019-11-05 08:35:57 +00:00
4ee75be7ab docs/userguide: Update reference to outdated output_processor 2019-11-05 08:35:57 +00:00
796dfb1de6 Update googlephotos workload to work against most recent version
* Updated googlephotos workload to work against version 4.28.0
2019-10-29 16:17:20 +00:00
f3e7b14b28 Update googlemaps workload to work against most recent version
* Updated google maps workload to work against apk version 10.19.1
2019-10-29 13:34:08 +00:00
e9839d52c4 output_processor/postgres: Fix out of range for hostid
Change the field type of `hostid` as part of `TargetInfo` from `Int` to
Bigint to prevent some ids from exceeding the maximum value.
2019-10-23 15:45:56 +01:00
7ebbb05934 target/info: Fix missing return statement
Add missing return statement when upgrading a `TargetInfo` POD to v5.
2019-10-23 15:45:56 +01:00
13166f66d1 Update gmail workload to work against most recent version
* Updated gmail to work against 2019.05.26.252424914.release.
2019-10-17 14:17:56 +01:00
ab5d12be72 output_processors/cpustates: Improve handling of missing cpuinfo data
Improve checking of whether cpu idle state information is available for
processing.
Add debug message to inform user if the cpuidle module is not detected
on the target.
2019-10-15 15:17:02 +01:00
298bc3a7f3 output_processors/cpustates: Deal with cpufreq data being unavailable
If the `cpufreq` module is not detected as present then warn the user
and processes the remaining data instead of crashing.
2019-10-15 15:17:02 +01:00
09d6f4dea1 uilts/cpustates: Fix inverted no_idle check
If there is no information about idle states then `no_idle` should be
set to `True` instead of `False`.
2019-10-15 15:17:02 +01:00
d7c95fa844 instrument/energy_measurement: Fix typo and formatting 2019-10-03 12:48:24 +01:00
0efd20cf59 uiauto: Update all applications to target SDK version 28
On devices running android 9 with google play services, PlayProtect
blocks the installation of our automation apks due to targeting a lower
SDK version. Update all apk builds to target SDK version 28 (Android 9)
however do not change the minimum version to maintain backwards
compatibility.
2019-10-03 12:18:48 +01:00
e41aa3c967 instruments/energy_measurement: Add a keep_raw parameter
Add a `keep_raw` parameter to control whether raw files should be
deleted during teardown.
2019-10-03 11:38:29 +01:00
3bef4fc92d instrument/energy_measurement: Invoke teardown method for backends
Forward the teardown method invocation to the instrument backend.
2019-10-03 08:39:53 +01:00
0166180f30 PCMark: Fixing click co-ordinates
The workload is clicking the run button in the centre
of the element but this is no longer starting the
run operation.

Refactoring the code to click in the topleft of the
object seems to rectify the issue.
2019-09-24 16:01:37 +01:00
a9f3ee9752 Update adobe reader workload to work with most recent version
Updated adobe reader workload to work with version 19.7.1.10709
2019-09-24 12:57:26 +01:00
35ce87068c Antutu: Updating to handle the new shortcut popup
Added logic to dismiss the new popup message which asks
to add a shortcut to the android homescreen.
2019-09-19 14:07:50 +01:00
6beac11ee2 Add simpleperf type to perf instrument
* Added simpleperf type to perf instrument as it's more stable
  on Android devices.
* Added record command to instrument
* Added output processing for simpleperf
2019-09-18 12:55:59 +01:00
2f231b5ce5 fw/target: detect module variations in TargetInfo
- Add modules entry to TargetInfo
- When retrieving TargetInfo from cache, make sure info modules match
  those for the current target, otherwise mark info as stale and
  re-generate.
2019-09-12 15:27:23 +01:00
75878e2f27 uiauto/build_scripts: Update to use python3
Ensure we are invoking python3 when attempting to import `wa` and
update the printing syntax to be compatible.
2019-09-06 11:02:13 +01:00
023cb88ab1 templates/setup: Update package setup template to specify python3 2019-09-06 11:02:13 +01:00
d27443deb5 workloads/rt_app: Update to use python3
Update the workload generation script to be python3 compatible and
invoke with python3.
2019-09-06 11:02:13 +01:00
1a15f5c761 Geekbench: Adding support for Version 5 of Geekbench Corporate 2019-09-06 10:34:41 +01:00
d3af4e7515 setup.py: Update pandas version restrictions
Pandas versions 0.25+ requires Python 3.5.3 as a minimum so ensure that
an older version of pandas is installed for older versions of Python.
2019-08-30 14:03:03 +01:00
73b0b0d709 readthedocs: Add ReadtheDocs configuration
Provide configuration file for building the documentation. We need to
specify to use python3 explicitly.
2019-08-28 15:03:38 +01:00
bb18a1a51c travis: Remove tests for python2.7 2019-08-28 11:46:26 +01:00
062be6d544 output_processors/postgresql: Don't reconnect if not initialised
Update check to clarify that we should not attempt to reconnect to
the database if the initialisation of the output processor failed.
2019-08-28 11:33:46 +01:00
c1e095be51 output_processors/postgresql: Ensure still connected to the database
Before exporting output to ensure that we are still connected to the
database. The connection may be dropped so reconnect if necessary, this
is a more of an issue with longer running jobs.
2019-08-28 11:33:46 +01:00
eeebd010b9 output_processors/postgresql: Group database connection operations
Refactors connection operations into the `connect_to_database`
method.
2019-08-28 11:33:46 +01:00
e387e3d9b7 Update to remove Python2 as supported version. 2019-07-19 17:07:46 +01:00
6042fa374a fw/version: Version Bump 2019-07-19 17:07:46 +01:00
050329a5ee fw/version: Update version for WA and required devlib 2019-07-19 16:37:00 +01:00
d9e7aa9af0 Dockerfile: Update to use the latest versions of WA and devlib 2019-07-19 16:37:00 +01:00
125cd3bb41 docs/changes: Changelog for v3.1.4 2019-07-19 16:37:00 +01:00
75ea78ea4f docs/faq: Fix formatting 2019-07-19 16:37:00 +01:00
12bb21045e instruments/SysfsExtractor: Add extracted directories as artifacts
Add the directories that have been extracted by the `SysfsExtractor` and
derived instruments as artifacts.
2019-07-19 16:36:11 +01:00
4bb1f4988f fw/DatabaseOutput: Only attempt to extract config if avaliable
Do not try to parse `kernel_config` if no data is present.
2019-07-19 16:36:11 +01:00
0ff6b4842a fw/DatabaseOutput: Fix the retrieval of job level artifacts 2019-07-19 16:36:11 +01:00
98b787e326 fw/DatabaseOutput: Enabled retrieving of directory artifacts
To provide the same user experience of accessing a directory
artifact from a standard `wa_output` when attempting to retrieve the
path of the artifact extract the stored tar file and extract it to a
temporary location on the host returning the path.
2019-07-19 16:36:11 +01:00
e915436661 commands/postgres: Upgrade the data base schema to v1.3
Upgrade the database schema to reflect the additions of directory
artifacts and the missing TargetInfo property.
2019-07-19 16:36:11 +01:00
68e1806c07 output_processors/postgresql: Add support for new directory Artifacts
Reflecting the addition to being able to store directories as Artifacts
enable uploading of a directory as a compressed tar file rather than
storing the file directly.
2019-07-19 16:36:11 +01:00
f19ebb79ee output_processors/postgresql: Add missing system_id field
When storing the `TargetInfo` the `system_id` attribute was ommited.
2019-07-19 16:36:11 +01:00
c950f5ec8f utils/postgres: Fix formatting 2019-07-19 16:36:11 +01:00
6aaa28781b fw/Artifact: Allows adding directories as artifacts
Adds a `is_dir` property to an `Artifact` to indicate that the
artifact represents a directory rather than an individual file.
2019-07-19 16:36:11 +01:00
d87025ad3a output_processors/postgres: Fix empty iterable
In the case of an empty iterable an empty string would be returned
however this was not an valid value so ensure that the brackets are
always inserted into the output.
2019-07-19 16:36:11 +01:00
ac5819da8e output_processors/postgres: Fix incorrect dict keys 2019-07-19 16:36:11 +01:00
31e08a6477 instruments/interrupts: Add interrupt files as artifacts
Ensure that the interrupt files pulled and diffed from the device are
added as artifacts.
2019-07-19 16:36:11 +01:00
47769cf28d Add a workload for Motionmark tests 2019-07-19 14:31:04 +01:00
d8601880ac setup.py: Add README as package description
Add the project README to be displayed as the project description on
PyPI.
2019-07-18 15:17:24 +01:00
0efc9b9ccd setup.py: Clean up extra dependencies
- Remove unused dependencies left over from WA2.
- Allow installing of the `daqpower` module as an option dependency.
2019-07-18 15:17:24 +01:00
501d3048a5 requirements: Add initial version
Adds a "requirements.txt" to the project. This will not be used during a
standard installation however will be used to indicate which are known
working packages in cases of conflicts.

Update README and documentaion to reflect this.
2019-07-18 15:17:24 +01:00
c4daccd800 README: Update installation instruction to match documentation.
When installing from github we recommend installing with setup.py as
install with pip does not always resolve dependencies correctly.
2019-07-18 15:17:24 +01:00
db944629f3 setup.py: Update classifiers 2019-07-18 15:17:24 +01:00
564738a2ad workloads/monoperf: Fix typos 2019-07-18 15:17:24 +01:00
c092128e94 workloads: Sets requires_network attribute for workloads
Both speedometer and aitutu require internet to function however this
attribute was missing from the workloads.
2019-07-18 15:17:24 +01:00
463840d2b7 docs/faq: Add question about non UTF-8 environments. 2019-07-12 13:32:28 +01:00
43633ab362 extras/Dockerfile: Ensure we are using utf-8 in our docker container
For compatibility we want to be using utf-8 by default when we interact
with files within WA so ensure that our environment is configured
accordingly.
2019-07-12 13:32:28 +01:00
a6f0ab31e4 fw/entrypoint: Add check for system default encoding
Check what the default encoding for the system is set to. If this is not
configured to use 'UTF-8', log a warning to the user as this is known
to cause issues when attempting to parse none ascii files during operation.
2019-07-12 13:32:28 +01:00
72fd5b5139 setup.py: Set maximum package version for python2.7 support
In the latest versions of panadas and numpy python2.7 support has been
dropped therefore restrict the maximum version of these packages.
2019-07-08 13:46:35 +01:00
766bb4da1a fw/uiauto: Allow specifying landscape and portrait orientation
Previously the `setScreenOrientation` function only accepted relative
orientations, this causes issue when attempt to use across tablets and
phones with different natural orientations. Now take into account the
current orientation and screen resolution to allow specifying portrait vs
landscape across different types of devices.
2019-07-04 13:18:48 +01:00
a5f0521353 utils/types: Fix typos 2019-06-28 17:56:13 +01:00
3435c36b98 fw/workload: Improve version matching and error propergation
Ensure that the appropriate error message is returned to the user to
outline what caused the version matching to fail.
Additionally fix the case where if specifying a package name directly
the version matching result would be ignored.
2019-06-28 17:56:13 +01:00
bd252a6471 fw/workload: Introduce max / min versions for apks
Allow specifying a maximum and minimum version of an APK to be used for
a workload.
2019-06-28 17:56:13 +01:00
f46851a3b4 utils/types: Add version_tuple
Allow for `version_tuple` to be used more generically to enable
natural comparing of versions encoded as strings.
2019-06-28 17:56:13 +01:00
8910234448 fw/workload: Don't override the package manager for ApkRevent workloads
`ApkRevent` workloads should be able to use the same Apk selection
criteria as `ApkWorkloads` therefore rely on the superclass to
instantiate the `PackageHandler`.
2019-06-28 17:56:13 +01:00
1108c5701e workloads: Update to better utilize cleanup_assets and uninstall
Update the workload classes to attempt and standardize the use of the
`cleanup_assets` parameter and the newly added `uninstall` parameter
2019-06-28 17:54:04 +01:00
f5d1a9e94a fw/workload: Add the uninstall parameter to the base workload class
In additional to being able to specify whether the APK should be
uninstalled as part of a `APKWorkload`s teardown add the `uninstall`
parameter to the base `workload` class in order to specify whether any
binaries installed for a workload should be uninstalled again.
2019-06-28 17:54:04 +01:00
959106d61b fw/workload: Update description of cleanup_assests parameter
Improve the description of the parameter as the parameter may be used in
other places aside from the teardown method.
2019-06-28 17:54:04 +01:00
0aea3abcaf workloads: Add support for UIBench Jank Tests
Add a workload that launches UIBenchJankTests. This differs from the
UIBench application as it adds automation and instrumentation to that
APK. This therefore requires a different implementation than classical
ApkWorkloads as 2 APKs are required (UIBench and UIBenchJankTests) and
the main APK is invoked through `am instrument` (as opposed to `am
start`).
2019-06-28 09:27:56 +01:00
24ccc024f8 framework.workload: am instrument APK manager
Add support for Android applications that are invoked through `am
instrument` (as opposed to `am start`) _i.e._ that have been
instrumented. See AOSP `/platform_testing/tests/` for examples of such
applications.
2019-06-28 09:27:56 +01:00
42ab811032 workloads/lmbench: Fix missing run method declaration 2019-06-19 11:28:28 +01:00
832ed797e1 fw/config/execution: Raise error if no jobs are available for running
If no jobs have been generated that are available for running then WA
will crash when trying to access the job queue. Add an explicit check to
ensure that a sensible error is raised in this case, for example if
attempting to run a specific job ID that is not found.
2019-06-06 15:17:42 +01:00
31b44e447e setup.py: Add missing dependency for building documentation
In later versions of sphinx the rtd theme needs installing explicitly
as it is no longer included in the main package.
2019-06-04 14:53:59 +01:00
179b2e2264 Dockerfile: Update to install all available extras for WA and devlib
Install all extras of WA and devliv to be able to use all available
features within the docker container.
2019-06-04 14:53:59 +01:00
22437359b6 setup.py: Change short hand to install all extras to all
In our documentation we detail being able to install the `all` extra
as a shorthand for installing all the available extra packages that WA
may require however this was actually implemented as `everything`.
2019-06-04 14:53:59 +01:00
2347c8c007 setup.py: Add postgres dependency in extras list 2019-06-04 14:53:59 +01:00
52a0a79012 build_plugin_docs: Pylint fix
Fix various pylint warnings.
2019-06-04 14:53:20 +01:00
60693e1b65 doc: Fix build_instrument_method_map script
Fix a wrong call to a function in the script execution path.
2019-06-04 14:53:20 +01:00
8ddf16dfea doc: Patch for doc generation under Py3
Patch scripts with methods that are supported under Py2.7 and Py3.
2019-06-04 14:53:20 +01:00
9aec4850c2 workloads/uibench: Pylint Fix 2019-05-28 09:33:15 +01:00
bdaa26d772 Geekbench: Updating supported versions to include 4.3.2 2019-05-24 17:47:49 +01:00
d7aedae69c workloads/uibench: Initial commit
Add support for Android's UIBench suite of tests as a WA workload.
2019-05-24 17:47:35 +01:00
45af8c69b8 ApkWorkload: Support implicit activity path
If the activity field of an instance of ApkWorkload does not the '.'
character, it is assumed that it is in the Java namespace of the
application. This is similar to how activities can be referred to with
relative paths:
    com.domain.app/.activity
instead of
    com.domain.app/com.domain.app.activity
2019-05-24 17:47:35 +01:00
e398083f6e PCMark: Removing hard coded index to make the workload more robust 2019-05-22 11:07:43 +01:00
4ce41407e9 tests/test_agenda_parser: Ensure anchors can be used as part of agenda
Ensure that yaml anchors and aliases can be used within a WA agenda.
2019-05-17 20:04:33 +01:00
aa0564e8f3 tests/test_agenda_parser: Use custom yaml loader for test cases
Instead of using the default yaml loader make sure to use our customised
loader. Also move the loading stage into our test cases as this should
be part of the test case to ensure that it functions for the individual
test case.
2019-05-17 20:04:33 +01:00
83f826d6fe utils/serializser: Re-fix support for YAML anchors
Include missing `flatten_mapping` call in our implementation of
`construct_mapping`. This is performed by a subclass in the default
implementation which was missing in our previous fix.
2019-05-17 20:04:33 +01:00
1599b59770 workloads: add aitutu
Add a workload to execute the Aitutu benchmark.
2019-05-17 13:26:36 +01:00
8cd9862e32 workloads/geekbench: Clean up version handling
The workload could attempt to use the version attribute before it was
discovered to assess the workload activity causing an error however the
whole process can be simplified using newer discovery features.
2019-05-17 09:15:23 +01:00
b4ea2798dd tests/test_agenda_parser: Remove attribute assignment
Do not try and assign the name attribute of the yaml loaded agenda as
this is not used.
2019-05-15 19:48:39 +01:00
76e6f14212 utils/serializer: pylint fixes 2019-05-15 19:48:39 +01:00
ce59318e66 utils/serializer: Fix using incorrect loader and imports
- Ensure that the new loader is used when opening files to ensure that our
custom constructors are used.
- Fix missing imports
2019-05-15 19:48:39 +01:00
5652057adb utils/serializer: fix support for YAML anchors.
Change the way maps get processed by YAML constructor to support YAML
features, such as anchors, while still maintaining dict ordering.
2019-05-15 09:59:14 +01:00
e9f5577237 utils/serializer: fix error reporting for YAML
When attempting to access the message of a exception check not only that
e.args is populated, but also that e.args[0] actually contains
something, before defaulting to str(e).
2019-05-15 09:57:52 +01:00
ec3d928b3b docs: Fix incorrect environment variable name 2019-04-26 08:05:51 +01:00
ee8bab365b docs/revent: Clarify the naming conventions for revent recordings
As per https://github.com/ARM-software/workload-automation/issues/968
the current documentation for detailing the naming scheme for an revent
recording is unclear. Reword the descriptions focusing on the typical
usecase rather then based on a customized target class.
2019-04-26 08:05:51 +01:00
e3406bdb74 instruments/delay: Convert module name to identifier
- Ensure cooling module name is converted to identifier when resolving
- Fix typo
2019-04-26 08:04:45 +01:00
55d983ecaf workloads/vellamo: Fix initialization order
Ensure that uiauto parameters are set before calling the super method.
2019-04-26 08:04:45 +01:00
f8908e8194 Dockerfile: Update to newer base and Python version
- Update the base ubunutu image to 18.10 and switch to using Python3 for
installing WA.
- Fix typo in documenation.
2019-04-18 10:48:00 +01:00
dd44d6fa16 docs/api/workload: Update documentation for activity attribute 2019-04-18 10:44:50 +01:00
753786a45c fw/workload: Add activity attribute to APK workloads
Allow specifying an `activity` attribute for an APK based workload which
will override the automatically detected activity from the resolved APK.
2019-04-18 10:44:50 +01:00
8647ceafd8 workloads/meabo: Fix incorrect add_metric call 2019-04-03 11:33:27 +01:00
2c2118ad23 fw/resource: Fix attempting to match against empty values
Update checking of attributes to allow for empty structures as they can
be set to empty lists etc. and therefore should not be checking if
explicitly `None`.
2019-04-02 07:54:05 +01:00
0ec8427d05 fw/output: Implement retriving "augmentations" for JobDatabaseOutputs
Enable retriving augmentations on a per job basis when using a Postgres
database backend.
2019-03-18 15:26:19 +00:00
cf5c3a2723 fw/output: Add missing "augmentation" attribute to JobOutput
Add attribute to `JobOutput` to allow easy listing of enabled augmentations
for individual jobs rather than just the overall run level.
2019-03-18 15:26:19 +00:00
8ddc1c1eba fw/version: Bump to development version 2019-03-04 15:50:13 +00:00
b5db4afc05 fw/version: Version Bump
Bump to the next revision release.
2019-03-04 15:50:13 +00:00
f977c3dfc8 setup.py: Update PyYaml Dependency 2019-03-04 15:50:13 +00:00
769aae3047 utils/serializer: Explicitly state yaml loader
In newer versions of PyYAML we need to manually specify the `Loader` to
be used as per https://msg.pyyaml.org/load.
`FullLoader` is now the default loader which attempts to avoid arbitrary
code execution, however if we are running an older version where this is
not available default back to the original Loader.
2019-03-04 15:50:13 +00:00
a1ba3c6f69 utils/misc: Update load structure to use WA's yaml wrapper 2019-03-04 15:50:13 +00:00
536fc7eb92 workloads/stress_ng: Update to use WA's yaml wrapper 2019-03-04 15:50:13 +00:00
de36dacb82 fw/version: Bump to development versions 2019-03-04 10:37:39 +00:00
637bf57cbc fw/version: Bump revison versions
Bump the revision version for WA and the required version for
devlib.
2019-03-04 10:37:39 +00:00
60ffd27bba extras/Dockerfile: Update to use the latest release version 2019-03-04 10:37:39 +00:00
984a74a6ca doc/changes: Update changelog for v3.1.2 2019-03-04 10:37:39 +00:00
5b8dc1779c setup.py: Limit the maximum version of PyYAML
Specify the latest stable release of PyYAML should be installed rather
than the latest pre-release.
2019-03-04 10:37:39 +00:00
ba0cd7f842 fw/target/info: Bump target info version
Due to mismatches in WA and devlib versions this previous upgrade method
could have been trigger before it was needed and would not be called a
second time. Now we can be sure that WA and devlib are updated together
bump the version number again to ensure the upgrade method is called a
second time to ensure the POD is upgraded correctly.
2019-03-04 10:37:39 +00:00
adb3ffa6aa fw/version: Introduce required version for devlib
To ensure that a compatible version of devlib is installed on the system
keep track of the version of devlib that is required by WA and provide a
more useful error message if this is not satisfied.
2019-03-04 10:37:39 +00:00
bedd3bf062 docs/faq: Add entry about missing kernel config errors
Although WA supports automatic updating during parsing of a serialized
`kernel_config` from devlib, if the installed versions of WA and devlib
have become out of sync where WA has "updated" the old implementation it
will not attempt to update it again when devlib is later updated to use
the new implementation and therefore will not trigger the existing
checks that are in place.
2019-02-21 11:57:32 +00:00
03e463ad4a docs/installation: Add warning about using pip to install from github 2019-02-20 16:30:53 +00:00
2ce8d6fc95 output_processors/postgresql: Add missing default
In the case of no screen resolution being present ensure that a default
is used instead of `None`.
2019-02-14 10:51:38 +00:00
1415f61e36 workloads/chrome: Fix for tablet devices
Some tablet devices use an alternate tab switching method due to the
larger screen space. Add support for adding new tabs via the menu
instead of via the tab switcher.
2019-02-08 14:32:58 +00:00
6ab1ae74a6 wa/apk_workloads: Update to not specify a default apk version.
No longer specify a default version to allow any available apks to be
detected and then choose the appropriate automation based on the
detected version.
Refactor to support new supported_versions attribute and since APK
resolution needs to have happened before setting uiauto parameter
move assignments to ``initialize``.
2019-02-08 13:56:55 +00:00
a1cecc0002 fw/workload: Add "support_versions" attribute to workloads
Allow for specifying a list of supported APK versions for a workload. If
a specific version is no specified then attempt to a resolve any valid
version for the workload.
2019-02-08 13:56:55 +00:00
0cba3c68dc fw/resource: Support matching APKs on multiple versions.
In the case where a range of apk versions are valid allow for the matching
process to accommodate a list of versions instead of a single value.
2019-02-08 13:56:55 +00:00
f267fc9277 fw/workload: Use apk version for workload if not set.
If a workloads `version` attribute is not set, and an APK file is
found, use this as the version number. This allows for workloads to not
specify a default version via parameters and for an available APK to be
automatically chosen.
2019-02-08 13:56:55 +00:00
462a5b651a fw/output: add label property to Metric
Add a "label" property to Metric that combines its name with its
classifiers into a single string.
2019-02-05 10:27:06 +00:00
7cd7b73f58 Fixed an error emptying the reading buffer of the poller
Fixed identation

Fixed identation
2019-02-04 09:46:13 +00:00
4a9a2ad105 fw/target/info: Fix for KernelConfig refactor
The Devlib KernelConfig object was refactored in commit
f65130b7c7
therefore update the way KernelConfig objects are deserialized to reflect the new
implementation and provide a conversion for PODs.
2019-01-31 09:44:30 +00:00
9f88459f56 fw/workload: Fix Typo 2019-01-30 15:46:54 +00:00
a2087ea467 workloads/manual: Fix incorrect attribute used to access target 2019-01-30 15:46:54 +00:00
31a5a95803 output_processors/postgresql: Ensure screen resolution is a list
Ensure that the screen resolution is converted to a list to prevent
casting errors.
2019-01-30 15:46:54 +00:00
3f202205a5 doc/faq: Add answer on how to fall back to surfaceflinger 2019-01-28 12:45:10 +00:00
ce7720b26d instruments/fps: Fix Typo 2019-01-28 12:45:10 +00:00
766b96e2ad fw/workload: Add a 'View' parameter to ApkWorkloads
Allow for easy configuring of a view for a particular workload as this
can vary depending on the device which can be used when using certain
instruments for example `fps`.
2019-01-11 10:12:42 +00:00
3c9de98a4b setup: Update devlib requirements to development versions. 2019-01-11 10:12:26 +00:00
5263cfd6f8 fw/version: Add development tag to version
Add a development tag to the version format instead of using the
revision field.
2019-01-11 10:12:26 +00:00
175 changed files with 3179 additions and 616 deletions
.readthedocs.yml.travis.ymlREADME.rst
doc
extras
pytest.inirequirements.txtsetup.py
tests
wa
__init__.py
commands
framework
instruments
output_processors
utils
workloads
adobereader
aitutu
androbench
antutu
applaunch
benchmarkpi
chrome
dhrystone
exoplayer
geekbench
gfxbench
glbenchmark
gmail
googlemaps
googlephotos
googleplaybooks
googleslides
hackbench
hwuitest
lmbench
manual
meabo
memcpy
mongoperf
motionmark
openssl
pcmark
rt_app
shellscript
speedometer
stress_ng
sysbench
uibench
uibenchjanktests
vellamo
youtube

21
.readthedocs.yml Normal file

@ -0,0 +1,21 @@
# .readthedocs.yml
# Read the Docs configuration file
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
# Required
version: 2
# Build documentation in the docs/ directory with Sphinx
sphinx:
builder: html
configuration: doc/source/conf.py
# Build the docs in additional formats such as PDF and ePub
formats: all
# Set the version of Python and requirements required to build your docs
python:
version: 3.7
install:
- method: setuptools
path: .

@ -17,7 +17,6 @@ language: python
python:
- "3.6"
- "2.7"
install:
- pip install nose
@ -45,10 +44,3 @@ env:
- TEST="$PROCESS_CMD && $SHOW_CMD && $LIST_CMD && $CREATE_CMD"
script:
- echo $TEST && eval $TEST
matrix:
exclude:
- python: "2.7"
env: TEST=$PYLINT
- python: "2.7"
env: TEST=$PEP8

@ -18,7 +18,7 @@ workloads, instruments or output processing.
Requirements
============
- Python 2.7 or Python 3
- Python 3
- Linux (should work on other Unixes, but untested)
- Latest Android SDK (ANDROID_HOME must be set) for Android devices, or
- SSH for Linux devices
@ -30,7 +30,11 @@ Installation
To install::
git clone git@github.com:ARM-software/workload-automation.git workload-automation
sudo -H pip install ./workload-automation
sudo -H python setup [install|develop]
Note: A `requirements.txt` is included however this is designed to be used as a
reference for known working versions rather than as part of a standard
installation.
Please refer to the `installation section <http://workload-automation.readthedocs.io/en/latest/user_information.html#install>`_
in the documentation for more details.

@ -1,5 +1,5 @@
#!/usr/bin/env python
# Copyright 2015-2015 ARM Limited
# Copyright 2015-2019 ARM Limited
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
@ -26,7 +26,7 @@ OUTPUT_TEMPLATE_FILE = os.path.join(os.path.dirname(__file__), 'source', 'instr
def generate_instrument_method_map(outfile):
signal_table = format_simple_table([(k, v) for k, v in SIGNAL_MAP.iteritems()],
signal_table = format_simple_table([(k, v) for k, v in SIGNAL_MAP.items()],
headers=['method name', 'signal'], align='<<')
priority_table = format_simple_table(zip(CallbackPriority.names, CallbackPriority.values),
headers=['decorator', 'priority'], align='<>')
@ -37,4 +37,4 @@ def generate_instrument_method_map(outfile):
if __name__ == '__main__':
generate_instrumentation_method_map(sys.argv[1])
generate_instrument_method_map(sys.argv[1])

@ -1,5 +1,5 @@
#!/usr/bin/env python
# Copyright 2014-2015 ARM Limited
# Copyright 2014-2019 ARM Limited
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
@ -25,7 +25,12 @@ from wa.utils.doc import (strip_inlined_text, get_rst_from_plugin,
get_params_rst, underline, line_break)
from wa.utils.misc import capitalize
GENERATE_FOR_PACKAGES = ['wa.workloads', 'wa.instruments', 'wa.output_processors']
GENERATE_FOR_PACKAGES = [
'wa.workloads',
'wa.instruments',
'wa.output_processors',
]
def insert_contents_table(title='', depth=1):
"""
@ -41,6 +46,7 @@ def insert_contents_table(title='', depth=1):
def generate_plugin_documentation(source_dir, outdir, ignore_paths):
# pylint: disable=unused-argument
pluginloader.clear()
pluginloader.update(packages=GENERATE_FOR_PACKAGES)
if not os.path.exists(outdir):
@ -57,7 +63,7 @@ def generate_plugin_documentation(source_dir, outdir, ignore_paths):
exts = pluginloader.list_plugins(ext_type)
sorted_exts = iter(sorted(exts, key=lambda x: x.name))
try:
wfh.write(get_rst_from_plugin(sorted_exts.next()))
wfh.write(get_rst_from_plugin(next(sorted_exts)))
except StopIteration:
return
for ext in sorted_exts:
@ -73,9 +79,11 @@ def generate_target_documentation(outdir):
'juno_linux',
'juno_android']
intro = '\nThis is a list of commonly used targets and their device '\
'parameters, to see a complete for a complete reference please use the '\
'WA :ref:`list command <list-command>`.\n\n\n'
intro = (
'\nThis is a list of commonly used targets and their device '
'parameters, to see a complete for a complete reference please use the'
' WA :ref:`list command <list-command>`.\n\n\n'
)
pluginloader.clear()
pluginloader.update(packages=['wa.framework.target.descriptor'])
@ -112,7 +120,8 @@ def generate_config_documentation(config, outdir):
if not os.path.exists(outdir):
os.mkdir(outdir)
outfile = os.path.join(outdir, '{}.rst'.format('_'.join(config.name.split())))
config_name = '_'.join(config.name.split())
outfile = os.path.join(outdir, '{}.rst'.format(config_name))
with open(outfile, 'w') as wfh:
wfh.write(get_params_rst(config.config_points))

@ -315,9 +315,12 @@ methods
.. method:: RunDatabaseOutput.get_artifact_path(name)
Returns a `StringIO` object containing the contents of the artifact
specified by ``name``. This will only look at the run artifacts; this will
not search the artifacts of the individual jobs.
If the artifcat is a file this method returns a `StringIO` object containing
the contents of the artifact specified by ``name``. If the aritifcat is a
directory, the method returns a path to a locally extracted version of the
directory which is left to the user to remove after use. This will only look
at the run artifacts; this will not search the artifacts of the individual
jobs.
:param name: The name of the artifact who's path to retrieve.
:return: A `StringIO` object with the contents of the artifact
@ -452,8 +455,11 @@ methods
.. method:: JobDatabaseOutput.get_artifact_path(name)
Returns a ``StringIO`` object containing the contents of the artifact
specified by ``name`` associated with this job.
If the artifcat is a file this method returns a `StringIO` object containing
the contents of the artifact specified by ``name`` associated with this job.
If the aritifcat is a directory, the method returns a path to a locally
extracted version of the directory which is left to the user to remove after
use.
:param name: The name of the artifact who's path to retrieve.
:return: A `StringIO` object with the contents of the artifact
@ -497,6 +503,11 @@ A :class:`Metric` has the following attributes:
or they may have been added by the workload to help distinguish between
otherwise identical metrics.
``label``
This is a string constructed from the name and classifiers, to provide a
more unique identifier, e.g. for grouping values across iterations. The
format is in the form ``name/cassifier1=value1/classifier2=value2/...``.
:class:`Artifact`
-----------------
@ -597,6 +608,12 @@ The available attributes of the class are as follows:
The name of the target class that was uised ot interact with the device
during the run E.g. ``"AndroidTarget"``, ``"LinuxTarget"`` etc.
``modules``
A list of names of modules that have been loaded by the target. Modules
provide additional functionality, such as access to ``cpufreq`` and which
modules are installed may impact how much of the ``TargetInfo`` has been
populated.
``cpus``
A list of :class:`CpuInfo` objects describing the capabilities of each CPU.

@ -178,6 +178,16 @@ methods.
locations) and device will be searched for an application with a matching
package name.
``supported_versions``
This attribute should be a list of apk versions that are suitable for this
workload, if a specific apk version is not specified then any available
supported version may be chosen.
``activity``
This attribute can be optionally set to override the default activity that
will be extracted from the selected APK file which will be used when
launching the APK.
``view``
This is the "view" associated with the application. This is used by
instruments like ``fps`` to monitor the current framerate being generated by

@ -2,6 +2,183 @@
What's New in Workload Automation
=================================
***********
Version 3.2
***********
.. warning:: This release only supports Python 3.5+. Python 2 support has now
been dropped.
Fixes/Improvements
==================
Framework:
----------
- ``TargetInfo`` now tracks installed modules and will ensure the cache is
also updated on module change.
- Migrated the build scripts for uiauto based workloads to Python 3.
- Uiauto applications now target SDK version 28 to prevent PlayProtect
blocking the installation of the automation apks on some devices.
- The workload metadata now includes the apk package name if applicable.
Instruments:
------------
- ``energy_instruments`` will now have their ``teardown`` method called
correctly.
- ``energy_instruments``: Added a ``keep_raw`` parameter to control whether
raw files generated during execution should be deleted upon teardown.
- Update relevant instruments to make use of the new devlib collector
interface, for more information please see the
`devlib documentation <https://devlib.readthedocs.io/en/latest/collectors.html>`_.
Output Processors:
------------------
- ``postgres``: If initialisation fails then the output processor will no
longer attempt to reconnect at a later point during the run.
- ``postgres``: Will now ensure that the connection to the database is
re-established if it is dropped e.g. due to a long expecting workload.
- ``postgres``: Change the type of the ``hostid`` field to ``Bigint`` to
allow a larger range of ids.
- ``postgres``: Bump schema version to 1.5.
- ``perf``: Added support for the ``simpleperf`` profiling tool for android
devices.
- ``perf``: Added support for the perf ``record`` command.
- ``cpustates``: Improve handling of situations where cpufreq and/or cpuinfo
data is unavailable.
Workloads:
----------
- ``adodereader``: Now support apk version 19.7.1.10709.
- ``antutu``: Supports dismissing of popup asking to create a shortcut on
the homescreen.
- ``gmail``: Now supports apk version 2019.05.26.252424914.
- ``googlemaps``: Now supports apk version 10.19.1.
- ``googlephotos``: Now supports apk version 4.28.0.
- ``geekbench``: Added support for versions 4.3.4, 4.4.0 and 4.4.2.
- ``geekbench-corporate``: Added support for versions 5.0.1 and 5.0.3.
- ``pcmark``: Now locks device orientation to portrait to increase
compatibility.
- ``pcmark``: Supports dismissing new Android 10 permission warnings.
Other:
------
- Improve documentation to help debugging module installation errors.
*************
Version 3.1.4
*************
.. warning:: This is the last release that supports Python 2. Subsequent versions
will be support Python 3.5+ only.
New Features:
==============
Framework:
----------
- ``ApkWorkload``: Allow specifying A maximum and minimum version of an APK
instead of requiring a specific version.
- ``TestPackageHandler``: Added to support running android applications that
are invoked via ``am instrument``.
- Directories can now be added as ``Artifacts``.
Workloads:
----------
- ``aitutu``: Executes the Aitutu Image Speed/Accuracy and Object
Speed/Accuracy tests.
- ``uibench``: Run a configurable activity of the UIBench workload suite.
- ``uibenchjanktests``: Run an automated and instrument version of the
UIBench JankTests.
- ``motionmark``: Run a browser graphical benchmark.
Other:
------
- Added ``requirements.txt`` as a reference for known working package versions.
Fixes/Improvements
==================
Framework:
----------
- ``JobOuput``: Added an ``augmentation`` attribute to allow listing of
enabled augmentations for individual jobs.
- Better error handling for misconfiguration job selection.
- All ``Workload`` classes now have an ``uninstall`` parameter to control whether
any binaries installed to the target should be uninstalled again once the
run has completed.
- The ``cleanup_assets`` parameter is now more consistently utilized across
workloads.
- ``ApkWorkload``: Added an ``activity`` attribute to allow for overriding the
automatically detected version from the APK.
- ``ApkWorkload`` Added support for providing an implicit activity path.
- Fixed retrieving job level artifacts from a database backend.
Output Processors:
------------------
- ``SysfsExtractor``: Ensure that the extracted directories are added as
``Artifacts``.
- ``InterruptStatsInstrument``: Ensure that the output files are added as
``Artifacts``.
- ``Postgres``: Fix missing ``system_id`` field from ``TargetInfo``.
- ``Postgres``: Support uploading directory ``Artifacts``.
- ``Postgres``: Bump the schema version to v1.3.
Workloads:
----------
- ``geekbench``: Improved apk version handling.
- ``geekbench``: Now supports apk version 4.3.2.
Other:
------
- ``Dockerfile``: Now installs all optional extras for use with WA.
- Fixed support for YAML anchors.
- Fixed building of documentation with Python 3.
- Changed shorthand of installing all of WA extras to `all` as per
the documentation.
- Upgraded the Dockerfile to use Ubuntu 18.10 and Python 3.
- Restricted maximum versions of ``numpy`` and ``pandas`` for Python 2.7.
*************
Version 3.1.3
*************
Fixes/Improvements
==================
Other:
------
- Security update for PyYAML to attempt prevention of arbitrary code execution
during parsing.
*************
Version 3.1.2
*************
Fixes/Improvements
==================
Framework:
----------
- Implement an explicit check for Devlib versions to ensure that versions
are kept in sync with each other.
- Added a ``View`` parameter to ApkWorkloads for use with certain instruments
for example ``fps``.
- Added ``"supported_versions"`` attribute to workloads to allow specifying a
list of supported version for a particular workload.
- Change default behaviour to run any available version of a workload if a
specific version is not specified.
Output Processors:
------------------
- ``Postgres``: Fix handling of ``screen_resoultion`` during processing.
Other
-----
- Added additional information to documentation
- Added fix for Devlib's ``KernelConfig`` refactor
- Added a ``"label"`` property to ``Metrics``
*************
Version 3.1.1
*************

@ -5,10 +5,12 @@ Convention for Naming revent Files for Revent Workloads
-------------------------------------------------------------------------------
There is a convention for naming revent files which you should follow if you
want to record your own revent files. Each revent file must start with the
device name(case sensitive) then followed by a dot '.' then the stage name
then '.revent'. All your custom revent files should reside at
``'~/.workload_automation/dependencies/WORKLOAD NAME/'``. These are the current
want to record your own revent files. Each revent file must be called (case sensitive)
``<device name>.<stage>.revent``,
where ``<device name>`` is the name of your device (as defined by the model
name of your device which can be retrieved with
``adb shell getprop ro.product.model`` or by the ``name`` attribute of your
customized device class), and ``<stage>`` is one of the following currently
supported stages:
:setup: This stage is where the application is loaded (if present). It is
@ -26,10 +28,12 @@ Only the run stage is mandatory, the remaining stages will be replayed if a
recording is present otherwise no actions will be performed for that particular
stage.
For instance, to add a custom revent files for a device named "mydevice" and
a workload name "myworkload", you need to add the revent files to the directory
``/home/$WA_USER_HOME/dependencies/myworkload/revent_files`` creating it if
necessary. ::
All your custom revent files should reside at
``'$WA_USER_DIRECTORY/dependencies/WORKLOAD NAME/'``. So
typically to add a custom revent files for a device named "mydevice" and a
workload name "myworkload", you would need to add the revent files to the
directory ``~/.workload_automation/dependencies/myworkload/revent_files``
creating the directory structure if necessary. ::
mydevice.setup.revent
mydevice.run.revent

@ -69,7 +69,72 @@ WA3 config file.
**Q:** My Juno board keeps resetting upon starting WA even if it hasn't crashed.
--------------------------------------------------------------------------------
Please ensure that you do not have any other terminals (e.g. ``screen``
**A** Please ensure that you do not have any other terminals (e.g. ``screen``
sessions) connected to the board's UART. When WA attempts to open the connection
for its own use this can cause the board to reset if a connection is already
present.
**Q:** I'm using the FPS instrument but I do not get any/correct results for my workload
-----------------------------------------------------------------------------------------
**A:** If your device is running with Android 6.0 + then the default utility for
collecting fps metrics will be ``gfxinfo`` however this does not seem to be able
to extract any meaningful information for some workloads. In this case please
try setting the ``force_surfaceflinger`` parameter for the ``fps`` augmentation
to ``True``. This will attempt to guess the "View" for the workload
automatically however this is device specific and therefore may need
customizing. If this is required please open the application and execute
``dumpsys SurfaceFlinger --list`` on the device via adb. This will provide a
list of all views available for measuring.
As an example, when trying to find the view for the AngryBirds Rio workload you
may get something like:
.. code-block:: none
...
AppWindowToken{41dfe54 token=Token{77819a7 ActivityRecord{a151266 u0 com.rovio.angrybirdsrio/com.rovio.fusion.App t506}}}#0
a3d001c com.rovio.angrybirdsrio/com.rovio.fusion.App#0
Background for -SurfaceView - com.rovio.angrybirdsrio/com.rovio.fusion.App#0
SurfaceView - com.rovio.angrybirdsrio/com.rovio.fusion.App#0
com.rovio.angrybirdsrio/com.rovio.fusion.App#0
boostedAnimationLayer#0
mAboveAppWindowsContainers#0
...
From these ``"SurfaceView - com.rovio.angrybirdsrio/com.rovio.fusion.App#0"`` is
the mostly likely the View that needs to be set as the ``view`` workload
parameter and will be picked up be the ``fps`` augmentation.
**Q:** I am getting an error which looks similar to ``'CONFIG_SND_BT87X is not exposed in kernel config'...``
-------------------------------------------------------------------------------------------------------------
**A:** If you are receiving this under normal operation this can be caused by a
mismatch of your WA and devlib versions. Please update both to their latest
versions and delete your ``$USER_HOME/.workload_automation/cache/targets.json``
(or equivalent) file.
**Q:** I get an error which looks similar to ``UnicodeDecodeError('ascii' codec can't decode byte...``
------------------------------------------------------------------------------------------------------
**A:** If you receive this error or a similar warning about your environment,
please ensure that you configure your environment to use a locale which supports
UTF-8. Otherwise this can cause issues when attempting to parse files containing
none ascii characters.
**Q:** I get the error ``Module "X" failed to install on target``
------------------------------------------------------------------------------------------------------
**A:** By default a set of devlib modules will be automatically loaded onto the
target designed to add additional functionality. If the functionality provided
by the module is not required then the module can be safely disabled by setting
``load_default_modules`` to ``False`` in the ``device_config`` entry of the
:ref:`agenda <config-agenda-entry>` and then re-enabling any specific modules
that are still required. An example agenda snippet is shown below:
.. code-block:: none
config:
device: generic_android
device_config:
load_default_modules: False
modules: ['list', 'of', 'modules', 'to', 'enable']

@ -16,7 +16,7 @@ Configuration
Default configuration file change
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Instead of the standard ``config.py`` file located at
``$WA_USER_HOME/config.py`` WA now uses a ``confg.yaml`` file (at the same
``$WA_USER_DIRECTORY/config.py`` WA now uses a ``confg.yaml`` file (at the same
location) which is written in the YAML format instead of python. Additionally
upon first invocation WA3 will automatically try and detect whether a WA2 config
file is present and convert it to use the new WA3 format. During this process

@ -17,6 +17,8 @@ further configuration will be required.
Android
-------
.. _android-general-device-setup:
General Device Setup
^^^^^^^^^^^^^^^^^^^^
@ -44,12 +46,15 @@ common parameters you might want to change are outlined below.
Android builds. If this is not the case for your device, you will need to
specify an alternative working directory (e.g. under ``/data/local``).
:load_default_modules: A number of "default" modules (e.g. for cpufreq
subsystem) are loaded automatically, unless explicitly disabled. If you
encounter an issue with one of the modules then this setting can be set to
``False`` and any specific modules that you require can be request via the
``modules`` entry.
:modules: A list of additional modules to be installed for the target. Devlib
implements functionality for particular subsystems as modules. A number of
"default" modules (e.g. for cpufreq subsystem) are loaded automatically,
unless explicitly disabled. If additional modules need to be loaded, they
may be specified using this parameter.
implements functionality for particular subsystems as modules. If additional
modules need to be loaded, they may be specified using this parameter.
Please see the `devlib documentation <http://devlib.readthedocs.io/en/latest/modules.html>`_
for information on the available modules.
@ -83,6 +88,7 @@ or a more specific config could be:
device_config:
device: 0123456789ABCDEF
working_direcory: '/sdcard/wa-working'
load_default_modules: True
modules: ['hotplug', 'cpufreq']
core_names : ['a7', 'a7', 'a7', 'a15', 'a15']
# ...

@ -14,9 +14,9 @@ Using revent with workloads
^^^^^^^^^^^^^^^^^^^^^^^^^^^
Some workloads (pretty much all games) rely on recorded revents for their
execution. ReventWorkloads will require between 1 and 4 revent files be be ran.
There is one mandatory recording ``run`` for performing the actual execution of
the workload and the remaining are optional. ``setup`` can be used to perform
execution. ReventWorkloads require between 1 and 4 revent files to be ran.
There is one mandatory recording, ``run``, for performing the actual execution of
the workload and the remaining stages are optional. ``setup`` can be used to perform
the initial setup (navigating menus, selecting game modes, etc).
``extract_results`` can be used to perform any actions after the main stage of
the workload for example to navigate a results or summary screen of the app. And
@ -26,17 +26,21 @@ exiting the app.
Because revents are very device-specific\ [*]_, these files would need to
be recorded for each device.
The files must be called ``<device name>.(setup|run|extract_results|teardown).revent``
, where ``<device name>`` is the name of your device (as defined by the ``name``
attribute of your device's class). WA will look for these files in two
places: ``<install dir>/wa/workloads/<workload name>/revent_files``
and ``~/.workload_automation/dependencies/<workload name>``. The first
location is primarily intended for revent files that come with WA (and if
The files must be called ``<device name>.(setup|run|extract_results|teardown).revent``,
where ``<device name>`` is the name of your device (as defined by the model
name of your device which can be retrieved with
``adb shell getprop ro.product.model`` or by the ``name`` attribute of your
customized device class).
WA will look for these files in two places:
``<installdir>/wa/workloads/<workload name>/revent_files`` and
``$WA_USER_DIRECTORY/dependencies/<workload name>``. The
first location is primarily intended for revent files that come with WA (and if
you did a system-wide install, you'll need sudo to add files there), so it's
probably easier to use the second location for the files you record. Also,
if revent files for a workload exist in both locations, the files under
``~/.workload_automation/dependencies`` will be used in favour of those
installed with WA.
probably easier to use the second location for the files you record. Also, if
revent files for a workload exist in both locations, the files under
``$WA_USER_DIRECTORY/dependencies`` will be used in favour
of those installed with WA.
.. [*] It's not just about screen resolution -- the event codes may be different
even if devices use the same screen.

@ -12,8 +12,9 @@ Installation
.. module:: wa
This page describes the 3 methods of installing Workload Automation 3. The first
option is to use :ref:`pip` which
will install the latest release of WA, the latest development version from :ref:`github <github>` or via a :ref:`dockerfile`.
option is to use :ref:`pip` which will install the latest release of WA, the
latest development version from :ref:`github <github>` or via a
:ref:`dockerfile`.
Prerequisites
@ -97,8 +98,8 @@ similar distributions, this may be done with APT::
If you do run into this issue after already installing some packages,
you can resolve it by running ::
sudo chmod -R a+r /usr/local/lib/python2.7/dist-packagessudo
find /usr/local/lib/python2.7/dist-packages -type d -exec chmod a+x {} \;
sudo chmod -R a+r /usr/local/lib/python2.7/dist-packages
sudo find /usr/local/lib/python2.7/dist-packages -type d -exec chmod a+x {} \;
(The paths above will work for Ubuntu; they may need to be adjusted
for other distros).
@ -171,9 +172,11 @@ install them upfront (e.g. if you're planning to use WA to an environment that
may not always have Internet access).
* nose
* PyDAQmx
* pymongo
* jinja2
* mock
* daqpower
* sphinx
* sphinx_rtd_theme
* psycopg2-binary
@ -199,6 +202,18 @@ Alternatively, you can also install the latest development version from GitHub
cd workload-automation
sudo -H python setup.py install
.. note:: Please note that if using pip to install from github this will most
likely result in an older and incompatible version of devlib being
installed alongside WA. If you wish to use pip please also manually
install the latest version of
`devlib <https://github.com/ARM-software/devlib>`_.
.. note:: Please note that while a `requirements.txt` is included, this is
designed to be a reference of known working packages rather to than to
be used as part of a standard installation. The version restrictions
in place as part of `setup.py` should automatically ensure the correct
packages are install however if encountering issues please try
updating/downgrading to the package versions list within.
If the above succeeds, try ::
@ -222,7 +237,7 @@ image in a container.
The Dockerfile can be found in the "extras" directory or online at
`<https://github.com/ARM-software /workload- automation/blob/next/extras/Dockerfile>`_
which contains addional information about how to build and to use the file.
which contains additional information about how to build and to use the file.
(Optional) Post Installation

@ -125,7 +125,7 @@ There are multiple options for configuring your device depending on your
particular use case.
You can either add your configuration to the default configuration file
``config.yaml``, under the ``$WA_USER_HOME/`` directory or you can specify it in
``config.yaml``, under the ``$WA_USER_DIRECTORY/`` directory or you can specify it in
the ``config`` section of your agenda directly.
Alternatively if you are using multiple devices, you may want to create separate
@ -318,7 +318,7 @@ like this:
config:
augmentations:
- ~execution_time
- json
- targz
iterations: 2
workloads:
- memcpy
@ -332,7 +332,7 @@ This agenda:
- Specifies two workloads: memcpy and dhrystone.
- Specifies that dhrystone should run in one thread and execute five million loops.
- Specifies that each of the two workloads should be run twice.
- Enables json output processor, in addition to the output processors enabled in
- Enables the targz output processor, in addition to the output processors enabled in
the config.yaml.
- Disables execution_time instrument, if it is enabled in the config.yaml

@ -30,7 +30,7 @@ An example agenda can be seen here:
device: generic_android
device_config:
device: R32C801B8XY # Th adb name of our device we want to run on
device: R32C801B8XY # The adb name of our device we want to run on
disable_selinux: true
load_default_modules: true
package_data_directory: /data/data
@ -116,7 +116,9 @@ whole will behave. The most common options that that you may want to specify are
to connect to (e.g. ``host`` for an SSH connection or
``device`` to specific an ADB name) as well as configure other
options for the device for example the ``working_directory``
or the list of ``modules`` to be loaded onto the device.
or the list of ``modules`` to be loaded onto the device. (For
more information please see
:ref:`here <android-general-device-setup>`)
:execution_order: Defines the order in which the agenda spec will be executed.
:reboot_policy: Defines when during execution of a run a Device will be rebooted.
:max_retries: The maximum number of times failed jobs will be retried before giving up.
@ -124,7 +126,7 @@ whole will behave. The most common options that that you may want to specify are
For more information and a full list of these configuration options please see
:ref:`Run Configuration <run-configuration>` and
:ref:`"Meta Configuration" <meta-configuration>`.
:ref:`Meta Configuration <meta-configuration>`.
Plugins

@ -6,7 +6,7 @@
#
# docker build -t wa .
#
# This will create an image called wadocker, which is preconfigured to
# This will create an image called wa, which is preconfigured to
# run WA and devlib. Please note that the build process automatically
# accepts the licenses for the Android SDK, so please be sure that you
# are willing to accept these prior to building and running the image
@ -38,21 +38,27 @@
#
# We want to make sure to base this on a recent ubuntu release
FROM ubuntu:17.10
FROM ubuntu:19.10
# Please update the references below to use different versions of
# devlib, WA or the Android SDK
ARG DEVLIB_REF=v1.1.0
ARG WA_REF=v3.1.1
ARG DEVLIB_REF=v1.2
ARG WA_REF=v3.2
ARG ANDROID_SDK_URL=https://dl.google.com/android/repository/sdk-tools-linux-3859397.zip
RUN apt-get update
RUN apt-get install -y python-pip git wget zip openjdk-8-jre-headless vim emacs nano curl sshpass ssh usbutils
RUN pip install pandas
RUN apt-get install -y python3 python3-pip git wget zip openjdk-8-jre-headless vim emacs nano curl sshpass ssh usbutils locales
RUN pip3 install pandas
# Ensure we're using utf-8 as our default encoding
RUN locale-gen en_US.UTF-8
ENV LANG en_US.UTF-8
ENV LANGUAGE en_US:en
ENV LC_ALL en_US.UTF-8
# Let's get the two repos we need, and install them
RUN git clone -v https://github.com/ARM-software/devlib.git /tmp/devlib && cd /tmp/devlib && git checkout $DEVLIB_REF && python setup.py install
RUN git clone -v https://github.com/ARM-software/workload-automation.git /tmp/wa && cd /tmp/wa && git checkout $WA_REF && python setup.py install
RUN git clone -v https://github.com/ARM-software/devlib.git /tmp/devlib && cd /tmp/devlib && git checkout $DEVLIB_REF && python3 setup.py install && pip3 install .[full]
RUN git clone -v https://github.com/ARM-software/workload-automation.git /tmp/wa && cd /tmp/wa && git checkout $WA_REF && python3 setup.py install && pip3 install .[all]
# Clean-up
RUN rm -R /tmp/devlib /tmp/wa

3
pytest.ini Normal file

@ -0,0 +1,3 @@
[pytest]
filterwarnings=
ignore::DeprecationWarning:past[.*]

21
requirements.txt Normal file

@ -0,0 +1,21 @@
certifi==2019.11.28
chardet==3.0.4
colorama==0.4.3
devlib==1.2.0
future==0.18.2
idna==2.8
Louie-latest==1.3.1
nose==1.3.7
numpy==1.17.4
pandas==0.25.3
pexpect==4.7.0
ptyprocess==0.6.0
pyserial==3.4
python-dateutil==2.8.1
pytz==2019.3
PyYAML==5.2
requests==2.22.0
six==1.13.0
urllib3==1.25.7
wlauto==3.2.0
wrapt==1.11.2

@ -29,7 +29,8 @@ except ImportError:
wa_dir = os.path.join(os.path.dirname(__file__), 'wa')
sys.path.insert(0, os.path.join(wa_dir, 'framework'))
from version import get_wa_version, get_wa_version_with_commit
from version import (get_wa_version, get_wa_version_with_commit,
format_version, required_devlib_version)
# happens if falling back to distutils
warnings.filterwarnings('ignore', "Unknown distribution option: 'install_requires'")
@ -61,9 +62,14 @@ for root, dirs, files in os.walk(wa_dir):
scripts = [os.path.join('scripts', s) for s in os.listdir('scripts')]
with open("README.rst", "r") as fh:
long_description = fh.read()
devlib_version = format_version(required_devlib_version)
params = dict(
name='wlauto',
description='A framework for automating workload execution and measurement collection on ARM devices.',
long_description=long_description,
version=get_wa_version_with_commit(),
packages=packages,
package_data=data_files,
@ -74,41 +80,43 @@ params = dict(
maintainer='ARM Architecture & Technology Device Lab',
maintainer_email='workload-automation@arm.com',
setup_requires=[
'numpy'
'numpy<=1.16.4; python_version<"3"',
'numpy; python_version>="3"',
],
install_requires=[
'python-dateutil', # converting between UTC and local time.
'pexpect>=3.3', # Send/receive to/from device
'pyserial', # Serial port interface
'colorama', # Printing with colors
'pyYAML', # YAML-formatted agenda parsing
'pyYAML>=5.1b3', # YAML-formatted agenda parsing
'requests', # Fetch assets over HTTP
'devlib>=1.1.0', # Interacting with devices
'devlib>={}'.format(devlib_version), # Interacting with devices
'louie-latest', # callbacks dispatch
'wrapt', # better decorators
'pandas>=0.23.0', # Data analysis and manipulation
'pandas>=0.23.0,<=0.24.2; python_version<"3.5.3"', # Data analysis and manipulation
'pandas>=0.23.0; python_version>="3.5.3"', # Data analysis and manipulation
'future', # Python 2-3 compatiblity
],
dependency_links=['https://github.com/ARM-software/devlib/tarball/master#egg=devlib-{}'.format(devlib_version)],
extras_require={
'other': ['jinja2'],
'test': ['nose', 'mock'],
'mongodb': ['pymongo'],
'notify': ['notify2'],
'doc': ['sphinx'],
'doc': ['sphinx', 'sphinx_rtd_theme'],
'postgres': ['psycopg2-binary'],
'daq': ['daqpower'],
},
# https://pypi.python.org/pypi?%3Aaction=list_classifiers
classifiers=[
'Development Status :: 4 - Beta',
'Development Status :: 5 - Production/Stable',
'Environment :: Console',
'License :: OSI Approved :: Apache Software License',
'Operating System :: POSIX :: Linux',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
],
)
all_extras = list(chain(iter(params['extras_require'].values())))
params['extras_require']['everything'] = all_extras
params['extras_require']['all'] = all_extras
class sdist(orig_sdist):
@ -122,7 +130,6 @@ class sdist(orig_sdist):
orig_sdist.initialize_options(self)
self.strip_commit = False
def run(self):
if self.strip_commit:
self.distribution.get_version = get_wa_version

@ -17,7 +17,7 @@
from wa import Plugin
class TestDevice(Plugin):
class MockDevice(Plugin):
name = 'test-device'
kind = 'device'

@ -18,7 +18,6 @@
# pylint: disable=R0201
import os
import sys
import yaml
from collections import defaultdict
from unittest import TestCase
@ -31,6 +30,7 @@ os.environ['WA_USER_DIRECTORY'] = os.path.join(DATA_DIR, 'includes')
from wa.framework.configuration.execution import ConfigManager
from wa.framework.configuration.parsers import AgendaParser
from wa.framework.exception import ConfigError
from wa.utils.serializer import yaml
from wa.utils.types import reset_all_counters
@ -44,8 +44,6 @@ workloads:
workload_parameters:
test: 1
"""
invalid_agenda = yaml.load(invalid_agenda_text)
invalid_agenda.name = 'invalid1'
duplicate_agenda_text = """
global:
@ -58,14 +56,10 @@ workloads:
- id: "1"
workload_name: benchmarkpi
"""
duplicate_agenda = yaml.load(duplicate_agenda_text)
duplicate_agenda.name = 'invalid2'
short_agenda_text = """
workloads: [antutu, dhrystone, benchmarkpi]
"""
short_agenda = yaml.load(short_agenda_text)
short_agenda.name = 'short'
default_ids_agenda_text = """
workloads:
@ -78,8 +72,6 @@ workloads:
cpus: 1
- vellamo
"""
default_ids_agenda = yaml.load(default_ids_agenda_text)
default_ids_agenda.name = 'default_ids'
sectioned_agenda_text = """
sections:
@ -102,8 +94,6 @@ sections:
workloads:
- memcpy
"""
sectioned_agenda = yaml.load(sectioned_agenda_text)
sectioned_agenda.name = 'sectioned'
dup_sectioned_agenda_text = """
sections:
@ -116,8 +106,22 @@ sections:
workloads:
- memcpy
"""
dup_sectioned_agenda = yaml.load(dup_sectioned_agenda_text)
dup_sectioned_agenda.name = 'dup-sectioned'
yaml_anchors_agenda_text = """
workloads:
- name: dhrystone
params: &dhrystone_single_params
cleanup_assets: true
cpus: 0
delay: 3
duration: 0
mloops: 10
threads: 1
- name: dhrystone
params:
<<: *dhrystone_single_params
threads: 4
"""
class AgendaTest(TestCase):
@ -132,6 +136,8 @@ class AgendaTest(TestCase):
assert_equal(len(self.config.jobs_config.root_node.workload_entries), 4)
def test_duplicate_id(self):
duplicate_agenda = yaml.load(duplicate_agenda_text)
try:
self.parser.load(self.config, duplicate_agenda, 'test')
except ConfigError as e:
@ -140,6 +146,8 @@ class AgendaTest(TestCase):
raise Exception('ConfigError was not raised for an agenda with duplicate ids.')
def test_yaml_missing_field(self):
invalid_agenda = yaml.load(invalid_agenda_text)
try:
self.parser.load(self.config, invalid_agenda, 'test')
except ConfigError as e:
@ -148,20 +156,26 @@ class AgendaTest(TestCase):
raise Exception('ConfigError was not raised for an invalid agenda.')
def test_defaults(self):
short_agenda = yaml.load(short_agenda_text)
self.parser.load(self.config, short_agenda, 'test')
workload_entries = self.config.jobs_config.root_node.workload_entries
assert_equal(len(workload_entries), 3)
assert_equal(workload_entries[0].config['workload_name'], 'antutu')
assert_equal(workload_entries[0].id, 'wk1')
def test_default_id_assignment(self):
default_ids_agenda = yaml.load(default_ids_agenda_text)
self.parser.load(self.config, default_ids_agenda, 'test2')
workload_entries = self.config.jobs_config.root_node.workload_entries
assert_equal(workload_entries[0].id, 'wk2')
assert_equal(workload_entries[3].id, 'wk3')
def test_sections(self):
sectioned_agenda = yaml.load(sectioned_agenda_text)
self.parser.load(self.config, sectioned_agenda, 'test')
root_node_workload_entries = self.config.jobs_config.root_node.workload_entries
leaves = list(self.config.jobs_config.root_node.leaves())
section1_workload_entries = leaves[0].workload_entries
@ -171,8 +185,22 @@ class AgendaTest(TestCase):
assert_true(section1_workload_entries[0].config['workload_parameters']['markers_enabled'])
assert_equal(section2_workload_entries[0].config['workload_name'], 'antutu')
def test_yaml_anchors(self):
yaml_anchors_agenda = yaml.load(yaml_anchors_agenda_text)
self.parser.load(self.config, yaml_anchors_agenda, 'test')
workload_entries = self.config.jobs_config.root_node.workload_entries
assert_equal(len(workload_entries), 2)
assert_equal(workload_entries[0].config['workload_name'], 'dhrystone')
assert_equal(workload_entries[0].config['workload_parameters']['threads'], 1)
assert_equal(workload_entries[0].config['workload_parameters']['delay'], 3)
assert_equal(workload_entries[1].config['workload_name'], 'dhrystone')
assert_equal(workload_entries[1].config['workload_parameters']['threads'], 4)
assert_equal(workload_entries[1].config['workload_parameters']['delay'], 3)
@raises(ConfigError)
def test_dup_sections(self):
dup_sectioned_agenda = yaml.load(dup_sectioned_agenda_text)
self.parser.load(self.config, dup_sectioned_agenda, 'test')
@raises(ConfigError)

@ -23,7 +23,7 @@ from wa.utils.exec_control import (init_environment, reset_environment,
activate_environment, once,
once_per_class, once_per_instance)
class TestClass(object):
class MockClass(object):
called = 0
@ -32,7 +32,7 @@ class TestClass(object):
@once
def called_once(self):
TestClass.called += 1
MockClass.called += 1
@once
def initilize_once(self):
@ -50,7 +50,7 @@ class TestClass(object):
return '{}: Called={}'.format(self.__class__.__name__, self.called)
class SubClass(TestClass):
class SubClass(MockClass):
def __init__(self):
super(SubClass, self).__init__()
@ -110,7 +110,7 @@ class AnotherClass(object):
self.count += 1
class AnotherSubClass(TestClass):
class AnotherSubClass(MockClass):
def __init__(self):
super(AnotherSubClass, self).__init__()
@ -142,7 +142,7 @@ class EnvironmentManagementTest(TestCase):
def test_reset_current_environment(self):
activate_environment('CURRENT_ENVIRONMENT')
t1 = TestClass()
t1 = MockClass()
t1.initilize_once()
assert_equal(t1.count, 1)
@ -152,7 +152,7 @@ class EnvironmentManagementTest(TestCase):
def test_switch_environment(self):
activate_environment('ENVIRONMENT1')
t1 = TestClass()
t1 = MockClass()
t1.initilize_once()
assert_equal(t1.count, 1)
@ -166,7 +166,7 @@ class EnvironmentManagementTest(TestCase):
def test_reset_environment_name(self):
activate_environment('ENVIRONMENT')
t1 = TestClass()
t1 = MockClass()
t1.initilize_once()
assert_equal(t1.count, 1)
@ -195,7 +195,7 @@ class OnlyOnceEnvironmentTest(TestCase):
reset_environment('TEST_ENVIRONMENT')
def test_single_instance(self):
t1 = TestClass()
t1 = MockClass()
ac = AnotherClass()
t1.initilize_once()
@ -209,8 +209,8 @@ class OnlyOnceEnvironmentTest(TestCase):
def test_mulitple_instances(self):
t1 = TestClass()
t2 = TestClass()
t1 = MockClass()
t2 = MockClass()
t1.initilize_once()
assert_equal(t1.count, 1)
@ -220,7 +220,7 @@ class OnlyOnceEnvironmentTest(TestCase):
def test_sub_classes(self):
t1 = TestClass()
t1 = MockClass()
sc = SubClass()
ss = SubSubClass()
asc = AnotherSubClass()
@ -250,7 +250,7 @@ class OncePerClassEnvironmentTest(TestCase):
reset_environment('TEST_ENVIRONMENT')
def test_single_instance(self):
t1 = TestClass()
t1 = MockClass()
ac = AnotherClass()
t1.initilize_once_per_class()
@ -264,8 +264,8 @@ class OncePerClassEnvironmentTest(TestCase):
def test_mulitple_instances(self):
t1 = TestClass()
t2 = TestClass()
t1 = MockClass()
t2 = MockClass()
t1.initilize_once_per_class()
assert_equal(t1.count, 1)
@ -275,7 +275,7 @@ class OncePerClassEnvironmentTest(TestCase):
def test_sub_classes(self):
t1 = TestClass()
t1 = MockClass()
sc1 = SubClass()
sc2 = SubClass()
ss1 = SubSubClass()
@ -308,7 +308,7 @@ class OncePerInstanceEnvironmentTest(TestCase):
reset_environment('TEST_ENVIRONMENT')
def test_single_instance(self):
t1 = TestClass()
t1 = MockClass()
ac = AnotherClass()
t1.initilize_once_per_instance()
@ -322,8 +322,8 @@ class OncePerInstanceEnvironmentTest(TestCase):
def test_mulitple_instances(self):
t1 = TestClass()
t2 = TestClass()
t1 = MockClass()
t2 = MockClass()
t1.initilize_once_per_instance()
assert_equal(t1.count, 1)
@ -333,7 +333,7 @@ class OncePerInstanceEnvironmentTest(TestCase):
def test_sub_classes(self):
t1 = TestClass()
t1 = MockClass()
sc = SubClass()
ss = SubSubClass()
asc = AnotherSubClass()

@ -33,7 +33,7 @@ from wa.framework.target.descriptor import (TargetDescriptor, TargetDescription,
create_target_description, add_description_for_target)
from wa.framework.workload import (Workload, ApkWorkload, ApkUiautoWorkload,
ApkReventWorkload, UIWorkload, UiautoWorkload,
ReventWorkload)
PackageHandler, ReventWorkload, TestPackageHandler)
from wa.framework.version import get_wa_version, get_wa_version_with_commit

@ -1,4 +1,4 @@
--!VERSION!1.2!ENDVERSION!
--!VERSION!1.5!ENDVERSION!
CREATE EXTENSION IF NOT EXISTS "uuid-ossp";
CREATE EXTENSION IF NOT EXISTS "lo";
@ -78,10 +78,11 @@ CREATE TABLE Targets (
oid uuid NOT NULL,
run_oid uuid NOT NULL references Runs(oid),
target text,
modules text[],
cpus text[],
os text,
os_version jsonb,
hostid int,
hostid bigint,
hostname text,
abi text,
is_rooted boolean,
@ -96,6 +97,7 @@ CREATE TABLE Targets (
android_id text,
_pod_version int,
_pod_serialization_version int,
system_id text,
PRIMARY KEY (oid)
);
@ -164,6 +166,7 @@ CREATE TABLE Artifacts (
kind text,
_pod_version int,
_pod_serialization_version int,
is_dir boolean,
PRIMARY KEY (oid)
);

@ -0,0 +1,3 @@
ALTER TABLE targets ADD COLUMN system_id text;
ALTER TABLE artifacts ADD COLUMN is_dir boolean;

@ -0,0 +1,2 @@
ALTER TABLE targets ADD COLUMN modules text[];

@ -0,0 +1 @@
ALTER TABLE targets ALTER hostid TYPE BIGINT;

@ -7,3 +7,18 @@
was done following an extended discussion and tests that verified
that the savings in processing power were not enough to warrant
the creation of a dedicated server or file handler.
## 1.2
- Rename the `resourcegetters` table to `resource_getters` for consistency.
- Add Job and Run level classifiers.
- Add missing android specific properties to targets.
- Add new POD meta data to relevant tables.
- Correct job column name from `retires` to `retry`.
- Add missing run information.
## 1.3
- Add missing "system_id" field from TargetInfo.
- Enable support for uploading Artifact that represent directories.
## 1.4
- Add "modules" field to TargetInfo to list the modules loaded by the target
during the run.
## 1.5
- Change the type of the "hostid" in TargetInfo from Int to Bigint.

@ -59,7 +59,7 @@ params = dict(
'Environment :: Console',
'License :: Other/Proprietary License',
'Operating System :: Unix',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
],
)

@ -1,12 +1,12 @@
apply plugin: 'com.android.application'
android {
compileSdkVersion 18
buildToolsVersion '25.0.0'
compileSdkVersion 28
buildToolsVersion '28.0.0'
defaultConfig {
applicationId "${package_name}"
minSdkVersion 18
targetSdkVersion 25
targetSdkVersion 28
testInstrumentationRunner "android.support.test.runner.AndroidJUnitRunner"
}
buildTypes {

@ -16,7 +16,7 @@ fi
# Copy base class library from wlauto dist
libs_dir=app/libs
base_class=`python -c "import os, wa; print os.path.join(os.path.dirname(wa.__file__), 'framework', 'uiauto', 'uiauto.aar')"`
base_class=`python3 -c "import os, wa; print(os.path.join(os.path.dirname(wa.__file__), 'framework', 'uiauto', 'uiauto.aar'))"`
mkdir -p $$libs_dir
cp $$base_class $$libs_dir

@ -24,7 +24,7 @@ from wa.framework.configuration.core import (MetaConfiguration, RunConfiguration
JobGenerator, settings)
from wa.framework.configuration.parsers import ConfigParser
from wa.framework.configuration.plugin_cache import PluginCache
from wa.framework.exception import NotFoundError
from wa.framework.exception import NotFoundError, ConfigError
from wa.framework.job import Job
from wa.utils import log
from wa.utils.serializer import Podable
@ -148,6 +148,9 @@ class ConfigManager(object):
def generate_jobs(self, context):
job_specs = self.jobs_config.generate_job_specs(context.tm)
if not job_specs:
msg = 'No jobs available for running.'
raise ConfigError(msg)
exec_order = self.run_config.execution_order
log.indent()
for spec, i in permute_iterations(job_specs, exec_order):

@ -16,19 +16,25 @@
import sys
import argparse
import locale
import logging
import os
import warnings
import devlib
try:
from devlib.utils.version import version as installed_devlib_version
except ImportError:
installed_devlib_version = None
from wa.framework import pluginloader
from wa.framework.command import init_argument_parser
from wa.framework.configuration import settings
from wa.framework.configuration.execution import ConfigManager
from wa.framework.host import init_user_directory, init_config
from wa.framework.exception import ConfigError
from wa.framework.version import get_wa_version_with_commit
from wa.framework.exception import ConfigError, HostError
from wa.framework.version import (get_wa_version_with_commit, format_version,
required_devlib_version)
from wa.utils import log
from wa.utils.doc import format_body
@ -64,6 +70,27 @@ def split_joined_options(argv):
return output
# Instead of presenting an obscure error due to a version mismatch explicitly warn the user.
def check_devlib_version():
if not installed_devlib_version or installed_devlib_version[:-1] <= required_devlib_version[:-1]:
# Check the 'dev' field separately to account for comparing with release versions.
if installed_devlib_version.dev and installed_devlib_version.dev < required_devlib_version.dev:
msg = 'WA requires Devlib version >={}. Please update the currently installed version {}'
raise HostError(msg.format(format_version(required_devlib_version), devlib.__version__))
# If the default encoding is not UTF-8 warn the user as this may cause compatibility issues
# when parsing files.
def check_system_encoding():
system_encoding = locale.getpreferredencoding()
msg = 'System Encoding: {}'.format(system_encoding)
if 'UTF-8' not in system_encoding:
logger.warning(msg)
logger.warning('To prevent encoding issues please use a locale setting which supports UTF-8')
else:
logger.debug(msg)
def main():
if not os.path.exists(settings.user_directory):
init_user_directory()
@ -102,6 +129,8 @@ def main():
logger.debug('Version: {}'.format(get_wa_version_with_commit()))
logger.debug('devlib version: {}'.format(devlib.__full_version__))
logger.debug('Command Line: {}'.format(' '.join(sys.argv)))
check_devlib_version()
check_system_encoding()
# each command will add its own subparser
subparsers = parser.add_subparsers(dest='command')

@ -23,6 +23,8 @@ except ImportError:
import logging
import os
import shutil
import tarfile
import tempfile
from collections import OrderedDict, defaultdict
from copy import copy, deepcopy
from datetime import datetime
@ -145,9 +147,10 @@ class Output(object):
if not os.path.exists(path):
msg = 'Attempting to add non-existing artifact: {}'
raise HostError(msg.format(path))
is_dir = os.path.isdir(path)
path = os.path.relpath(path, self.basepath)
self.result.add_artifact(name, path, kind, description, classifiers)
self.result.add_artifact(name, path, kind, description, classifiers, is_dir)
def add_event(self, message):
self.result.add_event(message)
@ -346,6 +349,13 @@ class JobOutput(Output):
self.spec = None
self.reload()
@property
def augmentations(self):
job_augs = set([])
for aug in self.spec.augmentations:
job_augs.add(aug)
return list(job_augs)
class Result(Podable):
@ -378,9 +388,10 @@ class Result(Podable):
logger.debug('Adding metric: {}'.format(metric))
self.metrics.append(metric)
def add_artifact(self, name, path, kind, description=None, classifiers=None):
def add_artifact(self, name, path, kind, description=None, classifiers=None,
is_dir=False):
artifact = Artifact(name, path, kind, description=description,
classifiers=classifiers)
classifiers=classifiers, is_dir=is_dir)
logger.debug('Adding artifact: {}'.format(artifact))
self.artifacts.append(artifact)
@ -516,7 +527,7 @@ class Artifact(Podable):
"""
_pod_serialization_version = 1
_pod_serialization_version = 2
@staticmethod
def from_pod(pod):
@ -525,9 +536,11 @@ class Artifact(Podable):
pod['kind'] = ArtifactType(pod['kind'])
instance = Artifact(**pod)
instance._pod_version = pod_version # pylint: disable =protected-access
instance.is_dir = pod.pop('is_dir')
return instance
def __init__(self, name, path, kind, description=None, classifiers=None):
def __init__(self, name, path, kind, description=None, classifiers=None,
is_dir=False):
""""
:param name: Name that uniquely identifies this artifact.
:param path: The *relative* path of the artifact. Depending on the
@ -543,7 +556,6 @@ class Artifact(Podable):
:param classifiers: A set of key-value pairs to further classify this
metric beyond current iteration (e.g. this can be
used to identify sub-tests).
"""
super(Artifact, self).__init__()
self.name = name
@ -555,11 +567,13 @@ class Artifact(Podable):
raise ValueError(msg.format(kind, ARTIFACT_TYPES))
self.description = description
self.classifiers = classifiers or {}
self.is_dir = is_dir
def to_pod(self):
pod = super(Artifact, self).to_pod()
pod.update(self.__dict__)
pod['kind'] = str(self.kind)
pod['is_dir'] = self.is_dir
return pod
@staticmethod
@ -567,11 +581,17 @@ class Artifact(Podable):
pod['_pod_version'] = pod.get('_pod_version', 1)
return pod
@staticmethod
def _pod_upgrade_v2(pod):
pod['is_dir'] = pod.get('is_dir', False)
return pod
def __str__(self):
return self.path
def __repr__(self):
return '{} ({}): {}'.format(self.name, self.kind, self.path)
ft = 'dir' if self.is_dir else 'file'
return '{} ({}) ({}): {}'.format(self.name, ft, self.kind, self.path)
class Metric(Podable):
@ -602,6 +622,12 @@ class Metric(Podable):
instance._pod_version = pod_version # pylint: disable =protected-access
return instance
@property
def label(self):
parts = ['{}={}'.format(n, v) for n, v in self.classifiers.items()]
parts.insert(0, self.name)
return '/'.join(parts)
def __init__(self, name, value, units=None, lower_is_better=False,
classifiers=None):
super(Metric, self).__init__()
@ -798,6 +824,19 @@ class DatabaseOutput(Output):
def get_artifact_path(self, name):
artifact = self.get_artifact(name)
if artifact.is_dir:
return self._read_dir_artifact(artifact)
else:
return self._read_file_artifact(artifact)
def _read_dir_artifact(self, artifact):
artifact_path = tempfile.mkdtemp(prefix='wa_')
with tarfile.open(fileobj=self.conn.lobject(int(artifact.path), mode='b'), mode='r|gz') as tar_file:
tar_file.extractall(artifact_path)
self.conn.commit()
return artifact_path
def _read_file_artifact(self, artifact):
artifact = StringIO(self.conn.lobject(int(artifact.path)).read())
self.conn.commit()
return artifact
@ -886,13 +925,15 @@ class DatabaseOutput(Output):
def _get_artifacts(self):
columns = ['artifacts.name', 'artifacts.description', 'artifacts.kind',
('largeobjects.lo_oid', 'path'), 'artifacts.oid',
('largeobjects.lo_oid', 'path'), 'artifacts.oid', 'artifacts.is_dir',
'artifacts._pod_version', 'artifacts._pod_serialization_version']
tables = ['largeobjects', 'artifacts']
joins = [('classifiers', 'classifiers.artifact_oid = artifacts.oid')]
conditions = ['artifacts.{}_oid = \'{}\''.format(self.kind, self.oid),
'artifacts.large_object_uuid = largeobjects.oid',
'artifacts.job_oid IS NULL']
'artifacts.large_object_uuid = largeobjects.oid']
# If retrieving run level artifacts we want those that don't also belong to a job
if self.kind == 'run':
conditions.append('artifacts.job_oid IS NULL')
pod = self._read_db(columns, tables, conditions, joins)
for artifact in pod:
artifact['path'] = str(artifact['path'])
@ -907,8 +948,9 @@ class DatabaseOutput(Output):
def kernel_config_from_db(raw):
kernel_config = {}
for k, v in zip(raw[0], raw[1]):
kernel_config[k] = v
if raw:
for k, v in zip(raw[0], raw[1]):
kernel_config[k] = v
return kernel_config
@ -942,9 +984,10 @@ class RunDatabaseOutput(DatabaseOutput, RunOutputCommon):
@property
def _db_targetfile(self):
columns = ['os', 'is_rooted', 'target', 'abi', 'cpus', 'os_version',
columns = ['os', 'is_rooted', 'target', 'modules', 'abi', 'cpus', 'os_version',
'hostid', 'hostname', 'kernel_version', 'kernel_release',
'kernel_sha1', 'kernel_config', 'sched_features',
'kernel_sha1', 'kernel_config', 'sched_features', 'page_size_kb',
'system_id', 'screen_resolution', 'prop', 'android_id',
'_pod_version', '_pod_serialization_version']
tables = ['targets']
conditions = ['targets.run_oid = \'{}\''.format(self.oid)]
@ -997,6 +1040,7 @@ class RunDatabaseOutput(DatabaseOutput, RunOutputCommon):
jobs = self._read_db(columns, tables, conditions)
for job in jobs:
job['augmentations'] = self._get_job_augmentations(job['oid'])
job['workload_parameters'] = workload_params.pop(job['oid'], {})
job['runtime_parameters'] = runtime_params.pop(job['oid'], {})
job.pop('oid')
@ -1160,6 +1204,15 @@ class RunDatabaseOutput(DatabaseOutput, RunOutputCommon):
logger.debug('Failed to deserialize job_oid:{}-"{}":"{}"'.format(job_oid, k, v))
return parm_dict
def _get_job_augmentations(self, job_oid):
columns = ['jobs_augs.augmentation_oid', 'augmentations.name',
'augmentations.oid', 'jobs_augs.job_oid']
tables = ['jobs_augs', 'augmentations']
conditions = ['jobs_augs.job_oid = \'{}\''.format(job_oid),
'jobs_augs.augmentation_oid = augmentations.oid']
augmentations = self._read_db(columns, tables, conditions)
return [aug['name'] for aug in augmentations]
def _list_runs(self):
columns = ['runs.run_uuid', 'runs.run_name', 'runs.project',
'runs.project_stage', 'runs.status', 'runs.start_time', 'runs.end_time']
@ -1211,3 +1264,11 @@ class JobDatabaseOutput(DatabaseOutput):
def __str__(self):
return '{}-{}-{}'.format(self.id, self.label, self.iteration)
@property
def augmentations(self):
job_augs = set([])
if self.spec:
for aug in self.spec.augmentations:
job_augs.add(aug)
return list(job_augs)

@ -24,7 +24,7 @@ from wa.framework.exception import ResourceError
from wa.framework.configuration import settings
from wa.utils import log
from wa.utils.misc import get_object_name
from wa.utils.types import enum, list_or_string, prioritylist
from wa.utils.types import enum, list_or_string, prioritylist, version_tuple
SourcePriority = enum(['package', 'remote', 'lan', 'local',
@ -142,10 +142,12 @@ class ApkFile(Resource):
def __init__(self, owner, variant=None, version=None,
package=None, uiauto=False, exact_abi=False,
supported_abi=None):
supported_abi=None, min_version=None, max_version=None):
super(ApkFile, self).__init__(owner)
self.variant = variant
self.version = version
self.max_version = max_version
self.min_version = min_version
self.package = package
self.uiauto = uiauto
self.exact_abi = exact_abi
@ -158,21 +160,25 @@ class ApkFile(Resource):
def match(self, path):
name_matches = True
version_matches = True
version_range_matches = True
package_matches = True
abi_matches = True
uiauto_matches = uiauto_test_matches(path, self.uiauto)
if self.version is not None:
if self.version:
version_matches = apk_version_matches(path, self.version)
if self.variant is not None:
if self.max_version or self.min_version:
version_range_matches = apk_version_matches_range(path, self.min_version,
self.max_version)
if self.variant:
name_matches = file_name_matches(path, self.variant)
if self.package is not None:
if self.package:
package_matches = package_name_matches(path, self.package)
if self.supported_abi is not None:
if self.supported_abi:
abi_matches = apk_abi_matches(path, self.supported_abi,
self.exact_abi)
return name_matches and version_matches and \
uiauto_matches and package_matches and \
abi_matches
version_range_matches and uiauto_matches \
and package_matches and abi_matches
def __str__(self):
text = '<{}\'s apk'.format(self.owner)
@ -273,15 +279,40 @@ class ResourceResolver(object):
def apk_version_matches(path, version):
version = list_or_string(version)
info = ApkInfo(path)
if info.version_name == version or info.version_code == version:
return True
return loose_version_matching(version, info.version_name)
for v in version:
if info.version_name == v or info.version_code == v:
return True
if loose_version_matching(v, info.version_name):
return True
return False
def apk_version_matches_range(path, min_version=None, max_version=None):
info = ApkInfo(path)
return range_version_matching(info.version_name, min_version, max_version)
def range_version_matching(apk_version, min_version=None, max_version=None):
if not apk_version:
return False
apk_version = version_tuple(apk_version)
if max_version:
max_version = version_tuple(max_version)
if apk_version > max_version:
return False
if min_version:
min_version = version_tuple(min_version)
if apk_version < min_version:
return False
return True
def loose_version_matching(config_version, apk_version):
config_version = config_version.split('.')
apk_version = apk_version.split('.')
config_version = version_tuple(config_version)
apk_version = version_tuple(apk_version)
if len(apk_version) < len(config_version):
return False # More specific version requested than available

@ -53,9 +53,9 @@ def kernel_version_from_pod(pod):
def kernel_config_from_pod(pod):
config = KernelConfig('')
config._config = pod['kernel_config']
config.typed_config._config = pod['kernel_config']
lines = []
for key, value in config._config.items():
for key, value in config.items():
if value == 'n':
lines.append('# {} is not set'.format(key))
else:
@ -221,6 +221,7 @@ class CpuInfo(Podable):
def get_target_info(target):
info = TargetInfo()
info.target = target.__class__.__name__
info.modules = target.modules
info.os = target.os
info.os_version = target.os_version
info.system_id = target.system_id
@ -313,12 +314,13 @@ def cache_target_info(target_info, overwrite=False):
class TargetInfo(Podable):
_pod_serialization_version = 2
_pod_serialization_version = 5
@staticmethod
def from_pod(pod):
instance = super(TargetInfo, TargetInfo).from_pod(pod)
instance.target = pod['target']
instance.modules = pod['modules']
instance.abi = pod['abi']
instance.cpus = [CpuInfo.from_pod(c) for c in pod['cpus']]
instance.os = pod['os']
@ -343,6 +345,7 @@ class TargetInfo(Podable):
def __init__(self):
super(TargetInfo, self).__init__()
self.target = None
self.modules = []
self.cpus = []
self.os = None
self.os_version = None
@ -362,6 +365,7 @@ class TargetInfo(Podable):
def to_pod(self):
pod = super(TargetInfo, self).to_pod()
pod['target'] = self.target
pod['modules'] = self.modules
pod['abi'] = self.abi
pod['cpus'] = [c.to_pod() for c in self.cpus]
pod['os'] = self.os
@ -401,3 +405,20 @@ class TargetInfo(Podable):
pod['page_size_kb'] = pod.get('page_size_kb')
pod['_pod_version'] = pod.get('format_version', 0)
return pod
@staticmethod
def _pod_upgrade_v3(pod):
config = {}
for key, value in pod['kernel_config'].items():
config[key.upper()] = value
pod['kernel_config'] = config
return pod
@staticmethod
def _pod_upgrade_v4(pod):
return TargetInfo._pod_upgrade_v3(pod)
@staticmethod
def _pod_upgrade_v5(pod):
pod['modules'] = pod.get('modules') or []
return pod

@ -92,9 +92,18 @@ class TargetManager(object):
@memoized
def get_target_info(self):
info = get_target_info_from_cache(self.target.system_id)
if info is None:
info = get_target_info(self.target)
cache_target_info(info)
else:
# If module configuration has changed form when the target info
# was previously cached, it is possible additional info will be
# available, so should re-generate the cache.
if set(info.modules) != set(self.target.modules):
info = get_target_info(self.target)
cache_target_info(info, overwrite=True)
return info
def reboot(self, context, hard=False):

@ -1,11 +1,11 @@
apply plugin: 'com.android.library'
android {
compileSdkVersion 25
buildToolsVersion '25.0.3'
compileSdkVersion 28
buildToolsVersion '28.0.3'
defaultConfig {
minSdkVersion 18
targetSdkVersion 25
targetSdkVersion 28
testInstrumentationRunner "android.support.test.runner.AndroidJUnitRunner"
}
}

@ -45,7 +45,7 @@ public class BaseUiAutomation {
public enum FindByCriteria { BY_ID, BY_TEXT, BY_DESC };
public enum Direction { UP, DOWN, LEFT, RIGHT, NULL };
public enum ScreenOrientation { RIGHT, NATURAL, LEFT };
public enum ScreenOrientation { RIGHT, NATURAL, LEFT, PORTRAIT, LANDSCAPE };
public enum PinchType { IN, OUT, NULL };
// Time in milliseconds
@ -176,6 +176,8 @@ public class BaseUiAutomation {
}
public void setScreenOrientation(ScreenOrientation orientation) throws Exception {
int width = mDevice.getDisplayWidth();
int height = mDevice.getDisplayHeight();
switch (orientation) {
case RIGHT:
mDevice.setOrientationRight();
@ -186,6 +188,30 @@ public class BaseUiAutomation {
case LEFT:
mDevice.setOrientationLeft();
break;
case LANDSCAPE:
if (mDevice.isNaturalOrientation()){
if (height > width){
mDevice.setOrientationRight();
}
}
else {
if (height > width){
mDevice.setOrientationNatural();
}
}
break;
case PORTRAIT:
if (mDevice.isNaturalOrientation()){
if (height < width){
mDevice.setOrientationRight();
}
}
else {
if (height < width){
mDevice.setOrientationNatural();
}
}
break;
default:
throw new Exception("No orientation specified");
}

Binary file not shown.

@ -19,15 +19,23 @@ from collections import namedtuple
from subprocess import Popen, PIPE
VersionTuple = namedtuple('Version', ['major', 'minor', 'revision'])
VersionTuple = namedtuple('Version', ['major', 'minor', 'revision', 'dev'])
version = VersionTuple(3, 1, 1)
version = VersionTuple(3, 2, 0, '')
required_devlib_version = VersionTuple(1, 2, 0, '')
def format_version(v):
version_string = '{}.{}.{}'.format(
v.major, v.minor, v.revision)
if v.dev:
version_string += '.{}'.format(v.dev)
return version_string
def get_wa_version():
version_string = '{}.{}.{}'.format(
version.major, version.minor, version.revision)
return version_string
return format_version(version)
def get_wa_version_with_commit():

@ -1,4 +1,4 @@
# Copyright 2014-2018 ARM Limited
# Copyright 2014-2019 ARM Limited
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
@ -14,15 +14,22 @@
#
import logging
import os
import threading
import time
try:
from shlex import quote
except ImportError:
from pipes import quote
from devlib.utils.android import ApkInfo
from wa.framework.plugin import TargetedPlugin, Parameter
from wa.framework.resource import (ApkFile, ReventFile,
File, loose_version_matching)
File, loose_version_matching,
range_version_matching)
from wa.framework.exception import WorkloadError, ConfigError
from wa.utils.types import ParameterDict
from wa.utils.types import ParameterDict, list_or_string, version_tuple
from wa.utils.revent import ReventRecorder
from wa.utils.exec_control import once_per_instance
@ -42,9 +49,15 @@ class Workload(TargetedPlugin):
aliases=['clean_up'],
default=True,
description="""
If ``True``, if assets are deployed as part of the workload they
will be removed again from the device as part of finalize.
""")
If ``True``, assets that are deployed or created as part of the
workload will be removed again from the device.
"""),
Parameter('uninstall', kind=bool,
default=True,
description="""
If ``True``, executables that are installed to the device
as part of the workload will be uninstalled again.
"""),
]
# Set this to True to mark that this workload poses a risk of exposing
@ -73,7 +86,7 @@ class Workload(TargetedPlugin):
supported_platforms = getattr(self, 'supported_platforms', [])
if supported_platforms and self.target.os not in supported_platforms:
msg = 'Supported platforms for "{}" are "{}", attemping to run on "{}"'
msg = 'Supported platforms for "{}" are "{}", attempting to run on "{}"'
raise WorkloadError(msg.format(self.name, ' '.join(self.supported_platforms),
self.target.os))
@ -174,6 +187,8 @@ class ApkWorkload(Workload):
# Times are in seconds
loading_time = 10
package_names = []
supported_versions = []
activity = None
view = None
clear_data_on_reset = True
@ -198,6 +213,16 @@ class ApkWorkload(Workload):
description="""
The version of the package to be used.
"""),
Parameter('max_version', kind=str,
default=None,
description="""
The maximum version of the package to be used.
"""),
Parameter('min_version', kind=str,
default=None,
description="""
The minimum version of the package to be used.
"""),
Parameter('variant', kind=str,
default=None,
description="""
@ -217,6 +242,7 @@ class ApkWorkload(Workload):
"""),
Parameter('uninstall', kind=bool,
default=False,
override=True,
description="""
If ``True``, will uninstall workload\'s APK as part of teardown.'
"""),
@ -235,6 +261,12 @@ class ApkWorkload(Workload):
will fall back to the version on the target if available. If
``False`` then the version on the target is preferred instead.
"""),
Parameter('view', kind=str, default=None, merge=True,
description="""
Manually override the 'View' of the workload for use with
instruments such as the ``fps`` instrument. If not specified,
a workload dependant 'View' will be automatically generated.
"""),
]
@property
@ -249,22 +281,40 @@ class ApkWorkload(Workload):
raise ConfigError('Target does not appear to support Android')
super(ApkWorkload, self).__init__(target, **kwargs)
if self.activity is not None and '.' not in self.activity:
# If we're receiving just the activity name, it's taken relative to
# the package namespace:
self.activity = '.' + self.activity
self.apk = PackageHandler(self,
package_name=self.package_name,
variant=self.variant,
strict=self.strict,
version=self.version,
version=self.version or self.supported_versions,
force_install=self.force_install,
install_timeout=self.install_timeout,
uninstall=self.uninstall,
exact_abi=self.exact_abi,
prefer_host_package=self.prefer_host_package,
clear_data_on_reset=self.clear_data_on_reset)
clear_data_on_reset=self.clear_data_on_reset,
activity=self.activity,
min_version=self.min_version,
max_version=self.max_version)
def validate(self):
if self.min_version and self.max_version:
if version_tuple(self.min_version) > version_tuple(self.max_version):
msg = 'Cannot specify min version ({}) greater than max version ({})'
raise ConfigError(msg.format(self.min_version, self.max_version))
@once_per_instance
def initialize(self, context):
super(ApkWorkload, self).initialize(context)
self.apk.initialize(context)
# pylint: disable=access-member-before-definition, attribute-defined-outside-init
if self.version is None:
self.version = self.apk.apk_info.version_name
if self.view is None:
self.view = 'SurfaceView - {}/{}'.format(self.apk.package,
self.apk.activity)
@ -327,7 +377,8 @@ class ApkUIWorkload(ApkWorkload):
@once_per_instance
def finalize(self, context):
super(ApkUIWorkload, self).finalize(context)
self.gui.remove()
if self.cleanup_assets:
self.gui.remove()
class ApkUiautoWorkload(ApkUIWorkload):
@ -365,7 +416,6 @@ class ApkReventWorkload(ApkUIWorkload):
def __init__(self, target, **kwargs):
super(ApkReventWorkload, self).__init__(target, **kwargs)
self.apk = PackageHandler(self)
self.gui = ReventGUI(self, target,
self.setup_timeout,
self.run_timeout,
@ -407,7 +457,8 @@ class UIWorkload(Workload):
@once_per_instance
def finalize(self, context):
super(UIWorkload, self).finalize(context)
self.gui.remove()
if self.cleanup_assets:
self.gui.remove()
class UiautoWorkload(UIWorkload):
@ -603,12 +654,12 @@ class ReventGUI(object):
if self.revent_teardown_file:
self.revent_recorder.replay(self.on_target_teardown_revent,
timeout=self.teardown_timeout)
def remove(self):
self.target.remove(self.on_target_setup_revent)
self.target.remove(self.on_target_run_revent)
self.target.remove(self.on_target_extract_results_revent)
self.target.remove(self.on_target_teardown_revent)
def remove(self):
self.revent_recorder.remove()
def _check_revent_files(self):
@ -637,18 +688,24 @@ class PackageHandler(object):
@property
def activity(self):
if self._activity:
return self._activity
if self.apk_info is None:
return None
return self.apk_info.activity
# pylint: disable=too-many-locals
def __init__(self, owner, install_timeout=300, version=None, variant=None,
package_name=None, strict=False, force_install=False, uninstall=False,
exact_abi=False, prefer_host_package=True, clear_data_on_reset=True):
exact_abi=False, prefer_host_package=True, clear_data_on_reset=True,
activity=None, min_version=None, max_version=None):
self.logger = logging.getLogger('apk')
self.owner = owner
self.target = self.owner.target
self.install_timeout = install_timeout
self.version = version
self.min_version = min_version
self.max_version = max_version
self.variant = variant
self.package_name = package_name
self.strict = strict
@ -657,6 +714,7 @@ class PackageHandler(object):
self.exact_abi = exact_abi
self.prefer_host_package = prefer_host_package
self.clear_data_on_reset = clear_data_on_reset
self._activity = activity
self.supported_abi = self.target.supported_abi
self.apk_file = None
self.apk_info = None
@ -669,6 +727,7 @@ class PackageHandler(object):
def setup(self, context):
context.update_metadata('app_version', self.apk_info.version_name)
context.update_metadata('app_name', self.apk_info.package)
self.initialize_package(context)
self.start_activity()
self.target.execute('am kill-all') # kill all *background* activities
@ -714,7 +773,9 @@ class PackageHandler(object):
version=self.version,
package=self.package_name,
exact_abi=self.exact_abi,
supported_abi=self.supported_abi),
supported_abi=self.supported_abi,
min_version=self.min_version,
max_version=self.max_version),
strict=self.strict)
else:
available_packages = []
@ -724,47 +785,57 @@ class PackageHandler(object):
version=self.version,
package=package,
exact_abi=self.exact_abi,
supported_abi=self.supported_abi),
supported_abi=self.supported_abi,
min_version=self.min_version,
max_version=self.max_version),
strict=self.strict)
if apk_file:
available_packages.append(apk_file)
if len(available_packages) == 1:
self.apk_file = available_packages[0]
elif len(available_packages) > 1:
msg = 'Multiple matching packages found for "{}" on host: {}'
self.error_msg = msg.format(self.owner, available_packages)
self.error_msg = self._get_package_error_msg('host')
def resolve_package_from_target(self): # pylint: disable=too-many-branches
self.logger.debug('Resolving package on target')
found_package = None
if self.package_name:
if not self.target.package_is_installed(self.package_name):
return
else:
installed_versions = [self.package_name]
else:
installed_versions = []
for package in self.owner.package_names:
if self.target.package_is_installed(package):
installed_versions.append(package)
if self.version:
matching_packages = []
for package in installed_versions:
package_version = self.target.get_package_version(package)
if loose_version_matching(self.version, package_version):
if self.version or self.min_version or self.max_version:
matching_packages = []
for package in installed_versions:
package_version = self.target.get_package_version(package)
if self.version:
for v in list_or_string(self.version):
if loose_version_matching(v, package_version):
matching_packages.append(package)
else:
if range_version_matching(package_version, self.min_version,
self.max_version):
matching_packages.append(package)
if len(matching_packages) == 1:
self.package_name = matching_packages[0]
elif len(matching_packages) > 1:
msg = 'Multiple matches for version "{}" found on device.'
self.error_msg = msg.format(self.version)
else:
if len(installed_versions) == 1:
self.package_name = installed_versions[0]
elif len(installed_versions) > 1:
self.error_msg = 'Package version not set and multiple versions found on device.'
if self.package_name:
if len(matching_packages) == 1:
found_package = matching_packages[0]
elif len(matching_packages) > 1:
self.error_msg = self._get_package_error_msg('device')
else:
if len(installed_versions) == 1:
found_package = installed_versions[0]
elif len(installed_versions) > 1:
self.error_msg = 'Package version not set and multiple versions found on device.'
if found_package:
self.logger.debug('Found matching package on target; Pulling to host.')
self.apk_file = self.pull_apk(self.package_name)
self.apk_file = self.pull_apk(found_package)
self.package_name = found_package
def initialize_package(self, context):
installed_version = self.target.get_package_version(self.apk_info.package)
@ -794,11 +865,11 @@ class PackageHandler(object):
self.apk_version = host_version
def start_activity(self):
if not self.apk_info.activity:
if not self.activity:
cmd = 'am start -W {}'.format(self.apk_info.package)
else:
cmd = 'am start -W -n {}/{}'.format(self.apk_info.package,
self.apk_info.activity)
self.activity)
output = self.target.execute(cmd)
if 'Error:' in output:
# this will dismiss any error dialogs
@ -842,3 +913,76 @@ class PackageHandler(object):
self.target.execute('am force-stop {}'.format(self.apk_info.package))
if self.uninstall:
self.target.uninstall_package(self.apk_info.package)
def _get_package_error_msg(self, location):
if self.version:
msg = 'Multiple matches for "{version}" found on {location}.'
elif self.min_version and self.max_version:
msg = 'Multiple matches between versions "{min_version}" and "{max_version}" found on {location}.'
elif self.max_version:
msg = 'Multiple matches less than or equal to "{max_version}" found on {location}.'
elif self.min_version:
msg = 'Multiple matches greater or equal to "{min_version}" found on {location}.'
else:
msg = ''
return msg.format(version=self.version, min_version=self.min_version,
max_version=self.max_version, location=location)
class TestPackageHandler(PackageHandler):
"""Class wrapping an APK used through ``am instrument``.
"""
def __init__(self, owner, instrument_args=None, raw_output=False,
instrument_wait=True, no_hidden_api_checks=False,
*args, **kwargs):
if instrument_args is None:
instrument_args = {}
super(TestPackageHandler, self).__init__(owner, *args, **kwargs)
self.raw = raw_output
self.args = instrument_args
self.wait = instrument_wait
self.no_checks = no_hidden_api_checks
self.cmd = ''
self.instrument_thread = None
self._instrument_output = None
def setup(self, context):
self.initialize_package(context)
words = ['am', 'instrument']
if self.raw:
words.append('-r')
if self.wait:
words.append('-w')
if self.no_checks:
words.append('--no-hidden-api-checks')
for k, v in self.args.items():
words.extend(['-e', str(k), str(v)])
words.append(str(self.apk_info.package))
if self.apk_info.activity:
words[-1] += '/{}'.format(self.apk_info.activity)
self.cmd = ' '.join(quote(x) for x in words)
self.instrument_thread = threading.Thread(target=self._start_instrument)
def start_activity(self):
self.instrument_thread.start()
def wait_instrument_over(self):
self.instrument_thread.join()
if 'Error:' in self._instrument_output:
cmd = 'am force-stop {}'.format(self.apk_info.package)
self.target.execute(cmd)
raise WorkloadError(self._instrument_output)
def _start_instrument(self):
self._instrument_output = self.target.execute(self.cmd)
self.logger.debug(self._instrument_output)
@property
def instrument_output(self):
if self.instrument_thread.is_alive():
self.instrument_thread.join() # writes self._instrument_output
return self._instrument_output

@ -20,6 +20,7 @@ import time
from wa import Instrument, Parameter
from wa.framework.exception import ConfigError, InstrumentError
from wa.framework.instrument import extremely_slow
from wa.utils.types import identifier
class DelayInstrument(Instrument):
@ -32,7 +33,7 @@ class DelayInstrument(Instrument):
The delay may be specified as either a fixed period or a temperature
threshold that must be reached.
Optionally, if an active cooling solution is available on the device tqgitq
Optionally, if an active cooling solution is available on the device to
speed up temperature drop between runs, it may be controlled using this
instrument.
@ -222,7 +223,7 @@ class DelayInstrument(Instrument):
for module in self.active_cooling_modules:
if self.target.has(module):
if not cooling_module:
cooling_module = getattr(self.target, module)
cooling_module = getattr(self.target, identifier(module))
else:
msg = 'Multiple cooling modules found "{}" "{}".'
raise InstrumentError(msg.format(cooling_module.name, module))

@ -144,7 +144,13 @@ class DAQBackend(EnergyInstrumentBackend):
connector on the DAQ (varies between DAQ models). The default
assumes DAQ 6363 and similar with AI channels on connectors
0-7 and 16-23.
""")
"""),
Parameter('keep_raw', kind=bool, default=False,
description="""
If set to ``True``, this will prevent the raw files obtained
from the device before processing from being deleted
(this is maily used for debugging).
"""),
]
instrument = DaqInstrument
@ -189,6 +195,12 @@ class EnergyProbeBackend(EnergyInstrumentBackend):
description="""
Path to /dev entry for the energy probe (it should be /dev/ttyACMx)
"""),
Parameter('keep_raw', kind=bool, default=False,
description="""
If set to ``True``, this will prevent the raw files obtained
from the device before processing from being deleted
(this is maily used for debugging).
"""),
]
instrument = EnergyProbeInstrument
@ -224,6 +236,12 @@ class ArmEnergyProbeBackend(EnergyInstrumentBackend):
description="""
Path to config file of the AEP
"""),
Parameter('keep_raw', kind=bool, default=False,
description="""
If set to ``True``, this will prevent the raw files obtained
from the device before processing from being deleted
(this is maily used for debugging).
"""),
]
instrument = ArmEnergyProbeInstrument
@ -282,11 +300,17 @@ class AcmeCapeBackend(EnergyInstrumentBackend):
description="""
Size of the capture buffer (in KB).
"""),
Parameter('keep_raw', kind=bool, default=False,
description="""
If set to ``True``, this will prevent the raw files obtained
from the device before processing from being deleted
(this is maily used for debugging).
"""),
]
# pylint: disable=arguments-differ
def get_instruments(self, target, metadir,
iio_capture, host, iio_devices, buffer_size):
iio_capture, host, iio_devices, buffer_size, keep_raw):
#
# Devlib's ACME instrument uses iio-capture under the hood, which can
@ -307,7 +331,7 @@ class AcmeCapeBackend(EnergyInstrumentBackend):
for iio_device in iio_devices:
ret[iio_device] = AcmeCapeInstrument(
target, iio_capture=iio_capture, host=host,
iio_device=iio_device, buffer_size=buffer_size)
iio_device=iio_device, buffer_size=buffer_size, keep_raw=keep_raw)
return ret
@ -510,3 +534,7 @@ class EnergyMeasurement(Instrument):
units = metrics[0].units
value = sum(m.value for m in metrics)
context.add_metric(name, value, units)
def teardown(self, context):
for instrument in self.instruments.values():
instrument.teardown()

@ -164,7 +164,7 @@ class FpsInstrument(Instrument):
os.remove(entry)
if not frame_count.value:
context.add_event('Could not frind frames data in gfxinfo output')
context.add_event('Could not find frames data in gfxinfo output')
context.set_status('PARTIAL')
self.check_for_crash(context, fps.value, frame_count.value,

@ -174,8 +174,14 @@ class SysfsExtractor(Instrument):
self.target.list_directory(dev_dir)):
self.logger.error('sysfs files were not pulled from the device.')
self.device_and_host_paths.remove(paths) # Path is removed to skip diffing it
for _, before_dir, after_dir, diff_dir in self.device_and_host_paths:
for dev_dir, before_dir, after_dir, diff_dir in self.device_and_host_paths:
diff_sysfs_dirs(before_dir, after_dir, diff_dir)
context.add_artifact('{} [before]'.format(dev_dir), before_dir,
kind='data', classifiers={'stage': 'before'})
context.add_artifact('{} [after]'.format(dev_dir), after_dir,
kind='data', classifiers={'stage': 'after'})
context.add_artifact('{} [diff]'.format(dev_dir), diff_dir,
kind='data', classifiers={'stage': 'diff'})
def teardown(self, context):
self._one_time_setup_done = []
@ -276,9 +282,15 @@ class InterruptStatsInstrument(Instrument):
wfh.write(self.target.execute('cat /proc/interrupts'))
def update_output(self, context):
context.add_artifact('interrupts [before]', self.before_file, kind='data',
classifiers={'stage': 'before'})
# If workload execution failed, the after_file may not have been created.
if os.path.isfile(self.after_file):
diff_interrupt_files(self.before_file, self.after_file, _f(self.diff_file))
context.add_artifact('interrupts [after]', self.after_file, kind='data',
classifiers={'stage': 'after'})
context.add_artifact('interrupts [diff]', self.diff_file, kind='data',
classifiers={'stage': 'diff'})
class DynamicFrequencyInstrument(SysfsExtractor):

@ -15,13 +15,14 @@
# pylint: disable=unused-argument
import csv
import os
import re
from devlib.trace.perf import PerfCollector
from devlib.collector.perf import PerfCollector
from wa import Instrument, Parameter
from wa.utils.types import list_or_string, list_of_strs
from wa.utils.types import list_or_string, list_of_strs, numeric
PERF_COUNT_REGEX = re.compile(r'^(CPU\d+)?\s*(\d+)\s*(.*?)\s*(\[\s*\d+\.\d+%\s*\])?\s*$')
@ -31,29 +32,40 @@ class PerfInstrument(Instrument):
name = 'perf'
description = """
Perf is a Linux profiling with performance counters.
Simpleperf is an Android profiling tool with performance counters.
It is highly recomended to use perf_type = simpleperf when using this instrument
on android devices since it recognises android symbols in record mode and is much more stable
when reporting record .data files. For more information see simpleperf documentation at:
https://android.googlesource.com/platform/system/extras/+/master/simpleperf/doc/README.md
Performance counters are CPU hardware registers that count hardware events
such as instructions executed, cache-misses suffered, or branches
mispredicted. They form a basis for profiling applications to trace dynamic
control flow and identify hotspots.
pref accepts options and events. If no option is given the default '-a' is
used. For events, the default events are migrations and cs. They both can
be specified in the config file.
perf accepts options and events. If no option is given the default '-a' is
used. For events, the default events for perf are migrations and cs. The default
events for simpleperf are raw-cpu-cycles, raw-l1-dcache, raw-l1-dcache-refill, raw-instructions-retired.
They both can be specified in the config file.
Events must be provided as a list that contains them and they will look like
this ::
perf_events = ['migrations', 'cs']
(for perf_type = perf ) perf_events = ['migrations', 'cs']
(for perf_type = simpleperf) perf_events = ['raw-cpu-cycles', 'raw-l1-dcache']
Events can be obtained by typing the following in the command line on the
device ::
perf list
simpleperf list
Whereas options, they can be provided as a single string as following ::
perf_options = '-a -i'
perf_options = '--app com.adobe.reader'
Options can be obtained by running the following in the command line ::
@ -61,21 +73,32 @@ class PerfInstrument(Instrument):
"""
parameters = [
Parameter('events', kind=list_of_strs, default=['migrations', 'cs'],
global_alias='perf_events',
constraint=(lambda x: x, 'must not be empty.'),
Parameter('perf_type', kind=str, allowed_values=['perf', 'simpleperf'], default='perf',
global_alias='perf_type', description="""Specifies which type of perf binaries
to install. Use simpleperf for collecting perf data on android systems."""),
Parameter('command', kind=str, default='stat', allowed_values=['stat', 'record'],
global_alias='perf_command', description="""Specifies which perf command to use. If in record mode
report command will also be executed and results pulled from target along with raw data
file"""),
Parameter('events', kind=list_of_strs, global_alias='perf_events',
description="""Specifies the events to be counted."""),
Parameter('optionstring', kind=list_or_string, default='-a',
global_alias='perf_options',
description="""Specifies options to be used for the perf command. This
may be a list of option strings, in which case, multiple instances of perf
will be kicked off -- one for each option string. This may be used to e.g.
collected different events from different big.LITTLE clusters.
collected different events from different big.LITTLE clusters. In order to
profile a particular application process for android with simpleperf use
the --app option e.g. --app com.adobe.reader
"""),
Parameter('report_option_string', kind=str, global_alias='perf_report_options', default=None,
description="""Specifies options to be used to gather report when record command
is used. It's highly recommended to use perf_type simpleperf when running on
android devices as reporting options are unstable with perf"""),
Parameter('labels', kind=list_of_strs, default=None,
global_alias='perf_labels',
description="""Provides labels for pref output. If specified, the number of
labels must match the number of ``optionstring``\ s.
description="""Provides labels for perf/simpleperf output for each optionstring.
If specified, the number of labels must match the number of ``optionstring``\ s.
"""),
Parameter('force_install', kind=bool, default=False,
description="""
@ -86,15 +109,21 @@ class PerfInstrument(Instrument):
def __init__(self, target, **kwargs):
super(PerfInstrument, self).__init__(target, **kwargs)
self.collector = None
self.outdir = None
def initialize(self, context):
self.collector = PerfCollector(self.target,
self.perf_type,
self.command,
self.events,
self.optionstring,
self.report_option_string,
self.labels,
self.force_install)
def setup(self, context):
self.outdir = os.path.join(context.output_directory, self.perf_type)
self.collector.set_output(self.outdir)
self.collector.reset()
def start(self, context):
@ -105,12 +134,32 @@ class PerfInstrument(Instrument):
def update_output(self, context):
self.logger.info('Extracting reports from target...')
outdir = os.path.join(context.output_directory, 'perf')
self.collector.get_trace(outdir)
self.collector.get_data()
for host_file in os.listdir(outdir):
if self.perf_type == 'perf':
self._process_perf_output(context)
else:
self._process_simpleperf_output(context)
def teardown(self, context):
self.collector.reset()
def _process_perf_output(self, context):
if self.command == 'stat':
self._process_perf_stat_output(context)
elif self.command == 'record':
self._process_perf_record_output(context)
def _process_simpleperf_output(self, context):
if self.command == 'stat':
self._process_simpleperf_stat_output(context)
elif self.command == 'record':
self._process_simpleperf_record_output(context)
def _process_perf_stat_output(self, context):
for host_file in os.listdir(self.outdir):
label = host_file.split('.out')[0]
host_file_path = os.path.join(outdir, host_file)
host_file_path = os.path.join(self.outdir, host_file)
context.add_artifact(label, host_file_path, 'raw')
with open(host_file_path) as fh:
in_results_section = False
@ -118,21 +167,150 @@ class PerfInstrument(Instrument):
if 'Performance counter stats' in line:
in_results_section = True
next(fh) # skip the following blank line
if in_results_section:
if not line.strip(): # blank line
in_results_section = False
break
else:
line = line.split('#')[0] # comment
match = PERF_COUNT_REGEX.search(line)
if match:
classifiers = {}
cpu = match.group(1)
if cpu is not None:
classifiers['cpu'] = int(cpu.replace('CPU', ''))
count = int(match.group(2))
metric = '{}_{}'.format(label, match.group(3))
context.add_metric(metric, count, classifiers=classifiers)
if not in_results_section:
continue
if not line.strip(): # blank line
in_results_section = False
break
else:
self._add_perf_stat_metric(line, label, context)
def teardown(self, context):
self.collector.reset()
@staticmethod
def _add_perf_stat_metric(line, label, context):
line = line.split('#')[0] # comment
match = PERF_COUNT_REGEX.search(line)
if not match:
return
classifiers = {}
cpu = match.group(1)
if cpu is not None:
classifiers['cpu'] = int(cpu.replace('CPU', ''))
count = int(match.group(2))
metric = '{}_{}'.format(label, match.group(3))
context.add_metric(metric, count, classifiers=classifiers)
def _process_perf_record_output(self, context):
for host_file in os.listdir(self.outdir):
label, ext = os.path.splitext(host_file)
context.add_artifact(label, os.path.join(self.outdir, host_file), 'raw')
column_headers = []
column_header_indeces = []
event_type = ''
if ext == '.rpt':
with open(os.path.join(self.outdir, host_file)) as fh:
for line in fh:
words = line.split()
if not words:
continue
event_type = self._get_report_event_type(words, event_type)
column_headers = self._get_report_column_headers(column_headers, words, 'perf')
for column_header in column_headers:
column_header_indeces.append(line.find(column_header))
self._add_report_metric(column_headers,
column_header_indeces,
line,
words,
context,
event_type,
label)
@staticmethod
def _get_report_event_type(words, event_type):
if words[0] != '#':
return event_type
if len(words) == 6 and words[4] == 'event':
event_type = words[5]
event_type = event_type.strip("'")
return event_type
def _process_simpleperf_stat_output(self, context):
labels = []
for host_file in os.listdir(self.outdir):
labels.append(host_file.split('.out')[0])
for opts, label in zip(self.optionstring, labels):
stat_file = os.path.join(self.outdir, '{}{}'.format(label, '.out'))
if '--csv' in opts:
self._process_simpleperf_stat_from_csv(stat_file, context, label)
else:
self._process_simpleperf_stat_from_raw(stat_file, context, label)
@staticmethod
def _process_simpleperf_stat_from_csv(stat_file, context, label):
with open(stat_file) as csv_file:
readCSV = csv.reader(csv_file, delimiter=',')
line_num = 0
for row in readCSV:
if line_num > 0 and 'Total test time' not in row:
classifiers = {'scaled from(%)': row[len(row) - 2].replace('(', '').replace(')', '').replace('%', '')}
context.add_metric('{}_{}'.format(label, row[1]), row[0], 'count', classifiers=classifiers)
line_num += 1
@staticmethod
def _process_simpleperf_stat_from_raw(stat_file, context, label):
with open(stat_file) as fh:
for line in fh:
if '#' in line:
tmp_line = line.split('#')[0]
tmp_line = line.strip()
count, metric = tmp_line.split(' ')[0], tmp_line.split(' ')[2]
count = int(count.replace(',', ''))
scaled_percentage = line.split('(')[1].strip().replace(')', '').replace('%', '')
scaled_percentage = int(scaled_percentage)
metric = '{}_{}'.format(label, metric)
context.add_metric(metric, count, 'count', classifiers={'scaled from(%)': scaled_percentage})
def _process_simpleperf_record_output(self, context):
for host_file in os.listdir(self.outdir):
label, ext = os.path.splitext(host_file)
context.add_artifact(label, os.path.join(self.outdir, host_file), 'raw')
if ext != '.rpt':
continue
column_headers = []
column_header_indeces = []
event_type = ''
with open(os.path.join(self.outdir, host_file)) as fh:
for line in fh:
words = line.split()
if not words:
continue
if words[0] == 'Event:':
event_type = words[1]
column_headers = self._get_report_column_headers(column_headers,
words,
'simpleperf')
for column_header in column_headers:
column_header_indeces.append(line.find(column_header))
self._add_report_metric(column_headers,
column_header_indeces,
line,
words,
context,
event_type,
label)
@staticmethod
def _get_report_column_headers(column_headers, words, perf_type):
if 'Overhead' not in words:
return column_headers
if perf_type == 'perf':
words.remove('#')
column_headers = words
# Concatonate Shared Objects header
if 'Shared' in column_headers:
shared_index = column_headers.index('Shared')
column_headers[shared_index:shared_index + 2] = ['{} {}'.format(column_headers[shared_index],
column_headers[shared_index + 1])]
return column_headers
@staticmethod
def _add_report_metric(column_headers, column_header_indeces, line, words, context, event_type, label):
if '%' not in words[0]:
return
classifiers = {}
for i in range(1, len(column_headers)):
classifiers[column_headers[i]] = line[column_header_indeces[i]:column_header_indeces[i + 1]].strip()
context.add_metric('{}_{}_Overhead'.format(label, event_type),
numeric(words[0].strip('%')),
'percent',
classifiers=classifiers)

Binary file not shown.

Binary file not shown.

@ -196,7 +196,7 @@ int main(int argc, char ** argv) {
strip(buf);
printf(",%s", buf);
buf[0] = '\0'; // "Empty" buffer
memset(buf, 0, sizeof(buf)); // "Empty" buffer
}
printf("\n");
usleep(interval);

@ -15,7 +15,7 @@
import os
from devlib.trace.screencapture import ScreenCaptureCollector
from devlib.collector.screencapture import ScreenCaptureCollector
from wa import Instrument, Parameter
@ -47,8 +47,9 @@ class ScreenCaptureInstrument(Instrument):
output_path = os.path.join(context.output_directory, "screen-capture")
os.mkdir(output_path)
self.collector = ScreenCaptureCollector(self.target,
output_path,
self.period)
self.collector.set_output(output_path)
self.collector.reset()
def start(self, context): # pylint: disable=unused-argument
self.collector.start()

@ -47,35 +47,36 @@ class SerialMon(Instrument):
def __init__(self, target, **kwargs):
super(SerialMon, self).__init__(target, **kwargs)
self._collector = SerialTraceCollector(target, self.serial_port, self.baudrate)
self._collector.reset()
def start_logging(self, context):
def start_logging(self, context, filename="serial.log"):
outpath = os.path.join(context.output_directory, filename)
self._collector.set_output(outpath)
self._collector.reset()
self.logger.debug("Acquiring serial port ({})".format(self.serial_port))
if self._collector.collecting:
self.stop_logging(context)
self._collector.start()
def stop_logging(self, context, filename="serial.log", identifier="job"):
def stop_logging(self, context, identifier="job"):
self.logger.debug("Releasing serial port ({})".format(self.serial_port))
if self._collector.collecting:
self._collector.stop()
outpath = os.path.join(context.output_directory, filename)
self._collector.get_trace(outpath)
context.add_artifact("{}_serial_log".format(identifier),
outpath, kind="log")
data = self._collector.get_data()
for l in data:
context.add_artifact("{}_serial_log".format(identifier),
l.path, kind="log")
def on_run_start(self, context):
self.start_logging(context)
self.start_logging(context, "preamble_serial.log")
def before_job_queue_execution(self, context):
self.stop_logging(context, "preamble_serial.log", "preamble")
self.stop_logging(context, "preamble")
def after_job_queue_execution(self, context):
self.start_logging(context)
self.start_logging(context, "postamble_serial.log")
def on_run_end(self, context):
self.stop_logging(context, "postamble_serial.log", "postamble")
self.stop_logging(context, "postamble")
def on_job_start(self, context):
self.start_logging(context)

@ -203,7 +203,8 @@ class TraceCmdInstrument(Instrument):
def update_output(self, context): # NOQA pylint: disable=R0912
self.logger.info('Extracting trace from target...')
outfile = os.path.join(context.output_directory, 'trace.dat')
self.collector.get_trace(outfile)
self.collector.set_output(outfile)
self.collector.get_data()
context.add_artifact('trace-cmd-bin', outfile, 'data')
if self.report:
textfile = os.path.join(context.output_directory, 'trace.txt')

@ -94,6 +94,12 @@ class CpuStatesProcessor(OutputProcessor):
if not trace_file:
self.logger.warning('Text trace does not appear to have been generated; skipping this iteration.')
return
if 'cpufreq' not in target_info.modules:
msg = '"cpufreq" module not detected on target, cpu frequency information may be missing.'
self.logger.warning(msg)
if 'cpuidle' not in target_info.modules:
msg = '"cpuidle" module not detected on target, cpu idle information may be missing.'
self.logger.debug(msg)
self.logger.info('Generating power state reports from trace...')
reports = report_power_stats( # pylint: disable=unbalanced-tuple-unpacking

@ -16,6 +16,7 @@
import os
import uuid
import collections
import tarfile
try:
import psycopg2
@ -24,6 +25,7 @@ try:
except ImportError as e:
psycopg2 = None
import_error_msg = e.args[0] if e.args else str(e)
from devlib.target import KernelVersion, KernelConfig
from wa import OutputProcessor, Parameter, OutputProcessorError
@ -88,10 +90,10 @@ class PostgresqlResultProcessor(OutputProcessor):
"VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)",
"update_run": "UPDATE Runs SET event_summary=%s, status=%s, timestamp=%s, end_time=%s, duration=%s, state=%s WHERE oid=%s;",
"create_job": "INSERT INTO Jobs (oid, run_oid, status, retry, label, job_id, iterations, workload_name, metadata, _pod_version, _pod_serialization_version) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s);",
"create_target": "INSERT INTO Targets (oid, run_oid, target, cpus, os, os_version, hostid, hostname, abi, is_rooted, kernel_version, kernel_release, kernel_sha1, kernel_config, sched_features, page_size_kb, screen_resolution, prop, android_id, _pod_version, _pod_serialization_version) "
"VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)",
"create_target": "INSERT INTO Targets (oid, run_oid, target, modules, cpus, os, os_version, hostid, hostname, abi, is_rooted, kernel_version, kernel_release, kernel_sha1, kernel_config, sched_features, page_size_kb, system_id, screen_resolution, prop, android_id, _pod_version, _pod_serialization_version) "
"VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)",
"create_event": "INSERT INTO Events (oid, run_oid, job_oid, timestamp, message, _pod_version, _pod_serialization_version) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s",
"create_artifact": "INSERT INTO Artifacts (oid, run_oid, job_oid, name, large_object_uuid, description, kind, _pod_version, _pod_serialization_version) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s)",
"create_artifact": "INSERT INTO Artifacts (oid, run_oid, job_oid, name, large_object_uuid, description, kind, is_dir, _pod_version, _pod_serialization_version) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s)",
"create_metric": "INSERT INTO Metrics (oid, run_oid, job_oid, name, value, units, lower_is_better, _pod_version, _pod_serialization_version) VALUES (%s, %s, %s, %s, %s, %s , %s, %s, %s)",
"create_augmentation": "INSERT INTO Augmentations (oid, run_oid, name) VALUES (%s, %s, %s)",
"create_classifier": "INSERT INTO Classifiers (oid, artifact_oid, metric_oid, job_oid, run_oid, key, value) VALUES (%s, %s, %s, %s, %s, %s, %s)",
@ -126,8 +128,6 @@ class PostgresqlResultProcessor(OutputProcessor):
'Postgresql Output Processor: {}'.format(import_error_msg))
# N.B. Typecasters are for postgres->python and adapters the opposite
self.connect_to_database()
self.cursor = self.conn.cursor()
self.verify_schema_versions()
# Register the adapters and typecasters for enum types
self.cursor.execute("SELECT NULL::status_enum")
@ -190,6 +190,7 @@ class PostgresqlResultProcessor(OutputProcessor):
self.target_uuid,
self.run_uuid,
target_pod['target'],
target_pod['modules'],
target_pod['cpus'],
target_pod['os'],
target_pod['os_version'],
@ -205,12 +206,13 @@ class PostgresqlResultProcessor(OutputProcessor):
target_info.kernel_config,
target_pod['sched_features'],
target_pod['page_size_kb'],
target_pod['system_id'],
# Android Specific
target_pod.get('screen_resolution'),
list(target_pod.get('screen_resolution', [])),
target_pod.get('prop'),
target_pod.get('android_id'),
target_pod.get('pod_version'),
target_pod.get('pod_serialization_version'),
target_pod.get('_pod_version'),
target_pod.get('_pod_serialization_version'),
)
)
@ -221,6 +223,8 @@ class PostgresqlResultProcessor(OutputProcessor):
''' Run once for each job to upload information that is
updated on a job by job basis.
'''
# Ensure we're still connected to the database.
self.connect_to_database()
job_uuid = uuid.uuid4()
# Create a new job
self.cursor.execute(
@ -302,8 +306,11 @@ class PostgresqlResultProcessor(OutputProcessor):
''' A final export of the RunOutput that updates existing parameters
and uploads ones which are only generated after jobs have run.
'''
if not self.cursor: # Database did not connect correctly.
if self.cursor is None: # Output processor did not initialise correctly.
return
# Ensure we're still connected to the database.
self.connect_to_database()
# Update the job statuses following completion of the run
for job in run_output.jobs:
job_id = job.id
@ -510,6 +517,8 @@ class PostgresqlResultProcessor(OutputProcessor):
raise OutputProcessorError(
"Database error, if the database doesn't exist, " +
"please use 'wa create database' to create the database: {}".format(e))
self.cursor = self.conn.cursor()
self.verify_schema_versions()
def execute_sql_line_by_line(self, sql):
cursor = self.conn.cursor()
@ -532,7 +541,7 @@ class PostgresqlResultProcessor(OutputProcessor):
'with the create command'
raise OutputProcessorError(msg.format(db_schema_version, local_schema_version))
def _sql_write_lobject(self, source, lobject):
def _sql_write_file_lobject(self, source, lobject):
with open(source) as lobj_file:
lobj_data = lobj_file.read()
if len(lobj_data) > 50000000: # Notify if LO inserts larger than 50MB
@ -540,10 +549,18 @@ class PostgresqlResultProcessor(OutputProcessor):
lobject.write(lobj_data)
self.conn.commit()
def _sql_write_dir_lobject(self, source, lobject):
with tarfile.open(fileobj=lobject, mode='w|gz') as lobj_dir:
lobj_dir.add(source, arcname='.')
self.conn.commit()
def _sql_update_artifact(self, artifact, output_object):
self.logger.debug('Updating artifact: {}'.format(artifact))
lobj = self.conn.lobject(oid=self.artifacts_already_added[artifact], mode='w')
self._sql_write_lobject(os.path.join(output_object.basepath, artifact.path), lobj)
if artifact.is_dir:
self._sql_write_dir_lobject(os.path.join(output_object.basepath, artifact.path), lobj)
else:
self._sql_write_file_lobject(os.path.join(output_object.basepath, artifact.path), lobj)
def _sql_create_artifact(self, artifact, output_object, record_in_added=False, job_uuid=None):
self.logger.debug('Uploading artifact: {}'.format(artifact))
@ -551,8 +568,10 @@ class PostgresqlResultProcessor(OutputProcessor):
lobj = self.conn.lobject()
loid = lobj.oid
large_object_uuid = uuid.uuid4()
self._sql_write_lobject(os.path.join(output_object.basepath, artifact.path), lobj)
if artifact.is_dir:
self._sql_write_dir_lobject(os.path.join(output_object.basepath, artifact.path), lobj)
else:
self._sql_write_file_lobject(os.path.join(output_object.basepath, artifact.path), lobj)
self.cursor.execute(
self.sql_command['create_large_object'],
@ -571,6 +590,7 @@ class PostgresqlResultProcessor(OutputProcessor):
large_object_uuid,
artifact.description,
str(artifact.kind),
artifact.is_dir,
artifact._pod_version, # pylint: disable=protected-access
artifact._pod_serialization_version, # pylint: disable=protected-access
)

@ -151,7 +151,7 @@ class PowerStateProcessor(object):
def __init__(self, cpus, wait_for_marker=True, no_idle=None):
if no_idle is None:
no_idle = True if cpus[0].cpuidle else False
no_idle = False if cpus[0].cpuidle and cpus[0].cpuidle.states else True
self.power_state = SystemPowerState(len(cpus), no_idle=no_idle)
self.requested_states = {} # cpu_id -> requeseted state
self.wait_for_marker = wait_for_marker
@ -368,7 +368,7 @@ class PowerStateTimeline(object):
if frequency is None:
if idle_state == -1:
row.append('Running (unknown kHz)')
elif idle_state is None:
elif idle_state is None or not self.idle_state_names[cpu_idx]:
row.append('unknown')
else:
row.append(self.idle_state_names[cpu_idx][idle_state])
@ -499,7 +499,7 @@ class PowerStateStats(object):
state = 'Running (unknown KHz)'
elif freq:
state = '{}-{:07}KHz'.format(self.idle_state_names[cpu][idle], freq)
elif idle is not None:
elif idle is not None and self.idle_state_names[cpu]:
state = self.idle_state_names[cpu][idle]
else:
state = 'unknown'
@ -560,12 +560,12 @@ class CpuUtilizationTimeline(object):
headers = ['ts'] + ['{} CPU{}'.format(cpu.name, cpu.id) for cpu in cpus]
self.writer.writerow(headers)
self._max_freq_list = [cpu.cpufreq.available_frequencies[-1] for cpu in cpus]
self._max_freq_list = [cpu.cpufreq.available_frequencies[-1] for cpu in cpus if cpu.cpufreq.available_frequencies]
def update(self, timestamp, core_states): # NOQA
row = [timestamp]
for core, [_, frequency] in enumerate(core_states):
if frequency is not None:
if frequency is not None and core in self._max_freq_list:
frequency /= float(self._max_freq_list[core])
row.append(frequency)
else:

@ -41,7 +41,6 @@ else:
from itertools import chain, cycle
from distutils.spawn import find_executable # pylint: disable=no-name-in-module, import-error
import yaml
from dateutil import tz
# pylint: disable=wrong-import-order
@ -325,6 +324,11 @@ def load_struct_from_python(filepath=None, text=None):
def load_struct_from_yaml(filepath=None, text=None):
"""Parses a config structure from a .yaml file. The structure should be composed
of basic Python types (strings, ints, lists, dicts, etc.)."""
# Import here to avoid circular imports
# pylint: disable=wrong-import-position,cyclic-import
from wa.utils.serializer import yaml
if not (filepath or text) or (filepath and text):
raise ValueError('Exactly one of filepath or text must be specified.')
try:

@ -199,7 +199,6 @@ def create_iterable_adapter(array_columns, explicit_iterate=False):
array_string = "{" + array_string + "}"
final_string = final_string + array_string + ","
final_string = final_string.strip(",")
final_string = "{" + final_string + "}"
else:
# Simply return each item in the array
if explicit_iterate:
@ -208,8 +207,7 @@ def create_iterable_adapter(array_columns, explicit_iterate=False):
else:
for item in param:
final_string = final_string + str(item) + ","
final_string = "{" + final_string + "}"
return AsIs("'{}'".format(final_string))
return AsIs("'{{{}}}'".format(final_string))
return adapt_iterable
@ -245,10 +243,10 @@ def get_schema(schemafilepath):
def get_database_schema_version(conn):
with conn.cursor() as cursor:
cursor.execute('''SELECT
DatabaseMeta.schema_major,
DatabaseMeta.schema_minor
FROM
DatabaseMeta;''')
DatabaseMeta.schema_major,
DatabaseMeta.schema_minor
FROM
DatabaseMeta;''')
schema_major, schema_minor = cursor.fetchone()
return (schema_major, schema_minor)

@ -60,9 +60,17 @@ import os
import re
import json as _json
from collections import OrderedDict
from collections.abc import Hashable
from datetime import datetime
import dateutil.parser
import yaml as _yaml # pylint: disable=wrong-import-order
from yaml import MappingNode
try:
from yaml import FullLoader as _yaml_loader
except ImportError:
from yaml import Loader as _yaml_loader
from yaml.constructor import ConstructorError
# pylint: disable=redefined-builtin
from past.builtins import basestring # pylint: disable=wrong-import-order
@ -203,16 +211,6 @@ def _wa_cpu_mask_representer(dumper, data):
return dumper.represent_scalar(_cpu_mask_tag, data.mask())
def _wa_dict_constructor(loader, node):
pairs = loader.construct_pairs(node)
seen_keys = set()
for k, _ in pairs:
if k in seen_keys:
raise ValueError('Duplicate entry: {}'.format(k))
seen_keys.add(k)
return OrderedDict(pairs)
def _wa_regex_constructor(loader, node):
value = loader.construct_scalar(node)
flags, pattern = value.split(':', 1)
@ -230,14 +228,34 @@ def _wa_cpu_mask_constructor(loader, node):
return cpu_mask(value)
class _WaYamlLoader(_yaml_loader): # pylint: disable=too-many-ancestors
def construct_mapping(self, node, deep=False):
if isinstance(node, MappingNode):
self.flatten_mapping(node)
if not isinstance(node, MappingNode):
raise ConstructorError(None, None,
"expected a mapping node, but found %s" % node.id,
node.start_mark)
mapping = OrderedDict()
for key_node, value_node in node.value:
key = self.construct_object(key_node, deep=deep)
if not isinstance(key, Hashable):
raise ConstructorError("while constructing a mapping", node.start_mark,
"found unhashable key", key_node.start_mark)
value = self.construct_object(value_node, deep=deep)
mapping[key] = value
return mapping
_yaml.add_representer(OrderedDict, _wa_dict_representer)
_yaml.add_representer(regex_type, _wa_regex_representer)
_yaml.add_representer(level, _wa_level_representer)
_yaml.add_representer(cpu_mask, _wa_cpu_mask_representer)
_yaml.add_constructor(_mapping_tag, _wa_dict_constructor)
_yaml.add_constructor(_regex_tag, _wa_regex_constructor)
_yaml.add_constructor(_level_tag, _wa_level_constructor)
_yaml.add_constructor(_cpu_mask_tag, _wa_cpu_mask_constructor)
_yaml.add_constructor(_regex_tag, _wa_regex_constructor, Loader=_WaYamlLoader)
_yaml.add_constructor(_level_tag, _wa_level_constructor, Loader=_WaYamlLoader)
_yaml.add_constructor(_cpu_mask_tag, _wa_cpu_mask_constructor, Loader=_WaYamlLoader)
_yaml.add_constructor(_mapping_tag, _WaYamlLoader.construct_yaml_map, Loader=_WaYamlLoader)
class yaml(object):
@ -249,12 +267,13 @@ class yaml(object):
@staticmethod
def load(fh, *args, **kwargs):
try:
return _yaml.load(fh, *args, **kwargs)
return _yaml.load(fh, *args, Loader=_WaYamlLoader, **kwargs)
except _yaml.YAMLError as e:
lineno = None
if hasattr(e, 'problem_mark'):
lineno = e.problem_mark.line # pylint: disable=no-member
raise SerializerSyntaxError(e.args[0] if e.args else str(e), lineno)
message = e.args[0] if (e.args and e.args[0]) else str(e)
raise SerializerSyntaxError(message, lineno)
loads = load

@ -17,7 +17,7 @@ import re
import logging
from itertools import chain
from devlib.trace.ftrace import TRACE_MARKER_START, TRACE_MARKER_STOP
from devlib.collector.ftrace import TRACE_MARKER_START, TRACE_MARKER_STOP
from wa.utils.misc import isiterable
from wa.utils.types import numeric

@ -38,7 +38,8 @@ if sys.version_info[0] == 3:
else:
from urllib import quote, unquote # pylint: disable=no-name-in-module
# pylint: disable=wrong-import-position
from collections import defaultdict, MutableMapping
from collections import defaultdict
from collections.abc import MutableMapping
from functools import total_ordering
from future.utils import with_metaclass
@ -170,9 +171,9 @@ def list_or_caseless_string(value):
def list_or(type_):
"""
Generator for "list or" types. These take either a single value or a list
values and return a list of the specfied ``type_`` performing the
values and return a list of the specified ``type_`` performing the
conversion on the value (if a single value is specified) or each of the
elemented of the specified list.
elements of the specified list.
"""
list_type = list_of(type_)
@ -208,6 +209,13 @@ def regex(value):
return re.compile(value)
def version_tuple(v):
"""
Converts a version string into a tuple of ints that can be used for natural comparison.
"""
return tuple(map(int, (v.split("."))))
__counters = defaultdict(int)
@ -281,7 +289,7 @@ class prioritylist(object):
- ``new_element`` the element to be inserted in the prioritylist
- ``priority`` is the priority of the element which specifies its
order withing the List
order within the List
"""
self._add_element(new_element, priority)
@ -774,11 +782,8 @@ class ParameterDict(dict):
def update(self, *args, **kwargs):
for d in list(args) + [kwargs]:
if isinstance(d, ParameterDict):
dict.update(self, d)
else:
for k, v in d.items():
self[k] = v
for k, v in d:
self[k] = v
class cpu_mask(object):

@ -41,7 +41,7 @@ class AdobeReader(ApkUiautoWorkload):
Search ``document_name`` for each string in the ``search_string_list``
4. Close the document
Known working APK version: 16.1
Known working APK version: 19.7.1.10709
'''
default_search_strings = [

@ -3,12 +3,12 @@ apply plugin: 'com.android.application'
def packageName = "com.arm.wa.uiauto.adobereader"
android {
compileSdkVersion 25
compileSdkVersion 28
buildToolsVersion '25.0.0'
defaultConfig {
applicationId "${packageName}"
minSdkVersion 18
targetSdkVersion 25
targetSdkVersion 28
testInstrumentationRunner "android.support.test.runner.AndroidJUnitRunner"
}
buildTypes {

@ -110,6 +110,28 @@ public class UiAutomation extends BaseUiAutomation implements ApplaunchInterface
private void dismissWelcomeView() throws Exception {
//Close optional sign in screen on newer versions (19.4.0.9813)
UiObject closeWelcomeImage = mDevice.findObject(new UiSelector().resourceId(packageID + "optional_signing_cross_button")
.className("android.widget.ImageView"));
if (closeWelcomeImage.exists()) {
closeWelcomeImage.click();
}
// Deal with popup dialog message tutorial on newer versions
UiObject tutorialDialog = mDevice.findObject(new UiSelector().resourceId(packageID + "close_card_button")
.className("android.widget.ImageButton"));
if (tutorialDialog.waitForExists(TimeUnit.SECONDS.toMillis(3))) {
tutorialDialog.click();
}
//Check to see if app is on home screen
if (mDevice.findObject(new UiSelector().textContains("Home")).exists()) {
return;
}
// Support older version (Last known working 16.1)
UiObject welcomeView = getUiObjectByResourceId("android:id/content",
"android.widget.FrameLayout");
welcomeView.swipeLeft(10);
@ -150,37 +172,8 @@ public class UiAutomation extends BaseUiAutomation implements ApplaunchInterface
private void openFile(final String filename) throws Exception {
String testTag = "open_document";
ActionLogger logger = new ActionLogger(testTag, parameters);
// Select the local files list from the My Documents view
clickUiObject(BY_TEXT, "LOCAL", "android.widget.TextView");
UiObject directoryPath =
mDevice.findObject(new UiSelector().resourceId(packageID + "directoryPath"));
if (!directoryPath.waitForExists(TimeUnit.SECONDS.toMillis(60))) {
throw new UiObjectNotFoundException("Could not find any local files");
}
// Click the button to search from the present file list view
UiObject searchButton =
mDevice.findObject(new UiSelector().resourceId(packageID + "split_pane_search"));
if (!searchButton.waitForExists(TimeUnit.SECONDS.toMillis(10))) {
throw new UiObjectNotFoundException("Could not find search button");
}
searchButton.click();
// Force a refresh of files before searching
uiDeviceSwipe(Direction.DOWN, 100);
// Repeat as first swipe is sometimes ignored.
uiDeviceSwipe(Direction.DOWN, 100);
// Enter search text into the file searchBox. This will automatically filter the list.
UiObject searchBox =
mDevice.findObject(new UiSelector().resourceIdMatches(".*search_src_text")
.classNameMatches("android.widget.Edit.*"));
searchBox.setText(filename);
// Open a file from a file list view by searching for UiObjects containing the doc title.
UiObject fileObject = getUiObjectByText(filename, "android.widget.TextView");
UiObject fileObject = findFileObject(filename);
logger.start();
fileObject.clickAndWaitForNewWindow(uiAutoTimeout);
@ -194,6 +187,68 @@ public class UiAutomation extends BaseUiAutomation implements ApplaunchInterface
logger.stop();
}
private UiObject findFileObject(String filename) throws Exception {
UiObject localFilesTab = mDevice.findObject(new UiSelector().textContains("LOCAL")
.className("android.widget.TextView"));
// Support older versions
if (localFilesTab.exists()) {
localFilesTab.click();
UiObject directoryPath =
mDevice.findObject(new UiSelector().resourceId(packageID + "directoryPath"));
if (!directoryPath.waitForExists(TimeUnit.SECONDS.toMillis(60))) {
throw new UiObjectNotFoundException("Could not find any local files");
}
// Click the button to search from the present file list view
UiObject searchButton =
mDevice.findObject(new UiSelector().resourceId(packageID + "split_pane_search"));
if (!searchButton.waitForExists(TimeUnit.SECONDS.toMillis(10))) {
throw new UiObjectNotFoundException("Could not find search button");
}
searchButton.click();
// Force a refresh of files before searching
uiDeviceSwipe(Direction.DOWN, 100);
// Repeat as first swipe is sometimes ignored.
uiDeviceSwipe(Direction.DOWN, 100);
// Enter search text into the file searchBox. This will automatically filter the list.
UiObject searchBox =
mDevice.findObject(new UiSelector().resourceIdMatches(".*search_src_text")
.classNameMatches("android.widget.Edit.*"));
searchBox.setText(filename);
// Open a file from a file list view by searching for UiObjects containing the doc title.
return getUiObjectByText(filename, "android.widget.TextView");
}
// Support for newer version
UiObject searchNavigationButton = mDevice.findObject(new UiSelector()
.resourceIdMatches(packageID + "bottombaritem_search")
.className("android.widget.FrameLayout"));
// On devices with larger screen sizes, layout is different
if(!searchNavigationButton.exists()) {
searchNavigationButton = getUiObjectByResourceId(packageID + "search_button_home",
"android.widget.TextView");
}
searchNavigationButton.click();
UiObject searchBox =
mDevice.findObject(new UiSelector().resourceIdMatches(".*search_src_text")
.classNameMatches("android.widget.EditText"));
searchBox.click();
searchBox.setText(filename);
mDevice.pressEnter();
// Remove file extension
return getUiObjectByText(filename.substring(0,filename.lastIndexOf(".")), "android.widget.TextView");
}
private void gesturesTest() throws Exception {
String testTag = "gesture";
@ -295,21 +350,7 @@ public class UiAutomation extends BaseUiAutomation implements ApplaunchInterface
progressBar.waitUntilGone(searchTimeout);
logger.stop();
// Get back to the main document view by clicking twice on the close button
UiObject searchCloseButton =
mDevice.findObject(new UiSelector().resourceIdMatches(".*search_close_btn")
.className("android.widget.ImageView"));
searchCloseButton.click();
if (searchCloseButton.exists()){
searchCloseButton.clickAndWaitForNewWindow();
}
else {
UiObject searchBackButton = getUiObjectByDescription("Collapse",
"android.widget.ImageButton");
searchBackButton.clickAndWaitForNewWindow();
}
mDevice.pressBack();
}
}
@ -333,14 +374,9 @@ public class UiAutomation extends BaseUiAutomation implements ApplaunchInterface
menuButton.click();
}
else {
menuButton =
mDevice.findObject(new UiSelector().resourceIdMatches(".*up.*")
.classNameMatches("android.widget.Image.*"));
menuButton.click();
mDevice.pressBack();
}
clickUiObject(BY_DESC, "My Documents", "android.widget.LinearLayout", true);
UiObject searchBackButton =
mDevice.findObject(new UiSelector().description("Collapse")
.className("android.widget.ImageButton"));
@ -348,7 +384,7 @@ public class UiAutomation extends BaseUiAutomation implements ApplaunchInterface
searchBackButton.click();
}
else {
clickUiObject(BY_ID, "android:id/up", "android.widget.ImageView", true);
mDevice.pressBack();
}
}
}

@ -31,7 +31,7 @@ fi
# Copy base class library from wa dist
libs_dir=app/libs
base_class=`python -c "import os, wa; print os.path.join(os.path.dirname(wa.__file__), 'framework', 'uiauto', 'uiauto.aar')"`
base_class=`python3 -c "import os, wa; print(os.path.join(os.path.dirname(wa.__file__), 'framework', 'uiauto', 'uiauto.aar'))"`
mkdir -p $libs_dir
cp $base_class $libs_dir

69
wa/workloads/aitutu/__init__.py Executable file

@ -0,0 +1,69 @@
# Copyright 2014-2018 ARM Limited
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
import re
from wa import ApkUiautoWorkload
from wa.framework.exception import WorkloadError
class Aitutu(ApkUiautoWorkload):
name = 'aitutu'
package_names = ['com.antutu.aibenchmark']
regex_matches = [re.compile(r'Overall Score ([\d.]+)'),
re.compile(r'Image Total Score ([\d.]+) ([\w]+) ([\w]+)'),
re.compile(r'Image Speed Score ([\d.]+) ([\w]+) ([\w]+)'),
re.compile(r'Image Accuracy Score ([\d.]+) ([\w]+) ([\w]+)'),
re.compile(r'Object Total Score ([\d.]+) ([\w]+) ([\w]+)'),
re.compile(r'Object Speed Score ([\d.]+) ([\w]+) ([\w]+)'),
re.compile(r'Object Accuracy Score ([\d.]+) ([\w]+) ([\w]+)')]
description = '''
Executes Aitutu Image Speed/Accuracy and Object Speed/Accuracy tests
The Aitutu workflow carries out the following tasks.
1. Open Aitutu application
2. Download the resources for the test
3. Execute the tests
Known working APK version: 1.0.3
'''
requires_network = True
def __init__(self, target, **kwargs):
super(Aitutu, self).__init__(target, **kwargs)
self.gui.timeout = 1200000
def update_output(self, context):
super(Aitutu, self).update_output(context)
expected_results = len(self.regex_matches)
logcat_file = context.get_artifact_path('logcat')
with open(logcat_file) as fh:
for line in fh:
for regex in self.regex_matches:
match = regex.search(line)
if match:
classifiers = {}
result = match.group(1)
if (len(match.groups())) > 1:
entry = regex.pattern.rsplit(None, 3)[0]
classifiers = {'model': match.group(3)}
else:
entry = regex.pattern.rsplit(None, 1)[0]
context.add_metric(entry, result, '', lower_is_better=False, classifiers=classifiers)
expected_results -= 1
if expected_results > 0:
msg = "The Aitutu workload has failed. Expected {} scores, Detected {} scores."
raise WorkloadError(msg.format(len(self.regex_matches), expected_results))

Binary file not shown.

@ -0,0 +1,35 @@
apply plugin: 'com.android.application'
def packageName = "com.arm.wa.uiauto.aitutu"
android {
compileSdkVersion 28
buildToolsVersion '25.0.0'
defaultConfig {
applicationId "${packageName}"
minSdkVersion 18
targetSdkVersion 28
testInstrumentationRunner "android.support.test.runner.AndroidJUnitRunner"
}
buildTypes {
applicationVariants.all { variant ->
variant.outputs.each { output ->
output.outputFile = file("$project.buildDir/apk/${packageName}.apk")
}
}
}
}
dependencies {
compile fileTree(include: ['*.jar'], dir: 'libs')
compile 'com.android.support.test:runner:0.5'
compile 'com.android.support.test:rules:0.5'
compile 'com.android.support.test.uiautomator:uiautomator-v18:2.1.2'
compile(name: 'uiauto', ext: 'aar')
}
repositories {
flatDir {
dirs 'libs'
}
}

@ -0,0 +1,12 @@
<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.arm.wa.uiauto.aitutu"
android:versionCode="1"
android:versionName="1.0">
<instrumentation
android:name="android.support.test.runner.AndroidJUnitRunner"
android:targetPackage="${applicationId}"/>
</manifest>

@ -0,0 +1,143 @@
/* Copyright 2013-2018 ARM Limited
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.arm.wa.uiauto.aitutu;
import android.app.Activity;
import android.os.Bundle;
import android.graphics.Rect;
import android.support.test.runner.AndroidJUnit4;
import android.support.test.uiautomator.UiObject;
import android.support.test.uiautomator.UiObjectNotFoundException;
import android.support.test.uiautomator.UiSelector;
import android.support.test.uiautomator.UiScrollable;
import android.view.KeyEvent;
import android.util.Log;
import com.arm.wa.uiauto.BaseUiAutomation;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import java.util.concurrent.TimeUnit;
@RunWith(AndroidJUnit4.class)
public class UiAutomation extends BaseUiAutomation {
public static String TAG = "UXPERF";
@Test
public void setup() throws Exception {
clearPopups();
downloadAssets();
}
@Test
public void runWorkload() throws Exception {
runBenchmark();
}
@Test
public void extractResults() throws Exception {
getScores();
}
public void clearPopups() throws Exception {
UiSelector selector = new UiSelector();
UiObject cancel = mDevice.findObject(selector.textContains("CANCEL")
.className("android.widget.Button"));
cancel.waitForExists(60000);
if (cancel.exists()){
cancel.click();
}
//waitObject(cancel);
//cancel.click();
}
public void downloadAssets() throws Exception {
UiSelector selector = new UiSelector();
//Start the tests
UiObject start = mDevice.findObject(selector.textContains("Start Testing")
.className("android.widget.TextView"));
waitObject(start);
start.click();
UiObject check = mDevice.findObject(selector.textContains("classification")
.className("android.widget.TextView"));
waitObject(check);
}
public void runBenchmark() throws Exception {
UiSelector selector = new UiSelector();
//Wait for the tests to complete
UiObject complete =
mDevice.findObject(selector.text("TEST AGAIN")
.className("android.widget.Button"));
complete.waitForExists(1200000);
}
public void getScores() throws Exception {
UiSelector selector = new UiSelector();
//Declare the models used
UiObject imageMod =
mDevice.findObject(selector.resourceId("com.antutu.aibenchmark:id/recyclerView"))
.getChild(selector.index(1))
.getChild(selector.resourceId("com.antutu.aibenchmark:id/textViewAIModelName"));
UiObject objectMod =
mDevice.findObject(selector.resourceId("com.antutu.aibenchmark:id/recyclerView"))
.getChild(selector.index(4))
.getChild(selector.resourceId("com.antutu.aibenchmark:id/textViewAIModelName"));
//Log the scores and models
UiObject totalScore =
mDevice.findObject(selector.resourceId("com.antutu.aibenchmark:id/textViewTotalScore"));
Log.d(TAG, "Overall Score " + totalScore.getText());
UiObject imageTotal =
mDevice.findObject(selector.resourceId("com.antutu.aibenchmark:id/recyclerView"))
.getChild(selector.index(1))
.getChild(selector.resourceId("com.antutu.aibenchmark:id/textViewSIDScore"));
Log.d(TAG, "Image Total Score " + imageTotal.getText() + " Model " + imageMod.getText());
UiObject imageSpeed =
mDevice.findObject(selector.resourceId("com.antutu.aibenchmark:id/recyclerView"))
.getChild(selector.index(2))
.getChild(selector.resourceId("com.antutu.aibenchmark:id/textViewBIDScore"));
Log.d(TAG, "Image Speed Score " + imageSpeed.getText() + " Model " + imageMod.getText());
UiObject imageAcc =
mDevice.findObject(selector.resourceId("com.antutu.aibenchmark:id/recyclerView"))
.getChild(selector.index(3))
.getChild(selector.resourceId("com.antutu.aibenchmark:id/textViewBIDScore"));
Log.d(TAG, "Image Accuracy Score " + imageAcc.getText() + " Model " + imageMod.getText());
UiObject objectTotal =
mDevice.findObject(selector.resourceId("com.antutu.aibenchmark:id/recyclerView"))
.getChild(selector.index(4))
.getChild(selector.resourceId("com.antutu.aibenchmark:id/textViewSIDScore"));
Log.d(TAG, "Object Total Score " + objectTotal.getText() + " Model " + objectMod.getText());
UiObject objectSpeed =
mDevice.findObject(selector.resourceId("com.antutu.aibenchmark:id/recyclerView"))
.getChild(selector.index(5))
.getChild(selector.resourceId("com.antutu.aibenchmark:id/textViewBIDScore"));
Log.d(TAG, "Object Speed Score " + objectSpeed.getText() + " Model " + objectMod.getText());
UiObject objectAcc =
mDevice.findObject(selector.resourceId("com.antutu.aibenchmark:id/recyclerView"))
.getChild(selector.index(6))
.getChild(selector.resourceId("com.antutu.aibenchmark:id/textViewBIDScore"));
Log.d(TAG, "Object Accuracy Score " + objectAcc.getText() + " Model " + objectMod.getText());
}
}

@ -0,0 +1,23 @@
// Top-level build file where you can add configuration options common to all sub-projects/modules.
buildscript {
repositories {
jcenter()
}
dependencies {
classpath 'com.android.tools.build:gradle:2.3.1'
// NOTE: Do not place your application dependencies here; they belong
// in the individual module build.gradle files
}
}
allprojects {
repositories {
jcenter()
}
}
task clean(type: Delete) {
delete rootProject.buildDir
}

@ -0,0 +1,55 @@
#!/bin/bash
# Copyright 2013-2018 ARM Limited
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
set -e
# CD into build dir if possible - allows building from any directory
script_path='.'
if `readlink -f $0 &>/dev/null`; then
script_path=`readlink -f $0 2>/dev/null`
fi
script_dir=`dirname $script_path`
cd $script_dir
# Ensure gradelw exists before starting
if [[ ! -f gradlew ]]; then
echo 'gradlew file not found! Check that you are in the right directory.'
exit 9
fi
# Copy base class library from wa dist
libs_dir=app/libs
base_class=`python3 -c "import os, wa; print(os.path.join(os.path.dirname(wa.__file__), 'framework', 'uiauto', 'uiauto.aar'))"`
mkdir -p $libs_dir
cp $base_class $libs_dir
# Build and return appropriate exit code if failed
# gradle build
./gradlew clean :app:assembleDebug
exit_code=$?
if [[ $exit_code -ne 0 ]]; then
echo "ERROR: 'gradle build' exited with code $exit_code"
exit $exit_code
fi
# If successful move APK file to workload folder (overwrite previous)
package=com.arm.wa.uiauto.aitutu
rm -f ../$package
if [[ -f app/build/apk/$package.apk ]]; then
cp app/build/apk/$package.apk ../$package.apk
else
echo 'ERROR: UiAutomator apk could not be found!'
exit 9
fi

Binary file not shown.

@ -0,0 +1,6 @@
#Wed May 03 15:42:44 BST 2017
distributionBase=GRADLE_USER_HOME
distributionPath=wrapper/dists
zipStoreBase=GRADLE_USER_HOME
zipStorePath=wrapper/dists
distributionUrl=https\://services.gradle.org/distributions/gradle-3.3-all.zip

160
wa/workloads/aitutu/uiauto/gradlew vendored Executable file

@ -0,0 +1,160 @@
#!/usr/bin/env bash
##############################################################################
##
## Gradle start up script for UN*X
##
##############################################################################
# Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script.
DEFAULT_JVM_OPTS=""
APP_NAME="Gradle"
APP_BASE_NAME=`basename "$0"`
# Use the maximum available, or set MAX_FD != -1 to use that value.
MAX_FD="maximum"
warn ( ) {
echo "$*"
}
die ( ) {
echo
echo "$*"
echo
exit 1
}
# OS specific support (must be 'true' or 'false').
cygwin=false
msys=false
darwin=false
case "`uname`" in
CYGWIN* )
cygwin=true
;;
Darwin* )
darwin=true
;;
MINGW* )
msys=true
;;
esac
# Attempt to set APP_HOME
# Resolve links: $0 may be a link
PRG="$0"
# Need this for relative symlinks.
while [ -h "$PRG" ] ; do
ls=`ls -ld "$PRG"`
link=`expr "$ls" : '.*-> \(.*\)$'`
if expr "$link" : '/.*' > /dev/null; then
PRG="$link"
else
PRG=`dirname "$PRG"`"/$link"
fi
done
SAVED="`pwd`"
cd "`dirname \"$PRG\"`/" >/dev/null
APP_HOME="`pwd -P`"
cd "$SAVED" >/dev/null
CLASSPATH=$APP_HOME/gradle/wrapper/gradle-wrapper.jar
# Determine the Java command to use to start the JVM.
if [ -n "$JAVA_HOME" ] ; then
if [ -x "$JAVA_HOME/jre/sh/java" ] ; then
# IBM's JDK on AIX uses strange locations for the executables
JAVACMD="$JAVA_HOME/jre/sh/java"
else
JAVACMD="$JAVA_HOME/bin/java"
fi
if [ ! -x "$JAVACMD" ] ; then
die "ERROR: JAVA_HOME is set to an invalid directory: $JAVA_HOME
Please set the JAVA_HOME variable in your environment to match the
location of your Java installation."
fi
else
JAVACMD="java"
which java >/dev/null 2>&1 || die "ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH.
Please set the JAVA_HOME variable in your environment to match the
location of your Java installation."
fi
# Increase the maximum file descriptors if we can.
if [ "$cygwin" = "false" -a "$darwin" = "false" ] ; then
MAX_FD_LIMIT=`ulimit -H -n`
if [ $? -eq 0 ] ; then
if [ "$MAX_FD" = "maximum" -o "$MAX_FD" = "max" ] ; then
MAX_FD="$MAX_FD_LIMIT"
fi
ulimit -n $MAX_FD
if [ $? -ne 0 ] ; then
warn "Could not set maximum file descriptor limit: $MAX_FD"
fi
else
warn "Could not query maximum file descriptor limit: $MAX_FD_LIMIT"
fi
fi
# For Darwin, add options to specify how the application appears in the dock
if $darwin; then
GRADLE_OPTS="$GRADLE_OPTS \"-Xdock:name=$APP_NAME\" \"-Xdock:icon=$APP_HOME/media/gradle.icns\""
fi
# For Cygwin, switch paths to Windows format before running java
if $cygwin ; then
APP_HOME=`cygpath --path --mixed "$APP_HOME"`
CLASSPATH=`cygpath --path --mixed "$CLASSPATH"`
JAVACMD=`cygpath --unix "$JAVACMD"`
# We build the pattern for arguments to be converted via cygpath
ROOTDIRSRAW=`find -L / -maxdepth 1 -mindepth 1 -type d 2>/dev/null`
SEP=""
for dir in $ROOTDIRSRAW ; do
ROOTDIRS="$ROOTDIRS$SEP$dir"
SEP="|"
done
OURCYGPATTERN="(^($ROOTDIRS))"
# Add a user-defined pattern to the cygpath arguments
if [ "$GRADLE_CYGPATTERN" != "" ] ; then
OURCYGPATTERN="$OURCYGPATTERN|($GRADLE_CYGPATTERN)"
fi
# Now convert the arguments - kludge to limit ourselves to /bin/sh
i=0
for arg in "$@" ; do
CHECK=`echo "$arg"|egrep -c "$OURCYGPATTERN" -`
CHECK2=`echo "$arg"|egrep -c "^-"` ### Determine if an option
if [ $CHECK -ne 0 ] && [ $CHECK2 -eq 0 ] ; then ### Added a condition
eval `echo args$i`=`cygpath --path --ignore --mixed "$arg"`
else
eval `echo args$i`="\"$arg\""
fi
i=$((i+1))
done
case $i in
(0) set -- ;;
(1) set -- "$args0" ;;
(2) set -- "$args0" "$args1" ;;
(3) set -- "$args0" "$args1" "$args2" ;;
(4) set -- "$args0" "$args1" "$args2" "$args3" ;;
(5) set -- "$args0" "$args1" "$args2" "$args3" "$args4" ;;
(6) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" ;;
(7) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" ;;
(8) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" "$args7" ;;
(9) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" "$args7" "$args8" ;;
esac
fi
# Split up the JVM_OPTS And GRADLE_OPTS values into an array, following the shell quoting and substitution rules
function splitJvmOpts() {
JVM_OPTS=("$@")
}
eval splitJvmOpts $DEFAULT_JVM_OPTS $JAVA_OPTS $GRADLE_OPTS
JVM_OPTS[${#JVM_OPTS[*]}]="-Dorg.gradle.appname=$APP_BASE_NAME"
exec "$JAVACMD" "${JVM_OPTS[@]}" -classpath "$CLASSPATH" org.gradle.wrapper.GradleWrapperMain "$@"

90
wa/workloads/aitutu/uiauto/gradlew.bat vendored Normal file

@ -0,0 +1,90 @@
@if "%DEBUG%" == "" @echo off
@rem ##########################################################################
@rem
@rem Gradle startup script for Windows
@rem
@rem ##########################################################################
@rem Set local scope for the variables with windows NT shell
if "%OS%"=="Windows_NT" setlocal
@rem Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script.
set DEFAULT_JVM_OPTS=
set DIRNAME=%~dp0
if "%DIRNAME%" == "" set DIRNAME=.
set APP_BASE_NAME=%~n0
set APP_HOME=%DIRNAME%
@rem Find java.exe
if defined JAVA_HOME goto findJavaFromJavaHome
set JAVA_EXE=java.exe
%JAVA_EXE% -version >NUL 2>&1
if "%ERRORLEVEL%" == "0" goto init
echo.
echo ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH.
echo.
echo Please set the JAVA_HOME variable in your environment to match the
echo location of your Java installation.
goto fail
:findJavaFromJavaHome
set JAVA_HOME=%JAVA_HOME:"=%
set JAVA_EXE=%JAVA_HOME%/bin/java.exe
if exist "%JAVA_EXE%" goto init
echo.
echo ERROR: JAVA_HOME is set to an invalid directory: %JAVA_HOME%
echo.
echo Please set the JAVA_HOME variable in your environment to match the
echo location of your Java installation.
goto fail
:init
@rem Get command-line arguments, handling Windowz variants
if not "%OS%" == "Windows_NT" goto win9xME_args
if "%@eval[2+2]" == "4" goto 4NT_args
:win9xME_args
@rem Slurp the command line arguments.
set CMD_LINE_ARGS=
set _SKIP=2
:win9xME_args_slurp
if "x%~1" == "x" goto execute
set CMD_LINE_ARGS=%*
goto execute
:4NT_args
@rem Get arguments from the 4NT Shell from JP Software
set CMD_LINE_ARGS=%$
:execute
@rem Setup the command line
set CLASSPATH=%APP_HOME%\gradle\wrapper\gradle-wrapper.jar
@rem Execute Gradle
"%JAVA_EXE%" %DEFAULT_JVM_OPTS% %JAVA_OPTS% %GRADLE_OPTS% "-Dorg.gradle.appname=%APP_BASE_NAME%" -classpath "%CLASSPATH%" org.gradle.wrapper.GradleWrapperMain %CMD_LINE_ARGS%
:end
@rem End local scope for the variables with windows NT shell
if "%ERRORLEVEL%"=="0" goto mainEnd
:fail
rem Set variable GRADLE_EXIT_CONSOLE if you need the _script_ return code instead of
rem the _cmd.exe /c_ return code!
if not "" == "%GRADLE_EXIT_CONSOLE%" exit 1
exit /b 1
:mainEnd
if "%OS%"=="Windows_NT" endlocal
:omega

@ -0,0 +1 @@
include ':app'

@ -3,12 +3,12 @@ apply plugin: 'com.android.application'
def packageName = "com.arm.wa.uiauto.androbench"
android {
compileSdkVersion 25
compileSdkVersion 28
buildToolsVersion '25.0.0'
defaultConfig {
applicationId "${packageName}"
minSdkVersion 18
targetSdkVersion 25
targetSdkVersion 28
testInstrumentationRunner "android.support.test.runner.AndroidJUnitRunner"
}
buildTypes {

@ -31,7 +31,7 @@ fi
# Copy base class library from wa dist
libs_dir=app/libs
base_class=`python -c "import os, wa; print os.path.join(os.path.dirname(wa.__file__), 'framework', 'uiauto', 'uiauto.aar')"`
base_class=`python3 -c "import os, wa; print(os.path.join(os.path.dirname(wa.__file__), 'framework', 'uiauto', 'uiauto.aar'))"`
mkdir -p $libs_dir
cp $base_class $libs_dir

@ -3,12 +3,12 @@ apply plugin: 'com.android.application'
def packageName = "com.arm.wa.uiauto.antutu"
android {
compileSdkVersion 25
buildToolsVersion "25.0.3"
compileSdkVersion 28
buildToolsVersion "28.0.3"
defaultConfig {
applicationId "${packageName}"
minSdkVersion 18
targetSdkVersion 25
targetSdkVersion 28
versionCode 1
versionName "1.0"
testInstrumentationRunner "android.support.test.runner.AndroidJUnitRunner"

@ -44,7 +44,8 @@ public class UiAutomation extends BaseUiAutomation {
@Test
public void setup() throws Exception {
dismissAndroidVersionPopup();
dismissAndroidVersionPopup();
clearPopups();
}
@Test
@ -65,6 +66,15 @@ public class UiAutomation extends BaseUiAutomation {
sleep(1);
}
public void clearPopups() throws Exception {
UiObject cancel =
mDevice.findObject(new UiSelector().textContains("CANCEL"));
cancel.waitForExists(5000);
if (cancel.exists()){
cancel.click();
}
}
public void waitforCompletion() throws Exception {
UiObject totalScore =
mDevice.findObject(new UiSelector().resourceId("com.antutu.ABenchMark:id/textViewTotalScore"));

@ -31,7 +31,7 @@ fi
# Copy base class library from wa dist
libs_dir=app/libs
base_class=`python -c "import os, wa; print os.path.join(os.path.dirname(wa.__file__), 'framework', 'uiauto', 'uiauto.aar')"`
base_class=`python3 -c "import os, wa; print(os.path.join(os.path.dirname(wa.__file__), 'framework', 'uiauto', 'uiauto.aar'))"`
mkdir -p $libs_dir
cp $base_class $libs_dir

@ -3,12 +3,12 @@ apply plugin: 'com.android.application'
def packageName = "com.arm.wa.uiauto.applaunch"
android {
compileSdkVersion 25
buildToolsVersion "25.0.3"
compileSdkVersion 28
buildToolsVersion "28.0.3"
defaultConfig {
applicationId "${packageName}"
minSdkVersion 18
targetSdkVersion 25
targetSdkVersion 28
testInstrumentationRunner "android.support.test.runner.AndroidJUnitRunner"
}
buildTypes {

@ -31,7 +31,7 @@ fi
# Copy base class library from wa dist
libs_dir=app/libs
base_class=`python -c "import os, wa; print os.path.join(os.path.dirname(wa.__file__), 'framework', 'uiauto', 'uiauto.aar')"`
base_class=`python3 -c "import os, wa; print(os.path.join(os.path.dirname(wa.__file__), 'framework', 'uiauto', 'uiauto.aar'))"`
mkdir -p $libs_dir
cp $base_class $libs_dir

@ -3,12 +3,12 @@ apply plugin: 'com.android.application'
def packageName = "com.arm.wa.uiauto.benchmarkpi"
android {
compileSdkVersion 25
buildToolsVersion "25.0.3"
compileSdkVersion 28
buildToolsVersion "28.0.3"
defaultConfig {
applicationId "${packageName}"
minSdkVersion 18
targetSdkVersion 25
targetSdkVersion 28
testInstrumentationRunner "android.support.test.runner.AndroidJUnitRunner"
}
buildTypes {

@ -31,7 +31,7 @@ fi
# Copy base class library from wa dist
libs_dir=app/libs
base_class=`python -c "import os, wa; print os.path.join(os.path.dirname(wa.__file__), 'framework', 'uiauto', 'uiauto.aar')"`
base_class=`python3 -c "import os, wa; print(os.path.join(os.path.dirname(wa.__file__), 'framework', 'uiauto', 'uiauto.aar'))"`
mkdir -p $libs_dir
cp $base_class $libs_dir

@ -6,7 +6,7 @@ android {
defaultConfig {
applicationId "com.arm.wa.uiauto.chrome"
minSdkVersion 18
targetSdkVersion 25
targetSdkVersion 28
testInstrumentationRunner "android.support.test.runner.AndroidJUnitRunner"
}
buildTypes {

@ -81,12 +81,22 @@ public class UiAutomation extends BaseUiAutomation implements ApplaunchInterface
// Activate the tab switcher
tabSwitcher = mDevice.findObject(new UiSelector().resourceId(packageID + "tab_switcher_button")
.className("android.widget.ImageButton"));
tabSwitcher.clickAndWaitForNewWindow(uiAutoTimeout);
// Click the New Tab button
newTab = mDevice.findObject(new UiSelector().resourceId(packageID + "new_tab_button")
.className("android.widget.Button"));
newTab.clickAndWaitForNewWindow(uiAutoTimeout);
if (tabSwitcher.exists()){
tabSwitcher.clickAndWaitForNewWindow(uiAutoTimeout);
// Click the New Tab button
newTab = mDevice.findObject(new UiSelector().resourceId(packageID + "new_tab_button")
.className("android.widget.Button"));
newTab.clickAndWaitForNewWindow(uiAutoTimeout);
}
// Support Tablet devices which do not have tab switcher
else {
UiObject menu_button = mDevice.findObject(new UiSelector().resourceId(packageID + "menu_button")
.className("android.widget.ImageButton"));
menu_button.click();
newTab = mDevice.findObject(new UiSelector().resourceId(packageID + "menu_item_text")
.textContains("New tab"));
newTab.click();
}
}
public void followTextLink(String text) throws Exception {

@ -31,7 +31,7 @@ fi
# Copy base class library from wlauto dist
libs_dir=app/libs
base_class=`python -c "import os, wa; print os.path.join(os.path.dirname(wa.__file__), 'framework', 'uiauto', 'uiauto.aar')"`
base_class=`python3 -c "import os, wa; print(os.path.join(os.path.dirname(wa.__file__), 'framework', 'uiauto', 'uiauto.aar'))"`
mkdir -p $libs_dir
cp $base_class $libs_dir

@ -153,7 +153,8 @@ class Dhrystone(Workload):
@once
def finalize(self, context):
self.target.uninstall('dhrystone')
if self.uninstall:
self.target.uninstall('dhrystone')
def validate(self):
if self.mloops and self.duration: # pylint: disable=E0203

@ -77,13 +77,13 @@ class ExoPlayer(ApkWorkload):
video_directory = os.path.join(settings.dependencies_directory, name)
package_names = ['com.google.android.exoplayer2.demo']
versions = ['2.4', '2.5', '2.6']
supported_versions = ['2.4', '2.5', '2.6']
action = 'com.google.android.exoplayer.demo.action.VIEW'
default_format = 'mov_720p'
view = 'SurfaceView - com.google.android.exoplayer2.demo/com.google.android.exoplayer2.demo.PlayerActivity'
parameters = [
Parameter('version', allowed_values=versions, default=versions[-1], override=True),
Parameter('version', allowed_values=supported_versions, override=True),
Parameter('duration', kind=int, default=20,
description="""
Playback duration of the video file. This becomes the duration of the workload.

@ -23,6 +23,7 @@ from collections import defaultdict
from wa import ApkUiautoWorkload, Parameter
from wa.framework.exception import ConfigError, WorkloadError
from wa.utils.misc import capitalize
from wa.utils.types import version_tuple
class Geekbench(ApkUiautoWorkload):
@ -51,39 +52,16 @@ class Geekbench(ApkUiautoWorkload):
http://support.primatelabs.com/kb/geekbench/interpreting-geekbench-scores
"""
summary_metrics = ['score', 'multicore_score']
versions = {
'4.3.1': {
'package': 'com.primatelabs.geekbench',
'activity': '.HomeActivity',
},
'4.2.0': {
'package': 'com.primatelabs.geekbench',
'activity': '.HomeActivity',
},
'4.0.1': {
'package': 'com.primatelabs.geekbench',
'activity': '.HomeActivity',
},
# Version 3.4.1 was the final version 3 variant
'3.4.1': {
'package': 'com.primatelabs.geekbench',
'activity': '.HomeActivity',
},
'3.0.0': {
'package': 'com.primatelabs.geekbench3',
'activity': '.HomeActivity',
},
'2': {
'package': 'ca.primatelabs.geekbench2',
'activity': '.HomeActivity',
},
}
supported_versions = ['4.4.2', '4.4.0', '4.3.4', '4.3.2', '4.3.1', '4.2.0', '4.0.1', '3.4.1', '3.0.0', '2']
package_names = ['com.primatelabs.geekbench', 'com.primatelabs.geekbench3', 'ca.primatelabs.geekbench2']
begin_regex = re.compile(r'^\s*D/WebViewClassic.loadDataWithBaseURL\(\s*\d+\s*\)'
r'\s*:\s*(?P<content>\<.*)\s*$')
replace_regex = re.compile(r'<[^>]*>')
parameters = [
Parameter('version', default=sorted(versions.keys())[-1], allowed_values=sorted(versions.keys()),
Parameter('version', allowed_values=supported_versions,
description='Specifies which version of the workload should be run.',
override=True),
Parameter('loops', kind=int, default=1, aliases=['times'],
@ -105,27 +83,12 @@ class Geekbench(ApkUiautoWorkload):
requires_network = True
@property
def activity(self):
return self.versions[self.version]['activity']
@property
def package(self):
return self.versions[self.version]['package']
@property
def package_names(self):
return [self.package]
def __init__(self, *args, **kwargs):
super(Geekbench, self).__init__(*args, **kwargs)
def initialize(self, context):
super(Geekbench, self).initialize(context)
self.gui.uiauto_params['version'] = self.version
self.gui.uiauto_params['loops'] = self.loops
self.gui.uiauto_params['is_corporate'] = self.is_corporate
self.gui.timeout = self.timeout
def initialize(self, context):
super(Geekbench, self).initialize(context)
if not self.disable_update_result and not self.target.is_rooted:
raise WorkloadError(
'Geekbench workload requires root to collect results. '
@ -135,12 +98,11 @@ class Geekbench(ApkUiautoWorkload):
def setup(self, context):
super(Geekbench, self).setup(context)
self.run_timeout = self.timeout * self.loops
self.exact_apk_version = self.version
def update_output(self, context):
super(Geekbench, self).update_output(context)
if not self.disable_update_result:
major_version = versiontuple(self.version)[0]
major_version = version_tuple(self.version)[0]
update_method = getattr(self, 'update_result_{}'.format(major_version))
update_method(context)
@ -154,7 +116,7 @@ class Geekbench(ApkUiautoWorkload):
score_calculator.update_results(context)
def update_result_3(self, context):
outfile_glob = self.target.path.join(self.target.package_data_directory, self.package, 'files', '*gb3')
outfile_glob = self.target.path.join(self.target.package_data_directory, self.apk.package, 'files', '*gb3')
on_target_output_files = [f.strip() for f in self.target.execute('ls {}'.format(outfile_glob),
as_root=True).split('\n') if f]
for i, on_target_output_file in enumerate(on_target_output_files):
@ -176,7 +138,7 @@ class Geekbench(ApkUiautoWorkload):
section['multicore_score'])
def update_result_4(self, context):
outfile_glob = self.target.path.join(self.target.package_data_directory, self.package, 'files', '*gb*')
outfile_glob = self.target.path.join(self.target.package_data_directory, self.apk.package, 'files', '*gb*')
on_target_output_files = [f.strip() for f in self.target.execute('ls {}'.format(outfile_glob),
as_root=True).split('\n') if f]
for i, on_target_output_file in enumerate(on_target_output_files):
@ -395,22 +357,14 @@ class GeekbenchCorproate(Geekbench): # pylint: disable=too-many-ancestors
name = "geekbench-corporate"
is_corporate = True
requires_network = False
versions = ['4.1.0', '5.0.0']
supported_versions = ['5.0.3', '5.0.1', '4.1.0', '4.3.4', '5.0.0']
package_names = ['com.primatelabs.geekbench4.corporate', 'com.primatelabs.geekbench5.corporate']
activity = 'com.primatelabs.geekbench.HomeActivity'
package = 'com.primatelabs.geekbench4.corporate'
parameters = [
Parameter('version',
default=sorted(versions)[-1], allowed_values=versions,
override=True)
Parameter('version', allowed_values=supported_versions, override=True)
]
def namemify(basename, i):
return basename + (' {}'.format(i) if i else '')
def versiontuple(v):
return tuple(map(int, (v.split("."))))

Some files were not shown because too many files have changed in this diff Show More