Closed
Bug 1443394
Opened 7 years ago
Closed 7 years ago
Intermittent Uncaught exception: Traceback (most recent call last): in mozharness/base/script.py
Categories
(Testing :: Talos, defect, P5)
Tracking
(Not tracked)
RESOLVED
DUPLICATE
of bug 1445580
People
(Reporter: intermittent-bug-filer, Unassigned)
Details
(Keywords: intermittent-failure, Whiteboard: [stockwell infra])
Filed by: csabou [at] mozilla.com
https://treeherder.mozilla.org/logviewer.html#?job_id=166143609&repo=mozilla-inbound
https://queue.taskcluster.net/v1/task/WbekmOoPTlGak1CEbYFArQ/runs/0/artifacts/public/logs/live_backing.log
21:32:12 INFO - Starting mitmproxy playback using env path: /home/cltbld/workspace/build/application/firefox
21:32:12 INFO - Starting mitmproxy playback using command: /home/cltbld/workspace/build/tests/talos/talos/mitmproxy/mitmdump -k -s /home/cltbld/workspace/build/tests/talos/talos/mitmproxy/alternate-server-replay.py /home/cltbld/workspace/build/tests/talos/talos/mitmproxy/mitmproxy-recording-google.mp /home/cltbld/workspace/build/tests/talos/talos/mitmproxy/mitmproxy-recording-youtube.mp /home/cltbld/workspace/build/tests/talos/talos/mitmproxy/mitmproxy-recording-amazon.mp /home/cltbld/workspace/build/tests/talos/talos/mitmproxy/mitmproxy-recording-facebook.mp
21:32:12 INFO - Error starting proxy server: OSError(98, 'Address already in use')
21:32:22 INFO - Aborting: mitmproxy playback process failed to start, poll returned: 1
21:32:22 INFO - Return code: 0
21:32:22 CRITICAL - PERFHERDER_DATA was seen 0 times, expected 1.
21:32:22 CRITICAL - Error copying results /home/cltbld/workspace/build/local.json to upload dir /home/cltbld/workspace/build/blobber_upload_dir/perfherder-data.json
21:32:22 INFO - Running post-action listener: _package_coverage_data
21:32:22 INFO - Running post-action listener: _resource_record_post_action
21:32:22 INFO - [mozharness: 2018-03-06 05:32:22.693602Z] Finished run-tests step (failed)
21:32:22 FATAL - Uncaught exception: Traceback (most recent call last):
21:32:22 FATAL - File "/home/cltbld/workspace/mozharness/mozharness/base/script.py", line 2059, in run
21:32:22 FATAL - self.run_action(action)
21:32:22 FATAL - File "/home/cltbld/workspace/mozharness/mozharness/base/script.py", line 1998, in run_action
21:32:22 FATAL - self._possibly_run_method(method_name, error_if_missing=True)
21:32:22 FATAL - File "/home/cltbld/workspace/mozharness/mozharness/base/script.py", line 1938, in _possibly_run_method
21:32:22 FATAL - return getattr(self, method_name)()
21:32:22 FATAL - File "/home/cltbld/workspace/mozharness/mozharness/mozilla/testing/talos.py", line 755, in run_tests
21:32:22 FATAL - self._artifact_perf_data(dest)
21:32:22 FATAL - File "/home/cltbld/workspace/mozharness/mozharness/mozilla/testing/talos.py", line 646, in _artifact_perf_data
21:32:22 FATAL - parser.update_worst_log_and_tbpl_levels(CRITICAL, TBPL_FAILURE)
21:32:22 FATAL - NameError: global name 'parser' is not defined
21:32:22 FATAL - Running post_fatal callback...
21:32:22 FATAL - Exiting -1
Comment hidden (Intermittent Failures Robot) |
Comment hidden (Intermittent Failures Robot) |
Comment 3•7 years ago
|
||
There have been 50 total failures in the last 7 days. This bug was filed 10 days ago. According to Orange Factor it seems that the number of failures has dropped in the last day.
Summary: Intermittent Uncaught exception: Traceback (most recent call last): in mozharness/base/script.py
Failures per platform:
-android-4-3-armv7-api16: 30
-windows10-64: 15
-windows10-64-qr: 3
-Linux x64: 2
Failures per build type:
-opt: 44
-pgo: 6
-debug: 1
Here is recent relevant log file and a snippet with the failure:
https://treeherder.mozilla.org/logviewer.html#?repo=mozilla-central&job_id=168293490&lineNumber=717
[task 2018-03-15T18:49:20.900Z] 18:49:20 INFO - Reading from file /builds/worker/workspace/build/target.test_packages.json
[task 2018-03-15T18:49:20.900Z] 18:49:20 INFO - Running post-action listener: _resource_record_post_action
[task 2018-03-15T18:49:20.901Z] 18:49:20 INFO - Running post-action listener: find_tests_for_verification
[task 2018-03-15T18:49:20.901Z] 18:49:20 INFO - Running post-action listener: set_extra_try_arguments
[task 2018-03-15T18:49:20.901Z] 18:49:20 INFO - [mozharness: 2018-03-15 18:49:20.901301Z] Finished download-and-extract step (failed)
[task 2018-03-15T18:49:20.904Z] 18:49:20 FATAL - Uncaught exception: Traceback (most recent call last):
[task 2018-03-15T18:49:20.905Z] 18:49:20 FATAL - File "/builds/worker/workspace/mozharness/mozharness/base/script.py", line 2076, in run
[task 2018-03-15T18:49:20.905Z] 18:49:20 FATAL - self.run_action(action)
[task 2018-03-15T18:49:20.906Z] 18:49:20 FATAL - File "/builds/worker/workspace/mozharness/mozharness/base/script.py", line 2015, in run_action
[task 2018-03-15T18:49:20.907Z] 18:49:20 FATAL - self._possibly_run_method(method_name, error_if_missing=True)
[task 2018-03-15T18:49:20.908Z] 18:49:20 FATAL - File "/builds/worker/workspace/mozharness/mozharness/base/script.py", line 1955, in _possibly_run_method
[task 2018-03-15T18:49:20.909Z] 18:49:20 FATAL - return getattr(self, method_name)()
[task 2018-03-15T18:49:20.909Z] 18:49:20 FATAL - File "/builds/worker/workspace/mozharness/scripts/android_emulator_unittest.py", line 684, in download_and_extract
[task 2018-03-15T18:49:20.910Z] 18:49:20 FATAL - suite_categories=self._query_suite_categories())
[task 2018-03-15T18:49:20.911Z] 18:49:20 FATAL - File "/builds/worker/workspace/mozharness/mozharness/mozilla/testing/testbase.py", line 566, in download_and_extract
[task 2018-03-15T18:49:20.912Z] 18:49:20 FATAL - self._download_test_packages(suite_categories, extract_dirs)
[task 2018-03-15T18:49:20.913Z] 18:49:20 FATAL - File "/builds/worker/workspace/mozharness/mozharness/mozilla/testing/testbase.py", line 428, in _download_test_packages
[task 2018-03-15T18:49:20.914Z] 18:49:20 FATAL - package_requirements = self._read_packages_manifest()
[task 2018-03-15T18:49:20.915Z] 18:49:20 FATAL - File "/builds/worker/workspace/mozharness/mozharness/mozilla/testing/testbase.py", line 394, in _read_packages_manifest
[task 2018-03-15T18:49:20.915Z] 18:49:20 FATAL - package_requirements = json.load(fh)
[task 2018-03-15T18:49:20.916Z] 18:49:20 FATAL - File "/usr/lib/python2.7/json/__init__.py", line 291, in load
[task 2018-03-15T18:49:20.917Z] 18:49:20 FATAL - **kw)
[task 2018-03-15T18:49:20.918Z] 18:49:20 FATAL - File "/usr/lib/python2.7/json/__init__.py", line 339, in loads
[task 2018-03-15T18:49:20.919Z] 18:49:20 FATAL - return _default_decoder.decode(s)
[task 2018-03-15T18:49:20.919Z] 18:49:20 FATAL - File "/usr/lib/python2.7/json/decoder.py", line 364, in decode
[task 2018-03-15T18:49:20.920Z] 18:49:20 FATAL - obj, end = self.raw_decode(s, idx=_w(s, 0).end())
[task 2018-03-15T18:49:20.921Z] 18:49:20 FATAL - File "/usr/lib/python2.7/json/decoder.py", line 382, in raw_decode
[task 2018-03-15T18:49:20.922Z] 18:49:20 FATAL - raise ValueError("No JSON object could be decoded")
[task 2018-03-15T18:49:20.923Z] 18:49:20 FATAL - ValueError: No JSON object could be decoded
[task 2018-03-15T18:49:20.923Z] 18:49:20 FATAL - Running post_fatal callback...
[task 2018-03-15T18:49:20.924Z] 18:49:20 FATAL - Exiting -1
Flags: needinfo?(rwood)
Whiteboard: [stockwell needswork]
Comment 4•7 years ago
|
||
This is not a talos specific issue, it is affecting all kinds of test suites across the board. It looks like a failure downloading the test package. I'm guessing it was a network issue but don't know. If the failures are dropping let's see if it is resolved on it's own already perhaps.
Flags: needinfo?(rwood)
Comment hidden (Intermittent Failures Robot) |
Updated•7 years ago
|
Status: NEW → RESOLVED
Closed: 7 years ago
Resolution: --- → DUPLICATE
Whiteboard: [stockwell needswork] → [stockwell infra]
Comment hidden (Intermittent Failures Robot) |
Comment hidden (Intermittent Failures Robot) |
Comment hidden (Intermittent Failures Robot) |
Comment 10•7 years ago
|
||
Comment hidden (Intermittent Failures Robot) |
You need to log in
before you can comment on or make changes to this bug.
Description
•