Created
September 1, 2025 22:41
-
-
Save jaraco/15c73fdf1a9285ae66721b638a9439f3 to your computer and use it in GitHub Desktop.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| cpython main 🐚 docker run --platform linux/amd64 -it @$(docker build --platform linux/amd64 -q .) | |
| The necessary bits to build these optional modules were not found: | |
| _dbm _decimal _gdbm | |
| _zstd | |
| To find the necessary bits, look in configure.ac and config.log. | |
| Checked 114 modules (37 built-in, 72 shared, 1 n/a on linux-x86_64, 0 disabled, 4 missing, 0 failed on import) | |
| ./python -E ./Tools/build/generate-build-details.py `cat pybuilddir.txt`/build-details.json | |
| ./python -E -m test --fast-ci -u-gui --timeout= | |
| + ./python -u -W error -bb -E -m test --fast-ci -u-gui --timeout= --dont-add-python-opts | |
| == CPython 3.15.0a0 (heads/main:0d383f8, Sep 1 2025, 17:08:52) [GCC 8.5.0 20210514 (Red Hat 8.5.0-28)] | |
| == Linux-6.10.14-linuxkit-x86_64-with-glibc2.28 little-endian | |
| == Python build: debug LTO | |
| == cwd: /home/buildbot/cpython/build/test_python_worker_235æ | |
| == CPU count: 8 | |
| == encodings: locale=UTF-8 FS=utf-8 | |
| == resources: all,-cpu,-gui | |
| Using random seed: 3845732249 | |
| 0:00:00 load avg: 1.12 Run 492 tests in parallel using 10 worker processes (timeout: 10 min, worker timeout: 15 min) | |
| 0:00:00 load avg: 1.84 [ 1/492] test.test_future_stmt.test_future_multiple_imports passed | |
| 0:00:00 load avg: 1.84 [ 2/492] test_dbm_ndbm skipped | |
| test_dbm_ndbm skipped -- No module named '_dbm' | |
| 0:00:00 load avg: 1.84 [ 3/492] test_weakset passed | |
| 0:00:00 load avg: 1.84 [ 4/492] test_ttk_textonly passed | |
| 0:00:01 load avg: 1.84 [ 5/492] test_hmac passed | |
| 0:00:01 load avg: 1.84 [ 6/492] test_winapi skipped | |
| test_winapi skipped -- No module named '_winapi' | |
| 0:00:01 load avg: 1.84 [ 7/492] test.test_asyncio.test_futures2 passed | |
| 0:00:01 load avg: 1.84 [ 8/492] test_base64 passed | |
| 0:00:02 load avg: 1.84 [ 9/492] test_sysconfig passed | |
| 0:00:02 load avg: 1.84 [ 10/492] test_pprint passed | |
| 0:00:03 load avg: 1.84 [ 11/492] test_range passed | |
| 0:00:04 load avg: 1.84 [ 12/492] test_urllib2 passed | |
| 0:00:04 load avg: 1.84 [ 13/492] test_hash passed | |
| 0:00:04 load avg: 1.84 [ 14/492] test_statistics passed | |
| 0:00:04 load avg: 1.84 [ 15/492] test_subclassinit passed | |
| 0:00:05 load avg: 1.84 [ 16/492] test_ast passed | |
| 0:00:05 load avg: 1.84 [ 17/492] test_locale passed | |
| 0:00:05 load avg: 1.84 [ 18/492] test_trace passed | |
| 0:00:05 load avg: 2.49 [ 19/492] test_grp passed | |
| 0:00:05 load avg: 2.49 [ 20/492] test_optparse passed | |
| 0:00:06 load avg: 2.49 [ 21/492] test_tuple passed | |
| 0:00:06 load avg: 2.49 [ 22/492] test_zipapp passed | |
| 0:00:06 load avg: 2.49 [ 23/492] test.test_asyncio.test_timeouts passed | |
| 0:00:07 load avg: 2.49 [ 24/492] test_dbm_sqlite3 passed | |
| 0:00:07 load avg: 2.49 [ 25/492] test_httplib passed | |
| 0:00:08 load avg: 2.49 [ 26/492] test_bz2 passed | |
| 0:00:08 load avg: 2.49 [ 27/492] test_kqueue skipped | |
| test_kqueue skipped -- test works only on BSD | |
| 0:00:09 load avg: 2.49 [ 28/492] test_contains passed | |
| 0:00:09 load avg: 2.49 [ 29/492] test_support passed | |
| 0:00:09 load avg: 2.49 [ 30/492] test_pkgutil passed | |
| 0:00:09 load avg: 2.49 [ 31/492] test_univnewlines passed | |
| 0:00:10 load avg: 2.49 [ 32/492] test_codecencodings_tw passed | |
| 0:00:10 load avg: 2.85 [ 33/492] test_argparse passed | |
| 0:00:10 load avg: 2.85 [ 34/492] test_remote_pdb passed | |
| 0:00:11 load avg: 2.85 [ 35/492] test.test_asyncio.test_runners passed | |
| 0:00:12 load avg: 2.85 [ 36/492] test_poll passed | |
| 0:00:12 load avg: 2.85 [ 37/492] test_unicode_identifiers passed | |
| 0:00:12 load avg: 2.85 [ 38/492] test_zipimport_support passed | |
| 0:00:13 load avg: 2.85 [ 39/492] test_interpreters passed | |
| 0:00:13 load avg: 2.85 [ 40/492] test_termios passed | |
| 0:00:13 load avg: 2.85 [ 41/492] test_codeccallbacks passed | |
| 0:00:14 load avg: 2.85 [ 42/492] test_long passed | |
| 0:00:14 load avg: 2.85 [ 43/492] test_mimetypes passed | |
| 0:00:14 load avg: 2.85 [ 44/492] test_cmath passed | |
| 0:00:14 load avg: 2.85 [ 45/492] test_apple skipped | |
| test_apple skipped -- Apple-specific | |
| 0:00:14 load avg: 2.85 [ 46/492] test_extcall passed | |
| 0:00:15 load avg: 2.85 [ 47/492] test__opcode passed | |
| 0:00:15 load avg: 2.85 [ 48/492] test.test_gdb.test_cfunction_full skipped | |
| test.test_gdb.test_cfunction_full skipped -- Couldn't find gdb program on the path: [Errno 2] No such file or directory: 'gdb' | |
| 0:00:15 load avg: 3.18 [ 49/492] test_perfmaps passed | |
| 0:00:15 load avg: 3.18 [ 50/492] test.test_multiprocessing_fork.test_manager passed | |
| 0:00:15 load avg: 3.18 [ 51/492] test_stat passed | |
| 0:00:16 load avg: 3.18 [ 52/492] test_string_literals passed | |
| 0:00:16 load avg: 3.18 [ 53/492] test.test_concurrent_futures.test_thread_pool passed | |
| 0:00:16 load avg: 3.18 [ 54/492] test_codecencodings_kr passed | |
| 0:00:17 load avg: 3.18 [ 55/492] test_sys_settrace passed | |
| 0:00:17 load avg: 3.18 [ 56/492] test_netrc passed | |
| 0:00:18 load avg: 3.18 [ 57/492] test_pyexpat passed | |
| 0:00:18 load avg: 3.18 [ 58/492] test_math_property passed | |
| 0:00:18 load avg: 3.18 [ 59/492] test_tty passed | |
| 0:00:19 load avg: 3.18 [ 60/492] test_runpy passed | |
| 0:00:19 load avg: 3.18 [ 61/492] test_strtod passed | |
| 0:00:20 load avg: 3.18 [ 62/492] test_isinstance passed | |
| 0:00:20 load avg: 3.18 [ 63/492] test_fnmatch passed | |
| 0:00:20 load avg: 3.65 [ 64/492] test_pwd passed | |
| 0:00:20 load avg: 3.65 [ 65/492] test_robotparser passed | |
| 0:00:21 load avg: 3.65 [ 66/492] test_pep646_syntax passed | |
| 0:00:21 load avg: 3.65 [ 67/492] test_userlist passed | |
| 0:00:22 load avg: 3.65 [ 68/492] test_unittest passed | |
| 0:00:22 load avg: 3.65 [ 69/492] test_setcomps passed | |
| 0:00:23 load avg: 3.65 [ 70/492] test_codecmaps_cn passed | |
| 0:00:23 load avg: 3.65 [ 71/492] test_print passed | |
| 0:00:23 load avg: 3.65 [ 72/492] test_str passed | |
| 0:00:24 load avg: 3.65 [ 73/492] test_http_cookiejar passed | |
| 0:00:24 load avg: 3.65 [ 74/492] test_fstring passed | |
| 0:00:24 load avg: 3.65 [ 75/492] test__colorize passed | |
| 0:00:24 load avg: 3.65 [ 76/492] test_codecmaps_hk passed | |
| 0:00:25 load avg: 3.65 [ 77/492] test_secrets passed | |
| 0:00:25 load avg: 3.65 [ 78/492/1] test_faulthandler failed (9 failures) | |
| test_cancel_later_without_dump_traceback_later (test.test_faulthandler.FaultHandlerTests.test_cancel_later_without_dump_traceback_later) ... ok | |
| test_disable (test.test_faulthandler.FaultHandlerTests.test_disable) ... FAIL | |
| test_disable_windows_exc_handler (test.test_faulthandler.FaultHandlerTests.test_disable_windows_exc_handler) ... skipped 'specific to Windows' | |
| test_disabled_by_default (test.test_faulthandler.FaultHandlerTests.test_disabled_by_default) ... ok | |
| test_dump_c_stack (test.test_faulthandler.FaultHandlerTests.test_dump_c_stack) ... ok | |
| test_dump_c_stack_file (test.test_faulthandler.FaultHandlerTests.test_dump_c_stack_file) ... ok | |
| test_dump_ext_modules (test.test_faulthandler.FaultHandlerTests.test_dump_ext_modules) ... FAIL | |
| test_dump_traceback (test.test_faulthandler.FaultHandlerTests.test_dump_traceback) ... ok | |
| test_dump_traceback_fd (test.test_faulthandler.FaultHandlerTests.test_dump_traceback_fd) ... ok | |
| test_dump_traceback_file (test.test_faulthandler.FaultHandlerTests.test_dump_traceback_file) ... ok | |
| test_dump_traceback_later (test.test_faulthandler.FaultHandlerTests.test_dump_traceback_later) ... ok | |
| test_dump_traceback_later_cancel (test.test_faulthandler.FaultHandlerTests.test_dump_traceback_later_cancel) ... ok | |
| test_dump_traceback_later_fd (test.test_faulthandler.FaultHandlerTests.test_dump_traceback_later_fd) ... ok | |
| test_dump_traceback_later_file (test.test_faulthandler.FaultHandlerTests.test_dump_traceback_later_file) ... ok | |
| test_dump_traceback_later_repeat (test.test_faulthandler.FaultHandlerTests.test_dump_traceback_later_repeat) ... ok | |
| test_dump_traceback_later_twice (test.test_faulthandler.FaultHandlerTests.test_dump_traceback_later_twice) ... ok | |
| test_dump_traceback_threads (test.test_faulthandler.FaultHandlerTests.test_dump_traceback_threads) ... ok | |
| test_dump_traceback_threads_file (test.test_faulthandler.FaultHandlerTests.test_dump_traceback_threads_file) ... ok | |
| test_enable_fd (test.test_faulthandler.FaultHandlerTests.test_enable_fd) ... FAIL | |
| test_enable_file (test.test_faulthandler.FaultHandlerTests.test_enable_file) ... FAIL | |
| test_enable_single_thread (test.test_faulthandler.FaultHandlerTests.test_enable_single_thread) ... FAIL | |
| test_enable_without_c_stack (test.test_faulthandler.FaultHandlerTests.test_enable_without_c_stack) ... FAIL | |
| test_env_var (test.test_faulthandler.FaultHandlerTests.test_env_var) ... ok | |
| test_fatal_error (test.test_faulthandler.FaultHandlerTests.test_fatal_error) ... ok | |
| test_fatal_error_c_thread (test.test_faulthandler.FaultHandlerTests.test_fatal_error_c_thread) ... ok | |
| test_fatal_error_without_gil (test.test_faulthandler.FaultHandlerTests.test_fatal_error_without_gil) ... ok | |
| test_free_threaded_dump_traceback (test.test_faulthandler.FaultHandlerTests.test_free_threaded_dump_traceback) ... skipped 'only meaningful if the GIL is disabled' | |
| test_gc (test.test_faulthandler.FaultHandlerTests.test_gc) ... FAIL | |
| test_gil_released (test.test_faulthandler.FaultHandlerTests.test_gil_released) ... FAIL | |
| test_ignore_exception (test.test_faulthandler.FaultHandlerTests.test_ignore_exception) ... skipped 'specific to Windows' | |
| test_is_enabled (test.test_faulthandler.FaultHandlerTests.test_is_enabled) ... ok | |
| test_raise_exception (test.test_faulthandler.FaultHandlerTests.test_raise_exception) ... skipped 'specific to Windows' | |
| test_raise_nonfatal_exception (test.test_faulthandler.FaultHandlerTests.test_raise_nonfatal_exception) ... skipped 'specific to Windows' | |
| test_register (test.test_faulthandler.FaultHandlerTests.test_register) ... ok | |
| test_register_chain (test.test_faulthandler.FaultHandlerTests.test_register_chain) ... ok | |
| test_register_fd (test.test_faulthandler.FaultHandlerTests.test_register_fd) ... ok | |
| test_register_file (test.test_faulthandler.FaultHandlerTests.test_register_file) ... ok | |
| test_register_threads (test.test_faulthandler.FaultHandlerTests.test_register_threads) ... ok | |
| test_sigabrt (test.test_faulthandler.FaultHandlerTests.test_sigabrt) ... ok | |
| test_sigbus (test.test_faulthandler.FaultHandlerTests.test_sigbus) ... ok | |
| test_sigfpe (test.test_faulthandler.FaultHandlerTests.test_sigfpe) ... ok | |
| test_sigill (test.test_faulthandler.FaultHandlerTests.test_sigill) ... ok | |
| test_sigsegv (test.test_faulthandler.FaultHandlerTests.test_sigsegv) ... FAIL | |
| test_stack_overflow (test.test_faulthandler.FaultHandlerTests.test_stack_overflow) ... ok | |
| test_stderr_None (test.test_faulthandler.FaultHandlerTests.test_stderr_None) ... ok | |
| test_sys_xoptions (test.test_faulthandler.FaultHandlerTests.test_sys_xoptions) ... ok | |
| test_truncate (test.test_faulthandler.FaultHandlerTests.test_truncate) ... ok | |
| test_unregister (test.test_faulthandler.FaultHandlerTests.test_unregister) ... ok | |
| ====================================================================== | |
| FAIL: test_disable (test.test_faulthandler.FaultHandlerTests.test_disable) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 387, in test_disable | |
| self.assertNotEqual(exitcode, 0) | |
| ~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^ | |
| AssertionError: 0 == 0 | |
| ====================================================================== | |
| FAIL: test_dump_ext_modules (test.test_faulthandler.FaultHandlerTests.test_dump_ext_modules) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 404, in test_dump_ext_modules | |
| self.fail(f"Cannot find 'Extension modules:' in {stderr!r}") | |
| ~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| AssertionError: Cannot find 'Extension modules:' in '' | |
| ====================================================================== | |
| FAIL: test_enable_fd (test.test_faulthandler.FaultHandlerTests.test_enable_fd) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 342, in test_enable_fd | |
| self.check_fatal_error(""" | |
| ~~~~~~~~~~~~~~~~~~~~~~^^^^ | |
| import faulthandler | |
| ^^^^^^^^^^^^^^^^^^^ | |
| ...<5 lines>... | |
| 'Segmentation fault', | |
| ^^^^^^^^^^^^^^^^^^^^^ | |
| fd=fd) | |
| ^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 163, in check_fatal_error | |
| self.check_error(code, line_number, fatal_error, **kw) | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 156, in check_error | |
| self.assertRegex(output, regex) | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^ | |
| AssertionError: Regex didn't match: '(?m)^Fatal Python error: Segmentation fault\n\nCurrent thread 0x[0-9a-f]+( \\[.*\\])? \\(most recent call first\\):\n File "<string>", line 4 in <module>\nCurrent thread\'s C stack trace \\(most recent call first\\):\n( Binary file ".+"(, at .*(\\+|-)0x[0-9a-f]+)? \\[0x[0-9a-f]+\\])|(<.+>)' not found in '' | |
| ====================================================================== | |
| FAIL: test_enable_file (test.test_faulthandler.FaultHandlerTests.test_enable_file) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 326, in test_enable_file | |
| self.check_fatal_error(""" | |
| ~~~~~~~~~~~~~~~~~~~~~~^^^^ | |
| import faulthandler | |
| ^^^^^^^^^^^^^^^^^^^ | |
| ...<5 lines>... | |
| 'Segmentation fault', | |
| ^^^^^^^^^^^^^^^^^^^^^ | |
| filename=filename) | |
| ^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 163, in check_fatal_error | |
| self.check_error(code, line_number, fatal_error, **kw) | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 154, in check_error | |
| output, exitcode = self.get_output(code, filename=filename, fd=fd) | |
| ~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 96, in get_output | |
| self.assertEqual(output, '') | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^ | |
| AssertionError: "<sys>:0: ResourceWarning: unclosed file [63 chars]'>\n" != '' | |
| - <sys>:0: ResourceWarning: unclosed file <_io.BufferedWriter name='/tmp/test_python_hl86tp47/tmpp65b5w6f'> | |
| ====================================================================== | |
| FAIL: test_enable_single_thread (test.test_faulthandler.FaultHandlerTests.test_enable_single_thread) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 354, in test_enable_single_thread | |
| self.check_fatal_error(""" | |
| ~~~~~~~~~~~~~~~~~~~~~~^^^^ | |
| import faulthandler | |
| ^^^^^^^^^^^^^^^^^^^ | |
| ...<4 lines>... | |
| 'Segmentation fault', | |
| ^^^^^^^^^^^^^^^^^^^^^ | |
| all_threads=False) | |
| ^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 163, in check_fatal_error | |
| self.check_error(code, line_number, fatal_error, **kw) | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 156, in check_error | |
| self.assertRegex(output, regex) | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^ | |
| AssertionError: Regex didn't match: '(?m)^Fatal Python error: Segmentation fault\n\nStack\\ \\(most\\ recent\\ call\\ first\\):\n File "<string>", line 3 in <module>\nCurrent thread\'s C stack trace \\(most recent call first\\):\n( Binary file ".+"(, at .*(\\+|-)0x[0-9a-f]+)? \\[0x[0-9a-f]+\\])|(<.+>)' not found in '' | |
| ====================================================================== | |
| FAIL: test_enable_without_c_stack (test.test_faulthandler.FaultHandlerTests.test_enable_without_c_stack) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 365, in test_enable_without_c_stack | |
| self.check_fatal_error(""" | |
| ~~~~~~~~~~~~~~~~~~~~~~^^^^ | |
| import faulthandler | |
| ^^^^^^^^^^^^^^^^^^^ | |
| ...<4 lines>... | |
| 'Segmentation fault', | |
| ^^^^^^^^^^^^^^^^^^^^^ | |
| c_stack=False) | |
| ^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 163, in check_fatal_error | |
| self.check_error(code, line_number, fatal_error, **kw) | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 156, in check_error | |
| self.assertRegex(output, regex) | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^ | |
| AssertionError: Regex didn't match: '(?m)^Fatal Python error: Segmentation fault\n\nCurrent thread 0x[0-9a-f]+( \\[.*\\])? \\(most recent call first\\):\n File "<string>", line 3 in <module>' not found in '' | |
| ====================================================================== | |
| FAIL: test_gc (test.test_faulthandler.FaultHandlerTests.test_gc) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 182, in test_gc | |
| self.check_fatal_error(""" | |
| ~~~~~~~~~~~~~~~~~~~~~~^^^^ | |
| import faulthandler | |
| ^^^^^^^^^^^^^^^^^^^ | |
| ...<28 lines>... | |
| function='__del__', | |
| ^^^^^^^^^^^^^^^^^^^ | |
| garbage_collecting=True) | |
| ^^^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 163, in check_fatal_error | |
| self.check_error(code, line_number, fatal_error, **kw) | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 156, in check_error | |
| self.assertRegex(output, regex) | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^ | |
| AssertionError: Regex didn't match: '(?m)^Fatal Python error: Segmentation fault\n\nCurrent thread 0x[0-9a-f]+( \\[.*\\])? \\(most recent call first\\):\n Garbage-collecting\n File "<string>", line 9 in __del__\nCurrent thread\'s C stack trace \\(most recent call first\\):\n( Binary file ".+"(, at .*(\\+|-)0x[0-9a-f]+)? \\[0x[0-9a-f]+\\])|(<.+>)' not found in 'exit' | |
| ====================================================================== | |
| FAIL: test_gil_released (test.test_faulthandler.FaultHandlerTests.test_gil_released) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 315, in test_gil_released | |
| self.check_fatal_error(""" | |
| ~~~~~~~~~~~~~~~~~~~~~~^^^^ | |
| import faulthandler | |
| ^^^^^^^^^^^^^^^^^^^ | |
| ...<3 lines>... | |
| 3, | |
| ^^ | |
| 'Segmentation fault') | |
| ^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 163, in check_fatal_error | |
| self.check_error(code, line_number, fatal_error, **kw) | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 156, in check_error | |
| self.assertRegex(output, regex) | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^ | |
| AssertionError: Regex didn't match: '(?m)^Fatal Python error: Segmentation fault\n\nCurrent thread 0x[0-9a-f]+( \\[.*\\])? \\(most recent call first\\):\n File "<string>", line 3 in <module>\nCurrent thread\'s C stack trace \\(most recent call first\\):\n( Binary file ".+"(, at .*(\\+|-)0x[0-9a-f]+)? \\[0x[0-9a-f]+\\])|(<.+>)' not found in '' | |
| ====================================================================== | |
| FAIL: test_sigsegv (test.test_faulthandler.FaultHandlerTests.test_sigsegv) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 171, in test_sigsegv | |
| self.check_fatal_error(""" | |
| ~~~~~~~~~~~~~~~~~~~~~~^^^^ | |
| import faulthandler | |
| ^^^^^^^^^^^^^^^^^^^ | |
| ...<3 lines>... | |
| 3, | |
| ^^ | |
| 'Segmentation fault') | |
| ^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 163, in check_fatal_error | |
| self.check_error(code, line_number, fatal_error, **kw) | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 156, in check_error | |
| self.assertRegex(output, regex) | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^ | |
| AssertionError: Regex didn't match: '(?m)^Fatal Python error: Segmentation fault\n\nCurrent thread 0x[0-9a-f]+( \\[.*\\])? \\(most recent call first\\):\n File "<string>", line 3 in <module>\nCurrent thread\'s C stack trace \\(most recent call first\\):\n( Binary file ".+"(, at .*(\\+|-)0x[0-9a-f]+)? \\[0x[0-9a-f]+\\])|(<.+>)' not found in '' | |
| ---------------------------------------------------------------------- | |
| Ran 48 tests in 23.731s | |
| FAILED (failures=9, skipped=5) | |
| test test_faulthandler failed | |
| 0:00:25 load avg: 4.16 [ 79/492/1] test_type_comments passed | |
| 0:00:26 load avg: 4.16 [ 80/492/1] test_rlcompleter passed | |
| 0:00:26 load avg: 4.16 [ 81/492/1] test_enum passed | |
| 0:00:26 load avg: 4.16 [ 82/492/1] test_module passed | |
| 0:00:27 load avg: 4.16 [ 83/492/1] test.test_asyncio.test_events passed | |
| 0:00:27 load avg: 4.16 [ 84/492/1] test_sqlite3 passed | |
| 0:00:27 load avg: 4.16 [ 85/492/1] test_popen passed | |
| 0:00:27 load avg: 4.16 [ 86/492/1] test_pyclbr passed | |
| 0:00:28 load avg: 4.16 [ 87/492/1] test_xml_etree_c passed | |
| 0:00:28 load avg: 4.16 [ 88/492/1] test_global passed | |
| 0:00:29 load avg: 4.16 [ 89/492/1] test_baseexception passed | |
| 0:00:34 load avg: 6.55 [ 90/492/1] test_doctest passed | |
| 0:00:35 load avg: 6.55 [ 91/492/1] test.test_multiprocessing_forkserver.test_threads passed | |
| 0:00:35 load avg: 6.55 [ 92/492/1] test___all__ passed | |
| 0:00:36 load avg: 7.07 [ 93/492/1] test__interpchannels passed -- running (1): test.test_multiprocessing_forkserver.test_processes (30.4 sec) | |
| 0:00:37 load avg: 7.07 [ 94/492/1] test.test_asyncio.test_staggered passed -- running (1): test.test_multiprocessing_forkserver.test_processes (31.5 sec) | |
| 0:00:38 load avg: 7.07 [ 95/492/1] test.test_concurrent_futures.test_init passed -- running (1): test.test_multiprocessing_forkserver.test_processes (32.3 sec) | |
| 0:00:38 load avg: 7.07 [ 96/492/1] test_complex passed -- running (1): test.test_multiprocessing_forkserver.test_processes (33.1 sec) | |
| 0:00:39 load avg: 7.07 [ 97/492/1] test_sched passed -- running (1): test.test_multiprocessing_forkserver.test_processes (34.2 sec) | |
| 0:00:40 load avg: 7.54 [ 98/492/1] test_venv passed -- running (1): test.test_multiprocessing_forkserver.test_processes (35.1 sec) | |
| 0:00:41 load avg: 7.54 [ 99/492/1] test_utf8source passed -- running (1): test.test_multiprocessing_forkserver.test_processes (35.6 sec) | |
| 0:00:41 load avg: 7.54 [100/492/1] test_copy passed -- running (1): test.test_multiprocessing_forkserver.test_processes (36.1 sec) | |
| 0:00:43 load avg: 7.54 [101/492/1] test_wait4 passed -- running (2): test.test_concurrent_futures.test_shutdown (30.6 sec), test.test_multiprocessing_forkserver.test_processes (37.6 sec) | |
| 0:00:43 load avg: 7.54 [102/492/1] test.test_asyncio.test_tasks passed -- running (2): test.test_concurrent_futures.test_shutdown (30.6 sec), test.test_multiprocessing_forkserver.test_processes (37.6 sec) | |
| 0:00:46 load avg: 11.34 [103/492/1] test_codecencodings_cn passed -- running (2): test.test_concurrent_futures.test_shutdown (33.6 sec), test.test_multiprocessing_forkserver.test_processes (40.6 sec) | |
| 0:00:50 load avg: 10.99 [104/492/1] test_tracemalloc passed -- running (2): test.test_concurrent_futures.test_shutdown (38.0 sec), test.test_multiprocessing_forkserver.test_processes (45.0 sec) | |
| 0:00:51 load avg: 10.99 [105/492/1] test_bool passed -- running (2): test.test_concurrent_futures.test_shutdown (38.4 sec), test.test_multiprocessing_forkserver.test_processes (45.4 sec) | |
| 0:00:51 load avg: 10.99 [106/492/1] test.test_concurrent_futures.test_shutdown passed (38.4 sec) -- running (1): test.test_multiprocessing_forkserver.test_processes (45.4 sec) | |
| 0:00:51 load avg: 10.99 [107/492/1] test_decorators passed -- running (1): test.test_multiprocessing_forkserver.test_processes (45.8 sec) | |
| 0:00:51 load avg: 10.99 [108/492/1] test_xml_dom_minicompat passed -- running (1): test.test_multiprocessing_forkserver.test_processes (45.9 sec) | |
| 0:00:54 load avg: 10.99 [109/492/1] test_gzip passed -- running (1): test.test_multiprocessing_forkserver.test_processes (49.0 sec) | |
| 0:00:55 load avg: 10.99 [110/492/1] test_sys_setprofile passed -- running (1): test.test_multiprocessing_forkserver.test_processes (49.4 sec) | |
| 0:00:55 load avg: 10.67 [111/492/1] test_metaclass passed -- running (1): test.test_multiprocessing_forkserver.test_processes (49.8 sec) | |
| 0:00:56 load avg: 10.67 [112/492/1] test_strptime passed -- running (1): test.test_multiprocessing_forkserver.test_processes (50.3 sec) | |
| 0:00:56 load avg: 10.67 [113/492/1] test_mmap passed -- running (1): test.test_multiprocessing_forkserver.test_processes (50.8 sec) | |
| 0:00:57 load avg: 10.67 [114/492/1] test_descrtut passed -- running (1): test.test_multiprocessing_forkserver.test_processes (51.3 sec) | |
| 0:00:57 load avg: 10.67 [115/492/1] test_tcl passed -- running (1): test.test_multiprocessing_forkserver.test_processes (51.8 sec) | |
| 0:00:57 load avg: 10.67 [116/492/1] test_keyword passed -- running (2): test.test_multiprocessing_forkserver.test_processes (52.2 sec), test_regrtest (30.2 sec) | |
| 0:00:58 load avg: 10.67 [117/492/1] test.test_multiprocessing_forkserver.test_processes passed (52.3 sec) -- running (2): test.test_multiprocessing_forkserver.test_misc (30.1 sec), test_regrtest (30.3 sec) | |
| 0:00:58 load avg: 10.67 [118/492/1] test_flufl passed -- running (2): test.test_multiprocessing_forkserver.test_misc (30.3 sec), test_regrtest (30.5 sec) | |
| 0:00:58 load avg: 10.67 [119/492/1] test_contextlib passed -- running (2): test.test_multiprocessing_forkserver.test_misc (30.4 sec), test_regrtest (30.7 sec) | |
| 0:00:58 load avg: 10.67 [120/492/1] test_contextlib_async passed -- running (2): test.test_multiprocessing_forkserver.test_misc (30.7 sec), test_regrtest (31.0 sec) | |
| 0:00:58 load avg: 10.67 [121/492/1] test_sundry passed -- running (2): test.test_multiprocessing_forkserver.test_misc (30.8 sec), test_regrtest (31.1 sec) | |
| 0:00:59 load avg: 10.67 [122/492/1] test_msvcrt skipped -- running (2): test.test_multiprocessing_forkserver.test_misc (31.1 sec), test_regrtest (31.4 sec) | |
| test_msvcrt skipped -- windows related tests | |
| 0:00:59 load avg: 10.67 [123/492/1] test_syntax passed -- running (2): test.test_multiprocessing_forkserver.test_misc (31.3 sec), test_regrtest (31.6 sec) | |
| 0:00:59 load avg: 10.67 [124/492/1] test_memoryio passed -- running (2): test.test_multiprocessing_forkserver.test_misc (31.6 sec), test_regrtest (31.9 sec) | |
| 0:00:59 load avg: 10.67 [125/492/1] test_bigaddrspace passed -- running (2): test.test_multiprocessing_forkserver.test_misc (31.9 sec), test_regrtest (32.2 sec) | |
| 0:01:00 load avg: 10.54 [126/492/1] test.test_asyncio.test_context passed -- running (2): test.test_multiprocessing_forkserver.test_misc (32.5 sec), test_regrtest (32.8 sec) | |
| 0:01:00 load avg: 10.54 [127/492/1] test_signal passed -- running (2): test.test_multiprocessing_forkserver.test_misc (32.9 sec), test_regrtest (33.2 sec) | |
| 0:01:01 load avg: 10.54 [128/492/1] test_queue passed -- running (2): test.test_multiprocessing_forkserver.test_misc (33.2 sec), test_regrtest (33.5 sec) | |
| 0:01:01 load avg: 10.54 [129/492/1] test_unary passed -- running (2): test.test_multiprocessing_forkserver.test_misc (33.3 sec), test_regrtest (33.6 sec) | |
| 0:01:02 load avg: 10.54 [130/492/1] test_math passed -- running (2): test.test_multiprocessing_forkserver.test_misc (34.3 sec), test_regrtest (34.6 sec) | |
| 0:01:03 load avg: 10.54 [131/492/1] test_readline passed -- running (2): test.test_multiprocessing_forkserver.test_misc (35.3 sec), test_regrtest (35.6 sec) | |
| 0:01:03 load avg: 10.54 [132/492/1] test_descr passed -- running (2): test.test_multiprocessing_forkserver.test_misc (35.7 sec), test_regrtest (35.9 sec) | |
| 0:01:04 load avg: 10.54 [133/492/1] test_linecache passed -- running (2): test.test_multiprocessing_forkserver.test_misc (36.1 sec), test_regrtest (36.4 sec) | |
| 0:01:04 load avg: 10.54 [134/492/1] test.test_multiprocessing_spawn.test_threads passed -- running (2): test.test_multiprocessing_forkserver.test_misc (36.2 sec), test_regrtest (36.5 sec) | |
| 0:01:04 load avg: 10.54 [135/492/1] test_weakref passed -- running (2): test.test_multiprocessing_forkserver.test_misc (36.2 sec), test_regrtest (36.5 sec) | |
| 0:01:04 load avg: 10.54 [136/492/1] test_listcomps passed -- running (2): test.test_multiprocessing_forkserver.test_misc (36.3 sec), test_regrtest (36.6 sec) | |
| 0:01:04 load avg: 10.54 [137/492/1] test_textwrap passed -- running (2): test.test_multiprocessing_forkserver.test_misc (36.5 sec), test_regrtest (36.7 sec) | |
| 0:01:04 load avg: 10.54 [138/492/1] test_resource passed -- running (2): test.test_multiprocessing_forkserver.test_misc (36.8 sec), test_regrtest (37.0 sec) | |
| 0:01:04 load avg: 10.54 [139/492/1] test_ipaddress passed -- running (2): test.test_multiprocessing_forkserver.test_misc (36.9 sec), test_regrtest (37.2 sec) | |
| 0:01:05 load avg: 10.54 [140/492/1] test_peepholer passed -- running (2): test.test_multiprocessing_forkserver.test_misc (37.1 sec), test_regrtest (37.3 sec) | |
| 0:01:05 load avg: 10.26 [141/492/1] test_winreg skipped -- running (3): test.test_multiprocessing_forkserver.test_misc (37.5 sec), test_regrtest (37.8 sec), test.test_multiprocessing_spawn.test_manager (30.4 sec) | |
| test_winreg skipped -- No module named 'winreg' | |
| 0:01:06 load avg: 10.26 [142/492/1] test_set passed -- running (3): test.test_multiprocessing_forkserver.test_misc (38.5 sec), test_regrtest (38.8 sec), test.test_multiprocessing_spawn.test_manager (31.4 sec) | |
| 0:01:06 load avg: 10.26 [143/492/1] test_exception_variations passed -- running (3): test.test_multiprocessing_forkserver.test_misc (38.9 sec), test_regrtest (39.2 sec), test.test_multiprocessing_spawn.test_manager (31.8 sec) | |
| 0:01:07 load avg: 10.26 [144/492/1] test.test_asyncio.test_server passed -- running (3): test.test_multiprocessing_forkserver.test_misc (39.7 sec), test_regrtest (40.0 sec), test.test_multiprocessing_spawn.test_manager (32.6 sec) | |
| 0:01:07 load avg: 10.26 [145/492/1] test.test_asyncio.test_sendfile passed -- running (3): test.test_multiprocessing_forkserver.test_misc (39.7 sec), test_regrtest (40.0 sec), test.test_multiprocessing_spawn.test_manager (32.6 sec) | |
| 0:01:08 load avg: 10.26 [146/492/1] test.test_multiprocessing_forkserver.test_misc passed (40.9 sec) -- running (2): test_regrtest (41.2 sec), test.test_multiprocessing_spawn.test_manager (33.8 sec) | |
| 0:01:09 load avg: 10.26 [147/492/1] test_winsound skipped -- running (2): test_regrtest (41.5 sec), test.test_multiprocessing_spawn.test_manager (34.1 sec) | |
| test_winsound skipped -- No module named 'winsound' | |
| 0:01:09 load avg: 10.26 [148/492/1] test_logging passed -- running (2): test_regrtest (42.1 sec), test.test_multiprocessing_spawn.test_manager (34.7 sec) | |
| 0:01:09 load avg: 10.26 [149/492/1] test_file_eintr passed -- running (2): test_regrtest (42.1 sec), test.test_multiprocessing_spawn.test_manager (34.7 sec) | |
| 0:01:09 load avg: 10.26 [150/492/1] test_structseq passed -- running (2): test_regrtest (42.3 sec), test.test_multiprocessing_spawn.test_manager (34.9 sec) | |
| 0:01:10 load avg: 10.32 [151/492/1] test_tstring passed -- running (2): test_regrtest (42.8 sec), test.test_multiprocessing_spawn.test_manager (35.4 sec) | |
| 0:01:10 load avg: 10.32 [152/492/1] test_functools passed -- running (2): test_regrtest (42.9 sec), test.test_multiprocessing_spawn.test_manager (35.5 sec) | |
| 0:01:10 load avg: 10.32 [153/492/1] test_getpass passed -- running (2): test_regrtest (43.0 sec), test.test_multiprocessing_spawn.test_manager (35.6 sec) | |
| 0:01:10 load avg: 10.32 [154/492/1] test_errno passed -- running (2): test_regrtest (43.2 sec), test.test_multiprocessing_spawn.test_manager (35.8 sec) | |
| 0:01:11 load avg: 10.32 [155/492/1] test_bigmem passed -- running (2): test_regrtest (43.4 sec), test.test_multiprocessing_spawn.test_manager (36.0 sec) | |
| 0:01:11 load avg: 10.32 [156/492/1] test_binop passed -- running (2): test_regrtest (43.5 sec), test.test_multiprocessing_spawn.test_manager (36.1 sec) | |
| 0:01:11 load avg: 10.32 [157/492/1] test_symtable passed -- running (2): test_regrtest (43.8 sec), test.test_multiprocessing_spawn.test_manager (36.4 sec) | |
| 0:01:11 load avg: 10.32 [158/492/1] test_dis passed -- running (2): test_regrtest (43.8 sec), test.test_multiprocessing_spawn.test_manager (36.4 sec) | |
| 0:01:11 load avg: 10.32 [159/492/1] test_ensurepip passed -- running (2): test_regrtest (44.0 sec), test.test_multiprocessing_spawn.test_manager (36.6 sec) | |
| 0:01:12 load avg: 10.32 [160/492/1] test.test_future_stmt.test_future passed -- running (2): test_regrtest (44.7 sec), test.test_multiprocessing_spawn.test_manager (37.3 sec) | |
| 0:01:13 load avg: 10.32 [161/492/1] test_gc passed -- running (3): test.test_multiprocessing_spawn.test_misc (30.2 sec), test_regrtest (45.8 sec), test.test_multiprocessing_spawn.test_manager (38.4 sec) | |
| 0:01:13 load avg: 10.32 [162/492/1] test_mailbox passed -- running (3): test.test_multiprocessing_spawn.test_misc (30.3 sec), test_regrtest (45.8 sec), test.test_multiprocessing_spawn.test_manager (38.4 sec) | |
| 0:01:13 load avg: 10.32 [163/492/1] test.test_multiprocessing_spawn.test_manager passed (38.5 sec) -- running (2): test.test_multiprocessing_spawn.test_misc (30.3 sec), test_regrtest (45.9 sec) | |
| 0:01:13 load avg: 10.32 [164/492/1] test_exception_hierarchy passed -- running (2): test.test_multiprocessing_spawn.test_misc (30.6 sec), test_regrtest (46.2 sec) | |
| 0:01:14 load avg: 10.32 [165/492/1] test_graphlib passed -- running (2): test.test_multiprocessing_spawn.test_misc (31.1 sec), test_regrtest (46.7 sec) | |
| 0:01:14 load avg: 10.32 [166/492/1] test.test_asyncio.test_ssl passed -- running (2): test.test_multiprocessing_spawn.test_misc (31.4 sec), test_regrtest (46.9 sec) | |
| 0:01:15 load avg: 10.32 [167/492/1] test_type_annotations passed -- running (2): test.test_multiprocessing_spawn.test_misc (31.8 sec), test_regrtest (47.4 sec) | |
| 0:01:15 load avg: 10.32 [168/492/1] test.test_asyncio.test_proactor_events passed -- running (2): test.test_multiprocessing_spawn.test_misc (32.0 sec), test_regrtest (47.6 sec) | |
| 0:01:16 load avg: 10.53 [169/492/1] test_platform passed -- running (2): test.test_multiprocessing_spawn.test_misc (33.0 sec), test_regrtest (48.6 sec) | |
| 0:01:16 load avg: 10.53 [170/492/1] test.test_asyncio.test_threads passed -- running (2): test.test_multiprocessing_spawn.test_misc (33.5 sec), test_regrtest (49.1 sec) | |
| 0:01:19 load avg: 10.53 [171/492/1] test_importlib passed -- running (2): test.test_multiprocessing_spawn.test_misc (36.0 sec), test_regrtest (51.6 sec) | |
| 0:01:19 load avg: 10.53 [172/492/1] test.test_asyncio.test_windows_events skipped -- running (2): test.test_multiprocessing_spawn.test_misc (36.5 sec), test_regrtest (52.1 sec) | |
| test.test_asyncio.test_windows_events skipped -- Windows only | |
| 0:01:20 load avg: 10.53 [173/492/1] test_largefile passed -- running (2): test.test_multiprocessing_spawn.test_misc (36.9 sec), test_regrtest (52.5 sec) | |
| 0:01:20 load avg: 10.53 [174/492/1] test_time passed -- running (2): test.test_multiprocessing_spawn.test_misc (37.1 sec), test_regrtest (52.7 sec) | |
| 0:01:20 load avg: 10.65 [175/492/1] test.test_gdb.test_misc skipped -- running (2): test.test_multiprocessing_spawn.test_misc (37.3 sec), test_regrtest (52.9 sec) | |
| test.test_gdb.test_misc skipped -- Couldn't find gdb program on the path: [Errno 2] No such file or directory: 'gdb' | |
| 0:01:21 load avg: 10.65 [176/492/1] test_generators passed -- running (2): test.test_multiprocessing_spawn.test_misc (38.0 sec), test_regrtest (53.6 sec) | |
| 0:01:21 load avg: 10.65 [177/492/1] test.test_asyncio.test_selector_events passed -- running (2): test.test_multiprocessing_spawn.test_misc (38.3 sec), test_regrtest (53.9 sec) | |
| 0:01:21 load avg: 10.65 [178/492/1] test_eintr passed -- running (2): test.test_multiprocessing_spawn.test_misc (38.4 sec), test_regrtest (54.0 sec) | |
| 0:01:21 load avg: 10.65 [179/492/1] test.test_gdb.test_pretty_print skipped -- running (2): test.test_multiprocessing_spawn.test_misc (38.5 sec), test_regrtest (54.1 sec) | |
| test.test_gdb.test_pretty_print skipped -- Couldn't find gdb program on the path: [Errno 2] No such file or directory: 'gdb' | |
| 0:01:22 load avg: 10.65 [180/492/1] test_colorsys passed -- running (2): test.test_multiprocessing_spawn.test_misc (38.9 sec), test_regrtest (54.5 sec) | |
| 0:01:22 load avg: 10.65 [181/492/1] test_tkinter skipped (resource denied) -- running (2): test.test_multiprocessing_spawn.test_misc (39.0 sec), test_regrtest (54.6 sec) | |
| test_tkinter skipped -- Use of the 'gui' resource not enabled | |
| 0:01:22 load avg: 10.65 [182/492/1] test_quopri passed -- running (2): test.test_multiprocessing_spawn.test_misc (39.1 sec), test_regrtest (54.7 sec) | |
| 0:01:22 load avg: 10.65 [183/492/1] test_nturl2path passed -- running (2): test.test_multiprocessing_spawn.test_misc (39.5 sec), test_regrtest (55.0 sec) | |
| 0:01:23 load avg: 10.65 [184/492/1] test_multibytecodec passed -- running (2): test.test_multiprocessing_spawn.test_misc (40.0 sec), test_regrtest (55.6 sec) | |
| 0:01:23 load avg: 10.65 [185/492/1] test_turtle passed -- running (2): test.test_multiprocessing_spawn.test_misc (40.1 sec), test_regrtest (55.6 sec) | |
| 0:01:24 load avg: 10.65 [186/492/1] test_uuid passed -- running (2): test.test_multiprocessing_spawn.test_misc (41.1 sec), test_regrtest (56.6 sec) | |
| 0:01:24 load avg: 10.65 [187/492/1] test_marshal passed -- running (2): test.test_multiprocessing_spawn.test_misc (41.6 sec), test_regrtest (57.2 sec) | |
| 0:01:25 load avg: 10.65 [188/492/1] test_codecmaps_kr passed -- running (2): test.test_multiprocessing_spawn.test_misc (42.1 sec), test_regrtest (57.7 sec) | |
| 0:01:25 load avg: 10.65 [189/492/1] test_crossinterp passed -- running (2): test.test_multiprocessing_spawn.test_misc (42.1 sec), test_regrtest (57.7 sec) | |
| 0:01:25 load avg: 11.08 [190/492/1] test_fileutils passed -- running (2): test.test_multiprocessing_spawn.test_misc (42.3 sec), test_regrtest (57.9 sec) | |
| 0:01:26 load avg: 11.08 [191/492/1] test_wmi skipped -- running (2): test.test_multiprocessing_spawn.test_misc (42.8 sec), test_regrtest (58.4 sec) | |
| test_wmi skipped -- No module named '_wmi' | |
| 0:01:26 load avg: 11.08 [192/492/1] test_codeop passed -- running (2): test.test_multiprocessing_spawn.test_misc (43.4 sec), test_regrtest (59.0 sec) | |
| 0:01:26 load avg: 11.08 [193/492/1] test_fileio passed -- running (2): test.test_multiprocessing_spawn.test_misc (43.7 sec), test_regrtest (59.2 sec) | |
| 0:01:27 load avg: 11.08 [194/492/1] test_stringprep passed -- running (2): test.test_multiprocessing_spawn.test_misc (44.2 sec), test_regrtest (59.8 sec) | |
| 0:01:27 load avg: 11.08 [195/492/1] test_shutil passed -- running (2): test.test_multiprocessing_spawn.test_misc (44.2 sec), test_regrtest (59.8 sec) | |
| 0:01:27 load avg: 11.08 [196/492/1] test.test_future_stmt.test_future_multiple_features passed -- running (2): test.test_multiprocessing_spawn.test_misc (44.7 sec), test_regrtest (1 min) | |
| 0:01:28 load avg: 11.08 [197/492/1] test_finalization passed -- running (2): test.test_multiprocessing_spawn.test_misc (45.3 sec), test_regrtest (1 min) | |
| 0:01:29 load avg: 11.08 [198/492/2] test_regrtest failed (2 failures) (1 min 2 sec) -- running (1): test.test_multiprocessing_spawn.test_misc (46.4 sec) | |
| test_add_python_opts (test.test_regrtest.ArgsTestCase.test_add_python_opts) ... ok | |
| test_cleanup (test.test_regrtest.ArgsTestCase.test_cleanup) ... ok | |
| test_coverage (test.test_regrtest.ArgsTestCase.test_coverage) ... ok | |
| test_crashed (test.test_regrtest.ArgsTestCase.test_crashed) ... FAIL | |
| test_doctest (test.test_regrtest.ArgsTestCase.test_doctest) ... ok | |
| test_env_changed (test.test_regrtest.ArgsTestCase.test_env_changed) ... ok | |
| test_failing_test (test.test_regrtest.ArgsTestCase.test_failing_test) ... ok | |
| test_forever (test.test_regrtest.ArgsTestCase.test_forever) ... ok | |
| test_fromfile (test.test_regrtest.ArgsTestCase.test_fromfile) ... ok | |
| test_huntrleaks (test.test_regrtest.ArgsTestCase.test_huntrleaks) ... ok | |
| test_huntrleaks_bisect (test.test_regrtest.ArgsTestCase.test_huntrleaks_bisect) ... ok | |
| test_huntrleaks_fd_leak (test.test_regrtest.ArgsTestCase.test_huntrleaks_fd_leak) ... ok | |
| test_huntrleaks_mp (test.test_regrtest.ArgsTestCase.test_huntrleaks_mp) ... ok | |
| test_ignorefile (test.test_regrtest.ArgsTestCase.test_ignorefile) ... ok | |
| test_interrupted (test.test_regrtest.ArgsTestCase.test_interrupted) ... ok | |
| test_leak_tmp_file (test.test_regrtest.ArgsTestCase.test_leak_tmp_file) ... ok | |
| test_list_cases (test.test_regrtest.ArgsTestCase.test_list_cases) ... ok | |
| test_list_tests (test.test_regrtest.ArgsTestCase.test_list_tests) ... ok | |
| test_matchfile (test.test_regrtest.ArgsTestCase.test_matchfile) ... ok | |
| test_multiprocessing_timeout (test.test_regrtest.ArgsTestCase.test_multiprocessing_timeout) ... ok | |
| test_no_test_ran_some_test_exist_some_not (test.test_regrtest.ArgsTestCase.test_no_test_ran_some_test_exist_some_not) ... ok | |
| test_no_tests_ran (test.test_regrtest.ArgsTestCase.test_no_tests_ran) ... ok | |
| test_no_tests_ran_multiple_tests_nonexistent (test.test_regrtest.ArgsTestCase.test_no_tests_ran_multiple_tests_nonexistent) ... ok | |
| test_no_tests_ran_skip (test.test_regrtest.ArgsTestCase.test_no_tests_ran_skip) ... ok | |
| test_nonascii (test.test_regrtest.ArgsTestCase.test_nonascii) ... ok | |
| test_pgo_exclude (test.test_regrtest.ArgsTestCase.test_pgo_exclude) ... ok | |
| test_print_warning (test.test_regrtest.ArgsTestCase.test_print_warning) ... ok | |
| test_python_command (test.test_regrtest.ArgsTestCase.test_python_command) ... ok | |
| test_random (test.test_regrtest.ArgsTestCase.test_random) ... ok | |
| test_random_seed (test.test_regrtest.ArgsTestCase.test_random_seed) ... ok | |
| test_random_seed_workers (test.test_regrtest.ArgsTestCase.test_random_seed_workers) ... ok | |
| test_rerun_async_setup_hook_failure (test.test_regrtest.ArgsTestCase.test_rerun_async_setup_hook_failure) ... ok | |
| test_rerun_async_teardown_hook_failure (test.test_regrtest.ArgsTestCase.test_rerun_async_teardown_hook_failure) ... ok | |
| test_rerun_fail (test.test_regrtest.ArgsTestCase.test_rerun_fail) ... ok | |
| test_rerun_setup_class_hook_failure (test.test_regrtest.ArgsTestCase.test_rerun_setup_class_hook_failure) ... ok | |
| test_rerun_setup_hook_failure (test.test_regrtest.ArgsTestCase.test_rerun_setup_hook_failure) ... ok | |
| test_rerun_setup_module_hook_failure (test.test_regrtest.ArgsTestCase.test_rerun_setup_module_hook_failure) ... ok | |
| test_rerun_success (test.test_regrtest.ArgsTestCase.test_rerun_success) ... ok | |
| test_rerun_teardown_class_hook_failure (test.test_regrtest.ArgsTestCase.test_rerun_teardown_class_hook_failure) ... ok | |
| test_rerun_teardown_hook_failure (test.test_regrtest.ArgsTestCase.test_rerun_teardown_hook_failure) ... ok | |
| test_rerun_teardown_module_hook_failure (test.test_regrtest.ArgsTestCase.test_rerun_teardown_module_hook_failure) ... ok | |
| test_resources (test.test_regrtest.ArgsTestCase.test_resources) ... ok | |
| test_skip (test.test_regrtest.ArgsTestCase.test_skip) ... ok | |
| test_slowest (test.test_regrtest.ArgsTestCase.test_slowest) ... ok | |
| test_slowest_interrupted (test.test_regrtest.ArgsTestCase.test_slowest_interrupted) ... ok | |
| test_success (test.test_regrtest.ArgsTestCase.test_success) ... ok | |
| test_threading_excepthook (test.test_regrtest.ArgsTestCase.test_threading_excepthook) ... ok | |
| test_uncollectable (test.test_regrtest.ArgsTestCase.test_uncollectable) ... ok | |
| test_unicode_guard_env (test.test_regrtest.ArgsTestCase.test_unicode_guard_env) ... ok | |
| test_unload_tests (test.test_regrtest.ArgsTestCase.test_unload_tests) ... ok | |
| test_unraisable_exc (test.test_regrtest.ArgsTestCase.test_unraisable_exc) ... ok | |
| test_verbose3 (test.test_regrtest.ArgsTestCase.test_verbose3) ... ok | |
| test_wait (test.test_regrtest.ArgsTestCase.test_wait) ... ok | |
| test_worker_decode_error (test.test_regrtest.ArgsTestCase.test_worker_decode_error) ... ok | |
| test_worker_output_on_failure (test.test_regrtest.ArgsTestCase.test_worker_output_on_failure) ... FAIL | |
| test_xml (test.test_regrtest.ArgsTestCase.test_xml) ... ok | |
| test_finds_expected_number_of_tests (test.test_regrtest.CheckActualTests.test_finds_expected_number_of_tests) | |
| Check that regrtest appears to find the expected set of tests. ... ok | |
| test_arg (test.test_regrtest.ParseArgsTestCase.test_arg) ... ok | |
| test_arg_option_arg (test.test_regrtest.ParseArgsTestCase.test_arg_option_arg) ... ok | |
| test_bisect (test.test_regrtest.ParseArgsTestCase.test_bisect) ... ok | |
| test_coverage_mp (test.test_regrtest.ParseArgsTestCase.test_coverage_mp) ... ok | |
| test_coverage_sequential (test.test_regrtest.ParseArgsTestCase.test_coverage_sequential) ... ok | |
| test_coverdir (test.test_regrtest.ParseArgsTestCase.test_coverdir) ... ok | |
| test_dont_add_python_opts (test.test_regrtest.ParseArgsTestCase.test_dont_add_python_opts) ... ok | |
| test_exclude (test.test_regrtest.ParseArgsTestCase.test_exclude) ... ok | |
| test_failfast (test.test_regrtest.ParseArgsTestCase.test_failfast) ... ok | |
| test_fast_ci (test.test_regrtest.ParseArgsTestCase.test_fast_ci) ... ok | |
| test_fast_ci_python_cmd (test.test_regrtest.ParseArgsTestCase.test_fast_ci_python_cmd) ... ok | |
| test_fast_ci_resource (test.test_regrtest.ParseArgsTestCase.test_fast_ci_resource) ... ok | |
| test_forever (test.test_regrtest.ParseArgsTestCase.test_forever) ... ok | |
| test_fromfile (test.test_regrtest.ParseArgsTestCase.test_fromfile) ... ok | |
| test_header (test.test_regrtest.ParseArgsTestCase.test_header) ... ok | |
| test_help (test.test_regrtest.ParseArgsTestCase.test_help) ... ok | |
| test_huntrleaks (test.test_regrtest.ParseArgsTestCase.test_huntrleaks) ... ok | |
| test_long_option__partial (test.test_regrtest.ParseArgsTestCase.test_long_option__partial) ... ok | |
| test_match (test.test_regrtest.ParseArgsTestCase.test_match) ... ok | |
| test_memlimit (test.test_regrtest.ParseArgsTestCase.test_memlimit) ... ok | |
| test_multiprocess (test.test_regrtest.ParseArgsTestCase.test_multiprocess) ... ok | |
| test_nocoverdir (test.test_regrtest.ParseArgsTestCase.test_nocoverdir) ... ok | |
| test_nowindows (test.test_regrtest.ParseArgsTestCase.test_nowindows) ... ok | |
| test_option_and_arg (test.test_regrtest.ParseArgsTestCase.test_option_and_arg) ... ok | |
| test_option_with_empty_string_value (test.test_regrtest.ParseArgsTestCase.test_option_with_empty_string_value) ... ok | |
| test_quiet (test.test_regrtest.ParseArgsTestCase.test_quiet) ... ok | |
| test_randomize (test.test_regrtest.ParseArgsTestCase.test_randomize) ... ok | |
| test_randseed (test.test_regrtest.ParseArgsTestCase.test_randseed) ... ok | |
| test_rerun (test.test_regrtest.ParseArgsTestCase.test_rerun) ... ok | |
| test_runleaks (test.test_regrtest.ParseArgsTestCase.test_runleaks) ... ok | |
| test_single (test.test_regrtest.ParseArgsTestCase.test_single) ... ok | |
| test_single_process (test.test_regrtest.ParseArgsTestCase.test_single_process) ... ok | |
| test_slow_ci (test.test_regrtest.ParseArgsTestCase.test_slow_ci) ... ok | |
| test_slowest (test.test_regrtest.ParseArgsTestCase.test_slowest) ... ok | |
| test_start (test.test_regrtest.ParseArgsTestCase.test_start) ... ok | |
| test_testdir (test.test_regrtest.ParseArgsTestCase.test_testdir) ... ok | |
| test_threshold (test.test_regrtest.ParseArgsTestCase.test_threshold) ... ok | |
| test_timeout (test.test_regrtest.ParseArgsTestCase.test_timeout) ... ok | |
| test_two_options (test.test_regrtest.ParseArgsTestCase.test_two_options) ... ok | |
| test_unknown_option (test.test_regrtest.ParseArgsTestCase.test_unknown_option) ... ok | |
| test_unrecognized_argument (test.test_regrtest.ParseArgsTestCase.test_unrecognized_argument) ... ok | |
| test_use (test.test_regrtest.ParseArgsTestCase.test_use) ... ok | |
| test_verbose (test.test_regrtest.ParseArgsTestCase.test_verbose) ... ok | |
| test_verbose3 (test.test_regrtest.ParseArgsTestCase.test_verbose3) ... ok | |
| test_verbose3_huntrleaks (test.test_regrtest.ParseArgsTestCase.test_verbose3_huntrleaks) ... ok | |
| test_wait (test.test_regrtest.ParseArgsTestCase.test_wait) ... ok | |
| test_module_autotest (test.test_regrtest.ProgramsTestCase.test_module_autotest) ... ok | |
| test_module_from_test_autotest (test.test_regrtest.ProgramsTestCase.test_module_from_test_autotest) ... ok | |
| test_module_regrtest (test.test_regrtest.ProgramsTestCase.test_module_regrtest) ... ok | |
| test_module_test (test.test_regrtest.ProgramsTestCase.test_module_test) ... ok | |
| test_pcbuild_rt (test.test_regrtest.ProgramsTestCase.test_pcbuild_rt) ... skipped 'Windows only' | |
| test_script_autotest (test.test_regrtest.ProgramsTestCase.test_script_autotest) ... ok | |
| test_script_regrtest (test.test_regrtest.ProgramsTestCase.test_script_regrtest) ... ok | |
| test_tools_buildbot_test (test.test_regrtest.ProgramsTestCase.test_tools_buildbot_test) ... skipped 'Windows only' | |
| test_test_result_get_state (test.test_regrtest.TestColorized.test_test_result_get_state) ... ok | |
| test_format_duration (test.test_regrtest.TestUtils.test_format_duration) ... ok | |
| test_format_resources (test.test_regrtest.TestUtils.test_format_resources) ... ok | |
| test_match_test (test.test_regrtest.TestUtils.test_match_test) ... ok | |
| test_normalize_test_name (test.test_regrtest.TestUtils.test_normalize_test_name) ... ok | |
| test_sanitize_xml (test.test_regrtest.TestUtils.test_sanitize_xml) ... ok | |
| ====================================================================== | |
| FAIL: test_crashed (test.test_regrtest.ArgsTestCase.test_crashed) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_regrtest.py", line 1344, in test_crashed | |
| output = self.run_tests("-j2", *tests, exitcode=EXITCODE_BAD_TEST) | |
| File "/home/buildbot/cpython/Lib/test/test_regrtest.py", line 933, in run_tests | |
| return self.run_python(cmdargs, **kw) | |
| ~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_regrtest.py", line 780, in run_python | |
| proc = self.run_command(cmd, **kw) | |
| File "/home/buildbot/cpython/Lib/test/test_regrtest.py", line 768, in run_command | |
| self.fail(msg) | |
| ~~~~~~~~~^^^^^ | |
| AssertionError: Command ['/home/buildbot/cpython/python', '-X', 'faulthandler', '-I', '-m', 'test', '--testdir=/tmp/test_python_ccxgf7hf/tmps_36c4m8', '-j2', 'test_regrtest_crash'] failed with exit code 4, but exit code 2 expected! | |
| stdout: | |
| --- | |
| Using random seed: 316769754 | |
| 0:00:00 load avg: 6.55 Run 1 test in parallel using 1 worker process | |
| 0:00:00 load avg: 6.55 [1/1] test_regrtest_crash ran no tests | |
| == Tests result: NO TESTS RAN == | |
| 1 test run no tests: | |
| test_regrtest_crash | |
| Total duration: 497 ms | |
| Total tests: run=0 | |
| Total test files: run=1/1 run_no_tests=1 | |
| Result: NO TESTS RAN | |
| --- | |
| ====================================================================== | |
| FAIL: test_worker_output_on_failure (test.test_regrtest.ArgsTestCase.test_worker_output_on_failure) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_regrtest.py", line 2245, in test_worker_output_on_failure | |
| output = self.run_tests("-j1", testname, | |
| exitcode=EXITCODE_BAD_TEST, | |
| env=env) | |
| File "/home/buildbot/cpython/Lib/test/test_regrtest.py", line 933, in run_tests | |
| return self.run_python(cmdargs, **kw) | |
| ~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_regrtest.py", line 780, in run_python | |
| proc = self.run_command(cmd, **kw) | |
| File "/home/buildbot/cpython/Lib/test/test_regrtest.py", line 768, in run_command | |
| self.fail(msg) | |
| ~~~~~~~~~^^^^^ | |
| AssertionError: Command ['/home/buildbot/cpython/python', '-X', 'faulthandler', '-I', '-m', 'test', '--testdir=/tmp/test_python_ccxgf7hf/tmpf8lwlfjr', '-j1', 'test_regrtest_noop51'] failed with exit code 0, but exit code 2 expected! | |
| stdout: | |
| --- | |
| Using random seed: 2150737714 | |
| 0:00:00 load avg: 10.65 Run 1 test in parallel using 1 worker process | |
| 0:00:00 load avg: 10.65 [1/1] test_regrtest_noop51 passed | |
| just before crash! | |
| == Tests result: SUCCESS == | |
| 1 test OK. | |
| Total duration: 489 ms | |
| Total tests: run=1 | |
| Total test files: run=1/1 | |
| Result: SUCCESS | |
| --- | |
| ---------------------------------------------------------------------- | |
| Ran 117 tests in 61.400s | |
| FAILED (failures=2, skipped=2) | |
| test test_regrtest failed | |
| 0:01:29 load avg: 11.08 [199/492/2] test_pickletools passed -- running (1): test.test_multiprocessing_spawn.test_misc (46.5 sec) | |
| 0:01:30 load avg: 11.08 [200/492/2] test_type_cache passed -- running (1): test.test_multiprocessing_spawn.test_misc (47.0 sec) | |
| 0:01:31 load avg: 12.11 [201/492/2] test_json passed -- running (1): test.test_multiprocessing_spawn.test_misc (48.1 sec) | |
| 0:01:32 load avg: 12.11 [202/492/2] test_tabnanny passed -- running (1): test.test_multiprocessing_spawn.test_misc (49.3 sec) | |
| 0:01:32 load avg: 12.11 [203/492/2] test_index passed -- running (1): test.test_multiprocessing_spawn.test_misc (49.6 sec) | |
| 0:01:33 load avg: 12.11 [204/492/2] test_type_aliases passed -- running (1): test.test_multiprocessing_spawn.test_misc (50.0 sec) | |
| 0:01:33 load avg: 12.11 [205/492/2] test_picklebuffer passed -- running (1): test.test_multiprocessing_spawn.test_misc (50.4 sec) | |
| 0:01:34 load avg: 12.11 [206/492/2] test_yield_from passed -- running (1): test.test_multiprocessing_spawn.test_misc (50.8 sec) | |
| 0:01:34 load avg: 12.11 [207/492/2] test_httpservers passed -- running (2): test.test_multiprocessing_spawn.test_misc (51.5 sec), test_socket (30.3 sec) | |
| 0:01:35 load avg: 11.62 [208/492/2] test_opcache passed -- running (2): test.test_multiprocessing_spawn.test_misc (52.2 sec), test_socket (31.0 sec) | |
| 0:01:38 load avg: 11.62 [209/492/2] test.test_multiprocessing_spawn.test_misc passed (55.2 sec) -- running (1): test_socket (34.0 sec) | |
| 0:01:39 load avg: 11.62 [210/492/2] test_exception_group passed -- running (1): test_socket (35.0 sec) | |
| 0:01:39 load avg: 11.62 [211/492/2] test.test_future_stmt.test_future_single_import passed -- running (1): test_socket (35.3 sec) | |
| 0:01:41 load avg: 11.73 [212/492/2] test_wait3 passed -- running (1): test_socket (36.7 sec) | |
| 0:01:41 load avg: 11.73 [213/492/2] test_multiprocessing_main_handling passed -- running (1): test_socket (37.4 sec) | |
| 0:01:41 load avg: 11.73 [214/492/2] test.test_multiprocessing_fork.test_processes passed -- running (1): test_socket (37.4 sec) | |
| 0:01:41 load avg: 11.73 [215/492/2] test_syslog passed -- running (1): test_socket (37.5 sec) | |
| 0:01:42 load avg: 11.73 [216/492/2] test_genericalias passed -- running (2): test.test_concurrent_futures.test_process_pool (30.1 sec), test_socket (38.0 sec) | |
| 0:01:42 load avg: 11.73 [217/492/2] test_positional_only_arg passed -- running (2): test.test_concurrent_futures.test_process_pool (30.6 sec), test_socket (38.5 sec) | |
| 0:01:43 load avg: 11.73 [218/492/2] test_clinic passed -- running (2): test.test_concurrent_futures.test_process_pool (30.6 sec), test_socket (38.6 sec) | |
| 0:01:43 load avg: 11.73 [219/492/2] test_abstract_numbers passed -- running (2): test.test_concurrent_futures.test_process_pool (31.0 sec), test_socket (38.9 sec) | |
| 0:01:43 load avg: 11.73 [220/492/2] test_compare passed -- running (2): test.test_concurrent_futures.test_process_pool (31.3 sec), test_socket (39.3 sec) | |
| 0:01:44 load avg: 11.73 [221/492/2] test_selectors passed -- running (2): test.test_concurrent_futures.test_process_pool (31.9 sec), test_socket (39.9 sec) | |
| 0:01:44 load avg: 11.73 [222/492/2] test_getopt passed -- running (2): test.test_concurrent_futures.test_process_pool (32.0 sec), test_socket (39.9 sec) | |
| 0:01:44 load avg: 11.73 [223/492/2] test.test_asyncio.test_waitfor passed -- running (2): test.test_concurrent_futures.test_process_pool (32.0 sec), test_socket (40.0 sec) | |
| 0:01:44 load avg: 11.73 [224/492/2] test_numeric_tower passed -- running (2): test.test_concurrent_futures.test_process_pool (32.6 sec), test_socket (40.5 sec) | |
| 0:01:45 load avg: 11.73 [225/492/2] test.test_asyncio.test_windows_utils skipped -- running (2): test.test_concurrent_futures.test_process_pool (33.0 sec), test_socket (41.0 sec) | |
| test.test_asyncio.test_windows_utils skipped -- Windows only | |
| 0:01:45 load avg: 11.35 [226/492/2] test_memoryview passed -- running (2): test.test_concurrent_futures.test_process_pool (33.1 sec), test_socket (41.0 sec) | |
| 0:01:46 load avg: 11.35 [227/492/2] test.test_asyncio.test_pep492 passed -- running (2): test.test_concurrent_futures.test_process_pool (33.7 sec), test_socket (41.6 sec) | |
| 0:01:46 load avg: 11.35 [228/492/2] test_opcodes passed -- running (2): test.test_concurrent_futures.test_process_pool (34.0 sec), test_socket (42.0 sec) | |
| 0:01:47 load avg: 11.35 [229/492/2] test_external_inspection passed -- running (2): test.test_concurrent_futures.test_process_pool (35.6 sec), test_socket (43.5 sec) | |
| 0:01:48 load avg: 11.35 [230/492/2] test_tools passed -- running (2): test.test_concurrent_futures.test_process_pool (36.1 sec), test_socket (44.0 sec) | |
| 0:01:48 load avg: 11.35 [231/492/2] test_longexp passed -- running (2): test.test_concurrent_futures.test_process_pool (36.2 sec), test_socket (44.1 sec) | |
| 0:01:48 load avg: 11.35 [232/492/2] test_peg_generator skipped (resource denied) -- running (2): test.test_concurrent_futures.test_process_pool (36.4 sec), test_socket (44.4 sec) | |
| test_peg_generator skipped -- Use of the 'cpu' resource not enabled | |
| 0:01:49 load avg: 11.35 [233/492/2] test_socket passed (45.0 sec) -- running (1): test.test_concurrent_futures.test_process_pool (37.1 sec) | |
| 0:01:49 load avg: 11.35 [234/492/2] test_email passed -- running (1): test.test_concurrent_futures.test_process_pool (37.3 sec) | |
| 0:01:49 load avg: 11.35 [235/492/2] test.test_concurrent_futures.test_interpreter_pool passed -- running (1): test.test_concurrent_futures.test_process_pool (37.3 sec) | |
| 0:01:50 load avg: 11.35 [236/492/2] test_int_literal passed -- running (1): test.test_concurrent_futures.test_process_pool (37.7 sec) | |
| 0:01:50 load avg: 11.35 [237/492/2] test_frozen passed -- running (1): test.test_concurrent_futures.test_process_pool (37.8 sec) | |
| 0:01:50 load avg: 11.35 [238/492/2] test.test_asyncio.test_base_events passed -- running (1): test.test_concurrent_futures.test_process_pool (37.9 sec) | |
| 0:01:50 load avg: 11.16 [239/492/2] test_stable_abi_ctypes passed -- running (1): test.test_concurrent_futures.test_process_pool (38.2 sec) | |
| 0:01:50 load avg: 11.16 [240/492/2] test_unpack_ex passed -- running (1): test.test_concurrent_futures.test_process_pool (38.3 sec) | |
| 0:01:50 load avg: 11.16 [241/492/2] test_pkg passed -- running (1): test.test_concurrent_futures.test_process_pool (38.3 sec) | |
| 0:01:50 load avg: 11.16 [242/492/2] test.test_concurrent_futures.test_wait passed -- running (1): test.test_concurrent_futures.test_process_pool (38.3 sec) | |
| 0:01:51 load avg: 11.16 [243/492/2] test_iter passed -- running (1): test.test_concurrent_futures.test_process_pool (39.0 sec) | |
| 0:01:51 load avg: 11.16 [244/492/2] test_audit passed -- running (1): test.test_concurrent_futures.test_process_pool (39.0 sec) | |
| 0:01:51 load avg: 11.16 [245/492/2] test_posixpath passed -- running (1): test.test_concurrent_futures.test_process_pool (39.3 sec) | |
| 0:01:51 load avg: 11.16 [246/492/2] test.test_gdb.test_backtrace skipped -- running (1): test.test_concurrent_futures.test_process_pool (39.5 sec) | |
| test.test_gdb.test_backtrace skipped -- Couldn't find gdb program on the path: [Errno 2] No such file or directory: 'gdb' | |
| 0:01:52 load avg: 11.16 [247/492/2] test_itertools passed -- running (1): test.test_concurrent_futures.test_process_pool (39.6 sec) | |
| 0:01:52 load avg: 11.16 [248/492/2] test_glob passed -- running (1): test.test_concurrent_futures.test_process_pool (39.8 sec) | |
| 0:01:52 load avg: 11.16 [249/492/2] test_startfile skipped -- running (1): test.test_concurrent_futures.test_process_pool (40.2 sec) | |
| test_startfile skipped -- object <module 'os' from '/home/buildbot/cpython/Lib/os.py'> has no attribute 'startfile' | |
| 0:01:52 load avg: 11.16 [250/492/2] test_pstats passed -- running (1): test.test_concurrent_futures.test_process_pool (40.3 sec) | |
| 0:01:52 load avg: 11.16 [251/492/2] test_super passed -- running (1): test.test_concurrent_futures.test_process_pool (40.4 sec) | |
| 0:01:53 load avg: 11.16 [252/492/2] test_dtrace passed -- running (1): test.test_concurrent_futures.test_process_pool (40.7 sec) | |
| 0:01:53 load avg: 11.16 [253/492/2] test_py_compile passed -- running (1): test.test_concurrent_futures.test_process_pool (41.2 sec) | |
| 0:01:54 load avg: 11.16 [254/492/2] test_epoll passed -- running (1): test.test_concurrent_futures.test_process_pool (41.9 sec) | |
| 0:01:54 load avg: 11.16 [255/492/2] test_zlib passed -- running (1): test.test_concurrent_futures.test_process_pool (41.9 sec) | |
| 0:01:54 load avg: 11.16 [256/492/2] test_coroutines passed -- running (1): test.test_concurrent_futures.test_process_pool (42.6 sec) | |
| 0:01:55 load avg: 11.16 [257/492/2] test_getpath passed -- running (1): test.test_concurrent_futures.test_process_pool (42.7 sec) | |
| 0:01:57 load avg: 11.47 [258/492/2] test_profile passed -- running (1): test.test_concurrent_futures.test_process_pool (45.1 sec) | |
| 0:01:57 load avg: 11.47 [259/492/2] test_tokenize passed -- running (1): test.test_concurrent_futures.test_process_pool (45.6 sec) | |
| 0:01:58 load avg: 11.47 [260/492/2] test_samply_profiler passed -- running (1): test.test_concurrent_futures.test_process_pool (45.6 sec) | |
| 0:01:58 load avg: 11.47 [261/492/2] test.test_concurrent_futures.test_as_completed passed -- running (1): test.test_concurrent_futures.test_process_pool (45.8 sec) | |
| 0:01:58 load avg: 11.47 [262/492/2] test_openpty passed -- running (1): test.test_concurrent_futures.test_process_pool (46.3 sec) | |
| 0:01:59 load avg: 11.47 [263/492/2] test_richcmp passed -- running (1): test.test_concurrent_futures.test_process_pool (46.9 sec) | |
| 0:01:59 load avg: 11.47 [264/492/2] test_warnings passed -- running (1): test.test_concurrent_futures.test_process_pool (47.3 sec) | |
| 0:01:59 load avg: 11.47 [265/492/2] test_csv passed -- running (1): test.test_concurrent_futures.test_process_pool (47.4 sec) | |
| 0:02:00 load avg: 11.47 [266/492/2] test_type_params passed -- running (1): test.test_concurrent_futures.test_process_pool (47.9 sec) | |
| 0:02:01 load avg: 11.75 [267/492/2] test__locale passed -- running (1): test.test_concurrent_futures.test_process_pool (48.7 sec) | |
| 0:02:01 load avg: 11.75 [268/492/2] test_idle passed -- running (1): test.test_concurrent_futures.test_process_pool (49.4 sec) | |
| 0:02:02 load avg: 11.75 [269/492/2] test__interpreters passed -- running (1): test.test_concurrent_futures.test_process_pool (49.8 sec) | |
| 0:02:02 load avg: 11.75 [270/492/2] test_operator passed -- running (1): test.test_concurrent_futures.test_process_pool (49.9 sec) | |
| 0:02:02 load avg: 11.75 [271/492/2] test_abc passed -- running (1): test.test_concurrent_futures.test_process_pool (50.5 sec) | |
| 0:02:03 load avg: 11.75 [272/492/2] test_pickle passed -- running (1): test.test_concurrent_futures.test_process_pool (50.7 sec) | |
| 0:02:03 load avg: 11.75 [273/492/2] test_dbm_dumb passed -- running (1): test.test_concurrent_futures.test_process_pool (51.0 sec) | |
| 0:02:03 load avg: 11.75 [274/492/2] test_asdl_parser passed -- running (1): test.test_concurrent_futures.test_process_pool (51.6 sec) | |
| 0:02:04 load avg: 11.75 [275/492/2] test_calendar passed -- running (1): test.test_concurrent_futures.test_process_pool (52.3 sec) | |
| 0:02:04 load avg: 11.75 [276/492/2] test_call passed -- running (1): test.test_concurrent_futures.test_process_pool (52.4 sec) | |
| 0:02:05 load avg: 11.75 [277/492/2] test_unicode_file_functions passed -- running (1): test.test_concurrent_futures.test_process_pool (53.0 sec) | |
| 0:02:06 load avg: 11.61 [278/492/2] test_dbm passed -- running (1): test.test_concurrent_futures.test_process_pool (53.7 sec) | |
| 0:02:06 load avg: 11.61 [279/492/2] test_sys passed -- running (1): test.test_concurrent_futures.test_process_pool (53.9 sec) | |
| 0:02:06 load avg: 11.61 [280/492/2] test_dict passed -- running (1): test.test_concurrent_futures.test_process_pool (54.0 sec) | |
| 0:02:06 load avg: 11.61 [281/492/2] test_dbm_gnu skipped -- running (1): test.test_concurrent_futures.test_process_pool (54.1 sec) | |
| test_dbm_gnu skipped -- No module named '_gdbm' | |
| 0:02:06 load avg: 11.61 [282/492/2] test.test_asyncio.test_subprocess passed -- running (1): test.test_concurrent_futures.test_process_pool (54.2 sec) | |
| 0:02:07 load avg: 11.61 [283/492/2] test_code_module passed -- running (1): test.test_concurrent_futures.test_process_pool (54.7 sec) | |
| 0:02:07 load avg: 11.61 [284/492/2] test_enumerate passed -- running (1): test.test_concurrent_futures.test_process_pool (54.8 sec) | |
| 0:02:07 load avg: 11.61 [285/492/2] test.test_asyncio.test_queues passed -- running (1): test.test_concurrent_futures.test_process_pool (54.8 sec) | |
| 0:02:07 load avg: 11.61 [286/492/2] test_raise passed -- running (1): test.test_concurrent_futures.test_process_pool (55.2 sec) | |
| 0:02:08 load avg: 11.61 [287/492/2] test.test_asyncio.test_sslproto passed -- running (1): test.test_concurrent_futures.test_process_pool (55.8 sec) | |
| 0:02:08 load avg: 11.61 [288/492/2] test_lltrace passed -- running (1): test.test_concurrent_futures.test_process_pool (55.9 sec) | |
| 0:02:08 load avg: 11.61 [289/492/2] test_zoneinfo passed -- running (1): test.test_concurrent_futures.test_process_pool (56.5 sec) | |
| 0:02:09 load avg: 11.61 [290/492/2] test.test_asyncio.test_tools passed -- running (1): test.test_concurrent_futures.test_process_pool (57.0 sec) | |
| 0:02:12 load avg: 11.56 [291/492/2] test_urllib2_localnet passed -- running (1): test.test_concurrent_futures.test_process_pool (59.8 sec) | |
| 0:02:12 load avg: 11.56 [292/492/2] test_file passed -- running (1): test.test_concurrent_futures.test_process_pool (1 min) | |
| 0:02:13 load avg: 11.56 [293/492/2] test_minidom passed -- running (1): test.test_concurrent_futures.test_process_pool (1 min) | |
| 0:02:14 load avg: 11.56 [294/492/2] test_types passed -- running (1): test.test_concurrent_futures.test_process_pool (1 min 1 sec) | |
| 0:02:14 load avg: 11.56 [295/492/2] test_profiling passed -- running (1): test.test_concurrent_futures.test_process_pool (1 min 2 sec) | |
| 0:02:15 load avg: 11.56 [296/492/2] test_urllib passed -- running (1): test.test_concurrent_futures.test_process_pool (1 min 2 sec) | |
| 0:02:17 load avg: 11.52 [297/492/3] test_os failed (1 failure) -- running (1): test.test_concurrent_futures.test_process_pool (1 min 4 sec) | |
| test_blocking (test.test_os.BlockingTests.test_blocking) ... ok | |
| test_compare_to_walk (test.test_os.BytesFwalkTests.test_compare_to_walk) ... ok | |
| test_dir_fd (test.test_os.BytesFwalkTests.test_dir_fd) ... ok | |
| test_fd_finalization (test.test_os.BytesFwalkTests.test_fd_finalization) ... ok | |
| test_fd_leak (test.test_os.BytesFwalkTests.test_fd_leak) ... ok | |
| test_file_like_path (test.test_os.BytesFwalkTests.test_file_like_path) ... ok | |
| test_walk_above_recursion_limit (test.test_os.BytesFwalkTests.test_walk_above_recursion_limit) ... ok | |
| test_walk_bad_dir (test.test_os.BytesFwalkTests.test_walk_bad_dir) ... ok | |
| test_walk_bad_dir2 (test.test_os.BytesFwalkTests.test_walk_bad_dir2) ... ok | |
| test_walk_bottom_up (test.test_os.BytesFwalkTests.test_walk_bottom_up) ... ok | |
| test_walk_named_pipe (test.test_os.BytesFwalkTests.test_walk_named_pipe) ... ok | |
| test_walk_named_pipe2 (test.test_os.BytesFwalkTests.test_walk_named_pipe2) ... ok | |
| test_walk_prune (test.test_os.BytesFwalkTests.test_walk_prune) ... ok | |
| test_walk_symlink (test.test_os.BytesFwalkTests.test_walk_symlink) ... ok | |
| test_walk_topdown (test.test_os.BytesFwalkTests.test_walk_topdown) ... ok | |
| test_yields_correct_dir_fd (test.test_os.BytesFwalkTests.test_yields_correct_dir_fd) ... ok | |
| test_file_like_path (test.test_os.BytesWalkTests.test_file_like_path) ... ok | |
| test_walk_above_recursion_limit (test.test_os.BytesWalkTests.test_walk_above_recursion_limit) ... ok | |
| test_walk_bad_dir (test.test_os.BytesWalkTests.test_walk_bad_dir) ... ok | |
| test_walk_bad_dir2 (test.test_os.BytesWalkTests.test_walk_bad_dir2) ... ok | |
| test_walk_bottom_up (test.test_os.BytesWalkTests.test_walk_bottom_up) ... ok | |
| test_walk_many_open_files (test.test_os.BytesWalkTests.test_walk_many_open_files) ... ok | |
| test_walk_named_pipe (test.test_os.BytesWalkTests.test_walk_named_pipe) ... ok | |
| test_walk_named_pipe2 (test.test_os.BytesWalkTests.test_walk_named_pipe2) ... ok | |
| test_walk_prune (test.test_os.BytesWalkTests.test_walk_prune) ... ok | |
| test_walk_symlink (test.test_os.BytesWalkTests.test_walk_symlink) ... ok | |
| test_walk_topdown (test.test_os.BytesWalkTests.test_walk_topdown) ... ok | |
| test_cpu_count (test.test_os.CPUCountTests.test_cpu_count) ... ok | |
| test_process_cpu_count (test.test_os.CPUCountTests.test_process_cpu_count) ... ok | |
| test_process_cpu_count_affinity (test.test_os.CPUCountTests.test_process_cpu_count_affinity) ... ok | |
| test_chown_gid (test.test_os.ChownFileTests.test_chown_gid) ... skipped 'test needs at least 2 groups' | |
| test_chown_uid_gid_arguments_must_be_index (test.test_os.ChownFileTests.test_chown_uid_gid_arguments_must_be_index) ... ok | |
| test_chown_with_root (test.test_os.ChownFileTests.test_chown_with_root) ... skipped 'test needs root privilege and more than one user' | |
| test_chown_without_permission (test.test_os.ChownFileTests.test_chown_without_permission) ... ok | |
| test_devnull (test.test_os.DevNullTests.test_devnull) ... ok | |
| test_bad_fd (test.test_os.DeviceEncodingTests.test_bad_fd) ... ok | |
| test_device_encoding (test.test_os.DeviceEncodingTests.test_device_encoding) ... ok | |
| test___repr__ (test.test_os.EnvironTests.test___repr__) | |
| Check that the repr() of os.environ looks like environ({...}). ... ok | |
| test_bool (test.test_os.EnvironTests.test_bool) ... ok | |
| test_constructor (test.test_os.EnvironTests.test_constructor) ... ok | |
| test_environb (test.test_os.EnvironTests.test_environb) ... ok | |
| test_get (test.test_os.EnvironTests.test_get) ... ok | |
| test_get_exec_path (test.test_os.EnvironTests.test_get_exec_path) ... ok | |
| test_getitem (test.test_os.EnvironTests.test_getitem) ... ok | |
| test_ior_operator (test.test_os.EnvironTests.test_ior_operator) ... ok | |
| test_ior_operator_invalid_dicts (test.test_os.EnvironTests.test_ior_operator_invalid_dicts) ... ok | |
| test_ior_operator_key_value_iterable (test.test_os.EnvironTests.test_ior_operator_key_value_iterable) ... ok | |
| test_items (test.test_os.EnvironTests.test_items) ... ok | |
| test_iter_error_when_changing_os_environ (test.test_os.EnvironTests.test_iter_error_when_changing_os_environ) ... ok | |
| test_iter_error_when_changing_os_environ_items (test.test_os.EnvironTests.test_iter_error_when_changing_os_environ_items) ... ok | |
| test_iter_error_when_changing_os_environ_values (test.test_os.EnvironTests.test_iter_error_when_changing_os_environ_values) ... ok | |
| test_key_type (test.test_os.EnvironTests.test_key_type) ... ok | |
| test_keys (test.test_os.EnvironTests.test_keys) ... ok | |
| test_keyvalue_types (test.test_os.EnvironTests.test_keyvalue_types) ... ok | |
| test_len (test.test_os.EnvironTests.test_len) ... ok | |
| test_or_operator (test.test_os.EnvironTests.test_or_operator) ... ok | |
| test_os_popen_iter (test.test_os.EnvironTests.test_os_popen_iter) ... ok | |
| test_pop (test.test_os.EnvironTests.test_pop) ... ok | |
| test_popitem (test.test_os.EnvironTests.test_popitem) ... ok | |
| test_putenv_unsetenv (test.test_os.EnvironTests.test_putenv_unsetenv) ... ok | |
| test_putenv_unsetenv_error (test.test_os.EnvironTests.test_putenv_unsetenv_error) ... ok | |
| test_read (test.test_os.EnvironTests.test_read) ... ok | |
| test_reload_environ (test.test_os.EnvironTests.test_reload_environ) ... ok | |
| test_ror_operator (test.test_os.EnvironTests.test_ror_operator) ... ok | |
| test_setdefault (test.test_os.EnvironTests.test_setdefault) ... ok | |
| test_update (test.test_os.EnvironTests.test_update) ... ok | |
| test_update2 (test.test_os.EnvironTests.test_update2) ... ok | |
| test_values (test.test_os.EnvironTests.test_values) ... ok | |
| test_write (test.test_os.EnvironTests.test_write) ... ok | |
| test_eventfd_initval (test.test_os.EventfdTests.test_eventfd_initval) ... ok | |
| test_eventfd_select (test.test_os.EventfdTests.test_eventfd_select) ... ok | |
| test_eventfd_semaphore (test.test_os.EventfdTests.test_eventfd_semaphore) ... ok | |
| test_execv_with_bad_arglist (test.test_os.ExecTests.test_execv_with_bad_arglist) ... ok | |
| test_execve_invalid_env (test.test_os.ExecTests.test_execve_invalid_env) ... ok | |
| test_execve_with_empty_path (test.test_os.ExecTests.test_execve_with_empty_path) ... skipped 'Win32-specific test' | |
| test_execvpe_with_bad_arglist (test.test_os.ExecTests.test_execvpe_with_bad_arglist) ... ok | |
| test_execvpe_with_bad_program (test.test_os.ExecTests.test_execvpe_with_bad_program) ... ok | |
| test_internal_execvpe_str (test.test_os.ExecTests.test_internal_execvpe_str) ... ok | |
| test_os_all (test.test_os.ExportsTests.test_os_all) ... ok | |
| test_fds (test.test_os.ExtendedAttributeTests.test_fds) ... ok | |
| test_lpath (test.test_os.ExtendedAttributeTests.test_lpath) ... ok | |
| test_simple (test.test_os.ExtendedAttributeTests.test_simple) ... ok | |
| test_dup (test.test_os.FDInheritanceTests.test_dup) ... ok | |
| test_dup2 (test.test_os.FDInheritanceTests.test_dup2) ... ok | |
| test_dup_nul (test.test_os.FDInheritanceTests.test_dup_nul) ... skipped 'win32-specific test' | |
| test_dup_standard_stream (test.test_os.FDInheritanceTests.test_dup_standard_stream) ... ok | |
| test_get_inheritable_cloexec (test.test_os.FDInheritanceTests.test_get_inheritable_cloexec) ... ok | |
| test_get_set_inheritable (test.test_os.FDInheritanceTests.test_get_set_inheritable) ... ok | |
| test_get_set_inheritable_badf (test.test_os.FDInheritanceTests.test_get_set_inheritable_badf) ... ok | |
| test_get_set_inheritable_o_path (test.test_os.FDInheritanceTests.test_get_set_inheritable_o_path) ... ok | |
| test_open (test.test_os.FDInheritanceTests.test_open) ... ok | |
| test_pipe (test.test_os.FDInheritanceTests.test_pipe) ... ok | |
| test_set_inheritable_cloexec (test.test_os.FDInheritanceTests.test_set_inheritable_cloexec) ... ok | |
| test_identity (test.test_os.FSEncodingTests.test_identity) ... ok | |
| test_nop (test.test_os.FSEncodingTests.test_nop) ... ok | |
| test_access (test.test_os.FileTests.test_access) ... ok | |
| test_closerange (test.test_os.FileTests.test_closerange) ... ok | |
| test_copy_file_range (test.test_os.FileTests.test_copy_file_range) ... ok | |
| test_copy_file_range_invalid_values (test.test_os.FileTests.test_copy_file_range_invalid_values) ... ok | |
| test_copy_file_range_offset (test.test_os.FileTests.test_copy_file_range_offset) ... ok | |
| test_fdopen (test.test_os.FileTests.test_fdopen) ... ok | |
| test_large_read (test.test_os.FileTests.test_large_read) ... skipped 'not enough memory: 2.0G minimum needed' | |
| test_large_readinto (test.test_os.FileTests.test_large_readinto) ... skipped 'not enough memory: 2.0G minimum needed' | |
| test_open_keywords (test.test_os.FileTests.test_open_keywords) ... ok | |
| test_read (test.test_os.FileTests.test_read) ... ok | |
| test_readinto (test.test_os.FileTests.test_readinto) ... ok | |
| test_readinto_badarg (test.test_os.FileTests.test_readinto_badarg) ... ok | |
| test_readinto_non_blocking (test.test_os.FileTests.test_readinto_non_blocking) ... ok | |
| test_rename (test.test_os.FileTests.test_rename) ... ok | |
| test_replace (test.test_os.FileTests.test_replace) ... ok | |
| test_splice (test.test_os.FileTests.test_splice) ... ok | |
| test_splice_invalid_values (test.test_os.FileTests.test_splice_invalid_values) ... ok | |
| test_splice_offset_in (test.test_os.FileTests.test_splice_offset_in) ... ok | |
| test_splice_offset_out (test.test_os.FileTests.test_splice_offset_out) ... ok | |
| test_symlink_keywords (test.test_os.FileTests.test_symlink_keywords) ... ok | |
| test_write (test.test_os.FileTests.test_write) ... ok | |
| test_write_windows_console (test.test_os.FileTests.test_write_windows_console) ... skipped 'test specific to the Windows console' | |
| test_fork (test.test_os.ForkTests.test_fork) ... ok | |
| test_fork_at_finalization (test.test_os.ForkTests.test_fork_at_finalization) ... ok | |
| test_fork_warns_when_non_python_thread_exists (test.test_os.ForkTests.test_fork_warns_when_non_python_thread_exists) ... ok | |
| test_compare_to_walk (test.test_os.FwalkTests.test_compare_to_walk) ... ok | |
| test_dir_fd (test.test_os.FwalkTests.test_dir_fd) ... ok | |
| test_fd_finalization (test.test_os.FwalkTests.test_fd_finalization) ... ok | |
| test_fd_leak (test.test_os.FwalkTests.test_fd_leak) ... ok | |
| test_file_like_path (test.test_os.FwalkTests.test_file_like_path) ... ok | |
| test_walk_above_recursion_limit (test.test_os.FwalkTests.test_walk_above_recursion_limit) ... ok | |
| test_walk_bad_dir (test.test_os.FwalkTests.test_walk_bad_dir) ... ok | |
| test_walk_bad_dir2 (test.test_os.FwalkTests.test_walk_bad_dir2) ... ok | |
| test_walk_bottom_up (test.test_os.FwalkTests.test_walk_bottom_up) ... ok | |
| test_walk_named_pipe (test.test_os.FwalkTests.test_walk_named_pipe) ... ok | |
| test_walk_named_pipe2 (test.test_os.FwalkTests.test_walk_named_pipe2) ... ok | |
| test_walk_prune (test.test_os.FwalkTests.test_walk_prune) ... ok | |
| test_walk_symlink (test.test_os.FwalkTests.test_walk_symlink) ... ok | |
| test_walk_topdown (test.test_os.FwalkTests.test_walk_topdown) ... ok | |
| test_yields_correct_dir_fd (test.test_os.FwalkTests.test_yields_correct_dir_fd) ... ok | |
| test_getrandom0 (test.test_os.GetRandomTests.test_getrandom0) ... ok | |
| test_getrandom_nonblock (test.test_os.GetRandomTests.test_getrandom_nonblock) ... ok | |
| test_getrandom_random (test.test_os.GetRandomTests.test_getrandom_random) ... ok | |
| test_getrandom_type (test.test_os.GetRandomTests.test_getrandom_type) ... ok | |
| test_getrandom_value (test.test_os.GetRandomTests.test_getrandom_value) ... ok | |
| test_link (test.test_os.LinkTests.test_link) ... ok | |
| test_link_bytes (test.test_os.LinkTests.test_link_bytes) ... ok | |
| test_unicode_name (test.test_os.LinkTests.test_unicode_name) ... ok | |
| test_getlogin (test.test_os.LoginTests.test_getlogin) ... skipped 'Skip due to platform/environment differences on *NIX buildbots' | |
| test_exist_ok_existing_directory (test.test_os.MakedirTests.test_exist_ok_existing_directory) ... ok | |
| test_exist_ok_existing_regular_file (test.test_os.MakedirTests.test_exist_ok_existing_regular_file) ... ok | |
| test_exist_ok_s_isgid_directory (test.test_os.MakedirTests.test_exist_ok_s_isgid_directory) ... ok | |
| test_makedir (test.test_os.MakedirTests.test_makedir) ... ok | |
| test_mode (test.test_os.MakedirTests.test_mode) ... ok | |
| test_win32_mkdir_700 (test.test_os.MakedirTests.test_win32_mkdir_700) ... skipped 'requires Windows' | |
| test_memfd_create (test.test_os.MemfdCreateTests.test_memfd_create) ... ok | |
| test_getcwd (test.test_os.MiscTests.test_getcwd) ... ok | |
| test_getcwd_long_path (test.test_os.MiscTests.test_getcwd_long_path) ... Tested current directory length: 2000 | |
| ok | |
| test_getcwdb (test.test_os.MiscTests.test_getcwdb) ... ok | |
| test_directory_link_nonlocal (test.test_os.NonLocalSymlinkTests.test_directory_link_nonlocal) | |
| The symlink target should resolve relative to the link, not relative ... ok | |
| test_oserror_filename (test.test_os.OSErrorTests.test_oserror_filename) ... ok | |
| test_path_t_converter (test.test_os.PathTConverterTests.test_path_t_converter) ... ok | |
| test_path_t_converter_and_custom_class (test.test_os.PathTConverterTests.test_path_t_converter_and_custom_class) ... ok | |
| test_listdir (test.test_os.Pep383Tests.test_listdir) ... ok | |
| test_open (test.test_os.Pep383Tests.test_open) ... ok | |
| test_stat (test.test_os.Pep383Tests.test_stat) ... ok | |
| test_statvfs (test.test_os.Pep383Tests.test_statvfs) ... ok | |
| test_getppid (test.test_os.PidTests.test_getppid) ... ok | |
| test_waitpid (test.test_os.PidTests.test_waitpid) ... ok | |
| test_waitpid_windows (test.test_os.PidTests.test_waitpid_windows) ... skipped 'win32-specific test' | |
| test_waitstatus_to_exitcode (test.test_os.PidTests.test_waitstatus_to_exitcode) ... ok | |
| test_waitstatus_to_exitcode_kill (test.test_os.PidTests.test_waitstatus_to_exitcode_kill) ... ok | |
| test_waitstatus_to_exitcode_windows (test.test_os.PidTests.test_waitstatus_to_exitcode_windows) ... skipped 'win32-specific test' | |
| test_setegid (test.test_os.PosixUidGidTests.test_setegid) ... ok | |
| test_seteuid (test.test_os.PosixUidGidTests.test_seteuid) ... ok | |
| test_setgid (test.test_os.PosixUidGidTests.test_setgid) ... ok | |
| test_setregid (test.test_os.PosixUidGidTests.test_setregid) ... ok | |
| test_setregid_neg1 (test.test_os.PosixUidGidTests.test_setregid_neg1) ... ok | |
| test_setreuid (test.test_os.PosixUidGidTests.test_setreuid) ... ok | |
| test_setreuid_neg1 (test.test_os.PosixUidGidTests.test_setreuid_neg1) ... ok | |
| test_setuid (test.test_os.PosixUidGidTests.test_setuid) ... ok | |
| test_set_get_priority (test.test_os.ProgramPriorityTests.test_set_get_priority) ... ok | |
| test_open_via_ptsname (test.test_os.PseudoterminalTests.test_open_via_ptsname) ... ok | |
| test_openpty (test.test_os.PseudoterminalTests.test_openpty) ... ok | |
| test_pipe_spawnl (test.test_os.PseudoterminalTests.test_pipe_spawnl) ... Traceback (most recent call last): | |
| File "/home/buildbot/cpython/build/test_python_11666æ/@test_11666_tmpæ", line 31, in <module> | |
| raise Exception("dup must fail") | |
| Exception: dup must fail | |
| FAIL | |
| test_posix_pty_functions (test.test_os.PseudoterminalTests.test_posix_pty_functions) ... ok | |
| test_bytes (test.test_os.ReadlinkTests.test_bytes) ... ok | |
| test_missing_link (test.test_os.ReadlinkTests.test_missing_link) ... ok | |
| test_not_symlink (test.test_os.ReadlinkTests.test_not_symlink) ... ok | |
| test_pathlike (test.test_os.ReadlinkTests.test_pathlike) ... ok | |
| test_pathlike_bytes (test.test_os.ReadlinkTests.test_pathlike_bytes) ... ok | |
| test_remove_all (test.test_os.RemoveDirsTests.test_remove_all) ... ok | |
| test_remove_nothing (test.test_os.RemoveDirsTests.test_remove_nothing) ... ok | |
| test_remove_partial (test.test_os.RemoveDirsTests.test_remove_partial) ... ok | |
| test_nowait (test.test_os.SpawnTests.test_nowait) ... ok | |
| test_spawnl (test.test_os.SpawnTests.test_spawnl) ... ok | |
| test_spawnl_noargs (test.test_os.SpawnTests.test_spawnl_noargs) ... ok | |
| test_spawnle (test.test_os.SpawnTests.test_spawnle) ... ok | |
| test_spawnle_noargs (test.test_os.SpawnTests.test_spawnle_noargs) ... ok | |
| test_spawnlp (test.test_os.SpawnTests.test_spawnlp) ... ok | |
| test_spawnlpe (test.test_os.SpawnTests.test_spawnlpe) ... ok | |
| test_spawnv (test.test_os.SpawnTests.test_spawnv) ... ok | |
| test_spawnv_noargs (test.test_os.SpawnTests.test_spawnv_noargs) ... ok | |
| test_spawnve (test.test_os.SpawnTests.test_spawnve) ... ok | |
| test_spawnve_bytes (test.test_os.SpawnTests.test_spawnve_bytes) ... ok | |
| test_spawnve_invalid_env (test.test_os.SpawnTests.test_spawnve_invalid_env) ... ok | |
| test_spawnve_noargs (test.test_os.SpawnTests.test_spawnve_noargs) ... ok | |
| test_spawnvp (test.test_os.SpawnTests.test_spawnvp) ... ok | |
| test_spawnvpe (test.test_os.SpawnTests.test_spawnvpe) ... ok | |
| test_spawnvpe_invalid_env (test.test_os.SpawnTests.test_spawnvpe_invalid_env) ... ok | |
| test_15261 (test.test_os.StatAttributeTests.test_15261) ... skipped 'Win32 specific tests' | |
| test_1686475 (test.test_os.StatAttributeTests.test_1686475) ... skipped 'Win32 specific tests' | |
| test_access_denied (test.test_os.StatAttributeTests.test_access_denied) ... skipped 'Win32 specific tests' | |
| test_file_attributes (test.test_os.StatAttributeTests.test_file_attributes) ... skipped 'st_file_attributes is Win32 specific' | |
| test_stat_attributes (test.test_os.StatAttributeTests.test_stat_attributes) ... ok | |
| test_stat_attributes_bytes (test.test_os.StatAttributeTests.test_stat_attributes_bytes) ... ok | |
| test_stat_block_device (test.test_os.StatAttributeTests.test_stat_block_device) ... skipped 'Win32 specific tests' | |
| test_stat_result_pickle (test.test_os.StatAttributeTests.test_stat_result_pickle) ... ok | |
| test_statvfs_attributes (test.test_os.StatAttributeTests.test_statvfs_attributes) ... ok | |
| test_statvfs_result_pickle (test.test_os.StatAttributeTests.test_statvfs_result_pickle) ... ok | |
| test_does_not_crash (test.test_os.TermsizeTests.test_does_not_crash) | |
| Check if get_terminal_size() returns a meaningful value. ... skipped 'failed to query terminal size' | |
| test_stty_match (test.test_os.TermsizeTests.test_stty_match) | |
| Check if stty returns the same results ... ok | |
| test_windows_fd (test.test_os.TermsizeTests.test_windows_fd) | |
| Check if get_terminal_size() returns a meaningful value in Windows ... skipped 'Windows specific test' | |
| test_uninstantiable (test.test_os.TestDirEntry.test_uninstantiable) ... ok | |
| test_unpickable (test.test_os.TestDirEntry.test_unpickable) ... ok | |
| test_blocking (test.test_os.TestInvalidFD.test_blocking) ... ok | |
| test_closerange (test.test_os.TestInvalidFD.test_closerange) ... ok | |
| test_dup (test.test_os.TestInvalidFD.test_dup) ... ok | |
| test_dup2 (test.test_os.TestInvalidFD.test_dup2) ... ok | |
| test_dup2_negative_fd (test.test_os.TestInvalidFD.test_dup2_negative_fd) ... ok | |
| test_fchdir (test.test_os.TestInvalidFD.test_fchdir) ... ok | |
| 0:02:18 load avg: 11.52 [298/492/3] test_typing passed -- running (1): test.test_concurrent_futures.test_process_pool (1 min 6 sec) | |
| 0:02:20 load avg: 11.52 [299/492/3] test.test_concurrent_futures.test_process_pool passed (1 min 7 sec) | |
| 0:02:20 load avg: 11.52 [300/492/3] test_userstring passed | |
| 0:02:21 load avg: 11.24 [301/492/3] test_socketserver passed | |
| 0:02:21 load avg: 11.24 [302/492/3] test_htmlparser passed | |
| 0:02:21 load avg: 11.24 [303/492/3] test_select passed -- running (1): test.test_multiprocessing_spawn.test_processes (30.1 sec) | |
| 0:02:22 load avg: 11.24 [304/492/3] test_embed passed -- running (1): test.test_multiprocessing_spawn.test_processes (30.2 sec) | |
| 0:02:22 load avg: 11.24 [305/492/3] test_urlparse passed -- running (1): test.test_multiprocessing_spawn.test_processes (30.4 sec) | |
| 0:02:22 load avg: 11.24 [306/492/3] test_html passed -- running (1): test.test_multiprocessing_spawn.test_processes (30.4 sec) | |
| 0:02:22 load avg: 11.24 [307/492/3] test_float passed -- running (1): test.test_multiprocessing_spawn.test_processes (30.9 sec) | |
| 0:02:23 load avg: 11.24 [308/492/3] test_pulldom passed -- running (1): test.test_multiprocessing_spawn.test_processes (31.3 sec) | |
| 0:02:23 load avg: 11.24 [309/492/3] test_ftplib passed -- running (1): test.test_multiprocessing_spawn.test_processes (31.5 sec) | |
| 0:02:24 load avg: 11.24 [310/492/3] test_ordered_dict passed -- running (1): test.test_multiprocessing_spawn.test_processes (32.1 sec) | |
| 0:02:24 load avg: 11.24 [311/492/3] test_atexit passed -- running (1): test.test_multiprocessing_spawn.test_processes (32.3 sec) | |
| 0:02:24 load avg: 11.24 [312/492/3] test_ssl passed -- running (1): test.test_multiprocessing_spawn.test_processes (33.0 sec) | |
| 0:02:25 load avg: 10.74 [313/492/3] test_codecmaps_jp passed -- running (1): test.test_multiprocessing_spawn.test_processes (33.9 sec) | |
| 0:02:25 load avg: 10.74 [314/492/3] test_difflib passed -- running (1): test.test_multiprocessing_spawn.test_processes (34.1 sec) | |
| 0:02:26 load avg: 10.74 [315/492/3] test_wave passed -- running (1): test.test_multiprocessing_spawn.test_processes (34.4 sec) | |
| 0:02:28 load avg: 10.74 [316/492/3] test_datetime passed -- running (1): test.test_multiprocessing_spawn.test_processes (36.3 sec) | |
| 0:02:28 load avg: 10.74 [317/492/3] test_random passed -- running (1): test.test_multiprocessing_spawn.test_processes (36.6 sec) | |
| 0:02:28 load avg: 10.74 [318/492/3] test_context passed -- running (1): test.test_multiprocessing_spawn.test_processes (37.0 sec) | |
| 0:02:28 load avg: 10.74 [319/492/3] test_sort passed -- running (1): test.test_multiprocessing_spawn.test_processes (37.1 sec) | |
| 0:02:29 load avg: 10.74 [320/492/3] test_ttk skipped (resource denied) -- running (1): test.test_multiprocessing_spawn.test_processes (37.4 sec) | |
| test_ttk skipped -- Use of the 'gui' resource not enabled | |
| 0:02:30 load avg: 10.74 [321/492/3] test_poplib passed -- running (1): test.test_multiprocessing_spawn.test_processes (38.5 sec) | |
| 0:02:30 load avg: 10.44 [322/492/3] test_perf_profiler passed -- running (1): test.test_multiprocessing_spawn.test_processes (38.7 sec) | |
| 0:02:31 load avg: 10.44 [323/492/3] test_ntpath passed -- running (1): test.test_multiprocessing_spawn.test_processes (39.4 sec) | |
| 0:02:31 load avg: 10.44 [324/492/3] test_augassign passed -- running (1): test.test_multiprocessing_spawn.test_processes (39.7 sec) | |
| 0:02:32 load avg: 10.44 [325/492/3] test.test_multiprocessing_forkserver.test_manager passed -- running (1): test.test_multiprocessing_spawn.test_processes (40.7 sec) | |
| 0:02:33 load avg: 10.44 [326/492/3] test_bytes passed -- running (1): test.test_multiprocessing_spawn.test_processes (41.3 sec) | |
| 0:02:34 load avg: 10.44 [327/492/3] test_thread passed -- running (1): test.test_multiprocessing_spawn.test_processes (43.0 sec) | |
| 0:02:35 load avg: 10.44 [328/492/3] test_fractions passed -- running (1): test.test_multiprocessing_spawn.test_processes (43.5 sec) | |
| 0:02:35 load avg: 10.44 [329/492/3] test.test_inspect.test_inspect passed -- running (1): test.test_multiprocessing_spawn.test_processes (43.5 sec) | |
| 0:02:35 load avg: 10.24 [330/492/3] test_site passed -- running (1): test.test_multiprocessing_spawn.test_processes (43.6 sec) | |
| 0:02:35 load avg: 10.24 [331/492/3] test_xxlimited passed -- running (1): test.test_multiprocessing_spawn.test_processes (43.9 sec) | |
| 0:02:36 load avg: 10.24 [332/492/3] test_cmd passed -- running (1): test.test_multiprocessing_spawn.test_processes (44.5 sec) | |
| 0:02:36 load avg: 10.24 [333/492/3] test.test_asyncio.test_unix_events passed -- running (1): test.test_multiprocessing_spawn.test_processes (44.8 sec) | |
| 0:02:36 load avg: 10.24 [334/492/3] test_configparser passed -- running (1): test.test_multiprocessing_spawn.test_processes (44.8 sec) | |
| 0:02:37 load avg: 10.24 [335/492/3] test_timeit passed -- running (1): test.test_multiprocessing_spawn.test_processes (45.3 sec) | |
| 0:02:37 load avg: 10.24 [336/492/3] test_monitoring passed -- running (2): test_io (30.0 sec), test.test_multiprocessing_spawn.test_processes (45.3 sec) | |
| 0:02:37 load avg: 10.24 [337/492/3] test_charmapcodec passed -- running (2): test_io (30.4 sec), test.test_multiprocessing_spawn.test_processes (45.7 sec) | |
| 0:02:38 load avg: 10.24 [338/492/3] test__osx_support passed -- running (2): test_io (30.8 sec), test.test_multiprocessing_spawn.test_processes (46.1 sec) | |
| 0:02:38 load avg: 10.24 [339/492/3] test_collections passed -- running (3): test_subprocess (30.0 sec), test_io (31.0 sec), test.test_multiprocessing_spawn.test_processes (46.3 sec) | |
| 0:02:38 load avg: 10.24 [340/492/3] test_devpoll skipped -- running (4): test_subprocess (30.4 sec), test_io (31.4 sec), test.test_multiprocessing_spawn.test_processes (46.7 sec), test_posix (30.2 sec) | |
| test_devpoll skipped -- test works only on Solaris OS family | |
| 0:02:38 load avg: 10.24 [341/492/3] test_wsgiref passed -- running (4): test_subprocess (30.5 sec), test_io (31.4 sec), test.test_multiprocessing_spawn.test_processes (46.7 sec), test_posix (30.3 sec) | |
| 0:02:39 load avg: 10.24 [342/492/3] test_gettext passed -- running (4): test_subprocess (31.2 sec), test_io (32.2 sec), test.test_multiprocessing_spawn.test_processes (47.5 sec), test_posix (31.0 sec) | |
| 0:02:39 load avg: 10.24 [343/492/3] test_dynamicclassattribute passed -- running (4): test_subprocess (31.6 sec), test_io (32.5 sec), test.test_multiprocessing_spawn.test_processes (47.9 sec), test_posix (31.4 sec) | |
| 0:02:40 load avg: 10.24 [344/492/3] test.test_asyncio.test_protocols passed -- running (4): test_subprocess (32.0 sec), test_io (32.9 sec), test.test_multiprocessing_spawn.test_processes (48.2 sec), test_posix (31.8 sec) | |
| 0:02:41 load avg: 9.82 [345/492/3] test_io passed (34.4 sec) -- running (3): test_subprocess (33.5 sec), test.test_multiprocessing_spawn.test_processes (49.7 sec), test_posix (33.3 sec) | |
| 0:02:41 load avg: 9.82 [346/492/3] test.test_gdb.test_cfunction skipped -- running (3): test_subprocess (33.8 sec), test.test_multiprocessing_spawn.test_processes (50.1 sec), test_posix (33.6 sec) | |
| test.test_gdb.test_cfunction skipped -- Couldn't find gdb program on the path: [Errno 2] No such file or directory: 'gdb' | |
| 0:02:43 load avg: 9.82 [347/492/3] test_unicodedata passed -- running (3): test_subprocess (35.4 sec), test.test_multiprocessing_spawn.test_processes (51.7 sec), test_posix (35.3 sec) | |
| 0:02:45 load avg: 9.44 [348/492/3] test_list passed -- running (3): test_subprocess (37.5 sec), test.test_multiprocessing_spawn.test_processes (53.7 sec), test_posix (37.3 sec) | |
| 0:02:45 load avg: 9.44 [349/492/3] test_capi passed -- running (3): test_subprocess (37.8 sec), test.test_multiprocessing_spawn.test_processes (54.0 sec), test_posix (37.6 sec) | |
| 0:02:46 load avg: 9.44 [350/492/3] test_genericclass passed -- running (3): test_subprocess (38.1 sec), test.test_multiprocessing_spawn.test_processes (54.4 sec), test_posix (38.0 sec) | |
| 0:02:46 load avg: 9.44 [351/492/3] test_dictcomps passed -- running (3): test_subprocess (38.5 sec), test.test_multiprocessing_spawn.test_processes (54.8 sec), test_posix (38.3 sec) | |
| 0:02:47 load avg: 9.44 [352/492/3] test_modulefinder passed -- running (3): test_subprocess (38.9 sec), test.test_multiprocessing_spawn.test_processes (55.2 sec), test_posix (38.7 sec) | |
| 0:02:48 load avg: 9.44 [353/492/3] test_pdb passed -- running (3): test_subprocess (40.6 sec), test.test_multiprocessing_spawn.test_processes (56.9 sec), test_posix (40.4 sec) | |
| 0:02:49 load avg: 9.44 [354/492/3] test.test_asyncio.test_futures passed -- running (3): test_subprocess (40.9 sec), test.test_multiprocessing_spawn.test_processes (57.2 sec), test_posix (40.7 sec) | |
| 0:02:49 load avg: 9.44 [355/492/3] test_genexps passed -- running (3): test_subprocess (41.0 sec), test.test_multiprocessing_spawn.test_processes (57.3 sec), test_posix (40.8 sec) | |
| 0:02:49 load avg: 9.44 [356/492/3] test_generator_stop passed -- running (3): test_subprocess (41.2 sec), test.test_multiprocessing_spawn.test_processes (57.5 sec), test_posix (41.0 sec) | |
| 0:02:49 load avg: 9.44 [357/492/3] test_timeout passed -- running (3): test_subprocess (41.5 sec), test.test_multiprocessing_spawn.test_processes (57.8 sec), test_posix (41.4 sec) | |
| 0:02:50 load avg: 9.44 [358/492/3] test_zipfile passed -- running (3): test_subprocess (42.1 sec), test.test_multiprocessing_spawn.test_processes (58.4 sec), test_posix (41.9 sec) | |
| 0:02:50 load avg: 9.08 [359/492/3] test_dataclasses passed -- running (3): test_subprocess (42.4 sec), test.test_multiprocessing_spawn.test_processes (58.7 sec), test_posix (42.2 sec) | |
| 0:02:50 load avg: 9.08 [360/492/3] test_iterlen passed -- running (3): test_subprocess (42.7 sec), test.test_multiprocessing_spawn.test_processes (59.0 sec), test_posix (42.5 sec) | |
| 0:02:50 load avg: 9.08 [361/492/3] test_smtplib passed -- running (3): test_subprocess (42.7 sec), test.test_multiprocessing_spawn.test_processes (59.0 sec), test_posix (42.6 sec) | |
| 0:02:51 load avg: 9.08 [362/492/3] test_deque passed -- running (3): test_subprocess (43.6 sec), test.test_multiprocessing_spawn.test_processes (59.9 sec), test_posix (43.4 sec) | |
| 0:02:51 load avg: 9.08 [363/492/3] test_fcntl passed -- running (3): test_subprocess (43.7 sec), test.test_multiprocessing_spawn.test_processes (60.0 sec), test_posix (43.5 sec) | |
| 0:02:52 load avg: 9.08 [364/492/3] test_fork1 passed -- running (3): test_subprocess (43.9 sec), test.test_multiprocessing_spawn.test_processes (1 min), test_posix (43.7 sec) | |
| 0:02:52 load avg: 9.08 [365/492/3] test.test_asyncio.test_streams passed -- running (3): test_subprocess (44.0 sec), test.test_multiprocessing_spawn.test_processes (1 min), test_posix (43.8 sec) | |
| 0:02:52 load avg: 9.08 [366/492/3] test_cext passed -- running (3): test_subprocess (44.3 sec), test.test_multiprocessing_spawn.test_processes (1 min), test_posix (44.2 sec) | |
| 0:02:53 load avg: 9.08 [367/492/3] test_shelve passed -- running (3): test_subprocess (45.1 sec), test.test_multiprocessing_spawn.test_processes (1 min 1 sec), test_posix (44.9 sec) | |
| 0:02:53 load avg: 9.08 [368/492/3] test_xmlrpc passed -- running (3): test_subprocess (45.4 sec), test.test_multiprocessing_spawn.test_processes (1 min 1 sec), test_posix (45.2 sec) | |
| 0:02:53 load avg: 9.08 [369/492/3] test_codecs passed -- running (3): test_subprocess (45.7 sec), test.test_multiprocessing_spawn.test_processes (1 min 2 sec), test_posix (45.6 sec) | |
| 0:02:53 load avg: 9.08 [370/492/3] test_launcher skipped -- running (3): test_subprocess (45.7 sec), test.test_multiprocessing_spawn.test_processes (1 min 2 sec), test_posix (45.6 sec) | |
| test_launcher skipped -- test only applies to Windows | |
| 0:02:54 load avg: 9.08 [371/492/3] test_android skipped -- running (3): test_subprocess (46.1 sec), test.test_multiprocessing_spawn.test_processes (1 min 2 sec), test_posix (45.9 sec) | |
| test_android skipped -- Android-specific | |
| 0:02:54 load avg: 9.08 [372/492/3] test_struct passed -- running (3): test_subprocess (46.7 sec), test.test_multiprocessing_spawn.test_processes (1 min 3 sec), test_posix (46.6 sec) | |
| 0:02:55 load avg: 9.08 [373/492/3] test.test_asyncio.test_free_threading passed -- running (3): test_subprocess (47.1 sec), test.test_multiprocessing_spawn.test_processes (1 min 3 sec), test_posix (46.9 sec) | |
| 0:02:55 load avg: 8.91 [374/492/3] test_http_cookies passed -- running (3): test_subprocess (47.6 sec), test.test_multiprocessing_spawn.test_processes (1 min 3 sec), test_posix (47.4 sec) | |
| 0:02:56 load avg: 8.91 [375/492/3] test_scope passed -- running (3): test_subprocess (47.9 sec), test.test_multiprocessing_spawn.test_processes (1 min 4 sec), test_posix (47.8 sec) | |
| 0:02:56 load avg: 8.91 [376/492/3] test_fileinput passed -- running (3): test_subprocess (48.4 sec), test.test_multiprocessing_spawn.test_processes (1 min 4 sec), test_posix (48.3 sec) | |
| 0:02:57 load avg: 8.91 [377/492/3] test_copyreg passed -- running (3): test_subprocess (48.9 sec), test.test_multiprocessing_spawn.test_processes (1 min 5 sec), test_posix (48.7 sec) | |
| 0:02:57 load avg: 8.91 [378/492/3] test_docxmlrpc passed -- running (3): test_subprocess (49.1 sec), test.test_multiprocessing_spawn.test_processes (1 min 5 sec), test_posix (48.9 sec) | |
| 0:02:57 load avg: 8.91 [379/492/3] test_decimal passed -- running (3): test_subprocess (49.2 sec), test.test_multiprocessing_spawn.test_processes (1 min 5 sec), test_posix (49.0 sec) | |
| 0:02:57 load avg: 8.91 [380/492/3] test.test_asyncio.test_transports passed -- running (3): test_subprocess (49.7 sec), test.test_multiprocessing_spawn.test_processes (1 min 5 sec), test_posix (49.5 sec) | |
| 0:02:57 load avg: 8.91 [381/492/3] test_generated_cases passed -- running (3): test_subprocess (49.8 sec), test.test_multiprocessing_spawn.test_processes (1 min 6 sec), test_posix (49.6 sec) | |
| 0:02:58 load avg: 8.91 [382/492/3] test_named_expressions passed -- running (3): test_subprocess (50.3 sec), test.test_multiprocessing_spawn.test_processes (1 min 6 sec), test_posix (50.1 sec) | |
| 0:02:59 load avg: 8.91 [383/492/3] test_exceptions passed -- running (3): test_subprocess (50.9 sec), test.test_multiprocessing_spawn.test_processes (1 min 7 sec), test_posix (50.7 sec) | |
| 0:02:59 load avg: 8.91 [384/492/3] test_repl passed -- running (3): test_subprocess (51.1 sec), test.test_multiprocessing_spawn.test_processes (1 min 7 sec), test_posix (50.9 sec) | |
| 0:02:59 load avg: 8.91 [385/492/3] test_eof passed -- running (3): test_subprocess (51.2 sec), test.test_multiprocessing_spawn.test_processes (1 min 7 sec), test_posix (51.1 sec) | |
| 0:02:59 load avg: 8.91 [386/492/3] test.test_asyncio.test_buffered_proto passed -- running (3): test_subprocess (51.3 sec), test.test_multiprocessing_spawn.test_processes (1 min 7 sec), test_posix (51.1 sec) | |
| 0:02:59 load avg: 8.91 [387/492/3] test_pty passed -- running (3): test_subprocess (51.8 sec), test.test_multiprocessing_spawn.test_processes (1 min 8 sec), test_posix (51.6 sec) | |
| 0:03:00 load avg: 8.91 [388/492/3] test.test_asyncio.test_graph passed -- running (3): test_subprocess (52.2 sec), test.test_multiprocessing_spawn.test_processes (1 min 8 sec), test_posix (52.0 sec) | |
| 0:03:00 load avg: 9.00 [389/492/3] test_xxtestfuzz passed -- running (3): test_subprocess (52.6 sec), test.test_multiprocessing_spawn.test_processes (1 min 8 sec), test_posix (52.4 sec) | |
| 0:03:01 load avg: 9.00 [390/492/3] test_free_threading skipped -- running (3): test_subprocess (52.9 sec), test.test_multiprocessing_spawn.test_processes (1 min 9 sec), test_posix (52.8 sec) | |
| test_free_threading skipped -- GIL enabled | |
| 0:03:02 load avg: 9.00 [391/492/3] test_frame passed -- running (3): test_subprocess (54.2 sec), test.test_multiprocessing_spawn.test_processes (1 min 10 sec), test_posix (54.0 sec) | |
| 0:03:02 load avg: 9.00 [392/492/3] test_except_star passed -- running (3): test_subprocess (54.6 sec), test.test_multiprocessing_spawn.test_processes (1 min 10 sec), test_posix (54.4 sec) | |
| 0:03:04 load avg: 9.00 [393/492/3] test_genericpath passed -- running (3): test_subprocess (56.1 sec), test.test_multiprocessing_spawn.test_processes (1 min 12 sec), test_posix (55.9 sec) | |
| 0:03:05 load avg: 9.00 [394/492/3] test.test_asyncio.test_eager_task_factory passed -- running (3): test_subprocess (57.0 sec), test.test_multiprocessing_spawn.test_processes (1 min 13 sec), test_posix (56.8 sec) | |
| 0:03:05 load avg: 9.00 [395/492/3] test_funcattrs passed -- running (3): test_subprocess (57.0 sec), test.test_multiprocessing_spawn.test_processes (1 min 13 sec), test_posix (56.8 sec) | |
| 0:03:05 load avg: 9.00 [396/492/3] test.test_asyncio.test_taskgroups passed -- running (3): test_subprocess (57.1 sec), test.test_multiprocessing_spawn.test_processes (1 min 13 sec), test_posix (56.9 sec) | |
| 0:03:05 load avg: 9.00 [397/492/3] test_tomllib passed -- running (3): test_subprocess (57.4 sec), test.test_multiprocessing_spawn.test_processes (1 min 13 sec), test_posix (57.2 sec) | |
| 0:03:06 load avg: 9.00 [398/492/3] test_threading_local passed -- running (3): test_subprocess (58.2 sec), test.test_multiprocessing_spawn.test_processes (1 min 14 sec), test_posix (58.0 sec) | |
| 0:03:07 load avg: 9.00 [399/492/3] test_buffer passed -- running (4): test.test_concurrent_futures.test_deadlock (31.3 sec), test_subprocess (59.5 sec), test.test_multiprocessing_spawn.test_processes (1 min 15 sec), test_posix (59.3 sec) | |
| 0:03:08 load avg: 9.00 [400/492/3] test_zipimport passed -- running (4): test.test_concurrent_futures.test_deadlock (32.1 sec), test_subprocess (1 min), test.test_multiprocessing_spawn.test_processes (1 min 16 sec), test_posix (1 min) | |
| 0:03:09 load avg: 9.00 [401/492/3] test_heapq passed -- running (4): test.test_concurrent_futures.test_deadlock (33.4 sec), test_subprocess (1 min 1 sec), test.test_multiprocessing_spawn.test_processes (1 min 17 sec), test_posix (1 min 1 sec) | |
| 0:03:10 load avg: 9.00 [402/492/3] test_typechecks passed -- running (4): test.test_concurrent_futures.test_deadlock (33.8 sec), test_subprocess (1 min 2 sec), test.test_multiprocessing_spawn.test_processes (1 min 18 sec), test_posix (1 min 1 sec) | |
| 0:03:10 load avg: 8.92 [403/492/3] test_compile passed -- running (4): test.test_concurrent_futures.test_deadlock (34.2 sec), test_subprocess (1 min 2 sec), test.test_multiprocessing_spawn.test_processes (1 min 18 sec), test_posix (1 min 2 sec) | |
| 0:03:11 load avg: 8.92 [404/492/3] test_bisect passed -- running (4): test.test_concurrent_futures.test_deadlock (34.7 sec), test_subprocess (1 min 2 sec), test.test_multiprocessing_spawn.test_processes (1 min 19 sec), test_posix (1 min 2 sec) | |
| 0:03:11 load avg: 8.92 [405/492/3] test_pow passed -- running (4): test.test_concurrent_futures.test_deadlock (35.3 sec), test_subprocess (1 min 3 sec), test.test_multiprocessing_spawn.test_processes (1 min 19 sec), test_posix (1 min 3 sec) | |
| 0:03:11 load avg: 8.92 [406/492/3] test_string passed -- running (4): test.test_concurrent_futures.test_deadlock (35.4 sec), test_subprocess (1 min 3 sec), test.test_multiprocessing_spawn.test_processes (1 min 19 sec), test_posix (1 min 3 sec) | |
| 0:03:12 load avg: 8.92 [407/492/3] test_thread_local_bytecode passed -- running (4): test.test_concurrent_futures.test_deadlock (35.8 sec), test_subprocess (1 min 4 sec), test.test_multiprocessing_spawn.test_processes (1 min 20 sec), test_posix (1 min 3 sec) | |
| 0:03:12 load avg: 8.92 [408/492/3] test_import passed -- running (4): test.test_concurrent_futures.test_deadlock (36.2 sec), test_subprocess (1 min 4 sec), test.test_multiprocessing_spawn.test_processes (1 min 20 sec), test_posix (1 min 4 sec) | |
| 0:03:12 load avg: 8.92 [409/492/3] test_codecencodings_jp passed -- running (4): test.test_concurrent_futures.test_deadlock (36.3 sec), test_subprocess (1 min 4 sec), test.test_multiprocessing_spawn.test_processes (1 min 20 sec), test_posix (1 min 4 sec) | |
| 0:03:12 load avg: 8.92 [410/492/3] test_cmd_line_script passed -- running (4): test.test_concurrent_futures.test_deadlock (36.5 sec), test_subprocess (1 min 4 sec), test.test_multiprocessing_spawn.test_processes (1 min 21 sec), test_posix (1 min 4 sec) | |
| 0:03:13 load avg: 8.92 [411/492/3] test_userdict passed -- running (4): test.test_concurrent_futures.test_deadlock (36.9 sec), test_subprocess (1 min 5 sec), test.test_multiprocessing_spawn.test_processes (1 min 21 sec), test_posix (1 min 4 sec) | |
| 0:03:13 load avg: 8.92 [412/492/3] test_code passed -- running (4): test.test_concurrent_futures.test_deadlock (37.0 sec), test_subprocess (1 min 5 sec), test.test_multiprocessing_spawn.test_processes (1 min 21 sec), test_posix (1 min 5 sec) | |
| 0:03:13 load avg: 8.92 [413/492/3] test_format passed -- running (4): test.test_concurrent_futures.test_deadlock (37.3 sec), test_subprocess (1 min 5 sec), test.test_multiprocessing_spawn.test_processes (1 min 21 sec), test_posix (1 min 5 sec) | |
| 0:03:14 load avg: 8.92 [414/492/3] test_curses passed -- running (4): test.test_concurrent_futures.test_deadlock (37.7 sec), test_subprocess (1 min 5 sec), test.test_multiprocessing_spawn.test_processes (1 min 22 sec), test_posix (1 min 5 sec) | |
| 0:03:14 load avg: 8.92 [415/492/3] test_class passed -- running (4): test.test_concurrent_futures.test_deadlock (38.1 sec), test_subprocess (1 min 6 sec), test.test_multiprocessing_spawn.test_processes (1 min 22 sec), test_posix (1 min 6 sec) | |
| 0:03:14 load avg: 8.92 [416/492/3] test_strftime passed -- running (4): test.test_concurrent_futures.test_deadlock (38.3 sec), test_subprocess (1 min 6 sec), test.test_multiprocessing_spawn.test_processes (1 min 22 sec), test_posix (1 min 6 sec) | |
| 0:03:15 load avg: 8.92 [417/492/3] test_keywordonlyarg passed -- running (4): test.test_concurrent_futures.test_deadlock (38.7 sec), test_subprocess (1 min 6 sec), test.test_multiprocessing_spawn.test_processes (1 min 23 sec), test_posix (1 min 6 sec) | |
| 0:03:15 load avg: 8.92 [418/492/3] test.test_asyncio.test_sock_lowlevel passed -- running (4): test.test_concurrent_futures.test_deadlock (39.0 sec), test_subprocess (1 min 7 sec), test.test_multiprocessing_spawn.test_processes (1 min 23 sec), test_posix (1 min 7 sec) | |
| 0:03:15 load avg: 8.93 [419/492/3] test_source_encoding passed -- running (4): test.test_concurrent_futures.test_deadlock (39.5 sec), test_subprocess (1 min 7 sec), test.test_multiprocessing_spawn.test_processes (1 min 23 sec), test_posix (1 min 7 sec) | |
| 0:03:15 load avg: 8.93 [420/492/3] test_array passed -- running (4): test.test_concurrent_futures.test_deadlock (39.6 sec), test_subprocess (1 min 7 sec), test.test_multiprocessing_spawn.test_processes (1 min 24 sec), test_posix (1 min 7 sec) | |
| 0:03:16 load avg: 8.93 [421/492/3] test_plistlib passed -- running (4): test.test_concurrent_futures.test_deadlock (39.7 sec), test_subprocess (1 min 7 sec), test.test_multiprocessing_spawn.test_processes (1 min 24 sec), test_posix (1 min 7 sec) | |
| 0:03:16 load avg: 8.93 [422/492/4] test.test_concurrent_futures.test_deadlock failed (3 errors, 12 failures) (39.9 sec) -- running (3): test_subprocess (1 min 8 sec), test.test_multiprocessing_spawn.test_processes (1 min 24 sec), test_posix (1 min 7 sec) | |
| test_crash_at_task_unpickle (test.test_concurrent_futures.test_deadlock.ProcessPoolForkExecutorDeadlockTest.test_crash_at_task_unpickle) ... FAIL | |
| 0.10s Warning -- threading_cleanup() failed to clean up threads in 1.0 seconds | |
| Warning -- before: thread count=0, dangling=1 | |
| Warning -- after: thread count=2, dangling=3 | |
| Warning -- Dangling thread: <_ExecutorManagerThread(Thread-1, started 140737365382912)> | |
| Warning -- Dangling thread: <_MainThread(MainThread, started 140737463471104)> | |
| Warning -- Dangling thread: <Thread(QueueFeederThread, started daemon 140737287026432)> | |
| test_crash_big_data (test.test_concurrent_futures.test_deadlock.ProcessPoolForkExecutorDeadlockTest.test_crash_big_data) ... FAIL | |
| 0.11s test_crash_during_func_exec_on_worker (test.test_concurrent_futures.test_deadlock.ProcessPoolForkExecutorDeadlockTest.test_crash_during_func_exec_on_worker) ... FAIL | |
| 0.03s Warning -- threading_cleanup() failed to clean up threads in 1.0 seconds | |
| Warning -- before: thread count=2, dangling=3 | |
| Warning -- after: thread count=4, dangling=5 | |
| Warning -- Dangling thread: <_ExecutorManagerThread(Thread-1, started 140737365382912)> | |
| Warning -- Dangling thread: <_MainThread(MainThread, started 140737463471104)> | |
| Warning -- Dangling thread: <Thread(QueueFeederThread, started daemon 140737287026432)> | |
| Warning -- Dangling thread: <_ExecutorManagerThread(Thread-3, started 140737274435328)> | |
| Warning -- Dangling thread: <Thread(QueueFeederThread, started daemon 140737266042624)> | |
| test_crash_during_result_pickle_on_worker (test.test_concurrent_futures.test_deadlock.ProcessPoolForkExecutorDeadlockTest.test_crash_during_result_pickle_on_worker) ... ERROR | |
| 0.05s Warning -- threading_cleanup() failed to clean up threads in 1.0 seconds | |
| Warning -- before: thread count=4, dangling=5 | |
| Warning -- after: thread count=6, dangling=7 | |
| Warning -- Dangling thread: <_ExecutorManagerThread(Thread-1, started 140737365382912)> | |
| Warning -- Dangling thread: <_MainThread(MainThread, started 140737463471104)> | |
| Warning -- Dangling thread: <Thread(QueueFeederThread, started daemon 140737287026432)> | |
| Warning -- Dangling thread: <_ExecutorManagerThread(Thread-3, started 140737274435328)> | |
| Warning -- Dangling thread: <Thread(QueueFeederThread, started daemon 140737266042624)> | |
| Warning -- Dangling thread: <Thread(QueueFeederThread, started daemon 140737249257216)> | |
| Warning -- Dangling thread: <_ExecutorManagerThread(Thread-4, started 140737257649920)> | |
| test_error_at_task_pickle (test.test_concurrent_futures.test_deadlock.ProcessPoolForkExecutorDeadlockTest.test_error_at_task_pickle) ... 0.02s ok | |
| test_error_at_task_unpickle (test.test_concurrent_futures.test_deadlock.ProcessPoolForkExecutorDeadlockTest.test_error_at_task_unpickle) ... 0.02s ok | |
| test_error_during_func_exec_on_worker (test.test_concurrent_futures.test_deadlock.ProcessPoolForkExecutorDeadlockTest.test_error_during_func_exec_on_worker) ... 0.02s ok | |
| test_error_during_result_pickle_on_worker (test.test_concurrent_futures.test_deadlock.ProcessPoolForkExecutorDeadlockTest.test_error_during_result_pickle_on_worker) ... 0.02s ok | |
| test_error_during_result_unpickle_in_result_handler (test.test_concurrent_futures.test_deadlock.ProcessPoolForkExecutorDeadlockTest.test_error_during_result_unpickle_in_result_handler) ... 0.02s ok | |
| test_exit_at_task_unpickle (test.test_concurrent_futures.test_deadlock.ProcessPoolForkExecutorDeadlockTest.test_exit_at_task_unpickle) ... 0.02s ok | |
| test_exit_during_func_exec_on_worker (test.test_concurrent_futures.test_deadlock.ProcessPoolForkExecutorDeadlockTest.test_exit_during_func_exec_on_worker) ... 0.02s ok | |
| test_exit_during_result_pickle_on_worker (test.test_concurrent_futures.test_deadlock.ProcessPoolForkExecutorDeadlockTest.test_exit_during_result_pickle_on_worker) ... 0.02s ok | |
| test_exit_during_result_unpickle_in_result_handler (test.test_concurrent_futures.test_deadlock.ProcessPoolForkExecutorDeadlockTest.test_exit_during_result_unpickle_in_result_handler) ... 0.02s ok | |
| test_gh105829_should_not_deadlock_if_wakeup_pipe_full (test.test_concurrent_futures.test_deadlock.ProcessPoolForkExecutorDeadlockTest.test_gh105829_should_not_deadlock_if_wakeup_pipe_full) ... 3.06s ok | |
| test_shutdown_deadlock (test.test_concurrent_futures.test_deadlock.ProcessPoolForkExecutorDeadlockTest.test_shutdown_deadlock) ... FAIL | |
| 0.12s test_shutdown_deadlock_pickle (test.test_concurrent_futures.test_deadlock.ProcessPoolForkExecutorDeadlockTest.test_shutdown_deadlock_pickle) ... 0.02s ok | |
| test_crash_at_task_unpickle (test.test_concurrent_futures.test_deadlock.ProcessPoolForkserverExecutorDeadlockTest.test_crash_at_task_unpickle) ... FAIL | |
| 0.43s Warning -- threading_cleanup() failed to clean up threads in 1.0 seconds | |
| Warning -- before: thread count=0, dangling=1 | |
| Warning -- after: thread count=2, dangling=3 | |
| Warning -- Dangling thread: <_MainThread(MainThread, started 140737463471104)> | |
| Warning -- Dangling thread: <Thread(QueueFeederThread, started daemon 140737232471808)> | |
| Warning -- Dangling thread: <_ExecutorManagerThread(Thread-17, started 140737240864512)> | |
| test_crash_big_data (test.test_concurrent_futures.test_deadlock.ProcessPoolForkserverExecutorDeadlockTest.test_crash_big_data) ... FAIL | |
| 0.33s test_crash_during_func_exec_on_worker (test.test_concurrent_futures.test_deadlock.ProcessPoolForkserverExecutorDeadlockTest.test_crash_during_func_exec_on_worker) ... FAIL | |
| 0.27s Warning -- threading_cleanup() failed to clean up threads in 1.0 seconds | |
| Warning -- before: thread count=2, dangling=3 | |
| Warning -- after: thread count=4, dangling=5 | |
| Warning -- Dangling thread: <_MainThread(MainThread, started 140737463471104)> | |
| Warning -- Dangling thread: <Thread(QueueFeederThread, started daemon 140737232471808)> | |
| Warning -- Dangling thread: <_ExecutorManagerThread(Thread-17, started 140737240864512)> | |
| Warning -- Dangling thread: <Thread(QueueFeederThread, started daemon 140737257649920)> | |
| Warning -- Dangling thread: <_ExecutorManagerThread(Thread-19, started 140737365382912)> | |
| test_crash_during_result_pickle_on_worker (test.test_concurrent_futures.test_deadlock.ProcessPoolForkserverExecutorDeadlockTest.test_crash_during_result_pickle_on_worker) ... ERROR | |
| 0.29s Warning -- threading_cleanup() failed to clean up threads in 1.0 seconds | |
| Warning -- before: thread count=4, dangling=5 | |
| Warning -- after: thread count=6, dangling=7 | |
| Warning -- Dangling thread: <_MainThread(MainThread, started 140737463471104)> | |
| Warning -- Dangling thread: <Thread(QueueFeederThread, started daemon 140737232471808)> | |
| Warning -- Dangling thread: <_ExecutorManagerThread(Thread-20, started 140737287026432)> | |
| Warning -- Dangling thread: <Thread(QueueFeederThread, started daemon 140737278633728)> | |
| Warning -- Dangling thread: <_ExecutorManagerThread(Thread-17, started 140737240864512)> | |
| Warning -- Dangling thread: <Thread(QueueFeederThread, started daemon 140737257649920)> | |
| Warning -- Dangling thread: <_ExecutorManagerThread(Thread-19, started 140737365382912)> | |
| test_error_at_task_pickle (test.test_concurrent_futures.test_deadlock.ProcessPoolForkserverExecutorDeadlockTest.test_error_at_task_pickle) ... 0.22s ok | |
| test_error_at_task_unpickle (test.test_concurrent_futures.test_deadlock.ProcessPoolForkserverExecutorDeadlockTest.test_error_at_task_unpickle) ... 0.26s ok | |
| test_error_during_func_exec_on_worker (test.test_concurrent_futures.test_deadlock.ProcessPoolForkserverExecutorDeadlockTest.test_error_during_func_exec_on_worker) ... 0.26s ok | |
| test_error_during_result_pickle_on_worker (test.test_concurrent_futures.test_deadlock.ProcessPoolForkserverExecutorDeadlockTest.test_error_during_result_pickle_on_worker) ... 0.26s ok | |
| test_error_during_result_unpickle_in_result_handler (test.test_concurrent_futures.test_deadlock.ProcessPoolForkserverExecutorDeadlockTest.test_error_during_result_unpickle_in_result_handler) ... 0.26s ok | |
| test_exit_at_task_unpickle (test.test_concurrent_futures.test_deadlock.ProcessPoolForkserverExecutorDeadlockTest.test_exit_at_task_unpickle) ... 0.28s ok | |
| test_exit_during_func_exec_on_worker (test.test_concurrent_futures.test_deadlock.ProcessPoolForkserverExecutorDeadlockTest.test_exit_during_func_exec_on_worker) ... 0.28s ok | |
| test_exit_during_result_pickle_on_worker (test.test_concurrent_futures.test_deadlock.ProcessPoolForkserverExecutorDeadlockTest.test_exit_during_result_pickle_on_worker) ... 0.27s ok | |
| test_exit_during_result_unpickle_in_result_handler (test.test_concurrent_futures.test_deadlock.ProcessPoolForkserverExecutorDeadlockTest.test_exit_during_result_unpickle_in_result_handler) ... 0.26s ok | |
| test_gh105829_should_not_deadlock_if_wakeup_pipe_full (test.test_concurrent_futures.test_deadlock.ProcessPoolForkserverExecutorDeadlockTest.test_gh105829_should_not_deadlock_if_wakeup_pipe_full) ... 3.23s ok | |
| test_shutdown_deadlock (test.test_concurrent_futures.test_deadlock.ProcessPoolForkserverExecutorDeadlockTest.test_shutdown_deadlock) ... FAIL | |
| 0.36s test_shutdown_deadlock_pickle (test.test_concurrent_futures.test_deadlock.ProcessPoolForkserverExecutorDeadlockTest.test_shutdown_deadlock_pickle) ... 0.27s ok | |
| test_crash_at_task_unpickle (test.test_concurrent_futures.test_deadlock.ProcessPoolSpawnExecutorDeadlockTest.test_crash_at_task_unpickle) ... FAIL | |
| 0.67s Warning -- threading_cleanup() failed to clean up threads in 1.0 seconds | |
| Warning -- before: thread count=0, dangling=1 | |
| Warning -- after: thread count=2, dangling=3 | |
| Warning -- Dangling thread: <_MainThread(MainThread, started 140737463471104)> | |
| Warning -- Dangling thread: <Thread(QueueFeederThread, started daemon 140737249257216)> | |
| Warning -- Dangling thread: <_ExecutorManagerThread(Thread-33, started 140737270241024)> | |
| test_crash_big_data (test.test_concurrent_futures.test_deadlock.ProcessPoolSpawnExecutorDeadlockTest.test_crash_big_data) ... FAIL | |
| 0.87s test_crash_during_func_exec_on_worker (test.test_concurrent_futures.test_deadlock.ProcessPoolSpawnExecutorDeadlockTest.test_crash_during_func_exec_on_worker) ... FAIL | |
| 0.70s Warning -- threading_cleanup() failed to clean up threads in 1.0 seconds | |
| Warning -- before: thread count=2, dangling=3 | |
| Warning -- after: thread count=4, dangling=5 | |
| Warning -- Dangling thread: <Thread(QueueFeederThread, started daemon 140737249257216)> | |
| Warning -- Dangling thread: <_MainThread(MainThread, started 140737463471104)> | |
| Warning -- Dangling thread: <_ExecutorManagerThread(Thread-33, started 140737270241024)> | |
| Warning -- Dangling thread: <Thread(QueueFeederThread, started daemon 140737365382912)> | |
| Warning -- Dangling thread: <_ExecutorManagerThread(Thread-35, started 140737287026432)> | |
| test_crash_during_result_pickle_on_worker (test.test_concurrent_futures.test_deadlock.ProcessPoolSpawnExecutorDeadlockTest.test_crash_during_result_pickle_on_worker) ... ERROR | |
| 0.65s Warning -- threading_cleanup() failed to clean up threads in 1.0 seconds | |
| Warning -- before: thread count=4, dangling=5 | |
| Warning -- after: thread count=6, dangling=7 | |
| Warning -- Dangling thread: <Thread(QueueFeederThread, started daemon 140737249257216)> | |
| Warning -- Dangling thread: <_MainThread(MainThread, started 140737463471104)> | |
| Warning -- Dangling thread: <_ExecutorManagerThread(Thread-33, started 140737270241024)> | |
| Warning -- Dangling thread: <Thread(QueueFeederThread, started daemon 140737365382912)> | |
| Warning -- Dangling thread: <_ExecutorManagerThread(Thread-35, started 140737287026432)> | |
| Warning -- Dangling thread: <Thread(QueueFeederThread, started daemon 140737261848320)> | |
| Warning -- Dangling thread: <_ExecutorManagerThread(Thread-36, started 140737278633728)> | |
| test_error_at_task_pickle (test.test_concurrent_futures.test_deadlock.ProcessPoolSpawnExecutorDeadlockTest.test_error_at_task_pickle) ... 1.43s ok | |
| test_error_at_task_unpickle (test.test_concurrent_futures.test_deadlock.ProcessPoolSpawnExecutorDeadlockTest.test_error_at_task_unpickle) ... 1.62s ok | |
| test_error_during_func_exec_on_worker (test.test_concurrent_futures.test_deadlock.ProcessPoolSpawnExecutorDeadlockTest.test_error_during_func_exec_on_worker) ... 0.72s ok | |
| test_error_during_result_pickle_on_worker (test.test_concurrent_futures.test_deadlock.ProcessPoolSpawnExecutorDeadlockTest.test_error_during_result_pickle_on_worker) ... 0.77s ok | |
| test_error_during_result_unpickle_in_result_handler (test.test_concurrent_futures.test_deadlock.ProcessPoolSpawnExecutorDeadlockTest.test_error_during_result_unpickle_in_result_handler) ... 0.77s ok | |
| test_exit_at_task_unpickle (test.test_concurrent_futures.test_deadlock.ProcessPoolSpawnExecutorDeadlockTest.test_exit_at_task_unpickle) ... 0.87s ok | |
| test_exit_during_func_exec_on_worker (test.test_concurrent_futures.test_deadlock.ProcessPoolSpawnExecutorDeadlockTest.test_exit_during_func_exec_on_worker) ... 0.79s ok | |
| test_exit_during_result_pickle_on_worker (test.test_concurrent_futures.test_deadlock.ProcessPoolSpawnExecutorDeadlockTest.test_exit_during_result_pickle_on_worker) ... 0.76s ok | |
| test_exit_during_result_unpickle_in_result_handler (test.test_concurrent_futures.test_deadlock.ProcessPoolSpawnExecutorDeadlockTest.test_exit_during_result_unpickle_in_result_handler) ... 0.69s ok | |
| test_gh105829_should_not_deadlock_if_wakeup_pipe_full (test.test_concurrent_futures.test_deadlock.ProcessPoolSpawnExecutorDeadlockTest.test_gh105829_should_not_deadlock_if_wakeup_pipe_full) ... 3.73s ok | |
| test_shutdown_deadlock (test.test_concurrent_futures.test_deadlock.ProcessPoolSpawnExecutorDeadlockTest.test_shutdown_deadlock) ... FAIL | |
| 0.87s test_shutdown_deadlock_pickle (test.test_concurrent_futures.test_deadlock.ProcessPoolSpawnExecutorDeadlockTest.test_shutdown_deadlock_pickle) ... 0.76s ok | |
| ====================================================================== | |
| ERROR: test_crash_during_result_pickle_on_worker (test.test_concurrent_futures.test_deadlock.ProcessPoolForkExecutorDeadlockTest.test_crash_during_result_pickle_on_worker) | |
| ---------------------------------------------------------------------- | |
| concurrent.futures.process._RemoteTraceback: | |
| """ | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/concurrent/futures/process.py", line 210, in _sendback_result | |
| result_queue.put(_ResultItem(work_id, result=result, | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| exception=exception, exit_pid=exit_pid)) | |
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/multiprocessing/queues.py", line 391, in put | |
| obj = _ForkingPickler.dumps(obj) | |
| File "/home/buildbot/cpython/Lib/multiprocessing/reduction.py", line 51, in dumps | |
| cls(buf, protocol).dump(obj) | |
| ~~~~~~~~~~~~~~~~~~~~~~~^^^^^ | |
| _pickle.PicklingError: __reduce__ must return a string or tuple, not NoneType | |
| when serializing test.test_concurrent_futures.test_deadlock.CrashAtPickle object | |
| when serializing dict item 'result' | |
| when serializing concurrent.futures.process._ResultItem state | |
| when serializing concurrent.futures.process._ResultItem object | |
| """ | |
| The above exception was the direct cause of the following exception: | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_concurrent_futures/test_deadlock.py", line 177, in test_crash_during_result_pickle_on_worker | |
| self._check_error(BrokenProcessPool, _return_instance, CrashAtPickle) | |
| ~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/contextlib.py", line 85, in inner | |
| return func(*args, **kwds) | |
| File "/home/buildbot/cpython/Lib/test/test_concurrent_futures/test_deadlock.py", line 132, in _check_error | |
| res.result(timeout=self.TIMEOUT) | |
| ~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/concurrent/futures/_base.py", line 450, in result | |
| return self.__get_result() | |
| ~~~~~~~~~~~~~~~~~^^ | |
| File "/home/buildbot/cpython/Lib/concurrent/futures/_base.py", line 395, in __get_result | |
| raise self._exception | |
| _pickle.PicklingError: __reduce__ must return a string or tuple, not NoneType | |
| when serializing test.test_concurrent_futures.test_deadlock.CrashAtPickle object | |
| when serializing dict item 'result' | |
| when serializing concurrent.futures.process._ResultItem state | |
| when serializing concurrent.futures.process._ResultItem object | |
| ====================================================================== | |
| ERROR: test_crash_during_result_pickle_on_worker (test.test_concurrent_futures.test_deadlock.ProcessPoolForkserverExecutorDeadlockTest.test_crash_during_result_pickle_on_worker) | |
| ---------------------------------------------------------------------- | |
| concurrent.futures.process._RemoteTraceback: | |
| """ | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/concurrent/futures/process.py", line 210, in _sendback_result | |
| result_queue.put(_ResultItem(work_id, result=result, | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| exception=exception, exit_pid=exit_pid)) | |
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/multiprocessing/queues.py", line 391, in put | |
| obj = _ForkingPickler.dumps(obj) | |
| File "/home/buildbot/cpython/Lib/multiprocessing/reduction.py", line 51, in dumps | |
| cls(buf, protocol).dump(obj) | |
| ~~~~~~~~~~~~~~~~~~~~~~~^^^^^ | |
| _pickle.PicklingError: __reduce__ must return a string or tuple, not NoneType | |
| when serializing test.test_concurrent_futures.test_deadlock.CrashAtPickle object | |
| when serializing dict item 'result' | |
| when serializing concurrent.futures.process._ResultItem state | |
| when serializing concurrent.futures.process._ResultItem object | |
| """ | |
| The above exception was the direct cause of the following exception: | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_concurrent_futures/test_deadlock.py", line 177, in test_crash_during_result_pickle_on_worker | |
| self._check_error(BrokenProcessPool, _return_instance, CrashAtPickle) | |
| ~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/contextlib.py", line 85, in inner | |
| return func(*args, **kwds) | |
| File "/home/buildbot/cpython/Lib/test/test_concurrent_futures/test_deadlock.py", line 132, in _check_error | |
| res.result(timeout=self.TIMEOUT) | |
| ~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/concurrent/futures/_base.py", line 450, in result | |
| return self.__get_result() | |
| ~~~~~~~~~~~~~~~~~^^ | |
| File "/home/buildbot/cpython/Lib/concurrent/futures/_base.py", line 395, in __get_result | |
| raise self._exception | |
| _pickle.PicklingError: __reduce__ must return a string or tuple, not NoneType | |
| when serializing test.test_concurrent_futures.test_deadlock.CrashAtPickle object | |
| when serializing dict item 'result' | |
| when serializing concurrent.futures.process._ResultItem state | |
| when serializing concurrent.futures.process._ResultItem object | |
| ====================================================================== | |
| ERROR: test_crash_during_result_pickle_on_worker (test.test_concurrent_futures.test_deadlock.ProcessPoolSpawnExecutorDeadlockTest.test_crash_during_result_pickle_on_worker) | |
| ---------------------------------------------------------------------- | |
| concurrent.futures.process._RemoteTraceback: | |
| """ | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/concurrent/futures/process.py", line 210, in _sendback_result | |
| result_queue.put(_ResultItem(work_id, result=result, | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| exception=exception, exit_pid=exit_pid)) | |
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/multiprocessing/queues.py", line 391, in put | |
| obj = _ForkingPickler.dumps(obj) | |
| File "/home/buildbot/cpython/Lib/multiprocessing/reduction.py", line 51, in dumps | |
| cls(buf, protocol).dump(obj) | |
| ~~~~~~~~~~~~~~~~~~~~~~~^^^^^ | |
| _pickle.PicklingError: __reduce__ must return a string or tuple, not NoneType | |
| when serializing test.test_concurrent_futures.test_deadlock.CrashAtPickle object | |
| when serializing dict item 'result' | |
| when serializing concurrent.futures.process._ResultItem state | |
| when serializing concurrent.futures.process._ResultItem object | |
| """ | |
| The above exception was the direct cause of the following exception: | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_concurrent_futures/test_deadlock.py", line 177, in test_crash_during_result_pickle_on_worker | |
| self._check_error(BrokenProcessPool, _return_instance, CrashAtPickle) | |
| ~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/contextlib.py", line 85, in inner | |
| return func(*args, **kwds) | |
| File "/home/buildbot/cpython/Lib/test/test_concurrent_futures/test_deadlock.py", line 132, in _check_error | |
| res.result(timeout=self.TIMEOUT) | |
| ~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/concurrent/futures/_base.py", line 450, in result | |
| return self.__get_result() | |
| ~~~~~~~~~~~~~~~~~^^ | |
| File "/home/buildbot/cpython/Lib/concurrent/futures/_base.py", line 395, in __get_result | |
| raise self._exception | |
| _pickle.PicklingError: __reduce__ must return a string or tuple, not NoneType | |
| when serializing test.test_concurrent_futures.test_deadlock.CrashAtPickle object | |
| when serializing dict item 'result' | |
| when serializing concurrent.futures.process._ResultItem state | |
| when serializing concurrent.futures.process._ResultItem object | |
| ====================================================================== | |
| FAIL: test_crash_at_task_unpickle (test.test_concurrent_futures.test_deadlock.ProcessPoolForkExecutorDeadlockTest.test_crash_at_task_unpickle) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_concurrent_futures/test_deadlock.py", line 158, in test_crash_at_task_unpickle | |
| self._check_error(BrokenProcessPool, id, CrashAtUnpickle()) | |
| ~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/contextlib.py", line 85, in inner | |
| return func(*args, **kwds) | |
| File "/home/buildbot/cpython/Lib/test/test_concurrent_futures/test_deadlock.py", line 130, in _check_error | |
| with self.assertRaises(error): | |
| ~~~~~~~~~~~~~~~~~^^^^^^^ | |
| AssertionError: BrokenProcessPool not raised | |
| ====================================================================== | |
| FAIL: test_crash_big_data (test.test_concurrent_futures.test_deadlock.ProcessPoolForkExecutorDeadlockTest.test_crash_big_data) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/contextlib.py", line 85, in inner | |
| return func(*args, **kwds) | |
| File "/home/buildbot/cpython/Lib/test/test_concurrent_futures/test_deadlock.py", line 257, in test_crash_big_data | |
| with self.assertRaises(BrokenProcessPool): | |
| ~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^ | |
| AssertionError: BrokenProcessPool not raised | |
| ====================================================================== | |
| FAIL: test_crash_during_func_exec_on_worker (test.test_concurrent_futures.test_deadlock.ProcessPoolForkExecutorDeadlockTest.test_crash_during_func_exec_on_worker) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_concurrent_futures/test_deadlock.py", line 163, in test_crash_during_func_exec_on_worker | |
| self._check_error(BrokenProcessPool, _crash) | |
| ~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/contextlib.py", line 85, in inner | |
| return func(*args, **kwds) | |
| File "/home/buildbot/cpython/Lib/test/test_concurrent_futures/test_deadlock.py", line 130, in _check_error | |
| with self.assertRaises(error): | |
| ~~~~~~~~~~~~~~~~~^^^^^^^ | |
| AssertionError: BrokenProcessPool not raised | |
| ====================================================================== | |
| FAIL: test_shutdown_deadlock (test.test_concurrent_futures.test_deadlock.ProcessPoolForkExecutorDeadlockTest.test_shutdown_deadlock) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/contextlib.py", line 85, in inner | |
| return func(*args, **kwds) | |
| File "/home/buildbot/cpython/Lib/test/test_concurrent_futures/test_deadlock.py", line 215, in test_shutdown_deadlock | |
| with self.assertRaises(BrokenProcessPool): | |
| ~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^ | |
| AssertionError: BrokenProcessPool not raised | |
| ====================================================================== | |
| FAIL: test_crash_at_task_unpickle (test.test_concurrent_futures.test_deadlock.ProcessPoolForkserverExecutorDeadlockTest.test_crash_at_task_unpickle) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_concurrent_futures/test_deadlock.py", line 158, in test_crash_at_task_unpickle | |
| self._check_error(BrokenProcessPool, id, CrashAtUnpickle()) | |
| ~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/contextlib.py", line 85, in inner | |
| return func(*args, **kwds) | |
| File "/home/buildbot/cpython/Lib/test/test_concurrent_futures/test_deadlock.py", line 130, in _check_error | |
| with self.assertRaises(error): | |
| ~~~~~~~~~~~~~~~~~^^^^^^^ | |
| AssertionError: BrokenProcessPool not raised | |
| ====================================================================== | |
| FAIL: test_crash_big_data (test.test_concurrent_futures.test_deadlock.ProcessPoolForkserverExecutorDeadlockTest.test_crash_big_data) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/contextlib.py", line 85, in inner | |
| return func(*args, **kwds) | |
| File "/home/buildbot/cpython/Lib/test/test_concurrent_futures/test_deadlock.py", line 257, in test_crash_big_data | |
| with self.assertRaises(BrokenProcessPool): | |
| ~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^ | |
| AssertionError: BrokenProcessPool not raised | |
| ====================================================================== | |
| FAIL: test_crash_during_func_exec_on_worker (test.test_concurrent_futures.test_deadlock.ProcessPoolForkserverExecutorDeadlockTest.test_crash_during_func_exec_on_worker) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_concurrent_futures/test_deadlock.py", line 163, in test_crash_during_func_exec_on_worker | |
| self._check_error(BrokenProcessPool, _crash) | |
| ~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/contextlib.py", line 85, in inner | |
| return func(*args, **kwds) | |
| File "/home/buildbot/cpython/Lib/test/test_concurrent_futures/test_deadlock.py", line 130, in _check_error | |
| with self.assertRaises(error): | |
| ~~~~~~~~~~~~~~~~~^^^^^^^ | |
| AssertionError: BrokenProcessPool not raised | |
| ====================================================================== | |
| FAIL: test_shutdown_deadlock (test.test_concurrent_futures.test_deadlock.ProcessPoolForkserverExecutorDeadlockTest.test_shutdown_deadlock) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/contextlib.py", line 85, in inner | |
| return func(*args, **kwds) | |
| File "/home/buildbot/cpython/Lib/test/test_concurrent_futures/test_deadlock.py", line 215, in test_shutdown_deadlock | |
| with self.assertRaises(BrokenProcessPool): | |
| ~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^ | |
| AssertionError: BrokenProcessPool not raised | |
| ====================================================================== | |
| FAIL: test_crash_at_task_unpickle (test.test_concurrent_futures.test_deadlock.ProcessPoolSpawnExecutorDeadlockTest.test_crash_at_task_unpickle) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_concurrent_futures/test_deadlock.py", line 158, in test_crash_at_task_unpickle | |
| self._check_error(BrokenProcessPool, id, CrashAtUnpickle()) | |
| ~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/contextlib.py", line 85, in inner | |
| return func(*args, **kwds) | |
| File "/home/buildbot/cpython/Lib/test/test_concurrent_futures/test_deadlock.py", line 130, in _check_error | |
| with self.assertRaises(error): | |
| ~~~~~~~~~~~~~~~~~^^^^^^^ | |
| AssertionError: BrokenProcessPool not raised | |
| ====================================================================== | |
| FAIL: test_crash_big_data (test.test_concurrent_futures.test_deadlock.ProcessPoolSpawnExecutorDeadlockTest.test_crash_big_data) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/contextlib.py", line 85, in inner | |
| return func(*args, **kwds) | |
| File "/home/buildbot/cpython/Lib/test/test_concurrent_futures/test_deadlock.py", line 257, in test_crash_big_data | |
| with self.assertRaises(BrokenProcessPool): | |
| ~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^ | |
| AssertionError: BrokenProcessPool not raised | |
| ====================================================================== | |
| FAIL: test_crash_during_func_exec_on_worker (test.test_concurrent_futures.test_deadlock.ProcessPoolSpawnExecutorDeadlockTest.test_crash_during_func_exec_on_worker) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_concurrent_futures/test_deadlock.py", line 163, in test_crash_during_func_exec_on_worker | |
| self._check_error(BrokenProcessPool, _crash) | |
| ~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/contextlib.py", line 85, in inner | |
| return func(*args, **kwds) | |
| File "/home/buildbot/cpython/Lib/test/test_concurrent_futures/test_deadlock.py", line 130, in _check_error | |
| with self.assertRaises(error): | |
| ~~~~~~~~~~~~~~~~~^^^^^^^ | |
| AssertionError: BrokenProcessPool not raised | |
| ====================================================================== | |
| FAIL: test_shutdown_deadlock (test.test_concurrent_futures.test_deadlock.ProcessPoolSpawnExecutorDeadlockTest.test_shutdown_deadlock) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/contextlib.py", line 85, in inner | |
| return func(*args, **kwds) | |
| File "/home/buildbot/cpython/Lib/test/test_concurrent_futures/test_deadlock.py", line 215, in test_shutdown_deadlock | |
| with self.assertRaises(BrokenProcessPool): | |
| ~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^ | |
| AssertionError: BrokenProcessPool not raised | |
| ---------------------------------------------------------------------- | |
| Ran 48 tests in 39.476s | |
| FAILED (failures=12, errors=3) | |
| test test.test_concurrent_futures.test_deadlock failed | |
| 0:03:16 load avg: 8.93 [423/492/4] test_xml_etree passed -- running (3): test_subprocess (1 min 8 sec), test.test_multiprocessing_spawn.test_processes (1 min 24 sec), test_posix (1 min 8 sec) | |
| 0:03:17 load avg: 8.93 [424/492/4] test_unparse passed -- running (3): test_subprocess (1 min 8 sec), test.test_multiprocessing_spawn.test_processes (1 min 25 sec), test_posix (1 min 8 sec) | |
| 0:03:17 load avg: 8.93 [425/492/4] test_smtpnet passed -- running (3): test_subprocess (1 min 9 sec), test.test_multiprocessing_spawn.test_processes (1 min 25 sec), test_posix (1 min 8 sec) | |
| 0:03:17 load avg: 8.93 [426/492/4] test_bufio passed -- running (3): test_subprocess (1 min 9 sec), test.test_multiprocessing_spawn.test_processes (1 min 25 sec), test_posix (1 min 9 sec) | |
| 0:03:17 load avg: 8.93 [427/492/4] test_reprlib passed -- running (3): test_subprocess (1 min 9 sec), test.test_multiprocessing_spawn.test_processes (1 min 25 sec), test_posix (1 min 9 sec) | |
| 0:03:17 load avg: 8.93 [428/492/4] test_dynamic passed -- running (3): test_subprocess (1 min 9 sec), test.test_multiprocessing_spawn.test_processes (1 min 25 sec), test_posix (1 min 9 sec) | |
| 0:03:17 load avg: 8.93 [429/492/4] test_patma passed -- running (3): test_subprocess (1 min 9 sec), test.test_multiprocessing_spawn.test_processes (1 min 26 sec), test_posix (1 min 9 sec) | |
| 0:03:18 load avg: 8.93 [430/492/4] test_c_locale_coercion passed -- running (3): test_subprocess (1 min 10 sec), test.test_multiprocessing_spawn.test_processes (1 min 26 sec), test_posix (1 min 9 sec) | |
| 0:03:18 load avg: 8.93 [431/492/4] test_script_helper passed -- running (3): test_subprocess (1 min 10 sec), test.test_multiprocessing_spawn.test_processes (1 min 26 sec), test_posix (1 min 10 sec) | |
| 0:03:18 load avg: 8.93 [432/492/4] test_codecencodings_iso2022 passed -- running (3): test_subprocess (1 min 10 sec), test.test_multiprocessing_spawn.test_processes (1 min 26 sec), test_posix (1 min 10 sec) | |
| 0:03:18 load avg: 8.93 [433/492/4] test_pathlib passed -- running (3): test_subprocess (1 min 10 sec), test.test_multiprocessing_spawn.test_processes (1 min 26 sec), test_posix (1 min 10 sec) | |
| 0:03:18 load avg: 8.93 [434/492/4] test_build_details passed -- running (3): test_subprocess (1 min 10 sec), test.test_multiprocessing_spawn.test_processes (1 min 26 sec), test_posix (1 min 10 sec) | |
| 0:03:18 load avg: 8.93 [435/492/4] test.test_future_stmt.test_future_flags passed -- running (3): test_subprocess (1 min 10 sec), test.test_multiprocessing_spawn.test_processes (1 min 27 sec), test_posix (1 min 10 sec) | |
| 0:03:19 load avg: 8.93 [436/492/4] test_optimizer passed -- running (3): test_subprocess (1 min 10 sec), test.test_multiprocessing_spawn.test_processes (1 min 27 sec), test_posix (1 min 10 sec) | |
| 0:03:19 load avg: 8.93 [437/492/4] test_property passed -- running (3): test_subprocess (1 min 11 sec), test.test_multiprocessing_spawn.test_processes (1 min 27 sec), test_posix (1 min 10 sec) | |
| 0:03:19 load avg: 8.93 [438/492/4] test_ucn passed -- running (3): test_subprocess (1 min 11 sec), test.test_multiprocessing_spawn.test_processes (1 min 27 sec), test_posix (1 min 11 sec) | |
| 0:03:19 load avg: 8.93 [439/492/4] test_zstd skipped -- running (3): test_subprocess (1 min 11 sec), test.test_multiprocessing_spawn.test_processes (1 min 28 sec), test_posix (1 min 11 sec) | |
| test_zstd skipped -- No module named '_zstd' | |
| 0:03:20 load avg: 8.93 [440/492/4] test_urllibnet passed -- running (3): test_subprocess (1 min 11 sec), test.test_multiprocessing_spawn.test_processes (1 min 28 sec), test_posix (1 min 11 sec) | |
| 0:03:20 load avg: 8.93 [441/492/4] test_compiler_codegen passed -- running (3): test_subprocess (1 min 12 sec), test.test_multiprocessing_spawn.test_processes (1 min 28 sec), test_posix (1 min 12 sec) | |
| 0:03:20 load avg: 8.53 [442/492/4] test_dictviews passed -- running (3): test_subprocess (1 min 12 sec), test.test_multiprocessing_spawn.test_processes (1 min 29 sec), test_posix (1 min 12 sec) | |
| 0:03:21 load avg: 8.53 [443/492/4] test_hashlib passed -- running (3): test_subprocess (1 min 12 sec), test.test_multiprocessing_spawn.test_processes (1 min 29 sec), test_posix (1 min 12 sec) | |
| 0:03:21 load avg: 8.53 [444/492/4] test.test_concurrent_futures.test_future passed -- running (3): test_subprocess (1 min 13 sec), test.test_multiprocessing_spawn.test_processes (1 min 29 sec), test_posix (1 min 12 sec) | |
| 0:03:21 load avg: 8.53 [445/492/4] test_defaultdict passed -- running (3): test_subprocess (1 min 13 sec), test.test_multiprocessing_spawn.test_processes (1 min 29 sec), test_posix (1 min 12 sec) | |
| 0:03:21 load avg: 8.53 [446/492/4] test_osx_env passed -- running (3): test_subprocess (1 min 13 sec), test.test_multiprocessing_spawn.test_processes (1 min 29 sec), test_posix (1 min 13 sec) | |
| 0:03:21 load avg: 8.53 [447/492/4] test_sax passed -- running (3): test_subprocess (1 min 13 sec), test.test_multiprocessing_spawn.test_processes (1 min 30 sec), test_posix (1 min 13 sec) | |
| 0:03:24 load avg: 8.53 [448/492/4] test_threadsignals passed -- running (3): test_subprocess (1 min 16 sec), test.test_multiprocessing_spawn.test_processes (1 min 32 sec), test_posix (1 min 16 sec) | |
| 0:03:24 load avg: 8.53 [449/492/4] test.test_multiprocessing_fork.test_misc passed -- running (3): test_subprocess (1 min 16 sec), test.test_multiprocessing_spawn.test_processes (1 min 32 sec), test_posix (1 min 16 sec) | |
| 0:03:25 load avg: 8.53 [450/492/4] test_builtin passed -- running (3): test_subprocess (1 min 17 sec), test.test_multiprocessing_spawn.test_processes (1 min 33 sec), test_posix (1 min 16 sec) | |
| 0:03:25 load avg: 8.41 [451/492/4] test.test_pydoc.test_pydoc passed -- running (3): test_subprocess (1 min 17 sec), test.test_multiprocessing_spawn.test_processes (1 min 33 sec), test_posix (1 min 17 sec) | |
| 0:03:25 load avg: 8.41 [452/492/4] test_annotationlib passed -- running (3): test_subprocess (1 min 17 sec), test.test_multiprocessing_spawn.test_processes (1 min 34 sec), test_posix (1 min 17 sec) | |
| 0:03:26 load avg: 8.41 [453/492/4] test_urllib_response passed -- running (3): test_subprocess (1 min 17 sec), test.test_multiprocessing_spawn.test_processes (1 min 34 sec), test_posix (1 min 17 sec) | |
| 0:03:26 load avg: 8.41 [454/492/4] test_binascii passed -- running (3): test_subprocess (1 min 18 sec), test.test_multiprocessing_spawn.test_processes (1 min 34 sec), test_posix (1 min 17 sec) | |
| 0:03:26 load avg: 8.41 [455/492/4] test_zipfile64 skipped (resource denied) -- running (3): test_subprocess (1 min 18 sec), test.test_multiprocessing_spawn.test_processes (1 min 34 sec), test_posix (1 min 18 sec) | |
| test_zipfile64 skipped -- test requires loads of disk-space bytes and a long time to run | |
| 0:03:26 load avg: 8.41 [456/492/4] test_threadedtempfile passed -- running (3): test_subprocess (1 min 18 sec), test.test_multiprocessing_spawn.test_processes (1 min 35 sec), test_posix (1 min 18 sec) | |
| 0:03:27 load avg: 8.41 [457/492/4] test_cppext passed -- running (3): test_subprocess (1 min 19 sec), test.test_multiprocessing_spawn.test_processes (1 min 35 sec), test_posix (1 min 18 sec) | |
| 0:03:27 load avg: 8.41 [458/492/4] test_unpack passed -- running (3): test_subprocess (1 min 19 sec), test.test_multiprocessing_spawn.test_processes (1 min 35 sec), test_posix (1 min 19 sec) | |
| 0:03:29 load avg: 8.41 [459/492/4] test_urllib2net passed -- running (3): test_subprocess (1 min 21 sec), test.test_multiprocessing_spawn.test_processes (1 min 37 sec), test_posix (1 min 21 sec) | |
| 0:03:30 load avg: 8.41 [460/492/4] test_ctypes passed -- running (3): test_subprocess (1 min 22 sec), test.test_multiprocessing_spawn.test_processes (1 min 38 sec), test_posix (1 min 22 sec) | |
| 0:03:30 load avg: 8.54 [461/492/5] test_subprocess failed (2 failures) (1 min 22 sec) -- running (2): test.test_multiprocessing_spawn.test_processes (1 min 38 sec), test_posix (1 min 22 sec) | |
| test_noshell_sequence_with_spaces (test.test_subprocess.CommandsWithSpaces.test_noshell_sequence_with_spaces) ... skipped 'Windows-specific tests' | |
| test_noshell_string_with_spaces (test.test_subprocess.CommandsWithSpaces.test_noshell_string_with_spaces) ... skipped 'Windows-specific tests' | |
| test_shell_sequence_with_spaces (test.test_subprocess.CommandsWithSpaces.test_shell_sequence_with_spaces) ... skipped 'Windows-specific tests' | |
| test_shell_string_with_spaces (test.test_subprocess.CommandsWithSpaces.test_shell_string_with_spaces) ... skipped 'Windows-specific tests' | |
| test_broken_pipe_cleanup (test.test_subprocess.ContextManagerTests.test_broken_pipe_cleanup) | |
| Broken pipe error should not prevent wait() (Issue 21619) ... ok | |
| test_communicate_stdin (test.test_subprocess.ContextManagerTests.test_communicate_stdin) ... ok | |
| test_invalid_args (test.test_subprocess.ContextManagerTests.test_invalid_args) ... ok | |
| test_pipe (test.test_subprocess.ContextManagerTests.test_pipe) ... ok | |
| test_returncode (test.test_subprocess.ContextManagerTests.test_returncode) ... ok | |
| test__all__ (test.test_subprocess.MiscTests.test__all__) | |
| Ensure that __all__ is populated properly. ... ok | |
| test_call_keyboardinterrupt_no_kill (test.test_subprocess.MiscTests.test_call_keyboardinterrupt_no_kill) ... ok | |
| test_context_manager_keyboardinterrupt_no_kill (test.test_subprocess.MiscTests.test_context_manager_keyboardinterrupt_no_kill) ... ok | |
| test_getoutput (test.test_subprocess.MiscTests.test_getoutput) ... ok | |
| test_run_keyboardinterrupt_no_kill (test.test_subprocess.MiscTests.test_run_keyboardinterrupt_no_kill) ... ok | |
| test_CalledProcessError_str_non_zero (test.test_subprocess.POSIXProcessTestCase.test_CalledProcessError_str_non_zero) ... ok | |
| test_CalledProcessError_str_signal (test.test_subprocess.POSIXProcessTestCase.test_CalledProcessError_str_signal) ... ok | |
| test_CalledProcessError_str_unknown_signal (test.test_subprocess.POSIXProcessTestCase.test_CalledProcessError_str_unknown_signal) ... ok | |
| test_args_string (test.test_subprocess.POSIXProcessTestCase.test_args_string) ... ok | |
| test_bytes_program (test.test_subprocess.POSIXProcessTestCase.test_bytes_program) ... ok | |
| test_call_string (test.test_subprocess.POSIXProcessTestCase.test_call_string) ... ok | |
| test_close_fd_0 (test.test_subprocess.POSIXProcessTestCase.test_close_fd_0) ... ok | |
| test_close_fd_1 (test.test_subprocess.POSIXProcessTestCase.test_close_fd_1) ... ok | |
| test_close_fd_2 (test.test_subprocess.POSIXProcessTestCase.test_close_fd_2) ... ok | |
| test_close_fds (test.test_subprocess.POSIXProcessTestCase.test_close_fds) ... ok | |
| test_close_fds_0_1 (test.test_subprocess.POSIXProcessTestCase.test_close_fds_0_1) ... ok | |
| test_close_fds_0_1_2 (test.test_subprocess.POSIXProcessTestCase.test_close_fds_0_1_2) ... ok | |
| test_close_fds_0_2 (test.test_subprocess.POSIXProcessTestCase.test_close_fds_0_2) ... ok | |
| test_close_fds_1_2 (test.test_subprocess.POSIXProcessTestCase.test_close_fds_1_2) ... ok | |
| test_close_fds_after_preexec (test.test_subprocess.POSIXProcessTestCase.test_close_fds_after_preexec) ... ok | |
| test_close_fds_when_max_fd_is_lowered (test.test_subprocess.POSIXProcessTestCase.test_close_fds_when_max_fd_is_lowered) | |
| Confirm that issue21618 is fixed (may fail under valgrind). ... ok | |
| test_communicate_BrokenPipeError_stdin_close (test.test_subprocess.POSIXProcessTestCase.test_communicate_BrokenPipeError_stdin_close) ... ok | |
| test_communicate_BrokenPipeError_stdin_close_with_timeout (test.test_subprocess.POSIXProcessTestCase.test_communicate_BrokenPipeError_stdin_close_with_timeout) ... ok | |
| test_communicate_BrokenPipeError_stdin_flush (test.test_subprocess.POSIXProcessTestCase.test_communicate_BrokenPipeError_stdin_flush) ... ok | |
| test_communicate_BrokenPipeError_stdin_write (test.test_subprocess.POSIXProcessTestCase.test_communicate_BrokenPipeError_stdin_write) ... ok | |
| test_communicate_repeated_call_after_stdout_close (test.test_subprocess.POSIXProcessTestCase.test_communicate_repeated_call_after_stdout_close) ... ok | |
| test_exception_bad_args_0 (test.test_subprocess.POSIXProcessTestCase.test_exception_bad_args_0) | |
| Test error in the child raised in the parent for a bad args[0]. ... ok | |
| test_exception_bad_executable (test.test_subprocess.POSIXProcessTestCase.test_exception_bad_executable) | |
| Test error in the child raised in the parent for a bad executable. ... ok | |
| test_exception_cwd (test.test_subprocess.POSIXProcessTestCase.test_exception_cwd) | |
| Test error in the child raised in the parent for a bad cwd. ... ok | |
| test_exception_errpipe_bad_data (test.test_subprocess.POSIXProcessTestCase.test_exception_errpipe_bad_data) | |
| Test error passing done through errpipe_write where its not ... ok | |
| test_exception_errpipe_normal (test.test_subprocess.POSIXProcessTestCase.test_exception_errpipe_normal) | |
| Test error passing done through errpipe_write in the good case ... ok | |
| test_extra_groups (test.test_subprocess.POSIXProcessTestCase.test_extra_groups) ... skipped 'setgroup() EPERM; this test may require root.' | |
| test_extra_groups_empty_list (test.test_subprocess.POSIXProcessTestCase.test_extra_groups_empty_list) ... skipped 'setgroup() EPERM; this test may require root.' | |
| test_extra_groups_invalid_gid_t_values (test.test_subprocess.POSIXProcessTestCase.test_extra_groups_invalid_gid_t_values) ... ok | |
| test_fork_exec (test.test_subprocess.POSIXProcessTestCase.test_fork_exec) ... ok | |
| test_fork_exec_sorted_fd_sanity_check (test.test_subprocess.POSIXProcessTestCase.test_fork_exec_sorted_fd_sanity_check) ... ok | |
| test_group (test.test_subprocess.POSIXProcessTestCase.test_group) ... ok | |
| test_group_error (test.test_subprocess.POSIXProcessTestCase.test_group_error) ... skipped 'setregid() available on platform' | |
| test_invalid_args (test.test_subprocess.POSIXProcessTestCase.test_invalid_args) ... ok | |
| test_kill (test.test_subprocess.POSIXProcessTestCase.test_kill) ... ok | |
| test_kill_dead (test.test_subprocess.POSIXProcessTestCase.test_kill_dead) ... ok | |
| test_leak_fast_process_del_killed (test.test_subprocess.POSIXProcessTestCase.test_leak_fast_process_del_killed) ... ok | |
| test_pass_fds (test.test_subprocess.POSIXProcessTestCase.test_pass_fds) ... ok | |
| test_pass_fds_inheritable (test.test_subprocess.POSIXProcessTestCase.test_pass_fds_inheritable) ... ok | |
| test_pass_fds_redirected (test.test_subprocess.POSIXProcessTestCase.test_pass_fds_redirected) | |
| Regression test for https://bugs.python.org/issue32270. ... FAIL | |
| test_pipe_cloexec (test.test_subprocess.POSIXProcessTestCase.test_pipe_cloexec) ... FAIL | |
| test_pipe_cloexec_real_tools (test.test_subprocess.POSIXProcessTestCase.test_pipe_cloexec_real_tools) ... ok | |
| test_preexec (test.test_subprocess.POSIXProcessTestCase.test_preexec) ... ok | |
| test_preexec_at_exit (test.test_subprocess.POSIXProcessTestCase.test_preexec_at_exit) ... ok | |
| test_preexec_errpipe_does_not_double_close_pipes (test.test_subprocess.POSIXProcessTestCase.test_preexec_errpipe_does_not_double_close_pipes) | |
| Issue16140: Don't double close pipes on preexec error. ... ok | |
| test_preexec_exception (test.test_subprocess.POSIXProcessTestCase.test_preexec_exception) ... ok | |
| test_preexec_fork_failure (test.test_subprocess.POSIXProcessTestCase.test_preexec_fork_failure) ... ok | |
| test_preexec_gc_module_failure (test.test_subprocess.POSIXProcessTestCase.test_preexec_gc_module_failure) ... ok | |
| test_process_group_0 (test.test_subprocess.POSIXProcessTestCase.test_process_group_0) ... ok | |
| test_remapping_std_fds (test.test_subprocess.POSIXProcessTestCase.test_remapping_std_fds) ... ok | |
| test_restore_signals (test.test_subprocess.POSIXProcessTestCase.test_restore_signals) ... ok | |
| test_run_abort (test.test_subprocess.POSIXProcessTestCase.test_run_abort) ... ok | |
| test_select_unbuffered (test.test_subprocess.POSIXProcessTestCase.test_select_unbuffered) ... ok | |
| test_send_signal (test.test_subprocess.POSIXProcessTestCase.test_send_signal) ... ok | |
| test_send_signal_dead (test.test_subprocess.POSIXProcessTestCase.test_send_signal_dead) ... ok | |
| test_send_signal_race (test.test_subprocess.POSIXProcessTestCase.test_send_signal_race) ... ok | |
| test_send_signal_race2 (test.test_subprocess.POSIXProcessTestCase.test_send_signal_race2) ... ok | |
| test_shell_sequence (test.test_subprocess.POSIXProcessTestCase.test_shell_sequence) ... ok | |
| test_shell_string (test.test_subprocess.POSIXProcessTestCase.test_shell_string) ... ok | |
| test_small_errpipe_write_fd (test.test_subprocess.POSIXProcessTestCase.test_small_errpipe_write_fd) | |
| Issue #15798: Popen should work when stdio fds are available. ... Exception ignored while flushing sys.stdout: | |
| OSError: [Errno 9] Bad file descriptor | |
| ok | |
| test_specific_shell (test.test_subprocess.POSIXProcessTestCase.test_specific_shell) ... ok | |
| test_start_new_session (test.test_subprocess.POSIXProcessTestCase.test_start_new_session) ... ok | |
| test_stderr_stdin_are_single_inout_fd (test.test_subprocess.POSIXProcessTestCase.test_stderr_stdin_are_single_inout_fd) ... ok | |
| test_stdout_stderr_are_single_inout_fd (test.test_subprocess.POSIXProcessTestCase.test_stdout_stderr_are_single_inout_fd) ... ok | |
| test_stdout_stdin_are_single_inout_fd (test.test_subprocess.POSIXProcessTestCase.test_stdout_stdin_are_single_inout_fd) ... ok | |
| test_stopped (test.test_subprocess.POSIXProcessTestCase.test_stopped) | |
| Test wait() behavior when waitpid returns WIFSTOPPED; issue29335. ... ok | |
| test_surrogates_error_message (test.test_subprocess.POSIXProcessTestCase.test_surrogates_error_message) ... ok | |
| test_swap_fds (test.test_subprocess.POSIXProcessTestCase.test_swap_fds) ... ok | |
| test_swap_std_fds_with_one_closed (test.test_subprocess.POSIXProcessTestCase.test_swap_std_fds_with_one_closed) ... ok | |
| test_terminate (test.test_subprocess.POSIXProcessTestCase.test_terminate) ... ok | |
| test_terminate_dead (test.test_subprocess.POSIXProcessTestCase.test_terminate_dead) ... ok | |
| test_umask (test.test_subprocess.POSIXProcessTestCase.test_umask) ... ok | |
| test_undecodable_env (test.test_subprocess.POSIXProcessTestCase.test_undecodable_env) ... ok | |
| test_user (test.test_subprocess.POSIXProcessTestCase.test_user) ... ok | |
| test_user_error (test.test_subprocess.POSIXProcessTestCase.test_user_error) ... skipped 'setreuid() available on platform' | |
| test_vfork_used_when_expected (test.test_subprocess.POSIXProcessTestCase.test_vfork_used_when_expected) ... skipped 'Requires working strace' | |
| test_wait_when_sigchild_ignored (test.test_subprocess.POSIXProcessTestCase.test_wait_when_sigchild_ignored) ... ok | |
| test_zombie_fast_process_del (test.test_subprocess.POSIXProcessTestCase.test_zombie_fast_process_del) ... ok | |
| test_bufsize_equal_one_binary_mode (test.test_subprocess.ProcessTestCase.test_bufsize_equal_one_binary_mode) ... ok | |
| test_bufsize_equal_one_text_mode (test.test_subprocess.ProcessTestCase.test_bufsize_equal_one_text_mode) ... ok | |
| test_bufsize_is_none (test.test_subprocess.ProcessTestCase.test_bufsize_is_none) ... ok | |
| test_bytes_executable (test.test_subprocess.ProcessTestCase.test_bytes_executable) ... ok | |
| test_bytes_executable_replaces_shell (test.test_subprocess.ProcessTestCase.test_bytes_executable_replaces_shell) ... ok | |
| test_call_kwargs (test.test_subprocess.ProcessTestCase.test_call_kwargs) ... ok | |
| test_call_seq (test.test_subprocess.ProcessTestCase.test_call_seq) ... ok | |
| test_call_timeout (test.test_subprocess.ProcessTestCase.test_call_timeout) ... ok | |
| test_check_call_nonzero (test.test_subprocess.ProcessTestCase.test_check_call_nonzero) ... ok | |
| test_check_call_zero (test.test_subprocess.ProcessTestCase.test_check_call_zero) ... ok | |
| test_check_output (test.test_subprocess.ProcessTestCase.test_check_output) ... ok | |
| test_check_output_input_arg (test.test_subprocess.ProcessTestCase.test_check_output_input_arg) ... ok | |
| test_check_output_input_none (test.test_subprocess.ProcessTestCase.test_check_output_input_none) | |
| input=None has a legacy meaning of input='' on check_output. ... ok | |
| test_check_output_input_none_encoding_errors (test.test_subprocess.ProcessTestCase.test_check_output_input_none_encoding_errors) ... ok | |
| test_check_output_input_none_text (test.test_subprocess.ProcessTestCase.test_check_output_input_none_text) ... ok | |
| test_check_output_input_none_universal_newlines (test.test_subprocess.ProcessTestCase.test_check_output_input_none_universal_newlines) ... ok | |
| test_check_output_nonzero (test.test_subprocess.ProcessTestCase.test_check_output_nonzero) ... ok | |
| test_check_output_stderr (test.test_subprocess.ProcessTestCase.test_check_output_stderr) ... ok | |
| test_check_output_stdin_arg (test.test_subprocess.ProcessTestCase.test_check_output_stdin_arg) ... ok | |
| test_check_output_stdin_with_input_arg (test.test_subprocess.ProcessTestCase.test_check_output_stdin_with_input_arg) ... ok | |
| test_check_output_stdout_arg (test.test_subprocess.ProcessTestCase.test_check_output_stdout_arg) ... ok | |
| test_check_output_timeout (test.test_subprocess.ProcessTestCase.test_check_output_timeout) ... ok | |
| test_class_getitems (test.test_subprocess.ProcessTestCase.test_class_getitems) ... ok | |
| test_communicate (test.test_subprocess.ProcessTestCase.test_communicate) ... ok | |
| test_communicate_eintr (test.test_subprocess.ProcessTestCase.test_communicate_eintr) ... ok | |
| test_communicate_epipe (test.test_subprocess.ProcessTestCase.test_communicate_epipe) ... ok | |
| test_communicate_epipe_only_stdin (test.test_subprocess.ProcessTestCase.test_communicate_epipe_only_stdin) ... ok | |
| test_communicate_errors (test.test_subprocess.ProcessTestCase.test_communicate_errors) ... ok | |
| test_communicate_pipe_buf (test.test_subprocess.ProcessTestCase.test_communicate_pipe_buf) ... ok | |
| test_communicate_pipe_fd_leak (test.test_subprocess.ProcessTestCase.test_communicate_pipe_fd_leak) ... ok | |
| test_communicate_returns (test.test_subprocess.ProcessTestCase.test_communicate_returns) ... ok | |
| test_communicate_stderr (test.test_subprocess.ProcessTestCase.test_communicate_stderr) ... ok | |
| test_communicate_stdin (test.test_subprocess.ProcessTestCase.test_communicate_stdin) ... ok | |
| test_communicate_stdout (test.test_subprocess.ProcessTestCase.test_communicate_stdout) ... ok | |
| test_communicate_timeout (test.test_subprocess.ProcessTestCase.test_communicate_timeout) ... ok | |
| test_communicate_timeout_large_output (test.test_subprocess.ProcessTestCase.test_communicate_timeout_large_output) ... ok | |
| test_cwd (test.test_subprocess.ProcessTestCase.test_cwd) ... ok | |
| test_cwd_with_absolute_arg (test.test_subprocess.ProcessTestCase.test_cwd_with_absolute_arg) ... ok | |
| test_cwd_with_bytes (test.test_subprocess.ProcessTestCase.test_cwd_with_bytes) ... ok | |
| test_cwd_with_pathlike (test.test_subprocess.ProcessTestCase.test_cwd_with_pathlike) ... ok | |
| test_cwd_with_relative_arg (test.test_subprocess.ProcessTestCase.test_cwd_with_relative_arg) ... ok | |
| test_cwd_with_relative_executable (test.test_subprocess.ProcessTestCase.test_cwd_with_relative_executable) ... ok | |
| test_double_close_on_error (test.test_subprocess.ProcessTestCase.test_double_close_on_error) ... ok | |
| test_empty_env (test.test_subprocess.ProcessTestCase.test_empty_env) | |
| Verify that env={} is as empty as possible. ... ok | |
| test_env (test.test_subprocess.ProcessTestCase.test_env) ... ok | |
| test_executable (test.test_subprocess.ProcessTestCase.test_executable) ... ok | |
| test_executable_replaces_shell (test.test_subprocess.ProcessTestCase.test_executable_replaces_shell) ... ok | |
| test_executable_takes_precedence (test.test_subprocess.ProcessTestCase.test_executable_takes_precedence) ... ok | |
| test_executable_with_cwd (test.test_subprocess.ProcessTestCase.test_executable_with_cwd) ... ok | |
| test_executable_without_cwd (test.test_subprocess.ProcessTestCase.test_executable_without_cwd) ... skipped 'need an installed Python. See #7774' | |
| test_failed_child_execute_fd_leak (test.test_subprocess.ProcessTestCase.test_failed_child_execute_fd_leak) | |
| Test for the fork() failure fd leak reported in issue16327. ... ok | |
| test_file_not_found_includes_filename (test.test_subprocess.ProcessTestCase.test_file_not_found_includes_filename) ... ok | |
| test_file_not_found_with_bad_cwd (test.test_subprocess.ProcessTestCase.test_file_not_found_with_bad_cwd) ... ok | |
| test_handles_closed_on_exception (test.test_subprocess.ProcessTestCase.test_handles_closed_on_exception) ... ok | |
| test_invalid_args (test.test_subprocess.ProcessTestCase.test_invalid_args) ... ok | |
| test_invalid_bufsize (test.test_subprocess.ProcessTestCase.test_invalid_bufsize) ... ok | |
| test_invalid_cmd (test.test_subprocess.ProcessTestCase.test_invalid_cmd) ... ok | |
| test_invalid_env (test.test_subprocess.ProcessTestCase.test_invalid_env) ... ok | |
| test_io_buffered_by_default (test.test_subprocess.ProcessTestCase.test_io_buffered_by_default) ... ok | |
| test_io_unbuffered_works (test.test_subprocess.ProcessTestCase.test_io_unbuffered_works) ... ok | |
| test_issue8780 (test.test_subprocess.ProcessTestCase.test_issue8780) ... ok | |
| test_leaking_fds_on_error (test.test_subprocess.ProcessTestCase.test_leaking_fds_on_error) ... skipped "resource 'cpu' is not enabled" | |
| test_list2cmdline (test.test_subprocess.ProcessTestCase.test_list2cmdline) ... ok | |
| test_no_leaking (test.test_subprocess.ProcessTestCase.test_no_leaking) ... ok | |
| test_nonexisting_with_pipes (test.test_subprocess.ProcessTestCase.test_nonexisting_with_pipes) ... skipped 'need msvcrt.CrtSetReportMode' | |
| test_one_environment_variable (test.test_subprocess.ProcessTestCase.test_one_environment_variable) ... ok | |
| test_pathlike_executable (test.test_subprocess.ProcessTestCase.test_pathlike_executable) ... ok | |
| test_pathlike_executable_replaces_shell (test.test_subprocess.ProcessTestCase.test_pathlike_executable_replaces_shell) ... ok | |
| test_pipesize_default (test.test_subprocess.ProcessTestCase.test_pipesize_default) ... ok | |
| test_pipesizes (test.test_subprocess.ProcessTestCase.test_pipesizes) ... ok | |
| test_poll (test.test_subprocess.ProcessTestCase.test_poll) ... ok | |
| test_repr (test.test_subprocess.ProcessTestCase.test_repr) ... ok | |
| test_stderr_devnull (test.test_subprocess.ProcessTestCase.test_stderr_devnull) ... ok | |
| test_stderr_filedes (test.test_subprocess.ProcessTestCase.test_stderr_filedes) ... ok | |
| test_stderr_fileobj (test.test_subprocess.ProcessTestCase.test_stderr_fileobj) ... ok | |
| test_stderr_none (test.test_subprocess.ProcessTestCase.test_stderr_none) ... ok | |
| test_stderr_pipe (test.test_subprocess.ProcessTestCase.test_stderr_pipe) ... ok | |
| test_stderr_redirect_with_no_stdout_redirect (test.test_subprocess.ProcessTestCase.test_stderr_redirect_with_no_stdout_redirect) ... ok | |
| test_stdin_devnull (test.test_subprocess.ProcessTestCase.test_stdin_devnull) ... ok | |
| test_stdin_filedes (test.test_subprocess.ProcessTestCase.test_stdin_filedes) ... ok | |
| test_stdin_fileobj (test.test_subprocess.ProcessTestCase.test_stdin_fileobj) ... ok | |
| test_stdin_none (test.test_subprocess.ProcessTestCase.test_stdin_none) ... ok | |
| test_stdin_pipe (test.test_subprocess.ProcessTestCase.test_stdin_pipe) ... ok | |
| test_stdout_devnull (test.test_subprocess.ProcessTestCase.test_stdout_devnull) ... ok | |
| test_stdout_filedes (test.test_subprocess.ProcessTestCase.test_stdout_filedes) ... ok | |
| test_stdout_filedes_of_stdout (test.test_subprocess.ProcessTestCase.test_stdout_filedes_of_stdout) ... ok | |
| test_stdout_fileobj (test.test_subprocess.ProcessTestCase.test_stdout_fileobj) ... ok | |
| test_stdout_none (test.test_subprocess.ProcessTestCase.test_stdout_none) ... ok | |
| test_stdout_pipe (test.test_subprocess.ProcessTestCase.test_stdout_pipe) ... ok | |
| test_stdout_stderr_file (test.test_subprocess.ProcessTestCase.test_stdout_stderr_file) ... ok | |
| test_stdout_stderr_pipe (test.test_subprocess.ProcessTestCase.test_stdout_stderr_pipe) ... ok | |
| test_threadsafe_wait (test.test_subprocess.ProcessTestCase.test_threadsafe_wait) | |
| Issue21291: Popen.wait() needs to be threadsafe for returncode. ... ok | |
| test_timeout_exception (test.test_subprocess.ProcessTestCase.test_timeout_exception) ... ok | |
| test_universal_newlines_and_text (test.test_subprocess.ProcessTestCase.test_universal_newlines_and_text) ... ok | |
| test_universal_newlines_communicate (test.test_subprocess.ProcessTestCase.test_universal_newlines_communicate) ... ok | |
| test_universal_newlines_communicate_encodings (test.test_subprocess.ProcessTestCase.test_universal_newlines_communicate_encodings) ... ok | |
| test_universal_newlines_communicate_input_none (test.test_subprocess.ProcessTestCase.test_universal_newlines_communicate_input_none) ... ok | |
| test_universal_newlines_communicate_stdin (test.test_subprocess.ProcessTestCase.test_universal_newlines_communicate_stdin) ... ok | |
| test_universal_newlines_communicate_stdin_stdout_stderr (test.test_subprocess.ProcessTestCase.test_universal_newlines_communicate_stdin_stdout_stderr) ... ok | |
| test_wait (test.test_subprocess.ProcessTestCase.test_wait) ... ok | |
| test_wait_negative_timeout (test.test_subprocess.ProcessTestCase.test_wait_negative_timeout) ... skipped 'need subprocess._winapi' | |
| test_wait_timeout (test.test_subprocess.ProcessTestCase.test_wait_timeout) ... ok | |
| test_win32_duplicate_envs (test.test_subprocess.ProcessTestCase.test_win32_duplicate_envs) ... skipped 'Windows only issue' | |
| test_win32_invalid_env (test.test_subprocess.ProcessTestCase.test_win32_invalid_env) ... skipped 'Windows only issue' | |
| test_writes_before_communicate (test.test_subprocess.ProcessTestCase.test_writes_before_communicate) ... ok | |
| test_bufsize_equal_one_binary_mode (test.test_subprocess.ProcessTestCaseNoPoll.test_bufsize_equal_one_binary_mode) ... ok | |
| test_bufsize_equal_one_text_mode (test.test_subprocess.ProcessTestCaseNoPoll.test_bufsize_equal_one_text_mode) ... ok | |
| test_bufsize_is_none (test.test_subprocess.ProcessTestCaseNoPoll.test_bufsize_is_none) ... ok | |
| test_bytes_executable (test.test_subprocess.ProcessTestCaseNoPoll.test_bytes_executable) ... ok | |
| test_bytes_executable_replaces_shell (test.test_subprocess.ProcessTestCaseNoPoll.test_bytes_executable_replaces_shell) ... ok | |
| test_call_kwargs (test.test_subprocess.ProcessTestCaseNoPoll.test_call_kwargs) ... ok | |
| test_call_seq (test.test_subprocess.ProcessTestCaseNoPoll.test_call_seq) ... ok | |
| test_call_timeout (test.test_subprocess.ProcessTestCaseNoPoll.test_call_timeout) ... ok | |
| test_check_call_nonzero (test.test_subprocess.ProcessTestCaseNoPoll.test_check_call_nonzero) ... ok | |
| test_check_call_zero (test.test_subprocess.ProcessTestCaseNoPoll.test_check_call_zero) ... ok | |
| test_check_output (test.test_subprocess.ProcessTestCaseNoPoll.test_check_output) ... ok | |
| test_check_output_input_arg (test.test_subprocess.ProcessTestCaseNoPoll.test_check_output_input_arg) ... ok | |
| test_check_output_input_none (test.test_subprocess.ProcessTestCaseNoPoll.test_check_output_input_none) | |
| input=None has a legacy meaning of input='' on check_output. ... ok | |
| test_check_output_input_none_encoding_errors (test.test_subprocess.ProcessTestCaseNoPoll.test_check_output_input_none_encoding_errors) ... ok | |
| test_check_output_input_none_text (test.test_subprocess.ProcessTestCaseNoPoll.test_check_output_input_none_text) ... ok | |
| test_check_output_input_none_universal_newlines (test.test_subprocess.ProcessTestCaseNoPoll.test_check_output_input_none_universal_newlines) ... ok | |
| test_check_output_nonzero (test.test_subprocess.ProcessTestCaseNoPoll.test_check_output_nonzero) ... ok | |
| test_check_output_stderr (test.test_subprocess.ProcessTestCaseNoPoll.test_check_output_stderr) ... ok | |
| test_check_output_stdin_arg (test.test_subprocess.ProcessTestCaseNoPoll.test_check_output_stdin_arg) ... ok | |
| test_check_output_stdin_with_input_arg (test.test_subprocess.ProcessTestCaseNoPoll.test_check_output_stdin_with_input_arg) ... ok | |
| test_check_output_stdout_arg (test.test_subprocess.ProcessTestCaseNoPoll.test_check_output_stdout_arg) ... ok | |
| test_check_output_timeout (test.test_subprocess.ProcessTestCaseNoPoll.test_check_output_timeout) ... ok | |
| test_class_getitems (test.test_subprocess.ProcessTestCaseNoPoll.test_class_getitems) ... ok | |
| test_communicate (test.test_subprocess.ProcessTestCaseNoPoll.test_communicate) ... ok | |
| test_communicate_eintr (test.test_subprocess.ProcessTestCaseNoPoll.test_communicate_eintr) ... ok | |
| test_communicate_epipe (test.test_subprocess.ProcessTestCaseNoPoll.test_communicate_epipe) ... ok | |
| test_communicate_epipe_only_stdin (test.test_subprocess.ProcessTestCaseNoPoll.test_communicate_epipe_only_stdin) ... ok | |
| test_communicate_errors (test.test_subprocess.ProcessTestCaseNoPoll.test_communicate_errors) ... ok | |
| test_communicate_pipe_buf (test.test_subprocess.ProcessTestCaseNoPoll.test_communicate_pipe_buf) ... ok | |
| test_communicate_pipe_fd_leak (test.test_subprocess.ProcessTestCaseNoPoll.test_communicate_pipe_fd_leak) ... ok | |
| test_communicate_returns (test.test_subprocess.ProcessTestCaseNoPoll.test_communicate_returns) ... ok | |
| test_communicate_stderr (test.test_subprocess.ProcessTestCaseNoPoll.test_communicate_stderr) ... ok | |
| test_communicate_stdin (test.test_subprocess.ProcessTestCaseNoPoll.test_communicate_stdin) ... ok | |
| test_communicate_stdout (test.test_subprocess.ProcessTestCaseNoPoll.test_communicate_stdout) ... ok | |
| test_communicate_timeout (test.test_subprocess.ProcessTestCaseNoPoll.test_communicate_timeout) ... ok | |
| test_communicate_timeout_large_output (test.test_subprocess.ProcessTestCaseNoPoll.test_communicate_timeout_large_output) ... ok | |
| test_cwd (test.test_subprocess.ProcessTestCaseNoPoll.test_cwd) ... ok | |
| test_cwd_with_absolute_arg (test.test_subprocess.ProcessTestCaseNoPoll.test_cwd_with_absolute_arg) ... ok | |
| test_cwd_with_bytes (test.test_subprocess.ProcessTestCaseNoPoll.test_cwd_with_bytes) ... ok | |
| test_cwd_with_pathlike (test.test_subprocess.ProcessTestCaseNoPoll.test_cwd_with_pathlike) ... ok | |
| test_cwd_with_relative_arg (test.test_subprocess.ProcessTestCaseNoPoll.test_cwd_with_relative_arg) ... ok | |
| test_cwd_with_relative_executable (test.test_subprocess.ProcessTestCaseNoPoll.test_cwd_with_relative_executable) ... ok | |
| test_double_close_on_error (test.test_subprocess.ProcessTestCaseNoPoll.test_double_close_on_error) ... ok | |
| test_empty_env (test.test_subprocess.ProcessTestCaseNoPoll.test_empty_env) | |
| Verify that env={} is as empty as possible. ... ok | |
| test_env (test.test_subprocess.ProcessTestCaseNoPoll.test_env) ... ok | |
| test_executable (test.test_subprocess.ProcessTestCaseNoPoll.test_executable) ... ok | |
| test_executable_replaces_shell (test.test_subprocess.ProcessTestCaseNoPoll.test_executable_replaces_shell) ... ok | |
| test_executable_takes_precedence (test.test_subprocess.ProcessTestCaseNoPoll.test_executable_takes_precedence) ... ok | |
| test_executable_with_cwd (test.test_subprocess.ProcessTestCaseNoPoll.test_executable_with_cwd) ... ok | |
| test_executable_without_cwd (test.test_subprocess.ProcessTestCaseNoPoll.test_executable_without_cwd) ... skipped 'need an installed Python. See #7774' | |
| test_failed_child_execute_fd_leak (test.test_subprocess.ProcessTestCaseNoPoll.test_failed_child_execute_fd_leak) | |
| Test for the fork() failure fd leak reported in issue16327. ... ok | |
| test_file_not_found_includes_filename (test.test_subprocess.ProcessTestCaseNoPoll.test_file_not_found_includes_filename) ... ok | |
| test_file_not_found_with_bad_cwd (test.test_subprocess.ProcessTestCaseNoPoll.test_file_not_found_with_bad_cwd) ... ok | |
| test_handles_closed_on_exception (test.test_subprocess.ProcessTestCaseNoPoll.test_handles_closed_on_exception) ... ok | |
| test_invalid_args (test.test_subprocess.ProcessTestCaseNoPoll.test_invalid_args) ... ok | |
| test_invalid_bufsize (test.test_subprocess.ProcessTestCaseNoPoll.test_invalid_bufsize) ... ok | |
| test_invalid_cmd (test.test_subprocess.ProcessTestCaseNoPoll.test_invalid_cmd) ... ok | |
| test_invalid_env (test.test_subprocess.ProcessTestCaseNoPoll.test_invalid_env) ... ok | |
| test_io_buffered_by_default (test.test_subprocess.ProcessTestCaseNoPoll.test_io_buffered_by_default) ... ok | |
| test_io_unbuffered_works (test.test_subprocess.ProcessTestCaseNoPoll.test_io_unbuffered_works) ... ok | |
| test_issue8780 (test.test_subprocess.ProcessTestCaseNoPoll.test_issue8780) ... ok | |
| test_leaking_fds_on_error (test.test_subprocess.ProcessTestCaseNoPoll.test_leaking_fds_on_error) ... skipped "resource 'cpu' is not enabled" | |
| test_list2cmdline (test.test_subprocess.ProcessTestCaseNoPoll.test_list2cmdline) ... ok | |
| test_no_leaking (test.test_subprocess.ProcessTestCaseNoPoll.test_no_leaking) ... ok | |
| test_nonexisting_with_pipes (test.test_subprocess.ProcessTestCaseNoPoll.test_nonexisting_with_pipes) ... skipped 'need msvcrt.CrtSetReportMode' | |
| test_one_environment_variable (test.test_subprocess.ProcessTestCaseNoPoll.test_one_environment_variable) ... ok | |
| test_pathlike_executable (test.test_subprocess.ProcessTestCaseNoPoll.test_pathlike_executable) ... ok | |
| test_pathlike_executable_replaces_shell (test.test_subprocess.ProcessTestCaseNoPoll.test_pathlike_executable_replaces_shell) ... ok | |
| test_pipesize_default (test.test_subprocess.ProcessTestCaseNoPoll.test_pipesize_default) ... ok | |
| test_pipesizes (test.test_subprocess.ProcessTestCaseNoPoll.test_pipesizes) ... ok | |
| test_poll (test.test_subprocess.ProcessTestCaseNoPoll.test_poll) ... ok | |
| test_repr (test.test_subprocess.ProcessTestCaseNoPoll.test_repr) ... ok | |
| test_stderr_devnull (test.test_subprocess.ProcessTestCaseNoPoll.test_stderr_devnull) ... ok | |
| test_stderr_filedes (test.test_subprocess.ProcessTestCaseNoPoll.test_stderr_filedes) ... ok | |
| test_stderr_fileobj (test.test_subprocess.ProcessTestCaseNoPoll.test_stderr_fileobj) ... ok | |
| test_stderr_none (test.test_subprocess.ProcessTestCaseNoPoll.test_stderr_none) ... ok | |
| test_stderr_pipe (test.test_subprocess.ProcessTestCaseNoPoll.test_stderr_pipe) ... ok | |
| test_stderr_redirect_with_no_stdout_redirect (test.test_subprocess.ProcessTestCaseNoPoll.test_stderr_redirect_with_no_stdout_redirect) ... ok | |
| test_stdin_devnull (test.test_subprocess.ProcessTestCaseNoPoll.test_stdin_devnull) ... ok | |
| test_stdin_filedes (test.test_subprocess.ProcessTestCaseNoPoll.test_stdin_filedes) ... ok | |
| test_stdin_fileobj (test.test_subprocess.ProcessTestCaseNoPoll.test_stdin_fileobj) ... ok | |
| test_stdin_none (test.test_subprocess.ProcessTestCaseNoPoll.test_stdin_none) ... ok | |
| test_stdin_pipe (test.test_subprocess.ProcessTestCaseNoPoll.test_stdin_pipe) ... ok | |
| test_stdout_devnull (test.test_subprocess.ProcessTestCaseNoPoll.test_stdout_devnull) ... ok | |
| test_stdout_filedes (test.test_subprocess.ProcessTestCaseNoPoll.test_stdout_filedes) ... ok | |
| test_stdout_filedes_of_stdout (test.test_subprocess.ProcessTestCaseNoPoll.test_stdout_filedes_of_stdout) ... ok | |
| test_stdout_fileobj (test.test_subprocess.ProcessTestCaseNoPoll.test_stdout_fileobj) ... ok | |
| test_stdout_none (test.test_subprocess.ProcessTestCaseNoPoll.test_stdout_none) ... ok | |
| test_stdout_pipe (test.test_subprocess.ProcessTestCaseNoPoll.test_stdout_pipe) ... ok | |
| test_stdout_stderr_file (test.test_subprocess.ProcessTestCaseNoPoll.test_stdout_stderr_file) ... ok | |
| test_stdout_stderr_pipe (test.test_subprocess.ProcessTestCaseNoPoll.test_stdout_stderr_pipe) ... ok | |
| test_threadsafe_wait (test.test_subprocess.ProcessTestCaseNoPoll.test_threadsafe_wait) | |
| Issue21291: Popen.wait() needs to be threadsafe for returncode. ... ok | |
| test_timeout_exception (test.test_subprocess.ProcessTestCaseNoPoll.test_timeout_exception) ... ok | |
| test_universal_newlines_and_text (test.test_subprocess.ProcessTestCaseNoPoll.test_universal_newlines_and_text) ... ok | |
| test_universal_newlines_communicate (test.test_subprocess.ProcessTestCaseNoPoll.test_universal_newlines_communicate) ... ok | |
| test_universal_newlines_communicate_encodings (test.test_subprocess.ProcessTestCaseNoPoll.test_universal_newlines_communicate_encodings) ... ok | |
| test_universal_newlines_communicate_input_none (test.test_subprocess.ProcessTestCaseNoPoll.test_universal_newlines_communicate_input_none) ... ok | |
| test_universal_newlines_communicate_stdin (test.test_subprocess.ProcessTestCaseNoPoll.test_universal_newlines_communicate_stdin) ... ok | |
| test_universal_newlines_communicate_stdin_stdout_stderr (test.test_subprocess.ProcessTestCaseNoPoll.test_universal_newlines_communicate_stdin_stdout_stderr) ... ok | |
| test_wait (test.test_subprocess.ProcessTestCaseNoPoll.test_wait) ... ok | |
| test_wait_negative_timeout (test.test_subprocess.ProcessTestCaseNoPoll.test_wait_negative_timeout) ... skipped 'need subprocess._winapi' | |
| test_wait_timeout (test.test_subprocess.ProcessTestCaseNoPoll.test_wait_timeout) ... ok | |
| test_win32_duplicate_envs (test.test_subprocess.ProcessTestCaseNoPoll.test_win32_duplicate_envs) ... skipped 'Windows only issue' | |
| test_win32_invalid_env (test.test_subprocess.ProcessTestCaseNoPoll.test_win32_invalid_env) ... skipped 'Windows only issue' | |
| test_writes_before_communicate (test.test_subprocess.ProcessTestCaseNoPoll.test_writes_before_communicate) ... ok | |
| test_capture_output (test.test_subprocess.RunFuncTestCase.test_capture_output) ... ok | |
| test_capture_stderr (test.test_subprocess.RunFuncTestCase.test_capture_stderr) ... ok | |
| test_capture_stdout (test.test_subprocess.RunFuncTestCase.test_capture_stdout) ... ok | |
| test_check (test.test_subprocess.RunFuncTestCase.test_check) ... ok | |
| test_check_output_input_arg (test.test_subprocess.RunFuncTestCase.test_check_output_input_arg) ... ok | |
| test_check_output_stdin_arg (test.test_subprocess.RunFuncTestCase.test_check_output_stdin_arg) ... ok | |
| test_check_output_stdin_with_input_arg (test.test_subprocess.RunFuncTestCase.test_check_output_stdin_with_input_arg) ... ok | |
| test_check_output_timeout (test.test_subprocess.RunFuncTestCase.test_check_output_timeout) ... ok | |
| test_check_zero (test.test_subprocess.RunFuncTestCase.test_check_zero) ... ok | |
| test_encoding_warning (test.test_subprocess.RunFuncTestCase.test_encoding_warning) ... ok | |
| test_returncode (test.test_subprocess.RunFuncTestCase.test_returncode) ... ok | |
| test_run_kwargs (test.test_subprocess.RunFuncTestCase.test_run_kwargs) ... ok | |
| test_run_with_an_empty_env (test.test_subprocess.RunFuncTestCase.test_run_with_an_empty_env) ... skipped 'Maybe test trigger a leak on Ubuntu' | |
| test_run_with_bytes_path_and_arguments (test.test_subprocess.RunFuncTestCase.test_run_with_bytes_path_and_arguments) ... ok | |
| test_run_with_pathlike_path (test.test_subprocess.RunFuncTestCase.test_run_with_pathlike_path) ... ok | |
| test_run_with_pathlike_path_and_arguments (test.test_subprocess.RunFuncTestCase.test_run_with_pathlike_path_and_arguments) ... ok | |
| test_run_with_shell_timeout_and_capture_output (test.test_subprocess.RunFuncTestCase.test_run_with_shell_timeout_and_capture_output) | |
| Output capturing after a timeout mustn't hang forever on open filehandles. ... ok | |
| test_stderr_with_capture_output_arg (test.test_subprocess.RunFuncTestCase.test_stderr_with_capture_output_arg) ... ok | |
| test_stdout_stdout (test.test_subprocess.RunFuncTestCase.test_stdout_stdout) ... ok | |
| test_stdout_with_capture_output_arg (test.test_subprocess.RunFuncTestCase.test_stdout_with_capture_output_arg) ... ok | |
| test_timeout (test.test_subprocess.RunFuncTestCase.test_timeout) ... ok | |
| test_call_string (test.test_subprocess.Win32ProcessTestCase.test_call_string) ... skipped 'Windows specific tests' | |
| test_close_fds (test.test_subprocess.Win32ProcessTestCase.test_close_fds) ... skipped 'Windows specific tests' | |
| test_close_fds_with_stdio (test.test_subprocess.Win32ProcessTestCase.test_close_fds_with_stdio) ... skipped 'Windows specific tests' | |
| test_creationflags (test.test_subprocess.Win32ProcessTestCase.test_creationflags) ... skipped 'Windows specific tests' | |
| test_empty_attribute_list (test.test_subprocess.Win32ProcessTestCase.test_empty_attribute_list) ... skipped 'Windows specific tests' | |
| test_empty_handle_list (test.test_subprocess.Win32ProcessTestCase.test_empty_handle_list) ... skipped 'Windows specific tests' | |
| test_invalid_args (test.test_subprocess.Win32ProcessTestCase.test_invalid_args) ... skipped 'Windows specific tests' | |
| test_issue31471 (test.test_subprocess.Win32ProcessTestCase.test_issue31471) ... skipped 'Windows specific tests' | |
| test_kill (test.test_subprocess.Win32ProcessTestCase.test_kill) ... skipped 'Windows specific tests' | |
| test_kill_dead (test.test_subprocess.Win32ProcessTestCase.test_kill_dead) ... skipped 'Windows specific tests' | |
| test_send_signal (test.test_subprocess.Win32ProcessTestCase.test_send_signal) ... skipped 'Windows specific tests' | |
| test_send_signal_dead (test.test_subprocess.Win32ProcessTestCase.test_send_signal_dead) ... skipped 'Windows specific tests' | |
| test_shell_encodings (test.test_subprocess.Win32ProcessTestCase.test_shell_encodings) ... skipped 'Windows specific tests' | |
| test_shell_sequence (test.test_subprocess.Win32ProcessTestCase.test_shell_sequence) ... skipped 'Windows specific tests' | |
| test_shell_string (test.test_subprocess.Win32ProcessTestCase.test_shell_string) ... skipped 'Windows specific tests' | |
| test_startupinfo (test.test_subprocess.Win32ProcessTestCase.test_startupinfo) ... skipped 'Windows specific tests' | |
| test_startupinfo_copy (test.test_subprocess.Win32ProcessTestCase.test_startupinfo_copy) ... skipped 'Windows specific tests' | |
| test_startupinfo_keywords (test.test_subprocess.Win32ProcessTestCase.test_startupinfo_keywords) ... skipped 'Windows specific tests' | |
| test_terminate (test.test_subprocess.Win32ProcessTestCase.test_terminate) ... skipped 'Windows specific tests' | |
| test_terminate_dead (test.test_subprocess.Win32ProcessTestCase.test_terminate_dead) ... skipped 'Windows specific tests' | |
| ====================================================================== | |
| FAIL: test_pass_fds_redirected (test.test_subprocess.POSIXProcessTestCase.test_pass_fds_redirected) | |
| Regression test for https://bugs.python.org/issue32270. | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_subprocess.py", line 3086, in test_pass_fds_redirected | |
| self.assertEqual(fds, {0, 1, 2} | frozenset(pass_fds), f"output={output!a}") | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| AssertionError: Items in the first set but not the second: | |
| 3 | |
| 4 | |
| 5 : output=b'0,1,2,3,4,5,6,7,9\n' | |
| ====================================================================== | |
| FAIL: test_pipe_cloexec (test.test_subprocess.POSIXProcessTestCase.test_pipe_cloexec) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_subprocess.py", line 2822, in test_pipe_cloexec | |
| self.assertFalse(result_fds & unwanted_fds, | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| "Expected no fds from %r to be open in child, " | |
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| "found %r" % | |
| ^^^^^^^^^^^^ | |
| (unwanted_fds, result_fds & unwanted_fds)) | |
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| AssertionError: {8, 7} is not false : Expected no fds from {8, 10, 7} to be open in child, found {8, 7} | |
| ---------------------------------------------------------------------- | |
| Ran 343 tests in 81.944s | |
| FAILED (failures=2, skipped=42) | |
| test test_subprocess failed | |
| 0:03:31 load avg: 8.54 [462/492/5] test_re passed -- running (2): test.test_multiprocessing_spawn.test_processes (1 min 39 sec), test_posix (1 min 23 sec) | |
| 0:03:32 load avg: 8.54 [463/492/5] test_asyncgen passed -- running (2): test.test_multiprocessing_spawn.test_processes (1 min 40 sec), test_posix (1 min 23 sec) | |
| 0:03:32 load avg: 8.54 [464/492/5] test_winconsoleio skipped -- running (2): test.test_multiprocessing_spawn.test_processes (1 min 40 sec), test_posix (1 min 23 sec) | |
| test_winconsoleio skipped -- test only relevant on win32 | |
| 0:03:33 load avg: 8.54 [465/492/5] test_lzma passed -- running (2): test.test_multiprocessing_spawn.test_processes (1 min 41 sec), test_posix (1 min 24 sec) | |
| 0:03:34 load avg: 8.54 [466/492/5] test_utf8_mode passed -- running (2): test.test_multiprocessing_spawn.test_processes (1 min 42 sec), test_posix (1 min 25 sec) | |
| 0:03:34 load avg: 8.54 [467/492/5] test_compiler_assemble passed -- running (2): test.test_multiprocessing_spawn.test_processes (1 min 42 sec), test_posix (1 min 26 sec) | |
| 0:03:34 load avg: 8.54 [468/492/5] test_traceback passed -- running (2): test.test_multiprocessing_spawn.test_processes (1 min 42 sec), test_posix (1 min 26 sec) | |
| 0:03:34 load avg: 8.54 [469/492/5] test.test_asyncio.test_locks passed -- running (2): test.test_multiprocessing_spawn.test_processes (1 min 42 sec), test_posix (1 min 26 sec) | |
| 0:03:34 load avg: 8.54 [470/492/5] test_with passed -- running (2): test.test_multiprocessing_spawn.test_processes (1 min 43 sec), test_posix (1 min 26 sec) | |
| 0:03:35 load avg: 8.54 [471/492/5] test_filecmp passed -- running (2): test.test_multiprocessing_spawn.test_processes (1 min 43 sec), test_posix (1 min 26 sec) | |
| 0:03:35 load avg: 8.54 [472/492/5] test_slice passed -- running (2): test.test_multiprocessing_spawn.test_processes (1 min 43 sec), test_posix (1 min 27 sec) | |
| 0:03:35 load avg: 8.57 [473/492/5] test_int passed -- running (2): test.test_multiprocessing_spawn.test_processes (1 min 43 sec), test_posix (1 min 27 sec) | |
| 0:03:35 load avg: 8.57 [474/492/5] test_grammar passed -- running (2): test.test_multiprocessing_spawn.test_processes (1 min 43 sec), test_posix (1 min 27 sec) | |
| 0:03:35 load avg: 8.57 [475/492/5] test_imaplib passed -- running (2): test.test_multiprocessing_spawn.test_processes (1 min 44 sec), test_posix (1 min 27 sec) | |
| 0:03:36 load avg: 8.57 [476/492/5] test_webbrowser passed -- running (2): test.test_multiprocessing_spawn.test_processes (1 min 44 sec), test_posix (1 min 27 sec) | |
| 0:03:36 load avg: 8.57 [477/492/5] test_xml_dom_xmlbuilder passed -- running (2): test.test_multiprocessing_spawn.test_processes (1 min 44 sec), test_posix (1 min 28 sec) | |
| 0:03:36 load avg: 8.57 [478/492/5] test_codecmaps_tw passed -- running (2): test.test_multiprocessing_spawn.test_processes (1 min 44 sec), test_posix (1 min 28 sec) | |
| 0:03:37 load avg: 8.57 [479/492/5] test_bdb passed -- running (2): test.test_multiprocessing_spawn.test_processes (1 min 45 sec), test_posix (1 min 28 sec) | |
| 0:03:37 load avg: 8.57 [480/492/5] test_ioctl passed -- running (2): test.test_multiprocessing_spawn.test_processes (1 min 45 sec), test_posix (1 min 28 sec) | |
| 0:03:37 load avg: 8.57 [481/492/5] test_unicode_file passed -- running (2): test.test_multiprocessing_spawn.test_processes (1 min 45 sec), test_posix (1 min 29 sec) | |
| 0:03:37 load avg: 8.57 [482/492/5] test_shlex passed -- running (2): test.test_multiprocessing_spawn.test_processes (1 min 46 sec), test_posix (1 min 29 sec) | |
| 0:03:38 load avg: 8.57 [483/492/5] test_codecencodings_hk passed -- running (2): test.test_multiprocessing_spawn.test_processes (1 min 46 sec), test_posix (1 min 29 sec) | |
| 0:03:38 load avg: 8.57 [484/492/5] test_tempfile passed -- running (2): test.test_multiprocessing_spawn.test_processes (1 min 46 sec), test_posix (1 min 30 sec) | |
| 0:03:45 load avg: 8.21 [485/492/5] test_tarfile passed -- running (2): test.test_multiprocessing_spawn.test_processes (1 min 53 sec), test_posix (1 min 36 sec) | |
| 0:03:46 load avg: 7.95 [486/492/5] test_pyrepl passed -- running (2): test.test_multiprocessing_spawn.test_processes (1 min 54 sec), test_posix (1 min 38 sec) | |
| 0:03:48 load avg: 7.95 [487/492/6] test_cmd_line failed (4 failures) -- running (2): test.test_multiprocessing_spawn.test_processes (1 min 56 sec), test_posix (1 min 39 sec) | |
| test_argv0_normalization (test.test_cmd_line.CmdLineTest.test_argv0_normalization) ... skipped 'bpo-32457 only applies on Windows' | |
| test_builtin_input (test.test_cmd_line.CmdLineTest.test_builtin_input) ... ok | |
| test_closed_stdout (test.test_cmd_line.CmdLineTest.test_closed_stdout) ... ok | |
| test_cmd_dedent (test.test_cmd_line.CmdLineTest.test_cmd_dedent) ... ok | |
| test_cmd_dedent_failcase (test.test_cmd_line.CmdLineTest.test_cmd_dedent_failcase) ... ok | |
| test_coding (test.test_cmd_line.CmdLineTest.test_coding) ... ok | |
| test_cpu_count (test.test_cmd_line.CmdLineTest.test_cpu_count) ... ok | |
| test_cpu_count_default (test.test_cmd_line.CmdLineTest.test_cpu_count_default) ... ok | |
| test_del___main__ (test.test_cmd_line.CmdLineTest.test_del___main__) ... ok | |
| test_directories (test.test_cmd_line.CmdLineTest.test_directories) ... ok | |
| test_disable_thread_local_bytecode (test.test_cmd_line.CmdLineTest.test_disable_thread_local_bytecode) ... skipped 'PYTHON_TLBC and -X tlbc only supported in Py_GIL_DISABLED builds' | |
| test_displayhook_unencodable (test.test_cmd_line.CmdLineTest.test_displayhook_unencodable) ... ok | |
| test_empty_PYTHONPATH_issue16309 (test.test_cmd_line.CmdLineTest.test_empty_PYTHONPATH_issue16309) ... ok | |
| test_enable_thread_local_bytecode (test.test_cmd_line.CmdLineTest.test_enable_thread_local_bytecode) ... skipped 'PYTHON_TLBC and -X tlbc only supported in Py_GIL_DISABLED builds' | |
| test_env_var_frozen_modules (test.test_cmd_line.CmdLineTest.test_env_var_frozen_modules) ... ok | |
| test_hash_randomization (test.test_cmd_line.CmdLineTest.test_hash_randomization) ... ok | |
| test_help (test.test_cmd_line.CmdLineTest.test_help) ... ok | |
| test_help_all (test.test_cmd_line.CmdLineTest.test_help_all) ... ok | |
| test_help_env (test.test_cmd_line.CmdLineTest.test_help_env) ... ok | |
| test_help_xoptions (test.test_cmd_line.CmdLineTest.test_help_xoptions) ... ok | |
| test_import_time (test.test_cmd_line.CmdLineTest.test_import_time) ... ok | |
| test_int_max_str_digits (test.test_cmd_line.CmdLineTest.test_int_max_str_digits) ... ok | |
| test_invalid_thread_local_bytecode (test.test_cmd_line.CmdLineTest.test_invalid_thread_local_bytecode) ... skipped 'PYTHON_TLBC and -X tlbc only supported in Py_GIL_DISABLED builds' | |
| test_invalid_utf8_arg (test.test_cmd_line.CmdLineTest.test_invalid_utf8_arg) ... ok | |
| test_isolatedmode (test.test_cmd_line.CmdLineTest.test_isolatedmode) ... ok | |
| test_large_PYTHONPATH (test.test_cmd_line.CmdLineTest.test_large_PYTHONPATH) ... ok | |
| test_no_std_streams (test.test_cmd_line.CmdLineTest.test_no_std_streams) ... FAIL | |
| test_no_stderr (test.test_cmd_line.CmdLineTest.test_no_stderr) ... FAIL | |
| test_no_stdin (test.test_cmd_line.CmdLineTest.test_no_stdin) ... FAIL | |
| test_no_stdout (test.test_cmd_line.CmdLineTest.test_no_stdout) ... FAIL | |
| test_non_ascii (test.test_cmd_line.CmdLineTest.test_non_ascii) ... ok | |
| test_non_interactive_output_buffering (test.test_cmd_line.CmdLineTest.test_non_interactive_output_buffering) ... ok | |
| test_optimize (test.test_cmd_line.CmdLineTest.test_optimize) ... ok | |
| test_osx_android_utf8 (test.test_cmd_line.CmdLineTest.test_osx_android_utf8) ... skipped 'test specific to Mac OS X and Android' | |
| test_output_newline (test.test_cmd_line.CmdLineTest.test_output_newline) ... ok | |
| test_parsing_error (test.test_cmd_line.CmdLineTest.test_parsing_error) ... ok | |
| test_python_asyncio_debug (test.test_cmd_line.CmdLineTest.test_python_asyncio_debug) ... ok | |
| test_python_basic_repl (test.test_cmd_line.CmdLineTest.test_python_basic_repl) ... ok | |
| test_python_dump_refs (test.test_cmd_line.CmdLineTest.test_python_dump_refs) ... skipped 'Requires --with-trace-refs build option' | |
| test_python_dump_refs_file (test.test_cmd_line.CmdLineTest.test_python_dump_refs_file) ... skipped 'Requires --with-trace-refs build option' | |
| test_python_executable (test.test_cmd_line.CmdLineTest.test_python_executable) ... skipped 'PYTHONEXECUTABLE only works on macOS' | |
| test_python_gil (test.test_cmd_line.CmdLineTest.test_python_gil) ... ok | |
| test_python_legacy_windows_fs_encoding (test.test_cmd_line.CmdLineTest.test_python_legacy_windows_fs_encoding) ... skipped 'Test only applicable on Windows' | |
| test_python_legacy_windows_stdio (test.test_cmd_line.CmdLineTest.test_python_legacy_windows_stdio) ... skipped 'Test only applicable on Windows' | |
| test_python_malloc_stats (test.test_cmd_line.CmdLineTest.test_python_malloc_stats) ... ok | |
| test_python_perf_jit_support (test.test_cmd_line.CmdLineTest.test_python_perf_jit_support) ... skipped 'Requires HAVE_PERF_TRAMPOLINE support' | |
| test_python_user_base (test.test_cmd_line.CmdLineTest.test_python_user_base) ... ok | |
| test_pythondevmode_env (test.test_cmd_line.CmdLineTest.test_pythondevmode_env) ... ok | |
| test_pythonmalloc (test.test_cmd_line.CmdLineTest.test_pythonmalloc) ... ok | |
| test_relativedir_bug46421 (test.test_cmd_line.CmdLineTest.test_relativedir_bug46421) ... ok | |
| test_run_code (test.test_cmd_line.CmdLineTest.test_run_code) ... ok | |
| test_run_module (test.test_cmd_line.CmdLineTest.test_run_module) ... ok | |
| test_run_module_bug1764407 (test.test_cmd_line.CmdLineTest.test_run_module_bug1764407) ... ok | |
| test_set_pycache_prefix (test.test_cmd_line.CmdLineTest.test_set_pycache_prefix) ... ok | |
| test_showrefcount (test.test_cmd_line.CmdLineTest.test_showrefcount) ... ok | |
| test_site_flag (test.test_cmd_line.CmdLineTest.test_site_flag) ... ok | |
| test_stdin_readline (test.test_cmd_line.CmdLineTest.test_stdin_readline) ... ok | |
| test_stdout_flush_at_shutdown (test.test_cmd_line.CmdLineTest.test_stdout_flush_at_shutdown) ... ok | |
| test_sys_flags_set (test.test_cmd_line.CmdLineTest.test_sys_flags_set) ... ok | |
| test_unbuffered_input (test.test_cmd_line.CmdLineTest.test_unbuffered_input) ... ok | |
| test_unbuffered_output (test.test_cmd_line.CmdLineTest.test_unbuffered_output) ... ok | |
| test_undecodable_code (test.test_cmd_line.CmdLineTest.test_undecodable_code) ... ok | |
| test_unknown_options (test.test_cmd_line.CmdLineTest.test_unknown_options) ... ok | |
| test_unmached_quote (test.test_cmd_line.CmdLineTest.test_unmached_quote) ... ok | |
| test_verbose (test.test_cmd_line.CmdLineTest.test_verbose) ... ok | |
| test_version (test.test_cmd_line.CmdLineTest.test_version) ... ok | |
| test_warnings_filter_precedence (test.test_cmd_line.CmdLineTest.test_warnings_filter_precedence) ... ok | |
| test_xdev (test.test_cmd_line.CmdLineTest.test_xdev) ... ok | |
| test_xoption_frozen_modules (test.test_cmd_line.CmdLineTest.test_xoption_frozen_modules) ... ok | |
| test_xoptions (test.test_cmd_line.CmdLineTest.test_xoptions) ... ok | |
| test_ignore_PYTHONHASHSEED (test.test_cmd_line.IgnoreEnvironmentTest.test_ignore_PYTHONHASHSEED) ... ok | |
| test_ignore_PYTHONPATH (test.test_cmd_line.IgnoreEnvironmentTest.test_ignore_PYTHONPATH) ... ok | |
| test_sys_flags_not_set (test.test_cmd_line.IgnoreEnvironmentTest.test_sys_flags_not_set) ... ok | |
| test_decoding_error_at_the_end_of_the_line (test.test_cmd_line.SyntaxErrorTests.test_decoding_error_at_the_end_of_the_line) ... ok | |
| test_tokenizer_error_with_stdin (test.test_cmd_line.SyntaxErrorTests.test_tokenizer_error_with_stdin) ... ok | |
| ====================================================================== | |
| FAIL: test_no_std_streams (test.test_cmd_line.CmdLineTest.test_no_std_streams) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_cmd_line.py", line 554, in test_no_std_streams | |
| self._test_no_stdio(['stdin', 'stdout', 'stderr']) | |
| ~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_cmd_line.py", line 542, in _test_no_stdio | |
| self.assertEqual(p.returncode, 42) | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^ | |
| AssertionError: 1 != 42 | |
| ====================================================================== | |
| FAIL: test_no_stderr (test.test_cmd_line.CmdLineTest.test_no_stderr) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_cmd_line.py", line 551, in test_no_stderr | |
| self._test_no_stdio(['stderr']) | |
| ~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_cmd_line.py", line 542, in _test_no_stdio | |
| self.assertEqual(p.returncode, 42) | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^ | |
| AssertionError: 1 != 42 | |
| ====================================================================== | |
| FAIL: test_no_stdin (test.test_cmd_line.CmdLineTest.test_no_stdin) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_cmd_line.py", line 545, in test_no_stdin | |
| self._test_no_stdio(['stdin']) | |
| ~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_cmd_line.py", line 542, in _test_no_stdio | |
| self.assertEqual(p.returncode, 42) | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^ | |
| AssertionError: 1 != 42 | |
| ====================================================================== | |
| FAIL: test_no_stdout (test.test_cmd_line.CmdLineTest.test_no_stdout) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_cmd_line.py", line 548, in test_no_stdout | |
| self._test_no_stdio(['stdout']) | |
| ~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_cmd_line.py", line 542, in _test_no_stdio | |
| self.assertEqual(p.returncode, 42) | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^ | |
| AssertionError: 1 != 42 | |
| ---------------------------------------------------------------------- | |
| Ran 75 tests in 19.880s | |
| FAILED (failures=4, skipped=11) | |
| test test_cmd_line failed | |
| 0:03:51 load avg: 7.63 [488/492/6] test_compileall passed -- running (2): test.test_multiprocessing_spawn.test_processes (1 min 59 sec), test_posix (1 min 42 sec) | |
| 0:03:51 load avg: 7.63 [489/492/6] test.test_multiprocessing_fork.test_threads passed -- running (2): test.test_multiprocessing_spawn.test_processes (1 min 59 sec), test_posix (1 min 42 sec) | |
| 0:04:00 load avg: 7.26 [490/492/6] test_threading passed -- running (2): test.test_multiprocessing_spawn.test_processes (2 min 8 sec), test_posix (1 min 51 sec) | |
| 0:04:25 load avg: 5.07 [491/492/7] test.test_multiprocessing_spawn.test_processes failed (1 error) (2 min 33 sec) -- running (1): test_posix (2 min 17 sec) | |
| test_array (test.test_multiprocessing_spawn.test_processes.WithProcessesTestArray.test_array) ... ok | |
| test_array_from_size (test.test_multiprocessing_spawn.test_processes.WithProcessesTestArray.test_array_from_size) ... ok | |
| test_getobj_getlock_obj (test.test_multiprocessing_spawn.test_processes.WithProcessesTestArray.test_getobj_getlock_obj) ... ok | |
| test_invalid_typecode (test.test_multiprocessing_spawn.test_processes.WithProcessesTestArray.test_invalid_typecode) ... ok | |
| test_rawarray (test.test_multiprocessing_spawn.test_processes.WithProcessesTestArray.test_rawarray) ... ok | |
| test_atexit (test.test_multiprocessing_spawn.test_processes.WithProcessesTestAtExit.test_atexit) ... ok | |
| test_abort (test.test_multiprocessing_spawn.test_processes.WithProcessesTestBarrier.test_abort) | |
| Test that an abort will put the barrier in a broken state ... ok | |
| test_abort_and_reset (test.test_multiprocessing_spawn.test_processes.WithProcessesTestBarrier.test_abort_and_reset) | |
| Test that a barrier can be reset after being broken. ... ok | |
| test_action (test.test_multiprocessing_spawn.test_processes.WithProcessesTestBarrier.test_action) | |
| Test the 'action' callback ... ok | |
| test_barrier (test.test_multiprocessing_spawn.test_processes.WithProcessesTestBarrier.test_barrier) | |
| Test that a barrier is passed in lockstep ... ok | |
| test_barrier_10 (test.test_multiprocessing_spawn.test_processes.WithProcessesTestBarrier.test_barrier_10) | |
| Test that a barrier works for 10 consecutive runs ... ok | |
| test_default_timeout (test.test_multiprocessing_spawn.test_processes.WithProcessesTestBarrier.test_default_timeout) | |
| Test the barrier's default timeout ... ok | |
| test_reset (test.test_multiprocessing_spawn.test_processes.WithProcessesTestBarrier.test_reset) | |
| Test that a 'reset' on a barrier frees the waiting threads ... ok | |
| test_single_thread (test.test_multiprocessing_spawn.test_processes.WithProcessesTestBarrier.test_single_thread) ... ok | |
| test_thousand (test.test_multiprocessing_spawn.test_processes.WithProcessesTestBarrier.test_thousand) ... ok | |
| test_timeout (test.test_multiprocessing_spawn.test_processes.WithProcessesTestBarrier.test_timeout) | |
| Test wait(timeout) ... ok | |
| test_wait_return (test.test_multiprocessing_spawn.test_processes.WithProcessesTestBarrier.test_wait_return) | |
| test the return value from barrier.wait ... ok | |
| test_notify (test.test_multiprocessing_spawn.test_processes.WithProcessesTestCondition.test_notify) ... ok | |
| test_notify_all (test.test_multiprocessing_spawn.test_processes.WithProcessesTestCondition.test_notify_all) ... ok | |
| test_notify_n (test.test_multiprocessing_spawn.test_processes.WithProcessesTestCondition.test_notify_n) ... ok | |
| test_timeout (test.test_multiprocessing_spawn.test_processes.WithProcessesTestCondition.test_timeout) ... ok | |
| test_wait_result (test.test_multiprocessing_spawn.test_processes.WithProcessesTestCondition.test_wait_result) ... ok | |
| test_waitfor (test.test_multiprocessing_spawn.test_processes.WithProcessesTestCondition.test_waitfor) ... ok | |
| test_waitfor_timeout (test.test_multiprocessing_spawn.test_processes.WithProcessesTestCondition.test_waitfor_timeout) ... ok | |
| test_connection (test.test_multiprocessing_spawn.test_processes.WithProcessesTestConnection.test_connection) ... ok | |
| test_context (test.test_multiprocessing_spawn.test_processes.WithProcessesTestConnection.test_context) ... ok | |
| test_duplex_false (test.test_multiprocessing_spawn.test_processes.WithProcessesTestConnection.test_duplex_false) ... ok | |
| test_fd_transfer (test.test_multiprocessing_spawn.test_processes.WithProcessesTestConnection.test_fd_transfer) ... ok | |
| test_large_fd_transfer (test.test_multiprocessing_spawn.test_processes.WithProcessesTestConnection.test_large_fd_transfer) ... ok | |
| test_missing_fd_transfer (test.test_multiprocessing_spawn.test_processes.WithProcessesTestConnection.test_missing_fd_transfer) ... ok | |
| test_sendbytes (test.test_multiprocessing_spawn.test_processes.WithProcessesTestConnection.test_sendbytes) ... ok | |
| test_spawn_close (test.test_multiprocessing_spawn.test_processes.WithProcessesTestConnection.test_spawn_close) ... ok | |
| test_event (test.test_multiprocessing_spawn.test_processes.WithProcessesTestEvent.test_event) ... ok | |
| test_repr (test.test_multiprocessing_spawn.test_processes.WithProcessesTestEvent.test_repr) ... ok | |
| test_finalize (test.test_multiprocessing_spawn.test_processes.WithProcessesTestFinalize.test_finalize) ... ok | |
| test_thread_safety (test.test_multiprocessing_spawn.test_processes.WithProcessesTestFinalize.test_thread_safety) ... skipped "resource 'cpu' is not enabled" | |
| test_free_from_gc (test.test_multiprocessing_spawn.test_processes.WithProcessesTestHeap.test_free_from_gc) ... ok | |
| test_heap (test.test_multiprocessing_spawn.test_processes.WithProcessesTestHeap.test_heap) ... ok | |
| test_abstract_socket (test.test_multiprocessing_spawn.test_processes.WithProcessesTestListener.test_abstract_socket) ... ok | |
| test_context (test.test_multiprocessing_spawn.test_processes.WithProcessesTestListener.test_context) ... ok | |
| test_empty_authkey (test.test_multiprocessing_spawn.test_processes.WithProcessesTestListener.test_empty_authkey) ... ok | |
| test_multiple_bind (test.test_multiprocessing_spawn.test_processes.WithProcessesTestListener.test_multiple_bind) ... ok | |
| test_issue14725 (test.test_multiprocessing_spawn.test_processes.WithProcessesTestListenerClient.test_issue14725) ... ok | |
| test_issue16955 (test.test_multiprocessing_spawn.test_processes.WithProcessesTestListenerClient.test_issue16955) ... ok | |
| test_listener_client (test.test_multiprocessing_spawn.test_processes.WithProcessesTestListenerClient.test_listener_client) ... ok | |
| test_lock (test.test_multiprocessing_spawn.test_processes.WithProcessesTestLock.test_lock) ... ok | |
| test_lock_context (test.test_multiprocessing_spawn.test_processes.WithProcessesTestLock.test_lock_context) ... ok | |
| test_lock_locked_2processes (test.test_multiprocessing_spawn.test_processes.WithProcessesTestLock.test_lock_locked_2processes) ... ok | |
| test_repr_lock (test.test_multiprocessing_spawn.test_processes.WithProcessesTestLock.test_repr_lock) ... ok | |
| test_repr_rlock (test.test_multiprocessing_spawn.test_processes.WithProcessesTestLock.test_repr_rlock) ... ok | |
| test_rlock (test.test_multiprocessing_spawn.test_processes.WithProcessesTestLock.test_rlock) ... ok | |
| test_rlock_context (test.test_multiprocessing_spawn.test_processes.WithProcessesTestLock.test_rlock_context) ... ok | |
| test_rlock_locked_2processes (test.test_multiprocessing_spawn.test_processes.WithProcessesTestLock.test_rlock_locked_2processes) ... ok | |
| test_enable_logging (test.test_multiprocessing_spawn.test_processes.WithProcessesTestLogging.test_enable_logging) ... ok | |
| test_filename (test.test_multiprocessing_spawn.test_processes.WithProcessesTestLogging.test_filename) ... ok | |
| test_level (test.test_multiprocessing_spawn.test_processes.WithProcessesTestLogging.test_level) ... ok | |
| test_rapid_restart (test.test_multiprocessing_spawn.test_processes.WithProcessesTestManagerRestart.test_rapid_restart) ... ok | |
| test_access (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPicklingConnections.test_access) ... ok | |
| test_pickling (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPicklingConnections.test_pickling) ... ok | |
| test_boundaries (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPoll.test_boundaries) ... ok | |
| test_dont_merge (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPoll.test_dont_merge) ... ok | |
| test_empty_string (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPoll.test_empty_string) ... ok | |
| test_strings (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPoll.test_strings) ... ok | |
| test_poll_eintr (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPollEintr.test_poll_eintr) ... ok | |
| test_apply (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPool.test_apply) ... ok | |
| test_async (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPool.test_async) ... ok | |
| test_async_timeout (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPool.test_async_timeout) ... ok | |
| test_context (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPool.test_context) ... ok | |
| test_empty_iterable (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPool.test_empty_iterable) ... ok | |
| test_enter (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPool.test_enter) ... ok | |
| test_imap (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPool.test_imap) ... ok | |
| test_imap_handle_iterable_exception (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPool.test_imap_handle_iterable_exception) ... ok | |
| test_imap_unordered (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPool.test_imap_unordered) ... ok | |
| test_imap_unordered_handle_iterable_exception (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPool.test_imap_unordered_handle_iterable_exception) ... ok | |
| test_make_pool (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPool.test_make_pool) ... ok | |
| test_map (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPool.test_map) ... ok | |
| test_map_async (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPool.test_map_async) ... ok | |
| test_map_async_callbacks (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPool.test_map_async_callbacks) ... ok | |
| test_map_chunksize (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPool.test_map_chunksize) ... ok | |
| test_map_handle_iterable_exception (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPool.test_map_handle_iterable_exception) ... ok | |
| test_map_no_failfast (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPool.test_map_no_failfast) ... ok | |
| test_map_unplicklable (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPool.test_map_unplicklable) ... ok | |
| test_release_task_refs (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPool.test_release_task_refs) ... ok | |
| test_resource_warning (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPool.test_resource_warning) ... ok | |
| test_starmap (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPool.test_starmap) ... ok | |
| test_starmap_async (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPool.test_starmap_async) ... ok | |
| test_terminate (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPool.test_terminate) ... ok | |
| test_traceback (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPool.test_traceback) ... ok | |
| test_wrapped_exception (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPool.test_wrapped_exception) ... ok | |
| test_async_error_callback (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPoolWorkerErrors.test_async_error_callback) ... ok | |
| test_unpickleable_result (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPoolWorkerErrors.test_unpickleable_result) ... ok | |
| test_pool_maxtasksperchild_invalid (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPoolWorkerLifetime.test_pool_maxtasksperchild_invalid) ... ok | |
| test_pool_worker_lifetime (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPoolWorkerLifetime.test_pool_worker_lifetime) ... ok | |
| test_pool_worker_lifetime_early_close (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPoolWorkerLifetime.test_pool_worker_lifetime_early_close) ... ok | |
| test_worker_finalization_via_atexit_handler_of_multiprocessing (test.test_multiprocessing_spawn.test_processes.WithProcessesTestPoolWorkerLifetime.test_worker_finalization_via_atexit_handler_of_multiprocessing) ... ok | |
| test_active_children (test.test_multiprocessing_spawn.test_processes.WithProcessesTestProcess.test_active_children) ... ok | |
| test_args_argument (test.test_multiprocessing_spawn.test_processes.WithProcessesTestProcess.test_args_argument) ... skipped "resource 'cpu' is not enabled" | |
| test_child_fd_inflation (test.test_multiprocessing_spawn.test_processes.WithProcessesTestProcess.test_child_fd_inflation) ... ok | |
| test_close (test.test_multiprocessing_spawn.test_processes.WithProcessesTestProcess.test_close) ... ok | |
| test_cpu_count (test.test_multiprocessing_spawn.test_processes.WithProcessesTestProcess.test_cpu_count) ... ok | |
| test_current (test.test_multiprocessing_spawn.test_processes.WithProcessesTestProcess.test_current) ... ok | |
| test_daemon_argument (test.test_multiprocessing_spawn.test_processes.WithProcessesTestProcess.test_daemon_argument) ... ok | |
| test_error_on_stdio_flush_1 (test.test_multiprocessing_spawn.test_processes.WithProcessesTestProcess.test_error_on_stdio_flush_1) ... ok | |
| test_error_on_stdio_flush_2 (test.test_multiprocessing_spawn.test_processes.WithProcessesTestProcess.test_error_on_stdio_flush_2) ... ok | |
| test_forkserver_auth_is_enabled (test.test_multiprocessing_spawn.test_processes.WithProcessesTestProcess.test_forkserver_auth_is_enabled) ... skipped 'forkserver start method specific' | |
| test_forkserver_sigint (test.test_multiprocessing_spawn.test_processes.WithProcessesTestProcess.test_forkserver_sigint) ... skipped 'test not appropriate for spawn' | |
| test_forkserver_sigkill (test.test_multiprocessing_spawn.test_processes.WithProcessesTestProcess.test_forkserver_sigkill) ... skipped 'test not appropriate for spawn' | |
| test_forkserver_without_auth_fails (test.test_multiprocessing_spawn.test_processes.WithProcessesTestProcess.test_forkserver_without_auth_fails) ... skipped 'forkserver start method specific' | |
| test_interrupt (test.test_multiprocessing_spawn.test_processes.WithProcessesTestProcess.test_interrupt) ... ERROR | |
| test_interrupt_no_handler (test.test_multiprocessing_spawn.test_processes.WithProcessesTestProcess.test_interrupt_no_handler) ... ok | |
| test_kill (test.test_multiprocessing_spawn.test_processes.WithProcessesTestProcess.test_kill) ... ok | |
| test_lose_target_ref (test.test_multiprocessing_spawn.test_processes.WithProcessesTestProcess.test_lose_target_ref) ... ok | |
| test_many_processes (test.test_multiprocessing_spawn.test_processes.WithProcessesTestProcess.test_many_processes) ... ok | |
| test_parent_process (test.test_multiprocessing_spawn.test_processes.WithProcessesTestProcess.test_parent_process) ... ok | |
| test_parent_process_attributes (test.test_multiprocessing_spawn.test_processes.WithProcessesTestProcess.test_parent_process_attributes) ... ok | |
| test_process (test.test_multiprocessing_spawn.test_processes.WithProcessesTestProcess.test_process) ... ok | |
| test_process_mainthread_native_id (test.test_multiprocessing_spawn.test_processes.WithProcessesTestProcess.test_process_mainthread_native_id) ... ok | |
| test_recursion (test.test_multiprocessing_spawn.test_processes.WithProcessesTestProcess.test_recursion) ... ok | |
| test_sentinel (test.test_multiprocessing_spawn.test_processes.WithProcessesTestProcess.test_sentinel) ... ok | |
| test_set_executable (test.test_multiprocessing_spawn.test_processes.WithProcessesTestProcess.test_set_executable) ... ok | |
| test_terminate (test.test_multiprocessing_spawn.test_processes.WithProcessesTestProcess.test_terminate) ... ok | |
| test_wait_for_threads (test.test_multiprocessing_spawn.test_processes.WithProcessesTestProcess.test_wait_for_threads) ... ok | |
| Warning -- Dangling processes: {<Process name='Process-152' pid=15663 parent=11526 started daemon>} | |
| test_closed_queue_empty_exceptions (test.test_multiprocessing_spawn.test_processes.WithProcessesTestQueue.test_closed_queue_empty_exceptions) ... ok | |
| test_closed_queue_put_get_exceptions (test.test_multiprocessing_spawn.test_processes.WithProcessesTestQueue.test_closed_queue_put_get_exceptions) ... ok | |
| test_fork (test.test_multiprocessing_spawn.test_processes.WithProcessesTestQueue.test_fork) ... ok | |
| test_get (test.test_multiprocessing_spawn.test_processes.WithProcessesTestQueue.test_get) ... ok | |
| test_no_import_lock_contention (test.test_multiprocessing_spawn.test_processes.WithProcessesTestQueue.test_no_import_lock_contention) ... ok | |
| test_put (test.test_multiprocessing_spawn.test_processes.WithProcessesTestQueue.test_put) ... ok | |
| test_qsize (test.test_multiprocessing_spawn.test_processes.WithProcessesTestQueue.test_qsize) ... ok | |
| test_queue_feeder_donot_stop_onexc (test.test_multiprocessing_spawn.test_processes.WithProcessesTestQueue.test_queue_feeder_donot_stop_onexc) ... ok | |
| test_queue_feeder_on_queue_feeder_error (test.test_multiprocessing_spawn.test_processes.WithProcessesTestQueue.test_queue_feeder_on_queue_feeder_error) ... ok | |
| test_task_done (test.test_multiprocessing_spawn.test_processes.WithProcessesTestQueue.test_task_done) ... ok | |
| test_timeout (test.test_multiprocessing_spawn.test_processes.WithProcessesTestQueue.test_timeout) ... ok | |
| test_bounded_semaphore (test.test_multiprocessing_spawn.test_processes.WithProcessesTestSemaphore.test_bounded_semaphore) ... ok | |
| test_semaphore (test.test_multiprocessing_spawn.test_processes.WithProcessesTestSemaphore.test_semaphore) ... ok | |
| test_timeout (test.test_multiprocessing_spawn.test_processes.WithProcessesTestSemaphore.test_timeout) ... ok | |
| test_copy (test.test_multiprocessing_spawn.test_processes.WithProcessesTestSharedCTypes.test_copy) ... ok | |
| test_sharedctypes (test.test_multiprocessing_spawn.test_processes.WithProcessesTestSharedCTypes.test_sharedctypes) ... ok | |
| test_synchronize (test.test_multiprocessing_spawn.test_processes.WithProcessesTestSharedCTypes.test_synchronize) ... ok | |
| test_invalid_shared_memory_creation (test.test_multiprocessing_spawn.test_processes.WithProcessesTestSharedMemory.test_invalid_shared_memory_creation) ... ok | |
| test_shared_memory_ShareableList_basics (test.test_multiprocessing_spawn.test_processes.WithProcessesTestSharedMemory.test_shared_memory_ShareableList_basics) ... ok | |
| test_shared_memory_ShareableList_pickling (test.test_multiprocessing_spawn.test_processes.WithProcessesTestSharedMemory.test_shared_memory_ShareableList_pickling) ... ok | |
| test_shared_memory_ShareableList_pickling_dead_object (test.test_multiprocessing_spawn.test_processes.WithProcessesTestSharedMemory.test_shared_memory_ShareableList_pickling_dead_object) ... ok | |
| test_shared_memory_SharedMemoryManager_basics (test.test_multiprocessing_spawn.test_processes.WithProcessesTestSharedMemory.test_shared_memory_SharedMemoryManager_basics) ... ok | |
| test_shared_memory_SharedMemoryManager_reuses_resource_tracker (test.test_multiprocessing_spawn.test_processes.WithProcessesTestSharedMemory.test_shared_memory_SharedMemoryManager_reuses_resource_tracker) ... ok | |
| test_shared_memory_SharedMemoryServer_ignores_sigint (test.test_multiprocessing_spawn.test_processes.WithProcessesTestSharedMemory.test_shared_memory_SharedMemoryServer_ignores_sigint) ... ok | |
| test_shared_memory_across_processes (test.test_multiprocessing_spawn.test_processes.WithProcessesTestSharedMemory.test_shared_memory_across_processes) ... ok | |
| test_shared_memory_basics (test.test_multiprocessing_spawn.test_processes.WithProcessesTestSharedMemory.test_shared_memory_basics) ... ok | |
| test_shared_memory_cleaned_after_process_termination (test.test_multiprocessing_spawn.test_processes.WithProcessesTestSharedMemory.test_shared_memory_cleaned_after_process_termination) ... ok | |
| test_shared_memory_name_with_embedded_null (test.test_multiprocessing_spawn.test_processes.WithProcessesTestSharedMemory.test_shared_memory_name_with_embedded_null) ... ok | |
| test_shared_memory_pickle_unpickle (test.test_multiprocessing_spawn.test_processes.WithProcessesTestSharedMemory.test_shared_memory_pickle_unpickle) ... ok | |
| test_shared_memory_pickle_unpickle_dead_object (test.test_multiprocessing_spawn.test_processes.WithProcessesTestSharedMemory.test_shared_memory_pickle_unpickle_dead_object) ... ok | |
| test_shared_memory_recreate (test.test_multiprocessing_spawn.test_processes.WithProcessesTestSharedMemory.test_shared_memory_recreate) ... ok | |
| test_shared_memory_tracking (test.test_multiprocessing_spawn.test_processes.WithProcessesTestSharedMemory.test_shared_memory_tracking) ... ok | |
| test_shared_memory_untracking (test.test_multiprocessing_spawn.test_processes.WithProcessesTestSharedMemory.test_shared_memory_untracking) ... ok | |
| test_child_sys_path (test.test_multiprocessing_spawn.test_processes.WithProcessesTestSpawnedSysPath.test_child_sys_path) ... ok | |
| test_forkserver_preload_imports_sys_path (test.test_multiprocessing_spawn.test_processes.WithProcessesTestSpawnedSysPath.test_forkserver_preload_imports_sys_path) ... skipped 'forkserver specific test.' | |
| test_std_streams_flushed_after_preload (test.test_multiprocessing_spawn.test_processes.WithProcessesTestSpawnedSysPath.test_std_streams_flushed_after_preload) ... skipped 'forkserver specific test' | |
| test_stderr_flush (test.test_multiprocessing_spawn.test_processes.WithProcessesTestSubclassingProcess.test_stderr_flush) ... ok | |
| test_subclassing (test.test_multiprocessing_spawn.test_processes.WithProcessesTestSubclassingProcess.test_subclassing) ... ok | |
| test_sys_exit (test.test_multiprocessing_spawn.test_processes.WithProcessesTestSubclassingProcess.test_sys_exit) ... ok | |
| test_getobj_getlock (test.test_multiprocessing_spawn.test_processes.WithProcessesTestValue.test_getobj_getlock) ... ok | |
| test_invalid_typecode (test.test_multiprocessing_spawn.test_processes.WithProcessesTestValue.test_invalid_typecode) ... ok | |
| test_rawvalue (test.test_multiprocessing_spawn.test_processes.WithProcessesTestValue.test_rawvalue) ... ok | |
| test_value (test.test_multiprocessing_spawn.test_processes.WithProcessesTestValue.test_value) ... ok | |
| Warning -- Dangling processes: {<Process name='Process-152' pid=15663 parent=11526 started daemon>} | |
| Process Process-152: | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/multiprocessing/process.py", line 320, in _bootstrap | |
| self.run() | |
| ~~~~~~~~^^ | |
| File "/home/buildbot/cpython/Lib/multiprocessing/process.py", line 108, in run | |
| self._target(*self._args, **self._kwargs) | |
| ~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/_test_multiprocessing.py", line 524, in _sleep_some_event | |
| time.sleep(100) | |
| ~~~~~~~~~~^^^^^ | |
| KeyboardInterrupt | |
| Warning -- reap_children() reaped child process 15663 | |
| ====================================================================== | |
| ERROR: test_interrupt (test.test_multiprocessing_spawn.test_processes.WithProcessesTestProcess.test_interrupt) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/contextlib.py", line 85, in inner | |
| return func(*args, **kwds) | |
| File "/home/buildbot/cpython/Lib/test/_test_multiprocessing.py", line 596, in test_interrupt | |
| exitcode = self._kill_process(multiprocessing.Process.interrupt) | |
| File "/home/buildbot/cpython/Lib/contextlib.py", line 85, in inner | |
| return func(*args, **kwds) | |
| File "/home/buildbot/cpython/Lib/test/_test_multiprocessing.py", line 577, in _kill_process | |
| self.assertEqual(join(), None) | |
| ~~~~^^ | |
| File "/home/buildbot/cpython/Lib/test/_test_multiprocessing.py", line 250, in __call__ | |
| return self.func(*args, **kwds) | |
| ~~~~~~~~~^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/multiprocessing/process.py", line 156, in join | |
| res = self._popen.wait(timeout) | |
| File "/home/buildbot/cpython/Lib/multiprocessing/popen_fork.py", line 44, in wait | |
| return self.poll(os.WNOHANG if timeout == 0.0 else 0) | |
| ~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/multiprocessing/popen_fork.py", line 28, in poll | |
| pid, sts = os.waitpid(self.pid, flag) | |
| ~~~~~~~~~~^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/_test_multiprocessing.py", line 573, in handler | |
| raise RuntimeError('join took too long: %s' % p) | |
| RuntimeError: join took too long: <Process name='Process-152' pid=15663 parent=11526 started daemon> | |
| ---------------------------------------------------------------------- | |
| Ran 165 tests in 152.999s | |
| FAILED (errors=1, skipped=8) | |
| test test.test_multiprocessing_spawn.test_processes failed | |
| 0:04:55 load avg: 3.53 running (1): test_posix (2 min 47 sec) | |
| 0:05:25 load avg: 2.51 running (1): test_posix (3 min 17 sec) | |
| 0:05:55 load avg: 1.79 running (1): test_posix (3 min 47 sec) | |
| 0:06:25 load avg: 1.92 running (1): test_posix (4 min 17 sec) | |
| 0:06:55 load avg: 1.61 running (1): test_posix (4 min 47 sec) | |
| 0:07:15 load avg: 1.52 [492/492/8] test_posix failed (2 errors, 1 failure) (5 min 6 sec) | |
| test_unshare_setns (test.test_posix.NamespacesTests.test_unshare_setns) ... ok | |
| test_initgroups (test.test_posix.PosixGroupsTester.test_initgroups) ... skipped 'not enough privileges' | |
| test_setgroups (test.test_posix.PosixGroupsTester.test_setgroups) ... skipped 'not enough privileges' | |
| testNoArgFunctions (test.test_posix.PosixTester.testNoArgFunctions) ... ok | |
| test_access (test.test_posix.PosixTester.test_access) ... ok | |
| test_chdir (test.test_posix.PosixTester.test_chdir) ... ok | |
| test_chflags (test.test_posix.PosixTester.test_chflags) ... skipped 'test needs os.chflags()' | |
| test_chmod_dir (test.test_posix.PosixTester.test_chmod_dir) ... ok | |
| test_chmod_dir_symlink (test.test_posix.PosixTester.test_chmod_dir_symlink) ... ok | |
| test_chmod_file (test.test_posix.PosixTester.test_chmod_file) ... ok | |
| test_chmod_file_symlink (test.test_posix.PosixTester.test_chmod_file_symlink) ... ok | |
| test_chown (test.test_posix.PosixTester.test_chown) ... ok | |
| test_cld_xxxx_constants (test.test_posix.PosixTester.test_cld_xxxx_constants) ... ok | |
| test_confstr (test.test_posix.PosixTester.test_confstr) ... ok | |
| test_dup (test.test_posix.PosixTester.test_dup) ... ok | |
| test_dup2 (test.test_posix.PosixTester.test_dup2) ... ok | |
| test_environ (test.test_posix.PosixTester.test_environ) ... ok | |
| test_fchmod_file (test.test_posix.PosixTester.test_fchmod_file) ... ok | |
| test_fchown (test.test_posix.PosixTester.test_fchown) ... ok | |
| test_fexecve (test.test_posix.PosixTester.test_fexecve) ... ERROR | |
| test_fs_holes (test.test_posix.PosixTester.test_fs_holes) ... ok | |
| test_fstat (test.test_posix.PosixTester.test_fstat) ... ok | |
| test_fstatvfs (test.test_posix.PosixTester.test_fstatvfs) ... ok | |
| test_ftruncate (test.test_posix.PosixTester.test_ftruncate) ... ok | |
| test_get_and_set_scheduler_and_param (test.test_posix.PosixTester.test_get_and_set_scheduler_and_param) ... ok | |
| test_getcwd_long_pathnames (test.test_posix.PosixTester.test_getcwd_long_pathnames) ... ok | |
| test_getgrouplist (test.test_posix.PosixTester.test_getgrouplist) ... ok | |
| test_getgroups (test.test_posix.PosixTester.test_getgroups) ... ok | |
| test_getresgid (test.test_posix.PosixTester.test_getresgid) ... ok | |
| test_getresuid (test.test_posix.PosixTester.test_getresuid) ... ok | |
| test_initgroups (test.test_posix.PosixTester.test_initgroups) ... ok | |
| test_lchflags_regular_file (test.test_posix.PosixTester.test_lchflags_regular_file) ... skipped 'test needs os.lchflags()' | |
| test_lchflags_symlink (test.test_posix.PosixTester.test_lchflags_symlink) ... skipped 'test needs os.lchflags()' | |
| test_lchmod_dir (test.test_posix.PosixTester.test_lchmod_dir) ... skipped 'test needs os.lchmod()' | |
| test_lchmod_dir_symlink (test.test_posix.PosixTester.test_lchmod_dir_symlink) ... skipped 'test needs os.lchmod()' | |
| test_lchmod_file (test.test_posix.PosixTester.test_lchmod_file) ... skipped 'test needs os.lchmod()' | |
| test_lchmod_file_symlink (test.test_posix.PosixTester.test_lchmod_file_symlink) ... skipped 'test needs os.lchmod()' | |
| test_lchown (test.test_posix.PosixTester.test_lchown) ... ok | |
| test_link_follow_symlinks (test.test_posix.PosixTester.test_link_follow_symlinks) ... ok | |
| test_listdir (test.test_posix.PosixTester.test_listdir) ... ok | |
| test_listdir_bytes (test.test_posix.PosixTester.test_listdir_bytes) ... ok | |
| test_listdir_bytes_like (test.test_posix.PosixTester.test_listdir_bytes_like) ... ok | |
| test_listdir_default (test.test_posix.PosixTester.test_listdir_default) ... ok | |
| test_listdir_fd (test.test_posix.PosixTester.test_listdir_fd) ... ok | |
| test_lockf (test.test_posix.PosixTester.test_lockf) ... ok | |
| test_makedev (test.test_posix.PosixTester.test_makedev) ... ok | |
| test_mkfifo (test.test_posix.PosixTester.test_mkfifo) ... ok | |
| test_mknod (test.test_posix.PosixTester.test_mknod) ... ok | |
| test_oscloexec (test.test_posix.PosixTester.test_oscloexec) ... ok | |
| test_osexlock (test.test_posix.PosixTester.test_osexlock) ... skipped 'test needs posix.O_EXLOCK' | |
| test_osshlock (test.test_posix.PosixTester.test_osshlock) ... skipped 'test needs posix.O_SHLOCK' | |
| test_path_error2 (test.test_posix.PosixTester.test_path_error2) | |
| Test functions that call path_error2(), providing two filenames in their exceptions. ... ok | |
| test_path_with_null_byte (test.test_posix.PosixTester.test_path_with_null_byte) ... ok | |
| test_path_with_null_character (test.test_posix.PosixTester.test_path_with_null_character) ... ok | |
| test_pidfd_open (test.test_posix.PosixTester.test_pidfd_open) ... skipped 'pidfd_open unavailable' | |
| test_pipe (test.test_posix.PosixTester.test_pipe) ... ok | |
| test_pipe2 (test.test_posix.PosixTester.test_pipe2) ... ok | |
| test_pipe2_c_limits (test.test_posix.PosixTester.test_pipe2_c_limits) ... ok | |
| test_posix_fadvise (test.test_posix.PosixTester.test_posix_fadvise) ... ok | |
| test_posix_fadvise_errno (test.test_posix.PosixTester.test_posix_fadvise_errno) ... ok | |
| test_posix_fallocate (test.test_posix.PosixTester.test_posix_fallocate) ... ok | |
| test_posix_fallocate_errno (test.test_posix.PosixTester.test_posix_fallocate_errno) ... ok | |
| test_pread (test.test_posix.PosixTester.test_pread) ... ok | |
| test_preadv (test.test_posix.PosixTester.test_preadv) ... ok | |
| test_preadv_flags (test.test_posix.PosixTester.test_preadv_flags) ... ok | |
| test_preadv_overflow_32bits (test.test_posix.PosixTester.test_preadv_overflow_32bits) ... skipped 'test is only meaningful on 32-bit builds' | |
| test_putenv (test.test_posix.PosixTester.test_putenv) ... ok | |
| test_pwrite (test.test_posix.PosixTester.test_pwrite) ... ok | |
| test_pwritev (test.test_posix.PosixTester.test_pwritev) ... ok | |
| test_pwritev_flags (test.test_posix.PosixTester.test_pwritev_flags) ... skipped 'test needs os.RWF_SYNC' | |
| test_pwritev_overflow_32bits (test.test_posix.PosixTester.test_pwritev_overflow_32bits) ... skipped 'test is only meaningful on 32-bit builds' | |
| test_readv (test.test_posix.PosixTester.test_readv) ... ok | |
| test_readv_overflow_32bits (test.test_posix.PosixTester.test_readv_overflow_32bits) ... skipped 'test is only meaningful on 32-bit builds' | |
| test_register_at_fork (test.test_posix.PosixTester.test_register_at_fork) ... ok | |
| test_rtld_constants (test.test_posix.PosixTester.test_rtld_constants) ... ok | |
| test_sched_getaffinity (test.test_posix.PosixTester.test_sched_getaffinity) ... ok | |
| test_sched_param (test.test_posix.PosixTester.test_sched_param) ... ok | |
| test_sched_priority (test.test_posix.PosixTester.test_sched_priority) ... ok | |
| test_sched_rr_get_interval (test.test_posix.PosixTester.test_sched_rr_get_interval) ... ok | |
| test_sched_setaffinity (test.test_posix.PosixTester.test_sched_setaffinity) ... ok | |
| test_sched_yield (test.test_posix.PosixTester.test_sched_yield) ... ok | |
| test_setresgid (test.test_posix.PosixTester.test_setresgid) ... ok | |
| test_setresgid_exception (test.test_posix.PosixTester.test_setresgid_exception) ... ok | |
| test_setresuid (test.test_posix.PosixTester.test_setresuid) ... ok | |
| test_setresuid_exception (test.test_posix.PosixTester.test_setresuid_exception) ... ok | |
| test_stat (test.test_posix.PosixTester.test_stat) ... ok | |
| test_statvfs (test.test_posix.PosixTester.test_statvfs) ... ok | |
| test_strerror (test.test_posix.PosixTester.test_strerror) ... ok | |
| test_sysconf (test.test_posix.PosixTester.test_sysconf) ... ok | |
| test_truncate (test.test_posix.PosixTester.test_truncate) ... ok | |
| test_umask (test.test_posix.PosixTester.test_umask) ... ok | |
| test_utime (test.test_posix.PosixTester.test_utime) ... ok | |
| test_utime_nofollow_symlinks (test.test_posix.PosixTester.test_utime_nofollow_symlinks) ... ok | |
| test_utime_with_fd (test.test_posix.PosixTester.test_utime_with_fd) ... ok | |
| test_waitid (test.test_posix.PosixTester.test_waitid) ... ok | |
| test_writev (test.test_posix.PosixTester.test_writev) ... ok | |
| test_writev_overflow_32bits (test.test_posix.PosixTester.test_writev_overflow_32bits) ... skipped 'test is only meaningful on 32-bit builds' | |
| test_access_dir_fd (test.test_posix.TestPosixDirFd.test_access_dir_fd) ... ok | |
| test_chmod_dir_fd (test.test_posix.TestPosixDirFd.test_chmod_dir_fd) ... ok | |
| test_chown_dir_fd (test.test_posix.TestPosixDirFd.test_chown_dir_fd) ... ok | |
| test_link_dir_fd (test.test_posix.TestPosixDirFd.test_link_dir_fd) ... ok | |
| test_mkdir_dir_fd (test.test_posix.TestPosixDirFd.test_mkdir_dir_fd) ... ok | |
| test_mkfifo_dir_fd (test.test_posix.TestPosixDirFd.test_mkfifo_dir_fd) ... ok | |
| test_mknod_dir_fd (test.test_posix.TestPosixDirFd.test_mknod_dir_fd) ... ok | |
| test_open_dir_fd (test.test_posix.TestPosixDirFd.test_open_dir_fd) ... ok | |
| test_readlink_dir_fd (test.test_posix.TestPosixDirFd.test_readlink_dir_fd) ... ok | |
| test_rename_dir_fd (test.test_posix.TestPosixDirFd.test_rename_dir_fd) ... ok | |
| test_stat_dir_fd (test.test_posix.TestPosixDirFd.test_stat_dir_fd) ... ok | |
| test_symlink_dir_fd (test.test_posix.TestPosixDirFd.test_symlink_dir_fd) ... ok | |
| test_unlink_dir_fd (test.test_posix.TestPosixDirFd.test_unlink_dir_fd) ... ok | |
| test_utime_dir_fd (test.test_posix.TestPosixDirFd.test_utime_dir_fd) ... ok | |
| test_bad_file_actions (test.test_posix.TestPosixSpawn.test_bad_file_actions) ... ok | |
| test_close_file (test.test_posix.TestPosixSpawn.test_close_file) ... ERROR | |
| test_dup2 (test.test_posix.TestPosixSpawn.test_dup2) ... ok | |
| test_empty_file_actions (test.test_posix.TestPosixSpawn.test_empty_file_actions) ... ok | |
| test_multiple_file_actions (test.test_posix.TestPosixSpawn.test_multiple_file_actions) ... ok | |
| test_no_such_executable (test.test_posix.TestPosixSpawn.test_no_such_executable) ... ok | |
| test_none_file_actions (test.test_posix.TestPosixSpawn.test_none_file_actions) ... ok | |
| test_open_file (test.test_posix.TestPosixSpawn.test_open_file) ... ok | |
| test_resetids (test.test_posix.TestPosixSpawn.test_resetids) ... ok | |
| test_resetids_explicit_default (test.test_posix.TestPosixSpawn.test_resetids_explicit_default) ... ok | |
| test_returns_pid (test.test_posix.TestPosixSpawn.test_returns_pid) ... ok | |
| test_setpgroup (test.test_posix.TestPosixSpawn.test_setpgroup) ... ok | |
| test_setpgroup_wrong_type (test.test_posix.TestPosixSpawn.test_setpgroup_wrong_type) ... ok | |
| test_setscheduler_only_param (test.test_posix.TestPosixSpawn.test_setscheduler_only_param) ... ok | |
| test_setscheduler_with_policy (test.test_posix.TestPosixSpawn.test_setscheduler_with_policy) ... ok | |
| test_setsid (test.test_posix.TestPosixSpawn.test_setsid) ... ok | |
| test_setsigdef (test.test_posix.TestPosixSpawn.test_setsigdef) ... ok | |
| test_setsigdef_wrong_type (test.test_posix.TestPosixSpawn.test_setsigdef_wrong_type) ... ok | |
| test_setsigmask (test.test_posix.TestPosixSpawn.test_setsigmask) ... ok | |
| test_setsigmask_wrong_type (test.test_posix.TestPosixSpawn.test_setsigmask_wrong_type) ... ok | |
| test_specify_environment (test.test_posix.TestPosixSpawn.test_specify_environment) ... ok | |
| test_bad_file_actions (test.test_posix.TestPosixSpawnP.test_bad_file_actions) ... ok | |
| test_close_file (test.test_posix.TestPosixSpawnP.test_close_file) ... ERROR | |
| test_dup2 (test.test_posix.TestPosixSpawnP.test_dup2) ... ok | |
| test_empty_file_actions (test.test_posix.TestPosixSpawnP.test_empty_file_actions) ... ok | |
| test_multiple_file_actions (test.test_posix.TestPosixSpawnP.test_multiple_file_actions) ... ok | |
| test_no_such_executable (test.test_posix.TestPosixSpawnP.test_no_such_executable) ... ok | |
| test_none_file_actions (test.test_posix.TestPosixSpawnP.test_none_file_actions) ... ok | |
| test_open_file (test.test_posix.TestPosixSpawnP.test_open_file) ... ok | |
| test_posix_spawnp (test.test_posix.TestPosixSpawnP.test_posix_spawnp) ... ok | |
| test_resetids (test.test_posix.TestPosixSpawnP.test_resetids) ... ok | |
| test_resetids_explicit_default (test.test_posix.TestPosixSpawnP.test_resetids_explicit_default) ... ok | |
| test_returns_pid (test.test_posix.TestPosixSpawnP.test_returns_pid) ... ok | |
| test_setpgroup (test.test_posix.TestPosixSpawnP.test_setpgroup) ... ok | |
| test_setpgroup_wrong_type (test.test_posix.TestPosixSpawnP.test_setpgroup_wrong_type) ... ok | |
| test_setscheduler_only_param (test.test_posix.TestPosixSpawnP.test_setscheduler_only_param) ... ok | |
| test_setscheduler_with_policy (test.test_posix.TestPosixSpawnP.test_setscheduler_with_policy) ... ok | |
| test_setsid (test.test_posix.TestPosixSpawnP.test_setsid) ... ok | |
| test_setsigdef (test.test_posix.TestPosixSpawnP.test_setsigdef) ... ok | |
| test_setsigdef_wrong_type (test.test_posix.TestPosixSpawnP.test_setsigdef_wrong_type) ... ok | |
| test_setsigmask (test.test_posix.TestPosixSpawnP.test_setsigmask) ... ok | |
| test_setsigmask_wrong_type (test.test_posix.TestPosixSpawnP.test_setsigmask_wrong_type) ... ok | |
| test_specify_environment (test.test_posix.TestPosixSpawnP.test_specify_environment) ... ok | |
| test_access (test.test_posix.TestPosixWeaklinking.test_access) ... skipped 'test weak linking on macOS' | |
| test_chmod (test.test_posix.TestPosixWeaklinking.test_chmod) ... skipped 'test weak linking on macOS' | |
| test_chown (test.test_posix.TestPosixWeaklinking.test_chown) ... skipped 'test weak linking on macOS' | |
| test_link (test.test_posix.TestPosixWeaklinking.test_link) ... skipped 'test weak linking on macOS' | |
| test_listdir_scandir (test.test_posix.TestPosixWeaklinking.test_listdir_scandir) ... skipped 'test weak linking on macOS' | |
| test_mkdir (test.test_posix.TestPosixWeaklinking.test_mkdir) ... skipped 'test weak linking on macOS' | |
| test_mkfifo (test.test_posix.TestPosixWeaklinking.test_mkfifo) ... skipped 'test weak linking on macOS' | |
| test_mknod (test.test_posix.TestPosixWeaklinking.test_mknod) ... skipped 'test weak linking on macOS' | |
| test_open (test.test_posix.TestPosixWeaklinking.test_open) ... skipped 'test weak linking on macOS' | |
| test_ptsname_r (test.test_posix.TestPosixWeaklinking.test_ptsname_r) ... skipped 'test weak linking on macOS' | |
| test_pwritev (test.test_posix.TestPosixWeaklinking.test_pwritev) ... skipped 'test weak linking on macOS' | |
| test_readlink (test.test_posix.TestPosixWeaklinking.test_readlink) ... skipped 'test weak linking on macOS' | |
| test_rename_replace (test.test_posix.TestPosixWeaklinking.test_rename_replace) ... skipped 'test weak linking on macOS' | |
| test_stat (test.test_posix.TestPosixWeaklinking.test_stat) ... skipped 'test weak linking on macOS' | |
| test_symlink (test.test_posix.TestPosixWeaklinking.test_symlink) ... skipped 'test weak linking on macOS' | |
| test_unlink_rmdir (test.test_posix.TestPosixWeaklinking.test_unlink_rmdir) ... skipped 'test weak linking on macOS' | |
| test_utime (test.test_posix.TestPosixWeaklinking.test_utime) ... skipped 'test weak linking on macOS' | |
| ====================================================================== | |
| ERROR: test_fexecve (test.test_posix.PosixTester.test_fexecve) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_posix.py", line 201, in test_fexecve | |
| posix.execve(fp, [sys.executable, '-c', 'pass'], os.environ) | |
| ~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| FileNotFoundError: [Errno 2] No such file or directory: 6 | |
| ====================================================================== | |
| ERROR: test_close_file (test.test_posix.TestPosixSpawn.test_close_file) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_posix.py", line 2133, in test_close_file | |
| with open(closefile, encoding="utf-8") as f: | |
| ~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| FileNotFoundError: [Errno 2] No such file or directory: '@test_12328_tmpæ' | |
| ====================================================================== | |
| ERROR: test_close_file (test.test_posix.TestPosixSpawnP.test_close_file) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_posix.py", line 2133, in test_close_file | |
| with open(closefile, encoding="utf-8") as f: | |
| ~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| FileNotFoundError: [Errno 2] No such file or directory: '@test_12328_tmpæ' | |
| ---------------------------------------------------------------------- | |
| Ran 171 tests in 7.592s | |
| FAILED (errors=3, skipped=34) | |
| Warning -- cwd was modified by test_posix | |
| Warning -- Before: /home/buildbot/cpython/build/test_python_12328æ | |
| Warning -- After: /home/buildbot/cpython | |
| Warning -- files was modified by test_posix | |
| Warning -- Before: [] | |
| Warning -- After: ['@test_12328_tmpæ'] | |
| test test_posix failed | |
| FAIL | |
| test_fs_holes (test.test_posix.PosixTester.test_fs_holes) ... ok | |
| test_fstat (test.test_posix.PosixTester.test_fstat) ... ok | |
| test_fstatvfs (test.test_posix.PosixTester.test_fstatvfs) ... ok | |
| test_ftruncate (test.test_posix.PosixTester.test_ftruncate) ... ok | |
| test_get_and_set_scheduler_and_param (test.test_posix.PosixTester.test_get_and_set_scheduler_and_param) ... ok | |
| test_getcwd_long_pathnames (test.test_posix.PosixTester.test_getcwd_long_pathnames) ... ok | |
| test_getgrouplist (test.test_posix.PosixTester.test_getgrouplist) ... ok | |
| test_getgroups (test.test_posix.PosixTester.test_getgroups) ... ok | |
| test_getresgid (test.test_posix.PosixTester.test_getresgid) ... ok | |
| test_getresuid (test.test_posix.PosixTester.test_getresuid) ... ok | |
| test_initgroups (test.test_posix.PosixTester.test_initgroups) ... ok | |
| test_lchflags_regular_file (test.test_posix.PosixTester.test_lchflags_regular_file) ... skipped 'test needs os.lchflags()' | |
| test_lchflags_symlink (test.test_posix.PosixTester.test_lchflags_symlink) ... skipped 'test needs os.lchflags()' | |
| test_lchmod_dir (test.test_posix.PosixTester.test_lchmod_dir) ... skipped 'test needs os.lchmod()' | |
| test_lchmod_dir_symlink (test.test_posix.PosixTester.test_lchmod_dir_symlink) ... skipped 'test needs os.lchmod()' | |
| test_lchmod_file (test.test_posix.PosixTester.test_lchmod_file) ... skipped 'test needs os.lchmod()' | |
| test_lchmod_file_symlink (test.test_posix.PosixTester.test_lchmod_file_symlink) ... skipped 'test needs os.lchmod()' | |
| test_lchown (test.test_posix.PosixTester.test_lchown) ... ok | |
| test_link_follow_symlinks (test.test_posix.PosixTester.test_link_follow_symlinks) ... ok | |
| test_listdir (test.test_posix.PosixTester.test_listdir) ... ok | |
| test_listdir_bytes (test.test_posix.PosixTester.test_listdir_bytes) ... ok | |
| test_listdir_bytes_like (test.test_posix.PosixTester.test_listdir_bytes_like) ... ok | |
| test_listdir_default (test.test_posix.PosixTester.test_listdir_default) ... ok | |
| test_listdir_fd (test.test_posix.PosixTester.test_listdir_fd) ... ok | |
| test_lockf (test.test_posix.PosixTester.test_lockf) ... ok | |
| test_makedev (test.test_posix.PosixTester.test_makedev) ... ok | |
| test_mkfifo (test.test_posix.PosixTester.test_mkfifo) ... ok | |
| test_mknod (test.test_posix.PosixTester.test_mknod) ... ok | |
| test_oscloexec (test.test_posix.PosixTester.test_oscloexec) ... ok | |
| test_osexlock (test.test_posix.PosixTester.test_osexlock) ... skipped 'test needs posix.O_EXLOCK' | |
| test_osshlock (test.test_posix.PosixTester.test_osshlock) ... skipped 'test needs posix.O_SHLOCK' | |
| test_path_error2 (test.test_posix.PosixTester.test_path_error2) | |
| Test functions that call path_error2(), providing two filenames in their exceptions. ... ok | |
| test_path_with_null_byte (test.test_posix.PosixTester.test_path_with_null_byte) ... ok | |
| test_path_with_null_character (test.test_posix.PosixTester.test_path_with_null_character) ... ok | |
| test_pidfd_open (test.test_posix.PosixTester.test_pidfd_open) ... skipped 'pidfd_open unavailable' | |
| test_pipe (test.test_posix.PosixTester.test_pipe) ... ok | |
| test_pipe2 (test.test_posix.PosixTester.test_pipe2) ... ok | |
| test_pipe2_c_limits (test.test_posix.PosixTester.test_pipe2_c_limits) ... ok | |
| test_posix_fadvise (test.test_posix.PosixTester.test_posix_fadvise) ... ok | |
| test_posix_fadvise_errno (test.test_posix.PosixTester.test_posix_fadvise_errno) ... ok | |
| test_posix_fallocate (test.test_posix.PosixTester.test_posix_fallocate) ... ok | |
| test_posix_fallocate_errno (test.test_posix.PosixTester.test_posix_fallocate_errno) ... ok | |
| test_pread (test.test_posix.PosixTester.test_pread) ... ok | |
| test_preadv (test.test_posix.PosixTester.test_preadv) ... ok | |
| test_preadv_flags (test.test_posix.PosixTester.test_preadv_flags) ... ok | |
| test_preadv_overflow_32bits (test.test_posix.PosixTester.test_preadv_overflow_32bits) ... skipped 'test is only meaningful on 32-bit builds' | |
| test_putenv (test.test_posix.PosixTester.test_putenv) ... ok | |
| test_pwrite (test.test_posix.PosixTester.test_pwrite) ... ok | |
| test_pwritev (test.test_posix.PosixTester.test_pwritev) ... ok | |
| test_pwritev_flags (test.test_posix.PosixTester.test_pwritev_flags) ... skipped 'test needs os.RWF_SYNC' | |
| test_pwritev_overflow_32bits (test.test_posix.PosixTester.test_pwritev_overflow_32bits) ... skipped 'test is only meaningful on 32-bit builds' | |
| test_readv (test.test_posix.PosixTester.test_readv) ... ok | |
| test_readv_overflow_32bits (test.test_posix.PosixTester.test_readv_overflow_32bits) ... skipped 'test is only meaningful on 32-bit builds' | |
| test_register_at_fork (test.test_posix.PosixTester.test_register_at_fork) ... ok | |
| test_rtld_constants (test.test_posix.PosixTester.test_rtld_constants) ... ok | |
| test_sched_getaffinity (test.test_posix.PosixTester.test_sched_getaffinity) ... ok | |
| test_sched_param (test.test_posix.PosixTester.test_sched_param) ... ok | |
| test_sched_priority (test.test_posix.PosixTester.test_sched_priority) ... ok | |
| test_sched_rr_get_interval (test.test_posix.PosixTester.test_sched_rr_get_interval) ... ok | |
| test_sched_setaffinity (test.test_posix.PosixTester.test_sched_setaffinity) ... ok | |
| test_sched_yield (test.test_posix.PosixTester.test_sched_yield) ... ok | |
| test_setresgid (test.test_posix.PosixTester.test_setresgid) ... ok | |
| test_setresgid_exception (test.test_posix.PosixTester.test_setresgid_exception) ... ok | |
| test_setresuid (test.test_posix.PosixTester.test_setresuid) ... ok | |
| test_setresuid_exception (test.test_posix.PosixTester.test_setresuid_exception) ... ok | |
| test_stat (test.test_posix.PosixTester.test_stat) ... ok | |
| test_statvfs (test.test_posix.PosixTester.test_statvfs) ... ok | |
| test_strerror (test.test_posix.PosixTester.test_strerror) ... ok | |
| test_sysconf (test.test_posix.PosixTester.test_sysconf) ... ok | |
| test_truncate (test.test_posix.PosixTester.test_truncate) ... ok | |
| test_umask (test.test_posix.PosixTester.test_umask) ... ok | |
| test_utime (test.test_posix.PosixTester.test_utime) ... ok | |
| test_utime_nofollow_symlinks (test.test_posix.PosixTester.test_utime_nofollow_symlinks) ... ok | |
| test_utime_with_fd (test.test_posix.PosixTester.test_utime_with_fd) ... ok | |
| test_waitid (test.test_posix.PosixTester.test_waitid) ... ok | |
| test_writev (test.test_posix.PosixTester.test_writev) ... ok | |
| test_writev_overflow_32bits (test.test_posix.PosixTester.test_writev_overflow_32bits) ... skipped 'test is only meaningful on 32-bit builds' | |
| test_access_dir_fd (test.test_posix.TestPosixDirFd.test_access_dir_fd) ... ok | |
| test_chmod_dir_fd (test.test_posix.TestPosixDirFd.test_chmod_dir_fd) ... ok | |
| test_chown_dir_fd (test.test_posix.TestPosixDirFd.test_chown_dir_fd) ... ok | |
| test_link_dir_fd (test.test_posix.TestPosixDirFd.test_link_dir_fd) ... ok | |
| test_mkdir_dir_fd (test.test_posix.TestPosixDirFd.test_mkdir_dir_fd) ... ok | |
| test_mkfifo_dir_fd (test.test_posix.TestPosixDirFd.test_mkfifo_dir_fd) ... ok | |
| test_mknod_dir_fd (test.test_posix.TestPosixDirFd.test_mknod_dir_fd) ... ok | |
| test_open_dir_fd (test.test_posix.TestPosixDirFd.test_open_dir_fd) ... ok | |
| test_readlink_dir_fd (test.test_posix.TestPosixDirFd.test_readlink_dir_fd) ... ok | |
| test_rename_dir_fd (test.test_posix.TestPosixDirFd.test_rename_dir_fd) ... ok | |
| test_stat_dir_fd (test.test_posix.TestPosixDirFd.test_stat_dir_fd) ... ok | |
| test_symlink_dir_fd (test.test_posix.TestPosixDirFd.test_symlink_dir_fd) ... ok | |
| test_unlink_dir_fd (test.test_posix.TestPosixDirFd.test_unlink_dir_fd) ... ok | |
| test_utime_dir_fd (test.test_posix.TestPosixDirFd.test_utime_dir_fd) ... ok | |
| test_bad_file_actions (test.test_posix.TestPosixSpawn.test_bad_file_actions) ... ok | |
| test_close_file (test.test_posix.TestPosixSpawn.test_close_file) ... ERROR | |
| test_dup2 (test.test_posix.TestPosixSpawn.test_dup2) ... ok | |
| test_empty_file_actions (test.test_posix.TestPosixSpawn.test_empty_file_actions) ... ok | |
| test_multiple_file_actions (test.test_posix.TestPosixSpawn.test_multiple_file_actions) ... ok | |
| test_no_such_executable (test.test_posix.TestPosixSpawn.test_no_such_executable) ... ok | |
| test_none_file_actions (test.test_posix.TestPosixSpawn.test_none_file_actions) ... ok | |
| test_open_file (test.test_posix.TestPosixSpawn.test_open_file) ... ok | |
| test_resetids (test.test_posix.TestPosixSpawn.test_resetids) ... ok | |
| test_resetids_explicit_default (test.test_posix.TestPosixSpawn.test_resetids_explicit_default) ... ok | |
| test_returns_pid (test.test_posix.TestPosixSpawn.test_returns_pid) ... ok | |
| test_setpgroup (test.test_posix.TestPosixSpawn.test_setpgroup) ... ok | |
| test_setpgroup_wrong_type (test.test_posix.TestPosixSpawn.test_setpgroup_wrong_type) ... ok | |
| test_setscheduler_only_param (test.test_posix.TestPosixSpawn.test_setscheduler_only_param) ... ok | |
| test_setscheduler_with_policy (test.test_posix.TestPosixSpawn.test_setscheduler_with_policy) ... ok | |
| test_setsid (test.test_posix.TestPosixSpawn.test_setsid) ... ok | |
| test_setsigdef (test.test_posix.TestPosixSpawn.test_setsigdef) ... ok | |
| test_setsigdef_wrong_type (test.test_posix.TestPosixSpawn.test_setsigdef_wrong_type) ... ok | |
| test_setsigmask (test.test_posix.TestPosixSpawn.test_setsigmask) ... ok | |
| test_setsigmask_wrong_type (test.test_posix.TestPosixSpawn.test_setsigmask_wrong_type) ... ok | |
| test_specify_environment (test.test_posix.TestPosixSpawn.test_specify_environment) ... ok | |
| test_bad_file_actions (test.test_posix.TestPosixSpawnP.test_bad_file_actions) ... ok | |
| test_close_file (test.test_posix.TestPosixSpawnP.test_close_file) ... ERROR | |
| test_dup2 (test.test_posix.TestPosixSpawnP.test_dup2) ... ok | |
| test_empty_file_actions (test.test_posix.TestPosixSpawnP.test_empty_file_actions) ... ok | |
| test_multiple_file_actions (test.test_posix.TestPosixSpawnP.test_multiple_file_actions) ... ok | |
| test_no_such_executable (test.test_posix.TestPosixSpawnP.test_no_such_executable) ... ok | |
| test_none_file_actions (test.test_posix.TestPosixSpawnP.test_none_file_actions) ... ok | |
| test_open_file (test.test_posix.TestPosixSpawnP.test_open_file) ... ok | |
| test_posix_spawnp (test.test_posix.TestPosixSpawnP.test_posix_spawnp) ... ok | |
| test_resetids (test.test_posix.TestPosixSpawnP.test_resetids) ... ok | |
| test_resetids_explicit_default (test.test_posix.TestPosixSpawnP.test_resetids_explicit_default) ... ok | |
| test_returns_pid (test.test_posix.TestPosixSpawnP.test_returns_pid) ... ok | |
| test_setpgroup (test.test_posix.TestPosixSpawnP.test_setpgroup) ... ok | |
| test_setpgroup_wrong_type (test.test_posix.TestPosixSpawnP.test_setpgroup_wrong_type) ... ok | |
| test_setscheduler_only_param (test.test_posix.TestPosixSpawnP.test_setscheduler_only_param) ... ok | |
| test_setscheduler_with_policy (test.test_posix.TestPosixSpawnP.test_setscheduler_with_policy) ... ok | |
| test_setsid (test.test_posix.TestPosixSpawnP.test_setsid) ... ok | |
| test_setsigdef (test.test_posix.TestPosixSpawnP.test_setsigdef) ... ok | |
| test_setsigdef_wrong_type (test.test_posix.TestPosixSpawnP.test_setsigdef_wrong_type) ... ok | |
| test_setsigmask (test.test_posix.TestPosixSpawnP.test_setsigmask) ... ok | |
| test_setsigmask_wrong_type (test.test_posix.TestPosixSpawnP.test_setsigmask_wrong_type) ... ok | |
| test_specify_environment (test.test_posix.TestPosixSpawnP.test_specify_environment) ... ok | |
| test_access (test.test_posix.TestPosixWeaklinking.test_access) ... skipped 'test weak linking on macOS' | |
| test_chmod (test.test_posix.TestPosixWeaklinking.test_chmod) ... skipped 'test weak linking on macOS' | |
| test_chown (test.test_posix.TestPosixWeaklinking.test_chown) ... skipped 'test weak linking on macOS' | |
| test_link (test.test_posix.TestPosixWeaklinking.test_link) ... skipped 'test weak linking on macOS' | |
| test_listdir_scandir (test.test_posix.TestPosixWeaklinking.test_listdir_scandir) ... skipped 'test weak linking on macOS' | |
| test_mkdir (test.test_posix.TestPosixWeaklinking.test_mkdir) ... skipped 'test weak linking on macOS' | |
| test_mkfifo (test.test_posix.TestPosixWeaklinking.test_mkfifo) ... skipped 'test weak linking on macOS' | |
| test_mknod (test.test_posix.TestPosixWeaklinking.test_mknod) ... skipped 'test weak linking on macOS' | |
| test_open (test.test_posix.TestPosixWeaklinking.test_open) ... skipped 'test weak linking on macOS' | |
| test_ptsname_r (test.test_posix.TestPosixWeaklinking.test_ptsname_r) ... skipped 'test weak linking on macOS' | |
| test_pwritev (test.test_posix.TestPosixWeaklinking.test_pwritev) ... skipped 'test weak linking on macOS' | |
| test_readlink (test.test_posix.TestPosixWeaklinking.test_readlink) ... skipped 'test weak linking on macOS' | |
| test_rename_replace (test.test_posix.TestPosixWeaklinking.test_rename_replace) ... skipped 'test weak linking on macOS' | |
| test_stat (test.test_posix.TestPosixWeaklinking.test_stat) ... skipped 'test weak linking on macOS' | |
| test_symlink (test.test_posix.TestPosixWeaklinking.test_symlink) ... skipped 'test weak linking on macOS' | |
| test_unlink_rmdir (test.test_posix.TestPosixWeaklinking.test_unlink_rmdir) ... skipped 'test weak linking on macOS' | |
| test_utime (test.test_posix.TestPosixWeaklinking.test_utime) ... skipped 'test weak linking on macOS' | |
| ====================================================================== | |
| ERROR: test_close_file (test.test_posix.TestPosixSpawn.test_close_file) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_posix.py", line 2133, in test_close_file | |
| with open(closefile, encoding="utf-8") as f: | |
| ~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| FileNotFoundError: [Errno 2] No such file or directory: '@test_12328_tmpæ' | |
| ====================================================================== | |
| ERROR: test_close_file (test.test_posix.TestPosixSpawnP.test_close_file) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_posix.py", line 2133, in test_close_file | |
| with open(closefile, encoding="utf-8") as f: | |
| ~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| FileNotFoundError: [Errno 2] No such file or directory: '@test_12328_tmpæ' | |
| ====================================================================== | |
| FAIL: test_fexecve (test.test_posix.PosixTester.test_fexecve) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_posix.py", line 203, in test_fexecve | |
| support.wait_process(pid, exitcode=0) | |
| ~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/support/__init__.py", line 2281, in wait_process | |
| raise AssertionError(f"process {pid} is still running " | |
| f"after {dt:.1f} seconds") | |
| AssertionError: process 12353 is still running after 300.3 seconds | |
| ---------------------------------------------------------------------- | |
| Ran 171 tests in 306.513s | |
| FAILED (failures=1, errors=2, skipped=34) | |
| test test_posix failed | |
| == Tests result: FAILURE == | |
| 10 slowest tests: | |
| - test_posix: 5 min 6 sec | |
| - test.test_multiprocessing_spawn.test_processes: 2 min 33 sec | |
| - test_subprocess: 1 min 22 sec | |
| - test.test_concurrent_futures.test_process_pool: 1 min 7 sec | |
| - test_regrtest: 1 min 2 sec | |
| - test.test_multiprocessing_spawn.test_misc: 55.2 sec | |
| - test.test_multiprocessing_forkserver.test_processes: 52.3 sec | |
| - test_socket: 45.0 sec | |
| - test.test_multiprocessing_forkserver.test_misc: 40.9 sec | |
| - test.test_concurrent_futures.test_deadlock: 39.9 sec | |
| 23 tests skipped: | |
| test.test_asyncio.test_windows_events | |
| test.test_asyncio.test_windows_utils test.test_gdb.test_backtrace | |
| test.test_gdb.test_cfunction test.test_gdb.test_cfunction_full | |
| test.test_gdb.test_misc test.test_gdb.test_pretty_print | |
| test_android test_apple test_dbm_gnu test_dbm_ndbm test_devpoll | |
| test_free_threading test_kqueue test_launcher test_msvcrt | |
| test_startfile test_winapi test_winconsoleio test_winreg | |
| test_winsound test_wmi test_zstd | |
| 4 tests skipped (resource denied): | |
| test_peg_generator test_tkinter test_ttk test_zipfile64 | |
| 8 tests failed: | |
| test.test_concurrent_futures.test_deadlock | |
| test.test_multiprocessing_spawn.test_processes test_cmd_line | |
| test_faulthandler test_os test_posix test_regrtest test_subprocess | |
| 457 tests OK. | |
| 0:07:15 load avg: 1.52 Re-running 8 failed tests in verbose mode in subprocesses | |
| 0:07:15 load avg: 1.52 Run 8 tests in parallel using 8 worker processes (timeout: 10 min, worker timeout: 15 min) | |
| 0:07:16 load avg: 2.04 [1/8/1] test_os failed (1 failure) | |
| Re-running test_os in verbose mode (matching: test_pipe_spawnl) | |
| test_pipe_spawnl (test.test_os.PseudoterminalTests.test_pipe_spawnl) ... Traceback (most recent call last): | |
| File "/home/buildbot/cpython/build/test_python_20400æ/@test_20400_tmpæ", line 31, in <module> | |
| raise Exception("dup must fail") | |
| Exception: dup must fail | |
| FAIL | |
| ====================================================================== | |
| FAIL: test_pipe_spawnl (test.test_os.PseudoterminalTests.test_pipe_spawnl) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/contextlib.py", line 85, in inner | |
| return func(*args, **kwds) | |
| File "/home/buildbot/cpython/Lib/test/test_os.py", line 4959, in test_pipe_spawnl | |
| self.assertEqual(exitcode, 0) | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^ | |
| AssertionError: 1 != 0 | |
| ---------------------------------------------------------------------- | |
| Ran 1 test in 0.279s | |
| FAILED (failures=1) | |
| test test_os failed | |
| 0:07:16 load avg: 2.04 [2/8/2] test_subprocess failed (2 failures) | |
| Re-running test_subprocess in verbose mode (matching: test_pass_fds_redirected, test_pipe_cloexec) | |
| test_pass_fds_redirected (test.test_subprocess.POSIXProcessTestCase.test_pass_fds_redirected) | |
| Regression test for https://bugs.python.org/issue32270. ... FAIL | |
| test_pipe_cloexec (test.test_subprocess.POSIXProcessTestCase.test_pipe_cloexec) ... FAIL | |
| ====================================================================== | |
| FAIL: test_pass_fds_redirected (test.test_subprocess.POSIXProcessTestCase.test_pass_fds_redirected) | |
| Regression test for https://bugs.python.org/issue32270. | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_subprocess.py", line 3086, in test_pass_fds_redirected | |
| self.assertEqual(fds, {0, 1, 2} | frozenset(pass_fds), f"output={output!a}") | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| AssertionError: Items in the first set but not the second: | |
| 3 | |
| 4 | |
| 5 : output=b'0,1,2,3,4,5,6,7,9\n' | |
| ====================================================================== | |
| FAIL: test_pipe_cloexec (test.test_subprocess.POSIXProcessTestCase.test_pipe_cloexec) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_subprocess.py", line 2822, in test_pipe_cloexec | |
| self.assertFalse(result_fds & unwanted_fds, | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| "Expected no fds from %r to be open in child, " | |
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| "found %r" % | |
| ^^^^^^^^^^^^ | |
| (unwanted_fds, result_fds & unwanted_fds)) | |
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| AssertionError: {8, 7} is not false : Expected no fds from {8, 10, 7} to be open in child, found {8, 7} | |
| ---------------------------------------------------------------------- | |
| Ran 2 tests in 0.301s | |
| FAILED (failures=2) | |
| test test_subprocess failed | |
| 0:07:16 load avg: 2.04 [3/8/3] test_cmd_line failed (4 failures) | |
| Re-running test_cmd_line in verbose mode (matching: test_no_std_streams, test_no_stderr, test_no_stdin, test_no_stdout) | |
| test_no_std_streams (test.test_cmd_line.CmdLineTest.test_no_std_streams) ... FAIL | |
| test_no_stderr (test.test_cmd_line.CmdLineTest.test_no_stderr) ... FAIL | |
| test_no_stdin (test.test_cmd_line.CmdLineTest.test_no_stdin) ... FAIL | |
| test_no_stdout (test.test_cmd_line.CmdLineTest.test_no_stdout) ... FAIL | |
| ====================================================================== | |
| FAIL: test_no_std_streams (test.test_cmd_line.CmdLineTest.test_no_std_streams) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_cmd_line.py", line 554, in test_no_std_streams | |
| self._test_no_stdio(['stdin', 'stdout', 'stderr']) | |
| ~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_cmd_line.py", line 542, in _test_no_stdio | |
| self.assertEqual(p.returncode, 42) | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^ | |
| AssertionError: 1 != 42 | |
| ====================================================================== | |
| FAIL: test_no_stderr (test.test_cmd_line.CmdLineTest.test_no_stderr) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_cmd_line.py", line 551, in test_no_stderr | |
| self._test_no_stdio(['stderr']) | |
| ~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_cmd_line.py", line 542, in _test_no_stdio | |
| self.assertEqual(p.returncode, 42) | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^ | |
| AssertionError: 1 != 42 | |
| ====================================================================== | |
| FAIL: test_no_stdin (test.test_cmd_line.CmdLineTest.test_no_stdin) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_cmd_line.py", line 545, in test_no_stdin | |
| self._test_no_stdio(['stdin']) | |
| ~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_cmd_line.py", line 542, in _test_no_stdio | |
| self.assertEqual(p.returncode, 42) | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^ | |
| AssertionError: 1 != 42 | |
| ====================================================================== | |
| FAIL: test_no_stdout (test.test_cmd_line.CmdLineTest.test_no_stdout) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_cmd_line.py", line 548, in test_no_stdout | |
| self._test_no_stdio(['stdout']) | |
| ~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_cmd_line.py", line 542, in _test_no_stdio | |
| self.assertEqual(p.returncode, 42) | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^ | |
| AssertionError: 1 != 42 | |
| ---------------------------------------------------------------------- | |
| Ran 4 tests in 0.501s | |
| FAILED (failures=4) | |
| test test_cmd_line failed | |
| 0:07:16 load avg: 2.04 [4/8/4] test_faulthandler failed (9 failures) | |
| Re-running test_faulthandler in verbose mode (matching: test_disable, test_dump_ext_modules, test_enable_fd, test_enable_file, test_enable_single_thread, test_enable_without_c_stack, test_gc, test_gil_released, test_sigsegv) | |
| test_disable (test.test_faulthandler.FaultHandlerTests.test_disable) ... FAIL | |
| test_dump_ext_modules (test.test_faulthandler.FaultHandlerTests.test_dump_ext_modules) ... FAIL | |
| test_enable_fd (test.test_faulthandler.FaultHandlerTests.test_enable_fd) ... FAIL | |
| test_enable_file (test.test_faulthandler.FaultHandlerTests.test_enable_file) ... FAIL | |
| test_enable_single_thread (test.test_faulthandler.FaultHandlerTests.test_enable_single_thread) ... FAIL | |
| test_enable_without_c_stack (test.test_faulthandler.FaultHandlerTests.test_enable_without_c_stack) ... FAIL | |
| test_gc (test.test_faulthandler.FaultHandlerTests.test_gc) ... FAIL | |
| test_gil_released (test.test_faulthandler.FaultHandlerTests.test_gil_released) ... FAIL | |
| test_sigsegv (test.test_faulthandler.FaultHandlerTests.test_sigsegv) ... FAIL | |
| ====================================================================== | |
| FAIL: test_disable (test.test_faulthandler.FaultHandlerTests.test_disable) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 387, in test_disable | |
| self.assertNotEqual(exitcode, 0) | |
| ~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^ | |
| AssertionError: 0 == 0 | |
| ====================================================================== | |
| FAIL: test_dump_ext_modules (test.test_faulthandler.FaultHandlerTests.test_dump_ext_modules) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 404, in test_dump_ext_modules | |
| self.fail(f"Cannot find 'Extension modules:' in {stderr!r}") | |
| ~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| AssertionError: Cannot find 'Extension modules:' in '' | |
| ====================================================================== | |
| FAIL: test_enable_fd (test.test_faulthandler.FaultHandlerTests.test_enable_fd) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 342, in test_enable_fd | |
| self.check_fatal_error(""" | |
| ~~~~~~~~~~~~~~~~~~~~~~^^^^ | |
| import faulthandler | |
| ^^^^^^^^^^^^^^^^^^^ | |
| ...<5 lines>... | |
| 'Segmentation fault', | |
| ^^^^^^^^^^^^^^^^^^^^^ | |
| fd=fd) | |
| ^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 163, in check_fatal_error | |
| self.check_error(code, line_number, fatal_error, **kw) | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 156, in check_error | |
| self.assertRegex(output, regex) | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^ | |
| AssertionError: Regex didn't match: '(?m)^Fatal Python error: Segmentation fault\n\nCurrent thread 0x[0-9a-f]+( \\[.*\\])? \\(most recent call first\\):\n File "<string>", line 4 in <module>\nCurrent thread\'s C stack trace \\(most recent call first\\):\n( Binary file ".+"(, at .*(\\+|-)0x[0-9a-f]+)? \\[0x[0-9a-f]+\\])|(<.+>)' not found in '' | |
| ====================================================================== | |
| FAIL: test_enable_file (test.test_faulthandler.FaultHandlerTests.test_enable_file) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 326, in test_enable_file | |
| self.check_fatal_error(""" | |
| ~~~~~~~~~~~~~~~~~~~~~~^^^^ | |
| import faulthandler | |
| ^^^^^^^^^^^^^^^^^^^ | |
| ...<5 lines>... | |
| 'Segmentation fault', | |
| ^^^^^^^^^^^^^^^^^^^^^ | |
| filename=filename) | |
| ^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 163, in check_fatal_error | |
| self.check_error(code, line_number, fatal_error, **kw) | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 154, in check_error | |
| output, exitcode = self.get_output(code, filename=filename, fd=fd) | |
| ~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 96, in get_output | |
| self.assertEqual(output, '') | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^ | |
| AssertionError: "<sys>:0: ResourceWarning: unclosed file [63 chars]'>\n" != '' | |
| - <sys>:0: ResourceWarning: unclosed file <_io.BufferedWriter name='/tmp/test_python_702c7nwp/tmpijzq9o4a'> | |
| ====================================================================== | |
| FAIL: test_enable_single_thread (test.test_faulthandler.FaultHandlerTests.test_enable_single_thread) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 354, in test_enable_single_thread | |
| self.check_fatal_error(""" | |
| ~~~~~~~~~~~~~~~~~~~~~~^^^^ | |
| import faulthandler | |
| ^^^^^^^^^^^^^^^^^^^ | |
| ...<4 lines>... | |
| 'Segmentation fault', | |
| ^^^^^^^^^^^^^^^^^^^^^ | |
| all_threads=False) | |
| ^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 163, in check_fatal_error | |
| self.check_error(code, line_number, fatal_error, **kw) | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 156, in check_error | |
| self.assertRegex(output, regex) | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^ | |
| AssertionError: Regex didn't match: '(?m)^Fatal Python error: Segmentation fault\n\nStack\\ \\(most\\ recent\\ call\\ first\\):\n File "<string>", line 3 in <module>\nCurrent thread\'s C stack trace \\(most recent call first\\):\n( Binary file ".+"(, at .*(\\+|-)0x[0-9a-f]+)? \\[0x[0-9a-f]+\\])|(<.+>)' not found in '' | |
| ====================================================================== | |
| FAIL: test_enable_without_c_stack (test.test_faulthandler.FaultHandlerTests.test_enable_without_c_stack) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 365, in test_enable_without_c_stack | |
| self.check_fatal_error(""" | |
| ~~~~~~~~~~~~~~~~~~~~~~^^^^ | |
| import faulthandler | |
| ^^^^^^^^^^^^^^^^^^^ | |
| ...<4 lines>... | |
| 'Segmentation fault', | |
| ^^^^^^^^^^^^^^^^^^^^^ | |
| c_stack=False) | |
| ^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 163, in check_fatal_error | |
| self.check_error(code, line_number, fatal_error, **kw) | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 156, in check_error | |
| self.assertRegex(output, regex) | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^ | |
| AssertionError: Regex didn't match: '(?m)^Fatal Python error: Segmentation fault\n\nCurrent thread 0x[0-9a-f]+( \\[.*\\])? \\(most recent call first\\):\n File "<string>", line 3 in <module>' not found in '' | |
| ====================================================================== | |
| FAIL: test_gc (test.test_faulthandler.FaultHandlerTests.test_gc) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 182, in test_gc | |
| self.check_fatal_error(""" | |
| ~~~~~~~~~~~~~~~~~~~~~~^^^^ | |
| import faulthandler | |
| ^^^^^^^^^^^^^^^^^^^ | |
| ...<28 lines>... | |
| function='__del__', | |
| ^^^^^^^^^^^^^^^^^^^ | |
| garbage_collecting=True) | |
| ^^^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 163, in check_fatal_error | |
| self.check_error(code, line_number, fatal_error, **kw) | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 156, in check_error | |
| self.assertRegex(output, regex) | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^ | |
| AssertionError: Regex didn't match: '(?m)^Fatal Python error: Segmentation fault\n\nCurrent thread 0x[0-9a-f]+( \\[.*\\])? \\(most recent call first\\):\n Garbage-collecting\n File "<string>", line 9 in __del__\nCurrent thread\'s C stack trace \\(most recent call first\\):\n( Binary file ".+"(, at .*(\\+|-)0x[0-9a-f]+)? \\[0x[0-9a-f]+\\])|(<.+>)' not found in 'exit' | |
| ====================================================================== | |
| FAIL: test_gil_released (test.test_faulthandler.FaultHandlerTests.test_gil_released) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 315, in test_gil_released | |
| self.check_fatal_error(""" | |
| ~~~~~~~~~~~~~~~~~~~~~~^^^^ | |
| import faulthandler | |
| ^^^^^^^^^^^^^^^^^^^ | |
| ...<3 lines>... | |
| 3, | |
| ^^ | |
| 'Segmentation fault') | |
| ^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 163, in check_fatal_error | |
| self.check_error(code, line_number, fatal_error, **kw) | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 156, in check_error | |
| self.assertRegex(output, regex) | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^ | |
| AssertionError: Regex didn't match: '(?m)^Fatal Python error: Segmentation fault\n\nCurrent thread 0x[0-9a-f]+( \\[.*\\])? \\(most recent call first\\):\n File "<string>", line 3 in <module>\nCurrent thread\'s C stack trace \\(most recent call first\\):\n( Binary file ".+"(, at .*(\\+|-)0x[0-9a-f]+)? \\[0x[0-9a-f]+\\])|(<.+>)' not found in '' | |
| ====================================================================== | |
| FAIL: test_sigsegv (test.test_faulthandler.FaultHandlerTests.test_sigsegv) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 171, in test_sigsegv | |
| self.check_fatal_error(""" | |
| ~~~~~~~~~~~~~~~~~~~~~~^^^^ | |
| import faulthandler | |
| ^^^^^^^^^^^^^^^^^^^ | |
| ...<3 lines>... | |
| 3, | |
| ^^ | |
| 'Segmentation fault') | |
| ^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 163, in check_fatal_error | |
| self.check_error(code, line_number, fatal_error, **kw) | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_faulthandler.py", line 156, in check_error | |
| self.assertRegex(output, regex) | |
| ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^ | |
| AssertionError: Regex didn't match: '(?m)^Fatal Python error: Segmentation fault\n\nCurrent thread 0x[0-9a-f]+( \\[.*\\])? \\(most recent call first\\):\n File "<string>", line 3 in <module>\nCurrent thread\'s C stack trace \\(most recent call first\\):\n( Binary file ".+"(, at .*(\\+|-)0x[0-9a-f]+)? \\[0x[0-9a-f]+\\])|(<.+>)' not found in '' | |
| ---------------------------------------------------------------------- | |
| Ran 9 tests in 1.206s | |
| FAILED (failures=9) | |
| test test_faulthandler failed | |
| 0:07:16 load avg: 2.04 [5/8/5] test_regrtest failed (2 failures) | |
| Re-running test_regrtest in verbose mode (matching: test_crashed, test_worker_output_on_failure) | |
| test_crashed (test.test_regrtest.ArgsTestCase.test_crashed) ... FAIL | |
| test_worker_output_on_failure (test.test_regrtest.ArgsTestCase.test_worker_output_on_failure) ... FAIL | |
| ====================================================================== | |
| FAIL: test_crashed (test.test_regrtest.ArgsTestCase.test_crashed) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_regrtest.py", line 1344, in test_crashed | |
| output = self.run_tests("-j2", *tests, exitcode=EXITCODE_BAD_TEST) | |
| File "/home/buildbot/cpython/Lib/test/test_regrtest.py", line 933, in run_tests | |
| return self.run_python(cmdargs, **kw) | |
| ~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_regrtest.py", line 780, in run_python | |
| proc = self.run_command(cmd, **kw) | |
| File "/home/buildbot/cpython/Lib/test/test_regrtest.py", line 768, in run_command | |
| self.fail(msg) | |
| ~~~~~~~~~^^^^^ | |
| AssertionError: Command ['/home/buildbot/cpython/python', '-X', 'faulthandler', '-I', '-m', 'test', '--testdir=/tmp/test_python_cs5eiwg4/tmpmgz4ra01', '-j2', 'test_regrtest_crash'] failed with exit code 4, but exit code 2 expected! | |
| stdout: | |
| --- | |
| Using random seed: 2489974870 | |
| 0:00:00 load avg: 2.04 Run 1 test in parallel using 1 worker process | |
| 0:00:00 load avg: 2.04 [1/1] test_regrtest_crash ran no tests | |
| == Tests result: NO TESTS RAN == | |
| 1 test run no tests: | |
| test_regrtest_crash | |
| Total duration: 317 ms | |
| Total tests: run=0 | |
| Total test files: run=1/1 run_no_tests=1 | |
| Result: NO TESTS RAN | |
| --- | |
| ====================================================================== | |
| FAIL: test_worker_output_on_failure (test.test_regrtest.ArgsTestCase.test_worker_output_on_failure) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_regrtest.py", line 2245, in test_worker_output_on_failure | |
| output = self.run_tests("-j1", testname, | |
| exitcode=EXITCODE_BAD_TEST, | |
| env=env) | |
| File "/home/buildbot/cpython/Lib/test/test_regrtest.py", line 933, in run_tests | |
| return self.run_python(cmdargs, **kw) | |
| ~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/test_regrtest.py", line 780, in run_python | |
| proc = self.run_command(cmd, **kw) | |
| File "/home/buildbot/cpython/Lib/test/test_regrtest.py", line 768, in run_command | |
| self.fail(msg) | |
| ~~~~~~~~~^^^^^ | |
| AssertionError: Command ['/home/buildbot/cpython/python', '-X', 'faulthandler', '-I', '-m', 'test', '--testdir=/tmp/test_python_cs5eiwg4/tmpgshgphpe', '-j1', 'test_regrtest_noop1'] failed with exit code 0, but exit code 2 expected! | |
| stdout: | |
| --- | |
| Using random seed: 915784820 | |
| 0:00:00 load avg: 2.04 Run 1 test in parallel using 1 worker process | |
| 0:00:00 load avg: 2.04 [1/1] test_regrtest_noop1 passed | |
| just before crash! | |
| == Tests result: SUCCESS == | |
| 1 test OK. | |
| Total duration: 283 ms | |
| Total tests: run=1 | |
| Total test files: run=1/1 | |
| Result: SUCCESS | |
| --- | |
| ---------------------------------------------------------------------- | |
| Ran 2 tests in 1.159s | |
| FAILED (failures=2) | |
| test test_regrtest failed | |
| 0:07:46 load avg: 1.67 running (3): test.test_concurrent_futures.test_deadlock (31.6 sec), test.test_multiprocessing_spawn.test_processes (31.6 sec), test_posix (31.6 sec) | |
| 0:08:16 load avg: 1.28 running (3): test.test_concurrent_futures.test_deadlock (1 min 1 sec), test.test_multiprocessing_spawn.test_processes (1 min 1 sec), test_posix (1 min 1 sec) | |
| 0:08:46 load avg: 1.42 running (3): test.test_concurrent_futures.test_deadlock (1 min 31 sec), test.test_multiprocessing_spawn.test_processes (1 min 31 sec), test_posix (1 min 31 sec) | |
| 0:08:56 load avg: 1.43 [6/8/6] test.test_multiprocessing_spawn.test_processes failed (1 error) (1 min 40 sec) -- running (2): test.test_concurrent_futures.test_deadlock (1 min 40 sec), test_posix (1 min 40 sec) | |
| Re-running test.test_multiprocessing_spawn.test_processes in verbose mode (matching: test_interrupt) | |
| test_interrupt (test.test_multiprocessing_spawn.test_processes.WithProcessesTestProcess.test_interrupt) ... ERROR | |
| Warning -- Dangling processes: {<Process name='Process-1' pid=20434 parent=20407 started daemon>} | |
| Warning -- Dangling processes: {<Process name='Process-1' pid=20434 parent=20407 started daemon>} | |
| Process Process-1: | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/multiprocessing/process.py", line 320, in _bootstrap | |
| self.run() | |
| ~~~~~~~~^^ | |
| File "/home/buildbot/cpython/Lib/multiprocessing/process.py", line 108, in run | |
| self._target(*self._args, **self._kwargs) | |
| ~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/_test_multiprocessing.py", line 524, in _sleep_some_event | |
| time.sleep(100) | |
| ~~~~~~~~~~^^^^^ | |
| KeyboardInterrupt | |
| Warning -- reap_children() reaped child process 20434 | |
| ====================================================================== | |
| ERROR: test_interrupt (test.test_multiprocessing_spawn.test_processes.WithProcessesTestProcess.test_interrupt) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/contextlib.py", line 85, in inner | |
| return func(*args, **kwds) | |
| File "/home/buildbot/cpython/Lib/test/_test_multiprocessing.py", line 596, in test_interrupt | |
| exitcode = self._kill_process(multiprocessing.Process.interrupt) | |
| File "/home/buildbot/cpython/Lib/contextlib.py", line 85, in inner | |
| return func(*args, **kwds) | |
| File "/home/buildbot/cpython/Lib/test/_test_multiprocessing.py", line 577, in _kill_process | |
| self.assertEqual(join(), None) | |
| ~~~~^^ | |
| File "/home/buildbot/cpython/Lib/test/_test_multiprocessing.py", line 250, in __call__ | |
| return self.func(*args, **kwds) | |
| ~~~~~~~~~^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/multiprocessing/process.py", line 156, in join | |
| res = self._popen.wait(timeout) | |
| File "/home/buildbot/cpython/Lib/multiprocessing/popen_fork.py", line 44, in wait | |
| return self.poll(os.WNOHANG if timeout == 0.0 else 0) | |
| ~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/multiprocessing/popen_fork.py", line 28, in poll | |
| pid, sts = os.waitpid(self.pid, flag) | |
| ~~~~~~~~~~^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/_test_multiprocessing.py", line 573, in handler | |
| raise RuntimeError('join took too long: %s' % p) | |
| RuntimeError: join took too long: <Process name='Process-1' pid=20434 parent=20407 started daemon> | |
| ---------------------------------------------------------------------- | |
| Ran 1 test in 100.484s | |
| FAILED (errors=1) | |
| test test.test_multiprocessing_spawn.test_processes failed | |
| 0:09:26 load avg: 1.17 running (2): test.test_concurrent_futures.test_deadlock (2 min 10 sec), test_posix (2 min 10 sec) | |
| 0:09:56 load avg: 1.03 running (2): test.test_concurrent_futures.test_deadlock (2 min 40 sec), test_posix (2 min 40 sec) | |
| 0:10:26 load avg: 1.08 running (2): test.test_concurrent_futures.test_deadlock (3 min 10 sec), test_posix (3 min 10 sec) | |
| 0:10:56 load avg: 1.01 running (2): test.test_concurrent_futures.test_deadlock (3 min 40 sec), test_posix (3 min 40 sec) | |
| 0:11:26 load avg: 1.34 running (2): test.test_concurrent_futures.test_deadlock (4 min 10 sec), test_posix (4 min 10 sec) | |
| 0:11:56 load avg: 1.20 running (2): test.test_concurrent_futures.test_deadlock (4 min 41 sec), test_posix (4 min 40 sec) | |
| 0:12:17 load avg: 1.00 [7/8/7] test_posix failed (2 errors, 1 failure) (5 min 1 sec) -- running (1): test.test_concurrent_futures.test_deadlock (5 min 1 sec) | |
| Re-running test_posix in verbose mode (matching: test_close_file, test_close_file, test_fexecve) | |
| test_fexecve (test.test_posix.PosixTester.test_fexecve) ... ERROR | |
| test_close_file (test.test_posix.TestPosixSpawn.test_close_file) ... ERROR | |
| test_close_file (test.test_posix.TestPosixSpawnP.test_close_file) ... ERROR | |
| ====================================================================== | |
| ERROR: test_fexecve (test.test_posix.PosixTester.test_fexecve) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_posix.py", line 201, in test_fexecve | |
| posix.execve(fp, [sys.executable, '-c', 'pass'], os.environ) | |
| ~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| FileNotFoundError: [Errno 2] No such file or directory: 6 | |
| ====================================================================== | |
| ERROR: test_close_file (test.test_posix.TestPosixSpawn.test_close_file) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_posix.py", line 2133, in test_close_file | |
| with open(closefile, encoding="utf-8") as f: | |
| ~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| FileNotFoundError: [Errno 2] No such file or directory: '@test_20411_tmpæ' | |
| ====================================================================== | |
| ERROR: test_close_file (test.test_posix.TestPosixSpawnP.test_close_file) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_posix.py", line 2133, in test_close_file | |
| with open(closefile, encoding="utf-8") as f: | |
| ~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| FileNotFoundError: [Errno 2] No such file or directory: '@test_20411_tmpæ' | |
| ---------------------------------------------------------------------- | |
| Ran 3 tests in 0.349s | |
| FAILED (errors=3) | |
| Warning -- cwd was modified by test_posix | |
| Warning -- Before: /home/buildbot/cpython/build/test_python_20411æ | |
| Warning -- After: /home/buildbot/cpython | |
| Warning -- files was modified by test_posix | |
| Warning -- Before: [] | |
| Warning -- After: ['@test_20411_tmpæ'] | |
| test test_posix failed | |
| FAIL | |
| test_close_file (test.test_posix.TestPosixSpawn.test_close_file) ... ERROR | |
| test_close_file (test.test_posix.TestPosixSpawnP.test_close_file) ... ERROR | |
| ====================================================================== | |
| ERROR: test_close_file (test.test_posix.TestPosixSpawn.test_close_file) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_posix.py", line 2133, in test_close_file | |
| with open(closefile, encoding="utf-8") as f: | |
| ~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| FileNotFoundError: [Errno 2] No such file or directory: '@test_20411_tmpæ' | |
| ====================================================================== | |
| ERROR: test_close_file (test.test_posix.TestPosixSpawnP.test_close_file) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_posix.py", line 2133, in test_close_file | |
| with open(closefile, encoding="utf-8") as f: | |
| ~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
| FileNotFoundError: [Errno 2] No such file or directory: '@test_20411_tmpæ' | |
| ====================================================================== | |
| FAIL: test_fexecve (test.test_posix.PosixTester.test_fexecve) | |
| ---------------------------------------------------------------------- | |
| Traceback (most recent call last): | |
| File "/home/buildbot/cpython/Lib/test/test_posix.py", line 203, in test_fexecve | |
| support.wait_process(pid, exitcode=0) | |
| ~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^ | |
| File "/home/buildbot/cpython/Lib/test/support/__init__.py", line 2281, in wait_process | |
| raise AssertionError(f"process {pid} is still running " | |
| f"after {dt:.1f} seconds") | |
| AssertionError: process 20423 is still running after 300.9 seconds | |
| ---------------------------------------------------------------------- | |
| Ran 3 tests in 301.257s | |
| FAILED (failures=1, errors=2) | |
| test test_posix failed | |
| 0:12:47 load avg: 1.05 running (1): test.test_concurrent_futures.test_deadlock (5 min 31 sec) | |
| 0:13:17 load avg: 0.86 running (1): test.test_concurrent_futures.test_deadlock (6 min 1 sec) | |
| Timeout (0:05:00)! | |
| Thread 0x00007ffff3fff700 [Thread-14] (most recent call first): | |
| File "/home/buildbot/cpython/Lib/subprocess.py", line 2057 in _wait | |
| File "/home/buildbot/cpython/Lib/subprocess.py", line 1277 in wait | |
| File "/home/buildbot/cpython/Lib/test/libregrtest/run_workers.py", line 194 in _run_process | |
| File "/home/buildbot/cpython/Lib/test/libregrtest/run_workers.py", line 299 in run_tmp_files | |
| File "/home/buildbot/cpython/Lib/test/libregrtest/run_workers.py", line 363 in _runtest | |
| File "/home/buildbot/cpython/Lib/test/libregrtest/run_workers.py", line 403 in run | |
| File "/home/buildbot/cpython/Lib/threading.py", line 1074 in _bootstrap_inner | |
| File "/home/buildbot/cpython/Lib/threading.py", line 1036 in _bootstrap | |
| Thread 0x00007ffffe844c00 [python] (most recent call first): | |
| File "/home/buildbot/cpython/Lib/threading.py", line 366 in wait | |
| File "/home/buildbot/cpython/Lib/queue.py", line 210 in get | |
| File "/home/buildbot/cpython/Lib/test/libregrtest/run_workers.py", line 541 in _get_result | |
| File "/home/buildbot/cpython/Lib/test/libregrtest/run_workers.py", line 610 in run | |
| File "/home/buildbot/cpython/Lib/test/libregrtest/main.py", line 455 in _run_tests_mp | |
| File "/home/buildbot/cpython/Lib/test/libregrtest/main.py", line 288 in _rerun_failed_tests | |
| File "/home/buildbot/cpython/Lib/test/libregrtest/main.py", line 309 in rerun_failed_tests | |
| File "/home/buildbot/cpython/Lib/test/libregrtest/main.py", line 569 in _run_tests | |
| File "/home/buildbot/cpython/Lib/test/libregrtest/main.py", line 598 in run_tests | |
| File "/home/buildbot/cpython/Lib/test/libregrtest/main.py", line 767 in main | |
| File "/home/buildbot/cpython/Lib/test/libregrtest/main.py", line 775 in main | |
| File "/home/buildbot/cpython/Lib/test/__main__.py", line 2 in <module> | |
| File "/home/buildbot/cpython/Lib/runpy.py", line 88 in _run_code | |
| File "/home/buildbot/cpython/Lib/runpy.py", line 198 in _run_module_as_main | |
| make: *** [Makefile:2432: test] Error 1 | |
| cpython main [2] 🐚 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment