-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ERROR: Test not found; the issue occurs with a test that uses random data in pytest.mark.parametrize. #22526
Comments
Also reproducing with:
Add to conftest.py:
|
Hi! Yes the seems related to a few issues we have where generated names are not working on the rewrite. I will circle back when I start work on this or with another issue to close against. Thanks |
Because we have not heard back with the information we requested, we are closing this issue for now. If you are able to provide the info later on, then we will be happy to re-open this issue to pick up where we left off. Happy Coding! |
Hi @akachurin93, following up on this. On investigation this seems like a more complex issue since these tests have no stable id. Pytest has no single way of identifying them and therefore there is not a way I create a ID that will stay constant through out the discovery/run cycle. A potential fix could be that when you click the parent node, like the file node which contains these tests than the request string it sends includes only the file name and not test_ids which would work to run since the file ID is more general. Is this the type of fix you would be looking for? |
Hi @eleanorjboyd ! |
I mean to collect test identifiers as it was in the case of this code: |
Im not sure I would want the extension to be making changes on the level of renaming tests as this feels like overreach into a users project. What would running the tests individually look like? Would this be running each parameterize test instance or the run the single parent parameterize test object? ie |
What would running the tests individually look like? Would this be running each parameterize test instance or the run the single parent parameterize test object? |
using the same example: do you want to be able to click to run |
Because we have not heard back with the information we requested, we are closing this issue for now. If you are able to provide the info later on, then we will be happy to re-open this issue to pick up where we left off. Happy Coding! |
Ideally, the user should see test_ex[return_data0], test_ex[return_data1], but it's also acceptable to display random information in the title, as long as it still triggers test_ex[return_data0], test_ex[return_data1]. |
Hello! Reviewing this issue again I think it is still out of scope and too challenging to implement. It would require renaming user test case names and somehow matching those to the pytest run results. If anyone feels strongly about this issue and wants to write a PR I will definitely take a look but I am going to close this issue as I do not see it as very possible. Thanks |
Type: Bug
Behaviour
Expected vs. Actual
Expected:
Clicking on test in TEST EXPLORER bar, test successfully started
Actual:
Clicking on test in TEST EXPLORER bar, getting: ERROR: not found: test_example.py::TestExample::test_1[sTlq]
(no name in any of [])
Steps to reproduce:
Diagnostic data
python.languageServer
setting: DefaultOutput for
Python
in theOutput
panel (View
→Output
, change the drop-down the upper-right of theOutput
panel toPython
)User Settings
Extension version: 2023.21.13261010
VS Code version: Code 1.84.2 (1a5daa3a0231a0fbba4f14db7ec463cf99d7768e, 2023-11-09T10:51:52.184Z)
OS version: Windows_NT x64 10.0.22621
Modes:
System Info
canvas_oop_rasterization: enabled_on
direct_rendering_display_compositor: disabled_off_ok
gpu_compositing: enabled
multiple_raster_threads: enabled_on
opengl: enabled_on
rasterization: enabled
raw_draw: disabled_off_ok
video_decode: enabled
video_encode: enabled
vulkan: disabled_off
webgl: enabled
webgl2: enabled
webgpu: enabled
A/B Experiments
The text was updated successfully, but these errors were encountered: