Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tech-Audit #2: Code Tests #194

Closed
TomDonoghue opened this issue Jun 22, 2020 · 1 comment
Closed

Tech-Audit #2: Code Tests #194

TomDonoghue opened this issue Jun 22, 2020 · 1 comment
Assignees
Labels
2.1 Updates to go into a 2.1.0 release discussion Issue for discussing a topic / idea.

Comments

@TomDonoghue
Copy link
Member

TomDonoghue commented Jun 22, 2020

This issue is part of a 'technical audit' of neurodsp, with @elybrand.

Our current code tests are, broadly speaking, 'smoke tests', meaning they execute code in the module, and generally check that nothing breaks - but they generally do not test very much for accuracy of the outputs.

As part of the technical audit, it would be nice to update / extend the tests we have to do more accuracy checking. This might help to discover if there are any issues with our current implementations, as well as offering a better test suite to guide future development.

If you are using simulations to check expected output in #193, this this issue can broadly be addressed by collecting these examples and trying to add them to our code tests, in so far as we can do so effectively and efficiently.

All our tests currently work on simulated data, for which approximate results can be predicted. Testing analysis functions is quite difficult - so I think the goal here is to add more accuracy checking, if & where we can, but without "over-doing it" (as in, what can we add to be more confident in our implementations, and to be safer w.r.t. future development, without falling too deep into the rabbit hole of broadly difficult questions about how to precisely test code that computes estimations).

@TomDonoghue TomDonoghue added the discussion Issue for discussing a topic / idea. label Jun 22, 2020
@TomDonoghue TomDonoghue added the 2.1 Updates to go into a 2.1.0 release label Jul 9, 2020
@TomDonoghue
Copy link
Member Author

@elybrand updated some of our tests to be more stringest in #204

Overall, I think in terms of the audit this was checked, and there are no current ToDos on this. Closing.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
2.1 Updates to go into a 2.1.0 release discussion Issue for discussing a topic / idea.
Projects
None yet
Development

No branches or pull requests

2 participants