Skip to content
This repository has been archived by the owner on Jun 24, 2022. It is now read-only.

Beta testing

bvhpatel edited this page Oct 1, 2020 · 1 revision

Results from preliminary evaluation of SODA by our beta testers

Review procedure

Our beta testers have reviewed our software every time a new feature was implemented to test it and provide feedback for improving. Once most major features were included we ask them to evaluate the performance and usability of SODA for the entire curation and sharing processes of SPARC. We provided a sample non-curated dataset with enough instruction about its provenance so that anyone could curate it according to the SPARC standards. We asked them to curate and share the sample dataset without SODA for SPARC (Task A) and with SODA for SPARC (Task B). Half of them were asked to complete Task A first then Task B while the other half were asked to do the other way around.

Results

Based on their responses, all of them had already gone through the entire curation and sharing processes without SODA (between one to ten times according) while it was their first time doing so with SODA. Yet, we found that SODA for SPARC divided on average by three the time required to curate and share a dataset and made it relatively easier to understand and implement the requirements. After evaluating the shared datasets, we also found that SODA for SPARC reduces human errors significantly in the curation and sharing processes (no errors at all when the user followed instructions of the user-manual properly).

Beta testers

The list of our ten beta testers, all SPARC-funded researchers from different research groups, is available here

Clone this wiki locally