-
Notifications
You must be signed in to change notification settings - Fork 110
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Transformation that converts design matrix into records #2847
Comments
For reference, this is the corresponding code for ERT2: https://github.com/equinor/semeio/blob/main/semeio/jobs/design2params/design2params.py |
Transformation multiplexing/singleplexing There's clearly a need for multiplexing transformations, i.e. transformations that create multiple records from one file, or vice-versa—or both. Transformations already do singleplexing with This increases the complexity of the transformation API. So, to make this livable, we need very strict rules for how *plexing is dealt with. E.g.
After configuration and creation on the Further, I initially thought that the interface would be something like this: async def to_record(self, root_path = Path()) -> Record:
async def to_records(self, root_path = Path()) -> RecordCollection: but a significant usage of design matrices is to create only one group (which we can call So
For DOE there's also not vectors, but scalars, so #2934 blocks this. TBC… |
Closing as related to ert3, which is no longer the direction taken by the project. Feel free to reopen if still relevant. |
This one relates to code in the |
Closing in favor of #4656 |
When working on doe (design of experiment) it is currently quite cumbersome to read the parameters from the design matrix file (eg. by means of an external job) and then create parameter json representation (one by one) in order to load them later on explicitly as records. Therefore it would be meaningful to have a transformation that reads such a design matrix csv file and does the parameter records automatically.
The text was updated successfully, but these errors were encountered: