Replies: 5 comments 5 replies
-
Really nice wrap up of our discussion! Thank you so much! I would argue that another con of adding parameters directly to the class is that it is necessary to instantiate the adapter, e.g. adapter = ArgsAdapter(parameters)
adapter.run(concs) versus ArcsAdapter.run(concs, parameters) This former creates a fictious split between parameters and concs, while they should be on the same level. Furthermore, within the run function it would be different ways of accessing the two: def run(concs):
internal_run_model(concs = concs, temperature = self.temperature) versus @classmethod
def run(concs, settings):
internal_run_model(concs = concs, temperature = settings.temperature) Meaning that the latter could be a classmethod, while the former requires initialization. Would be nice to explicitly avoid any form of state. |
Beta Was this translation helpful? Give feedback.
-
Although I agree, this is true for anyone using pydantic, isn't it? Hence this argument would go against using pydantic in any form or shape? |
Beta Was this translation helpful? Give feedback.
-
Interesting thought.. I mean, we are still going to review any adapters coming into the repo, it's not like it's free for all. If we see a specific use case that causes issues for us we could always amend usage of pydantic, disallowing it. |
Beta Was this translation helpful? Give feedback.
-
There are also use cases for the ones implementing the Baseadapter (that would be us) when using pydantic, as any built in validators do not need to be tested, and we get schema out of the box.. |
Beta Was this translation helpful? Give feedback.
-
I do like dataclass as well, they are simpler, less overhead, and you don't add more than you need. I just see that there are many things that will be solved if we use pydantic. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Parameters
(née Settings)
Part of the discussion regards how we should design defining parameters for a given model. The two versions that we have are using pydantic's
BaseModel
and rolling our own system that's similar to dataclasses/pydantic but satisfies our needs exactly.We want to achieve the following:
Parameters.validate_model(parameters).temperature
when the incorrectparameters["temperature"]
is shorter and nicer, but will not validate anything.Using Pydantic
The user can write a parameter model like so
Where
BaseParameters
subclassespydantic.BaseModel
andParameter
is a function that returnspydantic.Field
(which itself returnspydantic.FieldInfo
).We can't just use
Field
because it's not possible to directly set the fields that we want, likemin
,max
,label
,unit
andchoices
. TheParameter
will do so by wrappingField
and adding tojson_schema_extra
.We must also subclass
BaseModel
so that we can add our own validations automatically, like forchoices
.Pros:
@pydantic.field_validator
Cons:
Rolling our own
Pydantic is an extension of
@dataclass
with added validations. We can do the same ourselves. It's also possible to define parameters directly in the main adapter class.It's then possible to make use of parameters by accessing them via
self.
. BecauseBaseAdapter
is designed to be simple, we avoid the overload of suggestions for methods that are unneeded.Pros:
Cons:
Difficulty of implementation
Both methods would require us to write code which validates parameters. For Pydantic we'll mostly be using their API, while when rolling our own we use the standard Python API for metaprogramming. I'd say both are of equivalent difficulty to understand.
Beta Was this translation helpful? Give feedback.
All reactions