Does it really make sense to officially support both pandasai LLMs and langchain's ? #331
mspronesti
started this conversation in
General
Replies: 1 comment
-
@mspronesti thanks a lot for the great point! I totally agree. Just a quick explanation of why we have currently implemented it like this. We have out own models wrappers because:
On the other hand, we also support langchain models (as a non-default option) because:
This is the reason why atm we have both the options:
What do you think? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi @gventuri , this is a sort of philosophical thread for discussion (hence not really an issue).
I've just noticed we are now supporting any langchain client llm in PandasAI and the first thing that comes to mind was "what do we need our llms for then ?".
Matter of fact, langchain supports more clients and those in common have often more features than ours. Then why don't we simply drop the support to ours and live with langchain's ?
My original thought process was that having ours was better for the following 2 reasons:
A direct consequence of not respecting the first bullet is that we are implicitly saying that any client available in langchain works with pandasai, which is not the case.
To sum up, the point of this discussion is to raise a "coherency" issue and proposes 2 alternatives:
PandasAI
object. One only needs to make sure thattype
is implemented and that's pretty much it.What do you think ? :)
Beta Was this translation helpful? Give feedback.
All reactions