Micro Services #533
Replies: 2 comments 1 reply
-
This is looking good from a high-level @kavitharaju! I think we should do some more thinking around what our direction would be for APIs for fine-tuning models vs using it for inference-only. We might have some modifications (e.g. data flow, hardware requirements, etc.) based on that. |
Beta Was this translation helpful? Give feedback.
-
@kavitharaju I tried "KrakenD" It is a flexible API gateway that can be used to implement a variety of features for your API. Uses a JSON configuration file to define your API endpoints and their associated middleware. {
"endpoints": [
{
"name": "users",
"method": "GET",
"path": "/users",
"proxy": {
"target": "http://localhost:8080/users"
}
}
]
} Here Micro-service running with port "8080" and end-point id "/users" and KrakenD will run " KrakenD also supports a number of other features, such as:
And many more we can find here |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
With the requirement to add new services for AI APIs in VE, it becomes necessary we have some changes in the architecture to enable support for extending the APP module wise and also deploying/developing them separately as needed.
Here is a Miro board where I have tried to capture various points we have discussed within different teams and meetings and also some new thoughts.
Consider this board and discussion as a place to continue and collaborate on the already started discussions.
Beta Was this translation helpful? Give feedback.
All reactions