Similar to what happens to askgpt, it would be very useful for me and anyone with a local LLM running on OLLAMA (https://ollama.ai/), a similar action and connector. Right now I have it setup running several models which are passed by name.
I know that I can currently bypass this using api requests. But it wouldnt be too complex, i think, to re/purpose the askgpt action/connector to also allow to connect to an ollama host/port. the chats endpoint are the same.
Hi Andre
I think it would be really helpful to have a function similar to "lookup function" for local LLMs. I use LM Studio and I'd love to connect it to EasyMorph. It would be a game-changer if the EasyMorph team could develop this feature. It would allow everyone to easily connect their data to an LLM
I guess one way to do it is to generalize for any local LLM server that uses gpt like api endpoints like /v1/chat/completions , /v1/completions , /v1/embeddings. Where you only had to pinpoint the local address and port.
It would be incredibly helpful to be able to work with and modify individual rows, cells, or entire columns of data. Plus, the option to decide where the results go, like replacing existing information or adding new columns, would be fantastic. Being able to specify API endpoints, addresses, and ports would also be a game-changer. And the ability to see what models are available would be the icing on the cake.