Added LM Studio way of serving open-source models (#377)

* Added LM Studio way of serving open-source models

Works on Windows too. The current suggested way of 'modelz' works on UNIX only.

* Update open_source_language_model_example.ipynb
This commit is contained in:
Yogesh Haribhau Kulkarni 2023-10-25 23:00:05 +05:30 committed by GitHub
parent 1dd77d8b81
commit 7dad875f43
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
1 changed files with 3 additions and 1 deletions

View File

@ -10,7 +10,9 @@
"AutoGen is compatible with the OpenAI API library in Python for executing language models. Consequently, it can work with any models employing a similar API without the necessity to modify your AutoGen code.\n",
"\n",
"\n",
"In this guide, we will utilize the [modelz-llm](https://github.com/tensorchord/modelz-llm) package to illustrate how to locally serve a model and integrate AutoGen with the served model.\n"
"In this guide, we will utilize the [modelz-llm](https://github.com/tensorchord/modelz-llm) package to illustrate how to locally serve a model and integrate AutoGen with the served model.\n",
"Actually there are multiple ways to serve the local model in Open AI API compatible way. At this point in time [modelz-llm](https://github.com/tensorchord/modelz-llm) is not usable on Unix.\n",
"For Windows, [LM Studio](https://lmstudio.ai/) works. It allows downloading, checking and serving the local Open Source models. More details of how to use are [here](https://medium.com/p/97cba96b0f75).\n"
]
},
{