Commit Graph

42 Commits

Author SHA1 Message Date
ZYinNJU 16f410c788
Ollama chat model listener (#1765)
## Issue
Closes #1756
Closes #1750

## Change
1. `OllamaChatModel` and `OllamaStreamingChatModel` support
`ChatListener`
2. Fix `OllamaStreamingLanguageModel` throws `EOFException` when the
response content is too long.

## General checklist
<!-- Please double-check the following points and mark them like this:
[X] -->
- [x] There are no breaking changes
- [x] I have added unit and integration tests for my change
- [x] I have manually run all the unit and integration tests in the
module I have added/changed, and they are all green
- [x] I have manually run all the unit and integration tests in the
[core](https://github.com/langchain4j/langchain4j/tree/main/langchain4j-core)
and
[main](https://github.com/langchain4j/langchain4j/tree/main/langchain4j)
modules, and they are all green
<!-- Before adding documentation and example(s) (below), please wait
until the PR is reviewed and approved. -->
- [ ] I have added/updated the
[documentation](https://github.com/langchain4j/langchain4j/tree/main/docs/docs)
- [ ] I have added an example in the [examples
repo](https://github.com/langchain4j/langchain4j-examples) (only for
"big" features)
- [ ] I have added/updated [Spring Boot
starter(s)](https://github.com/langchain4j/langchain4j-spring) (if
applicable)
2024-09-17 11:12:43 +02:00
ZYinNJU 7c5e351486
langchain4j-ollama get rid of lombok (#1658)
## Issue
#1636 

## Change
`langchain4j-ollama` get rid of `lombok`

## General checklist
- [x] There are no breaking changes
- [ ] I have added unit and integration tests for my change
- [x] I have manually run all the unit and integration tests in the
module I have added/changed, and they are all green
- [x] I have manually run all the unit and integration tests in the
[core](https://github.com/langchain4j/langchain4j/tree/main/langchain4j-core)
and
[main](https://github.com/langchain4j/langchain4j/tree/main/langchain4j)
modules, and they are all green
<!-- Before adding documentation and example(s) (below), please wait
until the PR is reviewed and approved. -->
- [ ] I have added/updated the
[documentation](https://github.com/langchain4j/langchain4j/tree/main/docs/docs)
- [ ] I have added an example in the [examples
repo](https://github.com/langchain4j/langchain4j-examples) (only for
"big" features)
- [ ] I have added/updated [Spring Boot
starter(s)](https://github.com/langchain4j/langchain4j-spring) (if
applicable)
2024-09-10 13:49:53 +02:00
LangChain4j 21d35e4434 changed version to 0.35.0-SNAPSHOT 2024-09-09 10:11:09 +02:00
LangChain4j b0a8e6f45b
Release 0.34.0 (#1711) 2024-09-05 16:49:39 +02:00
bidek b818baa1e3
OllamaModels - list running models `api/ps` (#1562)
## Change
Add a new method to `OllamaModels` to list currently loaded/running
models.
Modify the old DTOs `OllamaModel` and `OllamaModelCard` to add missing
fields.

## General checklist
- [X] There are no breaking changes
- [X] I have added unit and integration tests for my change
- [X] I have manually run all the unit and integration tests in the
module I have added/changed, and they are all green
- [ ] I have manually run all the unit and integration tests in the
[core](https://github.com/langchain4j/langchain4j/tree/main/langchain4j-core)
and
[main](https://github.com/langchain4j/langchain4j/tree/main/langchain4j)
modules, and they are all green
<!-- Before adding documentation and example(s) (below), please wait
until the PR is reviewed and approved. -->
- [ ] I have added/updated the
[documentation](https://github.com/langchain4j/langchain4j/tree/main/docs/docs)
- [ ] I have added an example in the [examples
repo](https://github.com/langchain4j/langchain4j-examples) (only for
"big" features)
- [ ] I have added/updated [Spring Boot
starter(s)](https://github.com/langchain4j/langchain4j-spring) (if
applicable)
2024-09-05 10:43:49 +02:00
PrimosK c4b6ed5c78
re #1506 Enable Maven enforcer plugin and fix dependency conflict introduced by `okhttp` dependency in 19 modules. (#1645)
## Issue

#1506

## Change

Resolved version conflict:
```
[ERROR] Rule 0: org.apache.maven.enforcer.rules.dependency.DependencyConvergence failed with message:
[ERROR] Failed while enforcing releasability.
[ERROR]
[ERROR] Dependency convergence error for org.jetbrains.kotlin:kotlin-stdlib-jdk8:jar:1.9.10 paths to dependency are:
[ERROR] +-dev.langchain4j:langchain4j-ollama:jar:0.34.0-SNAPSHOT
[ERROR]   +-com.squareup.okhttp3:okhttp:jar:4.12.0:compile
[ERROR]     +-com.squareup.okio:okio:jar:3.6.0:compile
[ERROR]       +-com.squareup.okio:okio-jvm:jar:3.6.0:compile
[ERROR]         +-org.jetbrains.kotlin:kotlin-stdlib-jdk8:jar:1.9.10:compile
[ERROR] and
[ERROR] +-dev.langchain4j:langchain4j-ollama:jar:0.34.0-SNAPSHOT
[ERROR]   +-com.squareup.okhttp3:okhttp:jar:4.12.0:compile
[ERROR]     +-org.jetbrains.kotlin:kotlin-stdlib-jdk8:jar:1.8.21:compile
[ERROR]
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <args> -rf :langchain4j-ollama
```

... caused by 'okhttp' dependency and enabled Maven enforcer plugin in
the following modules:

- LangChain4j :: Integration :: Anthropic
- LangChain4j :: Integration :: ChatGLM
- LangChain4j :: Integration :: Chroma
- LangChain4j :: Integration :: CloudFlare Workers AI
- LangChain4j :: Integration :: Cohere
- LangChain4j :: Integration :: DashScope
- LangChain4j :: Integration :: Hugging Face
- LangChain4j :: Integration :: Jina
- LangChain4j :: Integration :: Judge0
- LangChain4j :: Integration :: MistralAI
- LangChain4j :: Integration :: Nomic
- LangChain4j :: Integration :: OVHcloud AI
- LangChain4j :: Integration :: Ollama
- LangChain4j :: Integration :: Qianfan
- LangChain4j :: Integration :: Vearch
- LangChain4j :: Integration :: Vespa
- LangChain4j :: Integration :: Zhipu AI
- LangChain4j :: Web Search Engine :: SearchApi
- LangChain4j :: Web Search Engine :: Tavily

## Note

Please note that [issue ](https://github.com/square/okhttp/issues/8288)
for this was already created in `httpok` repository but it will not be
fixed in 4.x. It's reportedly already tackled in version 5.x.

With that in mind I suggest we apply temporary changes proposed in this
PR. After upgrading to `httpok` 5.x we will be able to remove these.

## Tests

`mvn clean test` passed
2024-08-27 17:44:06 +02:00
LangChain4j 1cccfdfa65 changed version to 0.34.0-SNAPSHOT 2024-07-26 15:12:26 +02:00
LangChain4j 822f09cb1c
Release 0.33.0 (#1514) 2024-07-25 10:12:20 +02:00
Shawn Pang c8fe669931
Fix ollama client missing path issue (#1456)
## Issue
Closes #1455

## General checklist
<!-- Please double-check the following points and mark them like this:
[X] -->
- [X] There are no breaking changes
- [X] I have added unit and integration tests for my change
- [X] I have manually run all the unit and integration tests in the
module I have added/changed, and they are all green
- [X] I have manually run all the unit and integration tests in the
[core](https://github.com/langchain4j/langchain4j/tree/main/langchain4j-core)
and
[main](https://github.com/langchain4j/langchain4j/tree/main/langchain4j)
modules, and they are all green
<!-- Before adding documentation and example(s) (below), please wait
until the PR is reviewed and approved. -->
- [ ] I have added/updated the
[documentation](https://github.com/langchain4j/langchain4j/tree/main/docs/docs)
- [ ] I have added an example in the [examples
repo](https://github.com/langchain4j/langchain4j-examples) (only for
"big" features)
- [ ] I have added/updated [Spring Boot
starter(s)](https://github.com/langchain4j/langchain4j-spring) (if
applicable)
2024-07-16 11:11:31 +02:00
LangChain4j ee8f976782 Migrate ollama to jackson (#1072)
PropertyNamingStrategies.SnakeCaseStrategy requires jackson 2.12+
2024-07-16 10:02:25 +02:00
ZYinNJU bf4d2cb2f0
Migrate ollama to jackson (#1072)
## Issue
Closes #1042 


## Change
Migrate Ollama module from Gson to Jackson.
<img width="999" alt="image"
src="https://github.com/langchain4j/langchain4j/assets/77151639/723e622a-506f-47ae-9f98-56e761ae69ed">

Note that there are three things confused me(these issues exist in
original ollama module):

1. I observe that some tests with inputTokenCount Assertion (such as
`assertThat(tokenUsage.inputTokenCount()).isEqualTo(35)`) failed because
of the differencet count of input tokens. I don't know whether it's
because Ollama's update. Now I correct it
2. `should_propagate_failure_to_handler_onError` in
`Streaming{xxx}ModelIT` failed in my local because
`NullPointerException` do not have any message. Is it a problem in my
local environment?
3. Somtimes if I run tests **individually**, all tests will pass. But if
I run them in the same time(such as using command line rather than IDE),
they will failed due to some strange reasons(e.g. input token usage will
be null). I think Maybe it's a network problem or `Testcontainers`'s
problem.

I'm not sure if these problems are due to my local environment, so if
you have any suggestions or solutions, please let me know!

## General checklist
<!-- Please double-check the following points and mark them like this:
[X] -->
- [x] There are no breaking changes
- [ ] I have added unit and integration tests for my change
- [x] I have manually run all the unit and integration tests in the
module I have added/changed, and they are all green
- [x] I have manually run all the unit and integration tests in the
[core](https://github.com/langchain4j/langchain4j/tree/main/langchain4j-core)
and
[main](https://github.com/langchain4j/langchain4j/tree/main/langchain4j)
modules, and they are all green
<!-- Before adding documentation and example(s) (below), please wait
until the PR is reviewed and approved. -->
- [ ] I have added/updated the
[documentation](https://github.com/langchain4j/langchain4j/tree/main/docs/docs)
- [ ] I have added an example in the [examples
repo](https://github.com/langchain4j/langchain4j-examples) (only for
"big" features)
2024-07-16 09:43:10 +02:00
LangChain4j fe50c88e77 changed version to 0.33.0-SNAPSHOT 2024-07-08 14:47:07 +02:00
LangChain4j c2366a226c
Release 0.32.0 (#1409) 2024-07-04 12:04:29 +02:00
LangChain4j a1b733d96d bumped version to 0.32.0-SNAPSHOT 2024-05-24 16:25:13 +02:00
LangChain4j d9cb1e9b81
Release 0.31.0 (#1151) 2024-05-23 17:40:52 +02:00
LangChain4j 66c338c135 changed version to 0.31.0-SNAPSHOT 2024-04-29 11:21:00 +02:00
LangChain4j 1a340893ec
Release 0.30.0 (#945) 2024-04-16 18:21:01 +02:00
LangChain4j d1d9b45adc bumped to 0.30.0-SNAPSHOT 2024-04-08 17:36:52 +02:00
LangChain4j 45b58ac993
released 0.29.1 (#857) 2024-03-28 16:42:45 +01:00
LangChain4j d1e3cc1693
Release 0.29.0 (#830) 2024-03-26 11:54:43 +01:00
LangChain4j fbced4e70e Ollama: test that OpenAI API (OpenAiChatModel) works 2024-03-22 11:46:28 +01:00
LangChain4j da816fd491
Fix #756: Allow blank content in AiMessage, propagate failures into streaming handler (Ollama) (#782)
<!-- Thank you so much for your contribution! -->

## Context
See https://github.com/langchain4j/langchain4j/issues/756

## Change
- Allow creating `AiMessage` with blank ("", " ") content. `null` is
still prohibited.
- In `OllamaStreamingChat/LanguageModel`: propagate failures from
`onResponse()` method into `StreamingResponseHandler.onError()` method

## Checklist
Before submitting this PR, please check the following points:
- [X] I have added unit and integration tests for my change
- [X] All unit and integration tests in the module I have added/changed
are green
- [X] All unit and integration tests in the
[core](https://github.com/langchain4j/langchain4j/tree/main/langchain4j-core)
and
[main](https://github.com/langchain4j/langchain4j/tree/main/langchain4j)
modules are green
- [ ] I have added/updated the
[documentation](https://github.com/langchain4j/langchain4j/tree/main/docs/docs)
- [ ] I have added an example in the [examples
repo](https://github.com/langchain4j/langchain4j-examples) (only for
"big" features)
- [ ] I have added my new module in the
[BOM](https://github.com/langchain4j/langchain4j/blob/main/langchain4j-bom/pom.xml)
(only when a new module is added)
2024-03-22 09:39:48 +01:00
Eddú Meléndez Gonzales 3fabe0ed66
Use Testcontainers Ollama module (#702)
Testcontainers 1.19.7 offers an Ollama module. It also configures gpu if
available.
2024-03-19 13:05:24 +01:00
LangChain4j 91db3d354a bumped to 0.29.0-SNAPSHOT 2024-03-14 13:31:28 +01:00
LangChain4j 90fe3040b9
released 0.28.0 (#735) 2024-03-11 20:08:55 +01:00
LangChain4j 197b4af9d1 bumped version to 0.28.0-SNAPSHOT 2024-02-09 15:11:52 +01:00
LangChain4j c1462c087f
release 0.27.1 (#621) 2024-02-09 15:00:42 +01:00
LangChain4j ad2fd90f32 bumped version to 0.28.0-SNAPSHOT 2024-02-09 08:12:28 +01:00
LangChain4j a22d297104
Release 0.27.0 (#615) 2024-02-09 08:00:34 +01:00
Antonio Goncalves baac759766
Beautifying Maven output (#572)
Looking at the Maven output I thought it could benefit from a little
renaming. I just changed the `<name>` in the `pom.xml`, nothing more.
The output is like this at the moment:

![Screenshot 2024-01-30 at 16 26
53](https://github.com/langchain4j/langchain4j/assets/729277/940886d1-565e-416f-a58e-91f609fc0c00)

It could look like this if this PR is merged:

![Screenshot 2024-01-30 at 16 42
38](https://github.com/langchain4j/langchain4j/assets/729277/f8787af2-b869-4e95-90bd-72bce5622737)

Just a personal taste. Let me know if you like it or not (or want to
change it). If not, just discard it, it's fine ;o)
2024-01-30 16:54:54 +01:00
LangChain4j fca8ca48f7 bump version to 0.27.0-SNAPSHOT 2024-01-30 16:18:40 +01:00
LangChain4j 3958e01738
release 0.26.1 (#570) 2024-01-30 16:11:21 +01:00
LangChain4j 469699b944 bump version to 0.27.0-SNAPSHOT 2024-01-30 08:07:45 +01:00
LangChain4j a8ad9e48d9
Automate release (#562) 2024-01-30 07:20:20 +01:00
LangChain4j 7e5e82b7b2 updated to 0.26.0-SNAPSHOT 2023-12-22 18:08:19 +01:00
LangChain4j 2a5308b794 released 0.25.0 2023-12-22 18:02:04 +01:00
LangChain4j 7181633dcc
Ollama: add OllamaStreamingChatModel, "format" (json) and other parameters (#373)
- added `OllamaStreamingChatModel`
- added `format` parameter to all models, now can get valid JSON with
`format="json"`
- added `top_k`, `top_p`, `repeat_penalty`, `seed`, `num_predict`,
`stop` paramerters to all models
2023-12-21 12:14:11 +01:00
LangChain4j e1dddb33a2
bumped version to 0.25.0-SNAPSHOT (#369) 2023-12-19 13:03:48 +01:00
Eddú Meléndez Gonzales 00db7557ce
Use Testcontainers in Ollama IT (#315)
Currently, integration tests in Ollama module are disabled because
it needs a Ollama instance running in order to execute them.
Testcontainers provides this infrastructure not only by running
the ollama container but also by automating the pull model step.

This commit, use the Singleton Container approach to reuse the single
instance across multiple IT. Also, pull step is only executed when the
image is `ollama/ollama`.

The behavior on this is:
1st execution:
1. Pull `ollama/ollama` image
2. Start the container based on `ollama/ollama` image
3. Download the `orca-mini` model
4. Create an image based on the current state (with the model in it)
5. Declare the container ready to use
6. Run test

Next executions:
1. Look for the local image created in the 1st execution
2. Start the container based on the local image
3. Declare the container ready to use
4. Run test

1st execution is expected to take longer because of the model (3GB).
Next execution are way more faster.
2023-12-05 10:31:02 +01:00
deep-learning-dynamo 16f60dbef9 reducing duplication of *EmbeddingStoreIT 2023-11-18 16:23:29 +01:00
deep-learning-dynamo 21dfc8b317 released 0.24.0 2023-11-12 18:58:31 +01:00
ZYinNJU 677cf26bca
Ollama integration (#249)
@langchain4j Hi! I make some progress to integrate with Ollama, see
#244. Using `retrofit` to define REST API in [Ollama
doc](https://github.com/jmorganca/ollama/blob/main/docs/api.md#generate-a-completion).

Local test need local deployment of `Ollama`. Looking forward to get
your code review!

---------

Co-authored-by: Heezer <33568148+Heezer@users.noreply.github.com>
2023-11-07 21:39:40 +01:00