This PR adds a basic support of Gemini (text).
Next time images and tools will be added.
---------
Co-authored-by: kuraleta <digital.kuraleta@gmail.com>
- added `OllamaStreamingChatModel`
- added `format` parameter to all models, now can get valid JSON with
`format="json"`
- added `top_k`, `top_p`, `repeat_penalty`, `seed`, `num_predict`,
`stop` paramerters to all models
This uses the same LangChain classes as
https://github.com/langchain4j/langchain4j/pull/298 by @Heezer
- langchain4j-core/src/main/java/dev/langchain4j/data/image/Image.java
-
langchain4j-core/src/main/java/dev/langchain4j/model/image/ImageModel.java
I copy/pasted those 2 files here as #298 isn't merged yet, and my goal
is that we use the same API in the end.
support [chatglm](https://github.com/THUDM/ChatGLM-6B), which was
mentioned in #267 .
[chatglm2](https://github.com/THUDM/ChatGLM2-6B) and
[chatglm3](https://github.com/THUDM/ChatGLM3) api are compatible with
openai, so It is enough to support chatglm.
because chatglm does not have official docker image, so I don't know how
to use `Testcontainers` to do test. (I'm not familiar with
`Testcontainers`, so for now I have to copy test from the other modules,
lol). The test will update using `Testcontainers` after I learn about it
in few days.
Removed generic AiMessage ctor to make it clear in which states
`AiMessage` can be (it can contain either `text` OR
`toolExecutionRequests`, but not both at the same time)
A new implementation of ChatLanguageModel, OllamaChatModel, is added to
handle interactions with the Ollama AI and has an associated integration
test. This includes necessary configurations and methods for message
generation. This increases the project's modularity and provides a more
convenient and encapsulated way of interfacing with the Ollama AI.
Fixes#241: Added support for Neo4j Vector Index
This commit brings support for Neo4j graph database in general, and uses
the vector index functionality, generally available since version 5.13.
Mostly aligned with the existing WeaviateEmbeddingStoreImpl
implementation and tests.
The tests have some additional Neo4j node assertion to check that the
nodes involved are correctly created.
The module creates indexes, i.e. `"CALL
db.index.vector.createNodeIndex(<indexName>, <label>,
<embeddingProperty>, <dimension>, <distanceType>)"`, if needed, for the
vector search .
The required configurations are:
- the Neo4j index dimension parameter
- the Neo4j Java Driver connection instance
- as an alternative to the Neo4j Java Driver, we can create a
`Neo4jEmbeddingStore.builder().withBasicAuth(<url>, <username>,
<password>)`, which will create a Driver connection instance under the
hood
It is possible to customize, via the builder:
- the index name (with default `langchain-embedding-index`)
- the Neo4j node label (with default `Document`)
- the Neo4j property key which save the embeddings (with default
`embeddingProp`)
- the Neo4j index distanceType parameter
- the metadata prefix (with default `metadata.`)
- the text property key (with default `text`), which store the text
field of the `TextSegment.java`
Created an example PR as well, on `langchain4j-examples` repo:
https://github.com/langchain4j/langchain4j-examples/pull/23
Allow the organizationId to be specified in the configuration. See
https://platform.openai.com/docs/api-reference/organization-optional
Right now there is no way to specify this.
I added a `.organizationId(System.getenv("OPENAI_ORGANIZATION_ID"))` to
the builders in a bunch of the tests. They should have no effect other
than if you want to run the tests to pass an organizationId, you can
just set the `OPENAI_ORGANIZATION_ID` environment variable. The
`OpenAiClient` won't do anything if the `organizationId` is `null`.
Happy for questions/comments.
Fixes#344
In the GitHub Actions workflow:
- Update actions/checkout to the latest version
- Update actions/setup-java to the latest version (Java 21 already works
but is undocumented, the next version it will be thanks to
https://github.com/actions/setup-java/pull/538😀)
Azure OpenAI 1.0.0-beta.6 is out:
- [x] Update to the new SDK
- [x] Add unit tests
- [x] Do integration tests on my account (as I have access to Azure
OpenAI)
This module is confusing people as its intent is not reflected by its
name.
Moreover, this module is simply a copy of the previous version of the
Azure OpenAI stuff that I will just copy in
quarkus-langchain4j-azure-openai
fix the thread safety issue in InMemoryEmbeddingStore.
Log and Stack trace:
```
java.util.ConcurrentModificationException: null
at java.util.ArrayList$Itr.checkForComodification(ArrayList.java:911)
at java.util.ArrayList$Itr.next(ArrayList.java:861)
at dev.langchain4j.store.embedding.inmemory.InMemoryEmbeddingStore.findRelevant(InMemoryEmbeddingStore.java:122)
at dev.langchain4j.store.embedding.EmbeddingStore.findRelevant(EmbeddingStore.java:67)
```
https://github.com/langchain4j/langchain4j/issues/350
Maven wrapper is currently having 2 issues:
- The executable doesn't have the executable bit
- The `.mvn` directory is ignored in Git
Fixing those allow to do `./mvnw package` directly in Codespaces.