cb61a5e284
This should fix test failures on aarch64. ``` expected to be less than: 0.0 but was : 0.0 at app//io.grpc.benchmarks.driver.LoadWorkerTest.assertWorkOccurred(LoadWorkerTest.java:198) at app//io.grpc.benchmarks.driver.LoadWorkerTest.runUnaryBlockingClosedLoop(LoadWorkerTest.java:90) ``` runUnaryBlockingClosedLoop() has been failing but the other tests suceeding. The failure is complaining that getCount() == 0, which means no RPCs completed. The slowest successful test has a mean RPC time of 226 ms (the unit was logged incorrectly) and comparing to x86 tests runUnaryBlockingClosedLoop() is ~2x as slow because it executes first. So this is probably _barely_ failing and 4 attempts instead of 3 would be sufficient. While the test tries to wait for 10 RPCs to complete, it seems likely it is stopping early even for the successful runs on aarch64. There are 4 concurrent RPCs, so to get 10 RPCs we need to wait for 3 batches of RPCs to complete which would be 1346 ms (5 loops) assuming a 452 ms mean latency. Bumping timeout by 10x to give lots of headroom. |
||
---|---|---|
.github | ||
all | ||
alts | ||
android | ||
android-interop-testing | ||
api | ||
auth | ||
authz | ||
benchmarks | ||
binder | ||
bom | ||
buildscripts | ||
census | ||
compiler | ||
context | ||
core | ||
cronet | ||
documentation | ||
examples | ||
gae-interop-testing | ||
gcp-observability | ||
googleapis | ||
gradle/wrapper | ||
grpclb | ||
interop-testing | ||
netty | ||
okhttp | ||
protobuf | ||
protobuf-lite | ||
rls | ||
services | ||
stub | ||
testing | ||
testing-proto | ||
xds | ||
.bazelignore | ||
.gitattributes | ||
.gitignore | ||
AUTHORS | ||
BUILD.bazel | ||
CODE-OF-CONDUCT.md | ||
COMPILING.md | ||
CONTRIBUTING.md | ||
GOVERNANCE.md | ||
LICENSE | ||
MAINTAINERS.md | ||
NOTICE.txt | ||
README.md | ||
RELEASING.md | ||
SECURITY.md | ||
WORKSPACE | ||
build.gradle | ||
codecov.yml | ||
gradlew | ||
gradlew.bat | ||
java_grpc_library.bzl | ||
repositories.bzl | ||
run-test-client.sh | ||
run-test-server.sh | ||
settings.gradle |
README.md
gRPC-Java - An RPC library and framework
gRPC-Java works with JDK 8. gRPC-Java clients are supported on Android API levels 19 and up (KitKat and later). Deploying gRPC servers on an Android device is not supported.
TLS usage typically requires using Java 8, or Play Services Dynamic Security Provider on Android. Please see the Security Readme.
Homepage: | grpc.io |
Mailing List: | grpc-io@googlegroups.com |
Getting Started
For a guided tour, take a look at the quick start guide or the more explanatory gRPC basics.
The examples and the Android example are standalone projects that showcase the usage of gRPC.
Download
Download the JARs. Or for Maven with non-Android, add to your pom.xml
:
<dependency>
<groupId>io.grpc</groupId>
<artifactId>grpc-netty-shaded</artifactId>
<version>1.45.1</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>io.grpc</groupId>
<artifactId>grpc-protobuf</artifactId>
<version>1.45.1</version>
</dependency>
<dependency>
<groupId>io.grpc</groupId>
<artifactId>grpc-stub</artifactId>
<version>1.45.1</version>
</dependency>
<dependency> <!-- necessary for Java 9+ -->
<groupId>org.apache.tomcat</groupId>
<artifactId>annotations-api</artifactId>
<version>6.0.53</version>
<scope>provided</scope>
</dependency>
Or for Gradle with non-Android, add to your dependencies:
runtimeOnly 'io.grpc:grpc-netty-shaded:1.45.1'
implementation 'io.grpc:grpc-protobuf:1.45.1'
implementation 'io.grpc:grpc-stub:1.45.1'
compileOnly 'org.apache.tomcat:annotations-api:6.0.53' // necessary for Java 9+
For Android client, use grpc-okhttp
instead of grpc-netty-shaded
and
grpc-protobuf-lite
instead of grpc-protobuf
:
implementation 'io.grpc:grpc-okhttp:1.45.1'
implementation 'io.grpc:grpc-protobuf-lite:1.45.1'
implementation 'io.grpc:grpc-stub:1.45.1'
compileOnly 'org.apache.tomcat:annotations-api:6.0.53' // necessary for Java 9+
Development snapshots are available in Sonatypes's snapshot repository.
Generated Code
For protobuf-based codegen, you can put your proto files in the src/main/proto
and src/test/proto
directories along with an appropriate plugin.
For protobuf-based codegen integrated with the Maven build system, you can use
protobuf-maven-plugin (Eclipse and NetBeans users should also look at
os-maven-plugin
's
IDE documentation):
<build>
<extensions>
<extension>
<groupId>kr.motd.maven</groupId>
<artifactId>os-maven-plugin</artifactId>
<version>1.6.2</version>
</extension>
</extensions>
<plugins>
<plugin>
<groupId>org.xolstice.maven.plugins</groupId>
<artifactId>protobuf-maven-plugin</artifactId>
<version>0.6.1</version>
<configuration>
<protocArtifact>com.google.protobuf:protoc:3.19.2:exe:${os.detected.classifier}</protocArtifact>
<pluginId>grpc-java</pluginId>
<pluginArtifact>io.grpc:protoc-gen-grpc-java:1.45.1:exe:${os.detected.classifier}</pluginArtifact>
</configuration>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>compile-custom</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
For non-Android protobuf-based codegen integrated with the Gradle build system, you can use protobuf-gradle-plugin:
plugins {
id 'com.google.protobuf' version '0.8.17'
}
protobuf {
protoc {
artifact = "com.google.protobuf:protoc:3.19.2"
}
plugins {
grpc {
artifact = 'io.grpc:protoc-gen-grpc-java:1.45.1'
}
}
generateProtoTasks {
all()*.plugins {
grpc {}
}
}
}
The prebuilt protoc-gen-grpc-java binary uses glibc on Linux. If you are compiling on Alpine Linux, you may want to use the Alpine grpc-java package which uses musl instead.
For Android protobuf-based codegen integrated with the Gradle build system, also use protobuf-gradle-plugin but specify the 'lite' options:
plugins {
id 'com.google.protobuf' version '0.8.17'
}
protobuf {
protoc {
artifact = "com.google.protobuf:protoc:3.19.2"
}
plugins {
grpc {
artifact = 'io.grpc:protoc-gen-grpc-java:1.45.1'
}
}
generateProtoTasks {
all().each { task ->
task.builtins {
java { option 'lite' }
}
task.plugins {
grpc { option 'lite' }
}
}
}
}
API Stability
APIs annotated with @Internal
are for internal use by the gRPC library and
should not be used by gRPC users. APIs annotated with @ExperimentalApi
are
subject to change in future releases, and library code that other projects
may depend on should not use these APIs.
We recommend using the
grpc-java-api-checker
(an Error Prone plugin)
to check for usages of @ExperimentalApi
and @Internal
in any library code
that depends on gRPC. It may also be used to check for @Internal
usage or
unintended @ExperimentalApi
consumption in non-library code.
How to Build
If you are making changes to gRPC-Java, see the compiling instructions.
High-level Components
At a high level there are three distinct layers to the library: Stub, Channel, and Transport.
Stub
The Stub layer is what is exposed to most developers and provides type-safe
bindings to whatever datamodel/IDL/interface you are adapting. gRPC comes with
a plugin to the
protocol-buffers compiler that generates Stub interfaces out of .proto
files,
but bindings to other datamodel/IDL are easy and encouraged.
Channel
The Channel layer is an abstraction over Transport handling that is suitable for interception/decoration and exposes more behavior to the application than the Stub layer. It is intended to be easy for application frameworks to use this layer to address cross-cutting concerns such as logging, monitoring, auth, etc.
Transport
The Transport layer does the heavy lifting of putting and taking bytes off the
wire. The interfaces to it are abstract just enough to allow plugging in of
different implementations. Note the transport layer API is considered internal
to gRPC and has weaker API guarantees than the core API under package io.grpc
.
gRPC comes with three Transport implementations:
- The Netty-based transport is the main transport implementation based on Netty. It is for both the client and the server.
- The OkHttp-based transport is a lightweight transport based on OkHttp. It is mainly for use on Android and is for client only.
- The in-process transport is for when a server is in the same process as the client. It is useful for testing, while also being safe for production use.