Skip to content

Commit 456c857

Browse files
rmalararodrigomalara
authored andcommitted
[GH-2609] Fix the thread leak issue in VertexAiTextEmbeddingModel
The PredictionServiceClient was not being closed. Connections are kept open preventing resources from being disposed properly. Signed-off-by: rmalara <rmalara@interactions.com> [GH-2609] Fix the thread leak issue in VertexAiTextEmbeddingModel The PredictionServiceClient was not being closed. Connections are kept open preventing resources from being disposed properly. Signed-off-by: Rodrigo Malara <rodrigomalara@gmail.com> Signed-off-by: rmalara <rmalara@interactions.com> [GH-2609] Fix the thread leak issue in VertexAiTextEmbeddingModel The PredictionServiceClient was not being closed. Connections are kept open preventing resources from being disposed properly. Signed-off-by: Rodrigo Malara <rodrigomalara@gmail.com> Signed-off-by: rmalara <rmalara@interactions.com>
1 parent 2294c5a commit 456c857

File tree

1 file changed

+23
-22
lines changed

1 file changed

+23
-22
lines changed

models/spring-ai-vertex-ai-embedding/src/main/java/org/springframework/ai/vertexai/embedding/text/VertexAiTextEmbeddingModel.java

Lines changed: 23 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -128,37 +128,38 @@ public EmbeddingResponse call(EmbeddingRequest request) {
128128
.observation(this.observationConvention, DEFAULT_OBSERVATION_CONVENTION, () -> observationContext,
129129
this.observationRegistry)
130130
.observe(() -> {
131-
PredictionServiceClient client = createPredictionServiceClient();
131+
try (PredictionServiceClient client = createPredictionServiceClient()) {
132132

133-
EndpointName endpointName = this.connectionDetails.getEndpointName(finalOptions.getModel());
133+
EndpointName endpointName = this.connectionDetails.getEndpointName(finalOptions.getModel());
134134

135-
PredictRequest.Builder predictRequestBuilder = getPredictRequestBuilder(request, endpointName,
136-
finalOptions);
135+
PredictRequest.Builder predictRequestBuilder = getPredictRequestBuilder(request, endpointName,
136+
finalOptions);
137137

138-
PredictResponse embeddingResponse = this.retryTemplate
139-
.execute(context -> getPredictResponse(client, predictRequestBuilder));
138+
PredictResponse embeddingResponse = this.retryTemplate
139+
.execute(context -> getPredictResponse(client, predictRequestBuilder));
140140

141-
int index = 0;
142-
int totalTokenCount = 0;
143-
List<Embedding> embeddingList = new ArrayList<>();
144-
for (Value prediction : embeddingResponse.getPredictionsList()) {
145-
Value embeddings = prediction.getStructValue().getFieldsOrThrow("embeddings");
146-
Value statistics = embeddings.getStructValue().getFieldsOrThrow("statistics");
147-
Value tokenCount = statistics.getStructValue().getFieldsOrThrow("token_count");
148-
totalTokenCount = totalTokenCount + (int) tokenCount.getNumberValue();
141+
int index = 0;
142+
int totalTokenCount = 0;
143+
List<Embedding> embeddingList = new ArrayList<>();
144+
for (Value prediction : embeddingResponse.getPredictionsList()) {
145+
Value embeddings = prediction.getStructValue().getFieldsOrThrow("embeddings");
146+
Value statistics = embeddings.getStructValue().getFieldsOrThrow("statistics");
147+
Value tokenCount = statistics.getStructValue().getFieldsOrThrow("token_count");
148+
totalTokenCount = totalTokenCount + (int) tokenCount.getNumberValue();
149149

150-
Value values = embeddings.getStructValue().getFieldsOrThrow("values");
150+
Value values = embeddings.getStructValue().getFieldsOrThrow("values");
151151

152-
float[] vectorValues = VertexAiEmbeddingUtils.toVector(values);
152+
float[] vectorValues = VertexAiEmbeddingUtils.toVector(values);
153153

154-
embeddingList.add(new Embedding(vectorValues, index++));
155-
}
156-
EmbeddingResponse response = new EmbeddingResponse(embeddingList,
157-
generateResponseMetadata(finalOptions.getModel(), totalTokenCount));
154+
embeddingList.add(new Embedding(vectorValues, index++));
155+
}
156+
EmbeddingResponse response = new EmbeddingResponse(embeddingList,
157+
generateResponseMetadata(finalOptions.getModel(), totalTokenCount));
158158

159-
observationContext.setResponse(response);
159+
observationContext.setResponse(response);
160160

161-
return response;
161+
return response;
162+
}
162163
});
163164
}
164165

0 commit comments

Comments
 (0)