Skip to content

feat: Add responseMimeType option in vertexAiGeminiChatOptions #1185

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 2 commits into from

Conversation

jyami-kim
Copy link
Contributor

The Gemini model provides the responseMimeType parameter, as documented in the Gemini Model Reference.

However, when I attempting to call the Gemini model using Spring AI, there is no direct option to set this parameter.
To work around this, I used reflection in my project:

        val chatModel = VertexAiGeminiChatModel(
            VertexAI(System.getenv("GEMINI_PROJECT_ID"), System.getenv("GEMINI_LOCATION"))
        )

        val generationConfig = GenerationConfig.newBuilder()
            .setResponseMimeType("application/json") // responseMimeType 을 설정할 수 없어서. 직접 접근하여 설정하였다.
            .setTemperature(0.8f)
            .build()

        val generationConfigField = VertexAiGeminiChatModel::class.java.getDeclaredField("generationConfig")
        generationConfigField.isAccessible = true
        generationConfigField.set(chatModel, generationConfig)

Therefore, it would be beneficial if the responseMimeType parameter was officially supported. The model generated by default in VertexAiGeminiChatModel is GEMINI_1_5_PRO, so this option is available from this version onwards, so it should be added as needed.

@markpollack markpollack added this to the 1.0.0-M2 milestone Aug 22, 2024
@markpollack
Copy link
Member

Thanks! Important feature indeed. I updated the docs. Merged in bc55bc7

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants