Skip to content

ChatVertexAI does not support gpt-oss-120b-maas model #9150

@0xwata

Description

@0xwata

Checked other resources

  • This is a bug, not a usage question. For questions, please use the LangChain Forum (https://forum.langchain.com/).
  • I added a very descriptive title to this issue.
  • I searched the LangChain.js documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain.js rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

import { ChatVertexAI } from '@langchain/google-vertexai';

(async () => {
  // Initialize ChatVertexAI
  const chat = new ChatVertexAI({
    model: 'gpt-oss-120b-maas', // I also tried `openai/gpt-oss-120b-maas`, but it didn't work
    temperature: 0.7,
    maxOutputTokens: 2048,
  });

  // Simple test prompt
  const testPrompt = 'Hello, how are you?';

  console.log('Sending request...');

  // Invoke the model
  const response = await chat.invoke(testPrompt);

  console.log('\n=== Response ===');
  console.log(response.content);
  console.log('\n=== Done ===');
})();

Error Message and Stack Trace (if applicable)

throw new Error(`Unable to verify model params: ${JSON.stringify(params)}`);
Error: Unable to verify model params: {"lc":1,"type":"constructor","id":
["langchain","chat_models","chat_integration","ChatVertexAI"],
"kwargs":{"model":"gpt-oss-120b-maas","temperature":0.7,"platform_type":"gcp"}}

at validateModelParams (/[REDACTED_ROOT]/node_modules/.pnpm/@langchain+google-common@0.2.18_@langchain+core@0.3.78_@opentelemetry+api@1.9.0_@opente_eeff1b37dcb0d97159f515e53381d955/node_modules/@langchain/google-common/dist/utils/common.cjs:226:19)
    at copyAndValidateModelParamsInto (/[REDACTED_ROOT]/node_modules/.pnpm/@langchain+google-common@0.2.18_@langchain+core@0.3.78_@opentelemetry+api@1.9.0_@opente_eeff1b37dcb0d97159f515e53381d955/node_modules/@langchain/google-common/dist/utils/common.cjs:231:5)
    at new ChatGoogleBase (/[REDACTED_ROOT]/node_modules/.pnpm/@langchain+google-common@0.2.18_@langchain+core@0.3.78_@opentelemetry+api@1.9.0_@opente_eeff1b37dcb0d97159f515e53381d955/node_modules/@langchain/google-common/dist/chat_models.cjs:263:56)
    at new ChatGoogle (/[REDACTED_ROOT]/node_modules/.pnpm/@langchain+google-gauth@0.2.18_@langchain+core@0.3.78_@opentelemetry+api@1.9.0_@opentel_013af39eb4d2fb5aa8468ec7e96195c2/node_modules/@langchain/google-gauth/dist/chat_models.cjs:15:9)
    at new ChatVertexAI (/[REDACTED_ROOT]/node_modules/.pnpm/@langchain+google-vertexai@0.2.18_@langchain+core@0.3.78_@opentelemetry+api@1.9.0_@open_3483a2f1b6c8dce7d90d5491425c72fa/node_modules/@langchain/google-vertexai/dist/chat_models.cjs:292:9)
    at <anonymous> (/[REDACTED_ROOT]/scripts/test-langchain.ts:5:16)
    at <anonymous> (/[REDACTED_ROOT]/scripts/test-langchain.ts:22:1)
    at Object.<anonymous> (/[REDACTED_ROOT]/scripts/test-langchain.ts:22:4)
    at Module._compile (node:internal/modules/cjs/loader:1554:14)
    at Object.transformer (/[REDACTED_ROOT]/node_modules/.pnpm/tsx@4.20.5/node_modules/tsx/dist/register-D46fvsV_.cjs:3:1104)

Node.js v22.14.0

Description

I'm trying to use the gpt-oss-120b-maas model with ChatVertexAI, which is available in
Google Cloud's Vertex AI Models.
I expect the model to initialize successfully since it's a valid Vertex AI model.
Instead, I get an error "Unable to verify model params" and the initialization fails.

Current Behavior

When initializing ChatVertexAI with model: "gpt-oss-120b-maas", the following error is
thrown:

  Error: Unable to verify model params: {"lc":1,"type":"constructor","id":
  ["langchain","chat_models","chat_integration","ChatVertexAI"],
  "kwargs":{"model":"gpt-oss-120b-maas","temperature":0.7,"platform_type":"gcp"}}

The error originates from validateModelParams in
@langchain/google-common/dist/utils/common.cjs:226.

Expected Behavior

ChatVertexAI should successfully initialize with the gpt-oss-120b-maas model, similar to how
it works with gemini-2.5-pro. I have verified that the above example code works successfully when I replace gpt-oss-120b-maas with gemini-2.5-pro - the model initializes and executes without any errors.

System Info

System Information

OS: Mac OS
OS Version: macOS Tahoe 26.0.1
Node version: v22.14.0
pnpm version: 10.5.0

Package Information

langchain: 0.3.35
langsmith: 0.3.2
@langchain/core: 0.3.78
@langchain/google-vertexai: 0.2.18

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions