Skip to content

Long Conversation Variable Descriptions Cause "Internal Server Error" When Editing Chatflow #25540

@QuantumGhost

Description

@QuantumGhost

Self Checks

  • I have read the Contributing Guide and Language Policy.
  • This is only for bug report, if you would like to ask a question, please head to Discussions.
  • I have searched for existing issues search for existing issues, including closed ones.
  • I confirm that I am using English to submit this report, otherwise it will be closed.
  • 【中文用户 & Non English User】请使用英语提交,否则会被关闭 :)
  • Please do not modify this template :) and fill in all the required fields.

Dify version

1.8.1

Cloud or Self Hosted

Cloud

Steps to reproduce

Import the following DSL and edit it.

DSL
app:
  description: ''
  icon: 🤖
  icon_background: '#FFEAD5'
  mode: advanced-chat
  name: conversation variable desc length
  use_icon_as_answer_icon: false
dependencies:
- current_identifier: null
  type: marketplace
  value:
    marketplace_plugin_unique_identifier: langgenius/openai:0.1.0@539dc0dde8ecaa84e61b8eeb154c3bb7fbc9caa80afd83763dd5860a725d2742
kind: app
version: 0.4.0
workflow:
  conversation_variables:
  - description: =====94a7f93b7d03ac3405c3fb7da00705b65265f3f01620976a2fb22dbeb8e26ea77f1c4a54b052e6c02a5f03cf0a2dcc809dd80771bfe1401a4e7cf99aeae1f8fbf8692750b414ea541caa0c0816cce948f54812a1c33ce48b6d1f98024f103b9bcff6aaca2af723a6cc1c90fd8446db60588ff22a2ea1b443993fec7200a5f46474a9===
    id: 091bfe21-8e0e-4f2d-b1ef-5a9429d6b5b5
    name: test
    selector:
    - conversation
    - test
    value: '1234'
    value_type: string
  environment_variables: []
  features:
    file_upload:
      allowed_file_extensions:
      - .JPG
      - .JPEG
      - .PNG
      - .GIF
      - .WEBP
      - .SVG
      allowed_file_types:
      - image
      allowed_file_upload_methods:
      - local_file
      - remote_url
      enabled: false
      fileUploadConfig:
        audio_file_size_limit: 50
        batch_count_limit: 5
        file_size_limit: 15
        image_file_size_limit: 10
        video_file_size_limit: 100
        workflow_file_upload_limit: 10
      image:
        enabled: false
        number_limits: 3
        transfer_methods:
        - local_file
        - remote_url
      number_limits: 3
    opening_statement: ''
    retriever_resource:
      enabled: true
    sensitive_word_avoidance:
      enabled: false
    speech_to_text:
      enabled: false
    suggested_questions: []
    suggested_questions_after_answer:
      enabled: false
    text_to_speech:
      enabled: false
      language: ''
      voice: ''
  graph:
    edges:
    - data:
        sourceType: start
        targetType: llm
      id: 1757581230524-llm
      source: '1757581230524'
      sourceHandle: source
      target: llm
      targetHandle: target
      type: custom
    - data:
        sourceType: llm
        targetType: answer
      id: llm-answer
      source: llm
      sourceHandle: source
      target: answer
      targetHandle: target
      type: custom
    nodes:
    - data:
        desc: ''
        selected: false
        title: Start
        type: start
        variables: []
      height: 54
      id: '1757581230524'
      position:
        x: 80
        y: 282
      positionAbsolute:
        x: 80
        y: 282
      sourcePosition: right
      targetPosition: left
      type: custom
      width: 244
    - data:
        context:
          enabled: false
          variable_selector: []
        desc: ''
        memory:
          query_prompt_template: '{{#sys.query#}}


            {{#sys.files#}}'
          role_prefix:
            assistant: ''
            user: ''
          window:
            enabled: false
            size: 10
        model:
          completion_params:
            temperature: 0.7
          mode: chat
          name: gpt-3.5-turbo-0125
          provider: langgenius/openai/openai
        prompt_template:
        - role: system
          text: ''
        selected: false
        title: LLM
        type: llm
        variables: []
        vision:
          enabled: false
      height: 90
      id: llm
      position:
        x: 380
        y: 282
      positionAbsolute:
        x: 380
        y: 282
      sourcePosition: right
      targetPosition: left
      type: custom
      width: 244
    - data:
        answer: '{{#llm.text#}}'
        desc: ''
        selected: false
        title: Answer
        type: answer
        variables: []
      height: 105
      id: answer
      position:
        x: 680
        y: 282
      positionAbsolute:
        x: 680
        y: 282
      sourcePosition: right
      targetPosition: left
      type: custom
      width: 244
    viewport:
      x: 0
      y: 0
      zoom: 1

Problem Description

Currently, conversation variables in Dify have no length limit for their descriptions, allowing users to create variables with arbitrarily long descriptions. However, draft variables are restricted to a maximum of 256 characters.

When fetching draft variables for conversation variables during the drafting process, the system copies conversation variables to draft variables, including their descriptions as-is. Due to this length mismatch, the operation can fail and result in an "Internal Server Error" when attempting to save draft variables.

Additionally, excessively long descriptions may cause performance issues since conversation variable descriptions are stored inline as part of the workflow graph.

✔️ Expected Behavior

No error occurred while editing chatflow.

❌ Actual Behavior

A popup shows while editing chatflow.

Image

Metadata

Metadata

Assignees

Labels

cloudWhen the version is cloud and it is a bug report🐞 bugSomething isn't working

Type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions