Skip to content

fix(robot): optimized the example and added sessions and UI adjustment. #3484

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jun 5, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion examples/sites/demos/pc/app/grid/ai-agent/basic-usage.vue
Original file line number Diff line number Diff line change
Expand Up @@ -87,7 +87,7 @@ export default {
await client.connect(clientTransport)
const { sessionId } = await createSseProxy({
client,
url: 'http://39.108.160.245/sse'
url: 'https://39.108.160.245/sse'
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

❓ Verification inconclusive

Good security improvement by switching to HTTPS.

The change from HTTP to HTTPS enhances security by encrypting the SSE connection. However, please verify that the endpoint is accessible via HTTPS and consider externalizing this URL to a configuration file for better maintainability.


🏁 Script executed:

#!/bin/bash
# Verify the HTTPS endpoint is accessible
curl -I https://39.108.160.245/sse --connect-timeout 10

Length of output: 698


HTTPS endpoint fails certificate verification

The switch to HTTPS is a great security improvement, but curl -I https://39.108.160.245/sse is failing with an SSL error indicating a self-signed certificate. Please address this before merging:

  • Replace the self-signed certificate with a valid, CA-signed certificate so clients can verify it out of the box.
  • If a self-signed cert is required (e.g., for internal testing), ensure the client trusts it (e.g., import the cert or configure your HTTP client accordingly).
  • Externalize the SSE URL (and any related cert-trust settings) into a configuration file or environment variable instead of hard-coding the IP.
🤖 Prompt for AI Agents
In examples/sites/demos/pc/app/grid/ai-agent/basic-usage.vue at line 90, the SSE
URL is hardcoded with an HTTPS endpoint using a self-signed certificate that
fails SSL verification. To fix this, replace the self-signed certificate on the
server with a valid CA-signed certificate to ensure proper SSL validation.
Additionally, externalize the SSE URL into a configuration file or environment
variable to improve maintainability and allow easier updates without code
changes. If a self-signed certificate must be used, configure the client to
trust it explicitly by importing the certificate or adjusting the HTTP client
settings accordingly.

})

this.sessionID = sessionId
Expand Down
6 changes: 3 additions & 3 deletions examples/sites/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -29,9 +29,9 @@
"@docsearch/react": "npm:@docsearch/css",
"@modelcontextprotocol/sdk": "1.12.1",
"@opentiny/next": "0.1.2",
"@opentiny/tiny-robot": "0.2.0-alpha.8",
"@opentiny/tiny-robot-kit": "0.2.0-alpha.8",
"@opentiny/tiny-robot-svgs": "0.2.0-alpha.8",
"@opentiny/tiny-robot": "0.2.1",
"@opentiny/tiny-robot-kit": "0.2.1",
"@opentiny/tiny-robot-svgs": "0.2.1",
Comment on lines +32 to +34
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

Good dependency updates from alpha to stable versions.

The consistent update of all tiny-robot packages from alpha versions to stable 0.2.1 indicates improved stability and bug fixes.


🌐 Web query:

What is the latest version of @opentiny/tiny-robot package on npm?

💡 Result:

I couldn't find a package named @opentiny/tiny-robot on npm. It's possible there might be a typo or the package is listed under a different name. However, there is a related package called @opentiny/tiny-engine-plugin-robot, which is currently at version 2.0.0-rc.2. (libraries.io)

If this isn't the package you're referring to, could you please provide more details or verify the package name? That would help me assist you more effectively.

Citations:


Verify package names and published versions

The packages @opentiny/tiny-robot, @opentiny/tiny-robot-kit, and @opentiny/tiny-robot-svgs at version 0.2.1 could not be found on the public npm registry. Please confirm that:

  • The package names are correct and match your published artifacts.
  • Version 0.2.1 has been published (either publicly or to your private registry).
  • If these are private packages, ensure your CI/npm config points to the correct registry.

Once confirmed or corrected, the dependency bump from alpha to stable can be finalized.

🤖 Prompt for AI Agents
In examples/sites/package.json around lines 32 to 34, the dependencies
"@opentiny/tiny-robot", "@opentiny/tiny-robot-kit", and
"@opentiny/tiny-robot-svgs" at version "0.2.1" do not appear to exist on the
public npm registry. Verify that these package names are correct and correspond
to your published packages. Also confirm that version "0.2.1" is published
either publicly or in your private registry. If these are private packages,
ensure your npm configuration points to the correct registry. Update the package
names or versions accordingly to reflect valid and accessible dependencies.

"@opentiny/tiny-schema-renderer": "1.0.0-beta.5",
"@opentiny/tiny-vue-mcp": "0.0.1-beta.1",
"@opentiny/utils": "workspace:~",
Expand Down
13 changes: 11 additions & 2 deletions examples/sites/src/tools/appData.js
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
import { reactive, computed } from 'vue'
import { reactive, computed, watch } from 'vue'
import { useAutoStore } from './storage'
import { useMediaQuery } from './useMediaQuery'
import { ZH_CN_LANG, EN_US_LANG, LANG_KEY, LANG_PATH_MAP } from '../const'
Expand All @@ -8,7 +8,8 @@ const enPath = LANG_PATH_MAP[EN_US_LANG]
const appData = reactive({
lang: useAutoStore('local', LANG_KEY, ZH_CN_LANG),
theme: useAutoStore('local', '_theme', 'light'),
bpState: useMediaQuery([640, 1024, 1280]).matches // 3点4区间, bp0,bp1,bp2,bp3
bpState: useMediaQuery([640, 1024, 1280]).matches, // 3点4区间, bp0,bp1,bp2,bp3
showTinyRobot: false
})
const isZhCn = computed(() => appData.lang === ZH_CN_LANG)
const appFn = {
Expand All @@ -27,4 +28,12 @@ const appFn = {
}
// 为了和tiny-vue共享同一个响应变量
window.appData = appData

watch(
() => appData.showTinyRobot,
(value) => {
document.body.classList.toggle('docs-on-robot-show', value)
}
)

Comment on lines +32 to +37
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Add safety checks for DOM manipulation.

The watcher implementation is good for centralizing the robot visibility side effects, but consider adding safety checks for SSR compatibility and document availability.

 watch(
   () => appData.showTinyRobot,
   (value) => {
-    document.body.classList.toggle('docs-on-robot-show', value)
+    if (typeof document !== 'undefined' && document.body) {
+      document.body.classList.toggle('docs-on-robot-show', value)
+    }
   }
 )
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
watch(
() => appData.showTinyRobot,
(value) => {
document.body.classList.toggle('docs-on-robot-show', value)
}
)
watch(
() => appData.showTinyRobot,
(value) => {
if (typeof document !== 'undefined' && document.body) {
document.body.classList.toggle('docs-on-robot-show', value)
}
}
)
🤖 Prompt for AI Agents
In examples/sites/src/tools/appData.js around lines 32 to 37, the watcher
toggles a class on document.body without checking if the document object is
available, which can cause issues during server-side rendering (SSR). Add a
safety check to ensure that the code only runs if the document and document.body
exist before attempting to toggle the class. This will prevent errors in
environments where the DOM is not present.

export { appData, appFn, isZhCn }
8 changes: 3 additions & 5 deletions examples/sites/src/views/components-doc/common.vue
Original file line number Diff line number Diff line change
Expand Up @@ -90,7 +90,7 @@
</div>
<div id="footer"></div>
</div>
<robotChat v-if="show"></robotChat>
<robotChat v-show="appData.showTinyRobot"></robotChat>
</template>

<script setup lang="ts">
Expand All @@ -107,6 +107,7 @@ import ComponentHeader from './components/header.vue'
import ComponentContributor from './components/contributor.vue'
import ApiDocs from './components/api-docs.vue'
import useTasksFinish from './composition/useTasksFinish'
import { appData } from '../../tools/appData'

import robotChat from './tiny-robot-chat.vue'

Expand Down Expand Up @@ -445,13 +446,10 @@ const handleAnchorClick = (e, data) => {

defineExpose({ loadPage })

const show = ref(false)
onMounted(() => {
// tiny-robot 通过路由参数存在 mcp-robot, 则弹出对话容器
const hasRobot = router.currentRoute.value.fullPath.includes('grid-ai-agent')
show.value = !!hasRobot

document.body.classList.toggle('docs-on-robot-show', show.value)
appData.showTinyRobot = !!hasRobot
})
</script>

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,8 @@ import type { AIModelConfig } from '@opentiny/tiny-robot-kit'
import type { ChatCompletionResponse } from '@opentiny/tiny-robot-kit'
import type { StreamHandler } from '@opentiny/tiny-robot-kit'
import { BaseModelProvider } from '@opentiny/tiny-robot-kit'
import { handleSSEStream } from './utils.js'
import { globalConversation, handleSSEStream } from './utils.js'
import type { Ref } from 'vue'

/**
* 对接AIClient的自定义 Dify 大模型服务
Expand All @@ -14,6 +15,8 @@ import { handleSSEStream } from './utils.js'
* });
*/
export class DifyModelProvider extends BaseModelProvider {
_messages: Ref<ChatCompletionRequest['messages']> = []
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Fix the _messages property initialization.

The _messages property is incorrectly initialized as an empty array instead of a proper Vue Ref. This will cause runtime errors when the SSE handler tries to access .value.

Apply this diff to fix the initialization:

-  _messages: Ref<ChatCompletionRequest['messages']> = []
+  _messages: Ref<ChatCompletionRequest['messages']> = ref([])

Note: You'll also need to import ref from Vue.

Committable suggestion skipped: line range outside the PR's diff.

🤖 Prompt for AI Agents
In examples/sites/src/views/components-doc/composition/DifyModelProvider.ts at
line 18, the _messages property is initialized as a plain empty array instead of
a Vue Ref, which will cause runtime errors when accessing .value. Fix this by
importing ref from Vue and initializing _messages using ref([]) to create a
proper reactive reference.


constructor(config: AIModelConfig) {
super(config)
}
Expand Down Expand Up @@ -45,11 +48,12 @@ export class DifyModelProvider extends BaseModelProvider {
response_mode: 'streaming',
inputs: {
sessionId: window.$sessionId
}
},
conversation_id: globalConversation.id
})
})

await handleSSEStream(response, handler, signal)
await handleSSEStream(response, handler, this._messages, signal)
} catch (error) {
if (signal && signal.aborted) {
console.warn('Request was aborted:', error)
Expand Down
81 changes: 10 additions & 71 deletions examples/sites/src/views/components-doc/composition/useTinyRobot.ts
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
import type { AIModelConfig } from '@opentiny/tiny-robot-kit'
import { AIClient, useConversation } from '@opentiny/tiny-robot-kit'
import { IconAi, IconUser } from '@opentiny/tiny-robot-svgs'
import { h, nextTick, onMounted, reactive, ref, toRaw, watch } from 'vue'
import { h, nextTick, onMounted, ref, watch } from 'vue'
import { DifyModelProvider } from './DifyModelProvider.js'
import type { SuggestionItem } from '@opentiny/tiny-robot'

Expand All @@ -11,8 +11,9 @@ const difyConfig: AIModelConfig = {
apiKey: 'app-H0VJI4LqZ4KskdcA5a07pjXf'
}
export function useTinyRobot() {
const difyModelProvider = new DifyModelProvider(difyConfig)
const client = new AIClient({
providerImplementation: new DifyModelProvider(difyConfig),
providerImplementation: difyModelProvider,
...difyConfig
})

Expand All @@ -27,25 +28,21 @@ export function useTinyRobot() {
{
label: '识别网页的内容',
description: '公司人员表有多少列,有多少行数据,帮我统计一下!',
icon: h('span', { style: { fontSize: '18px' } }, '🧠')
icon: h('span', { style: { fontSize: '18px' } }, '💡')
},
{
label: '智能操作网页',
description: '请勾选公司人员表最后一行”!',
icon: h('span', { style: { fontSize: '18px' } }, '🤔')
description: '请勾选公司人员表最后一行',
icon: h('span', { style: { fontSize: '18px' } }, '🕹')
}
]
const handlePromptItemClick = (ev, item) => {
sendMessage(item.description)
}

const { messageManager, createConversation } = useConversation({ client })

const randomId = () => Math.random().toString(36).substring(2, 15)

const currentMessageId = ref('')

const { messageManager } = useConversation({ client })
const { messages, messageState, inputMessage, sendMessage, abortRequest } = messageManager
difyModelProvider._messages = messages

const roles = {
assistant: {
Expand All @@ -61,20 +58,17 @@ export function useTinyRobot() {
}
}

const showHistory = ref(false)
const historyData = reactive([])

// 建议按钮组,设置对话的模板
const suggestionPillItems = [
{
id: '1',
text: '公司人员表',
icon: h('span', { style: { fontSize: '18px' } }, '🔍')
icon: h('span', { style: { fontSize: '18px' } }, '🏢')
},
{
id: '2',
text: '城市下拉框',
icon: h('span', { style: { fontSize: '18px' } }, '🛠️')
icon: h('span', { style: { fontSize: '18px' } }, '🌐')
}
]

Expand All @@ -86,60 +80,11 @@ export function useTinyRobot() {
senderRef.value.setTemplate(templateText, currentInitialValue)
}
}
watch(
() => messages.value[messages.value.length - 1]?.content,
() => {
if (!messages.value.length) {
return
}

if (messages.value.length === 1) {
currentMessageId.value = randomId()
}

const allSessions = historyData.flatMap((item) => item.items)
const currentSession = allSessions.find((item) => item.id === currentMessageId.value)

const data = toRaw(messages.value)
if (!currentSession) {
const today = historyData.find((item) => item.group === '今天')
if (today) {
today.items.unshift({ title: messages.value[0].content, id: currentMessageId.value, data })
} else {
historyData.unshift({
group: '今天',
items: [{ title: messages.value[0].content, id: currentMessageId.value, data }]
})
}
} else {
currentSession.data = data
}
}
)

const handleHistorySelect = (item) => {
currentMessageId.value = item.id
messages.value = item.data
showHistory.value = false
}

const senderRef = ref(null)
const currentTemplate = ref('')
const suggestionOpen = ref(false)

// 设置指令
const handleFillTemplate = (templateText) => {
setTimeout(() => {
currentTemplate.value = templateText
inputMessage.value = ''

// 等待DOM更新后激活第一个字段
setTimeout(() => {
senderRef.value?.activateTemplateFirstField()
}, 100)
}, 300)
}

// 清除当前指令
const clearTemplate = () => {
// 清空指令相关状态
Expand Down Expand Up @@ -207,23 +152,17 @@ export function useTinyRobot() {
promptItems,

messageManager,
createConversation,
messages,
messageState,
inputMessage,
sendMessage,
abortRequest,
roles,
showHistory,
historyData,
currentMessageId,
handlePromptItemClick,
handleHistorySelect,

senderRef,
currentTemplate,
suggestionOpen,
handleFillTemplate,
clearTemplate,
handleSendMessage,
handleMessageKeydown,
Expand Down
21 changes: 19 additions & 2 deletions examples/sites/src/views/components-doc/composition/utils.ts
Original file line number Diff line number Diff line change
Expand Up @@ -4,13 +4,23 @@
*/

import type { ChatMessage, ChatCompletionResponse, StreamHandler } from '@opentiny/tiny-robot-kit'
import type { ChatCompletionRequest } from '@opentiny/tiny-robot-kit'
import type { Ref } from 'vue'

export const globalConversation = {
id: ''
}
Comment on lines +10 to +12
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Consider thread safety for global conversation state.

The globalConversation object is a shared global state that could lead to race conditions if multiple chat sessions are active simultaneously.

Consider one of these approaches:

  1. Pass conversation context through function parameters instead of global state
  2. Use a Map to track multiple conversations by session ID
  3. Add proper synchronization mechanisms

Example for approach 1:

export async function handleSSEStream(
  response: Response,
  handler: StreamHandler,
  message: Ref<ChatCompletionRequest['messages']>,
  conversationContext: { id: string },
  signal?: AbortSignal
): Promise<void>
🤖 Prompt for AI Agents
In examples/sites/src/views/components-doc/composition/utils.ts around lines 10
to 12, the globalConversation object is a shared global state that risks race
conditions with concurrent chat sessions. To fix this, avoid using this global
object by passing the conversation context explicitly through function
parameters, or alternatively use a Map keyed by session IDs to manage multiple
conversations separately. Implement one of these approaches to ensure thread
safety and prevent state conflicts.

/**
* 处理SSE流式响应
* @param response fetch响应对象
* @param handler 流处理器
*/
export async function handleSSEStream(response: Response, handler: StreamHandler, signal?: AbortSignal): Promise<void> {
export async function handleSSEStream(
response: Response,
handler: StreamHandler,
message: Ref<ChatCompletionRequest['messages']>,
signal?: AbortSignal
): Promise<void> {
// 获取ReadableStream
const reader = response.body?.getReader()
if (!reader) {
Expand Down Expand Up @@ -64,6 +74,7 @@ export async function handleSSEStream(response: Response, handler: StreamHandler
if (!dataMatch) continue

const data = JSON.parse(dataMatch[1])
// console.log('SSE data:', data)
if (data?.event === 'workflow_started') {
handler.onData({
id: data.workflow_run_id,
Expand All @@ -73,14 +84,18 @@ export async function handleSSEStream(response: Response, handler: StreamHandler
index: messageIndex++,
delta: {
role: 'assistant',
content: '**Workflow started** \r\n'
content: ' '
},
finish_reason: null
}
],
object: '',
model: ''
})
message.value.slice(-1)[0].loading = true
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Add safety checks for message array access.

The code directly accesses the last message in the array without checking if the array is empty, which could cause runtime errors.

Apply this diff to add safety checks:

-            message.value.slice(-1)[0].loading = true
+            const lastMessage = message.value.slice(-1)[0]
+            if (lastMessage) {
+              lastMessage.loading = true
+            }

Apply similar fixes to lines 118 and 137 where loading = false is set.

Also applies to: 118-118, 137-137

🤖 Prompt for AI Agents
In examples/sites/src/views/components-doc/composition/utils.ts at lines 95,
118, and 137, the code accesses the last element of the message array without
checking if the array is empty, which can cause runtime errors. Add safety
checks to ensure the message array is not empty before accessing its last
element and setting the loading property. Use conditional checks to verify the
array length is greater than zero before performing these operations.

if (!globalConversation.id) {
globalConversation.id = data.conversation_id
}
}

if (data?.event === 'message' && data?.answer) {
Expand All @@ -100,6 +115,7 @@ export async function handleSSEStream(response: Response, handler: StreamHandler
object: '',
model: ''
})
message.value.slice(-1)[0].loading = false
}
if (data?.event === 'message_end') {
handler.onData({
Expand All @@ -118,6 +134,7 @@ export async function handleSSEStream(response: Response, handler: StreamHandler
object: '',
model: ''
})
message.value.slice(-1)[0].loading = false
}
// handler.onData(data)
} catch (error) {
Expand Down
Loading
Loading