You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Dec 18, 2024. It is now read-only.
not only does this one print the actual LLM response instead of "instance of Future" but it prints it in pieces so you can see the streaming in action:
Stream<String> _logMessage(
String prompt, {
requiredIterable<Attachment> attachments,
}) async* {
// log the message and attachmentsdebugPrint('# Sending Message');
debugPrint('## Prompt\n$prompt');
debugPrint('## Attachments\n${attachments.map((a) => a.toString())}');
// forward the message on to the providerfinal response = _provider.sendMessageStream(
prompt,
attachments: attachments,
);
// log the responsedebugPrint('## Response');
var i =1;
yield* response.map((text) {
debugPrint('$i. "$text"');
++i;
return text;
});
}