Skip to content
This repository was archived by the owner on Dec 18, 2024. It is now read-only.
This repository was archived by the owner on Dec 18, 2024. It is now read-only.

fixing the logging sample #83

@csells

Description

@csells

not only does this one print the actual LLM response instead of "instance of Future" but it prints it in pieces so you can see the streaming in action:

  Stream<String> _logMessage(
    String prompt, {
    required Iterable<Attachment> attachments,
  }) async* {
    // log the message and attachments
    debugPrint('# Sending Message');
    debugPrint('## Prompt\n$prompt');
    debugPrint('## Attachments\n${attachments.map((a) => a.toString())}');

    // forward the message on to the provider
    final response = _provider.sendMessageStream(
      prompt,
      attachments: attachments,
    );

    // log the response
    debugPrint('## Response');
    var i = 1;
    yield* response.map((text) {
      debugPrint('$i. "$text"');
      ++i;
      return text;
    });
  }

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions