Skip to content

Using expo file system to download and cache model, using file uri crashes app #13

@nadeem-portico

Description

@nadeem-portico

The app crashes when using modelPath after downloading the file from network.

const llmInference = useLlmInference({
   storageType: 'file',
   modelPath: '/data/user/0/com.offlinellmpoc/files/gemma-2b-it-cpu-int4.bin',
 });

or

const llmInference = useLlmInference({
  storageType: 'file',
  modelPath: 'file:///data/user/0/com.offlinellmpoc/files/gemma-2b-it-cpu-int4.bin',
});

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions