You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
You don't usually need to tokenize the input text yourself - the [`Generation` code](https://github.com/huggingface/swift-transformers/blob/17d4bfae3598482fc7ecf1a621aa77ab586d379a/Sources/Generation/Generation.swift#L82) will take care of it.
38
+
-`Hub`: Utilities for interacting with the Hugging Face Hub! Download models, tokenizers and other config files. Usage example:
39
+
```swift
40
+
importHub
41
+
functestHub() asyncthrows {
42
+
let repo = Hub.Repo(id: "mlx-community/Qwen2.5-0.5B-Instruct-2bit-mlx")
43
+
let filesToDownload = ["config.json", "*.safetensors"]
print("Files downloaded to: \(modelDirectory.path)")
52
+
}
53
+
```
29
54
30
-
-`Hub`: Utilities to download config files from the Hub, used to instantiate tokenizers and learn about language model characteristics.
31
-
-`Generation`: Algorithms for text generation. Currently supported ones are greedy search and top-k sampling.
55
+
-`Generation`: Algorithms for text generation. Handles tokenization internally. Currently supported ones are: greedy search, top-k sampling, and top-p sampling.
32
56
-`Models`: Language model abstraction over a Core ML package.
33
57
34
-
## Supported Models
35
-
36
-
This package has been tested with autoregressive language models such as:
37
-
38
-
- GPT, GPT-Neox, GPT-J.
39
-
- SantaCoder.
40
-
- StarCoder.
41
-
- Falcon.
42
-
- Llama 2.
43
-
44
-
Encoder-decoder models such as T5 and Flan are currently _not supported_. They are high up in our [priority list](#roadmap).
45
58
46
59
## Usage via SwiftPM
47
60
@@ -55,7 +68,7 @@ dependencies: [
55
68
56
69
And then, add the Transformers library as a dependency to your target:
57
70
58
-
```
71
+
```swift
59
72
targets: [
60
73
.target(
61
74
name: "YourTargetName",
@@ -66,12 +79,44 @@ targets: [
66
79
]
67
80
```
68
81
82
+
## Projects that use swift-transformers ❤️
83
+
84
+
-[WhisperKit](https://github.com/argmaxinc/WhisperKit): A Swift Package for state-of-the-art speech-to-text systems from [Argmax](https://github.com/argmaxinc)
85
+
-[MLX Swift Examples](https://github.com/ml-explore/mlx-swift-examples): A Swift Package for integrating MLX models in Swift apps.
86
+
87
+
Using `swift-transformers` in your project? Let us know and we'll add you to the list!
88
+
89
+
## Supported Models
90
+
91
+
You can run inference on Core ML models with `swift-transformers`. Note that Core ML is not required to use the `Tokenizers` or `Hub` modules.
92
+
93
+
This package has been tested with autoregressive language models such as:
94
+
95
+
- GPT, GPT-Neox, GPT-J.
96
+
- SantaCoder.
97
+
- StarCoder.
98
+
- Falcon.
99
+
- Llama 2.
100
+
101
+
Encoder-decoder models such as T5 and Flan are currently _not supported_.
102
+
69
103
## Other Tools
70
104
71
105
-[`swift-chat`](https://github.com/huggingface/swift-chat), a simple app demonstrating how to use this package.
72
106
-[`exporters`](https://github.com/huggingface/exporters), a Core ML conversion package for transformers models, based on Apple's [`coremltools`](https://github.com/apple/coremltools).
73
107
-[`transformers-to-coreml`](https://huggingface.co/spaces/coreml-projects/transformers-to-coreml), a no-code Core ML conversion tool built on `exporters`.
74
108
109
+
## Contributing
110
+
111
+
Swift Transformers is a community project and we welcome contributions. Please
112
+
check out [Issues](https://github.com/huggingface/swift-transformers/issues)
113
+
tagged with `good first issue` if you are looking for a place to start!
114
+
115
+
Please ensure your code passes the build and test suite before submitting a pull
0 commit comments