-
Notifications
You must be signed in to change notification settings - Fork 6.1k
Add remote_decode
to remote_utils
#10898
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
47498fb
5fb50f1
76f79b3
303e920
8c405d5
19414df
2a2157e
f80ef6d
2c572f7
1978a8a
e55139b
4773420
54280dd
2af1995
d80d66c
05b39ab
c2a2daf
1c4fdea
f03a105
7e7af59
d16c855
485d99e
2937eb2
86c2236
b10ea13
7df21f2
562a4c0
9a39e35
217e161
3712dc3
5302645
3f69f92
08ffc8f
6c2f123
82aa5cd
7151510
9f6d91f
4c24111
9c39564
ca53835
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,5 @@ | ||
# Hybrid Inference API Reference | ||
|
||
## Remote Decode | ||
|
||
[[autodoc]] utils.remote_utils.remote_decode |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,54 @@ | ||
<!--Copyright 2024 The HuggingFace Team. All rights reserved. | ||
|
||
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with | ||
the License. You may obtain a copy of the License at | ||
|
||
http://www.apache.org/licenses/LICENSE-2.0 | ||
|
||
Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on | ||
an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the | ||
specific language governing permissions and limitations under the License. | ||
--> | ||
|
||
# Hybrid Inference | ||
|
||
**Empowering local AI builders with Hybrid Inference** | ||
|
||
|
||
> [!TIP] | ||
> Hybrid Inference is an [experimental feature](https://huggingface.co/blog/remote_vae). | ||
> Feedback can be provided [here](https://github.com/huggingface/diffusers/issues/new?template=remote-vae-pilot-feedback.yml). | ||
|
||
|
||
|
||
hlky marked this conversation as resolved.
Show resolved
Hide resolved
|
||
## Why use Hybrid Inference? | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. so here basically we listed the type of end points we offer, I think we can create a section for that later |
||
|
||
Hybrid Inference offers a fast and simple way to offload local generation requirements. | ||
|
||
- 🚀 **Reduced Requirements:** Access powerful models without expensive hardware. | ||
- 💎 **Without Compromise:** Achieve the highest quality without sacrificing performance. | ||
- 💰 **Cost Effective:** It's free! 🤑 | ||
- 🎯 **Diverse Use Cases:** Fully compatible with Diffusers 🧨 and the wider community. | ||
- 🔧 **Developer-Friendly:** Simple requests, fast responses. | ||
|
||
--- | ||
|
||
## Available Models | ||
|
||
* **VAE Decode 🖼️:** Quickly decode latent representations into high-quality images without compromising performance or workflow speed. | ||
* **VAE Encode 🔢 (coming soon):** Efficiently encode images into latent representations for generation and training. | ||
* **Text Encoders 📃 (coming soon):** Compute text embeddings for your prompts quickly and accurately, ensuring a smooth and high-quality workflow. | ||
|
||
--- | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. let's add a section to showcase Real-World Use cases! we can link to comfy nodes etc |
||
|
||
## Integrations | ||
|
||
* **[SD.Next](https://github.com/vladmandic/sdnext):** All-in-one UI with direct supports Hybrid Inference. | ||
* **[ComfyUI-HFRemoteVae](https://github.com/kijai/ComfyUI-HFRemoteVae):** ComfyUI node for Hybrid Inference. | ||
|
||
## Contents | ||
|
||
The documentation is organized into two sections: | ||
|
||
* **VAE Decode** Learn the basics of how to use VAE Decode with Hybrid Inference. | ||
* **API Reference** Dive into task-specific settings and parameters. |
Uh oh!
There was an error while loading. Please reload this page.