Skip to content

Docs: Installing validators concepts page with incode examples #1010

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 5 commits into from
Aug 15, 2024
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
86 changes: 86 additions & 0 deletions docs/concepts/validators.md
Original file line number Diff line number Diff line change
Expand Up @@ -107,4 +107,90 @@ git clone git@github.com:guardrails-ai/validator-template.git
Once the repository is cloned and the validator is created, you can register the validator via this [Google Form](https://forms.gle/N6UaE9611niuMxZj7).


## Installing Validators

### Guardrails Hub

Validators can be combined together into Input and Output Guards that intercept the inputs and outputs of LLMs. There are a large collection of Validators which can be found at the [Guardrails Hub](https://hub.guardrailsai.com/).

<div align="center">
<img src="https://raw.githubusercontent.com/guardrails-ai/guardrails/main/docs/img/guardrails_hub.gif" alt="Guardrails Hub gif" width="600px" />
</div>

Once you have found a Validator on the hub, you can click on the Validator `README` to find the install link.

### Using CLI

You can install a validator using the Guardrails CLI. For example the [Toxic Language](https://hub.guardrailsai.com/validator/guardrails/toxic_language) validator can be installed with:

```bash
guardrails hub install hub://guardrails/toxic_language
```

> This will not download local models if you opted into remote inferencing during `guardrails configure`

> If you want to control if associated models are downloaded or not you can use the `--install-local-models` or `--no-install-local-models` flags respectively during `guardrails hub install`

After installing the validator with the CLI you can start to use the validator in your guards:

```python
from guardrails.hub import ToxicLanguage
from guardrails import Guard

guard = Guard().use(
ToxicLanguage, threshold=0.5, validation_method="sentence", on_fail="exception"
)

guard.validate("My landlord is an asshole!")
```

### In Code Installs

You can also install validators using the Guardrails SDK which simplifies development particularly when using Jupyter Notebooks.

```python
from guardrails import install

install(
"hub://guardrails/toxic_language",
install_local_models=True, # defaults to `None` - which will not download local models if you opted into remote inferencing.
quiet=False # defaults to `True`
)
```

### In Code Installs - Pattern A

After an `install` invocation you can import a validator as you typically would:

```python
from guardrails import install

install("hub://guardrails/toxic_language")

from guardrails.hub import ToxicLanguage

guard = Guard().use(
ToxicLanguage, threshold=0.5, validation_method="sentence", on_fail="exception"
)

guard.validate("My landlord is an asshole!")
```

### In Code Installs - Pattern B

You can also extract the validator directly from the installed module as follows:

```python
from guardrails import install

ToxicLanguage = install("hub://guardrails/toxic_language").ToxicLanguage

guard = Guard().use(
ToxicLanguage, threshold=0.5, validation_method="sentence", on_fail="exception"
)

guard.validate("My landlord is an asshole!")
```


> Note: Invoking the `install` SDK always installs the validator module so it's recommended for the install to be in a separate code block when using Notebooks.
Loading