Skip to content

New landing page #5

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 8 commits into from
Jan 9, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -8,3 +8,4 @@
**/.benchmarks
**/.pytest_cache
**/.ruff_cache
.DS_Store
147 changes: 139 additions & 8 deletions docs/docs-demo/readme.mdx
Original file line number Diff line number Diff line change
@@ -1,19 +1,150 @@
---
title: infrahub-demo-dc-fabric
title: Demo-dc-fabric
---

import CodeBlock from '@theme/CodeBlock';
import block1 from '!!raw-loader!./code_snippets/0001_graphql_add_repo.gql';
import block2 from '!!raw-loader!./code_snippets/0002_shell_run_generator.sh';

This repository demos key Infrahub features using an example data center running VxLAN / EVPN and firewalls. It demonstrates the capabilities to use Infrahub with Arista AVD and Containerlab. Infrahub generates configurations that AVD deploys to a Containerlab topology.

![infrahub-demo-dc-fabric drawing](./infrahub-demo-dc-fabric.excalidraw.svg)

## Infrahub introduction
## Running the demo

### Clone the repository

Clone the GitHub repository to the server you will run this demo on:

```shell
git clone https://github.com/opsmill/infrahub-demo-dc-fabric.git
```

### Requirements

Please ensure you have installed the [Infrahub docker-compose requirements](https://docs.infrahub.app/guides/installation#docker-compose).

### Set environmental variables

```shell
export INFRAHUB_ADDRESS="http://localhost:8000"
export INFRAHUB_API_TOKEN="06438eb2-8019-4776-878c-0941b1f1d1ec"
export CEOS_DOCKER_IMAGE="registry.opsmill.io/external/ceos-image:4.29.0.2F"
export LINUX_HOST_DOCKER_IMAGE="registry.opsmill.io/external/alpine-host:v3.1.1"
```

### Install the Infrahub SDK

```shell
poetry install --no-interaction --no-ansi --no-root
```

### Start Infrahub

```shell
poetry run invoke start
```

### Load schema and data into Infrahub

The `invoke` command will create:

- Basic data (Account, organization, ASN, Device Type, and Tags)
- Location data (Locations, VLANs, and Prefixes)
- Topology data (Topology, Topology Elements)
- Security data (Policies, rules, objects)

```shell
poetry run invoke load-schema load-data
```

## Demo flow

### 1. Add the repository into Infrahub via GraphQL

> [!NOTE]
> Reference the [Infrahub documentation](https://docs.infrahub.app/guides/repository) for the multiple ways this can be done.

<!-- markdownlint-disable -->
<CodeBlock language="graphql">{block1}</CodeBlock>
<!-- markdownlint-restore -->

### 2. Generate a topology (devices, interfaces, cabling, BGP sessions, ...)

> [!NOTE]
> The example below creates the topology fra05-pod1

<!-- markdownlint-disable -->
<CodeBlock language="bash">{block2}</CodeBlock>
<!-- markdownlint-restore -->

### 3. Generate a network service in a Topology

> [!NOTE]
> The example below creates the Layer2 network service and a another Layer3 on topology fra05-pod1

```shell
poetry run infrahubctl run generators/generate_network-services.py topology=fra05-pod1 type=layer2
poetry run infrahubctl run generators/generate_network-services.py topology=fra05-pod1 type=layer3 vrf=production
```

### 4. Render artifacts

Artifact Generation is not currently present in the UI but it's possible to try it out locally :

> [!NOTE]
> This command will render the artifact define with `device_arista` Transformation, for `fra05-pod1-leaf1` device

```shell
poetry run infrahubctl render device_arista device=fra05-pod1-leaf1
```

### 5. Try out our pytest plugin

> [!NOTE]
> The command will use our Infrahub pytest plugin. It will run the different test in the `tests` folder. Those tests included :
>
> - Syntax checks for all the GraphQL Queries
> - Syntax checks for the Checks
> - Syntax checks for all the jinja files used in `templates`
> - will use the input/output file to try out the rendering and confirm there is no unexpected missing piece

```shell
pytest -v ./tests
```

### 6. Create a new Branch

Create directly a new branch `test` in the UI, or if you prefer to use our SDK in CLI :

```shell
poetry run infrahubctl branch create test
```

### 7. Create new network services and regenerate artifacts in your branch

> [!NOTE]
> You will be able to see the Diff in the Branch not only about the Data but about the Artifact as well
> You can go back in time to see the Diff on the branch before you create the new services (you can do it `main` after merging the proposed changes too)

### 8. Create a proposed change

Using your new branch `test` you will be able to see the Diff in the Proposed Change and you will see the checks / tests in the CI pipeline

### 9. Try out the topology check

- Modify an Elements in a Topology (example: increase or decrease the quantity of leaf switches in fra05-pod1)

- The checks will run in the `Proposed Changes` -> `check_device_topology` will fail.

Infrahub from [OpsMill](https://opsmill.com) is taking a new approach to Infrastructure Management by providing a new generation of datastore to organize and control all the data that defines how an infrastructure should run. Infrahub offers a central hub to manage the data, templates and playbooks that power your infrastructure by combining the version control and branch management capabilities of Git with the flexible data model and UI of a graph database.
### 10. Deploy your environment to Containerlab

## Infrahub documentation
The Containerlab generator automatically generates a Containerlab topology artifact for every topology. Every device has its startup configuration as an artifact.

If you'd like to learn more about Infrahub, please refer to the following resources:
```shell
# Download all artifacts automatically to ./generated-configs/
poetry run python3 scripts/get_configs.py

- [Infrahub Overview](https://docs.infrahub.app/overview/)
- [Infrahub Documentation](https://docs.infrahub.app/)
- [FAQ](https://docs.infrahub.app/faq/)
# Start Containerlab
sudo -E containerlab deploy -t ./generated-configs/clab/fra05-pod1.yml --reconfigure
```
48 changes: 48 additions & 0 deletions docs/docs/convert_links.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
import re
from pathlib import Path
import os


def convert_absolute_to_relative(source_path: Path, absolute_path: str) -> str:
"""Convert an absolute path to a relative path based on the source file location."""
# Remove leading slash to make it relative to the root
target_path = Path(absolute_path.lstrip("/"))

# Get the relative path from source directory to target
relative_path = os.path.relpath(target_path, source_path.parent)

return relative_path


def process_file(file_path: Path) -> None:
"""Process a single markdown file and convert absolute links to relative."""
content = file_path.read_text()

# Regular expression to find markdown links
pattern = r"\[([^\]]+)\]\((/[^)]+)\)"

def replace_link(match):
link_text = match.group(1)
absolute_path = match.group(2)
relative_path = convert_absolute_to_relative(file_path, absolute_path)
return f"[{link_text}]({relative_path})"

new_content = re.sub(pattern, replace_link, content)

# Only write if content has changed
if new_content != content:
print(f"Updating {file_path}")
file_path.write_text(new_content)


def main():
# Find all markdown files in current directory and subdirectories
root_dir = Path(".")
markdown_files = list(root_dir.rglob("*.md")) + list(root_dir.rglob("*.mdx"))

for file_path in markdown_files:
process_file(file_path)


if __name__ == "__main__":
main()
131 changes: 131 additions & 0 deletions docs/docs/development/backend.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,131 @@
---
title: Backend guide
---

# Backend guide

In order start developing on Infrahub backend, it is recommended to have a decent knowledge about topics such as Docker, Python and generally UNIX systems. Tools such as Docker and Python virtual environment help us in isolating the development work without interfering with the system itself. In this guide, we will use:

* [Python](https://www.python.org/) to be able to run the code
* [Invoke](https://www.pyinvoke.org/) to run some commands bundled with Infrahub
* [Poetry](https://python-poetry.org/) to manage our Python virtual environment
* [Docker](https://www.docker.com/) and its Compose extension to run dependencies such as the database, cache and queueing system

To fetch Infrahub's code, we will use Git and we will use the `develop` branch (default).

```shell
git clone --recursive git@github.com:opsmill/infrahub.git
cd infrahub
```

## Basic settings

Most of Infrahub and tools around it rely on some settings. These settings are in general set as environment variables, dealing with many of these can be hard to maintain and manage. We can use a tool such as [direnv](https://direnv.net/) to help. It allows to define environment variables (or pretty much anything bash can make sense of) in a file that will be interpreted when entering a given directory. Here is an example of a `.envrc` file providing development friendly setting values:

```shell
export INFRAHUB_PRODUCTION=false
export INFRAHUB_SECURITY_SECRET_KEY=super-secret
export INFRAHUB_USERNAME=admin
export INFRAHUB_PASSWORD=infrahub
export INFRAHUB_TIMEOUT=20
export INFRAHUB_METRICS_PORT=8001
export INFRAHUB_DB_TYPE=neo4j # Accepts neo4j or memgraph
export INFRAHUB_INITIAL_ADMIN_TOKEN="${ADMIN_TOKEN}" # Random string which can be generated using: openssl rand -hex 16
export INFRAHUB_STORAGE_LOCAL_PATH="${HOME}/Development/infrahub-storage"
export INFRAHUB_API_CORS_ALLOW_ORIGINS='["http://localhost:8080"]' # Allow frontend/backend communications without CORS issues
```

The exported environment variables are very important and must be set before moving to another step. Without these, you will likely face some errors or issues later.

## Required services

Infrahub uses several external services to work:

* A neo4j database
* A Redis in-memory store
* A RabbitMQ message broker

To run all these services, we will use Docker, but for local development some ports will need to be bound to local ones. To do so, a very basic Docker Compose override file is provided in the `development` directory, but it has a `tmp` extension which makes Compose ignore it by default. We will copy this file to a new one without the `tmp` extension. In a development environment, having only the services in Docker and Infrahub running on local Python is convenient to take advantage of the server auto-reload feature when making changes.

```shell
cp development/docker-compose.dev-override.yml.tmp development/docker-compose.dev-override.yml
```

Now we need to make sure we have a compatible version of Python that Infrahub can run on top of, Poetry to create virtual environment and Invoke to run commands. Invoke can be installed in many ways, but we recommend to use the `pipx` way to get it available user wide while without messing with the system Python. Assuming we have these utilities ready, we can run the following commands to build a proper Python environment:

```shell
cd infrahub # or the directory of your choice
poetry install
```

Some tests require some services to work. By default, they are automatically started by pytest before tests run.
It is also possible to disable the automatic startup of services and to rely on existing services using an environment variable:

```bash
export INFRAHUB_USE_TEST_CONTAINERS=false
```

The required services now need to be started using dedicated commands:

```shell
invoke dev.destroy dev.deps
```

This will actually pass two commands, one to destroy any remains of a previous run and one to start services. So this will effectively bring up clean services without leftovers. We can see which services are running by using:

```shell
poetry run invoke dev.status
```

This should yield a Docker output like the following:
docs

```shell
NAME IMAGE COMMAND SERVICE CREATED STATUS PORTS
infrahub-cache-1 redis:7.2 "docker-entrypoint.s…" cache 2 hours ago Up 2 hours (healthy) 0.0.0.0:6379->6379/tcp
infrahub-database-1 memgraph/memgraph:2.13.0 "/usr/lib/memgraph/m…" database 2 hours ago Up 2 hours (healthy) 0.0.0.0:7444->7444/tcp, 0.0.0.0:7474->7474/tcp, 0.0.0.0:7687->7687/tcp
infrahub-message-queue-1 rabbitmq:3.12-management "docker-entrypoint.s…" message-queue 2 hours ago Up 2 hours (healthy) 4369/tcp, 5671/tcp, 0.0.0.0:5672->5672/tcp, 15671/tcp, 15691-15692/tcp, 25672/tcp, 0.0.0.0:15672->15672/tcp
```

When following a guide, like the [installation guide](../guides/installation.mdx), the command `demo.start` is mentioned. It is slightly different from the `dev.deps` that is mentioned here. The `demo.start` will bring up a demo environment as a whole including services and Infrahub while the `dev.deps` will only start the services as seen in the code block above.

## Running Infrahub test suite

With the required services working and properly setup Python virtual environment we can now run the Infrahub test suite to make sure the code works as intended.

```shell
INFRAHUB_LOG_LEVEL=CRITICAL pytest -v backend/tests/unit
```

The environment variable at the beginning of the command is useful to have a much more cleaner output when running tests.

## Running Infrahub server

We can run the Infrahub server with the built-in command:

```shell
infrahub server start --debug
```

The `debug` flag allows the server to be reloaded when a change is detected in the source code.

Note that this will only make the backend service usable, the frontend will not be available. Only Swagger documentation should be available at http://localhost:8000/api/docs.
GraphQL sandbox is available through the frontend.

For running the frontend, please refer to its [dedicated documentation section](./frontend).

## Loading a new schema via CLI

For testing code changes, you may want to load a new schema from a YAML file. This can be performed in the development environment using:

```shell
poetry run infrahubctl schema load ${PATH_TO_SCHEMA_FILE}
```

## Code format

Formatting code in the backend relies on [Ruff](https://docs.astral.sh/ruff/) and [yamllint](https://yamllint.readthedocs.io/en/stable/). To ensure all files are as close as possible to the expected format, it is recommended to run the `format` command:

```shell
poetry run invoke format
```
42 changes: 42 additions & 0 deletions docs/docs/development/changelog.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
---
title: Changelog guide
---
# Changelog guide

Infrahub utilizes a tool called [`towncrier`](https://towncrier.readthedocs.io/) for Changelog management and generation.

The Changelog is maintained in a `CHANGELOG.md` file in the main directory of each package in this repository.

Multiple Changelogs can be merged into a single Release Note for the overall project.

Infrahub follows the best practices from [Keep a Changelog](https://keepachangelog.com/), and our categories of change are:

- **Added** for new features
- **Changed** for changes in existing functionality
- **Deprecated** for soon-to-be removed features
- **Removed** for now removed features
- **Fixed** for any bug fixes
- **Security** in case of vulnerabilities

## Creating changelog entries

What this means in practice for contributing to Infrahub is:

1. Any PR to the `develop` (or `stable`) branch that closes an issue should contain at least one Markdown formatted "Newsfragment" file in the related `changelog` directory
- `changelog/` for Infrahub changes
- `python-sdk/changelog` for Infrahub SDK changes
1. This file should be named with the format `<github issue id>.<change type>.md`. For example:
- A Newsfragment file named `1234.fixed.md` represents a bug fix PR closing GitHub Issue #1234
- This allows `towncrier` to populate the correct section of the Changelog with the relevant information
- *Note*: If a PR doesn't close (or at least reference) a specific issue, you can utilize plain-text for the prefix by appending `+` to the filename.
- For example: `+this_is_not_a_github_issue.fixed.md`
1. Available Newsfragment suffixes are:
- `added`
- `changed`
- `deprecated`
- `removed`
- `fixed`
- `security`
1. Upon release of a new version, maintainers (and eventually CI) will execute `towncrier build` in each package directory which will do the following:
1. Consolidate all Newsfragment files in each `changelog` directories into a respective file each named `CHANGELOG.md`, and automatically prepend the changelog into the relevant release section
1. `git rm` all individual Newsfragment files
Loading
Loading