Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
27 commits
Select commit Hold shift + click to select a range
722394a
Temp file
j-rivero Feb 26, 2025
395a34e
WIP
j-rivero Feb 26, 2025
a16498d
Relocate python dependant code
Mar 6, 2025
111253f
Basic documentation for the script
Mar 12, 2025
9a1d908
Use single var expansion
Mar 12, 2025
21a23b7
Rework shell-hook
Mar 12, 2025
c0f667f
Improve usability
Mar 12, 2025
6078b34
Improve documentation
Mar 12, 2025
cd353a9
Remove temp file
Mar 12, 2025
a2eb690
Merge branch 'master' into jrivero/conda_local_build
j-rivero Mar 12, 2025
d98eac9
Merge branch 'master' into jrivero/conda_local_build
j-rivero Mar 13, 2025
7791b96
Merge branch 'master' into jrivero/conda_local_build
j-rivero Mar 13, 2025
77b88f4
Merge branch 'master' into jrivero/conda_local_build
j-rivero Mar 14, 2025
4cdde84
Fix bad merge of code
j-rivero Mar 14, 2025
449795c
Remove comment
j-rivero Mar 14, 2025
dcb74c3
Replace build script with Python
j-rivero Mar 14, 2025
46f4629
Need to be permissive
j-rivero Mar 14, 2025
569c110
Fix and color for the messages
j-rivero Mar 14, 2025
34c5975
Update local_build.py
j-rivero Mar 14, 2025
fe305e6
Fix format in markdown
j-rivero Mar 17, 2025
8cd55cb
Merge branch 'master' into jrivero/conda_local_build
j-rivero Mar 17, 2025
e253b72
local_build.py needs to be called from a cmd.exe
j-rivero Apr 14, 2025
bb2a460
Merge branch 'master' into jrivero/conda_local_build
j-rivero Apr 14, 2025
bc0f2c4
Merge branch 'master' into jrivero/conda_local_build
j-rivero May 12, 2025
a4d1e9a
-j is now supported by the script
j-rivero May 12, 2025
f74a492
Merge branch 'master' into jrivero/conda_local_build
j-rivero May 19, 2025
5d8b96e
Merge branch 'master' into jrivero/conda_local_build
j-rivero Oct 13, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
104 changes: 88 additions & 16 deletions jenkins-scripts/README.md
Original file line number Diff line number Diff line change
@@ -1,38 +1,110 @@
# Jenkins scripts

The release-tools repository uses the [DSL Jenkins plugin](https://plugins.jenkins.io/job-dsl/) to allow us to programmatically generate the job configuration (configuration as code). You can find the different job configs under the [`dsl`](./dsl/) folder.
The release-tools repository uses the [DSL Jenkins plugin](https://plugins.jenkins.io/job-dsl/) to allow us to programmatically generate the job configuration (configuration as code). You can find the different job configs under the [`dsl`](./dsl/) folder.

## Useful links
## Conda local builder for Windows

### Prerequisites

The installation of Visual Studio 2019 needs to be peformed before using the local builder.
Same [ROS 2 instructons](https://docs.ros.org/en/jazzy/Installation/Windows-Install-Binary.html#install-visual-studio) are valid here.


### Usage of `local_build.py`

The `local_build.py` script is used to reproduce Jenkins builds for Windows, specifically supporting Pixi builds.

### Running the script

> [!IMPORTANT]
> The script needs to be run from a Windows command pront (Batch/DOS). Can not work under powershell or
> git bash or other shells.

To run the script, use the following command:

```bat
python3 local_build.py <jenkins-bat-script> <gz-sources> [--reuse-dependencies-environment] [-j <make_jobs>]
```

### Arguments

- `jenkins-bat-script`: The script to run from the files in release-tools/jenkins-scripts/gz_*.bat
- `sources`: Local checkout of the gazebo library sources
- `--reuse-dependencies-environment` (optional): Reuse the Pixi build environment created in the initial run (useful for testing code changes).
- `-j`, `--jobs` (optional): MAKE_JOBS variable value to apply

### Example

Use case: reproducing a gz-math pull request for the branch my-testing-branch.

```bat
git clone -b my-testing-branch C:\Users\foo\code\gz-math
python3 local_build.py gz_math-default-devel-windows-amd64.bat C:\Users\foo\code\gz-math
```

This command will run `gz_math-default-devel-windows-amd64.bat ` using the sources from `C:\User\foo\code\gz-math`. It will handle the installation of all the system dependencies
using Pixi (it can take up to 10 minutes) and build all the Gazebo dependencies from source
using colcon. In a second build it builds gz-math with tests using colcon.

When finishes, you can do modifications in C:\Users\foo\code\gz-math and re-run the script
with the `--reuse-dependencies-enviroment` flag enabled to re-use the environment
prepared with external dependencies.

```bat
python3 local_build.py gz_math-default-devel-windows-amd64.bat C:\Users\foo\code\gz-math --reuse-dependencies-enviroment
```

The script will also generate a `.debug_last_build.bat` file that will source the generated Pixi
enviroment and the colcon `install.bat` and leave the user in the colcon workspace root inside
%TMP%. This allows direct debugging without the need to run anything more than colcon and edit
the code in the colcon workspace.

```bat
call .debug_last_build.bat
:: ignore errors related to vs2019 if using other version of MSVC
C:\Users\josel\AppData\Local\Temp\12853\ws> colcon list
gz-cmake4 ws\src\gz-cmake (ros.cmake)
gz-plugin3 gz-plugin (ros.cmake)
gz-plugin3 ws\src\gz-plugin (ros.cmake)
gz-tools2 ws\src\gz-tools (ros.cmake)
gz-utils3 ws\src\gz-utils (ros.cmake)
...
:: edit code in C:\Users\josel\AppData\Local\Temp\12853\ws\src\gz-sim
C:\Users\josel\AppData\Local\Temp\12853\ws> colcon build --packages-select gz-sim9
```

## DSL related

### Useful links
- [List of installed plugins in Jenkins](https://github.com/osrf/chef-osrf/blob/latest/cookbooks/osrfbuild/attributes/plugins.rb)
- [Jenkins DSL API docs](https://jenkinsci.github.io/job-dsl-plugin/)
- [Jenkins DSL Wiki](https://github.com/jenkinsci/job-dsl-plugin/wiki)

- [Jenkins DSL Wiki](https://github.com/jenkinsci/job-dsl-plugin/wiki)

## Local testing
### Local testing

To test locally the build of the different `dsl` jobs you need the following:
To test locally the build of the different `dsl` jobs you need the following:

1. Run the `dsl/tools/setup_local_generation.bash` script to produce the necessary jar files
3. In the terminal execute:
``` bash
2. In the terminal execute:
```bash
java -jar <path-to-dsl-tools>/jobdsl.jar <file.dsl>
```
For more information go [here](https://github.com/jenkinsci/job-dsl-plugin/wiki/User-Power-Moves#run-a-dsl-script-locally).

## Development workflow
### Development workflow

1. Make changes locally and test that it builds correctly.
2. Push changes to a specific branch in `release-tools`
3. Go to seed job for the job you wanna test (usually you would use [`_dsl_test`](https://build.osrfoundation.org/job/_dsl_test/) to not affect the jobs in production) and build with the parameter pointing to your new custom branch in the `RTOOLS_BRANCH` parameter.
4. After it builds correctly, you will have generated jobs with the changes you implemented. You can use and modify the generated job.
3. Go to seed job for the job you wanna test (usually you would use [`_dsl_test`](https://build.osrfoundation.org/job/_dsl_test/) to not affect the jobs in production) and build with the parameter pointing to your new custom branch in the `RTOOLS_BRANCH` parameter.
4. After it builds correctly, you will have generated jobs with the changes you implemented. You can use and modify the generated job.

> WARNING! : Running the _dsl job for a specific job that it's not `test` will modify the configuration for production. You should always aim to utilize `test` jobs.
> WARNING! : Running the _dsl job for a specific job that it's not `test` will modify the configuration for production. You should always aim to utilize `test` jobs.

## :arrow_forward: Playbook
### :arrow_forward: Playbook

### XML injection into DSL
#### XML injection into DSL
How to deal with plugins that do not implement the DSL layer and don't provide a DSL API?

There is a feature called [configure blocks](https://github.com/jenkinsci/job-dsl-plugin/wiki/The-Configure-Block) that allows us to represent xml job configs as DSL. [Here](https://github.com/gazebo-tooling/release-tools/blob/9fbfe60133d2b7b8b280b92f7c563dc64c8367a5/jenkins-scripts/dsl/_configs_/OSRFUNIXBase.groovy#LL83C1-L92C10) is an example of it's usage with the retryBuild plugin, where `checkRegexp(true)` gets converted into `<checkRegexp>true</checkRegexp>` and the hierarchy of the definition is respected, so `checkRegexp` exists as a child of `com.chikli.hudson.plugin.naginator.NaginatorPublisher` in the XML definition.
There is a feature called [configure blocks](https://github.com/jenkinsci/job-dsl-plugin/wiki/The-Configure-Block) that allows us to represent xml job configs as DSL. [Here](https://github.com/gazebo-tooling/release-tools/blob/9fbfe60133d2b7b8b280b92f7c563dc64c8367a5/jenkins-scripts/dsl/_configs_/OSRFUNIXBase.groovy#LL83C1-L92C10) is an example of its usage with the retryBuild plugin, where `checkRegexp(true)` gets converted into `<checkRegexp>true</checkRegexp>` and the hierarchy of the definition is respected, so `checkRegexp` exists as a child of `com.chikli.hudson.plugin.naginator.NaginatorPublisher` in the XML definition.

To check what are the corresponding names for the XML tags you can refer to the plugin documentation or as an alternative you can manually modify the job to add the information you want and then go to `https://build.osrfoundation.com/job/myjob/config.xml` and match the XML there in the DSL config.
To check what are the corresponding names for the XML tags you can refer to the plugin documentation or as an alternative you can manually modify the job to add the information you want and then go to `https://build.osrfoundation.com/job/myjob/config.xml` and match the XML there in the DSL config.
2 changes: 0 additions & 2 deletions jenkins-scripts/lib/colcon-default-devel-windows.bat
Original file line number Diff line number Diff line change
Expand Up @@ -49,8 +49,6 @@ set DXDIAG_FILE=%WORKSPACE%\dxdiag.txt
if "%GPU_SUPPORT_NEEDED%" == "true" (
echo # BEGIN SECTION: dxdiag info
dxdiag /t %DXDIAG_FILE%
:: found that locally this works in Win11
if errorlevel 1 ( dxdiag \t %DXDIAG_FILE%)
type %DXDIAG_FILE%
echo Checking for correct NVIDIA GPU support %DXDIAG_FILE%
findstr /C:"Manufacturer: NVIDIA" %DXDIAG_FILE%
Expand Down
4 changes: 3 additions & 1 deletion jenkins-scripts/lib/windows_env_vars.bat
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
set "PIXI_VERSION=0.44.0"
set "PIXI_URL=https://github.com/prefix-dev/pixi/releases/download/v%PIXI_VERSION%/pixi-x86_64-pc-windows-msvc.exe"
set "PIXI_PROJECT_PATH=%TMP%\pixi\project"
if NOT DEFINED PIXI_PROJECT_PATH (
set "PIXI_PROJECT_PATH=%TMP%\pixi\project"
)
set "PIXI_TMPDIR=%TMP%\pixi"
set "PIXI_TMP=%PIXI_TMPDIR%\pixi.exe"
set "CONDA_ENVS_DIR=%SCRIPT_DIR%\..\conda\envs\"
Expand Down
117 changes: 117 additions & 0 deletions jenkins-scripts/local_build.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,117 @@
#!/usr/bin/env python3
import os
import sys
import shutil
import tempfile
import random
import subprocess
from pathlib import Path
import argparse

def main():
# Get script directory
script_dir = Path(__file__).parent.absolute()

# Parse arguments
parser = argparse.ArgumentParser(
description="Local build script",
formatter_class=argparse.RawTextHelpFormatter
)
parser.add_argument("script", help="The script to run")
parser.add_argument("sources", help="Local checkout of sources directory")
parser.add_argument(
"--reuse-dependencies-environment",
action="store_true",
help="Reuse pixi build environment (useful for testing code changes)"
)
parser.add_argument(
"-j", "--jobs",
type=int,
default=8,
help="Number of building threads (default: 8)"
)
args = parser.parse_args()

script_path = Path(script_dir) / Path(args.script)
src_directory = args.sources
reuse_dependencies_environment = args.reuse_dependencies_environment
make_jobs = args.jobs

# Check if the shell is a DOS Windows shell and not powershell or bash
if not os.environ.get('COMSPEC', '').lower().endswith('cmd.exe'):
print("This script is designed to be run in a DOS Windows shell (cmd.exe).")
sys.exit(1)

# Create temp workspace with random name
workspace = Path(os.environ["TMP"]) / str(random.randint(0, 1000000))

# Unset variables LOCAL_WS LOCAL_WS_BUILD LOCAL_WS_SOFTWARE_DIR
for var in ["LOCAL_WS", "LOCAL_WS_BUILD", "LOCAL_WS_SOFTWARE_DIR", "VCS_DIRECTORY", "WORKSPACE"]:
if var in os.environ:
os.environ.pop(var, None)

# Set environment variables
os.environ["WORKSPACE"] = str(workspace)
os.environ["MAKE_JOBS"] = str(make_jobs) # Customize the number of building threads
pixi_project_path = Path(os.environ["TMP"]) / "pixi" / "project"
os.environ["PIXI_PROJECT_PATH"] = str(pixi_project_path)

# Check if script exists
if not Path(script_path).exists():
print(f"Script {script_path} does not exist", file=sys.stderr)
sys.exit(1)

# Check if sources exist
if not Path(src_directory).exists():
print(f"Sources {src_directory} does not exist", file=sys.stderr)
sys.exit(1)

# Set additional environment variables
os.environ["KEEP_WORKSPACE"] = "1" # Help with debugging and re-run compilation only

# Create debug last build file path
dbg_last_build_file = Path(".debug_last_build.bat")
if dbg_last_build_file.exists():
dbg_last_build_file.unlink()

# Create workspace and copy files
workspace.mkdir(exist_ok=True)
dest_dir = workspace / Path(src_directory).name
print(f"Copying {src_directory} to {dest_dir}")
shutil.copytree(src_directory, dest_dir, dirs_exist_ok=True)

if reuse_dependencies_environment:
os.environ["REUSE_PIXI_INSTALLATION"] = "1"
else:
if "REUSE_PIXI_INSTALLATION" in os.environ:
del os.environ["REUSE_PIXI_INSTALLATION"]

# Run the script
result = subprocess.run([script_path], shell=True, check=False)

print("\n\033[1;34m Local build finished \033[0m\n")

# Check for errors
if result.returncode != 0:
print("\033[1;31m FAILED \n\033[0m")
sys.exit(1)

# Create debug last build file for reproduction
local_ws = workspace / "ws"
with open(dbg_last_build_file, "w") as f:
f.write(f"call {Path(local_ws) / 'install' / 'setup.bat'}\n")
f.write(f"call {Path(pixi_project_path) / 'hooks.bat'}\n")
f.write(f"cd {local_ws}\n")

# Print fancy and colorful message
print("\033[1;32m SUCCESS \n\033[0m")
print(" - Build root is", workspace)
print(" - Build workspace is", local_ws)
print("\033\n [1;34mReproduce the call to the last build:\033[0m")
print(" - Only reusing pixi environment:")
print(f" \033[1;36m- run '{script_path} {src_directory} --reuse-dependencies-environment'\033[0m")
print(" - Preparing pixi and colcon and go to the colcon workspace:")
print(f" \033[1;36m- run 'call {dbg_last_build_file}'\033[0m")

if __name__ == "__main__":
main()