Issue with folder names containing spaces when downloading the Dashboard using GDG following v0.8.1 release #466
Replies: 15 comments
-
0.8.1 Should allow for a more flexible pattern, especially with nested folder becoming a standard feature, but it does make spacing a bit troublesome. Mostly since it gets encoded with a + separator which needs to be escaped as it's a valid special char. I thought I added a proper test for this and crud. Can tell me when you're getting the inconsistencies. For my part given this regex in the config: watched:
- ES\+net/LHC\+Data\+Challenge will consistently writes the output to: test/data/org_main-org/dashboards/ES+net/LHC+Data+Challenge/firefly-details.json And list it as: ┌────┬─────────────────┬─────────────────┬────────────────────┬───────────────────────────┬───────────┬──────┬───────────────────────────────────────────────────┐
│ ID │ TITLE │ SLUG │ FOLDER │ NESTEDPATH │ UID │ TAGS │ URL │
├────┼─────────────────┼─────────────────┼────────────────────┼───────────────────────────┼───────────┼──────┼───────────────────────────────────────────────────┤
│ 3 │ Firefly Details │ firefly-details │ LHC Data Challenge │ ES+net/LHC+Data+Challenge │ JywX0Qt7k │ │ http://localhost:3000/d/JywX0Qt7k/firefly-details │
└────┴─────────────────┴─────────────────┴────────────────────┴───────────────────────────┴───────────┴──────┴───────────────────────────────────────────────────┘ Can you tell me a bit more on what you're doing? Any sequence of operations? Also if you can share the output of:
|
Beta Was this translation helpful? Give feedback.
-
@techuser12345 final poke before I close this out. Any updates/feedback? |
Beta Was this translation helpful? Give feedback.
-
Thanks for following up. Yes, I’m still seeing the issue — it seems to be caused by recursive encoding of folder names containing spaces. For example, when I export dashboards from UAT using GDG (v0.8.1), a folder like: "Monthly Incidents" becomes I'm using GDG in UAT to extract dashboards and then push only the extracted JSON into Git (stash), which is later provisioned in production. Because the folder name keeps changing, it creates inconsistencies and breaks the provisioning setup. I had a downgrade the version and had to fix all the folder names Let me know if you'd like any further information. |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
Thanks for the follow-up. Just to confirm: When I was using GDG v0.8.1, my config included: `context_name: grafana-sync
storage_engine:
any_label:
bucket_name: ""
cloud_type: s3
kind: cloud
custom_cloud:
access_id: ""
bucket_name: mybucket
cloud_type: s3
endpoint: http://localhost:9000
init_bucket: "true"
kind: cloud
prefix: dummy
secret_key: ""
ssl_enabled: "false"
contexts:
grafana-sync:
token: XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXx
connections: null
dashboard_settings:
nested_folders: true
ignore_filters: true
watched:
- General
watched_folders_override: []
organization_name: "Main Org."
secure_location: ""
output_path: /path/to/grafana/gdg/output
password: ""
storage: ""
url: https://sdfsdfsdsds.sdfs.sdf.dsfsd.dfs:3000/grafana
user_name: ""
user: null
global:
debug: true
api_debug: false
ignore_ssl_errors: false
retry_count:3
retry_delay: 5s
clear_output: true` I don't use cloud/s3, just linux server.
I'm only exporting dashboards (not datasources or folders) or anything else. I was originally looking for a solution for library panels #286 (reply in thread) , which is why I upgraded to the latest GDG. Please advise, given my current setup. |
Beta Was this translation helpful? Give feedback.
-
I'll try to test this on a linux server next week. For what it's worth, What steps are followed though. so you have an instance running that's UAT. I need to translate your pipeline into GDG steps. 1.Runs on the UAT server using GDG to export dashboards only |
Beta Was this translation helpful? Give feedback.
-
Thanks for the reply and clarification. Yes, my GDG context is named grafana-sync, and to clarify the workflow: Users create or modify dashboards directly in the Grafana UI in UAT. When they're ready to promote to production, they manually trigger a Rundeck job, passing a Jira ticket ID as an argument. That job:
This path is used for Grafana provisioning in both UAT and Prod (via dashboard.yml in Grafana config). I use GDG only on the UAT server, and only for exporting dashboards. By “pushes to Stash,” I mean the exported dashboards are committed and pushed to Git (Bitbucket/Stash) — no GDG context switching or push between instances. While I originally upgraded to the latest GDG version to resolve the library panel issue (#286), I haven't been able to address that yet — I'm first trying to sort out the encoding issue. Specifically, with clear_output: true enabled, I was still seeing recursive encoding of folder names containing spaces or special characters: "Monthly Incidents" → + → %2B → %252B |
Beta Was this translation helpful? Give feedback.
-
I'm going to set up a basic linux to validate your issue. Thank you for sharing all of the workflow ...so if I understand this correctly...
At this point, some pattern to try that would be nice would be:
Basically as far as I can tell you have a host/VM where you'll only pulling from UAT using GDG, writing to the same path and only exporting to Prod. So I'm really confused on how you're getting different encoding. Are you saying just running If that's the case the problem is clear.. it's doesn't make sense why it would work that way, but if I can recreate the issue, I'll get a fix out. If you can confirm the assumptions I've made that'd be great. I assume this would not matter much, but if you can give me an exact OS version that'd be great. x86 vs 64, etc. Also, those folder Encoding you're seeing, I'm assuming those issues are on the file system exports, not i grafana itself, is that correct? |
Beta Was this translation helpful? Give feedback.
-
Thanks again for the detailed response and for taking the time to validate this on Linux. To confirm your assumptions and add full context on my end: If I'm not mistaken you have ignore_filters set to true in your GDG contexts, correct? - Yes, that's right. I've supplied the full file in one of my previous replies. Workflow Summary
./gdg tools ctx set grafana-sync
./gdg tools ctx list
./gdg backup dash download
cp -rp ./gdg/output/org_main-org/dashboards ./gdg/output/grafana/
a. First to UAT Provisioning Setup
apiVersion: 1
deleteDatasources:
- name: Prometheus
orgId: 1
datasources:
- name: Prometheus
type: prometheus
uid: prometheus
access: proxy
orgId: 1
url: https://${host}:${port}/prometheus
version: 1
editable: false
apiVersion: 1
providers:
- name: 'bitbucket provider'
orgId: 1
folder: ''
folderUid: ''
type: file
disableDeletion: false
updateIntervalSeconds: 10
allowUiUpdates: true
options:
path: grafana/dashboards
foldersFromFilesStructure: true Key Issue Recap
Environment Info
Please let me know if you require any additional information. |
Beta Was this translation helpful? Give feedback.
-
I don't have the cycles to look at this right now, but one question. Have you tried "rm -fr ./gdg/output/grafana/" before you copy the data over? The whole reason for doing the If you can try that, it would be nice. As far as the recap, it's very clear now that you're doing and where you're seeing the issues. I'll setup a test VM at somepoint and will try to recreate the behavior. |
Beta Was this translation helpful? Give feedback.
-
Thanks for the suggestion. For clarity, here’s exactly what my Rundeck job does each run: # 1) Fresh clone of repo into ./gdg/output/grafana
git clone <repo-url> ./gdg/output/grafana
# 2) Wipe the dashboards subtree inside the clone
rm -rf ./gdg/output/grafana/dashboards
# 3) Set and verify context
./gdg tools ctx set grafana-sync
./gdg tools ctx list
# 4) Export ALL dashboards from UAT to the GDG export path
./gdg backup dashboard download
# 5) Move General dashboards into Common
mv ./gdg/output/org_main-org/dashboards/General/* \
./gdg/output/org_main-org/dashboards/Common/
# 6) Create feature branch (Bitbucket API later creates the PR)
# 7) Copy the fresh export into the repo clone
cp -rp ./gdg/output/org_main-org/dashboards \
./gdg/output/grafana/
# 8) Clean up the GDG export source tree (so it doesn’t linger)
rm -rf ./gdg/output/org_main-org/dashboards
# 9) Commit and push from inside the repo clone
cd ./gdg/output/grafana
git add -A
git commit -m "Sync dashboards from UAT"
git push origin <feature-branch>
# 10) Create PR via Bitbucket API token (automated)
# 11) Final cleanup of working dirs
cd -
rm -rf ./gdg/output/grafana
rm -rf ./gdg/output/.git Key points:
|
Beta Was this translation helpful? Give feedback.
-
Okay, I set this up a ubuntu VM. initial setup:
A few notes:
My bug testing config:
gdg backup dashboards download ┌───────────┬───────────────────────────────────────────────────────────────────────────────────────────────────┐
│ TYPE │ FILENAME │
├───────────┼───────────────────────────────────────────────────────────────────────────────────────────────────┤
│ dashboard │ test/bug/org_main-org/dashboards/ES+net/LHC+Data+Challenge/firefly-details.json │
│ dashboard │ test/bug/org_main-org/dashboards/Monthly+Incidents/bandwidth-dashboard.json │
│ dashboard │ test/bug/org_main-org/dashboards/Monthly+Incidents/bandwidth-patterns.json │
│ dashboard │ test/bug/org_main-org/dashboards/General/individual-flows-per-country.json │
│ dashboard │ test/bug/org_main-org/dashboards/Monthly+Incidents/individual-flows.json │
│ dashboard │ test/bug/org_main-org/dashboards/General/loss-patterns.json │
│ dashboard │ test/bug/org_main-org/dashboards/General/other-flow-stats.json │
│ dashboard │ test/bug/org_main-org/dashboards/General/science-discipline-patterns.json │
│ dashboard │ test/bug/org_main-org/dashboards/General/top-talkers-over-time.json │
│ dashboard │ test/bug/org_main-org/dashboards/Ignored/latency-patterns.json │
│ dashboard │ test/bug/org_main-org/dashboards/linux%2Fgnu/Others/flow-data-for-circuits.json │
│ dashboard │ test/bug/org_main-org/dashboards/linux%2Fgnu/Others/flow-data-for-projects.json │
│ dashboard │ test/bug/org_main-org/dashboards/linux%2Fgnu/Others/n%2B_%3D23r/dashboard-makeover-challenge.json │
│ dashboard │ test/bug/org_main-org/dashboards/linux%2Fgnu/Others/n%2B_%3D23r/flow-analysis.json │
│ dashboard │ test/bug/org_main-org/dashboards/linux%2Fgnu/flow-data-per-country.json │
│ dashboard │ test/bug/org_main-org/dashboards/linux%2Fgnu/flow-data-per-organization.json │
│ dashboard │ test/bug/org_main-org/dashboards/linux%2Fgnu/flow-information.json │
│ dashboard │ test/bug/org_main-org/dashboards/linux%2Fgnu/flows-by-science-discipline.json │
└───────────┴───────────────────────────────────────────────────────────────────────────────────────────────────┘ Okay, the name is: Monthly+Incidents which is correct. repeating just in case it's being odd, same behavior. test/bug/
test/bug/org_main-org
test/bug/org_main-org/dashboards
test/bug/org_main-org/dashboards/ES+net
test/bug/org_main-org/dashboards/ES+net/LHC+Data+Challenge
test/bug/org_main-org/dashboards/ES+net/LHC+Data+Challenge/firefly-details.json
test/bug/org_main-org/dashboards/Monthly+Incidents
test/bug/org_main-org/dashboards/Monthly+Incidents/bandwidth-dashboard.json
test/bug/org_main-org/dashboards/Monthly+Incidents/bandwidth-patterns.json
test/bug/org_main-org/dashboards/Monthly+Incidents/individual-flows.json
test/bug/org_main-org/dashboards/General
test/bug/org_main-org/dashboards/General/individual-flows-per-country.json
test/bug/org_main-org/dashboards/General/loss-patterns.json
test/bug/org_main-org/dashboards/General/other-flow-stats.json
test/bug/org_main-org/dashboards/General/science-discipline-patterns.json
test/bug/org_main-org/dashboards/General/top-talkers-over-time.json
test/bug/org_main-org/dashboards/Ignored
test/bug/org_main-org/dashboards/Ignored/latency-patterns.json
test/bug/org_main-org/dashboards/linux%2Fgnu
test/bug/org_main-org/dashboards/linux%2Fgnu/Others
test/bug/org_main-org/dashboards/linux%2Fgnu/Others/flow-data-for-circuits.json
test/bug/org_main-org/dashboards/linux%2Fgnu/Others/flow-data-for-projects.json
test/bug/org_main-org/dashboards/linux%2Fgnu/Others/n%2B_%3D23r
test/bug/org_main-org/dashboards/linux%2Fgnu/Others/n%2B_%3D23r/dashboard-makeover-challenge.json
test/bug/org_main-org/dashboards/linux%2Fgnu/Others/n%2B_%3D23r/flow-analysis.json
test/bug/org_main-org/dashboards/linux%2Fgnu/flow-data-per-country.json
test/bug/org_main-org/dashboards/linux%2Fgnu/flow-data-per-organization.json
test/bug/org_main-org/dashboards/linux%2Fgnu/flow-information.json
test/bug/org_main-org/dashboards/linux%2Fgnu/flows-by-science-discipline.json So far so good. I still think you're missing a few step to help debug this. Unless I'm misreading this. you're basically creating dashboards in your grafana-sync instance, using gdg so set context and pull all of them down, then create i git commit. As far as I can tell the only GDG commands you're running are: gdg tools ctx set grafana-sync
gdg tools ctx list
gdg backup dashboard download All of that seems to be working fine. I even tried to put "Monthly Report" in a different folder to make it nested with the same behavior: ┌───────────┬───────────────────────────────────────────────────────────────────────────────────────────────────┐
│ TYPE │ FILENAME │
├───────────┼───────────────────────────────────────────────────────────────────────────────────────────────────┤
│ dashboard │ test/bug/org_main-org/dashboards/ES+net/LHC+Data+Challenge/firefly-details.json │
│ dashboard │ test/bug/org_main-org/dashboards/General/individual-flows-per-country.json │
│ dashboard │ test/bug/org_main-org/dashboards/General/loss-patterns.json │
│ dashboard │ test/bug/org_main-org/dashboards/General/other-flow-stats.json │
│ dashboard │ test/bug/org_main-org/dashboards/General/samir.json │
│ dashboard │ test/bug/org_main-org/dashboards/General/science-discipline-patterns.json │
│ dashboard │ test/bug/org_main-org/dashboards/General/top-talkers-over-time.json │
│ dashboard │ test/bug/org_main-org/dashboards/Ignored/latency-patterns.json │
│ dashboard │ test/bug/org_main-org/dashboards/Testing+Bug/Monthly+Incidents/bandwidth-dashboard.json │
│ dashboard │ test/bug/org_main-org/dashboards/Testing+Bug/Monthly+Incidents/bandwidth-patterns.json │
│ dashboard │ test/bug/org_main-org/dashboards/Testing+Bug/Monthly+Incidents/individual-flows.json │
│ dashboard │ test/bug/org_main-org/dashboards/linux%2Fgnu/Others/flow-data-for-circuits.json │
│ dashboard │ test/bug/org_main-org/dashboards/linux%2Fgnu/Others/flow-data-for-projects.json │
│ dashboard │ test/bug/org_main-org/dashboards/linux%2Fgnu/Others/n%2B_%3D23r/dashboard-makeover-challenge.json │
│ dashboard │ test/bug/org_main-org/dashboards/linux%2Fgnu/Others/n%2B_%3D23r/flow-analysis.json │
│ dashboard │ test/bug/org_main-org/dashboards/linux%2Fgnu/flow-data-per-country.json │
│ dashboard │ test/bug/org_main-org/dashboards/linux%2Fgnu/flow-data-per-organization.json │
│ dashboard │ test/bug/org_main-org/dashboards/linux%2Fgnu/flow-information.json │
│ dashboard │ test/bug/org_main-org/dashboards/linux%2Fgnu/flows-by-science-discipline.json │
└───────────┴───────────────────────────────────────────────────────────────────────────────────────────────────┘ and files: find test/bug/
test/bug/
test/bug/org_main-org
test/bug/org_main-org/dashboards
test/bug/org_main-org/dashboards/ES+net
test/bug/org_main-org/dashboards/ES+net/LHC+Data+Challenge
test/bug/org_main-org/dashboards/ES+net/LHC+Data+Challenge/firefly-details.json
test/bug/org_main-org/dashboards/General
test/bug/org_main-org/dashboards/General/individual-flows-per-country.json
test/bug/org_main-org/dashboards/General/loss-patterns.json
test/bug/org_main-org/dashboards/General/other-flow-stats.json
test/bug/org_main-org/dashboards/General/samir.json
test/bug/org_main-org/dashboards/General/science-discipline-patterns.json
test/bug/org_main-org/dashboards/General/top-talkers-over-time.json
test/bug/org_main-org/dashboards/Ignored
test/bug/org_main-org/dashboards/Ignored/latency-patterns.json
test/bug/org_main-org/dashboards/Testing+Bug
test/bug/org_main-org/dashboards/Testing+Bug/Monthly+Incidents
test/bug/org_main-org/dashboards/Testing+Bug/Monthly+Incidents/bandwidth-dashboard.json
test/bug/org_main-org/dashboards/Testing+Bug/Monthly+Incidents/bandwidth-patterns.json
test/bug/org_main-org/dashboards/Testing+Bug/Monthly+Incidents/individual-flows.json
test/bug/org_main-org/dashboards/linux%2Fgnu
test/bug/org_main-org/dashboards/linux%2Fgnu/Others
test/bug/org_main-org/dashboards/linux%2Fgnu/Others/flow-data-for-circuits.json
test/bug/org_main-org/dashboards/linux%2Fgnu/Others/flow-data-for-projects.json
test/bug/org_main-org/dashboards/linux%2Fgnu/Others/n%2B_%3D23r
test/bug/org_main-org/dashboards/linux%2Fgnu/Others/n%2B_%3D23r/dashboard-makeover-challenge.json
test/bug/org_main-org/dashboards/linux%2Fgnu/Others/n%2B_%3D23r/flow-analysis.json
test/bug/org_main-org/dashboards/linux%2Fgnu/flow-data-per-country.json
test/bug/org_main-org/dashboards/linux%2Fgnu/flow-data-per-organization.json
test/bug/org_main-org/dashboards/linux%2Fgnu/flow-information.json
test/bug/org_main-org/dashboards/linux%2Fgnu/flows-by-science-discipline.json The output all seems to be working an expected is valid and is repeatable. Everytime I run download it re-create the same file. No extra encoding or anything of the sort. You may have a bug elsewhere. Maybe something in how you're creating the MR with the data via bitbucket API?
|
Beta Was this translation helpful? Give feedback.
-
Can you also make sure that whatever you have on your local file system after download actually matches the git PR? @techuser12345 |
Beta Was this translation helpful? Give feedback.
-
Moving this to conversation. as I'm leaning towards in issue with the encoding of your request to the bitbucket API. |
Beta Was this translation helpful? Give feedback.
-
@techuser12345 if you figure anything out let me know, but on my part I'm not able to reproduce any issue on any OS so I there's nothing else to do on my side. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Issue with folder names containing spaces when downloading the Dashboard using GDG.
GDG is handling the space in the folder name by encoding the characters differently each time I download the same original folder with a space in the folder name.
GDG v 0.8.1
Grafana v11.2.0
I think it's related to
79f8985
Minor tweaks dealing with folders containing spaces (#455)
example
"Monthly Incidents" to "Monthly+Incidents" to "Monthly%2BIncidents" to "Monthly%252BIncidents"
Beta Was this translation helpful? Give feedback.
All reactions