Skip to content

Conversation

LucasSte
Copy link

@LucasSte LucasSte commented Oct 7, 2025

Problem

Downloads directly from the browser URL may fail in slow connections because Github now restricts them to a five minute window. One of the alternatives discussed in https://github.com/orgs/community/discussions/169381#discussioncomment-14105326 is using the Github REST API for the download.

This PR should fix anza-xyz/platform-tools#108 and anza-xyz/platform-tools#107.

Summary of Changes

  1. Bump create solana-file-download.
  2. Query Github for the object ID and download url.
  3. Download from gihtub REST API.

@codecov-commenter
Copy link

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 83.2%. Comparing base (e75d32f) to head (f383ac4).

Additional details and impacted files
@@           Coverage Diff           @@
##           master    #8370   +/-   ##
=======================================
  Coverage    83.2%    83.2%           
=======================================
  Files         838      838           
  Lines      368496   368496           
=======================================
+ Hits       306684   306728   +44     
+ Misses      61812    61768   -44     
🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@LucasSte LucasSte requested review from Lichtso and joncinque October 7, 2025 19:52
@LucasSte LucasSte marked this pull request as ready for review October 7, 2025 19:52
@t-nelson
Copy link

t-nelson commented Oct 7, 2025

pretty sure we used to do this for the release binary tarball and at some point gh instituted quotas so we had to switch to cloud buckets. did we confirm that this interface provides the bandwidth that we need here?

@LucasSte
Copy link
Author

LucasSte commented Oct 8, 2025

pretty sure we used to do this for the release binary tarball and at some point gh instituted quotas so we had to switch to cloud buckets. did we confirm that this interface provides the bandwidth that we need here?

According to the Github docs, for unauthenticated users like our case:

Primary rate limit for unauthenticated users
You can make unauthenticated requests if you are only fetching public data. Unauthenticated requests are associated with the originating IP address, not with the user or application that made the request. The primary rate limit for unauthenticated requests is 60 requests per hour.

I think on that end we are good.

There are also secondary limits:

Make too many concurrent requests. No more than 100 concurrent requests are allowed. This limit is shared across the REST API and GraphQL API.

Make too many requests to a single endpoint per minute. No more than 900 points per minute are allowed for REST API endpoints

I don't think users download the tools so often, do they?

See the points in https://docs.github.com/en/rest/using-the-rest-api/rate-limits-for-the-rest-api?apiVersion=2022-11-28#calculating-points-for-the-secondary-rate-limit.

Make too many requests per minute. No more than 90 seconds of CPU time per 60 seconds of real time is allowed. No more than 60 seconds of this CPU time may be for the GraphQL API. You can roughly estimate the CPU time by measuring the total response time for your API requests.

I don't know if "response time" includes the download time. When I download with curl and limit the download rate to 500kbits/s, the download takes 15 minutes and finishes successfully.

Make too many requests that consume excessive compute resources in a short period of time.
Create too much content on GitHub in a short amount of time.

Not our use case.

Copy link

@joncinque joncinque left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great to me! Do get Trent's approval on the rate limiting bit, just to make sure we aren't wandering into another problem.

@t-nelson
Copy link

t-nelson commented Oct 8, 2025

don't think it was a rate limit last time. rather a bandwidth quota. i can't find anything clearly documented atm though. i guess fafo. should probably have a backup plan tho


this is just a concept ack. didn't review the code

@LucasSte
Copy link
Author

LucasSte commented Oct 9, 2025

Given the potential quota Trent is worried about, we could instead adopt another strategy.

We could maintain the existing URL for automatic downloads, when someone simply invokes cargo-build-sbf, and add another CLI command --install-only, which would download from the Github REST API. We would also hint users to use --install-only if the automatic download fails.

Any opinions?

@joncinque
Copy link

should probably have a backup plan tho

Hm, that's a good point, how about a flag to use the old download method and url? ie --browser-url-tools-install or --old-tools-install-url or whatever else that might be better

@joncinque
Copy link

Sorry, I posted my message without seeing yours!

We could maintain the existing URL for automatic downloads, when someone simply invokes cargo-build-sbf, and add another CLI command --install-only, which would download from the Github REST API. We would also hint users to use --install-only if the automatic download fails.

Works for me! I've wanted an install-only command for a long time 😄

@LucasSte
Copy link
Author

Closing in favor of #8461

@LucasSte LucasSte closed this Oct 13, 2025
@LucasSte LucasSte deleted the github-api branch October 13, 2025 23:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Cannot download the platform-tools-linux-x86_64.tar.bz2

4 participants