You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
To benchmark a URL cleaning tool I'm building, I got 9000 URLs from reddit using reddit.com/domain/$DOMAIN and want to pass them all into a bash script that's effectively a fancy wrapper around hyperfine. The problem is that passing the first 3000 URLs into a --parameter-list errors on the argument list being too long, and passing in all 9000 somehow crashes my laptop.
What I'd like is to be able to write the URLs to a file then say, for example, --parameters-from-file-lines url urls.txt. In fact this could skip the file entirely by using --parameters-from-file-lines url <(url-getting-command).
The text was updated successfully, but these errors were encountered:
To benchmark a URL cleaning tool I'm building, I got 9000 URLs from reddit using
reddit.com/domain/$DOMAIN
and want to pass them all into a bash script that's effectively a fancy wrapper around hyperfine. The problem is that passing the first 3000 URLs into a--parameter-list
errors on the argument list being too long, and passing in all 9000 somehow crashes my laptop.What I'd like is to be able to write the URLs to a file then say, for example,
--parameters-from-file-lines url urls.txt
. In fact this could skip the file entirely by using--parameters-from-file-lines url <(url-getting-command)
.The text was updated successfully, but these errors were encountered: