A Flaresolverr alternative using Scrappey.com
Developed with the software and tools below.
scrappey_proxy is a drop-in replacement for FlareSolverr. It works with the exact same API as FlareSolverr, ensuring compatibility with your existing setup.
Instead of implementing Cloudflare bypass logic itself, it leverages Scrappey.com's bypassing capabilities while handling cookies and user-agent consistency.
The script uses approximately one cookie request every 2 hours (the typical cf_clearance cookie lifespan), which makes it a reliable and affordable solution.
The script has only been tested with Prowlarr, feel free to test it with Jackett or any other tool and give your feedback.
NOT all FlareSolverr functionalities have been implemented. It currently supports GET and POST requests, which should be enough for the majority of cases.
scrappey_proxy sits between your applications (like Prowlarr) and websites protected by Cloudflare, handling the challenge-solving process:
- Your application sends a request to scrappey_proxy
- scrappey_proxy checks if the site is protected by Cloudflare
- If protected, scrappey_proxy forwards the request to Scrappey.com
- Scrappey.com solves the challenge through its service
- The request gets routed through your configured proxy to maintain IP consistency
- scrappey_proxy receives the solution, saves cookies, and returns results
- Future requests use saved cookies until they expire
This architecture ensures that both IP address and User-Agent remain consistent, which is crucial for Cloudflare bypass.
The system consists of three main components working together:
- Acts as a FlareSolverr-compatible API
- Detects Cloudflare protection
- Manages cookies and request routing
- Communicates with Scrappey.com
- Online Cloudflare bypass service
- Returns cookies and content
- Provides a consistent external IP
- Routes Scrappey.com requests
- Maintains IP-based cookie validity
docker run -d \
--name scrappey_proxy \
-p 8191:8191 \
-e PROXY_USERNAME="your_proxy_username" \
-e PROXY_PASSWORD="your_proxy_password" \
-e PROXY_INTERNAL_IP="proxy_internal_ip" \
-e PROXY_EXTERNAL_IP="proxy_external_ip" \
-e PROXY_INTERNAL_PORT="proxy_internal_port" \
-e PROXY_EXTERNAL_PORT="proxy_external_port" \
-e SCRAPPEY_API_KEY="your_scrappey_api_key" \
anthonyraffy/scrappey_proxy:latest
Example manifests to come.
Variable | Description |
---|---|
PROXY_USERNAME |
Username for your forward proxy |
PROXY_PASSWORD |
Password for your forward proxy |
PROXY_INTERNAL_IP |
Internal IP of your proxy (how scrappey_proxy connects to it) |
PROXY_EXTERNAL_IP |
External IP of your proxy (how Scrappey.com connects to it) |
PROXY_INTERNAL_PORT |
Internal port of your proxy |
PROXY_EXTERNAL_PORT |
External port of your proxy |
SCRAPPEY_API_KEY |
Your API key from Scrappey.com |
-
A working forward proxy (like Squid)
- Must be accessible from both your server and the internet
- Configured to allow authenticated access
-
A Scrappey.com account with an API key
- Obtain your API key from your account dashboard
-
Go to Settings β Indexers
-
Add two proxies:
-
FlareSolverr Proxy:
- Host: IP of your scrappey_proxy instance
- Port: 8191 (default Flaresolverr port)
- Tags: a tag like "scrappey"
-
HTTP Proxy:
- Host: Your PROXY_INTERNAL_IP
- Port: Your PROXY_INTERNAL_PORT
- Username: Your PROXY_USERNAME
- Password: Your PROXY_PASSWORD
- Tags: a tag like "squid-proxy"
-
-
For each indexer that needs Cloudflare bypass:
- Edit the indexer settings
- Add both tags you created to the "Tags" field
Not tested with Jackett yet.
After installation, you can test if scrappey_proxy is working correctly:
curl -X POST http://localhost:8191/v1 \
-H "Content-Type: application/json" \
-d '{"cmd": "request.get", "url": "https://cloudflare-protected-site.com"}'
If successful, you should receive a JSON response with the page content and cookies.
The typical request structure follows FlareSolverr's format:
{
"cmd": "request.get",
"url": "https://your-target-site.com",
"maxTimeout": 60000
}
{
"cmd": "request.post",
"url": "https://your-target-site.com",
"postData": "a=b&c=d",
"maxTimeout": 60000
}
This project is protected under the GPL3.0 License. For more details, refer to the LICENSE file.