Skip to content

Incompatibility with selenium-stealth library in multiprocessing on Linux #2264

@raye22

Description

@raye22

🐛 Bug Report: Linux-specific incompatibility with selenium-stealth

Summary

When using undetected_chromedriver together with the selenium-stealth library in multiprocessing on Linux, Chrome instances crash with ProtocolError. The exact same code works perfectly on macOS, indicating a Linux-specific issue in how these libraries interact.

macOS: uc + selenium-stealth + multiprocessing = ✅ WORKS
Linux: uc + selenium-stealth + multiprocessing = ❌ CRASHES


Environment

Working Environment (macOS):

  • OS: macOS (Darwin)
  • Python: 3.10
  • Chrome: 141.0.7390.107
  • undetected-chromedriver: 3.5.x
  • selenium-stealth: 1.0.6
  • Result: ✅ 8 concurrent instances work perfectly

Failing Environment (Linux Server):

  • OS: Ubuntu 22.04 LTS (Linux kernel 5.15.0)
  • Python: 3.10
  • Chrome: 141.0.7390.107
  • undetected-chromedriver: 3.5.x (same version)
  • selenium-stealth: 1.0.6 (same version)
  • Result: ❌ First instance crashes when second starts

System Resources (Not Limited):

  • File descriptors: 1,048,576
  • Shared memory: 246GB
  • RAM: 491GB total
  • CPUs: 64

The Issue

This is a platform-specific incompatibility: the combination of undetected_chromedriver + selenium-stealth only fails on Linux.

Configuration macOS Linux
uc alone + multiprocessing ✅ Works ✅ Works
uc + selenium-stealth + single process ✅ Works ✅ Works
uc + selenium-stealth + multiprocessing WORKS CRASHES

Since the exact code works on macOS, this suggests platform-specific differences in how undetected_chromedriver patches Chrome on Linux vs macOS, which conflicts with selenium-stealth's patches.


Code to Reproduce (Fails on Linux, Works on macOS)

import multiprocessing as mp
import os
import time
from selenium.webdriver.chrome.options import Options
import undetected_chromedriver as uc
from selenium_stealth import stealth

def worker(worker_id):
    print(f"[Worker-{worker_id}] Starting (PID: {os.getpid()})")
    
    try:
        opts = Options()
        opts.binary_location = "/usr/bin/google-chrome-stable"  # Linux path
        opts.add_argument("--headless=new")
        opts.add_argument(f"--user-data-dir=/tmp/uc_test_{os.getpid()}")
        opts.add_argument("--no-sandbox")
        opts.add_argument("--disable-dev-shm-usage")
        opts.add_argument(f"--remote-debugging-port={9222 + worker_id}")
        
        driver = uc.Chrome(options=opts, version_main=None)
        
        # Adding selenium-stealth causes crashes on Linux only
        stealth(
            driver,
            languages=["en-US", "en"],
            vendor="Apple Computer, Inc.",
            platform="MacIntel",
            webgl_vendor="Apple Inc.",
        )
        
        driver.get("https://www.google.com")
        print(f"[Worker-{worker_id}] ✓ Success: {driver.title}")
        driver.quit()
        
    except Exception as e:
        print(f"[Worker-{worker_id}] ✗ FAILED: {e}")
        import traceback
        traceback.print_exc()

if __name__ == "__main__":
    mp.set_start_method('spawn', force=True)
    
    p1 = mp.Process(target=worker, args=(1,))
    p2 = mp.Process(target=worker, args=(2,))
    
    p1.start()
    time.sleep(2)
    p2.start()
    
    p1.join()
    p2.join()

Expected Behavior

Both workers should succeed on both macOS and Linux.


Actual Behavior

On macOS: ✅ Both workers succeed

On Linux: ❌ First worker crashes

[Worker-1] Starting
[Worker-1] Creating Chrome driver...
[Worker-1] Loading page...
[Worker-2] Starting
[Worker-2] Creating Chrome driver...
[Worker-2] Loading page...
[Worker-1] ✗ FAILED: ProtocolError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))
[Worker-2] ✓ Success: Google

Why selenium-stealth is Needed

Without selenium-stealth, undetected_chromedriver alone works in multiprocessing on Linux, but fails bot detection (all requests timeout at verification challenges). The stealth library is necessary for production use.


Things Tested (All Failed on Linux)

  1. ✅ Different --user-data-dir per instance
  2. ✅ Different --remote-debugging-port per instance
  3. ✅ Both fork and spawn modes
  4. ✅ 60+ second delays between workers
  5. ✅ Various Chrome flags
  6. ✅ Running as separate processes (not multiprocessing)
  7. ✅ Proxies per worker
  8. ✅ Verified resource limits are very high

Impact

This severely limits scalability for Linux production environments:

  • AWS/GCP/Azure (all primarily Linux)
  • Docker containers (Linux-based)
  • Most web scraping servers (Linux)

Many developers test on macOS (works fine) then deploy to Linux (silent failure).


Request

Could the maintainers:

  1. Investigate why the patching mechanism works differently on Linux vs macOS?
  2. Test the reproduction code on Linux to verify?
  3. Either:
    • Fix the Linux compatibility, or
    • Document the incompatibility clearly

Suggested documentation:

## 🐧 Linux-Specific Issues

**selenium-stealth compatibility:** When using `selenium-stealth` with 
undetected_chromedriver in multiprocessing on Linux, instances may crash.
This works fine on macOS. 

Thank you for this amazing library! Happy to provide additional debugging information or test patches on Linux.


Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions