Skip to content

Pre/beta #835

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 16 commits into from
Dec 2, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 5 additions & 5 deletions .github/workflows/pylint.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,15 +9,15 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Install the latest version of rye
uses: eifinger/setup-rye@v3
- name: Install uv
uses: astral-sh/setup-uv@v3
- name: Install dependencies
run: rye sync --no-lock
run: uv sync --frozen
- name: Analysing the code with pylint
run: rye run pylint-ci
run: uv run poe pylint-ci
- name: Check Pylint score
run: |
pylint_score=$(rye run pylint-score-ci | grep 'Raw metrics' | awk '{print $4}')
pylint_score=$(uv run poe pylint-score-ci | grep 'Raw metrics' | awk '{print $4}')
if (( $(echo "$pylint_score < 8" | bc -l) )); then
echo "Pylint score is below 8. Blocking commit."
exit 1
Expand Down
8 changes: 4 additions & 4 deletions .github/workflows/release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -14,8 +14,8 @@ jobs:
run: |
sudo apt update
sudo apt install -y git
- name: Install the latest version of rye
uses: eifinger/setup-rye@v3
- name: Install uv
uses: astral-sh/setup-uv@v3
- name: Install Node Env
uses: actions/setup-node@v4
with:
Expand All @@ -27,8 +27,8 @@ jobs:
persist-credentials: false
- name: Build app
run: |
rye sync --no-lock
rye build
uv sync --frozen
uv build
id: build_cache
if: success()
- name: Cache build
Expand Down
34 changes: 34 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,37 @@
## [1.32.0-beta.4](https://github.com/ScrapeGraphAI/Scrapegraph-ai/compare/v1.32.0-beta.3...v1.32.0-beta.4) (2024-12-02)


### Features

* add api integration ([8aa9103](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/8aa9103f02af92d9e1a780450daa7bb303afc150))
* add sdk integration ([209b445](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/209b4456fd668d9d124fd5586b32a4be677d4bf8))


### chore

* migrate from rye to uv ([5fe528a](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/5fe528a7e7a3e230d8f68fd83ce5ad6ede5adfef))

## [1.32.0-beta.3](https://github.com/ScrapeGraphAI/Scrapegraph-ai/compare/v1.32.0-beta.2...v1.32.0-beta.3) (2024-11-26)


### Bug Fixes

* improved links extraction for parse_node, resolves [#822](https://github.com/ScrapeGraphAI/Scrapegraph-ai/issues/822) ([7da7bfe](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/7da7bfe338a6ce53c83361a1f6cd9ea2d5bd797f))

## [1.32.0-beta.2](https://github.com/ScrapeGraphAI/Scrapegraph-ai/compare/v1.32.0-beta.1...v1.32.0-beta.2) (2024-11-25)


### Bug Fixes

* error on fetching the code ([7285ab0](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/7285ab065bba9099ba2751c9d2f21ee13fed0d5f))

## [1.32.0-beta.1](https://github.com/ScrapeGraphAI/Scrapegraph-ai/compare/v1.31.1...v1.32.0-beta.1) (2024-11-24)


### Features

* revert search function ([faf0c01](https://github.com/ScrapeGraphAI/Scrapegraph-ai/commit/faf0c0123b5e2e548cbd1917e9d1df22e1edb1c5))

## [1.31.1](https://github.com/ScrapeGraphAI/Scrapegraph-ai/compare/v1.31.0...v1.31.1) (2024-11-22)


Expand Down
44 changes: 44 additions & 0 deletions examples/scrapegraph-api/smart_scraper_api.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
"""
Basic example of scraping pipeline using SmartScraper
"""
import os
import json
from dotenv import load_dotenv
from scrapegraphai.graphs import SmartScraperGraph
from scrapegraphai.utils import prettify_exec_info

load_dotenv()

# ************************************************
# Define the configuration for the graph
# ************************************************


graph_config = {
"llm": {
"model": "scrapegraphai/smart-scraper",
"api_key": os.getenv("SCRAPEGRAPH_API_KEY")
},
"verbose": True,
"headless": False,
}

# ************************************************
# Create the SmartScraperGraph instance and run it
# ************************************************

smart_scraper_graph = SmartScraperGraph(
prompt="Extract me all the articles",
source="https://www.wired.com",
config=graph_config
)

result = smart_scraper_graph.run()
print(json.dumps(result, indent=4))

# ************************************************
# Get graph execution info
# ************************************************

graph_exec_info = smart_scraper_graph.get_execution_info()
print(prettify_exec_info(graph_exec_info))
25 changes: 15 additions & 10 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ name = "scrapegraphai"



version = "1.31.1"
version = "1.32.0b4"



Expand Down Expand Up @@ -43,7 +43,8 @@ dependencies = [
"transformers>=4.44.2",
"googlesearch-python>=1.2.5",
"simpleeval>=1.0.0",
"async_timeout>=4.0.3"
"async_timeout>=4.0.3",
"scrapegraph-py>=0.0.4"
]

license = "MIT"
Expand Down Expand Up @@ -91,7 +92,7 @@ other-language-models = [
"langchain-anthropic>=0.1.11",
"langchain-huggingface>=0.0.3",
"langchain-nvidia-ai-endpoints>=0.1.6",
"langchain_together>=1.2.9"
"langchain_together>=0.2.0"
]

# Group 2: More Semantic Options
Expand All @@ -116,17 +117,21 @@ screenshot_scraper = [
requires = ["hatchling"]
build-backend = "hatchling.build"

[tool.rye]
managed = true
[dependency-groups]
dev = [
"burr[start]==0.22.1",
"sphinx==6.0",
"furo==2024.5.6",
]

[tool.uv]
dev-dependencies = [
"poethepoet>=0.31.1",
"pytest==8.0.0",
"pytest-mock==3.14.0",
"-e file:.[burr]",
"-e file:.[docs]",
"pylint>=3.2.5",
]

[tool.rye.scripts]
pylint-local = "pylint scrapegraphai/**/*.py"
[tool.poe.tasks]
pylint-local = "pylint scraperaphai/**/*.py"
pylint-ci = "pylint --disable=C0114,C0115,C0116 --exit-zero scrapegraphai/**/*.py"
update-requirements = "python 'manual deployment/autorequirements.py'"
9 changes: 7 additions & 2 deletions requirements-dev.lock
Original file line number Diff line number Diff line change
Expand Up @@ -353,7 +353,7 @@ pyasn1==0.6.0
# via rsa
pyasn1-modules==0.4.0
# via google-auth
pydantic==2.8.2
pydantic==2.10.1
# via burr
# via fastapi
# via fastapi-pagination
Expand All @@ -368,7 +368,8 @@ pydantic==2.8.2
# via openai
# via pydantic-settings
# via qdrant-client
pydantic-core==2.20.1
# via scrapegraph-py
pydantic-core==2.27.1
# via pydantic
pydantic-settings==2.5.2
# via langchain-community
Expand Down Expand Up @@ -396,6 +397,7 @@ python-dateutil==2.9.0.post0
# via pandas
python-dotenv==1.0.1
# via pydantic-settings
# via scrapegraph-py
# via scrapegraphai
pytz==2024.1
# via pandas
Expand Down Expand Up @@ -424,6 +426,7 @@ requests==2.32.3
# via langchain-community
# via langsmith
# via mistral-common
# via scrapegraph-py
# via sphinx
# via streamlit
# via tiktoken
Expand All @@ -439,6 +442,8 @@ s3transfer==0.10.2
# via boto3
safetensors==0.4.5
# via transformers
scrapegraph-py==0.0.3
# via scrapegraphai
semchunk==2.2.0
# via scrapegraphai
sentencepiece==0.2.0
Expand Down
9 changes: 7 additions & 2 deletions requirements.lock
Original file line number Diff line number Diff line change
Expand Up @@ -257,7 +257,7 @@ pyasn1==0.6.0
# via rsa
pyasn1-modules==0.4.0
# via google-auth
pydantic==2.8.2
pydantic==2.10.1
# via google-generativeai
# via langchain
# via langchain-aws
Expand All @@ -269,7 +269,8 @@ pydantic==2.8.2
# via openai
# via pydantic-settings
# via qdrant-client
pydantic-core==2.20.1
# via scrapegraph-py
pydantic-core==2.27.1
# via pydantic
pydantic-settings==2.5.2
# via langchain-community
Expand All @@ -286,6 +287,7 @@ python-dateutil==2.9.0.post0
# via pandas
python-dotenv==1.0.1
# via pydantic-settings
# via scrapegraph-py
# via scrapegraphai
pytz==2024.1
# via pandas
Expand Down Expand Up @@ -313,6 +315,7 @@ requests==2.32.3
# via langchain-community
# via langsmith
# via mistral-common
# via scrapegraph-py
# via tiktoken
# via transformers
rpds-py==0.20.0
Expand All @@ -324,6 +327,8 @@ s3transfer==0.10.2
# via boto3
safetensors==0.4.5
# via transformers
scrapegraph-py==0.0.3
# via scrapegraphai
semchunk==2.2.0
# via scrapegraphai
sentencepiece==0.2.0
Expand Down
16 changes: 4 additions & 12 deletions scrapegraphai/docloaders/chromium.py
Original file line number Diff line number Diff line change
Expand Up @@ -100,18 +100,11 @@ async def ascrape_undetected_chromedriver(self, url: str) -> str:
async def ascrape_playwright(self, url: str) -> str:
"""
Asynchronously scrape the content of a given URL using Playwright's async API.

Args:
url (str): The URL to scrape.

Returns:
str: The scraped HTML content or an error message if an exception occurs.
"""
from playwright.async_api import async_playwright
from undetected_playwright import Malenia

logger.info(f"Starting scraping with {self.backend}...")
results = ""
attempt = 0

while attempt < self.RETRY_LIMIT:
Expand All @@ -127,16 +120,15 @@ async def ascrape_playwright(self, url: str) -> str:
await page.wait_for_load_state(self.load_state)
results = await page.content()
logger.info("Content scraped")
break
return results
except (aiohttp.ClientError, asyncio.TimeoutError, Exception) as e:
attempt += 1
logger.error(f"Attempt {attempt} failed: {e}")
if attempt == self.RETRY_LIMIT:
results = f"Error: Network error after {self.RETRY_LIMIT} attempts - {e}"
raise RuntimeError(f"Failed to fetch {url} after {self.RETRY_LIMIT} attempts: {e}")
finally:
await browser.close()

return results
if 'browser' in locals():
await browser.close()

async def ascrape_with_js_support(self, url: str) -> str:
"""
Expand Down
10 changes: 10 additions & 0 deletions scrapegraphai/graphs/smart_scraper_graph.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@
ConditionalNode
)
from ..prompts import REGEN_ADDITIONAL_INFO
from scrapegraph_py import SyncClient

class SmartScraperGraph(AbstractGraph):
"""
Expand Down Expand Up @@ -59,6 +60,15 @@ def _create_graph(self) -> BaseGraph:
Returns:
BaseGraph: A graph instance representing the web scraping workflow.
"""
if self.llm_model == "scrapegraphai/smart-scraper":

sgai_client = SyncClient(api_key=self.config.get("api_key"))

response = sgai_client.smartscraper(
website_url=self.source,
user_prompt=self.prompt
)
return response

fetch_node = FetchNode(
input="url| local_dir",
Expand Down
Loading