Skip to content

davidelng/webcrawler

Repository files navigation

Go Webcrawler

Usage

  • make sure that you have Go installed
  • run go build -o crawler && ./crawler <URL> <maxConcurrency> <maxPages>

Description

This webcrawler fetches all internal links on a webpage. It uses concurrency to be faster and can be limited to a max number of pages.

About

Webcrawler written in Go

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages