Skip to content

Latest commit

 

History

History
29 lines (19 loc) · 487 Bytes

README.md

File metadata and controls

29 lines (19 loc) · 487 Bytes

A recursive web scraper for dead links in a website, supports both parallel and sequential execution.

To Build

go build .

Usage

./dead_links_scraper https://<your_link>

To start from a relative path, specify -s

./dead_links_scraper -s /relative_path https://<your_link>

To execute in parallel, specify -p

./dead_links_scraper -p https://<your_link>

Caveats

This dead links scraper only scrapes links inside a html <a> tag