An automated test of a Websites Caching strategy. These scripts will crawl your website, generate URLs for testing, then run Header tests to determine HTTP Status Codes per page, and if your site's pages are utilizing Caching properly. It will test Varnish & Akamai configuration. And it will generate a nice report for you to analyize.
Clone the repo and make sure you have the following pre-reqs:
- wget
- imagemagick
- webkit2png
For Mac:
brew install webkit2png imagemagick wget
For Linux:
apt-get install webkit2png imagemagick wget
-
- clone the repo
-
- Run the URL Spider to generate the list of URLs to test
USAGE:
./urlspider.sh <site-to-crawl> <crawl-time-seconds> <urls-file>
Example 2 minute crawl:
./urlspider.sh http://www.example.com 120 urls.txt
Example 1 hour crawl:
./urlspider.sh http://www.example.com 3600 urls.txt
NOTE:
The more crawl time specified, the more URLs will be craweled and discovered.
Large sites may take a very long time to crawl.
-
- Run the Web Test against the URLs generated by the spider
USAGE:
./webtest.sh <file-with-urls>
Example:
./webtest.sh urls.txt
-
- Generate the report
./generateReport.sh
- Example running it all at once:
./urlspider.sh http://www.example.com 120 urls.txt && ./webtest.sh urls.txt && ./generateReport.sh
Edit the config file in etc/webtest.conf
vim etc/webtest.conf
Additional configuration by setting the following variables:
- URLS - the path to the file where URLs will be saved
- OUTDIR - the path where output will be generated
- CUSTOMER - the name for the report
- CUSTOM_COOKIE - cusom cookie headers to include in the requests
Tested on Mac OS X Yosemite 10.10.3