One day I noticed that wget(1) has much more option than I expected.
As the example below, we can recursively download target web site's contents.
# Using Safari as UA TARGET_URL="http://example.com/" USER_AGENT="Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_6) AppleWebKit/603.1.30 (KHTML, like Gecko) Version/10.1 Safari/603.1.30" nohup wget -rkpc --no-clobber --limit-rate 2.5k \ --wait=0.5 --random-wait \ --user-agent="${USER_AGENT}" \ "${TARGET_URL}" &
This command is verified with GNU Wget 1.18 built on darwin15.5.0.
For more detail, online manual is available at Linux man page.
0 件のコメント:
コメントを投稿