Download webpage for offline reading - wget
During web browsing, often I feel that I have to save that web pages for further reading. In such situations, always I bookmark URL & later return to that web page in my free time.
But recently, I got enough free time. But there is no internet access! So I need to find out a way to download website completely for offline reading.
There is too many tools like HTTrack. Also we can use default wget
command.
wget --mirror --convert-links --adjust-extension --page-requisites --no-parent http://example.org
The above command will download example.org webpage & and convert all links to access offline.
We can also shorten the above command.
wget -mkEpnp http://example.org
Explanation of the various flags that used in this command is given below.
--mirror
– Makes (among other things) the download recursive.--convert-links
– convert all the links (also to stuff like CSS stylesheets) to relative, so it will be suitable for offline viewing.--adjust-extension
– Adds suitable extensions to filenames (html or css) depending on their content-type.--page-requisites
– Download things like CSS style-sheets and images required to properly display the page offline.--no-parent
– When recursing do not ascend to the parent directory. It useful for restricting the download to only a portion of the site.
Got a project in mind? Send me a quick message, and I'll get back to you within 24 hours!.
Recent Posts
- Disabling Payment Methods in WooCommerce Based on Conditions
- How to Update Product Quantity in WooCommerce Using Custom Code
- Dynamically Generating a Table of Contents in WordPress
- Direct Checkout in WooCommerce - Add Product to Cart from Checkout Page & Skip Shop, Product, and Cart Pages
- Understanding the Impact of git reset --hard Command
Your Questions / Comments
If you found this article interesting, found errors, or just want to discuss about it, please get in touch.