Download all the images from a website using wget
Wget is the "Swiss Army Knife" of command-line tools for downloading content from the web.1 It is non-interactive, meaning it can run in the background while you focus on other tasks. If you need to scrape every image from a specific website without clicking "Save As" a hundred times, Linux has you covered.
The "Golden" Command
The following command is the most reliable way to grab images:
wget -nd -r -P /home/user/downloads/images -A jpeg,jpg,bmp,gif,png https://www.example.com
What is happening here?
-nd (no directories): Prevents wget from creating a complex hierarchy of folders on your computer that mirrors the website's structure. It puts all images in one folder.
-r (recursive): Tells the tool to follow links within the page to find more images.
-P (prefix): Sets the local directory where the files will be saved.
-A (accept): A comma-separated list of file extensions to keep. Everything else will be ignored.
How to Access Help in the Terminal
If you ever forget a switch or want to explore advanced features like authentication or proxy settings, there are two primary ways to get help directly in your Linux terminal:
1. The Short Help (Quick Reference)
Use the --help flag to see a categorized list of all available options.
2. The Manual Page (In-depth)
For a comprehensive guide, including examples and technical explanations, use the man command.
Tip: While in the manual page, press q to exit and / to search for a specific keyword.
Online Tools
If your not a linux guru or would like to use an interface to download all your images from a website you could use the online tool such as https://imagedownloader.website