Download all .jpg files from a web page
wget -r -A .jpg http://site.with.images/url/
Gather all links on the page
After you gather all needed links in browser console
$$('a .box').forEach(a => console.log(a.href));
or in case of Podcast RSS feed:
$$('[href$="mp3"]').forEach(a => console.log(a.href));
and copy them to the .txt
file (and remember to remove quotes). Then you can use program wget
to download them all in one go.
Download all links from a file
wget -i links.txt