howto: wget and curl rev 22 oct 2020 Both download files from the command line - a text file, image, video, raw html page. Both can send data to a website, like filing out a form (can make HTTP POST requests). Both can be included in bash scripts. wget - Download something quickly without needing to worry about flags. - Meant for quick downloads, and it’s excellent at it. - Simple and straightforward. - Download recursively. Download everything on a page, an entire website, all of the files in an FTP directory. - Intelligent defaults. Specifies how to handle a lot of things that a normal browser would, like cookies and redirects, without the need to add any configuration. - Works out of the box. Single self-contained program. Doesn’t require any extra libraries. cURL - Do something more complex. - like a stripped-down command line web browser. It supports just about every protocol you can think of and can access and interact with nearly all online content. - Powered by a library: libcurl. - more protocols: Can access websites over HTTP and HTTPS Can handle FTP in both directions. Supports LDAP and even Samba shares. Can use cURL to send and retrieve email. Has SSL/TLS libraries. Supports Internet access via proxies, including SOCKS. That means you can use cURL over Tor. ......................... difference between: Central difference: curl - transfers data from any server over to your computer. not recursive. more protocols. multiple platforms (libcurl) wget - downloads the data as a file. recursive. http, https, ftp mostly linux * https://www.linuxfordevices.com/tutorials/linux/wget-vs-curl summaries of each * https://www.maketecheasier.com/curl-vs-wget/ summaries of each, plus how to install, similarities. differences as a table. screenshots of output. _______________________________________________________ begin 22 oct 2020 -- 0 --