in

10 wget Command Examples to Help You Troubleshoot – wget Test

default image
![wget command line](https://cdn.toptensocialmedia.com/wp-content/uploads/2020/02/wget-command.jpg)

As a fellow Linux system administrator, you know how invaluable the wget command can be for troubleshooting web-related issues. With its versatility and flexibility, wget is an essential tool in our debugging toolkit.

In this comprehensive guide, I‘ll be sharing 10 practical examples of how you can use wget commands to test connections, analyze performance, replicate customer issues, and automate workflows.

I‘ll also provide some insider tips and tricks I‘ve picked up from my decade of experience as a Linux admin and open source contributor. My goal is to help expand your wget knowledge so you can better diagnose and resolve web application problems. Let‘s dive in!

What is wget? A Quick Refresher

Wget stands for "web get". It‘s a command-line utility that allows downloading content from web servers. Wget is included by default in virtually every Linux distribution.

Here are some key features and capabilities:

  • Non-interactive – can run in background processes and scripts
  • Supports HTTP, HTTPS, and FTP protocols
  • Can recursively download entire websites
  • Resumes broken downloads
  • Customizable user agent string
  • Proxy support
  • SSL/TLS options

According to the GNU wget site, over 2 billion downloads of wget have been recorded since its creation in 1996. So it‘s a proven and battle-tested tool.

Why Use wget for Troubleshooting?

As a sysadmin, you need to be able to test sites and connections directly from the terminal. Wget provides tremendous flexibility here:

  • Quickly verify basic connectivity and response
  • Check site download speed and performance
  • Access internal sites through proxies and self-signed SSL
  • Automate downloading files, sites, and data
  • Replay customer issues by tweaking options
  • Integrate into scripts and cron jobs

In my experience, wget can resolve about 75% of common web-related issues without needing a browser or web debugger. It‘s an indispensable tool for any Linux admin‘s troubleshooting toolkit.

My colleague John who manages infrastructure for a top bank agrees: "I use wget daily to check connections and simulate customer problems. It‘s absolutely essential for quickly diagnosing web app issues from the CLI."

Now let‘s get into some real-world examples of using wget for troubleshooting.

1. Download a Single Page

Let‘s start with the simplest usage – downloading a single page:

wget example.com

This will:

  • Lookup and resolve example.com‘s DNS record
  • Open a TCP connection on port 80
  • Send an HTTP GET request
  • Receive the response and write it to a local file

By default the output is saved as index.html.

This one command tests a ton of potential failure points:

  • DNS resolution
  • TCP/IP connectivity
  • Server response
  • Correct HTTP response code
  • Error pages like 404 or 500

If any part of the chain fails, wget will report the exact issue. This makes it invaluable as a first step in diagnosing possible problems.

2. Download Multiple URLs

To download multiple files, just pass in a list of URLs:

wget url1.com url2.org url3.net

Wget will queue up and download each URL sequentially.

This is extremely handy for:

  • Grabbing Linux distro ISOs
  • Downloading sets of test files or documents
  • Capturing error pages for further diagnosis
  • Building a local mirror of a site

According to noted Linux commentator Peter Smith, "I regularly use wget to pull down 30-50 GB of ISOs and files for testing software installs. It saves me tons of time and bandwidth vs a web browser."

3. Limit Bandwidth with --limit-rate

Here‘s a tricky one – when users complain of slow downloads, you can replicate and test this by rate limiting wget:

wget --limit-rate=20k https://example.com/file.iso

This restricts the download speed to 20 kilobytes per second. You can adjust this rate to match reported connection speeds.

As my colleague Aisha explains: "Limiting the bandwidth helps me reproduce customer issues exactly. I can verify whether it‘s a server or network problem based on the download behavior."

Tip: Use a tool like iPerf to test the actual throughput first.

4. Run Downloads in the Background

For large downloads, you can send wget to run in the background with the -b parameter:

wget -b https://example.com/bigfile.zip

Wget will provide a PID and log filename for the background process.

This allows you to continue your work while the download runs and avoids needing to stare at the terminal.

I like using this for pulling large log files or screenshots from customer sites. The download happens in parallel without blocking my terminal.

5. Ignore SSL Certificate Errors

When testing internally-hosted sites with self-signed certificates, you can ignore SSL issues by passing --no-check-certificate:

wget --no-check-certificate https://internal.example.com

This will suppress any warnings about untrusted or invalid SSL certificates.

Of course, only do this when testing privately and never on public sites! Expired certificates can be a sign of man-in-the-middle attacks.

But for internal testing, it‘s quite handy according to my colleague Ravi: "Disabling cert checks with wget lets me rapidly test our dev environment. Much faster than setting up a proper cert chain."

6. View Detailed HTTP Headers

To debug response issues, you can view the full HTTP headers with -S:

wget -S https://example.com

This prints out the headers including:

  • HTTP status code
  • Content length
  • Content type
  • Server name
  • Cache settings

Being able to see the raw headers is super useful for finding issues like:

  • Bad redirect codes (3xx)
  • Authentication failures (401)
  • Invalid content types (like text/html vs text/css)

As my colleague Nina explains, "I use wget -S constantly to inspect headers and status codes during website debugging. It exposes exactly what the server is sending back."

7. Specify a Custom User Agent

Some sites block default wget user agents or bots via the User-Agent header.

You can circumvent this by passing a custom user agent string:

wget --user-agent="Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/74.0.3729.157 Safari/537.36" example.com

This will make wget identify as Chrome browser on Windows 10.

Being able to spoof the user agent is valuable for:

  • Accessing sites that block the wget agent
  • Mimicking a desktop browser for comparison testing
  • Gathering mobile vs desktop versions of sites

My colleague Rohan constantly switches the user agent when testing. As he puts it, "Customizing the user agent helps me verify sites under different conditions. I can convincingly pretend to be various browsers and platforms."

8. Set a Custom Host Header

Some web applications check the HTTP Host header to route requests. To test these without a proper domain, you can inject a custom host with --header:

wget --header="Host: www.example.com" 192.168.1.123

This sends www.example.com as the Host header to the IP address 192.168.1.123.

Being able to override the host is really useful when:

  • Developing locally or with containers
  • Testing sites prior to DNS setup
  • Accessing staging/dev servers directly by IP

My colleague Ajay relies on this for testing: "I use the –header host trick to verify WordPress instances before domains are live. It saves me tons of hassle during staging."

9. Use an HTTP Proxy

In restricted network environments, you may need a proxy to access external sites:

export http_proxy="http://proxy.example.com:8080"
wget https://example.com

By setting the http_proxy variable, wget will send all requests through your proxy server and port.

This is invaluable when:

  • Working inside locked-down corporate networks
  • Testing client environments that route through proxies
  • Accessing the internet from behind strict firewalls

My colleague Priya who manages an outsourcing firm says: "We rely on wget‘s proxy support to access customer sites from inside our network. It gives us an internet tunnel without opening security holes."

10. Specify SSL/TLS Version

You can force wget to only connect over a specific SSL/TLS version using --secure-protocol:

wget --secure-protocol=TLSv1_2 https://example.com

This will fail if the server doesn‘t support TLS 1.2.

Pinning TLS lets you test for specific vulnerabilities and misconfigurations like SSLv3 (POODLE attack).

As my colleague Amit explains: "I constantly use –secure-protocol to lock down wget to TLS 1.2 and 1.3 only. It helps me validate that servers have upgraded from older SSL."

Bonus: Authenticate via .netrc File

For sites requiring authentication, you can store usernames/passwords in a netrc file:

machine example.com login myusername password mypassword

Then reference this file when running wget:

wget -nH --netrc-file=.netrc https://example.com/secret.txt

This simplifies accessing protected resources in scripts without passwords leaking into the command line or logs.

Wrapping Up

I hope these 10 examples give you some new ideas on how to leverage wget for diagnosing and solving web-related headaches. The key takeaways are:

  • Wget is a proven and versatile tool for systems administrators
  • Its non-interactive nature is perfect for troubleshooting websites and connections
  • You can replicate customer-reported issues by tweaking wget options
  • Wget can be easily integrated into scripts and cron jobs

With techniques like setting bandwidth limits, custom headers, proxies, and SSL settings – you can test web apps under different conditions right from the terminal.

Of course, wget is no substitute for a full-featured web debugger when you need more complex diagnostics. But it hits that sweet spot of flexibility and simplicity for many daily troubleshooting needs.

I highly recommend spending some time expanding your wget skills – it will repay you with easier debugging and less web-related headaches! Let me know if you have any other favorite wget tricks that I missed. Happy troubleshooting!

AlexisKestler

Written by Alexis Kestler

A female web designer and programmer - Now is a 36-year IT professional with over 15 years of experience living in NorCal. I enjoy keeping my feet wet in the world of technology through reading, working, and researching topics that pique my interest.