How do I Retrieve URLs Using Native Tools at a Windows Command Prompt?

If you want to retrieve a file located at a HTTP URL and you’ve got a *nix variant for an OS, you’ve got some great options. Namely wget and curl. If you’re on a Windows machine, you have some rather frustrating and limited options.

Side note: I try to take every platform that I work with at face value and work with its native tools. I do not like using cygwin on Windows machines. I also know there’s a curl binary for Windows. Natheless, I like to look for solutions to problems using packages native to the OS I’m using, and then fan out from there as absolutely necessary. I find things to be much more stable and portable that way. Moving on…

My Scenario

In my use case, I have a client who uses a hosted wiki from a major online office provider. The only ways that you can back up the wiki to a local file is to either manually log in to a dashboard and click a URL to download the backup file, or you can script the retrieval of a HTTP URL to get the file. Of course, being a good-lazy admin, I want to script this.

My Options

There are some really ugly options that I’ll refer to briefly. One of them is to simply script the HTTP communication directly using telnet. Like,  telnet http://url ; GET /path HTTP/1.1 type stuff. Fortunately I’m not a substance abuser, and if I was I wouldn’t be able to afford the quantities of methamphetamines necessary to make that option seem rational.

Another option is to use .NET libraries directly from within PowerShell. For example: (New-Object System.Net.WebClient).DownloadFile(
You know your Windows scripting efforts are hitting Saw-levels of grotesque when you’re routinely calling .NET namespaces in your scripts.

The more sane options is to use tried and true BITS, but from within PowerShell itself. However, first, one has to import the BitsTransfer module. Under normal circumstances on a recent version of Windows, it will be an available module, just not imported. To check if it’s available, run the following command:

PS C:Usersuser> Get-Module -ListAvailable
ModuleType Name ExportedCommands
---------- ---- ----------------
Manifest BitsTransfer {}
Manifest PSDiagnostics {}

If you see the BitsTransfer module, you can import it thusly:

Import-Module bitstransfer

To see the commands available for bits, check out:

PS C:Usersuser> get-command *bits* | Where-Object {$_.commandtype -eq "cmdlet"}
CommandType     Name                                                Definition
-----------     ----                                                ----------
Cmdlet          Add-BitsFile                                        Add-BitsFile [-BitsJob]  [-Source] <S...
Cmdlet          Complete-BitsTransfer                               Complete-BitsTransfer [-BitsJob]  [-V...
Cmdlet          Get-BitsTransfer                                    Get-BitsTransfer [[-Name] ] [-AllUsers...
Cmdlet          Remove-BitsTransfer                                 Remove-BitsTransfer [-BitsJob]  [-Ver...
Cmdlet          Resume-BitsTransfer                                 Resume-BitsTransfer [-BitsJob]  [-Asy...
Cmdlet          Set-BitsTransfer                                    Set-BitsTransfer [-BitsJob]  [-Displa...
Cmdlet          Start-BitsTransfer                                  Start-BitsTransfer [-Source]  [[-Desti...
Cmdlet          Suspend-BitsTransfer                                Suspend-BitsTransfer [-BitsJob]  [-Ve...

The command in question that can do our file retrieval bidding is start-bitstransfer. In my case, it was as simple as this:

Start-BitsTransfer -Source -Destination C:backups

I was able to schedule that to run at the right times and then get my automated backup file downloaded each night.

I will not feign that start-bitstransfer or the rest of the BITS commands can suffice for all scenarios that wget or curl would eat for lunch. However at least consider those PowerShell command before you go about shimming other tools into a Windows install. Have any better methods for retrieving URLs at the command line in Windows? Let me know below.


  1. furicle

    March 27, 2013 at 5:49 am

    I’ve started using BITs to pull DVDs down at our bandwidth limited remote sites. (yeah, I get important stuff on DVDs still, sigh)

    A tip – you can’t use bits in a powershell remote session, but you can get around that by scheduling it using schtasks.

    So, you set it up using start-BitsTransfer -Source $sourceURLs -Destination $destination -Asynchronous -Priority low -verbose -DisplayName $displayName -Suspended – then schedule it to start after hours using Resume-BitsTransfer -Asynchronous and then check it the next morning and use complete-bitstransfer

    Nice things – it’ll even throttle itself back if the same server has to download something else more important at the same time, it restarts/retries on errors or even server reboots.


    • Wesley David

      March 27, 2013 at 8:57 am

      The limit on PS remote sessions seems… unusual, does it not? I can’t figure out the rationale yet. It’s strange, unexpected limits like that that makes me less and less inclined to use Windows for my business and clients. =/


      • Furicle

        March 27, 2013 at 9:33 am

        It’s tied into the fact that power shell remoting isn’t ssh. It is not a remote shell, it’s more like a web service taking input and firing back output, kinda.


        • Wesley David

          March 27, 2013 at 9:39 am

          Ahhh, yes. I remember now. One of the last PowerShell user groups I went to talked all about that. I remember feeling terribly confused, disappointed, angry and slightly violated.


  2. jscott

    March 27, 2013 at 6:11 am

    Two quick things:

    1. Using .Net in PowerShell isn’t grotesque, it’s how things are done. PowerShell /is/ .Net. Your example is just creating a new object of the class WebClient, that’s completely sane. Now if you’re making static method calls via reflection, that syntax may look odd to you, but it’s fairly common.

    2. If you’ve WMF 3, look at Invoke-WebRequest.


    • Wesley David

      March 27, 2013 at 8:55 am

      I recall over the years that at many a PowerShell user group, or reading many a PowerShell MVP’s blog, there was constant bragging about being about to access .NET from within PowerShell, and yes, that it really is .NET. That seemed to me to be a bit like saying “You can access C libraries from within bash! In fact, bash is made with C!!” Lawl hoocares?

      That’s a lot of power, and is pretty awesome, but at the end of the scripting day, PowerShell is a shell (okay, PS aficionados and MVPs tend to make the distinction that ‘it’s more of an automation engine than a shell’ but still). A shell that should have a litany of small tools that do many common tasks well. PowerShell is getting there, but when I frequently see encouragement to drop straight into the .NET namespace to perform tasks, I think something still isn’t thoroughly baked. (In fact, the first blog post I found about retrieving a URL was what showed the .NET method.)

      When I See scripts that commonly use .NET, I think that either the PowerShell ecosystem is emaciated, or the “script” maintainer is a true developer that just used PowerISE as his IDE. =) The question that always comes to mind is something like “Do you really want a scripting environment to consistently encourage accessing development libraries / .NET namespace to perform tasks?”


Leave a Reply

Follow TheNubbyAdmin!

follow us in feedly

Raw RSS Feed:

Contact Me!

Want to hire me as a consultant? Have a job you think I might be interested in? Drop me a line:

Contact Me!

Subscribe via Email

Your email address is handled by Google FeedBurner and never spammed!

The Nubby Archives

Circle Me on Google+!

Photos from Flickr

Me on StackExchange

The IT Crowd Strava Group

%d bloggers like this: