NuGet PowerShell Downloader Update - Adding Failed Download Retries, Better Paging Support
I previously posted a NuGet PowerShell downloader script, which is handy for downloading a local NuGet repository. There are several common uses:
- It's used in corporate environments where network policies prevent developers from accessing NuGet.org
- It's useful in cases where development teams want to build a customized feed with specific packages
- It's a great backup for presentations involving NuGet, especially on overloaded conference wi-fi
Note: Now, look, I'm happy if the first two help you out, but that's not what I wrote it for. I wrote this because I was tired of watching speakers at conferences have problems installing NuGet packages and complaining about the slow conference wi-fi. Of course it's slow, it's always slow (it's not lupus, it's never lupus!). If you're doing a presentation that involves you installing a NuGet package, part of your prep needs to be verifying that you have the required packages on your machine.
Matthew Podwysocki recently told me he was getting a call depth overflow error with the script. We looked at it, and it turned out that the problem was that he was downloading significantly more packages than had been available when I first wrote the script. I'd originally tested against several hundred, but there are over 16,000 available now (3,900 unique). I updated the script to fix that, and while I was at it I added in support for retry if a package download fails for some reason. I'll describe the retry mechanism here and save the paging bit for the next post, since that's a bit more in-depth.
Handling Download Retries in PowerShell
Sometimes, downloads fail for any number of reasons. When you're downloading hundreds of files, chances of a failed download go up a bit, and I'd gotten a few requests to handle that better. The dumb-but-working workaround was to just run the script again, since it only downloads packages you don't have locally, but that's not elegant - plus, you might not notice that some of them had failed.
A slightly less dumb workaround is just to retry the download a few times, with a limit to prevent an infinite loop if the file just can't be downloaded (e.g. bad link or missing file). This is pretty easy to implement with a try-catch loop, like this:
[int]$trials = 0 do { try { $trials +=1 $webClient.DownloadFile($url, $saveFileName) break } catch [System.Net.WebException] { write-host "Problem downloading $url `tTrial $trials ` `n`tException: " $_.Exception.Message } } while ($trials -lt 3)
The general idea:
- Start a counter ($trials)
- Increment the counter
- Try to download the file
- If the download succeeds, break out of the retry loop - we're done here
- If we got a WebException, write out a message
- If the counter's less than 3, try again. Otherwise, give up.
I was conflicted on the exception handling, but decided that I really only want to retry if I know that the failure was due to a Web Exceptions. If there were an unanticipated DownloadHasSetTheBuildingOnFireException or a BotNetUnleashedSecurityException on the first try, I'd rather not blindly repeat it. Thoughts?
PowerShell 1.0? Forget about that try/catch part.
Try/Catch requires PowerShell 2.0 or later. If you're on PowerShell 1.0, you've got three options:
- Just use a trap and don't do retries
- Rewrite this to use logic inside the trap block
- Use one of the clever solutions out there that simulates try/catch in PowerShell 1.0
If you are on PowerShell 1.0, as is the hapless Matt Podwysocki, you can see what we came up with here.
Converted to a gist
After having made a couple updates to this script and listing it in more than one place, it became obvious that this is exactly what a gist if for. If you haven't seen them, a gist is a single snippet of code that's posted in Github. It can be embedded in a blog post, but more importantly other users can comment on it and fork it, and I can update it. I've updated the original post to reference this gist as well, so if people stumble across the older blog post they'll automatically get the latest, greatest version of this script.
Script and Usage Instructions
I wrote up a walkthrough on the original post; usage hasn't changed with this update.
The paging part is interesting in its own right. We'll talk about that in the next post.