Command line website scanner to find broken links.

I made a little tool. It鈥檚 called Linky.

I was inspired by a tool that I saw on Twitter called Linkinator and I thought it would be fun to try to replicate it using C# and .NET Core. And it turned out good enough that I made the repo public.

It鈥檚 really simple to use. All you need is a URL to check.


The page is then downloaded and parse for links. Each link is then checked and you鈥檒l get a report on which ones don鈥檛 work.

If you want to have some fun, and have a bit of time, add the -r flag to have it recursively parse all internal pages and check those links, too. It should avoid checking the same link multiple times.

There鈥檚 lots of improvements I鈥檇 like to make to it:

  • Parallelize it. URLs are checked one at a time and this is kind of slow. It should be able to work in batches.
  • Fix issues with 301s and sites that don鈥檛 seem to like being called from HttpClient. May need some user-agent trickery.
  • Show where the URL was referred from to help identify where the broken link is.
    • Maybe a tree view?
  • Check for broken images and such, maybe?
  • Refactoring to make it less awful to look at.

Even if I never get around to making any changes to it, I鈥檓 happy with Linky. It was fun to build and I鈥檝e already fixed a bunch of links on my own site.