Introduction to Log Parser-Week 39

You can find this week’s video here.

Log Parser is a powerful tool that every web administrator should become familiar with. With Log Parser we can do deep troubleshooting and data mining of the IIS Logs, Event Viewer, the File System, other file types and more.

Useful examples of parsing the IIS logs include: finding long running pages, finding all error pages with a 500 status code, all requests from a particular IP (potential hacking attempt), and much more.

This week is an introduction to Log Parser with walkthroughs covering Event Viewer, IIS logs, various filtering, different output formats, how to find your own way around, Log Parser Lizard GUI, and more. Log parser is a tool that no web administrator should be without.

This is now the 39th week of the entire series. You can view past and future weeks here: http://dotnetslackers.com/projects/LearnIIS7/

You can find this week’s video here.

8 Comments

  • Hi Scott,

    Another useful tutorial.

    I would say that I would not use the log parser running on production boxes for long running queries. You can quickly consume a lot of memory for large log files. I am used to 1GB+ per server in a farm.
    So a word of warning there.

    For years I have only been using the Log Parser Lizard GUI - even for the very basic task of saving my queries - some of them get really complex.
    Still on my to do list is to add some more useful queries to the default ones.

    :) Liked the matrix mode. Not seen that before. :)

  • Hey Rovastar!

    That's good feedback on the memory usage. You're right, we do need to be careful on the usage on a production machine. I will often do quick tasks on a production machine because of the time and diskio it takes to move large files around, but I should have offered your disclaimer to be careful on the usage and types of queries. For example, if there is an order-by then it has to load a lot of it into memory.

    Good feedback on Lizard GUI. Sounds like a solid tool.

  • Hi Chung Lee,

    It's been a while since I've worked with the situation you described, but I have at least once. I believe I went with a CSV in the end. Log Parser supports generic log files like that.

    If it's a once only situation, you can just copy the # files to the top of the file. Just copy them for existing IIS logs and then tweak the fields accordingly.

    If you still battle with it, send me the first few lines using the contact link above and I'll help test it out with you.

  • I run log parser against large log files from a remote system, not local and have no perf issues. At least in my experience.

  • Hi Steve,

    I'll go with that perspective. You and Rovastar are right that it's safer to run on a remote server across a UNC path, and performance is very impressive. That's the safer perspective.

  • Chung Lee:

    I was having the same issue throwing error when converting, I resolved it by using below. If it helps anyone

    logparser -c -i:BIN -o:W3C \\inputfilepath \\outputfilepath

  • Any great information on creating a custom input format for Log Parser?

  • Hi Michael,

    There are a couple ways to do this. If the input is a standard text file with consistent columns then you can use one of the inputs like CSV. Here are the possible choices:

    IISW3C, NCSA, IIS, IISODBC, BIN, IISMSID,
    HTTPERR, URLSCAN, CSV, TSV, W3C, XML, EVT, ETW,
    NETMON, REG, ADS, TEXTLINE, TEXTWORD, FS, COM

    You can get detailed help with "logparser -h -i:"

    The other way is to write a custom provider. Robert McMurray has a great walkthrough on that: http://bit.ly/JhuylR

Comments have been disabled for this content.