If you are preparing to do some serious analysis of your web server logs, it makes sense to transfer the logs to a database, so that you can easily extract information by running SQL queries. This article shows how to easily import IIS logs into MS SQL Server.

IIS logs each web server hit as a separate line in a log file. Information is logged in W3C Extended Log format, but the individual fields that are logged is configurable.

At the top of your log file will be a section that specifies the fields which are being logged:

#Software: Microsoft Internet Information Services 6.0
#Version: 1.0
#Date: 2008-09-11 14:00:00
#Fields: date time s-sitename s-ip cs-method cs-uri-stem cs-uri-query cs-username c-ip cs(User-Agent) cs(Cookie) cs(Referer) sc-status sc-substatus sc-win32-status sc-bytes cs-bytes time-taken

First, let's create a database to store our logs.

-- Create a database called "WebLogs"
	   FROM   master..sysdatabases 
	   WHERE  name = N'WebLogs')


Now let's create a table with columns that match every field of the log file.

-- Create a table for logs.
USE WebLogs;
	[date] [date] NULL,
	[time] [time] NULL,
	[s-sitename] [varchar] (32) NULL,
	[s-ip] [varchar] (16) NULL,
	[cs-method] [varchar] (8) NULL, 
	[cs-uri-stem]  [varchar] (255) NULL,
	[cs-uri-query] [varchar] (2048) NULL,
	[cs-username] [varchar] (16) NULL,
	[c-ip] [varchar] (16) NULL,
	[cs(User-Agent)] [varchar] (1024) NULL,
	[cs(Cookie)] [varchar] (2048) NULL,
	[cs(Referer)] [varchar] (4096) NULL, 
	[sc-status] [int] NULL,
	[sc-substatus] [int] NULL,
	[sc-win32-status] [int] NULL,
	[sc-bytes] [bigint] NULL,
	[cs-bytes] [bigint] NULL,
	[time-taken] [int] NULL

SQL Server has a neat feature where you can do a bulk import of data from a character-delimited text file (like a CSV file). The fields in the log file are separated with a single space character, so the import query becomes:

-- Import data from web server log into table
USE WebLogs;
BULK INSERT [JDSWEB01] FROM 'c:\temp\ex080912.log'

Now that you have imported all of your log data into SQL Server, you should read my Tech Tip on extracting useful information from it with SQL queries.

As a bonus for reading this far, here are some useful tips:

  • Web server logs quickly become quite large. A popular website will generate gigabytes of logs per day.
  • Fortunately, these log files contain a lot of redundant information, so they compress really well. 5 GB of log files may compress to 200 MB (a 25:1 ratio).
  • Don't ever open a large log file in Notepad. Notepad is written in an inefficient way that will cause it to become unresponsive and use an entire CPU while the file is opening (which takes a long time with Notepad). Use a good text editor like TextPad instead.
  • The time required for a BULK INSERT varies linearly with the number of lines in the log file. Expect a ballpark figure of 1 minute per gigabyte of log file.
  • Remove the headers at the top of the log file to avoid errors like: Bulk insert data conversion error (type mismatch) for row 1, column 1 (date).
  • The headers appear every time the web server is restarted, or when a new log file is created, so the headers may appear in the middle of a log file too.
  • You may need to tweak the size of the VARCHARS for some fields if you get an error message like: Bulk insert data conversion error (truncation) for row 8394, column 10 (cs(User-Agent)).
  • Don't make your VARCHARs too big because SQL Server has a limitation of 0860 bytes. If the total size of all your columns exceeds this, you will have problems: INSERT or UPDATE of a row in this table will fail if the resulting row length exceeds 8060 bytes.
  • If you cannot get rid of truncation errors entirely (and still stay within the 8060 byte limit for a row), you can add MAXERRORS=somebignumber to the WITH clause of the BULK INSERT.
  • Double check that your BULK INSERT worked correctly by counting the number of records in the table, and comparing with the number of lines in the log file.
  • Use a datatype bigger than an int for any field that you will SUM() (such as [sc-bytes]) to avoid overflow errors like "Arithmetic overflow error converting expression to data type int."
  • Don't bother using Log2SQL (a commercial software product), it tries to make all fields 255 characters (obvious truncation problems), and it is completely unnecessary because of SQL Server's BULK INSERT feature.
  • Queries may take a long time if there is lots of data in the database. You can speed things up considerably if you remote any log entries for static content (*.gif, *.js etc). Note that this will make your hits per second and sent/received bytes incorrect, but (for me anyway) my key metric is pages rather than raw hits.

Tech tips from JDS


This solution will probably not fit most users. BULK INSERT doesn’t’ have the ability to filter out rows based on string values. It only allows us to start the reading from specific line in the file. But IIS writes headers each time the application starts and not only in the beginning of the log file. You can check my blog for full solutions using Transact SQL and more important full explanation (I think it is full, but I am always open to get feedback, comments, and idea to improve it):

Peter Jesus

Thanks, a simple way to import data into SQL. Now I can make easily statistics, about IIS use

Thank you for good information.
May I save on my blog?

Thanks.Good article. More useful….

Thanks.Good article. How this can be used to import Iplanet WebServer logs?