Audit databases – Using DDL triggers

Posted by Deepak Kumar | Posted in Scripts, SQL DBA, Uncategorized | Posted on 05-06-2010

3

As a DBA, do you ever wonder what is going inside your production database? Who is changing database schema during production time without informing you. Now, you need not to worry, you can setup DDL triggers to easily records all changes in table; like what was changed, when and by whom etc. Given is two set of commands that you need to use and leave all worries aside. Also, you can prepare customized email alerts on top of data table to notify you as soon as an event occurs.

Create a table that will store all records. For example I am using MSDB database to store this table.
a DDL trigger need to created inside database; for that you want to enable audit. This trigger will capture all DDL events and save in table. Test and check process by creating test procedure, tables etc.

1. Script to create table

USE [msdb] 
GO 
SET ANSI_NULLS ON 
GO 
SET QUOTED_IDENTIFIER ON 
GO 
SET ANSI_PADDING ON 
GO 
CREATE TABLE [dbo].[DatabaseChangeLog]( 
    [LogId] [int] IDENTITY(1,1) NOT NULL, 
    [DatabaseName] [varchar](256) NOT NULL, 
    [EventType] [varchar](50) NOT NULL, 
    [ObjectName] [varchar](256) NOT NULL, 
    [ObjectType] [varchar](25) NOT NULL, 
    [SqlCommand] [varchar](max) NOT NULL, 
    [EventDate] [datetime] NOT NULL CONSTRAINT [DF_DBChangeLog_EventDate]  DEFAULT (getdate()), 
    [LoginName] [varchar](256) NOT NULL 
) ON [PRIMARY] 
GO 
SET ANSI_PADDING OFF 
 

clip_image002[4]

 
2. Script to create database trigger

USE [SQLKnowledge] 
GO 
SET ANSI_NULLS ON 
GO 
SET QUOTED_IDENTIFIER ON 
GO 
CREATE trigger [TrgDBChangeLog] 
on database 
for DDL_DATABASE_LEVEL_EVENTS 
as 
set nocount on 
If Object_ID('msdb.dbo.DatabaseChangeLog') IS NOT NULL 
BEGIN 
declare @data xml 
set @data = EVENTDATA() 
insert into msdb.dbo.DatabaseChangeLog(databasename, eventtype, objectname, objecttype, sqlcommand, loginname) 
values( 
@data.value('(/EVENT_INSTANCE/DatabaseName)[1]', 'varchar(256)'), 
@data.value('(/EVENT_INSTANCE/EventType)[1]', 'varchar(50)'), 
@data.value('(/EVENT_INSTANCE/ObjectName)[1]', 'varchar(256)'), 
@data.value('(/EVENT_INSTANCE/ObjectType)[1]', 'varchar(25)'), 
@data.value('(/EVENT_INSTANCE/TSQLCommand)[1]', 'varchar(max)'), 
@data.value('(/EVENT_INSTANCE/LoginName)[1]', 'varchar(256)') 
) 
END 
GO 
SET ANSI_NULLS OFF 
GO 
SET QUOTED_IDENTIFIER OFF 
GO 
ENABLE TRIGGER [TrgDBChangeLog] ON DATABASE 

 

clip_image004

3. Script to test process and check Audit records

-- Test table creation 
use SQLKnowledge 
create table test (fname varchar(10), lname varchar(10)) 
 
-- Test procedure creation 
create proc test1 
as 
select * from msdb.dbo.DatabaseChangeLog 
 
-- Check table records in msdb database 
select * from msdb.dbo.DatabaseChangeLog 
 

clip_image006

Please leave a comment, if you have questions or need suggestions

Comments (3)

why re-invent the wheel and put additional processing onto a server to obtain information that is already directly available?

Schema changes are recorded in the default trace and there is a standard report available from SSMS to determine any DDL changes per database.

It is also simple to extract the details from the default trace

declare @logfilename sysname

select @logfilename = path from sys.traces where id = 1

SELECT Eventclass,
case
when EventClass = 164 then ‘Alter’
when EventClass = 46 then ‘Create’
else ‘Drop’
end as [DDL Action], LoginName,SPID,StartTime,ObjectID,ObjectName,DatabaseName
FROM fn_trace_gettable(@logfilename, default)
where eventclass in (164,46,47)
and DatabaseID > 4
and ObjectName is not null

There can also be a problem with a subscriber in a replication scenario as a DDL trigger on the subscriber will cause a snapshot to fail. This is something Microsoft are aware of but as yet have not come up with a solution

Apologies for delayed response.. I missed your post.

In my opinion; Using tables to store DDL changes is efficient way instead relying on trace files only. The reason I see are;

– Trace file(s) may get deleted over a period of time.
– Trace files will not provide you actual T-SQL
– With Table; its very easy to search from years old data to find out multiple version of an SP or changes in it. However, trace files are recreated with every SQL restart makes it hard to club all files.
– With Trace Tables; You can add multiple alerts or create daily/weekly or monthly report for DB schema changes easily for further analysis.

let me know what you think…

what about data changes?
can we have something like this for data changes in DB?

Write a comment

*

© 2010 SQLKnowledge.com Increase your website traffic with Attracta.com