Audit databases – Using DDL triggers

Posted by Deepak Kumar | Posted in Scripts, SQL DBA, Uncategorized | Posted on 05-06-2010

3

As a DBA, do you ever wonder what is going inside your production database? Who is changing database schema during production time without informing you. Now, you need not to worry, you can setup DDL triggers to easily records all changes in table; like what was changed, when and by whom etc. Given is two set of commands that you need to use and leave all worries aside. Also, you can prepare customized email alerts on top of data table to notify you as soon as an event occurs.

Create a table that will store all records. For example I am using MSDB database to store this table.
a DDL trigger need to created inside database; for that you want to enable audit. This trigger will capture all DDL events and save in table. Test and check process by creating test procedure, tables etc.

1. Script to create table

USE [msdb] 
GO 
SET ANSI_NULLS ON 
GO 
SET QUOTED_IDENTIFIER ON 
GO 
SET ANSI_PADDING ON 
GO 
CREATE TABLE [dbo].[DatabaseChangeLog]( 
    [LogId] [int] IDENTITY(1,1) NOT NULL, 
    [DatabaseName] [varchar](256) NOT NULL, 
    [EventType] [varchar](50) NOT NULL, 
    [ObjectName] [varchar](256) NOT NULL, 
    [ObjectType] [varchar](25) NOT NULL, 
    [SqlCommand] [varchar](max) NOT NULL, 
    [EventDate] [datetime] NOT NULL CONSTRAINT [DF_DBChangeLog_EventDate]  DEFAULT (getdate()), 
    [LoginName] [varchar](256) NOT NULL 
) ON [PRIMARY] 
GO 
SET ANSI_PADDING OFF 
 

clip_image002[4]

 
2. Script to create database trigger

USE [SQLKnowledge] 
GO 
SET ANSI_NULLS ON 
GO 
SET QUOTED_IDENTIFIER ON 
GO 
CREATE trigger [TrgDBChangeLog] 
on database 
for DDL_DATABASE_LEVEL_EVENTS 
as 
set nocount on 
If Object_ID('msdb.dbo.DatabaseChangeLog') IS NOT NULL 
BEGIN 
declare @data xml 
set @data = EVENTDATA() 
insert into msdb.dbo.DatabaseChangeLog(databasename, eventtype, objectname, objecttype, sqlcommand, loginname) 
values( 
@data.value('(/EVENT_INSTANCE/DatabaseName)[1]', 'varchar(256)'), 
@data.value('(/EVENT_INSTANCE/EventType)[1]', 'varchar(50)'), 
@data.value('(/EVENT_INSTANCE/ObjectName)[1]', 'varchar(256)'), 
@data.value('(/EVENT_INSTANCE/ObjectType)[1]', 'varchar(25)'), 
@data.value('(/EVENT_INSTANCE/TSQLCommand)[1]', 'varchar(max)'), 
@data.value('(/EVENT_INSTANCE/LoginName)[1]', 'varchar(256)') 
) 
END 
GO 
SET ANSI_NULLS OFF 
GO 
SET QUOTED_IDENTIFIER OFF 
GO 
ENABLE TRIGGER [TrgDBChangeLog] ON DATABASE 

 

clip_image004

3. Script to test process and check Audit records

-- Test table creation 
use SQLKnowledge 
create table test (fname varchar(10), lname varchar(10)) 
 
-- Test procedure creation 
create proc test1 
as 
select * from msdb.dbo.DatabaseChangeLog 
 
-- Check table records in msdb database 
select * from msdb.dbo.DatabaseChangeLog 
 

clip_image006

Please leave a comment, if you have questions or need suggestions

Database Logshipping issues & Considerations!

Posted by Deepak Kumar | Posted in SQL DBA, Uncategorized | Posted on 04-06-2010

3

While working on SQL Server logshipping a DBA must consider following points to have log shipping setup running smoothly over a long period of time. Logshipping architect will look like:

image

  • Transaction log Backup and restoration jobs should work constantly (in parallel) and if backup/restoration jobs are not in sync due to network/server or any other issue it will crash logshipping setup and you may need to setup again or apply full backup restore once again. A DBA must add good number of alerts to notify as soon as transaction log backup, file copy or restoration job fails and act immediately to resolve it.
  • Its highly recommended that you should use almost similar set (size/drive name) etc on both servers. if your database log file increases to 100GB on primary server that means your secondary server should have 100GB capacity also. Similarly if you add new log/data files on primary database it should have similar location on secondary server, or you may need to manually restore TL backup on secondary server with move command.
  • Primary database can create huge transaction log backups during alter index/dbreindex tasks that you may find hard to move to secondary server(s); to mitigate the delay and risk you may choose to buy 3rd party products like Red-Gate/ Idera etc with logshipping features.
  • Use a customized script to transfer SQL Server logins to secondary server(s).
  • Logshipping failover is not fully automatic, manual DBA intervention is expected to recover databases during failure.
  • For SQL 2000 logshipping is good, however if you are using SQL2005 or 2008 use database mirroring which is advance to logshipping with hot standby feature of SQL Server.
© 2010 SQLKnowledge.com Increase your website traffic with Attracta.com