Audit on SQL Server

The other day one of my clients asked me if there was any way to audit what users did on their data, and in particular some tables of importance. I had to look it up, well I knew that there was some Audit possibilities build into SQL Server, but I had never had the need for it before. And as I was expecting, SQL server offers a rather thorough audit engine, you just have to enable and configure it.

I’ll show you what is needed to set up audit on tables and views in a given database, but first a little bit about the audit engine it self.

What is Audit on a SQL server

Audit is an engine that gives you the ability  to log the use and access to data in tables, views, execution of stored procedures, functions and much more.

Here is a short list of some of the possibilities for auditing on database level:

      • SELECT
      • UPDATE
      • INSERT
      • DELETE
      • EXECUTE

Here is a short list of some of the events to audit on server level (not adequate list) :

      • DATABASE_PRINCIPAL_IMPERSONATION_GROUP
      • DATABASE_ROLE_MEMBER_CHANGE_GROUP
      • DBCC_GROUP
      • FAILED_DATABASE_AUTHENTICATION_GROUP
      • FAILED_LOGIN_GROUP
      • FULLTEXT_GROUP
      • LOGIN_CHANGE_PASSWORD_GROUP
      • LOGOUT_GROUP

As you can see there is an vast variation of events to audit, and depending on the level of regulation in a clients business, we can set up different levels of auditing on various objects.

It gives you the option to store the log in binary files, the security log or the application log.

To enable AUDIT in SQL server, there is a few things that needs enabling first.

First there needs to be created a new AUDIT on server level, this can be accomplished either in SSMS or by T-SQL script, If you go for SSMS it is done under Security.

1. Right click on Audits and then click on New Audit

image2. You should now see the window below.
image3. It is now time to define how  the audit should store the log, the name of the audit, file sizes and the like.
4. Either store the log in a file or in the Security or Application Log
5. If possible you should reserve the disk space now, to avoid a failure if the disk runs out of space.

If you decide to do is by T-SQL here is a script that do exactly that.

USE [master]
GO

/****** Object:  Audit [Select Audit AdventureWorks2014] ******/
/****** Create the Server audit for use in the database audit ******/
CREATE SERVER AUDIT [Select Audit AdventureWorks2014]
TO FILE 
(	FILEPATH = N'D:\MSSQL\2014\Audits\'
	,MAXSIZE = 500 MB
	,MAX_ROLLOVER_FILES = 20
	,RESERVE_DISK_SPACE = ON
)
WITH
(	QUEUE_DELAY = 1000
	,ON_FAILURE = CONTINUE
	,AUDIT_GUID = 'd1817028-0edc-49d3-bbad-0312291673c9'
)
ALTER SERVER AUDIT [Select Audit AdventureWorks2014] WITH (STATE = ON)
GO

The above script will make a file log on D:\MSSQL\2014\Audits\ with a max file size of 500mb, 20 files and the space will be reserved for the log.
Next we issue an alter statement on the server audit just created, where we enable it.

Now that the server audit is created and active, it is time for the actual database audit specification, again we can create it in SSMS or by T-SQL – first SSMS

image1. Right click on Audits and then click on New Audit
2. Click on New Database Audit Specification, and the following window will show

image3.Here you can choose what audit action types to audit, here its DELETE, then select the object, object schema and click OK.
4. That is it, your now auditing for DELETE in the specified object.

If you would use T-SQL here is a script that does the same, just for a few more objects.

USE [AdventureWorks2014]

GO

/****** Create the Audit SPecification for these objects ******/
CREATE DATABASE AUDIT SPECIFICATION [DB Select Audit AdventureWorks2014]
FOR SERVER AUDIT [Select Audit AdventureWorks2014]
ADD (SELECT ON OBJECT::[HumanResources].[Department] BY [HumanResources]),
ADD (SELECT ON OBJECT::[HumanResources].[Employee] BY [HumanResources]),
ADD (SELECT ON OBJECT::[Person].[Person] BY [dbo])
WITH (STATE = ON)
GO

As you can see we have defined audit for select on 3 different objects, Department, Employee and Person. This means that every time an application or user makes a SELECT on either of these objects an audit entry is made.

If we then views the log, this is what we will see.

image

As you can see, i have issued an “select * from humanresources.department” and you can easily see from which workstation and use the data was viewed by.

So it is fairly easy to setup, and even easier to use – so my message to you is this. Always set up some sort of audit on at least sensitive data, not all, and not each and every audit action, but the ones that allows you to spot any malicious activity on your data.

Happy Auditing ;o)

Salesforce and integration with SQL Server – part 3

SFSQL (1) As described in the previous posts, I’ve been working on a project which integrates data from Salesforce (SF) to Navision (NAV) manipulate and create new data, and then we needed to push it back to SF again.

To get all this to work together , there is a need for a local database in your SQL server, and in this case we call it LOCALSF – this is where all the stored procedures from DBAmp resides. But it is also the place for the local tables which is used to store data from the SF cloud storage.

Last post was about the creation of Insert / Update / Delete tables for SF data, so in order to i.e. insert data into the table Account – we need to have the table named Account_Insert in the database called LOCALSF.

After the table have been populated with new values, the stored procedure SF_BULKOPS need to be executed – this procedures manages the data transfer to and from LOCALSF and SF in the cloud.

exec SF_BulkOps @Operation = 'insert',
		@table_server = 'SFCLOUD',
		@load_table = 'Account_insert'

You need to set which operation that should be executed, and in this example, we are inserting values into Account from Account_insert and the operation is set to ‘insert’

After SF_BULKOPS has exe executed, we can have a look in the error attribute on the insert table – to see if everything went well – or something needs attention.

2014-12-02 15_50_05-lag Lead Functions.sql - DKCPHBI02ACS.SFDCBackEnd (REHFELD_KMN (257))_ - Microso

As you can see from the above screenshot, there was an attempt to insert 6 rows in Account, one was successful,  and the the last 5 was not inserted due to invalid email addresses. There could be someone looking at the table and solving issues, or there could be some sort of event that sends issues to a supporter.

Campus Days 2014 – Session 1

B3XC8oLCQAEzROX Today was the day where Regis Baccaro ( T | B ) and I had the pleasure to speak at Campus Days in Copenhagen. Started the day by making final changes to the slidedecks, going through the slides together and making a plan for the session. Then we headed to the Speakers Lounge, a rather fine room in one of the smaller cinemas in the movie theatre , where the event took place – had some coffee and a light breakfast.

Session started at 10.45 in room 10, a room that could accommodate around 200 people (unfortunately, we didn’t manage to attract that many) – but still a good number.

I started talking about  which and End-to-End big data solution on AZURE, what is needed to success with such a project and how do we set all the elements up. A short walkthrough of each of the elements, and then a tale about how they are created by using the AZURE portal.

The elements was : 2014-11-26 22_56_36-Campus Days - Azure HDInsight part 1.pptx - PowerPoint

  • AZURE Account
  • Storage Account
  • SQL Server
  • SQL Databases
  • Firewall rules
  • HDInsight Cluster
  • Data
  • Hive Scripts
  • Machine Learning

Then I moved on to talk about how tiresome work this the creation and deletion of these elements really is, and isn’t there a smarter way to achieve this? It turns out there is another  way to do the same tasks, and I choose to use PowerShell for the job. I used a script based on a script from Adam Jorgensen ( T | B ) and John Welch ( T | B ) that I have extended a bit, so that we also take care of uploading data to AZURE and creating tables to be used in HIVE Queries.

The demo went well, if we don’t take into account the smaller errors, that sometimes occur, this time because I by mistake had my PowerShell ISE running in the wrong directory, but I managed.

I think that the demo and the message about automation of the entire creation of BigData project on AZURE was well received, but I also got the impression that many people yet have to see the use for a HDInsight solution, as some of the questions asked was –

  • Why would i put my data into HDInsight instead of ordinary SQL Server?
  • The SQL server could return an answer faster than my HIVE query, so why?
  • Are there any real world implementations – we’ve only seen demos

After my demos and talk, Regis talked about another way of automate the process, and his take on was to have it all done in the ordinary ETL process using SSIS. Microsoft have not yet made any SSIS components that allows you to work with AZURE from SSIS, if they will in the future would be nice, but I don’t know.

But there is someone else that thought it would be cool to mange your AZURE setup in SSIS, and it’s oh22data a German company, and they have made suite of components for SSIS to mange AZURE, Regis made a demo where all the task that I did in PowerShell where done i SSIS as part of your regular ETL flow.

2014-11-26 22_58_04-Reader

If you would like to know more about these components have a look here at their blog and if you need information, take contact to Oliver Engels ( T | L ) and I’m sure he’ll be helpful.

Here’s a link to my slides on slideshare.net and my scripts can be found here on onedrive

Slidedeck and Demo – Campus Days 14

2014-11-19 14_09_09-Campus Days - Azure HDInsight part 1.pptx - PowerPointMaking finishing changes to my talk at Campus Days next week in Copenhagen.

The talk will be focused on End-to-End Big data solution running on AZURE, and the various methods make the implementation both easier, faster and smoother. Will start by showing how we can make it all work using the AZURE Portal, and then move on to automation of the process using POWERSHELL – in the end I have a good friend with me on stage, who will present an all together new way of managing your Big Data Project on AZURE.

 

So make sure that you’re in place Nov. 26th at Campus Days.

Pre Pass Summit 14

20141106_000328000_iOS This is the week after Pass Summit 2014, and what have I learned, gained, experienced during Pass – well for starters I’ve meet a bunch of interesting people, sat through a whole lot of sessions, sessions that both inspired me and made me think about the way I implement solutions at my clients.

Automation

One of the things that was most emphasized was the ability to automate and script everyday tasks – in one session the speaker showed us how to automate the most trivial task on a SQL server, using PowerShell. This is something that I will take home, and start using with my clients, or just for starters in our internal IT setup.

SQLServer

For instance, why would you use the GUI in Sql Server Management Studio to make Backups and restore of databases, when everything can be automated in about 20 lines of code. Furthermore, this is only one aspect of the many things that can be automated by using PowerShell.

Azure and HDInsight

In another session, well actually it was a whole day precon, the speakers walked us through the rather tedious task of creating a BIg-Data application on AZURE, using the GUI – it’s manageable but wouldn’t it be nice if everything was automated, as the point being with AZURE – is the ability to spin up an application / server park when needed.

With that in mind they then walked us through an PowerShell script that by using a configuration file, was able to setup and deploy a BIG-Dataa application on AZURE with only on hit of a button – F5 for execution.

The script would then automatically set up:

  • A new Storage Account
  • A new SQL server
  • A new SQL Database for metadata storage
  • A new HDInsight Cluster
  • A new Blob Storage
  • Upload data to the storage

All this would take about 10 minutes, during which, you as a developer could make something more meaningful.

After the deployment you had a setup with data, and you could now begin the interesting part of the project, namely querying the data in the HDInsight Cluster.

And the beauty of it, is that once you’re finished using the BIG-Data application on AZURE, you could remove it all, and deploy it later when the need arises.

People

As with any conferences, one of the most important things to remember is to develop your network, and by this I mean – Just do it, talk to people, be open, ask what they work is about, where they’re from, and then the talk will soon flow smoothly. That way you’ll see that  many have the same challenges as you, and many may even have solved them, and what better way to meet other people is there than to get a solution or maybe help a fellow SQL man/woman with a solution to their problems.

So don’t be shy, just throw yourself into the battlefield, and make friends and acquaintances from across the world.

Twitter

Be on twitter, both during PASS and when home in your day job, stay in contact with the many people and “celebrities” you’ve meet during pass. Many times I’ve managed to sit across people that I know very well on twitter, but not in real life, just to suddenly realize whom they are. And likewise they’ve meet me like that. So the value of using twitter a few times a day is just mind-blowing.

Follow #SQLHelp as this is the way to get help from the best in the field, people will if they have the answer and time, often provide you with either a solution or at least a direction for a solution to your problem.

And after a while – just start help others, remember give back to the community, and your experience will be ten times better ;o) and the community will flourish.

SSIS Connectors v3.0 for Oracle and Teradata

This monday Microsoft announced the releas of a new SSIS Connectors v3.0 for Oracle and Teradata by Attunity, this release will fix a few bugs, and

You can download the connector here http://www.microsoft.com/en-us/download/details.aspx?id=44582

In regards to Teradata, the release is considered a minor release, and the fixes is listed here.

  • Expose additional TPT Stream Attributes(TD_PACK and TD_PACKMAXIMUM) to provide maximum tuning flexibility.
  • Support for loading table with columns using reserved words.
  • Fix mapping for TIME(0) to DT_STR SSIS datatype.
  • Can display table name more than 30 characters correctly.
  • Support for block mode and set as default mode.
  • Expose TD_SPOOLMODE for TPT Export for faster extracts.
  • Support for Extended Object Names(EON), which allow UNICODE object names in the Data Dictionary tables.
  • Adding new datatypes (TD_PERIOD_DATE, TD_PERIOD_TIME and TD_NUMBER)

You can download the connector here http://www.microsoft.com/en-us/download/details.aspx?id=44582

 

Pass Summit 2014 – Tips & Tricks

One of the greatest thing when attending a conference, is that you meet a lot people that instantly understands what you’re talking about. And during some talks and sessions, we tend to share tips and tricks.

One of the better tricks I’ve come across was shared today by Robert Cain ( twitter | blog ) and it’s about to make your life as an t-sql developer a lot easier.

You know sometimes when you have to make a new statement, you right-click the table and create script as select and use that to work further on.

Script Table

Usually you need to join another table onto that table, so you add an Alias to the table, and then add that alias in front of all the attributes on the SELECT statement.

whatifitoldyou

 

Createscript

Go add an alias to the table.

tablealias

Put your cursor at the last attribute in the select statement.

Click and hold ALT+SHIFT and move you cursor up with the ARROW-UP

alt-shift

See there’s a light blue line now?

Now start typing the alias of the table, and by magic the alias you write is written on all lines selected previous.

alias

And this can also be used for copy pasting.

I bet you this will be the most used little nifty trick – and it works in SSMS 2012 and 2014

Pass Summit day 1

powershell-f8deadaf08be625e8d5dc2d146d1b7a5Sitting here waiting for the session to start, and what better way to use the time than to plan my schedule for today and tomorrow. I have to say it’s real hard to choose the right ones, due to the high quality of the speakers and abstracts.

But I think I made som good choices, but that I’ll know better to night. Here’s the sessions that I found most interesting.

I’ll write during later about what I took with me from the sessions.

[AD-306] Make SQL Server Pop with PowerShell
Speaker Robert Cain

PowerShell is everywhere, and that includes SQL Server. In this session, we’ll look at using PowerShell with both the SQL Server provider and the SMO (SQL Management Objects) library. You will learn how to handle common DBA tasks, such as checking the status of various services, validating backups, and checking for disk usage. You’ll also see how to perform developer tasks such as creating tables, inserting records, and querying data. Perhaps most importantly, you will learn how to read and understand the SMO object model, and translate it into PowerShell commands. Come see how to make your SQL Server Pop with PowerShell.

[BIA-206] Configuring SharePoint 2013 as a Business Intelligence Platform
Speaker : Edwin Sarmiento

With SharePoint as the business collaboration platform of choice for the enterprise, it has become an important piece in the overall Microsoft BI stack. However, deploying and configuring it to support Microsoft BI solutions has become a cumbersome task for the IT professional. In this session, you will learn what it takes to successfully architect and design SharePoint 2013 as a BI platform. We will cover multiple configuration scenarios and the different technologies necessary to build the infrastructure. At the end of the session, you can walk away feeling confident to run Power Pivot, Power View and Reporting Services in your SharePoint 2013 farm.

[DBA-407-M] Query Store – A New SQL Query Tuning Feature
Speaker : Conor Cunningham

Have you ever come in to work only to have the boss come tell you that your main site is down and the database is “broken”? Fixing query performance problems can be rough, especially in high-pressure situations. Microsoft has developed a feature to help customers gain significantly easier insight into production systems and to be able to quickly fix cases where a new plan choice from the query optimizer has undesired performance consequences. This talk introduces the Query Store, explains the architecture, and shows how it can be used to solve real-world performance problems. It will now be possible to ask questions like “show me what query plans have changed since yesterday” and to quickly ask the optimizer to “go back” to the query plan that was working fine for you previously.

[BIA-307] Continuous Delivery for Data Warehouses and Marts
Speaker : John Welch

Most data warehouses are in a constant state of flux. There are new requirements coming in from the business, updates and improvements to be made to existing data and structures, and new initiatives that drive new data requirements. How do you manage the complexity of keeping up with the changes and delivering new features to your business users in a timely manner, while maintaining high quality? Continuous delivery is an approach for managing this. It focuses on automation of many steps in the process, so that time is spent on adding new functionality, rather than repetitive steps. Attend this session and learn how Continuous Delivery can be applied to your data projects.

Pass Summit 14 – Preconf (in progress)

Attending the preconf session Big Data: Deploy, Design, and Manage Like a Pro where Buck Woody (web), Adam Jorgensen (web | twitter) and John Welch (web | twitter) is doing their magic with Azure, HDInsight, PowerShell and everything in between.

B1iLXdiCQAA3SR6

Great questions from the attendees, and even greater answers.

Some keypoints from Buck is these, but I think they’ve always been relevant, but now even more in regards to Big Data.

  • Always ask the right questions
  • Never select the tech beforehand
  • Always select the TECHNOLOGIES after the questions have been asked and answered
  • Move 1TB data to Azure, DONT DO THAT
  • Send data i a trickle way, incremential data load

Powershell In A NutShell :

  • Scripting language
  • Based on Command Lets
    • Verb Noun
    • DIR becomes – Get child item
  • Variables always starts with a $
    • $Datasource
  • Everything is an OBJECT

John Welch is starting to talk about how to load data into your Azure storage, for this task we’re loading data from Twitter.com and Linkedin.com

John has a tool to download the XMLfeed from Twitter and Linkedin, the data needs to be preprocessed on record at at time

photo1

  •  Text files need to be in UTF-8 no BOM
  • Records is delimited by newline
  • Formats
    • Several formats can be used
    • Delimited text
    • SEQ. File
    • RCFile / Optimized Column File

 

More to come

In exactly 1 week I’m at PASS Summit 14

firsttimers

In exactly 1 week from now I’m in Seattle for my first ever Pass Summit, and I for one cannot tell you how much I look forward to be there with all the amazing people from the SQL community. As it my first time there i’m planning to attend the different “firsttimer” events, as well as all the other events that’s part of this summit.

We’re 3 guys travelling together this time, 2 from Rehfeld and 1 from Mærsk (whom used to be at Rehfeld) – and instead of crashing each night at different hotel rooms we’ve rented an whole appartment via AirBNB.

It seemed as the smartest, and it’s actually cheaper than hotels, only downside, is that the houserules says “STRICTLY NO PARTIES” – gonna be a tough one to manage ;o)

I’m attending 2 full day pre conference workshops, monday and tuesday and I’m really excited to learn from the best – and even pick up some tricks for use in my own presentations at SQLSaturdays around europe.

I’m in the process of trying to plan my session schedule, but I assure you, it’s a tough task – trying to pick which one of the sessions that I can skip to go for another.

Well until sunday next week, have a nice week, catch me on twitter, maybe we’ll see each other in Seattle.