Power BI for Free – Yes, you heard right

icon175x175 A few days ago Microsoft announced that the basic Power BI suite, currently available in Office 365 subscription, will be free to use in the future.

So, lets see what this will do for all the customers currently using Power BI and the ones still considering jumping on the wagon.

Power BI will come in at least two different flavours, the free version for small to middle corporations, and a paid service offering more corporate/enterprise  features. The good thing is that uses already using Power BI will start to save money, not only by the knowledge in the data, but right to the bone, as the service at the moment cost up to 33$ pr. month and a subscription to Office 365 Apps.

That means that users can spend their money on other stuff, and still get the knowledge in the data – which may lead to even more savings (it’s what’s BI is all about).

The new Poser BI service is able to extract data from a bunch of know sources, such as

  • Microsoft products
  • Salesforce
  • GitHub
  • Marketo
  • Facebook
  • Odata
  • Azure

And to top it all, Microsoft just released a new Power BI app for IPAD, currently available in preview from App-store, as well as a web-based edition. Later this year a version for iPhone will arrive, and I would not be surprised to see a Android and Windows phone version as well.


One could find it strange that the first version to hit the market is the IPAD, but I think it has to do with potential number of adopters in the market. Let’s face it, the dominant platform at the moment is iOS, so why not bet on that.

And what’s even more interesting is the news about Microsoft acquiring the company Revolution Analytics – this to boost the ability to further enhance Microsoft position in the Big Data Analytics segment. This is also a great nod to the open-source language R, now Microsoft is embedding it into their products, and I really hope to see the effects in their future products.

So head over to www.powerbi.com and start getting to know it.

Pricing and feature plans:

Plans available at general availability Power BI Free Power BI
9,95$ pr user/month
Data Capacity Limit 1 GB/user 10 GB/user
Create, view and share personal dashboards and reports x x
Power BI Designer for content authoring x x
Explore data with natural language* x x
Power BI mobile apps for iOS and Windows x x
Content packs for services like Dynamics, Salesforce, and SendGrid x x
Import data and reports from Excel files x x
Data Refresh    
Scheduled Data Refresh Daily Hourly
Support for Streaming Data 10K rows/hour 1M rows/hour
Live interactive connectivity to data sources   x
Data Management Gateway   x
Share refreshable team dashboards and reports   x
Create and publish customized content packs   x
Use of Active Directory groups for sharing & managing access control   x
Shared data queries through the Data Catalog   x

PASS SQLRally Nordic is back – bigger than ever

PASS SQLRally Nordic 2015What, haven’t you heard, SQLRally Nordic is back and this time the event is being held in Copenhagen – more specific at the Tivoli Congress Center from march 2-4.

Smack in the middle of Copenhagen central district, a mere walk of 5 min from the central station – meaning that travel and transportation for all the attendees from abroad will have an easy time ;o)

The sessions and Precon sessions looks just amazing, so do your self a favor – check out the list of speakers and sessions, sign up for a precon, as you will get great value for money at this event.

If you register before January 14th. you get an early bird discount of 100€ on the regular price.

Perhaps we’ll see each other at SQLRally, I’m sure going to this event, and you should not miss out either.

Speaking at SQL Saturday Vienna #374


What a great surprise to see in my twitterfeed this afternoon, lots of retweets indicating that I had been selected to speak at the next SQLSaturday in Vienna. I submitted 3 sessions, and the one that I actually think is the best got chosen. The session will be about how to automate trivial task in a AZURE bi solution, by using PowerShell.

Have a look at the great schedule the team i Vienna have put together, there is so many sessions that I will attend, and so should you ;o)


This will be my second talk at a SQL Saturday, and I look very much forward to meet all the great people from the SQL Family and also to see Vienna for the first time ever.

So this year seems to getting started on the right track, lots of interesting clients at work, many topics to blog about and the first event where I will speak is already in the book.

Looks promising, and we’re just ending day #6 of 2015.

SQL Server 2014 CU5 Master Data Service Excel Addin

image This update was released yesterday, and it is a highly anticipated update, as it will introduce performance factor 4 in the Excel Addin.

This performance upgrade will come in very handy when working on larger datasets, and should speed the user experience up.

It is an update to the client tool, which the user installs on their local laptop, and not an update to be installed on the server.

It is possible to add an additional 2~4 performance improvement, but this has to be done by a administrator on the SQL server that hosts MDS, by turning on the server side dynamic content compression.

This is done by the following procedure.

  1. Open control Panel
  2. Click Programs
  3. image  

  4. Click Turn Windows features on or off
  5. Open Information Services
  6. Open World Wide Web Services
  7. Open Performance Features
  8. Click Dynamic Content Compression

Doing this will result in the additional 2~4 performance increase.

You can download the update here

And remember to download the correct one, as there are two different ones, one for 32Bit and another for 64Bit. The MSI is backward compatible with SQL Server 2012.

SQL Server 2014 CU5 ready

YIhaaaa, another Cumulative Update for SQL server 2014 is ready, well actually it have been ready for at while, I’ve just found it now and thought I would share it.

Here are some of the more interesting fixes.

FIX: “You do not have permission…” error when you run a report that connects to a .BISM file in PowerPivot gallery    Analysis Services

FIX: Error when you upgrade a SQL Server 2012 SP2 instance with MDS to SQL Server 2014    DQS

FIX: Insufficient memory error when you run an XML Validation Control Flow task on a large input file in SSIS 2012 or SSIS 2014    SQL service

FIX: Exception when you select more than one hierarchy from parent/child dimension in SSAS 2012 or in SSAS 2014    Analysis Services

For a comprehensive list of the fixes in this update, have a look at this page.

Where you also can download the update.

SQLHangout #27 – Masterdata Management

Back in the early spring of 2014 I was speaking at SQLSaturday Copenhagen, I gave a speak about Masterdata Management and automation of some of the processes in this area. And while there, I ran in to Boris Hristov ( t | b ) and we had a great talk about SQL server, SQLSaturdays and a generally had a good time. Over the year we’ve kept in contact and a week or so before Christmas, we talked about doing a #SQLHangout when I arrived home after Christmas holiday.

So yesterday we did just that, we talked about Master Data Management and how it is used in SQL Server, the differences between 2008 R2 Edition and 2012 Edition, what could be wished for in future editions and a whole lot more.


Hopefully I’ll be able to join another in 2015, until  then take a look at this one, and do give yourself the treat of watching all the other #SQLHangouts with some of the most amazing people from the #SQLFamily.

And now, all that’s left to say is – Happy New Year ;o)

Audit on SQL Server

The other day one of my clients asked me if there was any way to audit what users did on their data, and in particular some tables of importance. I had to look it up, well I knew that there was some Audit possibilities build into SQL Server, but I had never had the need for it before. And as I was expecting, SQL server offers a rather thorough audit engine, you just have to enable and configure it.

I’ll show you what is needed to set up audit on tables and views in a given database, but first a little bit about the audit engine it self.

What is Audit on a SQL server

Audit is an engine that gives you the ability  to log the use and access to data in tables, views, execution of stored procedures, functions and much more.

Here is a short list of some of the possibilities for auditing on database level:

      • SELECT
      • UPDATE
      • INSERT
      • DELETE
      • EXECUTE

Here is a short list of some of the events to audit on server level (not adequate list) :

      • DBCC_GROUP

As you can see there is an vast variation of events to audit, and depending on the level of regulation in a clients business, we can set up different levels of auditing on various objects.

It gives you the option to store the log in binary files, the security log or the application log.

To enable AUDIT in SQL server, there is a few things that needs enabling first.

First there needs to be created a new AUDIT on server level, this can be accomplished either in SSMS or by T-SQL script, If you go for SSMS it is done under Security.

1. Right click on Audits and then click on New Audit

image2. You should now see the window below.
image3. It is now time to define how  the audit should store the log, the name of the audit, file sizes and the like.
4. Either store the log in a file or in the Security or Application Log
5. If possible you should reserve the disk space now, to avoid a failure if the disk runs out of space.

If you decide to do is by T-SQL here is a script that do exactly that.

USE [master]

/****** Object:  Audit [Select Audit AdventureWorks2014] ******/
/****** Create the Server audit for use in the database audit ******/
CREATE SERVER AUDIT [Select Audit AdventureWorks2014]
(	FILEPATH = N'D:\MSSQL\2014\Audits\'
	,MAXSIZE = 500 MB
(	QUEUE_DELAY = 1000
	,AUDIT_GUID = 'd1817028-0edc-49d3-bbad-0312291673c9'
ALTER SERVER AUDIT [Select Audit AdventureWorks2014] WITH (STATE = ON)

The above script will make a file log on D:\MSSQL\2014\Audits\ with a max file size of 500mb, 20 files and the space will be reserved for the log.
Next we issue an alter statement on the server audit just created, where we enable it.

Now that the server audit is created and active, it is time for the actual database audit specification, again we can create it in SSMS or by T-SQL – first SSMS

image1. Right click on Audits and then click on New Audit
2. Click on New Database Audit Specification, and the following window will show

image3.Here you can choose what audit action types to audit, here its DELETE, then select the object, object schema and click OK.
4. That is it, your now auditing for DELETE in the specified object.

If you would use T-SQL here is a script that does the same, just for a few more objects.

USE [AdventureWorks2014]


/****** Create the Audit SPecification for these objects ******/
FOR SERVER AUDIT [Select Audit AdventureWorks2014]
ADD (SELECT ON OBJECT::[HumanResources].[Department] BY [HumanResources]),
ADD (SELECT ON OBJECT::[HumanResources].[Employee] BY [HumanResources]),
ADD (SELECT ON OBJECT::[Person].[Person] BY [dbo])

As you can see we have defined audit for select on 3 different objects, Department, Employee and Person. This means that every time an application or user makes a SELECT on either of these objects an audit entry is made.

If we then views the log, this is what we will see.


As you can see, i have issued an “select * from humanresources.department” and you can easily see from which workstation and use the data was viewed by.

So it is fairly easy to setup, and even easier to use – so my message to you is this. Always set up some sort of audit on at least sensitive data, not all, and not each and every audit action, but the ones that allows you to spot any malicious activity on your data.

Happy Auditing ;o)

Salesforce and integration with SQL Server – part 3

SFSQL (1) As described in the previous posts, I’ve been working on a project which integrates data from Salesforce (SF) to Navision (NAV) manipulate and create new data, and then we needed to push it back to SF again.

To get all this to work together , there is a need for a local database in your SQL server, and in this case we call it LOCALSF – this is where all the stored procedures from DBAmp resides. But it is also the place for the local tables which is used to store data from the SF cloud storage.

Last post was about the creation of Insert / Update / Delete tables for SF data, so in order to i.e. insert data into the table Account – we need to have the table named Account_Insert in the database called LOCALSF.

After the table have been populated with new values, the stored procedure SF_BULKOPS need to be executed – this procedures manages the data transfer to and from LOCALSF and SF in the cloud.

exec SF_BulkOps @Operation = 'insert',
		@table_server = 'SFCLOUD',
		@load_table = 'Account_insert'

You need to set which operation that should be executed, and in this example, we are inserting values into Account from Account_insert and the operation is set to ‘insert’

After SF_BULKOPS has exe executed, we can have a look in the error attribute on the insert table – to see if everything went well – or something needs attention.

2014-12-02 15_50_05-lag Lead Functions.sql - DKCPHBI02ACS.SFDCBackEnd (REHFELD_KMN (257))_ - Microso

As you can see from the above screenshot, there was an attempt to insert 6 rows in Account, one was successful,  and the the last 5 was not inserted due to invalid email addresses. There could be someone looking at the table and solving issues, or there could be some sort of event that sends issues to a supporter.

Campus Days 2014 – Session 1

B3XC8oLCQAEzROX Today was the day where Regis Baccaro ( T | B ) and I had the pleasure to speak at Campus Days in Copenhagen. Started the day by making final changes to the slidedecks, going through the slides together and making a plan for the session. Then we headed to the Speakers Lounge, a rather fine room in one of the smaller cinemas in the movie theatre , where the event took place – had some coffee and a light breakfast.

Session started at 10.45 in room 10, a room that could accommodate around 200 people (unfortunately, we didn’t manage to attract that many) – but still a good number.

I started talking about  which and End-to-End big data solution on AZURE, what is needed to success with such a project and how do we set all the elements up. A short walkthrough of each of the elements, and then a tale about how they are created by using the AZURE portal.

The elements was : 2014-11-26 22_56_36-Campus Days - Azure HDInsight part 1.pptx - PowerPoint

  • AZURE Account
  • Storage Account
  • SQL Server
  • SQL Databases
  • Firewall rules
  • HDInsight Cluster
  • Data
  • Hive Scripts
  • Machine Learning

Then I moved on to talk about how tiresome work this the creation and deletion of these elements really is, and isn’t there a smarter way to achieve this? It turns out there is another  way to do the same tasks, and I choose to use PowerShell for the job. I used a script based on a script from Adam Jorgensen ( T | B ) and John Welch ( T | B ) that I have extended a bit, so that we also take care of uploading data to AZURE and creating tables to be used in HIVE Queries.

The demo went well, if we don’t take into account the smaller errors, that sometimes occur, this time because I by mistake had my PowerShell ISE running in the wrong directory, but I managed.

I think that the demo and the message about automation of the entire creation of BigData project on AZURE was well received, but I also got the impression that many people yet have to see the use for a HDInsight solution, as some of the questions asked was –

  • Why would i put my data into HDInsight instead of ordinary SQL Server?
  • The SQL server could return an answer faster than my HIVE query, so why?
  • Are there any real world implementations – we’ve only seen demos

After my demos and talk, Regis talked about another way of automate the process, and his take on was to have it all done in the ordinary ETL process using SSIS. Microsoft have not yet made any SSIS components that allows you to work with AZURE from SSIS, if they will in the future would be nice, but I don’t know.

But there is someone else that thought it would be cool to mange your AZURE setup in SSIS, and it’s oh22data a German company, and they have made suite of components for SSIS to mange AZURE, Regis made a demo where all the task that I did in PowerShell where done i SSIS as part of your regular ETL flow.

2014-11-26 22_58_04-Reader

If you would like to know more about these components have a look here at their blog and if you need information, take contact to Oliver Engels ( T | L ) and I’m sure he’ll be helpful.

Here’s a link to my slides on slideshare.net and my scripts can be found here on onedrive

Slidedeck and Demo – Campus Days 14

2014-11-19 14_09_09-Campus Days - Azure HDInsight part 1.pptx - PowerPointMaking finishing changes to my talk at Campus Days next week in Copenhagen.

The talk will be focused on End-to-End Big data solution running on AZURE, and the various methods make the implementation both easier, faster and smoother. Will start by showing how we can make it all work using the AZURE Portal, and then move on to automation of the process using POWERSHELL – in the end I have a good friend with me on stage, who will present an all together new way of managing your Big Data Project on AZURE.


So make sure that you’re in place Nov. 26th at Campus Days.