Bring your own SQL licenses to Azure

data_Illustration_cloudA few days ago, we announced that Microsoft Enterprise customers is now allowed to bring their own SQL Licenses to Azure VMs. This means that if a customer already have a SQL License, this license can be used on SQL Server VM images from Marketplace.

This means that they do no longer need to build their own VM, but instead can just provision a server from the marketplace and use the existing license.

Read a lot more on the official blog post

https://azure.microsoft.com/da-dk/blog/easily-bring-your-sql-server-licenses-to-azure-vms/

Found a possible bug in MDS

image.png

UPDATE : Upvote the connect task

UPDATE : Tested on SQL 2016 CTP3.1

Today, working on Master Data Services 2014 (MDS) at a client, I ran into some strange behavior from both the MDS website and the MDS Excel Add-In.

For some reason there is a limit to the length on attributes in an entity, in my opinion there should not be limitations on the attribute length, as it should be up to the users to decide. Continue reading →

Microsoft acquires Datazen Mobile BI

visual02Yesterday was a really good day for all doing Mobile BI on Microsoft BI Stack, as Microsoft acquired Datazen. Datazen has been in the mobile BI market for quite some time now and their product is really stable and delivers a great front-end for data in the any company. Continue reading →

Salesforce and integration with SQL Server – part 2

SFSQL

As described int the previous post, I’ve been working on a project which integrates data from Salesforce (SF) to Navision (NAV) manipulate and create new data, and then we needed to push it back to SF again.

To get all this to work together , there is a need for a local database in your SQL server, and in this case we call it LOCALSF – this is where alle the storedprocedures from DBAmp resides. But it is also the place for the local tables which is used to store data from the SF cloud storage.

The previous post showed you how to generate the tables to hold SF data, but DBAmp is also capable of Insert, Update and Delete in SF cloud storage. To mange this there is a need for either a Insert, Update or Delete table – these can be generated almost automatically by calling the stored procedure SF_Generate

To do this we have to issue a statement that will grab the table structure from SF containing all the attributes of which we can insert, update or delete values, the procedure wil then render a table in the local database.

The statement if we would like to create an insert table for account is this

EXEC SF_Generate @operation = 'insert', 
                 @table_server = 'SFCLOUD', 
                 @load_tablename = 'Account_insert'

This creates a table with an ID, ERROR attributes as well as alle the attributes needed to insert a new Account in SF, you then need to populate the table with values, ID and ERROR must be left empty as DBAmp uses these to keep hold of which record is deployed and which are not.

After the population of the account_Insert table, we need to puch the data into SF, this is done by calling the following statement.

EXEC SF_BulkOps @operation = 'Insert',
                @table_server = 'SFCLOUD', 
                @table_name = 'Account_Insert'

What DBAmp then does is, that it will grab all rows from the insert table and insert them into the account object in SF, afterwards it writes a status in the ERROR attribute on the insert table, so it’s possible to see if any given record is succesfully transfered.

If you instead want to generate a table for Deletes you have to issue the following statement.

EXEC SF_Generate @operation = 'delete', 
                 @table_server = 'SFCLOUD', 
                 @load_tablename = 'Account_delete'

This statement will generate a table called ‘Account_delete’ which will contain an ID, and ERROR attribute and that’s it.

Then you could populate the IDs with SF ID’s that needed deletion from SF, then as before you would need to issue a statement for deleting these records.

EXEC SF_BulkOps @operation = 'Delete', 
                @table_server = 'SFCLOUD', 
                @table_name = 'Account_Delete'

Again DBAmp will connect to SF in the cloud and delete the approriate records and save a status in the ERROR attribute.

In the next posts, i’ll be getting into the stored procedure SF_BulkOps that we shortly looked at in this post.

Salesforce and integration with SQL Server

SFSQLIn a recent project i’ve been working on, we needed to transfer data from Salesforce (SF) to Navision (NAV) manipulate and create new data, and then we needed to push it back to SF again. To manage this we needed to persistate the data from SF in our on onpremise SQL database, there are a few tools that allows that – we used DBAmp for the purpose, its a framework developed in T-SQL using stored procedures. It allows you to replicate tables from the SF Cloud to you on-premise database.

In the following series of posts I’ll go through the various procedures and how it could be implemented in Integration Services

First step is to make a Linked Server on you SQL server to the SF database in the cloud (how to do this is not covered here) – when that is in place you should be able to make queries like this to you data in SF.

SELECT * FROM SFCLOUD...Account

In the above statement SFCLOUD is the name of the linked server, and remember to use 4 name convention

These are the main procedures in the product, and before we can push data to SF we need to have local copy of the table.

  1. SF_Replicate
  2. SF_Generate
  3. SF_ReplicateAll
  4. SF_BulkOps
  5. SF_Refresh

To do this we have to issue a statement that will grab the table structure from SF and then make a create statement on our local server, the statement if we would like to get the account table is this

EXEC SF_Replicate @linked_server = 'SFCLOUD',
                  @object_name = 'Account', 
                  @options = 'batchsize(50)'

What this will do is that the table will be created in the local database, and the data will be transfered as well, and data will be transfered in batches of 50 rows.

When the stored procedure is finished you should be able to select data from your local table, in your database, so if your database is called LOCALSF the you SQL statement would look like this.

USE LOCALSF
GO
SELECT * FROM dbo.Account

That should result in a dataset exactly as the one you would get from the first statement, what was executed on the Linked Server.

In the next posts, i’ll be getting into the stored procedure SF_Generate