The team behind SQLSaturday Denmark is in the work of creating an event that will be even bigger and greater than the one we had last year. This year we will have 2 full days filled with precon seminars, there will be both halfdays and fulldays seminars Continue reading →
Category Archives for System Integration
SSIS Connectors v3.0 for Oracle and Teradata
This monday Microsoft announced the releas of a new SSIS Connectors v3.0 for Oracle and Teradata by Attunity, this release will fix a few bugs. Continue reading →
Salesforce and integration with SQL Server – part 2
As described int the previous post, I’ve been working on a project which integrates data from Salesforce (SF) to Navision (NAV) manipulate and create new data, and then we needed to push it back to SF again.
To get all this to work together , there is a need for a local database in your SQL server, and in this case we call it LOCALSF – this is where alle the storedprocedures from DBAmp resides. But it is also the place for the local tables which is used to store data from the SF cloud storage.
The previous post showed you how to generate the tables to hold SF data, but DBAmp is also capable of Insert, Update and Delete in SF cloud storage. To mange this there is a need for either a Insert, Update or Delete table – these can be generated almost automatically by calling the stored procedure SF_Generate
To do this we have to issue a statement that will grab the table structure from SF containing all the attributes of which we can insert, update or delete values, the procedure wil then render a table in the local database.
The statement if we would like to create an insert table for account is this
EXEC SF_Generate @operation = 'insert', @table_server = 'SFCLOUD', @load_tablename = 'Account_insert'
This creates a table with an ID, ERROR attributes as well as alle the attributes needed to insert a new Account in SF, you then need to populate the table with values, ID and ERROR must be left empty as DBAmp uses these to keep hold of which record is deployed and which are not.
After the population of the account_Insert table, we need to puch the data into SF, this is done by calling the following statement.
EXEC SF_BulkOps @operation = 'Insert', @table_server = 'SFCLOUD', @table_name = 'Account_Insert'
What DBAmp then does is, that it will grab all rows from the insert table and insert them into the account object in SF, afterwards it writes a status in the ERROR attribute on the insert table, so it’s possible to see if any given record is succesfully transfered.
If you instead want to generate a table for Deletes you have to issue the following statement.
EXEC SF_Generate @operation = 'delete', @table_server = 'SFCLOUD', @load_tablename = 'Account_delete'
This statement will generate a table called ‘Account_delete’ which will contain an ID, and ERROR attribute and that’s it.
Then you could populate the IDs with SF ID’s that needed deletion from SF, then as before you would need to issue a statement for deleting these records.
EXEC SF_BulkOps @operation = 'Delete', @table_server = 'SFCLOUD', @table_name = 'Account_Delete'
Again DBAmp will connect to SF in the cloud and delete the approriate records and save a status in the ERROR attribute.
In the next posts, i’ll be getting into the stored procedure SF_BulkOps that we shortly looked at in this post.
Salesforce and integration with SQL Server
In a recent project i’ve been working on, we needed to transfer data from Salesforce (SF) to Navision (NAV) manipulate and create new data, and then we needed to push it back to SF again. To manage this we needed to persistate the data from SF in our on onpremise SQL database, there are a few tools that allows that – we used DBAmp for the purpose, its a framework developed in T-SQL using stored procedures. It allows you to replicate tables from the SF Cloud to you on-premise database.
In the following series of posts I’ll go through the various procedures and how it could be implemented in Integration Services
First step is to make a Linked Server on you SQL server to the SF database in the cloud (how to do this is not covered here) – when that is in place you should be able to make queries like this to you data in SF.
SELECT * FROM SFCLOUD...Account
In the above statement SFCLOUD is the name of the linked server, and remember to use 4 name convention
These are the main procedures in the product, and before we can push data to SF we need to have local copy of the table.
- SF_Replicate
- SF_Generate
- SF_ReplicateAll
- SF_BulkOps
- SF_Refresh
To do this we have to issue a statement that will grab the table structure from SF and then make a create statement on our local server, the statement if we would like to get the account table is this
EXEC SF_Replicate @linked_server = 'SFCLOUD', @object_name = 'Account', @options = 'batchsize(50)'
What this will do is that the table will be created in the local database, and the data will be transfered as well, and data will be transfered in batches of 50 rows.
When the stored procedure is finished you should be able to select data from your local table, in your database, so if your database is called LOCALSF the you SQL statement would look like this.
USE LOCALSF GO SELECT * FROM dbo.Account
That should result in a dataset exactly as the one you would get from the first statement, what was executed on the Linked Server.
In the next posts, i’ll be getting into the stored procedure SF_Generate