In a recent project i’ve been working on, we needed to transfer data from Salesforce (SF) to Navision (NAV) manipulate and create new data, and then we needed to push it back to SF again. To manage this we needed to persistate the data from SF in our on onpremise SQL database, there are a few tools that allows that – we used DBAmp for the purpose, its a framework developed in T-SQL using stored procedures. It allows you to replicate tables from the SF Cloud to you on-premise database.
In the following series of posts I’ll go through the various procedures and how it could be implemented in Integration Services
First step is to make a Linked Server on you SQL server to the SF database in the cloud (how to do this is not covered here) – when that is in place you should be able to make queries like this to you data in SF.
SELECT * FROM SFCLOUD...Account
In the above statement SFCLOUD is the name of the linked server, and remember to use 4 name convention
These are the main procedures in the product, and before we can push data to SF we need to have local copy of the table.
To do this we have to issue a statement that will grab the table structure from SF and then make a create statement on our local server, the statement if we would like to get the account table is this
EXEC SF_Replicate @linked_server = 'SFCLOUD', @object_name = 'Account', @options = 'batchsize(50)'
What this will do is that the table will be created in the local database, and the data will be transfered as well, and data will be transfered in batches of 50 rows.
When the stored procedure is finished you should be able to select data from your local table, in your database, so if your database is called LOCALSF the you SQL statement would look like this.
USE LOCALSF GO SELECT * FROM dbo.Account
That should result in a dataset exactly as the one you would get from the first statement, what was executed on the Linked Server.
In the next posts, i’ll be getting into the stored procedure SF_Generate