Azure Maps

So at the moment, I’m testing and reviewing some training material for internal use in Microsoft in February and as you may have guessed, the service is Azure Maps.

This is the service that used to be part of the Bing Api, but have now been moved into the Azure service catalog

Chris Pendleton just wrote me to let me know that Azure Maps is in fact a brand new service, written from scratch – only thing that moved from Bing is the imagery and a couple of former Bing team members – Just wanted to let you all know this, and correct the misunderstanding.

and I must say it’s been a really nice experience to work with and use during the last few days.

Using Azure Maps, you get a variety of options to integrate maps into your application, these includes in headlines:

  • Search : build applications that enable you and the users to search for adresses, Point of Interest, Businesses, contact information and much more. You even have the option to get detailed information on what the road is used for, speedlimit and more.
  • MAPS : use this to integrate the well-known quality maps from Bing into your website, application or mobile app to give the user a visual experience of the location.
  • Geocoding : Convert Lat and Lon into adresses and vise versa.
  • IP to location : Ever wanted to have an easy way to match a IP/number to the country where it is in use? Well, here is a service that gives you that, but please be aware that the service is in preview, and are subject to changes.
  • Traffic : Use this in your custom application to allow for instance your sales personel to avoid traffic jams, reduce travel time and let them choose between several available routes.
  • Routing : Use this to incorporate the shortest or fastest route to your users, allow multiple points along a route, and can be useful for development and help solve the ever occurring logistic problem, know as “the travelling salesman”
  • Timezone : Enables you ti implement time service in your application, and look up times around the globe.

The full documentation is available here, with a lot of examples and demo apps.

During the test and evaluation of the training material, I used the application Postman that allows you to build a url and header for use against a Rest API, such as Azure Maps.

The application will then get the result and present it for you in a format of your choosing, raw, json, pretty, etc. and you can then inspect the response you get from the service, even before you start a single line of code in your preferred IDE for development. But I suggest you use Visual Studio Code – that is an free and open sources code editor that runs on your selected operation system.

Postman

Start by downloading the application Postman, and install it, once installed and running, you should create a Collection for storage of the results you get.


Click on the Arrow Down besides New, and select Collection
Enter a name for the collection and click Create

Now we’re ready for testing our service, it’s a prerequisite that you have created a Azure Map Service on Azure and have your Subscription-key at hand.

  1. Start by entering the following into the URL just besides the GET function https://atlas.microsoft.com/search/fuzzy/json?
  2. Now we’re ready to fill in some values for the keys, that we will send to the REST API.
  3. In the first key, you enter api-version and the value is 1
  4. In the next you enter query and your query for an address in value, I entered Birkedommervej 8, Vester Egede  but  i  suggest  you  use your own 😉
  5. In the next key you enter subscription-key and in value the key from your Azure Map Service
  6. Now you should have a screen looking a bit like the one below

Now once all the keys and values have been entered, you should click on the big blue SEND button, which initiates a call to the api, and then catches the result in the Postman app.

You have now called the Azure Maps api for the first time and the result you get is in this example being shown as JSON, but you could easily show it as XML instead. Here is the result in JSON that my query returned.

{
    "summary": {
        "query": "birkedommervej 8 vester egede",
        "queryType": "NON_NEAR",
        "queryTime": 104,
        "numResults": 1,
        "offset": 0,
        "totalResults": 1,
        "fuzzyLevel": 1
    },
    "results": [
        {
            "type": "Street",
            "id": "DK/STR/p0/20146",
            "score": 5.785,
            "address": {
                "streetName": "Birkedommervej",
                "municipalitySubdivision": "Vester Egede",
                "municipality": "Haslev",
                "countrySubdivision": "Sjælland",
                "postalCode": "4690",
                "countryCode": "DK",
                "country": "Denmark",
                "countryCodeISO3": "DNK",
                "freeformAddress": "Birkedommervej, 4690 Haslev (Vester Egede)"
            },
            "position": {
                "lat": 55.26562,
                "lon": 11.96339
            },
            "viewport": {
                "topLeftPoint": {
                    "lat": 55.26486,
                    "lon": 11.96664
                },
                "btmRightPoint": {
                    "lat": 55.26602,
                    "lon": 11.96013
                }
            }
        }
    ]
}

Now the next steps would be to add more keys to the query to get even more information from the API about the address at hand, as mentioned above we can get information about speedlimit, road usage etc. etc.

I hope that you got a little excited about this new service on Azure and if so, i would encourage you to go deep dive into the API and look at some of the more advanced features yourself.

I will post another post on some of the advanced features, in the upcoming weeks, so stay tuned or head over to the documentation and start yourself.

Blog reboot

New Year, New me – a sentence many of us have either used ourself or seen online for the last few weeks.

Well, here it is not a new me, as I will be the same as always 😉

But I will reboot this blog and try to have at least one weekly post on topics suchs as :

  • Data
  • Artificial Intelligence
  • Machine Learning
  • Azure Services
  • And everything in between

So if you’re into these topics, please feel free to drop in once in while and see if there’s something to your liking, also everything here will be cleaned for customer details, references and suchs, and will solely consist of reference architectures, my points on the topics and is solely my view and personal opinion.

Best regards

Kenneth

MS Cloud Summit 2017 in Paris

LogodetoureThere only a few days to the next big event in my calendar – this time I am travelling to Paris, France to attend and speak at the MS Cloud Summit 2017.

This is a conference covering a lot of Azure topics, and I will be speaking about Azure Data Lake Store and Analytics – spicing it up a bit with some Cognitive Services that we can use in our analytic scripts. I am hoping to see lots of friends and get new ones from the community around SQL Server and Azure Data Services.

A conference organized by AGILE.NET – aOS – AZUG FR – CMD – GUSS

  • 1 day pre-conference workshops (Jan. 23th)
  • 2 days of conference (Jan 24th-25th)
  • 600 attendees expected
  • Passionated audience
  • 6 tracks60 sessions
  • Microsoft Cloud technologies (Azure, Office 365, Data Platform)
  • Microsoft Hybrid technologies (SQL Server, SharePoint, etc.)
  • Valuable international and french speakers

Register here

Be part of that great conference. Register now !

A participation of 15€ is asked to help us covering conference cost

Day 1 – Achetez vos places Registration website – https://www.weezevent.com/ms-cloud-summit-jour-1

Day 2 – Achetez vos places Registration website – https://www.weezevent.com/ms-cloud-summit-jour-2

Speaking at SQL Saturday Prague

https_proxyWhoaaa, just here to tell you that I will be in Prague in December this year – giving a speak at the SQL Saturday #569 Dec. 2nd 2016. Looking very much forward to be there and spread the word on Azure Data Lake Store and Azure Data Lake Analytics, and how these can be used in different scenarios. It will be a session filled with technical information and lots of demos..

This will be my first SQLSaturday in Czechoslovakia, and actually also the very first in Czechoslovakia 😉

I really look forward to attend and meet all of the SQL Family.

Biml Precon in Prague december 2nd 2016

bimlwheelWill be delivering my first precon at SQL Saturday Prague december 2nd, the precon will be centered around Biml and automation of your ETL development process.

I will be joined by my good friend Regis Baccaro (T | B | L) and we are in the process of fine tuning the material for the precon. It will be a good combination of slidedecks, discussions and Hands-on Labs, if attending you will leave the day with a framwork for automated datawarehouse made in Biml. Continue reading →

Speaking at SQL Saturday Denmark

sqlsat541_webWhoaaa, just here to tell you that I will be in Copenhagen later this year – giving a speak at the SQL Saturday #541 in Copenhagen Sep. 17th 2016. Looking very much forward to be there and spread the word on Azure Data Lake Store and Azure Data Lake Analytics, and how these can be used in different scenarios. It will be a session filled with technical information and lots of demos..

 

Speaking at SQL Saturday Madrid

sqlsat568_webWhoaaa, just here to tell you that I will be in Madrid later this year – giving a speak at the SQL Saturday #568 in Madrid Sep. 24th 2016. Looking very much forward to be there and spread the word on Azure Data Lake Store and Azure Data Lake Analytics, and how these can be used in different scenarios. It will be a session filled with technical information and lots of demos..

This will be my first SQLSaturday in Spain, and I really look forward to attend and meet all of the SQL Family.

%d bloggers like this: