Azure Maps

So at the moment, I’m testing and reviewing some training material for internal use in Microsoft in February and as you may have guessed, the service is Azure Maps.

This is the service that used to be part of the Bing Api, but have now been moved into the Azure service catalog and I must say it’s been a really nice experience to work with and use during the last few days.

Using Azure Maps, you get a variety of options to integrate maps into your application, these includes in headlines:

  • Search : build applications that enable you and the users to search for adresses, Point of Interest, Businesses, contact information and much more. You even have the option to get detailed information on what the road is used for, speedlimit and more.
  • MAPS : use this to integrate the well-known quality maps from Bing into your website, application or mobile app to give the user a visual experience of the location.
  • Geocoding : Convert Lat and Lon into adresses and vise versa.
  • IP to location : Ever wanted to have an easy way to match a IP/number to the country where it is in use? Well, here is a service that gives you that, but please be aware that the service is in preview, and are subject to changes.
  • Traffic : Use this in your custom application to allow for instance your sales personel to avoid traffic jams, reduce travel time and let them choose between several available routes.
  • Routing : Use this to incorporate the shortest or fastest route to your users, allow multiple points along a route, and can be useful for development and help solve the ever occurring logistic problem, know as “the travelling salesman”
  • Timezone : Enables you ti implement time service in your application, and look up times around the globe.

The full documentation is available here, with a lot of examples and demo apps.

During the test and evaluation of the training material, I used the application Postman that allows you to build a url and header for use against a Rest API, such as Azure Maps.

The application will then get the result and present it for you in a format of your choosing, raw, json, pretty, etc. and you can then inspect the response you get from the service, even before you start a single line of code in your preferred IDE for development. But I suggest you use Visual Studio Code – that is an free and open sources code editor that runs on your selected operation system.

Postman

Start by downloading the application Postman, and install it, once installed and running, you should create a Collection for storage of the results you get.


Click on the Arrow Down besides New, and select Collection
Enter a name for the collection and click Create

Now we’re ready for testing our service, it’s a prerequisite that you have created a Azure Map Service on Azure and have your Subscription-key at hand.

  1. Start by entering the following into the URL just besides the GET function https://atlas.microsoft.com/search/fuzzy/json?
  2. Now we’re ready to fill in some values for the keys, that we will send to the REST API.
  3. In the first key, you enter api-version and the value is 1
  4. In the next you enter query and your query for an address in value, I entered Birkedommervej 8, Vester Egede  but  i  suggest  you  use your own 😉
  5. In the next key you enter subscription-key and in value the key from your Azure Map Service
  6. Now you should have a screen looking a bit like the one below

Now once all the keys and values have been entered, you should click on the big blue SEND button, which initiates a call to the api, and then catches the result in the Postman app.

You have now called the Azure Maps api for the first time and the result you get is in this example being shown as JSON, but you could easily show it as XML instead. Here is the result in JSON that my query returned.

{
    "summary": {
        "query": "birkedommervej 8 vester egede",
        "queryType": "NON_NEAR",
        "queryTime": 104,
        "numResults": 1,
        "offset": 0,
        "totalResults": 1,
        "fuzzyLevel": 1
    },
    "results": [
        {
            "type": "Street",
            "id": "DK/STR/p0/20146",
            "score": 5.785,
            "address": {
                "streetName": "Birkedommervej",
                "municipalitySubdivision": "Vester Egede",
                "municipality": "Haslev",
                "countrySubdivision": "Sjælland",
                "postalCode": "4690",
                "countryCode": "DK",
                "country": "Denmark",
                "countryCodeISO3": "DNK",
                "freeformAddress": "Birkedommervej, 4690 Haslev (Vester Egede)"
            },
            "position": {
                "lat": 55.26562,
                "lon": 11.96339
            },
            "viewport": {
                "topLeftPoint": {
                    "lat": 55.26486,
                    "lon": 11.96664
                },
                "btmRightPoint": {
                    "lat": 55.26602,
                    "lon": 11.96013
                }
            }
        }
    ]
}

Now the next steps would be to add more keys to the query to get even more information from the API about the address at hand, as mentioned above we can get information about speedlimit, road usage etc. etc.

I hope that you got a little excited about this new service on Azure and if so, i would encourage you to go deep dive into the API and look at some of the more advanced features yourself.

I will post another post on some of the advanced features, in the upcoming weeks, so stay tuned or head over to the documentation and start yourself.

Blog reboot

New Year, New me – a sentence many of us have either used ourself or seen online for the last few weeks.

Well, here it is not a new me, as I will be the same as always 😉

But I will reboot this blog and try to have at least one weekly post on topics suchs as :

  • Data
  • Artificial Intelligence
  • Machine Learning
  • Azure Services
  • And everything in between

So if you’re into these topics, please feel free to drop in once in while and see if there’s something to your liking, also everything here will be cleaned for customer details, references and suchs, and will solely consist of reference architectures, my points on the topics and is solely my view and personal opinion.

Best regards

Kenneth

MS Cloud Summit 2017 in Paris

LogodetoureThere only a few days to the next big event in my calendar – this time I am travelling to Paris, France to attend and speak at the MS Cloud Summit 2017.

This is a conference covering a lot of Azure topics, and I will be speaking about Azure Data Lake Store and Analytics – spicing it up a bit with some Cognitive Services that we can use in our analytic scripts. I am hoping to see lots of friends and get new ones from the community around SQL Server and Azure Data Services.

A conference organized by AGILE.NET – aOS – AZUG FR – CMD – GUSS

  • 1 day pre-conference workshops (Jan. 23th)
  • 2 days of conference (Jan 24th-25th)
  • 600 attendees expected
  • Passionated audience
  • 6 tracks60 sessions
  • Microsoft Cloud technologies (Azure, Office 365, Data Platform)
  • Microsoft Hybrid technologies (SQL Server, SharePoint, etc.)
  • Valuable international and french speakers

Register here

Be part of that great conference. Register now !

A participation of 15€ is asked to help us covering conference cost

Day 1 – Achetez vos places Registration website – https://www.weezevent.com/ms-cloud-summit-jour-1

Day 2 – Achetez vos places Registration website – https://www.weezevent.com/ms-cloud-summit-jour-2

Get started using Data Science tools fast on AZURE

logo-131029674714477966Did you know that Microsoft have made it really easy to get started using various data science tools? How, you ask! Well Microsoft have compiled a few Virtual Machine Images, ready to be spun up/provisioned on Azure – all you have to do is to select the data science flavor of your choice. Continue reading →

Azure SQL Datawarehouse and Premium Storage

SQLAzureDatawarehouse2Today Microsoft have announced that Azure SQL Datawarehouse will support Premium Storage, this will allow the customers to see greater performance and predictability on queries. As of today, all newly created SQL Datawarehouse will be created with Premium Storage, at least in regions where Premium Storage is available. In the remainder of the preview period, the billing will continue to be based on standard pricing.

 

To read the full blog post – go to -> https://azure.microsoft.com/da-dk/blog/azure-sql-data-warehouse-introduces-premium-storage-for-greater-performance/

Bring your own SQL licenses to Azure

data_Illustration_cloudA few days ago, we announced that Microsoft Enterprise customers is now allowed to bring their own SQL Licenses to Azure VMs. This means that if a customer already have a SQL License, this license can be used on SQL Server VM images from Marketplace.

This means that they do no longer need to build their own VM, but instead can just provision a server from the marketplace and use the existing license.

Read a lot more on the official blog post

https://azure.microsoft.com/da-dk/blog/easily-bring-your-sql-server-licenses-to-azure-vms/

My laptop is at the repair shop, what to do?

I recently got a new laptop, the renowned Lenovo X1 Carbon 3rd generation – a most amazing piece of hardware. The laptop is so light and blistering fast, that I have not regretted opting for that one instead of the Surface Pro 4.

Well, all was good for the first 2 months, then one day when i took the power out, because I was of to a meeting, the laptop died on me. Just like that power went of, what to do, tried all the tips I found on the net, but to no avail. Continue reading →

%d bloggers like this: