Protected: Simple JSON codes mockup voor Watson Assistant

This content is password protected. To view it please enter your password below:


IBM Cloud platform Watson API – CLI Tools Error: NO CF API endpoint set

IBM Watson is a really good AI platform.
But since development of the Watson Platform goes so quickly, they keep pushing new updates and workspaces.
If you are a developer, this can be quite time consuming, since you need to keep rebuilding the former workspace now called Skills and APi configurments.
This last Update to V2 and the deprecation of the bluemix environment gave me quite a few headaches.

To save you from the hassle, here is an example how you can rebuild a Watson assistant and Watson discovery API with the cloud CLI Tool.

c:\Program Files\IBM\Cloud\bin>ibmcloud cf push
FAILED
No CF API endpoint set.
Use 'ibmcloud target --cf-api ENDPOINT [-o ORG] [-s SPACE]' to target Cloud Foundry, or 'ibmcloud target --cf' to target it interactively.

it’s because you are still pointing to the old bluemix link:
c:\Program Files\IBM\Cloud\bin>ibmcloud api https://api.eu-gb.bluemix.net
Setting api endpoint…
API endpoint https://api.eu-gb.bluemix.net is going to be deprecated. Use https://cloud.ibm.com.

here is what you do:
c:\Program Files\IBM\Cloud\bin>ibmcloud api https://cloud.ibm.com
c:\Program Files\IBM\Cloud\bin>\ibmcloud login
now your ENDPOINT is set to the cloud.ibm.com
Now set the right environment for discovery:
c:\Program Files\IBM\Cloud\bin>ibmcloud target –cf-api api.eu-gb.cf.cloud.ibm.com

Now you can (re-)do al your CF functions


Deploy linked Azure Resource Manager templates with a SAS token

ARM templates tend to get huge when your deployments get more complex.
With linking you can call an ARM template from another template and create a hierarchy of your templates, making it easier to adjust and reuse the templates. You can pass parameters from the master template to the linked template.

Linked templates are not very intuitive to use however. In this blog post I will walk you through an example where I deploy a storage account with a linked template. I will also show you how to use the template in a CD/CI pipeline in Visual Studio Team Services.

azure-arm

A complete example is on my Github repository.

 

The linked storage template

Let’s start with a regular template for storage. However, without the variables! A linked template only has parameters.
These parameters will be populated by the master template. These parameters can be hardcoded, populated by variables or declared in a separate parameters template.

{
    "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {
        "storageAccountType": {
            "type": "string",
            "defaultValue": "Standard_LRS",
            "allowedValues": [
                "Standard_LRS",
                "Standard_GRS",
                "Standard_ZRS",
                "Premium_LRS"
            ]
        },
        "storageAccountTier": {
            "type": "string",
            "defaultValue": "Standard",
            "allowedValues": [
                "Standard",
                "Premium"
            ]
        }
    },
    "resources": [
        {
            "apiVersion": "2017-10-01",
            "name": "[concat('disk', uniqueString(resourceGroup().id))]",
            "type": "Microsoft.Storage/storageAccounts",
            "sku": {
                "name": "[parameters('storageAccountType')]",
                "tier": "[parameters('storageAccountTier')]"
            },
            "kind": "Storage",
            "location": "[resourceGroup().location]",
            "tags": {}
        }
    ]
}

Let’s call this template storage.json.
Now we are going to call this template from a master template that I will name template.json.

 

The master template

Let’s create a folder structure like this:

In template.json I need to make a reference to storage.json. I could put my ARM Templates on Github or GitLab and reference the public URI of storage.json. But what if you are in an enterprise and you need to keep your templates private? What if you want to run the templates from a private storage account?
Then you will want to protect them with a SAS Token. How that works will be described in the last part of this article.

This is how the master.json file will look like:

 {
    "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {
        "artifactsLocationSasToken": {
            "type": "string"
        },
        "artifactsLocationStorageAccount": {
            "type": "string"
        }
    },
    "variables": {
        "storageAccountType": "Standard_LRS",
        "storageAccountTier": "Standard",
        "nestedTemplates": {
            "storageTemplateUrl": "storageTemplateUrl": "[uri(deployment().properties.templateLink.uri, 'nestedtemplates/storage.json' )]"
        }
    },
    "resources": [
        {
            "name": "storageDeployment",
            "type": "Microsoft.Resources/deployments",
            "apiVersion": "2017-05-10",
            "dependsOn": [],
            "properties": {
                "mode": "Incremental",
                "templateLink": {
                    "uri": "[concat(variables('nestedTemplates').storageTemplateUrl, parameters('artifactsLocationSasToken'))]",
                    "contentVersion": "1.0.0.0"
                },
                "parameters": {
                    "storageAccountType": {
                        "value": "[variables('storageAccountType')]"
                    },
                    "storageAccountTier": {
                        "value": "[variables('storageAccountTier')]"
                    }
                }
            }
        },
    ],
    "outputs": {
    }
}

Some explanation: according to the Microsoft docs you can use deployment() to get the base URL for the current template, and use that to get the URL for other templates in the same location. The templateLink property is only returned when linking to a remote template with a URL. If you’re using a local template, that property isn’t available.

So we need to concatenate uri(deployment().properties.templateLink.uri plus nestedtemplates/storage.json. That looks like this:

"nestedTemplates": {
"storageTemplateUrl": "storageTemplateUrl": "[uri(deployment().properties.templateLink.uri, 'nestedtemplates/storage.json' )]"
}

And append the SAS Token” parameters('artifactsLocationSasToken') in our resource section:

"nestedTemplates": {
"templateLink": {
"uri": "[concat(variables('nestedTemplates').storageTemplateUrl, parameters('artifactsLocationSasToken'))]",
"contentVersion": "1.0.0.0"
},

 

Pass the parameters

As already mentioned, you can pass parameters:

  • Hardcoded the nested template (not recommended)
  • Hardcoded in the master template in parameters or variables (semi recommended)
  • In a separate parameters file (recommended)

I would recommend to use the parameters file to set values that are unique to your deployment. Then you can use the concat function to create other resources names in variables.

 

Nested templates and dependencies

You can reference to the deployment like this:

"nestedTemplates":
"dependsOn": [
"Microsoft.Resources/deployments/storageDeployment"
]

 

Deployment

Finally, the deployment. If you are in an enterprise and you need to keep your templates private you will want to run the templates from a private storage account. You can achieve this with a SAS Token.

The steps are as follows:

  • Create separate resource group with a storage account
  • Create a container in blob storage
  • Upload all templates and scripts to this container
  • Create a SAS Token for this container with a valid time of 2 hrs
  • Inject the SAS Token to your parameters.json file
  • Append the SAS Token to the nested template URI

Basically, this is what the PowerShell script does when you create an ARM Template in Visual Studio! However, I think it’s good to know what it actually does under the hood.

I would suggest you to create a service principal. Here is how.
We need the clientId, Secret, TenantId and SubscriptionId from the principal.

You can find the complete script here.

Then run the script:

$vars = @{
ClientId = ""
Secret = ""
TenantId = ""
SubscriptionId = ""
ResourceGroupName = "azure-vm-poc"
ArtifactsResourceGroup = 'my-artificats'
ArtifactsLocationStorageAccount = 'mybeautifulartifacts'
}

# modify path if needed
.\New-AzureDeploy.ps1 @vars -Verbose

 

Add the script to a build or release pipeline with VSTS

Simply add an Azure Powershell script task and call the script. Define the variables in VSTS.

Troubleshoot

Sometimes the error message in the PowerShell console are a bit cryptic. With this command you will get more verbose error messages:

(Get-AzureRmLog -Status "Failed" | Select-Object -First 1) | Format-List

Artificial Intelligence – Chat Bot Back to Basic part 3

We experimented a little with Machine learning, but now we get to the part where it really get’s interesting. Creating a chatbot.

In this example, I use a simple tool called Q&A maker and the Azure QnA maker resource. When you connect both services, you can integrate the bot on social media like Skype, messenger or as a Cortana service, or create a (mobile) app. Sign up at https://qnamaker.ai. and create a new knowledgebase.

I added some of my car data to the knowledge base and named it ‘license check’. Based on a license plate we send to the bot, it will respond with the matching car brand. we need to train the bot and publish it. as you can see in this example: Save and train your Q&A with the data, test the data and when you like the results you Publish it.

When you publish the Q&A it will open a new screen with the URL details, you will need the details to associate your Q&A data with the Azure bot:

Go to Azure Bot service in Azure portal and go to the Application Settings page, in the App Settings section, set the QnAKnowledgebaseId, QnAAuthKey, and QnAEndpointHostName values from the published page and save the keys to Azure.

Now try out the new bot in the Azure webchat.

This is still very basic information, but we can ‘learn’ our bot new skills, for instance to be more social. Since we are humans, we always greet someone when we meet. So, what if someone says hello? the bot will not recognize this as a license plate and will respond with an error, but we can learn the bot, that whenever someone says Hi, Hello or Hoi, The bot will respond back with a friendly Hello! as you can see in this clip. It even understands bad sentences and grammar:

This is just the very basics of a chatbot. You can also add LUIS (Language understanding intelligence service) to your bot. LUIS service is a language processor. You need to create a LUIS service in Azure portal: add new resource and browse for LUIS. Then sign up at https://www.luis.ai and create an intent to associate LUIS with your Q&A content. This makes your bot interesting to publish it under Cortana, You can just tell your computer what license plate you are looking for.

Read my previous post on creating the car data for Machine learning.


Artificial Intelligence prepping data – Back to basic part 1

What street legal cars taught me about Machine learning? It’s all about the right data being available and the RDW data can’t be trusted!

I impulsively signed up for an Artificial Intelligence certification track 2 months ago, So I’ve been experimenting with Artificial Intelligence for a while now and in the beginning of the course it was one though cookie! Those formula’s to interpreted the data predictability really freaked me out!

But once I got past the formula’s and I saw the resemblance of the workspace with Microsoft products like SSIS and BI I see endless possibilities. This takes the data to a whole new level.

Preparing the data:

I did a test on all cars that are currently on the road in the Netherlands and combined it with performance data. I wanted to find the fastest street legal car. I guess I just wanted to find out what kind of cars I should fancy these days according to the performance stats.

I used an open data set from the dutch RWD (Driver and Vehicle Standards Agency). It contained 14m rows and it’s 7GB in size. So I had to prepare the data in order to keep the experiment basic and performance high. I imported it into my SQL server and I filtered out the the stationwagons, campers, scooters and trailers, So I was left with a 900000  rows data set.

I use a SQL Server 2017 and the Microsoft Azure machine learning studio to create a new experiment.

In order to make a prediction I needed to combine the brand data with the engine displacement data, because horsepower data was not available, to see which models are high performance based on the engine capacity. So sadly the smaller engines which are supercharged are not correctly represented in the prediction.

The calculation based on above rules, took a local SQL server on an i5 laptop about 15 minutes. I needed more data preparation.

Based on engine displacement, a top 3 came up. But I didn’t like the results at all. Sure, the engine displacement was high, but the cars are heavy and their performance isn’t the best. Super charged Turbo’s and gearing make all the difference, but aren’t properly represented in this data result.

I had to filter out a lot of data, next up I added the weight of the car, but it wasn’t trust worthy either. I found a data set which contained the Kw of the cars and top speed and joined the data with my current results and added a calculation in SQL on the Kw row * 1,362 to calculate the Hp of the car. The Hp outcome looks pretty accurate.  After 4 hours of combining data and filtering the queries I gave up. Based on this data there is no way you can truly point out the fastest cars. I had to change my plans. Too many uncertain variables to make a decent prediction and not even close to the start of an IA project 🙁

Lot’s of NULL data
This No. 1 car can’t be trusted!

After more data crunching, The results are still not really worth to display. So here is a TOP 21 of “fastest” cars…based on…well the obvious HP and Weight sorting:

btw, did you know there is only one Koenigsegg on the dutch roads.

Ok, I got a little bit carried away with data prepping.

Now let’s import it into an IA experiment: First you need to create a resource in the Azure Portal for your workspace. I won’t get into details, we did this before!

Verify that you created the following new resources: A Machine Learning Workspace, A Machine Learning Plan and A Storage Account.

Browse to the Machine learning workspace you created and launch Machine Learning Studio. This opens a new browser page.

Go to experiments and down in the left  corner click NEW.

Rename the experiment and add a dataset. Upload a new dataset. Datasets –> NEW–> Select data to upload. Now that you have the dataset ready, you can drag it into your experiment and start running tests and variables on the data.

In my next post we will dive deeper into Artificial Intelligence