Browsing:

Category: SQL

PowerBI -Fix: The import List.ConformToPageReader matches no exports. Did you miss a module reference?

Did you wake up this morning finding PowerBI failure messages in your inbox?

Then you are probably not the only one. Your Gateway is out of date and left you with errors on your Datasets on PowerbI.


Some datasets are not able to refresh due to a module not being found, but testing your connections all seems to be working fine ...or not? It does not cleary indicate something is wrong with your gateway, it just gives you a friendly suggestion to update, but all connections are A OK!

What you need to do is install the recently released (let me do a wild guess, last Sunday I...?) gateway version number: 3000.66.4 november 2020

Once you update your (on-premise) gateway server and give it a restart. You will be able to refresh all your reports.

Thank you Microsoft for keeping us busy today!


PaaS taking over the world! Are dba’s a dying breed?

PaaS vs IaaS

 

It's easter weekend, or as the dutch say 'PaaS weekend'. So, it might be a good idea to talk a bit about PaaS. What is it and what does this mean for you as a Database administrator? Are Dba's a dying breed. Will they just shift over to more complex or broader tasks or are they here to stay. It all depends on the company and the software they are running.

 

 

 

What is #Iaas? in short, it's a VM. Your database is running in a data center. As a Dba you have the same job requirements as when running a database on-premise. Update, backup, patches, tuning, security and account control.

But #PaaS is a different story, it runs the database as a service, so there is no need for a dba. At least that's what they tell you. But does PaaS solve all your performance and tuning needs, is that faulty query when moved to PaaS suddely solved? Nope. Machine learning is doing a great job so far, but it isn't the magical quick fix, yet.

In all honesty most companies don't care about tuning a database, not all applications have complex queries and tasks running on their SQL server, most are fine running an express edition. They don't even bother having a Dba. The database is taken care of by a system engineer, if being looked at, at all.

Where does this put you as a Dba? Don't sob, we still need you! It's a big relief that the market, which is still flooded with old school high maintenance MS SQL driven apps. How nice if these could be taken care of with PaaS. The only good thing coming out of these high maintenance, splintered databases, is the data itself. Do you really want to spend your time updating, patching and granting rights to users and saying no to SA account requests? No, you don't!

But second, in the real world, companies don't evolve as fast as the IT world itself. Most applications, lots outdated or not are not being replaced overnight and not all the vendors are quite ready for PaaS environments with their applications. So you just have to decide on which side you want to be, fast IT or slow IT. There is still a big playground available for both for the coming years.


Polybase configuration on SQL Server 2017 Part II

Image: Microsoft

Nowadays your precious data can be stored everywhere, not just on several servers with different SQL versions, your data probably is wide spread in the cloud. It’s also a good idea to store data in the cloud with stretch database to release your local discs from excessive data and still be able to query it, but also use it in your SSIS and BI environment and keep an acceptable ETL. With Microsoft’s polybase you can access, import and export any data structured, semi, or non structured on the Hadoop platform and azure blob storage using T- SQL language.

The best business knowledge comes from the data you collect. So it might be a good idea to put the data you collect into some good use. Businesses collect lot’s of data, but in most cases this is also where it ends. Those who read my posts before,  know I am all about combining various sources with linked servers, since SQL 2014 lot’s of new features are available for using all your data on business intelligence platforms.

In my last post, we had a first look and troubleshoot of a polybase installation. This time we are going to configure and use the polybase in SQL server 2017. I’m going to use the Blob storage on Azure to demonstrate how you can implement this solution in your (local) SQL database.

First things first, now you’ve got your polybase installation ready, check if the services for polybase exist and are running.

Services: ‘SQL Server PolyBase Data Movement’ and ‘SQL Server PolyBase Engine’

You need to configure Polybase in order to start using it. Fire up SSMS and open a new query window. Type

sp_configure ' hadoop connectivity', 4;

reconfigure

Option 4 is Azure blob storage (WASB[S]). For more info on availability of the Polybase connectivity configuration, take a look here. Run the query and make sure you restart both Polybase services on the machine to finish the configuration.

In order to start using the blob storage make sure you have an Azure storage account if you don’t have an Azure account yet, create one here.

Login to Azure and on the left side select and create a new storage account

Give it some time, once the storage is created, you also need to create a container on the Azure storage.

To connect your local db to azure  storage, you need to get the azure storage key from your Azure storage account you just created and put it in the configuration file of your SQL installation.
Look for the core-site.xml file in the installation path of SQL Server.
The path looks simular to this: “C:\Program Files\Microsoft SQL Server\MSSQL14.MSSQLSERVER\MSSQL\Binn\Polybase\Hadoop\conf”
This will open the config directory  with the core-site.xml file.

Open the file in notepad and add code before the block of code mentioning Kerebos.

Fill in the storage name, in my case polybasedemo  and the storagekey and save the file.

 fs.azure.account.key..blob.core.windows.net

Now we have to create an external data source in SSMS. Replace containername@storagename with the names you created on Azure.

CREATE EXTERNAL DATA SOURCE PolyBaseDemo WITH
( TYPE = HADOOP, -- wasbs:// containername@storagename.blob.core.windows.net/ 
LOCATION = 'wasbs://containername@storagename.blob.core.windows.net/' );

Next up, we create the external file format to define external data on Azure blob storage, this needs to be done in order to create the external table

CREATE EXTERNAL FILE FORMAT PolybaseFormat 
WITH ( FORMAT_TYPE = DELIMITEDTEXT , FORMAT_OPTIONS ( FIELD_TERMINATOR = ',' ) );

This creates 2 new server objects in SSMS and now all is left to create the external table itself. In this demo I use an excel sheet with some irrelevant data to have some test data available.

USE [DemoPolybase] CREATE EXTERNAL TABLE [dbo].[Customers] ( [Name] VARCHAR(255) NULL, [adres] VARCHAR(255) NULL, [postalcode] VARCHAR(6) NULL ) WITH (LOCATION = N’/Customer_Export.csv’, DATA_SOURCE = PolyBaseDemo, FILE_FORMAT = PolybaseFormat, REJECT_TYPE = Value, REJECT_VALUE = 10) GO

And to see if this worked, just query the data 🙂

SELECT * FROM [Customers]

Now put this knowledge into action yourself with some real data!


SQL2016 Stretch database. Yes, it’s another linked server

Your manager always wants to keep all the data...now with bigdata being a thing and algoritmes are used more efficient (IOT), people want to hoard data even more, but it needs to be online to be valuable. If you have to restore the data first, your data is not very valuable. But all this extra data also slows down your database performance...and gives the dba'ers extra challenges....but not anymore.! Well, at least when your boss got a big wallet and don't mind spending it on a SQL buget.

In SQL2016 Microsoft introduced the stretch database. which give you live access to 'archived' data and makes it feel like it's on premise. And  it's a secretly just a linked server! A linked server between mssql and Azure.LS+azure

How does this work, and is it safe?

Yes! since it works on secure linked server technology and it stores an encryption key on the on-premises sqlserver, it's suitable for all your cold data, not hot, because it does slow down the query speed, but it will not clogg your network and your data will be available at all times, even when the data is being stretched over to Azure the data still is fully query-able.

First ask yourself, Do you really need to keep this data? If yes, How do you create a stretchdatabase? Well, this is easy!

Hesitations? use this link for info on the MS Stretch database advisor.

I use a simple test database, the adventureworks2016 ctp3.

Before we can start, check if the data archive function is enabled, if not (0) enable it.

####Check status of Remote data archive####
SELECT @@VERSION AS 'SQL Server Version'; 
EXEC sp_configure 'remote data archive'; 
GO 
####Activate Remote data archive####
EXEC sp_configure 'remote data archive' , '1'; 
GO 
RECONFIGURE; 
GO

Next we can create a master key to encrypt all your data locally, this way, the data can't be read in Azure.
Note: You can also create a masterkey from the stretchdb wizard, it's up to you. I'll show both options.

USE AdventureWorks2016CTP3; 
GO 

CREATE MASTER KEY ENCRYPTION BY PASSWORD='stretchdb2016'; 
GO

Next I will create a simple table  and populate it with some data, to keep the demo quick and simple :

CREATE TABLE dbo.Stretchtable
(
FirstName VARCHAR (50),
LastName VARCHAR (50)
);
GO


Use [AdventureWorks2016CTP3]
GO
INSERT INTO dbo.Stretchtable (FirstName, lastName)
VALUES ('Reed', 'Richards'), ('Benjamin', 'Grimm'), ('Sue', 'Storm'), 
 ('Johnny', 'Storm'), ('Victor', 'Von Doom'), ('Willie','Lumpkin');
 GO

Keep your Azure account ready! I use the SQL logical server resource, you need to create a resource before you can stretch the database.

Azure resource

 

Select the table you will be stretching to Azure. Now enable the table for stretching and the wizard will open up.

Enable stretch database

It's a Hybride archive - You can stretch the whole table or filter out the (c)old data from a massive table.
4_EnableStretchTable

 

Next thing you create a master key, if you didn't make it before, if you made it already, It will ask you to fill in the master password.Masterkey_StretchTable
10_LoginStretchTable

Fill in the IP range for a firewall rule.IP_Azure

12_EnableStretchTable

Hey Ho, let's go, that was easy!

Now the table will start to stretch, if you want to see if it stretched yet, try a select query and check the executionplan.
14_EnableStretchTable

Want your data back? No problem, You can bring back the data to your on-premise server, if you don't want to use Azure. Because, when you delete your azure account, the data will be gone too!

bringbackmydatabringbackmydata2bringbackmydata3

 

Note: Data storage can be expensive on Azure, but this function gives us a positive view on the ever growing databases and it's possibilities.


Query a database through a C# REST API with Powershell (part 1)

It is probably known that you can query an SQL database in Powershell relatively easy, but wouldn't it be great to quickly write a REST API in front of the database? So that you can add business logic if you wish? And use Powershell as a REST client? And then be able to code a decent frontend for the API for whatever device?

Let's get started!
In this series I will first create a WebApi from scratch. Of course, you can also use the templates in Visual Studio, but I prefer to have a bit of knowledge of the code that's in my project. It's not that hard and you will end up with a clean code base.

Step 1. Get your dev environment ready

You can use a Vagrant box. If you use this Vagrantfile a install.ps1 script will be copied to your desktop. Run it, grab a coffee or go shopping because we are on Windows and Windows apps can be huge.

Step 2. Getting the VS Project in place

Start Visual Studio
Create a new empty solution:

ice_screenshot_20160508-093224

I named the empty solution BusinessApp (I'm lacking inspiration for a better name).

Then right click the newly made solution in the Solution Explorer (the pane on the right) and click Add and the New Project:

20150508-context

 

 

 

 

 

I named the new Project BusinessApp.Api. If you set your solution up like this you can add more projects as you continue extending the app, for example for an Angular (or whatever framework) frontend, or if you want to separate your datalayer. You can also put your Powershell client modules in a separate project if you wish.

Then open up the Nuget Package Manager Console and install the WebApi dll's:

Install-Package Microsoft.AspNet.WebApi

Make sure to choose the correct Package source (Microsoft and .NET).

Step 3. Add routing

Add a new folder and name it App_Start.
Create a new class in the folder and name it WebApiConfig.cs

using System;
using System.Collections.Generic;
using System.Linq;
using System.Web.Http;


namespace BusinessApp.Api
{
    public static class WebApiConfig
    {
        public static void Register(HttpConfiguration config)
        {
            config.MapHttpAttributeRoutes();
            GlobalConfiguration.Configuration.Formatters.JsonFormatter.SerializerSettings.ReferenceLoopHandling = Newtonsoft.Json.ReferenceLoopHandling.Ignore;
            GlobalConfiguration.Configuration.Formatters.Remove(GlobalConfiguration.Configuration.Formatters.XmlFormatter);

            config.Routes.MapHttpRoute(
                name: "DefaultApi",
                routeTemplate: "api/{controller}/{id}",
                defaults: new { id = RouteParameter.Optional }
            );
        }
    }
}

In this class we configure that we want our api to return and consume json. Also, we configure our routes to match the controller name, followed by id, wich is optional. E.g http://example.com/api/employees/1 would match a controllername Employees, and it would return employee with id 1.

Step 4. Enable CORS

We need to enable CORS else we won't be able to consume the api from from another domain outside the domain from which the resource originated. In a production web environment you should configure this very carefully. I will CORS very permissive because I want my code to work.

Install CORS with in Nuget console:

Install-Package Microsoft.AspNet.WebApi.Cors

Then modify the WebApiConfig.cs class as follows:

using System.Web;
using System.Web.Http;
using System.Web.Http.Cors;

namespace BusinessApp.Api
{
    public static class WebApiConfig
    {
        public static void Register(HttpConfiguration config)
        {
            var cors = new EnableCorsAttribute("*", "*", "*");
            config.EnableCors(cors);
            config.MapHttpAttributeRoutes();
            GlobalConfiguration.Configuration.Formatters.JsonFormatter.SerializerSettings.ReferenceLoopHandling = Newtonsoft.Json.ReferenceLoopHandling.Ignore;
            GlobalConfiguration.Configuration.Formatters.Remove(GlobalConfiguration.Configuration.Formatters.XmlFormatter);

Step 5. Add a Controller

  • Create a folder named 'Controllers'
  • Right click the Controllers folder and click Add and then Controller
  • Click Web API 2 Controller with read/write actions.

ice_screenshot_20160508-092302

I named the Controller Test Controller.

Step 5. Add a Global.asax file

We need to add a Global.asax file to call the WebApiConfig.cs methods at startup.

Right click the solution, click Add, click New Item and search for Global.asax, then Add it.

ice_screenshot_20160508-095951

Modify Global.asax (see the highlighted lines):

using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using System.Web.Http;
using System.Web.Security;
using System.Web.SessionState;

namespace BusinessApp.Api
{
    public class Global : System.Web.HttpApplication
    {

        protected void Application_Start(object sender, EventArgs e)
        {
            GlobalConfiguration.Configure(WebApiConfig.Register);
        }

 

Step 6. Test the API

Hit F5 and browse to http://localhost:/api/test

ice_screenshot_20160508-100831

And it works. You can also consume the API with Powershell at this point:

((Invoke-WebRequest http://localhost:53601/api/test).content) | ConvertFrom-Json

It should return value1 and value2.

Done! Now let's query a database. This will be explained in Part 2.