Browsing:

Category: Algemeen

My Productivity System or How I Get Things Done

One could say: your productivity system can't be that good because you haven't written a blog post in ages. True, and here is why: I deliberately put some activities on hold the last 6 months. These 'activities on hold' happened to be the public ones: speaking, organizing meetups and blogging.

Why did I do that? Because I switched jobs AND I made a career switch. From an infrastructure architect, I decided to take the plunge and join a company that develops back-ends (API's) and links and integrates them with all sort of systems. So I needed all my energy to be able to make this switch and in the same time add value to customers. Which is why all extra curricular activities where put on hold.

In the meantime I finally developed a GTD system which fits my needs and works for me. So let me to share that with you.

Why a productivity system?
Because in this day and age there is so much a person has to process and remember. At home there is taxes, bookkeeping, laundry, sports, shopping and so on. At work the customers, the team and the MT should be kept happy (and vice versa, obviously). On top of that I want to spend time with my SO as much as possible, to study and write code and blogs for fun.

With all this stuff going on one could easily become overwhelmed, lose control and end up having a burn-out. So I need to plan. I want to be able to concentrate to the tasks at hand and I want to improve my system over time. And obviously I need an app to help me remember and organise.

Turns out there is no single app that fits my needs do I have to combine three: Omnifocus, Harvest and Focus:

My personal system

So here it is. My personal system is like a funnel: I start collecting and organising all tasks, then I track time for a collection of tasks and I need to focus on a single task ('deep work').

This is how it works:

1. Note every single task item you can think of down.
2. Finish every task that will take only 2 minutes.
3. Put the remaining tasks into categories. E.g.: work, client x, sports, study.
4. Assign the tasks a start date or a due date..
5. Focus on your tasks for today and nothing else.
6. Track Time for these tasks
7. For the task at hand that requires deep thinking and concentration use the Pomodoro technique
8. Once a week: review the week and see when and where you were less efficient.

OmniFocus (only for Mac and IOS) fits well for this workflow. It helps you to focus only on the tasks you want (or can) do today, or place or when some condition is true.
This is a screenshot of the forecasting functionality of OmniFocus. For a given day you only get to see the tasks that you will be working on that day. And it has nice calendar integration a well.

OmniFocus does not a good job with time tracking, so in comes Harvest. Harvest is a very cool Time Tracking tool, which helps you see how productive you were on a day, a week and a month.

But Harvest does not let you focus. This is where I use the Pomodoro Technique. So in comes an app called Focus, which let me focus on the task at hand for 25 minutes.

The reason why I picked Focus is because of its integration with OmniFocus.

Mac only and Expensive?
Mac Only? Yes, unfortunately so (except for Harvest). I would love to OmniFocus going cross-platform, but it's not going to happen.

Expensive? That depends. If you buy Asana or Todoist then you need to pay a monthly subscription to get at least similar functionality as OmniFocus. For the OmniFocus Pro version (and you are going to want that) you pay a whopping $79.99. So OmniFocus might be cheaper in the end. I use the free version of Harvest, and Focus costs around $7,99.

Take Aways
Developing a good productivity system that fits your needs can take months or even years. And in my case there was no single app that supported my needs. So I needed three. But I'm quite happy now. And I can see where I was less productive and why. And improve myself.

A big thank you for Asian Efficieny by the way. Their podcast is awesome and inspiring. Have a listen!


Query a database through a C# REST API with Powershell (part 1)

It is probably known that you can query an SQL database in Powershell relatively easy, but wouldn't it be great to quickly write a REST API in front of the database? So that you can add business logic if you wish? And use Powershell as a REST client? And then be able to code a decent frontend for the API for whatever device?

Let's get started!
In this series I will first create a WebApi from scratch. Of course, you can also use the templates in Visual Studio, but I prefer to have a bit of knowledge of the code that's in my project. It's not that hard and you will end up with a clean code base.

Step 1. Get your dev environment ready

You can use a Vagrant box. If you use this Vagrantfile a install.ps1 script will be copied to your desktop. Run it, grab a coffee or go shopping because we are on Windows and Windows apps can be huge.

Step 2. Getting the VS Project in place

Start Visual Studio
Create a new empty solution:

ice_screenshot_20160508-093224

I named the empty solution BusinessApp (I'm lacking inspiration for a better name).

Then right click the newly made solution in the Solution Explorer (the pane on the right) and click Add and the New Project:

20150508-context

 

 

 

 

 

I named the new Project BusinessApp.Api. If you set your solution up like this you can add more projects as you continue extending the app, for example for an Angular (or whatever framework) frontend, or if you want to separate your datalayer. You can also put your Powershell client modules in a separate project if you wish.

Then open up the Nuget Package Manager Console and install the WebApi dll's:

Install-Package Microsoft.AspNet.WebApi

Make sure to choose the correct Package source (Microsoft and .NET).

Step 3. Add routing

Add a new folder and name it App_Start.
Create a new class in the folder and name it WebApiConfig.cs

using System;
using System.Collections.Generic;
using System.Linq;
using System.Web.Http;


namespace BusinessApp.Api
{
    public static class WebApiConfig
    {
        public static void Register(HttpConfiguration config)
        {
            config.MapHttpAttributeRoutes();
            GlobalConfiguration.Configuration.Formatters.JsonFormatter.SerializerSettings.ReferenceLoopHandling = Newtonsoft.Json.ReferenceLoopHandling.Ignore;
            GlobalConfiguration.Configuration.Formatters.Remove(GlobalConfiguration.Configuration.Formatters.XmlFormatter);

            config.Routes.MapHttpRoute(
                name: "DefaultApi",
                routeTemplate: "api/{controller}/{id}",
                defaults: new { id = RouteParameter.Optional }
            );
        }
    }
}

In this class we configure that we want our api to return and consume json. Also, we configure our routes to match the controller name, followed by id, wich is optional. E.g http://example.com/api/employees/1 would match a controllername Employees, and it would return employee with id 1.

Step 4. Enable CORS

We need to enable CORS else we won't be able to consume the api from from another domain outside the domain from which the resource originated. In a production web environment you should configure this very carefully. I will CORS very permissive because I want my code to work.

Install CORS with in Nuget console:

Install-Package Microsoft.AspNet.WebApi.Cors

Then modify the WebApiConfig.cs class as follows:

using System.Web;
using System.Web.Http;
using System.Web.Http.Cors;

namespace BusinessApp.Api
{
    public static class WebApiConfig
    {
        public static void Register(HttpConfiguration config)
        {
            var cors = new EnableCorsAttribute("*", "*", "*");
            config.EnableCors(cors);
            config.MapHttpAttributeRoutes();
            GlobalConfiguration.Configuration.Formatters.JsonFormatter.SerializerSettings.ReferenceLoopHandling = Newtonsoft.Json.ReferenceLoopHandling.Ignore;
            GlobalConfiguration.Configuration.Formatters.Remove(GlobalConfiguration.Configuration.Formatters.XmlFormatter);

Step 5. Add a Controller

  • Create a folder named 'Controllers'
  • Right click the Controllers folder and click Add and then Controller
  • Click Web API 2 Controller with read/write actions.

ice_screenshot_20160508-092302

I named the Controller Test Controller.

Step 5. Add a Global.asax file

We need to add a Global.asax file to call the WebApiConfig.cs methods at startup.

Right click the solution, click Add, click New Item and search for Global.asax, then Add it.

ice_screenshot_20160508-095951

Modify Global.asax (see the highlighted lines):

using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using System.Web.Http;
using System.Web.Security;
using System.Web.SessionState;

namespace BusinessApp.Api
{
    public class Global : System.Web.HttpApplication
    {

        protected void Application_Start(object sender, EventArgs e)
        {
            GlobalConfiguration.Configure(WebApiConfig.Register);
        }

 

Step 6. Test the API

Hit F5 and browse to http://localhost:/api/test

ice_screenshot_20160508-100831

And it works. You can also consume the API with Powershell at this point:

((Invoke-WebRequest http://localhost:53601/api/test).content) | ConvertFrom-Json

It should return value1 and value2.

Done! Now let's query a database. This will be explained in Part 2.


Create and Export a bacpac file to Azure Storage

Next in my Azure database series is the bacpac file, Azure works with bacpac files, we can up- and download to azure. We need a storagespace in order to up and download files.

We start creating storage space. This may come in handy when we need file tranfer, and again you can access the files from your local device.
1.storage

To acces your storage account, you need the access keys, which are generated when creating the storage account. On the footer of the page, where the red arrow is visible in the screenshot below, you can open up and acces the keys. 2.storage

This key you need for the net step, we will download a windows Azure storage explorer to be able to acces the files. Azure has a broad choise in downloads for this option. I downloaded the Azure storage explorer from codeplex storage explorer.

Now I will walk you through on how to create a bacpac file on your local database. Choose the Export Data-tier Applicationbacpac1

Now we have to options, you can create the bacpac file locally and upload it with the storage explorer or link it to azure directly, we choose tthe latter. Fill in the storage credentials we just created by connecting the storage account with the access keys. Name a container and filename and thunderbirds are GO!

bacpac2bacpac4

 

 


Connect your Azure database from your local SSMS #Error 40615

Ofcourse we wanna access our fresh imported database on Azure locally!

And this is so easy if you just follow these simple steps.

In order to be able to access the database located on your Azure cloud, you have to know the Azure address, which you can find on the Windows Azure Management Portal. Click the database icon and open up the database. Here you will find the connectionstring.

azure_link

Next step is to make sure Azure knows your device by adding it to the trusted devices in Azure, if your device is unknow to Azure it will bring you the following message:1error-whenlocal_connection

you can add the device by going to the database icon in Azure and click the CONFIGURE tab. Here you add and save your device IP.2local_adTosavelist

Now you can login to your Azure cloud with the credentials you made while creating the database on Azure.
3Local_db connect


SQL database migration to Microsoft Azure with codeplex

sql-database-windows-azureThere are multiple ways to migrate your database to the Azure cloud. Today, I am testing this sql azure plugin by codeplex. I unzipped it to my local folder and run it as admin.

1.InstallAzurePlugin

Now fire up the SQLAzureMW.exe and the script wizard starts up, which is very intuitive, choose the available options, In my case, I will migrate from SQL database to Azure SQL Database. Fill in your database details and run the export. This might take a while depending on how big your SQL Database is. You can also make a selection of tables/views you want to export, or just export the whole database.

Azure1

Azure2 Azure3   Azure4 Azure5 Azure6   Azure7

Make sure you fill in the correct details for your target server (you can find your sql connection data on your azure Database configure page.) Azure10  Azure11

Now the database will be brought to the Azure cloud, again this might need some time, but you can see the progress on screen.

Azure11 Azure12

 

I'm a big fan of this sql azure plugin by codeplex , Like i said it's very intuitive and gives you several options to work with your local and azure data.