Logic Apps What’s New

I’m following the Logic Apps broadcasting video live, below a quick recap about the most important points.

1) possibility to write java script and c# script code inside a logic app, essentially the same concept of BizTalk scripting functoid


2) possibility to download these Api Apps from a new GitHub repository https://github.com/logicappsio

This is nice if you want to understand how to create or extend these Api Apps, the purpose of this GitHub repository is to provide an open source repository where the community can contribute.

4) What’s new

new features

4) What’s in progress


For more details…

the full video here

the Microsoft team blog here

Fast and easy deployment with WiX 3.10 for Visual Studio 2015

Wix is a great toolset which able to provide all we need to create a great deployment, I’m preparing the setup for jitGate, now in private beta test and I’m using WiX 3.10, this version support Visual Studio 2015, the installation is very simple and the 3.10 build is available here.

Essentially WiX is a toolset completely based on top of Windows Installer and it is completely base on XML scripting, here the name Windows Installer XML Toolset, WiX Toolset.

WiX is free and open source and the framework is able to cover all the large number of features and options offered by Microsoft Windows Installer, it also provide a large number of tool to made easy creating our deployment database and WiX already offers a large number of setup dialog forms ready and we can also customize them.

WiX is able to offer a great WPF setup interfaces, the Wix setup also uses a WPF interface.


Wix is very easy to extend because completely based on XML scripting and it is essentially formed by 4 big areas.


1) Product area containing all the information about the product, the most important setting about the setup behaviours as upgrading, compression options, deployment restrictions and more

2) Features to manage the different deployment features options, for example, minimal / typical and complete installation.

3) Directory to manage the source and destination directories deployment and it is very intuitive to use.

4) Component group to manage the deployment files,  essentially 1 Component = 1 File.

Another great thing is the WiX BootStrapper, we can use it to install our prerequisite before our installation, for example these three simple lines will install the .Net Framework 4.5.


WiX covers all the deployment option type and we can also extend the Windows Installer behaviour using the Custom Action Project and this is really powerful.


There are many resources and courses available in internet

I definitely recommend WiX Toolset to create our deployment projects.



Event Hubs API App – fast and easy to do

Last evening I was working around a Logic Apps and I needed to send some messages to the EventHubs.
We have two different options, we can use the API app HTTP Connector or we can decide to create and API app which able to do that, this is a good opportunity to understand the development productivity about API Apps.
I would like to use a very simple approach and we can extend this sample as we want, using dynamic configurations, extended features and so on, I just want demonstrate how much is simple do that in some simple steps.

Install the Windows Azure SDK for .NET – 2.5.
Create a new Visual Studio project, select Cloud and ASP.NET Web Application


Select Azure API App


All the library will be automatically added


Select manage NuGet Packages
5Search for EventHub and select the EventProcessHost package
6[OPTIONAL] Rename the ValuesController.cs class in to EventHubController.cs

Below the simple code we have to use to send an event message to the Event Hubs, copy and past this code in the class file.

using Microsoft.ServiceBus;
using Microsoft.ServiceBus.Messaging;
using System;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.Linq;
using System.Net;
using System.Net.Http;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
using System.Timers;
using System.Web.Http;

namespace EventHubConnector.Controllers
    public class EventHubController : ApiController
        // GET api/values
        public string Get(string message)
            string ehConnectionString = "Endpoint=sb://[EVENTHUB CONNECTION STRING]";
            //Create the connection string
            ServiceBusConnectionStringBuilder builder = new ServiceBusConnectionStringBuilder(ehConnectionString)
                TransportType = Microsoft.ServiceBus.Messaging.TransportType.Amqp

            //Create the EH sender
            string eventHubName = "[EVENTHUBNAME]";

            //OPTIONS 1
            EventHubClient eventHubClient = null;
            eventHubClient = EventHubClient.CreateFromConnectionString(builder.ToString(), eventHubName);
            EventData data = new EventData(Encoding.UTF8.GetBytes(message));

Now we enable the swagger features, I would like to spend something here because some guys asking me more information about the swagger side.
We have two different options to manage the swagger contract, on is dynamically, entering in SwaggerConfig.cs and uncomment the usual .EnableSwaggerUi lines.
The second is static, this is useful if we want to drive our swagger generation and this is also quite simple to do.
Uncomment the usual EnableSwaggerUi lines , press F5 and execute the project in debug mode.

Navigate on http://localhost:%5BYOURPORT%5D/swagger/docs/v1 to get the API json raw version

Open the file in Visual Studio, create a file named apiDefinition.swagger.json Metadata folder and save the content inside the file.

To enable the static feature we just only have to enter in the apiapp.json file and delete the value of the endpoints node , as below


Very easy and fast.
We are ready to publish, right click on project file and select Publish.
Select Microsoft Azure API Apps.


Insert the name of your API app and set all the usual subscription properties as service plan and resource group, we can also create them.


Our API app is deployed and ready to use


This API App which able to send a message to the Azure Event Hubs, very simple and fast to do and we can now extend this API App also to receive message from Event Hubs and create some other interesting features.

Logic Apps and API Apps under the hood

Solidsoft Reply are experiencing a growing number of customers who are interested in understanding Logic Apps and API Apps and as a result of the increasing requirement, I decided to spend time dedicated to this topic.

There are many articles and blog posts about this topic, in this article I would like to present my first impression and feedback.
I would like to do a quick introduction about that but before to read this article I recommend to see this video of Josh Twist.

The difference between the API Apps and Logic Apps is, the API Apps are single atomics applications we are be able to develop and deploy in the Microsoft Cloud, they provide us the possibility to write and deploy .Net code.


The Logic Apps is the possibility to orchestrate our API Apps in a logical flow and creating and centralizing our logical processes.


The first time I saw the Logic Apps I tried to compare them with BizTalk and I think is quite normal for a person as me but this is a mistake, they are two different things, different architecture, different pattern, different approach used, but both of them are able to cover the same scope.

Compare Logic and API Apps with BizTalk is quite complicate because they uses two different approaches and patterns, BizTalk provides different components layers and levels, API Apps is a unique containers of micro application blocks and we use Logic Apps to organize them.
We can’t create a new BizTalk orchestration shape but we can do that with API App creating a new API App, in the Logic Apps we can’t have the same concept of BizTalk pipeline file but we are free to organize our pipeline, formed by API app components, inside the Logic App, as we prefer.
The API Apps are application containers which executing actions, we can have two different types of Apps, simple applications containers which formed by our .Net code, or Triggers.
To start a Logic App we need to use a trigger, a trigger could start in different ways, because called, because a polling rule is verified or manually.

Essentially the API App is a different representation of a Web API, to create an API App we need to use Visual Studio 2013, Microsoft provides the Microsoft Azure SDK which provides a new project template, the API App project template.
To create a new API App we select new project -> Cloud -> APS.NET Web Application -> OK.
new proj1

Visual Studio proposes a new template, the Azure API App (Preview).
new proj2
Logically the Web API checkbox is selected.

After the NuGet package installation we have all we need to develop the API App, but I don’t want to enter in details in this first article, I want to discuss about the concepts.
The directory organization is the quite similar to a Web API.


This is a good idea because the developers don’t need to upgrade their knowledge, we can see a Web API on the left and a API App on the right, they are using the same ApiController abstract class inside the assembly, System.Web.Http.dll, so what is the real difference.

API App uses a metadata description controller completely based on Json, also the Swashbuckle NuGet package provides an automatic Swagger metadata generation.
For more information about Swagger you can go in the official site http://swagger.io/

The provisioning is different, the API App template provide a deploy completely focused on that.
Right click on project file, select publish and click on Microsoft Azure API Apps.

After selected the subscription we are able to define how to deploy the API App, what I like here is the idea to keep separated the concept of project name from the API App name, we can define a different and multiple API App name.

Now select the canonical information and deploy the API App.
There are aspects that I really like about the internal architecture.
One is the swagger integration inside the API App, we can activate this feature simply uncommenting these lines of code inside the SwaggerConfig.cs file.

We can test our swagger documentation running our project, press F5.
Adding /swagger in the link url and select List Operations we are able to see the documentation.

The difference between a simple and general API App operation and a Trigger operation is because a naming convention.
In this week I’m developing an API App trigger which able to integrate my RFID reader, to define a trigger I just need to specify the “Trigger” word at the end of our methods.

My first question was … why not using a System.Attribute approach and enrich the class? but exist a good reason behind that.
This is a simple and common way for all the programming languages the API App supports and now API App supports .NET, Java, Node.js, Python and PHP.
The Azure platform recognize triggers from the Swagger API definition rather than the API app code itself, this is really cross-platform approach.

To define the Trigger mechanism we just only need to define the push or poll in the Route Sytem.Attribute

After the deployment we are able to see and use it inside the Logic App, open the browser and navigate in the https://portal.azure.com and select Browse and API Apps.

About Logic Apps and API Apps distribution, we have to consider two different sides, Admin and Developer, one is on premise the second is on the cloud.

Logic App uses triggers to integrate on premise and, of course, on Cloud technologies as SQL Server, File System and other more, in the first case.
Logically the trigger needs to interact with the on premise environment, this mechanism is provided by a service host application running in our on premise environment and using a relay binding on the cloud.

In the next articles I would like to explain more in detail about settings on on-premise side and on cloud side environments.

We have two different sides of settings, one is one the cloud and the other is on the developer side in Visual Studio and no many resources mention about that but I think is important.
Open the Azure node, in the Visual Studio Server Explorer and logging in to the subscription, now we are able to see many information about our API App as logging and tracing.

We have two different settings sides, one is on premise side inside Visual Studio.


the other one is in the cloud side.

We can also to debug our code remotely.

Another interesting area to consider is the on-premise integration layer, when we are going to use a trigger as SQL Connector, to communicate between the cloud and our on-premise environment, the Azure Platform creates a relay binding endpoint into the namespace.

This is the reason because we need to specify a namespace string during the API App trigger creation inside the Logic App.

In on premise side a host process is going to create the service layer interface to communicate with the trigger in the cloud.

There are a lot of things to discuss about the Logic Apps and API Apps architecture and there are a lot of things which running “under the hood, in this article I tried to collect the most important.
My feedback is, I like the approach and the idea, I think we missed a so powerful platform inside Azure.

I hope that with this article, other passionate developers as me, will be able to understand more about Logic and API Apps, I will write more in detail about any particular aspect in the next articles and videos.

Quick way to get all assemblies referenced in a BizTalk Server Solution

BizTalk Server uses a lot of assemblies, many of them are referenced and many times I need to understand how the  assembly is used and how many are referenced from it, many times to prepare a deployment or because I need to migrate an unknown environment.
Well, there are different ways to do that, one of these is querying the BizTalk management DB but it doesn’t contains all the information we need, normally only a first inheritance level.
I think the best and quick option is using the BizTalk Explorer Object model, I wrote a simple console application which is able to create an output file containing all the information I need.

The way is quite simple and it uses reflection to understand the assembly dependencies, below the code:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Microsoft.BizTalk.ExplorerOM;
using System.IO;
using System.Reflection;

namespace BizTalkAssemblyScanner
class Program
static void Main(string[] args)
StringBuilder somethingToWrite = new StringBuilder();
string fileName = "";
if (args.Length != 2)
throw new NotImplementedException();

Console.BackgroundColor = ConsoleColor.Black;
Console.ForegroundColor = ConsoleColor.White;
string servername = args[0].ToString();
string dbname = "BizTalkMgmtDb";

fileName = args[1].ToString();

Console.WriteLine("Load BizTalk configuration");

Microsoft.BizTalk.ExplorerOM.BtsCatalogExplorer btsCatalog1 = new Microsoft.BizTalk.ExplorerOM.BtsCatalogExplorer();
btsCatalog1.ConnectionString = "server=" + servername + ";database=" + dbname + ";Integrated Security=SSPI";

string localpath = Path.GetDirectoryName(Assembly.GetExecutingAssembly().Location);

//TODO: fai in modo di inserire una string di esclusione dei vari assembly, esempio Microsoft.System., mscorlib
string starsep = "***********************************************************************";

foreach (Application application in btsCatalog1.Applications)
//if (application.Name != "FOR AFFINITY APP") continue;
Console.WriteLine("Looking into the application: " + application.Name);

somethingToWrite.AppendLine(string.Format("Assemblies in the application name: {0}", application.Name));
foreach (BtsAssembly assembly in btsCatalog1.Assemblies)
//Assembly affinity?
//if (assembly.DisplayName.IndexOf("Microsoft", 0) == 0 || assembly.DisplayName.IndexOf("System.", 0) == 0) continue;

Assembly asmbase = Assembly.Load(assembly.DisplayName);
string text = string.Format("{0} - {1} - BTS:", assembly.DisplayName, asmbase == null ? "Loaded..." : "Not Loaded...", isbiztalkassemby(assembly.Name, application) ? "Yes" : "No");
if (asmbase == null)
LoadAssemblyReferences(somethingToWrite, asmbase, application);

catch (Exception ex)


File.WriteAllText(fileName, somethingToWrite.ToString());

catch (NotImplementedException ex)
Console.WriteLine("Arguments missing, -> BizTalkDeployer.exe servername filename");
catch (Exception ex)
File.WriteAllText(fileName, somethingToWrite.ToString());

static void LoadAssemblyReferences(StringBuilder somethingToWrite, Assembly asmbase,Application application)
// get first level references
AssemblyName[] referencedAssembly = asmbase.GetReferencedAssemblies();
Console.WriteLine("References used by {0}:", asmbase.FullName);
somethingToWrite.AppendLine(string.Format("References used by {0}:", asmbase.FullName));
foreach (AssemblyName assemblyName in referencedAssembly)
string text = string.Format("{0} - BTS {1}", assemblyName.FullName, isbiztalkassemby(assemblyName.Name, application) ? "Yes" : "No");
//Assembly affinity?
//if (assemblyName.FullName.IndexOf("Microsoft.", 0) == 0 || assemblyName.FullName.IndexOf("System.", 0) == 0) continue;

Assembly asmreference = Assembly.Load(assemblyName.FullName);
AssemblyName[] referencedAssemblyRef = asmreference.GetReferencedAssemblies();

if (referencedAssemblyRef.Count() != 0)

LoadAssemblyReferences(somethingToWrite, asmreference, application);

catch (Exception ex)

static bool isbiztalkassemby(string assemblyname, Application application)
BtsAssembly btsasm = application.Assemblies[assemblyname];
return btsasm != null;



I prefer to use a console application because is easier to manage in future scripting mechanism.

The console application command line is BizTalkAssemblyScanner.exe [Servername] [PathFileName]

and it produce an output as below:


You can download the .Net project HERE


How JiTGate engine will take advantage of the Azure of Things to provide a very simple way to integrate things

The BizTalk Summit in London is gone and after some critical weeks I’m finally able to be back on the battlefield.

What about the news, in the first is about the project I presented in London, the name is JiTGate (Just in Time Gate), what the idea is?
JiTGate is a light framework to integrate things in very fast and easy way, I started the project six months ago and I’m still in the development phase.
After the BizTalk NoS Add-in I made the decision to do something to make easier for me integrate things.
As many of you, I used a lot of different technologies to solve my integration problems, FILE, ASMX, WCF, BizTalk, .Net, Java, SSIS, Microsoft Azure , Google Cloud, Amazon AWS, MQ Rabbit, Tibco and more.
The evolution of the integration is quite curious, we passed across a lot of different patternsSlide10

I remember, years ago we used file and then ASP, just sending HTTP Post, who is able to remember Transaction Server?, if you remember that, this meaning you are a quite old integrator Smile but, I think you also have a very deep vision about the integration Smile.
After ASMX we had the adapter (BizTalk Server) and later we extend the vision of contract and metadata with WCF, with REST we started to understand that using a Convention Over Configuration we were able to drive our services, later, the large amount of methods drove us to create representation as Swagger and pattern Web API.

I think that, following my personal evolution we can understand a lot of things, 20 years ago I was a quite more beautiful, happy, blond and with blue eyes, with BizTalk Server I started a metamorphosys and all people started to call me the BizTalk Grinch.

Now the situation in going more complex that before, in last three years I saw so many new things that I’m not able to decide what is better than another.

Slide10Now you also know the meaning of my picture blog Smile

A lot of new terms and visions, MicroServices, On Premise, Exagonal, Poligonal, and who remember Quadrant?, but what is really curious is one thing, do you know what is the most integration pattern normally used inside the customer?


Yes, the File, and I start thinking about why the companies are still using this mechanism to integrate systems, I think because the File is the most known and simple thing.
It is also simple to manage, fast to use, it is polymorphic, you can adapt a file to contain everything you want and , of course, it it serializable and more…


I’m not telling you that the file is the best way to integrate but, honestly, this is what I would like to see in an Integration Framework, I would like a framework simple to use, fast to manage and first of all, full extensible.
Well, during the last MVP Summit I realized that now Microsoft Azure contains a lot of new things to use and many of them are able to work together in very easy way.
I started to imagine many different combinations for use them, and I started to change my vision of Azure from flat way, as below.


To a more complex vision as below.


this because Azure of Things, I like to say that the power of Azure if proportional to the number of Azure things we are able to use together.

JiTGate is an engine which able to use all of these things together and provide a very simple way to integrate things.

BizTalk London Summit 2015 Nino

The engine is quite complex in favour of the usability that is very simple, I shown some demos during the BizTalk Summit and the feedbacks was really good.
JiTGate is based on a complex Event Propagation pattern which uses Triggers and Events and internally it uses a complex mechanism to manage the relationship between all of them.

It is multi transport protocol and opened for all the publishing scenarios, it uses technology as Event Hubs, for this reason is able to scale in very high level.


An automatic synchronization mechanism is able to provide a very easy deployment mechanism, the configuration uses Json file so it is very simple to extend for other tools or existing UI as Visual Studio or Windows Explorer.


Create a trigger or event is really fast and I can do this using different languages as C# or PowerShell, I liked the idea to provide something good for the admin,create a trigger using PowerShell is very simple and without any particular strong development experience and without compile code, I think this is a good thing for administrator, I created a trigger which is able to get events from the event viewer in just ten minutes (thanks to my dear friend Sandro Pereira  for the PowerShell script Open-mouthed smile ).


Internal it use a rule engine based on Roslyn, this meaning that I’m able to create my rules directly using .Net and this is quite cool for me and the log system is totally dynamic, it is able to be extended using Stream Analytic or any other log system.



The usability is simple, download , unpack and run, the version will be as NT Service, executable, Worker Role and DLL Package, I’m also thinking to provide the package for Docker, in this way I will be able to use JiTGate in all the operating systems, Unix, Linux and more…

BizTalk London Summit 2015 Nino

The provision is totally automatic, the trigger or event could be created using C# or PowerShell, I just need to copy it in a folder and if, I mark it as shared, a synchronization engine will provide the same event to  all the JiTPoints in the same group, the configuration is totaly Json based and I can use all tools I want to manage it, I also want to create the VSX package extension but I prefer to think about a VSX library package extension, so the developers will be able to extend it as they want.


Logically the trigger activation is multi pattern, I’m able to fire a Trigger using a REST call and I’m creating a Convention Over Configuration to drive the triggers and events by REST, I tried to do that with BizTalk Server, years ago, but in this case the code is mine and I can do all I want in detail.
I can start a trigger using any scheduler system I want or in single instance to create, for example, a REST or HTTP point , in polling way as, for example the BizTalk File adapter  or SQL sever adapter, but I was also happy to provide the Event Handling patter, for me is very useful in case, for example, receive notification from external stream systems as event viewer, RFID and devices.
JiTGate is able to execute events everywhere and in cascade too.


I’m completing some of the most important features and I would like to present some  new interesting things during the Belgian BizTalk User Group Integration Day 2015 , I’m sure I will have the opportunity to discuss with a lot of integration experts and to gather a lot of useful feedbacks.

I’m also happy to receive any technical contribution or advice Smile.

{AoT} Azure Of Things – Queues and Topics

I spent the last six months study and developing and I like to say, now we have a lot of things in Azure which we can use to integrate a lot of things in Internet.
The world of Integration is very complex, it is a universe of things, in the first time my problem was not about how to create this or that in Azure, this is quite simple, but how to combine and use all of these things together.
For Ap as Azure power and At as Azure things, I can say that the power of Azure increases with number of Azure things combined together.


The power of Azure is directly proportional with the summation of things we combine together.

I like to think about Azure as a little universe full of things and the collision between them is able to create energetic solutions.


I remember, just only two years ago, how much was complicate to use stuff as queues, topics or blob in Azure, now is very fast and simple, also now there are a lot more things as Event Hub, Stream Analytic, a machine who is able to learn, power features for business Intelligence, an API to write powerful services and more.
The last Microsoft released is the API APPS, the BizTalk Summit in London is a good opportunity to understand more about it and I’m also sure that many guys will start to write about it, me too.
In these last period I liked to play with all the possible things around and I was really surprise by the simplicity to use them.

Create an Azure Queue.. no sooner said than done, a Topic with multiple subscriptions…no sooner said than done, a blob or table storage for any kind of purpose no sooner said than done, no more than ten minutes coding.
For example here the code to manage an Azure Queue, I added the comment inside the code in case you need to reuse it.

private void buttonQueue_Click(object sender, EventArgs e)

    string connectionString = "Endpoint=sb://[YOUR NAME SPACE].servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=[YOUR SHARE ACCESS KEY]";
    var namespaceManager = NamespaceManager.CreateFromConnectionString(connectionString);

    if (!namespaceManager.QueueExists("TestQueue"))
    QueueClient Client =
        QueueClient.CreateFromConnectionString(connectionString, "TestQueue");

    //If you want to use properties
    //message.Properties["TestProperty"] = "TestValue";
    //message.Properties["Message number"] = i;
    byte[] b = Encoding.UTF8.GetBytes("Text to Send");
    Client.Send(new BrokeredMessage(b));

    //Callback lambda approach, faster and easy
    // Configure the callback options
    OnMessageOptions options = new OnMessageOptions();
    options.AutoComplete = false;
    options.AutoRenewTimeout = TimeSpan.FromMinutes(1);

    // Callback to handle received messages
    Client.OnMessage((message) =>
            // Process message from queue, here to change the type for custom class 
            string bodymessage = message.GetBody<string>();
            string propertymessage =  message.Properties["TestProperty"].ToString();
            // Remove message from queue
        catch (Exception)
            // Indicates a problem, unlock message in queue
    }, options);

What I like more about this code is the improvements that Microsoft is doing inside the code pattern, the using of a lambda approach is the faster way to manage the callbacks and also is the most readable way to manage these kind of situations, I absolutely love it.

I tried to do the same with Amazon SQS (Simple Queue Service) , for some the aspects the approach is similar on the Queue creation but different in the receiving side.


Personally I prefer the Microsoft approach, I’m sure that exist the way to use the same pattern with Amazon, what I’m meaning is that the base pattern proposed by the Microsoft Framework is faster and simpler.

The Amazon receiving approach is closer to a “flat direct” pattern approach, the Microsoft approach is closer to an “event propagation” pattern and this is faster to use and also optimized for high threading approach.

queue call back

The pattern to create a Microsoft Topic is similar, this is a nice thing because the developer is going to use a same pattern approach for all the stacks, check the code below.
Below you can read the complete sample with some useful comments.

private void buttonTopic_Click(object sender, EventArgs e)
    string connectionString = "Endpoint=sb://[YOUR NAMESPACE].servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=[YOUR SHARED KEY]";

    var namespaceManager = NamespaceManager.CreateFromConnectionString(connectionString);

    if (!namespaceManager.TopicExists("TestTopic"))
    TopicClient Client =
        TopicClient.CreateFromConnectionString(connectionString, "TestTopic");

    byte[] b = Encoding.UTF8.GetBytes("Message content");
    BrokeredMessage message =  new BrokeredMessage(b);
    message.Properties["MessageNumber"] = 4;
    //message.Properties["Message number"] = i;


    // Here to create a subscription 
    //Create a filtered subscription
    SqlFilter highMessagesFilter =
        new SqlFilter("MessageNumber > 3");

    if (!namespaceManager.SubscriptionExists("TestTopic", "HighMessages"))

    SubscriptionClient subscriptionClientHigh =
                (connectionString, "TestTopic", "HighMessages");

    // Configure the callback options
    OnMessageOptions options = new OnMessageOptions();
    options.AutoComplete = false;
    options.AutoRenewTimeout = TimeSpan.FromMinutes(1);

    subscriptionClientHigh.OnMessage((brokerMessage) =>
            // Process message from subscription
            string bodymessage = brokerMessage.GetBody<string>();
            string propertymessage =  message.Properties["TestProperty"].ToString();

            // Remove message from queue
        catch (Exception)
            // Indicates a problem, unlock message in subscription
    }, options);

What I really like is the natural and simple approach to create stuff from scratch, it is fast and easy, we can combine things as Topics, Queues, Event Hub and more in very fast way to create very complex Pub/Sub scenarios.

Event Hub is really simple to use, I’m going to put a personal laboratory in GitHub, I will share the content with you soon, I want also to prepare a video about it.

I spent the last six months playing with all the Microsoft technologies, using all of them together we are really able to create amazing things.