Smart Cloud Integration with Azure and OpenSource


During the last London Summit I presented my new project, An Azure of Things, a developer’s perspective, you can watch the video here.
Next week I will have the pleasure to speak in the WPC 2015. I want to thank Overnet for inviting me in this important event as WPC is the most important technical conference in Italy.

Everything I started with Overnet has being a good thing. For example in Overnet, we organized the first BizTalk Innovation Event with the BizTalk Crew :).  Here a very good post from Sandro.

The title of my session is Smart Cloud Integration with Azure and OpenSource. The title reflects my idea of smart integration to describe solving common and sometimes complex integration scenarios in a very simple and smart way.

In addition, to announce something that I am very excited about. For the past two years, during my spare time of nights and weekends, I have created a smart integration framework that I am releasing as open source.

It was a big challenge as I wrote and rewrote the same code different times until I was to distill my idea to a concrete, workable solution. And now I am very happy with the result.

The project will be hosted on GitHub to give the opportunity to the community to collaborate and extend.

See you on WPC 2015 and on GitHub next week :)








BizTalk Convention Over Configuration processing by REST

During the 2012 I was looking for something real dynamic to drive the processes inside BizTalk Server, there are different options to use, one is using BRE from pipeline components and orchestrations and drive the processing by SQL Server tables and context message as Microsoft ESB Toolkit another is creating some pipeline components and orchestrations and using a storage database as SQL Server with a number of tables to instruct our code to drive the messages and other more options.

What I was missing in BizTalk was the real feature of a real ESB, something was able to adapt his processes using a convention over configuration, when Microsoft provided the opportunity to expose REST by BizTalk then BOOOM, here we go and I started the project.

I wrote a post about it but I never published the code, now the code is available here.

The idea is simple, RESB is going to provide the possibility to drive the BizTalk processes by HTTP REST using a convention over configuration, below some of the slides I used during an event in Milan, all of them are inside the package.


and trying to cover the most important and used communication patterns.


In the REST URI specify what entities we want to invoke and the order.


and drive our process trough HTTP REST


The architecture is using BRE with different pipeline components and orchestrations.


The code inside uses different technics to provide the maximum flexibility and extensibility, below some sample used inside the project.

Slide8 Slide7 Slide9

In many asked me to publish this code, the solution is the DRAFT of this idea and this could be a good starter kit to create something similar.

You can find more information in previous blog post and you can download the code here.

C# Tip Clone a .Net object

Clone an object in an integration scenario is a quite common operation we need to do,  many time integrate could means move values objects, transform data between references and so on, many time is very useful to be able to duplicate our objects and “data containers”.

Well, one of the common approach, is using the MemberwiseClone method, it create a shallow copy of the object and we can use the clone as “copy” but it doesn’t work with any scenario, below the exact limits:

The MemberwiseClone method creates a shallow copy by creating a new object, and then copying the nonstatic fields of the current object to the new object. If a field is a value type, a bit-by-bit copy of the field is performed. If a field is a reference type, the reference is copied but the referred object is not; therefore, the original object and its clone refer to the same object.

Essentially all the object reference are still working and is going to be complicate abstract the value object for different clones / values.

Another option is to implement the iClonable interface and use the MemberwiseClone , this is a very god option to control in deep the cloning and we will be sure to recreate a new one clone, obbiovusly we need to write all the code to replicate ore object in our Clone() method and use theMemberwiseClone where we need.

An interesting tip I like to use is serializing the object, we can use any serialization pattern the important thing is just serialize the object,for example, a method to serialize an object to a byte array and on from a byte array to byte array as below.

public static byte[] ObjectToByteArray(object objectData)
if (objectData == null)
return null;
var binaryFormatter = new BinaryFormatter();
var memoryStream = new MemoryStream();
binaryFormatter.Serialize(memoryStream, objectData);
return memoryStream.ToArray();

public static object ByteArrayToObject(byte[] arrayBytes)
if (arrayBytes == null) return Encoding.UTF8.GetBytes(string.Empty);
var memoryStream = new MemoryStream();
var binaryFormatter = new BinaryFormatter();
memoryStream.Write(arrayBytes, 0, arrayBytes.Length);
memoryStream.Seek(0, SeekOrigin.Begin);
var obj = binaryFormatter.Deserialize(memoryStream);
return obj;

and simply serializing the object to clone it.

byte[] byteobj = SerializationEngine.ObjectToByteArray(objecttoclone);
var newobjectclone = SerializationEngine.ByteArrayToObject(byteobj);

it works perfectly and it is very simple to implement, logically the object need to be serializable.

BizTalk NOS Ultimate – BizTalk Assessment Power Up!

In this last period I’m doing many different BizTalk assessments and health check, these kind of missions are really complicate to do specially if you need to check the source code and BizTalk NOS supports me a lot, the best thing I can do is to show you how I use BizTalk NOS in different situations, in this post I will show you one real case in a orchestration assessment.

I found a solution with many projects insides, obviously the projects and solution name are obfuscated.


The code is well done but the number of projects inside the single solution is huge, creating different problems.

First big problem is the navigation, I can’t manually navigate inside it just browsing each single project, the wild search feature helps me a lot to find what I need.
I want to understand where the solution implement Oracle integration and what error messaging patterns it use, first of all I execute a wild search using the “oracle.” word and ignoring the case.


I got 1 orchestration, first of all I like to execute the Refactor feature, it able to provide me many information, first of all the size and complexity, I can execute the same feature globally from the solution file and I receive the same report bat with a global result.


The BizTalk NOS produces a very detailed report with many useful information, on top I can check the most important information:


Average complexity, this is an algorithm which calculate the complexity using different parameters as branches, code used, loops and so on, more than 1000 means  that the orchestration is complex to understand and too many artefacts present, I will discuss that with the customer.

File Size, is the current assembly Kb size in term of this orchestration, quite high… but the AGC is what I looking for.

AGC, Average Global Cost, this algorithm calculates the global cost of orchestration considering also the external dependencies used, as maps, schemas pipelines, orchestrations and so on, this is what I’m really keen, the result here is 11643 Kb, this means that this orchestration potentially will consume 11+ megabytes of memory, could be very expensive if this orchestration is called in multiple instances, I will check the issue with the customer.

Number of warning, this algorithm sum all the warnings found in the orchestration, a warning is something that doesn’t follow a simple base best practice, as base I mean a persistent in a loop using a send port, a extensive using of code inside a shape, a default named as “schema1 or map2” used inside it an more, over 50 means that is better have a look inside the orchestration for sure you will find different points of discussion.

For example in this orchestration I found many dangerous persistent points created by send ports inside loops, I can see that in the report looking for the database bomb icon, I will check this other issue with the customer.


To understand more I also use the dynamic navigation panel, when you open the orchestration you can see a navigation panel on the right.


Using this panel I’m able to navigate in very fast way inside the orchestration and see all the messages and maps used and all the dependencies, but I will love to show you what I’m able to find and understand during a BizTalk solution assessment using BizTalk NOS.
I’m keen to the message ExceptionNotificationMessage, I think this message is used inside the orchestration to manage the messaging error handling, I’m keen to understand if the error messaging pattern used is global inside the solution or just inside this orchestration.

To understand that I open the schema with a double click from the dynamic navigation panel, now I need to locate the schema inside the solution, to do that right click on the top of schema toolbox panel and select I simply select the Locate It menu voice.


Now I’m able to use one of my preferred feature, the Troubleshooting-External Dependencies, right click on the schema file and select Troubleshooting and External Dependencies,


The result is most of the orchestration are using this schema, this means the error messaging handling should be centralized, to be sure I execute a wild search using the schema name and the result confirmed, I can also see where the schema is used and constructed, now I can check the libraries code regarding the error handling.


As you can see BizTalk NOS Ultimate is a very powerful tool, I use it in many different situations, assessment, troubleshooting, development, refactoring and more, there are a lot of features inside, I will write more about it, I love it, it is ROCK!.

You can download a free trial version here.
More info @

Logic Apps What’s New

I’m following the Logic Apps broadcasting video live, below a quick recap about the most important points.

1) possibility to write java script and c# script code inside a logic app, essentially the same concept of BizTalk scripting functoid


2) possibility to download these Api Apps from a new GitHub repository

This is nice if you want to understand how to create or extend these Api Apps, the purpose of this GitHub repository is to provide an open source repository where the community can contribute.

4) What’s new

new features

4) What’s in progress


For more details…

the full video here

the Microsoft team blog here

Fast and easy deployment with WiX 3.10 for Visual Studio 2015

Wix is a great toolset which able to provide all we need to create a great deployment, I’m preparing the setup for jitGate, now in private beta test and I’m using WiX 3.10, this version support Visual Studio 2015, the installation is very simple and the 3.10 build is available here.

Essentially WiX is a toolset completely based on top of Windows Installer and it is completely base on XML scripting, here the name Windows Installer XML Toolset, WiX Toolset.

WiX is free and open source and the framework is able to cover all the large number of features and options offered by Microsoft Windows Installer, it also provide a large number of tool to made easy creating our deployment database and WiX already offers a large number of setup dialog forms ready and we can also customize them.

WiX is able to offer a great WPF setup interfaces, the Wix setup also uses a WPF interface.


Wix is very easy to extend because completely based on XML scripting and it is essentially formed by 4 big areas.


1) Product area containing all the information about the product, the most important setting about the setup behaviours as upgrading, compression options, deployment restrictions and more

2) Features to manage the different deployment features options, for example, minimal / typical and complete installation.

3) Directory to manage the source and destination directories deployment and it is very intuitive to use.

4) Component group to manage the deployment files,  essentially 1 Component = 1 File.

Another great thing is the WiX BootStrapper, we can use it to install our prerequisite before our installation, for example these three simple lines will install the .Net Framework 4.5.


WiX covers all the deployment option type and we can also extend the Windows Installer behaviour using the Custom Action Project and this is really powerful.


There are many resources and courses available in internet

I definitely recommend WiX Toolset to create our deployment projects.



Event Hubs API App – fast and easy to do

Last evening I was working around a Logic Apps and I needed to send some messages to the EventHubs.
We have two different options, we can use the API app HTTP Connector or we can decide to create and API app which able to do that, this is a good opportunity to understand the development productivity about API Apps.
I would like to use a very simple approach and we can extend this sample as we want, using dynamic configurations, extended features and so on, I just want demonstrate how much is simple do that in some simple steps.

Install the Windows Azure SDK for .NET – 2.5.
Create a new Visual Studio project, select Cloud and ASP.NET Web Application


Select Azure API App


All the library will be automatically added


Select manage NuGet Packages
5Search for EventHub and select the EventProcessHost package
6[OPTIONAL] Rename the ValuesController.cs class in to EventHubController.cs

Below the simple code we have to use to send an event message to the Event Hubs, copy and past this code in the class file.

using Microsoft.ServiceBus;
using Microsoft.ServiceBus.Messaging;
using System;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.Linq;
using System.Net;
using System.Net.Http;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
using System.Timers;
using System.Web.Http;

namespace EventHubConnector.Controllers
    public class EventHubController : ApiController
        // GET api/values
        public string Get(string message)
            string ehConnectionString = "Endpoint=sb://[EVENTHUB CONNECTION STRING]";
            //Create the connection string
            ServiceBusConnectionStringBuilder builder = new ServiceBusConnectionStringBuilder(ehConnectionString)
                TransportType = Microsoft.ServiceBus.Messaging.TransportType.Amqp

            //Create the EH sender
            string eventHubName = "[EVENTHUBNAME]";

            //OPTIONS 1
            EventHubClient eventHubClient = null;
            eventHubClient = EventHubClient.CreateFromConnectionString(builder.ToString(), eventHubName);
            EventData data = new EventData(Encoding.UTF8.GetBytes(message));

Now we enable the swagger features, I would like to spend something here because some guys asking me more information about the swagger side.
We have two different options to manage the swagger contract, on is dynamically, entering in SwaggerConfig.cs and uncomment the usual .EnableSwaggerUi lines.
The second is static, this is useful if we want to drive our swagger generation and this is also quite simple to do.
Uncomment the usual EnableSwaggerUi lines , press F5 and execute the project in debug mode.

Navigate on http://localhost:%5BYOURPORT%5D/swagger/docs/v1 to get the API json raw version

Open the file in Visual Studio, create a file named apiDefinition.swagger.json Metadata folder and save the content inside the file.

To enable the static feature we just only have to enter in the apiapp.json file and delete the value of the endpoints node , as below


Very easy and fast.
We are ready to publish, right click on project file and select Publish.
Select Microsoft Azure API Apps.


Insert the name of your API app and set all the usual subscription properties as service plan and resource group, we can also create them.


Our API app is deployed and ready to use


This API App which able to send a message to the Azure Event Hubs, very simple and fast to do and we can now extend this API App also to receive message from Event Hubs and create some other interesting features.