Quick way to get all assemblies referenced in a BizTalk Server Solution

BizTalk Server uses a lot of assemblies, many of them are referenced and many times I need to understand how the  assembly is used and how many are referenced from it, many times to prepare a deployment or because I need to migrate an unknown environment.
Well, there are different ways to do that, one of these is querying the BizTalk management DB but it doesn’t contains all the information we need, normally only a first inheritance level.
I think the best and quick option is using the BizTalk Explorer Object model, I wrote a simple console application which is able to create an output file containing all the information I need.

The way is quite simple and it uses reflection to understand the assembly dependencies, below the code:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Microsoft.BizTalk.ExplorerOM;
using System.IO;
using System.Reflection;

namespace BizTalkAssemblyScanner
{
class Program
{
static void Main(string[] args)
{
StringBuilder somethingToWrite = new StringBuilder();
string fileName = "";
try
{
if (args.Length != 2)
throw new NotImplementedException();

Console.BackgroundColor = ConsoleColor.Black;
Console.ForegroundColor = ConsoleColor.White;
string servername = args[0].ToString();
string dbname = "BizTalkMgmtDb";

fileName = args[1].ToString();

Console.WriteLine("Load BizTalk configuration");

Microsoft.BizTalk.ExplorerOM.BtsCatalogExplorer btsCatalog1 = new Microsoft.BizTalk.ExplorerOM.BtsCatalogExplorer();
btsCatalog1.ConnectionString = "server=" + servername + ";database=" + dbname + ";Integrated Security=SSPI";

string localpath = Path.GetDirectoryName(Assembly.GetExecutingAssembly().Location);

//TODO: fai in modo di inserire una string di esclusione dei vari assembly, esempio Microsoft.System., mscorlib
string starsep = "***********************************************************************";

foreach (Application application in btsCatalog1.Applications)
{
//if (application.Name != "FOR AFFINITY APP") continue;
Console.WriteLine("Looking into the application: " + application.Name);

somethingToWrite.AppendLine("");
somethingToWrite.AppendLine(starsep);
somethingToWrite.AppendLine(string.Format("Assemblies in the application name: {0}", application.Name));
somethingToWrite.AppendLine(starsep);
somethingToWrite.AppendLine("");
//Assembly
foreach (BtsAssembly assembly in btsCatalog1.Assemblies)
{
//Assembly affinity?
//if (assembly.DisplayName.IndexOf("Microsoft", 0) == 0 || assembly.DisplayName.IndexOf("System.", 0) == 0) continue;
try
{
somethingToWrite.AppendLine(assembly.DisplayName);

Assembly asmbase = Assembly.Load(assembly.DisplayName);
string text = string.Format("{0} - {1} - BTS:", assembly.DisplayName, asmbase == null ? "Loaded..." : "Not Loaded...", isbiztalkassemby(assembly.Name, application) ? "Yes" : "No");
Console.WriteLine(text);
somethingToWrite.AppendLine(text);
if (asmbase == null)
continue;
else
LoadAssemblyReferences(somethingToWrite, asmbase, application);

}
catch (Exception ex)
{
somethingToWrite.AppendLine(ex.Message);
}

}

}
File.WriteAllText(fileName, somethingToWrite.ToString());

}
catch (NotImplementedException ex)
{
Console.WriteLine("Arguments missing, -> BizTalkDeployer.exe servername filename");
}
catch (Exception ex)
{
somethingToWrite.AppendLine(ex.Message);
}
finally
{
File.WriteAllText(fileName, somethingToWrite.ToString());
Console.WriteLine("Done!");
Console.ReadLine();
}

}
static void LoadAssemblyReferences(StringBuilder somethingToWrite, Assembly asmbase,Application application)
{
// get first level references
AssemblyName[] referencedAssembly = asmbase.GetReferencedAssemblies();
Console.WriteLine("References used by {0}:", asmbase.FullName);
somethingToWrite.AppendLine("");
somethingToWrite.AppendLine(string.Format("References used by {0}:", asmbase.FullName));
foreach (AssemblyName assemblyName in referencedAssembly)
{
try
{
string text = string.Format("{0} - BTS {1}", assemblyName.FullName, isbiztalkassemby(assemblyName.Name, application) ? "Yes" : "No");
Console.WriteLine(text);
somethingToWrite.AppendLine(text);
//Assembly affinity?
//if (assemblyName.FullName.IndexOf("Microsoft.", 0) == 0 || assemblyName.FullName.IndexOf("System.", 0) == 0) continue;

Assembly asmreference = Assembly.Load(assemblyName.FullName);
AssemblyName[] referencedAssemblyRef = asmreference.GetReferencedAssemblies();

if (referencedAssemblyRef.Count() != 0)
return;
else

LoadAssemblyReferences(somethingToWrite, asmreference, application);

}
catch (Exception ex)
{
somethingToWrite.AppendLine(ex.Message);
}

}
}
static bool isbiztalkassemby(string assemblyname, Application application)
{
BtsAssembly btsasm = application.Assemblies[assemblyname];
return btsasm != null;

}

}
}

I prefer to use a console application because is easier to manage in future scripting mechanism.

The console application command line is BizTalkAssemblyScanner.exe [Servername] [PathFileName]

and it produce an output as below:

Capture

You can download the .Net project HERE

 

How JiTGate engine will take advantage of the Azure of Things to provide a very simple way to integrate things

The BizTalk Summit in London is gone and after some critical weeks I’m finally able to be back on the battlefield.

What about the news, in the first is about the project I presented in London, the name is JiTGate (Just in Time Gate), what the idea is?
JiTGate is a light framework to integrate things in very fast and easy way, I started the project six months ago and I’m still in the development phase.
After the BizTalk NoS Add-in I made the decision to do something to make easier for me integrate things.
As many of you, I used a lot of different technologies to solve my integration problems, FILE, ASMX, WCF, BizTalk, .Net, Java, SSIS, Microsoft Azure , Google Cloud, Amazon AWS, MQ Rabbit, Tibco and more.
The evolution of the integration is quite curious, we passed across a lot of different patternsSlide10

I remember, years ago we used file and then ASP, just sending HTTP Post, who is able to remember Transaction Server?, if you remember that, this meaning you are a quite old integrator Smile but, I think you also have a very deep vision about the integration Smile.
After ASMX we had the adapter (BizTalk Server) and later we extend the vision of contract and metadata with WCF, with REST we started to understand that using a Convention Over Configuration we were able to drive our services, later, the large amount of methods drove us to create representation as Swagger and pattern Web API.

I think that, following my personal evolution we can understand a lot of things, 20 years ago I was a quite more beautiful, happy, blond and with blue eyes, with BizTalk Server I started a metamorphosys and all people started to call me the BizTalk Grinch.

Now the situation in going more complex that before, in last three years I saw so many new things that I’m not able to decide what is better than another.

Slide10Now you also know the meaning of my picture blog Smile

A lot of new terms and visions, MicroServices, On Premise, Exagonal, Poligonal, and who remember Quadrant?, but what is really curious is one thing, do you know what is the most integration pattern normally used inside the customer?

Slide13

Yes, the File, and I start thinking about why the companies are still using this mechanism to integrate systems, I think because the File is the most known and simple thing.
It is also simple to manage, fast to use, it is polymorphic, you can adapt a file to contain everything you want and , of course, it it serializable and more…

Slide18

I’m not telling you that the file is the best way to integrate but, honestly, this is what I would like to see in an Integration Framework, I would like a framework simple to use, fast to manage and first of all, full extensible.
Well, during the last MVP Summit I realized that now Microsoft Azure contains a lot of new things to use and many of them are able to work together in very easy way.
I started to imagine many different combinations for use them, and I started to change my vision of Azure from flat way, as below.

Slide19.

To a more complex vision as below.

Slide20

this because Azure of Things, I like to say that the power of Azure if proportional to the number of Azure things we are able to use together.

JiTGate is an engine which able to use all of these things together and provide a very simple way to integrate things.

BizTalk London Summit 2015 Nino

The engine is quite complex in favour of the usability that is very simple, I shown some demos during the BizTalk Summit and the feedbacks was really good.
JiTGate is based on a complex Event Propagation pattern which uses Triggers and Events and internally it uses a complex mechanism to manage the relationship between all of them.

It is multi transport protocol and opened for all the publishing scenarios, it uses technology as Event Hubs, for this reason is able to scale in very high level.

Slide26

An automatic synchronization mechanism is able to provide a very easy deployment mechanism, the configuration uses Json file so it is very simple to extend for other tools or existing UI as Visual Studio or Windows Explorer.

Slide27

Create a trigger or event is really fast and I can do this using different languages as C# or PowerShell, I liked the idea to provide something good for the admin,create a trigger using PowerShell is very simple and without any particular strong development experience and without compile code, I think this is a good thing for administrator, I created a trigger which is able to get events from the event viewer in just ten minutes (thanks to my dear friend Sandro Pereira  for the PowerShell script Open-mouthed smile ).

Slide28

Internal it use a rule engine based on Roslyn, this meaning that I’m able to create my rules directly using .Net and this is quite cool for me and the log system is totally dynamic, it is able to be extended using Stream Analytic or any other log system.

Slide29

 

The usability is simple, download , unpack and run, the version will be as NT Service, executable, Worker Role and DLL Package, I’m also thinking to provide the package for Docker, in this way I will be able to use JiTGate in all the operating systems, Unix, Linux and more…

BizTalk London Summit 2015 Nino

The provision is totally automatic, the trigger or event could be created using C# or PowerShell, I just need to copy it in a folder and if, I mark it as shared, a synchronization engine will provide the same event to  all the JiTPoints in the same group, the configuration is totaly Json based and I can use all tools I want to manage it, I also want to create the VSX package extension but I prefer to think about a VSX library package extension, so the developers will be able to extend it as they want.

Slide31

Logically the trigger activation is multi pattern, I’m able to fire a Trigger using a REST call and I’m creating a Convention Over Configuration to drive the triggers and events by REST, I tried to do that with BizTalk Server, years ago, but in this case the code is mine and I can do all I want in detail.
I can start a trigger using any scheduler system I want or in single instance to create, for example, a REST or HTTP point , in polling way as, for example the BizTalk File adapter  or SQL sever adapter, but I was also happy to provide the Event Handling patter, for me is very useful in case, for example, receive notification from external stream systems as event viewer, RFID and devices.
JiTGate is able to execute events everywhere and in cascade too.

Slide32

I’m completing some of the most important features and I would like to present some  new interesting things during the Belgian BizTalk User Group Integration Day 2015 , I’m sure I will have the opportunity to discuss with a lot of integration experts and to gather a lot of useful feedbacks.

I’m also happy to receive any technical contribution or advice Smile.

{AoT} Azure Of Things – Queues and Topics

I spent the last six months study and developing and I like to say, now we have a lot of things in Azure which we can use to integrate a lot of things in Internet.
The world of Integration is very complex, it is a universe of things, in the first time my problem was not about how to create this or that in Azure, this is quite simple, but how to combine and use all of these things together.
For Ap as Azure power and At as Azure things, I can say that the power of Azure increases with number of Azure things combined together.

sommatory

The power of Azure is directly proportional with the summation of things we combine together.

I like to think about Azure as a little universe full of things and the collision between them is able to create energetic solutions.

AOT

I remember, just only two years ago, how much was complicate to use stuff as queues, topics or blob in Azure, now is very fast and simple, also now there are a lot more things as Event Hub, Stream Analytic, a machine who is able to learn, power features for business Intelligence, an API to write powerful services and more.
The last Microsoft released is the API APPS, the BizTalk Summit in London is a good opportunity to understand more about it and I’m also sure that many guys will start to write about it, me too.
In these last period I liked to play with all the possible things around and I was really surprise by the simplicity to use them.

Create an Azure Queue.. no sooner said than done, a Topic with multiple subscriptions…no sooner said than done, a blob or table storage for any kind of purpose no sooner said than done, no more than ten minutes coding.
For example here the code to manage an Azure Queue, I added the comment inside the code in case you need to reuse it.

private void buttonQueue_Click(object sender, EventArgs e)
{

    string connectionString = "Endpoint=sb://[YOUR NAME SPACE].servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=[YOUR SHARE ACCESS KEY]";
    var namespaceManager = NamespaceManager.CreateFromConnectionString(connectionString);

    if (!namespaceManager.QueueExists("TestQueue"))
    {
        namespaceManager.CreateQueue("TestQueue");
    }
    QueueClient Client =
        QueueClient.CreateFromConnectionString(connectionString, "TestQueue");

    //If you want to use properties
    //message.Properties["TestProperty"] = "TestValue";
    //message.Properties["Message number"] = i;
    byte[] b = Encoding.UTF8.GetBytes("Text to Send");
    Client.Send(new BrokeredMessage(b));

    ////Reciving***********************************
    //Callback lambda approach, faster and easy
    // Configure the callback options
    OnMessageOptions options = new OnMessageOptions();
    options.AutoComplete = false;
    options.AutoRenewTimeout = TimeSpan.FromMinutes(1);

    // Callback to handle received messages
    Client.OnMessage((message) =>
    {
        try
        {
            // Process message from queue, here to change the type for custom class 
            string bodymessage = message.GetBody<string>();
                  
            string propertymessage =  message.Properties["TestProperty"].ToString();
            // Remove message from queue
            message.Complete();
            MessageBox.Show(bodymessage);
        }
        catch (Exception)
        {
            // Indicates a problem, unlock message in queue
            message.Abandon();
        }
    }, options);
}

What I like more about this code is the improvements that Microsoft is doing inside the code pattern, the using of a lambda approach is the faster way to manage the callbacks and also is the most readable way to manage these kind of situations, I absolutely love it.

I tried to do the same with Amazon SQS (Simple Queue Service) , for some the aspects the approach is similar on the Queue creation but different in the receiving side.

miccode

Personally I prefer the Microsoft approach, I’m sure that exist the way to use the same pattern with Amazon, what I’m meaning is that the base pattern proposed by the Microsoft Framework is faster and simpler.

The Amazon receiving approach is closer to a “flat direct” pattern approach, the Microsoft approach is closer to an “event propagation” pattern and this is faster to use and also optimized for high threading approach.

queue call back

The pattern to create a Microsoft Topic is similar, this is a nice thing because the developer is going to use a same pattern approach for all the stacks, check the code below.
Below you can read the complete sample with some useful comments.

private void buttonTopic_Click(object sender, EventArgs e)
{
    string connectionString = "Endpoint=sb://[YOUR NAMESPACE].servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=[YOUR SHARED KEY]";

    var namespaceManager = NamespaceManager.CreateFromConnectionString(connectionString);

    if (!namespaceManager.TopicExists("TestTopic"))
    {
        namespaceManager.CreateTopic("TestTopic");
    }
    TopicClient Client =
        TopicClient.CreateFromConnectionString(connectionString, "TestTopic");

    byte[] b = Encoding.UTF8.GetBytes("Message content");
    BrokeredMessage message =  new BrokeredMessage(b);
    message.Properties["MessageNumber"] = 4;
    //message.Properties["Message number"] = i;


    Client.Send(message);
    MessageBox.Show("Sent");

    //Receiving***********************************
    // Here to create a subscription 
    //Create a filtered subscription
    SqlFilter highMessagesFilter =
        new SqlFilter("MessageNumber > 3");

    if (!namespaceManager.SubscriptionExists("TestTopic", "HighMessages"))
    {
        namespaceManager.CreateSubscription("TestTopic",
            "HighMessages",
            highMessagesFilter);
    }

    SubscriptionClient subscriptionClientHigh =
        SubscriptionClient.CreateFromConnectionString
                (connectionString, "TestTopic", "HighMessages");

    // Configure the callback options
    OnMessageOptions options = new OnMessageOptions();
    options.AutoComplete = false;
    options.AutoRenewTimeout = TimeSpan.FromMinutes(1);

    subscriptionClientHigh.OnMessage((brokerMessage) =>
    {
        try
        {
            // Process message from subscription
            string bodymessage = brokerMessage.GetBody<string>();
            string propertymessage =  message.Properties["TestProperty"].ToString();

            // Remove message from queue
            brokerMessage.Complete();
            MessageBox.Show(bodymessage);
        }
        catch (Exception)
        {
            // Indicates a problem, unlock message in subscription
            message.Abandon();
        }
    }, options);
}

What I really like is the natural and simple approach to create stuff from scratch, it is fast and easy, we can combine things as Topics, Queues, Event Hub and more in very fast way to create very complex Pub/Sub scenarios.

Event Hub is really simple to use, I’m going to put a personal laboratory in GitHub, I will share the content with you soon, I want also to prepare a video about it.

I spent the last six months playing with all the Microsoft technologies, using all of them together we are really able to create amazing things.

{AoT} Azure of Things – BizTalk London Summit 2015

Untitled

The BizTalk London Summit is coming and many are the reasons because I’m so excited. Sometime I like to think about my first BizTalk London Summit, it was two years ago, the atmosphere was magic, the event was full of integration experts and passionate, honestly I like to call them Integration Animals :)

Sandro wrote a great post about it HERE

During the last Summit I had the opportunity to share knowledge, to learn new thing and , important thing, in front a good beer :).
The number of attendees is going to grow up every year more, last year were 250+ and probably this year will be 400+.

When I think that all of this because a group of great friends started a magic adventure called BizTalk Crew three year ago, organizing events around the Europe because the passion and the friendship, I think that’s amazing and I’m feeling so happy for that.

547819_430143567003648_519469018_n

Now the event is International and many are the challenges to organize it, the BizTalk 360 team is doing a great job, these guys are rocks!!!

All the speakers are Microsoft MVPs and also many Microsoft Product and Program Managers from Redmond will be present, these two days will be full of intensive moments, it will be an amazing technical opportunity to live.

Microsoft released a lot of new things in Azure and I invested all my free time to study and developing. One of my preferred stack is Event Hub, I played a lot around it, next week I will be in the Integration Monday and I prepared some toys and code to show up, I think it will be a nice technical moment.

The event will be around 19.30, yes is during the dinner time and I like the idea that probably most of the attendees will eat during my session, I will also eat something during the speech :D

HERE if you are interested to join for the Tech Dinner Session, the menu will be, Event Hub and good beer.

About BizTalk Summit in London, well is not simple to explain what I’m going to show, in this last period I was impressed about how many interesting thing is possible to do mixing things in Azure, this because {AoT} Azure of Things. Now Microsoft Azure is full of great stacks witch we are able to mix together, in very simple way, to solve most of the common integration problems and patterns. Months ago I started to investigate about it and I was impressed by the simplicity , by the opportunity to combine all of this in so simple way, Queues, Topics, Event Hub, Hybrid Integration, ,Application, Analytic, Computing, Data and more.

In this last month I felt myself as a child playing with fun, playing with code and technologies, was long time that I missed this feeling, just combining in intelligent way these stacks is possible to create and solve fantastic integration patterns.

I will write more about {AoT}, the BizTalk London Summit will be a great opportunity to discuss about it.

You can find more detail about the BizTalk London Summit HERE.

See you there :)

Development productivity – all my thoughts

Last month I started working in a new project idea, I love to write code and I also love painting, I don’t see so many differences between these two activities, twice are based on our pure creativity, we can create all we think and all we want, sometime I love to use this my quote, I paint when I write code :)

As a painter in front a white picture, when I have an idea or an inspiration, I start to write code in front an empty Visual Studio project, Initially the activity is very expensive because I have a clear idea, a clear feeling in my mind, but I’m not sure about what I’m going to do to realize that, I write and delete the same code many times, changing it continuously, until I’m satisfied.
Logically I work during my free time, normally during the night, so the optimization of my time and the improvement of my productivity is the first real critical aspect for me, more I’m fast more quickly I will realize my idea and more happy will be wife :)

My personal strategy is to invest time to increase my productivity, more I invest in the first time and more I will be fast and optimized in my development, but what is, for my opinion, the development productivity?
I can think about it in three different main aspects, one is how much fast I can write code, the second is how much I can optimize my writing code and the third is the motivation.
First my best strategy is to study Visual Studio and .Net Framework to understand  the best options I can use to optimize my development life, I want to do two simple but powerful examples.

About the first one, how many time I need to re write the same lines of code, I like to create my best base code snippets, I do that before and in the during of development, and this is very simple and fast to do, this is a my real sample.

Create a simple XML file with this content:

<?xml version=”1.0″ encoding=”utf-8″ ?>
<CodeSnippet Format=”1.0.0″ xmlns=”http://schemas.microsoft.com/VisualStudio/2013/CodeSnippet”&gt;
<Header>
<Title>LogMessage</Title>
<Author>Nino</Author>
<Shortcut>log</Shortcut>
<Description>Insert log message</Description>
<SnippetTypes>
<SnippetType>SurroundsWith</SnippetType>
<SnippetType>Expansion</SnippetType>
</SnippetTypes>
</Header>
<Snippet>
<Declarations>
<Literal>
<ID>message</ID>
<Default>my function</Default>
</Literal>
</Declarations>
<Code Language=”CSharp”>
<![CDATA[
LogEngine.WriteLog(Configuration.General_Source,
string.Format(“Error in {0}”, MethodBase.GetCurrentMethod().Name),
Constant.Error_EventID_High_Critical_HYIS,
Constant.TaskCategories_HYIS,
ex,
MessageLevel.Error);
]]>
</Code>
</Snippet>
</CodeSnippet>

Save the file with .snippet extension and import the snippet file in Visual Studio using the Code Snipper Manager you will find under the Tools voice menu.
snipet

When I need to reuse my code I just press Ctrl + K and then X.

snipet2Select the snippet

snipp3

and press Enter

snipp4

There are other ways to do that, you can check here.
This is a very simple example about what I do to optimize my productivity but there are many other thing we can do, I repeat, my best strategy is to keep me update on Visual Studio features.

The second one is about how to optimize my writing code, as I said before, there are many features to use but I like to use more some of them than other and I would like to resume the best for me, I saw that the features below are able to increase my productivity of 500%:

I think that the important thing is, not only just to study new features, this is not so complicate, the complicate thing is to understand how to use them in the best way,  I spent time studying all of these and it was a bite stressful but using this strategy now I write code 60% less, this is crazy but true.

The last one, the motivation, is at the base of all, if you are motivated you work faster and better, otherwise is a big problem, you can be a guru of programming but without motivation nothing is possible.
I use a lot of different strategies to keep myself motivated, the two most important are:

I like to write code in atomic way before and try it, I like to create a Lab Console project and I start to write all the base functions and classes before, if you see your code running you keep yourself motivated, later I like to put all of them in the real scenario and when you see the result and the motivation will be crazy.

Another important aspect for me is listening music, different kinds of music, it depends about what I need to do, quite music or silent when I need to think and to be real focused, progressive rock or radio during the normal development and hard rock when I’m tired or I need to develop faster :)

BTW
I like to invest in my knowledge because it will help me to work better in the future.

Azure Event Hub – all my thoughts

One of the technology I really love in Azure in the Event Hub (EH) ,  I use EH for a lot of different scopes, I like to use the technologies in different scenarios and EH is a really versatile framework.

Event Hub is able to receive millions of message per second, this depending by the setting used in the Azure portal and the message size, for this particular feature, EH is a perfect solution where we need to integrate/coordinate/receive millions of things (IoT Internet of Things), for example receive messages from millions of sensors, coordinate messages between thousands of trains or taxis and why not, send millions of logging messages to the Cloud (On Premise Tracking/Monitoring/Advertising Centralization) and more.

Under the Hood recap.

AzureEventHubsOverview

EH is inside the Service Bus namespace, this doesn’t mean that it is the Service Bus but just only one of the parts of it, inside the Service Bus now we have Queues, Topics and Event Hub as well, for some aspects they have some same behaviours but they also have some important differences.

In the first some quick technical information we need to know to understand EH and his position in Azure, I like to organize the positioning of the technologies inside Azure, this is very important to decide what the best  is for you when you need.

At the end EH is a mix between Queues and Topic but with some differences, with EH you can manages messages in FIFO     pattern as Queues and you can use publish/subscribing pattern as Topics, the difference is that EH cans persist a message no more than 7 days, the maximum dimension of the message is 256 Kb and we have three way to manage a publish/subscribing scenario, logically one is using different Event Hubs, another is using his Partitions and the last one is using different Consumer Groups per EH, now some other important information we need to know.

  • The number of partitions can be between 8 and 32 but you can extend them until 1024 calling the Microsoft support.
  • Each Partition receives messages in an ordered sequence of events.
  • An event is a message composed by a Message Data object class.
  • EH can support batch messaging, so we can send messages bigger than 256k in batch, logically we have to manage the mechanism, EH provide the SequenceNumber property of the EventData class to do that.
  • A Message Data object is a message formed by a body and other context property as:
    • Offset: to set the positioning of the message in the partition, could be a time stamp or any other unique value you need, after you will use the method Checkpoint() to inform the EH about your last reading.
    • Sequence Number: to set a sequence number of the event inside the Partition, this could be very useful to manage  a batch of messages, yes the limit is 256 Kb per message but we can image to use a batch mechanism for the larger message just using this property.
    • Body: we can write all we want inside the body and we can use three different approach,EventData(Byte[]) for byte array,  EventData(Stream) fore stream and EventData(Object, XmlObjectSerializer) to take the content and serialize it.
    • User Properties: we can define all context properties we want and need.
    • System Properties: we can use to set the some Event Data system properties as  Offset, PartitionKey and other more, check here (http://msdn.microsoft.com/en-us/library/azure/microsoft.servicebus.messaging.eventdatasystempropertynames.aspx) , the important thing to know is that we can set this property in two different way, using the context or the object properties

 

systempropertties

 

  • The EH need to use a storage account, we also can configure a local storage account for development purpose and we can do that using the Azure Storage Emulator

 

storage emulator

 

  • The capacity of EH is measured in Throughput Units and by default you have in Ingress: 1MB per second or 1000 events per second and in Output: 2MB per second.
  • We can modify this Throughput in the Azure portal under Service Bus, select the Namespace and Scale.
  • scalePublish/subscribing
    • As I said before we can organize the pubs/subs in different way, we can send messages to a particular Partition and Receive messages from this particular Partition, we can create different Consumer Group following a group URI naming convention, an example…
      • <my namespace>.servicebus.windows.net/<event hub name>/<Consumer Group #1>
      • <my namespace>.servicebus.windows.net/<event hub name>/<Consumer Group #2>
      • <my namespace>.servicebus.windows.net/<event hub name>/<Consumer Group #3>
  • The EH receive message using HPPS and AMQP 1.0 and we can push a message using all language or client technology we want, .Net, Java, Apache Quid and so on.

 

Develoment and Challenges

The development aspect is pretty simple and you can find a lot of samples, the challenge is to organize the messaging pattern and the  Hubs/Partitions/Groups, this is the most important thing, so my best advice is, before to start writing code is better to plan the hubs architecture and the messaging exchange pattern carefully.
I created a lab to try the performances, I was totally amazed, with the default configuration in the Azure portal, I sent 40.000 messages in less than 30 seconds form my laptop and the network was pretty bad.
Logically I used a multi-threading pattern and all the cores worked perfectly and I sent all the messages in one batch, this means that for the EH is really simple to work in multicast scenarios.

a1

 

 

The most interesting things are the correlations and the different combinations we have using Event Hub with the other Azure Technologies as Azure Stream Analytic, Azure Batch, Power BI or Machine Learning and more, but these are others thoughts.

Below all the most important resources we need to start to use it:

BizTalk MicroServices and INTEGRATE 2014 all my thoughts

Last week I was speaker to the Integrate 2014, one the two most important Microsoft Integration conferences in the world, the second most important is in London.
The conference was organized by Microsoft and BizTalk360, in the first I want to thank  Microsoft and BizTalk360 for inviting me as speaker, the Microsoft campus in Redmond is the “Everest” of any speaker :)

io e steef

About the conference, many of my friend and colleagues already wrote a lot about that and there are a lot of technical information in their blogs:

Integration and Microservices  Charles Young
Microsoft making big bets on Microservices by Saravana Kumar
INTEGRATE 2014 – Journey to next generation of integration Steef-Jan Wiggers
Integrate 2014 Recap Kent Weare
Integrate 2014 – Final Thoughts by Nick Hauenstein
Azure BizTalk Microservices, initial thoughts by Sam Vanhoutte

You know my person and as my blog quotes, I’m a man with his technologies thoughts, so what about my thoughts and experience about this event?

The event was focused to present the new Azure Integration Layer, this layer is based on MicroServices architecture, the question is , what happened to the old BizTalk Services framework?

Microsoft changed the strategy and honestly I really like this new one, for a lot of reasons.
The Strategy before was a framework totally closed, with inside all you need to use to integrate the technologies,  the new framework is totally extendible and customizable with a granularity that I couldn’t imagine.
In general all the Integration frameworks we have are essentially a product with features inside who we can use and , if sometime we are lucky and with a really good expertise about it, we can customize and extend.

BizTalk MicroServices follows many interesting patterns based on many important principles and offering infinite possibilities and options.

In the first the possibility to use an own Convention Over Configuration and in this case the possibility to drive a business process dynamically, this is one of my prefer pattern, I like to use it in Azure and I remember one my blog post, a lot of time ago, about how to use this pattern in BizTalk Server.

Other important thing offered by BizTalk MicroServices is the possibility to write atomic component extremely scalable, at the moment in Azure we have only the Worker Role, a totally different architecture, a Windows NT Service executing a .Net Dll component, MicroServices will be an atomic REST services hosted in a Cloud Web Service.

architecture

I said atomic because the approach we will have to use is exactly that, we will implement the transformation component, the rule engine component, the validation component and so on, after that we have to create our workflow, our business process and we will be able to use to different patterns to drive our process.flow

This is what we really missed in Azure, the possibility to create business processes using a real process choreography http://en.wikipedia.org/wiki/Business_Process_Execution_Language, each process will be start by a Trigger, the trigger could be a schedule trigger or a message incoming via HTTP or any kind of protocol, at the end the Trigger in BizTalk MicroServices is the same concept of the BizTalk Server Adapter.

adapters

In other world Microsoft will provide many MicroServices components ready to use and we will be able to create our, all of that offers a lot of interesting business opportunities and possibilities.

The concept we had before about Bridge in BizTalk Services now is BizTalk MicroServices but the difference is huge and more of this difference is about the use.

Before we had a Bridge with receiving data, transforming , validation, sending etcetera, it was a pretty old fashion approach that we already known and used, Microsoft MicroServices are components that we use to create a process choreography who we can reuse in all the Azure Technologies and this is the most important point.

listfeatures

I really like to use all of the Azure technologies we have, the secret I think is to understand what the best technology to use to solve the problem in the best way, in term of productivity and costs.

The world of integration is going to change completely, there is not only one technology to use to integrate but there are many of them and I think that Microsoft Azure is now in the first position about that.

oldpattern

newpattern

About the roadmap, BizTalk Server update in 2015, and BizTalk Microservices in Q1 2015.

roadmapo

During the event Microsoft presented other new features, in particular I was impressed by the Power BI (Business Intelligence).

machinelearning
Only one world,  this is absolutely amazing, it is a BI framework very powerful, the Power BI will be able to use all kind of different data sources, the performances are rock and we will be able to use an Excel add-on and use it to do any kind of BI experience in very powerful way, one of the feature that I more appreciated is the possibility to do query in natural language, that’s amazing, I will be curious to use it ASAP.

I’m pretty sure that Power BI covers will replace our Business Activity Monitoring, I’m already thinking to use.

Data Factory

A framework focused on data warehouse issues, as Power BI the new Data Factory will be able to use all kind of different data sources, with the data factory we will be able to acquire data and work with them using a pipeline pattern, multi different dataset resources.

Stream Analytics and Event HUB
I absolutely love these two guys and I use them a lot in my solutions, with Event Hub we are able to send millions of message in the Cloud / second and use Stream Analytics to receive them for a great Staging strategy.
The language used in Stream Analytics is TSQL, very familiar and easy.

For very intensive and complicate analysis, previsions, neural analysis and so on there is Microsoft Machine Learning, I saw it during the Microsoft MVP Summit and one of the first this I did when I was back home, I started to study it, now I’m going to use it for one my project.

We have Queue for the FIFO and Topics to create very powerful publish / subscribing patterns, Cache to improve the performances, we don’t miss anything now in Azure, we just only have to study and use.

Many people asked me what I think about this change of strategy, well…

I think that it is very complicate to change a strategy for a company huge as Microsoft and this shows only two things , the first is that this company shows that wants to find the best solution to solve a problem and in the second that this company has the courage  to do that, and no many companies in the world are really able to do that.

Proud to be part of that as Microsoft MVP.

MVP