{AoT} Azure Of Things – Queues and Topics

I spent the last six months study and developing and I like to say, now we have a lot of things in Azure which we can use to integrate a lot of things in Internet.
The world of Integration is very complex, it is a universe of things, in the first time my problem was not about how to create this or that in Azure, this is quite simple, but how to combine and use all of these things together.
For Ap as Azure power and At as Azure things, I can say that the power of Azure increases with number of Azure things combined together.


The power of Azure is directly proportional with the summation of things we combine together.

I like to think about Azure as a little universe full of things and the collision between them is able to create energetic solutions.


I remember, just only two years ago, how much was complicate to use stuff as queues, topics or blob in Azure, now is very fast and simple, also now there are a lot more things as Event Hub, Stream Analytic, a machine who is able to learn, power features for business Intelligence, an API to write powerful services and more.
The last Microsoft released is the API APPS, the BizTalk Summit in London is a good opportunity to understand more about it and I’m also sure that many guys will start to write about it, me too.
In these last period I liked to play with all the possible things around and I was really surprise by the simplicity to use them.

Create an Azure Queue.. no sooner said than done, a Topic with multiple subscriptions…no sooner said than done, a blob or table storage for any kind of purpose no sooner said than done, no more than ten minutes coding.
For example here the code to manage an Azure Queue, I added the comment inside the code in case you need to reuse it.

private void buttonQueue_Click(object sender, EventArgs e)

    string connectionString = "Endpoint=sb://[YOUR NAME SPACE].servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=[YOUR SHARE ACCESS KEY]";
    var namespaceManager = NamespaceManager.CreateFromConnectionString(connectionString);

    if (!namespaceManager.QueueExists("TestQueue"))
    QueueClient Client =
        QueueClient.CreateFromConnectionString(connectionString, "TestQueue");

    //If you want to use properties
    //message.Properties["TestProperty"] = "TestValue";
    //message.Properties["Message number"] = i;
    byte[] b = Encoding.UTF8.GetBytes("Text to Send");
    Client.Send(new BrokeredMessage(b));

    //Callback lambda approach, faster and easy
    // Configure the callback options
    OnMessageOptions options = new OnMessageOptions();
    options.AutoComplete = false;
    options.AutoRenewTimeout = TimeSpan.FromMinutes(1);

    // Callback to handle received messages
    Client.OnMessage((message) =>
            // Process message from queue, here to change the type for custom class 
            string bodymessage = message.GetBody<string>();
            string propertymessage =  message.Properties["TestProperty"].ToString();
            // Remove message from queue
        catch (Exception)
            // Indicates a problem, unlock message in queue
    }, options);

What I like more about this code is the improvements that Microsoft is doing inside the code pattern, the using of a lambda approach is the faster way to manage the callbacks and also is the most readable way to manage these kind of situations, I absolutely love it.

I tried to do the same with Amazon SQS (Simple Queue Service) , for some the aspects the approach is similar on the Queue creation but different in the receiving side.


Personally I prefer the Microsoft approach, I’m sure that exist the way to use the same pattern with Amazon, what I’m meaning is that the base pattern proposed by the Microsoft Framework is faster and simpler.

The Amazon receiving approach is closer to a “flat direct” pattern approach, the Microsoft approach is closer to an “event propagation” pattern and this is faster to use and also optimized for high threading approach.

queue call back

The pattern to create a Microsoft Topic is similar, this is a nice thing because the developer is going to use a same pattern approach for all the stacks, check the code below.
Below you can read the complete sample with some useful comments.

private void buttonTopic_Click(object sender, EventArgs e)
    string connectionString = "Endpoint=sb://[YOUR NAMESPACE].servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=[YOUR SHARED KEY]";

    var namespaceManager = NamespaceManager.CreateFromConnectionString(connectionString);

    if (!namespaceManager.TopicExists("TestTopic"))
    TopicClient Client =
        TopicClient.CreateFromConnectionString(connectionString, "TestTopic");

    byte[] b = Encoding.UTF8.GetBytes("Message content");
    BrokeredMessage message =  new BrokeredMessage(b);
    message.Properties["MessageNumber"] = 4;
    //message.Properties["Message number"] = i;


    // Here to create a subscription 
    //Create a filtered subscription
    SqlFilter highMessagesFilter =
        new SqlFilter("MessageNumber > 3");

    if (!namespaceManager.SubscriptionExists("TestTopic", "HighMessages"))

    SubscriptionClient subscriptionClientHigh =
                (connectionString, "TestTopic", "HighMessages");

    // Configure the callback options
    OnMessageOptions options = new OnMessageOptions();
    options.AutoComplete = false;
    options.AutoRenewTimeout = TimeSpan.FromMinutes(1);

    subscriptionClientHigh.OnMessage((brokerMessage) =>
            // Process message from subscription
            string bodymessage = brokerMessage.GetBody<string>();
            string propertymessage =  message.Properties["TestProperty"].ToString();

            // Remove message from queue
        catch (Exception)
            // Indicates a problem, unlock message in subscription
    }, options);

What I really like is the natural and simple approach to create stuff from scratch, it is fast and easy, we can combine things as Topics, Queues, Event Hub and more in very fast way to create very complex Pub/Sub scenarios.

Event Hub is really simple to use, I’m going to put a personal laboratory in GitHub, I will share the content with you soon, I want also to prepare a video about it.

I spent the last six months playing with all the Microsoft technologies, using all of them together we are really able to create amazing things.

{AoT} Azure of Things – BizTalk London Summit 2015


The BizTalk London Summit is coming and many are the reasons because I’m so excited. Sometime I like to think about my first BizTalk London Summit, it was two years ago, the atmosphere was magic, the event was full of integration experts and passionate, honestly I like to call them Integration Animals :)

Sandro wrote a great post about it HERE

During the last Summit I had the opportunity to share knowledge, to learn new thing and , important thing, in front a good beer :).
The number of attendees is going to grow up every year more, last year were 250+ and probably this year will be 400+.

When I think that all of this because a group of great friends started a magic adventure called BizTalk Crew three year ago, organizing events around the Europe because the passion and the friendship, I think that’s amazing and I’m feeling so happy for that.


Now the event is International and many are the challenges to organize it, the BizTalk 360 team is doing a great job, these guys are rocks!!!

All the speakers are Microsoft MVPs and also many Microsoft Product and Program Managers from Redmond will be present, these two days will be full of intensive moments, it will be an amazing technical opportunity to live.

Microsoft released a lot of new things in Azure and I invested all my free time to study and developing. One of my preferred stack is Event Hub, I played a lot around it, next week I will be in the Integration Monday and I prepared some toys and code to show up, I think it will be a nice technical moment.

The event will be around 19.30, yes is during the dinner time and I like the idea that probably most of the attendees will eat during my session, I will also eat something during the speech :D

HERE if you are interested to join for the Tech Dinner Session, the menu will be, Event Hub and good beer.

About BizTalk Summit in London, well is not simple to explain what I’m going to show, in this last period I was impressed about how many interesting thing is possible to do mixing things in Azure, this because {AoT} Azure of Things. Now Microsoft Azure is full of great stacks witch we are able to mix together, in very simple way, to solve most of the common integration problems and patterns. Months ago I started to investigate about it and I was impressed by the simplicity , by the opportunity to combine all of this in so simple way, Queues, Topics, Event Hub, Hybrid Integration, ,Application, Analytic, Computing, Data and more.

In this last month I felt myself as a child playing with fun, playing with code and technologies, was long time that I missed this feeling, just combining in intelligent way these stacks is possible to create and solve fantastic integration patterns.

I will write more about {AoT}, the BizTalk London Summit will be a great opportunity to discuss about it.

You can find more detail about the BizTalk London Summit HERE.

See you there :)

Development productivity – all my thoughts

Last month I started working in a new project idea, I love to write code and I also love painting, I don’t see so many differences between these two activities, twice are based on our pure creativity, we can create all we think and all we want, sometime I love to use this my quote, I paint when I write code :)

As a painter in front a white picture, when I have an idea or an inspiration, I start to write code in front an empty Visual Studio project, Initially the activity is very expensive because I have a clear idea, a clear feeling in my mind, but I’m not sure about what I’m going to do to realize that, I write and delete the same code many times, changing it continuously, until I’m satisfied.
Logically I work during my free time, normally during the night, so the optimization of my time and the improvement of my productivity is the first real critical aspect for me, more I’m fast more quickly I will realize my idea and more happy will be wife :)

My personal strategy is to invest time to increase my productivity, more I invest in the first time and more I will be fast and optimized in my development, but what is, for my opinion, the development productivity?
I can think about it in three different main aspects, one is how much fast I can write code, the second is how much I can optimize my writing code and the third is the motivation.
First my best strategy is to study Visual Studio and .Net Framework to understand  the best options I can use to optimize my development life, I want to do two simple but powerful examples.

About the first one, how many time I need to re write the same lines of code, I like to create my best base code snippets, I do that before and in the during of development, and this is very simple and fast to do, this is a my real sample.

Create a simple XML file with this content:

<?xml version=”1.0″ encoding=”utf-8″ ?>
<CodeSnippet Format=”1.0.0″ xmlns=”http://schemas.microsoft.com/VisualStudio/2013/CodeSnippet”&gt;
<Description>Insert log message</Description>
<Default>my function</Default>
<Code Language=”CSharp”>
string.Format(“Error in {0}”, MethodBase.GetCurrentMethod().Name),

Save the file with .snippet extension and import the snippet file in Visual Studio using the Code Snipper Manager you will find under the Tools voice menu.

When I need to reuse my code I just press Ctrl + K and then X.

snipet2Select the snippet


and press Enter


There are other ways to do that, you can check here.
This is a very simple example about what I do to optimize my productivity but there are many other thing we can do, I repeat, my best strategy is to keep me update on Visual Studio features.

The second one is about how to optimize my writing code, as I said before, there are many features to use but I like to use more some of them than other and I would like to resume the best for me, I saw that the features below are able to increase my productivity of 500%:

I think that the important thing is, not only just to study new features, this is not so complicate, the complicate thing is to understand how to use them in the best way,  I spent time studying all of these and it was a bite stressful but using this strategy now I write code 60% less, this is crazy but true.

The last one, the motivation, is at the base of all, if you are motivated you work faster and better, otherwise is a big problem, you can be a guru of programming but without motivation nothing is possible.
I use a lot of different strategies to keep myself motivated, the two most important are:

I like to write code in atomic way before and try it, I like to create a Lab Console project and I start to write all the base functions and classes before, if you see your code running you keep yourself motivated, later I like to put all of them in the real scenario and when you see the result and the motivation will be crazy.

Another important aspect for me is listening music, different kinds of music, it depends about what I need to do, quite music or silent when I need to think and to be real focused, progressive rock or radio during the normal development and hard rock when I’m tired or I need to develop faster :)

I like to invest in my knowledge because it will help me to work better in the future.

Azure Event Hub – all my thoughts

One of the technology I really love in Azure in the Event Hub (EH) ,  I use EH for a lot of different scopes, I like to use the technologies in different scenarios and EH is a really versatile framework.

Event Hub is able to receive millions of message per second, this depending by the setting used in the Azure portal and the message size, for this particular feature, EH is a perfect solution where we need to integrate/coordinate/receive millions of things (IoT Internet of Things), for example receive messages from millions of sensors, coordinate messages between thousands of trains or taxis and why not, send millions of logging messages to the Cloud (On Premise Tracking/Monitoring/Advertising Centralization) and more.

Under the Hood recap.


EH is inside the Service Bus namespace, this doesn’t mean that it is the Service Bus but just only one of the parts of it, inside the Service Bus now we have Queues, Topics and Event Hub as well, for some aspects they have some same behaviours but they also have some important differences.

In the first some quick technical information we need to know to understand EH and his position in Azure, I like to organize the positioning of the technologies inside Azure, this is very important to decide what the best  is for you when you need.

At the end EH is a mix between Queues and Topic but with some differences, with EH you can manages messages in FIFO     pattern as Queues and you can use publish/subscribing pattern as Topics, the difference is that EH cans persist a message no more than 7 days, the maximum dimension of the message is 256 Kb and we have three way to manage a publish/subscribing scenario, logically one is using different Event Hubs, another is using his Partitions and the last one is using different Consumer Groups per EH, now some other important information we need to know.

  • The number of partitions can be between 8 and 32 but you can extend them until 1024 calling the Microsoft support.
  • Each Partition receives messages in an ordered sequence of events.
  • An event is a message composed by a Message Data object class.
  • EH can support batch messaging, so we can send messages bigger than 256k in batch, logically we have to manage the mechanism, EH provide the SequenceNumber property of the EventData class to do that.
  • A Message Data object is a message formed by a body and other context property as:
    • Offset: to set the positioning of the message in the partition, could be a time stamp or any other unique value you need, after you will use the method Checkpoint() to inform the EH about your last reading.
    • Sequence Number: to set a sequence number of the event inside the Partition, this could be very useful to manage  a batch of messages, yes the limit is 256 Kb per message but we can image to use a batch mechanism for the larger message just using this property.
    • Body: we can write all we want inside the body and we can use three different approach,EventData(Byte[]) for byte array,  EventData(Stream) fore stream and EventData(Object, XmlObjectSerializer) to take the content and serialize it.
    • User Properties: we can define all context properties we want and need.
    • System Properties: we can use to set the some Event Data system properties as  Offset, PartitionKey and other more, check here (http://msdn.microsoft.com/en-us/library/azure/microsoft.servicebus.messaging.eventdatasystempropertynames.aspx) , the important thing to know is that we can set this property in two different way, using the context or the object properties




  • The EH need to use a storage account, we also can configure a local storage account for development purpose and we can do that using the Azure Storage Emulator


storage emulator


  • The capacity of EH is measured in Throughput Units and by default you have in Ingress: 1MB per second or 1000 events per second and in Output: 2MB per second.
  • We can modify this Throughput in the Azure portal under Service Bus, select the Namespace and Scale.
  • scalePublish/subscribing
    • As I said before we can organize the pubs/subs in different way, we can send messages to a particular Partition and Receive messages from this particular Partition, we can create different Consumer Group following a group URI naming convention, an example…
      • <my namespace>.servicebus.windows.net/<event hub name>/<Consumer Group #1>
      • <my namespace>.servicebus.windows.net/<event hub name>/<Consumer Group #2>
      • <my namespace>.servicebus.windows.net/<event hub name>/<Consumer Group #3>
  • The EH receive message using HPPS and AMQP 1.0 and we can push a message using all language or client technology we want, .Net, Java, Apache Quid and so on.


Develoment and Challenges

The development aspect is pretty simple and you can find a lot of samples, the challenge is to organize the messaging pattern and the  Hubs/Partitions/Groups, this is the most important thing, so my best advice is, before to start writing code is better to plan the hubs architecture and the messaging exchange pattern carefully.
I created a lab to try the performances, I was totally amazed, with the default configuration in the Azure portal, I sent 40.000 messages in less than 30 seconds form my laptop and the network was pretty bad.
Logically I used a multi-threading pattern and all the cores worked perfectly and I sent all the messages in one batch, this means that for the EH is really simple to work in multicast scenarios.




The most interesting things are the correlations and the different combinations we have using Event Hub with the other Azure Technologies as Azure Stream Analytic, Azure Batch, Power BI or Machine Learning and more, but these are others thoughts.

Below all the most important resources we need to start to use it:

BizTalk MicroServices and INTEGRATE 2014 all my thoughts

Last week I was speaker to the Integrate 2014, one the two most important Microsoft Integration conferences in the world, the second most important is in London.
The conference was organized by Microsoft and BizTalk360, in the first I want to thank  Microsoft and BizTalk360 for inviting me as speaker, the Microsoft campus in Redmond is the “Everest” of any speaker :)

io e steef

About the conference, many of my friend and colleagues already wrote a lot about that and there are a lot of technical information in their blogs:

Integration and Microservices  Charles Young
Microsoft making big bets on Microservices by Saravana Kumar
INTEGRATE 2014 – Journey to next generation of integration Steef-Jan Wiggers
Integrate 2014 Recap Kent Weare
Integrate 2014 – Final Thoughts by Nick Hauenstein
Azure BizTalk Microservices, initial thoughts by Sam Vanhoutte

You know my person and as my blog quotes, I’m a man with his technologies thoughts, so what about my thoughts and experience about this event?

The event was focused to present the new Azure Integration Layer, this layer is based on MicroServices architecture, the question is , what happened to the old BizTalk Services framework?

Microsoft changed the strategy and honestly I really like this new one, for a lot of reasons.
The Strategy before was a framework totally closed, with inside all you need to use to integrate the technologies,  the new framework is totally extendible and customizable with a granularity that I couldn’t imagine.
In general all the Integration frameworks we have are essentially a product with features inside who we can use and , if sometime we are lucky and with a really good expertise about it, we can customize and extend.

BizTalk MicroServices follows many interesting patterns based on many important principles and offering infinite possibilities and options.

In the first the possibility to use an own Convention Over Configuration and in this case the possibility to drive a business process dynamically, this is one of my prefer pattern, I like to use it in Azure and I remember one my blog post, a lot of time ago, about how to use this pattern in BizTalk Server.

Other important thing offered by BizTalk MicroServices is the possibility to write atomic component extremely scalable, at the moment in Azure we have only the Worker Role, a totally different architecture, a Windows NT Service executing a .Net Dll component, MicroServices will be an atomic REST services hosted in a Cloud Web Service.


I said atomic because the approach we will have to use is exactly that, we will implement the transformation component, the rule engine component, the validation component and so on, after that we have to create our workflow, our business process and we will be able to use to different patterns to drive our process.flow

This is what we really missed in Azure, the possibility to create business processes using a real process choreography http://en.wikipedia.org/wiki/Business_Process_Execution_Language, each process will be start by a Trigger, the trigger could be a schedule trigger or a message incoming via HTTP or any kind of protocol, at the end the Trigger in BizTalk MicroServices is the same concept of the BizTalk Server Adapter.


In other world Microsoft will provide many MicroServices components ready to use and we will be able to create our, all of that offers a lot of interesting business opportunities and possibilities.

The concept we had before about Bridge in BizTalk Services now is BizTalk MicroServices but the difference is huge and more of this difference is about the use.

Before we had a Bridge with receiving data, transforming , validation, sending etcetera, it was a pretty old fashion approach that we already known and used, Microsoft MicroServices are components that we use to create a process choreography who we can reuse in all the Azure Technologies and this is the most important point.


I really like to use all of the Azure technologies we have, the secret I think is to understand what the best technology to use to solve the problem in the best way, in term of productivity and costs.

The world of integration is going to change completely, there is not only one technology to use to integrate but there are many of them and I think that Microsoft Azure is now in the first position about that.



About the roadmap, BizTalk Server update in 2015, and BizTalk Microservices in Q1 2015.


During the event Microsoft presented other new features, in particular I was impressed by the Power BI (Business Intelligence).

Only one world,  this is absolutely amazing, it is a BI framework very powerful, the Power BI will be able to use all kind of different data sources, the performances are rock and we will be able to use an Excel add-on and use it to do any kind of BI experience in very powerful way, one of the feature that I more appreciated is the possibility to do query in natural language, that’s amazing, I will be curious to use it ASAP.

I’m pretty sure that Power BI covers will replace our Business Activity Monitoring, I’m already thinking to use.

Data Factory

A framework focused on data warehouse issues, as Power BI the new Data Factory will be able to use all kind of different data sources, with the data factory we will be able to acquire data and work with them using a pipeline pattern, multi different dataset resources.

Stream Analytics and Event HUB
I absolutely love these two guys and I use them a lot in my solutions, with Event Hub we are able to send millions of message in the Cloud / second and use Stream Analytics to receive them for a great Staging strategy.
The language used in Stream Analytics is TSQL, very familiar and easy.

For very intensive and complicate analysis, previsions, neural analysis and so on there is Microsoft Machine Learning, I saw it during the Microsoft MVP Summit and one of the first this I did when I was back home, I started to study it, now I’m going to use it for one my project.

We have Queue for the FIFO and Topics to create very powerful publish / subscribing patterns, Cache to improve the performances, we don’t miss anything now in Azure, we just only have to study and use.

Many people asked me what I think about this change of strategy, well…

I think that it is very complicate to change a strategy for a company huge as Microsoft and this shows only two things , the first is that this company shows that wants to find the best solution to solve a problem and in the second that this company has the courage  to do that, and no many companies in the world are really able to do that.

Proud to be part of that as Microsoft MVP.


BizTalk360 – a technical review across the board.

I’m going to start a series of articles about the best technologies I’m going to use in my job and I would like to do a technical review across the board.

As I said before, in this last period I have to use a lot of different interesting technologies, this because the integration world is growing in very fast way and the customers requirements are more specified every day more, more performances, more easy operation management, more monitoring and control in the business solutions.

I like to use and play with products and technologies when they are well done and powerful, for example I was really impressed by the Microsoft Event Hub, I played with him in these last days, one million message in just one second and in reliable way, this is awesome, I will write something about that in next posts, but I want to back to BizTalk360.
A customer asked me about a solution to manage his BizTalk infrastructure, so I decided to install BizTalk360 to do a demo to him.
I remember the first BizTalk360 demo that I saw in Seattle, a couple of years ago, the product was in Silverlight, with some interesting features, but I couldn’t image so many changes and features in just two years, I have only one word, impressive.

The installation is very fast and simple, I don’t have many things to write about the instalation, next next next next ….. done.
start install2After the installation, the portal will be open, and all features are already running and configured, the number of features is impressive and most of them are really useful, this means that the product is created by persons that working with Microsoft BizTalk Server in real , every day.

Impossible to describe all the features in one post, but I can speak about my impression and about my thoughts.
Many feature are useful to solve any kind of problem that you can have with a BizTalk infrastructure, but not only for that, I’m going to use BizTalk360 during the development to improve the productivity.

Very well done, is the navigation inside the product, I mean that you start to use a feature and you will go to use other features just navigating inside the artifacts.
The world of the integration lives on the dependencies between on artifacts and services, and a intelligent navigation is the best strategy to be fast to understand possible problems or to manage the artifacts in fast way.
The menu on the left follows the user proposing a completed vision every moment.


The monitoring and notification is really detailed and this is one of the most important aspect in a BizTalk solution, you can set alarms for a lot of different situations and areas.
Another important thing is that you can combine this technology areas, for example BizTalk + Event Viewer + Sql Server, to create a lot of very powerful monitoring strategies.
The Graphical Flow is amazing, I tried to image about solving any kind of usual problem about tracking and I realized that with this feature you can.
I want to propose a classic situation and I want to try to image how to solve this without this feature in so easy way, below the sample:
You know you have a big problem with a send pipeline, this is a runtime error, the problem is that you need to understand who is using this pipeline.
This is important, because the error is about some error inside the message and the data are created by some orchestrations, so i need to construct the message flow to understand where I need to operate.
Unfortunately in 98% of the situations we don’t have any kind of good documentation, we have to be honest about that.
The step to accomplish this is very well done, after open Graphical Flow, you make a query with any kind of criterias you want and look for the pipeline.


Click on the pipeline and, very important thing, you can navigate forward or backward, in this case you will go to reconstruct all the messaging flow, in this case was a send pipeline and you can create multiple view and navigate in deep as you want.


The product covers all the most important aspects about a BizTalk solution, some of the feature are really specialized for advanced troubleshooting as the Throttling Analyzer and Advance Event Viewer.
The Disaster and Recovering is the most crucial part in a BizTalk environment, BizTalk depends in all on Sql Server.
BizTalk uses a lot of different databases, Message Box for the messaging transaction and many other to cover a lot of different aspects, Administion, SSO, BAM, BRE and so on.
The feature Backup-DR Monitor is very well done, you can manage all of these aspects in very easy way.

There are so many features, I think that the best way to understand the product is to go in the BizTalk360 site (http://www.biztalk360.com), the site is really full of documentation, just reading the product tour, you will be able to use the product.

You can download the evaluation version and install it in very easy and fast way, I’m sure that I will write more about this product, I really love to play with products well done and now I start to understand why BizTalk360 has more than 200 customers in the world.

BizTalk Mapping Patterns & Best Practices, the Bible of the mapping.

I want to present you one of the best book about BizTalk Server, BizTalk Mapping Patterns & Best Practices.
The Author is Sandro Pereira, my great friend, he is a MVP in Microsoft Application Integration since 2011, he is one of the most active person in the Integration Community area.
Honestly I’m always impressed by Sandro’ s Blog, thousands of posts and advices, and  I always thought , “what could be happen if Sandro decides to write a book?”, this is happened.
Well, 400 pages about mapping and transformations, this is the bigger opera I can image about mapping and transformations.
I reviewed the book, it was a very hard work, the book is really full of information, demos, and all possible use cases about mapping in BizTalk Server.
I said BizTalk Server because the book is wrote for the mapping with BizTalk Server, but a lot of the concepts I found in the book, could be applicable for all kind of mapping, MABS, XSLT etcetera.
I’m sure that this book will be considered the bible of mapping, any Integration animal in the world will refer to this book in the future.

Another important thing is that, the book is free.

You can find the book here