I just want to apologize for not posting in a very long time. I was recently in Seattle, WA for business. But I am back now and ready to start talking technology. One topic that got me flustered when learning BizTalk was de-batching. I will show you how to set up debatching in a BizTalk Solution. Just to be clear I am assuming some BizTalk experience such as dealing with schemas, Visual Studio and the BizTalk Administration Console.
First a little definition of debatching in my own words. For any body that doesn't know, BizTalk loves XML and for this example we will be using XML. Many times in BizTalk you will receive messages containing other messages. A good example is a PO or Purchase Order. Instead of sending 100 separate messages, or files, to a vendor for orders, a company will put all of those orders, or in this case POs, in one large PO. The figure below shows a small example of this. Here you can see that the PO is the root node and has a Header tag that contains information about the company or entity making the order. What we want to focus on is the Orders node of the XML. This is where debatching comes into play. Inside the Orders node you will see three child Order nodes. These are the separate orders being placed by the ordering company or entity. These orders will need to be separated and processed individually. You may be thinking, "But how? They are in the same XML file!". This is where the magic of debatching comes in.
As you know when taking in a file and processing it you must have a schema for BizTalk to match that file to. My question is how many schemas do you think we need for de-batching? If you answered two you are correct! The first schema is what is called the Envelope. This envelope will strip off certain nodes or data in the XML file and allow you to read what is inside of that envelope. This is much like taking a letter out of an envelope and reading the letter. The second schema will define what you want to read. In this case it will be the order node. Now just to be clear, you do not need a schema for each Order node. You will see why later. So based off of the above XML file we can create two schemas that will create the de-batching solution.
First is the Envelope schema. The figure below shows what we need to create the Envelope that will strip allow BizTalk to remove what we want and look at the node that contains our Order nodes.
Take note of the Orders Element and how it contains a Child Element call <Any>. This Child Element will allow us to not have to create any explicit schema in the Envelope until we create the Document Schema. Just a side note, you can use the <Any> Element where ever you please. I could have used the <Any> Element to replace the street, city, state and zip. You will have to evaluate your needs and apply the <Any> Element appropriately. There are also two Properties in Visual Studio we need to set. Click on Schema at the top of the Envelope Schema and go to properties to set the Envelope to Yes. This one is relatively obvious as to why we have to set this property.
Lastly we need to set the Body XPath to point to the Orders Node. Click on PO in the Envelope Schema and in the Properties set the Body XPath to the Orders node. A window will pop up when you click the ellipsis to the left and all you have to do is drill down and highlight the Orders node and click OK.
Below is the Document Schema.
The Document schema will allow BizTalk to recognize and parse those Order nodes that have been exposed by removing the nodes defined in the Envelope Schema. Again you see we have the <Any> Element. I was simply lazy and did not want to explicitly define the schema. Explicitly defining the schema will only allow you to read one structure of message. The <Any> Element allows BizTalk to read anything inside the Order node whether it be elements, records or attributes. You will have to decide for your solution what will best fit your needs in your environment. One major property you need to set in the Document schema is the Max Occurs. By default it is set to 1 meaning there can only be one order inside the Document schema and you know that there are more. Either type in Unbounded or *. In the simplest explanation this will tell BizTalk what do to with each order it encounters and match in to the Document schema.
All that's left to do is deploy to BizTalk and create a Receive Location and a Send Port, point them to two test folders on your computer and drop and file in the Receive Location folder. You can recreate the file to be dropped off of the first picture at the top of the PO file. With that exact file you should get three separate files from the one in the out folder(Send Port file location). The output will give you three files. Just so everyone know I am going to do an advanced de-batching post sometime soon.
Well thanks again for tuning in. If you have any questions please comment and I will answer the best I can. See you soon!
What if your IT Admin gets hit by a bus? What if a server goes down, or worse your backups don't work. Do you have a plan? This is not so much a problem in large enterprise situations but many small and medium businesses tend to over look the "what if's" of their network. The three aforementioned scenarios are very real, except maybe the first one, situations that must be addressed. So here are some solutions that I will go over:
1. Documentation: when you leaves will someone else be able to implement, fix or administer what you did.
2. Backups & Redundency: why would you not do this? Backups are you life line when something goes wrong.
3. Testing: when you do something test it. Not only is this done before implementing but also while your solution is being used.
So documentation first. Every business uses some sort of office suite now. A simple Word document can go along way. Your first goal is going to be documenting research. Why did you pick this solution. Why did you not pick this solution. Including finances will go along way with upper management and shows you have their interests in mind also. Next document your steps to implementing your solution. This will be the road map for future employees to learn what is going on and also to re-implement your solution in the event of a failure. Lastly, your documentation should include people that are involved in your solutions. This maybe be outside businesses that connect and their employees, people inside your company but using different technologies or this may be people on your team. Your documentation should allow you to make well educated decisions and not leave anybody in the dark. Where I work we use Microsoft Work for out "Work Instructions" and we use Microsoft SharePoint Server to organize all our systems, system owners, relations, divisions and how they interact. When something goes wrong, we have all the information at our finger tips to make well educated decisions.
Backups & Redundancy
Just do it. Hard drives today are so inexpensive there is no reason to not have a backup or more than one backup. Your backup should also have a backup. I don't mean you should have three hard drives with three backups. Invest in a SAN. This will give you redundancy in your storage in the event you have a mechanical failure. You can also replace hard drives on the fly utilizing Hot Swappable bays. Down time can hurt very much if people cannot access storage because you had to take the server or SAN offline. Microsoft's line of server OSs offer very good storage solutions and will allow you to use iSCSI which is provides much faster data transfer also. Most businesses already implement Windows Server already so this will appeal to anybody concerned about finances.
Redundancy is also much more doable because prices of hardware have fallen drastically. If you don't know much about redundancy, here it is in Laymen's terms. When one server or hard drive or piece of hardware fails, another picks up the load or "Fail-Over". The most common redundancy is in servers especially with more businesses utilizing virtualization. The first step is to find out what redundancy your current solution supports. Next, find out what programs support redundancy and also what items don't need redundancy. A good example of built-in redundancy is Active Directory. Active Directory will replicate itself as long as there is another Domain Controller in the Forest. something to keep in mind when deploying a redundant solution is the hardware you're going to stock for maintenance. Your servers, whether SAN, Application, Active Directory or whatever should all have the same hardware. This will allow you to better obtain hardware and will keep your solution less complicated by avoiding multiple venders. This is a requirement when utilizing redundancy. Have spare parts! When a server goes down you now have no redundancy. You want to get your failed hardware back up and running as soon as possible.
I will use a story to paint a picture of how redundancy can save a business. A company I worked for before just recently installed a new server. That same server just recently failed and this was the only server. This server was there basket and all the eggs where in this basket. When it failed there was no other server to take over and there business just about stopped. The person that implemented the server thought it would cost too much and that the server would never fail as it was a new server. Guess what.....it failed. Now a thousand dollars for a second server is much cheaper than potentially losing thousands of money in business.
I will add to the story I used before to paint a picture as to why you need to test. The same company just recently installed that new server and though everything was fine. This server failed and they actually had four backups. When one of the four backups was applied it failed. The second backup failed and so did the third backup. Luckily the fourth backup worked and they were able to keep doing business without much impact. Notice I used the word "luckily". I used this work because they should have not had to try backups. Backups need to be tested for errors. Enough said.
So the moral of this whole entry is that you need to think through what you are doing. Thanks!
As probably many of you know PowerShell is a very powerful tool for any programmer or IT administrator. Here is my scenario and basically what this post is about. Currently at my job we are testing using BizTalk Services in Windows Azure. Our problem is that we have to create all the necessary services and databases, which takes time, and then bring them back down as to make sure we do not incur any charges for them. This takes precious time and just gets monotonous after awhile.
The solution to this is PowerShell. With PowerShell and the Azure module we can pretty much script all of the common tasks of standing up VMs, ACSs, databases, virtual networks and services. So all we have to do before sitting down to test is double click a script file and away we go. The first part of this is simply setting up PowerShell to use the Azure module and getting your Publish Settings file into PowerShell the correct way. Some blogs will go through this but skip a major step. Now I will assume you have some idea of what PowerShell is and how to at least open it. Let's begin!
1. First we need to download Microsoft's Web Platform Installer. This is a great tool for getting tools to develop Microsoft Products and other products. The download is at http://www.microsoft.com/web/downloads/platform.aspx
2. After the Web Platform is installed you should be shown a screen similar to the following
3. In the top right hand corner search for Windows Azure PowerShell and add it to your downloads. Then click Install at the bottom right.
4. The installer will walk you through installation.
5. The installation not only includes the cmdlets but also a Windows Azure PowerShell program. This is nice because all the Azure cmdlets are loaded for you.
6. Now issue the command get-azurepublishsettingsfile. This will open up your browser and take you to a site to download your Azure subscription Publish Settings file. Log in with your Live ID and you will be prompted to download the file. I recommend saving it somewhere you will be able to find it later.
7. Next issue the command import-azurepublishsettingsfile -publishsettingsfile <fileLocation>. The file location is a string and will need to be in quotes. This will import all the setting for you subscription.
8. To test your settings issue the command get-azuresubscription. The output is pictured below.
9. In the above picture, the most import thing to check is the IsDefault setting towards the bottom of the output. Here is the one thing most blogs forget to have you check. If this is not set to true your commands will fail. The importing of your publish settings file should take care of this but when I first started with Azure PowerShell for some unknown reason this did not happen. It caused me a lot of headache until a co-worker pointed it out.
10. To set a subscription to default issue the command set-azuresubscription -subscriptionname <subscriptionname>. The subscriptionname argument is a string and should be quotes.
11. Issue the get-azuresubscription command again to make sure the subscription you want is the default.
Ok well this covers the Azure subscription setup. As soon as I have the whole script working for my project I will be sure to write about it. I hope you are successful with this and this opens new doors to your developing. Thanks!