Monday, April 30, 2007

Developing .Net Service Oriented Applications - Contract-First Approach

SO (Service Oriented) architecture, generally nowadays implies Web Service Service Oriented architecture, is in vogue and every vendor clamours to produce easy-to-use tools/IDE. But SO or Web Service is more than authoring some classes and spitting out the service. In my research, I come across the following highly recommended articles by Aaron Skonnard, slightly dated but every bit relevant today, containing extremely valuable lessons to designing SO/Web Services that are worth remembering.

In his first article of a series to introduce this concept of "Contract-First Approach", he introduce this by way of drawing parallelism with COM programming. "Contract-First approach" in that technology is by way of authoring the COM IDL. Contrasting this is an approach promoted by Visual Basic and he called this "Code-first approach" by encouraging developers to author classes which are then translated into COM type library.

He went on to cite the following disadvantages of code-first vs contract-first:
The code-first approach made it much easier for developers using Visual Basic to build COM applications, but it came with its own set of problems.

Although it increased productivity, this code-first approach complicated versioning.
[...]
The other problem with the Visual Basic code-first approach was that it hurt interoperability in mixed-language environments.
In contrast, for the contract-first approach, he cited that
...they were more aware of the issues given their focus on the contract. An important aspect to designing a shared contract is considering the needs of all possible implementation environments before writing code.

Many developers didn't realize that they too could enjoy the benefits of contract-first development by authoring the IDL first, compiling the IDL into a type library, and then referencing the type library to bring the interface definition into the project.
He concluded his introduction with the following warning that everyone serious about developing SO should heed:
Now, not all organizations using COM were concerned with mixed-language support or long-term versioning and were more interested in improving short-term development productivity. Such organizations were perfectly content, and even successful, taking the Visual Basic code-first route despite some of the issues I've outlined. Ironically, the new world of SO is heading down the same path, dealing with a very similar set of issues.
I was drummed in, in my early days of COM programming, that the proper way to develop COM (and to the same extend CORBA) was to author IDL first and I've never forget this valuable advice. Unfortunately, I have lost the reference to this article.

In dealing with Web Service,
most of today's Web services frameworks make it possible to design services using either code-first or contract-first techniques; however, most Web service development frameworks push developers towards the former. Frameworks that provide a code-first approach provide a translation layer to map between the native programming language and XSD and WSDL.
While code-first ASMX programming technique improves productivity and reduce the learning curve, his warning should be heeded:
Experience has shown that services implemented using code-first techniques, like the default ASMX model, are less likely to interoperate cleanly in mixed-language environments. However, the difference with SO is that mixed environments are much more common—practically the norm—making interoperability a much greater design concern. The problem is compounded by the fact there are more programming languages and type systems involved in this new world than before in the component days. Now it's even more important to consider all possible implementation environments during the contract design stage.

Code-first can also complicate versioning because the developer isn't in direct control of the contract, but must rely on the translation layer throughout development. However, unlike Visual Basic 6.0, ASMX doesn't provide any fancy compatibility modes to help the developer know when making changes to the ASMX class may break contract compatibility with existing clients, which makes the situation even worse.

This is how to do the contract-first approach:
To use a contract-first approach, you start by authoring the XSD definitions to model the messages used by the service. Next you author the WSDL to define the service operations, specifying which XSD messages are sent and received. There are a variety of applications you can use to author these documents, including Notepad. These first few steps are the hardest for most developers because they aren't familiar with XSD and WSDL and because there isn't a great deal of tool support.
So how do you choose when to use code-first or contract-first approach? Here are his guidelines:
The answer is any time there is a strong emphasis on the messages exchanged or you need to ensure conformance to an existing message format..... Contract-first is, without question, the most natural approach in a message-oriented world, such as that of BizTalk.

Contract-first is also especially important when you require conformance to a particular contract from third parties, or when you're collaborating with other groups to define a shared contract throughout an organization. Using contract-first here allows you to solidify the contract up front while numerous parties implement the contract in parallel. All of these situations are common in industry today.
[...]
So when you own the contract, it is common sense to strive to make the contract as easy as possible for your consumers to use, which also begs for a contract-first model.

The biggest obstacle to contract-first design today is the lack of tool support, which hurts productivity. When productivity is the overriding concern, code-first can be an acceptable approach, especially when you're not particularly concerned with mixed-environment interoperability.
In his second article of this series, he describes the techniques and toolsets to support this contract-first approach.

There are not too many tools that are integrated with VS2005 to produce the WSDL, which in his mind is the toughest of the steps - the first, constructing the XSD Schema representing the XML message can be accomplished with the XSD designer in Visual Studio.

The one that he find extremely useful and competent is by thinktecture and is called Web Service Contract First. It is a free tool for VS2003 and VS2005.

These two articles are a must read materials for anyone serious about developing SO or Web Services applications.

Sunday, April 29, 2007

A long journey to meet the Penguin

Since I have lost interested in Windows ME II, alias Vista, and totally disgusted with their activation scheme and DRM turning the OS literally into a Police outpost, I have decided to turn my attention to something differently. I have decided to re-acquaint with my long lost friend that I began my computing career in.

I have heard so much about Linux and how little resource it demands to run; it is free (that is what OS should be) and that it has a much stronger security system (a fact that has been questioned); Linux is a strong contender to ME II. So I have decided to check out UBuntu, a distribution or variant of Linux.

Below is my experience of meeting and installing UBuntu:

I happen to have an old discarded Laptop, a Toshiba Satellite 1800 Series with a Celeron 800 and 256M of RAM lying around with Windows XP Pro installed into it that I can use to check out Linux, since UBuntu supports multi-boot option.

Well, it is an old machine but I have been told that Linux does not require much to run. So it sounds like a perfect machine to test Ubuntu out. I am sure ME II will choke on this specification.

The first attempt used a downloaded CD image of the desktop version 7.04. I popped the CD into the machine on Wednesday evening and naively believing that I could use it to config an ADSL modem. How wrong was I.

The Ubuntu Live CD fired up reasonable quick, not without drama, allowing me to experiment with the OS. When it was booting up the graphical environment, an error message box popped up complaining GNOME Settings applet could not start, etc. All I could do was to close it. Some how it did not hold back the starting process.

It looks very nice and the graphical interface was definitely not in the Unix I cut my teeth in in learning C programming back in the '70. It's a nice addition.

Running totally from the CD was really painfully slow and you can hear the poor drive going crazy. So I decided to bite the bullet and to install this onto my hard drive, which still had 6-7G spare. This was the beginning of the fun bit revealing the not so polished side of Ubuntu.

I opened the Install applet on the desktop. For a period that felt like eternity and was indeed eternity in computer time, the CD Rom drive went berserk zipping back and forth lights blinking and all the screen was showing was a frame with the title "Install." After more than a hour or so, a list of locales beginning to appear in a combo box but more were to come.

Not knowing what lied ahead and I was not totally comfortable of risking the machine at that moment, I aborted the installation, an action I regretted. The installation process did not have any instruction on aborting it and shutting down. So I simply powered down the machine forcefully. For some reasons, I did not see the red button on the top right corner of the screen that was designed used to power down.

From that moment on, I could not recover that list of locales. It's strange that running the installation from a CD Rom should guarantee that no previous installation state was recorded but somehow, there were variations between each attempt, indicating something was recorded but in where? In some attempts, I left it ran for hours and all I saw was a frame with nothing in it. It's not a freeze by any definition because the CD was still being read on and off. One had to wonder what it was doing. Some progress message would be greatly appreciated. It couldn't be just reading that hundred or so locale names. Some attempts encountered error with the message complaining about failure to load the paned "OAFIID: GNome_MixedApplet".

After so many failed attempts, I suspected the image might be faulty or the machine could not tolerate version 7.04. So I tried the version 6.06 (alternate).

This ISO image produced even far worse outcomes. It froze my machine trying to populate the list of locales consistently. It might be doing some memory scanning or something but definitely no CD drive activities and no hard disk activity. I had even left it "ran" for hours just for curiosity.

After several attempts, I gave up on this and was even considering trying XUbuntu as it's for lower spec machines. Before doing that, I decided to give 7.04 one final attempt using the "alternate" ISO image. I had no idea what "alternate" meant and I also selected the download for machines with less than 256M, which sounded more suitable for my machine.

This ISO image differed from the other in that it did not start Live CD. It jumped straight into installation mode - in text mode. The best parts were that it was fast and informative comparing to the graphical installation.

I was a bit apprehensive when it guided me through in resizing the NTFS partition and in creating the new one. The on screen message was not clear enough. What the heck? It's a late Friday night and soon into Saturday. So I went ahead.

It finally completed the installation and after removing the CD from the drive, I pressed the button to reboot. Instead of coming up with the familiar graphical screen, after displaying some messages that quickly vanished, I was staring at the black screen with no disk activity and no progress message.

I thought that was frozen, so I forcefully shut it down. When it restarted, it went through some recovery process akin to NTFS' Chkdsk /F . When that was completed, the same black screen came back. Once again I powered it down. On the next reboot, I decided to play with it and ran the memtest. This went through some lengthy tests and after completed one full cycle, I decided to reboot.

Not knowing what would come next, I went to do something else. When I returned to the Toshiba, the Ubuntu log in screen awaited me. Finally success and the Penguin was there to welcome me!

One of my first missions was to check out the command prompt, being a keen command line practitioner, and it's now called the Terminal. The next was to determine what kind of privilege I was given. The installation screen should advise me that I have been give root access rights. I only discovered this on the Ubuntu web page.

While I was exploring this, I stumbled onto some panel that required elevated privilege and I was challenged. This is very similar to the ME II's UAC . The only difference was that I was required to supply the password in Ubuntu, even when I have root access rights but in UAC I would only be required to select the accept button on the consent box.

I began to feel confident and tried my luck in pinging the Internet. No such luck. It turned out that Ubuntu did not recognise my old PCMCIA network card, a Xircom RealPort2 card. So now, I have the most secured Ubuntu in the world and I was happy of progress so far.

Many of my failed attempts might be the result of lack of communication from the installer. Definitely the installation program was not as slick as that from Microsoft. There is room for improvement. But I can put up with that since it's free and it did not treat me as a thief by constantly asking effectively "prove to me that you bought it", the most offensive customer treatment that Microsoft has ever invented.

What I saw impresses me and it's now time to explore - a spirit I have lost, for the first time, on any Microsoft OS, particularly the ME II.

Passing array of items in Delphi.Net - revisit

I have blogged about the worrying result of passing array of items in Delphi.Net using the "array of" type.

Apparently, there is another way to specify array of items to a function, that generates the same parameter type in IL code but causing the compiler to generate slightly different code inside the function, all without warning the user.

The technique is akin to C/C++'s use of typedef, except that in C/C++, typedef does not cause the compiler to generate different code. It is merely a form of short hand notation.

Not so in Delphi.Net.

The different form of specifying the array of items in a function's argument is by means of a type alias like this:

type
TArraryOfString = array of String;

procedure TSample.DoSomething(ar: TArrayOfString);
begin
// ....
end;

This form of declaration instructs the compiler not to overwrite the ar argument with its cloned copy. If one examine the IL code for the method's signature, it is identical to that of using "array of String" syntax. This generates identical code to that using any non-Delphi.Net languages.

This is a dangerous practice because both Delphi.Net Pascal syntax generate the same method signature in IL but with very different code in the function body and the developer is not told of the difference. The developer cannot see and feel any difference. The only time a developer can feel anything is different is when one tried to pass a nil array to the function. It is even worse if that assembly is to be used with other .Net assembly which can result in difference in behaviour, see my previous blog.

The morale of the story: Do not trust any assembly that is developed in Delphi.Net with array of parameter. Always check the code, with the help of Lutz Reflector or ILDasm, to determine if the argument has been overwritten with its clone. No non-Delphi.Net language will generate code like that without explicit action from the developer.

You can test this by passing a nil in the argument where you see array of in a method. If you get an NullReferenceException on entry to the method before your source code, you can bet your dollar that if is a Delphi.Net assembly using "array of" parameter.

Borland is more interested in Delphi.Win32 compatibility than to produce a compiler that generate safe, well behave and ECMA conforming assemblies.

Wednesday, April 25, 2007

An unscientific way to resuscitate a Netcomm NB5 Modem/Router

Recently, I was helping a friend to sort out their internet problem when they were changing ISP and they wanted to see if they could reuse their modem/router, a Netcomm NB5. They actually suspected that it was slightly faulty.

Of course I needed to get into the modem to alter settings to work with different ISP. Unfortunately they had forgotten the password, which had been changed from the factory default.

To cut the story short, they also wanted to get a WiFi router if they could reuse this modem. So I decided to take it home to replace my NB1300 Plus4 with this one in my home network infrastructure.

After the factory reset I managed to fire up the modem/router connecting to my ISP. For some reason when I switched it over to operate as a bridge providing modem service to my NetGear router, I accidentally caused the NB5 to go into a state that I could no longer acquire an IP address from it. I could not ping it and obviously could not hook the web browser onto it. It was dead for all intents and purposes.

I had met this kind of situation - comatose modem/router - before but with a different brand of router. In that situation, since it was still in warranty, I sent it back.

Since I could not use this comatose modem anymore, I left this powered down for a day or two and tried again with the same result.

Suddenly I vaguely recalled people had tried to hack into or by-pass those PIN protected car stereo unit by putting the device into a freezer. Out of desperation and seeing it couldn't do any more harm, I gave this a try.

I stuck the NB5 into the freezer for about 10 to 15 minutes (I nearly forgotten about it) and then I took it out (nice and cold like a block of ice) to thaw, making sure that I wiped away the moisture as it thawed.

When it reached temperature still lower than room temperature (I did say not very scientific, didn't I?), I powered it up, did another factory reset, plugged in my PC and a minute or so, the PC managed to acquired an IP address and I managed to browse into the settings page.

That boosted my confidence allowing me to change the NB5 to a bridge and providing the connectivity to my NetGear router. This would be my modem for the next few days to determine if it is faulty.

How Security Companies Sucker Us With Lemons

Recently, Bruce Schneier, wrote an article of this title in which he attempts to explain why so many bad security devices are available on the market.

He attempted to rationalise the situation by using a theory proposed by George Akerlof in his paper "The Market for Lemons".

While Bruce directs his attention to security devices like secure USB drive and firewalls, much of his arguments apply equally well to most software package on the market.

According to George's theory, the producer of the goods (in this case software) knows their product a lot better than the buyers and hence the buyers are always placed in a disadvantage position. Most buyers of the software are suckers for buzz words used by software producers.

I know intimately of one case in which the producer of a piece of software proclaiming their software uses relational database. But of course, the vendor never tells the buyers how well their exploit this technology. If the buyer knows about this, they would definitely be fuming.

Another case when multi-tier architecture is more fashionable then the old client-server one, so it changes its tune to proclaim that it is using multi-tier design but in reality it is still a two tier client-server architecture.

George went on to suggest to break this vicious circle by providing signal from source such as knowledgeable mechanics that can check on the used cars for the buyers. Or for someone to provide honest assessment of the goods for the buyers.

I have long been an advocate for Software Advocacy on behalf of software consumer. At the moment, the playing field is so severely tilted in favour of the software producer. No wonder we have so many badly written software peddling on the market.

Wednesday, April 18, 2007

A simple way to build multiple STA COM server

In .Net, particularly when building Web Service provider, the prefer apartment type is MTA. Under this condition if you have to use functionality provided by a legacy COM in-proc server that has ThreadingModel=Apartment, the performance of the solution involving mixed model is severely impaired.

The problem is fully documented in MSDN under the Mixed Model Development section and interested reader should consult that article.

The way to overcome this is to construct a multi-STA server with each STA hosting just one STA com object. In this way each client living in a MTA can call the STA com object independently from other invocation.

This kind of construction requires some administrative work and hence not as simple as one may think.

However, for most COM in-proc server the way around this problem is a lot easier and the technique is to employ COM aggregation to host the STA COM component that you need to use by MTA client.

A step by steps guide to aggregate another COM component using ATL can be found here.

The construction recipe to use COM aggregation to build a multiple STA server is given below:
1) Create a standard COM local (executable) server that contains a COM component with an interface of your choosing. This interface needs not contain any useful method and is not used at all. If you are developing this with Delphi, beware of the problem.

I recommend the use of ATL as it is a nice and skinny framework.

2) Then use the COM development framework to aggregate the legacy STA COM in-proc server. It is often extremely difficult to tell if a COM component supports aggregation or not. Just build it and test it is a good way to determine that. Aggregation is a construction technique and hence does not contain any detectable COM registration information. Incidentally all managed COM component automatically support COM-style aggregation and this can be used to construct .Net COM local server.

3) Make sure when the COM local server contains only STA by calling CoInitialize() or CoInitializeEx( 0, COINIT_APARTMENTTHREADED ).

4) When COM server registers class object, make sure it uses the REGCLS_SINGLEUSE flag forcing this server support only one COM object, thus one STA in-proc server by virtue of aggregation.

This technique has one great advantage and that is you can leave the source code unchanged except in the server instantiation code. Instead of creating the in-proc STA server, you are creating the com object belonging to the aggregater server.

Because you are interested only in the interface belonging to the aggregated component, you can immediately cast the result into the target interface.

This technique is not without cost and they are:
* It trades off serialising calls to methods in STA components from MTA with context switching and data marshalling costs. Typically the STA component exposes a fine-grained object model and any navigation will incur these costs. Hence one should minimise the use of dotted expression.

* It uses more processes than otherwise required.

* Involvement of COM local server could result in a situation where the server does not terminate.

* Memory consumption can remain relatively high if the client is a .Net solution. This is due to the involvement of two object life-cycle management schemes - the .Net RCW relying on GC to reclaim resources and the COM which relies on reference counting. In this situation, particularly if the dotted expression is used to navigate down the object model, this kind of expression generates a lot internal RCW's and each level of the dotted expression corresponds to a COM object.

These internal RCW's will only be disposed when the GC collects them. If they are not disposed, the corresponding COM object is not removed. This situation increases memory demand above that if both client and server are unmanaged code. The way around this is to avoid using dotted expression.