A site devoted to discussing techniques that promote quality and ethical practices in software development.

Monday, December 6, 2010

Are bookshop ripping us off with eBook?

At the moment I am looking for a couple of technical books and since the fad these days are eBook, I am going to evaluate whether or not eBooks offer real advantages.

Remember, a physical book needs paper, inks, printing press to produce it, bookbinding to bind the pages, warehouse storage facilities and transportation. The costs associated with these processes are not insignificant.

eBook, on the other hand, does not require any one of those processes. In fact, it is rare to produce a physical book without electronic manuscript in one form or another that can readily be converted to eBook format, if not already done so. It is well known in electronic products - be it software or computer games - the reproduction cost is almost zero. I have left out the royalty to the author and merchant's profits. It is not unrealistic to assume that the royalty is the same between eBook and physical book. So any cost savings must be due to the above associated costs to reproduce the physical book.

Hence, eBook should be a lot lot cheaper than physical book. Not quite as the comparisons shown below demonstrate that it in fact has negative economic benefits. The tables below show the book prices without inclusion of handling and shipping cost.

Web Service Contract Design and Versioning for SOA
Bookshop eBook Price eBook Protection Book Price Savings Used Price
Amazon $28.79 DRM $42.89 $14.1 $31.52
Barnes & Nobles N/A N/A $43.99 $0.0 $33.99
infomIT $39.59 Watermarked $49.49 $9.9 N/A

SOA with .NET & Windows Azure: Realizing Service-Orientation with the Microsoft Platform
Bookshop eBook Price eBook Protection Book Price Savings Used Price
Amazon $35.19 DRM $42.89 $7.7 $42.06
Barnes & Nobles $43.99 DRM $46.71 $2.72 $33.05
informIT $39.59 Watermarked $49.49 $9.9 N/A

The savings are not uniform and are from book to book, from bookshop to bookshop, ranging from a pittance of under $3 to $14.

Some physical book, despite all these expensive production & storage costs, is cheaper than eBook. Amazon's price for the second book is cheaper than Barnes & Nobles' eBook version. They must be more expensive 1's and 0's used by Barnes & Nobles than those inks and paper used by Amazon!

Now clearly someone is taking advantage of the latest fad to charge for something that cost zero dollar to produce and reproduce.

The above figures are just pure price comparison and now consider the real economic disadvantages of eBook:
  1. For those using DRM, even the owner cannot print out a diagram or pages after all it is something you have paid for. At least you can run the physical book over on a photo copying machine or scanner. Does that pittance in savings enough to compensate for the lost of freedom to use the things you have bought?
  2. Loss of second hand book market. While the above used book prices are indicative, nonetheless, there is a market for used copies and that you can participate without restriction. All eBooks bar you from that market and their form of compensation is only a pittance. Even allowing a discount of 50% for used copy, you can sell it for more than the saving you can get from eBook.
  3. You can't even share part of the book or form a syndicate to buy a book. At least in a physical book, no one forbids you from tearing the book apart to share it. You can tear out pages to lend to others or to give it away. You have total freedom. But not so with eBook. Amazon generously allows you to have 6 devices but they need your access code to your account. Barnes & Nobles has this lend it facility for only 14 days.
  4. No one can take the physical book away from you as long as you have paid for it. Not so with eBook. Even if the bookshop does not have any rights to sell a physical book, so long as you have paid for it and walked out of their doors, that book is yours and they can't take it away.
  5. How soon does the eBook savings recoup the investment cost of a eBook reader? The only outlay for a physical book is your bookshelf cost if you want to store it that way. The floor is perfectly good free storage space for books.
I have to acknowledge that there are benefits with eBook that physical books are not capable of offering. Such as the ability to carry almost a library-ful of eBook with you,  enhanced searchability, or almost instantaneous delivery and with no delivery charges. Well Amazon does not charge delivery charges for purchase over certain amount to US address.

In my opinion, eBook merchants are definitely taking advantage of current hype and fad to over charge the customers and in my mind, it offers more restriction than physical book and preventing book owner from disposing them in a second hand market, which can return considerable amount to the owner.

In fact, it is not uncommon to find eBook costing more than its hardcover versions. The debate whether eBook should cost so much is raging on the Internet and no end in sight if consumers are willing to pay for the hype & fad.

Sunday, December 5, 2010

Using WCF to produce Web Service Contract documents that must use a supplied schema

This post is to describe a very common scenario in SOA/Web Service world. To avoid chaos in SOA world, practitioners are encouraged to use "Standardization of Service Data Representation" so that services and clients are communicating using standardized or common vocabulary. They may be standards data representations specified within an enterprise or by trade groups like MIMOSA and are not necessary standard endorsed by W3C.

The problem at hand is to produce service contract documents that a system needs to interact with abstractly and the data interchanges must use a standardized or common representation. For ease of discussion, let's assumed the standardized data is from ACME Enterprise and supplied in schema file called ACMEEnterprise.xsd.

One way to do this is to use a process called Contract-First or Schema-First. This gives the designer the maximum control in what to put into the Service Contract. Service Contract, include WSDL and XSD, is not just for machine to execute but also containing information that are useful to service producers and consumers. However, it is not for the faint-hearted; it is only for the most determined soul.

Instead of using Contract-First, in this post, I am describing the necessary steps and settings of using WCF to generate Service Contract documents that use the prescribed data representation. In the frequently described scenarios, the developer is responsible for specifying the data and service contract. But in the problem at hand, much of the data representation or data contract, are predetermined.


Designing a Service Contract using WCF conforming to supplied data representation

As stated previously, the data representation is supplied in ACMEEnterprise.xsd and the targetNamespace is "http://ACMEEnterprise.org/2010/12/ACMEEnterprise.xsd". All data or message exchanges must use types specified in this schema file. The Service Contract many specify other data contracts it sees fit but if it is to describe data for ACME Enterprise, it must use the types specified in ACMEEnterprise.xsd.

Step 1 - Produce the .Net classes

The first step is to convert the types specified in the ACMEEnterprise.xsd into .Net classes that we can use in WCF constructs. For illustration purpose, I use C# but you can use any other .Net languages.

This can be achieved by using XSD.exe or SvcUtil.exe. Most WCF materials will recommend one to use SvcUtil.exe but if the schema is specified using the full set of W3C XSD Schema syntax, the chance that SvcUtil can process your schema file is slim. The reason is that SvcUtil is designed to work with DataContractSerializer which maps the CLR data types to XSD's or vice versa and that the Data Contract model only supports a limited subset of the W3C Schema specifications.

You have a better chance of successful conversion by using XSD.exe provided that you follow the caveat. What CLR namespace you choose to use is immaterial and is not something that will affect the wire format. They are local artifacts affecting only your .Net solutions.

Step 2 - Incorporate the generated file into your project

The next step is to incorporate the generated file into your project and begin using the types in that files to design your Service Contract as if you are dealing with normal WCF DataContract types.

The main thing to note is that each class generated is adored with the XmlTypeAttribute declaring the Namespace corresponding to the targetNamespace in ACME, like this:
[System.Xml.Serialization.XmlTypeAttribute
        (Namespace="http://ACMEEnterprise.org/2010/12/ACMEEnterprise.xsd")]
    [System.Xml.Serialization.XmlRootAttribute("employee",
          Namespace="http://ACMEEnterprise.org/2010/12/ACMEEnterprise.xsd", 
          IsNullable=false)]
    public partial class Employee : Person {
        // . . . 
    }

It is vitally important that this namespace is maintained when we generate the Service Contract documents for this type.

Step 3 - Mark the Service Contract with XmlSerializerFormatAttribute

This is a very important point. You may apply this attribute to only those Contract Operations that require to use XmlSerializer. In my case since every operation is using this serializer, I apply this attribute to the entire service contract. If you do not apply this attribute, types that came from ACMEEnterprise.xsd will be placed under the targetNamespace of "http://schemas.datacontract.org/2004/07/ACME.Enterprise" like this:
<?xml version="1.0" encoding="utf-8"?>
<xs:schema xmlns:tns="http://schemas.datacontract.org/2004/07/ACME.Enterprise" 
 elementFormDefault="qualified" 
 targetNamespace="http://schemas.datacontract.org/2004/07/ACME.Enterprise" 
 xmlns:xs="http://www.w3.org/2001/XMLSchema">
  <xs:complexType name="Person">
    <xs:sequence>
      <xs:element name="ageField" type="xs:int" />
      <xs:element name="firstNameField" nillable="true" type="xs:string" />
      <xs:element name="genderField" type="tns:Gender" />
      <xs:element name="hobbyField" nillable="true" type="xs:string" />
      <xs:element name="lastNameField" nillable="true" type="xs:string" />
      <xs:element name="secretNumberField" nillable="true" type="xs:long" />
    </xs:sequence>
  </xs:complexType>
This effective produces a different type on the wire. The use of XmlSerializerFormatAttribute retains the original targetNamespace.

Step 4 - Build the WCF Service Library and Generate the Contract documents

After you have finished building the WCF Service Library you can use SvcUtil.exe to produce the Service Contract documents. The process will generate the WSDL as well as the companion XSD files. While this process with the aid of XmlSerializerFormatAttribute preserves the targetNamespace, as shown below, for the types that are used in this service library and placing them in a XSD file resembling the original XSD file, the process is at best of low fidelity. That is the process loses information that are in the original documents deem not needed in WCF. Information such as <annotation>, <documentation>, and <restriction> elements are lost. The schema file only contains a subset of the types described in the original schema; it includes only types used in the service.
<?xml version="1.0" encoding="utf-8"?>
<xs:schema xmlns:tns="http://ACMEEnterprise.org/2010/12/ACMEEnterprise.xsd" 
 elementFormDefault="qualified" 
 targetNamespace="http://ACMEEnterprise.org/2010/12/ACMEEnterprise.xsd" 
 xmlns:xs="http://www.w3.org/2001/XMLSchema">
  <xs:complexType name="Person">
    <xs:sequence>
      <xs:element minOccurs="0" maxOccurs="1" name="FirstName" type="xs:string" />
      <xs:element minOccurs="0" maxOccurs="1" name="LastName" type="xs:string" />
      <xs:element minOccurs="1" maxOccurs="1" name="Age" type="xs:int" />
      <xs:element minOccurs="1" maxOccurs="1" name="Gender" type="tns:Gender" />
      <xs:element minOccurs="1" maxOccurs="1" name="SecretNumber" nillable="true" type="xs:long" />
      <xs:element minOccurs="1" maxOccurs="1" name="Hobby" nillable="true" type="xs:string" />
    </xs:sequence>
  </xs:complexType>

You may replace the regenerated schema file for types for ACME Enterprise with the original one without problem so that it contains all the valuable information.

The process described here is a low fidelity process. It does not allow designer to include annotation in the WSDL file. The only way to create a full fidelity Service Contract is to use Contract-First process by authoring the messages and then the WSDL. This will be reported in full post.


"SOA Principle of Service Design" Section 6.3 "Types of Service Contract Standardization"

Wednesday, December 1, 2010

Caveat in using xsd.exe

Here is a trap that many can fall into when using xsd.exe:
1) If your xsd schema file uses <xsd:import> to add multiple schemas from different target namespace into the document, xsd ignores the schemaLocation attribute value.

In this case, you need to specify those imported xsd files on the command line.

2) If the document uses <xsd:include>, xsd uses the schemaLocation attribute value.

While it is great to use the latest and greatest but in many situation, particularly when you are given an XSD authored in other platform, SvcUtil.exe /dcOnly will frequently unable to handle XSD Schema syntax that XSD.exe can handle.

In that situation, alternately you can use the /importXmlTypes (/ixt) and ended up with types that implements IExtensibleDataObject, something that only existing in Microsoft world and may even impact on interoperability, certainly a J2EE Web Service does not have this.

Sunday, November 14, 2010

Beware when using AppDomain.BaseDirectory in NUnit

In my previous post I accused Uri() exhibiting some quirky behavior,
The reason is that depending on whether the base uri for file protocol is your current execution directory, that constructor will drop the last component of your base uri if the current directory is not your base uri  when forming the final uri.
This is indeed a wrong accusation. System.Uri() is functioning correctly in accordance to RFC 3986. For example
Assert.IsTrue( (new Uri( new Uri(@"C:\A\B\C"), "test.txt")).LocalPath,
               @"C:\A\B\test.txt" );
Assert.IsTrue( (new Uri( new Uri(@"C:\A\B\C\"), "test.txt")).LocalPath,
               @"C:\A\B\C\test.txt" );

In normal application, like a console application or WinForm application, the following result is true:
Debug.Assert( AppDomain.CurrentDomain.BaseDirectory.EndWith( @"\" ) );

But when executing the above statement in a TestFixture class in NUnit, the assert fails. This is because when NUnit defines the AppDomainSetup.ApplicationBase, it fails to include a trailing '\'. NUnit simply uses Environment.CurrentDirectory, which does not have a ending '\'. Incidentally, the sample code makes the mistake of assuming the Environment.CurrentDirectory ends with a '\' resulting in forming an incorrect path to the application configuration file.

Normally, this discrepancy does not matter if you use System.IO.Path.Combine() to form path names. However, in the case of when log4net is used inside NUnit it becomes an issue when it uses Uri( Uri, string ) to form the Uri to the log4net configuration file with the value from AppDomain.CurrentDomain.BaseDirectory. The end result is a wrong Uri to the configuration file. To overcome this problem in NUnit, it is therefore wise to specify a full path name for the configuration file in the NUnit's project's configuration file as recommended.

After all these experiments, I am none the wiser what is the official convention for AppDomain.BaseDirectory and Environment.CurrentDirectory? At least in Uri, it is unambiguously defined in the RFC. Perhaps someone reading this from Microsoft can care to comment on this to let the world know if there is such a convention at all.

Saturday, November 13, 2010

Log4Net configuration issue when using with NUnit - Solutions & Explanation

Log4Net is a widely used logging framework and an application has to configure it in order for your instructions, such as where to log your message to, what to ignore, etc, are carried out. However, there are several quirks that one has to be aware, particularly when using it with NUnit. This post uses NUnit version 2.5.7 and log4net version 1.2.10 and hence the comments below are relevant to this version combination.

However, when used in NUnit, report has been filed that it does not generate any log file. In other word, the combination fails to find the required configuration file. The report accused NUnit being the culprit but I can assure readers that it is not entirely. This triggers a number of unnecessary work around.

In this post I will give you a number of solutions first and I will go into detail explanation of what causes this quirky behavior. It is neither party's fault but exposing an issue with how log4net loads the configuration, which is not a fault either.

If you are not interested in the technical discussion, just skip the second section.

Solutions

Solution 1 - No need to write code

I prefer this solution because it fits perfectly into "The Most Beautiful Code I Never Wrote".

In order to use NUnit with your test assemblies, which may or may not contain log4net logging calls, that call into your deploy assemblies, the ones being tested, containing log4net calls, do the following steps:
1) Create a NUnit project config file which usually is in the same directory as your NUnit project file, if not already present.
2) In the config file define the key (in red):
<configuration>
  <appSettings>
    <add key="log4net.Config" 
   value="E:\Libraries Tests\Log4NetDemo\Unit Tests\SimpleLog4net-UnitTest.config"/>
  </appSettings>
</configuration>
The value of that key defines the log4net configuration file that will be used to initialize the default repository.

That is all to it! Nice and simple.

Now when you run your NUnit, all your log4net calls will follow the instructions specified in that configuration file.

It is recommended to use full path name in specifying the configuration file due to some quirky behavior of System.Uri( String, String ) call. This will be disclosed in the Technical Discussion section.

Solution 2 - Use work around

While work around generally works but not very scalable, particularly when you have a collection of test fixture classes and/or assemblies. You need to ensure no matter which test assemblies are called that the configuration file still is used. Here is the recipe of a better work around:
1) Create a class NUnitLog4NetLogManager which has a method called GetLogger( Type type ) and package that inside an helper assembly:
static public class NUnitLog4NetLogManager {
  public static ILog GetLogger( Type type ) {
    return GetLogger( Assembly.GetCallingAssembly(), type );
  }
  // ...
} 

2) NUnitLog4NetLogManager.GetLogger() will use the given assembly object to attempt to retrieve any custom attribute of type log4net.Config.XmlConfiguratorAttribute type. Once found it will then use it to retrieve the config file and watch state. With these pieces of information it will form the full pathname of the configuration file using AppDomain.CurrentDomain.BaseDirectory, if the config file is not a full path.

It then form the System.IO.FileInfo with which it then calls XmlConfigurator.Configure() or XmlConfigurator.ConfigureAndWatch() to load the configuration into log4net.

3) Once step 2 is completed, it then calls LoggerManager.GetLogger(Assembly, Type) to grab the logger and return it to the caller.

This technique allows user to use familiar way of specifying the preferred configuration file like this:
// AssemblyInfo.cs
[assembly: log4net.Config.XmlConfigurator(ConfigFile="MyTest.config")]

and to create the logger in a TextFixture class like this:
// MySimpleTest.cs
[TestFixture]
public class MySimpleTest {
  static ILog log = NUnitLog4NetLogManager.GetLogger( typeof(MySimpleTest) );
  [Test]
  public void SimpleTest() {
  // ....
  }
} 

This has advantage over the work around in that any class included into this assembly can make use of the familiar syntax to get a logger and in doing so indirectly load up the configuration file.

Technical Discussion

Typically, an assembly can include an assembly-level Log4Net custom attribute, XmlConfiguratorAttribute, to specify the configuration information. When a situation that needs to load up the configuration file, log4net internal logic will look for an instance of this custom attribute in a given assembly for the configuration file, etc. This class is located in src\Config\XmlConfiguratorAttribute.cs.

In theory, each assembly can use this to specify different configuration file. log4net does not process or configure its system until the first reference to create a logger or an appender. But in practice the first logger created determines the configuration file used and subsequent calls to create different loggers will not load the config file specified in that assembly.

There is really no need to have separate configuration files as one can control all loggers in one configuration file.

Bear this piece of information in mind will help to understand the issue at hand.

When NUnit is launched and before running your test assemblies, code inside NUnit causes NUnit.Core.Log4NetCapture.StartCapture() to execute. This is an override of its parent abstract class TextCapture. The result is that it causes a the log4net.Appender.TextWriteAppender to be created.

When this happens, some how it triggers a call to log4net.Config.BasicConfigurator() for this appender. This in turn causes the creation of Logger Repository for the assembly nunit.core, which in turns starts looking for custom attributes for repository.

Since these custom attributes are rarely used and nothing is found, it then look for an instance of custom attribute log4net.Config.ConfiguratorAttribute or log4net.Config.XmlConfiguratorAttribute in nunit.core to configure this default repository. Note that it is not your assembly that is used to create the logger repository; it is nunit.core.

If no such custom attribute is found it will then look for a value for the key. If a value is returned, it will then use it to configure this default repository (see code DefaultRepositorySelector.ConfigureRepository(Assembly, ILoggerRepository) in src\Core\DefaultRepositorySelector.cs). The logic will eventually then call XmlConfigurator.Configure() to load and configure the default repository.

If nothing is found, it will simply leave this default repository with its default initialized state.

After all this is completed, NUnit will begin to execute the first TestFixture class it can find, when that happens, assuming the logger is created in a static object initialization style, NUnit will call into log4net to create a logger identified by the given type or string. log4net will grab the calling assembly to begin creation of the logger.

Instead of looking for the custom attribute that you have injected into the AssemblyInfo.cs file, for example, to specify the log4net config file, log4net now has a default repository and it will no need to go through the motion to locate those custom attributes.

The result causes people to believe NUnit or log4net is responsible for not finding their configuration file. This is because they do not realize NUnit is the culprit that causes the creation of a default repository with default initialization state. It is not that log4net can't find your log4net config file, no matter how convenient you have placed it. It is simply that by then log4net does not need to do that.

log4net will then create a logger and stashes it into the default repository, which happens to have the default initialized state, that your logger will use when call upon. It then returns this logger back to your code.

When you call the logger (ILogger) method to log messages, it has found that it is not initialized and hence it will not generate the expected log file, if you use FileAppender for example leading to the submission of the bug report on NUnit.

In reality, perhaps NUnit in version 2.5.7 happens by accident trips the load once issue of log4net as described above and hence when it comes to your test assembly, it simply ignores your directive placed inside the config file.

Using Solution 1 causes the creation of the default repository to use that file to initialize the default repository thereby ensuring the logger will obey your instructions.

With respect to using my solution 1, I mentioned that there is a quirky issue with System.Uri(Uri, string) that makes me to recommend the use of full path name for the value in the key. The reason is that depending on whether the base uri for file protocol is your current execution directory, that constructor will drop the last component of your base uri if the current directory is not your base uri  when forming the final uri. The following code fragment illustrates the issue
// Current directory in MyTestAssembly is 
String curDir = "C:\A\Test\MyTestAssembly\bin\debug";
Debug.Assert( Environment.CurrentDirector==curDir );
Uri x = new Uri( new Uri("C:\A\Test\MyTestAssembly"), "hello.txt" );
Debug.Assert( x.LocalPath == "C:\A\Test\hello.txt" );

In the log4net, it uses AppDomain.CurrentDomain.BaseDirectory as the base Uri and this may or may not be the current directory and this uncertainty is very hard to tell since log4net does not throw any exception when the config file does not exist. For this reason, it is best just to use a full path name for your configuration file and this causes Uri(Uri, string) to ignore the base uri.

The work around specifies a full path to the custom log4net config file to construct the FileInfo object to XmlConfigurator.Configure(). This is not necessary because at that point in time, your current directory is the target directory of your test assembly and FileInfo() will always use current directory to fully qualify the supplied file name if it is not an absolute path name.



[1] "Beautiful Code - Leading Programmers Explain How They Think" Chapter 3 "The Most Beautiful Code I never wrote".

Sunday, October 24, 2010

GMail mishandles e-mail addresses - '.' is insignificant in GMail

Tonight, I received a spam mail sent to A.BCD.HelloWorld@gmail.com and intrigued how it could arrive in the in box of ABCD.HelloWorld@Gmail.com my proper GMail account.

BTW, the above e-mail addresses are fictitious containing only structural information, like the presence or absence of a . to illustrate how GMail mishandling e-mail addresses.

So I did some experiments. I sent an e-mail from my hotmail account to A.BCD.HelloWorld@gmail.com and lo and behold, it arrived in ABCD.HelloWorld@gmail.com.

I did that with several other GMail accounts some with no '.' in the address and I could add as many '.' as I like and they were obediently sent to the address without '.'.

I took my other GMail account like this OneBrownFox@gmail.com and sent e-mail to O.n.e.B.r.o.w.n.F.o.x@gmail.com and without failing it ended up in OneBrownFox@gmail.com.

In other words, GMail tries to guess e-mail addresses and that kind of dangerous practices can increase the SPAM mail you receive. In E-Mail format, the '.' is significant. That is OneBrownFox@gmail.com and One.BrownFox@gmail.com are two distinct e-mail addresses with distinct inboxes. But in the eyes of GMail, they are not.

So far GMail is the only e-mail service that seems to mishandle e-mail address in this manner.

Monday, October 4, 2010

Management never learns

Let's wind the clock back to year 2000, not long after the Y2K had been vindicated as hoax of the millennium, I was working as a senior software developer in a software company called Mincom that has a product called LinkOne in its product suite. At that time I was not associated with this business unit but was being enticed to join it to redevelop to exploit component technology.

Suddenly, the lead developer resigned and next the product manager. The tide continued and eventually the whole team resigned en masse, include the clerical assistant. Leaving behind was a pile of undocumented C/C++ Win32 SDK code. Business Units in Mincom were, and to a large extend still is, very disjointed groups with very little cross pollination of knowledge. In the mean time, the product had a large users base world wide and they obviously weren't happy being left with an unsupported product after heavy investment in entrusting their data to this product. It is not like swapping Microsoft Word with OpenOffice.

The management at that time speedily mounted a rescue mission, which I called Rescue ver 1, to deal with this product. I was one of the two senior engineers ordered in to take charge to rebuild the team and to take control of the product. At that time, Mincom had the luxury in terms of availability of resources to mount an effective rescue mission and the process went smoothly even though it took a long time for the team to get on top of it.

It was not exactly clear what spooked the team so badly that they resigned en masse but one rumor had it that they did not like the management initiative and the pasture was greener outside. Consultative process and listening to staff/developers were not the forte of the company and still is not. Hence this rumor has credence.

Now fast forward to post GFC era in 2008, which by then I have parted company with this business unit for over 4 years, the company used the GFC as an excuse to begin shedding staff giving rounds of redundancy. LinkOne was not treated any differently even though it was pulling in a respectable income for the company.

Once again, people in this team gradually left the company either through redundancy, disenchantment or simply tossed it in before the ship sank. By 2010, the team, including the key architect who transformed the product from a Win32 product to a respectable .Net product, suffered a bout of anorexia reducing the number down to 1 person with a manager! To this observer, it is a case of Deja Vu.

The remaining person was a product support engineer and obviously they desperatly need a 'team'. While keeping the news quiet from their customers trying to prevent the riotous reaction of the first episode, the company tried to mount a rescue, Rescue ver 2, albeit a vain attempt given the company was now depleted of resources; they had easily more managers than developers. It could only mount a vain mission using time-shared resources. The ineffectiveness of this management style has been well documented by DeMarco. Having worked in this product and knowing what resources remained in the organisation, the future looks bleak not only in terms of supporting the product but to enhance it. It is in precarious position because they cannot afford to lose any more.

Perhaps it is their desire to wind up this product without telling the users by natural attrition.

Once again, Management could have stemmed the loss of unquantifiable resource if they are more consultative and treat their staff with respect. There are other well-known cases of people doing the right thing for ending up being sacked when the company is not exactly flushed with resources. Truly a last act of desperate death throe.

When we mounted the ver 1 rescue, we had the luxury of a pool of resources and knowledgeable long term users to guide us and to show us how the product was supposed to perform. Not anymore. All those knowledge has gone out of the window. The only consolation they now have is this team that left was keen methodologists leaving them with a product source code in a much better shape than when I inherited in ver 1 rescue. I wonder how long this will last before it degenerates into a mess as management presses the poor soul to rush out the fix as his time-slice is over spent.

From this vantage point and from my personal involvement, it is a case of management never learns and never knows how to manage. I am wondering if the loss can be translated into an entry in their profit and loss statement, will their board of directors or share holders still take such a quiet sheepish position.

Saturday, October 2, 2010

Wonder which tablet is underachievers

The quality of technical reporting on the Internet has sunk with each passage of time, particularly on analyzing product like a Tablet computers. In addition to the previous one, another just surfaced by Dan Ackerman making unsubstantiated statements that
"these Windows tablets have not been very good. Some are slate-style devices, others are convertible laptops with swiveling screens--but all have been underachievers, to put it mildly."
He fails to define what criteria he uses to call them underachievers. In fact, the current crop of so-called Tablets are nothing more than the same kind used in check-out counters or information kiosk and should be termed as touch sensitive device.

It cannot write like Windows Tablet that stores the scribbles as searchable ink. It cannot write and convert to text as you write. It relies on a touch senstive keyboard, which has been found even in good old PDA, and in an integral part of Tablet Windows. In fact Tablet Windows has 2 touch sensitive keyboards - one fixed one and another floating one. The current crop of non-Windows touch sensitive devices cannot provide the user with the ability to annotate, scribble and mark up any documents.

Windows applications run without modification on tablets; my Firefox can use the tablet input and my Thunderbird, with GeckoTIP, works flawlessly. It offers developers ability to develop ink-aware applications that you can scribble on say a PDF or Word documents just like you can do to a printed document.

Sure the current crop of touch sensitive device has gesture to open, zoom, flick or rotate object. Windows 7 touch sensitive netbook and Tablets have that functionality in addition to use stylus to write. That kind of gimmicky stuff is nothing new but is a good bait for technical bloggers or reporters.

The only thing working against the current crop of Windows Tablet is the price. For a properly functioning tablet supporting stylus, it is a lot more expensive than touch sensitive only device. Until the screen cost can be reduced further, this handicap will continue to exist. No amount of software can overcome that.

As a long term Windows Tablet user, and is still using one to compose this blog, I have found the current crop of touch sensitive devices, like iPad, Dell Streak, even Windows touch sensitive netbook, lacking. The ability to write/scribble notes or to annotate pre-prepared document in a meeting or presentation provides a dimension these touch sensitive devices cannot meet.

The convertible type, degraded by Dan through his obvious lack of usage over a period of time - the one with a hardware keyboard - are the most versatile of the lots. While one can write, with some practice, comfortably a reasonably long document using the stylus, often time the keyboard can give you that much more speed and precision. Of course the current crop of touch sensitive devices like iPad are not composing devices for "business-oriented" operations and are more a presentation device, more like a large size iPod Touch. As a result, there is no need for a keyboard with feedback. In a convertible or hybrid kind the keyboard complement the stylus.

So who is lacking and underachieving - show me how to annotate a PDF on iPad or running something like Excel, Photoshop or other productivity suite on iPad like devices.

Friday, September 24, 2010

A review of Advanced Registry Optimizer 2010

I have been asked for an opinion of this tool called "Advanced Registry Optimizer 2010" (ARO) from Sammsoft. Since I have never used anything like that despite using Windows since 3.0, I decided to give it a try.

The one so 'generously' made available from its web site was only a nobbled version allowing one to correct up to 100 errors. This wasn't made clear in the download. Anyway, the version I tested was 6.0.743.796 and all tests were performed in two Virtual Machines whose guest OS's were: Windows XP SP3 and Windows 7 Ultimate. Furthermore all tests, other than installations, were carried out in LUA (Limited User Account) and disconnected from the network, just in case the program performs call-home operations.

I also deliberately created a key in HKCR\CLSID with incomplete and incorrect information. While the GUID was correct, the ProgID contained incorrect or missing data. This was to test how good this program was.

Although the installation was relatively straight forward there were two areas that were not very satisfactory:
  • The acceptance of askToolbar installation should be a bit more obvious.
  • After successful installation, it should not execute an automatic scan without user's instruction. It ran and I had to use tools like Process Explorer to terminate it immediately. Not nice.
As mentioned all runs were carried in LUA and the program was found to be faulty running in this kind of account settings. Since all programs in Windows 7 and Vista run with standard user security token by default, ARO's end result was identical to that running in XP LUA mode.

This observation was substantiated by comparing the run time log captured using Process Monitor when ARO was executed in XP SP3 and Windows 7.

It did not crash when running in LUA. It simply lied to the users telling them their system had fewer issues than it really was.

Take for example, the specially planned registry settings was not identified when ARO was running in LUA. The reason was that it tried to open the registry key HKCR\CLSID\<My-Test-GUID> with Read/Write access and failed due to lack of sufficient rights when executed in LUA.

The program simply swallowed the error and moved on. This was a very bad programming mistake. The return code was very specific of the problem and when the program was told access was denied, it should have immediately ceased processing and informed the user to run with administrative rights.

Instead, the program just treated that as a non-event and moved on. This resulted in a misdiagnosis.

You can easily test this without these tools. Run ARO in LUA and run it again with "As Administrator" and compare the findings. If the program is functioning correctly, it should report identical error. In this version, they differ drastically.

In fact, if you go to the settings and de-select all settings except "ActiveX and COM" and run it. The chance is that you will get a 0 error when running in LUA and non-zero with administrator's rights. This is because when it opens HKCR it demands Read/Write even in scanning mode. This indicates a possible programming error! Perhaps it is an innocent victim of the ATL registry library problem.

As a result, the product is not functioning.

Tuesday, September 7, 2010

Who creates the ASPNET account in XP?

I have been using a couple of XP VM for development and suddenly I have to develop some ASP.Net. The VMs did not have IIS installed and only has .Net Framework 2 SP2, 3.0, and 3.5 as well as VS2008.

After installing the IIS, I was having trouble to launch some ASP.Net application. Upon some investigation, ASPNET, the default account for IIS 5.1 in XP, was not there!

So who was responsible for creating it?

It turns that one needs to install .Net Framework 1.1 to create that account irrespective if IIS 5.1 is installed or not. What distracted me was the presence of v1.1.4322 sub-directory in the Framework as it turned out it was placed there when I installed .Net Framework 2.0 SP1.

It is not the same as running the installation script for .Net Framework 1.1.

Tuesday, August 24, 2010

Gpg4Win Fails in TChinese Windows

Further to my discovery of problem in Gpg4Win when the "Language for non-Unicode program" is not set to English, I decided to test it in Traditional Chinese Windows with "Language for non-Unicode program" set to same as the Unicode language (TChinese HK SAR) and to English US.

Sadly Gpg4Win will not allow me to enter passphrase when generating my key:

The captured screen shot did not show the mouse carot but it was actually inside the Passphrase edit box and no matter what I typed, nothing appearing.

The strange thing is that. I could enter my name and e-mail address, albeit very poor focusing handling, but only in the passphrase entry dialog did the program misbehave. This kind of misbehavior in part but not in other is common in this program.

Not deterred by this, my next test was to import a key that was generated in an English Windows XP. The import process worked fine.

But once again the Windows Explorer plug-in failed when I used the context menu to encrypt a small text file with the same misbehavior reported previously.

The next test is to use the File Manager (a rather clunky and clumsy user-interface. They should simply just make a Windows API call to invoke the familiar UI) from the GPA (GNU Privacy Assistance) to see if I could encrypt and decrypt the text file the loooooong way that could not be done via the Explorer plug-in.

Once again, like other features in Gpg4Win, parts work and other parts fail. The annoying things are those operations that fail aren't some exotic rarely used ones. I could encrypt a text file but when I tried to decrypt it, I was met with this familiar dialog box:
The content showed the correct armor text. To prove that the file was correctly encrypted, I took this file to an English Windows and it decrypted it fine. This clear shows another bug in Gpg4Win.

Conclusion:
Gpg4Win 2.0.4 does not work in a non-English Windows or English Windows with non-English language for "Language for non-Unicode program" settings.

Sunday, August 15, 2010

GPG3Win 2.0.4 Windows Explorer Context menu still fails to work

This is my pet project to see how long it takes Gpg4Win to produce a Windows Explorer context menu that is capable to encrypt and decrypt files.

My test environment is XP Pro SP3 (English Windows) with HK SAR as the language setting for non-Unicode Programs. Gpg4Win's explorer context menu fails to encrypt and decrypt a file producing the following familiar dreaded message box:

To get this feature working one has to change the "Language for non-Unicode program" is set to English. This is an unnecessary demand clearly indicating a lack of Internationalization Programming prowess. It presents great inconvenience to non-English speaking Windows users. Sad to see this bug still lingering on for so long.

It is another case of using 'It-works-here' development methodology.

Tuesday, August 3, 2010

Some people just writing rubbish on the Internet

Consider this totally inaccurate and ignorant statement by Jason Hiner on ZdNet:
No Windows 7 tablets have hit the market, or even been officially announced.
Jason has either lived in the wilderness for the last few years or so ignorant to make such a statement. No Windows 7 tablet?

Is he joking? I was using one and listed on Fujitsu's product site. There are plenty of Windows 7 touchy type of touch-screen only device too if he cares to investigate. All of them can do that silly gesture to open and flick. The T4310 can do both - touch sensitive as well as using stylus to write in ink - something iPad can't do.

Just because Apple brought out a touch-sensitive device, just like those in use in check out counters, so may so called tech journalists are totally confused.

It is like saying a vehicle is not a car if it does not look like a Mini while accusing Toyota and Land Rover has yet to produce a car!

Monday, August 2, 2010

Important to set up a subversion repository correctly

To use Subversion effectively, one needs to sit down and plan the organization of the repository layout. "Version Control with Subversion" book page 16, recommends:
While Subversion's flexibility allows you to lay out your repository in any way that you choose, we recommend that you create a trunk directory to hold the “main line” of development, a branches directory to contain branch copies, and a tags directory to contain tag copies. For example:
$ svn list file:///var/svn/repos
/trunk
/branches
/tags
Failure to adhere to this recommendation can bring lots of grief later on. Often when you start off a project, you may not anticipate that you may need tags or branches and hence you start importing file directly into the repository at the root. Like this:

Then one day you decide to start using tags to record interesting events. Where do you now put it? Let's create a directory called Tags and put it there, like this:

Now you may think this fixes the problem. Not quite. If some one decides to check out the trunk, this is what you will end up with:

Now you do not only get the files for the trunk but also files associated with every tags you have in the repository. In this demo, I only have one tag.

While it is not a dead loss, you can still fix this situation. First get everybody to check in everything first.

Then create three separate directories in the repository, 2 are of immediate need, as Trunk, Tags and Branches directly below the root. Staying with the recommended names and structures benefit everybody and avoid misunderstanding.

Select all the trunk directories, excluding Branches, Trunk, or Tags folders, and move them into the Trunk sub-directory.

Check the Trunk out to a different directory on your local drive. it is best not to use your old copy anymore. Then tag that immediately so that you know this important event in correcting the repository directory structure. Of course you store the tag in the Tags sub-directory.

These steps are totally unnecessary if it is done immediate after the creation of the repository.

Friday, July 30, 2010

Why no portable Office from Microsoft? OpenOffice is portable.

When .Net was introduced to the world, one of the features touted was an XCopy Deployment mode. That is one can simply copy a software suite from one location (a remote server) and drop it onto your local drive and it just works. None of these heavy weight installation program requiring administrative rights. In .Net 2, Microsoft further enhanced this with the ClickOnce deployment.

Sadly, there is still not one product from Microsoft that utilizes this XCopy deployment - this is a classic case of "Do as I preach and Not as I do". There are plenty of products from other parties that have achieved this 'XCopy Deployment' mode and they fall into a class called the Portable Applications. Notice the complete absence of any Microsoft's contribution in this area.


Just about every competitor to Microsoft has portable version: Browser - Portable Firefox, MailClient/Outlook - Portable Thunderbird, Office - Porftable OpenOffice, Messenger - several portable ones, such as Portable Pidgin, the list goes on. Even poor old Microsoft's WordPad has been made portable and enhanced but not from Microsoft.


The Portable OpenOffice is so convenient not having to install the 500lb Gorilla called Microsoft Office just to write some documents. So why Microsoft has tried to preach the features of XCopy/ClickOnce while itself refrain from using them?

Sure, the portable version does not support the Object Linking And Embedding or OLE Automation. In most cases, people do not need them. Besides there are portable applications that when installed into the hard drive can provide that kind of features. I am sure Microsoft can figure that out.

It is not a technical impediment. I think the main reason Microsoft has not dared to venture into this area is MONEY. How can you force someone to activate when it is a portable application? If Microsoft cannot force people to activate their 'portable' application, it can't force people to pay them. The only way Microsoft's money tree continues to thrive is to force the users to cement the applications deeply rooted into the machine's hard drive. To hell with users' convenience.

Back to Portable OpenOffice's (version 3.2.0) word processing module. I am extremely impressed with its capability and pound for pound matching the expensive MS Office. Sure no eye-candy of MS Office 2007 but who cares. Sure there are quirky stuff in MS Office that OpenOffice can't do but are they in the frequently used features demanded by majority users? The best part is that I do not have to install it. If I am working on a Virtual Machine and needing something to write with more capability than Wordpad, I simply operate either from a USB drive or drag the suite onto the Virtual hard drive. Not need to go through the pain of installation followed by dreaded activation which disturbs the machine's environment. Often activation will fail because it has already been activated previously!

Previously I have been rather skeptical of the performance and reliability of OpenOffice but after having spend days on it writing lengthy document recording my experiment, I am mightily impressed by it. It is free and it does not bother me with Activation. That's how software should be deployed.

If it can function with the Tablet PC's TIP (Tablet Input Panel), I will install it into my Tablet PC and ditch the 500lb gorilla.

Monday, July 26, 2010

One would expect an online bank to know QIF file format? Don't count on www.ingdirect.com.au

INGDirect has been caught previously for corrupting the download transaction data, now it has even failed to encode the download transaction data in correct QIF format.

Consider the following excerpt of the downloaded transaction in QIF format:

!Type:Bank
D30/06/2010
PDeposit - Interest Credit
T
T10.88
T42345.94
^
D27/06/2010
PDeposit - Deposit from linked bank account
T
T1000.00
T43345.94
^
The bit in red should not be there. I have to run a filter to get rid of those offending records before it can be imported properly.

Sad to find an online bank does not know basic stuff.

Thursday, July 8, 2010

No need to use VPC2007 or Windows Virtual PC in Windows 7

If you are using a home edition of Vista or Windows 7 and is discouraged by the money grabbing exercise of Microsoft trying to frighten you into buying a Pro or higher edition just so that you can run your VPC Virtual Machine in peace don't worry.

There is a better & free way to run your VPC VM by following these steps:
1) Get a copy of free VMPlayer 3.0 and install it into your machine. The installation does not contain any frighten languages like installing VPC2007SP1 in Vista or Windows 7.

2) Then download the Converter and install it. This converter can convert your VPC VM to VMPlayer format without loss of data. You don't even need to have VPC installed in your machine to convert, just the VMC & VHD files will do. After that you can ditch Microsoft's VPC for good.

This converter is more powerful than converting VPC VM to that on VMPlayer, it can convert a physical machine to a virtual machine, something even Microsoft's tool can't do.

I have just converted one of my VPC VM in my Windows 7 Home Edition to VMPlayer and thoroughly enjoying it. I don't know why anyone would want to bother with Microsoft's VPC which is hostile to Home Edition owners.

Friday, July 2, 2010

KeePass v1 or KeePass v2

The pros and cons of KeePass V2 and V1 have been discussed previously.

Finally I have decided to switch allegiance to V1. What sways me over is the problem in V2 needing .Net framework 2. While progressively more and more machines are running Windows Vista or Windows 7, but there are still plenty of WinXP machines out there.

In fact, I was using one that does not have .Net Framework 2 and I was going to use to configure my modem/router. In that situation, I could not use my KeePass v2 database and I did not feel like installing .Net Framework 2 just to run this program. It is kind of defeating the portability advantage of KeePass.

With XP, you cannot count on it having a .Net Framework 2 on it. Without it, your KeePass V2 database is as good as corrupted.

To avoid being left out in the cold, I exported the KeePass v2's kdbx file format to KeePass V1 database format and use the V1.17 instead.

Performance-wide, it starts a lot faster. Until the day when .Net Framework is so widespread that it is not a issue or KeePass organization stops maintaining V1, I will then upgrade to V2.

Wednesday, June 30, 2010

No joy being (mis)classified as a bogon!

About a week ago, I tried to access http://www.translink.com.au, the integrated public transport for South East Queensland, and my browser timed me out unable to reach the site and since then I cannot access that site. Over this period and preceding that, I have not changed any router settings.

A tractrt brought me this typical result:
Tracing route to www.translink.com.au [202.58.101.51]

over a maximum of 30 hops:
1 2 ms 1 ms 1 ms 192-168-1-1.tpgi.com.au [192.168.1.1]
2 24 ms 24 ms 24 ms 10.20.21.36
3 24 ms 24 ms 24 ms 202-7-165-1.tpgi.com.au [202.7.165.1]
4 25 ms 24 ms 24 ms bri-sot-wic-crt2-po1.tpgi.com.au [202.7.171.41]
5 41 ms 42 ms 41 ms syd-sot-ken-crt1-TG-7-0-0.tpgi.com.au [202.7.171.125]
6 42 ms 41 ms 41 ms 202-7-162-246.tpgi.com.au [202.7.162.246]
7 59 ms 59 ms 59 ms 149.59.194.203.static.comindico.com.au [203.194.59.149]
8 59 ms 59 ms 59 ms 149.59.194.203.static.comindico.com.au [203.194.59.149]
9 * * * Request timed out.
10 * * * Request timed out.
11 * * * Request timed out.
12 * * * Request timed out.
13 * * * Request timed out.
14 * * * Request timed out.
15 * * * Request timed out.
16 * * * Request timed out.
17 * * * Request timed out.
18 * * * Request timed out.
19 * * * Request timed out.
20 * * * Request timed out.
21 * * * Request timed out.
22 * * * Request timed out.
23 * * * Request timed out.
24 * * * Request timed out.
25 * * * Request timed out.
26 * * * Request timed out.
27 * * * Request timed out.
28 * * * Request timed out.
29 * * * Request timed out.
30 * * * Request timed out.

Trace complete.

Initially I thought the site might have been down but asking several friends to test reach this site told me the site was up and running. While some using the same ISP as mine, TPG, met with the same fate. A search on the Internet resulted in a thread in a broadband forum discussing this very issue for users on TPG, my ISP, meeting the same fate.

My fate apparently is the result of me, by virtue of my IP address, being a bogon. It is no joy because essentially I cannot do anything as I do not control the IP address allocation. Furthermore, when things like this happening, there is no one-stop shop where you can go to get help and no one authority having the power to arbitrate. The ISP giving me the IP address is not really responsible for my fate; I guess in some way they are responsible for allocating a bogon IP to me. The destination party may be slow to the bogon's classification update announcement to correct the filtering problem resulting my mis-classification. That party may not even understand or aware that they are doing bogon filter using outdated data. Have you ever tried to discuss this kind of technical details on a site's feedback page, if one exists?

This site is of great importance to me as I routinely perform legitimate e-commerce transactions on it to top up my transport stored value smart card. Failure to access this will mean that I have to top up by queuing up at top up stations denying me the convenience e-commerce brings.

Thankfully at the moment, I have two ways to reach this site beating the alleged bogon classification. One is to use a proxy server like http://freeproxyserver.net. The other technique is to use a Tor Browser, which is my preferred way to access this site.

Just to prove that there is nothing wrong with my internet connection and router settings, here is a screen shot showing the successful connection to Translink, showing on the left using Tor Browser, and the failure screen via direct connection using my ISP allocated IP address, shown on the right. The two browsers are running simultaneously.

When using IE8, the message return is that it cannot display the page like this:


Perhaps there is another explanation other than me being a bogon! Checking my IP address against a list of Blacklist Servers using whatismyipaddress.com tells me that I am not blacklisted. So is the reason?

At the moment I am in limbo with two lifelines at the mercy of the Internet magic! If you are contemplating of implementing bogon filtering, be prepared to do frequent update as IP ranges go in and out of the bogon range. Failure will cause unwittingly innocent victims, like me, great problem. Think hard before you use bogon filter.

Saturday, June 26, 2010

Caveat is using NTBackup in Windows 7

Evaluation of several backup utilities to replace the terrible Windows 7 "Backup and Restore" facility has been reported in my previous blog.

The conclusion is that NTBackup, until a more formidable free one comes around, is still the best backup utility. What you need to run this in Windows 7 has been documented here and it also mentions that you can safely disregard a warning message box.

However, the disclosure given fails to mention one very important setting you need to turn off. That is the 'Disable volume shadow copy' in the 'Advanced backup options' dialog box here:

Make sure you check this option otherwise the back up will abort with this message:
Error returned while creating the volume shadow copy:Catastrophic failure

Aborting Backup.
Test of backing a program while still in use seem to be unaffected by this option.

Tuesday, June 22, 2010

In search of a Backup Utility for Windows 7

While people reading this title may be wondering if I have missed the much touted "Backup and Restore" features in Windows 7 and wondering why I need to search for (better) Backup Utility?

Let me describes the deficiency in Windows 7's "Backup and Restore" utility before I will describe my search result.

Microsoft has succeeded in taking a highly capable program called NTBackup and destroying it for the sake of some eye-candy screen. The Win7 backup is so slooooow that any ZIP program will beat it hands down. Not only that, the eye-candy stuff lacks any really useful progress information. It does not even bother to tell you how many files it has picked up, which file is processing, and the total size of the files being backup (until the whole thing is finished) or expected time to take, given that is a best-guess. Apart from some pretty looking screens, the user-interface is totally unintuitive and functionality lacking.

If you have not seen an industrial strength real back up utility, I suggest you fire up a copy of NTBackup in XP Pro or installed it in Home Edition.

One of the basic needs of a back up utility is to be able to add the back up information to what is already there like keeping revision so that you can go back several generations to restore the data.

Not only that a backup utility is as good as its restoration capability. It has to be able to restore the data accurately and precisely including restoring the original ACLS for the NTFS files and folders. Failure to do that can cause major problem and security risk. Imagine you are backing up the profile areas for a machine and has to do a disaster recovery only to discover that the resulting ACLS are all wrong!

A good backup utility also should ensure that the person running this must be a backup operator or one with the SeBackupPrivilege, to perform backup and SeRestorePrivilege, to perform restoration.

The other features commonly found in industrial strength backup utility is to be able to perform incremental backup so to reduce the backup volume. This is important when you are doing a daily backup.

Armed with these demands, I evaluated the following free backup programs: NTBackup, FBackup, and Comodo Backup.

Sadly only NTBackup managed to perform flawlessly managing to replicate the ACLS perfectly. Here is a screen shot of the source (left hand window) and restored folder's ACLS (right hand window):
I restored the material to a different directory to check the restoration process. As a consequence, NTBackup is being used as a benchmark against which the other utilities are compared.

FBackup4 Version 4.4 Build 207
This is a very simple to use free 'backup' utility. The good part of this one is its ability to manage several backup instructions as jobs in the program. It automatically increments the backup volume file name and the backup volume is in ZIP format. It is relatively fast.

However, it is a naive implementation of a backup utility. At best, it can only be classified as a ZIP program with a purpose-built user-interface.

Here is the screen shot of restoring the user's profile to an alternate location. Once again the left hand window shows the security settings of the original top level user profile folder and the right hand window shows that for the restored materials:
This clears shows the restored materials have fewer privileges than the original materials failing being a competent back up utility. This is a simulation of profile restoration.

Comodo Backup Version 2.2.127000.12
The user-interface is prettier than FBackup but it is also more confusing and functional lacking. With its user-interface, a user cannot workout how to get the program to remember the backup instructions so that one can reuse them again. However there are several ways this program that this program can do that, albeit not as intuitive as that for FBackup.

During the composition of the backup instruction using the wizard, you can use the Schedule definition to remember your instructions even if you do not want to do scheduled backup. You simply change the type to manual backup. Strange logic.

The other way is to export your backup instructions to a file which contains the command-line arguments corresponding to your instructions. With this file you can then use the /script command-line directive to supply the script file when launching the backup utility.

This utility has some slick facility allowing you to define the backup volume file format - such as including date, time etc. They call them macros and they are not available in FBackup. It also has facility for you to define the level of compression you want during backup.

Performance of this program is very good. However it suffers the same problem as in FBackup when it fails to reproduce precisely the ACLS as shown below:

Conclusion
These backup utilities are not really backup tools but specialized ZIP programs. It is not just because they are using ZIP format but because they left out the ACLS what they need to use to restore them

While NTBackup is available official in XP Pro and optionally in XP Home Edition, Microsoft has provided a cut-down version of this tool for Vista/Windows 7, called "Windows NT Backup - Restore Utility", so that user can restore from NTBackup volumes.

However, NTBackup has been known to run fine in Vista and in Windows 7 (and here). Since the 'installation' of NTBackup is so low impact, I will definitely give that a try. The big remaining question is: At what time in the future that Windows development will render our trusty friend not operable?

Sunday, June 20, 2010

Virtual Machine software on Windows 7

Ever since Vista promoted that multiple level license pack - Home Edition, Professional/Business, Ultimate - Microsoft has been flagging certain programs "not supported" on cheaper edition.

This form of money grabbing exercise continues with Windows 7. Not contended with this, Microsoft actually turns up the confusion one more notch by introducing Windows Virtual PC as oppose to Virtual PC 2007 SP1.

Windows Virtual PC works fine with VM created in VPC2007. But one thing to remember, you cannot run Windows Virtual PC together with VPC2007sp1 on the same Windows.

When you install VPC200SP1 (just make sure Windows Virtual PC is uninstalled - go to 'Programs and Features' and then 'Turn Windows features on or off') on Windows 7 Home Edition, Windows will warn you that the program is not supported on this edition of Windows. Just ignore this as it works fine. It is Microsoft's way of tricking you to buy a more expensive edition. Read on for more virtual machines that do not practice this kind of scare tactic.

While running Windows Virtual PC, drag-drop from Host OS to Guest OS or vice versa does not seem to be working while VPC2007SP1 handles this operation without trouble. If you like the drag-drop feature, just use VPC2007SP1, which incidentally supports wider guest OS.

When running up the VPC2007SP1 console, it will display this lame warning message box:
Just ignore this.

If you are concerned by this, check out this comprehensive review of other Virtual Machines available that are not concerned with which edition of your Windows 7 or Vista you are running on.

It is actually better to run non-Microsoft Virtual Machine because in this way you are guarantee a uninterrupted migration path to non-Microsoft Host OS machine. It is also interesting to note that VirtualBox can run 'XP Mode' VM.

After exploring these confusing situations, I have settled on using VM Player for serious stuff and to benefit from a much wider support of guest OS, leaving VPC2007SP1 to deal with my experimental VPC VM that I have accumulated.

Saturday, May 15, 2010

How is NBN pricing compared with other country?

I am presently holidaying in HK and has just visited a friend who has just installed Fibre-To-Home service. It is interesting to compare what people in HK pays and that will be charged by iiNet and Exetel using NBN service.

HK is charging HK$99 (which is about AUD14) per month. Australia is one of those few rare countries that still imposes download limit. The HK service does not have download limit.

With is low cost, I am surprised to find in a building with 28 apartments, only 2 have taken up this kind of service. The installation fee is only HK$300 or AUD43 with a contract period of 2 years.

Friday, May 14, 2010

Firefox kills Windows Journal Writer in Fujitsu T4310 Windows 7 Tablet

I am rather shocked and annoyed to discover that Firefox 3.6.3 causes the Windows Journal Note Writer to crash in a Fujitsu T4310 Tablet PC running Windows 7. Initially I suspected it was caused by the Adobe Flash plug-in in Firefox but after many hours of painful cycles of recovery to factory settings that I have managed to nail it down to Firefox.

The machine is brand new and hence I can afford to reset it back to factory settings when things goes wrong.

For those new to Tablet technology, Journal is a key component that allows user to scribble electronic ink onto a document. It can be a blank page to start with or you can print an existing document to the Journal so that you can scribble notes on it or to highlight parts. Without it operating a Tablet is pretty ordinary.

The machine is installed from distributed image and then connects to the Internet by WiFi to retrieve all the Windows updates current to May 13, 2010. At each stage, the Windows Journal is fired up to ensure that it can render the blank page and that I can print from say Adobe Reader/IE8 to a Journal file. The Windows Journal still works even when IE8's Adobe Flash plug-in is installed. During my experiment, Microsoft Security Essentials or AntiVirus are not installed.

The installation of Firefox 3.6.3 proceeds without any trouble. I cannot launch Windows Journal or print from any program that once works to Journal Note Writer the moment Firefox is installed. Printing from Firefox causes the Journal to crash.

All the crashes cause Event ID 1000 and pin-pointed to mostly "C:\Windows\System32\msvcrt.dll" or occasionally to "C:\Program Files\Windows Journal\NBDoc.dll". In all events, the exception code is 0xC0000005, which means Access Violation and the fault offset for MSVCRT is the same. Here is a typical Event message:
Log Name:      Application
Source:        Application Error
Date:          5/13/2010 11:55:08 AM
Event ID:      1000
Task Category: (100)
Level:         Error
Keywords:      Classic
User:          N/A
Computer:      ABCD-PC7
Description:
Faulting application name: journal.exe, version: 6.1.7600.16385, time stamp: 0x4a5bc80a
Faulting module name: msvcrt.dll, version: 7.0.7600.16385, time stamp: 0x4a5bda6f
Exception code: 0xc0000005
Fault offset: 0x00009c7f
Faulting process id: 0x14d4
Faulting application start time: 0x01caf25012163ca9
Faulting application path: C:\Program Files\windows journal\journal.exe
Faulting module path: C:\windows\system32\msvcrt.dll

Finally, after I've uninstalled Firefox, I launched a command prompt that "Run as administrator" and to apply the following steps to recover the Windows Journal:
In the "C:\Program Files\Windows Journal" directory
1) Run
Regsvr32 /u NBDoc.dll
Regsvr32 NBDoc.dll

2) Run
Journal /repairnotewriter

If you happen to run the last command in a CMD running as the log in user, a message box will inform you that it cannot find the specific file.

3) Then reboot.

When the machine is restarted, you can try to bring up Journal. It may still crash or it may bring up the Windows Journal Recovery dialog box as the last one you tried to print from Firefox did not go to completion. Do not choose to recover and this should bring up the blank page. You can then close the Journal Writer to complete the recover process.

You may also choose to launch the Journal Writer with an existing JNT file when launching by itself fails.

After the Windows Journal is restored (with Firefox uninstalled of course) you can print from IE or any programs.

I have been a long time Tablet user and is still using my P1510 running XP with Firefox and I have never seen this bizarre interference.

If anyone has any idea what is exactly happening in the background causing them to be hostile to each other or similar bizarre interference, I would like to hear from you. If you have a way to fix this, it is even more welcoming.

Tuesday, May 11, 2010

A review of two Windows 7 Tablet PCs with touch support

I was looking to buy a Windows 7 tablet PC that was more than a touch sensitive notebook. I am not a novice to Tablet as I am still using a T1510 when it was first release several years ago. Any candidate must be able to write like a tablet PC as the primary requirement, which means you can input without the use of the keyboard by means of the TIP (Tablet Input Panel) and touch support was a secondary requirement.

I narrow down to two: Acer Multi-Touch and Fijitsu T4310L both running Windows 7 Home Edition. The processor specification was of secondary concern to me. The functioning as a tablet was more important to me.

Because of the price difference and the weight, the Acer was the first that attracted my attention. In all respect this machine had every tablet tools and support - TIP, Journal Writer, Snipping tool and sticky note, except one thing.

This machine does not come with a pen and rely solely on touch operation which was currently in vogue. This decision by Acer defied the natural human instincts. No human I know of writes with a finger otherwise Quill was never discovered.  But Acer, in search of being vogue failed to deal with this probably. Most likely buyers' ignorance and misconception of a touch computer and a tablet. I placed the blame on the level of competence on the sales people.

A quick test of writing my name was a real struggle with the TIP and I challenged anyone to write more complicated text such as any Chinese characters on it. Even the sales person had a real struggle. Sure you could flick through image files with ease with a finger but you could not flick out a sentence or even some basic words.

Seeing several people struggling with it, I declared that Acer was not only an unsuitable Tablet PC but also bordering on not being a Tablet PC either. It is basically a touch sensitive notebook with all the Tablet tools, that no one can basically use. Its bias towards touch robbed its market share. It was more than a touch (pardon the punt) of disappointment.

I would definitely not recommend this to anyone as a Tablet PC. If your sole purpose was to get that flicking and touching actions, then you could do a lot better with a purely touch sensitive notebook, rather paying for an imitation Tablet PC.

With this shocking disappointment, I moved over to a Fijitsu T4310L with Windows 7 Home edition. This machine came with a special pen and also reacted to touch, like flicking. Writing on the TIP with that pen instantly showed the difference between this and the Acer. It was like comparing a Harley Davidson with a bicycle.

It was pure joy to write with the pen on this machine, even to a novice by-stander. Coupled with the smart recognizer, which was also available to Acer but incapable to exploit it properly, the degree of accuracy was several order of magnitude of improvement over my T1510. The special pen had programmable buttons on it as well as an eraser at the top of it.

The only disappointment was that it used a special pen that was expensive to replace. Without it, one had to resort to using the finger and this brought it back to the level of incompetency of the Acer.

With respect to some of messages from Microsoft on the combination of hand writing recognizers one could have in a Home Edition, I was pleasantly surprised to see the presence of Traditional and Simplified Chinese recognizers in an English Windows 7 Home Edition.

I was so disappointed with the Acer that I did not bother to try to discover this support. The other reason was that I could not use the finger to touch that tiny down arrow to drop down the language options in the TIP. Not such frustration on the T4310.

Hence I could not be certain if the policy reported had been repealed quietly for the sake of good common sense and now an English Home Edition could have any other language recognizers one needed, just like previous editions of Windows. But I could be certain that it was available in my T4310 running Windows 7 Home Edition.

T4310 was a clear winner, even though it was dearer. What's the use buying a cheaper Tablet PC imitation.

Blog Archive