August 2003 Blog Posts

Having spent probably more time in the last couple of months on a project using VB.NET than I might otherwise have chosen, it seems to me that the same problems keep cropping up again and again that you'd really expect a modern compiler to help you deal with.

I've lost track of the amount of time we've spent looking for the source of a problem that turned out to be where someone forgot to actually return the result of a computation at the end of a method, or worse where one of the code paths forgot to return a result but all the others did. The C# compiler wouldn't waste any time telling me I'd done something wrong and I don't understand how the compiler for a language historically targeted at an audience that might appreciate a bit more hand-holding can happily continue churning out the code.

I read Keith Ballinger's excellent Web Services: Architecture and Implementation with .NET a few months ago, and it gave a fascinating approach to using XML Serialization. Keith suggests that this was really misnamed and that the term XML Mapping would be more appropriate because it is not designed to serialize any CLR object to XML, but rather designed to map any XML onto some set of CLR objects. Once you take this view and start to get a handle on the various XML mapping code attributes, XML serialization suddenly becomes much more powerful.

For this reason, many of the cases where I have used the DOM (XmlDocument) or XPathDocument in the past, can now be defined using a lot less code using XML Serialization. This also makes the code much more accessible for people with limited XML knowledge to use and extend in an intuitive manner. I've recently written a base class for dealing with custom configuration sections using this approach and thought I'd share it here.

There is an excellent article by Rockford Lhotka that explains what XML configuration files are and what they're used for so I won't repeat any of that here. In that article, Rockford demonstrates how to write the code for a custom configuration section by writing a class that implements the IConfigurationSectionHandler interface. If you want several config sections in your application, however, you end up writing a lot of code that is almost identical.

By using a base class for the configuration section handler, we can write a minimal amount of code describing simply what settings we want in our handler. Here is an example configuration settings class:

using System;
using System.Xml.Serialization;

public class MyConfig : BaseConfigHandler
    public string MyString;

    public int MyInt;

    public static MyConfig Current {
        get { return (MyConfig)GetCurrent(typeof(MyConfig)); }

This class will parse a configuration section that looks like this:

    <myConfig myString="abc" myInt="123" />

The key points to take from this are that we use an attribute on the class CustomConfigHandler to indicate the name of the XML node (myConfig) and the group name (sampleConfig) and then we use standard XML Serialization attributes for the fields. This gives us strongly typed access to our settings. To access a setting in code, we simply refer to the Current property: MyConfig.Current.MyString.

Finally, we need to indicate in the .config file, where the code resides to process the custom handler. We do this by including a pointer to the handler code at the top of the .config file:

    <sectionGroup name="sampleConfig">
        <section name="myConfig" type="MyConfig,ConfigTest" />

This indicates that the class is called MyConfig in the assembly ConfigTest.

The code for the BaseConfigHandler is here and you can download a sample project to try out the code.

Clay Shirky: Process is an embedded reaction to prior stupidity. When I was CTO of a web design firm, I noticed in staff meetings that we only ever talked about process when we were avoiding talking about people. [via Sam Ruby]

Clay Shirky makes such a perceptive observation about how process gets created that I'm surprised that I hadn't thought about it in quite those terms before. While talking about the lack of process in Wikis, he explains why we end up with "layers of gunk that keep [the] best employees from doing interesting work, because they too have to sign The Form Designed to Keep You From Doing The Stupid Thing That One Guy Did Three Years Ago".

Apologies to those people seeing a string of old posts from me in their RSS readers. I've updated my RSS generating code to (hopefully) correctly handle ETag and Last-Modified headers and to return a 304 if no changes have been made. I've also taken the opportunity to change my permalinks so that they point to one page per post. All old permalinks should still be supported. This will allow me to add-in per post tracking, etc. If anyone spots a problem, please e-mail me and let me know. Thanks.

Brad describes using the Application Error event to catch application-wide exceptions and handle them gracefully. This is pretty much what I've been doing, although I am just using the code behind global.asax rather than a separate module for the time being. My event handler e-mails details of the exception to me (which allowed me to spot the slight coding error yesterday that meant a blog post between 201 and 209 characters in length broke my RSS feed - oops!).

CraigBlog: This is where it pays to be friends with Fritz Onion. :) I pinged him, and he suggested a clever little hack (in the good sense of that word) to put all my error handling code in one place.

Scott Watermasysk: In part 1 of the interview, Anders mentions an internal wiki that is used for the design process. I would love to read that some day.

Reading this sparked off some more thoughts this week about the use of a Wiki for documenting a design/development project. I played around with OpenWiki a few weeks ago, which is implemented using ASP and is a piece of cake to get up and running.

The pages upon pages of notes I have written down for the projects I'm currently working with can become difficult to manage and there are a list of tasks in amongst them that are not only waiting for me to complete, but also waiting for information from other people. These tend to be little things that don't need to show up on the overall project plan. Using a Wiki might be a way of allowing other people to see the notes I'm taking and also see what tasks are outstanding. If everyone did that then there would be a much better sharing of information. However, the Wiki Backlash debate made me wonder if all that would be manageable.

I've also experimented with Microsoft OneNote this week. Having originally written it off as something that I really needed a Tablet PC for (and much as I'd like one, the price/performance factor doesn't seem to make them worth it yet), a few comments I've read recently suggested that it was still a fantastic tool for keeping your notes even if you just type into it. My evaluation suggested just that and if it wasn't for the fact that the beta kept crashing on me, I might dispense with a paper notebook, something that was previously an anathema for me.

The outcome of this is that I think in the very near future, I'll be managing much more of the day-to-day information that I write down in an electronic (and hence searchable and sharable) medium. How, exactly, I'm not yet sure: possibly a mixture of Wiki and OneNote from the looks of things...

Paul Vick provides a bit of history about where the short circuiting logical operators got their names. One of the projects I'm working on currently is being written in VB.NET and I have to make a conscious effort to regularly do a search through my code to replace And and Or with AndAlso and OrElse since it is rare (never?) that I don't want the short circuit to be taken if possible.

After reading Simon's recommendation for POPFile a week or so ago, I decided that I would give it a go after all. I'd looked at it previously but decided not to bother installing in the end for whatever reason. Instead of using it to proxy my calls to my POP3 mailboxes, I have installed Outclass, which is an add-in for Outlook that uses the POPFile engine but from within Outlook itself.

So far, I'm impressed. It is filtering out spam with an amazing accuracy - it's probably only mis-classified a 4 or 5 messages in the last few days, and for a couple of those, it's debatable whether they don't really count as spam anyway. I've disabled all the Outlook rules that I had in place to try to capture spam - this does a much better job.

The last week or so has been a depressing one for advocates of the Windows platform with lots of publicity surrounding the W32.Blaster.Worm that spread like wildfire amongst Windows PC's that hadn't been kept up to date with security patches.

Whilst it was to be expected that many home users, particularly those on slow modems, might not have downloaded the 30-odd patches awaiting them on Windows Update, it was depressing to see and hear publicly quoted companies losing mission-critical servers to this worm. This is frankly unacceptable. It might be a "problem" installing patches that require system reboots but this is nothing compared to the havoc created when systems are taken down. There really can be no excuse for systems departments in reasonable sized companies not monitoring the security patch situation and planning deployment of critical updates when they are released. After all, in this case people had plenty of time to get updated, and what do these people do all day anyway?

With all the furore surrounding the Windows worm, not much mention was made in mainstream media of the GNU Ftp Server compromise and so little was done to quiet the crowing from the "open source" (in other words here non-Windows) community.

Following on the heals of the original MSBlaster worm, a new version is released that works to patch the machines it infects with the Microsoft patch leading to a situation of "good" worms fighting "bad".

Some security experts were puzzled as to why users couldn't seem to deworm their own machines. MSBlaster is not especially difficult to remove.

But some users said that it was difficult to find any understandable information about removing MSBlaster.

"These virus and worm removal advice I see are obviously written by nerds for nerds," said Paul Pacifico, a beauty supply salesman in Brooklyn. "Most of the time I can't ever figure out what the hell they're on about."

Pacifico also said his computer was running perfectly today, and a scan shows that it, too, was infected with the new worm.

Another critical update was highlighted in the past couple of days with an updated security bulletin. I have to say that I found it a little difficulty working through all the technical blurb to find out which version would apply to my PC in order to confirm that I did already have the patch installed. It is unsurprising then that some end users struggle with the "by nerds for nerds" descriptions.

Microsoft clearly realise this and aside from "working on better ways to release patches", have created a more Joe Public site explaining how users can better protect themselves.

"But I actually had one of our secretaries tell me today, after I warned the staff about this antiworm, that she'd rather let the new worm fix her home machine than to 'have to fuss with all this security stuff.'"

"I didn't know whether to laugh or cry," confessed Godell.

Hopefully the lesson going away from this will be that people will pay more attention to keeping up to date with patches, especially in mission-critical systems. I might say, though, that I'll believe it when I see it.

This is another gotcha that I ran into this week. The documentation pages generated by the ASMX engine don't seem to handle derived/base class types correctly. To illustrate this problem, create a demo .asmx file with the following contents:

<%@ WebService Language="c#" Class="MyService" %>

public class MyBase {
    public int baseVal = 5;

public class MyDerived : MyBase {
    public int derivedVal = 10;

public class MyService {
    public MyDerived MyTest() {
        return new MyDerived();

The generated WSDL correctly contains the type definitions for MyBase and MyDerived:

<s:complexType name="MyDerived">
  <s:complexContent mixed="false">
    <s:extension base="s0:MyBase">
        <s:element minOccurs="1" maxOccurs="1" name="derivedVal" type="s:int" /> 
<s:complexType name="MyBase">
    <s:element minOccurs="1" maxOccurs="1" name="baseVal" type="s:int" /> 

And the implementation of the method returns the correct result:

<?xml version="1.0" encoding="utf-8" ?> 
<MyDerived xmlns:xsd=""

But the examples in the generated documentation page only show the members from the derived class:

HTTP/1.1 200 OK
Content-Type: text/xml; charset=utf-8
Content-Length: length

<?xml version="1.0" encoding="utf-8"?>
    <MyTestResponse xmlns="">

Note that only <derivedVal> is shown in the documentation and <baseVal> is missing. I spent a little while looking at this before I realised that it was just the documentation page and that the schema was correct.

Rob MacFadyen writes:

Ok.. so to automate signcode such that is does not prompt for a password the steps are as follows:

  1. Get the keyimprt.exe tool
    Note: This is a self extracting exe that expands into another self extracting exe (named the same), that then extracts to an installer. The installer then installs 2 files (again the .exe name is the same)
  2. Import your .SPC and .PVK using the tool and instructions from #1. This will ask for your password. Pick the store explicitly, and pick the "Personal" store.
  3. Use the "certmgr.exe" tool to view your new key and determine what it's common name (cn) is. You can start certmgr.exe from IE by Tools->Internet Options, then on the "Content" tab click the "Certificates..." button
  4. Use signcode.exe as follows (line breaks add for clarity):
      -s my
      -cn "Your Cert CN"

    Note: If you specify the "YourFileToSign" incorrectly you get a cryptic message: "One or more input parameters are invalid." instead of a more useful "file not found" message (there's 2 hours I'll never get back).

That's it... you may also want to include the "-info" switch to add a url that is displayed to the user as "more info".

One of the projects that I'm working on involves putting together a web service interface to a booking system so that business partners can integrate our inventory into their business systems. This involves exposing a fairly flexible schema so that different partners can supply different information. This means that for some requests some information is optional while for requests from other partners that information might be required.

Using ASP.NET web services, we implement this by having a class to represent the whole request which contains object references to other classes representing the optional information. In the schema, we get a minOccurs="0" and maxOccurs="1" and this works well.

One of the things that I learnt this week, however, that hadn't been immediately apparent is that there is no way to insist that a sub-object must be present: an [XmlRequired] attribute if you will. This means that all reference types are optional.

According to Douglas Purdy, this behaviour is by design:

If you already know what you want on the wire, why don't you define the WSDL first? You can use the wsdl.exe tool with the /server option to generate the code.

He says this Usenet post will be taken as a feature request - I presume nothing has been done yet and that this therefore may or may not exist in v2.0. It sure would be useful. Keeping our WSDL in sync with our code manually would be a pain, but we'd like the WSDL schema to represent most of the contract to our customers.

Scott Watermasysk: Anyone who needs to switch between Source Code Providers can not be without Harry's latest utility: SccSwitch. VSS to GotDotNet to Vault ...and so on and so on. Very handy.

Oh thank goodness. Was about to write just this project this weekend. Excellent!

Alan Flaherty: Try iTextSharp, I've been using it for creating rtf documents and it's a really nice piece of work, (and its free :)). It's a c# implementation of the iText JAVA-PDF library.

Microsoft Data Access Components (MDAC) 2.8 contains core Data Access components such as the Microsoft SQL Server OLE DB provider and ODBC driver. [ Microsoft Download Center]

Jeff Smith: There is a webalizer port for Win32. Works great. Make sure you turn off the built in reverse DNS lookup or it will take days to analyze any decent size logfile. Also make sure you go into IIS and turn on logging for Referrer, User Agent, and Bytes Sent because those are usually turned off and that's important info!