December 2002 Blog Posts

I had a problem with my Cassiopeia E-105 (Windows CE 2.11) because I let the batteries run down and it lost all my settings. I did have a backup made with ActiveSync but when I tried to restore it, it said "Restore cannot be completed successfully" and complained telling me that I was trying to restore a backup for a different device (which I wasn't). Apparently, this is a common problem and in order to fix it you have to make sure to set the same regional settings location on the palmtop as it was when the backup was made before doing the restore.

Having done the restore and resynced all the data back in from Outlook, I next discover that the clock setting is out by one hour - probably something to do with daylight saving but I couldn't see what. After another half an hour of digging, this one is also common. This time you have to set the world clock home location to Seattle, then resync the clock, and then set the location back to your actual location. And guess what, I bet this all works fine all the time for those Redmondians.

Hopefully, now all I need to do is get a new backup battery tomorrow and all will be well again.

Apparently AOL just was awarded a patent (filed in 1997) on Instant Messaging software.  This is insanity.  "Chat" applications have been around as long as the Internet itself. [Managed Space]

Exactly. finger/talk anyone? I used these to determine if people were online and to then talk to them over a decade ago. Didn't the old pre-Internet AOL and Compuserve include network user chats too?

Team Development Guide This document provides development and procedural guidance for project teams building .NET applications with Visual Studio .NET and Visual SourceSafe. It discusses the processes, disciplines, and .NET development techniques that team members must adopt in a team development environment. It also describes how to create the necessary development infrastructure, which includes source control databases, development workstations, and build servers.

The Biggest RiskThe focus of every manager / executive's concern has to be risk. Every technology project has risks, and some have more than others. Here is a sampler of risk types...

Interesting read.

Also re-visited my trusty stack ofGerald Weinberg's"Quality Software Management" books (all 4 volumes).  If you're a developer who is constantly trying to understand why people in bad situations constantly do stupid things ("The project is late! We'll fix that by adding more people!"), you owe it to yourself to read his books.  I think Jerry's work is timeless, and you shouldn't overlook that fact that modern "luminaries" in software development techniques derive a LOT of their work from Jerry's work. [InkBlog : The Random Musings of David Weller]

Looks interesting.

Brad (The .NET Guy) @ 12/11/2002 12:37 PM. I think that selecting the compression ratio for ZIP files is a different thing, for a few reasons: 1. It's mostly heritage, from when we had slow CPUs and small hard disks. 2. A ZIP is an archive of multiple files, not compression of a single file. 3. Selecting different options actually used different compression systems, which yielded different results.

1. I agree with this, but presumably GDI+ will be available on portable devices with less CPU since it forms the basis for System.Drawing.

2. Well, gzip works on one file and offers the same facility. IIRC, PNG uses the same algorithm as gzip.

3. I'm not sure about this - I think the different options affect the size of the lookahead buffer for the deflate algorithm where a bigger buffer may give better compression but will require more processing.

Interesting discussion for a cold Thursday morning. :o) Brrrr...just started trying to snow too.

[Managed Space] Your 30 day trial subscription has ended

Overall Radio has been  great. It does everything I want it to and some things I didn't know I wanted but am now addicted to.  It's not perfect however  - there are some annoyances,including:

*No way to manually update News! This one drives me nuts. I use Radio on a laptop. It's not all the time and often I have to use dial-up internet.  When I open it up in the morning or dial in from the airport or something I want to be able to tell Radio to please go fetch news now.   As it is I set up Radio to fetch news on startup and just restart Radio when I want to refresh.  This (combined with criticism 2 below) means I'm unsure on a dialup just when Radio is finished upating.

Hear, hear. This one drives me mad too.

Radio does not start in a new browser window - Double-clicking the Radio taskbar icon lauches radio in some random IE window.  More than once I've found a site, double-clicked radio so I could link to it, then got confused when the site's not up anymore. 

This was one of the things that drove me to change the setting in IE for whether or not it should reuse browser windows. I had the same thing from Outlook too. I'm not sure how Radio launches URL's because it's the only application I have that doesn't interact with IE correctly - it always opens a browser window, but it only navigates to the Radio page if there is another IE window open already. Otherwise, it just goes to the home page (about:blank).

Buggy html editor.  If I enter a link for some text and then try to edit the link later the editor gets confused. And Ctrl-Z (undo) is just broken - for instance if I accidentally paste in some text with Ctrl-V then press Ctrl-Z the editor doesn't delete the text back out.

Yep, but unfortunately, this is just HTML editing with IE and not really anything to do with Radio.

Browsing RSS updates - I love having the RSS upates come in but the presentation of them feels very 1.0 to me.  It would be pretty cool to browse them NNTP style like say Forte Agent.  (in fact it should be pretty easy to literally convert RSS feeds in NNTP and feed them to Agent).  I've thought about looking at other RSS browser tools, but I really like just being able to click-and-post, and using an external tool wouldn't let me do that. At the very least I really would like a button like Hotmail has that selects all the checkboxes on the page with one click.  I sometimes dread subscribing to a new blog, knowing that I'm gonna have to go click all those checkboxes. 

There is a preference that says check everything by default and then you only have to uncheck the things you want to keep before pressing delete.

Multiple blogs? - It might be nice to maintain several independent blogs in different places.  Radio seems geared toward only maintaining a single blog.  Categories help but what if I want a music blog on a different server than my professional blog?  Seems easy enough to fix [though it might increase pressure on the poor userland server that provides server space]. 

You can create a #upstream.xml file in your category folder to make it get uploaded to a different location.

Flakey Comments -  Hey what do I want for free?  The comment server at to be down as much as it is up.

There are alternatives around - I use YACCS, but it appears that they are no longer accepting new registrations. Try here?

Overall I'm having a great time with Radio, so I sent Dave his 40 bucks and expect to be using it for some time to come...

I've used Radio for 9 months now and I'm mostly happy with it. I'd like to move to adding some more dynamic features to the site though and so in time I'm probably going to move to a more home-grown blog using ASP.NET. I'm undecided whether Radio will play a part in that - I'm a bit worried about the problems people have reported when Radio goes bad.

The XML Documentation Tool ( XML Documentation Tool.exe) gives Visual Basic .NET developers the ability to author XML documentation files for their library projects.  This is useful when it is desirable to distribute a built assembly but impractical to distribute associated source code.  Consumers of their referenced libraries see the XML information in Intellisense in the Visual Studio code editor and the Object Browser.

Note   The tool can be used for assemblies built in languages other than Visual Basic .NET, as long as the assemblies are CLS compliant managed code assemblies, such as C#.

Brad (The .NET Guy) @ 12/11/2002 08:31 AM. The reason you cannot select the compression is because PNG uses a lossless compression system. There's nothing to tweak, unlike JPEG (which is lossy, and therefore you may want to balance size vs. quality).

That's certainly true, but zip files use lossless compression (fortunately ;o)) and I can tweak their compression speed/size when creating them:

-0   store only
-1   compress faster
-9   compress better

So I went looking for something that said either compression on/off or level of compression. Of course, I don't mind one way or the other, but I was surprised not to get any compression when using 24 bpp (which I was using since it ought to be smaller than 32 bpp) and I didn't see this documented in MSDN which is the only real reason this is of note (to me, at least).

I've been using GDI+ to do some basic image processing to add PNG output support to an application that previously only handled writing BMP files. I selected the 24bpp RGB format because I didn't need an alpha channel but the image files weren't being compressed. The sample code in MSDN shows that the PNG encoder doesn't have any parameters and it seems compression is dependent upon the bit-depth. Once I selected 32bpp ARGB (the default for the Bitmap class apparently), the compression was enabled and my 150K test file went down to 1K.

Another annoying VS.NET "feature" is that when I've got a C++ project open, the solution explorer lists the files with a case sensitive sort. This means that the source files that happen to have been added with a lower case first character (by some previous incarnation of MSVC++) are right at the bottom and it takes me a while to remember to look down there. Especially bad when there are a few hundred files all told.

I like VS.NET but things like this make me wish I'd paid more attention during the beta process and reported a few more issues. Unfortunately, I don't have the bandwidth (or the few days it would take) to download the VS.NET 2003 beta to see if that is better. It took the better part of a day to download the .NET 1.1 SDK and runtime :o(.

Q306149 - INFO: How to Display an Assembly in the Add Reference Dialog Box

This is the missing link I've been wondering about: it's all done with HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\.NETFramework\AssemblyFolders.

Visual Basic Reference Counting vs. .NET Garbage Collection.

My first article. "I work with a few developers that are in the process of migrating from Visual Basic 6 to .NET. The transition for them is huge, but ultimately, there are a few issues that rear their heads more than any others. One of these issues is resource acquisition, initialization and cleanup."[ClrMnd -- Matt Kennedy]

Matt showed up in my Referer logs today. We exchanged email and it tuns out that myself and others here have inspired him to start his. Cool! Welcome. Rss-subscribed. [Sam Gentile's Radio Weblog]

I listened to most of Chris Sells on .NET Rocks this morning. It was mostly just interesting chat but the one thing I did pick up that I hadn't considered before was the idea of putting System.Diagnostics.Debug.Assert(false) in the finaliser for your IDisposable classes. This means that if you're using a debug build and you forget to call Dispose, you'll ultimately get an assertion failure to remind you when the GC cleans up. I think this is helpful when moving to the GC paradigm.

I've uploaded a PDF document about my SqlTx class library as promised. This is a first draft and I do need to put a little more effort into it but I wanted to post what I'd done so far because I won't have much time spare this week now to go further. I've tried to explain the the justification for writing the library and the assumptions that went into it. I've also described how the code works to a degree but you should probably read this whilst also looking at the source code.

One thing I didn't mention before is that the SQL scripts and the C# code for the data components in the unit tests code were generated automatically from a tool using an XML description of the data and methods. I plan to release this VS.NET code generator too once I've tidied up the source. It is amazing how much more productive you feel when you can implement T-SQL stored procedures and a C# data layer complete with transaction support just by describing the data and operations you want leaving you to get on with the business logic.

As always I would appreciate any feedback you have about this project.

John St. Clair @ 12/04/2002 03:17 PM. I'm not sure I agree with you on VSS integration problems -- I've seen a few, but generally I'm pretty happy. But your include problem seems fairly easy to fix: In VSS, create a subproject (subfolder) of your current solution. Share the files from the other folder into it, and then do a get latest from VS.NET. Since the files are shared, changes made in your project will be propagated back. Kind of a hack, but it should work.

Don't get me wrong - I am generally pretty happy with VS.NET in terms of the full product, but some parts do feel like a retrograde step. Yes, I could share the files and change the location of the relevant files in the project to use the integration. There are a couple of hundred of them organised in subfolders below the main include directory so it would be a bit of a pain. I also probably end up having to do this with the other projects that use the common include files and then I need to keep ensuring that I've got the latest version for each project before I build it rather than just once for all of them. I just feel a bit let down because this worked fine before.

Automatic Managed SQL Server Transactions

I've uploaded a setup package for my SqlTx code. This includes the source code, NDoc documentation, and binaries including an NUnit test harness.

To build the source you will need the following:

  • VS.NET to build the solution
  • NDoc (I am using version 1.1c) - the solution references NDoc in "C:\Program Files\NDoc" so you'll need to make some adjustments if you have it elsewhere
  • NUnit 2.0 if you want to build and run the unit tests - again this is referenced in "C:\Program Files\NUnit V2.0"
  • You will need to create a key file VTSql.snk in the src folder (I didn't include mine)
  • To run the unit tests you will need a database configured with the two scripts in the database project and, if the connection string is different, you will need to update it in the ConnectionString class file

I will post my draft documentation later today once I've made a few more changes.

I'm convinced that the SourceSafe support in VS.NET is worse than it was in VS6. Notwithstanding the bug that means that if I'm using a code generation tool and I edit the source file without manually checking out first, then VS.NET 'forgets' to check out the underlying generated file and things get messed up, but I'm having problems with basic things.

I have a project that started with the first 32-bit version of Visual C++ (v 1.1?) and MFC2 in about 1993. We gradually ported through the v2 releases, v4 releases, v5 and v6. With VS6, we started using VSS for source control (yes, it was a long time coming). The project has include files drawn from a common folder that different tools share which is above the project folder in the file system. In VS6 this was no problem, presumably just looking at the .scc files to figure out where to look in the SourceSafe database. I moved this project to MFC7/VS.NET several months ago but because the include file folder isn't below the project home folder it refuses to accept that they might be under source control. The few project specific includes are controled, but if I want to check out/check in the other include files then I have to go through the main SourceSafe .EXE and do it manually (i.e. without integration). This is a real pain.

Today, I've been working on putting together the sample projects for my Declarative SQL Server Transactions sample and it includes a deployment project to generate an MSI install. This is all fine - Microsoft obviously expected people to do this because one of the file sets I can choose is "Source Files from XXXX" so that I can install the source files. Now obviously I'll want to include the solution .sln file so I added that to the deployment project but now SourceSafe gets itself in a mess. It seems to get confused because the .sln file is managed by SourceSafe already. Grrrr.

I've been thinking about this further:

Surely the argument about DataReaders isn't the cost of opening the connection, it's the fact that the connection lives much longer and hence reduces scalability?

One could make the argument that in fact getting a connection, getting you data, and getting off as quickly as possible, is much more important than the efficiency of DataSet's or otherwise at the middle tier. It's generally a lot easier to scale out the middle tier than improve the scalability of the database.

ADO.NET DataSets.

Thomas Wagner has some interesting things to say about ADO.NET DataSets. I recall there was a long and drawn out discussion on the DM DOTNET list about the performance of using DataReaders vs. DataSets, the end result of which was that Microsoft's admonishment to always use DataSets is bullshit. [The .NET Guy]

Yes, this is a long running discussion, but I'm not convinced that this supposed Microsoft policy stands up to much scrutiny. I've been at an MSDN presentation at Microsoft's UK base where we were told, "...but if you want to get data out quickly, don't use a DataSet, always use a DataReader."

Thomas Wagner says:

Over and over and over I read that one ought to prefer a Dataset over a Datareader because of the cost of opening and closing a db connection when using a reader object. Can we make a mountain out of a molehill to promote our latest achievements please? Connections are pooled in most NET / SQL Server configs and therefore are faster to open and close. Has anyone mentioned that  fact whilst they demote the  Datareader? No!   Every other author espouses the party line that you should cache datasets whenever feasible and eschew Datareaders because of the cost of opening a connection. I can see on a daily basis how developers confuse the concept of inheriting datasets with traditional OO programming.

EnterpriseServices and transaction promotion.

Greg didn't really like my previous example and says that one could easily pass the SqlTransaction to a method, thereby eliminating the need for implicit distributed transactions[...]

ThisEnterpriseServicesdiscussionhas been timely for me. One of the projects that I've been working on over the last couple of months involves building declarative SQL Server transactions to give you the advantage of attributed programming that you get with ES while avoiding the DTC.

I spent some time over the last week writing up a description of my implicit transaction code so I will try my best to make time to publish the first part of this tomorrow.

Definitely true - at the first look. But what happens, if you later decide that you need distributed TX, probably because another method wants to integrate the addition of a new customer with a post to a message queue? This wouldn't be possible using the code you've shown. (And I'm not even talking about what happens if some method somewhere deep down the call chains screws up the transaction logic by preliminary committing a TX).

For a whole class of web applications that I've been involved in over the last few years using primarily VB6/ASP, we only used COM+ for the simplification of the transaction processing model. Although in 99% of the cases we only ever used SQL Server, along the way, perhaps with diminishing frequency, I've oft cited the fact that one day we'll need the DTC and we'll be glad we used COM+ transactions then but I don't think it's ever happened to me yet.

On the other hand, your point regarding the overhead of distributed TX is well understood. Wouldn't it be great if EnterpriseServices would allow for transactions to start as local ones and later be turned into distributed TX "on demand" - as soon as you access a secondary resource manager? In fact, this feature is planned for the future and the only reason I didn't talk about it before was that I thought it was NDA information. I was wrong - it is already in public and can be found at  (next to some other great information about this topic).

In the future, Enterprise Services will support the concept of promotable transactions which will begin as a local transaction and will be promoted to a distributed transaction when required.

Pretty cool, huh? Hearing about this feature was definitely part in triggering my love for this technology. [Ingo Rammer's DotNetCentric]

This is something that I didn't know. Yes, it is pretty cool, and I can certainly understand how it might cast ES in a different light. I need to think carefully about how this affects my thought processes thus far :o).

Managed and Unmanaged Contexts.

What exactly is the relation, and the interactions between, unmanaged and managed contexts? Now, I do know the basics, but not much more. Managed contexts are very undocumented, and I'm not sure I know much about them (I'd particularly like to understand when and why they are created by the runtime), but how they relate to unmanaged contexts is even more misterious (to me, at least).

So, for example, as I understand it, a ContextBoundObject-derived object will be bound to a managed context, while a ServicedComponent-derived object is furthermore constrained to be bound to an unmanaged COM+ context. However, that's doesn't say all that much. For example, is there always a managed context for every unmanaged context? Or can an unmanaged context contain multiple managed contexts? Or are managed ContextBoundObject-derived objects context-agile with respect to unmanaged contexts? [Commonality]

I found this document the other day when looking for Enterprise Services context related stuff:

Understanding Enterprise Services in .NET

It talks about the contexts and how they relate to unmanaged COM(+) stuff.