May 2003 Blog Posts

I decided last November that I was going to find the time to develop a system to replace Radio Userland as my blogging software. Lots of people had been reporting problems with it and although that never happened to me, I wanted a platform to which I could add more dynamic features. I gave myself until the end of March to make the change, but in the end it took an extra two months to find the time to sort something out.

Using ASP.NET, I've finally got around to putting a basic system together. It has been a useful learning experience putting to use all the best practice information that I've learned over the past few months and I plan to develop further. At the present time, all I have is an unsophisticated XML/RSS item store which displays in HTML, has an RSS feed, and uses the Blogger API for posting. No comments and no support for Etag/Last-modified yet but that'll come soon.

The old RSS feed URL will redirect for the time being, but the new url that you should use is

I've done my best to ensure that old links are mapped to the new pages but if you find anything broken, please let me know.

It is often useful to be able to track a user session through your ASP.NET application whilst avoiding the use of server-side session state. The following code generates a session cookie on the first request which will then be propagated to subsequent requests until you either clear the cookie or the user closes their browser.

using System;
using System.Web;
namespace Sample {
 public class SessionModule : IHttpModule {
  #region IHttpModule Members
  public void Init(HttpApplication context) {
   context.AuthenticateRequest +=
new EventHandler(OnAuthenticateRequest); }
public void Dispose() { } #endregion

#region Event Handler private void OnAuthenticateRequest(object sender,EventArgs e) { HttpContext ctx = HttpContext.Current; HttpCookie cookie = ctx.Request.Cookies["SampleSession"]; if(cookie==null) { ctx.Response.Cookies.Add(
new HttpCookie("SampleSession",Guid.NewGuid().ToString())); } } #endregion } }

You then need to add the HttpModule into the application in your web.config file as follows:

/> </httpModules>

where SampleSession is the name of the assembly that the class is compiled into.

From time to time I've considered making a call from a SQL Server stored proc into a .NET assembly via COM Interop using the sp_OA stored procedures as a mechanism to perform some database triggered activity (say from a scheduled job). I've always found a different solution in the end and it turns out to have been lucky that I did so:

Q322884 - INF: Using Extended Stored Procedures or SP_OA Stored Procedures to Load CLR in SQL Server Is Not Supported

To increase performance, the CLR internally uses thread-local-storage (TLS) which isn't allowed inside the SQL Server memory space. In addition, SQL may be configured to use fibers, which the CLR additionally doesn't support.

Don Box favours using <group> and <attributeGroup> over named complexTypes in XSLT. I've been having problems getting Intellisense to work correctly in VS.NET with schema using <extension> so using this approach looks like a good idea and I can nest <attributeGroup>s.

Craig points to a cool Microsoft tool called Windows Application Verifier, which will watch a malfunctioning app and tell you what it's doing wrong. Perfect for the developer or QA person running as non-admin, especially if you're debugging your own misbehaving app. [The .NET Guy]

Sometimes it is difficult to discern when a seemingly everyday activity requires an element of skill to get the best results. Yesterday, Jon Udell talked about indexing and searching his Outlook mail. He writes:

The Web has trained us, rightly, to expect that we just type in a word or two and get the "right" answer.

I'm not so sure that this is the case. Many times I've seen people searching using Google and getting fed up because their search comes back with 1-10 of about 65,500,000 results. For example, you do a search for "windows" but you won't find many results talking about those glass things we look out of.

Using a web search engine is a skill that can be learned. You need to understand that common words won't necessarily point you in the right direction and that the common usage of a particular word on the web is sometimes not the meaning that you're looking for. Over time you get a feel for how much specificity you need to include to bring the results you want to the top of the list and still avoid the "No pages were found containing..." message.

Back in 1995, I remember it being rare for me to turn to the web when I wanted some information. It cames as a sort of "oh yeah, I could look for that on the web" moment. These days, it's almost a given that if I want to find something, it's off to Google I go. In fact, with the Google Toolbar installed, it's right there when I open my browser. However sometimes it still takes me a while to realise the power that's available.

Yesterday, I was battling with XSLT and input documents containing multiple namespaces that I wished to manipulate in a particularly way. After trying various different approaches for an hour or so, it struck me that I couldn't be the first to run into this problem. Sure enough, a quick search of Google Groups brought me the solution complete with sample source :o). If I'd thought to look an hour earlier it would have saved me much head scratching. The key thing, though, is that it was the careful choice of keywords that meant I found my answer in the top 5 results and didn't end up with 5000 vaguely related threads that would be no help.