Friday, October 3, 2008

Are Certifications Worth It?

For our training this year, because of budget limitations, we were restricted from traveling, and ended up all getting CBTs that should enable us to get some Microsoft certifications. I'm about to start my training, and I've started wondering if it will be worth it.

On the one hand, doing the study necessary to earn it will definitely expose me to more of the .NET framework and the C# language than I may have been in by day-to-day programming. Of course, a lot of it may be things that I don't really have a need to know, since certification exams tend to ask about such a wide range of topics.

On the other hand, it's been my experience in the past that getting a certification doesn't usually buy you anything at your current employer; they're more useful when looking for something new. I'm not currently looking for a job, nor am I expecting to (though you never know, especially with the economy going the way it is).

I'm not going to turn down the chance to take a week and study, and I'll definitely take the exams and get the certifications, but I'm still not sure exactly how much benefit I will gain from this.


Tuesday, September 16, 2008

ORMs and Generated Code

At my job, we've been using a code generator for our main applications since we originally started writing them over 5 years ago. It implements Object-Relational Mapping (ORM) and generates most of the code necessary to read and write objects from the database, even includig some rudimentary business logic. Since we started, we've also extended this to include generating some things that I didn't think could even be generated, like Windows dropdowns with enumerations and a custom stored procedure for keeping a permission table up to date.

Overall, it has been a good system, and it has certainly saved us a lot of time over the years. It is very nice to be able to put a bunch of XML into a file, run the code generator, and get virtually everything we need to work with a new table or set of tables.

Still, it has some limitations, and some other issues that have made me reconsider whether using it has been a good idea 100% of the time.

First, the tool we use to generate the code was written by a consulting company that our company used for many years before the IT department became as fully staffed as it is now. Unfortunately, a couple of years ago we severed ties with that company, which means we no longer get updated versions of the tool. It was written in .NET 1.1, and the C# parser still only understands 1.1 constructs. That means if we use anything introduced in 2.0 or later, like generics, LINQ, etc., we get errors when we run the tool. We have source for most of the tool, but we are missing it for the C# parser, which is exactly what we would need to modify to fix this.

Next, I feel like in some ways, it has kept me from learning everything I could about ADO.NET. Unless we have some custom stuff we need to do in the database that the tool can't generate, we just end up calling generated methods to read and write from the database. Now, I do understand how it works, and in fact, I extended it to include the .NET 2.0 transaction model, but I still wonder if I would know more about ADO.NET if it weren't for this tool.

Finally, it has introduced somewhat of a learning curve as we have brought new developers onto the project, especially if they are not really experienced developers. Since it generates so much of the business logic and data access layers, developers have to be trained to not just jump in and start writing or changing code that touches those areas. They need to understand that some of their changes may be wiped out by the tool if they don't do them correctly. So far, most have been positive about this, and have caught on quickly, but it is still something different than most people are used to.

So, looking back, would I have done anything differently if I could have? It's hard to say. Probably the biggest change I would have liked to have made would be to somehow keep the generated code separate from any custom code we wrote. If we had had .NET 2.0 back then, we could have used partial classes (and in fact, if I had the source to the C# parser, that would be the first thing I would add). As it is, the code is all mixed together, and it can sometimes be hard to tell what is getting generated and what is hand-coded. I definitely would not have chosen to write everything by hand. I definitely believe generating code like this is beneficial.

Friday, September 5, 2008

ASP.NET MVC First Impressions

There's been a lot of discussion online lately about Microsoft's new ASP.NET MVC framework. If you don't know what it is, it is a new, alternative framework for web development. It isn't meant to replace ASP.NET web forms, but instead be another choice.

I've been reading about it and playing with the preview releases a bit, so I thought I would write up my brief first impressions. Later on, after I've had a chance to work with it a bit more, and maybe build something useful with it, I'll come back and write something else.

Things I like:

  • The separation of logic. I like how it breaks up the presentation (View), logic (Controller) and data (Model) into separate files. It would make it easy to have alternate views of the same data. For example, if you were writing a blog enginge, you could have one view be the normal text view, and another view be the RSS feed.

  • URL handling. I wrote recently about URL rewriting using ASP.NET web forms. If we were using ASP.NET MVC, we would have been able to have more friendly URLs with no "tricky" code to intercept the calls and rewrite them to what we already had.

  • Cleaner HTML. One of the things I've run into with web forms that has bugged me is the way it renames form elements to things like "ctl00$Leftnavigation1$productSearch". Since you're now responsible for generating form elements yourself, you no longer get this.

  • Testing. Since the model and controller logic are separated from the view, you can now write unit tests against them.

  • Lots of community support. As I mentioned above, a lot of people are using this and talking about it, so before long, there will be plenty of places to go for answers to questions.


Things I'm not wild about:

  • Tag soup. I think there are more ways to do this, but most of the examples I've seen involve putting C# or VB code right in the .aspx file. It's like a return to bad old ASP days.

  • Lack of view state. Again, I'm not sure if this is the only way, but so far everything I've seen indicates that you have to manually repopulate form fields, just like in ASP. And this is actually getting better. I noticed in Scott Guthrie's recent post that Preview 5 now automatically repopulates fields in an error condition.

  • It is a completely different model from what I (and the other devs on my team) are used to. This is obviously not a huge complaint, since I enjoy learning new things, but if we decide to use this, it will take some time to get everyone up to speed.

  • Sparse official documentation. I realize it is still in preview stage, so hopefully this will get better over time.


Overall, this is an interesting framework, and it will be nice to have a choice when developing new projects. Having said that, I don't think we will be rushing to rewrite all our existing code into MVC. We just have far too much time and knowledge invested in what we already have.

Wednesday, August 27, 2008

Using URL Rewriting For More Friendly Links

As part of an application we're developing at work, I recently did some research on whether it was possible to make the links to items, categories, etc., more friendly to search engines. The links we currently have are something like this: /website/productdetails.aspx?productid=123456. Basically, we pass in a product ID to the productdetails.aspx page. However, it would be nicer to have something more like /website/item/123456/rewritten-item-description.aspx.

I did some research and came up with some code that will work, and not be too intrusive to the rest of the code, or require any changes in IIS. First, we'll add a property to the items that returns a link in the correct format, so that we don't have to duplicate that code in the various places we show the link. To convert the item description to something that looks like a filename, I came up with the following method:

private string CreateLink(string prefix, string id, string description)
{
String format = "{0}/{1}/{2}.aspx";
String fixedDescription = "";
fixedDescription = description.Trim().Replace(" ", "-").Replace("&", "and").Replace("/", "-");

Regex nonWord = new Regex(@"[^\w\-]", RegexOptions.IgnoreCase);
fixedDescription = nonWord.Replace(fixedDescription, "");

fixedDescription = DeAccentCharacters(fixedDescription);

return String.Format(format, prefix, id, fixedDescription);
}

This strips out any non-word characters, replaces spaces and slashes with dashes, ampersands with the word "and" and converts accented characters to their non-accented counterparts (that part is done in DeAccentCharacters(), which I won't post, since it just uses a pre-made lookup table, and isn't that interesting). It then prefixes the name with the prefix and id, basically coming up with what I showed above.

Now that we have links in the correct format, we need to interpret them. Note that it looks like we have a couple of directories there ("item" and "123456") that don't actually exist. We need to intercept the request for this page before ASP.NET has a chance to complain about it not existing. To do that, we add some code in the Global.cs class in the App_Code directory. First, in the Application_Start() method, add this, anywhere in the method:

SetupRewriteRules();

This will set up a list of URL rewriting rules, using regular expressions. Then, in Application_BeginRequest(), add this as the first line, before anything else you may be doing:

RewriteURL();

This calls a new method that interprets the rules created in SetupRewriteRules() and rewrites the URL in the request accordingly.

Next, add this class to the file (at the end is preferable):

private class RewriteRule
{
private Regex rule;
private String rewrite;

public RewriteRule(String ruleRegex, String rewriteText)
{
rule = new Regex(ruleRegex, RegexOptions.Compiled | RegexOptions.IgnoreCase);
rewrite = rewriteText;
}

public String Process(String path)
{
Match match = rule.Match(path);

if (match.Success)
{
return rule.Replace(path, rewrite);
}

return string.Empty;
}
}

Next, add a static list to hold these and the SetupRewriteRules() method:

private static List<RewriteRule> rules = new List<RewriteRule>();

private void SetupRewriteRules()
{
rules.Add(new RewriteRule("Item/([^/]*)/(.*).aspx", "/ProductDetails.aspx?productID=$1"));
}

We have more rules, but I'll just show this one. Note that the first parameter is a regular expression that matches "Item/item ID/filename.aspx". The second parameter is what to replace that with. In this case, it takes the first match (the item ID, enclosed in the first set of parentheses), and puts it where the "$1" is in the URL.

Finally, add the RewriteURL() method:

private void RewriteURL()
{
foreach (RewriteRule rule in rules)
{
String subst = rule.Process(HttpContext.Current.Request.Path);
if (subst.Length > 0)
{
HttpContext.Current.RewritePath(subst);
break;
}
}
}

This code iterates through each rule, and if it finds one that matches, it calls RewritePath() using the rewritten URL. This is what actually translates the "friendly" URL into something that works with our application. The great part is that it is totally transparent to the user; they never see the rewritten URL. Postbacks still work fine as well.

I realize there are other ways to do this, like ASP.NET MVC, but our application is pretty much already written, and I'm not wild about going back and redoing it in a totally new technology. This can be retrofitted onto the app without too much pain, and could even be turned off or on with a config file setting if needed.

Wednesday, August 20, 2008

Red Gate Buys .NET Reflector

In a post on his blog today, Lutz Roeder announced that he has sold .NET Reflector to Red Gate Software. This is an indispensible tool for any .NET developer (I can't believe I left it off my list earlier!). I hope Red Gate will continue to support it, especially the free version. We use several of their products at work, and they are a good company, so I have high hopes for it.

Tuesday, August 12, 2008

Favorite Computer Books

I believe all good developers should keep up to date with technology, and one good way to do this is to read technical books. I realize there is a lot of information available on the internet these days, and many people think they can get everything they need from there. However, for some topics, a well-written book is a much better option. With that, here's a list of books that have been useful and/or influential in my programming career:
Note that several of these books are not focused on a specific technology, and thus won't go out of date as soon as a lot of computer books will. I think that's important when deciding what books to buy, especially if you're on a budget. Try and find a book that will last more than the year or so that many tightly-focused books will.

Friday, August 1, 2008

Debugging ASP.Net AJAX Pages

If you've done any ASP.Net AJAX development, you've probably run into the following error message while debugging (Sys.WebForms.PageRequestManagerTimeoutException: The server request timed out):

This happens because the default timeout for asynchronous processing is 90 seconds. This is a reasonable default, and really, nothing in an asynchronous request should take that long. However, if you're stepping through code of any complexity, you can easily exceed that limit. You can fix this by setting the timeout on your script manager, like this:

<asp:ScriptManager ID="MainScriptManager" runat="server" EnablePartialRendering="true" AsyncPostBackTimeout="3600" />

However, you probably don't want that in production. There is also a way to set this in your code-behind, and do it conditionally, based on whether your pages are running in debug mode or not. If you've been doing ASP.Net development for any amount of time you are probably aware that you shouldn't run your production site in with debug enabled in the web.config file. So, how do you tell if that is enabled? In that link, Scott Guthrie mentions that you can use HttpContext.Current.IsDebuggingEnabled to tell. So, the code to set this would look like this:

// if we're running debug mode, increase the async timeout to 1 hour.
// this allows stepping through code without timing out on the front end.
if (HttpContext.Current.IsDebuggingEnabled)
{
MainScriptManager.AsyncPostBackTimeout = 3600;
}

This only increases the timeout if you're running in debug mode. I've implemented this in one of our products, and it has worked great.