Sunday, December 7, 2008

Castle Windsor factory method support

After using ninject for a short project (some modifications to a Subtext blog) I came back to Windsor and found myself missing ToMethod(). ToMethod() in ninject gives you factory method support, that is, you pass it a Func and the container will call it whenever it needs to instantiate the component.

Now Windsor has had factory support for years, but it requires the factory to be a class, registered as a component in the container. This is ok as it's the most flexible approach, but sometimes all you need is a factory method, and creating a class and then registering it seems a bit overkill. It would be nicer to write something like:

Container.Register(Component.For<HttpServerUtilityBase>()
   .FactoryMethod(() => new HttpServerUtilityWrapper(HttpContext.Current.Server))
   .LifeStyle.Is(LifestyleType.Transient));

With this little extension method, you can do just that:

    public static class ComponentRegistrationExtensions {
        public static IKernel Kernel { private get; set; }

        public static ComponentRegistration<T> FactoryMethod<T, S>(this ComponentRegistration<T> reg, Func<S> factory) where S: T {
            var factoryName = typeof(GenericFactory<S>).FullName;
            Kernel.Register(Component.For<GenericFactory<S>>().Named(factoryName).Instance(new GenericFactory<S>(factory)));
            reg.Configuration(Attrib.ForName("factoryId").Eq(factoryName), Attrib.ForName("factoryCreate").Eq("Create"));
            return reg;
        }

        private class GenericFactory<T> {
            private readonly Func<T> factoryMethod;

            public GenericFactory(Func<T> factoryMethod) {
                this.factoryMethod = factoryMethod;
            }

            public T Create() {
                return factoryMethod();
            }
        }
    }

 

Of course, this needs the FactorySupportFacility to be installed:

Container.AddFacility("factory.support", new FactorySupportFacility());

And since this isn't really a part of the Kernel, it needs a reference to the container's kernel:

ComponentRegistrationExtensions.Kernel = Container.Kernel;

UPDATE 5/7/2009: I submitted a patch which Ayende applied in revision 5650 so this is now part of the Windsor trunk, only instead of FactoryMethod() it's called UsingFactoryMethod()

UPDATE 7/30/2010: as of Windsor 2.5, UsingFactoryMethod() no longer requires the FactorySupportFacility.

Monday, December 1, 2008

Google Chrome caches 301 redirects

Ever since the release of Chrome I've alternating between it and Firefox for casual browsing. At work, I mainly use Chrome for information seeking because of its sheer speed, but nothing beats Firefox + Firebug for web development and javascript debugging. Sometimes, however, I mix my browsers and run some non-javascript stuff on Chrome (of course I test javascript-heavy stuff on all main browsers).

So a couple of days ago I was checking a seemingly innocent piece of code on Chrome, something like:

Response.StatusCode = 301;
Response.RedirectLocation = newUrl;

and I messed up a querystring parameter on newUrl. I fixed it, ran it again, and it still pointed to the wrong URL. Hmm, was I using the right port (sometimes I run multiple branches of different ports)? Yes. So I set a breakpoint on Response.RedirectLocation, and it didn't fire. WTF?! After some more hair-pulling and cursing (spanish is a really wonderful language for cursing, you know, there's even a whole mathematical theory behind it(1)) I ran it on Firefox and it worked as expected. And then it hit me: it is a permanent redirection after all, isn't it? So why not cache it? I ran it on every other browser I had (IE, Safari, Opera), they all behaved like Firefox, the only one caching the redirect was Chrome.

I couldn't believe it, so I wrote a simple "proof": an app that gives you a link, when you click the link it creates a cookie and redirects (301) to another page where it shows the content of the cookie and deletes it. Source code is here (nothing interesting) and you can run it here. Point your browser to http://localhost:12345/. Click the link. You'll see "Original=/first" as the response. Ok, now go back to root. Click the link again. Now, if your browser cached the redirect, you'll see "Original=", since the /first URL wasn't requested again. Go back to root, hit F5 and it clears the cache.

I don't mean to say this is wrong behavior, it's just that it's the first browser I see that implements 301 redirect caching, it's something to be aware of while developing and testing.

(1) For the non-Spanish-speaking readers: it's just a joke ;-)

Sunday, November 9, 2008

Playing around with Visual Studio 2010 CTP

I read the news about VS2010 CTP and couldn't resist downloading it. So here are some impressions:

  • The use of WPF not only makes it "prettier", it opens up a lot of possibilities like the ones shown here. I'm more text-oriented than visual-oriented, though, so I won't be using that much. Highlighting looks really cool, check it out:

vs2010_highlight

 

  • ASP.NET MVC is not included... well, it's just a CTP, I'm sure it's gonna be included in the next preview...
  • The sample app, DinnerNow, looks very messy and incomplete right now. There are lots of directories and many solutions and projects. Many projects look like they're duplicate. The installer seemed to work just fine, but then I went on to browse it and got some errors like this one:

    An error ocurred while parsing EntityName 

    But it's obviously still work in progress and it seems they're integrating a lot of technologies (including CardSpace!), I'm looking forward to it.
  • As you can see at the bottom of the previous stack trace, it's a whole new CLR. This means no more people wondering about the framework version on IIS like it happens with v3.0 and v3.5 :-)
  • Parallel Extensions is built right in the core of the new framework, just like announced. Apparently it's a September CTP that we never got for v3.5, I hope the June CTP is not the last we're getting for v3.5 :-(
    I've been using ParallelFX for the last few months and it really simplifies the development of multi-threaded code, its API is so simple and clean that you almost forget you're doing it parallel, until you get burned :-) You still have to know the principles of multi-threading and be aware of what runs in parallel and what doesn't, but this is a topic for another post...
  • I opened a VS2008 solution with a couple of v2.0 and v3.5 projects. The conversion wizard showed up, just like the one in VS2008, but with a difference: it converted all of the projects to v4.0 without asking. Ok, so I open any file to try the new dynamic feature that's causing so much controversy in the last few days (apparently it's a love-it-or-hate-it kind of thing), write a quick dynamic instantiation, and I get this error:

    One or more types required to compile a dynamic expression cannot be located

    Luckily it turned out to be an easy fix: it was a v2.0 project converted to v4.0, but it was missing a reference to System.Core (the one from v4.0)
  • It was also my first time using SQL Server 2008. Finally there's intellisense in Management Studio! Maybe it's not as complete as the latest version of Red Gate's SQL Prompt, but at least it's something.

Some other features I'll be checking out:

  • The new System.Web.DataVisualization
  • Classic ASP.NET can now define static, predictable IDs (i.e. no more "ListView1_ctrl0_ProductIDLabel"). Great for javascript integration.
  • Playing around with IDynamicObject to implement method_missing (Chris Burrows' series of posts about this look great). On the surface, it looks quite complicated compared to Ruby's method_missing or Boo's IQuackFu. It would be pretty cool to use it to implement finders on top of ActiveRecord alla RoR or DSLs, although it looks like it still won't be as malleable as Boo for DSL creation.

Wednesday, October 8, 2008

CRUD API for Google Spreadsheets

For a little project I'm coding, I needed to programatically post new rows to a Google spreadsheet. Luckily, there are .NET bindings to the Google Data APIs. But they're still too low-level if you want to use the spreadsheet as a database. I looked around the source repo for awhile and found this little gem in python. In the words of its creator, it helps:

Make the Google Documents API feel more like using a database.

This module contains a client and other classes which make working with the
Google Documents List Data API and the Google Spreadsheets Data API look a
bit more like working with a heirarchical database. Using the DatabaseClient,
you can create or find spreadsheets and use them like a database, with
worksheets representing tables and rows representing records.

Just what I needed! So I ported it to .net, and here is the result. The biggest difference with the python version is that tables in .net are strongly typed. When you define a Table<Entity>, the Entity's public properties are serialized when posting to the spreadsheet.

The sample app is pretty much self-explanatory. You can do CRUD operations on the rows stored on a worksheet. The structured query operators are limited, but I didn't need more. I even threw in a LINQ provider (base classes courtesy of Matt Warren), so you can do strongly typed queries:

class Entity {
  public int Amount { get; set; }
}
...
Table<Entity> t = ...
var rows = from e in t.AsQueryable()
           where e.Amount == 5
           select e;

It's still a bit rough around the edges, but usable.

UPDATE 11/03/2010: released GDataDB 0.2

Basic Quartz.Net-Windsor integration

A couple of months or so ago, I had to code some periodic maintenance jobs to be run in-process. So I was faced with the same options that Ayende had a year ago (except the out-of-process options). I finally decided on using Quartz.NET, since I needed the dynamic update capabilities (that is, it watches the config file for changes and re-schedules accordingly) which Castle.Components.Scheduler doesn't have yet as far as I know... (BTW Castle.Components.Scheduler is now part of the trunk).

So I set to code the wrappers needed to make it play with Windsor, and the resulting code is here. There's a sample app that shows how to set up job and trigger listeners (both global and job-specific). The actual scheduling configuration is managed by Quartz.NET of course. In the sample I used the external quartz_jobs.xml config with the default RAMJobStore, but you could easily change that to a ADOJobStore or anything, just by setting the props dictionary on the QuartzNetScheduler component, that dictionary is passed as-is to Quartz.NET.

Why did I write "basic" integration on the title? Well, I just needed the basic features of Quartz.NET, so I didn't even try to integrate stuff like clustering, remote servers, etc. I have no idea if those will work with my code. If anyone gives it a try, I'd love to hear about it :-)

UPDATE 4/3/2009: I wrapped the components in a facility for easier usage and configuration.

Saturday, September 6, 2008

Changing Windsor components configuration at runtime

Most of the time you want your services/websites/systems running non-stop. I mean, why would you ever want to be restarting them every 4 minutes? You want to minimize downtime. One cause of app restarting in .NET is having to modify web.config / app.config. To partially address that, I wrote a DynamicConfigurationSettings class some time ago, which basically mimics a web.config but freely modifiable at runtime. Now, let's say you have an app with hundreds of components registered in a Windsor container, all configured in your web.config or windsor.boo file. How could you change a component's properties without restarting your app? Even if you put your container config in another file, if you change it nothing happens until the file is re-parsed and re-processed, which basically means recycling the app.

Example

A sample case could be a SMTP mail sender component, which could look like this:

public class SmtpMailSender : IEmailSender
{
    public SmtpMailSender(int port, string host)
    {
        ...
    }  
    ...
}

and configured like this:

<configuration>
    <components>
        <component id="smtp.sender" 
            service="Namespace.IEmailSender, AssemblyName"
            type="Namespace.SmtpMailSender, AssemblyName">
            <parameters>
                <port>10</port>
                <host>smtp.mycompany.com</host>
            </parameters>
        </component>
    </components>
</configuration>

Now suddenly, the mail server goes down! The ops team, a.k.a. you :-), quickly sets up a second mail server at smtp6.myothercompany.com:25, but you have to point the smtp component to the new server while you investigate what happened to the other. If you touched the config, the site would be recycled, and bad things could happen:

  • If you use InProc sessions (which is the default), all your users lose their sessions.
  • If you use inproc cache (which is the default), you lose all your cache.
  • If any user was in the middle of a payment transaction (using a payment gateway like VeriSign PayPal), you could lose the transaction

Solution

But Windsor is very flexible, so you could tie your components configuration to a dynamic.web.config using this simple SubDependencyResolver:

public class DynamicAppSettingsResolver : ISubDependencyResolver {
   public string Key(ComponentModel model, DependencyModel dependency) {
       return string.Format("{0}.{1}", model.Implementation, dependency.DependencyKey);
   }

   public object Resolve(CreationContext context, ISubDependencyResolver parentResolver, ComponentModel model, DependencyModel dependency) {
       var key = Key(model, dependency);
       return Convert.ChangeType(DynamicConfigurationSettings.AppSettings[key], dependency.TargetType);
   }

   public bool CanResolve(CreationContext context, ISubDependencyResolver parentResolver, ComponentModel model, DependencyModel dependency) {
       var key = Key(model, dependency);
       return dependency.DependencyType == DependencyType.Parameter &&
              !string.IsNullOrEmpty(DynamicConfigurationSettings.AppSettings[key]);
   }
}

Adding this resolver using AddSubResolver, gives you the ability to override components parameters in the appSettings section of your dynamic.web.config:

<?xml version="1.0" encoding="utf-8" ?>
<configuration>    
    <appSettings>
        <add key="Namespace.SmtpMailSender.host" value="smtp6.myothercompany.com"/>
        <add key="Namespace.SmtpMailSender.port" value="25"/>
    </appSettings>
</configuration>

Caveats

This solution has some caveats:

  • The component modifiable with this method (let's call it component A) has to be transient (default is singleton)! Otherwise the subresolver will never get a chance to set the new values...
    Also, as a consequence, other components (let's call them group B) depending on component A have to be transient as well, or they will be stuck with a SmtpMailSender with the old values! And yet other components (group C), depending on any component of group B, have to be transient for the same reason, and so on.
  • This subresolver only works with basic types. Ints and strings are all I have needed so far. Of course, you could swap that Convert.ChangeType() with type converters or the other TypeConverters, or store the settings in a configSection of its own instead of appSettings for more flexibility.
  • It does not support (although it wouldn't be very difficult to do) service overriding, it only does parameter overriding.

Wednesday, August 20, 2008

SolrNet with faceting support

I finally added support for facets in SolrNet. There are basically two kinds of facet queries:

  1. querying by field
  2. arbitrary facet queries

Querying by field

ISolrOperations<TestDocument> solr = ...
ISolrQueryResults<TestDocument> r = solr.Query("brand:samsung", new QueryOptions {
    FacetQueries = new ISolrFacetQuery[] {
        new SolrFacetFieldQuery("category")
    }
});

Yeah, kind of verbose, right? The DSL makes it shorter:

ISolrQueryResults<TestDocument> r = Solr.Query<TestDocument>().By("brand").Is("samsung").WithFacetField("id").Run();

To get the facet results, you get a property FacetFields in ISolrQueryResults<T> that is a IDictionary<string, ICollection<KeyValuePair<string, int>>>. The key of this dictionary is the facet field you have queried. The value is a collection of pairs where the key is the value found and the value is the count of ocurrences. Sounds complex? It's not. Let's see an example:

Let's assume that eBay used SolrNet to do its queries (please bear with me :-) ). Let's say a user enters the category Maps, Atlases & Globes, so you want the items within that category, as well as the item count on each subcategory ("Europe", "India", etc) that shows up as "Narrow your results". You could express such a query like this:

var results = Solr.Query<EbayItem>().By("category").Is("maps-atlases-globes")
    .WithFacetField("subcategory")
    .Run();

Now to print the subcategory count:

foreach (var facet in results.FacetFields["subcategory"]) {
    Console.WriteLine("{0}: {1}", facet.Key, facet.Value);
}

Which would print something like this:

United States (Pre-1900): 2123
Europe: 916
World Maps: 650
...

and so on. See? Told you it wasn't hard :-)

Note that by default, Solr orders facet field results by count (descending), which makes sense since most of the time you want the most populated/important terms first. If you want to override that:

ISolrQueryResults<EbayItem> results = Solr.Query<EbayItem>().By("category").Is("maps-atlases-globes")
    .WithFacetField("subcategory").DontSortByCount()
    .Run();

There are other options for facet field queries, I copied the docs from the official Solr documentation.

Arbitrary facet queries

Support for arbitrary queries is not very nice at the moment, but it works:

var priceLessThan500 = "price:[* TO 500]";
var priceMoreThan500 = "price:[500 TO *]";
var results = Solr.Query<TestDocument>().By("category").Is("maps-atlases-globes")
    .WithFacetQuery(priceLessThan500)
    .WithFacetQuery(priceMoreThan500)
    .Run();
Then results.FacetQueries[priceLessThan500] and results.FacetQueries[priceMoreThan500] get you the respective result counts.

Code is hosted at googlecode

Sunday, July 20, 2008

Facebook login WTF

Yesterday I was about to log into Facebook and I noticed it was taking too long to load the page... I opened Firebug and saw this:

 

facebook_login

 

Wow. Nice stairs, but it really hurts performance.

Tuesday, July 15, 2008

Castle Facility for SolrSharp

Some months ago I wrote SolrNet, an interface to Solr, mainly because I thought SolrSharp (the standard Solr interface for .net) was too verbose and IoC-unfriendly. I still think that way, but I always wondered what it would take to integrate it to Windsor (or any other IoC container, for that matter).

And the answer is... 241 lines of code (as per "wc -l *.cs"). That's it. That's all it took to write some interfaces and adapters and then wrap them in a Castle facility. That should teach me to try a bit harder next time! :-)

SolrNet was designed with IoC in mind, so integration is just a matter of registering a component:

<component id="solr" service="SolrNet.ISolrOperations`1, SolrNet" type="SolrNet.SolrServer`1, SolrNet">
    <parameters>
        <serverURL>http://localhost:8983/solr</serverURL>
    </parameters>
</component>

Code is here. Note that it's just a proof of concept, I barely tested it!

Monday, July 14, 2008

Y-combinator and LINQ

Today I finally found an excuse to use Wes Dyer's implementation of the Y-combinator in C#:

public delegate Func<A, R> Recursive<A, R>(Recursive<A, R> r)

public static Func<A, R> Y<A, R>(Func<Func<A, R>, Func<A, R>> f) {
Recursive<A, R> rec = r => a => f(r(r))(a);
return rec(rec);
}

I used it to recursively get all the files in a given directory, in a one-liner:
var RecGetFiles = Y<string, IEnumerable<string>>(f => d => Directory.GetFiles(d).Concat(Directory.GetDirectories(d).SelectMany(f)))
Nice, isn't it? Sample usage:
foreach (var f in RecGetFiles(Directory.GetCurrentDirectory()))
Console.WriteLine(f);

It's probably not a good idea, though, to use this in production code, as it's not really necessary (just use regular recursion) and it harms readability.

A few less strings in MonoRail


In the same spirit as my previous post, I wrote a couple of extensions methods to avoid strings when redirecting in Castle MonoRail.
Let's say you have this controller:
public class HomeController : SmartDispatcherController
{
public void Index()
{
}

public void SaveInformation(String name, int age, DateTime dob)
{
// work work work

// Send the user back to the index
RedirectToAction("index");
}
}

Now you can write the redirect like this:

this.RedirectToAction(c => c.Index());

It also works with parameters, i.e. if you have an action like this:

public void Index(int a) {}

you can do:

this.RedirectToAction(c => Index(1));

and it will redirect to /home/index.castle?a=1 (or whatever extension you're using)

It even works with objects and [DataBind] (with some limitations, see below):

public class Properties {
public int SomeValue { get; set; }
}

public class SmartTest2Controller : SmartDispatcherController {
public void DataBinding([DataBind("prefi")] Properties prop) {}
public void Index() {
this.RedirectToAction(c => c.DataBinding(new Properties {SomeValue = 22}));
}
}

Which will redirect to /SmartTest2/DataBinding.castle?prefi.SomeValue=22

There's also support for [ARDataBind] and [ARFetch]:

    public class ARController : ARSmartDispatcherController {
public void Index() {
this.RedirectToAction(c => c.ARFetchAction(new Entity { PK = 5, Name = "Robert Duvall" }));
this.RedirectToAction(c => c.ARAction(new Entity { PK = 4, Name = "John Q" }));
}
public void ARAction([ARDataBind("pre")] Entity e) {}
public void ARFetchAction([ARFetch("pre")] Entity e) { }
}

Which would respectively redirect to /AR/ARFetchAction.castle?pre=5 and /AR/ARAction.castle?pre.PK=4&pre.Name=John+Q
(I was watching John Q while writing this :-) )

ActiveRecord support is implemented in another assembly, to cleanly separate the dependencies. When using the ActiveRecord extensions, you have to initialize it in your Application_Start():

Setup.ForAll(new DefaultExtensionsProvider {ExtUtils = new ARUtils()});
This instructs the extension methods to use the ActiveRecord helpers.

Code is here, let me know your opinion...

Caveats / limitations (for now):
  • Doesn't work with nested object databinding
  • Doesn't work with arrays of objects
  • Works on Castle RC3. Trunk has been refactored a lot regarding controllers since RC3, so this code would have to be adapted (extending IRedirectSupport instead of Controller and such)

UPDATE 5/16/2009: I just found out there already was a similar project on castle contrib.

Tuesday, March 25, 2008

Strongly typed NHibernate Criteria with C# 3

I, like many others, hate string literals. Of course, I mean string literals that have some meaning to the code itself, not the "hello, world" kind of strings. They are not type-safe, not easy to refactor, etc. The drawbacks have been mentioned a lot.

In particular, it always bothered me that NHibernate suffers from this problem, specially in the Criteria API. Well, now with C# 3 AST manipulation we can easily fix this. I wrote a simple set of helper classes and extension methods that provide an Expression parameter wherever there was a string parameter describing a property name. Let's see an example:

Instead of writing this:

IList cats = sess.CreateCriteria(typeof(Cat))
    .Add( Expression.Like("Name", "F%")
    .AddOrder( Order.Asc("Name") )
    .AddOrder( Order.Desc("Age") )
    .SetMaxResults(50)
    .List();

You can write this:

IList cats = sess.CreateCriteria(typeof(Cat))
    .Add( ExpressionEx.Like((Cat c) => c.Name, "F%")
    .AddOrder( OrderEx.Asc((Cat c) => c.Name) )
    .AddOrder( OrderEx.Desc((Cat c) => c.Age) )
    .SetMaxResults(50)
    .List();

which is a bit longer, but refactorable.

Code is here (NHibernate 1.2)

UPDATE 2/21/2009: Dan Miser has kindly submitted a patch to port these extensions to NHibernate 2.0.1GA. Code is here. Thanks Dan!

UPDATE 3/28/2010: Two years have passed now since I originally published this and several similar solutions have popped up. In particular, I recommend NH Lambda extensions for NHibernate 2.x which seems to be more complete, less verbose, and better maintained than my own solution. NHibernate 3 will have a new official API similar to this, named QueryOver. If you're stuck with NHibernate 1.2 the only solution available is the one presented in this article.

Related projects:

Sunday, March 16, 2008

Injectable file adapters

Someone must have done this, but I really couldn't find it. I'm talking about an IoC-friendly System.IO.File replacement. You know, unit tests shouldn't touch the file system, etc. So if your code writes a file and want to unit-test it, you pretty much have to mock the writing of the file. Except there's a known problem, System.IO.File is a static class, and those are not mockable. Even TypeMock can't mock System.IO.File since it's part of mscorlib. So the only solution left is to build an interface and a wrapper around File. So, here's the source, probably the most boring code I've ever written. I've also included a static locator that gets the IFile implementation using Ayende's IoC static accessor to the Windsor Container, so you can write code like this:

[Test]
public void Copy() {
    var mocks = new MockRepository();
    var container = mocks.CreateMock<IWindsorContainer>();
    var fileImpl = mocks.CreateMock<IFile>();
    IoC.Initialize(container);
    With.Mocks(mocks).Expecting(delegate {
        SetupResult.For(container.Resolve<IFile>()).Return(fileImpl);
        Expect.Call(() => fileImpl.Copy(null, null))
            .IgnoreArguments()
            .Repeat.Once();
    }).Verify(delegate {
        FileEx.Copy("source", "dest");                
    });
}

So you only have to replace your calls to System.IO.File to FileEx and that's it. You get the benefits of testability and extensibility (yes, sometimes you need to provide a different behavior for File.WriteAllText()) and you don't have to deal with interfaces, implementations, etc once you have set up the container. Personally, I prefer to make explicit the dependency for IFile in my components.

Like I said, I was very surprised that I couldn't find a working implementation of this... specially because there was a big debate a year ago about maintenability, YAGNI, dependency injection, coupling and more, and Anders NorĂ¥s commented this solution on one of his own posts.

Well, I hope someone finds this useful.

Friday, March 14, 2008

ReSharper 4 helps learning C# 3 syntax

So, ReSharper 4 nightlies have been available for download for around a month now, and I've been using it successfully at work with VS2008 from the first published build. Exceptions are uncommon and when they happen nothing really breaks, i.e. your code won't get messed up. Progress from one build to the next is amazing, lots of bugs get fixed and it just gets better and better. I won't get into the specific new features since that's been covered a lot.

IMO, the best feature is that it actually speeds up your learning of C# 3 features. When I first started coding in VS2008, my code came out naturally in C# 2 style, something like:

Dictionary<string, string> dict = new Dictionary<string, string>();
dict["one"] = "1";
dict["two"] = "2";

Although i did know about the C# 3 features, I had to consciously make an effort to use them. It's like making an expression jump from your receptive to your productive vocabulary. You know the words, but they just won't come out!

But I'm a CTRL+ALT+F (code reformat) junkie, and you add some alt-enter magic and you get:

var dict = new Dictionary<string, string> {
    {"one", "1"},
    {"two", "2"},
};

And after a while of constantly reading your own code automatically converted to C# 3 by ReSharper, you gradually get used to it and it just starts coming out like that naturally. Now it's part of your "productive vocabulary"!

So don't be afraid of the nightly status and give ReSharper 4 a try!

Tuesday, February 5, 2008

Migrating to TeamCity

So I finally got some time to migrate to the free TeamCity Professional (actually, I did this like three weeks ago but never blogged about it :-) ), and I'm loving it. Installation and configuration are a breeze, and the comet-y UI always tells you the current status. There's a ton of features I have yet to discover, but one thing I did try and really liked is jabber notifications. We have a private jabber server at work, so I created an account for TeamCity, and now it IMs me when the build fails. Plus, I get a RSS feed of all the builds.

What I'd really like to do now is put a status widget on our trac's wiki, but so far, I haven't been able to do so. I have a bit of confluence-envy now :-)

Trying out ASP.NET MVC

A couple of weeks ago I wrote a web front-end for managing faxes on a fax server using Castle.Facilities.WindowsFax and ASP.NET MVC, mostly to get a feel for the latter. Although it's a very simple app (a couple of features are still missing), I got some impressions:

  • Windsor integration is a breeze thanks to MVCContrib.
  • I missed MonoRail's AccesibleThrough property. There is another option, setting the routing... but I kind of like the property better. Matter of taste, I guess.
  • Ajax is trivial thanks to the normal MVC routing, just like in MonoRail. I'm not a big fan of JS generation, I prefer to cut out the middle man and use directly jQuery, so I used the great taconite plugin to render ajax responses.
  • I like the possibility of having public methods on my controllers without them being exposed to the client. It would make testing easier in some cases. I know there is quite a controversy about testing private methods... or breaking encapsulation in order to enhance testability, but what if I wanted to test method GetRecords() on this controller? Sometimes it is convenient. No, I'm NOT saying this is good practice, just that it's nice to have the choice. Choice is good. And if you don't like [ControllerAction], you can always use Phil Haack's ConventionController. See? Choice :-)

Code is here.

Tuesday, January 29, 2008

More fun with Yahoo Pipes + Ohloh + Google charts

UPDATE 3/16/2008: Apparently the project commits XML I was getting from ohloh wasn't an official API call, so now all pipes that depend on that are broken :-(
Hopefully the guys at ohloh will implement it soon!

The more, the merrier, right? This time I took the pipes from my previous post and graphed the commit activity using google charts. For example, this is the resulting graph of the latest commit history of Subversion code (at least the commits processed by ohloh):

Note: If you're reading this from rss, the graph will most probably not show up

The graph above is generated on the fly, it's not a static image. In fact, it's contained in an iframe. The page that generates the graph is customizable via querystring, so it is kind of a widget. It's all javascript, so you can drop it anywhere. These are the parameters it takes:

  • api_key: ohloh api key (required)
  • project_id: ohloh project id (required)
  • page_count: how much history to get (roughly 25 commits per page) (default 1)
  • interval: shows date on the x axis of the graph only every x days, to avoid overlapping (default 5)
  • width (default 600)
  • height (default 150)

In addition to the pipes from my previous post, I had to build a couple more:

For date parsing, I used the awesome datejs.

PS: Please respect the Ohloh API terms, get your own API key and link back to their site. PS2: Let me know if you use this! Or if you find any bugs!

Wednesday, January 23, 2008

Fun with Ohloh + Yahoo Pipes

UPDATE 3/16/2008: Apparently the project commits XML I was getting from ohloh wasn't an official API call, so now all pipes that depend on that are broken :-(
Hopefully the guys at ohloh will implement it soon!

A couple of days ago I was looking around ohloh and thought it would be cool to publish my OSS stack. They don't provide a feed for that, but there's yahoo pipes and ohloh's API, so I got to work on it. Pipes wouldn't read ohloh's xml since it didn't have a xml header, but I asked and the ohloh team promptly fixed it. So here are the pipes I built:

All of them need an ohloh api key.

Thursday, January 3, 2008

Castle Facility for Windows Fax Services

Some months ago, at work I was facing the need to send faxes from the main application server. "No problem", you might say, "just use Windows Fax Services". But the fax server was far, far away, only to be accessed through the internet. So I wrote a wrapper Castle facility around the fax API and published it as a SOAP-WS. Interfaces and implementation are in separate assemblies, so the ws-client doesn't have to depend on fxscomex.dll et al (the Windows Fax DLLs). It is also possible to send faxes via email, by setting up a couple of components outside the facility. These components periodically check an IMAP account for the destination fax number from the subject and the actual fax as an attachment. Usually you would send a TIF or PDF for faxing. Plain text also works, but I guess nobody would use that...

Ok, here's a more formal documentation:

Usage

The facility (Castle.Facilities.WindowsFax.dll) should be installed and configured on the machine that is the direct client of the fax server.

Required assemblies:

  • Castle.Facilities.WindowsFax.dll
  • Castle.Components.Fax.WindowsFaxService.dll
  • Castle.Components.Fax.dll
  • Castle.Core.dll
  • Castle.MicroKernel.dll
  • Interop.FAXCOMEXLib.dll
  • Windows Fax installed

Those are the bare minimum assemblies required, however you probably already have also Castle.Windsor, so I'm going to assume that for the rest of this documentation. If you don't have Windsor for some reason, you'll have to configure it manually using MicroKernel or provide your own configuration.

Configuration

<facility id="fax" type="Castle.Facilities.WindowsFax.WindowsFaxFacility, Castle.Facilities.WindowsFax">
  <serverName>localhost</serverName>
</facility>

<serverName> refers to the machine that has the actual Windows Fax Service running, the one with the actual fax hardware. If not specified, localhost is assumed.

To make a fax operation, ask the Windsor container for Castle.Components.Fax.IFaxService, it has all the operations.

Scenario 1: Local fax server

If your fax server is on the web server's network, just drop the configuration above in your windsor config, setup the fax server name and you're good to go.

Scenario 2: Fax server is on another network

Here you can use the SOAP web services provided (client proxy here) or alternatively use remoting, WCF, etc. Leaving firewalls aside, it looks like this:

 

Dibujo3

 

Required assemblies on "Web App Server":

  • Castle.Components.Fax.dll
  • Castle.Components.Fax.WebServiceClient.dll

On "Fax Web Server", just install the web service provided.

Sending faxes by email

The web service miniapp also comes with the optional components to send faxes via mail. It works by periodically polling an IMAP account for mails with subject with the format "<fax number>, <fax subject>"

  • IMAPClientWrapper is just a wrapper around Lumisoft's excellent IMAP Client.
  • MailFaxSender is the one that does the actual work of checking the mail account and sending any pending faxes found.
    This component takes the following configuration:
    • Username, Password: to access the IMAP account
    • Server, ServerPort: to the IMAP server
    • FolderInbox: IMAP folder where to look for new faxes
    • FolderError: IMAP folder where faxes processed with errors are stored.
    • FolderSent: IMAP folder where sent faxes are stored.
    • FolderNotFax: IMAP folder where mails not related to the fax service (ie doesn't comply with the subject format specified above) are stored.
    These IMAP folders are automatically created when needed.
  • MailSenderScheduler is a basic scheduler that executes the IMAP polling every x minutes. It relies on the startable facility to be...er... started.
    Configuration: SleepingMinutes: pretty much self explanatory.

Troubleshooting

  • Problem: I can't perform any fax operation, I always get some COM Operation Failed exception.
    Solution: Make sure the fax service is running. If it's not that, it's probably a matter of permissions. Open the Fax Console, Tools -> Fax Service Manager, right-click on Fax (Local), Properties, Security tab, allow all operations to the user running the facility (probably NETWORK SERVICE or ASPNET). Also check the permissions on the fax printer: from the Fax Console, Tools -> Fax Printer Configuration, Security tab, etc.
  • Problem: I can't send PDF files.
    Solution: Install Acrobat 4, then modify the path on the registry file acrobat_printto.reg, then apply that reg. UPDATE: also try HKCR\AcroExch.Document.x\shell\Printto\command (where x is the acrobat version)
    Explanation: Windows fax services sends different file types by printing them internally to the fax printer. This registry modification tells Windows to use Acrobat 4 to print the PDF. Why Acrobat 4 and not the latest version? Because Adobe has removed the support in Reader to directly print a PDF. UPDATE: turns out this still works with Acrobat 8.1. Couldn't get it working with Acrobat 9 though.
  • Problem: I can't send DOC (Word) files.
    Solution: DOC files are not supported (if you find something to print docs on the server reliably (ie NOT Microsoft Word) please tell me)

TODO

  • Generalize it to support Hylafax and other fax providers.
  • Support POP3

Project homepage at SourceForge.

SVN repository.