Wednesday, November 5, 2008

Prototyping AJAX with Jaxer and jQuery

So where did October go? Amazon kindly delivered a bunch of great books. So far I'm part way through C# in Depth and jQuery in Action, both Manning books and good reading. As a developer it can be difficult to find books pitched at the right level and at least with these two books Manning got pretty close to the sweet spot.


Fortunately the biggest impact on my tech time apart from family is the three months work I've begun for a local SaaS outfit, TenderLink.com (which is involved in tendering, not online dating). The work I'm doing is confidential but I wanted to talk about the tools I'm using for prototyping: Aptana Studio, Jaxer, and jQuery.


So who isn't using jQuery, hell even Microsoft is getting in on the fun. If you're using jQuery you may have heard of Aptana. From their website:


Aptana is about cutting-edge infrastructure for modern, Ajax-based web applications. We already have the leading Ajax IDE in the market (roughly 2 million downloads).

Aptana Studio is based on Eclipse and you can either download the full application or if you're already using Eclipse, just download the plugin. Aptana Studio has great support for writing JavaScript code. If you've ever had to write JavaScript using the likes of older versions of Visual Studio you'll know that JavaScript, despite being the dynamic part of DHTML, has been a second class citizen for a long time. Aptana Studio really gives you the tools you would expect in order to be productive writing JavaScript, including support for all the major JavaScript frameworks. Aptana Studio is free open source software.


Aptana also integrate their Jaxer server product into Aptana Studio which has some really great possibilities. Jaxer is referred to as the world's first AJAX server. It's not necessarily a misnomer but the buzzword compliance doesn't begin to describe what a cool product it is. It uses the Mozilla engine so you get the latest versions of Mozilla JavaScript. And by tagging your script element with the runat="server" attribute your familiar JavaScript code gets executed at the server. For the work I'm doing I'm leaning heavily on jQuery to handle the UX. The server-side will ultimately be provided by the Jade enterprise object database. But for prototyping my jQuery client user experience, the Aptana Studio/Jaxer combination is exceptional.

Wednesday, September 17, 2008

Flex and .Net

I'm currently investing time in learning how to connect a Flex RIA to a .Net server. Having worked for many years on web applications in the finance and banking sector I am well acquainted with the trials associated with rendering forms and implementing user experience consistently on a variety of modern and vintage web browsers. The ability to design and code for a single runtime environment has strong appeal. The stumbling blocks for me have been a loathing for Flash-based advertising and poorly constructed Flash-based websites. Actually my loyalty to Firefox is due in part to the ability to selectively enable Flash via the FlashBlock extension. And did I mention that I also dislike Adobe Reader? How can Adobe turn something so simple into an unwieldy bohemiath.


However the biggest stumbling block is the lack of detail surrounding Flash/AIR session handling and caching. This is something that is well understood in browsers; an online banking site will typically store a session cookie in your browser cache and you can feel confident that cookie will be erased when the session is ended. I believe this isn't necessarily the case for Flash although I can't just now place my hands on the particular URL where this issue was being discussed. At the time it was enough to shelve my interest and it's an issue I need to revisit.


So there are a number of ways to connect from an AIR application to a server application. An obvious one is via Web services. This has the enormous benefit on the server side that having written and published your Web service it is now available for any client-side or peer-to-peer technology that knows how to speak your particular flavour of Web service (SOAP/REST). A disadvantage of even a well-designed SOAP Web service can be the amount of XML data being shipped to the client. This is going to impact the responsiveness of an application even over a broadband connection. For AIR the alternative is to use Flash Remoting, the benefits of which according to this Mark Piller tutorial "include significant performance improvements, reduced development costs, and a more natural client/server integration". At this stage I'm just going to accept those claims, I don't have any good reason to dispute them.


Using Mark's tutorial turned out to be a reasonable place to start. There is a commercial Flash Remoting product but Mark's company gives away the previously commercial WebOrb Flash Remoting application. The tutorial had one unfortunate omission that had me scratching my head, I was consistently getting an error message “Destination 'GenericDestination' either does not exist or the destination has no channels defined (and the application does not define any default channels.)”. I don't know if this occurred because I'm using Flex Builder 3 rather than version 2 as in Mark's tutorial. I found the resolution on one of the forum pages. I needed to provide the additional Flex Builder compiler argument -services "services-config.xml" and copy the services-config.xml, remoting-config.xml, and data-management.xml from C:\inetpub\wwwroot\weborb30\WEB-INF\flex into the src directory of my Flex Builder project.


The need I have at the moment is to feel confident I can replicate this quickly in order to pitch for some work. That work entails integrating a Flash/Flex application with .Net server-side and some form of repository, all in a package I can distribute to a hosting environment. Trying to tie those strings together is where the tutorial unravels. GenericDestination is not something I want to use for a production environment, it allows access to every deployed assembly in the .Net application. The tutorial even indicates that it shouldn't be used in a production environment. On the Web what is easy is often used by default, particularly by inexperienced developers. To identify the alternative configuration I need to dig deeper, I'm sure many wouldn't bother. This is a peeve, if you're going to go to the trouble of writing a tutorial, show me how to do the job correctly. In this case the easy way has serious security implications that may be ignored by developers who just want to get something out there.


So now I want to start filling in the gaps. The above-mentioned GenericDestination needs to be replaced with something appropriate. I need to understand what is required to deploy my application into a hosted environment. I need to integrate my application with some form of repository and I'm leaning towards NHibernate largely because I had some recent exposure to its Java forebear Hibernate and I'm curious.

Wednesday, September 3, 2008

Installing Subversion on Leopard

Having previously installed subversion on Linux without issue ($ sudo yum install subversion and $ sudo chkconfig --add svnserve) I was expecting a similarly happy experience with Mac OS X 10.5. Perhaps I wasn't paying as much attention the second time around which contributed to my woes. The various instructions I used as reference all got me part of the way for which I'm grateful.


As an aside, I'm also partway through the extraordinary Flow: The Psychology of Optimal Experience. In this case installing and configuring subversion was definitely not a flow activity for me. One of the precursors to achieving flow is being able to concentrate. Trying to find useful information on the Internet can be extremely frustrating. I know the information I'm looking for, I drop some search terms into Google, middle-click some likely links to open in new tabs, and then wade through the tabs looking for the good stuff. Effectively I've just broken my concentration to spend unproductive time looking for the answer to a question that can be difficult to phrase without trawling all manner of dead herrings (so to speak). Think about how good the MSDN documentation is and how easy it is to find not only the API you're looking for but also example code. Think about the code completion features in Visual Studio. These all contribute to getting you into a state of flow and keeping you there.


Getting back to my skirmish with subversion, here's how it all panned out. The first step is to install subversion (I know there are options that don't include compiling from source but I like to see the machinery working):



$ mkdir subversion
$ cd subversion
$ curl -O http://subversion.tigris.org/downloads/subversion-1.5.1.tar.gz
$ curl -O http://subversion.tigris.org/downloads/subversion-deps-1.5.1.tar.gz
$ # that's the letter O btw, not a number.
$ tar xzf subversion-1.5.1.tar.gz
$ tar xzf subversion-deps-1.5.1.tar.gz
$ cd subversion-1.5.1/zlib
$ ./configure
$ sudo make install
$ cd ..
$ ./configure --prefix=/usr/local --with-ssl --with-zlib=/usr/local
$ sudo make install

Actually I was in something of a hurry and some of the suggested options were causing the configure step to fail. The instructions I was following didn't include the step to build and install zlib, and didn't include specifying the location of the library. Configure would fail when trying to locate the library and googling turned up a lot of information but not much of it useful. YMMV.


So that takes care of $ sudo yum install subversion. The next step is to get subversion to load at boot. On Mac OS X this is handled by launchd. I've previously written a launchd plist for PostgreSQL under OS X 10.4 where the data was stored on a firewire-attached RAID. I have a strong recollection that it was a PITA trying to get it to wait for the RAID to be initialised before starting postgres. I went back to the freyside and used his sample plist. It was pretty close, but it didn't specify a username for running subversion. It turns out that there already exists a subversion account on Leopard, _svn. The modified /Library/LaunchDaemons/subversion.svnserve.plist is below. And the new service can be initialised without rebooting by $ sudo launchctl load /Library/LaunchDaemons/subversion.svnserve.plist.



<?xml version="1.0" encoding="utf-8"?>
<!DOCTYPE plist PUBLIC "-//Apple Computer//DTD PLIST 1.0//EN"
"http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
  <dict>
    <key>Disabled</key>
    <false />
    <key>UserName</key>
    <string>_svn</string>
    <key>Label</key>
    <string>subversion.svnserve</string>
    <key>ProgramArguments</key>
    <array>
      <string>/usr/bin/svnserve</string>
      <string>--inetd</string>
      <string>--root=/Users/dave/develop/source</string>
    </array>
    <key>ServiceDescription</key>
    <string>Subversion Standalone Server</string>
    <key>Sockets</key>
    <dict>
      <key>Listeners</key>
      <array>
        <dict>
          <key>SockFamily</key>
          <string>IPv4</string>
          <key>SockServiceName</key>
          <string>svn</string>
          <key>SockType</key>
          <string>stream</string>
        </dict>
        <dict>
          <key>SockFamily</key>
          <string>IPv6</string>
          <key>SockServiceName</key>
          <string>svn</string>
          <key>SockType</key>
          <string>stream</string>
        </dict>
      </array>
    </dict>
    <key>inetdCompatibility</key>
    <dict>
      <key>Wait</key>
      <false />
    </dict>
  </dict>
</plist>

Nearly there. I'm using TortoiseSVN in Parallels to access my subversion repository (which is really not going to work if I use BootCamp). I created my repository via



$ svnadmin create /Users/dave/develop/source/

Trying to access the repository using the TortoiseSVN Repo-browser from Parallels resulted in the error expected FS format '2'; found format '3'. Turns out subversion 1.5 has a new repository format to support some new features. For some reason TortoiseSVN (1.5.3, Build 13783 - 32 Bit , 2008/08/30 20:59:46) on my system didn't want to talk to the new format. At this stage my repository is for personal use and I'm really OK with using the older format. I deleted my repository and started over. Note also that the _svn account needs to be able to write to the repository, it seemed reasonable to make _svn the owner of the repository:



$ svnadmin create --pre-1.5-compatible ~/develop/source/
$ sudo chown -R _svn ~/develop/source

Now to create the root for my c# code.



$ svn mkdir svn://localhost/csharp --username dave -m "c# source"
svn: Authorization failed

Duh, so after uncommenting the line # password-db = passwd in ~/develop/source/conf/svnserve.conf and adding an appropriate entry to ~/develop/source/conf/passwd my subversion repository is alive and I can access it from Parallels using TortoiseSVN.

Wednesday, August 27, 2008

The Great Migration

We moved house 12 days ago, the house we were renting that was so great over summer turned into a cold and damp hole in winter. Moving is hard. My big mistake was assuming, given I knew there was an existing broadband connection at the new house, that it would be a simple task to switch my broadband connection from the old house to the new house. So after a week without internet connectivity and wanting to be connected to aid the Great Job Hunt, I was getting severe withdrawals and some filthy looks from my partner Anne who had earlier suggested I should perhaps make some arrangements to move the connection. Anne's a project manager in the real world.


At the same time I had ordered a couple of gigs of RAM (to better run Vista under Parallels) and a 7200RPM 320GB drive for my MacBook Pro from Other World Computing plus a copy of Leopard so I can run Boot Camp. Moving is hard. My big mistake was assuming I had a screwdriver that would be suitable for removing the screws from my laptop. After locating the screwdriver in the chaos of small children and boxes and establishing that it was not in fact suitable I had to wait another day before tackling the delicate task of dismantling my MBP. The screws aren't really a problem as long as you get all of them out, including the two torx screws hiding beneath the RAM cover. The problem is the clips and knowing when it's ok to keep pulling and when you should stop and have a look for those two screws you missed.


Ultimately it's all good and I'm enjoying the feeling of a lot of free space, much better performance, and a clean install. And with the restoration of our broadband connection I've been able to read an excellent article on scalability on InfoQ.

Tuesday, August 5, 2008

C# 3.0 Extension Methods

I'm getting stuck into C# 3.0 and really enjoying some of the new features. How often do you go looking for a class method to perform a specific task only to discover the method doesn't exist or exists but only does part of the job. Inevitably my programs contain methods I've created specifically to address this issue. One of the downsides of this approach is that there's often no obvious place to code the method, it just gets stuck close to where it is used or in a utility class if it is used in more than one location. And that is bad because it increases coupling/lowers cohesion for classes. Extension methods are a great way to address this situation.


I have an ASP.Net application that began life as a real application. The version I use to explore new stuff now bears little resemblance to the original but when I wrote it I was surprised to find that System.Web.UI.Control.FindControl() doesn't recurse down through the child controls. As a result I wrote the following method in the code-behind for a MasterPage:



private static Control LocateControl( Control Ctrl, string Id)
{
   Control ctrlRet = Ctrl.FindControl( Id);
   if (ctrlRet == null)
   {
      for (int i = 0; i < Ctrl.Controls.Count && ctrlRet == null; i++)
      {
         ctrlRet = LocateControl( Ctrl.Controls[i], Id);
      }
   }
   return ctrlRet;
}


Using extension methods I can now instead code a new class ControlExtensions:



public static class ControlExtensions
{
    public static Control LocateControl(this Control Ctrl, string Id)
   {
      Control ctrlRet = Ctrl.FindControl(Id);
      if (null == ctrlRet)
      {
         foreach (Control childCtrl in Ctrl.Controls)
         {
            ctrlRet = childCtrl.LocateControl(Id);
            if (null != ctrlRet)
               break;
         }
      }
      return ctrlRet;
   }
}

Calling the original method is clunky —


Control ctrl = LocateControl( parentControl, controlId);

when compared to the new extension method —


Control ctrl = parentControl.LocateControl( controlId);

Very nice.

Saturday, July 26, 2008

Quality, Security, and Risk

Providing a quality service or application is often a good indication of an organisation that is paying attention to detail and as a result there is reason to suspect that they're also paying attention to security. A bit general perhaps but if you're looking for smells, I believe the quality of an organisation's public channels can provide some indication of the quality of that organisation's internal systems and processes.


From the inside a good indication that attention is being paid to detail is the existence of a risk program. Assessing risk in information systems is a prerequisite for both quality and security. Assessing risk is really only expensive in time and that only for initialisation, once established it only requires a relatively small effort to maintain. The tool I've used previously is a spreadsheet and the process involves a round-table discussion with all involved parties to thrash out what risks can be identified in operations/systems/applications. The great bonus of this exercise is that everyone gets an opportunity to have some input in a neutral forum and issues that have been held close can be exposed to the light.


Once an agreed list of risks has been composed, the next step is to assign values representing the impact and likelihood of each risk on a scale from 1 to 5:









ImpactLikelihoodGross Risk
5 (catastrophic)5 (almost certain)E extreme/critical
4 (major)4 (likely)H high risk
3 (moderate)3 (moderate)M moderate risk
2 (minor)2 (unlikely)L low risk
1 (insignificant)1 (rare) 

Another table then provides a value for the gross risk assessment (see previous table for Gross Risk legend):










 Likelihood
Impact54321
5EEEHH
4EEHHM
3HHMML
2MHMLL
1LHMLL

The spreadsheet has a single row for each identified risk, and within that row there are columns for a description of the risk, the impact and likelihood and the calculated overall risk. Other columns include accountability and the current risk mitigation measures and management comment. It becomes fairly obvious then what issues require immediate attention and which ones can be treated as routine. It is also a great way for highlighting to management issues that might otherwise not get the attention they deserve and ensuring that if something does go wrong that management had previously been made aware of the risk, accepted the risk and agreed to the mitigation measures that had been put in place, i.e. the principle of no surprises. Plus once it is in place it only requires a regular review to continue to be effective.


This is not something I cooked up myself, it was a process established as part of a risk program and introduced by a third party organisation. I don't know where that third party sourced the process. I do know it is simple and effective.

Tuesday, July 22, 2008

Dropping IE 6

So 37signals is reporting on Apple's decision to drop support for IE 6 in MobileMe. I'm sure there are many who would like to drop IE 6 from a great height. I will confess that in my current position as a web developer for a NZ bank, I have users connecting with IE 4 and NN 4 browsers. This is a situation I inherited and one I'm happy to relinquish when my notice period finishes in a couple of weeks time. One of the major issues is that there is no way to correlate the user with the browser. This means there is no way to contact users of older browsers and assist them to upgrade and as a result the bank's sites are all lowest-common-denominator HTML. Of course to get into this situation in the first place required a hefty portion of mediocrity which included very little cross-browser testing. The average user experience is likely be fairly average but only the user knows.


Which brings me to my prior position, also for a NZ financial organisation, where we did take a lot of care with the sites we published. There we also had users connecting with older browsers but we kept a record of which browsers a user connected with. As usage of older browsers dropped away we found we were able to contact the typically small (<100) number of users affected and arrange an upgrade. Feedback was generally really good, some folk were happy to have an excuse to upgrade, some folk had access to a supported browser at work (and sadly one old chap conveniently popped his clogs).


Which left us still supporting users of IE 5.0 and Navigator 6.2. There are some inconvenient issues with DHTML on those browsers and they require a little extra work. Fortunately there is an enormous body of knowledge around what those issues are and how to work around or avoid them. We didn't need to be at the cutting edge which is where a lot of cross-browser issues can be crippling. I personally wanted to provide a good user experience for all our users and with a small team that was very doable. Looking to the future, I mandated the use of web standards and high quality JavaScript and as the usage of older browsers drops away it all just gets easier.


I can appreciate that Apple wants to have absolute control of the user interface design and requiring a modern browser must make that easier. I had a quick look at the page info for MobileMe, there's a lot of stuff coming down the pipe to your browser, far more than I would allow for my web applications — most of my users are still on dial-up and the user experience would be dreadful.