Monday, 28 April 2008

My personal list of must-install apps

For the umpteenth time I re-installed windows today on armitage (my desktop). For the first time I did it from a flash drive, since my dvd reader was on the fritz. I considered buying a new one, but after spending quite a lot upgrading the computer, I felt some effort could be spent in trying out the usb method. Much to my amazement it turned out to be quite easy. This link explained everything, and is actually quite straightforward. But that's not the point here, this is:

Synergy

Small application which allows one computer to control many via the network. A godsend for us with both laptop and desktop, an a desire to have more screens than a stock market analyst.

ClipX

Kill ring for windows. Before, I had to remember to get a notepad window, paste the clipboard I wanted to use afterwards, copy something, go back to notepad, rinse, wash, repeat. How in the 9 circles of hell did I live like that I don't know. But now that way is no more. ClipX remembers the copy operations you make, and has a pretty intuitive way to go back and forth.

Launchy

I understand the need for a start menu. I just don't understand why must I use it every single time I want to open a browser, visual studio or anything else. This one takes care of that. Not as good as Quicksilver, but it's good enough for me not to rip out my eyeballs out while trying to get stuff done.

Dexpot

The sanest virtual desktop manager for windows. Works extremely well almost all the time, and stays out of your way.

Friday, 25 April 2008

Dual core ftw!

One of my small gripes with games are that you can't usually alt-tab out of them. This results in lossage of time when you happen to press a key which shows the start menu, or a modal dialog pops up (yes, I'm looking at you, windows firewall), or when you just need to pause out of a game to see some other thing.

Well, I think that's a problem no more!

task manager.jpg

Given a dual-core processor and the propensity of games to be monothreaded, I can now alt-tab out of a game, do what I must and return to it without having to spend half an hour waiting for it. On the other hand, 2 Gb of ram almost seems too little nowadays...

Wednesday, 23 April 2008

And for my next trick, lets Mingle

 

Thoughtworks has recently released version 2.0 of their Agile Project Management tool, called Mingle. Apart from some new features and ways to manage the large amount of information a project usually produces and consumes, version 2.0 brings about one very interesting capability: REST APIs, accessed via http.

After spending about 1 hour before figuring out that "basic_auth_enabled: true" is different from "basic_auth_enabled:true", I was ready to at last work on a small app to use as a dashboard for a small "game" we play at weListen.

And what better way to do it than to try out the new features of .net 3.5 (and beyond), including LINQ and the ASP.Net MVC framework. Since this will be an internal project, it is an excellent chance to try them out.

For reference, here some example code to fetch the list of users from mingle:

public static List<User> getUsers() {
    
    String url = "http://mingle.example.com/users.xml";
    HttpWebRequest req = (HttpWebRequest) WebRequest.CreateDefault(new Uri(url));
 
    String username = "john.doe";
    String password = "secret";
 
    byte[] credentials = Encoding.ASCII.GetBytes(username + ":" + password);
    String base64Credentials = Convert.ToBase64String(credentials);
    req.Headers.Add("Authorization",
                    string.Format("Basic {0}", base64Credentials));
 
    StreamReader reader = new StreamReader(req.GetResponse().GetResponseStream());
    var users = (from node in XDocument.Load(reader).Descendants("user")
                 select new User
                            {
                                Username = node.Element("login").Value,
                                Id = int.Parse(node.Element("is").Value),
                                Name = node.Element("name").Value,
                                Email = node.Element("email").Value
                            });
    return new List<User>(users);
}

Most if it is rather straightforward, taking advantage of linq to iterate through the xml to fetch user's data and transform it to a business object. The User class here is just a container for the information, for now, and I use c#'s new property assignments in the constructor mainly to get a feel for it. Linq to me seems like a variant on python's list comprehensions, which is a good thing. It creates terse code, and abstracts away the cycle to focus on what you do with the information. To me this is a huge gain in c#'s expressiveness.

For me, the trickiest part was in getting the authorization correct. I now wonder if the problem was really with the code or mingle's not being correctly configured. I'll try a couple of other approaches to see if we can move away from having to encode manually the header and use .net's credentials' api.

Tuesday, 22 April 2008

Page fault accessing Mocks

I realised how little I know about mocking, stubbing et. al. as soon as I read the quickstart for moq. Main problem was the domain language.


Expect? What the hell.. Does mock expect anything? Or do we expect something from mock? Perhaps I can gain some more insight by reading about other mocking frameworks...


Clicking a link about Rhino showed me some more code, and a clearer mental image started to form. Still, gained a faint image of the meaning of Expect, but had two more keywords I know nothing about: ReplayAll and VerifyAll.

Previously I had encountered and used the concept of a mock when dealing with tests and database connections. Not wanting to setup a whole database with test data just to make sure a method's implemented correctly, I ended up writing custom mocks which implement an in-memory version of the database, returning test values when certain methods are called. Useful? yes. Practical? no, not really. Different tests had different needs, and the choice became to either implement different mocks, or to have one mock support a bunch of tests. Not the most efficient way to do it, as it is now a bit clearer.

So, here's to a cup of juice, feet up and some light reading! I feel it is just the beginning...

Monday, 7 April 2008

Linux sysadmin'ing tip of the day (again)

On the theme of clocks, virtual machines and just not getting it.

So you've got a spankin' new server, one with humongous disk space, lots of ram and cpus on the double digits. What's the first thing you think? "We can put all sorts of machines here and there'll still be space for a fool around linux to try new things".

Yes, the joys of virtualization. In this case, a centos machine running xen. Installation was simple enough, and after the first hour I already had a virtualized linux and was starting the process of installing a windows machine to serve as a development playground for everyone. By the second day I had already forgotten about the xen installation, since its purpose was being fully fulfilled as the host for the many (okay, just 3...) virtual machines installed on top.

Fast forward some weeks, and here I am trying to figure out why the bloody server is one hour ahead of time. NTPd is running, I can see the messages telling me that "yes, the time was a bit ahead, and we've got it right, now", but still the time was ahead one hour and it was not right. Damned time zones, I think! and so (naively, as it is clear to me now) I set the time zone from Europe/Lisbon to GMT in an attempt to make the system think he's one hour behind. And it works.

Until today.

After committing a couple of files to the server, and before packing on home, I check the integration server to see if everything is okay and the build is on its way to a green icon. Nop, still green. Last build time... yesterday. Strange. Picking through the logs I find out the server's not picking the latest change in the source. And that's when I notice that the commit emails fired from subversion come out one hour ahead. Again. Back I go, logging in to the server and trying to think up why the hell the hour kept going back.

Perhaps ntpd isn't working properly? Was it syncing to a bad server? Was the daemon not running correctly?

A peek in the logs shows ntpd trying to sync the time, time and time again without success, with no reason as to why it wasn't working. Okay, let's try and set the date by hand. Good, it works. And now it's back to normal. Hun?! Why... did the date... change by itself? Hardware clock?

Oh, wait... hardware clock... in a xen environment?

Yes, ladies and gentlemen, the answer right before our silly little noses. The hardware clock, in a xen virtualized environment, is managed by the host (dom0 in xen parlance), and unless specified otherwise in an obscure flag it stays that way, not allowing changes in the client environments.

And so, all it was needed was for the host machine to have a correct time and all was well in the land. Setting the correct date and time on the host machine sets also the correct date and time on the clients.

Sunday, 6 April 2008

Linux sysadmin'ing tip of the day

When you install a brand new linux machine, one of the things that is usually setup for you is a daily report of the system, sent by email to the root user.

If you're like me, you only notice it a month later when you've got a spool file with 30 emails or so.

And now, for the tip.

Two steps to solve the spool problem. Firstly, setup a .forward file on root's homedir with your email.

echo (your email) > ~/.forward

will take take of that quite nicely.

Test the config by sending a mail to root.

mail -s test root

will send the message as soon as you press Ctrl+d.

With the mail going to the correct address, the next step is to flush all those mail messages back through the mail pipe so you receive them on your email account.

formail -s /usr/lib/sendmail (your email) < /var/spool/mail/root

will do that for you. All that's missing if to clean up the queue, since you've got those messages out of the system.

cp /dev/null /var/spool/mail/root

is what you're looking for.

And that's all for today!

Saturday, 5 April 2008

Useful surprise of the day

With putty you can actually create new ssh tunnels on an existing connection by changing the settings of the connection and pressing apply. And here I was dropping and recreating the connection like an idiot...