Sunday, 7 December 2008

2 more apps I end up installing on all computers I use.

Foldershare, set to sync my utils folder, which includes putty, teracopy, some gnu command line utils, a couple of scripts to start and stop services, .net reflector, keepass binary, notepad2, and some other portable apps that I end up collecting over time.

Dropbox, mostly for the public folder. The explorer/finder integration is a wonderful piece of magic. Dropping a file on the public folder and using the "Copy the public link" item on the context menu ended up being my favorite way to share files with the world.

Wednesday, 26 November 2008

First shot at a couchdb installer on windows

Disclaimer: this is for the brave folks who wish to try things out. So far I've only tested the installer on my own machine and on a clean Windows XP vmware image.

Now for the running bits: here's a first try at getting a workable couchdb installer on Windows.

There were some instructions down at the wiki on how to get all the bits and bobs running together, but parts of it required a C compiler and some rather tedious setup on part of the user. It was also a bit frail, as noted by me trying for a couple of afternoons to get the unit tests to pass.

The biggest problem was that Spidermonkey must be compiled in a specific way (one that wasn't apparent to me at first), which I dismissed at first because the blog containing the steps was down. Also, one must use lowercase paths on the config file, otherwise erlang will choke when trying to open the database files.

The next step would be to create a shortcut on the start menu, get some sensical text on the installer, do some error checking to find out if the prerequisites are installed, and create an uninstaller.

For now, here's the installer for the world to try. Leave a comment if you find a problem or have a suggestion.

Tuesday, 28 October 2008

Thank you guys!

My great friends and colleagues at weListen just offered my these two wonderful sets of DVDs for my birthday:

birthday presents

These are Q&A sessions Kevin Smith does at packed theaters with thousands of people, and where he talks about everything and anything. It includes some inside jokes for his Askewniverse fans, and each disk is about an hour and a half of him just having fun with the audience. And that's the brilliance of it. It really is just a dude with some great stories having fun with people who dig his work, and he just goes with the flow, having people from the audience up on the stage for whatever reason it is at the time, or going on a binge discussing Prince's flamboyance.

For anyone who's a fan of Kevin Smith's movies, these dvds are must see. And I really thank the guys at work for the gift. :D

Friday, 17 October 2008

Unexpected change

Got home at 11pm. General slouch mode before bedtime engaged. Open firefox for one last news feed round and look at a large change on iGoogle.

My iGoogle page is "designed" to fit on one 20" screen at 1650x1050 or thereabouts. I want to open it, scan it quickly for any big news and move on. Now it takes two screens with a short description for each feed item (as most of the widgets are based on newspapers and tv channels' feeds), which kind of defeats the purpose. Also, the tabs moved to the left, which may be a good option for today's wide screens, but looks a bit odd having lots of negative space on a vertical bar.

Thankfully a quick trip to the tab's settings shows me a way to remove the new feed behavior and return the page to a more sane, no scroll, quick scan mode. It also shows a manual backup for the iGoogle page, and export settings. May pay someday to add these settings to the backup jobs at home.

Sunday, 12 October 2008

My first Visual Studio Plugin - CullWindows

okay, that was hell...

In the .net world I usually work with the wonder tool that is Resharper. For this particular story, the main points are the hugely useful navigation shortcuts: Go to symbol, Go to file, Go to type, Navigate to implementation, Navigate to base, you get the picture.

So, the Solution Explorer window usually doesn't get much work, but the tab strip on top is cluttered beyond recognition even after just half an hour or two of work. And while most of the source navigation is done via shortcuts, sometimes it's handy to just click on the respective tab. Just try and find it amidst 20 other files.

Hence, my idea to expand Visual Studio with the simple ability to keep just the top X files I use.

Meet CullWindows. Direct download link here, google code home page here. A simple solution to a simple problem. It was the implementation that was hell.

The Plugin

The plugin's logic is quite simple. Keep track of which files are opened for editing. When the user views a file, see if we've hit the limit. If we have, close documents, from last to first, until we're back on the limit. Ignore unsaved files, and things that are not files.

The limit is configured on the Visual Studio Options panel, under Cull Windows.

The installed is "dumb", meaning it won't give you any feedback, and just install and go away. I'm sorry for that.

For now the plugin is Visual Studio 2008 only, no version for 2005, mostly because I don't use it. If there's any demand, I may give it a try.

The Implementation

Firstly, I created an add-in. Seeing an example that came with the SDK, I managed to get a reference to the Running Documents Table, which contains all currently opened documents, and registered an event sink to listen for changes in the table.

This was all good until I get home and try the same code on another computer. There, the table was nowhere to be found, and found no error. I could've tried to debug it for some time, but it seemed better to just restart it.

Next up, a VSPackage. From what I had gathered, it is a "new" and more powerful way to extend VS, and the API and samples seemed to be a bit more OO. I managed to get it to work, again, but now the debugging is a bit stranger than when it was an add-in, since it now needs to be installed on the registry hive for the IDE.

Most of the interaction is a bit nasty from someone used to clean, object oriented APIs. It seems most of the extensibility points in Visual studio are done via COM, which is a beast I've encountered few times before, and I dont' keep fond memories from those occasions. It may be that I'm too new working with it, but the code that resulted wasn't exactly... pleasant. Feel free to browse the code. It isn't pretty, specially the first iterations, but I'm open to suggestions and criticisms.

Now, the API is HUGE. There are hundreds of interfaces on each of the Visual Studio namespaces, and trying to get from one point to another is somewhat tiring. I have a document cookie from the event handler, now how to get the name of the file. Okay, have the name, now how can I find if it is modified. Where to keep the preferences, and how to access them. There were some concepts which I lacked, but mostly it was just be being a newbie at it and not knowing where to look.

The solution ended up being simply 3 files: the package, the document monitor and the options page.

The package was built mostly by the VSPackage template, and the only things I changed was to remove the menu item, add the option pane and build the document monitor. Some of the attribute values were a matter of faith on the documentation, and I didn't dare touch most of the generated code.

The document monitor does the grunt work of maintaining a document list, ordered by access date, listening to the running document table events, and closing documents when the limit is reached. The first implementation was fetching the window frames from the UI shell and iterating through it to find the one with the file to close. Not optimal by a long shot, but worked. A second (and final) iteration used the IsDocumentOpen method from VsShellUtilities to get the window frame. This was not intuitive. I expected a method starting with Is to just return true or false, and not return more information.

The option page was actually the easier part, with all of the work taken care of by the DialogPage class, including persistence. It's a bit too much magic for my tastes, but it works. I was supposed to access the options via DTE, but it kept throwing an invalid cast exception, and I gave up and just passed the option page object to the monitor.

The Next Steps

The first implementation is done, but I think I can make it a bit smarter if I have the time:

  • Ponder the amount of time spent on a given file when picking which one to remove. This would prevent newer files which were opened by mistake to stay opened when earlier files which had more use go away.
  • Keep a background timer to cull windows after some time with no visits to the document.
  • Have the installed say something instead of just installing and going away.

I guess that's all for now, thanks for tuning in.

Monday, 29 September 2008

More Buxfer goodies (this time, a backup script)

I just realized that I'm putting a fair amount of effort (and information) on Buxfer. Been tracking rather faithfully my expenses for a month, and I'm rather happy with the service.

But what if it someday fails, or disappears? What about the data? It seems to have a data export facility implemented, but it's for Pro accounts only. Okay, it seems like either I misread the membership plans, or they changed them. Either way, the data export is good for when you want to take a look at the statements in excel, but as far as a backup goes, it's no good for me.

A backup must be something I don't have to think about for it to work. It should do its job in the background as much as possible, and only warn me if something goes wrong. Thankfully Buxfer's API has enough functionality to cover this.

This is a small script that fetches all transactions from your Buxfer accounts (or one particular account) to disk. It has some smarts implemented to "continue" a previous backup, so you can setup a scheduled task periodically to fetch the transactions, and it should just fetch the new ones after the first run.

So far it stores the info as pure json, which was the easiest way to implement it. Should I require in the future to actually act on this, I can parse the json again and convert it to csv, or any other format I need.

The source is here, and a skeleton configuration file is here.

To use it you'll need Python (I tested it with 2.5, but I think it should work with 2.4) and simplejson.

Configuration should be self-explanatory, it needs you to fill out your username, password, and filename for backup.

To run it, you can pass the path to the configuration file (this way you can have several different configurations, one for each account, or for different users).

I guess this is it. Any questions, feedback or general chat, just leave a comment.

That's all, folks!

Friday, 26 September 2008

v1.2 of BuxferSubmit

Version 1.2 is up at the project's page.

Direct link


  • Uses jQuery for the async request.
  • Stores the user's password in the user's keychain (thanks to Stan Lemon for the code and plugin)
  • Minor interface tweak (pressing enter when on the tags text field submits the transaction)

That's all for now. I'd love to get some tag auto-complete, but getting a good interface for it has proven difficult.

Wednesday, 10 September 2008


on the topic of forced rest time

I actually like waiting for someone at the airport. Of course, this may have something to do with the fact that this only happens once in a while, when I do so I've got my trusty macbook with me, and that I'm not all that eternally busy that I can't spare some time to sit at a coffee shop with wireless.

Not that I do much work, or read all that much, or have a positive contribution to give the world while I'm sitting here. I actually spend most of the time just enjoying people going to and fro or milling about while waiting, like me, for someone else. The coffee shop I'm sitting in has about 32 tables, and it's directly in front of the arrival gate. There are 12 people here, all of them sitting the same way, having a dring, chatting on the phone or just waiting with a bit of a blank stare.

And I'm talking about a mildly peripheral airport like the Lisbon one, which I bet doesn't have one tenth of the cultural cross section of a Heathrow, or an LAX. On the other hand, because it's a smallish one, I can sit at the coffee shop and view most of the arrivals, and it doesn't sound hectic or too crowded. It's just a bunch of people. Waiting.

And you can play a small game, while waiting. Try to think of all these people, every single one of them, as an distinct individual, not just "people". With needs, thoughts, desires, passions, pains and hungers. All of them have a single shared purpose here, but that can be the only thing that they have in common. Imagine the diversity. The blonde lady, dressed in a white short sleeve which passes you? Is she coming or going? Where to? The couple (or so you assume, it's a man and a woman, they could be brothers, or just friends) with the large backpacks. Travelling the world, vacations or volunteer work? The black family, waiting for a relative? Perhaps a son, gone abroad to meet some friends. Of course, the suit, lugging his laptop luggage, blackberry on hand talking and looking just a bit lost, almost (just almost) looking like it's the first time he's here.

And then it starts thinning out. More people are leaving than arriving. There are less flights. And now it's my turn to go. My charge has arrived.

Tuesday, 2 September 2008

v1.1 of BuxferSubmit

For the impatient:

New version. Should not freeze if Buxfer is slow to respond or if you forget to plug in the cable. Also some better handling of special characters while talking to the server.

Download here.

For the curious:

Buxfer seemed to have had a bit of trouble with site slowness, which exposed my cluelessness in developing widgets that connect to web services (as in, forgot to consider timeouts of XMLHttpRequests).

I was a bit amazed to find neither much info on Apple's site nor on the web on how to best handle these problems. Ended up just setting a timeout before the request, clearing it on success and canceling the request should the timeout happen. Perhaps I should look into a javascript library such as mochikit or jquery to help with the heavy lifting.

Also, most of the code feels to me a bit fragile. The error handling is primitive, and the fact that the request is async makes me a bit queasy. I'll chalk that up to immaturity on my part in developing in javascript, but I hope to have the time to refactor it a bit.

As I said before, the code is up on google code, so feel free to browse around, all three of you who are reading this :)

Saturday, 30 August 2008

Google Code page for the widget

Well, instead of just having a zip file containing the source code, I decided to put it up on google code. The page is here and I've updated the homepage to include a link to it. The releases will live there, also, and I think I'll look into getting the widget to autoupdate or something, with the files on google code I think I can get an rss feed to the releases.

I'll keep y'all posted.

Buxfer widget

If you follow my twitter feed, you may have noticed that I’ve created and published a small widget to use with buxfer here. Right now it’s just two forms, one to submit and one to configure, and it isn’t all that smart at what it does.

My intent is to keep the interface simple, but to improve on three points:
- Usability-wise, I want to be faster at adding the transaction, and perhaps have some suggestion or completion for the tags. Right now it is a bit dumb, and doesn’t provide much help besides being a form.
- As far as the interface goes, I want it to occupy a bit less space, and to be prettier. As you can see on the screenshot, the widget uses the standard parts and the labels have the default size. I think I can do better than that and still keep it usable.
- The configuration is stored on the preferences cache for widgets Apple provides, which isn’t quite the best choice as far as storing sensitive data. I should put it in the keychain, and perhaps even use the same as Safari.

I was quite impressed with the ease with which you can produce widgets. Having Dashcode helps a lot, both with the layout of the forms and the debugging of the code. Widgets being based on javascript and html gives them a low entry cost if one is already used to client-side web programming, and opens up the opportunity for some rather interesting moves, such as using JQueryUI or Moo on the widget.

If you’ve tried the widget and want to leave feedback, please, feel free to contact me or to drop a comment here :)

Monday, 21 July 2008

Perhaps extra keys on a keyboard aren't that bad an idea...

on the newfound use of useless keys

Whilst my work is nowadays mostly done on my macbook (running windows, much to my lament), at home I usually drop the laptop besides my desktop, boot it into mac os x, and use it as a communications central with email and IM open while working on some other thing on the desktop. Call it a strange dual-head, quad-core computing station.

Instead of switching around keyboards when I need to pass from one computer to the other, I use (and profoundly adore) synergy, which allows one keyboard+mouse combo to control more than one computer, via the network. But until now, to change from one computer to another (let's say to talk to someone on IM) my hand had to leave the home keys of the keyboard to use the mouse to move from one computer to the other (since synergy uses the edges of the monitor to cross control from one computer to another).

But no more! Remembering some button with "hotkeys" on the label on the configuration manager, I managed to bind two keys, one to move left, other to move right. But then the dilema... which keys?

I didn't want a hard, multiple key, emacs-like, binding. I was looking for a short stroke to allow for quick back-and-forth, since that was the most common scenario (coding on the desktop, and going to the laptop to spout nonsense on im and back again to serious business).

Enter the most useless keys on my current keyboard, a logitech internet navigator. A fine keyboard all in all, with a wheel on the left side (which I tend to use quite often, as a matter of fact) and two keys below it, go and back.

Previously useless keys

Bind go for left, back to right and we've got a pan-computer Alt-Tab.


Saturday, 5 July 2008

Using capistrano to deploy a python application

At weListen we have a small python web application that we use as a time tracker and status keeper. Something akin to twitter, but with less social and more tracking. It lives in a Linux server together with a couple of our other services, and tends to be something I work on from time to time to improve small details or to fix small bugs.

Now, most of the time the development of this app is rather iterative. I think up of something new to try or a small improvement, code it on my local workstation, test it with live data taken from the production environment and if I'm happy with the results, I commit the change and deploy it on the production server. It's a small app, with a couple of services running, so the upgrade protocol is direct. It is still a couple of steps, and sometimes I tend to forget one of them (usually the one where I refresh the source code from subversion).

So, that looks like a good excuse to try out capistrano, of which I've heard many things, mostly coming from the ruby on rails community.

In a nutshell, what I wanted was for a library to take care of the connection and execution of commands on remote servers with minimal fuss. And that's what I got.

The Good

After installing the one click version, ruby was setup and ready to go. The gems library was already included, and installing capistrano was a matter of invoking the stanza featured on the project's home page.

Creating a script to perform the same steps I previously did manually was straightforward enough, and rather "obvious", apart from the trick to get sudo not to complain:

task :update, :hosts => "<server>" do
  default_run_options[:pty] = true # required so that sudo doesn't complain
  run "svn update ~/<directory>"
  sudo "/etc/init.d/<service 1> restart"
  sudo "/etc/init.d/<service 2> restart"

The server can use the standard username@host:post format, and the ssh framework plays nice with pageant, making passwordless logins updates real easy.

The Bad

No documentation, apart from a getting started tutorial, which is completely geared to deploying ruby on rails applications according to their methodology. Both seem to be a known problem, as a quick Google search for "capistrano documentation" gives us some pages with a call for help, and a mailing list post for it, and the getting started page warns us about it being devoted to ruby on rails.

Still, It leaves the rest of the world in the dark about how to use cap to deploy other kind of applications. Also, there isn't much in the way of explaining what is their methodology, and how can a user skip or customize some of the steps.  I ended up just running the commands with no automatic error checking. Which works for now, but leaves me a bit unsure as to how solid it is.

The Ugly

Since I use a private/public key combo to login to the server with a non-root user (standard Linux sysadmin practice), and sudo requires a password by default, the script didn't work out as well as I wanted it to be on the first try. I could either type the password each time I wanted to deploy the app, get ssh to pass through my private key from pageant to sudo, or tell sudo that the services don't require a password.

I ended up compromising and going with the last choice. It means that an attacker which gains access to the server via the normal user can restart the services and do a bit of damage, but if he's already inside, then a couple of services going down is the least of my worries.


If I already had ruby installed, adding capistrano to the list of dependencies is not a big deal. The library is small and easy to install using gems. The problem was simple, and with the right tools the solution was equally direct. The lack of documentation wasn't that big a deal, but mostly because the problem was small.

Sadly, capistrano is the only reason for me to have ruby installed, so far, making the dependency a large and difficult one to explain. I might have a look in the future for python based alternatives.

Also, I'm starting to wonder if I could use the idea behind capistrano to deploy windows based applications. The biggest problem would be the remote connection, but I believe that Windows Server 2008 already has some support for console based remote connections. If this were possible, deployments could be more easily automated, which would most likely reduce the overhead of getting a new version of a web based application live.

Wednesday, 7 May 2008

The biggest disadvantage to a coffee machine at home...

... is the period while you adapt to having a coffee machine at home.

The one where you learn when not to drink coffee.

The same one where you end up awake at 6am because you just had to get a cup of delicious, tasteful, awesomely scented java after dinner.

Live and learn...

Friday, 2 May 2008

Python, windows, sockets and Ctrl-c

One of the minor annoyances of python on windows has to do with the sockets' blocking behavior.

Python's libs have several implementations of simple server loops, where the server listens for connections or specific requests and passes it on to a handler function on your code. To do so, the server blocks on an accept or read. Which, on Windows, means you can't ctrl-c out of a server process to test a code change.

To get around this little quirk I found a small workaround. We need two tools:

  • PsKill to kill the python process

Stick an infinite loop outside the python script (like this "while (1) { }"). Now, open another command prompt (I use Console for managing those) and, when you want to reload the server, just run "pskill python". Sadly this kills all python sessions currently running, which may not be what you want. But for now it works well enough.

Monday, 28 April 2008

My personal list of must-install apps

For the umpteenth time I re-installed windows today on armitage (my desktop). For the first time I did it from a flash drive, since my dvd reader was on the fritz. I considered buying a new one, but after spending quite a lot upgrading the computer, I felt some effort could be spent in trying out the usb method. Much to my amazement it turned out to be quite easy. This link explained everything, and is actually quite straightforward. But that's not the point here, this is:


Small application which allows one computer to control many via the network. A godsend for us with both laptop and desktop, an a desire to have more screens than a stock market analyst.


Kill ring for windows. Before, I had to remember to get a notepad window, paste the clipboard I wanted to use afterwards, copy something, go back to notepad, rinse, wash, repeat. How in the 9 circles of hell did I live like that I don't know. But now that way is no more. ClipX remembers the copy operations you make, and has a pretty intuitive way to go back and forth.


I understand the need for a start menu. I just don't understand why must I use it every single time I want to open a browser, visual studio or anything else. This one takes care of that. Not as good as Quicksilver, but it's good enough for me not to rip out my eyeballs out while trying to get stuff done.


The sanest virtual desktop manager for windows. Works extremely well almost all the time, and stays out of your way.

Friday, 25 April 2008

Dual core ftw!

One of my small gripes with games are that you can't usually alt-tab out of them. This results in lossage of time when you happen to press a key which shows the start menu, or a modal dialog pops up (yes, I'm looking at you, windows firewall), or when you just need to pause out of a game to see some other thing.

Well, I think that's a problem no more!

task manager.jpg

Given a dual-core processor and the propensity of games to be monothreaded, I can now alt-tab out of a game, do what I must and return to it without having to spend half an hour waiting for it. On the other hand, 2 Gb of ram almost seems too little nowadays...

Wednesday, 23 April 2008

And for my next trick, lets Mingle


Thoughtworks has recently released version 2.0 of their Agile Project Management tool, called Mingle. Apart from some new features and ways to manage the large amount of information a project usually produces and consumes, version 2.0 brings about one very interesting capability: REST APIs, accessed via http.

After spending about 1 hour before figuring out that "basic_auth_enabled: true" is different from "basic_auth_enabled:true", I was ready to at last work on a small app to use as a dashboard for a small "game" we play at weListen.

And what better way to do it than to try out the new features of .net 3.5 (and beyond), including LINQ and the ASP.Net MVC framework. Since this will be an internal project, it is an excellent chance to try them out.

For reference, here some example code to fetch the list of users from mingle:

public static List<User> getUsers() {
    String url = "";
    HttpWebRequest req = (HttpWebRequest) WebRequest.CreateDefault(new Uri(url));
    String username = "john.doe";
    String password = "secret";
    byte[] credentials = Encoding.ASCII.GetBytes(username + ":" + password);
    String base64Credentials = Convert.ToBase64String(credentials);
                    string.Format("Basic {0}", base64Credentials));
    StreamReader reader = new StreamReader(req.GetResponse().GetResponseStream());
    var users = (from node in XDocument.Load(reader).Descendants("user")
                 select new User
                                Username = node.Element("login").Value,
                                Id = int.Parse(node.Element("is").Value),
                                Name = node.Element("name").Value,
                                Email = node.Element("email").Value
    return new List<User>(users);

Most if it is rather straightforward, taking advantage of linq to iterate through the xml to fetch user's data and transform it to a business object. The User class here is just a container for the information, for now, and I use c#'s new property assignments in the constructor mainly to get a feel for it. Linq to me seems like a variant on python's list comprehensions, which is a good thing. It creates terse code, and abstracts away the cycle to focus on what you do with the information. To me this is a huge gain in c#'s expressiveness.

For me, the trickiest part was in getting the authorization correct. I now wonder if the problem was really with the code or mingle's not being correctly configured. I'll try a couple of other approaches to see if we can move away from having to encode manually the header and use .net's credentials' api.

Tuesday, 22 April 2008

Page fault accessing Mocks

I realised how little I know about mocking, stubbing et. al. as soon as I read the quickstart for moq. Main problem was the domain language.

Expect? What the hell.. Does mock expect anything? Or do we expect something from mock? Perhaps I can gain some more insight by reading about other mocking frameworks...

Clicking a link about Rhino showed me some more code, and a clearer mental image started to form. Still, gained a faint image of the meaning of Expect, but had two more keywords I know nothing about: ReplayAll and VerifyAll.

Previously I had encountered and used the concept of a mock when dealing with tests and database connections. Not wanting to setup a whole database with test data just to make sure a method's implemented correctly, I ended up writing custom mocks which implement an in-memory version of the database, returning test values when certain methods are called. Useful? yes. Practical? no, not really. Different tests had different needs, and the choice became to either implement different mocks, or to have one mock support a bunch of tests. Not the most efficient way to do it, as it is now a bit clearer.

So, here's to a cup of juice, feet up and some light reading! I feel it is just the beginning...

Monday, 7 April 2008

Linux sysadmin'ing tip of the day (again)

On the theme of clocks, virtual machines and just not getting it.

So you've got a spankin' new server, one with humongous disk space, lots of ram and cpus on the double digits. What's the first thing you think? "We can put all sorts of machines here and there'll still be space for a fool around linux to try new things".

Yes, the joys of virtualization. In this case, a centos machine running xen. Installation was simple enough, and after the first hour I already had a virtualized linux and was starting the process of installing a windows machine to serve as a development playground for everyone. By the second day I had already forgotten about the xen installation, since its purpose was being fully fulfilled as the host for the many (okay, just 3...) virtual machines installed on top.

Fast forward some weeks, and here I am trying to figure out why the bloody server is one hour ahead of time. NTPd is running, I can see the messages telling me that "yes, the time was a bit ahead, and we've got it right, now", but still the time was ahead one hour and it was not right. Damned time zones, I think! and so (naively, as it is clear to me now) I set the time zone from Europe/Lisbon to GMT in an attempt to make the system think he's one hour behind. And it works.

Until today.

After committing a couple of files to the server, and before packing on home, I check the integration server to see if everything is okay and the build is on its way to a green icon. Nop, still green. Last build time... yesterday. Strange. Picking through the logs I find out the server's not picking the latest change in the source. And that's when I notice that the commit emails fired from subversion come out one hour ahead. Again. Back I go, logging in to the server and trying to think up why the hell the hour kept going back.

Perhaps ntpd isn't working properly? Was it syncing to a bad server? Was the daemon not running correctly?

A peek in the logs shows ntpd trying to sync the time, time and time again without success, with no reason as to why it wasn't working. Okay, let's try and set the date by hand. Good, it works. And now it's back to normal. Hun?! Why... did the date... change by itself? Hardware clock?

Oh, wait... hardware clock... in a xen environment?

Yes, ladies and gentlemen, the answer right before our silly little noses. The hardware clock, in a xen virtualized environment, is managed by the host (dom0 in xen parlance), and unless specified otherwise in an obscure flag it stays that way, not allowing changes in the client environments.

And so, all it was needed was for the host machine to have a correct time and all was well in the land. Setting the correct date and time on the host machine sets also the correct date and time on the clients.

Sunday, 6 April 2008

Linux sysadmin'ing tip of the day

When you install a brand new linux machine, one of the things that is usually setup for you is a daily report of the system, sent by email to the root user.

If you're like me, you only notice it a month later when you've got a spool file with 30 emails or so.

And now, for the tip.

Two steps to solve the spool problem. Firstly, setup a .forward file on root's homedir with your email.

echo (your email) > ~/.forward

will take take of that quite nicely.

Test the config by sending a mail to root.

mail -s test root

will send the message as soon as you press Ctrl+d.

With the mail going to the correct address, the next step is to flush all those mail messages back through the mail pipe so you receive them on your email account.

formail -s /usr/lib/sendmail (your email) < /var/spool/mail/root

will do that for you. All that's missing if to clean up the queue, since you've got those messages out of the system.

cp /dev/null /var/spool/mail/root

is what you're looking for.

And that's all for today!

Saturday, 5 April 2008

Useful surprise of the day

With putty you can actually create new ssh tunnels on an existing connection by changing the settings of the connection and pressing apply. And here I was dropping and recreating the connection like an idiot...

Thursday, 10 January 2008

Good customer service pays off

Monday I went to a sports superstore to get some clothes as additional motivators for getting in shape. Expecting a supermarket of sports goods, I did most of the shopping by myself, getting a set of pants, some sweatshirts and eyeballing a GPS thingamajig which I set as a reward in a couple of months should I be good and reach my goals. But when I came to the shoes I actually was a bit undecided. Most of the other articles were rather straightforward. I had a clear picture of what I wanted, and the prices matched what I was willing to give. Not so much for a set of running shoes, which went from 20/30€ all the way to 120€. The adverts or informational flyers talked about how the lower priced items were for small walks or runs, once a week, in good conditions. The more expensive ones talked about competitive running, more frequently and in all conditions. So, instead of simmering over which ones I should pick, I called up on someone from the shop to ask what really was the difference and why were some marked as for once in a while jogging, and others more frequent running. To my (somewhat dumb) amazement, the clerk dropped what she was doing, and started explaining. So the whole frequency thing has to do with durability. If I'm going to run once in a while, less expensive shoes will do for a couple of years. But ramp up and those shoes last only months. Also, I should be on the look out for softer sole, which usually is better for jogging since it molds better to the motion of the foot. After fetching 3 or 4 pairs, the advantages of each were discussed, and the main points of how a set of shoes should be picked. I ended up buying the first and foremost recommendation of a brand I didn't knew about, in spite of trying both an adidas and a reebok pair. All in all, 10 minutes later and 5 pairs tried out, a client wishing to spend 30€ was convinced to spend 60€. And was happy about it. I got to try the different options, and was given good reasons for a more expensive buy from someone who was willing to spend the time to inform. The main point? Good, qualified, knowledgeable and, most of all, willing customer facing employees are worth their price, for the shop and for the client.