Monday 4 October 2010

Zero-friction deployment: from local to github in one powershell command

Short version:

Code is here, depends on a dll here and here. Allows for a powershell script to upload a file to github.

Long version:

There are one core belief powering this post:

- There should be zero friction in getting a software release out in the open. In the age of build scripts and APIs and everything over http there is no reason a human should be involved in cutting a version except in the decision to make it so.

The main side effect of having very little to no friction in cutting a release is that it happens way more often, since there is no reason not to do it if the code is good to go.

In this particular case, all I lacked was a way to upload a file to github. The packaging was already done, courtesy of psake+write-zip. Having found nothing in the powershell world to do it, I ended up “porting” part of a ruby script.

Github stores files in Amazon’s S3, so there were two steps to uploading a file:
-Telling github that the file exists and getting a token to pass to S3.

-Uploading the file itself to S3 using the token gotten from step 1.

The biggest issue ended up being that WebClient doesn’t handle POST’ing to an url with MIME values, which is what S3 expects in this scenario. Using the underlying *thingie* directly from powershell would bit a bit harder and more prone to errors, so I just used an existing helper to upload the file correctly.

The powershell code that uses it is available here, with another extra dependency on the Html Agility Pack to scrape the downloads page for info on existing downloads.

The function itself (not iet a full commandlet due to lack of time) is upload_file_to_github:

function upload_file_to_github($login, $repo, $api_key, $file, $filename, $description){
    [void][System.Reflection.Assembly]::LoadFrom((get-item "lib\Krystalware.UploadHelper.dll"))
   
    $full_repo = $login+"/"+$repo
    $downloads_path = "http://github.com/"+$full_repo+"/downloads"
   
    $post = new-object System.Collections.Specialized.NameValueCollection
    $post.Add('login',$login)
    $post.Add('token',$api_key)
    $post.Add('file_size',$file.Length)
    $post.Add('content_type',"application/octet-stream")
    $post.Add('file_name',$filename)
    $post.Add('description',$description)
    $wc = new-object net.webclient
    $upload_info = [xml][System.Text.Encoding]::ASCII.GetString($wc.UploadValues($downloads_path, $post))
   
    $post = new-object System.Collections.Specialized.NameValueCollection
    $post.Add('FileName',$filename)
    $post.Add('policy',$upload_info.hash.policy)
    $post.Add('success_action_status',"201")
    $post.Add('key',$upload_info.hash.prefix+$file.Name)
    $post.Add('AWSAccessKeyId',$upload_info.hash.accesskeyid)
    $post.Add('signature',$upload_info.hash.signature)
    $post.Add('acl',$upload_info.hash.acl)
   
    $upload_file = new-object Krystalware.UploadHelper.UploadFile $file.FullName, "file", "application/octet-stream"
    [void][Krystalware.UploadHelper.HttpUploadHelper]::Upload("http://github.s3.amazonaws.com/", $upload_file, $post)
}

As you can see, it’s just a normal POST to the web to notify github of a new file, interpreting the response as xml to extract needed information for the S3 POST. That one uses the helper to pass the parameters as MIME-encoded values.

I have to say, using xml in powershell was a lot easier than I thought it would be. I wonder if JSON is equally well supported…