Monthly Archives: October 2010

October 2010 Dublin GTUG

Another interesting night at the Dublin GTUG. We had a very full house last night, what with about 50 folks in attendance – we made an effort to reach out more to students and there was an excellent response.

The evening started with Nick Johnson giving an overview of how Google built App Engine – what problem it was intended to solve, ie making it easy to scale web apps, and what solutions Google arrived at. Nick talked about the different components of App Engine and talked about the how the Datastore works in a bit of detail. There were quite a few questions from the crowd, ranging from issues pertaining to video serving via App Engine (me!), to costs of App Engine, to standardization of App Engine interfaces and Google support of App Engine going forward. A very interesting discussion indeed. Slides here.

Following the break, there were two short talks. The first one, in a series we’re calling Frontline Dispatches – where folks share their war stories – focused on the use of the Maps JS API v2/v3 in the application. Fintan Fairmichael talked about how they had used Maps to support their routing application, problems they had, eg issues with street name autocompletion, work involved in moving from v2 to v3 of the API, etc. Slides here.

The second of the short talks was in the Student Showcase series, in which a student showcases their work. Note that the work does not have to be complete (although we do feel that it is important to have something to demo!). Mark Ahern from UCD Computer Science showed his work to date on realizing a web-accessible EC2-hosted Android emulator for remote testing of apps. Slides to be supplied.

I’m told the usual suspects went off to the Schoolhouse for a swift half, but I couldn’t make it on this occasion.

We’re hoping to have a talk on App Marketplace next month – stay tuned to the group for more info.



JQuery, JSON and GAE…and then a little REST

Working with JQuery, JSON and GAE proved to be a little less plug and play than I had envisaged. I’m working on a small project in which I’m using client side JQuery to make Ajax calls to a GAE backend – I’m using JSON as the lingua franca between front and back ends, for no other reason than that I like it better.

My requirements to date have been:

  • Client-side JQuery makes Ajax call to backend, backend to return list of objects from data store;
  • Backend makes REST interface available to provide flexible access to data store

Some issue arose in each of the above which I’m noting here.

Getting GAE to return a list of objects serialized into JSON was not so difficult. There is a discussion on stackoverflow which focuses on exactly this issue – this discussion provides a JSON serializer for GqlQuery objects which can be dropped into the GAE app and imported. I did have to extend the serializer as it doesn’t have great coverage for all data types: in my case, I had to add handlers for a type and a db.Query type. The file should probably be put on github to allow for anyone to add extensions to it.

Getting the REST stuff working was quite a bit more complicated. I chose to use the appengine-rest-server library as it seemed to have been around for a while, have pretty good functionality – JSON support in particular – and it is seeing a significant amount of downloads from

The library is easy to use – just create a rest directory in the project, drop the __init.py__ file into it and import into the main project. It requires a few configuration parameters to be set, eg the endpoint which handles the REST – I did run afoul of some GAE persistency issues with this – the parameters need to be set in code that only runs once in a given run-time.

Once I had the server running, retrieving XML from it proved simple. I encountered two further problems retrieving JSON from it. Firstly, the library expects the HTTP request header to include an Accept field which is exactly ‘application/json’. Setting this was a bit difficult – JQuery does provide for this using either the ajaxSetup() call or the beforeSend: parameter in the ajax() call. However, browsers do not fully respect what they are instructed to do, which makes the process more complicated.

The second, related, problem arose in the way the REST library deals with Accept field. Its MIME type pattern matching algorithm has particular behaviour, which means that an Accept field of the form ‘application/json, */*’ is interpreted as requesting XML rather than JSON. This seemed a little surprising to me, and I made some small modifications to the library to just return JSON, as this was what I was most interested in.

Having resolved those issues, the basic JSON plumbing seemed to be largely working as one would expect. The process seemed a little more niggly than I had expected, which leads me to believe that there aren’t lots of folk out there who are using these technologies together right now.

Simple Javascript HTTP Request Generator

I wanted to have a simple HTTP request generator for some work that I’m doing. I did not want to have the HTTP generated by a standalone application; rather, I wanted it to be generated from some Javascript running within a browser. I could not easily find some functionality which did this, so I decided to roll my own.

The generator is here. It’s a simple JQuery based Javascript app embedded in HTML5 (ish)/CSS3 page. It’s been validated by the W3C HTML Validator, so it might work across browsers; I tested it on Chrome 6.0 on Mac OS X Leopard and it did what I wanted it to do.

One issue which I did not really think about before starting this small work was the Cross Site Scripting (XSS) issue – it’s not possible to request an arbitrary web page from Javascript that is downloaded from a specific server. This did impose limitations on what’s possible – essentially, I’ll have to add this page to any web app that I need to use it in.

The system comprises of a single page. At present, it’s housed in an AppEngine application, but as all the functionality is in a single web page, it can be just downloaded, copied on to any server and used to send requests to the server. (My AppEngine app has some extra backend functionality to make sure the client side is doing what is should be doing).

What functionalities does it provide? It can generate GETs and POSTs (my main motivation was to have a simple way to generate POSTs), it can emulate different user agents, parameters can be added to the request and the response is shown on the page.

The functionality is certainly quite limited – it’s easy to envisage lots of ways to extend it – but it was a reasonable test tool for some of the web app development I was doing.

I may extend it in the future – improve the UI, add more control over the HTTP generated, improve the error checking, provide a way to make it easy to update, load jquery via google libraries API, modify the ‘URI’ terminology as it’s inaccurate etc – but for now, that’s all he wrote.

Comments/suggestions/reasons why this is redundant(!) welcome.

The blogging students…

The four students I’m working with on final year projects have put up their blogs. They’re at:

They’re good kids, so there should be some interesting stuff appearing on those blogs in the coming months.


PWOnView – a window into ProgrammableWeb

I wrote a small tool for looking at the ProgrammableWeb dataset – it’s here. This post provides some background on the work.

For the uninitiated, ProgrammableWeb is a repository which stores information on publicly available APIs and mashups of said APIs. The repository is rich, currently comprising of over 2000 APIs and over 5000 mashups and quite a lot of useful information pertaining to the APIs and the mashups.

Obviously, this is not an enormous data set, but even still, understanding what it looks like is non-trivial. I wanted to develop some simple means of understanding what it looks like to get some feel for what types of mashups people are creating. I asked the friendly guys at PW if they would give me an API key and they kindly obliged.

I had some ideas on what to do, but they changed somewhat as I got some experience working with the data. My initial objective was to look at the classifications that the PW guys have developed and see how mashups relate to classifications; in particular, to understand which combinations of classifications result in large amounts of mashups.

I got some basic animation up and running in which I extracted the 10 classifications with the most APIs, as I thought these the most important. Clicking on one of these classifications gives information on the APIs that fall into that classification, ordered in terms of the numbers of mashups that contain that API. So, for example, the first four APIs returned when the ‘Social’ classification is clicked are twitter, facebook, foursquare and LinkedIn. It’s then possible to click on each of these in turn to obtain a list of mashups which use those APIs, ordered by popularity.

(Note that the animation of the different classifications is not so important – originally, I had designs on making these moving circles that would move the selected classification into the centre and link to the other classifications, with the thickness of the link reflecting how many mashups comprise of APIs that fall into the two classifications).

Once I had developed this somewhat rudimentary visualization tool, I was able to navigate the PW data set a bit more easily. After playing with it a little, it became apparent that the dataset is somewhat skewed – there are over 2000 mashups which use the Google Maps API alone and youtube, flickr and twitter all come in around the 500 mark. Further, there are some classifications that have very few mashups.

This had implications for the work – in particular, I decided not to progress with enhancing the tool to reflect the linkages between the classifications as there was significant non-uniformity in it which is reasonably easy to detect.

There are still some interesting things which could be done with this and I may come back to revisit it at some future date. For now, it’s parked here.

Happy to have any comments/feedback on the tool in the handy comment box below!