On Saturday i had the great pleasure of attending ScraperWiki's Hacks and Hackers Hack day held in the University of Ulster, Belfast. It was run by Scraperwiki.
Scraperwiki, is an award-winning data mining tool, funded by
4iP. Hacks and Hackers Hack Day (on twitter the hashtag was
#hhhbel ) was a one day event where web developers and designers got stuck with journalists and bloggers to produce projects and stories based on public data. It was sponsored by the School of Media, Film and Journalism at the University of Ulster and The Guardian (Many thanks for the opportunity!) and the Belfast visit was all part of the ScraperWiki UK and Ireland Hacks and Hackers tour.
Personally, I was totally unsure of what to expect. I don't code (so I am not a hacker) and I only vaguely make sense on this blog, sometimes (so I could only loosely claim to be a Hack). So I felt I was being a bit of a cheat when I arrived. Thankfully I found
Alan in Belfast and
cimota to vaguely hang around with and use to deflect any pertinent questions as to what I do. Scraperwiki's goal was to attract 'hacks' and 'hackers' from all different types of backgrounds: people from big media organisations, as well as individual online publishers and freelancers. They certainly accomplished that. There were more Hacks than Hackers (Hacks For The Win in turning up!!) but I think it actually worked out OK as the technical side could only happen once the ideas had been suitably worked out. As with any project, brainstorming and creating a plan of action is just as important as the doing. You need to know where you are going in order to get there.
There was a nice introduction from the Scraperwiki team and a bit of background on what Scraperwiki was. Now, as I said I am not a hacker and most of the language went over my head, however I began to see the concept. It would help find information that is already publicly available but not very easy to find. The spin for hacks and jhackers was that the aim of the event was to show journalists how to use programming and design techniques to create online news stories and features; and vice versa, to show programmers how to find, develop, and polish up stories and features.
The Scaperwiki team pushed this forward on the day by brainstorming 'datasets', or themes to likes of me, and then teams would form to create a real piece of datamining software (it would find stuff on the web in a thrice which would take you or me a very long time to dig up). Hmm, I think I have just decided I now like the word datamining. It makes me sound like I know what I am doing.
So armed with our mandatory laptops and free WIFI, four intrepid teams of journalists and developers began to develop their chosen themes into ideas and then into a final project to be published and shared. Each team would present their project to the whole group. There was also a bit of a competition with a prize at the end of the day.
I maintained my limpet like attachment to Alan and Matt whilst grasping onto poor
Rob Moore from
Learning Pool - the sucker who would be our coding monkey.
We eventually got down to something vaguely approximating working on the project theme, that of politics (shock horror). In particular we were minded to do something around voting. Through a bit of idea making, in between watching
RSAnimate on changing education paradigms, we started a creation journey towards voting patterns and perhaps how that would fit with socio economic levels within constituencies. This could provide a better picture for instance where instances of political disengagement was highest and the economic background. We used a number of websites for initial research including the
Northern Ireland Neighbourhood Information Service,
ARK, and a few others, we refined the idea to an initial step of pulling down the information on the ARK website to firstly create a visual of voting patterns.
|
There was even pizza!!! |
Dear help Rob and Alan (who can also code) who struggled with the coding and the government websites. Rob would look up every so often muttering '
why do they use
?? Grrrrrr' I thought it best not to ask and let him get on with it, feeding him some caffeine every no and again. In the end we decided that the project is a goer but we would only look at the voting statistics first as that was easier and would test the practical coding. Lo and behold, it worked. And it showed something we had not seen before. Or at least as clearly before. The lack of people voting.
At the last Westminster election, we could see quite starkly, 17 out of 18 constituencies had Minority MPs. That is to say the number of non voters, who we called Mr No Vote, outstripped the number of those who did and out did the winning candidate by a mile. This caused us to discuss the issue of legitimacy. Yes over 50% in each constituency voted, in some cases only just, but the winner had no where near the number of votes that Mr No Vote did not cast. So apathy is winning out at the minute. The 18th constituency where the winner beat the number of no votes? Fermanagh and South Tyrone. This raised our eye brows, was the only motivator to get bigger numbers out to vote sectarianism? Looked like it as a riled divided community seemed determined not to let the other side in.
OK, this is a bit simplistic but the graphics were quite stark.
So coding done? Check. Visuals done? Check. Ready to present the findings? er, really? Had forgot about that.
So in we went a bit nervous about it all and the presentations began. One group created a scraperwiki to glean
NI Court decisions to investigate how effective the judicial process was and how effective the process was, one group created a scraperwiki
to scrape the NI Jobfinder website and summarises the public/private/voluntary pay levels, another group created a Scraperwiki to scrape contracts awarded for repairing buildings and so on, by the NI Housing Executive (I did not think of a red sky at this point at all, no no no. Not at all.) The we presented our project, MrNoVote. After the presentations a number of judges went out to discuss the projects and come back with a winner. And guess what? It was us that done it!! I could not believe it at all. I was still asking Matt afterwards if we had actually won. He quietly insisted in his usual MacDaddy way that yes it was true, now stopping asking.
Overall I really think, after seeing it in practical use, scraperwiki has huge potential and a myriad of applications. It even makes me want to relive my coding days and learn to work with things like Python (I was disappointed to learn this is a coding language and not Eric Idle and co) or PHP.
A great big well done to the developers of Scraperwiki and I hope the madness continues into the future. As for our little project I think I saw a twinkle in Alan's eyes so MrNoVote will continue on and, unfortunately for politicians, will be very much alive to help assess the Assembly and Council elections next May.