So what's been going on?

Yesterday I was given a reason to revisit my site and update some of my work. I was also horrified to realise it had been over nine months since my last blog post. 

But boy, 2016 has gone fast. I know it's the same for most people I know - all of a sudden I have been thinking about summer school holidays, end of year assignments, and what we may be doing on NYE.

I have been doing my masters full time which has...well, taken up a lot of time but it's been great. I also just saw that my profile has been put up in the student profile section - have a look here (and yes, I had my hair done that day...why can't it be like that everyday? )

 

I've been having a great time working at Fairfax for the AFR. Survived the election, the BRW Rich List, and about to launch into the next round of the AFR Executive Salary series.  I have also been doing the odd freelance project, but have had to essentially stop doing this. Uni and work have meant there is not much extra time to pursue or complete these projects. 

I managed to fit in a half marathon in the early stages of 2016, plus attend a machine knitting course. Running has slowed down a lot though over the past few months. We had a family holiday though in Malaysia and Singapore and it was great. Uni still provides me with lots of stimulation and this semester I am doing two great subjects and I hope to be able to share some of the outputs that I am working on - one on gender diversity and how it relates to financial performance of companies, and another on the speech of Australian Prime Ministers. 

Generally life is full and it is good!

Hand crafted: How skipping the computer can give you a visualisation creativity jolt

For this first assignment I wanted to take a very handmade approach to data visualisation. Giorgia Lupi is a creator of immersive, static visualisations, and has also recently completed a fifty-two week project called Dear Data with Stephanie Posavec. Lupi writes in her essay “The new aesthetic of Data narrative” about the immersive possibilities of print. She asserts that visualisations are also used by readers for pleasure, meandering and entertainment; not just about quickly taking in information. Her generous and elegant designs are used for experimentation with novel shapes and forms. They require deep immersion.

An example of Accurat Studio's work for La Lettura magazine. Lupi is the creative director there.

An example of Accurat Studio's work for La Lettura magazine. Lupi is the creative director there.

I took her essay as a basis for how to proceed with my design, as well as being inspired by the intimate and hand-made nature of the Dear Data project.

Data capture and selection

Posavec and Lupi appeared as guests on the FiveThirtyEight podcast “What’s the Point”, and set out a challenge for listeners to capture data about their podcast listening habits over a seven day period. Then they had to visualise the data in whatever way, inscribe it on a postcard and mail it to the show. I decided to take part in this as well, and captured data from March 14 until March 20th. I used the Reporter app to capture information about the podcast as I was listening to it. This data was then extracted as a CSV file. I sorted the data and viewed it on my screen.  I did not conduct any analysis on the data.  I simply used the variables captured as a basis to start drawing.

Design Process:

I started off by sketching out ideas, and used Lupi as inspiration for using this style of visualisation as a way to experiment and play. No bar or line charts! I used time as my main axis (Monday through to Sunday), with each day split on AM and PM. I wished to used organic shapes, so began to play with different forms using ideas of “pods” or “waves”.

Once I decided on a form I would use, I used the spreadsheet to capture each podcast and the data that sat around that listening moment. I was tired, so I found this type of “deep work” quite hard - I had to concentrate on what podcast I was working on, and move between the screen where my data sat and the page. This took me an evening, but the process was very enjoyable.

IMG_7574.png

The next stage was to create the legend. Normally the legends are created by the package I am working with, and often are only two to four variables. In this case I visualised nine variables - a lot of work, but once I began it took about one hour. For a process that I normally don’t have to think about, this was interesting.

As I progresses, I posted images to Instagram, and received a nice comment from Lupi herself, which was a little thrilling! (yes, I am a tragic fan-girl!)

Opportunities and Challenges:

The opportunities for using a tool like this are immense. I got a deep sense of satisfaction completing it - much more than the work I do on a day to day level - and I enjoy my job! It’s not drudgery for me, but this was a different level of satisfaction. It was nice to experiment and layer lots of information. Normally I think about extraction in my daily task; what is redundant, what can be removed, what is unnecessary. There was something quite decadent about including everything. I hope that people have an explore and see what they can discover about each podcast.

(this post was originally submitted as a blog post for an exercise in experimenting with different visualisation tools for a subject Data Visualisation and Narratives, part of the Master of Data Science and Innovation at University Technology Sydney

That time we predicted the election

So it's been a few months since the election, but for some of us the scars are still visible. It was looooong, and sort of a slow and boring slow motion car crash. Yet, it was fun in a kind of demented way as I played a lot with shape files and the ABS, and generally learnt to dislike the way the AEC handles a lot of things (the rude-ish emails about information requests - I get that in an election you must feel like you are under siege, but I spoke to many nice ABS people during this period who were probably fielding the same sorts of desperate questions from newsrooms about electorates, and boundaries, and postcodes and merging regional data sets based with new electoral boundary information and post codes and COULD IT BE DONE? ...etc)

Anyhow, there was this weird moment a day or two before the election where I searched for every seat-based poll and updated a file that Greg Earl had started, and one of the scenarios that came out was a hung parliament which seemed really wild, but if you didn't look at a national swing, and instead state and seat based swings, it wasn't actually in the realm of impossibility. It was actually a real outcome. On election night, I did a small fistpump, not for the ALP but for the brief moment that we picked it. 

So to have this proof I really wanted to post this here, to prove that for a little while, we actually predicted the election outcome ...until all the postal votes rolled in of course and the NLP got a majority ;-)

This was team work, with the interactive designed by Les Hewart, data gathering and initial graphic prototyping by me, and analysis and commentary by Ed Tadros and Greg Earl. Nice also to be in a graphic in a Laura Tingle article too.

 

 

  

 

99% cleaning and compiling with 1% of some other stuff

There are times when you get a data set, and you can use it straight away. Actually, I can't remember when this has ever happened, so let's assume there are actually no times when you get a data set and you can use it straight away. 

There are times when you get data, and you have a lot of wrangling, cleaning, reformatting and just generally bashing the thing into shape. It's an iterative process and visualisation comes in handy to spot outliers and errors.

Then there are the times when there is actually no data, and you have to combine existing data sets with photocopies of old magazines. Thankfully in the newsroom there is a good record of those old magazines. So this is what you sometimes have to do - look for old records or where there is gaps and hand data enter the numbers. Or just check what's going on...

It was heartening to read Tim Sherratt has done the same thing with his PM speeches repository (although on a much grander leather-bound scale - mine were crappy photo copies of old BRW mags).

So here is the story, and the graphic, in all it's digital glory