Hero photograph
Framed
 
Photo by d_pham

Metadata is the new art direction

Tom Barnett —

As our content creation tools get smarter they can take on more and more of the manual tasks, letting us concentrate on the hardest part; storytelling.

'Metadata is the new art direction' was probably somewhat cheekily proclaimed by Ethan Resnick three years ago, and refers to our content being designed less by us and more by machines. Although the concept isn't without it's failures there's no doubt that it's increasingly becoming a reality.

Once upon a time every book and newspaper article was painstakingly arranged, or 'typeset', letter by letter, word by word, column by column. It was a highly specialised and technical craft, that still lives on today in the terminology we use when we refer to typography, both online and offline. For example, 'leading' (pronounced ledding) is still the term we designers use for the height between lines of text and gets its name from the little strips of lead that were used to space out the rows of letters for old printing presses.

In the 80s we got personal computers and lazer printers in our homes, schools and workplaces. These devices, combined with a brand new software category, lead to a desktop publishing revolution. Now all the labourious manual tasks around typesetting, graphic design and photography could be done by one person on a computer and stored digitally. Of course, when it came time to distribute the content we still had to print and package it into physical magazines, newsletters and books. All of which could take weeks, months or even years to get in to the hands of the audience.

Now with the web and today's software we have narrowed the time it takes from content creation to content consumption so much that it's practically immediate. As a result, our audiences increasingly expect constant updates. Therefore, anything we can do to streamline the process from the first words being typed on the laptop, to the finished article being in the hands of the consumer, the better.

The speed, ease and frequency of online publishing means that there simply isn't time to design every story with unique typography, layout and graphical treatment. Plus, even if we had the inclination, we couldn't possibly manually adapt our content for every device type and screen size that our audience may use now or in the future. 

That's the environment that's forced us to consider a new form of art direction; design by metadata. 

Put simply, 'metadata' are the little bite-sized pieces of information that attach to and describe certain aspects of a piece of content. For example, in Hail when you upload a photo you give it a caption, a photographer, tags and, depending on the camera it was taken with, other metadata for date/time, location and camera model. So if you were writing an article on yesterday's school sports day and wanted to give it a hero image you could quickly filter for photos tagged with sport from yesterday, or search for the sports ground across all the location metadata.

Another clever metadata example from Hail is our recently introduced 'face detection' feature. Now when you upload an image the system automatically scans it for faces. If it detects one it adds metadata to the image with the relative position of the face within the photo. Now when you select one of those images as an article's hero shot, it's automagically positioned within the design to centre the face and make sure the head isn't cut off. All without the need for any human interaction, it just works.*

These are some simple examples of using metadata to speed up and improve content creation and display, but there's plenty more exciting potential to come in the future if we start to cross-reference and remix all this rich metadata with other streams on the web. If we know the date, time and location of the sports day, then we could also find out if it was sunny and hot or chilly and pouring with rain. Maybe we could automatically display that weather data in a creative and engaging way alongside our article to help build out and tell the story. We could also include a map and related content from similar tags or timeframe.

Smart software will never replace the need for the creative people driving it, but the more data points we build in to our content now, the more possibilities there are for streamlining and automating parts of the deep and personalised content experiences that our audiences will come to expect.

*Don't worry, we're not scanning photos to identify individuals. It's simply to look for face shapes in order to optimise their position in the design.
Hero image by d_pham.