Thursday 31 October 2013

Interactive Media Awards: what do they tell us

The range and type of awards offered in the industry reflect the activities and trends of work. It’s a snap-shot of what’s going on and what is considered ‘good’. With this in mind, we’ve taken a quick look at some of the major players for the UK.

What’s readily apparent is the massive increase in categories that you can enter. This demonstrates the growth of interactive media use across traditional sectors. Imagine, The Interactive Media Awards now has 100 categories (and growing) that they divide into 25 per quarter of the year. If you can’t find a category to enter there for your projects, we’d be amazed. The award categories are mostly arranged around traditional content sectors of business, such as Bridal/Weddings, Government, Financial Services and Animals/Wildlife although some of the categories like e-commerce and web design/development seem more about the medium. Other awarding bodies choose to define the categories in other ways.

BIMA (British Interactive Media Association) has recently had its 2013 awards event and you can see that they divide their categories into Sector, Discipline and Premium with 24 subcategories. These cover:, Battle of the Brands, Business to business, Corporate, Education and Outreach, Entertainment, Leisure and Culture, and Public Life, as Sectors. Then Community Building, Content Marketing, Engagement, Games, Integrated Campaign, Mashups and Data Visualisation, Mobile, Multi-platform, Self Promotion, Social Media, Student, Tablet, User Experience, make up the Discipline categories. Finally Innovation, Minor Miracle, Agency of the Year and Grand Prix cover the Premium categories.

FWA (Favourite Website Awards) is based in Cambridge and runs daily, monthly and annual awards for web sites based on the judging criteria of creativity, originality, design, content and personality. The daily and monthly judging leads to a public vote for the Winner of the year. Their site is updated daily and they have lots of followers worldwide.

Only you will know what angle your company favours; whether creativity or transparent interface, well-structured content or in-your-face bombardment. So much depends on your clients, their needs, the proposed audience etc. That’s why defining categories for awards is no easy feat. Just think about what you’d put as criteria and categories. How would you set criteria that judges would agree on? It’s easy to write off these awards as irrelevant to you, but they pin down standards as a result of what they do – and you shouldn’t ignore that.

Look at some of the winners in the categories that suit your type of work. Do they inspire or frustrate? A good way to change perceptions if you don’t agree with the results is to back your own work and enter. Good luck.

Thursday 24 October 2013

Interactive media testing: who, what, where, how

As with all other areas of interactive development, testing has exploded into a specialism of its own, but this comes with some baggage because specialists expect some kind of career progression. It appears that the tradition for testers has been to transfer out sideways after a couple of years into development or project management. With the emphasis from clients being on proven performance of interactive products, myriad test results have become important in demonstrating your company’s worth. So a drain on your experienced testers is less tolerable now.

What do we know about testers? This is exactly what Chris George addresses in his posting at The Ministry of Testing in, Testing: an Obvious Career Choice (20.10.13). He’s realised that the many automated tests breed boredom and so testers move. His answer is to educate testers to demonstrate that there is a skills progression across sectors of testing by using a skills map. He has reached other conclusions too, so it’s worth a look at the map and analysis of his findings.

Rob Lambert at The Social Tester (18.10.13), seems to agree about boredom for testers and automated testing but his answer is to throw in challenges. This begins a dialogue on,’who are testers and what is their position in iMedia development?’

If you think you know about testing categories and what tests to recommend to your clients, take a look at Website Testing and the tools they review over 19 categories! We’ll let their list blow your mind! . Their checklists for testing are handy too. We lead you to their ecommerce checklist here, but there are others. These cover the ‘what to test’ and ‘where’.

Well, there’s just the ‘how’ to address now. Automation is rife as we’ve understood from ‘Testing Web Sites’, but there’s a mixed solution to some aspects of testing because user testing involves real people and real situations/cases. This has become partially automated too but that can point you to faster solutions for management and clients while taking you outside the purely automated box. User Testing.com specialise in online user testing that gives you fast-taste feedback. Yes, they are pushing themselves and their service but their client reaction including large brands is enviable. See www.usertesting.com/buzz.

Just to end on a bit of humour, Gerald Thulbourn berates PayPal (calls them ‘muppets’ repeatedly) for neglecting to do their own testing on some of their own sample code whilst warning others that changes they are making might affect their previous coding!

Friday 18 October 2013

Funding and competitions for you

Last month saw the launch of a new set of funding opportunities for the creative industries from the UK Technology Strategy Board (TSB). The question has always been, ‘Yes, but what are the creative industries and are we considered part of it?’. Interesting point. In the past these initiatives have appeared to favour traditional visually-driven sectors. The definition debate continues across countries apparently. (See Wikipedia’s take on the definition debate as an introduction).

But because of convergence - and maybe even dominance - across the digital media sectors, the aspects that the strategy addresses have finally crossed over to more purist digital media. If you have got lost in the debate before, as we have, take a deep breath and be prepared to look afresh.

Listed on the TSB web site there’s:
  • £15 million for cross-platform production in digital media (although the slant appears towards film, TV, online video, animation, video games and special effects.)
  • £2.5 million for ‘Frictionless Commerce’ – collaborative projects covering digital transaction environments for content industries.
  • £2.5 million for hyperlocal media demonstrators for geographical communities
  • £4 million for location-based services for businesses to get their customers into the ‘here and now’ context.
  • £2.5 million for valuing and pricing digital assets in digital transactions.
  • £3.5 million for ‘big data exploration’ meaning finding new ways of extracting value from data.
  • £1 million to help Manchester creative industries cluster.
Well, it is in the right general direction, don’t you think? This funding and the competitions across these areas begin late 2013 and continue into 2014. For a stake in the first (and largest) and the second you need to register interest quite quickly in November. The others weigh-in during late 2013 and early 2014. Now, don’t forget that the project management of funded projects like these is heavy because there’s usually lots more admin than on strictly commercial projects. Also, UK government bodies often ask for Prince 2 qualified project management as a basis. Do you have people qualified in this? We’ve covered Prince 2 aspects and developments before so check out previous blogs.

If you’re tempted, good luck. Remember, you learn a lot along the journey which increases your skills and experience. We all need that.

Tuesday 8 October 2013

Evaluation of web sites – what is this?

Evaluation is such a problematic word because it implies testing according to criteria but covers wider ‘softer’ issues as well. Everyone who looks at a web site tablet, mobile app, or any electronically delivered information, is subconsciously judging what they meet. Our problem as developers is to be on top of general evaluation criteria as well as be aware of more specialised market-driven information. Some aspects of evaluation such as accessibility are linked to legal requirements, so we have to conform to those anyway. We’ll not be looking at accessibility specifically here as we tend to treat evaluation for that separately.

Perhaps we should start with a definition of evaluation. It is about judging or assessing the value of something in a structured way. Under a general evaluation appraisal, we can find guides for assessing web sites that emphasise looking out for: authority, accuracy, reliability, being up-to-date, relevance to you, feel (University of Reading). The University of Berkeley, Evaluating web pages, offers similar advice covering techniques and questions to pose about web pages for evaluating them. So, we may have education graduates worldwide who have been trained to evaluate our pages. It might be useful for your clients to understand these general criteria as some of the criteria relate more to actions of theirs for keeping the information current.

But the great majority of people do not employ a common set of criteria when assessing information. They employ subconscious criteria according to their age, need, behaviour trends, and more. This is where market research extends the measures for evaluation. It offers insights of how to tailor the information and experience better for a particular group.

Then we get more precise guidelines of how to engineer information electronically for particular needs. Take for example Kantar Media’s September 2013’s publication on, Over 50s in the digital age, where they define the over 50s behaviour trends with electronic information giving valuable pointers of how to reach this group and how to engage them better. If you remain sceptical can you argue with Webcredible’s success results (if they are true, of course!).
  • 36% increase in made-to-order in online revenues for Laura Ashley
  • 50% reduction in mobile homepage drop-offs for Macmillan Cancer Research
  • 44% conversion improvement and 168% uplift in leads for Propertywide
  • 80% increase in hotel ‘look-to-book’ conversions for Thomson
Perhaps stats like that may influence your clients to agree to some market research, or it might help you define some criteria of successful evaluation for your own sites.