How much does your website cost? Part two
5 May 2011
Lessons from the UK government’s spending review – and content strategy
Part two: Measuring costs, quality and value
The metrics and standards the UK government uses
How are the costs for UK government websites broken down? And how do they show value for money?
Areas of spending
The Central Office of Information (COI) divided non-staff costs into the following five areas of spending. The figures are for the 46 department-run, central government sites listed in 2009/10:
- Strategy and planning: £14.1 million
- Design and build: £22.7 million
- Hosting and infrastructure: £23.8 million
- Content provision: £24.1 million
- Testing and evaluation: £9.7 million
In this total of £94.4 million, the area with the highest spend is content provision. But only 24 sites provided figures on content provision, so presumably the other 22 departments used only their own staff.
‘Reported staff costs’, making up another £33.5 million, were not broken down into areas of spending. Content provision is likely to figure highly in this.
But while 40 sites listed non-staff costs under design and build, and hosting and infrastructure, only 28 did so under testing and evaluation. And only 16 did for strategy and planning.
What exactly does content provision mean here? Does it include any form of content planning, review and governance?
COI provides an extensive list which includes style guides, copy, images, audio and video content, content migration, content maintenance, content moderating, reviewing, updating and syndicating – as well as terms and conditions, translations, accessibility, metadata and preparing PDFs for online use.
Strategy and planning refers to the strategic work before the website is built or redeveloped, ongoing planning and project management. Here there is no explicit mention of content except for ‘rationalising content’ under ‘convergence’. It does though include ‘specification of requirements’, ‘systems architecture planning’, ‘ongoing site management’ etc.
Design and build means the creative and technical work in producing or updating the website. It includes the costs of building website components such as text, images, video, audio, animation, blogs and wikis – but excludes the costs of creating the content. It also includes search engine optimisation.
The developing discipline of content strategy can influence all five areas of spending, as I show in Part three.
The cost of content
How does COI’s metric deal with the cost of staff providing content? Well, it’s a bit of a balancing act.
The problem, as COI points out, is that ‘Many staff within an organisation may be involved in content provision as a relatively small part of their role. Attempting to measure how many people, and how much of their time, are involved in content provision is likely to be a difficult task with inaccurate results.’
As a result, staff who spend more than half their time working on the website – in any capacity – must be reported as ‘full-time equivalents’ at each grade level. But, rather than excluding staff spending less than half their time on the website, wherever possible proportional costs should be estimated.
Are we just talking about salaries – or overheads as well?
Only salaries need be reported – but COI will add an ‘uplift figure’. This will be based on ‘an average uplift for salary costs (covering eg National Insurance, pension contributions etc) and a further uplift for overheads (covering eg desk space, electricity, IT etc)’.
Working out the true cost of overheads is no easy task for any organisation – but essential to get a true picture.
Web-only v cross-channel content
The guidance separates ‘web-only content’ and ‘cross-channel content’ on government websites. Full costs must be reported for content created just for the website (eg text, images, audio, video).
But excluded is content created for:
- the website and other channels (eg a video shot for a TV advertisement and online use)
- other channels and then added to the website ‘subsequent to its primary use’ (eg a leaflet distributed by direct mail, then added to the website for reference)
Costs of converting offline content for online use must be reported because they relate specifically to the website. For example, the cost of preparing PDFs for use online by adding accessibility features is included. But the costs of typesetting, proofing and printing PDFs are excluded.
Also excluded are costs of transactional content, such as self-assessment tax returns that require ‘end user input’ and ‘result in a state change’. The rationale is that if the website did not exist, they would be provided through other channels such as post offices and call centres.
Costs of providing the online interface to the transaction must be reported because this is a cost incurred by the website specifically. For example, for a self-assessment tax return, the costs of creating and hosting the interface must be reported – but not the back-end systems that process the data, nor offline support for users.
Costs versus quality and value
So we’ve worked out all the detailed costs for a site. How do we now measure its overall quality and value?
COI has mandatory guidelines for every government site to track the number of visits and carry out online ‘user satisfaction surveys’.
In the 2009/10 survey of department-run, central government websites, all 46 sites kept track of website usage by recording:
- average number of unique users/browsers per month
- total number of page impressions
- total number of visits
- bounce rate
- average visit duration
All site traffic was verified through an independent audit by the non-profit organisation ABCe.
The total number of visits in the year was 568.3 million, with the highest number per month going to direct.gov.uk (8.6 million), followed by National Health Services Choices (6.2 million) and HM Revenue and Customers (3.6 million).
While measuring volume isn’t always a reliable metric, by comparing the total annual visits with costs per site, you can work out the costs of the visit. The three most expensive websites, as ZDnet wrote, were uktradeinvest.gov.uk (£11.78 per visit), businesslink.gov.uk (£2.15 per visit), and research4development.info (87p per visit). The cheapest was Environment, Food and Rural Affairs, defra.gov.uk, at 2p per visit. Thirteen of the websites surveyed cost less than 10p per visit.
Value to the user
In a Digigov blog, David Pullinger of the COI emphasises that ‘cost per visit is not related in any way to the cost per use, nor indeed the value to the user’.
This is because ‘It’s made harder by some public bodies doing lots of syndication, placing information onto other websites where people regularly go, and early adoption of re-usable information and data, so that people can present it in new ways. Both these result in people using the information, but not being recorded as visiting the website to do so.’
Another good reason for Martha Lane Fox’s recommendations on controlling content that I referred to in Part one.
User satisfaction surveys
User satisfaction surveys ensure the volume metric doesn’t stand on its own. These involve asking users whether they felt they had got what they wanted from the website through:
- ease of use
- ease of finding info/services
- editorial quality
- content accuracy
- search tool
COI provides sample surveys containing mandatory core questions, but departments are free to add to them.
A final survey on ‘purpose of visit achieved’ was done by 17 sites, with very mixed scores, under:
- got everything I wanted
- got most of what I wanted
- got some of what I wanted
- got none of what I wanted
Output v outcomes
In a blog post for Econsultancy, Alec Cochrane points out a useful measure that the American government recognised: the difference between ‘outputs’ and ‘outcomes’.
Outputs are the direct effects of the website – the things that people do on the site. Outcomes are the effects of users using the website and what they go and do with it.
Referring to his work measuring the costs of businesslink.gov.uk, Alec showed how you should ideally continue to measure value by looking at what visitors gained from their visit. Has it helped them improve their own businesses, and save time in the process – and can you measure this exactly?
Commenting on the post, David Pullinger of the COI wrote that an ‘output/out-take/outcome model’ would soon be used for media evaluation across government. It would go ‘a long way towards focusing developers and managers’ attention to what really matters’.
Testing with users
Summarising COI’s guidelines in ’149 steps to a better government website’, Steph Gray raised the question: ‘If you do all these things, will you have a good website?’
In reply, Adam Bailin of COI wrote:
‘The answer is, “no”. There’s one test that trumps all of these: test with users. No amount of good management practice or standards compliance monitoring will provide insight or generate site improvements to the extent that usability testing will.
‘We’ve tried to emphasise this in our blog articles, seminars and workshops, and the usability toolkit. We could probably do more to promote user-centred design within the guidance documents themselves.’
And content strategy…
Leave a comment below or go to the next part:
Part three: How content strategy can help reduce costs
Extra ways of making websites and intranets more cost effective
Part one: The story so far
The spectacular rise and fall of UK government websites, 1994-2011
Published on 6 May 2011