Open Government Briefing: Transparency in public finance – the role of good data

Ben Wray

Common Weal has published a new briefing on an Open Government approach to Public Finance data to improve its usability and transparency for citizens. A pdf version of this briefing can be accessed here

COMMON WEAL works with a broad range of statistical information, of which financial information is one of the most significant categories. But we also work with a lot of other groups and organisations which do not have our capacity to deal with numbers and so a part of this is drawn from our experience of working with them. There are some social impacts of information and data which we’ve looked at in other work and so that issue will be picked up too. Finally, we’ve been involved in some public opinion research work which has thrown up a number of relevant pieces of information about data and how it is perceived by the public and this will inform some of this.


There are two primary issues with data for the public and for transparency – trust and usability. The former of these is crucial. There has been an enormous amount of relentless back-and-forth over public data in Scotland as a result of the two referendums we’ve recently experienced. This has left the public feeling somewhat paralysed by public sector data which has been used by competing sides to prove apparently opposite things. The result is that the public is tuning out from this data altogether. Given some of the misuse of data in recent years (by virtually all sides in public debate) it is hard to see this as an entirely irrational response. The public therefore requires more trust in data and a better ability to derive ‘meaning’ from it (more on ‘meaning’ below).

One possible response to this is to have an independent, national statistics agency like the Office for National Statistics (ONS) is at a UK level. There are genuine reasons for some scepticism about whether ‘a single source’ of statistics can ever really be independent and there have been plenty legitimate criticisms of, for example, the Office for Budget Responsibility. But the possibility that there could be a body broadly seen as independent from the fluctuating interests of any given government might help. And if there could be a ‘kite mark’ which could give certain kinds of data a clear stamp of approval, that might help. Indeed, it could be possible for such an agency to give secondary ‘kite mark’ approval to data produced by other sources such as academia or think tanks. That would mean that Scotland could grow a substantial body of meaningful data which, within given caveats, could be trusted by the public.


There is one very important proviso to this which is public statistics must be given much more clarity through description and framing. What the public is looking for from data is not ‘numbers’ but ‘meaning’ – and a sharp clarification of what ‘meaning’ might reasonably be derived from any body of data would be very helpful indeed.

For example, if each body of data was presented with three short statements easily understandable by the public, it could substantially reduce misuse of data. Those pieces of information would be ‘how was this data produced?’. A simple description of methodology would help the public understand how data was put together and therefore its limitations. Stating ‘this was produced from a survey of businesses with a 32 per cent response rate leading to a total of 116 businesses participating’ makes clear where the limitations lie. Then a simple statement on ‘what it is reasonable to conclude from this data’ would explain what would be generally accepted as a legitimate conclusion to draw from the data, along with a simple statement ‘what it is not reasonable to conclude’.

To give a specific example, if the Government Expenditure and Revenue Scotland (GERS) data had been presented saying ‘this data is based on a complex combination of data sources, some  based on very accurate accounting, others based on survey work and some of that extrapolated from small sample sizes in UK surveys’ and ‘it is reasonable to infer from this data, within its given limitations, how Scotland performs in public finances as a region of the UK but it is not reasonable to infer that this is the same as the financial position of Scotland as a notionally independent country’ would have helped to prevent the abuse of this data which has been perpetrated by both sides in the Scottish constitutional debate.

Making numbers have meaning for people should be the primary public-facing aim of public data. But this should also carry an additional element – regular work should be done to systematically identify what information the public wants from public data. By identifying what meaning people are seeking to derive it will be more possible to ensure that the right data is being prepared and presented. For example, perhaps people want to compare their economic position with the population of Scotland more widely (a task that is just about possible from existing data sources but requires an inordinate amount of work), or perhaps people want to benchmark public spending with other countries, or perhaps they want to see how progress has been made in a particular social issue through time-sequences of data. The assumption with data at the moment is that what is counted is what is deemed important by statisticians and those working in public policy. Given that slant, it is again unsurprising that the public appears to express frustration.


There should be a single portal for Scottish public sector data. This would clearly be aided by a single statistics agency. However, there is very widely varying practice out there on the presentation of data to the public – and clearly for openness the better and easier the presentation the more likely people are to feel data is useable.

An online front end to help people interrogate data is crucial. It will have been achieved when a citizen with a genuine interest can go to the page to find something out and be able to navigate an interface which gets them to the data they want without specialist knowledge. And this should not mean ‘find preprepared date’ but also interrogating that data through comparison. For example, if someone wishes to know how the level of public expenditure on a given public service compares in rural areas to urban ones, they should be able to get to that data through a series of simple choices: click ‘expenditure on public services’, select the service required, click compare, in the compare box click ‘cost centres’ and then specify ‘those with a postcode identified as rural’ (or whatever the process would require). That makes data genuinely open. To see a case study example of how this is done well, look at

Consistency, quality and comprehensiveness

As a substantial user of public data, Common Weal regularly comes across the issue of comprehensiveness and consistency. Put very simply, often Scotland lacks crucial data because it is collected from across government rather than being coordinated by a central body. While there are strengths and weaknesses to each approach, the current outcome in Scotland is that if someone somewhere in the public realm doesn’t actively decide to count something, it doesn’t get counted. Sometimes as problematically, what has been counted may appear to be using the same base information and therefore be directly comparable but has in fact been collected by two different parts of government using entirely different methodologies and therefore the two bits of data are not comparable so cannot be used together. To say this is limiting and problematic is probably an understatement.

If there was a national statistics agency it should publish full methodologies to make it easy to understand what data is comparable and not comparable. And it should have the statutory power to compel the release of data so it is able to fill any gaps it feels should be filled. Ensuring consistent methodology for similar datasets would make public information much more useable.

Finally, the current practices where some data is confused and of low quality which other data isn’t published at all should come to an end. If there was a statistics agency it should be required to ‘vouch for’ the quality and consistency of data and use its powers (statutory if necessary) to improve the quality of data. At the moment it is simply too easy for a department of a public body or of government to cobble together data which isn’t ‘wrong’ but isn’t of a quality to make it ‘right. It is also too easy simply to dump data in unusable ways or to fail to publish it altogether.

End obstructionism

Almost everyone involved with public data will have had some experience of ‘obstructionism’ in gaining access to public data. Whether it is failure to publish, denial of Freedom of Information requests, use of ‘commercial confidentiality’ clauses, refusal to disaggregate date or provide methodologies or claims that ‘personal data’ can be inferred from larger data sources, there has simply been far too much ‘that’s for us to know and you to not know’ practice.

Commercial confidentiality during a tendering process is perfectly reasonable. However, once a contract has been issued there is little legitimate reason for people not to know how public money is being spent. If there are mild commercial disadvantages to a private company from clear public interest in knowing, the company must suffer the disadvantage. Public contracts are lucrative and incredibly safe (this is a client guaranteed to pay its bills) and the price for this must be more transparency. The public is not wrong to suspect that the secrecy only benefits wealthy individuals.

The rest of these issues might best be addressed by a statistics agency able to compel data. The Scottish Government must improve its track record on FoI, and take steps to strengthen the legislation. The publicly available on lobbying should also include financial data for transparency purposes.

Economic impact data

There is no more grievous misuse of publicly available data than the ‘economic impact’ report. As often as not these are simply lobbying documents produced by one side of a debate or the other and produced with a methodology designed for a predetermined outcome. While this kind of data may be legitimate as part of a campaigning strategy, it has no place being presented as meaningful or trustworthy. The Scottish Government has a somewhat alarming recent track record of using words in the form ‘according to lobby group X, this policy will create Y new jobs and Z amount of economic growth’. This appears to give government credence to claims that should have no place in official public discourse and this kind of practice should stop immediately.

The only legitimate use of impact assessments by government should be those either produced using or checked against the Treasury ‘Green Book’ methodology. This is a rigorous assessment of what is and is not a legitimate claim (for example, if you claim ‘X jobs created’ you must then calculate ‘Y jobs lost’ using the same methodology and subtract the latter from the former). There is a very big difference between government wanting something to be true and it actually being true; the job of data is to be on the public’s side in helping to identify what is closest to being ‘actually true’.

Who benefits?

One of the biggest issues of trust and data relates to ‘who benefits?’. There is an enormous amount of public expenditure but it is not always easy to identify (a) who’s pockets it ends up in and (b) whose interests the expenditure serves. This should be addressed by attaching to every piece of legislation or spending decision (over a certain threshold) a statement of who benefits (directly and indirectly) and by how much. Which private companies receive direct payment for delivery? This can then be checked against the lobbying register to identify whether they’ve sought to influence the decision that led to them receiving the money. Which interest groups benefit and by how much? Every sectoral lobby seeks to persuade us that what is good for them is good for everyone – so this should be publicly counted, How good is it for you in hard money terms? How good is it for everyone else in hard money terms? If government is making public-good decisions in an open and honest way, it should not only have no objection to this information but should positively welcome it.