“Board papers are little bit like a bikini; what they reveal is suggestive, but what they conceal is vital”.

With apologies to Irving R Levine


In theory, the difficult bit about being a NED is supposed to be weighing the numerous, important and complex decisions that have to be taken, having first read the relevant, informative Board papers.

But in practice, figuring out what the Board papers are trying to tell you is frequently half the problem. 

Although Board and Committee papers have improved in recent years, familiar challenges remain – too lengthy and you are swamped; too brief and you have to read between the lines to see what’s not there.

And then you have the problem of papers that inadvertently encourage confirmation bias, or group-think; there may be deliberate “steering” on the part of the author(s), and all the rest. 

These challenges make it hard for NEDs to interrogate the thinking behind papers, which can create a Corporate Governance issue and, in serious cases in some sectors, lead to a Section 166 report.

Inevitably, there’s a bit of an art in spotting the traps in Board papers, so let’s look at some of the most common, and possible solutions.


Data, in its various forms, is supposed to be unambiguous and represent a form of truth. If only!

Data, data everywhere

It’s not unknown for the Board to be working from one set of data, but individual components of the business to be working from other data sets (often due to the proliferation of legacy systems), and its not always obvious that’s the case, and that they may diverge to some degree.

Some organisations create a single database of information across the enterprise, so that everyone is working from the same material, but this sort of enterprise-wide solution is a major undertaking.

In the absence of this, politely asking the CoSec or CFO to attest that there are no inconsistencies in information across the enterprise, can lead to an affirmation or, in other cases, an embarrassed response that  could develop into an enlightening line of enquiry.

The perfect fit

Sometimes the data in reports seems to fit too precisely to whatever corporate narrative is being presented. 

For example, financials that show a perkily consistent improvement, month-on-month, year-on-year.  Or tiny customer complaint numbers; or high production yield figures, that always flat-line at a satisfactory level; employee feedback that shows a remarkably contended workforce – its easy to see what we want to believe (also known as confirmation bias).

The solution here – and it’s useful across a range of issues – is to don the most sceptical state of mind that you can (as opposed to an enquiring frame of mind) when looking at rosy data, and interrogate those who presented the data, including asking to see the raw information. You don’t have to do this very often before the message gets through that you do not simply accept good news at face value.

Every number tells a story (or two)

Confirmation bias can infect summarised numbers too, which may be presented along with an explanatory narrative that tells an encouraging story, but that’s actually inconsistent with the picture the base numbers show, through over-generous interpretation. 

And of course, these raw numbers may be missing from the reports sent to NEDs, or inaccessible by their sheer volume (a “snow job”). 

If there’s any possibility that explanatory narratives and their data may be divergent, then it can be helpful to insist the data be placed in an Appendix (or made otherwise accessible). 

If NED’s occasionally review the underlying numbers, with their highly sceptical mindset and if, as a result, they are able to make the numbers tell a different story from that reported, then you know that confirmation bias (or worse) is at work.


We all know the trap of steering by looking in the rear-view mirror, but forward-looking metrics can sometimes obscure, rather than illuminate the route ahead.

Imagine you are reading a Board or Committee paper, covering progress on a significant, but slightly nebulous project. Launching a new marketing campaign; developing a new product line – that sort of thing.

How do you know if the project is genuinely on track, and whether management’s proposed next actions are the right ones, especially if it’s one of those initiatives where there’s a “hockey stick” payoff, some way into the future.

Of course, metrics – both current and projected – should help, but you may not have all the data you feel you require; or the information given may seem disconnected one from one another, each throwing light on a different aspect of the project, but with no overall illumination. 

Crucially, there may be nothing to show the link between management actions, metrics, and project outcomes, in a cause-and-effect sense.

For example, with a new advertising campaign, you might be given material that covers: opportunities to see; focus group research, digital engagement; spontaneous recall; likelihood to consider purchase; inbound contacts via digital channels, branches and call centres; increase in products purchased over time.

But, if the links that managers assume exist between their actions, these metrics, and the project outcomes are not explained; if there’s an absence of interim targets; if there’s limited historic data to help gauge the accuracy of management’s current estimates; it’s just not possible to take informed decisions on the matter.

To exaggerate, but slightly: if spontaneous recall is up X percentage points; if digital engagement is up Y percentage points, how much money do you spend on the next phase of the campaign? It’s impossible to answer properly, without more data and insight into management’s thinking.

In the end, you may have to cross your fingers and trust the experience and expertise of the executive, but then that’s hardly an auspicious basis on which to be spending millions, is it.

So before the next, nebulous project starts, maybe it warrants a more detailed discussion with the Executive, to gain a shared understanding of cause and effect links, of data to be used to measure progress, of the sorts of interim targets that should be considered and what the areas of ignorance and uncertainty are.

And if the Executive cannot provide the level of detail you are looking for, that’s a whole different conversation in itself.


The use of Red / Amber / Green to flag progress is popular in Board and Committee reports. It’s simple and intuitive. 

However, there are three traps to be wary of, and one means of making the traffic lights system more informative.

Trap one, is that there may be confusion between whether the traffic lights are indicating the state of affairs as at the time reported (mid way through a project, for example), or are signalling what the end state might be.

So a project that’s starting to go off the rails half-way through could still be reported as Green by the author(s) – not Red or Amber – if they are convinced they will bring the project back on track, in the end.

The scope for confusion is obvious, so in order to avoid errors in reporting, make clarity on this point a part of your reporting guidelines.

The next trap is also one of interpretation. The use of R / A / G appears universally understood. But what’s Amber to me might be Red to you and my Green might be your Amber, so unless you have definitions on what the triggers are for each colour, this useful signalling mechanism might be actually be leading readers astray.

Finally, there’s an aspect of human nature, which means that when faced with a choice between three alternatives, we often plump for the nice, safe, middle one. In part that explains why so many Board and Committee R / A / G reports show Amber when reporting on status; and often the status sticks there month in, month out.

That’s not just uninformative; worse, the apparent stability might mask underlying changes – until, too late – matters start flashing Red.

I suggest that in order to avoid these problems you split the Amber by adopting the Red / Red-Amber / Green-Amber / Green format.

This has the effect of forcing matters out of the nice, safe Amber middle ground. It makes the reporting more descriptive, as executives have to think harder about the status of what’s being reported upon, and gives earlier warning of any movement in status, which might otherwise be masked in a warm, Amber glow.


Sometimes reliance on the nice, safe, middle choice spreads beyond RAG reporting.

The best Board papers offer the reader a choice of potential courses of action (e.g. should we close that plant, mothball it, or take on subcontracted production?); few challenges having only one single, correct solution. 

It’s not unknown for authors that have a pet solution to a challenge, to bookend it by providing two alternatives they believe to be unpalatable, knowing that human nature means their mid choice is most liable to be accepted.

I think the trick here is not to be tempted to simply focus on the middle option, but to interrogate the others to see how realistic they are, and to ask what other alternatives were considered and discarded, and why.


The streamlining of packs through templates, exhortation, self censorship or editorial decision, can lead to the omission of significant outliers and important counter-narrative data and opinion.

The danger is that group-think can creep in, as a result. If a particular matter appears to have been green-lighted through the organisation’s entire reporting chain; if there seems to be no data that gives a counter narrative, then it’s “obvious” that all is going swimmingly. 

In which case it’s easy for the NEDs reading the papers to go with the flow. Which could be a terrible mistake.

And annoyingly, some measures, meant to improve Board and Committee packs, can inadvertently have the opposite effect. The popular, “reporting by exception only”, can create a contextual void for readers not as steeped in the minutiae of the business as the Executive, and thus significant, unreported secular trends may be missed.

So it may be worth reviewing your organisation’s reporting guidelines, to check that these are not inadvertently contributing to the problem, by causing authors to self-censor material to fit templates and document length restrictions. 

And NEDs should ask for counter-narrative information to be given, for their consideration. If there truly are none, then either the matter is so simple the Board should not be dealing with it, or there’s a potential problem somewhere inside the business (people too afraid to speak out, for example).


One of the quickest way to inadvertently be mislead, is the use of vague terms in Board papers, instead of precise terminology. 

Technically known as ‘the curse of knowledge’, the difficulty arises because the writer’s knowledge of a term’s meaning may differ from the reader’s interpretation, leading to misunderstanding. 

To take one, recent example outside of the Boardroom; you may remember that the former International Development Secretary (now Home Secretary), Pritti Patel, was recalled to the UK and resigned, due to having had unauthorised meeting with Israeli politicians, whilst on a family holiday there.

She is reported to have said she attended only a “few” meetings. 

It transpired there were 12 meetings in as many days. Whether that’s a “few” is open to interpretation; just stating the numbers leaves no room for doubt.

Here are some examples of commonplace vagueness, where precision could be used instead:

  • A number of   }
  • Broadly          }
  • About             }
  • A few              } In each of these cases, just use the actual figure
  • Many              }
  • A variety         }
  • Significant      }
  • Insignificant    }

So in your Board Paper Guidelines, it may be worth insisting on the use of precision of terminology, as just one, small step in the drive to improve the drafting of Board Papers.


How do you know when something is too short to give you a full picture? 

It’s when you find yourself reading between the lines as much as reading the text, or when your experience tell you something’s not quite right – the picture painted is too bland or the sub-text of the document seems to be ‘nothing to see here, move along.’ 

Often there will be an absence of hard KPI’s (particularly interim KPI’s on a lengthy project) and an abundance of imprecise language – e.g. “…the project is broadly on track…”,  “…a small number of matters were noted as requiring improvement…”, etc.

In order to flush out problems with slender documents, it’s sometimes helpful to read them with our old friend, that highly sceptical mindset, as that often throws shortcomings into relief. 

And it never hurts to ask yourself: what’s missing from the document; if you were writing it, what else would you put in; how does the document compare with similar papers written elsewhere that you have come across. 

Spotting what appears to be absent can give you a helpful line of enquiry…


The biggest trap is reading papers that cover the wrong topics, and / or ignore the truly important issues completely. 

Having a calendar for Board content – what’s to be discussed, when – is sometimes the culprit. 

Whilst often essential in large, complex, regulated businesses (and just plain helpful in others), these can sometimes become immutable, and in a rapidly changing environment – like now – that can lead to a misalignment between the Board’s timetabled topics and the organisation’s stakeholders’ interests, that may go unnoticed in the short term.

And of course, the volume of fiduciary and regulatory reporting may cause backward-looking papers to become the norm, so those which deal with the growth/value challenge can become thin on the ground, and operational reporting can displace matters of strategy.

For some entities, it could be insightful for NEDs to ask for “unconventional” data, such as Net Promoter Scores from customers / patients  / etc; rankings produced by 3rd parties (eg Dr Foster, Glassdoor); summaries of analyst notes on the sector and on investor calls from competitors. 

These can capture information that conventional reporting does not, and can point to issues the Board should perhaps be considering.


It may be worth revisiting your firm’s guidelines, to check they are up-to-date and that the deadlines shown allow the Secretariat enough time to review and reject material that’s not up to scratch. 

Requiring sub-standard papers to be redone is never popular, but it’s important if a culture of creating quality Board documents is to be cemented and help the NEDs produce their highest quality thinking.

But the best way to avoid traps is for NEDs to stay on their toes, and speak up when they see shortcomings in the documents they are using.

As always, there’s much to do and time is short, so good luck and get cracking.