Inequality, mental models, and counterfactuals

09/09/2013 § 4 Comments

In the recent book Inequality: A New Zealand Crisis, Jonathan Boston has a really useful chapter. It probably should have started the book as a way of laying some intellectual groundwork, instead of being Chapter 5. It raises the same point as Matt Nolan does at TVHE, quoting Amartya Sen, but with more detail. Here’s Boston:

As highlighted in this chapter, there is almost universal acceptance that equality matters. Yet there is no consensus on what kind of equality should be championed.

And here’s Nolan:

 Everyone, especially those who are more extreme in any given “political dimension” cares about equality of something – and the underlying reason why there are trade-offs stems from (as Sen discusses in the book) the heterogeneity of individuals!

Boston tries to cope with the different equality targets in two ways. First, he says that some equalities are more important than others, arguing that there is an inequality of equalities. With this, I believe he steps right back from his initial understanding — that reasonable people may reasonably disagree. Secondly, he suggests a certain pragmatism, or specific egalitarianism. While this seems attractive, it papers over the real conflict amongst theories of egalitarianisms with a poorly defined set of concrete goals.

In doing this, Boston shows how hard it really is to have evidence-based policies. When it comes time to make a decision, we have to take values and preferences into account. A particular set of evidence can be made to suit a range of policies, once you introduce different values. I’ve not read Peter Gluckman’s new report on evidence-based policies (pdf), but the reporting I’ve seen suggests that it is insufficiently attuned to this problem.

There is a second problem less well understood. When we make judgements about inequality, we are using mental models of social systems to create counterfactuals. For example, if one says, ‘they wouldn’t be poor if they weren’t lazy’, there is a mental model of society that underpins the judgement. That model has a weighting on ‘effort’, as well as weightings on other things like ‘education’, ‘social network’, ‘ethnicity’, ‘gender’, and more. The ‘effort’ weighting is sufficiently large to counteract any negatives from the other factors. We could, through interviews and surveys, estimate the parameters that people apply to those factors.

We all have these models. They are all wrong. I say that with conviction because ‘all models are wrong.’ They are always partial — they have missing variables — and the parameters are estimated from a sample of observations rather than the population. So, in my interpretation of my experience, it may be that ‘effort’ is sufficient to make a person not-poor. That doesn’t make it so, either for my experience (which suffers from observer error) or for the wider world.

In economic analysis, counterfactuals are hard to construct. You have to decide which factors are important and how important and over what time. If I’m being cynical, I might say that the counterfactual is the most important part of any cost-benefit analysis, and it is literally something we just make up. With data and evidence, mind you. But we make it up just the same.

How much harder is it, then, to understand the counterfactuals that people create from poorly-specified mental models of complex social systems?

More evidence would help, so it’s good to see Gluckman encouraging the government to find and use more. The findings will rarely be conclusive, however, as evidenced by Boston’s discussion of equality.


The dreaded spreadsheet error

22/04/2013 § 2 Comments

I posted flippantly last week about the Reinhart and Rogoff (R&R) re-assessment by Herndon, Ash, and Pollin. There’s been more bytes spilled since then. The Economist says it’s not such a big deal, because  ‘Ms Reinhart and Mr Rogoff acknowledge in their academic work that this conundrum “has not been fully resolved”, but have sometimes been less careful in media articles.’ Paul Krugman counters that, yes, it is a big deal and provides some links. Matt Nolan at TVHE provides more links and more perspective:

it has been used as an inconsistent marketing tool by people for selling their own unrelated ideological policies….

I’m going to be careful here. Media interviews are not the same as academic writing. Keeping my thoughts straight while listening to someone else’s questions, and then controlling the random thoughts that spring to mind whenever (ask my poor students — I don’t censor digressions quite the same way in lectures) while not babbling — hey, it’s fun and energising but only approximately accurate. So, I’m not going to pile on.

I’m fascinated that it was a spreadsheet error, at least in part. Most economists I know proudly and loudly avoid Excel for anything analytical. Grunty programmes like Stata, sure, and nerdy open-source stuff like R (thanks, Auckland!), absolutely. I mean, these are guys (yes, guys) who sneer at SPSS. To find out that R&R were relying on Excel is like, I don’t know, seeing a celebrity chef eating at Burger King.

There’s a lesson for consultants here. Excel is the sort of programme that gave rise to this:

To err is human, but to really foul things up requires a computer.

Nevertheless, I like Excel a lot. Despite all the stupid and paranoid security controls that Microsoft has added, it is still a portable way to give clients the analytical details of what I’ve done. It also allows me to build dynamic tools to help clients tweak the analysis for their own questions. And, I can show them exactly which number is multiplied by which other number, and then transform it all into pretty pictures clearly and transparently. Throw in some macros and buttons, and it’s really powerful.

The best advice I’ve heard about building those sorts of files is to treat them like programming tasks. You are essentially programming a new bit of software. There are established protocols for tracking versions and checking code — that’s a place to get some tips on good design processes.

It’s the best advice, but I’ve generally ignored it (like a lot of good advice). It’s just too hard. So, let me offer my own advice:

  • do it differently — there are always multiple ways to make calculations. I like to make calculations two different ways, and then check whether they have the same values (‘=A3=B3’ will give a TRUE or FALSE; or, use an IF statement)
  • back-of-the-envelope — just the other day, we were looking at a spreadsheet model (again, portability is important), and we did some back-of-the-envelope calculations to check whether they were sensible. It’s similar to the idea of an elevator pitch — can I explain in simple language and logic why we get these results?
  • have someone check — give it to someone else. Let them see everything, get them to check everything. Make sure they have the chops, too, to do it right. Now, that can be expensive, several hours of work. So ask yourself, do I feel lucky today is it worth it for the job or the client? I mean, if I’m going to recommend unemployment for a few million people, I want to make sure my cell references are right. But not all clients warrant that level of scrutiny.

After all that, though, mistakes will happen. The best thing to do is be a mensch — I’m not sure what the New Zild translation is. Own up, walk the client through the impacts, and do as much work as you need to do with the client to restore some credibility.

And then, add it to your bag of tricks. You’ve just learned an expensive lesson.

The fragility of superannuation

11/07/2012 § 11 Comments

Matt Nolan over at TVHE picked up a theme dear to my heart: blood-sucking Baby Boomers and their selfish demands for everything. Oops, wait, did I say that out loud? It’s an overstatement, sure, but look at it from a Gen X perspective. Boomers were provided lots of stuff by prior generations, and then decided that those same things would just make the kids soft. They benefited from well-funded infrastructure and education (and healthcare in NZ), and then pulled the ladder up after them. And yes, Millennials have it even worse — go ask Wellygnome.

Nolan suggests that because superannuation will become unaffordable at current levels, it needs reforming. We can do this the easy way or the hard way:

Say that, when they are retired it will be the next generation in charge.  The next generation won’t be willing to increase taxes, and so will cut them off – forcing them to leech off their children or live an impoverished existence.  If the younger generations show this degree of bloody-mindedness now then older generations will definitely cut back on consumption, and start saving for their retirement.

They might even be willing to “make a deal” regarding the retirement age.

But not everyone believes that the younger taxpayers can make the threat credible. The argument is that the Boomers are just too numerous. Janet comments:

It’s an unrealistic threat – no-one will believe it. It’ll get translated as ‘starting in 2035 the rug will be pulled out for Gen X, because there’s less of them’. At least that’s how I’d read it, given there’s so many baby boomers who all have and exercise a vote.

I wondered whether this is true, so I went to Statistics NZ. Here is the percentages in each age group, using a middle demographic projection (series 4):

Projected Population by Age Group
1991–2061 (2009-base)
Population by age group (years)
 Year 0–14 15–39 40–64 65+
2011 20 34 32 13
2016 20 33 32 15
2021 19 33 31 17
2026 18 32 30 19
2031 18 32 29 21
2036 17 31 29 23
2041 17 30 29 23
2046 17 30 29 24
2051 17 29 30 24
2056 17 29 29 25
2061 16 29 29 26

In no year does the population of those 65+ ever reach more than 50% of the adult population. They are never the majority. They always depend on younger people to maintain their superannuation.

One objection is that older people have higher rates of voting. Yep, we can correct for that. A Ministry of Social Development report tells us that

People aged 65 years and over had the highest reported turnout (94 percent), followed by people aged 45–64 years (89 percent) and those aged 25–44 years (77 percent). Fewer than half of 15–24 year olds (46 percent) said they had voted….

Applying those percentages and some adjustments for voting age, the voters in 2061 will be: 18 to 39 — 25%, 45 to 64 — 39%, and 65+ — 37% (= 101% due to rounding). Retirees are still not a majority of the electorate.

Let’s try a comparison. One line is the percentage of the population 65+, which is a proxy for the cost of superannuation. We could just as well do superannuation as a percentage of GDP or government spending. A second line shows the percentage of voters likely to vote as a block on superannuation. Let’s assume that everyone born before 1960 votes as a block (Boomers look after each other), and anyone 60+ will vote to preserve the superannuation status quo (because they will soon be retired).

A few comments:

  • The voting block I’ve constructed is never a majority. Retirees, nearly-retirees, and Boomers depend on younger voters to maintain the desired level of superannuation. If this does turn into an intergenerational conflict, the superannuitants will lose.
  • The weakest time for Boomers’ superannuation and largest possibility for conflict comes in about ten years, when the voting block is shrinking but the cost is increasing.
  • After that point, the voting block increases at a faster rate than the increase in retirees and the cost of superannuation. This happens for two reasons. First, those 60+ are more likely to vote. Secondly, the voting block shifts from being Boomers to being retirees and nearly-retirees, so its proportion of the population stops falling.

I expect this issue to become more important and more divisive over the next ten years. If Boomers are going to survive with their retirements mostly intact, they need to make nice with some of the rest of us.

Where Am I?

You are currently browsing entries tagged with Matt Nolan at Groping towards Bethlehem.