John Kay on Mervyn King’s “The End of Alchemy”

Having already expressed my doubts about Mervyn King’s ideas on credit in his recent book, “The End of Alchemy”, I’m moved to comment on John Kay’s very positive review of it yesterday for the FT,

The enduring certainty of radical uncertainty

where for him the most important message was the stress on radical uncertainty. He describes his and Mervyn King’s parallel intellectual careers where they were together at the start, receiving and passing on what now seems an absurd idea of Milton Friedman’s – that there is no “sharp distinction between risk, as referring to events subject to a known or knowable probability distribution, and uncertainty, as referring to events for which it was not possible to specify numerical probabilities.”

Up to a point, I don’t think Friedman is being absurd, although that point is reached only a few lines further on in the text John Kay refers to (Price Theory, p. 282), of which more later. Of course there is a distinction to be drawn between these types of risk, but I don’t think it’s that sharp, because almost all financial risks have a degree of uncertainty in the sense of the distribution not being known. But now, conjuring up the idea of radical uncertainty, it suggests a qualitative boundary beyond which quantification is pointless, and that given the pervasiveness of uncertainty, this applies to most financial risk.  Perhaps mundanely, I’d suggest

  • there is no such boundary,
  • that pretty well all financial risks have a degree of uncertainty in this sense, but
  • that in general this can be reduced with some risk analysis (preferable not just collecting data about past performance), and
  • a subjective estimate of the remaining uncertainty is useful for a financial risk taker.

Where I’d say Friedman crosses the border to absurdity is when he writes

Sometimes people will agree – we then may designate the probabilities as “objective”; sometimes they will not – we may then designate the probabilities as “subjective”.

In this, I think Friedman is claiming that by observing how such risk takers act on their subjective estimates, and generate a market price for the risk if there are enough of them, then it will be possible to calculate the probability implied by that price, and that this probability is “objective”.  The scare quotes are still there, but so are they round “subjective”, for no reason I can see.  Combine with this dogma that markets are the best possible calculating device – rather than just a signal worth having on the radar – and the intellectual path is open to the justly derided excesses of financial quantdom.

The example of Gillian Tett’s insights into CDO traders from her studies as a social anthropologist is now well known, and in retrospect we can see that this would have kept our estimates of the uncertainty – and expected value – of such products more objective – no scare quotes needed here.  But so would insights from other disciplines – understanding of IT systems security, for example, and  maybe even some of what still seems like financial quantdom will have a role in understanding uncertainty and reducing it.

Mervyn King on credit

John Plender gives a warm review to Mervyn King’s recent book, The End of Alchemy: Money, Banking and the Future of the Global Economy

Uncertainty principles: ‘The End of Alchemy’, by Mervyn King


It is rare to encounter a book on economics quite as intellectually exhilarating as The End of Alchemy

I’m not convinced, at least by the review.

The aim of the book is said to be to “put an end to the alchemy that has made financial crises a permanent feature of the landscape and allowed money — a public good — to become the by product of credit creation by private-sector banks”.

This is odd, because there are numerous public goods which are created to a significant extent as by products of private-sector activities.  A typical pattern is that private sector agents engage in transactions where both parties benefit – which, if you insist, can be expressed as both seeing an increase in their utility – and that by such transactions becoming routine, helped by the emergence of institutions to support them, a public good is created.  Very often a public sector agency is needed to support such goods – so judges to support a system for enforcing and interpreting contracts, paid as indirectly as possible – by general taxation – by the parties to the transactions.

It’s how money and banking systems developed, and while in a sense they are inherently unstable, being based on a degree of trust which may suddenly evaporate, on balance I would expect a Central banker to see private-sector credit creation as a strength of a banking system, to be helped impartially in supporting the public good it creates, but not something to be eliminated.

The starting point for the argument is that we live in what economists now call “radical uncertainty”, where it is not always possible to compute the expected utility of any action, and the probabilities of all future events cannot be identified, so no set of economist’s equations that describe people’s attempts to cope with that uncertainty.

I wonder if it was ever possible to calculate the probabilities of all future events, so raising some doubts about any such set of equations.  So what have bankers and monetary economists being doing all along?  Doing the best they can, and if we accept that banking systems have on balance been beneficial, then the policy focus should be on helping them do better in serving the public. This would suggest greater transparency, more public provision of data which will help bankers and their customers identify individual risk, and the prevention of structure which divorce individual risk from public risk (the problem of too big to fail.)

Instead, the concept of radical, or Knightian uncertainty, asserts that some risks can never be calculated, so is qualitatively different from the sorts of risk which it might be possible to understand better.  I can accept that a theoretical distinction can be made between some risks which is it possible to quantify – the result of tossing a coin, for example – and others, which are not, such as whether a potential client is actually an extraordinarily accomplished conman, or whether a technology which depends on cutting edge science – quantum computing, anyone? – is actually going to deliver. Put like that, I’d suggest bankers have always dealt with radical uncertainty, an impression supported when I google to look for examples of such uncertainty.

Explained: Knightian uncertainty

Mervyn King’s response to accepting the world as it is, with such uncomfortable uncertainty to be found not only in businesses as modelled by central banks, but the financial sector itself, is to give Central banks the authority to perform various calculations

he offers an elegant refinement of the concept of “narrow banking”, which seeks to ensure that all deposits are covered by safe, liquid assets. In his system, banks would decide how much of their asset base to lodge in advance at the central bank to be available for use as collateral. For each asset, the central bank would calculate a haircut to decide how much to lend against it.

What is more, it would not just apply to the current financial system, but to any other business which emerged performing an equivalent role

The system would apply to all financial intermediaries, including those now outside full banking regulation, to avoid the “boundary” problem whereby depositors evacuate in search of higher yielding outlets.

I struggle to see how this is elegant – unless the word carries with it the suggestion of intensely learned or technical discussion, with subtleties beyond the appreciation of those not trained as economists.  It doesn’t look that different, to me, to the system of risk weighting for bank capital.

What is rather clearer, however, is that somehow, in a world just admitted to be characterised by radical uncertainly, the central banks will someone be in a class apart from everyone else in the economy, and able to get their calculations right.

It is fairly clear this is nonsense.

Does @TalkTalkCare?

This is a final attempt to get TalkTalk to contact me, and provide my father with the broadband connection he is paying them for.

I struggle to find the words to describe how frustrating it is to talk to TalkTalk, but their customers seem to be locked into a Catch-22 – until the connection is working, you can only communicate by phone – they will not use email.  So, if I try to connect from my father’s home, I get no further than this:


Broadband service not enabled

Contact provider for activation date

(Note to the right the wireless router, with TalkTalk branding, with all lights working; there is nothing wrong with this router,or connections to it from the PC)

but if I try to contact via a customer service from my own PC, using t this URL,, it seems I have to be using a home TalkTalk connection:


So for months now I have been on the phone to TalkTalk call centres, sometimes getting through quickly, sometimes hanging on for what seems like hours – probably no more than 20 minutes, but it feels like hours – and each time being asked the same questions, and giving the same answers. Yes, I have switched the router on and off, more times than I care to remember.

To begin at the beginning, in December last year my father moved house, to an address two doors away from his old address.  Before the move, TalkTalk had been providing both phone and broadband, and I assumed changing would be completely straightforward.  Eventually the phone service came through, but not the broadband.  I got on the phone, and after a lot of questions, I was assured that things had been sorted, and the broadband would resume at midnight; it did not.

So that he would at least be able to read emails, I approached a next door neighbour, who kindly shared the password to his wireless router, which could be detected through the wall, but explained that usage with his contact was limited.  I didn’t suppose that, for the few days it would take to sort things out, our usage would cause a problem.  So this other router was set up on my father’s PC as an alternative to connect to the internet.

It’s possible that all TalkTalk customers should keep a log of what phone calls they make, because here I can’t say for sure when I next spoke to them.  I would expect that TalkTalk have some system for logging calls from customers, but they do not seem to be able to share this.  Anyway, in due course, with the problem still not fixed, TalkTalk arranged to send an engineer to visit, explaining – over the phone – that the problem had either to be to do with the wiring of the router, or something to do with the wiring of my father’s new home.  I was a bit sceptical, since it was the same router as had worked fine in the old house, and the new house had recently been rewired.  But they gave a date and time slot when the engineer would arrive, and I made it down there in time, hoping to be able to ask the engineer what he found.  I rang TalkTalk to get a more exact estimate of when the engineer would be arriving – but they didn’t know.  Later my father came back from his shopping, and happily said – “so it’s all fixed?”.  Apparently the engineer had called him to ask if it was OK for him to come early, and my father had agreed, but not asked what he had done, or any problems he had found.  In fact, it seems he found no problem, so felt able to tell my father that everything was fine.

Well, when I switched the wireless connection back to his router, it wasn’t working, so I got on to TalkTalk, again, and again was told it was just a matter of waiting till midnight, and then the service would the there.  So, I sent him an email, and asked him to check his emails next morning.  Good news – he was able to read my email, and I thought everything was fixed.

I started to worry, however, when other family members visited, and reported being unable to connect to the internet via his router, and the suspicion crossed my mind that when he was checking emails now, or looking at website, his PC was connecting via the neighbour’s router, having it on a list of alternative routers to try if the first one failed.  So, when I visited again yesterday, I had a look via the Control Panel – and yes, that was what was happening.  So now I need to ask the kind neighbour if we have been running up unexpected bills for him – embarrassing.  Of course I got on to TalkTalk – by phone – and was asked all the same questions, and gave all the same answers.

In retrospect, we should probably have switched from TalkTalk long ago, but I don’t want to have an interruption to the phone service while moving to another supplier, and up to a point, it seems more reasonable to suppose any supplier – even TalkTalk – will be able to work out what might be causing a problem, and fix it.  I guess that’s why I’m writing this – in the hope that by putting something in the public domain, I can get TalkTalk to try to sort out the problem, rather than just leave it as some hassle which goes no further than a poor call centre underling somewhere in India.

I’m also curious to know how or why TalkTalk’s service can be so bad.  Somewhere within the organisation there must be the information which will identify the problem.  Do they simply economise on staff with sufficient training to deal with problems, and accept that customers who encounter problems will just shrug their shoulders, and move on to another supplier?  I guess that is the basic business model justifying – to shareholders – the provision a poor service to customers.

More recently it struck me that TalkTalk must also log the actual broadband usage of its customers, since otherwise they would not know how to charge them.  So they could identify how much broadband my father has used – which I suspect now is none.

Cycling for Shelter with Annie

Years ago I used to go for long cycle rides, and do the occasional marathon or mountain marathon.  I’d think up the routes I wanted to cycle, and generally just do them, without any support, or specific training.  Working in an investment bank, I’d sometimes use them as ways of raising money for good causes, making a nuisance of myself, taking a sponsorship sheet round the trading floor. Actually, I’d have more than one sheet, keeping back one to start with the people I knew to be more generous, so that when I put it in front of others who I felt ought to be generous, they would feel obliged to match their colleagues.

That’s all in the past, and I’ve not done any long rides for more than ten years, when, not very well prepared, I dragged my daughter across Ireland.  We still made it to Mt Brandon, though.

But now she is going to do this year’s Ride London-Surrey 100 for Shelter

Shelter logo

and suggested I should do so too.  So I am, and in her way, I’ll be taking the training much more seriously; at my age now, it probably pays too.  The fundraising will be more 21st century too, with a page for donations, which will be anonymous, and tax deductible.  Here’s the link

Tim Lund’s page

Without a trading floor to pester, I’ll be looking to everyone to help me make my funding target.  I had thought of reusing one of my favourite fund raising ideas from back then – a sweepstake on how long it will take me – but that would make donations non tax-deductible, or just too complicated.  But I’m interested to know how long people think it will take me, at my advanced age of 59.  As of now – March 30th – I think around seven hours.  So here’s a rather simpler form than my first attempt

Your Name (required)

Your Email (required)

So - how long do you think it will take me?

Any other comments?

Just to prevent spam bots, here's a simple quiz - if you're not sure of the answer, there's a clue in the title for the ride!

Gene Gini

Alternatively, how much inequality would there be if all human lives were played out on an economically levelled playing field?

I know it’s never going to happen, not least because economic winners, whether thanks to luck or talent, like to tilt the odds for the next generation by giving their children the benefit of their experience while alive, and their money when they die.  In earlier ages they have also constructed ideological systems requiring access to certain roles – being High Priest of the Temple, Caliph, or allowed to vote in elections – to those with some arbitrary genetic inheritance, or born or living within some administrative region.

But there is a more modern ideology out there in favour of equal opportunities, which has had some success in getting measures implemented in public policy.

Chart of the Week: How two decades of globalization have changed the world

This chart derives from the work of Branko Milanovic – no fan of inequality, or neo-liberal cheer leader – but the implication is that globally inequality has declined thanks to globalisation, and inequality only appears to increase if seen through a nationalist prism.  The big picture, I’d say, is that on balance globalisation has levelled the economic playing field, and the developed world middle classes are now obliged to compete with equally talented people from emerging economies such as India and China.  Meanwhile members of elites, individually benefiting most from globalisation, very often buy assets in safer developed economies, and go to live there for at least some of the time, so boosting inequality as perceived in these countries.

That chart was just for the period up to 2008, so what happens now, post-crash, will be interesting. From a simple Marxist point of view, one would expect the development among the middle classes of the developed world of ideologies opposing globalisation.

Let’s not talk about Ayaan Hirsh Ali

I got on a train yesterday, for a short break from the work of helping my Dad move out of his home of 65 years, with the prospect of at last getting to read some more of Ayaan Hirsh Ali’s “Nomad” on my kindle.  But then the seat next to me was taken by a young woman wearing a headscarf, and reading Edward Said’s “Orientalism”, with tabs on various pages in the way of a someone studying a text. Continue reading Let’s not talk about Ayaan Hirsh Ali

Housing policy as if we really believed in it

The immediate trigger for this post are a series of comments I made today on a blog by the Green Party spokesperson on housing, Tom Chance, which can be found here

Building homes, not false hope, in London

In my comments I refer to “housing policy as if we really believed in it” in the context of hoping one day to make housing affordable again, in areas of high demand among people in their 20s and 30s, such as London.  I can accept other aims for housing policy, in particular the need for an environmentally sustainable housing stock, but affordability concerns me here, and is I think of greatest wider concern as well.

Because policy doesn’t look at prices

I don’t think most housing policy is made as if this aim is really believed in, because, if it was, policy would be framed in terms of making housing affordable, i.e. in terms of price, rather than in terms of how many houses need to be built.  There are occasional exceptions, such as this recent comment from Priced Out calling for zero house price inflation, but given the current level of unaffordability, this is not exactly ambitious.  Instead, the debate is cast in quantitative targets, whether:

It would make far more sense to have core policy expressed as requiring the building of enough new homes, and the infrastructure which goes with them, to cap the proportion of income for median earners at the point of household formation ending up going on housing, either as renters or buyers.  This proportion is currently around 50% in the areas of highest demand, about twice what it was 30 years ago.  Such a policy would be interpreted at the city level, assuming this, as in the case of London, to constitute an effective housing market, and might still mean quantitative directives from City Hall to lower level planning authorities, i.e. boroughs, to say that they are not managing to get enough development done, but it would have the twin advantages of the numbers being set at the city level, so more locally than by Whitehall, and requiring increased supply when prices get out of kilter.

It would be a rejection of current ‘predict and provide’ targets, based on models of rates of household formation and other factors which estimate how much housing people  will want.  It seems clear that this approach has failed over the years, but it is more of a wonder why it would ever have been thought likely to succeed; in the long run, what people want will be reflected in pricing, and a refusal to respond to such price signals amounts to telling people they should live somehow other than they wish to, for no good reason when sufficient good, sustainable homes can be built.  An insidious effect of the current system is that high demand means people squeeze into what housing there is available, and unless policy makers do look at the price signals, they can think, institutionally, that somehow or other people are finding somewhere to live, so it’s not such a problem.

and because looking at prices is too scary

The other reason for thinking housing policy makers don’t really think they will make housing more affordable is that if they did, they would be developing policy for how any housing gets built during a period in which peak unaffordability is unwound.  This will not be a good time, financially, to be a first time buyer, speculative developer or buy-to-let landlord, so if the additional housing we need is going to get built, it will have to be supported by the tax payer.

Given the decades over which the housing crisis has developed, it is hard to call this a bubble, except that bubbles can develop in very viscous liquids, and I think this is one – but we don’t know how viscous it is, or if, like glass, which is in fact a very viscous liquid, it might also be capable of fracture.

All we can say is that if housing affordability is to return, it will happen along a path somewhere on a spectrum between these two scenarios

  • Quickly – with enormous political pain from home owners in negative equity, and destabilisation of the financial systems; or
  • Slowly – with a long period in which private sector housing construction stagnates in anticipation of falling prices, and requiring the increasing supply which will drive down prices to come from the public sector.

All we have at the moment from policy makers are the macroprudential concerns of the Bank of England aimed are avoiding the risks of an adjustment happing quickly.

I will leave it there, because predicting what might happen is so speculative, but it is clear there is a problem, and a £2.3 billion fund to subsidise first time buyers, as announced in the recent spending review, is an aggravation of the problem, not part of the solution.

A Random Choropleth

A few days ago an Intergenerational Foundation colleague showed me an Excel workbook published by the London Datastore with which it was possible to produce maps like this

and wondered if I could extend this to maps beyond the M25.

The answer to such questions is almost always “yes” – what varies is how long it takes, and how neatly it can be done. Click here to download what I have managed, which can produce maps such as this – called choropleths – and which is discussed belowOutput

How this works

This workbook has two sheets, one for the choropleth, the other for data, which need to be named “choropleth” and “data” respectively.  On the choropleth sheet, there has to be a cell with the text “Legend”, and also these with the text “Colour0”, “Colour1” and “Boxes”


which define the colour range for the choropleth.  To change the colours, use the normal Excel fill colour for those cells.

The data sheet has to have a table with one column headed “ShapeNames” and another “Data” – and any number of other columns.


Here the data are just random numbers between 0 and 1 – but this is where the values to be shown in the choropleth are entered.

The emboldened cells in this workbook are ones which the user should not change – although there is no protection attempting to prevent this.

Under the bonnet, it works by the map in the choropleth sheet comprising a collection of Excel shapes – of type “Freeform” – named according to the values in the column headed ShapeName in the data sheet.  Change these names, and the “Run” macro fails.

How this was constructed

The first step was to reverse engineer the workbook published by the London Datastore, to see how its macro to fill a collection of Excel shapes with varying colours worked.

The next step was to obtain a collection of Excel shapes for a different geography. Standard shapefiles, with the extension .shp, can be obtained from many public sources – in this case the UK ONS.   This can be read with the free software QGIS, exported as a .emf file, and then this read into Excel as an image.  The Excel image can then be ungrouped, which turns it into the desired collection of Freeform shapes.  Unfortunately, information linking these shapes to the names of the regions in the shapefile are lost, and a manual process is needed to restore the link.  This manual process can be greatly accelerated by copying and pasting the attributes table for the QGIS map layer into Excel, since the Excel Freeform shapes come in the same order as the region names in the attributes table.  The process cannot be completely automated, because when a region is geographically broken up – which happens with islands – there are multiple shapes for the same region.  In this case – e.g. with Isles of Scilly – it is necessary to pick one, e.g. the largest.  There are also several degenerate shapes, with either height or width zero – which can be eliminated with a macro.

How this could be extended

The main limitations of this are:

  • there are only shapes for one set of regions
  • the Excel shapes have to be obtained from a copy of a workbook, with all the Excel risk of something getting changed when it shouldn’t be
  • The VBA code is contained within the workbook, so would need to be copied to new workbooks

It would be nice to have definitive datasets which define the Excel Freeform shapes available from some web site for download.  If this has not already been done, it could be achieved with some VBA code to read KML files, as exportable from QGIS, and convert their latitudes and longitudes first to Easting and Northings, and then to displacements from the top and left of a document in the construction of a Freeform shape in Excel.  The first – and hardest – part of how to do this is described in a workbook downloadable from the Ordnance Survey here.

If sets of Excel shapes could be constructed on request from authoritative publicly available files, it would be better to move the VBA code here into an add in, which, once installed, would allow the user to construct the shapes which comprise the map in the sheet “Choropleth” and the entries in the column “ShapeNames” in the sheet “Data”.

There are always ways a user interface can be improved, especially if the code is wrapped up in an add in, but there is also always an element of judgement / taste in this, so no observations on that here.