I listened to the eminent evolutionary biologist and New Atheist Richard Dawkins promoting his new book, Science in the Soul: Selected Writings of a Passionate Rationalist, on the radio. He discussed the role of scientific method and evidence, particularly in relation to the Brexit vote. He began by saying that nothing so important as staying in, or leaving the EU should hinge on a binary yes/no vote. But he then went on to extol the virtues of scientific method, which in his radio interview, and in the introduction to the book, he argues should be the preeminent method for making decisions about the world, including Brexit. We should seek out the evidence, public and private, and make our decision according to that. For Dawkins, scientific method is predicated on removing prejudice and gut feeling, indeed all feelings, from rational decision-making and is as relevant to making political decision making as it is to discovering more about the natural world. The best example of a method which does this is the double blind randomised control trial, the gold standard of medical research. He declared that he didn’t want his politicians to be emotional, but rather he wanted them to make the best possible decision, rationally, and on the basis of the best possible evidence. Continue reading
I came across two recent stories from organisational life which reminded me of how attempts to rationalise it can often bring about irrational consequences, and how schemes to systematise and order can disrupt even as they try to make things cohere.
One educational institution I know has decided to streamline and control its purchasing with the noble intention of cutting costs. This means that any member of staff wanting to buy something has to fill in a form justifying the purchase and explaining why they are using the particular provider they have chosen. The form designed by purchasing colleagues clearly has in mind those members of staff ordering stationery or perhaps chairs, since there are questions to be answered about which catalogue and which page the item is on, which colour you are ordering, and where you want the item delivered. Be that as it may, all members of staff have to fill in the same form.
In anticipation of next year’s centenary commemoration of the start of WWI staff in the History department encouraged students to undertake a creative project and make a short film about the war. They found a historian locally who had turned his garden into a mock-up of a WWI trench. In return for the use of his facility and in order to maintain it, the man asked for a fee of £500. In order to get the man paid, staff in the History department had to convince purchasing colleagues by filling in the form explaining why they had not used the institution’s ‘preferred suppliers’ of WWI trenches and why they had not sought competitive bids.
In another organisation a recent restructuring was used as a way of both centralising and decentralising control. In terms of centralisation, all financial responsibility was pulled upwards to the rank of directors of departments, so members of staff who previously had had financial responsibility for, say, signing off their teams’ expenses no longer could. And as for decentralisation, colleagues at the newly configured ‘centre’ of the organisation were told that they couldn’t speak to colleagues in the region, because now the organisation was devolved requests for help should come from the periphery to the centre, rather than the other way round. Otherwise it would seem as though the centre was dictating terms to the regions, rather than strategy emerging responsively and ‘bottom up’. Continue reading
I was recently invited to fill out a questionnaire for a colleague who was being assessed for a 360 degree appraisal concerning her leadership abilities, although I did not work for her organisation. I was being invited to offer an ‘outsider’s’ perspective. To the best of my knowledge this colleague does not lead a large team, although she has a very senior position. I understand this questionnaire to be a reflection of many organisations’ preoccupation with leadership and their need to quantify and assess the leadership potential of their employees, whether they are in leadership positions or not. It is part of a much wider discourse about leadership and a widely accepted supposition that it is a critical determinant of organisational success.
This particular questionnaire comprised 40 or so Likert scale questions with four discursive questions at the end asking about the colleague’s principle strengths and weaknesses. The questions divided roughly into eight main themes. Continue reading
In previous posts I have been exploring the ways in which conventional management theory tries to overcome organisational paradoxes by introducing logic models, idealisations, producing double-binds or separating the paradox out into temporal or spatial phases. In this post I will treat those scholars who recognise paradox, but nonetheless suggest that somehow it can be mastered for organisational improvement and ‘excellence’. In doing so their writing can tend towards the esoteric , sometimes suggesting that leaders and managers can develop a special skill or insight that allows them to ‘master ‘ paradox and ‘unleash’ its creativity in the organisation. Sensitive to the complexifying potential of the coincidence of one thing and its opposite, they are tempted nonetheless to suggest that it is possible to instrumentalise contradictions for the good of the company. This is a familiar trope with many people writing about the complexity sciences, who on the one hand express an interest in uncertainty and unpredictability and on the other hand suggest that they can both be harnessed for the good. Continue reading
I was rung up the other week by someone who worked in a management team in a development organisation, which wanted to try some new initiatives in three ‘fragile states’. It had become clear to them that traditional ways of working, adopting and following logical planning instruments, were inadequate in these particular dynamic and fast-moving contexts, and they were keen to try a different approach. I began to discuss the possibility of working experimentally: with the teams already working in-country, why not start with what they would like to do. Take the first steps, reflect on it, see how it had gone, and then take the next steps. Repeat the process over again. The programme would evolve as new possibilities emerged, although it would take a good deal of discussion and judgement. Programme coherence would build up with retrospective sense-making over time. ‘Yes, but can you prove that this way of working is effective?’, my co-respondent asked.
In a recent journal article I described the way in which staff in an organisation I had a great deal of experience with had tried over time to reflect systematically on the way they were working. This involved acting with intention, but regularly being open to puncturing and questioning these intentions through discussion, reflection and involving the subjects of their intentions by asking them what they thought of the work. It often involved taking two steps forward and one step back, and seeing the process of reflection and discussion not as an adjunct to the work, but as the work itself. The staff often had to work to tight deadlines, to cut short their deliberations to meet them, so were not in any way paralysed by talking rather than doing. Talking was a form of doing. One of the reviewers of the article commented that this was all very well, but what had I actually said about working differently? What would an ideal model of working actually look like?
I was supporting an organisation think about how they might assess work they were doing in East and West Africa where they had made an explicit commitment to their donor that they would focus on what they thought would be sustainable ways of working. That is to say, instead of providing services or materials as such, they would support local stakeholders, central and local government officers, local organisations, politicians and local councillors to work out what their problems were and what they wanted to do about them. The staff in the organisation I was supporting were clear that they had expertise to offer, but the problems were not theirs to ‘solve’. They would support, cajole, facilitate, discuss, offer training if necessary or seed initiatives. But since the inception of the programme the relations with the donor had changed, partly owing to a change in personnel in the donor. Now the donor required ‘objective evidence’ that this way of working produced results, and that these results would be transferable elsewhere. Exactly which kinds of ‘instruments’ were they using to encourage local discussion, and how could they be validated?
In each of these three examples I would argue that there is an illusory quest for certainty. Continue reading
In an INGO where I was working recently one of the newer members of staff proudly told me that he was Prince2 trained. This was mentioned in relation to the conversation we were having about what he considered to be the ‘lack of systems’, I think implying a lack of rigour, that he perceived in the organisation he had just joined. As someone who once worked as a systems analyst, operating at the interface between software developers and end users, I was prompted into thinking about why my colleague might believe that a project management method originating from software development, and contested even there as to its usefulness, might also be suitable for managing social development projects. One would hardly look to the domain of IT for examples of projects which have been delivered on time and to budget, without even considering the other, obvious differences between the two fields of activity. Nevertheless, Prince2 is a good example of the kinds of tools, frameworks and methods which increasingly pervade the management of social development, and are taken to be signs of professionalization in the sector. Continue reading
The Health Ombudsman in the UK, Ann Abraham, recently published a report documenting the ways in which some elderly patients over the age of 65 had been poorly treated in the NHS. These were some of the examples:
• Alzheimer’s sufferer Mrs J, 82, whose husband was denied the chance to be with her when she died at Ealing hospital in west London because he had been “forgotten” in a waiting room.
• Mrs R, a dementia patient, who was not given a bath or shower during 13 weeks at Southampton University Hospitals NHS trust. She was not helped to eat, despite being unable to feed herself, and suffered nine falls, only one of which was recorded in her notes.
• “Feisty and independent” Mrs H, who had lived alone until she was 88, was taken from Heartlands hospital in Birmingham to a care home in Tyneside but, when she arrived, was bruised, soaked in urine, dishevelled, and wearing someone else’s clothes, which were held up with large paper clips.
Abraham’s report prompted much hand-wringing on the part of the Royal College of Nursing, government ministers and the press. The care services minister Paul Burstow saw the report as further proof that ‘modernisation’ of the NHS was needed, which presumably means the major ‘reforms’ that his own government is proposing. He added that ‘leadership’ was needed in the NHS to ‘drive out poor practice’, and mentioned a forthcoming initiative of the Health Quality Commission NHS regulator to carry out spot checks to identify malnutrition and dignity in older patients. Inspecting older patients for dignity is an interesting proposition. Continue reading