The brain is (mostly) not a computer

I recently had my attention drawn to this essay from May 2016 – The Empty Brain – written by psychologist Robert Epstein (thanks Andrew). In it, Epstein argues that the dominant information processing (IP) model of the brain is wrong. He states that human brains do not use symbolic representations of the world and do not process information like a computer. Instead, the IP model is one chained to our current level of technological sophistication. It is just a metaphor, with no biological validity.

Epstein points out that no-one now believes that the human brain works like a hydraulic system. However, this was the dominant model of intelligence from 300 BCE to the 1300s. It was based on the technology of the times. Similarly, no-one now argues that the brain works like a telegraph. This model was popularised by physicist Hermann von Helmholtz in the mid 1800s. The IP model of the brain can be traced back to the mid 20th century. Epstein cites John von Neumann (mathematician) and George Miller (psychologist) as being particularly influential in its development. His conclusion is that it is as misguided as the hydraulic and telegraphy models of earlier times.

If Epstein is correct, his argument has significant implications for the world of artificial intelligence. If humans are not information processors, with algorithms, data, models, memories and so on, then how could computing technology be programmed to become artificially intelligent? Is it even possible with current computing architectures? (*) There has been no successful ‘human brain project’ so far using such a model. I’m convinced (as both a computer scientist and psychologist) that there never will be.

However, I disagree with what I interpret as Epstein’s (applied) behaviourist view of human intelligence. The argument that we act solely on combinations of stimuli reinforced by the rewards or punishment that follow has been thoroughly debunked (+). There is a difference between explaining something and explaining away something. The behaviourist obsession with explaining away rather than attempting explanations of mental events is a serious blind spot to progress. As serious as the obsession with the IP model, to the exclusion of other possibilities, exhibited by many cognitive scientists.

Living together in perfect harmony on my bookshelf - some of the many psychological traditions.
Living together in perfect harmony on my bookshelf – some of the many psychological traditions.

Just because we can’t currently say how the brain changes in response to learning something, or how we later re-use this knowledge, doesn’t mean that the task will always be impossible. It certainly doesn’t mean that our brains don’t have biological analogues of memories or rules. Declarative and procedural knowledge exists, even if there isn’t a specific collection of neurons assigned to each fact or process we know.

Furthermore, the limits of our current understanding of brain architecture doesn’t invalidate the IP paradigm per-se – at least for partly explaining human intelligence. We shouldn’t be surprised at this. After all, blood circulates around the body – and brain – using hydraulics. This earlier model of how the brain functions therefore isn’t completely invalid – at least, at a low-level. It may therefore turn out that the IP model of intelligence is at least partly correct too.

Epstein finishes his essay by saying asserting “We are organisms, not computers. Get over it.” He’s right – up to a point. But the explanations (or explaining away) he offers are partial at best. Psychologists from all traditions have something to add to the debate about human intelligence. Discarding one approach solely on the grounds that it can’t explain everything that makes up human intelligence is just silly. And that’s something which Epstein definitely needs to get over.

 

(*) I asked the same question at the end of Brainchildren – Exploding robots and AI. I’m still not ready to answer it!

(+) For example, see Dennett’s essay Skinner Skinned in Brainstorms.

Thriving at work – #DOPconf 2019 review

Shortly after I’d been discharged from hospital last September, I made a decision to attend the British Psychological Society’s Division of Occupational Psychologists annual conference (DOPconf to its friends) in Chester. It was held last week, 9th to 11th January 2019, so it was a good recovery milestone to aim for. Fortunately I just about made my target – physically and mentally – even though I didn’t manage to attend all of the sessions I’d optimistically put into my diary at the start of the week.

It was particularly good to meet a number of Leicester and OU psychology alumni again. One of the media sensations of the week was the study published about the benefits of singing at work, carried out by Joanna Foster for her Leicester MSc. However, I get the feeling that if I joined a workplace choir other people may not find my dulcet tones beneficial …

The sessions I did attend at the conference were excellent. These were a few of my personal highlights.

Evidence-based (change) management

The first keynote of the conference was given by Professor Denise Rousseau of Carnegie Mellon University. EBMgt is defined as being the practice of making organisational decisions, in relation to a claim or hypothesis, based on the combination of :

  • Scientific principles and knowledge
  • Valid / relevant organisational and business facts
  • Professional expertise and critical thinking
  • Stakeholder concerns, implications and ethics

Denise argues that few organisations pay attention to the quality of the data on which they base their decisions. Fewer still assess the impact of the decisions they take. Denise suggests that the 6A decision-making process seen in medicine (ask, acquire, appraise, aggregate, apply, assess) should be used – on both the problem and solutions – to improve outcomes.

BPS DOPconf - Denise Rousseau explains why developing any management expertise is so difficult
Professor Denise Rousseau explains why developing any management expertise is so difficult. Unlike surgeons, change managers operate in unpredictable environments. They are project-centred, so have little opportunity to develop their skills with the same group of people for long periods. Because of the lack of assessed outcomes, they rarely receive feedback useful for their development.

Solving the right problem(s) and considering multiple solutions (rather than asking the question “should we do x or nothing”) is more likely to result in effective change. Furthermore, systematic reviews demonstrate that a bundle of interventions rather than implementing a single “silver bullet” is best.

The Center for Evidence-Based Management has a wealth of resources available to support organisations in adopting this approach.

Work engagement in cancer survivors

A paper presented by Andrew Parsons from the University of Hertfordshire. It was of personal interest to me as I’m in the process of returning to work after treatment. Self-report questionnaires to measure work engagement, quality of working life and psychological capital, plus semi-structured interviews analysed using interpretative phenomenological analysis were used in the study. It was unsurprising, if comforting, that measures of psychological capital were strongly correlated with quality of working life scores.

Of most interest to me were the reports of interview participants talked about the importance of developing a “new model of me” and the resources that either helped or hindered their response to events as they returned to work. The “new normal or new me” theme is one I’ve heard many MCL survivors talk about. However, I’m not convinced that the experience of treatment has changed me all that much – at least, not yet.

The influence of work on personality development and change through life

This keynote was presented by Professor Stephen Woods of the University of Surrey. I became familiar with some of his work while studying for my masters and it was good to put a face to the name. He presented evidence which questions the long-held view of many psychologists that personality traits remain fixed and stable during adulthood. Instead, he suggested that they were dynamic and contingent on the work context. The social constructionist and critical psychologist in me grinned broadly as he concluded his talk.

The evidence base is growing - personality changes as we learn and develop over our lives
The evidence base is growing – personality changes as we learn and develop over our lives.
Cynicism in organisations – the antithesis of thriving?

Having confessed that I’m not convinced by personality psychometrics, I also admit that I’m not convinced by so-called authentic leadership. I once wrote that adopting authentic leadership would lead to a highly dysfunctional organisation and burned-out individuals. I still stand by every word of my argument.

It was therefore fascinating to hear Zoe Sanderson from Bristol University compare the traditional view of organisational cynicism with that from critical management theory. Traditional organisational psychology usually constructs cynicism at work as being wholly negative and coming from the individual. “Cynicism can take down an entire organisation”.

Critical management studies takes a different perspective and argues that cynicism is a predictable outcome of many working environments. Furthermore, cynicism can be seen as employees protecting their identity. This helps to reduce any cognitive dissonance stemming from organisational propaganda, enabling them to remain engaged and productive. Zoe’s work on cynicism is at an exploratory stage and I look forward to seeing it progressing.

How do you spot an organisational psychopath … and what do you do next?

Having written earlier that I’m not much of a fan of personality psychometrics, I do love ‘dark triad’ papers. Lorraine Falvey said that the literature suggests there is an increasing level of malevolent behaviour reported at work. Her personal frustration is that most studies into organisational psychopathy either use students as participants, or cover a very narrow workforce, such as police officers. Her study used a qualitative, thematic analysis of interviews with 15 experienced, cross-industry sector participants. It suggests that there is a spectrum of potentially malevolent behaviours – from influencing, through manipulation, to verbal and physical threats. Lorraine argued that organisational leaders need to:

  • Be aware of the shadow you cast as a leader – it is an important factor in what others consider to be acceptable behaviour.
  • Think about the unintended consequences of (poorly designed) rewards.
  • Be clear about individual roles and responsibilities, as clarity seems to mitigate poor behaviour. Matrix organisations are therefore seen as being at particular risk.
Leading with purpose: How to lift people, performance and the planet, profitably

An excellent interactive workshop to end the conference, run by Sarah Rozenthuler and Victoria Hurth. We were given an overview of what purpose in business is, and how purpose is distinct from corporate social responsibility, sustainability, mission and vision. Command and control vs purpose-led leadership paradigms were discussed, and the four capacities necessary for purpose-led leadership defined. From my own business value consulting perspective, the tangible benefits claimed for this approach look extraordinary and are worthy of urgent further investigation.

As I was flagging at this point on the Friday afternoon I’ve been particularly glad of the handouts provided. One of the handouts, “The what, the why and the how of purpose: a guide for leaders“, published by The Chartered Management Institute, has been particularly useful in enabling my reflections.

Graphology: Mere wishful thinking?

A graphologist (handwriting analyst) was interviewed on BBC Breakfast this morning. Resisting the urge to immediately rant on Twitter about the pseudoscience of graphology, I headed upstairs to my study instead. Since then I’ve spent some time refreshing myself on the arguments for and against the art. My own interest is in its use at work, so I’m not that concerned whether Donald Trump’s handwriting indicates if he’s a narcissist or not (*).

Two claims are commonly made by graphologists. The first claim is that graphology can be used to accurately assess personality traits. The second claim is that graphology is an effective personnel selection method. Naturally, for a selection method to be effective, it should be predictive of eventual job performance.

These are extraordinary claims and therefore require extraordinary evidence, as Carl Sagan used to say. Unfortunately for people who use the services of graphologists, the evidence in the peer-reviewed personality and occupational psychology literature does not bear these claims out.

For the purposes of brevity, I’ve naturally been selective in the papers I’ve quoted from in the rest of this article. They are, however, broadly representative of the scientific consensus on graphology over the last 30 or 40 years.

Is graphology a good predictor of personality?

No. It isn’t.

Leaving aside the arguments about whether any instrument can (a) measure personality and (b) extrapolate job performance from those measurements (**), handwriting analysis is not a good predictor of personality.

The gold standard of personality assessment is widely regarded to be instruments that measure the ‘Big Five’ traits – Openness, Conscientiousness, Extraversion, Agreeableness and Neuroticism. If graphology could replicate the outcomes of a Big Five questionnaire, then it could claim to be predictive of personality.

However, studies conducted by Dazzi & Pedrabissi (2009) found that results from a Big Five questionnaire did not correlate with the assessment of graphologists. Furthermore, agreement between graphologists was poor. These results are in line with the earlier meta-analysis conducted by Neter & Ben-Shakhar (1989) which concluded that graphologists are worse than laypeople at predicting personality traits from handwriting.

Simply put, a BBC article quoting experts from 2005, says:

The British Psychological Society ranks graphology alongside astrology – giving them both “zero validity” in determining someone’s character. Dr Rowan Bayne, a psychologist who tested top graphologists against their claims, says the practice is “useless… absolutely hopeless”.

Is graphology a good predictor of job performance?

No. It isn’t.

Robertson & Smith’s (2001) review of personnel selection studies reports that the best predictor of eventual job performance is a candidate’s cognitive ability (intelligence) and integrity. Structured interviews also score well. Common elements of typical CV’s, for example years spent in education and years of job experience, score poorly, but still fare far better than graphology. Indeed, the only worse predictor of eventual job performance they report is age. Even something as ephemeral as personal popularity at work positively correlates to job performance (Garden, Hu, Zhan & Wei, 2018) at a level higher than graphology.

Accuracy of personnel selection methods
Accuracy of personnel selection methods – graphology has a lower accuracy than every method reported by Robertson & Smith (2001) except age (negative correlation).

Predicting future job performance during the selection process is hard. Even the best methods aren’t infallible. But graphology is not the answer.

Conclusion

Based on all I’ve read today, I find it impossible not to agree with this statement.

There is no doubt that when one carefully selects studies in terms of their methodological robustness, the evidence [for the efficacy of graphology] is overwhelmingly negative (Dazzi & Pedrabissi, 2009).

Graphology as practised today is mere wishful thinking.

Footnotes

(*) I did find a paper on narcissism and career success. It concludes that narcissism impacts success through increased occupational self-efficacy beliefs and career engagement – but has only a weak relationship to standard measures of career success, including job satisfaction and salary.

(**) See my earlier post – Does measuring personality make sense?

 

References

Dazzi, C. & Pedrabissi, L. (2009). Graphology and Personality: An Empirical Study on Validity of Handwriting Analysis. Psychological Reports, 105(3), 1255-1268.

Garden, R., Xu, H., Zhan, Y. & Wei, F. (2018). The Role of Workplace Popularity: Links to Employee Characteristics and Supervisor-Rated Outcomes. Journal of Leadership and Organizational Studies, 25(1), 19-29.

Neter, E. & Ben-Shakhar, G. (1989). The predictive validity of graphological inferences: a meta-analytic approach. Personality and Individual Differences, 10, 737-745.

Robertson, I.T. & Smith, M. (2001). Personnel Selection. Journal of Occupational and Organizational Psychology, 74, 441-472.

Poll: 73% say ‘Brexit dividend’ is a lie

A poll conducted on Monday 18th June 2018 found that 73% of those asked said the claim of a ‘Brexit dividend’ was a lie. 11% of respondents said that there would be a Brexit dividend, with the remaining 16% undecided. The sample size was 1,003, with a margin of error +/-3% (*).

 

A fake graph to demonstrate confirmation bias
A fake graph to demonstrate confirmation bias

If you’ve read this far, your initial reaction to this ‘poll’ is likely to have been determined by your existing beliefs about Brexit. If you oppose Brexit, you were probably more likely to have seen this as further evidence that your view is right. If you support Brexit you probably haven’t even read this far, but will have dismissed or ignored this article on the basis of the headline itself.

A psychological explanation often offered for this effect is confirmation bias (Darley and Gross, 1983). Confirmation bias is the tendency to seek evidence to confirm your existing beliefs rather than look for evidence that might counter them. Regardless of the actual truth of the information, finding support for your beliefs boosts your confidence in them. Crucially, this makes it less likely that people holding these beliefs will alter them.

Many people on the pro-EU side of the debate are placing a lot of faith in calling for a ‘people’s vote’ on the final EU exit deal. They express confidence (often citing the way that opinion has subsequently changed on the Iraq War pursued by the Blair government) that people won’t be fooled again.

I remain unconvinced that the outcome of any such referendum would be different.

Although opinion pollsters YouGov claim there has been a slight drift towards people thinking that the decision to leave the EU is a bad one, the difference is nothing like as pronounced as the shift over the Iraq War.

There’s also another important difference compared with the Iraq War – Brexit is a current issue. On both sides of the argument, people still have a lot of psychological capital invested in their beliefs. Much of the shift in opinion over the Iraq War seems to have happened afterwards, when it was seen to be both a disaster and with a premise based on a lie.

The challenge for those of us who want no truck with Brexit is to overcome the confirmation bias of the leavers. If I was well enough to attend, I’d be at the march in London on the 23rd June. But no matter how large and well organised it is, it’s unlikely to have much impact in shifting opinion.

What’s needed as well are emotional, media attention-grabbing demonstrations of the benefits of remaining in the EU. The equivalent of the Farage/Rees-Mogg fish throwing incident, if you will.

 

 

(*) For the avoidance of doubt, these figures are completely made up. Sorry. (But that doesn’t mean they bear no resemblance to the truth and that the Brexit dividend isn’t a lie, naturally).

 

Update 18th June – 2200: Sky News has published a genuine poll in the last few minutes that does indeed indicate that the majority of those asked say the ‘Brexit dividend’ claim is a lie.

 

References

Darley, J.M. & Gross, P.H. (1983). A hypothesis-confirming bias in labelling effects. Journal of Personality and Social Psychology, 44, 20-33.

Yes, we are all individuals!

Josh Friedman’s recent article for Time, “It’s Okay to Be a Coward About Cancer“, is an interesting piece about the language that surrounds the disease. It’s written from the perspective of someone who has experienced cancer for himself. In it, he takes issue with the dominant interpretative repertoires (*) of “fighting” and “surviving” the disease.

When I was first diagnosed with MCL, I initially adopted positions from the “fighting” repertoire. After all, it seems the logical thing to do. No-one wants to die from cancer – and not many people want to die,  ever! “Fighting” is how I perceived that the majority of people were talking about the disease, and I started to talk about it in that way too.

However, over time, I started to think of myself as being more of a survivor than fighter. This was because I found it difficult to declare war on my own body, regardless of its faults. But even that phase didn’t last long. These days, given my current non-treatment status, I feel more comfortable with the idea that I’m “living” with the condition rather than fighting or surviving it. My twitter and facebook biographies have reflected this progression over the last three years since my diagnosis.

While understanding and respecting Josh’s position, I think that rejecting the dominant fighting and surviving repertoires as cowardice undersells his own strength. Coming to terms with cancer by rejecting the culturally dominant discourses is definitely not cowardice. Taking a position against what the majority believe to be commonsense is always hard.

I wish him and all other cancer patients well, regardless of their approach to coming to terms with the disease and their own mortality. After all, in the words of Brian, “You are all individuals, you don’t need to follow anybody!”

 

(*) For those of you who aren’t discursive psychologists, interpretative repertoires provide commonsense and relatively coherent ways of talking about a topic, providing a basis for shared understandings to be reached. They are culturally and historically situated – for example, it is unlikely that a Victorian would have talked about cancer in the same way as a citizen of the 21st century.

What makes an accomplished negotiator?

There are few empirical studies outside of academia that have looked into what makes an accomplished negotiator. However, in 1978(*), Neil Rackham and John Carlisle of the Huthwaite Group conducted one that went beyond game playing. Their work compared the behaviour of a number of accomplished negotiators with ones rated merely average by their peers. They found that accomplished negotiators:

  • Spent twice as much time asking questions (20% vs 10%), and so presumably more time listening to the other party
  • Talked more about their feelings
  • Spent twice as much time ensuring that a common understanding had been reached
  • Used fewer arguments to support their proposals
  • When responding to a proposal, they made half as many counter-proposals

In addition, average negotiators made six times more statements that annoyed the other party than an accomplished negotiator.

Yesterday afternoon we got another glimpse of Theresa May’s preferred negotiating behaviour. Will giving her even more power on 8th June end well for anyone in the UK?

 

(*) The Rackham & Carlisle study is referenced in Hal Movius’ 2008 paper “The effectiveness of negotiation training”.

Choosing your tribe – them and us

“In Ireland you must choose your tribe. Reason has nothing to do with it.” 

 

So wrote J.G. Farrell in his 1970 novel Troubles. While much of what has happened politically in 2016 has felt both tribal and irrational to me, psychology suggests that we don’t even need big issues to persuade us to pick our tribe. Developed at much the same time that Farrell published his novel, Henri Tajfel’s minimal group experiments show how easy – how frighteningly easy – it is for us to do this.

Minimal groups can be formed using arbitrary criteria.  A coin toss can be used to divide people randomly into two groups. A simple task, such as distributing small amounts of money, results in people favouring members of their own group. This happens even when there is no objective difference between group members and the distribution is performed anonymously.

This result led Tajfel with others including John Turner, to develop Social Identity Theory (SIT). SIT can be used as a way of explaining the minimal group results, but more importantly, can perhaps shed light on what happens in everyday life outside of laboratory experiments.

SIT argues that we categorise ourselves and others into different groups. A process of social identification occurs over time, where we decide which groups we identify with. Our decisions on group membership are influenced by others already in a particular group whose attitudes and beliefs we wish to emulate. Finally, our self-esteem is boosted by positive comparisons of our own group against others. It can also be damaged if other groups are held in higher regard by society than ours.

The need to pick our tribe, regardless of how rational or irrational that choice may seem to others, would therefore seem to be an inbuilt characteristic of humanity. Which of the tribes that we belong to is most important to us at any point in time depends on how salient the social identity it embodies becomes. If that identity feels threatened, then we often cling to it even harder.

It would seem to me that the events of the last week demonstrate that the most salient political identity in the UK at the moment is how pro-EU (or anti-EU) we feel. How else would you explain the truly wonderful result for Sarah Olney in the Richmond Park by-election if that was not the case? How else would you explain the willingness of many people to work across traditional party political divides to make sure that we don’t drive our economy off a cliff? Or how else would you explain a large slice of the electorate still voting for the ‘independent’ ex-incumbent anyway?

Long may the country’s new-found passion for the EU continue. I have chosen my tribe and for the first time in some years, I feel rather good about being a member.

My response to the recent Post40Bloggers writing prompt number 104: “Them and Us“.

(Probably) the end

Hello! *Blows away the cobwebs and dusts furiously* I bet you thought that I’d forgotten about you all as I haven’t written anything here since May. Well, after my excellent attempts at procrastination earlier on in the year, I finally decided to buckle down and sort my dissertation out. It’s been quite a journey, which is why I’ve been so uncharacteristically quiet – both here, and on my own blog.

I’m glad to report that after many, many more hours of work than I’d originally estimated, resulting in the production of 22 drafts for the research paper and 7 for the executive summary, I successfully submitted the dissertation last month. I’m now basking in the knowledge that I’ve passed not only the dissertation component of the MSc, but the MSc itself.

Naturally, I have a number of pieces of advice to pass onto future part-time, distance learners undertaking the Occupational Psychology MSc at Leicester in future. The most important of these naturally relate to the dissertation.

Firstly, don’t undertake a piece of qualitative research simply because you’re not keen on statistics. Only do it if you’re really committed to your research question and a qualitative methodology is the only way you’ll be able to answer it. Qualitative research is definitely not an easy option, particularly if you’re looking to demonstrate it’s been performed rigorously and transparently. And you should be, of course.

Secondly, make good use of your dissertation supervisor. Keep them updated with your progress, tell them what you’re thinking about doing … and when they question you, listen to their advice and act on it. They know what they’re talking about! For example, I would have had a much worse question schedule had I not listened carefully to my supervisor’s advice at the start of the process. The quality of the questions that I eventually came up with resulted (I believe) in a far more coherent set of data when it came to analysis than I otherwise would have had. Good data certainly makes analysis more enjoyable, and it made generating evidence-based conclusions easier too.

Thirdly, find ways to enjoy the process. If you’re a distance learner, feelings of isolation and self-doubt seem to haunt most of us at some stage. Talk about your concerns to others – a Facebook group of fellow students in my first year and an email list in my delayed second year certainly helped me when I needed to sound off. The other way I found to enjoy myself was to deliberately argue for controversial positions that I didn’t necessarily hold (backed by evidence, naturally) in the assessments we were set. I seem to remember the ergonomics module being a particularly fruitful one for this approach. In occupational psychology, as in life, there are no completely right or wrong answers – simply positions you can justify based on evidence.

This is probably the end of my academic adventures at Leicester (or anywhere else for that matter). I’m looking forward to presenting my dissertation findings at the British Psychological Society’s Division of Occupational Psychology conference as well as my graduation ceremony in January. I certainly hope to stay in touch with many of my fellow students and the academic staff who have encouraged me over the last three years. Your efforts have been hugely appreciated.

 

This article was originally published at the University of Leicester Student Blogs, 30th October 2016.

Post-40 Bloggers

Discursive strategies used by sales leaders in value co-creation

Today, this amazing thing happened.
DOP Conference 2017 Programme

A short paper based on my MSc research into the discursive strategies used by sales leaders has been included in the programme for the 2017 BPS division of occupational psychology conference. It’s being held in Liverpool between 4th – 6th January. I have a 9am slot on the morning after the gala dinner. I can see that I may need to find innovative ways of encouraging people to attend …

Anyway, I’m absolutely thrilled, excited, chuffed … you get the picture … to be able to speak at the conference. I really hope to see some of you who have read this blog over the years there too.

10,000 steps a day – day 11 – dissertation done!

My dissertation is officially finished. Yay! Well, almost: I will proof read it again tomorrow before submitting all 9,000 lovingly crafted words. But, done. Which brings me to the end of my MSc, too. Here it is in all of its front cover glory.

Dissertation front coverMy advice to future students is simple. No matter how tempting it seems, if you’re going to do a qualitative study purely because you’re scared of statistics, think again. Qualitative research is far more time-consuming and the analysis process far more onerous than anything SPSS can throw at you. Trust me – I’ve done both now. Only do a qualitative piece of research if the question you devise demands it, you have masochistic tendencies and are completely committed to your ontological approach. Otherwise you’ll hate it. And even if you meet all of these criteria, you’ll still hate it at some point during the process. I know I did, but I got through it. There is hope for us all.

I believe that I deserve a beer, before I return my final library book.

BeerNaturally, I walked several thousand steps more than I needed to before I bought the beer. Day 11 and still on track. Only 19 left to go.

 

If you’d like to sponsor me to walk all over cancer during September, my donations page is here. Thank you to all of my sponsors who have helped me to raise £280 so far. Please join them if you can. It will make me feel like my dissertation has some real value (don’t groan).