I’m a couple of days late, but the good news is that bloggers can now embed pictures for non-commercial use from Getty Images without charge. The image below is reproduced just because I can …
One major advance in Victorian medicine was the realisation that cholera was spread through contaminated drinking water, rather than through airborne “cholera clouds”. In J.G. Farrell’s novel The Siege of Krishnapur, a heated debate is held between Dr Dunstaple who believes in cholera clouds and Dr McNab, who is able to demonstrate that contaminated drinking water is the real reason for the spread of cholera.
Despite the overwhelming evidence that he is able to muster, Dr McNab loses the debate.
I am worried that even though evidence and logic suggests that EU membership confers many more advantages than disadvantages for the UK, the ability of UKIP to capture and express the illogical fears and prejudices held by many may yet hold sway.
In The Siege of Krishnapur, Dr Dunstaple is so confident in his cholera cloud arguments that he drinks a bottle of contaminated water. The consequences of this decision are fatal, with Dr McNab’s view being proved correct. But as Farrell notes, even this turn of events still fails to swing public opinion behind Dr McNab.
For Nick to win the debate and ensure that we retain a strong Liberal voice for the UK in the EU, it seems to me that rather more than pure logic will be required. I wish Nick and the team who will be helping him to prepare all the best because, rationally or irrationally, I fear for the future of our country should the electorate decide that UKIP and its isolationist policies are the correct prescription for the challenges ahead.
One of the constants of my working life has been the periodic need to submit to performance appraisals. It doesn’t matter where I’ve worked, the ‘a’ word has invariably come up at one point or another. I’ve often found that being appraised is a disheartening experience. When I’ve been employed as a manager I’ve also had conduct appraisals. Sadly, that’s often been an even more disheartening experience than being an appraisee, so I can just imagine how the people I was appraising felt. If you have ever been one of them, I apologise. Until I started the personnel selection and assessment module on my MSc, I’d never really thought about what good practice should look like.
For example, regardless of the side of the desk I’ve been sitting, I’ve often thought of appraisals as being a total waste of time that got in the way of the “real work”. This is probably because most of the organisations that I’ve been a part of have seemed to use them as little more than tick-box exercises to justify the existence of HR(*). Needless to say, that’s not a good reason for putting together an appraisal system! You can easily tell when that’s the case – as neither you or your manager ever refers to the information that’s been painstakingly gathered, written up and agreed to. Rather than being a living framework that drives the behaviour of both the employee and employer, bad appraisal processes simply result in documents that gather dust in the bottom of a filing cabinet somewhere.
Implemented well, appraisals should be able to motivate, develop and reward employees while enabling the employer to understand the potential of the people they have invested in. Perhaps my experiences have been unfortunate in that I’ve rarely worked for organisations where an occupational psychologist has had any input into the design and conduct of the appraisal process – and just as importantly, into the training of the people conducting appraisals. Yet the insights offered by occupational psychology into motivation, development and systems of reward are fascinating and organisations that take these insights seriously often seem to perform better than those that don’t.
(*) Not my current organisation as it happens – we seem to have a system in place that does add real value!
This article was originally written for the University of Leicester Student Blogs, 1st March 2014.
Bill Bryson in his book The Lost Continent lists seven rules of restaurant dining. For those of my readers who are unfamiliar with his work, these are:
- Never eat in a restaurant that displays photographs of the food it serves. (But if you do, never believe the photographs.)
- Never eat in a restaurant with flock wallpaper.
- Never eat in a restaurant attached to a bowling-alley.
- Never eat in a restaurant where you can hear what they are saying in the kitchen.
- Never eat in a restaurant that has live entertainers with any of the following words in their titles: Hank, Rhythm, Swinger, Trio, Combo, Hawaiian, Polka.
- Never eat in a restaurant that has blood-stains on the walls.
- Never go into a restaurant ten minutes before closing time.
As someone who spends a significant proportion of his life away from home on business, I find it hard to disagree with any of them. You’ll be able to spot me easily if I’m in a town that I’m unfamiliar with as I spend an inordinate amount of time pacing up and down the high street, trying to find establishments that break none of the first six rules. Sometimes I spend so much time doing this that I fall foul of the seventh.
A couple of recent experiences have suggested to me that Bryson has missed a couple of the more important rules, so I’m offering them up here by way of public service, hoping to add to the collective knowledge of long-suffering travelling consultants everywhere.
My first new rule is:
- Never eat in a restaurant where you have to pay for the food before you can eat it.
I’d argue that this is self-evident. The only places that make you pay up-front are the ubiquitous burger and chicken palaces (and they barely qualify as restaurants in the first place, despite what they might claim on their signs) and carveries. Carveries always have a chef on duty wielding sharp implements who seem to take great delight in cutting meat so thinly that you can still see your plate through the slices. Take my advice – when you’re faced with a disappointing portion of food in a carvery, you should never complain to the chef wielding the knife. I’ve often thought that Bryson avoids these places anyway as perhaps they’re the ones that are most likely to have blood-stains on the walls from customers who did, following the lead of Oliver Twist, have the temerity to ask for more.
My second new rule is:
- Never eat in a restaurant where the chef won’t let you take pictures of the food.
This seems to me to be nice counterpoint to Bryson’s first rule. In 1989 when The Lost Continent was published, digital camera phones, twitter, facebook, snapchat and all of those other seemingly indispensable features of life today didn’t exist, so I can forgive him for overlooking this rule. The thing is, if you’re eating on business expenses and you happen to find yourself in such a place, the chances are it may have a few Michelin stars to its name. If this is the case, there’s no way your manager will ever authorise your expense claim.
I suspect that there may be some other rules of restaurant dining that I’m not aware of, so I’d be very happy to know what yours are! As I do much of my studying with a book or iPad propped up on a restaurant table somewhere in the country, I take these rules rather seriously. Perhaps a little too seriously if I’m honest. But who can blame me – after all, a poor restaurant decision might make all the difference between a good and a bad assignment mark …
This article was originally written for the University of Leicester Student Blogs, 23rd February 2014.
I remember vividly when I first became interested in psychology. It was 1985 and I was a final year computer science undergraduate. One of the modules I took covered artificial intelligence and expert systems. The module was aimed at two distinct audiences – people like me, who’d spent the previous two and a half years being taught the fundamentals of programming, computational theory, electronics and robotics and MSc psychology students who’d spent their academic careers studying … well, I had absolutely no idea what they’d been studying if I’m honest.
At the end of the first lecture, with all the computer scientists sat on one side of the room and the psychologists on the other, the lecturers (one from each department) asked if there were any questions about the module. One of the psychologists asked if we would be using or writing a computer program that simulated the way that the human brain worked. Remember, this question is being asked in 1985, and the most powerful computer in the university probably had less computing power than the iPhone you’re trying to read this on. Worse, that amount of raw computing power was considered sufficient to support 20 or 30 people writing and running code at the same time. But I digress…
There were two very distinct reactions to the question. All of the psychologists thought that this would be a really good thing to do, whereas all of the computer scientists (me included) thought that this was a ludicrous idea. We all laughed heartily. Why would anyone want to write a computer program that imitated the way the human brain worked? How inefficient! If humanity was ever going to get anything useful from artificial intelligence, surely it had to be precisely that – artificial – and using efficient algorithms to make the best use of scare computing resources.
By the end of the module there was rather more understanding between the two camps and I’d filed away a note in my brain to have a proper look at this cognitive psychology stuff when I had some time (which didn’t happen until 2007, but life has this habit of getting in the way of study intentions of course). It no longer seemed silly to suggest that there was value in creating computer programs to mimic the way the human brain tackled problems.
Fast forward to today – and even with the vast increase in computing power available, artificial intelligence still seems to be mired in the mid 1980s. Even when it does surface in the media it seems to be treated as a joke item. For example, Radio 4′s Today programme last Friday morning had a truly toe-curling interview between the researcher behind Cleverbot and John Humphries. The image below is a conversation I’ve just had with it. Cleverbot speaks first. Its output is rather unimpressive, frankly, even if the technology and algorithms behind it are very impressive indeed.For computer programs to be truly be first class citizens in a conversation, simply passing a stimulus and response test like that envisaged by Turing isn’t anything like enough. Using language properly is about so much more than combing through large databases of previous conversations and finding responses that appear to make syntactic and semantic sense. As discursive psychology points out, we use language strategically to achieve particular social outcomes. Humans do this by selecting an appropriate discursive repertoire, take a position within it and then apply language to justify our actions or to blame others. It’s hard to see how a computer program could ever be so in tune with the subtleties of discourse and how it is used in social interactions to be truly convincing.
Perhaps my initial scepticism about modelling the human brain with computer programs (or ‘apps’ as I suppose we have to call them these days … how language changes!) wasn’t so far short of the mark after all. Why should we bother to create apps that try to simulate the way that the brain works when other ways of solving problems with a computer are so much more efficient and reliable?
This article was originally written for the University of Leicester Student Blogs, 16th February 2014.
The problem with writing a blog, particularly when it concerns your study plans and ambitions, is that it creates hostages to fortune. For example, in December I wrote the following words:
As an experienced distance learner, I’ve always found it absolutely essential to use this time of year to get ahead of the schedule, so that the inevitable issues that crop up in my working and home life don’t totally derail the study effort.
Well, that didn’t work out terribly well. I’m currently less than 48 hours away from my module assignment deadline with the smallest of the two parts completed (500 words) and suffering a complete crisis of confidence about what I’ve written for the first part of it (2,500 words). I also remember writing this in January:
I’m now working my way through the second module of the Occupational Psychology MSc – Personnel Selection and Assessment. I also appear to be on track as far as my own personal schedule is concerned …
How long ago that seems! I’ll be back here again when I’ve finally stopped procrastinating and either finished off the assignment … or possibly, when it’s finished *me* off.
This article was originally written for the University of Leicester Student Blogs, 8th February 2014. I survived the experience. But only just …