“Real Artists Ship”

Colin Johnson’s blog

Archive for the ‘Education’ Category

“That employability shit”

Wednesday, April 26th, 2017

It is depressing, yet informative, that the end result of no-doubt endless meetings and careful planning and strategy documents and analyses of employability results in the NSS and all that woffle ended in the following fragment of conversation from two students on the bus t’other week discussing the assessments that they had to finish by the end of term:

“…and then there’s [whatever it was], but it’s just that employability shit, so it doesn’t matter.”

(Meta-lesson. You learn a lot by getting the bus up to campus.)

Professional Practice

Friday, February 17th, 2017

Is there such a thing as a set of skills that apply across all of the professions? When I first started to come across (still rather rare) university departments of “professional practice”, I was bemused. Professional practice in what? Is there really enough common to being a nurse, barrister, dentist, accountant, town planner, occupational therapist, etc. etc. to call all of their activities “professional practice”? These seem, at least initially, to consist almost entirely of a lot of profession-specific skills/knowledge/understanding.

But, over time, I’ve started to wonder. Perhaps we are at the stage with professional practice schools that we were at with business schools a few decades ago. There was certainly a cynicism at one point that “business” could be taught generically. What business? Is there really enough in common to running a bassoon factory, a chain of gyms, an online career consultancy, an au pair agency, etc. etc. to call all of their activities “business”? At one point, these would have been seen as needing radically different skill-sets, but over time we have started to realise that some common understanding of finance, accountancy, PR, marketing, project management, strategy, staff appraisal, etc. are useful in all areas of business, alongside a knowledge of the specific business domain.

Perhaps there is something to be gained by bringing together dental nurses, architects, and solicitors for part of their education, and having some common core of education in e.g. dealing with clients. Perhaps the idea of a generic professional practice school isn’t such a ludicrous idea after all.

Ironic (1)

Monday, January 16th, 2017

"For years now, I've been doing the same presentation on change for

(actually from quite an interesting article: Lessons from the A47 and the University Bubble).

Personal Practice (1)

Tuesday, April 26th, 2016

My colleague Sally Fincher has pointed out that one interesting aspect of architecture and design academics is that the vast majority of them continue with some kind of personal practice in their discipline alongside carrying out their teaching and research work. This contrasts with computer science, where such a combination is rather unusual. It might be interesting to do a pilot scheme that gave some academic staff a certain amount of time to do this in their schedule, and see what influence it has on their research and teaching.

Interestingly, a large proportion of computer science students have a personal practice in some aspect of computing/IT. It is interesting to note quite how many of our students are running a little web design business or similar on the side, alongside their studies.

Highs and Lows

Wednesday, February 24th, 2016

The highs and lows of work. Spent 2 hours in a meeting on Monday discussing items that were flagged on the agenda as “not for discussion”. Then spent 4 hours yesterday working with students on our new Computational Creativity module, they were really engaged with the material and willing to engage in discussion and had clearly read the papers in detail before the class—proper “flipped classroom” stuff. I wonder what today will bring?

Get Yerself an Edderkation

Saturday, February 6th, 2016

An news item from my former school’s website:

This is a reminder to parents of Year 11 students that the Year  11 Commnedation Eveming due to take place tonight has been postponed (ref letter sent home last wek).

Agility 17, Wisdom 8

Wednesday, September 9th, 2015

Software engineering education needs to give students a more nuanced understanding of software development processes than one which causes students to say, in effect “There are two kinds of software development: waterfall, which is noisy and old fashioned and so we won’t use it, and agile, which we will use because it means that we can do what we like.”

Ironic (1)

Thursday, July 2nd, 2015

Email. Subject: "What is the future for university staff unions?" "This message has no content."

Best One Evar

Thursday, December 4th, 2014

I had always thought that these were a myth, but I found one in the wild yesterday:


Quality Assurance? (1)

Thursday, November 6th, 2014

It seems to be that one unfortunate side effect of “quality assurance” as it is currently constituted in many organisations is to ensure that real work cannot happen in committees as it is meant to. Because committee minutes become the primary means of evidence that an organisation is running as it claims to, there is a reluctance to show anything in those minutes that analyses how things are really happening. As a result, these sorts of discussions—discussions about quality enhancement, natch!—happen in an undocumented shadow system. This is of particular detriment to attempts to involve stakeholders (for example, student representatives in universities) in the process, because they are rarely involved in these shadow systems.

Incomprehension (1)

Wednesday, October 22nd, 2014

A while ago I had a conversation with a colleague, that went something like this:

Me: “I’ve come across a new book that would be really useful to you for the module you’re teaching next term.”

Colleague: “I don’t really think I need that.”

Me: “No, it’s really good, you will find it really useful.”

Colleague (rather angry): “I appreciate your suggestions, but I REALLY DON’T NEED A BOOK ON THE SUBJECT.”

It eventually transpired that my colleague was interpreting “you will find this book useful” as “Because you don’t know the subject of the course very well, you will need a book to help you learn the subject before you teach it to the students.”. By contrast, I was meaning “you will find it useful as a book to recommend to your students“.

This subtle elision between “you” being taken literally and being used in a slightly elided way to mean “something you are responsible for” is easily misunderstood. Another example that comes up frequently is when I am discussing with students some work that they have to do on a project. I will say something like “you need to make an index of the terms in the set of documents”, using the common elision in software development of “you need to” to mean “you need to write code to”, not “you need to do this by hand”. Most of the time the students get this, but on a significant minority of occasions there is a look of incomprehension on the student’s faces as they think I have asked them to do the whole damn tedious thing by themselves.

Teaching Specialities

Saturday, September 20th, 2014

University research often works well when there is a critical mass in some area. University degrees usually aim to give a balanced coverage of the different topics within the subject. This is usually seen as a problem—how can a set of staff with narrow research specialities deliver such a broad programme of studies?

One solution to this is to encourage staff to develop teaching specialities. That is, to develop a decent knowledge of some syllabus topic that is (perhaps) completely contrasted with their research interests.

One problem is that we are apologetic with staff about asking them to teach outside of their research area. Perhaps a little bit of first year stuff? Okay, but teaching something elsewhere in the syllabus? We tend to say to people “would you possibly, in extenuating circumstances, just for this year, pretty, pretty, please teach this second year module”. This is completely the wrong attitude to be taking. By making it sound like an exception, we are encouraging those staff to treat it superficially. A better approach would be to be honest about the teaching needs in the department, and to say something more like “this is an important part of the syllabus, no-one does research in this area, but if you are prepared to teach this area then we will (1) give you time in the workload allocation to prepare materials and get up to a high level of knowledge in the subject and (2) commit, as much as is practical, to making this topic a major part of your teaching for the next five years or more”.

In practice, this just makes honest the practice that ends up happening anyway. You take a new job, and, as much as the university would like to offer you your perfect teaching, you end up taking over exactly what the person who retired/died/got a research fellowship/moved somewhere else/got promoted to pro vice chancellor/whatever was doing a few weeks earlier. Teaching is, amongst other things, a pragmatic activity, and being able to teach anything on the core syllabus seems a reasonable expectation for someone with pretensions to being a university lecturer in a subject.

Is this an unreasonable burden? Hell no! Let’s work out what the “burden” of learning material for half a module is. Let’s assume—super-conservatively—that the person hasn’t any knowledge of the subject; e.g. they have changed disciplines between undergraduate studies and their teaching career, or didn’t study it as an option in their degree, or it is a new topic since their studies. We expect students, who are coming at this with no background, and (compared to a lecturer) comparatively weak study skills, to be able to get to grips with four modules each term. So, half a module represents around a week-and-a-half of study. Even that probably exaggerates the amount of time a typical student spends on the module; a recent study has shown that students put about 900 hours each year into their studies, a contrast with university assertions that 1200 hours is a sensible number of hours. So, we are closer to that half-module representing around a week’s worth of full-time work.

Would it take someone who was really steeped in the subject that long to get to grips with it? Probably not; we could probably halve that figure. On the other hand, we are expecting a level of mastery considerably higher than the student, so let’s double the figure. We are still at around a week of work; amortised over five years, around a day per year. Put this way, this approach seems very reasonable, and readily incorporable into workload allocation models.

Justice (1)

Friday, September 12th, 2014

My heart sinks whenever I speak to a student who says “I thought that something was wrong (e.g. with marking), but I didn’t want to offend the lecturers by suggesting it.”. Sometimes the implication is worse—”I don’t want to bias lecturers against me in future classes by being seen to be a troublemaker.”, or “I didn’t want to challenge the accusation of plagiarism, even though I had a good explanation, because I don’t want the lecturers to mark me down on future assessments.”.

My impression is that universities are, on the whole, not like this. Indeed, the idea that we have time to pursue grudges like this, even if we had the inclination (and we don’t), seems risible from where I sit. Nonetheless, we have a genuine problem here; one of “justice being seen to be done” as well as justice being done.

Universities try to deal with complaints, plagiarism cases, problems with marking, etc. by having a clear, unbiased system—as much as there is a model at all, it is the judicial system. But, some students don’t see it like that. However much we emphasise that the process is neutral, there is always a fear of those exhortations being seen as a smokescreen to hide an even deeper bias. The same, of course, is true in the broader world—disadvantaged groups believe (in some cases correctly) that the justice system is set up against them, and no amount of exhortation that it is a neutral system will help.

What can we do? Firstly, I wonder if we need to explain more. In particular, we need to explain that things are different from school, that students are treated as adults at university, and that a university review process consists in a neutral part of the university making a fair judgement between the part of the university that is making the accusation and the student. Students entering the university system have only the school system to base their idea of an educational disciplinary/judicial system on, and that is a very different model. Certainly when I was at school, it was a rather whimsical system, which could have consequences for other aspects of school life. In particular, something which wound me up at the time was the reluctance of teachers to treat issues as substantive; if someone hit you over the head, and you put your hands up to stop them, then you were both seen as “fighting” and had to do detention. Universities are not like this, and perhaps we need to emphasise this difference more.

A second thing is to recruit student unions to play a greater role in the process. I’ve been on dozens of student disciplinary and appeal panels over the years, and the number of students who exercise their right to bring someone with them is tiny. If I were in their shoes, I’d damn well want a hard-headed union representative sat next to me. Speaking as someone who wants the best for everyone in these situations, I’d like them to be as nonconfrontational as possible; but, I wonder if making them slightly more adversarial would give a stronger reassurance that they were working fairly.

Thirdly, I wonder about the role of openness in these systems. One way that national judicial systems increase confidence in their workings is by transacting their business in public; only the rarest of trials are redacted. There is clearly a delicate issue around student and staff privacy here. Nonetheless, I wonder if there is some way in which suitably anonymised cases could be made public; or, whether we might regard the tradeoff of a little loss of privacy to be worth it in the name of justice being seen to be done. Certainly, the cases that go as far as the Office of the Independent Adjudicator are largely public.

Continuous Improvement

Wednesday, July 30th, 2014

Is there anything that we can get continuously and consistently better at by extensive and sustained work longer than a decade or so?

A question that has been asked a number of times on discussion boards is “could someone of decent fitness reach Olympic standard at some sport if they started at the age of 25?”. The usual response is that there are some examples—equestrian sports, sailing, archery, shooting—where there are serious international competitors aged in their 40s and 50s, and so it doesn’t seem unreasonable that someone could get there starting at 25. A further strand to this argument is that there aren’t only competitors in that age range there. There are perfectly competitive people in, say, their twenties competing against the older competitors. It isn’t as if you need to start at 5 years old and put in 50 years of practice before you stand a chance of being up there with the best. You plateau out—whether at local club level or Olympic gold medal level—after a number of years of sustained effort. You don’t just continue, Duracell-bunny-like, to get better and better as you put in the effort over the years.

So, experience might not be a disadvantage in these activities, but beyond a certain (rather advanced!) point it isn’t an actual advantage either. Are there any areas where it is almost necessary to have put in the years to be any good? I struggle to think of anything. Let us consider some other areas of human endeavour.

In science and maths, there doesn’t seem to be anything like this. The rather addleheaded idea that “mathematicians are burned out at 25/30/35″ is on the wane. Nonetheless, it seems that with the right combination of study and focus and talent you can get to a research-frontier understanding of most areas of science and maths in about ten years of hard study, from a fairly standing start. Some topics have gotten pretty complex, but not so much that you need to spend ten years learning the basics, then another ten years learning how to use those basics, then another ten years learning about the real frontier stuff.

Craft skills similarly seem to need a number of years to reach professional standards, after which there isn’t really a notable advance in skill. There might be more diversity of practice, richer application of skill, etc., but isn’t as if we only regard as world-class the craft-work of makers in their 60s, say. We would probably make a distinction between the work of a one-year-experience potter and that of a ten-year-experience one in terms of basic skill. But, we wouldn’t make the same distinction between one of twenty-one-years-experience and one of thirty-year-experience; we would talk instead of the ideas that they use their skills to execute, not that the thirty-year one was better at handling the materials.

The arts are more complicated. It is possible to be a child-genius performer. Less so a creator. With the exception of the occasional high-concept work, the number of writers/composers/painters who gain recognition equal to that of the established practitioners at the age of 15 are nugatory. Novelists in particular are generally older. This is presumably something to do with the sheer length of novels. To bash through a few hundred mediocre poems, songs or drawings is just part of the process of becoming a practitioner in those areas; to bash out a few hundred novels whilst getting to grips with the medium is impossible. In music the 10,000 hour “rule” seems to hold sway, overall. A top-ten band might seem to be full of fresh-faced youths, but probably fresh-faced youths who have been practicing guitar in the garage every spare hour since they were 11. Again, the high-concept exception applies, with punk as a clear example. But, again, once we are past the 10,000 hour mark, we aren’t really into “improvement” any more, we are into depth and diversity. Orchestral conductors are usually older, but that is probably a “second job” phenomenon, you probably don’t become a conductor until you have spent a good number of years studying an instrument and being a player. A similar argument applies to football managers, another wunderkind-free zone.

Talking about second jobs, there are the areas in which a certain amount of relevant lived experience is appropriate. There aren’t going to be any whizzy 12 year old marriage guidance wunderkinder. But the relevant experience isn’t in the job as such; it is that the job builds on reflection on life experience.

Perhaps parenting? Parents are said to be much more relaxed with their second an subsequent children, and I’ve met the occasional parent of four or five kids who basically seems to have “a system” after the first two or three, but it doesn’t seem like the tenth would be any better parented than the fourth (indeed, sheer weight of numbers might make it harder). Similarly, the advice of well-intentioned grandparents doesn’t seem obviously better than that of the parents.

So, are there any examples? Any area where the second decade or more of work gets you to a different level of achievement, such that people at the end of that first decade are regarded as amateurs/students? I struggle to think of one.

Dilemma (1)

Wednesday, July 16th, 2014

Here’s an interesting situation. Several times a year, I take part in university open days, where I sit behind a desk answering questions about courses from prospective students. Typically, at the undergraduate open days, the punters consist of a shy 16/17 year old and one or two rather more confident parents.

Here’s my problem. I don’t want to make the assumption that the older person is the accompanying parent and the younger person the prospective student. I’d be mortified if I made that assumption on the day that a parent, bringing their child with them for moral support or lack of childcare, was the prospective student. But, this happens so rarely that the parents and student just sit down assuming that I am going to read the situation as the obvious stereotype.

How should I react in this situation? Asking “which of you is the prospective student?” is treated as a joke or, more troublingly, as evidence of density or weirdness on my behalf. But I still feel uncomfortable making the assumption. I’ve taken to starting with a broad, noncommittal statement like “So, what can I do for you?” or “What’s the background here then?” and hoping that it will become obvious. That isn’t too bad, but there might be a better way.

More abstractly: we try to avoid stereotypes and making assumptions about people and situations based on initial appearance. But, what do you do when the stereotype is so commonplacely true that even the people being stereotypical are expecting that you will react using the stereotype as context?

PhD Supervision

Wednesday, February 26th, 2014

PhD supervision is really easy. All you have to do is muddle your way through supervising your first 30 students and after that the rest are really straightforward.

Job Titles (2)

Tuesday, January 21st, 2014

Two real job titles from UK universities:

Deputy Pro Vice-Chancellor

Associate Executive Pro-Vice-Chancellor

The hyphenation is is very variable. One poor sod is described as an “Associate Executive Pro-Vice Chancellor”, which is just about the worst possible of the large number of hyphen placements possible between those five words.

I think this reflects the idea that “pro-vice-chancellor” is now seen as a single unit concept, not as a deputy-of-a-deputy-of. Indeed, spelling out the phrase is very rare; they are almost invariably referred to as “PVCs”, which provides much confusion and amusement outwith the academy.

Academic Spaces are Consulting Rooms

Thursday, November 21st, 2013

What are academics’ workspaces about? There is sometimes a view, commonly shared across administrators and architects, that they are “offices”, and that the vast majority of work is desk-based, working at computers or with books or papers. There is also a vague idea that this is a bad thing, and that things would be in some vague way better if people weren’t “siloed” in offices, and instead in some kind open-plan spaces where they might “communicate” better with each other (about what, is usually unstated). This might just be a half-arsed excuse for money-saving.

The idea of desk-based work is emphasised also by a view that they are “studies”. This is often offered as counter-narrative to the open-plan idea, it being seen as important to have individual workspaces for this.

I don’t recognise either of these models. Rather, my room is more a “consulting room”. Looking through my diary for the last few weeks, I am in my room for about half the time, the rest of the time I am teaching, meetings, interviews, at lunch, etc. I have about:

  • 4-5 hours of meetings with PhD students and postdocs;
  • 2-3 hours of meetings with project students;
  • a few short meetings with students about coursework, progress, or staff-student liaison issues; say about 2-3 hours a week. A number of these are rather confidential;
  • a couple of formally arranged meetings with colleagues for an hour or so each;
  • a few shorter meetings with colleagues; similarly, a number of these have confidentiality issues;
  • a couple of Skype discussions with colleagues elsewhere for 2-3 hours total.

So, a total of around 15 hours a week of being in my office talking to people.

The idea that I could work in an open plan space and “book meeting rooms” for occasional meetings is risible. Academics’ workspaces are closer to GP’s consulting rooms than offices or studies, and we would regard it as ludicrous to say that GPs should “book a consulting room” on the odd occasion that they see a patient.

Provocative Thoughts (1)

Friday, October 18th, 2013

“MOOCs are Mensa for the 21st Century.”

Sub-disciplines (1)

Friday, October 11th, 2013

When I first started to meet humanities academics, it surprised me how many defined their interests in terms of nations: they were “Scottish historians”, their subject was “British cinema”, etc. There were even things like “American philosophy”—the idea that something as abstract as this can be influenced by something as concrete as nationhood still discombobulates me.

This struck me as rather odd. I’d assumed that this sort of characterisation would be super-naive. I would not have dreamed of asking a historian which country they specialised in: I assumed this would be like asking a mathematician which of the four basic arithmetic operations they specialised in, or (more controversial, this) a computer scientist which programming language they use.

I suppose I thought that at the research level, humanities would be characterised by larger, more abstract problems: the relationship between expressiveness and language, the common features of political systems throughout world history, the interplay between economic forces and art produced, etc., etc.

Of course, the humanities do deal with questions at this level of abstraction; but, largely through the lens of a particular example. There is a similarity here with biology. Biologists will characterise themselves as being experts in fruitflies or large primates or whatever; I have just about gotten over a sense of mild amusement at seeing signs on campuses for things like the “British Yeast Symposium”. Of course, they are using these as a means of investigating deeper issues about gene expression, development, virus transmission, or whatever. It is easier to focus on one organism, as the techniques vary so much for different organism types. Similar with history, but perhaps the issue is less one of techniques than one of accumulated knowledge.

Why did I make this assumption that this characterisation was naive? I suppose I am used to this from studying mathematics, where we leave behind concreteness at a dizzying rate. But, then, it is possible to study mathematics in abstraction; once you have defined a mathematical concept formally, you can deal with it as a formal object, rather than through concrete examples. This isn’t so possible in the humanities; theoretical points are usually argued via the concrete examples. Perhaps there is scope, in some areas, for “big data” methods to change this—for example, having tools that allow historians to take a concept and a database of its realisations in different historical periods and ask questions about that mass of realisations, rather than give a couple of examples and a vague hint that this is a large scale phenomenon.