MITx 6.002x: post mortem 2
As promised, I’m now going to spend a little time reflecting on the experience of 6.002x as a whole.
1. Course content
As I said in my first post-mortem post, I was a little disappointed that there wasn’t more digital and more practical electronics in the course. There were a couple of weeks where I honestly thought I’d joined a maths course rather than an electronics one. However, having survived the whole experience I believe that the maths really was necessary to gain a proper insight into the subject. The content delivered was coherent and usually interesting (but please, the over use and misuse of “fun” and “cool” is really annoying to English ears of my age) so I think it was my perception of the course before I took it that was to blame for my (very) slight disappointment of its scope.
2. Teaching method
6.002x attempted to re-create the classroom experience by presenting lectures in sequences. The vast majority of these consisted of voiced-over whiteboards (with what was sometimes a hilariously inaccurate scrolling transcript) consisting of a combination of legible information (pre-drawn) and when finished, almost illegible handwriting. If this approach is to be retained in future, the legibility has to be a key area of focus. There’s a big difference between the meaning of vi, vI, Vi and VI in large/small signal analysis and while it’s obvious when you’re listening to the soundtrack, it’s definitely not easy to unpick if you review the lecture slides afterwards. The transcript rarely got the differences correct too.
The lecture segments of each sequence were interspersed with tests and video demos. The demos were interesting, if amateurishly shot and produced. I gave up doing most of the test problems after the first couple of weeks as I didn’t have enough time to devote to those as well as to understanding the material presented and completing the labs/homeworks. That observation is more of a feature of the way I like to learn rather than any problem per-se with the idea of tests or quizzes as a teaching method. I always used to largely ignore those kind of exercises in OU courses too. At least they were optional on 6.002x!
Sadly, I didn’t get much opportunity to watch the tutorial videos. The few that I did watch were excellent and had far clearer handwriting than the main lecture segments.
The recommended textbook for the course at a little over 1,000 pages was a monster! While incredibly comprehensive it was rather dull and worthy in tone – very unlike the personality of at least one of its authors! It really made me appreciate how good OU textbooks are. Even the monster that was the DD303 Cognitive Psychology textbook was more digestible. However – the textbook was made useful by the excellent signposting of the chapters in the lecture sequences. This factor alone made it rather more approachable and less daunting than it otherwise would have been.
However, my main criticism of the teaching approach was its slavish attempt to replicate the classroom and university experience. Frankly, it doesn’t translate at all well and becomes intensely irritating very quickly. The course team would do well to examine the lessons learned by the OU over the last 40+ years about how to present degree level material to distance learning students. I honestly don’t think that voiced-over handwritten lecture slides presented in real-time, with all the mistakes being made by the lecturer being corrected on the fly works as a teaching technique in this environment. It certainly didn’t enhance my learning experience.
3. The MITx virtual learning environment (VLE)
Beautifully minimalist, well organised and easy to navigate and find material in, it makes me wonder what on earth the OU and other HE institutions see in bloated monsters like Moodle. This VLE framework will be a real asset to edX in the future.
The discussion (question) forums have a couple of excellent features – such as the ability to tag posts (I found it incredibly useful and usually well done by the contributors) and the ability to easily identify staff contributions. However, the karma system (whereby students earn points off peers and can eventually become forum moderators) is ridiculous. There is rarely any correlation between a good forum moderator and valuable contributors. Indeed, during the second half of the course a number of karma induced moderators appeared to positively relish their power and in some cases I believe it was abused – bullying behaviour is never acceptable in real life or online.
In future presentations I hope that at least two forums are set up – one for on-topic questions and the other for general chit-chat. I also think that it would be better to appoint student moderators and have them adhere to a published code of conduct – a bit like the OU(SA) forums ‘spirit of conference’ charter, rather than using a karma system that can be easily subverted.
I can’t remember if I’ve ranted about wikis in the past. The one created on the course was ok as far as they go, but they are useless as a learning tool. Psychological studies show that it’s far better to spend time actively making notes in a way that suits your learning style rather than trying to make them conform to some arbitrary ‘best practice’ format. You can always share them ‘as-is’ if you really want to afterwards – and there were a couple of students who did this. Their notes (alongside my own) were far more valuable to me than the sterile environment of the wiki.
The star of the VLE though was undoubtably the circuit sandbox. Even with its continually irritating little quirk of deleting a components when I was trying to input a value for them! Very, very good indeed and I hope that more use of it is made in future 6.002x presentations as a teaching tool.
I liked the way that labs and homeworks were provided every week and contributed to the overall course grade. I even think that the ‘exam’ format used on 6.002x (more akin to an OU EMA rather than a traditional exam) is a reasonable way of proceeding with assessment in the future, rather than having to incur the costs of travelling to and taking exams in a testing centre – provided that something can be done to deal with plagiarism and cheating.
However, I really struggle to see how the totally automated ‘right/wrong’ assessment format used could be extended to other science or social science courses (such as psychology) that rely on the construction of well argued and evidenced essays – and I suspect it would never be able to transfer as an assessment format for humanities and arts courses.
From the MITx 6.002x course information page:
6.002x had 154,763 registrants. Of these, 69,221 people looked at the first problem set, and 26,349 earned at least one point on it. 13,569 people looked at the midterm while it was still open, 10,547 people got at least one point on the midterm, and 9,318 people got a passing score on the midterm. 10,262 people looked at the final exam while it was still open, 8,240 people got at least one point on the final exam, and 5,800 people got a passing score on the final exam. Finally, after completing 14 weeks of study, 7,157 people have earned the first certificate awarded by MITx, proving that they successfully completed 6.002x.
I’ve had a great time taking this course (it’s released the inner geek in me) and I’ll certainly take future edX courses should they appeal to my interests. I didn’t spend as much time as I really should have done on the course – probably a maximum of 5 or 6 hours per week (with the exception of the midterm and final exams, both of which took me around 8 hours to complete). Leaving aside the nonsensical hype about Massive Open Online Courses (MOOCs) and the way that they will change education forever (they won’t, but they’re a valuable addition to the overall HE landscape) I wish the team at edX all the best for the future.