A Zambretti weather forecaster

When I was recovering from my stem cell transplant last year, I built a weather forecaster. It uses a Raspberry Pi, a BME280 sensor and a 20×4 character LCD screen. The forecasting algorithm I’d written for it was rudimentary, to say the least. However, earlier on this year I came across a device known as a Zambretti forecaster. These were made by Negretti and Zambra for the UK market in the 1920s.

The Zambretti device uses air pressure, the direction of change, season and wind direction to make a forecast. Depending on what you believe on t’internet, a forecast accuracy of 90% is possible. You can buy replicas from a popular forest-based e-commerce site if you want to. I didn’t, but with the help of a search engine and a number of people who’ve been down this route before, wrote my own Zambretti forecasting algorithm. In FORTRAN 77 naturally.

The results so far have been encouraging. However, I’m of the opinion that the accuracy I’m perceiving may be due to the Forer effect, rather than the goodness of the algorithm. It’s true that different barometric conditions do produce different forecasts. However, I remain unsure as to the real difference between Fine : showers possible, Fair : showers likely and Fairly fine : showers. Not much I suspect.  

Anyway, it was producing good enough results to invest a few more pounds in a second LCD display. This retrieves the forecast made by the Raspberry Pi and sensor covered in cobwebs in the garage and displays it in more comfortable surroundings. This time I’ve stuck to C as my language of choice.

Raspberry Pi 4B plus a 20x4 LCD screen showing a Zambretti weather forecast
A Raspberry Pi 4B plus a 20×4 LCD screen, showing a Zambretti weather forecast all the way from my garage.

The current release of my Zambretti forecaster with remote display screen, with instructions, is available on github. Some (most) of the code could definitely do with improvement …

Raspberry Pi – camera box

This weekend, I was finally happy that I’d managed to implement a reasonable temperature and humidity project as well as a motion detecting camera for my Raspberry Pi. I decided to invest £19 in a ModMyPi camera box to consolidate them onto my Pi 3. It arrived today, and after an evening’s fun this is the result.

Pi Camera BoxLid open for testing. The DHT22 temperature and humidity sensor is at the front, with the motion sensor and camera mounted in the lid.

Pi Camera Box - installed in the garageIn situ.

I think it looks much better than my original attempt, even if the rather fiddly assembly took a couple of hours (with testing) rather than the 10 minutes claimed by the manufacturer! It also means that as my camera is now mounted the correct way up, I no longer need to rotate the image by 180 degrees in my code …

Update: After I’d installed this in the garage, I started to get a large number of false positives. A change back to my Pi2 made little difference (although the original version I’d put together but without the DHT22 had worked well). Finally, soldering a 10k resistor between the data and ground wires of the PIR detector seems to have resolved the issue of the data pin going high without it sensing movement.

Humidity and temperature monitoring with a Raspberry Pi

Continuing my quest to protect my Caterham 7 using my Raspberry Pi, I bought a bargain box of 37 assorted sensors for £21. One of the devices supplied was a DHT11, capable of monitoring temperature and humidity. A quick internet search led me to discover the very useful pigpio libraries and daemon. Wiring up the DHT11 to the Pi’s GPIO pins was simple – 3 wires, 1 to the 3V3 supply, 1 to a ground pin and the data pin of the device to GPIO pin 17. The author of the pigpio library also provides some example code for the sensor.

This was enough to demonstrate that I could get it to work, but the accuracy of the DHT11 was woeful. It’s advertised as having an accuracy of +/- 2 Celsius, but my experiments suggested that the one I had acquired for the princely sum of 57p had an accuracy of around +/- 5 Celsius. As I also wanted the sensor to work in an unheated garage, another limitation was that the DHT11 is unable to read values below freezing.

Encouraged, I decided to invest a further £6 in a plug compatible DHT22 sensor. This has an operating range of -40 Celsius to 125 Celsius, with an accuracy of +/- 0.5 degrees. Initial tests suggested that it was far more accurate than the DHT11. While the sensor produces a slightly different data stream to the DHT11, it was close enough for the example code to have been written in a way that it would still work.

As part of my experiments, I made a small modification to the example code (testDHT.c) to calculate the dew point from the temperature and relative humidity readings. I’ve also changed the way it outputs data, to write it to a file rather than the screen.

void cbf(DHTXXD_data_t r)
   FILE* fp;
   char buff[100];
   time_t now;
   float rdwpt,rtemp,rhum;
   extern float rdewpt_();
   if (r.status == 0) {
       now = time (0);
       strftime (buff, 100, "%Y%m%d-%H%M", localtime (&now));
       fp=fopen("readings.txt", "a");
       fprintf(fp,"%s ",buff);
       fprintf(fp,"%.1f ", rtemp);
       fprintf(fp,"%.1f ", rhum);
       fprintf(fp,"%.1f", rdwpt);

While it would have been simple enough to write a dew point calculation in C, I decided to write it in FORTRAN instead. Hey, the more people who get to love FORTRAN the better. Here’s the function I wrote, which uses the Magnus formula. It’s declared in the C fragment above as an external function that returns a float. Parameters are passed to FORTRAN by address, rather than by value.

C                               RELATIVE HUMIDITY - RHUM
C     AUTHOR: TJH 28-01-2017
      REAL RH
      RH = (LOG10(RHUM)-2)/0.4343+(17.62*RTEMP)/(243.12+RTEMP)
      RDEWPT = 243.12*RH/(17.62-RH)

Compiling and linking the C and FORTRAN code using gfortran:

gfortran -Wall -pthread -o DHTXXD *.c *.f -lpigpiod_if2

produces an executable that can create a data file. This example was created using the command:

./DHTXXD -g17 -i600

which reads the data from GPIO pin 17 every 10 minutes.

20170128-2009 6.9 72.0 2.2
20170128-2019 6.9 71.7 2.1
20170128-2029 6.8 71.3 2.0
20170128-2039 6.8 71.4 2.0
20170128-2049 6.8 71.7 2.0
20170128-2059 6.8 71.4 2.0
20170128-2109 6.7 71.1 1.8

This provides data in a suitable format to use some relative simple gnuplot commands to create a charts. The first example plots the relative humidity readings over time, with the larger spikes in the relative humidity data correlating to the garage door being opened while it was raining.

set xdata time
set xlabel "Time"
set ylabel "Relative Humidity %"
set timefmt '%Y%m%d-%H%M'
set format x '%H%M'
plot 'readings.txt' using 1:3 title 'Relative Humidity' with linespoints

HumidityThe second example graphs the air temperature reading directly from the DHT22 with the dew point temperature calculated from the Magnus formula.

set xdata time
set xlabel "Time"
set ylabel "Temperature in degrees Celsius"
set timefmt '%Y%m%d-%H%M'
set format x '%H%M'
plot 'readings.txt' using 1:2 title 'Air Temperature' with linespoints, \
     'readings.txt' using 1:4 title 'Dew Point' with linespoints

Air temperature and dew pointI’m going to be keeping an eye on the data, to understand if it might be beneficial to seal the garage door more effectively than at present and invest in a dehumidifier.

Raspberry Pi motion sensitive camera

Other than messing around with a few FORTRAN benchmarks and learning how to code using Python, I haven’t really used my Raspberry Pi computers for very much that’s been practical. However, having bought a Raspberry Pi camera to play with over Christmas, I decided to have a go at building a motion sensitive camera for the garage. It’s cheap and easy to find passive infrared detectors these days, so I acquired three for the princely sum of £5.

The passive infrared detector
PIR detector

The first challenge was working out the function of the three pins in the foreground. A little bit of searching led me to the conclusion that the top pin is the ground, the bottom pin the 5v supply, with the middle being the status pin. If the middle pin goes high, it means that motion has been detected. The sensitivity of the device, and the length of time the status pin stays high for, can be adjusted using the two potentiometers.

I connected the power pins to a couple of the available 5v supply and ground pins on a Raspberry Pi 2. I used physical pin 26 (GPIO pin 7) to connect up to the status pin.

The code

The next challenge was writing some code to detect changes in the status pin and take a photograph when motion is detected. Fortunately, there are plenty of code snippets available that made this task relatively straightforward. The current version of my code is below.

import RPi.GPIO as GPIO
import time
from picamera import PiCamera
# Initialise the camera settings
# Use GPIO pin 7 (physical pin 26) for the PIR detector
# Variables used to determine when a picture should be taken.
# GPIO pin 7 => high (ts==1) from low (qs==0) 
# triggers the camera.
  # Wait until PIR GPIO pin is low (0)  
  print "Waiting ..."
  while GPIO.input(GPIO_PIR)==1:
  print "... detector is ready"     
  # Loop until quit signal
  while 1:
    # Read PIR state
    # DEBUG print ts
    if ts==1 and qs==0:
      # Create unique filename with timestamp and set qs high
      filename=("img" + timestamp + ".jpg")
      print "Movement detected - ",filename," created"
    elif ts==0 and qs==1:
      # GPIO pin 7 has returned to low, therefore set qs low
    # Wait for a second
except KeyboardInterrupt:
  # Cleanup GPIO
  print "PIR-PiCamera program terminated"
The results

My Raspberry Pi 2 is now set up in the garage with the motion detector and camera. At the moment it’s simply saving the images onto a drive available to my home network, but I’m probably going to experiment with sending email alerts as well.

Raspberry Pi motion sensitive cameraI’m pleased to report my motion sensitive camera has already caught an intruder …

C7 thief!

edX 6.00x week 6 – going up a gear

Whew. Suddenly, after five relatively straightforward weeks, 6.00x has kicked up into a higher gear. I’ve just got to the end of this week’s lectures, finger exercises and problem set and it’s been far more taxing than anything, including the midterm exam, that preceded it. The main theme of the week has been an introduction to object-oriented programming, with various concepts (exceptions, classes, instances, inheritance and so on) being used for the first time.

This is the part of the course, while not being totally new to me, is the part that I’m least familiar with. All of the ‘production’ code that I ever cut in my career was definitely not object-oriented – and what little OO code I have created has been for the purposes of demonstrating other software packages, rather than being something coded to form an integral part of such a package. There’s a difference – as if your code doesn’t need to go into production you start to get a little sloppy about things – and the edX grader definitely won’t let you get away with that!

This week’s problem set has involved writing a number of classes to complete a program which selects and displays content from rss feeds if particular trigger words or combinations of them appear in its configuration file. There was definitely some subtlety required to complete the task successfully and the very final part of it took me ages because I’d made a silly error. However, I appeared to be in good company, as at least two other people on the edX forum had made exactly the same error. (Hint: if you end up with the error message: ‘str’ object has no attribute ‘evaluate’, for the final part of problem set 6, have a look at what you’re passing to the boolean triggers. It should be the actual object from the triggerMap dictionary, not its constructor).

At the time of writing (Sunday afternoon), neither of the graders for the penultimate and final part of problem set 6 are up and running, which is a little frustrating. Adding to the frustration this week has also been the bug in the problem set that becomes apparent when doing filtering (the code is expecting methods like get_guid() instead of getGuid() as was required by the grader earlier on) – but this is simple to fix of course.

There was also a documentation issue in one of the earlier finger exercises. In it, the grader tests whether your isPrime() function can handle an input value of 0, whereas the problem definition says that the function only needs to consider numbers greater than or equal to 1. Fortunately, the diagnostic output from the grader is very useful in ironing out such wrinkles. Being able to debug other people’s code and documentation is an important skill for any programmer to pick up!

However these minor problems, grader outages, the availability of only the current week’s material and the push back of the release date for each week’s material from Mondays to Wednesdays all give an impression that the 6.00x course team are a little overstretched – much more so than was apparent on 6.002x earlier on this year. I don’t think anyone on the course this time around minds being a guinea pig (after all, the content is excellent and it’s free to participate), but it’s clearly something that will need addressing in future if the plan to charge for completion certificates is to come to fruition.

After all, one of the benefits of online or distance education should be that the material is available for you to work on when you have the time to study it. Getting ahead of the timetable was something I always tried to do on my OU courses, as you never know when real life is going to get in the way. And for me, I think real life is just about to do exactly that. But for the moment, I’m just waiting impatiently for week 7 to start.

Update 12th November

The graders for the final two parts of problem set 6 are now up and running. But guess what. Despite the docstring for makeTrigger() being explicit that it returns a value of None, it won’t pass the grader unless you return triggerMap[name]. Sighs again.

MITx 6.002x: post mortem 2

As promised, I’m now going to spend a little time reflecting on the experience of 6.002x as a whole.

1. Course content

As I said in my first post-mortem post, I was a little disappointed that there wasn’t more digital and more practical electronics in the course. There were a couple of weeks where I honestly thought I’d joined a maths course rather than an electronics one. However, having survived the whole experience I believe that the maths really was necessary to gain a proper insight into the subject. The content delivered was coherent and usually interesting (but please, the over use and misuse of  “fun” and “cool” is really annoying to English ears of my age) so I think it was my perception of the course before I took it that was to blame for my (very) slight disappointment of its scope.

2. Teaching method

6.002x attempted to re-create the classroom experience by presenting lectures in sequences. The vast majority of these consisted of voiced-over whiteboards (with what was sometimes a hilariously inaccurate scrolling transcript) consisting of a combination of legible information (pre-drawn) and when finished, almost illegible handwriting. If this approach is to be retained in future, the legibility has to be a key area of focus. There’s a big difference between the meaning of vi, vI, Vi and VI in large/small signal analysis and while it’s obvious when you’re listening to the soundtrack, it’s definitely not easy to unpick if you review the lecture slides afterwards. The transcript rarely got the differences correct too.

The lecture segments of each sequence were interspersed with tests and video demos. The demos were interesting, if  amateurishly shot and produced. I gave up doing most of the test problems after the first couple of weeks as I didn’t have enough time to devote to those as well as to understanding the material presented and completing the labs/homeworks. That observation is more of a feature of the way I like to learn rather than any problem per-se with the idea of tests or quizzes as a teaching method. I always used to largely ignore those kind of exercises in OU courses too. At least they were optional on 6.002x!

Sadly, I didn’t get much opportunity to watch the tutorial videos. The few that I did watch were excellent and had far clearer handwriting than the main lecture segments.

The recommended textbook for the course at a little over 1,000 pages was a monster! While incredibly comprehensive it was rather dull and worthy in tone – very unlike the personality of at least one of its authors! It really made me appreciate how good OU textbooks are.  Even the monster that was the DD303 Cognitive Psychology textbook was more digestible.  However – the textbook was made useful by the excellent signposting of the chapters in the lecture sequences. This factor alone made it rather more approachable and less daunting than it otherwise would have been.

However, my main criticism of the teaching approach was its slavish attempt to replicate the classroom and university experience. Frankly, it doesn’t translate at all well and becomes intensely irritating very quickly. The course team would do well to examine the lessons learned by the OU over the last 40+ years about how to present degree level material to distance learning students. I honestly don’t think that voiced-over handwritten lecture slides presented in real-time, with all the mistakes being made by the lecturer being corrected on the fly works as a teaching technique in this environment. It certainly didn’t enhance my learning experience.

3. The MITx virtual learning environment (VLE)

Beautifully minimalist, well organised and easy to navigate and find material in, it makes me wonder what on earth the OU and other HE institutions see in bloated monsters like Moodle. This VLE framework will be a real asset to edX in the future.

The discussion (question) forums have a couple of excellent features – such as the ability to tag posts (I found it incredibly useful and usually well done by the contributors) and the ability to easily identify staff contributions. However, the karma system (whereby students earn points off peers and can eventually become forum moderators) is ridiculous. There is rarely any correlation between a good forum moderator and valuable contributors. Indeed, during the second half of the course a number of  karma induced moderators appeared to positively relish their power and in some cases I believe it was abused – bullying behaviour is never acceptable in real life or online.

In future presentations I hope that at least two forums are set up – one for on-topic questions and the other for general chit-chat. I also think that it would be better to appoint student moderators and have them adhere to a published code of conduct – a bit like the OU(SA) forums ‘spirit of conference’ charter, rather than using a karma system that can be easily subverted.

I can’t remember if I’ve ranted about wikis in the past. The one created on the course was ok as far as they go, but they are useless as a learning tool. Psychological studies show that it’s far better to spend time actively making notes in a way that suits your learning style rather than trying to make them conform to some arbitrary ‘best practice’ format. You can always share them ‘as-is’ if you really want to afterwards – and there were a couple of students who did this. Their notes (alongside my own) were far more valuable to me than the sterile environment of the wiki.

The star of the VLE though was undoubtably the circuit sandbox. Even with its continually irritating little quirk of deleting a components when I was trying to input a value for them! Very, very good indeed and I hope that more use of it is made in future 6.002x presentations as a teaching tool.

4. Assessment

I liked the way that labs and homeworks were provided every week and contributed to the overall course grade. I even think that the ‘exam’ format used on 6.002x (more akin to an OU EMA rather than a traditional exam) is a reasonable way of proceeding with assessment in the future, rather than having to incur the costs of travelling to and taking exams in a testing centre – provided that something can be done to deal with plagiarism and cheating.

However, I really struggle to see how the totally automated ‘right/wrong’ assessment format used could be extended to other science or social science courses (such as psychology) that rely on the construction of well argued and evidenced essays – and I suspect it would never be able to transfer as an assessment format for humanities and arts courses.

5. Statistics

From the MITx 6.002x course information page:

6.002x had 154,763 registrants. Of these, 69,221 people looked at the first problem set, and 26,349 earned at least one point on it. 13,569 people looked at the midterm while it was still open, 10,547 people got at least one point on the midterm, and 9,318 people got a passing score on the midterm. 10,262 people looked at the final exam while it was still open, 8,240 people got at least one point on the final exam, and 5,800 people got a passing score on the final exam. Finally, after completing 14 weeks of study, 7,157 people have earned the first certificate awarded by MITx, proving that they successfully completed 6.002x.

6. Overall

I’ve had a great time taking this course (it’s released the inner geek in me) and I’ll certainly take future edX courses should they appeal to my interests. I didn’t spend as much time as I really should have done on the course – probably a maximum of 5 or 6 hours per week (with the exception of the midterm and final exams, both of which took me around 8 hours to complete). Leaving aside the nonsensical hype about Massive Open Online Courses (MOOCs) and the way that they will change education forever (they won’t, but they’re a valuable addition to the overall HE landscape) I wish the team at edX all the best for the future.

MITx 6.002x: post mortem 1

Now that the final exam has finished and the only thing that appears to be happening in the 6.002x forum is a lot of bickering about certificates, I thought I’d write a couple of posts to finish my journey off. This first post will simply look at how I did in the final exam. In the second post, I’ll reflect more generally on the experience and document what I thought was good / bad / indifferent about 6.002x as a whole.

The final exam had 10 questions, each with a number of different parts (1 mark per part), with 47 marks available in total. As I’d written in an earlier post, I needed to score 2/47 to ensure a pass and 34/47 to obtain an ‘A’ grade. In the end, I finished with 32/47 – a comfortable grade ‘B’ pass. Most of the questions I had difficulties with covered material from the first half of the course rather than the second half. I put that down to the second half of the course having become a little more practical in focus – in short, it contained the more interesting material!

Question by question:

1. Strain (5/5) – a nice simple resistive circuit problem to solve just to get into the swing of things.

2. Logic circuit (7/7) – another “gimme” as far as I was concerned. If I had a single criticism of the course content (and it’s not really the course’s fault, more my own expectations of it when I started) it’s that there wasn’t nearly enough digital in it. But at least there was a question on what little there was on the topic!

3. Switched capacitor (5/5) – straightforward stuff involving the calculation of a couple of different time constants. It took me two attempts to get all of the parts correct, as I hadn’t originally spotted that I’d need to re-calculate the time constant when the circuit was switched through the second capacitor … durr.

4. Bipolar Junction Transistor (0/9) – this is where I lost any chance of an ‘A’. It wasn’t really anything to do with a BJT – more a large signal analysis of a couple of voltage sources and a voltage controlled current source, followed by a small signal analysis. I got hopelessly lost and didn’t have the time to go back to first principles to sort it out. I still don’t think that it was a difficult or unfair question – I simply messed it up. Oh well.

5. Op amp with an RL filter (2/3) – straightforward, but I still manged to get the final part wrong as I’d missed out the RS resistor in the algebraic expression – I must learn to write more clearly.

6. Op amp FET (0/2) – I’ve no idea even now about how to solve this one! Hopefully the course team will publish a worked solution at a later date.

7. Trapping noise (3/4) – I have no idea what the part I missed out was asking for – otherwise, it wasn’t too bad a question.

8. Increasing Q (6/6) – no real difficulties with this one, apart from inexplicably multiplying one frequency by 2pi and not the other when working out the bandwidth on my first submission. I sorted that mistake out second time around.

9. Scope probe (4/4) – a repeat of one of the homework questions from a few weeks ago, just with slightly different values this time. Straightforward therefore.

10. Triode amplifier (0/2) – the course team weren’t joking when they said it was ” intended to stretch you beyond the material that we explicitly taught in this class” and “do not work on it until you have finished with the other problems.” More of a “WTF” moment than a “Aha” moment I think.

So 68% on the final exam; 86% overall. A very solid B and I’m pleased that I managed to stick with the course all of the way through.

MITx 6.002x final score: close, but not close enough

MITx 6002x - final score

The chart shows I gained a total of 86% overall – meaning I miss an ‘A’ by 1%. Rats!

As always, the questions I answered on the final exam seemed fairly straightforward and the ones I didn’t attempt seemed impossible! Even if I’d not made a silly mistake on one of the parts of the questions I did answer, I’d have still been short of the 87% mark by around 0.2% – and there definitely wasn’t another part question anywhere on the paper that I could have answered.

As the exam doesn’t formally close for some late starters until 1200 GMT tomorrow (11th June) I’ll leave my post-mortem until a later post. I’ll also reflect on the course as a whole. One of my original motivations for taking 6.002x in the first place was that I wanted to understand the strengths and weaknesses of the MITx approach as applied to teaching and having survived the experience over the last 14 weeks I think I’m in a much better place to write about that now.

All the best to anyone still grappling with the exam; congratulations to those who have passed and commiserations to those who have just missed out.

MITx 6.002x – almost time for the final exam

I’ve just completed the last homework (week 12) for 6.002x, so I’m running rather behind schedule as I haven’t even looked at the lectures for week 13 yet.  Normally, this wouldn’t be much of a problem, but as week 13 contains the final two sequences that could be tested on the final exam, I need to get a move on. Fortunately, in line with MIT practice, there’s no homework or lab for week 13. Equally fortunately, I’m very pleased that the Jubilee celebrations mean that I  have Monday and Tuesday off next week too!

All being well, these lucky breaks mean that I should get to the final exam in reasonable shape.

It’s going to be made available on June 6th from 2200 GMT (2300 BST) and will close at 1200 GMT (1300 BST) on June 11th. As with the midterm exam everyone has 24 hours from opening the paper to complete it, with 3 attempts per question permitted.

I’m currently 1.2% off the ‘C’ passing grade – which means I need to score just 3% on the final exam to gain my certificate. To obtain a ‘B’ I’d have to score 28%, with an ‘A’ available if I manage to achieve 72%.

Best wishes to everyone who’s going to attempt the 6.002x final exam next week. I wonder how many of the 120,000+ who originally registered for the course have made it this far?

MITx 6.002x week 10: (sine) waving but not yet drowning

This week’s lectures have been about the response of networks to a sinusoidal drive and how to analyse them. First of all, this involved using an incredibly difficult method based on solving differential equations (so difficult that the attempt terminates part way through after much baffling mathematics), a “sneaky” approach based on complex algebra and finally a “super sneaky” approach based on the impedance model.

This final method turns all of the steady state sinusoidal circuit analysis problems which seemed pretty difficult using the first two methods into problems which can be rather more simply solved by the application of Ohm’s Law, along, of course, with all of the usual circuit analysis techniques based on the node method, Thevenin, Kirchoff et al.

I think if we hadn’t been warned that the conclusion of the week was going to be relatively straightforward I might have been tempted to cut my losses and plough straight on into week 11 – but I’m glad that I didn’t. In the end, the lab and homework problems seemed to be fairly tractable once I’d thought about them properly – and been guided by the odd hint or seven from the discussion forum of course!

I now have the magic 59% mark showing up on my profile page – which, even if I complete the next two weeks homework and labs, I won’t be able to improve on until the final exam. Despite many pleas from students to the course team on the discussion forum, they still appear to be keeping silent about the form the final exam will take, when it will appear, how long we’ll have to complete it in and so on.

Not knowing when the final exam will appear is pretty frustrating, as one of the “joys” of being a part-time distance learner is that the rest of life tends to get in the way of study, in exactly the way it doesn’t when you’re full-time at a brick university.

One of the lessons therefore that the MITx/edX team ought to take from this first run of 6.002x is that certainty over the time windows for assessments at or very near the start of the course is essential. Without such certainty, it’s difficult to see how edX would ever get future students to pay for assessment, even if the delivery of course content remains free.