There is no usable blogging client that works across Linux distributions, leaving me to use the Vim editor and the GoogleCL command line interface.
I have been using the Blogger.com web interface to write posts for several years now and I have found it usable, but limiting. The interface works OK, and it has three different views showing the html code, the html as it might be seen on the web and a hybrid composing view. I switched between the html and web views and found it worked reasonably well as long as the internet connection remained.
But since most of the free time I have available to write posts involved a note-book computer without internet I started to look for an offline blogging client. All it had to do was let me edit the text and upload it to the Blogger web-site along with a title. I am not big on images or embedded videos, so I didn’t think I was asking much.
First up was Gnome Blog. Presented as a bare bones client, it was just that. It functioned well though for simple blog entries, with the ability to format text and include images. But it was unable to access Blogger with the Atom 2.0 interface, so it dropped to Atom 1.0. This was a deal breaker, as Atom 1.0 does not seem to support post titles. So, out with Gnome Blog, and ‘aptitude install drivel’.
Drivel looks good and has more features than Gnome Blog, but was more buggy and less stable. And it, too, could not supply post titles to Blogger, or retrieve posts to edit. It seems to work well with other blog hosting sites, so it seems to be a problem with Blogger, but I am not keen on transferring this blog to another host just yet, so onwards.
Blogilo (previously called Bilbo, but it attracted legal problems) was the most promising, being rated highly in a Linux Format magazine review. It was the most feature complete and polished, with all the bells and whistles you could want.
But it turned out to be a heavily integrated with the KDE desktop. This meant that, since I wasn’t running KDE, it pulled in a number of KDE dependencies. This is not normally a problem, since I had plenty of RAM and the extra libraries could be loaded without slowing the system. It required the use of KDEwallet, which is an encryption keyring. I had no need for another keyring, so I disabled that, but then I discovered that KDE had started taking over my system! The fonts, window decorations and icons, file browser and a bunch of defaults were all KDE’d.
I started to de-KDEify one problem at a time, but ended by removing the whole KDE Desktop invading hoard. All the problems went at once, along with Blogilo.
The most reliable way of uploading pre-written posts to Blogger turned out to be the command line GoogleCL. This is a set of tools for the Google Data API, which includes the Google run Blogger.com, so there is no Atom 1.0 / 2.0 shenanigans. I just type in a terminal:
google blogger post –tags “taglist” –title “title” blogpost.html
GoogleCL is aimed, really, at developers who want to integrate Google services with their software, but it is easy to use at the command line. And, seeing that none of the clients I tried correctly handled uploading or paragraph html tags or titles reliably, GoogleCL is a good stopgap until something better turns up. I am using Vim as an editor, so an added bonus, if you can call it that, is that I am forced to learn the simpler HTML tags.
One Thing Well
The Linux philosophy for software is for each program to do one thing well and to link them together to do more complex tasks. You would think that a dedicated blogging client would be able to do that one thing well, but you’d be wrong. But Vim will always be a good HTML editor and GoogleCL will always be able to upload to Blogger. Who needs a blogging client anyway?
Now the summer exam results circus is over &mdash and the usual suspects have made their usual criticisms of the examination system &mdash I would like to suggest to OFSTED (and school managers) what they should be looking at when they assess teaching.
Currently, most schools operate formal lesson observations where a manager sits in on a lesson and fills in a proforma. This sheet has key observations to make and a four level grading system, based on OFSTED’s procedures so that the school can defend it when inspectors arrive. (The grades range from 1 = very good to 4 = poor, with 3 = satisfactory, the new poor). Teachers are, of course, carefully trained in the system, so that formally observed lessons fill one tick-box after another.
Particularly important to the watched are those lesson features deemed good practice. They can be set as hurdles, limiting the grades otherwise good lessons if the box is not ticked. For example, ‘are the lesson aims written on the board?’ Not ‘do the students understand what they are learning?’ or ‘were the students swept along?’. Another example: ‘was ICT used in the lesson?’, even if that use was no better that the non-computer alternative, since there is a government target on ICT use in the classroom.
So what would be better? I suggest that there is plenty of research evidence as to what techniques work in classrooms. Rather than writing off a teacher on their annual observation because they did not use ICT that lesson and the class exam average was below the ‘benchmark’, or because the teacher was idiosyncratic, the observers should be checking to see if the teacher was doing what objectively works.
Most educational interventions have some positive effect on students achievements, so what is needed is a list of the most effective interventions, since we all have only limited time and energy. There are several literature reviews summarising the evidence for interventions. For example here, and here.
Effect sizes can show quickly which interventions are worth expending time and effort on and which can safely be given lower priority. A list from the first link shows these effect sizes:
Feedback / 1.13
Prior Ability / 1.04
Instructional Quality / 1.00
Direct instruction / 0.82
Remediation feedback / 0.65
Student disposition / 0.61
Class environment (culture) / 0.56
Challenge/goals / 0.52
Peer tutoring / 0.50
Mastery learning / 0.50
Team teaching / 0.06
Behavioural objectives / 0.12
Finances/money / 0.12
Individualisation / 0.14
Audio visual aids / 0.16
Ability grouping / 0.18
Effect sizes do not tell you what is good, but they do indicate what actually improves student outcomes. An effect size of 1.0 is well worth achieving, and is approx. equivalent to one year of advancement. 0.5 is well worth a try. Requiring your teachers to include interventions with lower effect sizes may be counter-productive, indeed some of your better teachers may start to quietly rebel.
Lesson observations have the power to force teachers to do what the Principle or Head Teacher wants them to do. It is essential that these demands are informed by the best educational research, and not by political or bureaucratic considerations.