Thursday, 31 December 2015

The joy of printers ...

My venerable Lexmark E120N needs a new drum kit - something between $70 and $80 bought online from the usual suspects.

It's also not wireless, which is not a great problem at the moment, our router still has ethernet ports, but we're in the throes of buying a new house, and that might mean the printer living somewhere other than next to the router.

What brought the whole problem to a head was we had to print out draft copies of the purchase contract, take one to the bank, two the lawyer and so on, and the photoconductor unit in the Lexmark had decided it was time to stop playing.

One of the big box stores had a pre Christmas special on FujiXerox Docuprint P115W's for less than a Lexmark drum kit, and I could get it there and then. So I did, and we printed nice clean copies and got papers filed the last day before the holidays.

Now for a cheapie I didn't expect much. I didn't expect inbuilt Google Print support, but given it plays nicely with our Macs I thought it might work with Linux as well, after all both use CUPS as print management solution.

Well, no.

FujiXerox don't distribute a ppd file - something that they must have to allow Mac support via CUPS. The obvious solution would be the  the generic PCL driver, which certainly submits the job, and that's about it. (PostScript's worse, it's a great way to push a lot of blank pages through the machine to test paper handling).

I'm guessing that there's something wierd in the job initialisation sequence that is missing from the generic driver output - a language select statement or something like that needs baking into the generic ppd, and certainly there's some hints of this in the printer configuration if you dig round the internal system settings.

The question is which will be easiest - hack the generic ppd or try and extract the vendor one from the Mac printer driver distribution?

Hacking the generic one means it could be released back into the wild, but life would be easier if FujiXerox just released the ppd as an unsupported driver ...

Friday, 11 December 2015

Dear Tim, Chromebooks are as much a real computer as an iPad ...

I'm a Mac user, a linux user and a Chromebook user.

Tim Cook, the boss of Apple, has been very disparaging of Chromebooks, describing them as 'test machines', as in machines for computer based testing rather than carrying out software tests.

Not true, you can do real work on a Chromebook like email, spreadsheets, documents, as well as surfing the web. Something I proved quite dramatically yesterday when our building was closed due to a rather dramatic water main leak.

When I turned up to work the building was already closed off and you guessed it, my work machines were the other side of the safety cordon. I started off with a keyboard equipped Android tablet sat on a picnic table under a tree while I emailed and called people, shifted meetings and so on.

But useful as the tablet was, it was showing some of its limitations. So I went home.

I started first off with my Chromebook. Again, email, calendaring and so on was all available, because they all have web based interfaces, and google Docs is pretty good at displaying word, odt and pdf files these days.

Even when we had a server failure and someone had to remote in, I could deal with the service desk incident purely because it's a web based service.

In fact the only reason I stopped using my Chromebook was that by about threethirty in the afternoon the battery was down to 8%.

So I swapped over to the old linux netbook I take travelling and finished my day.

So, why do I have a MacBook as well ?


The Chromebook needs a good internet connection. I've learned from experience that while it will work on a 4G connection or even a good 3G connection, if the internet's in the least bit crappy, you might as well go home.

And travelling's about slow, crappy, unreliable internet. It's why sometimes I take my own 3G network box with me as it's often cheaper (and more reliable) than using hotel wifi.

So, with my old linux netbook, I can work offline, and upload material and documents at the end of a session, especially at conferences, which nowadays seem to involve a lot of frenzied typing and overloaded wifi connections.

I could live in a totally linux based world except for the need to occasionally use some of the software the rest of the world uses, and for that reason I have a MacBook ...

Friday, 27 November 2015

Huayra 3.1

Since it's a bit more than eighteen months since I last downloaded and played with Huayra, I thought I'd have a go at building the latest version Huayra 3.1.

It's a 3.4GB download for the iso image. Building it on virtualbox was reasonably simple - it's debian based, and while some knowledge of Spanish helps, if you've installed Debian based distros before it's really just a case of following the bouncing ball.

The only snafu is that you end up with a 9.5GB virtual box disk image - which is larger than the default virtual hard disk size of 8GB. The installation script does quite a bit of clean up towards the end of the installation so I'd be generous when creating the virtual hard disk - 20GB seems about right.

It's a fairly standard environment, and is obviously targeted towards education and programming with such nice extras as iPython and the Arduino development tools installed by default, but it has all the standard tools as well.

All in all faster and slicker than the old version, and interesting to take a gander at if you're interested in the use of linux in education ...

Thursday, 26 November 2015

Teaching Robotics

Yesterday, I tweeted a link to an Argentine initiative, part of the Huayra initiative, encouraging students to build a simple robot out of (mostly) recycled parts and an Arduino board.

On the same theme, few days previously I tweeted a link to a news story of how some Mexican high school students had won a robotics competition in Romania.

There's a story here. Recently the news has been full of stories about robots are going to take our jobs. Possibly true, possibly not. Past experience of the IT revolution suggests that the changes will be different to those predicted but equally disruptive.

But let's assume for a moment that robotic devices become a lot more common. That means that, initially at least, we'll need a lot more technicians to set them up and configure them, do a little bit of tweaking, carry out field upgrades and the like.

Not advanced cybernetics, but good solid run of the mill technician work. The robotics equivalent of the early 20th century workshop 'fitter'.

When I look at our education programs here in Australia, I don't see a whole lot of evidence that we're headed in the same direction - rather more we're still stuck on teaching kids to use Microsoft apps and webmail.

Useful, but not necessarily the best thing to equip them for the future.

Maybe I'd better start working on my Spanish ...

Tuesday, 3 November 2015

Riding the wily werewolf

Having upgraded my Mac to El Capitan, a few days ago I upgraded my Linux laptop to Ubuntu 15.10, aka Wily Werwolf.

I'd been running the 14.04 LTS (long term support) version very successfully - it was stable, booted considerably faster than my Mac, and apart from very occasionally getting in a stupid state where it didn't come out of hibernation properly, problem free.

But, some of the applications packaged with it were beginning to be out of step with some of the newer versions available on some of my other machines running Debian and the like so I decided to upgrade.

Upgrading involved going via 15.04 - I actually thought about stopping at 15.04 and not upgrading to 15.10 as it was so new the paint wasn't fully dry, but 15.04 felt unstable on my laptop.

Nothing I could put my finger on, it just didn't feel as rock solid as 14.04.

Going from 15.04 to 15.10 gave me a few more patches, the new versions of the applications I wanted, and more, it felt more stable, and it's proved to be fairly solid in use.

The upgrade process was fairly mechanical, just a case of following the prompts, and while it wasn't quite as slick as a Mac or a Windows upgrade, it was all there and it all made sense - strangely this is the first Ubuntu upgrade I've done in a long time - mostly I just rebuild the machine from scratch - but this machine had a slew of files and extra applications that would be a pain to reinstall, so an upgrade it was. And everything still worked afterwards.

What it showed is that Ubuntu is mainstream quality, does what you want, and works well. No complaints so far ...

Thursday, 29 October 2015

Google calendar url's and orage synchronisation

On a hot sticky afternoon in 2007 I wrote a little script to import a google calendar file into orage.

For some reason it became quite popular, mostly due to the lack of a suitable alternative import mechanism.

Google have recently announced that the URL used by Google Calendar will change.

If you've been using my script with an old style URL, this will of course break but the fix is relatively straight forward - simply update the calendar url used by the wget command ...

Wednesday, 21 October 2015

El Capitan ...

While I was having my twitter discussion on 'What is a repository?' I wasn't idle.

I upgraded my five year old MacBook to the latest version of OS X, El Capitan. I'd previously upgraded it to Yosemite, and while it was fine in use, it did tend to slow down over the working week and was at times tediously slow to boot after a shutdown.

It's a little too early to tell how good it'll be after a period of sustained use but I think the upgraded system is a little faster.

The upgrade process was fairly slick, as you'd expect with Apple. I did have a moment of panic half way through when the system rebooted, and instead of simply flashing the power light on and off, made it flicker along with a horrid buzzing noise that reminded me of a system with corrupt firmware, but that was just me being paranoid.

Initial login and configuration seemed to take forever, giving me plenty of time to appreciate the aesthetics of the redesigned spinning beach ball of death but we got there in the end.

As usual with a Mac, everything just seems to work and there's no playing with settings (well apart from reconnecting to iCloud).

We'll see how thing are in a week or so ...

Monday, 19 October 2015

No one wants an ereader

A few days ago I blogged about peak ereader and how one big English bookshop chain had stopped stocking Kindles.

So, Is it really the end for the ereader?

I think so. A search on ebay reveals a lot of second hand Kobos, Sonys and Kindles but no new devices or third party cheap asian devices.

The cheapest I could find was a refurbished Kobo for fifty bucks - as a comparison on the same day I could get a new end of range 7" android tablet for an extra ten bucks -from Telstra of all people.

The market has voted - no one wants them ...

Wednesday, 14 October 2015

ipython notebooks and text cleaning

Twice in the last few days I've had an interactive text cleaning session - once with Ernesta's diary and once with a 125 page listing of journal editor strings that lacked some quality control - leading spaces, double spaces between some entries and so on.

All easy enough to fix with sed to get a file where the names were nicely comma separated and hence easy to split into the individual names.

None of this is rocket science mostly it's just

s/ and /,/
s/ & /,/
s/ ,/,/

and of course each time you do it the steps you follow are slightly different.

Most times I don't use sed, mostly I use gedit which includes the functionality. It could also be done interactively from the command line using perl or python as I did cleaning up Ernesta when I felt lazy and raided stack overflow rather than doing it myself.

The crucial thing is of course that I don't actually have a record of what I did. I have notes of what I think I did, but this is reconstructed from a screenscrape of a terminal session and emails to colleagues. Crucially if you use a tool like gedit, you don't get a record of what you've done.

The same goes for work done in R such as my experiments to make a middle Scots stopword list - while I'm sure I've archived my script somewhere, I don't have a record of what I did.

While it might be overkill, the answer is to use something like ipython notebooks as an interactive work environment - and of course they're not just for python anymore - they're increasingly language agnostic.

So my little self improvement project is to get to grips with ipython notebooks, which if nothing else should improve my python skills ...

Thursday, 8 October 2015

Peak e-reader ?

Waterstone's, the big UK bookshop chain has stopped reselling the Kindle. At the same time, their competitor, who resells the Nook, reports that sales are flat, with few people buying an e-reader for the first time, and those sales that they have are people replacing failed devices.

This shouldn't surprise us. E-readers are conceptually simple devices that do one or two things very well. There's no pressure to upgrade or replace unless the device breaks or is left out in a summer storm.

For example, while I use a Kindle for recreational reading, I still use my 2009 vintage Cool-er for reading public domain epubs and cleaned up texts such as Ernesta Drinker's diary. Despite being totally unsupported my Cool-er still works fine - the only problem being that the paint has scuffed off some of the arrow keys.

So that's one problem. The devices are reliable. The other problem is the multiple device problem. A lot of reading takes place on public transport, and if you've already got a tablet with you why carry a second device when you can just as easily read your book on your tablet?

So it's probably legitimate to say that the e-reader device is saturated, at least in the developed, English speaking world. Due to the cheapness of tablets these days less developed countries may never do the e-reader thing, especially as the tablet is considerably more flexible as a resource - after all if you'd a choice between a $100 tablet and a $100 ereader, which would you choose?

None of this says anything about e-book adoption rates.

E-books remain a versatile distribution medium. There will always be people that prefer paper books and those books  that simply aren't available in an electronic format. And there's definitely a role for them as reference material.

But e-books are here to stay.

Wednesday, 7 October 2015

Ernesta Drinker and surfaces, macbooks and the rest

Yesterday I posted about my quick and dirty clean up of Ernesta Drinker's journal.

A few hours later, on the other side of the planet, Microsoft announced a slew devices, including the Surface Book, which is being touted by some journalists as a MacBook Pro killer.

Well I have a MacBook Pro (well, work bought it for me, and it's actually 5 years old and overdue for replacement), but all my work fiddling with Ernesta Drinkwater's book was carried out on an even more elderly Dell Latitude running Linux.

It's Linux that made it possible, because of it's rich toolset, though I could have done it on my MacBook via a terminal window because of OS X's BSD heritage.

Windows? - well given I used perl for a lot of it I could have done it by running the scripts from the command line but it would have been a bit of a hassle.

And that's something that tends to be forgotten. There are those of us who use machines for work, and quite often what we have on our desks is driven by our software requirements for work, and how effective that makes us.

If I was cynical, the only reason I have Microsoft Office is because I once had to write a set of grant proposals using a template that didn't work in Libre Office.

Necessity is the mother of software choice, not how fast or sexy your hardware its ...

Tuesday, 6 October 2015

Fixing Ernesta

Fixing Ernesta Drinker's book turned out to be easier than expected.

First of all I used gedit to remove the front matter from the text file and then  used cat-s to suppress double blank lines introduced by the digitisation process to get a halfway clean file.

I then used sed to replace the header and footer strings

sed s/header_string\n//g

with null strings, which gave me a reasonably clean text. The only problem was that the file had hard coded end of line markers, and paragraphs were mostly separated by double end of line markers. Here perl was my friend

perl -pi -0 -w -e 's/\n\n/ qq9 /g' infile.txt

to replace the paragraph breaks with qq9 - a string that did not occur in the document. Then I used

perl -p -e 's/\n//g' infile.txt > outfile.txt

to take out the end of line markers

 perl -p -e 's/qq9/\n /' infile.txt > outfile.txt

to put back the paragraph breaks. (And yes, I used stackoverflow). I could have wrapped all of this up in a script, but working out the best order of opeation was a bit iterative , and consequently I ran the individual operations in a terminal window.

At this point I opened the text with Libre Office to check the format and remove a couple of headers garbled in the OCR process. If I was being pedantic I could then have spell checked the document but what I had was good enough to read and take notes from, so I simply used CloudConvert to make an epub file from the saved file.

Not perfect, but good enough.

Reading old digitised books

Over the long weekend I caught up on some of my reading backlog, including a biography of Louise Bryant.

Louise Bryant was at one time married to John Reed (he of 'Ten Days that Shook the World' fame) and after his death married William Bullitt, who was later US ambassador to the Soviet Union.

Louise Bryant's story is of someone who desperately wanted to be someone, rather than a serious revolutionary. While she had her fifteen minutes of fame as a journalist, she was ultimately a tragic figure dying in obscurity. To quote Emma Goldman's cynical remark 'she was never a communist, she only slept with a communist'.

William Bullitt had a diplomatic career before he met Louise Bryant.
His first wife, Ernesta Drinker, accompanied him on a diplomatic mission to the central powers (Germany, Austria Hungary) before the USA joined with Britian an France in 1917 on the Western Front. Ernesta kept a diary of the trip and published it as book afterwards.

Now one of my interests is the lead up to the Russian Revolution. There's plenty of material in English about the first world war, but that naturally concentrates on Gallipoli and the Western Front. There's actually very little available about how things were in Germany and Austria Hungary, so I thought I'd try and track down a digitised copy.

Well there's not a copy on Gutenberg but it's been digitised as part of the Google Books initiative, and it's reasonably easy to obtain a copy via the Internet Archive of the scanned text as either a pdf or an epub. The text has scanning errors in it but it's not too bad, even if the structuring of the pages is a bit annoying with 'digitised by Google' added in at the bottom of each individual page image.

The text is however good enough for input to any text analyis program. Good enough for what people rather grandly call 'distant reading'.

It is however a pain to read. I could of course take the text and write a little python script to clean it up a bit and generate my own epub, and perhaps I should, but that does defer the instant gratfication aspect of tracking down a book, so I went looking for a clean copy.

The various Indian print on demand operations offer to print a corrected version for around $15, and a couple of websites offer access to a corrected version for a modest fee which allows you to download the text. One of them offers a try befoe you buy option to see a sample of the pages and cerainly they look reasonable. A quick search of AbeBooks drew nothing other than the print on demand versions at a reasonable price - none of the original editions being off loaded for a dollar or two.

So it's back to the digitised text.

One of the problems with the text digitisation effort is that a lot of the scanning initiatives have been focused on producing the text either for input to some machine learning programs or in producing a page by page set of images. And if one is using the pdf version, having an added footer is not really a problem, providing that one views the page image screen by screen at the original page size.

But one never does that. The easiest way is to use a reflowable format such as epub which allows one to adapt the text display to the capabilities of the device being used, or on a pdf viewer coercing the page to A4. And this leads to the footers and original page breaks being scattered through the document.

And this because the way that the text has been digitised has been to scan the text, add the footers, and ocr the page images to extract text from teh page images. Which is fine if one wants a digital representation of the original book, but rather less so if one wants to read the damn thing ...

Wednesday, 30 September 2015

Madoko ...

Earlier today I tweeted a link to Madoko, a Microsoft project to provide an online markdown editor. I was sufficiently intrigued to try  it out:

Syntax is definitely more Markdown like than pure Markdown but you don't need to read the documentation to use it providing you know standard Markdown - something that's a bit of a contradiction in itself.

The program uses the split screen model adopted by StackEdit and where you type in one screen and you see your text appear in the other, rather than the ReText style approach where you flip between an edit mode and a preview mode.

As a microsoft product the menus and document structuring tools are word like in style.

Responsiveness is similar to that with Zoho, Google Docs or using Retext on I'd categorise is as reasonable, but not amazing.
Like Retext on, while a web application it can link to other cloud stores such as Dropbox and One Drive.

Unlike most other markdown editors it produces either PDF output or HTML to publish web pages rather than offering any of the more conventional wordprocessor format outputs. And being positioned as a scientific writing tool, it will import LaTeX documents, as well as doing formulae nicely. PDF export goes via TeX and you also get the output TeX document to modify as well.

Using Apache Tika to analyse the pdf output one gets

producer: xdvipdfmx (0.7.9)
resourceName: document.pdf
xmp:CreatorTool: LaTeX with hyperref package

which is fairly standard for the LaTeX world.

Would I use it?

Possibly, although I'm comfortable enough using either Kate or Gedit to create markdown directly. It's certainly an alternative to StackEdit if you're working on a Chromebook, or working in an environment without a decent text editor.

Monday, 21 September 2015

reading a paper book

Very strange.

For the last six months or so I've only read a book on a tablet or on a dedicated ereader. Didn't set out that way, it's just happenstance.

Recently, I had reason to read a thick legacy format paperback - ie one made of paper with pages that you turn.

And it's strangely odd to hold a book open, or even deal with the weight and bulk of the book ...

Tuesday, 15 September 2015

Printing from the cloud with RollApp

RollApp - the service that lets you run applications from a web browser - has just announced a facility to allow you to print locally from your remote application.

Reading between the lines, it works a little like Office 365 or Google Apps printing by generating a pdf which is transferred to you local machine for printing.

But there's a couple of questions about this:

1) If you're using a chromebook or an android tablet with a keyboard as your desktop device you're going to have access to CloudPrint, so why the two step process - what would be cool would be able to queue the file directly

2) If you're working on the train, the bus, or in a coffee shop the chances are you don't have a local printer - again queueing it to one of your cloud devices, or using something like Epson or HP's print via email service to a remote printer is probably what you want to use

So, good idea and a good first step, but not the complete answer ... 

Tuesday, 8 September 2015

Online galleries and the democratization of content

I’ve had an idea knocking around the back of my head for a few months or so now, ever since I was in Budapest and discovered the Hungarian National Gallery have their collection online.

Now it’s not a stellar collection, but it’s definitely competent and well curated.

At the same time I’ve been playing both with Pinterest and Omeka - Pinterest as a sort of visual research diary to collect and hold images, and Omeka as tool for assembling collections of material and putting them into context to tell a story.

Of course some items have an intrinsic structure - a scanned diary has a beginning, a middle and an end, just as a set of tax records from the 1700’s have a beginning, a middle, and an end.

Others are just a collection of items that be assembled in various different ways to tell different stories with the same content - it's what you do with them that’s important. One fun example is the University of Reading's Collections tumblr page - a happenstance athematic collection of oddities

And then there are sites like Artsy that try to build sites around particular artists for all sorts of reasons - for example their Egon Schiele page provides visitors with Schiele's bio, over 25 of his works, exclusive articles, as well as up-to-date Schiele exhibition listings, and as such provide a service to people interested in the work of an artist or group of artists.

And interestingly is that under all of this is what they call their art genome project trying to evolve a classification model for art.

However, for the purposes of this post, what’s interesting about Artsy is how they have taken and reused content to make a different resource.

For quite a few years now there’s been discussion about digital repatriation - basically gathering together digitised content and representing (or more accurately making them available for re-presentation) as a whole - manuscripts that have been split up can be re-united, cultural material looted during nineteenth century colonial wars can be made available again to the original owners and collections of an artist’s work can be drawn together to show how his or her work and style evolved.

And of course we’re talking about the reuse of digital content, and the need to understand that once something is made available for reuse it can be used in lots of ways, and that you’ve basically lost control of the content.

And of course there’s fear element - make a high resolution image available and there’s nothing to stop someone else copying it and using it make a fridge magnet, and if it’s a popular and attractive item, a bit of money.

Inevitably that will happen, just as easily as people will make things of intellectual value, it’s simply that when you democratize access to content things change ...

Friday, 28 August 2015

Numbers, reputation, and worth

In the past week I've been shedding twitter followers, while at the same time my Klout score (Klout is a website that claims to measure the 'worth' or 'impact' of your tweets) has increased by a few points.

It's long been my view that metrics, rankings and the rest don't mean much individually  in absolute terms but that in aggregate, higher scores indicate a degree of worth.

And this is sort of demonstrated by this week's little event. Normally if the number of followers had gone down you would expect that my Klout score would go down as what was being said was seen to be less valuable.

On the other hand if what was being said was felt to be more valuable my Klout score would go up and probably the number of followers would increase - and certainly this has seemed to be the case in past months.

But of course twitter is populated by a host of inactive accounts, perhaps related to dead and stalled projects, and of course it's the end of August, the time when in the northern hemisphere, academic projects are typically wrapped up and closed down.

So I'm guessing what I've been shedding is a slew of low worth accounts.

And this is a learning experience - what's being said is more valuable than the numbers listening, ie measuring influence/impact is more complicated than the things we can easily count ...

Monday, 24 August 2015

The disruptive chimera of the digital humanities ...

Over the past year I've become more and more convinced that Digital Humanities is a chimera, much as Eresearch is also a chimera.

Many disciplines in the physical sciences have always deal with large data seta and their manipulation. Many researchers in the social and health sciences have always carried out complex analyses of government statistical data to reveal both new trends and the impact of legislative changes.

Until recently the poster child for this was psychology - ore more accurately the cluster of closely related behavioural sciences from ethology through to neurobiology that are usually lumped together as 'psychology'.

Psychologists have used computers since they became widely available to contol experiments, present stimuli and illusions, and analyse data. Clever innovative work that has become increasingly more innovative as technology has become cheaper and more and more off the shelf components have bcome together.

But nothing was more than a logical extension of previous research. And that is what digital humanities are - a logical extension of preceding research. Yes, the easy access to large quantities of data and the availability of easy-to-use mapping systems, natural language toolkits, has allowed a step change in the nature of the research, but not a fundamental change.

For example, in a moment of rash enthusiasm, I thought you could do something with the tax return date in the Domesday book to graph the harrying of the north - after all the Domesday book is semi structured data and online as queryable resource - until someone pointed out that someone had looked at exactly that question some thirty years before, with some rather more traditional techniques.

In other words there was nothing special about applying digital techniques, they merely amplified what was already possible, and by extension there is nothing special about the digital humanities.

And because there is nothing special they need no special consideration in the provision of computing resources, they merely require consideration.

Where their disruptive effect comes from, and the thing that makes them look like something new and different is the scale of the step change - the large scale digitisation of resources though projects such as Google Books, and the comparitive cheapness of cloud based computing has meant that a guy with a laptop and a good idea can make a significant difference for a low cost, and unlike in the science the data collection cost is negligible.

However, even this difference will disappear as both various initiatives for the digitisation of legacy data in the sciences and the open science movement and its emphasis on data publication bear fruit. Just as in the humanities somone with a laptop and a good idea should be capable of disruptive change.

And in both these cases there is nothing special in the resources required. The real disruptor is that the person with the laptop and the good idea need no longer be at one of the big research centric institutions - meaning that research can spread outwards to smaller, and perhaps more nimble institions ...

Tuesday, 4 August 2015

No more playing with linux on old Imacs ...

Saturday was a sad day.

We bundled up all my old PPC iMacs and took them to recycling - I'd finally come to the realisation that I was never seriously play with PPC linux again. That's not to say I'm stopping playing with linux, because the one thing that the whole PPC linux on imacs thing taught me was that older hardward can have its working life usefully extended by being sidegraded to a lower demand linux environment.

Great for cash strapped schools and libraries, but it's not a panacea.

It does require that the linux distribution that you choose continues to be supported and maintained, simply because you need modern browsers, as well as all these security patches. It also means that the hardware you're working on has to have some upgrade potential - extra memory, more internal storage, simply because even the best linux implementation or application is not immune from bloat.

So, what now?

I'm personally convinced that linux on the desktop remains a viable alternative to proprietary operating systems. In fact things like printing and network configuration have got a lot easier over the years, and the general robustness of Libre Office makes it a viable alternative office suite, just as you can run your life on Evolution, just as you can on Outlook.

Using a linux based notebook I've found nothing I can't do on a machine with a commercial operating system with the exception of working with documents created with odd templates and change tracking - something that's not quite as portable as it should be. In fact things like straight forward text editing are a lot more straight forward on linux. I will say though that you do need a decent browser, purely because increasingly one ends up using browser based applications (Evernote for example)  rather than stand alone applications - however, Firefox admirably rises to the occasion.

So I'll keep on playing with linux - except it'll be i386 hardware from now on ...

Monday, 27 July 2015

Big phones and smartwatches ...

I got myself a new phone this weekend to replace my 3 year old Galaxy S2 – an S5.

My old S2 was basically running out of puff – it would occasionally crash or flatten its battery and even with an extra SD card to boost the space for data and applications it was always a little tight for updates, making it time to upgrade, especially as my mobile provider had a special deal on the S5.

Obviously it had been a popular offer – for when my phone arrived, instead of being Virgin branded it was Optus branded with the Optus extras. Virgin resells Optus bandwidth in Australia, and I'm guessing that Virgin had run out and Optus still had stock, and someone failed to reflash my phone.

Anyway, I have a new phone. Battery life is definitely better but it's also inconveniently large – too big to fit in a jacket or jeans pocket, meaning it'll have to live in my bag with my notebook and other gubbins. For the first time in a long time I've ordered a case for a phone, living in a bag it has more chance of getting scratched and banged. If a tablet or laptop needs protection in that sort of environment I'm going to guess a phone does as well.

The other thing is the realisation that the bluetooth based smartwatch concept makes some kind of sense. If your phone is stored somewhere difficult to get to – like your work bag, being able to do the Dick Tracy thing and answer your phone from your wrist brings back the convenience of a mobile phone, rather than desperately hunting for it when it rings. The same goes for email and text alerts, and of course you may simply not hear it because it's buried among a pile of other stuff.

However I'm not yet convinced that's $200 worth of convenience ...

Friday, 24 July 2015

The paperless office ...

I grew up in a world of paper.

There were no word processors, only typewriters. Memos were written by hand. Mail meant writing a letter putting it in an envelope, sticking a stamp on it and dropping it in a mailbox. Social media meant talking about a newspaper report over a drink with friends.

Later on there were things like troff and eventually LaTeX, but there wasn't anything like a proper word processor until the advent of WordStar. (For my sins I actually used to teach WordStar and can still remember the macro commands).

Even though eventually we all got access to wordprocessors and email storage was expensive – always the luddite I always used to just bump up students' filestore if they asked – so stuff tended to be printed out and filed just as it would have been in the nineteenth century.

Same with meeting paperwork, expenditure reports, and all the gubbins of system management and solution delivery.

And because I'm a creature of habit I ended up with a 3 drawer filing cabinet in my office full of paper that no one ever looked at.

Well we're moving to a new open plan office next week. All the documents in that filing cabinet exist on my computer, on the various project sites and wikis, or archived using evernote or onenote.

So I took the three drawers of paper, dumped the non-confidential stuff in the paper recycling bin and shredded the rest.

I reckon I can find most things if required. Yes my online indexing might not be the most systematic, but it's no worse than searching through drawers full of stuff.

It's just possible I've finally achieved the paperless office ...

Thursday, 23 July 2015

Eduroam and public wi-fi networks ...

I've previously sung the praises of Eduroam, and it remains by default networking solution when visiting other institutions, but yesterday I had an experience which made me question whether Eduroam is the only solution.

I was at a meeting at the University of Canberra and I'd taken my Xubuntu netbook as a writing device. When I got there, I discovered that I'd forgotten to configure Eduroam on it. Major fail on my part.

So I pulled out my old 7 inch note taking tablet, only to discover that while it was fully charged and connected to eduroam, its certificate was out of date, meaning it wouldn't authenticate (it could of course just be that I'd stuffed up the eduroam configuration – but the middle of a conference on e-research is not the place for network debugging).

And this of course highlights one of the problems with eduroam – the configuration is tricky, and non standard – it's not like most public wifi systems where you get a private ip address, and then sign into the network, provide some identity data and tick the box agreeing to abide by the conditions of use and not do anything involving naughty Nora and her oscillating hamsters.

Setting up to use eduroam involves installing a certificate on your machine and configuring some settings. Not difficult, but fiddly and outside most non-geeks' experience.

The other problem with eduroam is that it assumes that you have a university internet account and can authenticate appropriately. Not all visitors to campus do, such as visitors from government research organisations, commercial bodies, and overseas academic institutions, particularly those in SE Asia.

Like all universities in Australia, UC have an eduroam service. But they also have a new experimental service called UC-visitor, where, you guessed it, you sign in just as you would to a public network in an airport, on a train, or in a shopping centre. I'm assuming that they do some rate limiting to prevent abuse and track usage to avoid people using it as a substitute for their 3G connections.

In use, the service was perfectly adequate for email, tweeting and syncing a file to dropbox, which basically is all you want to do, ie write stuff, show people stuff, tell people about stuff.

Eduroam is a service that has its roots in the days when internet access and particularly high speed internet access was expensive and therefore rationed. We're not living in that world any more.

In Croatia and Slovenia, even Sri Lanka, internet is everywhere – in Croatia,  coffee shops and petrol stations offer it for free, without any need for authentication, and even in one case, a small coastal town (Drvenik to be precise) provided free connectivity on its beach strip. Interestingly, the University of York has recently brought the York city public wi-fi network onto campus, while also extending eduroam coverage to the city network.

In a world where free public internet is increasingly becoming the norm, does Eduroam require a reboot?

Tuesday, 14 July 2015

Cloudprint for linux ...

Last year, I upgraded my old EEEpc701SD to Crunchbang linux to make a distraction free writing machine – something that's worked out pretty well, especially since I started using Focuswriter as a basic writing application.

In the meantime my venerable HP PSC1200 inkjet upped and died – the scanner still works and it still sort of prints but quality is variable and sometimes print is intermittent – I suspect that the contacts between the cartridges and print mechanism are dirty or damaged. However, without going into the ins and outs our local big box office equipment supplier had an end of financial year special on Epson wifi workgroup inkjets so I bought one to replace the PSC1200.

Apart from being discounted, one of the inducements for going Epson was that it supported Google cloud print natively, allowing easy printing from Chromebooks and tablets, which is something that's becoming increasingly important to us.

At the same time as setting up cloudprint, I of course added the drivers to our windows machines as well as our increasingly venerable imac to allow them to print to it as if it were a normal local network printer.

 Adding the drivers to my Linux netbook to achieve to achieve the same turned out to be a bit more complex than I thought it would – Epson don't distribute a PPD file as such, you need to install their print management utility, something that meant spending some time with
dpkg -i and apt get -f install.

After that little detour, installing the Epson drivers on the Eee seemed to be asking a little to much against the minimalist spirit of what I was trying to achieve here.

So I went googling to see if anyone had written a linux Google Cloudprint client.

And they have

One of the advantages is that once installed,  all your Google Cloudprint printers, become available meaning that you can save stuff as a pdf to Google drive, which is quite neat as an alternative to emailing stuff to Dropbox which is what I've been doing up to now.

Installation onto Crunchbang wasn't quite as easy as it should be but the following script works (for me anyway, your mileage may vary):

dpkg -i cupscloudprint_20140814.2-1_all.deb
apt-get -f install

Obviously you need to run it with sudo rather than straight from the command line.

You also need to have a web browser installed on your machine. The install script will prompt you for the google account name you want to use and generate a magic url you need to paste into the browser url bar.

Google will prompt you to login and then generate a keycode you need then to copy and paste back into your terminal window to complete the authentication key.

And it works. For a writing machine it of course also means that you can work on something on the bus, and using the wifi, queue something for printing and proof reading at home ...

Friday, 10 July 2015

Not using a smartphone (sort of)

Sometime ago I wrote about not using a smartphone.

Well we took the Asha on our recent European trip, with a Go-Sim travel SIM to save us having to buy (and toss) multiple SIMS.

All in all it was a success. Great for making calls, great battery life. And when you're travelling, making calls is what you do – call taxis, call hotels and restaurants to say you're stuck in traffic, and the rest. And when it came to sending texts the Blackberry type keyboard was faster and more accurate than the usual glass smartphone keyboard.

And it did the job wonderfully.

The calls worked out pretty cheap as well – out of the $30 credit we started out with we came back with just under $20 credit.

There is of course a caveat – almost everywhere we went there was free zippy wi-fi, which meant I could use my tablet for google maps, we could look stuff up, check the weather, send tweets and emails. This of course meant carrying two devices, sometimes three, but given that we usually had a backpack for extra jumpers, rainjackets and so on this wasn't a big ask.

Now that of course is not quite the same as real life. But given that increasingly I take a tablet everywhere as a note taker one has to ask whether or not one needs a smartphone as well, and I have to say it's only the convenience factor.

If my current phone was to die, I could certainly live with the Asha while I sorted out a replacement, and to be honest, if your life revolved around phone calls and texts, I'd pick the Asha over a smart phone for one simple reason – battery life, three or four days without a recharge is pretty remarkable these days ...

Pinterest as a visual research diary ...

Recently, I've been spending some time with Pinterest.

For those of you who havn't played with it, Pinterest is a scrapbooking application that lets you save and organise images.

Now I have a deep interest in that nebulous period centred around the Russian revolution, which of course spills into the events related to the end of the AustroHungarian empire and the reshaping of the European map into something like the one we know today.

So, this is a period that interests me, and unlike my other great love, the world of late antiquity, one that had been documented by pictures. Often small scratchy pictures, taken on small simple Kodak cameras, but pictures nevertheless.

And over the last few years various digitisation initiatives to put World War 1 material online have had the indirect effect of putting a lot of photographic material relating to that period online.

And there's a lot. German soldiers who were amateur photographers were encouraged to take their cameras with them, something that was not the case with the British, the Romanovs were inveterate picture takers. There's also a vast wealth of material from the old Habsburg lands and more generally from the successor states to the Soviet Union.

So the first problem I faced was how to archive the material – you never know, one day I might turn it from a hobby to something more serious. My first thought was Evernote which I use to organise and store print material.

The only problem I found is that while text is searchable, unless you tag images correctly and consistently, finding images is a tedious process. You can't look at a pile of images on the screen in order to select the image you want.

I then thought about using Omeka. It's very powerful but it's more a tool to assemble information than one to capture content. It would definitely have a role in putting together and assembling material, but not to capture one.

And then I thought about J's visual diaries – which are basically books full of doodles, images and written notes and how she spends a lot of time with iPhoto organising material and indeed archiving scanned sketches and drawings to iPhoto.

So the answer seemed to be a web application that allowed you to easily capture visual content. And Pinterest seems to fit the bill as a first pass capture tool. It's not about telling a story, it's about assembling the material to tell a story.

Obviously, I both need to extract the images that I saved to Evernote and load them into Pinterest and find a way to get the material out of Pinterest – I can see myself building an Omeka exhibition eventually, but it seems to do the job with a minimum of fuss ...

Friday, 3 July 2015

Further thoughts on Lodlam 2015

The Lodlam 2015 event was pretty interesting, and I came away all enthusiastic about linked data and what you could do with it.

However my bag's research infrastructure provision not research itself. To be sure I have a couple of play projects to teach myself about stuff to better inform/help/advise clients but they are just that - play projects.

So how to raise the profile of linked data in a research enablement context?

After all we're not funded to do projects (not strictly true - we can be but someone has to tell us to do it), and the experience of Project Bamboo suggests that building elaborate infrastructure is not the way.

Likewise simply providing storage and data management skills isn't going to provide that degree of enablement either.

Probably what it comes down to is talking to people, showing people examples, and perhaps showing the play projects - basically we needs a showcase and some demo code.

The only infrastructure required first time around is an old laptop, a copy of Ubuntu, and the ability to use an editor ...

Wednesday, 1 July 2015

Lodlam 2015

I’ve just spent the last two days at the Lodlam summit in Sydney.

Lodlam - Linked Open Data in Libraries, Archives and Museums - was an invitation only event loosely linked to the Digital Humanities 2015 conference also on in Sydney at the same time and I was lucky enough to get an invitation to the LodLam event.

The even was cast using the unconference format - rather than a formal agenda it was a set of birds of a feather sessions where people proposed topics and groups met and discussed them. At it’s best it was pretty powerful as it allowed discussion among people who were motivated and interested in the topic and one could get some good discussion and insights - after all it’s the discussion that often makes conferences valuable rather the presentations themselves.

At its worst it was rather less valuable - the unconference thing breaks down in two ways - when one or two loud talkers dominate a discussion - which didn’t really happen this time or alternativesly when particular sessions get too large for informality and some structure and mediation is needed - controlled anarchy is productive, but sometimes a little scaffolding is needed.

I basically spent my time talking about disambiguation and entity recognition, topics near to my heart at the moment but I was also inspired to revisit some of my experiments with R and text analysis, not too mention to play with some of the Python natural language tool kits.

Along the way I think I also found the ideal note taking solution - type brief notes into Markdrop on a tablet, sync them to Dropbox and clean them up and generate a pdf and dump that in Evernote from where the shard can be shared out as a link I found my Samsung tablet had just enough battery life to last a day - something that was a bit of a problem sometimes with my older 7” tablet. It might be worth trying this on a larger format tablet as the onscreen keyboard size was a little restricting for may fat fingers, leading to more typos than were strictly necessary.

All in all it was a good event, a little nerdy, but quite inspiring to see what people are doing with open linked data ...

Wednesday, 10 June 2015

Travel computing ...

I'm back from five weeks away, travelling to Vienna, Budapest, Slovenia and Croatia.

Over the years I've agonised about whether a tablet can truly replace a laptop for computing while travelling. I now have the definitive answer - it can (but with a couple of caveats).

I took a little seven inch Samsung tablet with me and the same netbook I took to Sri Lanka in 2013, albeit upgraded to Xubuntu from Windows 7.

The netbook spent most of the trip in its travel bag. I thought we might end up in a couple of hotels with only fixed internet, but in the event free wifi was everywhere - most strikingly in Croatia where cafes made it available by default, and hotels all had pretty zippy wifi (and for free).

The only time the netbook saw serious use was to back up camera SD cards to Dropbox.  I had thought I might do some writing while I was away, but in the event I didn't - which had been my main reason for taking the netbook in the first place.

I did still keep a travel journal, but as I've always done, I wrote my notes in longhand in a Moleskine notebook - incidentally the same one I've been using since my Laos trip in 2005.

I did end up using the netbook to book trains on both Deutsche Bahn and its Austrian equivalent OBB - purely because both companies' english language web sites were easier to use with a mouse and a keyboard rather than via an emulated tablet keyboard that covered half the screen.

Now I don't doubt that both DB and OBB have excellent applications, but as an occasional user, like one who need to make two bookings and amend another, I'm going to use the website rather than download and install the apps, especially as I can use it in English.

The same goes for booking a flight on Adria - I could have done it using Expedia's app, but it was cheaper to book directly, and again their English language site worked better on a laptop.

There is a follow up to this - I failed to find somewhere to print the booking confirmation docket, so at check in showed the clerk a pdf copy on my tablet which he was happy to accept.

So, for even quite a long trip, you can basically do everything you want with a tablet provided you've got wifi.

For a business trip it depends what you're doing. If you need spreadsheets and numbers undoubtedly a laptop. For reviewing documents, a full size tablet and a decent hardback notebook would probably do ...

Tuesday, 21 April 2015

These days are past now ...

Microsoft has introduced thus feature called clutter to Office 365 - basically the system learns what you always ignore or delete, and moves it to a folder called clutter, where you can set up an auto delete rule to get rid of the content after a decent interval.

Anyway, in my clutter folder were a pile of emails from various Jiscmail mailing lists, including a couple of lists I started myself some twenty or so years ago when I worked in the UK and was involved in the support of enduser computing - mainly windows and thin client (remember them?) stuff.

Well, the world has changed immensely since then.

Enduser computing is essentially a commodity - hardware is vastly simplified with no need these days to specify video adapters, network hardware - basically you can go to any of the big box stores and just about anything you can buy will do the job.

Likewise operating systems and network configurations - it's become immensely simple and black arts such as building boot volumes or building network configurations are mostly behind us, and the plethora of file services on offer such as OneDrive, Google Drive, Dropbox and the rest make network storage provision increasingly irrelevant.

And as these things become simpler I've moved away from enduser support and now work principally in data management and archiving.

So, I dumped out the headers of these mailing lists, found the unsubscribe instructions, and did the necessary.

I did feel a momentary twinge though ...

Friday, 10 April 2015

Microsoft using WindowsUpdate to spruik Windows 10 ...

So Microsoft have decided to use the Windows Update mechanism to sneak adverts for Windows 10 onto PC's worldwide.

This is really bad. Basically it's a Snapfish moment for Windows Update and destroys trust in the update mechanism.

It flushes years of educating users to apply windows updates religiously down the toilet.

Debian anyone ?

Thursday, 9 April 2015

Netflix angst

There's been some angst recently due to the arrival of Netflix and its impact on Australia's shaky internet infrastructure. So far the consensus seems to be that Netflix (and its rivals Stan and Presto) are pushing a shaky house of cards over the edge.

This isn't surprising. For years at Chez Moncur we were troubled by unstable internet and service dropouts. Things got so bad I eventually bought myself a 3G router so that it would fail over to a 3G service whenever the ADSL did a walkabout.

Well that's worked a treat, and, about a month after I set up the 3G router I found an alternative ISP who'd provide us with an ADSL service (quite a few of the major players declined to offer us a service as we lived in an ADSL not-spot),

For whatever reason, our new isp's service has been incredibly stable, if a trifle slow at times. So much so that the router only failed over to 3G three or four times in the whole year.

Then came Netflix and its competitors.

Since then we've had as many flipovers in four weeks as in the previous year, and always in the early evening around six o'clock.

I can't of course prove it's due to Netflix but I'd say it was a reasonable guess.

The autumn school holidays start next week in Canberra - what happens to our link could be interesting ...

Tuesday, 7 April 2015

5 years of the iPad ...

Last weekend, as well as being the Easter holiday was the fifth anniversary of the iPad, a device which has undoubtedly changed the world.

The iPad wasn't the first such device - but earlier tablets had been slow, had clumsy stylus based interfaces and the rest - and they'd been heavy and had comparitively short battery life.

The iPad got it right - reasonable battery life, wifi available in lots of places, and suddenly one could carry a single device with all your meeting notes, photographs and the rest.

It could have been a flop. It wasn't. The success (or lack of it) of its various android competitors shows just how effective Apple's marketing was.

The iPad changed things. Finnish paper manufacturers blame the decline of their industry on it. Airlines let you use them to access streaming media on flights.

Things have changed, and the iPad has been one of the engines moving us over into a truly online world ...

Monday, 30 March 2015


Not having a smartphone is apparently a thing.

And of course I'm well known as a luddite when it comes to phones, but I recently bought myself a new phone. Not exactly a smartphone but a bin-end unlocked Nokia Asha 302.

Not as my main phone, but as a phone to use when travelling overseas.

You see, I have one of these travel sims that allow you to make low cost calls and send texts for pennies when overseas without incurring punitive roaming charges. It also doesn't come with a data bundle, although I could add one, but in practice you can get everything you need over wi-fi.

So, the Nokia has:

  • wifi
  • basic browser capability
  • excellent battery life
  • keyboard for ease of texting (like in German to a taxi company)
  • good sound quality
  • lightweight
all in all a very good phone.

It should have email, but Nokia had this system where they collected your mail, stripped, textified, and compressed it for you and then downloaded it to your phone, and Microsoft closed this service when they bought Nokia's phone division.

Tant pis! - the browser still works and is good enough to find an email and get that phone number you're looking for out of an email - and I've got wifi.

It's also got a reasonable camera and bluetooth, so it doesn't lack connectivity. And of course it was considerably cheaper than a smartphone.

The next thing is to see how it works out in practice I guess ...

Friday, 20 March 2015

Upgrading to Omeka 2.3 via the command line

Having got a nicely working omeka install I thought I'd see if I could break it by upgrading to the latest version. I'm glad to say it didn't, here's what I did:
  1. Make sure your system is fully patched

    sudo apt-get upgrade
  2. Backup your mysql database

    mysqldump -h localhost -u db_username -p omeka_db_name > omeka_backup_file.sql
  3. Deactivate any plugins you are using as per the official upgrade instructions

  4. copy db.ini somewhere safe (my omeka install is in /var/www)

    cd /var/www
    mkdir /home/username/omeka_backup
    sudo cp db.ini /home/username/omeka_backup/.
    if you've added any extra plugins you'll also need to back them up. It's a good idea to take a screenshot of the files listings so you remember the permissions.

  5. backup your omeka install just in case

    sudo tar -zcvf /home/username/omeka_backup/omeka_2.gz /var/www

  6. download the updated version of omeka

    cd /home/username

  7. delete the old install - you only need to get rid of the subdirectories, in practice you will overwrite any existing files in the top level directory. Do not remove your files directory - this contains your content. If you have custom plugins or themes you may wish to leave these in place as well.

    cd /var/www
    sudo rm -rf install
    sudo rm -rf plugins
    sudo rm -rf themes
    sudo rm -rf admin
    sudo rm -rf application

  8. then copy the new release in place

    cd /home/username
    sudo mv omkea-2.3/* /var/www/
    sudo mv omeka-2.3/.htaccess /var/www/

  9. make sure that all the permissions are correct

    cd /var/www 
    sudo find . -type d | xargs sudo chmod 775
    sudo find . -type f | xargs sudo chmod 664
    sudo find files -type d | xargs sudo chmod 777
    sudo find files -type f | xargs sudo chmod 666

  10. copy back your db.ini file (and any extras you'd installed)

    cd /var/wwww
    sudo cp /home/username/omeka_backup/db.ini .

  11. restart apache
    sudo /etc/init.d/apache2 restart
Now point your web browser at your site. You may well get a message about the site being unavailable while the upgrade completes. Go to the admin page and click on the database upgrade button.

Once complete, your site should just work. As always your mileage may vary, but this procedure worked well on my test install.

Wednesday, 18 March 2015

Nixnote 1.6 and an OAuth error

When we go to Europe this winter I'll be taking my newly Xubuntu-ized netbook with me, as well as the usual tablet etc.

Now I've stored copies of the various plane and train dockets in Evernote, so I thought it would make sense to install Nixnote, the third party linux Evernote client that I've reviewed previously.

As always I thought I'd test it by upgrading the version on my work linux machine first to the latest version, version 1.6.

And it broke.

But fortunately there's an easy fix. The summary of what you need to do goes like this:

Download the latest 1.6 stable release from the SourceForge repository.

Install it with the command

sudo dpkg -i ~/Downloads/nixnote-1.6_something.deb

(replace the something with the version for your architecture eg i386)

Then, following the instructions, download the fixed nixnote.jar

Update the nixnote install as follows:

cd /usr/share/nixnote
sudo cp nixnote.jar nixnote.jar.old
sudo cp ~/Downloads/nixnote.jar .
nixnote &

Nixnote should start up and you should be prompted to authenticate against Evernote and authorise the Nixnote app to access your data.

As always your mileage may vary and the location of files, including the download folder may differ on your system.

Friday, 13 March 2015

Shiny things update

We had a power outage at lunchtime today that took out most of campus, and meant that all the buildings were evacuated while it was fixed.

However, the wifi network, being protected by a UPS kept going with the result that all the students who'd been ejected could keep on working by sitting on a bench, under a tree, or what have you.

So, as is my wont, I took a walk to make an informal count of what students are using.

And the answer's Apple.

A lot of Macbook Airs and quite a few Macbook pro's. There were a reasonable number of Windows laptops as well, which I'll guess were mostly windows 8 purely because they all looked reasonably new - I'd no way of sorting out the Windows 7 machines from the Windows 8. Most of the windows machines were 15" screen models.

Few if any students were using a tablet - I guess if you have to carry one device, you'd carry a laptop rather than a tablet.

In contrast, quite a few of the staff members who'd had the presence of mind to grab a device on the way out had grabbed older, and quite often well used looking, laptops (Mostly windows, rather than Macs, although there were some Mac users). Again, not a lot of tablets in view, and no small format computers.

Of the students I'd guess 60-65% were Mac users, as opposed to 25% of staff.

Long term this might have implications for the future of Windows - if a lot of people who potentially will become significant users of computing have migrated away from Windows it suggests that that revenue stream may diminish.

But then of course, there's always Office, which is still seeming to retain its stranglehold on the wordprocessor and spreadsheet market ...

Bodhi Linux

Ever since the demise of Crunchbang I've been on the lookout for an alternative distro for the Eee in case I ever need to do a sidegrade.

Hopefully I won't, it's currently working well for me, but as we all know, it's the unknowns that get you, not the knowns.

So I thought I'd take a gander at Bodhi Linux which is a lightweight distribution that uses Enlightenment as a window manager. I'd never used enlightenment, so even though Bodhi is based on Ubuntu, I thought it might be an interesting experience.

As always with these tests, I used a virtual machine. This time around though, rather than build me own I used the virtual box image available from OSBoxes to save build time.

The image comes compressed with 7zip, and needs to be decompressed first but it loads and runs first time.

Bodhi Linux is however a little less impressive. The application set (or at least that that comes with the pre-rolled image) is sparse to say the least and the performance of the window manager is slow. Bear in mind that this is being run on a virtual machine, and that the virtual machine has been configured as per OSBoxes instructions - as always your mileage may vary.

In comparison I can build a debian vm on the same virtual box installation and get considerably sharper performance.

Just for fun (and I probably should get out more) I installed an old fluxbuntu image from and that also gave considerably sharper performance even if the repositories etc were in severe need of an update, which leads me to think that perhaps bodhi isn't quite as lightweight as I'm looking for and some other distro with either fluxbox or openbox as window managers might do the job ...

Tuesday, 10 March 2015

easy three step guide to building omeka on debian

Following my success with building omeka on a debian vm I thought I'd write down how I did it before I forget.

It's not rocket science, all I did was follow the bouncing ball ...
  1. Obtain the debian net install iso – I used the 7.8.30 version
    1. this assumes that the device you are installing to has a network connection
    2. this install will download software from the internet
  2. Build debian on the machine of your choice. You will need administrative access
    1. you will need to decide during the install process whether to be command line only or to install a desktop. You do not need a desktop to install omeka
    2. some of the utilities required may already be installed
    3. remember to do sudo apt-get update followed by sudo apt-get upgrade at the end of the build process to ensure that all repositories and updates are in place. Using the netinstall iso should mean that you have few, if any, updates
  3. Follow the omeka command line install script at
    1. the latest version of omeka is omeka 2.2.2 – replace all references to omeka-2.0 with omeka-2.2.2 throughout the install script - for example the wget command should read wget

It should just work, but as always, your mileage may vary ...