Wednesday, 25 January 2012

dropbox, evernote and the digital repository

Over lunch I watched a couple of videos from the DepositMo project and one thing that grabbed ny attention was the way one of the speakers referred to repositories as 'dropboxes'.

Conventionally they are quite different and do different things:

A dropbox is where you want to put stuff you either want to move/share between different machines eg home and work, or share with a small number of other people. The files have no context, ie no metadata at all, except any you create by way of the file name eg picture_of_lily.jpg. Only you of course know if it's a picture of a lily or a person, dog, cat, hamster or whatever called Lily. In fact a quick look at wikipedia's disambiguation page  for lily shows an even wider range of possibilities for 'lily'.And of course just because it's called .jpg doesn't mean it has to be a JPEG.

So dropbox items are context free. You may create a set of naming and directory conventions but they mean nothing. An example might be the mp3 of a presentation that you transfer to home, download to an mp3 player and listen to on the bus. You might give it a meaningful name or you might simply call it preso.mp3. As long as the name means something to you that's all that matters

Evernote or indeed OneNote, is different. You could simply treat it as a dropbox, but infact it lends itself to organising data, and the natural trend is to group data thematically. Therefore I have material grouped by project, so I can find anything that relates to DC7D. I can also add metadata as tags, eg 'invoices' so I can search for all invoices or indeed only the invoices referring to DC7D.

This is of course reliant on me being organised but at least instead of picture_of_lilly.jpg I have a pictures notebook, with an entry tagged 'Lilly'.

If I've done my tagging right I end up with a pile of data organised as a de facto folksonomy. Thus in my pictures folder I have a picture of Wen Xiu, the mistress of Pu Yi, and it is tagged 'China' and 'Manchuria'. (I also have a folder of material related to Manchuria some of which is tagged 'Russia' and "Korea', which contains material relating to a personal project, which may or may not turn into something about writers and journalists 1930's China - the point being it makes sense to me to classify things that way, not due to strict logic. A folksonomy is a contextual aid to organistion, not a substitute classification schema)

A repository is of course something else. In the classic model it is a collection of published documents about which we need to know a number of standard things, basically who, what, where, and the format. In a digital preservation system - say for holding electronic versions of historic documents, say early pictures and recordings of Yolngu ceremonial events - we never want to change things. In a repository of research preprints we may want to replace items with corrected versions of documents.

Of course we may want to transfer the content to a curated system such as that being developed by Project Bamboo and add value by creating a transcript of the sound recording and an English language translation of the transcript and annotations for the image data.

As this work is revisable we may of course want to put it in a separate transcripts and annotation repository separate from the preservation repository.

Just to muddy the waters we could imagine a work in progress repository, where updates to a document  are regularly submitted but the basic metadata remains the same. In fact we should probably just admit that repositories are really (just) content management systems and it's a repository when used by librarians, a preservation system when used by archivists and a CMS when used by everyone else. Architecturally they're the same, it's just that the workflows around how content is ingested, retrieved, displayed and disposed of differ.

However let's assume that when we say repository we mean a system that has the characteristics of using standard metadata data and containing objects subject to little or no revision, as in any classic university research repository.

Digital asset management systems, or preservation repositories are effectively the same. Not quite as they need to have systems in place to maintain the integrity of the data and more complex metadata and access control models.

This might lead you to think that a data repository was effectively the same. After all, if you digitise an audio recording using a specialised digitisation workstation like a Quadriga workstation you capture some machine based information and technical information which is typically embedded in the technical metadata section of a WAV file, perhaps with some added vendor extension fields.

The preservation repository ingest process would typically both extract the technical metadata from teh file and have some human created metadata - the who, what, where component.

Functionally the process is the same as acquiring information from any other instrument, be it a seismometer, a radio telescope, or whatever.

Except it's not. When you are preserving data you are preserving 1's and 0's. Unlike TIFF's or WAV's there are no rules about data or format integrity checks, all you have is the metadata, either that entered by humans or acquired from instruments. Even though we pay lispervice to using standard schemas really it's much more like a an Evernote notebook, with some tags and information that make sense to the user or groups of users, plus a human readable description of what all the columns in the data mean. Without that it's meaningless. Context is everything with data. At least with an image you can guess what it might be showing. A spreadsheet or set of spreadsheets can be utterly opaque.

As a system a data repository look a lot more like a software archive, such as that run by Mirrorservice.org, than a classic dspace implementation. Yes, it needs to speak someting standard such as RIF-CS to produce standard descriptions of the items, but unlike a print or image repository where we know implictly how to deal with the different media types, we have no idea of how to deal with the data stored in the object.

A data repository is a collection of data objects stored according to a standard set of rules. So, just as we expect a software tar file to unpack and show a readme, a manifest, and perhaps a makefile we should expect a data set to unpack to contain the data, the technical metadata, and a description of the files, both their structure and significance, a bit like you get with either SEED or FITS.

So, when we build a data repository are we really building something more like an evernote for data, rather than a dspace for data?

And in that context should simply use off the shelf CMS technology, such as Alfresco, rather than a dedicated repository application ?



Tuesday, 24 January 2012

Travel writers and typewriters


Back at the start of the month I wrote a post inspired by a NYT article on Mark Kirschenbaum’s work on the literary history of word processing.

Last night, because I rode the bus home, I finally got around to listening to a podcast of a talk given by Mark Kirschenbaum at the New York Public library in the middle of December.

Listening to the talk I was suddenly struck by the thought that advent of the portable typewriter suddenly made travel writing possible.

Certainly Ella Maillart and Peter Fleming took typewriters with them when the crossed the Takla Makan on the way to Kashgar and Maillart writes of Anna Schwarzenbach and herself cramming their portable typewriters into the dickey seat of their Ford when they were driving to Afghanistan in the late 1930’s.

Travel writers are essentially journalists. The typewriter liberated them from the need for legibilty and coupled with that great Victorian invention, a regular mail service, meant they could prepare and send their copy while en route. Importantly, by using such rudimentary techniques as carbon paper (which of course lives on in email’s cc:) they could also ensure that they retained a copy should their draft get lost in the mail.

Just as the 35mm camera liberated photojournalists and made some of the innovative photography of the 1930’s possible.

Journalists of course have moved on from the typewriter - for a while the Tandy 100 and it’s successors were popular - due to their lightness and the fact that they could run off standard batteries if necessary, which was probably a great advantage when out in the field.

Nowadays journalists use netbooks, or ultrabooks, and complain about the lack of power for charging and internet connectivity.

In fact it’s much like the fun of running field surveys in the days before decent battery life.

For example, in the mid eighties I was responsible for running a couple of small scale botanical surveys. Even though the data was entered into a database, the field surveys were done with squared paper, a pencil and crib on what species we were looking for and how to estimate abundance.

Lack of suitable portable devices, battery life and the rest. Now you’d use either a laptop or a tablet computer. The same goes for archaeological surveys, technology is adopted  when the ease of data entry and battery life makes it practical.

And in this there’s a lesson. Elsewhere I’ve written about how some key technologies were what enabled to modern world. A decent reliable postal system, coupled with trains and steamships enabled communication and commerce. The arrival of the safety bicycle  that anyone could ride in 1885 meant that people could get somewhere in a day that would previously have been difficult to get to, be it historians documenting medieval buildings, or botanists counting plant species.

Likewise the advent of the truck. It’s notable that both Joseph Needham and Peter Fleming rode trucks to get to parts of 1930’s China not connected to the train system, or Ella Maillart ‘lorry hopping’ to Afghanistan, in much the same way as William Dalrymple did some fifty years later in search of Xanadu.

Like the wordprocessor, the typewriter was an enabling technology in that it allowed travel writers and journalists to work in a way that had previously been impossible, by being able to compile notes and prepare material in the field.

And any history of the modern needs to take account of these enabling technologies - even though the end points may have remained essentially the same the process of getting there became simpler, more immediate, and was suitably flexible to be adapted to a range of needs

Monday, 23 January 2012

Handwritten - an exhibition at the NLA

If you're in Canberra, as well as all the publicity for the NGA's summer blockbuster exhibition you may have noticed some of the buses are carrying adverts for 'Handwritten - ten centuries of manuscripts from staatsbibliothek zu berlin' at the National Library.
Last Sunday, as well as taking an old film camera for a walk round Canberra (part of the photography project) I went to this.

For some inexplicable reason  I have always been  interested in old documents, something got me started in digitisation as a means of preservation for a the relics of a culture on the other side of the world.
This fascination goes back to when I was child in Stirling, Scotland and the main library used to have this display of the town records and transcriptions. And one day I realised that all this stuff about people being fined for having dungheaps where they shouldn’t actually told you stuff about how the town was laid out and how it functioned – something that years later gave me a great deal of insight into the whys of  digitisation preservation and reuse.


The exhibition itself is quite small and unexpectedly popular so you need to book online, even though it's free. This can mean that you get a bunching effect round about the hour breaks so I'd suggest that being fashionably late, by which time the crowd has thinned a bit.


There are some quite nice medieval manuscripts including a very plain matter of fact ninth century copy of the Aeneid, as well as lome late medieval examples and herbals, but no examples of correspondence or charters from before the mid 1400's ( a letter to one of the Medici). From then on it's all the correspondence of the rich and famous, Michaelangelo,  Volta, Humboldt, Darwin, Cook, Einstein and the rest, not to mention Dostoevsky and Marx.
Most of the letters are fairly prosaic, Darwin for example writes about sea temperature, but what is interesting is to track how handwriting evolved, and also what crappy writing the prolific correspondents of any age developed through the need to write a lot, and quickly.


It's also not just text, there' also a selection of various handwritten musical manuscripts which demonstrate that the great composers were just as messy as the rest of us.

Inevitably, being from the Staatsbibliothek zu Berlin, there's a focus on German authors and scientists, but even that demonstrates the internationalism of the day, for example the correspondence of the Forsters - who despite their name were solidly German - who sailed with Cook.
Tellingly, the exhibition ends with a typwritten page.

All in all quite a nice little exhibition, and well worth spending 30-45 minutes enjoying.

There's an exhibition website with booking instructions, plus a blog giving useful background that's worth a read before visiting.

Thursday, 19 January 2012

the day the web went dark ...

Not really of course, but the SOPA black screen protest by Wikipedia and others has prompted an outpouring of angst plus the usual spate of articles about 'could we live without the web?'

One of the better articles was from the New Zealand Herald, which referenced the experience of Egypt during the Arab Spring, but even then failed to mention that one of the side effects of the Egyptian blackout was the loss of e-banking.

That's right, e-banking, or paying people electronically, or indeed finding the cash machines didn't work. Killed a lot of normal business stone dead.

We can manage without a lot of the internet  and certainly without social media such as facebook and twitter. Same goes for Skype, and at a pinch, email (you know it's amazing, you can still print out some text, put it in a wrapper and stick a special payment receipt on it and put it in a dedicated red collecting box and it gets to the other side of the world, and even Gundaroo,  in three or four days).

The killer is banking. It's all electronic. No more traveller's cheques, no more cheques, no more bank statements, just track it all online. No more banks even, get your cash from a machine in a wall.

I know from personal experience travelling that you can last without the internet for a few days without difficulty, and being out of email contact can be a relief, but the real killer has always been banking to check how you're doing, whether or not a particular credit card purchase has gone through, or indeed how much you've been stung for that multi currency ATM withdrawal in Dubai airport.

You even need access to e-banking to top up these special  multi currency visa cards you can get from Travelex.

In fact one of the reasons we started taking the Ookygoo with us was online banking. The other main one was checking flight details and hotel reservations, the only problem was not being able to print boarding passes and the like.

Nowadays most people are happy to scan your phone or tablet, afterall it's only the bar code they're really interested in.

So yes, its is possible to go back non-internet way of life as long as the legacy services are still there. The moment that you are expected to do anything at all for your self - online banking, flight confirmations, etc the wheels come off. And that's the key. One could happily live in a hut in the bush, type letters on an old typewriter. Couldn't buy anything except what was available locally and can be paid for with cash.

Allow yourself a debit card (remember, no more cheques) and you can order stuff by mail or by phone, and that's giving you all the functionality of the 1980's ...

Wednesday, 18 January 2012

I've published a book !

Well sort of.

If you're a regular reader you'll know that I've periodically ranted on about e-readers, e-books, espresso book machines and the like.

And then I read the recent Guardian article about Amanda Hocking and self publishing.

Curious as ever I got to wondering just how easy it is to self publish. To do that you obviously need something to publish, so I reverted to my account of our trip through Laos and Northern Thailand at the end of 2005.

It only comes out to 47 A4 pages, but on the other hand the free version was picked up by a number of people as a valuable background source, including at least one UN agency. So, we'll assume it has some merit.

Creating the book was simple. Register with Amazon's self publishing service, sort out the bank details so you can get paid, upload the file, do a little bit more work and 'hey presto!' - you're done.

Frighteningly easy.

No I'll admit I was kind of sloppy. The upload and conversing service offered by Amazon lost some of the page breaks from the pdf version and so on, and I didn't bother fixing the format, but I did check the text and it's all there, and available to purchase as a Kindle book for the grand sum of US$0.99.

I don't seriously expect to get rich out of this, in fact I'd be gratified if I made any money at all out this experiment.

The key learning is that the Amazon platform makes it incredibly easy to self publish, all you need to do is have something to publish.

The other learning is that while it's ok for hobby publishing, or perhaps for publishing obscure academic publications (there are options to also have your book printed through CreateSpace, Amazon's print on demand service, though I havn't explored these), probably the degree of editing, proof reading and marketing required means that the Amanda Hockings of this world will be few and far between. It's telling that she has now opted to have her work handled by an agent and a publisher, in part because of the amount of time proof reading etc was taking up.

If I was a small university publishing house I'd be worried - on the other hand if I was one of the big boys I'd only be mildly concerned ...

Tuesday, 17 January 2012

Using the cheapo MP3 player

Well, I used my cheap MP3 player bought off of ebay for $17 today.

Basically it just works - select a file, click play and it plays it. When you're done you can delete it or keep it. Your choice.

The fact you add content just by copying it to the player makes the business of listening to ad hoc mp3's of seminars and preso's simple

In use the menu structure is a little odd, and the thing that looks like an ipod control switch doesn't work the same way but essentially all the functionality is there, and the sound quality is pretty reasonable too. I've actually got to say I'm quietly impressed with the device. Not as slick as an iPod, but certainly a lot better than my original USB stick MP3 player.

That's the plus. The minus is my decision to use gPodder as a podcast management application. It works, it downloads, it syncs but it's by no means perfect. And it's slooow!

It lacks the comprehensiveness of some of the commercial applications, meaning you quite often have to track down the url of the rss feed for the podcast you are interested in, and then add the feed to your collection manually.

Now if you wanted to crowd source a database of podcast feeds this would be an interesting and perhaps innovative way to build content, and perhaps it's intended as such. Unfortunately the database is eccentric, perhaps reflecting the demographics and interests of the gPodder community.

The other problem is that it doesn't have a two stage sync mechanism. It downloads the sound files to a directory on your pc. It would be nice to have an on demand  second stage where it syncs the podcast directory to a particular filesystem path, which as the mp3 player presents itself as a filesystem would be a fairly simple thing to do, with the device name and path stored locally in a configuration file. Given that it already has an 'export to local file' function extending it to do synchronisation should not be too difficult ...

However, grumbles aside, the set up basically works and lets me do what I want to do, and $17 plus some free software seems a hell of a lot better than an iPod or an iPhone ...

Wednesday, 11 January 2012

MP3 players and me

There are people who listen to music on an mp3 device almost every day

I'm not one of them - and while I do like to listen to music at home, in the mornings and when commuting radio is my preferred drug (Newsradio and Classic if you're asking). Radio has never been just background to me and I've always listened to 'serious' radio as a form of entertainment, and over the years I've heard some great performances, some stunning plays and truly informative documentaries, not to mention almost running off the road once due to laughing so much at a political satire show ...

I've always had decent radios, and in the days when I used to go bush I always used t take a halfway decent radio with me, as part of the fun was sitting out in the dark by a fire listening to the radio.

However it not just been radio, there's been recorded music as well, but that always came second. Over the years I've had various walkmen and so on but it was always radio I valued including good serious talk radio. In fact one of my favourite toys (still have it) is a solar powered radio that I used to use on my walk from the bus to work. Leave it out on a sunny window sill during the day, and there it was, charged, for the journey home.

Life of course changes. Over the years life has got busier and somehow serious radio listening time got squeezed out. Not that I havn't tried to find space for it.

For example, when I started riding my bike regularly to work I bought myself a little USB stick sized MP3 player that had an FM radio. Well listening while riding isn't that safe but I discovered podcasts, so on days when I took the bus rather than rode I found myself taking the USB player and listening to podcasts of BBC talk radio.

The interface on the player was one of the old style not very intuitive two line displays, so I ended up replacing it with an 4GB iPod classic (bought from the Apple store in Cupertino, no less). That was truly superb, with excellent sound reproduction, and iTunes provided a truly wonderful sync mechanism.

However, after two or three years the rotary switch thingie on the front became unreliable, and I never got  round to replacing it, and no, I didn't move over to doing the obvious and start listening to podcasts on my phone. Instead I stopped listening to podcasts altogether, as to tell the truth I was struggling to find listening time. Driving more and more rather than using the bus or riding my bike killed my listening time.

At the same time more and more interesting broadcasts ceased to be available over the web as podcasts, but instead were available as on demand content.

On demand content is fine - I can still listen to interesting shows from the BBC, NPR and RTE, but it does mean having to sit in front of a computer and doesn't have the convenience factor of being able to listen to them in the car via the aux cable, or while doing something else like waiting for a bus, weeding or pruning.

On the other hand I miss my periodic fix of intelligent talk radio, so, as new year's resolution I've decided to revisit podcasts. The good news is that old favourites like the BBC's From Our Own Correspondent are still available, so it has legs as an idea, but how to download and play them?

I could of course have gone out and bought myself an ipod and bought back into the whole iTunes thing, but with open source alternatives like gPodder still being available, a better solution seemed to be a no-name 8GB MP3 player from ebay for less than $20 - cheap enough to lose, break, or whatever but chargeable via a standard ipod style cable and mountable as a windows formatted disk. It also comes with an FM radio meaning I can easily setup presets for Newsradio, Classic and Artsound (no FM  ABC local or Radio National in Canberra) for those times when I just want to zone out, or have to ride the bus.

It's also simple to use - just like my old USB stick MP3 player, content is simply added by copying the files to the player. The other thing is that, late to the party as always, I now have a car with an aux socket on the sound system meaning I can plug the player in and listen safely while driving.

So player #3 - let's see if it makes it on to 2012's what worked post ...


Tuesday, 10 January 2012

Office on the iPad

The idea of providing access to Office from the ipad seems to be on of this week's themes with both CloudOn and OnLive garnering some interest CES this week.

I've not played with either of them so I'm not going to pontificate on how well or how badly they work but there's a couple of features worthy of note.

CloudOn uses dropbox to synchronise documents between your ipad and your main computer - this is actually very clever as (a) just about everyone on the planet has a dropbox account and (b) most people need office on a tablet for document review and a bit of highlighting/commenting - realistically no one is going to use it to write a 27 page project report.

OnLive is interesting due to its use of thin client technology - something that should have had a lot of traction, but which through a combination of licensing restrictions and the near universal availability of cheap hardware never quite made it to the mainstream.

Now what is interesting is the use of thin client technology to get decent performance on a slow network connection (such as a 3G connection) meaning that you can get access to full featured environment without all the computation overhead implied.

Both products seem to be assuming an always on network connection - which is not always the case, but certainly don't seem to support offline use. What's also interesting is that they seem to have made little or no use of the Office 365/Windows Live/Skydrive type infrastructure - perhaps due to slow response over slow links, or lack of an obvious api.

I find the focus on providing full access to office puzzling, given that tablet pc's tend not to lend themselves to document editing - if you need to edit you need a proper keyboard etc etc. If I was to develop an office client for a tablet pc, I think my first approach would be to write an app that essentially functions as a document viewer with some annotation and simple editing functions, and uses the Office 365 infrastructure with local caching and periodic writeback - allowing me to go offline or suffer network dropouts in a fairly seamless manner. In other words something to give me more or less the  functionality of google docs but generating word format files by default, and accessible to Office on my home machine ....

Xubuntu

Despite the fact that I no longer use it on a daily basis, I occasionally still dabble with linux.

Everyone tells me that Mint is the new black but I have singularly failed to get it to build a vm on virtualbox, so I resorted to going back to Ubuntu with the latest version of Unity.

Call me prejudiced, but I just plan don't like it - there is something in the user experience that jars. I am not a window manager freak, but in these days when Windows 7 is the most common one out there, closely followed by OS X, window managers need to give people a comparable experience - if it looks and feels like windows 90% will never really notice if all they do is fire up a browser to read mail.

Now there are two distributions that I've used for lightweight virtual machines, crunchbang and xubuntu. both of which use the lighter weight window managers, OpenBox in the case of Crunchbang and Xfce in the case of Xubuntu.

Of the two, OpenBox is even more cheeseparing in its use of resources than Xfce, and for that reason I would certainly consider crunchbang for use on an old netbook (or other old laptop) but with Xubuntu as a fall back position due to its better hardware support.

Crunchbang is a good stable distribution but Xubuntu, in my experience, supports a wider range of hardware, and being built on Ubuntu is generally slicker, and Xfce conforms more closely to the generic window manager meme than OpenBox

Certainly building a Xubuntu vm was a fairly slick experience, and I was impressed by the way it picked up it was running on virtualbox and offered to install some extra drivers.

The interface is the interface, the software manager is neater than it used to be - my usual test of installing kwrite just worked without any confusing messages about extra libraries etc. The default set of tools is good, abiword, gnumeric etc, but I'm guess that a lot of people using xubuntu will be doing most of their work in the google ecology.

Unlike crunchbang, the browser is still firefox, rather than chromium, the open source version of chrome, but that's an easy fix if it's important to you. (Just to complicate matters, the latest distribution of crunchbang has changed to iceweasel)

Xubuntu still produce a version for ppc machines, and I'm tempted to try installing it on one of my old glass imacs - perhaps not the original one, but the second one which has a little bit more memory - I've always had this idea of having a machine available for visitors to use to check their email, blog or tweet from, and while smartphones have kind of taken over the email checking role, they're not yet universal ...

Sunday, 8 January 2012

magpies, 4-inch nails and precursors of tool use

Our main bathroom doesn't have windows, instead it has a skylight for natural light, and that means you can lie in the bath, see birds (and the occasional Qantas commuter jet) fly overhead, and on one  occasion have the cat look down at you through the glass,

Yesterday morning I was wakened just after dawn by a tremendous repeated clatter - my first thought was the the cat had brought a rodent home and was chasing round the kitchen or the laundry, but no, it was a juvenile Australian magpie repeatedly picking up and dropping a four inch nail on the glass.

I stared at it, torn between trying to get it to bugger off and fascination with the behaviour, which looked as if it was trying to break the glass with the nail, perhaps a miscuing of an instinctive response to ice.

By the time I snapped to and went to get my phone to take a picture, the bird had flown off.

Crows are of course clever birds, and some species such as the New Caledonian crow are known for tool use but this is the first time I've heard of or seen such a behaviour in an Australian magpie ...

Thursday, 5 January 2012

Lenovo Ideapad K1

Just before Christmas I bought J a tablet of her own - this time a Lenovo IdeaPad K1, rather than another zPad.

The Lenovo came preloaded with a whole pile of demo apps and games, most of which were completely useless and ended up being deleted. The one bundled app that took a trick was Drawing Pad, which J, as an artist at heart, absolutely loved.

The rest were crap, except for the printershare app which neatly solves the tablet printing problem by using Google Cloudprint - a non Lenovo branded version made it onto the zPad some 30 seconds after I discovered it.

Setup was slicker than the zPad - a gmail address and suddenly email was there. Add the Jorte calendar app and J's Google calendar was there as well.

Now my dear wife is somewhat of a refusenik with technology - she knows how to get it to things she wants but she's not interested in it. The tablet however is something else - checking email, googling for stuff, wikipedia, not to mention news and weather, and again the experience was socialised - sit on the couch, surf, doodle or email, or whatever.

Performance doesn't seem any better than the zPad, and the custom Lenovo Android 3.1 skin doesn't seem quite as intuitive in some vague intangible way as the zPad's close iOS skin, but it's a tablet and does tablet things, and all the apps seem to work as reliably as anything else.

The Lenovo cost me $100 more than the zPad, but that still works out $200 less than an iPad, and unlike the zPad comes with a decent warranty.

Certainly worth a look if you're in the market for one, but it's a pity they don't provide a 'no crap' installation and configuration option ...

Tuesday, 3 January 2012

Partitioning Korea in 1896

As I've previously written, Russia has long desired to maintain a buffer state on the Korean peninsula, to counter Japanese influence in the area and help defend Primorye - the very east of Siberia around Vladivostok, and possession of which gives Russia access to the Pacific.

In 1894-5 Japan and Qing state fought a short war over Korea and Manchuria, in which the Japanese sank most of what China possessed in the way of a modern navy and successfully invaded Manchuria. The result was a fairly humiliating defeat for China.

Imperial Russia took fright at this, rightly fearing long term Japanese aims to expand and colonise Manchuria and Korea, and thus threaten Primorye and eastern Siberia.

In an act of bare faced cheek, Russia proposed to Japan that they partition Korea between them along the 38th parallel, which is today roughly the cease fire line between the DPRK and South Korea.

At the same time, Russia, France and Germany pressured Japan to return territory in Manchuria seized during the 1894-5 war, at which point Russia seized the territory returned by Japan and hence the war of 1905, and the whole sorry tale of the DPRK as a Soviet buffer state.

The interesting thing about a lot of the commentary around the death of Kim Jong Il was that it emphasised the role of China in supporting the DPRK, rather than the role of Russia.

Historically it is Russia that has wanted the DPRK to continue to exists, as a buffer to protect the Primorye, first against American forces in Japan and Korea, and latterly against any pre-emptive move by China to regain the territories in the east of Siberia informally ceded by the Qing state to Russia from the 1860's onwards.

China's support is probably pragmatic. A hungry chaotic nuclear armed neighbour is not a nice prospect. One with a stable government, however repressive, is probably a comfortable neighbour, and there always remains the prospect of managed change...




Sunday, 1 January 2012

Literary word processing and empowerment

A few days ago I tweeted a link to a piece in the NYT  about the literary history of wordprocessing.
This piqued my interest as in the mid eighties I worked as a technology evangelist at the University of York in England and taught word processing (Wordstar no less) to undergraduates and at the same time spent a lot of time doing document conversion between the myriad of different word processors, disk formats and sizes, not to mention delving into WPS+ on Dec Vaxes.
At the time we didn’t have a set of public access pc classrooms and consequently were unable to offer a public access wordprocessing service. However the Vaxes came with WPS+, part of the Dec All in One office management suite and a decision was taken to deploy this as a way of meeting the pent up demand for access to wordprocessing by the student body. In hindsight,  a wonderfully wrong  decision as to a means of providing a student word processing platform accessible from every timesharing terminal on campus and utterly incompatible with anything else on the planet (we later converted to WordPerfect on VMS which was compatible with the DOS and Mac versions and as student PC labs and individually owned PC’s became more common the students gradually self migrated over to DOS and Windows.
However, this got me thinking.
People tend to equate the arrival of the PC with the arrival of wordprocessing. This is not the case. There were dedicated word processing systems from IBM, Wang, Dec all of which were based on the minicomputer/timesharing terminal model long before the personal desktop computer  was anything but a plaything, and there were specialist mainframe type services like Runoff and TeX (which is still with us) which was originally designed to talk to various high end typesetters for the production of journal articles and the like.
But all of this was used by scientists, lawyers and banks for specialist purposes. For example while TeX supported templates for letters it was not exactly a user friendly free form writing tool.
The things which made wordprocessing was first of all escaping from character oriented terminals to something resembling a properly addressable screen – something which good old Wordstar and Wordperfect never quite did – so that the user could see what the text would look like. This was of course part of the appeal of the first Macs, suddenly WYSIWYG was a reality, and secondly the rise of individual computing power, meaning that people could have a computer of their own, at home, in the study, to use when they wanted, without having to go to some ugly concrete data centre, interact with the priesthood who administered the machines (and who tended to come from a scientific programming background and never quite saw the point of wordprocessing), get an account, and then fight for a terminal.
Personal Computers were exactly that, personal and it was this that set productivity based computing free.
And that’s exactly what wordprocessing was – productivity based computing. No more days spent listening to the radio mindlessly retyping drafts to fix some spelling mistakes, or to restructure paragraphs, take text in, change the order,  or take text out. Suddenly editing was easy.
Of course there was a downside - a plethora of wordprocessing applications - all with incompatible document formats. Wordstar, WordPerfect and Word were fairly mainstream, there were suddenly popular applications like AmiPro, and strange ones like NotaBene preferred by humanities researchers. This meant that documents had to be converted between different formats to be shared, printed edited and so on - which spawned a whole range of conversion tools and filters, perhaps the best of which was Word For Word, some of whose filters still live on in OpenOffice. Of course nowadays we have a monoculture of Word, with the odd weedy sprout of open office in the difficult to reach corner ...
Wordprocessing’s unique selling point was the ease or revision, not the ease of production. Making the text look nice was secondary – in a world of monospaced Courier even Computer Modern or Arial suddenly looked sexy.
In fact the appearance of sexy looking documents had to await the arrival of decent inkjet and laser printers in the late eighties, before then all you could hope for was some nicer looking daisywheel printer text.
The appearance of better printing technologies at an affordable price of course caught word processing by surprise – hence the desktop publishing phenomenon of the late eighties/early nineties where text was fed into a separate program to do complicated page layouts. Now of course, applications like Office do it all for you.
But again the DTP phenomenon was about empowerment – no longer was it the case that you had to take your draft and have it properly and expensively typeset for publication (you may not believe me, but the number of publishers that rekeyboarded text into a typesetting system in the early nineties was phenomenal – the idea of converting between documents and correcting any introduced artefacts never seemed to gel with the professional typesetting community.
DTP and wordprocessing meant that you could produce good looking text and text for filmsetting from your desktop – not exactly self publishing but it allowed authors and manual writers control of the publication process.
Likewise multi lingual, multi alphabet text was a breeze – I remember in the early nineties after the fall of the Soviet Union and going to Nerja in Spain, and being amazed by the number of real estate agents with badly spelled, badly  handwritten Russian language adverts for villas (always with a swimming pool). For a moment I seriously thought of setting up in an office with a Mac and a Laserwriter producing Russian language real estate display  ads.
So, wordprocessing was about empowerment. It made the production of text easier by making the tasks of revision and publication simpler and meant that the drudgery was taken out of the writing process. It still meant that the process of creation was the same, but it gave control to the author – no longer having to fight with the copy bureau or indeed pay for the quite considerable costs of having the final version of a draft, or a thesis professionally produced, or indeed the need to have a publisher’s advance to defray the costs of production.
Incidentally it could probably be argued that scientific journals’ publication model is a result purely of the costs of rekeyboarding and typesetting journals in the seventies and eighties – as arxiv.org and others in the open source journal business demonstrate, that’s no longer the case, but that’s a an arguement for a different day.
Literary wordprocessing probably made the barriers to publication lower – easier to produce a revised draft, easier to get published without the need for agents and substantial advances. In short it made it easier for people who wanted to write, to write, just as nowadays anyone can produce an e-book from their desktop and distribute it themselves – which may be the saving of  obscure and learned texts. More mainstream publication still benefits from advertising and promotion, but for how long is an open question – the music industry has been through this already ….