Thursday, 31 July 2008

medieval underwear and literacy

At the risk of more spam trying to sell me chain mail knickers I followed up the medieval underwear and literacy theory mentioned in my earlier post about Cuil. (I've always had a weakness for odd sounding ideas).

And it's actually more plausible than you might think. The logic goes something like this:

As more people moved to towns more people started wearing underwear and underwear made of better quality materials produced by professional weavers rather than homespun garments.

And we're being very general here - we know from marginalia in manuscripts and from other sources that the sight of peasants with their wedding tackle dangling between their tunic hemline was a source of ribaldry, as was the sight of fine ladies falling of horses and inadvertently showing their bums. So think chemises and shifts rather than knickers. Knickers only really become common when people start wearing trousers and care about skidmarks. 

At the same time as towns grew there was an increased demand for records and documents as life became more complex. Co-incident with increased urbanisation papermaking became common in western Europe, and to make paper you need rags, preferably good quality rags with long fibre lengths. And source for these was the worn out linen shifts. And because there were more people in towns in meant that there was enough good material available for rag collectors to operate profitably and sell the rags to the papermaker.

And suddenly what initially seemed a loony idea is suddenly sensible - it's not that increased wearing of  underwear resulted in increased literacy, it's that the increased use of finer materials for undergarments provided a ready source of raw material for papermakers when these garments were discarded.

Compare this with the Roman period, where a fair proportion of the population was literate, but lacking paper wrote on wood slats, pottery fragments as well as papyrus.

Comment spam ...

Maybe it's a new trend, or maybe it's my fault for mentioning medieval undergarments on this blog, but in the last two days I've had two spam comments, one from technosnack, and one from a company wanting to sell me 'exotic' underwear.

At least technosnack had looked at the content ...

Tuesday, 29 July 2008

Cuil and medieval underpants ...

Like everyone else today, I've been trying out Cuil, the new search engine. And I've come away unconvinced.

First of all I searched for my name. Always a good one, given that I've a lot of detritus on the internet and share my slightly uncommon surname with an eminent jazz musician, a Jamaican athlete and well known Sydney restaurant.

Well, it didn't come up with an ordered list of articles ranked in some way, and there were a few funnies, like some aquatic plant websites that are nothing at all to do with me - or nayone else with my surname as far as I can tell. Not a lot of page ranking going on.

So then I searched for "medieval underpants". There was method in my madness - at the Leeds medieval conference earlier this month there was a paper on the role increased use of underwear had in an increase of literacy in later medieval europe. (If you don't believe me check out this report from the Guardian).

This is the sort of story that news editors like as filler for light relief - loony historians with knicker fascination etc, and as such should have been commonly picked up on half a dozen new sites - which is exactly what you find with Google.

Cuil doesn't do this - the only reference was to a Chinese web site that refused to load.

Now I may be doing it a mis-service, but for the moment I'd describe it as 'pretty but fairly useless'. Google remains a much better search engine.

Monday, 28 July 2008

Content re-use and refactoring

Now I know everyone feeds off of each other, steals and refactors content etc, etc, but I received a spurious comment from Technosnack. In part it reads:

I'm Susan, of the TechnoSnack's team and I wish to inform you that we are opening a new blog aggregator about Computers & Internet news. We put it on-line some hours ago and the link is: The main objective of this project is creation of a "virtual dashboard" of posts coming from many specialized blog and information about Computers & Internet world, with news about Linux, Windows, Mac, Open sources, Security, Graphics, Symbian and more on...
The key feature is that news come directly from blogosphere. We wish to show a preview of posts, with a link "Read more..." to signed blogs. If users are interested in news, they are redirected to your blog and can read entire post directly from your blog! So, the different signed blogs can increase their visibility and reach more visitors, all over the world! We think that in a little of time it can send more visitors to re gistered blogs, contributing to diffusion of know-how about Computer and Technology world. I visited your blog and I think it has very interesting and useful posts! So, are you interested in this idea, with your blog? If yes, then you can register your blog, using the specific "Registration Form"! REGISTRATION IS ABSOLUTELY FREE! The only thing we ask to you is to insert TechnoSNACK banner in your blog to promote this project. Or, if you prefer, you can insert a link in your blogroll. If you like (we whould be happy, but it is not mandatory :-), you can write a post regarding TechnoSNACK project in your blog, to promote this idea.

Which sounds fair enough. Flattering even. Until you think about it.

They want to take my content, re use it, sell advertising, make money, and at the same time act like they're doing me a favour.

Over the years I've had content ripped off, even had my resume stolen as an example of how to write a resume, to the extent that I don't care anymore. I've come to assume that everything published here is public domain and written for my own purposes, and that if the content is reused the source is acknowledged. I know people working for vendors I deal with read this and sometimes use the material for internal purposes, but by and large everyone has stuck to the rules as regards public re-use.

So I'm not signing up with Technosnack. If the content is that interesting they can harvest it and acknowledge the source, and even send me a few dollars if they want to. Search engines and page ranks will do the rest.

Thursday, 24 July 2008

Medieval warfare on the grid

Anyone who reads this regularly will know that, along with the computer related stuff, there's a  long standing fascination for late Roman and Byzantine history, plus the very early medieval post Roman world.

A part of me has always wanted to find a way to integrate the two, but never has. Such is life. However a former colleague has managed to get himself on to the management board for a really interesting project trying to simulate the logistics for the Battle of Manzikert, and using grid based computing to run scenarios as to how you get people, horses and the rest to a particular location.

And this is a very interesting problem. In the early Roman empire period the army walked carrying their own gear and there was a system of granaries to stockpile grains and pulses. (There's a reconstruction at Lunt Roman Fort for the interested).

These stockpiles were sometimes used to help feed the populace during times of famine.

In the later Roman empire things were different. Large parts of the army rode, and horses need feed, and there were large baggage trains. It's been argued that one of the reasons the empire in the west came apart is that after the loss of North Africa, the tax base was too small to support the costs of acquiring grain to keep the army in the field.

And so Manzikert. Big heavy horse based army with great need for logistic support meets horde of scruffy turkic nomads who in the main lived off the land. Big heavy horse army loses. The result is the turks get the Anatolian plateau which can support their horses - grassland, and Byzantium loses an area which can go grain to feed the city and the army.

I'm envious ...

Wednesday, 23 July 2008

The CherryPal cloud pc ...

Cherrypal  have released a $249 computer  that is designed to work with the cloud - little or know local storage - most data stored elsewhere in google docs, zoho, flickr, or some S3 base storage service, but with some local wordprocessing capability.
Having been playing with an old PPC imac  working in a similar mode - slow, low on memory and disk but works well enough as a web terminal to google docs, zoho and gmail, plus abiword for local document creation. It's all I need and use 99% of the time. And since I replaced the window manager with xfce it's been fine - in fact it's been surprising how useful it is.
Now such a stripped down environment is getting damned close to a thin client - not much locally, but a lot of infrastructure behind. After all if you can do it in a browser you can do it on anything, which means what is effectively a thin client for the web2.0 web. Yet as we know because the reponsiveness of remote services can vary as a function of load, it's good to provide a local wordprocessor/editor to provide that good close coupled response that even AJAX based solutions don't give all the time and synchronous editors like htmlarea or Xinha definitely don't give.
Providing cloud access plus local capability as a backup for when the internet is flaky slow or just plain gone away is a good trick. Just as  I sometimes write my posts using kwrite running locally on my machine and cut and paste my submission into the blog editor, exactly the same way that 15 or so years ago I used to sit at home with my Mac Classic with its 9" b+w screen and write reports using microemacs and then cut and paste them into a terminal window running a  VMS edit/edt session on the VAX8650 at work. 
Not because one was more sophisticated than the other but because one was more responsive than the other ... 

Wednesday, 16 July 2008

usenet news -rss for the 80's

I've thought about why usenet died and blogging seems successful:

Blogging has

  • tight control of authorship
  • inherent facilities to control comments and spurious posts

usenet has
  • distributed architecture with no authorship control
which of course makes usenet news incredibly robust and difficult to (a) censor (b) stop sprious posts. Blogging, while seemingly as anarchic as usenet does have tight individual control and of course there are a lot more blogs than usenet groups - so people who are researching the wearing of socks in the later western roman empire create a blog rather than posting to a group, so the people who read the blog are only other researchers interested in the area and possibly the odd sock fetishist - ie the potential audience is smaller.

The only question is why do the loonies continue to post to usenet news - havn't they noticed that usage as dropped way off for the text groups?

Tuesday, 15 July 2008

usenet news

a few weeks ago we took the decision at work to kill our usenet news service. Given it had about six active users we could no longer justify the use of usenet news.

Now I hadn't read usenet news for about eek years, but I had to become a transient expert in it for a few minutes finding those six active users free to air newsfeeds they could use to read news, which meant building and installing news reader software to test the groups and make sure that among all the dross they carried the requisite groups.

As I say, an instant sixty second expert. But I also rediscovered the addiction. I've started reading sci.archaeology and soc.history.medieval as the ratio of odd quirky posts to dreck is just good enough to be interesting.

Usenet news's demise is a sorry tale of how public fora could be hijacked both by spammers and loonies. Blogs allow communication and with all the addition to add comments via comment features etc, and with authentication if required, although things like openid don't really provide much of a barrier.

So why do blogs work and usenet news didn't - probably control by a single authr or group of authirs who have more engagement with keeping things right - unfortunatley shared facilities are prone to abuse...

Monday, 14 July 2008

No more StarStuff ...

one of my favourite radio shows, ABC NewsRadio's StarStuff is no more - cancelled by the program schedulers.

While I have no knowledge of the precise reasons to cancel the program other than it was felt more effective to use the funding elsewhere I think it's a great shame. Starstuff was unique as a weekly science and astronomy program with a decent science news update.

With the ending of the New Scientist weekly podcast last year we seem basically to be left with Diffusion and the Guardian science podcast as providers of digestible 30 minute chunks of science news - ideal for listening to on the bus or over lunch.

maybe it's me but I do feel it's a shame no well endowed public broadcaster funds a comparable show

Failed upgrades and linux on the desktop ...

I have a very old PPC imac which runs Ubuntu 6.06. I originally built it as an experiment, but strangely enough it turned out be incredibly useful and now lives in the study where it's used almost daily. It ended up having desk space in the study as the compact design meant I could stick it on my desk where the old toshiba laptop was. The $83 linux machine languishes in the garage for the moment as it doesn't fit comfortably on the desk, and winter's cold means it is relatively little used. 

So back to the ppc imac. Slow, low on memory and disk but works well enough as a web terminal to google docs, zoho and gmail, plus abiword for local document creation. It's all I need and use 99% of the time. And since I replaced the window manager with xfce it's been fine. So much so that I decided I should upgrade it to a more recent version of ubuntu for ppc, even if it's no longer a mainstream project

I decided to go for an in place upgrade using synaptic, so I downloaded the required cd's burned them and let synaptic do it's thing on Friday night while we did our Friday night thing, which basically consists of making an anchovy pizza, a bottle of red wine, and some tv while we unwind and talk about the week past and what we want to do on the weekend. 

Well it almost worked. It did preserve everything but it did something funny to the fonts such that all the menus where in a mixture of rectangles and glagolitic. 

Not good.

 As the machine was data free - even local documents usually end up being uploaded, I decided to blow it away and rebuild with a later version of Xubuntu. So I started it off, told it what it needed to know and went to bed. 

Come the morning I had a machine with a successful build, or so I thought. Let the machine restart and up it came. Got as far as the splash screen and then crashed back to a boot prompt. 


So, while I fed the cat, got the paper in, I downloaded and burned an earlier version - after some trouble this one started to install. 


Nope, built a base image but failed to install software properly and aborted. 

Tried the hardy heron 21 April release candidate. This failed as well, great base image, couldn't find the software. 

Now this sounds like I spent the morning doing this. 

I didn't. 

I read the paper, let the cat out, let the cat back in, made breakfast, drank coffee, did Saturday things. But by mid morning I wasn't any further forward. 

So I decided to revert to what I had before and rebuild with 6.06. Booted it up from the cd, told the installer some things, and let it do its thing while we went off to the market and have lunch out. Sure enough, when we came back I had a working bootable installation. 

Left it going for an hour while it applied all its software updates, and then installed the extra software I wanted - basically abiword, pan, and kate/kwrite. So I then had a working ubuntu machine. All I had left to do to recreate what I had before was to roll it over to xfce with the xubuntu-desktop package. 

Well we were going out that evening so I didn't do that right then, but waited to Sunday afternoon for the final act. 

Now there's a reason for this little story. 

Fixing the problem didn't take a lot of time really - it fitted round life. But it does show that slick though Ubuntu and its derivatives are, the upgrade procedures are a bit so-so. Most things work but the documentation and debugging information isn't that good. 

And what there is tends to be terse an cryptic. Not a problem, but as it moves out into the mainstream it may increasingly become so. One of the things that the commerical vendors have is very good online documentation and upgrade procedures. 

Community efforts tend not to have such good or comprehensive testing, nor the documentation. And it shows when things don't work. Linux is a viable replacement for commercial OS's. And until this little episode I'd had no more difficulty with linux than with other OS's. 

But this little drama showed that what the commercial vendors do well is enduser documentation and support resources. And that's always something that needs to be factored in to any decision to put out Linux on the desktop

Sunday, 6 July 2008

AbiWord mini-review

I've started using AbiWord as an alternative to open office on the old ppc imac I have running linux and I'm quietly impressed at its capabilities as a lightweight word processor on an older slower machine.

It works, it's reponsive, it's fast and it can export in a lot of formats - what more can one need, and yet it's a fully featured word processor.

Apart from the help key on the mac keyboard becoming the insert/overwrite toggle I've come across no real idiosyncracies.

With open source - everyone focuses on OpenOffice as a Microsoft Office killer. AbiWord gets ignored, but actually it has all the functionality 90% of the popukace need yet it needs a much less meatier machine - ideal for small low powerd nano portables.

Language creation

Last week I was on vacation on the north coast. During that time I finished reading Derek Bickerton's book "bastard toungues" about his research into Creoles, ie synthetic contact languages such as those created by the former slave populations in Surinam.

The book's an entertaining read - part linguistics, part autobiography - but in it he let slip he's tried once to get funding for an experiment for studying creole creation by putting together a group of people from disparate backgrounds without a common language in an isolation experiment.

The experiment never got funded primarily for ethical reasons. At the end of his book Bickerton hypothesises that you might see a 'found' version of the experiment in day care centres and schools where there are migrant kids from all over - like in the east end of London where there are primary schools where kids speak eighty different languages.

Actually I don't think you'd find such a found experiment as the kids are bathed in a dominant language - English - encouraged to speak it and go home to parents who want to learn English, even if in some cases for cultural reasons mum stays at home and never really learns it.

What is interesting though is the slang of disaffected youth such as in the housing projects around Paris where this argot of French, Arabic, Berber, Lingala and the rest has become a badge of defiance. Does this have the characteristics of a creole or is it something else?

I suspect it's probably fairly creole like but I'm not enough of a linguist to do more than wave my hands and make vaguely sensible comments ...