Monday, 22 January 2007
final version of orage synchronisation script ...
and the final version of the orage synchronisation script is :
touch ~/calendar/basic.ics
date >> ~/calendar/google_download.log
while test ! -s ~/calendar/basic.ics
do
wget -rK -nH http://www.google.com/calendar/ical/your_calendar_path/basic.ics -O ~/calendar/basic.ics -a ~/calendar/google_download.log
sleep 30
done
if test -s ~/calendar/basic.ics
then
mv ~/.config/xfce4/orage/orage.ics ~/.config/xfce4/orage/orage_old.ics
mv ~/calendar/basic.ics ~/.config/xfce4/orage/orage.ics
fi
elegant, its is not, but it works.
Thursday, 18 January 2007
orage synchronisation (take #3)
Orage synchronisation now works (90% of the time anyway). Two refinements would be to check if the file downloaded from google is > 0 bytes in size (wget occasionally connects but gets a spurious 404 error and writes a zero byte length output file) before replacing the old file with the new file, and checking if orage is running, and stopping and restarting it if it is running
The first part is easy - modify the script to read something like
wget -nH -rK your_calendar_file_url -O ~/calendar/google.ics -a ~/calendar/google.log
if test -s google.ics
then
mv ~/.config/xfce4/orage/orage.ics ~/.config/xfce4/orage/orage.ics.old
mv ~/calendar/google.ics ~/.config/xfce4/orage/orage.ics
fi
assuming you run this often enough the odd failed download shouldn't be a problem, otherwise you'd need to add some logic to retry the download
The first part is easy - modify the script to read something like
wget -nH -rK your_calendar_file_url -O ~/calendar/google.ics -a ~/calendar/google.log
if test -s google.ics
then
mv ~/.config/xfce4/orage/orage.ics ~/.config/xfce4/orage/orage.ics.old
mv ~/calendar/google.ics ~/.config/xfce4/orage/orage.ics
fi
assuming you run this often enough the odd failed download shouldn't be a problem, otherwise you'd need to add some logic to retry the download
Wednesday, 17 January 2007
Orage synchronisation take#2
My Orage synchronisation bodge-up didn't reallly work as Orage doesn't remember which archive file it last had between sessions, so here's take 2:
wget -nH -rK your_calendar_file_url -O ~/calendar/google.ics -a ~/calendar/google.log
mv ~/.config/xfce4/orage/orage.ics ~/.config/xfce4/orage/orage.ics.old
mv ~/calendar/google.ics ~/.config/xfce4/orage/orage.ics
Each line is of course a single line of code. Basically what the script does is:
1) uses wget to retrieve the google calendar file and save it in the directory ~/calendar as google.ics. It appends a log of the transaction to ~/calendar/google.log. The directory ~/calendar must exist and you need to create google.log with the touch command.
[If you do not want to append to the log file use -o in place of -a]
2) rename the existing orage.ics file to something else
3) move your downloaded google calendar to orage.ics
This seems to work but there seemed to be a delay on orage reading the ics file. My testing has been fairly minimal so far so it might still be somewhat sub-optimal.
I'll see how it goes.
wget -nH -rK your_calendar_file_url -O ~/calendar/google.ics -a ~/calendar/google.log
mv ~/.config/xfce4/orage/orage.ics ~/.config/xfce4/orage/orage.ics.old
mv ~/calendar/google.ics ~/.config/xfce4/orage/orage.ics
Each line is of course a single line of code. Basically what the script does is:
1) uses wget to retrieve the google calendar file and save it in the directory ~/calendar as google.ics. It appends a log of the transaction to ~/calendar/google.log. The directory ~/calendar must exist and you need to create google.log with the touch command.
[If you do not want to append to the log file use -o in place of -a]
2) rename the existing orage.ics file to something else
3) move your downloaded google calendar to orage.ics
This seems to work but there seemed to be a delay on orage reading the ics file. My testing has been fairly minimal so far so it might still be somewhat sub-optimal.
I'll see how it goes.
Tuesday, 16 January 2007
syncing orage with google calendar
This really came out of playing with Xubuntu on parallels, the VM server for the mac. Xubuntu is a lightweight window manager and comes with orage as a basic calendar application. The question is, how to sync it with google calendar - or more accurately how to set it up to provide an offline calendar service so that you can have a recent calendar available (I'm so taken with Xubuntu I'm thinking about using it on an old machine that's not online all the time)
The trick actually is incredibly simple - google gives you the unique url of you calendar. This is an ics format file. Orage reads ics format files as archive files.
The trick really is to do a
wget -nH -rK your_calendar_url
as a cron job and hey presto, you have as recent a copy of your calendar as your cron job is set up to run.
Of course calendar files are kind of big so you probably wouldn't like to do this too often over a slow link, but even so you could probably run it once a day even over dialup
Parallels ...
I've just been playing with parallels, the virtual desktop software for the macintosh. And it looks good. Built Xubuntu, the low demand version of Ubuntu inside of it, the installataion process was exactly the same (except faster) as building it on a real (old) pc and so far it just works.
Totally odd to have a linux desktop inside a Mac window, but there you go ...
Certainly could well be a solution to having both a Mac and a linux machine at home, certainly something to play with further
Totally odd to have a linux desktop inside a Mac window, but there you go ...
Certainly could well be a solution to having both a Mac and a linux machine at home, certainly something to play with further
Wednesday, 10 January 2007
And another one bites the dust ...
BritishLibrary.Net, an initiative started by the British Library back in 1999 to provide a self financing dialup service for scholars and anyone else who wanted an account is closing at the end of March.
The service started in the 'free' dialup service boom in the UK with the intention of providing a free basic service for scholars who wanted a service and by implication an email address with a bit of class. In reality it was a virtual ISP reselling easynet services but logo'd by the BL.
My involvement came through York University. When we closed our dialup service, we needed to be able to offer an alternative. We settled on BritishLibrary.Net because it was free and reliable and (a) they would give us cd's to redistribute, and (b) the had good information to allow Mac users, Linux users and Windows users who wanted to set them selves up without the CD.
Well that was then, this is now. Boradband has basically killed dialup and they're not making the revenue to cover costs, so they're closing. Simple as that.
Being ethical I mailed former coleagues at York about the closure - I don't know what they've been doing about recommending dialup providers since I left, or even if they do, but I thought I'd better let them know in case they're still handing out instructions, or worse still got the information in a collection of online guides somewhere.
Other than that, not much. Used wget some months ago to get back my web pages, and I havn't used my email account with them for years and it gets almost no traffic, suggesting that it's dropped out of everything including the spammers. The only thought I have is thet the 'use down a wet hole' powerbook I got rid of last Sunday probably has the BL dialup settings and email settings still set on it. Not much of a hazard, don't see anyone using the UK dialup settings from Australia, and well, the email, there's nothing there. The machine is too old to become a spam bot so not a problem.
Anyway I don't see anyone picking up my old powerbook - too old and I bet it ends up in landfill ...
Tuesday, 9 January 2007
Google Docs as an editor (but not a blog publisher)
It's good, but nine times out ten I can't post directly from google docs (invalid url for my blog is the error message). The api seems incredibly flaky, yet I can post from flickr no problem.
Something ain't right ...
Something ain't right ...
Getting rid of old machines ...
Sunday was a sad day. Actually it wasn't, but it kind of closed the book on retro computing. We had a couple of old laptops, Judi's old Windows 95 Toshiba that was going to have Damn Small Linux installed on it and never did and an old but working Apple powerbook 520.
The Toshiba was old, the battery was shot but the machine still powered up, but was clearly surplus to requirements. Wiped the disk with DBAN, didn't reinstall an OS (and forgot to mention this to the recyclers at Revolve - naughty!) and that was it. Something that cost $3000 eight years ago is converted to a doorstop. The targus carry case that came with it is probably worth more.
The powerbook was something else. And it has a sort of history attached to it. Long long ago I did some work with archaeologists about taking computers into the field, and typically down wet holes and places that are generally unfriendly to hardware. This eventually turned into me doing an ad hoc presentation at a field archaeology conference in the early nineties.
It also turned into an interest in computing in 'rough' places that you might well travel to but where things were a bit iffy - combined with my interest in travel to exotic places.
About the time I dscovered the palm pilot as the ideal note taker when away from base - something I still think holds, I came up with the idea that retro computing might be an answer to the computers and wet muddy holes problem.
The idea behind retro computing is that you use an old low value computer with a modem (or even back up to floppy disk) for data capture on the basis that if it gets lost or stolen there's no real loss. Hunting about I came up with the Powerbook 520 as the ideal device. Slow and heavy it may be but with a mono, and hence low power screen it had reasonable power consumption with its dual batteries, a modem, and ethernet.
So I decided to try it out and bought one from eBay, with the Apple eternet to UTP gubbins and tried it out. It was surprisingly good, and being a well made machine reliable. The only two downsides was that the modem was only 14.4 - good enough for email and ftp but forget the web (mac 68k web browsers sucked anyway) and even on ethernet the web browsers weren't the best.
Still it taught me soem valuable lessons about what is needed for productive retro and travel computing, and it doesn't just mean a laptop.
Then it sat in its case for three and a half years. I'd moved on and didn't have the same requirements for travel computing. So last week it was powered up (and it did first time) had all the documents wiped from it, and powered down for the last time.
Pity. I almost felt sad. It had been a good machine that served its purpose well.
The Toshiba was old, the battery was shot but the machine still powered up, but was clearly surplus to requirements. Wiped the disk with DBAN, didn't reinstall an OS (and forgot to mention this to the recyclers at Revolve - naughty!) and that was it. Something that cost $3000 eight years ago is converted to a doorstop. The targus carry case that came with it is probably worth more.
The powerbook was something else. And it has a sort of history attached to it. Long long ago I did some work with archaeologists about taking computers into the field, and typically down wet holes and places that are generally unfriendly to hardware. This eventually turned into me doing an ad hoc presentation at a field archaeology conference in the early nineties.
It also turned into an interest in computing in 'rough' places that you might well travel to but where things were a bit iffy - combined with my interest in travel to exotic places.
About the time I dscovered the palm pilot as the ideal note taker when away from base - something I still think holds, I came up with the idea that retro computing might be an answer to the computers and wet muddy holes problem.
The idea behind retro computing is that you use an old low value computer with a modem (or even back up to floppy disk) for data capture on the basis that if it gets lost or stolen there's no real loss. Hunting about I came up with the Powerbook 520 as the ideal device. Slow and heavy it may be but with a mono, and hence low power screen it had reasonable power consumption with its dual batteries, a modem, and ethernet.
So I decided to try it out and bought one from eBay, with the Apple eternet to UTP gubbins and tried it out. It was surprisingly good, and being a well made machine reliable. The only two downsides was that the modem was only 14.4 - good enough for email and ftp but forget the web (mac 68k web browsers sucked anyway) and even on ethernet the web browsers weren't the best.
Still it taught me soem valuable lessons about what is needed for productive retro and travel computing, and it doesn't just mean a laptop.
Then it sat in its case for three and a half years. I'd moved on and didn't have the same requirements for travel computing. So last week it was powered up (and it did first time) had all the documents wiped from it, and powered down for the last time.
Pity. I almost felt sad. It had been a good machine that served its purpose well.
Monday, 8 January 2007
Science, blogs and web 2.0
Couple of interesting articles picked up by way of the New Scientist technology blog, one showing that Instant Messaging is increasingly important for brainstorming between researchers who are online, and one from Nature claiming to show that scientists are on the whole not that interested in blogging, for a number of reasons, including that blogs don't count for citation purposes and the lurking fear that someone might get a march on your ideas as you are developing and discussing them, in the main as you can't control whom you communicate with.
This would be a concern for our development of a collaboration space based on Sakai, but it's really only designed to get rid of the tyrrany of timezones and have people work on joint documents within a closed space, so it becomes an internal discussion forum for a closed research group, ie a way of exchanging and editing drafts of a paper rather releasing works in progress to the world
Friday, 5 January 2007
Khipu (or quipu)
Writing systems fascinate me, ever since I was a kid and used to make up pretend alphabets. It was this fascination that has led me to (a) learn Russian, and (b) spend time buggering around with installing non-latin fonts for people in the days when it didn't just happen automagically.
It's also got a lot to do with my interest in lunguages and user interface design.
I've known about the Inca coding system for some time but really didn't know much about it and though that it was just a sophisticated tally system. Then I happened across a posting in Wired about a new model that suggests that the system was really ore complex than that and actually functioned as some sort of symbol based writing system. Doubly intriguing as ost other writing systems have followed the marks on a surface model, be they scratched on ostraka, written on the shinbones or goats or even the good old black marks on white paper.
Couple of other references:
Article (originally from NY Times)
Khipu research site
It's also got a lot to do with my interest in lunguages and user interface design.
I've known about the Inca coding system for some time but really didn't know much about it and though that it was just a sophisticated tally system. Then I happened across a posting in Wired about a new model that suggests that the system was really ore complex than that and actually functioned as some sort of symbol based writing system. Doubly intriguing as ost other writing systems have followed the marks on a surface model, be they scratched on ostraka, written on the shinbones or goats or even the good old black marks on white paper.
Couple of other references:
Article (originally from NY Times)
Khipu research site
Subscribe to:
Posts (Atom)