WordPress is now set up on my new domain www.archaeogeek.com, so this will be my last post from this wordpress.com hosted blog. Next job is to export the existing posts to the new domain, then the new feed should be available at http://www.archaeogeek.com/blog/?feed=rss (or rss2 etc, hopefully). I’m going to try and set it up to forward the old address, but in the mean time if you’d like to keep subscribing, that’s the way to go…
See you over in the new blog!
I’m going to be moving away from a wordpress.com hosted blog in the next few days. I’d like more flexibility with the layout, and I’ve been playing with some cool mapping that I would like to be able to show- what’s the point in blogging about it if I can’t actually let anyone see it!
So, this is a three step process. I’ve got to install wordpress on my shiny new http://www.archaeogeek.com domain, then import my existing posts into it. Then it’s a case of getting people to change their feed addresses, unless there’s a clever way of doing that…
In my spare time/lunch times I’m in the middle of a major project at the moment, to update our site database. Without going into the gory details of how it ended up in three separate, totally unlinked databases, it is supposed to document the archaeological sites we’ve worked on since the 1970’s, and to help with the administration of project archives, the location of finds within our finds store and so on.
For a long time I’ve wanted to sort this out, pull everything into one place, display the site locations on a web-based GIS and so on. This is not rocket science, but the data had ended up in such a bad state that I couldn’t see past fixing that before getting to the good and fun stuff. Attempts a few years ago to use Mapserver as a mapping interface were abandoned because the positional data for the sites was wildly inaccurate (we seemed to do a lot of work in the Scilly Isles, which are as close to 0, 0 on the British National Grid as you can get), and connecting to Microsoft Access was quite difficult and unstable. If anyone using a Windows XP machine tried to open the database, my map would not display. None of those problems are Mapserver’s fault of course, but you can’t roll out a map that has inaccurate data on it, and might not alwats work.
So, after a few years of muttering about this, I have finally bitten the bullet. I am working, in stages, towards an integrated database for all of our finds and archive information, in a PostgreSQL database, with both an Open Office base front-end for querying, and a web-based map interface. I’ll blog about each stage in this process, starting with integrating and data-cleansing in Microsoft Access.
The title of this post refers to the incredible feeling of achievement that I had when I got all this data together, followed by the difficulty I’m having moving that data into PostgreSQL. More later…
The science might be a little dodgy (No more methane at all? What about all that decaying plant matter?), but I think this image is great. You’ve got to wonder what alien archaeologists will make of our heavy metals and particularly stubborn plastics though, in a time as far ahead of now as we are ahead of the palaeolithic…
(Goes off in a fit of navel-gazing about the lack of permancy of anything…)
There have been a few heads-up over the last week about 52°North, who have just announced an initiative for geospatial open source software. Of course, we’ve heard this before, but this time ESRI are on board.
So, we have Autodesk supporting OSGEO, and ESRI supporting 52North’s initiative (it needs a snappy title or acronym). The approach seems different though, because Autodesk chipped in right from the word go by open-sourcing one fork of MapGuide, which they freely admitted was not a core product. It appears however as if ESRI are not looking to open-source one of their existing products, but to help to develop new technologies, currently in the areas of Sensor Web Enablement (SWE), Security and Digital Rights Management (to quote from the press release).
Personally, I can’t decide if the big guys are squaring up for an opensource fight, or if we’re just seeing much more involvement in the opensource movement. Following on from the recent discussion about the use of opensource alternatives to ESRI’s main product lines, and further back, ESRI’s decision to include support for PostgreSQL as a database, hopefully this is all a sign of a gathering momentum. At first I thought this 52 North initiative was a bad thing- a dilution of focus, but maybe it’s just a sign that the snowball’s getting that bit bigger. More products, and some healthy competition has to be good, right?
About three weeks ago I decided to give my Sharp Zaurus a well-earned rest and try going back to a paper-based approach to project planning and time management. Well, I say “going back” but in all honesty I’ve never tried the paper-based approach, it’s simply that I have never managed to find exactly what I want in a PDA to-do list/calendar, and there is always that low-level worry of data loss and breakage.
I’d been interested in the Hipster PDA/DIY Paper Planner approach for a while, and decided to give it a go, since the total investment (see below) was less than £20, so if it all went horribly wrong then I hadn’t lost much! So, from Waterstones I purchased a Moleskine Pocket Memo Folder, from my local generic stationers I purchased some 3×5 inch blank index cards, from my local art store I brought a really nice propelling pencil and fine-nibbed biro, and then I downloaded the HipsterPDA DIY Planner template.
The Moleskine had too many pockets for the number of cards I wanted initially. To get around this, I cut some of the card dividers out (sorry Moleskine). There is no loss of integrity with this approach because the dividers are separate card inserts, stuck at the sides but not at the bottom. This left me with three large pockets, each of which held a reasonable-sized stack of paper/card (see next point).
Index cards seemed too thick for what I wanted, so I ended up using them for the items that needed to be robust and long-lasting, such as the calendar and important contact details. The rest I printed out on A4 paper, and spent a happy hour or so cutting the individual cards out with a craft knife. Initially I printed way more cards than I needed, and didn’t print on both sides, but when I get around to reprinting I will use both sides (and only print the sections I want).
After a few weeks:
I think I like it!
I find it easier to write ideas down freely on paper rather than on a PDA, even using a sketchpad programme, so am tending to write down far more “speculative”ideas than I would with my zaurus.
There are some great day-planner templates, of the type recommended in Time Management for System Administrators. I’m trying to get into the habit (after reading that book) of arriving at work a few minutes early and planning my day before I even check my email. The only thing I allow myself to do first is change the data backup tape in case I forget later on. The day-planner is a great tool for combing a daily todo list and time planner (hence the name, I guess) and it really works for me. On one side is a list for tasks and spaces to assign priorities (and a nice tick box to check when it’s completed, which is always a bonus) and on the other is the work day laid out in hours, so it’s a small job to map out roughly how the day should pan out.
Other useful templates are the project planner, agenda, notes (obviously), shopping list and weekly time-tracker. This in particular translates very well at the end of the week to my work time-sheet. I haven’t really used any of the others, and consequently when I refill my Moleskine I probably won’t include them.
Is it the be-all and end-all?
Not sure. I find the 3×5 cards slightly too small (and the bigger ones too big), although this may be mitigated by printing on both sides of the page and only including the templates that I really need. I also like having my todo list and calendar integrated, and that’s more of an effort to do on paper than digitally, or online.
Is it the end for my zaurus?
Absolutely not! This frees up my zaurus for other things, like experimenting with other operating systems or software (and filling it up with ebooks and mp3s). It also saves me from the worry of having to ensure everything is backed up, particularly if I am playing around with other packages.
In conclusion, I’m still trying to find the best approach for time/project management and todo lists but this does give me some flexibility and peace of mind that my previous pda-based approach did not. We’ll see…
I was at a workshop today in York, run by the Archaeological Data Service, entitled XML for archaeologists: Beyond the Hype. I went along because I felt that I didn’t really understand how XML works, and have to say I was very pleasantly suprised because I absolutely loved the workshop and came away feeling that I’d made the conceptual link that I needed in order to understand XML.
My problem was that I was confused by the idea of a “language”. I understood that in XML you defined your own markup tags, and that somewhere there was a schema that explained what those tags actually meant. My sticking point was that I couldn’t figure out how you explained what something was, without relying on a whole bunch of other things that you’d need to explain as well. How would you explain to a computer what an elephant, or should I say an <elephant>, is?
What I now see, or what makes sense to me (ie not necessarily the truth but good enough for me to work with), is that it’s actually more like a grammar than a language. It defines objects , but only in terms of rules and relationships. In a spoken or written language, that would be like knowing how to conjugate a verb or use it in a sentence without ever knowing what that verb meant. In XML, you don’t have to explain what an <elephant> is, but you can define that it has a <trunk>, four <legs>, two <ears> and so on. The pc doesn’t have to understand what any of those things actually are, but it understands the relationship between them, because you’ve defined that in your schema. Once I made that connection (sorry to those that think I’m terribly slow) I really enjoyed the workshop.
What also became clear, is that archaeologists, and in fact anyone who has to classify objects and assign behaviours to them, fundamentally understand XML even if they don’t realise that they do. The key is to try and persuade people to codify those objects and their behaviours, and to get them all to use the same language/schema to describe them.
There are some schemas available in archaeology- in the UK the best known is MIDAS XML but there’s also ArchaeoML. Some people have argued that they don’t go far enough, partly because there isn’t enough formalisation of terms of reference such as for colours. To paraphrase a colleague’s analogy, the colour “black” has a completely different meaning if you are in greyscale, monochrome or full colour. However, I can’t believe that these issues haven’t been considered before, and in fact most can be avoided by ensuring that everyone uses and understands a common vocabulary. MIDAS XML, which is the schema I am most familar with, comes with a set of thesauri and word lists that should help get around this issue, and if in doubt there are much larger models that we can call upon.
One of the most interesting presentations that I attended today was about the TEI- or Text Encoding Initiative, which aims to provide a toolkit for encoding literary texts and other information for online use. Pretty much everybody present at the talk could immediately see a use for this in dealing with archaeological grey literature. This is the huge mass of unpublished reports that commercial archaeolgical units produce each year. It is a notoriously difficult resource to quantify, let alone search, as most units have neither the time or the money to make this information available in any sensible form. Currently the job of trying to quantify this resource falls to the Archaeolgical Investigations Project, based at Bournemouth University. I can quite understand why the AIP need to exist, but currently their main method of data collection is to send people to visit units and read all of their reports for a year and type the data into a database. They need to do this because they can then have a consistent method of recording the pertinent details about a site, and what was found, separately from the remit of the unit, which is to fulfil the terms of the archaeological brief. If units could be persuaded to adopt a common schema and methodology for marking up their reports as the TEI has demonstrated, then surely this need to actually go and visit every unit in the country could be avoided? After all, broadband is a lot cheaper than train tickets! Marking up content does take time and would add to the cost of each project, but (presumably) this could be offset somehow by the cost savings on the AIP project for English Heritage.
None of this is rocket science, I’m well aware of that, it’s just getting enough people on board and coming up with a way forward that we’re all happy with…
At FOSS4G last week, my colleagues and I got chatting with the folks from OSGEO. It was difficult not to, given that they played such a huge part in organising the conference. Anyhow, we identified that it would be a good idea to set up a UK Local Chapter, to provide a UK-specific focus and slant on the work that OSGEO are doing. The kind of things we might look at include providing a first port-of-call to newcomers to the world of geomatics in the UK, with a particular focus on the open source tools available; providing a focus for lobbying for public access to Geodata (you know, the stuff we’ve paid for with our Taxes but have to pay again to use).
What we need at the moment is expressions of interest. Enough signatures will convince the board that such a chapter would be worth setting up. We only have a few names at the moment, mainly because we only started canvassing this week, so if you feel you could sign up then pop on over to the wiki and add your name. For more information on local chapters, here’s the place to look.
What we are also trying to do is come up with a manifesto for the group. If you have something to contribute to this, then please feel free! That’s what wikis are for, after all…
If you’re into archaeology, regardless of whether or not you’re in the UK, but you are interested in Open Source applications or Open Standards in archaeology, then we’re also investigating the level of interest in an Archaeology Special Interest Group. Again, at the moment we just need expressions of interest and ideas.
Back from Switzerland after the FOSS4G conference, and a weekend in Geneva. Whew! Geneva would perhaps have been more enjoyable if our hotel wasn’t on a street having an all-weekend party, complete with blaring music (Pink Floyd and Reggae mix one night, Slipknot or similar the next). Anyhow, we got about- went out to CERN and visited the United Nations, and even took in a little archaeology at St Peter’s Cathedral.
The end sessions of FOSS4G have been pretty well commented on elsewhere, and I don’t have much to add, except that I want to add my congratulations to Markus Neteler from GRASS for winning the Sol Katz award and to say that the whole conference was extremely interesting, inspiring and enjoyable.
I came across some great examples of what we perhaps have to call Neogeography today- from the beautiful and a little scary Information Aesthetics. They are all examples of using maps to display statistical information, and prove what all Geomatics-types know, which is that maps are a very effective way of getting statistical information across with maximum impact, and they can be visually attractive too.