Sean McGrath points to an interesting piece by John Sowa (he of conceptual graphs fame) on The Law of Standards. Sowa's hypothesis is that successful standards are codifications of existing practice, whereas unsuccessful standards are by and large attempts to create new technology. He illustrates the point by reference to programming language design, such as the design of Ada resulting in the widespread adoption of C.
I'm not sure I entirely agree with the characterisation. Firstly, there are successful new standards for novel technologies. Consider CDROM and DVD formats, for example. It seems to me that a more significant difference is that it is obvious with the successful standards what they are for. Nor does ratifying an existing practice always make a decent standard - by the time ISO had finished standardising Prolog the developer community had by-and-large forgotten that the langauge existed.
However, if we take as a starting point that complex technical standards without a really obvious application are often replaced by a simpler de facto standard, it rather begs the question what, if anything, will do this for the semantic web. We see a roughly bi-modal distribution of users on the Jena list: some just want really simple taxonomies and properties, while others want complex DL reasoning with persistent data, or else want to do funky things with all the freedom inherent in OWL Full. Sowa claims that the complexity of OS/2 drove the adoption of the Windows API. The other school of thought is that Microsoft have always tried hardest to woo developers with the best tools. So, maybe the answer to the complexity of the semantic web is even more, even better semantic web tools.