New report commission: 100GB Ethernet and the future of networking

Last week I attended the BCS Visions of Computer Science conference in London. We’ll be posting a more detailed look at the event later on, but I thought I’d just mention Vint Cerf’s speech as it relates to a TechWatch report we’ve just commissioned.

Vint Cerf is often referred to as the ‘father of the Internet’ (although he’s always quick to point out that he worked as part of a team and doesn’t deserve this moniker), thanks to his pioneering work on ARPANET and the protocols for the ‘network of networks’ idea that became the Internet. At the BCS conference he spoke about those days and the excitement of early experimentation. He also covered some of the issues that now face the Internet as the number of users passes the billion mark and the number of devices or ‘terminators’ attached to it passes two billion.

He outlined several areas where there are ongoing or need to be major developments:

• A new addressing standard, IPv6, which will be able to cope with the billions of mobile phones, PDAs and other devices that are rapidly being connected to the Net.

• Internationalisation of domain names to handle non-English and non-Latin character sets such as Chinese.

• A massive increase in the use of geo-spatial and location-based information thanks to the increasing use of mobile-based access to the Web.

• The preservation and longevity of Web-based materials. Vint warned that: “our century may well be invisible to historians”.

• The energy used to ‘drive’ the Internet is becoming an issue of major concern, especially because of the large data centres that companies like Amazon, Google and Microsoft are using.

• Capacity of the Internet has been an issue discussed in the press. With the advent of widespread use of VoIP, Web 2.0 services and Web-based video (through the likes of YouTube), there has been growing concern that the Internet may not ‘cope’. Some commentators have referred to this as the “exabyte” Internet problem – how to build a network that can handle exabytes of information. One area of interest is the networking and routing equipment that provides some of the backbone of the Net.

The capacity issue was also the concern of the recent EARNEST foresight study into the future of the technologies that are used to build research and education networks and TechWatch has commissioned a report provisionally entitled 100Gb Ethernet and beyond: preparing for the exabyte Internet. More details on the BCS conference will be forthcoming over the next few days.

Open call: Data mash-ups and the future of mapping

In the Web 2.0 report we published last year (What is Web 2.0? Ideas, technologies and implications for education) we noted the importance of geo-spatial data in the development of mash-ups and we’re now planning to return to this subject with a new open call.

The growing interest in the use of geospatial and geographical information in combination with Web-based information sources and services has been driven, at least in part, by the emergence of new and low-cost technologies such as high-spec digital cameras, handheld GPS location equipment and vehicular SatNav. Alongside this, increasingly popular Web-based mapping applications and 3-D mapping tools have supported the mash-up approach. On the social science research side there have been many developments in graphical visualisation and simulation using location-based data.

After a number of discussions with experts across the education sector it is clear that this will be a complex report to commission but interested parties should have a look at the full open call, on the TechWatch website.

Content Management Systems: why we’re not updating the TechWatch report

Back in September 2001 TechWatch published a report on Content Management Systems, written by Paul Browning and Mike Lowndes. At the time it very quickly became one of the most popular TechWatch reports ever published, and is still a steady favourite, even today.

I can’t take any of the credit for the report as it was published before I took over TechWatch, but one of the first things people asked me for, back in 2004/2005, was an update to the CMS report; particularly the CMS product list. I did think about this, and even went as far as approaching the original authors to see if they were interested in a reunion. However, as I started to think about what the report would cover I had a crisis of remit.

The most important factor for me was what this was no longer a ‘future’ issue. As well as being a very practical report, exploring the nitty gritty of procuring a CMS, Paul and Mike’s report was also conceptual, outlining the bigger issues such as: “In reality a CMS is a concept rather than a product. It is a concept that embraces a set of processes”, and “… the boundaries of the CMS space are blurred. Substantial overlaps exist with document management systems, knowledge management systems, enterprise application integration systems, e-learning systems and portals”. In 2005 these concepts just hadn’t moved on far enough to warrant a new ‘future-facing’ report.

In addition, although there was a lot of demand for an update to the CMS product list, this isn’t within our remit. The purpose of the TechWatch reports is to start the ball rolling and hopefully stimulate interest/uptake elsewhere – we just don’t have the resources for ‘maintenance’.

However, this decision keeps coming back to haunt me, and when I was in Keele for the JISC Innovation Forum 2008 meeting last month, someone, once again, asked why there hadn’t been an update to the TechWatch CMS report. In fact, what this person was really saying was: “I want to procure a CMS but your report’s out of date”, and that’s really quite a different matter.

This got me thinking and revisiting the notes I made in 2005. My conclusion is that the answer is: “things have moved on a lot since then and perhaps you shouldn’t be buying a CMS”. I will elaborate on this in future blog items, but for now, suffice it to say that there is a paradigm shift brewing in institutional ICT provision, of which the institutional CMS is only one part. The future of the CMS is actually caught up in technological reinterpretations of the big concepts that Paul and Mike identified in 2001: processes rather than products, and blurring the boundaries between systems.

E-books: open standards déjà vu.

TechWatch has recently been asked to contribute its thoughts on future technology developments that are likely to have the most significant impact on library and information services in higher education. It’s for Update, the journal of the Chartered Institute of Library & Information Professionals, and one of the interesting questions they’ve asked us is about developments that we didn’t initially anticipate or whose impact has been greater than might at first have been expected.

This is not actually a straightforward question – just because we don’t publish a report on something doesn’t mean we didn’t anticipate it – but it has prompted quite a bit of discussion. I think one of the things TechWatch may be in danger of missing is the whole e-reader development, which will present challenges in integrating e-books into academic library acquisition, discovery, and delivery systems.

At the moment there are three main devices squaring up for domination of the market: the Sony Reader (from Sony, of course), the iLiad (from iRex) and Kindle (from Amazon). One of the big issues for HE will be the document format standards used by each device.

Work is underway on an open, XML-based standard called EPUB through an organisation called the International Digital Publishing Forum. The other key standard is PDF, which is now an ISO standard. Sony’s reader supports PDF and the company has just announced that they will support EPUB in a forthcoming e-reader. By contrast, Kindle only supports Amazon’s own standards, MobiPocket and AZW. It does not support Adobe’s PDF although it provides an ‘experimental’ converter. The iLiad supports PDF and Mobipocket.

There is more than a hint of déjà vu, here. Last year TechWatch published a report on XML-based office document standards which focused on the arguments around open and proprietary standards and the difficulties that would be created by the imminent approval of a second ISO standard within the office document standards domain. You really need to read the report to get the picture, but my concern is that there may be another ODF/OOXML-type situation emerging, with Amazon taking on the role of Microsoft.

For those who would like to know more about e-readers there was a long piece in the Observer newspaper on the 27th of July, with an abridged version published online. A more technical look at matters relating to e-books, rather than the readers themselves, is provided by the team undertaking JISC’s own major investigation: the National e-books Observatory project. There is also an interesting paper, What Happened to the E-book Revolution?, by Lynn Silipigni Connaway and Heather L. Wicht, which looks at the history of e-books and some of the barriers to their widespread adoption.

Semantic Web Technologies – has their time come in education?

In 2005, TechWatch published a report on Semantic Web Technologies written by Brian Matthews who, at that time, was deputy manager of W3C’s UK office. As JISC has recently announced an open call of funding for a study on Semantic Web technologies in teaching and learning we thought we’d use this opportunity to provide a bit of an update on Semantic Web developments.

The JISC call seeks to fund a study which looks at pragmatic aspects of the actual use of semantic applications in real world scenarios. The successful applicant will review a number of case studies of real-world teaching and learning scenarios and look at the potential for use of semantic technologies. The key question they are asking is: “Can you convince us that semantic technologies offer one potential solution to some real problems?”

This is an interesting development. In 2005, the Semantic Web was still seen by many people as very much a computer science ‘Grand’ research project which would take years to reach fruition. Despite some very well worked out visions as to what it would deliver – a more automated Web in which some sense of ‘meaning’ or semantics had been imbued into data held within pages and their links – there were still plenty of practical doubts. Even in 2006, when Tim Berners-Lee and other researchers at Southampton provided an update (The Semantic Web Revisited) they admitted that: “this simple idea, however, remains largely unrealized” (page 96). However, they were optimistic, arguing that the key development was for standards that express meaning to become well established and they reported that this was “progressing steadily”.

Despite this there are still concerns expressed, particularly by the business community, as to the practical reality of doing semantic web. Recent conferences such as Semantic Technologies have highlighted the question again: where are the practical examples? The Tallis Semantic Web Gang have produced a useful podcast of a round-up discussion which reviews their attendance at these conferences in which they debate some of these issues. As they make clear one of the key things to come out of these conferences is that venture capitalists and business development people are starting to ask quizzical questions about what exactly semantic web is, what does it actually do for real users and what is the ‘killer app’?

This reflects the original TechWatch report, which commented that: “people are still asking how they can be used in practical situations to solve real problems” (page 2). However, the report also concluded that higher education was likely to be at the forefront in the use of these technologies. Given that, the JISC open call seems particularly timely. There is an opportunity here for higher education to lead the way in making use of semantics with real users and I think it will be interesting to see the outcomes.

TechTracking: XML-based office document standards

Last summer TechWatch published a report on XML-based office document formats. The issue was one of standards: increasing pressure, particularly from the EU, was being brought to bear on public sector software procurement practices to only buy software that conforms to open standards. The argument is about how taxpayers’ money should be spent. In essence, that members of the public should be able to have easy access to all electronic documents published in the public sector without being required to purchase a particular software product in order to view or edit those documents. The TechWatch report looked at the on-going debate over the move away from closed, proprietary document formats used by everyday applications such as word processing and spreadsheet software.

The problem was that although everyone agreed that XML was the way forward, there was considerable controversy and technical debate over two proposed XML-based formats: Open Document Format (ODF) and Office Open XML (OOXML). The former had already been formally ratified as an international standard by ISO, whereas the latter, promoted by ECMA and based heavily on work by Microsoft, was still undergoing a process of being ‘fast-tracked’ towards standardisation. The fact that a scenario was emerging where there would be two international standards was causing considerable consternation.

The report concluded at the time that it was extremely important for HE/FE to keep a watching brief on the developments of these two standards and to start engaging with the debates. It also recommended that institutions should begin planning, pro-actively, for a switch to XML-based formats arguing that:

“although the UK higher education sector has, for a long time, understood the interoperability benefits of open standards, it has been slow to translate this into easily understandable guidelines for implementation at the level of everyday applications such as office document formats. As far as higher education is concerned, the use of office document formats has now reached a watershed.”

In October 2007 Becta referred Microsoft to the Office of Fair Trading after talks had failed to secure the agreements needed in several key areas. In the interim, Becta’s advice to schools was that they should not move to Microsoft’s School Agreement subscription licensing model.

Since then things have moved on. Although ISO approved OOXML (a standard they refer to as ISO/IEC 29500) in April 2008, four countries (Brazil, South Africa, Venezuela and India) lodged appeals. This meant that OOXML was immediately ‘unratified’ pending the outcome of the appeal.

In addition to these formal appeals there have been a number of complaints about the process of standardization including:

In January 2008 the EU announced that it would investigate a number of suspected abuses of dominant market position by Microsoft and that this would include the question as to whether: “Microsoft’s new file format Office Open XML, as implemented in Office, is sufficiently interoperable with competitors’ products”.

All in all, things are not looking good for OOXML. Also in January, Becta released formal advice to the effect that schools should not upgrade to Microsoft Vista or Office 2007 and that existing users of Office 2007 should not save in Microsoft’s OOXML format. In the short term, they recommend that existing Office 2007 users should save files in .doc, .xls, and .ppt formats.