We’re moving from PCs to mobile, he says. This is rapidly changing the Internet. 51% of Internet traffic is non-human, he says (as of Feb 2012). 35hrs of video are uploaded to YouTube every minute. Traditionally we dealt with this type of demand via data warehousing: put it all in one place for easy access. But that’s not true: we never really got it all in one place accessible through one interface. Jeffrey Pollock says we should be talking not about data integration but interoperability because the latter implies a looser coupling.
He gives some use cases:
BBC wanted to have a Web presence for all of its 1500 broadcasts per day. They couldn’t do it manually. So, they decided to grab data from the linked open data data cloud and assemble the pages automatically. They hired fulltime editors to curate Wikipedia. RDF enabled them to assuemble the pages.
O’Reilly Media switched to RDF reluctantly but for purely pragmatic reasons.
BestBuy, too. They used RDFa to embed metadata into their pages to improve their SEO.
Elsevier uses Linked Data to manage their assets, from acquisition to delivery.
This is not science fiction, he says. It’s happening now.
Then two negative examples:
David says that Microsoft adopted RDF in the late 90s. But Netscape came out a portal tech based on RDF that scared Microsoft out of the standards effort. But they needed the tech, so they’ve reinvented it three times in proprietary ways.
Borders was too late in changing its tech.
Then he does a product pitch for Callimachus Enterperise: a content management system for enterprises.