Wherein the IngentaConnect Product Management, Engineering, and Sales Teams
ramble, rant, and generally sound off on topics of the day
 

Consortial Networks and Publishers: Partnering in a Sea of Competition

Monday, March 17, 2008

The Electronic Resources & Libraries conference is taking place from tomorrow in Atlanta, GA. This is the conference's third year, and it's sold out - no surprise, given that it's a packed schedule with some strong speakers. Our own Jeff Downing (library relations manager) will be part of a panel discussion about the ways in which consortial networks can help libraries to retain "market share" in an increasingly competitive landscape. Here's a précis of Jeff's paper (which we published in last week's eyetoeye newsletter).

It is no secret: libraries face daily and ever-increasing competition. Within this sea of competition, however, publishers and regionally-based consortial networks are forging partnerships to develop creative, long-term cost-effective business models for content delivery.

Where is the competition coming from?
Competition for traditional library services is coming from all directions, but most obviously from the web, where consumer information is widely available and in many cases freely accessible. Wikipedia, for all its faults, has become a destination reference resource while other less well-branded sources of information are made easily discoverable by search services such as Google. Thus users are now able to self-serve much of the information that historically has only been available via the library or other paid services. But, of course, users are largely untrained in the skills of assessing found materials for authoritativeness, and in forgoing library assistance they are at risk not only of missing out on valuable paid-for resources, but also of basing their studies on incorrect data or ill-formed arguments. The convenience of internet research is substituting for the credible sources to be obtained from the traditional library.

What effect does this new competition have?
Historically, libraries have had the good fortune of being a monopoly; if you wanted access to information, especially authoritative information, you went to the library. Libraries had no competition and thus had no need to operate like a commercial business. As other resources become more prominent, libraries are having to re-envision and re-tool to operate in a more competitive environment. This is an attitudinal shift to which not all librarians are ready to adapt; the rigours of competition in a free market are not necessarily a welcome environment for those who have opted for an altruistic career assisting researchers in their information quest.

End of Chain of Craters Road, where it meets the lava flow (Volcano National Park, Hawaii)

Some people's reaction to the sea of competition?

How can libraries reinforce their value in the information supply chain?
Researchers continue to need to access quality, peer-reviewed information, and in providing this the library is making itself an essential tool in the academic arsenal. Libraries should take advantage of regional networks like Amigos and Palinet that can help by promoting libraries as information providers and community leaders, and by facilitating sharing of resources and development of innovative services. Networks may also be able to negotiate discounts of which members can take advantage when purchasing scholarly content from publishers or aggregators.

If you are attending ER&L, be sure to attend this session in order to add your voice to the discussion. If you would like to arrange an appointment with Jeff Downing during the event, please contact jeff.downing@ingenta.com - or stop by the Ingenta table at the sponsors' reception tomorrow night.

Labels: , , , , ,

posted by Charlie Rapple at 5:34 pm

 

Publishing Technology Trends: authoritative? What's that? And who says?

Friday, January 11, 2008

Remember December?
We held the first event in our new Publishing Technology Trends seminar series. The venue was Shakespeare's Globe theatre, on the banks of the river Thames in London. The session we held there was designed to communicate the latest developments in information industry technology to selected publishing industry executives. I made copious notes, which I will share with you in a series of postings over the next few days. First up is a review of the session by our own Chief Technology Officer, Leigh Dodds, entitled "Authoritative? What's that? And who says?"




Sated by lunch or fascinated by the topic? For whatever reason, you could have heard a pin drop among our audience as Leigh Dodds reviewed the ways in which we ascribe authority to content, explored the potential for crossover between traditional peer review and emerging Web 2.0 systems, and considered whether we can make processes more visible to end users.

The massive amounts of information available both through conventional publishing channels and on the web make it difficult for users to find reliable information. Particularly disturbing for publishers is that users are, ultimately, more concerned with finding an answer to their question than with issues of authority. Furthermore, users often have a very different understanding of authority to publishers; consider, on the one hand, the widely-accepted Google model wherein subjective measures of popularity and relevance are a proxy for authority, and on the other, publishers’ expectation that authority denotes submission to, and acceptance by, a formal process.

Despite the well-documented cases of abuse in the last year, editorial control – and particularly peer review – remains the most effective way to filter research output ensuring that published content is the most relevant, interesting and authoritative in its field. However, this formal publishing process is subject to pressures including the costs of filtering ultimately unsuitable content (average turnaways are 80%); the time it can take for content to undergo the process; and the constant need for the new material which attracts most usage.

Web 2 publishing certainly reduces the time-to-market as the majority of processes take place post-publication. User-generated sites such as Wikipedia certainly benefit from the speed and simplicity with which pages can be created, reviewed and edited - but even Wikipedia itself does not describe the content delivered through such "creative anarchy" as authoritative. Its creator attempted to deal with some of its infamous problems in the business model for second-attempt Citizendium - which only allows editing by registered users, incorporates marks of "quality" and is managed by subject editors. The success of sites such as Postgenomic suggests that the "wisdom of crowds" approach is even more effective within a subject silo, while their ability and tendency to include related material (conference programmes and reports, blog postings) brings them closer to traditional publishing.

Nature Publishing Group attempted to combine the traditional and emerging approaches with its open peer review trial. Although ultimately unsuccessful, the project was noted for its transparency - options included making identities of authors and reviewers public, publishing reviewer comments and even allowing end users to contribute. However, this transparency may have contributed to the project's lack of popularity, as academics will naturally be wary of publicly criticising one another's work. The lack of integration with other workflows was another factor preventing this concept from catching on at Nature, but one interesting observation was that posting content online earlier in the publication process did encourage authors to make it more presentable; a transfer of responsibility for some part of the copyediting process from publisher to author.

Leigh proposed taking forward the "open" concept in terms of openly demonstrating to users that content has been reviewed in some way, for example with a kitemark for peer review. As with Creative Commons licences, this would need to combine human-readable logos with machine-readable embedded metadata. Kitemarked content could therefore by searchable (as Creative Commons content currently is). A current example which displays promise is the BPR3 scheme, which enables bloggers to mark postings containing scholarly subject matter (in order to separate them from personal postings); BPR3 is being reviewed by CrossRef, which may carry out prototype work (in which Ingenta would participate).

Leigh has bookmarked further reading material at http://del.icio.us/ldodds/charleston-2007-11; please do share your comments below.

Labels: , , , , , , , , , , ,

posted by Charlie Rapple at 4:11 pm

 

The Team

Contact us

Recent Posts

Archives

Links

Blogs we're reading

RSS feed icon Subscribe to this site

How do I do that

Powered by Blogger