I just watched an interesting webcast by David Gillikin, Chief of NLM’s Bibliographic Services, about the upcoming changes to the PubMed interface, followed by extensive Q&A. There was some confusion about how existing functionality would be mapped to the new interface, and understandable concern that the familiar interface would become dramatically less so. From an outsider’s perspective, the changes that were implemented looked reasonable, reducing the clutter of the existing design with some simplified controls and a more modern look and feel.
William Webber recently wrote an interesting analysis of the reports of the original Cranfield experiments that were so influential in establishing the primacy of evaluation in information seeking, and in particular a certain kind of evaluation methodology around recall and precision based on a ground truth. One reason that the experiments were so influential was that they provided strong evidence that previously-held assumptions about the effectiveness of various indexing techniques were unfounded. Specifically, the experiments showed that full-text indexing outperformed controlled vocabularies. While this result was shocking in the 1950s, 50 years later it seems banal. Or almost.
Having seen the recent news of gun-toting protesters at health reform meetings, I got into a discussion with my wife about gun control, and you know where that can lead. Yes, that’s right, to exploratory search. I had some hypotheses about the relationship between gun control and crime, and wanted to find some data to test them. I needed to find some crime statistics by state, and to cross-reference it with some aspects of states, including the degree of urbanization, population density, laws, etc. While I thought the odds of finding a canned analysis of my hypotheses was small given the amount of time I was willing to devote to the problem, I did try a few obvious queries. No luck.
My declaration of the difficulty of the solution to a challenge posed by Eleanor was premature. The problem was difficult, but apparently not impossible to solve. I wrote the previous post before Francine found a solution using classic berrypicking techniques, further confirming the utility of using more search engines than just Google to increase the diversity of results. Of course now that she has linked to that page (particularly in such a prominent blog :-) ) Google may promote it in its ranking and make that result more findable. (I am not sure about the no-follow restriction on comment links, but none the less the likelihood of someone else linking or bookmarking that page has just increased.) Francine’s discovery through exploratory search thereby increases the odds that others will now find that document through Social Search.
Yesterday Eleanor posted a great example of a difficult exploratory search. The goal was to answer a question, but not only was it difficult to figure out how to articulate the search effectively, but also it was not clear whether the answer even exists. The difficulty of articulation stems from the fact that even in combination, the terms that Eleanor used to characterize the information need retrieved documents that were similar to the desired information, but were lacking some key aspect.
The USE SIG of ASIS&T is celebrating its 10th anniversary at the annual ASIS&T conference in Vancouver, BC this year with a symposium on collaborative information seeking, followed by a plenary reception. Continue Reading
After some confusion and frustration, I have created an entry for the JCDL2008 workshop on Collaborative Information Seeking on arXiv.org. The entry includes an overview page with an summary of the workshop and a link to an HTML page that contains links to the papers that were presented. Some of the papers made the ingestion cutoff today; the rest should appear on Friday.
One of the few things I remember about electrical engineering from my undergraduate education is the joke
Q: How do you build an amplifier?
A: Design an oscillator
Q: How do you build an oscillator?
A: Design an amplifier
This dichotomy is funny to those who’ve tried (and often failed) to build these electrical circuits. But it also underscores a similarity in the technologies involved, and points out that subtle changes in the design can produce radically different effects. It also works well as an analogy to (you guessed it) collaborative search. Social search based on recommendations, whether inferred from user behavior or from expressed through opinion, works like an amplifier: those signals (pages, documents, etc.) voted on by many people become featured more prominently (amplified) and thus are more likely to be retrieved.
tribescape is a newish entry into the social search arena that allows searchers to share search results with their peers through Twitter. Rather than e-mailing URLs, you simply click on button next to each search result, pick the followers to whom you want to tweet this result, and you’re done. Convenient, yes. Collaborative? Maybe.
I’ve made some changes and additions to the Collaborative Search Engine page in the Wikipedia to expand the entry and add more references. The entry is by no means complete! Please contribute more prose and references to make it a useful resource. In particular, more prose would be useful around the task vs. trait distinction and the division of labor vs. the sharing of knowledge.