Blog Category: collaborative search

CFP: 2nd Workshop on Collaborative Information Seeking

on Comments (2)

Jeremy and I have been blogging about collaborative search for a while, and it is our pleasure to announce that Merrie Morris and we are organizing another workshop on Collaborative Information Seeking. The first workshop was held in 2008 in conjunction with the JCDL 2008 conference. We had a many interesting presentations and a lot of discussion about systems, algorithms, and evaluation.You can find the proceedings from the workshop on arXiv.org (metadata and papers) and on the workshop web site.

It’s time to revisit this topic, this time in conjunction with the CSCW 2010 conference. The workshop call for participation is here. Our goal is

to bring together researchers with backgrounds in CSCW, social computing, information retrieval, library sciences and HCI to discuss the research challenges associated with the emerging field of collaborative information seeking.

To participate, please submit a 2-4 page position paper in the ACM format by November 20th. The workshop will take place in February, in Savannah, Georgia. Hope to see you there!

Perhaps they measured the wrong thing…

on Comments (3)

Ian Soboroff commented on yesterday’s blog post that although mental models were important, they were insufficient. He cited a paper that found that legal staff had experienced problems with using a full-text search engine to search (with a recall-oriented information need) a collection of documents in a legal discovery scenario. The paper concludes that coming up with effective keyword searches is difficult for non-search experts. The paper is interesting and worth reading, but I believe the authors conclusions are not warranted by their methodology.

Continue Reading

Amplifiers and Oscillators

on Comments (1)

One of the few things I remember about electrical engineering from my undergraduate education is the joke

Q: How do you build an amplifier?
A: Design an oscillator
Q: How do you build an oscillator?
A: Design an amplifier

This dichotomy is funny to those who’ve tried (and often failed) to build these electrical circuits. But it also underscores a similarity in the technologies involved, and points out that subtle changes in the design can produce radically different effects. It also works well as an analogy to (you guessed it) collaborative search. Social search based on recommendations, whether inferred from user behavior or from expressed through opinion, works like an amplifier: those signals (pages, documents, etc.) voted on by many people become featured more prominently (amplified) and thus are more likely to be retrieved.

Continue Reading

Tribal search

on Comments (5)

tribescape is a newish entry into the social search arena that allows searchers to share search results with their peers through Twitter.  Rather than e-mailing URLs, you simply click on button next to each search result, pick the followers to whom you want to tweet this result, and you’re done. Convenient, yes. Collaborative? Maybe.

Continue Reading

Which future of search?

on

Alex Iskold recently wrote on the ReadWriteWeb about potential improvements in search that could be derived from incorporating evidence from one social network to affect the ranking of documents. The idea is that people you know, people with similar interests, friends-of-friends, authorities, and “the crowd” could all contribute to change the ranking on documents that a search engine delivers to you because the opinions or interests of all these people can provide some information to help disambiguate queries.

Continue Reading

The social cost of collaboration

on

Thanks to @davefauth I came across an interesting blog post by Naumi Haque on the diminishing returns of collaboration. The basis of his thesis is that as the number of explicit collaborators in a project increases past a certain point, the overall utility decreases due in part to costs associated with maintaining the collaboration. This reminds me of the notion of group coherence that Morten Hertzum wrote about in his paper on Collaborative Information Seeking. He focused on the need for teams to devote resources to ground the collaboration to prevent loss of coherence (and thus shared goals and values).

Continue Reading

Call center collaboration

on

In their JCDL 2009 paper titled “Cost and Benefit Analysis of Mediated Enterprise SearchWu et al. described a cost-benefit analysis of call center activity. The goal was to understand when an experts should help “consultants” who are handling phone calls from customers. The idea was that experts could make improvements in search results of queries run by consultants by identifying useful documents; the challenge is to make effective use of the more expensive experts’ time.

This seems like a great opportunity to implement a collaborative search interface that would mediate the collaboration between the people handling the phone calls and the technical experts. In addition to screen sharing (to help the expert understand the problem), the system might provide the expert with additional tools to facilitate searches and to reuse previously-found results.

Expanding query expansion

on Comments (2)

Looks like I missed a good paper at JCDL 2009: A Polyrepresentational Approach to Interactive Query Expansion by Diriye, Blandford and Tombros. As with many good ideas, this paper describes an approach that is obviously useful once described, but one I had not come across before.

Manual query expansion can be useful when relevance feedback fails because it doesn’t know why a person found a document relevant, but people are often reluctant to use the suggestions offered by information seeking systems. This paper offers a new twist on these recommended terms: When suggesting query terms for expanding a user’s queries, they show terms with some representation of the context in which they occur. Evaluation showed that this contextual information allowed users to understand query terms better, and that it improved their ability to make relevance judgments with respect to documents that contained the suggested terms.

In Cerchiamo, we offered users term suggestions based on relevance judgments made by search partners. While the suggested terms were useful for identifying other relevant documents, they weren’t always used. It’s likely that term recommendation in collaborative search situations would benefit from these techniques even more than in the standalone search because in the collaborative search case term recommendations may be based on documents that a searcher has never seen.