[Elsnet-list] SIGIR 2010 Workshop on Crowdsourcing for Search Evaluation - Submission Deadline June 7 (updated)

Emine Yilmaz eminey at microsoft.com
Wed May 19 16:48:00 CEST 2010

SIGIR 2010 Workshop on Crowdsourcing for Search Evaluation
Please note the updated submission deadline!
Call for Papers
The SIGIR 2010<http://www.sigir2010.org> Workshop on Crowdsourcing for Search Evaluation (CSE2010) solicits submissions on topics including but are not limited to the following areas:

    * Novel applications of crowdsourcing for evaluating search systems (see examples below)
    * Novel theoretical, experimental, and/or methodological developments advancing state-of-the-art knowledge of crowdsourcing for search evaluation
    * Tutorials on how the different forms of crowdsourcing might be best suited to or best executed in evaluating different search tasks
    * New software packages which simplify or otherwise improve general support for crowdsourcing, or particular support for crowdsourced search evaluation
    * Reflective or forward-looking vision on use of crowdsourcing in search evaluation as informed by prior and/or ongoing studies
    * How crowdsourcing technology or process can be adapted to encourage and facilitate more participation from outside the USA

The workshop especially calls for innovative solutions in the area of search evaluation involving significant use of a crowdsourcing platform such as Amazon's Mechanical Turk, Crowdflower, LiveWork, etc. Novel applications of crowdsourcing are of particular interest. This includes but is not restricted to the following tasks:
    * cross-vertical search (video, image, blog, etc.) evaluation,
    * local search evaluation
    * mobile search evaluation
    * realtime/news search evaluation
    * entity search evaluation
    * discovering representative groups of rare queries, documents, and events in the long-tail of search
    * detecting/evaluating query alterations

For example, does the inherent geographic dispersal of crowdsourcing enable better assessment of a query's local intent, its local-specific facets, or diversity of returned results?  Could crowd-sourcing be employed in near real-time to better assess query intent for breaking news and relevant information?

Most Innovative Awards --- Sponsored by Microsoft Bing

As further incentive to participation, authors of the most novel and innovative crowdsourcing-based search evaluation techniques (e.g. using Amazon's Mechanical Turk, Livework, Crowdflower, etc.) will be recognized with "Most Innovative Awards" as judged by the workshop organizers. Selection will be based on the creativity, originality, and potential impact of the described proposal, and we expect the winners to describe risky, ground-breaking, and unexpected ideas. The provision of awards is thanks to generous support from Microsoft Bing, and the number and nature of the awards will depend on the quality of the submissions and overall availability of funds. All valid submissions to the workshop will be considered for the awards.

Submission Instructions

Submissions should report new (unpublished) research results or ongoing research. Long paper submissions (up to 8 pages) will be primarily target oral presentations. Short papers submissions can be up to 4 pages long, and will primarily target poster presentations. Papers should be formatted in double-column ACM SIG proceedings format (http://www.acm.org/sigs/publications/proceedings-templates). Papers must be submitted as PDF files. Submissions should not be anonymized.

Papers should be submitted through the conference management system: http://www.easychair.org/conferences/?conf=cse2010. If you do not have an EasyChair account you will need to create one in order to submit a paper.

Important Dates

Submissions due: June 7, 2010

Notification of acceptance: June 21, 2010

Camera-ready submission: June 28, 2010

Workshop date: July 23, 2010


Email the organizers at cse2010 at ischool.utexas.edu<mailto:cse2010 at ischool.utexas.edu>


Vitor Carvalho<http://www.cs.cmu.edu/~vitor>, Microsoft Bing

Matthew Lease<http://www.ischool.utexas.edu/~ml>, University of Texas at Austin

Emine Yilmaz<http://research.microsoft.com/en-us/people/eminey>, Microsoft Research

Program Committee

Eugene Agichtein, Emory University

Ben Carterette, University of Delaware

Charlie Clarke,University of Waterloo

Gareth Jones, Dublin City University

Michael Kaisser. University of Edinburgh

Jaap Kamps, University of Amsterdam

Gabriella Kazai, Microsoft Research

Mounia Lalmas, University of Glasgow

Winter Mason, Yahoo! Research

Don Metzler, University of Southern California

Stefano Mizzaro, University of Udine

Gheorghe Muresan, Microsoft Bing

Iadh Ounis, University of Glasgow

Mark Sanderson, University of Sheffield

Mark Smucker, University of Waterloo

Siddharth Suri, Yahoo! Research

Fang Xu, Saarland University

-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.elsnet.org/pipermail/elsnet-list/attachments/20100519/46f60109/attachment-0001.htm 

More information about the Elsnet-list mailing list