[Elsnet-list] TREC Federated Web Search Track 2014, Dataset is Released

Ke Zhou Ke.Zhou at ed.ac.uk
Thu Jun 12 19:08:59 CEST 2014


*TREC Federated Web Search Track 2014*

     https://sites.google.com/site/trecfedweb/


The FedWeb 2014 Sample Dataset is Available!
It contains sampled search results and pages for 4000 queries
for 149 web search engines, millions of documents in total.
This new dataset is crawled between April and May 2014.

The updated license agreement file can be found on the website:
https://sites.google.com/site/trecfedweb/#2014dataset
Fill it out, send it in, and start doing experiments!


   Now                   Subscribe to TREC + Training data available
                         + New sample data released
   August 18, 2014       Vertical selection and resource selection runs due
   September 15, 2014    Results merging results due
   18-21 November, 2014  TREC 2014 conference

The 2014 FedWeb track promotes research on federated search with realistic
web data. Federated search is the approach of querying multiple search
engines simultaneously, and combining their results into one coherent
search engine result page. The goal of the Federated Web Search (FedWeb)
track is to evaluate approaches to federated search at very large scale in
a realistic setting, by combining the search results of existing web search
engines. This year the track focuses on vertical selection (selecting the
right category/media/type of data), resource selection (selecting the
search
engines that should be queried), and results merging (combining the results
into a single ranked list):

    Task 1. Vertical Selection
    Task 2. Resource Selection
    Task 3. Results Merging

This year's new challenge is the Vertical Selection task, where the
participants have to predict the quality of the different verticals for a
particular query (for instance sports, news, or images). A set of relevant
verticals should be selected for each test topic.

The second task, Resource Selection, is about predicting the quality of the
individual resources on the test topics, where the participants are
required to rank all resources.

The third and final task, Results Merging, aims at creating ranked lists of
result snippets, by merging results from a limited number of resources. Not
only relevance of individual results, but also diversity in terms of
verticals should be taken into account. Participants should also eliminate
duplicates from the final ranking.

Track coordinators

     Djoerd Hiemstra - University of Twente, The Netherlands
     Thomas Demeester - Ghent University, Belgium
     Dolf Trieschnigg - University of Twente, The Netherlands
     Dong Nguyen - University of Twente, The Netherlands
     Ke (Adam) Zhou - University of Edinburgh, Scotland, UK

Subscribe to TREC 2014 now:
     http://trec.nist.gov/pubs/call2014.html

-- 
The University of Edinburgh is a charitable body, registered in
Scotland, with registration number SC005336.




More information about the Elsnet-list mailing list