The website "romip.narod.ru." is not registered with uCoz.
If you are absolutely sure your website must be here,
please contact our Support Team.
If you were searching for something on the Internet and ended up here, try again:

About uCoz web-service

Community

Legal information

 Web Adhoc Track
RIRES: Russian Information Retrieval Evaluation Seminar

 News 
 About 
 Manifesto 
 Call for participation 
 General principles 
 Participation 
 Tracks 
 Participants 
 Test collections 
 Publications 
 Relevance tables 
 History 
 2004 
 2005 
 Forum 

По-русскиПо-русски
 

Web Adhoc Track

Adhoc search in a Web collection

Overview

The purpose of this track is to evaluate adhoc search methods on a Web collection. The dataset that is used imitates Web documents and Web queries.

For this track the standard procedure is used.

Test Collection

The source dataset is the union of BY.web and KM.ru collections.

Task Description for Participating Systems

Each participant is granted access to the BY.web and KM.ru collections and the set of queries. The number of queries in the set is 19627. The set was formed as follows:

  • all queries from ad hoc search in the web collection tracks of previous ROMIP cycles
  • a selection of queries from the logs of Yandex search engines.
    Selection procedure: start from log of queries with at least one result document, remove all queries with search operators and "adult" queries, select every 100th query.
  • a selection of queries from the logs of KM.ru search engine

Expected result is an ordered list of document URLs. Maximum list size is 100 per query.

If processing of both collections is not possible then participant may return results based on search in one of the collections only (i.e. either from KM.RU or By.web).

Evaluation Methodology

  • instructions for assessors:
    Assessors evaluate document relevance to the query basing on the extended description of the user information need.
  • evaluation method: pooling (pool depth is 50)
  • relevance scale:
    • yes / probably yes / perhaps yes / no / impossible to evaluate
    • yes / no / impossible to evaluate
  • official metrics:
    • precision
    • recall
    • TREC 11-point precision/recall graph
    • bpref

Data Formats