Dr. Ellen Voorhees from TREC at NIST.gov sent me their most recent Call For Participation. To help with its dissemination, I am posting it in its entirety, so it will reach out through  IRThoughts and Mi Islita.com  wider audience.



                      February 2010 – November 2010

                          Conducted by:
      National Institute of Standards and Technology (NIST)

The Text Retrieval Conference (TREC) workshop series encourages
research in information retrieval and related applications by
providing a large test collection, uniform scoring procedures,
and a forum for organizations interested in comparing their
results.  Now in its nineteenth year, the conference has become
the major experimental effort in the field.  Participants in
the previous TREC conferences have examined a wide variety
of retrieval techniques and retrieval environments,
including cross-language retrieval, retrieval of web documents,
multimedia retrieval, and question answering.  Details about TREC
can be found at the TREC web site,  http://trec.nist.gov.

You are invited to participate in TREC 2010.  TREC 2010 will
consist of a set of tasks known as “tracks”.  Each track focuses
on a particular subproblem or variant of the retrieval task as
described below.  Organizations may choose to participate in any or
all of the tracks.  Training and test materials are available from
NIST for some tracks; other tracks will use special collections that
are available from other organizations for a fee.

Dissemination of TREC work and results other than in the (publicly
available) conference proceedings is welcomed, but the conditions of
participation specifically preclude any advertising claims based
on TREC results.  All retrieval results submitted to NIST are
published in the Proceedings and are archived on the TREC web site.
The workshop in November is open only to participating groups that
submit retrieval results for at least one track and to selected
government personnel from sponsoring agencies.


  By February 18 — submit your application to participate in
        TREC 2010 as described below.  Submitting an application
        will add you to the active participants’ mailing list.
        On Feb 24, NIST will announce a new password for the “active
        participants” portion of the TREC web site.  Included
        in this portion of the web site is information regarding
        the permission forms needed to obtain the TREC document

   Beginning March 2 — document disks used in some existing
        TREC collections distributed to participants who have
        returned the required forms.  Please note that no disks
        will be shipped before March 2.

   July–August  — results submission deadline for most tracks
        Specific deadlines for each track will be included in
        the track guidelines, which will be finalized in the spring.

   September 9  (estimated) — speaker proposals due at NIST.

   September 30 (estimated) — relevance judgments and individual
        evaluation scores due back to participants.

   Nov 16-19 — TREC 2010 conference at NIST in Gaithersburg, Md. USA

Task Description:

Below is a brief summary of the tasks.  Complete descriptions of
tasks performed in previous years are included in the Overview
papers in each of the TREC proceedings (in the Publications section
of the web site).

The exact definition of the tasks to be performed in each track for
TREC 2010 is still being formulated.  Track discussion takes place
on the track mailing list.  To be added to a track mailing list,
follow the instructions for contacting that mailing list as
given below.  For questions about the track, send mail to the
track coordinator (or post the question to the track mailing list
once you join).

TREC 2010 will contain seven tracks.  The blog, chemical IR,
entity, legal, relevance feedback, and web tracks will
continue from TREC 2009.  The million query track will be
incorporated into the web track.  TREC 2010 will also contain
a new “session” track.

Blog Track — The purpose of the blog track is to explore information
    seeking behavior in the blogosphere.

    Track coordinators: Craig Macdonald, Iadh Ounis, Ian Soboroff
    Mailing list:  send a mail message to listproc@nist.gov
        such that the body consists of the line
        subscribe trec-blog <FirstName> <LastName>

Chemical IR Track — The goal of the chemical IR track is to develop
    and evaluate technology for large scale search in chemical
    documents including academic papers and patents to better
    meet the needs of professional searchers: specifically patent
    searchers and chemists.

    Track co-ordinators: John Tait, john.tait@ir-facility.org
                         Jimmy Huang, jhuang@yorku.ca
                         Jianhan Zhu, j.zhu@adastral.ucl.ac.uk
                         Mhai Lupu, m.lupu@ir-facility.org

    Track Web Page: http://www.ir-facility.org/the_irf/trec_chem.htm
    Mailing List: follow the link on the web page to join the list

Entity Track — The overall aim of this track is to perform
    entity-related search on Web data.  These search tasks
    (such as finding entities and properties of entities) address
    common information needs that are not that well modeled as
    ad hoc document search.

    Track coordinators: Krisztian Balog, k.balog@uva.nl
                        Paul Thomas, Paul.Thomas@csiro.au
                        Arjen P. de Vries, arjen@acm.org
                        Thijs Westerveld, thijs.westerveld@teezir.nl
    Track web page: http://ilps.science.uva.nl/trec-entity/
    Mailing list: visit http://groups.google.com/group/trec-entity
        to apply for membership.

Legal Track — The goal of the legal track is to develop search technology
    that meets the needs of lawyers to engage in effective discovery
    in digital document collections.

    Track coordinators: Gord Cormack, gvcormac@uwaterloo.ca
                        Maura Grossman, MRGrossman@wlrk.com
                        Bruce Hedin, bhedin@h5.com
                        Doug Oard, oard@umd.edu
    Track web page: http://trec-legal.umiacs.umd.edu
    Mailing list: contact oard@umd.edu to be added to the list.

Relevance Feedback Track — The goal of the relevance feedback track
   is to provide a framework for exploring the effects of different
   factors on the success of relevance feedback.

   Track coordinators: Chris Buckley, cabuckley@sabir.com
                       Matt Lease, ml@ischool.utexas.edu
                       Mark Smucker, msmucker@engmail.uwaterloo.ca
   Track web page:  http://groups.google.com/group/trec-relfeed
   Mailing list: follow the instructions given on the track web page
                 to join the email list

Session Track —  The Session track has two primary goals: (1) to test
        whether systems can improve their performance for a given query
        by using a previous query (and search results from the search
        session), and (2) to evaluate system performance over an entire
        query session instead of a single query.

    Track coordinators: Ben Carterette, carteret@cis.udel.edu
                        Paul Clough, p.d.clough@sheffield.ac.uk
                        Evangelos Kanoulas, ekanou@ccs.neu.edu
                        Mark Sanderson, m.sanderson@sheffield.ac.uk

    Track web page: http://ir.cis.udel.edu/sessions
    Mailing list: Use the link given on the track web page to
                  join the email list

Web Track —  The Web track explores Web-specific retrieval tasks,
    including diversity and efficiency tasks, over collections of
    up to 1 billion Web pages.
    Track coordinators: Nick Craswell, nickcr@microsoft.com
                        Charles Clarke, claclark@plg.uwaterloo.ca
    Mailing list:  send a mail message to listproc@nist.gov
        such that the body consists of the line
        subscribe trec-web <FirstName> <LastName>

Conference Format

The conference itself will be used as a forum both for presentation
of results (including failure analyses and system comparisons),
and for more lengthy system presentations describing retrieval
techniques used, experiments run using the data, and other issues
of interest to researchers in information retrieval.  As there
is a limited amount of time for these presentations, the TREC
program committee will determine which groups are asked to speak
and which groups will present in a poster session.  Groups that
are interested in having a speaking slot during the workshop
should submit a 200-300 word abstract in September describing
the experiments they performed.  The program committee will use
these abstracts to select speakers.

Many of the existing TREC English collections (documents, topics,
and relevance judgments) are available for training purposes and
may also be used in some of the tracks.  Parts of the training
collection (Disks 1-3) were assembled from Linguistic Data
Consortium (LDC) text, and a signed User Agreement will be required
from all participants.  The documents are an assorted collection
of newspapers, newswire, journals, and technical abstracts.
A second agreement is needed for disks (4-5).

All documents are typical of those seen in a real-world situation
(i.e. there will not be arcane vocabulary, but there may be
missing pieces of text or typographical errors).  For most tracks,
the relevance judgments against which each system’s output will be
scored will be made by experienced relevance assessors based on the
output of all TREC participants using a pooled relevance methodology.
See the Overview paper in the TREC-8 proceedings (on the TREC
web site) for a detailed discussion of pooling.
Application details:
Organizations wishing to participate in TREC 2010 should respond
to this call for participation by submitting an application.
Participants in previous TRECs who wish to participate
in TREC 2010 must submit a new application.

To apply, follow the instructions at
to submit an online application.  The application system
will send an acknowledgement to the email address
supplied in the form once it has processed the form.

Any questions about conference participation should be sent
to the general TREC email address, trec@nist.gov .