Reading PAGE

Peer Evaluation activity

Downloads 409
Views 38
Followed by 2
Following... 1

Total impact ?

    Send a

    Stefan has...

    Trusted 0
    Reviewed 0
    Emailed 0
    Shared/re-used 0
    Discussed 0
    Invited 0
    Collected 0

     

    This was brought to you by:

    block this user Stefan Trausan-Matu Trusted member

    Professor

    Computer Science Department, Politehnica University of Bucharest
    Research Institute for Artificial Intelligence

    Improving Topic Evaluation Using Conceptual Knowledge

    Export to Mendeley

    The growing number of statistical topic models led to the need to better evaluate their output. Tradi-tional evaluation means estimate the models fit-ness to unseen data. It has recently been proven than the output of human judgment can greatly dif-fer from these measures. Thus the need for me-thods that better emulate human judgment is strin-gent. In this paper we present a system that com-putes the conceptual relevance of individual topics from a given model on the basis of information drawn from a given concept hierarchy, in this case WordNet. The notion of conceptual relevance is regarded as the ability to attribute a concept to each topic and separate words related to the topic from the unrelated ones based on that concept. In mul-tiple experiments we prove the correlation between the automatic evaluation method and the answers received from human evaluators, for various corpo-ra and difficulty levels. By changing the evaluation focus from a statistical one to a conceptual one we were able to detect which topics are conceptually meaningful and rank them accordingly.

    Oh la laClose

    Your session has expired but don’t worry, your message
    has been saved.Please log in and we’ll bring you back
    to this page. You’ll just need to click “Send”.

    Your evaluation is of great value to our authors and readers. Many thanks for your time.

    Review Close

    Short review
    Select a comment
    Select a grade
    You and the author
    Anonymity My review is anonymous( Log in  or  Register )
    publish
    Close

    When you're done, click "publish"

    Only blue fields are mandatory.

    Relation to the author*
    Overall Comment*
    Anonymity* My review is anonymous( Log in  or  Register )
     

    Focus & Objectives*

    Have the objectives and the central topic been clearly introduced?

    Novelty & Originality*

    Do you consider this work to be an interesting contribution to knowledge?

    Arrangement, Transition and Logic

    Are the different sections of this work well arranged and distributed?

    Methodology & Results

    Is the author's methodology relevant to both the objectives and the results?

    Data Settings & Figures

    Were tables and figures appropriate and well conceived?

    References and bibliography

    Is this work well documented and has the bibliography been properly established?

    Writing

    Is this work well written, checked and edited?

    Write Your Review (you can paste text as well)
    Please be civil and constructive. Thank you.


    Grade (optional, N/A by default)

    N/A 1 1.5 2 2.5 3 3.5 4 4.5 5 5.5 6 6.5 7 7.5 8 8.5 9 9.5 10
    Close

    Your mailing list is currently empty.
    It will build up as you send messages
    and links to your peers.

     No one besides you has access to this list.
    Close
    Enter the e-mail addresses of your recipients in the box below.  Note: Peer Evaluation will NOT store these email addresses   log in
    Your recipients

    Your message:

    Your email : Your email address will not be stored or shared with others.

    Your message has been sent.

    Description

    Title : Improving Topic Evaluation Using Conceptual Knowledge
    Author(s) : Claudiu Cristian Musat, Julien Velcin, Stefan Trausan-matu, Marian-andrei Rizoiu, Eric Laboratoire, Université Lumière, Claudiu Musat
    Abstract : The growing number of statistical topic models led to the need to better evaluate their output. Tradi-tional evaluation means estimate the models fit-ness to unseen data. It has recently been proven than the output of human judgment can greatly dif-fer from these measures. Thus the need for me-thods that better emulate human judgment is strin-gent. In this paper we present a system that com-putes the conceptual relevance of individual topics from a given model on the basis of information drawn from a given concept hierarchy, in this case WordNet. The notion of conceptual relevance is regarded as the ability to attribute a concept to each topic and separate words related to the topic from the unrelated ones based on that concept. In mul-tiple experiments we prove the correlation between the automatic evaluation method and the answers received from human evaluators, for various corpo-ra and difficulty levels. By changing the evaluation focus from a statistical one to a conceptual one we were able to detect which topics are conceptually meaningful and rank them accordingly.
    Keywords : natural language processing

    Subject : unspecified
    Area : Other
    Language : English
    Year : 2003

    Affiliations Computer Science Department, Politehnica University of Bucharest
    Journal : IJCAI
    Pages : 1866-1871
    Url : http://www.aaai.org/ocs/index.php/IJCAI/IJCAI11/paper/viewPDFInterstitial/3010/3754

    Leave a comment

    This contribution has not been reviewed yet. review?

    You may receive the Trusted member label after :

    • Reviewing 10 uploads, whatever the media type.
    • Being trusted by 10 peers.
    • If you are blocked by 10 peers the "Trust label" will be suspended from your page. We encourage you to contact the administrator to contest the suspension.

    Does this seem fair to you? Please make your suggestions.

    Please select an affiliation to sign your evaluation:

    Cancel Evaluation Save

    Please select an affiliation:

    Cancel   Save

    Stefan's Peer Evaluation activity

    Downloads 409
    Views 38
    Followed by 2
    Following... 1

    Stefan has...

    Trusted 0
    Reviewed 0
    Emailed 0
    Shared/re-used 0
    Discussed 0
    Invited 0
    Collected 0
    Invite this peer to...
    Title
    Start date (dd/mm/aaaa)
    Location
    URL
    Message
    send
    Close

    Full Text request

    Your request will be sent.

    Please enter your email address to be notified
    when this article becomes available

    Your email


     
    Your email address will not be shared or spammed.