%0 Conference Paper %B 13th International Semantic Web Conference (ISWC 2014) %D 2014 %T Conference v2.0: An uncertain version of the OAEI Conference benchmark %A Michelle Cheatham %A Pascal Hitzler %E Peter Mika %E Tania Tudorache %E Abraham Bernstein %E Chris Welty %E Craig A. Knoblock %E Denny Vrandecic %E Paul T. Groth %E Natasha F. Noy %E Krzysztof Janowicz %E Carole A. Goble %K benchmark %K OAEI %K Ontology Alignment %X The Ontology Alignment Evaluation Initiative is a set of benchmarks for evaluating the performance of ontology alignment systems. In this paper we re-examine the Conference track of the OAEI, with a focus on the degree of agreement between the reference alignments within this track and the opinion of experts. We propose a new version of this benchmark that more closely corresponds to expert opinion and confidence on the matches. The performance of top alignment systems is compared on both versions of the benchmark. Additionally, a general method for crowdsourcing the development of more benchmarks of this type using Amazon’s Mechanical Turk is introduced and shown to be scalable, cost-effective and to agree well with expert opinion. %B 13th International Semantic Web Conference (ISWC 2014) %I Lecture Notes in Computer Science, Springer %C Riva del Garda, Italy %V 8797 %P 148-163 %8 10/2014 %G eng