Inproceedings3474: Unterschied zwischen den Versionen
Yt2652 (Diskussion | Beiträge) |
Yt2652 (Diskussion | Beiträge) |
||
Zeile 17: | Zeile 17: | ||
{{Publikation Author | {{Publikation Author | ||
|Rank=5 | |Rank=5 | ||
− | |Author=Sören Auer | + | |Author=Sören Auer |
}} | }} | ||
{{Publikation Author | {{Publikation Author | ||
Zeile 32: | Zeile 32: | ||
|Organization=International Semantic Web Conference | |Organization=International Semantic Web Conference | ||
|Publisher=Springer | |Publisher=Springer | ||
+ | }} | ||
+ | {{Publikation Dataset | ||
+ | |Dataset=Crowdsourced DBpedia Quality Assessment | ||
}} | }} | ||
{{Publikation Details | {{Publikation Details | ||
Zeile 38: | Zeile 41: | ||
|Link=http://link.springer.com/chapter/10.1007%2F978-3-642-41338-4_17 | |Link=http://link.springer.com/chapter/10.1007%2F978-3-642-41338-4_17 | ||
|DOI Name=10.1007/978-3-642-41338-4_17 | |DOI Name=10.1007/978-3-642-41338-4_17 | ||
− | |Forschungsgruppe=Wissensmanagement | + | |Forschungsgruppe=Web Science und Wissensmanagement |
}} | }} |
Version vom 14. März 2017, 16:32 Uhr
Crowdsourcing Linked Data Quality Assessment
Crowdsourcing Linked Data Quality Assessment
Published: 2013
Oktober
Buchtitel: The Semantic Web – ISWC 2013
Seiten: 260-276
Verlag: Springer
Organisation: International Semantic Web Conference
Referierte Veröffentlichung
Kurzfassung
In this paper we look into the use of crowdsourcing as a means to handle Linked Data quality problems that are challenging to be solved automatically. We analyzed the most common errors encountered in Linked Data sources and classified them according to the extent to which they are likely to be amenable to a specific form of crowdsourcing. Based on this analysis, we implemented a quality assessment methodology for Linked Data that leverages the wisdom of the crowds in different ways: (i) a contest targeting an expert crowd of researchers and Linked Data enthusiasts; complemented by (ii) paid microtasks published on Amazon Mechanical Turk. We empirically evaluated how this methodology could efficiently spot quality issues in DBpedia. We also investigated how the contributions of the two types of crowds could be optimally integrated into Linked Data curation processes. The results show that the two styles of crowdsourcing are complementary and that crowdsourcing-enabled quality assessment is a promis- ing and affordable way to enhance the quality of Linked Data.
ISBN: 978-3-642-41337-7
Weitere Informationen unter: Link
DOI Link: 10.1007/978-3-642-41338-4_17
Crowdsourced DBpedia Quality Assessment
Web Science und Wissensmanagement