HOME     |      PUBLICATIONS     |      PROJECTS     |      TEACHING     |      RESOURCES         
 


Ekaterini Ioannou

Software Technology and Network Applications Laboratory

Department of Electronic & Computer Engineering
Technical University of Crete
University Campus
73100, Crete, HELLAS


Emails:
ioannou AT softnet.tuc.gr
EkateriniIoannou AT acm.org
 


Beyond 100 million entities: large-scale blocking-based resolution for heterogeneous data

Georgios Papadakis, Ekaterini Ioannou, Claudia Niederée, Themis Palpanas, and Wolfgang Nejdl.
In Proceedings of the 5th ACM International Conference on Web Search and Data Mining (WSDM), Feb. 2012, Seattle, USA.
pdf

A prerequisite for leveraging the vast amount of data available on the Web is Entity Resolution, i.e., the process of identifying and linking data that describe the same real-world objects. To make this inherently quadratic process applicable to large data sets, blocking is typically employed: entities (records) are grouped into clusters - the blocks - of matching candidates and only entities of the same block are compared. However, novel blocking techniques are required for dealing with the noisy, heterogeneous, semi-structured, user-generated data in the Web, as traditional blocking techniques are inapplicable due to their reliance on schema information. The introduction of redundancy improves the robustness of blocking methods but comes at the price of additional computational cost.
In this paper, we present methods for enhancing the efficiency of redundancy-bearing blocking methods, such as our attribute-agnostic blocking approach. We introduce novel blocking schemes that build blocks based on a variety of evidences, including entity identifiers and relationships between entities; they significantly reduce the required number of comparisons, while maintaining blocking effectiveness at very high levels. We also introduce two theoretical measures that provide a reliable estimation of the performance of a blocking method, without requiring the analytical processing of its blocks. Based on these measures, we develop two techniques for improving the performance of blocking: combining individual, complementary blocking schemes, and purging blocks until given criteria are satisfied. We test our methods through an extensive experimental evaluation, using a voluminous data set with 182 million heterogeneous entities. The outcomes of our study show the applicability and the high performance of our approach.

Bibtex

@inproceedings{conf/wsdm/PapadakisINPN12,
     author = {George Papadakis and Ekaterini Ioannou and Claudia Nieder{\'e}e and Themis Palpanas and Wolfgang Nejdl },
     title ={Beyond 100 million entities: large-scale blocking-based resolution for heterogeneous data},
     booktitle = {WSDM},
     pages = {53-62},
     year = {2012}
}


 
Last modified: February 2012