Analysis and improvement of map-reduce data distribution in read mapping applications

A. Espinosa, P. Hernandez, J. C. Moure, J. Protasio, A. Ripoll

Producción científica: Contribución a una revistaArtículoInvestigaciónrevisión exhaustiva

4 Citas (Scopus)

Resumen

The map-reduce paradigm has shown to be a simple and feasible way of filtering and analyzing large data sets in cloud and cluster systems. Algorithms designed for the paradigm must implement regular data distribution patterns so that appropriate use of resources is ensured. Good scalability and performance on Map-Reduce applications greatly depend on the design of regular intermediate data generationconsumption patterns at the map and reduce phases.We describe the data distribution patterns found in current Map-Reduce read mapping bioinformatics applications and show some data decomposition principles to greatly improve their scalability and performance. © Springer Science+Business Media, LLC 2012.
Idioma originalInglés
Páginas (desde-hasta)1305-1317
PublicaciónJournal of Supercomputing
Volumen62
DOI
EstadoPublicada - 1 dic 2012

Huella

Profundice en los temas de investigación de 'Analysis and improvement of map-reduce data distribution in read mapping applications'. En conjunto forman una huella única.

Citar esto