dc.contributor.author | Oo, Zar Zar | |
dc.contributor.author | Phyu, Sabai | |
dc.date.accessioned | 2019-07-03T06:57:24Z | |
dc.date.available | 2019-07-03T06:57:24Z | |
dc.date.issued | 2018-02-22 | |
dc.identifier.uri | http://onlineresource.ucsy.edu.mm/handle/123456789/257 | |
dc.description.abstract | MapReduce is currently a parallel computing framework for distributed processing of large-scale data intensive application. The most important performance metric is job execution time but it can be seriously impacted by straggler machines. Speculative execution is a common approach for this problem by backing up slow tasks on alternative machines. Some schedulers with speculative execution have been proposed but they have some weaknesses:(i) they cannot calculate the progress rate accurately because the progress scores of the phases are set to constant values which may be totally different for heterogeneous environment, (ii) they define the stragglers by specifying a static threshold value which calculates the temporal difference between an individual task and the average task progression. To get the better performance, this paper proposes an algorithm identifying the stragglers by the more accurate progress of each job based on its own historical information and using a dynamic threshold value adjusting the continuously varying environment automatically. | en_US |
dc.language.iso | en | en_US |
dc.publisher | Sixteenth International Conferences on Computer Applications(ICCA 2018) | en_US |
dc.title | Improving Hadoop MapReduce Performance Using Speculative Execution Strategy in a Heterogeneous Environment | en_US |
dc.type | Article | en_US |