开发者

spark: How reducing executor cores solve memory issue?

When I was searching for a memory related issue in spark, I came across this article, which is suggesting to redu开发者_运维知识库ce the number of cores per executor, but in the same article it's mentioned that we have get the number of executors using the formula ((number of cores per node * total no of nodes)/no of cores per executor), so if we reduce the number of cores per executor, the number of executor increases. So how will we solve the problem by reducing the number of cores per executor?

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜