欢迎关注Hadoop、Spark、Flink、Hive、Hbase、Flume等大数据资料分享微信公共账号:iteblog_hadoop
  1. 文章总数:960
  2. 浏览总数:11,441,374
  3. 评论:3867
  4. 分类目录:102 个
  5. 注册用户数:5823
  6. 最后更新:2018年10月13日
过往记忆博客公众号iteblog_hadoop
欢迎关注微信公众号:
iteblog_hadoop
大数据技术博客公众号bigdata_ai
大数据猿:
bigdata_ai

Spark稳定版0.9.2版本发布

  Spark 0.9.2于昨天(2014年07月23日)发布。对,你没看错,是Spark 0.9.2。Spark 0.9.2是基于0.9的分枝,修复了一些bug,推荐所有使用0.9.x的用户升级到这个稳定版本。有28位开发者参与了这次版本的开发。虽然Spark已经发布了Spark 1.0.x,但是里面有不少的bug,这次的Spark是稳定版。


如果想及时了解Spark、Hadoop或者Hbase相关的文章,欢迎关注微信公共帐号:iteblog_hadoop

全文如下:

You can download Spark 0.9.2 as either a source package (6 MB tgz) or a prebuilt package for Hadoop 1 / CDH3 (156 MB tgz), CDH4 (161 MB tgz), or Hadoop 2 / CDH5 / HDP2 (168 MB tgz). Release signatures and checksums are available at the official Apache download site.

Fixes

Spark 0.9.2 contains bug fixes in several components. Some of the more important fixes are highlighted below. You can visit the Spark issue tracker for the full list of fixes.

Spark Core
  1. ExternalAppendOnlyMap doesn’t always find matching keys. (SPARK-2043)
  2. Jobs hang due to akka frame size settings. (SPARK-1112, SPARK-2156)
  3. HDFS FileSystems continually pile up in the FS cache. (SPARK-1676)
  4. Unneeded lock in ShuffleMapTask.deserializeInfo. (SPARK-1775)
  5. Secondary jars are not added to executor classpath for YARN. (SPARK-1870)
PySpark
  1. IPython won’t run standalone Python script. (SPARK-1134)
  2. The hash method used by partitionBy doesn’t deal with None correctly. (SPARK-1468)
  3. PySpark crashes if too many tasks complete quickly. (SPARK-2282)
MLlib
  1. Make MLlib work on Python 2.6. (SPARK-1421)
  2. Fix PySpark’s Naive Bayes implementation. (SPARK-2433)
Streaming
  1. SparkFlumeEvent with body bigger than 1020 bytes are not read properly. (SPARK-1916)
GraphX
  1. GraphX triplets not working properly. (SPARK-1188)
本博客文章除特别声明,全部都是原创!
转载本文请加上:转载自过往记忆(https://www.iteblog.com/)
本文链接: 【Spark稳定版0.9.2版本发布】(https://www.iteblog.com/archives/1081.html)
喜欢 (2)
分享 (0)
发表我的评论
取消评论

表情
本博客评论系统带有自动识别垃圾评论功能,请写一些有意义的评论,谢谢!