欢迎关注Hadoop、Spark、Flink、Hive、Hbase、Flume等大数据资料分享微信公共账号:iteblog_hadoop
  1. 文章总数:978
  2. 浏览总数:11,958,256
  3. 评论:3937
  4. 分类目录:106 个
  5. 注册用户数:6120
  6. 最后更新:2018年12月15日
过往记忆博客公众号iteblog_hadoop
欢迎关注微信公众号:
iteblog_hadoop
大数据技术博客公众号bigdata_ai
大数据猿:
bigdata_ai

Hive on Spark新增的参数介绍

  Hive on Spark功能目前只增加下面九个参数,具体含义可以参见下面介绍。
hive.spark.client.future.timeout
  Hive client请求Spark driver的超时时间,如果没有指定时间单位,默认就是秒。Expects a time value with unit (d/day, h/hour, m/min, s/sec, ms/msec, us/usec, ns/nsec), which is sec if not specified. Timeout for requests from Hive client to remote Spark driver.
hive.spark.job.monitor.timeout
  Job监控获取Spark作业状态的超时时间,如果没有指定时间单位,默认就是秒。Expects a time value with unit (d/day, h/hour, m/min, s/sec, ms/msec, us/usec, ns/nsec), which is sec if not specified. Timeout for job monitor to get Spark job state.
hive.spark.client.connect.timeout
  Spark driver连接Hive client的超时时间,如果没有指定时间单位,默认就是毫秒。Expects a time value with unit (d/day, h/hour, m/min, s/sec, ms/msec, us/usec, ns/nsec), which is msec if not specified. Timeout for remote Spark driver in connecting back to Hive client.
hive.spark.client.server.connect.timeout
  Hive client和远程Spark driver握手时的超时时间,这个会在两边都检查的,如果没有指定时间单位,默认就是毫秒。Expects a time value with unit (d/day, h/hour, m/min, s/sec, ms/msec, us/usec, ns/nsec), which is msec if not specified. Timeout for handshake between Hive client and remote Spark driver. Checked by both processes.
hive.spark.client.secret.bits
  在Hive client和远程Spark driver通信过程中,随机生成密码的比特数。最好设置成8的倍数。Number of bits of randomness in the generated secret for communication between Hive client and remote Spark driver. Rounded down to the nearest multiple of 8.
hive.spark.client.rpc.threads
  远程Spark drive用于处理RPC事件所用的最大线程数,默认是8。Maximum number of threads for remote Spark driver's RPC event loop.
hive.spark.client.rpc.max.size
  Hive client和远程Spark driver通信最大的消息大小(单位:byte),默认是50MB。Maximum message size in bytes for communication between Hive client and remote Spark driver. Default is 50MB.
hive.spark.client.channel.log.level
  远程Spark driver的通道日志级别,必须是DEBUG, ERROR, INFO, TRACE, WARN中的一个。Channel logging level for remote Spark driver. One of {DEBUG, ERROR, INFO, TRACE, WARN}.
hive.spark.client.rpc.sasl.mechanisms
  用于身份验证的SASL机制的名称。Name of the SASL mechanism to use for authentication.

本博客文章除特别声明,全部都是原创!
转载本文请加上:转载自过往记忆(https://www.iteblog.com/)
本文链接: 【Hive on Spark新增的参数介绍】(https://www.iteblog.com/archives/1541.html)
喜欢 (7)
分享 (0)
发表我的评论
取消评论

表情
本博客评论系统带有自动识别垃圾评论功能,请写一些有意义的评论,谢谢!
(2)个小伙伴在吐槽
  1. 为啥spark中通过beelin 执行sql ,不需要启动hive的metastore呢
    可惜不2016-03-25 11:51 回复
  2. 《炼数成金-Spark大数据平台视频百度网盘免费下载能私发下验证码么
    可惜不2016-02-23 10:22 回复