欢迎关注大数据技术架构与案例微信公众号:过往记忆大数据
过往记忆博客公众号iteblog_hadoop
欢迎关注微信公众号:
过往记忆大数据

Hive on Spark新增的参数介绍

  Hive on Spark功能目前只增加下面九个参数,具体含义可以参见下面介绍。
hive.spark.client.future.timeout
  Hive client请求Spark driver的超时时间,如果没有指定时间单位,默认就是秒。Expects a time value with unit (d/day, h/hour, m/min, s/sec, ms/msec, us/usec, ns/nsec), which is sec if not specified. Timeout for requests from Hive client to remote Spark driver.
hive.spark.job.monitor.timeout
  Job监控获取Spark作业状态的超时时间,如果没有指定时间单位,默认就是秒。Expects a time value with unit (d/day, h/hour, m/min, s/sec, ms/msec, us/usec, ns/nsec), which is sec if not specified. Timeout for job monitor to get Spark job state.
hive.spark.client.connect.timeout
  Spark driver连接Hive client的超时时间,如果没有指定时间单位,默认就是毫秒。Expects a time value with unit (d/day, h/hour, m/min, s/sec, ms/msec, us/usec, ns/nsec), which is msec if not specified. Timeout for remote Spark driver in connecting back to Hive client.
hive.spark.client.server.connect.timeout
  Hive client和远程Spark driver握手时的超时时间,这个会在两边都检查的,如果没有指定时间单位,默认就是毫秒。Expects a time value with unit (d/day, h/hour, m/min, s/sec, ms/msec, us/usec, ns/nsec), which is msec if not specified. Timeout for handshake between Hive client and remote Spark driver. Checked by both processes.
hive.spark.client.secret.bits
  在Hive client和远程Spark driver通信过程中,随机生成密码的比特数。最好设置成8的倍数。Number of bits of randomness in the generated secret for communication between Hive client and remote Spark driver. Rounded down to the nearest multiple of 8.
hive.spark.client.rpc.threads
  远程Spark drive用于处理RPC事件所用的最大线程数,默认是8。Maximum number of threads for remote Spark driver's RPC event loop.
hive.spark.client.rpc.max.size
  Hive client和远程Spark driver通信最大的消息大小(单位:byte),默认是50MB。Maximum message size in bytes for communication between Hive client and remote Spark driver. Default is 50MB.
hive.spark.client.channel.log.level
  远程Spark driver的通道日志级别,必须是DEBUG, ERROR, INFO, TRACE, WARN中的一个。Channel logging level for remote Spark driver. One of {DEBUG, ERROR, INFO, TRACE, WARN}.
hive.spark.client.rpc.sasl.mechanisms
  用于身份验证的SASL机制的名称。Name of the SASL mechanism to use for authentication.

本博客文章除特别声明,全部都是原创!
原创文章版权归过往记忆大数据(过往记忆)所有,未经许可不得转载。
本文链接: 【Hive on Spark新增的参数介绍】(https://www.iteblog.com/archives/1541.html)
喜欢 (11)
分享 (0)
发表我的评论
取消评论

表情
本博客评论系统带有自动识别垃圾评论功能,请写一些有意义的评论,谢谢!
(2)个小伙伴在吐槽
  1. 为啥spark中通过beelin 执行sql ,不需要启动hive的metastore呢

    可惜不2016-03-25 11:51 回复
  2. 《炼数成金-Spark大数据平台视频百度网盘免费下载能私发下验证码么

    可惜不2016-02-23 10:22 回复