admin 管理员组文章数量: 1184232
配置Kerberos后,连接HDFS参数如下:
private void confLoad() throws IOException {
conf = new Configuration();
conf.clear();
conf.set("hadoop.security.authentication", "kerberos");
conf.set("fs.defaultFS", "hdfs://IP:8020");
//conf.set("hadoop.rpc.protection", "authentication");
conf.set("hadoop.rpc.protection", "privacy");
conf.set("dfs.data.transfer.protection", "integrity");
}注意: hadoop.rpc.protection 必须与集群的配置保持一致,否则会报:
Exception in thread "main" java.io.IOException:
Failed on local exception: java.io.IOException: Couldn't setup connection for ldapuser@EXAMPLE.COM to /172.16.70.3:8020; Host Details : local host is: "USER-20161130SP/172.16.25.69"; destination host is: "c2bde03":8020;
... ...
Caused by: javax.security.sasl.SaslException: No common protection layer between c
版权声明:本文标题:`hadoop.rpc.protection`详解:构建高效、安全的HDFS与Kerberos连接之旅 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.roclinux.cn/p/1770554163a3534970.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论