admin 管理员组

文章数量: 1184232


2024年4月16日发(作者:asp在线编程)

Spark安装说明

(安装在了三个虚拟机的hadoop用户下)

一.安装scala2.11.4

1将上传至/home/hadoop/local/opt/scala

2.解压到当前目录

3.编辑~/.bash_rc文件增加SCALA_HOME环境变量配置:

exportSCALA_HOME=/home/hadoop/local/opt/scala/scala-2.11.4

exportPATH=$PATH:$SCALA_HOME/bin

4.将scala文件夹和.bashrc文件copy到slave1和slave2中

scp-rscalahadoop@192.168.1.132:/home/hadoop/local/opt/

scp-rscalahadoop@192.168.1.133:/home/hadoop/local/opt/

scp~/.bashrchadoop@192.168.1.132:/home/hadoop/.bashrc

scp~/.bashrchadoop@192.168.1.133:/home/hadoop/.bashrc

5.让三个节点的环境变量生效:source~/.bash_profile

二.Spark安装(spark-1.4.0-bin-hadoop2.4)

1将spark-1.4.0-bin-hadoop2.4上传至/home/hadoop/local/opt/

2.解压到当前目录

3.编辑~/.bash_rc文件增加SPARK_HOME环境变量配置:

exportSPARK_HOME=/home/hadoop/local/opt/spark-1.4.0-bin-hadoop2.4

exportPATH=$PATH:$SCALA_HOME/bin:$SPARK_HOME/bin

4.进入sparkconf目录:

(1)vislaves添加如下内容:

(2)添加如下内容:

exportJAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.55.x86_64

exportSCALA_HOME=/home/hadoop/local/opt/scala/scala-2.11.4

exportSPARK_MASTER_IP=192.168.1.131

exportSPARK_WORKER_MEMORY=2g

exportHADOOP_CONF_DIR=/home/hadoop/local/opt/hadoop-2.4.1/etc/hadoop

其中HADOOP_CONF_DIR是Hadoop配置文件目录,SPARK_MASTER_IP主机IP地

址,SPARK_WORKER_MEMORY是worker使用的最大内存。

5.将spark-1.4.0-bin-hadoop2.4文件夹和.bashrc文件copy到slave1和slave2中:

scp-rspark-1.4.0-bin-hadoop2.4hadoop@192.168.1.132:/home/hadoop/local/opt/

scp-rspark-1.4.0-bin-hadoop2.4hadoop@192.168.1.133:/home/hadoop/local/opt/

scp~/.bashrchadoop@192.168.1.132:/home/hadoop/.bashrc

scp~/.bashrchadoop@192.168.1.133:/home/hadoop/.bashrc

让三个节点的环境变量生效:source~/.bash_profile


本文标签: 环境变量 文件 增加 编程 说明