Home > Storage > PowerScale (Isilon) > Industry Solutions and Verticals > Analytics > Multi-Cloud Data Services for Dell PowerScale in AWS: Amazon EMR for Data Analytics Solutions > Apache Spark
Script started on 2021-06-03 17:57:00+0000
EEEEEEEEEEEEEEEEEEEE MMMMMMMM MMMMMMMM RRRRRRRRRRRRRRR
E::::::::::::::::::E M:::::::M M:::::::M R::::::::::::::R
EE:::::EEEEEEEEE:::E M::::::::M M::::::::M R:::::RRRRRR:::::R
E::::E EEEEE M:::::::::M M:::::::::M RR::::R R::::R
E::::E M::::::M:::M M:::M::::::M R:::R R::::R
E:::::EEEEEEEEEE M:::::M M:::M M:::M M:::::M R:::RRRRRR:::::R
E::::::::::::::E M:::::M M:::M:::M M:::::M R:::::::::::RR
E:::::EEEEEEEEEE M:::::M M:::::M M:::::M R:::RRRRRR::::R
E::::E M:::::M M:::M M:::::M R:::R R::::R
E::::E EEEEE M:::::M MMM M:::::M R:::R R::::R
EE:::::EEEEEEEE::::E M:::::M M:::::M R:::R R::::R
E::::::::::::::::::E M:::::M M:::::M RR::::R R::::R
EEEEEEEEEEEEEEEEEEEE MMMMMMM MMMMMMM RRRRRRR RRRRRR
]0;hadoop@ip-10-15-1-140:~/scripts
[?1034h[hadoop@ip-10-15-1-140 scripts]$ exit[K./hive_dml.sh exit[K./hive_dml.sh exi[Kt[K
./spark_swc_lc.sh
++ clear
[3J[H[2J++ POWERSCALE_HDFS=hdfs://10.1.1.15:8020
++ klist
./spark_swc_lc.sh: line 6: klist: command not found
++ sudo -u hdfs hdfs dfs -rm -skipTrash -r hdfs://10.1.1.15:8020/user/hdfs/spark
Deleted hdfs://10.1.1.15:8020/user/hdfs/spark
++ sudo -u hdfs hdfs dfs -mkdir -p hdfs://10.1.1.15:8020/user/hdfs/spark
++ sudo -u hdfs hdfs dfs -put /etc/os-release hdfs://10.1.1.15:8020/user/hdfs/spark/
++ cat
++ sudo -u hdfs spark-shell -i /tmp/spark_line_word_count.scala --conf spark.yarn.access.hadoopFileSystems=hdfs://10.1.1.15:8020 –conf
'spark.driver.args=/user/hdfs/spark/os-release hdfs://10.1.1.15:8020/user/hdfs/spark/output'
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
21/06/03 17:57:27 WARN Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME.
Spark context Web UI available at http://ip-10-15-1-140.ec2.internal:4040
Spark context available as 'sc' (master = yarn, app id = application_1622731208564_0008).
Spark session available as 'spark'.
[Stage 0:> (0 + 2) / 2]
++ sudo -u hdfs hdfs dfs -ls -R hdfs://10.1.1.15:8020/user/hdfs/spark/
-rw-r--r-- 3 root Administrators 212 2021-06-03 17:57 hdfs://10.1.1.15:8020/user/hdfs/spark/os-release
drwxr-xr-x - root Administrators 0 2021-06-03 17:57 hdfs://10.1.1.15:8020/user/hdfs/spark/output-lc
-rw-r--r-- 3 root Administrators 0 2021-06-03 17:57 hdfs://10.1.1.15:8020/user/hdfs/spark/output-lc/_SUCCESS
-rw-r--r-- 3 root Administrators 0 2021-06-03 17:57 hdfs://10.1.1.15:8020/user/hdfs/spark/output-lc/part-00000
-rw-r--r-- 3 root Administrators 0 2021-06-03 17:57 hdfs://10.1.1.15:8020/user/hdfs/spark/output-lc/part-00001
-rw-r--r-- 3 root Administrators 0 2021-06-03 17:57 hdfs://10.1.1.15:8020/user/hdfs/spark/output-lc/part-00002
-rw-r--r-- 3 root Administrators 2 2021-06-03 17:57 hdfs://10.1.1.15:8020/user/hdfs/spark/output-lc/part-00003
drwxr-xr-x - root Administrators 0 2021-06-03 17:57 hdfs://10.1.1.15:8020/user/hdfs/spark/output-wc
-rw-r--r-- 3 root Administrators 0 2021-06-03 17:57 hdfs://10.1.1.15:8020/user/hdfs/spark/output-wc/_SUCCESS
-rw-r--r-- 3 root Administrators 126 2021-06-03 17:57 hdfs://10.1.1.15:8020/user/hdfs/spark/output-wc/part-00000
-rw-r--r-- 3 root Administrators 142 2021-06-03 17:57 hdfs://10.1.1.15:8020/user/hdfs/spark/output-wc/part-00001
++ sudo -u hdfs hdfs dfs -cat 'hdfs://10.1.1.15:8020/user/hdfs/spark/output-wc/*'
(NAME="Amazon,1)
(Linux",1)
(ID_LIKE="centos,1)
(2",1)
(ID="amzn",1)
(Linux,1)
(CPE_NAME="cpe:2.3:o:amazon:amazon_linux:2",1)
(PRETTY_NAME="Amazon,1)
(fedora",1)
(VERSION="2",1)
(rhel,1)
(VERSION_ID="2",1)
(ANSI_COLOR="0;33",1)
(HOME_URL="https://amazonlinux.com/",1)
++ sudo -u hdfs hdfs dfs -cat 'hdfs://10.1.1.15:8020/user/hdfs/spark/output-lc/*'
9
++ true
]0;hadoop@ip-10-15-1-140:~/scripts
[hadoop@ip-10-15-1-140 scripts]$ exit
exit
Script done on 2021-06-03 17:58:40+0000