How to run spark job server as a service which will be connected to spark master but not as local[*]?
Spark configuration steps:
Installation steps..
1)
Java ( version 1.8 )
2) Spark ( version 1.6.2) also latest version
3) Scala ( version 2.10.5)
spark started by - ' ./sbin/start-all.sh '
job server started by - server_start.sh
1. Now Spark is running on 8080 port and worker is running on 8081 port
2. Spark job-server is running on 8090 port.
When we start spark job-server with the help of SBT command and place spark master URL with local[*] then it's working fine. But when we replace local[*] with
spark://master:7077 and checked with our own example like WordCount example.
The command is used for
word count example is :
a) Uploaded the jar by curl command like :-
{curl --data-binary @SparkTest.jar localhost:8090/jars/JarTest}
The jar is uploaded successfully on spark job-server.
b) Trying to execute WordCount java program using curl command:
{ curl -d "input.filename = data.txt" 'localhost:8090/jobs?appName=JarTest&classPath=com.pro.WordCount'}
the job can be seen running on spark (8080) but the status of the job is always 'RUNNING' mode only and the result is not able to get back in response.
C) I have also tried to install Livy server on my local system and spark is running on the remote server at 8080 port. I run spark job using Livy URL in java program from my system through Livy gateway but did not get the result back after submitting the job on spark master
spark://master:7077.
I want to execute our job with an interactive session or with in the same spark context but got failed to execute with this all kind of approach. Please help us where we might be missing in the configuration?