• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Tim Cooke
  • Liutauras Vilda
  • Jeanne Boyarsky
  • paul wheaton
Sheriffs:
  • Ron McLeod
  • Devaka Cooray
  • Henry Wong
Saloon Keepers:
  • Tim Holloway
  • Stephan van Hulst
  • Carey Brown
  • Tim Moores
  • Mikalai Zaikin
Bartenders:
  • Frits Walraven

How do I run spark job server as a service which can be connected to spark master but no as local[*]

 
Greenhorn
Posts: 1
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
How to run spark job server as a service which will be connected to spark master but not as local[*]?
Spark configuration steps:

       Installation steps..
   1) Java ( version 1.8 )
   2) Spark ( version 1.6.2) also latest version
   3) Scala ( version 2.10.5)
   
       spark started by - ' ./sbin/start-all.sh '
       job server started by - server_start.sh
   
       1. Now Spark is running on 8080 port and worker is running on 8081 port
       2. Spark job-server is running on 8090 port.
       When we start spark job-server with the help of SBT command and place spark master URL with local[*] then it's working fine. But when we replace local[*] with spark://master:7077 and checked with our own example like WordCount example.

The command is used for word count example is :

    a) Uploaded the jar by curl command like :-
       {curl --data-binary @SparkTest.jar localhost:8090/jars/JarTest}
   
       The jar is uploaded successfully on spark job-server.
   
       b) Trying to execute WordCount java program using curl command:
         { curl -d "input.filename = data.txt" 'localhost:8090/jobs?appName=JarTest&classPath=com.pro.WordCount'}
   
       the job can be seen running on spark (8080) but the status of the job is always 'RUNNING' mode only and the result is not able to get back in response.
   
       C) I have also tried to install Livy server on my local system and spark is running on the remote server at 8080 port. I run spark job using Livy URL in java program from my system through Livy gateway but did not get the result back after submitting the job on spark master spark://master:7077.
   
       I want to execute our job with an interactive session or with in the same spark context but got failed to execute with this all kind of approach. Please help us where we might be missing in the configuration?
 
It sure was nice of your sister to lend us her car. Let's show our appreciation by sharing this tiny ad:
Gift giving made easy with the permaculture playing cards
https://coderanch.com/t/777758/Gift-giving-easy-permaculture-playing
reply
    Bookmark Topic Watch Topic
  • New Topic