Sorry, but US citizens, Green Card and EAD holders only pleases.
If you are interested in this job, please submit your MS-Word resume to lana/at/capitalmarketsp.com
Your resume will not be shared with anyone without your consent
This role is responsible for designing, developing and unittesting large scale data pipe lines on the Hadoop platform in an Agile team based environment. This includes developing large scale event driven Extract Load and Transform (ELT) applications. These high volume ELT applications will support complex financial data flows from JMS queues, XML, RDMS result sets, and log files on the input side and importing them into HDFS, HBase or other appropriate Hadoop based technology. You will also be responsible for writing highly performant analytics against billions of records in petabyte scale datasets that extract maximum business value.
– Build complex algorithms that implement state of the art analytics on Hadoop
– Build large distributed systems that scale well
– Clearly and enthusiastically articulate the solutions you implement and why you implemented them
– Not only develop world class algorithms but debug platform issues that arise with the open source software being used and fix them
– Diving deep into specific Hadoop and non Hadoop technologies and presenting those research activities to the team
– Ability to work in a fast changing environment and learn new technologies effortlessly
– Work with QA to develop test plans, test scripts and test environments and to understand and resolve defects
– Participate in code reviews, software design sessions, and architectural reviews
– Excellent knowledge of CS fundamentals, especially algorithms and distributed systems architecture
– Minimum of 6 months but prefer 1 year of development experience in a Hadoop environment
– Well-versed in Hadoop-related technologies like HDFS, MapReduce, HBase,Flume and similar
– Experience working with and data modelling very large datasets into the billions of records per table and/or terabytes of overall information
– Experience debugging and evaluating map/reduce code for efficiency and scalability
– Practical experience writing large enterprise class ETL/ELT programs to move massive amounts of information in parallel
– Strong understanding of scalability, performance, availability, caching, and synchronization techniques
– Sound Analytical Skills
– Experience in a modern programming language that run on the JVM (Java, Scala, Closure )
– Multithreaded or event-driven programming that works in a non-blocking fashion
– Understanding of Build Tools (preferably Maven), Source Code Control (preferably SVN), Continuous Integration (TeamCity, Hudson or Jenkins)
– Thorough understanding of the SDLC.
– Experience in or familiarity with Agile methodologies and Behavioural Driven Development
– Experience with financial data (FIX, SWIFT) preferred but not required
– Strong Linux fundamentals
– Bachelors or Masters Degree from an accredited college or university with a concentration in Computer Science (or equivalent work experience).