Glenn Partners Staffing Solution


Go Back To Job Listing


Job ID1405
Job TitleSpark/Hadoop Developer FTE
Permanent/Contract Permanent
Travel (%)0
Date Posted 7/27/2018
CityToronto
State/Province ON
CountryCanada
Job Description A bank in Downtown Toronto is looking for a FTE P/A Advisory level. The Programmer Analyst Advisory is responsible for the effective analysis, design, development, implementation and support of regulatory/compliance reporting systems to support the GBM business line. Reporting to the Manager Development, the candidate will be involved in all phases of SDLC, and need to interact with various stakeholders including Business Analysts, business users, vendors, Infrastructure and Production Support Team. The incumbent will influence the technological solution in relationship to architecture, design and development and leverage common components and enterprise technologies as part of delivering the solution. Key Accountabilities • Design and develop scalable applications to meet business objectives by analyzing user requirements, providing technical specifications and design and developing/maintaining programs according to best practice and quality standards. The incumbent must ensure programs and applications developed meet the high availability, integrity and reliability requirements, and utilize CIAD process • Provide production and user acceptance testing support to assigned applications by identifying, evaluating, escalating, resolving problems. • Implement new systems or enhancements, establishing and executing system test procedures, developing implementation plan, developing the required program and system documentation and ensuring all functionality has been delivered as required. The incumbent is also required to provide post implementation support and training • Implement ETL processes to ingest large data sets from multiple data sources • Keep current on rapidly changing technological trends and new technologies and maintain an understanding of the Division’s business and technology strategies. • Leverage best practices in continuous integration and delivery • Help drive transformation by continuously looking for ways to automate existing processes, test, and optimize data quality
Job Requirements Key Experience • At least 5-7 years of strong experience developing and supporting applications in Unix/Linux Environment, Shell Scripting • Advanced knowledge and hands on experience building solutions using the Spark ecosystem (e.g., Spark Core, Spark SQL, Spark Streaming) • Practical experience building solutions leveraging the Hadoop ecosystem (e.g., HDFS, YARN, Airflow) • Experience writing clean and concise code using Scala, Python, Java 8 • Good knowledge of RDBMS (Postgres, Oracle, etc) • Experience with Continuous integration and delivery • Experience with risk management and Derivatives applications (like Interest Rate, Equity, and Credit Derivatives) would be an asset. Education • A degree in computer science, math or engineering related discipline is desirable, along with 5-7 years practical experience.
     
Click here to upload your resume, and remember to list the Job ID(s).
Copyright © 2008 Glenn Partners Staffing Solution Inc. postmaster@glennpartners.com