The candidate will be responsible for designing and delivering scalable and resilient hybrid and Cloud-based data solutions supporting critical financial market clearing and risk activities; helping to drive the strategy of transforming the enterprise into a data-driven organization; lead through innovative strategic thinking in building data solutions.
Primary Duties and Responsibilities:
To perform this job successfully, an individual must be able to perform each primary duty satisfactorily.
You will be part of the Data team, a diverse group of dedicated engineers who are very passionate about data. As the Data team member, you will be responsible for crafting and building large cloud-based data systems that will serve as the backbone for the enterprise data management and analytics capabilities. You will join the core team responsible for the design, development, and implementation. You will work closely with business and technology partners, internal and external. We will define system architecture, technology stack and its tactical implementation. You will help us take on unique technical challenges associated with handling large datasets and managing streaming data in public cloud and hybrid environments, build large and complex data pipelines, integrate data coming from diverse sources in different formats, implement continuous integration/continuous delivery pipelines, automate everything we can get our hands on, and be part of the API-first design principle implementation. You will have a rare and challenging opportunity to apply your technical skills, knowledge and experience, acquire new skills and grow with us.
- BS degree in Computer Science, similar technical field or equivalent experience.
- 5+ years of technical experience building data-centric solutions within regulated industries
- 2+ years of solutions design and architecture experience
- Experience working with Cloud ecosystems (AWS, Azure, GCP)
- Knowledge and understanding of DevOps tools and technologies such as Terraform, GIT, Jenkins, Docker, Nexus/Artifactory and CI/CD pipeline.
- Good understanding of data integrations patterns, technologies and tools
- Knowledge of SQL, data warehousing design concepts, various data management systems (structured and semi structured) and integrating with various database technologies (Relational, NoSQL)
- Hands-on development experience with multiple programming languages such as Python, Java, and Scala
- Hands-on experiences designing and implementing RESTful APIs
- Familiarity with Big Data processing technologies and frameworks such as Presto, Hadoop, MapReduce, Spark.
- Familiarity with Stream processing technologies and frameworks such as Kafka, Spark Streaming, Flink.
- Understanding of the software development life cycle (SDLC) in Waterfall, Lean, and Agile work environments.
- Excellent oral and written communication skills.
- Strong analytical and problem-solving skills.
- Desire and ability to learn other programming languages.
- Master’s degree or equivalent experience.
- Hands-on experience with BI tools, such as Tableau, Microsoft PowerBI, or Qlik.
- Good understanding of data integration technologies and tools such as Pentaho, Talend or IBM DataStage.
- Experience working with Apigee Edge or similar API management solution.
- Financial markets work experience, knowledge of trade and settlement lifecycle.
- Understanding of data governance and master data management processes.
Certificates or Licenses:
- AWS Certified Solution Architect Associate Level is a plus
When you find a position you're interested in, click the 'Apply' button. Please complete the application and attach your resume.
You will receive an email notification to confirm that we've received your application.
If you are called in for an interview, a representative from OCC will contact you to set up a date, time, and location.
For more information about OCC, please click here.
OCC is an Equal Opportunity Employer