Senior DevOps Data Engineer (Kafka) DescriptionJob #: 68543
Selected by Newsweek as a 2021 and 2022 Most Loved Workplace, EPAM's global multi-disciplinary teams serve customers in more than 50 countries across six continents. As a recognized leader, EPAM is listed among the top 15 companies in Information Technology Services on the Fortune 1000 and ranked four times as the top IT services company on Fortune's 100 Fastest Growing Companies list. EPAM is also listed among Ad Age's top 25 World's Largest Agency Companies for three consecutive years, and Consulting Magazine named EPAM Continuum a top 20 Fastest Growing Firm.
Learn more at www.epam.com and follow EPAM on Twitter and LinkedIn.
Are you passionate about DevOps and Big Data Technologies like Kafka? Are you skilled at building and deploying complex solutions in large scale BI data processing systems? Do you enjoy developing complex components in Java?
EPAM is looking for a Senior DevOps Data Engineer to support one of our longstanding clients in building the Strategic Data Exchange between lending core systems and surrounding systems. The primary focus of the team is on the processes concerning data delivery to lending internal applications, to the wholesale bank data lake and data delivery on regulations.
If you want to work in a collaborative environment, alongside talented colleagues, in a global team, check out our opportunity below for this hybrid role within DevOps and Data.
and join EPAM’s Cloud & Data team.
- Develop data pipelines using Kafka Streams API
- Design and develop batch and real-time scalable data processing, contribute to continuous deployment of the platform
- Help to create infrastructure for a Kafka cluster and related components
- Ensure system stability and observability with logging and monitoring
- Design and build scalable, low-latency, fault-tolerant streaming data platform
- Work closely with business and technology stakeholders to build the next generation Distributed Streaming Data Pipelines and Analytics Data Stores using streaming frameworks
- Build a platform that facilitates gathering and collecting data, storing it, performing real-time processing on it, and serve it to end users and decision-making systems
- Maintain an on-going understanding of emerging data management technologies, industry trends and best practices
- Peer review of other developers' technical work from time to time
- Identify ways to improve data reliability, efficiency and quality
- At least 9 years’ of relevant experience of data engineering in a complex, corporate environment
- Strong Java 8 or higher backend development experience
- Track record in building larger corporate systems
- Decent experience with Kafka Streaming API. Experience of running and managing a Kafka cluster and related components
- Hands on experience with Kafka, Schema Registry and Kafka Connect, using the Confluent framework
- CI/CD tooling like Azure DevOps, Maven, Checkmarx, Git, Ansible
- Linux (Bash) scripting capabilities
- Agile mindset, self-organization and craftsmanship
Nice to have
- Oracle Database 12c or higher
- Logging and monitoring with Grafana, Elastic, Kibana, Prometheus or Logstash
- Issue trackers like JIRA, ServiceNow
- Collaboration tooling like Confluence
- Competitive Compensation
- 26 paid holiday days
- Commuting to work- costs reimbursement
- Option to participate in our Pension Plan scheme OR Health insurance Compensation
- Discretionary Performance Based bonus
- Laptop + Corporate Simcard
- Annual Salary Review
- Trainings, Internal Education and Dutch Language Courses
- Unlimited access to LinkedIn learning solutions
- Relocation Package
- Regular Corporate and Social Events
- The opportunity to be part of a diverse and multicultural company
- EPAM Employee Stock Purchase Plan (ESPP) (subject to certain eligibility requirements)