You are curious, persistent, logical and clever – a true techie at heart. You enjoy living by the code of your craft and developing elegant solutions for complex problems. If this sounds like you, this could be the perfect opportunity to join EPAM as a DevOps/Big Data Solution Architect. Scroll down to learn more about the position’s responsibilities and requirements.
Our engineers not only work closely with operations, but also with development and analytics engineers. We build data pipelines for maximum efficiency, scalability and reliability to allow domain specific engineers to focus on their specialties. The ideal candidate will have experience in being a Solution Architect in DevOps and/or Big Data.
- Operate client’s largest infrastructure supporting millions of customers at double digit PB scale
- Troubleshoot complex issues across the entire stack
- Design and develop automation frameworks to handle both development and production events at scale
- Advise other teams on technical direction
- Knowledge of Linux operation system (OS, networking, process level)
- Understanding of Big Data technologies (Hadoop, Hbase, Spark, Kafka, Flume, Hive, etc.)
- Understanding of one or more object-oriented programming languages (Java, C++)
- Fluent in at least one scripting language (Shell, Python, Ruby, etc.)
- Strong verbal and written communication skills
- Passionate about being part of a tight-knit
- Operations team education B.S. degree in Computer Science or 3+ years of building data pipelines experience or equivalent
Nice to have
- Java VM performance tuning/optimizations
- Mesos computer platforms and job schedulers