Data Integration Engineer Kazakhstan
Data Integration Engineer Description
Job #: 66100We are looking for a Data Integration Engineer to make our team even stronger! As a member of the team, you will have a chance to learn the most modern technologies, prove your proficiency during challenging project engagements and be recognized as a world-class specialist.
#LI-DNI
#LI-DNP
What You’ll Do
- Design and implement Data Integration solutions, model databases, build enterprise data platforms using classic Data technologies and tools (Databases, ETL/ELT technology & tools, MDM tools, etc.) and implement modern Cloud or Hybrid data solutions
- Work with project tech. lead/architect to understand data product requirements, estimate new features and develop corresponding solution components
- Perform detailed analysis of business problems and technical environments and use this in designing high-quality technical solutions
- Actively participate in code review and testing of solutions to ensure it meets specification and quality requirements
- Support a high-performance engineering culture
- Write project documentation, participate in project meetings, meet with a customer, conduct demos to present the results of completed work
What You Have
- At least 1 year of relevant development experience and practice with data integration, data management, data storage, data modeling, and database design
- Some experience of working with public cloud providers (AWS, Azure, GCP)
- General knowledge of leading cloud data warehousing solutions (Google BigQuery, Snowflake, Redshift, Azure Synapse Analytics, etc.)
- Production coding experience in one of the data-oriented programming languages
- Expertise in outstanding analytical and problem-solving skills
- Ability to play a developer role on a project and ensure that delivered solutions meet product requirements
- Experience working with modern Agile developing methodologies and tools
Technologies
- Cloud providers stack (AWS/Azure/GCP): Storage, Compute, Networking, Identity and Security, DataWarehousing and DB solutions (RedShift, Snowflake, BigQuery, Azure Synapse, etc.)
- Standard Data Integration tools (Azure Data Factory, AWS Glue, GCP Dataflow, Talend, Informatica, Pentaho, Apache NiFi, KNIME, SSIS, etc.)
- Coding with on one of the data-oriented programming languages: SQL, Python, SparkSQL, PySpark, R, Bash, Scala
- Relational Databases (RDBMS: MS SQL Server, Oracle, MySQL, PostgreSQL)
- Dataflow orchestration, replication and preparation tools
- Version Control Systems (Git, SVN)
- Testing: Component/ Integration Testing / Reconciliation
We offer
- Outstanding career development opportunities
- Knowledge-sharing with colleagues all around the world
- Unlimited access to learning courses (LinkedIn learning, EPAM training courses, English regular classes, Internal Library)
- Community of 43,500+ industry’s top professionals
- Regular assessments and salary reviews
- Competitive compensation
- Friendly team and enjoyable working environment
- Social package – medical & family care
- Flexible working schedule
- Corporate and social events