Senior Data Engineer
We Dream. We Do. We Deliver.
As a full-service, data-driven customer experience transformation, we partner with Top 500 companies in the DACH region and in Eastern Europe. Originally from Switzerland, Merkle DACH was created out of a merger Namics and Isobar - two leading full-service digital agencies.
Our 1200+ digital enthusiasts are innovating the way brands are built, through providing expertise in Digital Transformation strategy, MarTech platforms, Creativity, UX, CRM, Data, Commerce, Mobile, Social Media, Intranet and CMS. We are part of the global Merkle brand, the largest brand within the dentsu group, who shares with us a network of over 66,000 passionate individuals in 146 countries.
- Use CI/CD tools to facilitate deployment of code to stage and production environments.
- Participate on architecture of end-to-end solutions for our customers on AWS, Azure and other cloud platforms.
- Maintain GIT repositories using Gitflow framework.
- Collaborate on feature deliverables to meet milestones and quality expectations.
- Communicate with the stakeholders, vendors and technology subject matter experts.
- Document implemented logic in a structured manner using Confluence; plan your activities using Agile methodology in Jira.
- Work with stakeholders to assist with data-related technical issues and support their data infrastructure needs, like optimizing existing data delivery, re-designing infrastructure for greater scalability, etc.
- Experience in building and productionizing public cloud infrastructure.
- Experience using GitHub, Bitbucket or other code repository solution.
- Experience in setting up and using CI/CD automation tools like like Github Actions, Azure DevOps or AWS CodePipeline.
- Experience with infrastructure as a code frameworks like Terraform, AWS CloudFormation, ARM templates.
- Understanding of containerization concepts and container orchestration services like Docker, Fargate, Kubernetes.
- Experience with scripting languages like Python, Bash, PowerShell etc.
- Strong analytic skills related to working with structured and unstructured datasets.
- Person who is precise, well organized, has good communication skill, can adapt to changing circumstances and is not afraid of responsibility for his / her work will do great in this role.
- Understanding data concepts and patterns of big data, data lake, lambda architecture, stream processing, DWH, BI & reporting.
- Experience with data pipeline / workflow management tools like dbt, AWS Step Functions, AWS Glue, Azure Data Factory, Airflow.
- Knowledge of SQL
With us, you will become part of:
- An international, amazing team, where you can gain new/relevant experience
- A dynamic and supportive environment where you will never happen to fall into a routine
- Possibility to grow, in accordance with your skills and interests connected with future development
- Start-up agile atmosphere
- Friendly international team of creative minds
We, obviously, offer even more:
- Brand new offices in Prague and Brno centre with great accessibility
- Laptop and international tariff even for your private use
- Cafeteria of benefits to choose from – life insurance, pension insurance, Edenred Cafeteria and more are coming
- 5 weeks of paid vacation (25 days), 1 Wellness Day
- Medical advisory system – ulekare.cz
- Well-being benefit - Soulmio
- We value self-education and learning new technologies, so we support all our team members in obtaining new certifications, attending learning tutorials and conferences etc.
- Flexible working hours
- 3 Wellness days
- We have an employee breakfast regularly, we enjoy beer and wine and we create a lot of opportunities to get together for those who enjoy life, not only work.