We use cookies. You have options. Cookies help us keep the site running smoothly and inform some of our advertising, but if you’d like to make adjustments, you can visit our Cookie Notice page for more information.
We’d like to use cookies on your device. Cookies help us keep the site running smoothly and inform some of our advertising, but how we use them is entirely up to you. Accept our recommended settings or customise them to your wishes.
×

The life of a data engineer at Merkle

The role of data in all our lives is growing in importance. It’s all around us. Take the cloud, for instance. It’s becoming vital for nearly every single large company, and data experts are needed in significant numbers to ensure this operates efficiently and effectively.

That’s why I wanted to work as a data engineer at Merkle. I’m in an environment that’s constantly developing and changing, and as the volume of data produced and collected rises then data engineers are increasingly essential to successful businesses.

The background to this is that companies can improve their performance by migrating their infrastructure to the cloud. In my role as a data engineer, I help to create that infrastructure, which allows analysts to perform their role. After the analysts have finished, and created the machine learning models, the data engineering team steps in again to deploy and maintain the code on the cloud so that it can be delivered right to where it needs to be.

There’s also a big problem-solving element to the role of data engineer. I studied mechanical engineering, from which I have a good understanding of the issues that big data is causing within the field. Take for example the airbus A350, it has over 50,000 sensors on board, generating over 2.5 Terabytes of data every day, most of which is never even used. Big Data can have big benefits, but it can also have massive draw backs. Storing and/or processing data on this level is expensive. Hence, the need for analysts and data engineers - to gain insights from the data, and to operationalise this information to empower companies to make decisions which could potentially improve safety, efficiency, and productivity. 

Life as a data engineer

In the first week of working at Merkle, I was given the task of creating a practice analytics environment in Azure. As well as completing all the required learning modules on the Merkle university, which ensure that you know about GDPR and data protection laws, as well as understanding the Merkle culture.

The GDPR and data protection rules are detailed and important for my role, considering how important it is to make sure that people’s personal information is being kept safe and is being looked after. The need for caution is instilled into everyone in the team. After the modules were completed, I moved on to learning more about the Azure platform. At the beginning I was only using the Microsoft portal to set up the required infrastructure. The portal is Microsoft's user interface, making it quite simple to create different resources.

It is easy to create the resources, tying them together in a compliant, secure manner is where the role presents an interesting challenge. To do this, I had to invest time in what any programmer does when they cannot find the solution to a problem, I had to find the answers via Google. Then I’d quickly fix my code/infrastructure and put my learning into action. If I couldn’t find the specific answer online, I had the support of my team in helping me to understanding the task.

A supportive environment

sop

Everyone is extremely nice, and they love to share their knowledge. On this initial task, people either already knew the answer and were able to help, or found someone else on the team to help me. There were even occasions where they would get me to look at specific documentation to solve a complex solution because they had done this so many times before.

After a while, I began to understand everything without needing to refer to good old Google. I had solved the task which I had been given and was already onto another client project. I started to feel more confident about what I was doing around the Azure platform and started to use IaC (Infrastructure as Code). I have also started learning about PowerShell command line interface, from which you can launch any Azure resource and configure permissions straight from the command line interface. This led me onto investigating Bicep, which is Microsoft's own low level programming language aimed towards setting up multiple resources at once.

I have learned something new every day, and although some days have been stressful, I love the team, and I am extremely happy to be a part of it. I have gained a thorough understanding of the Azure platform and I am looking to pass the Azure accreditation exams soon.  

Since starting at Merkle, every day has been different. Learning about how different companies implement various solutions is fascinating since you discover a whole range of ways of solving the same problem using different tools and techniques, which in return has improved my depth of knowledge. As businesses come to rely increasingly on their data infrastructure then that knowledge is only going to become more valuable as time goes on and I’d strongly advise anyone who likes the sounds of data engineering as a job to contact Merkle’s recruitment to discuss the available roles in analytics.