We use cookies. You have options. Cookies help us keep the site running smoothly and inform some of our advertising, but if you’d like to make adjustments, you can visit our Cookie Notice page for more information.
We’d like to use cookies on your device. Cookies help us keep the site running smoothly and inform some of our advertising, but how we use them is entirely up to you. Accept our recommended settings or customise them to your wishes.
×

Senior Data Engineer/Developer

Location: Remote, AR.
Company Description:

Merkle is a leading data-driven, technology-enabled, global performance marketing agency that specializes in the delivery of unique, personalized customer experiences across platforms and devices. For more than 30 years, Fortune 1000 companies and leading nonprofit organizations have partnered with Merkle to maximize the value of their customer portfolios. The agency’s heritage in data, technology, and analytics forms the foundation for its unmatched skills in understanding consumer insights that drive people-based marketing strategies. Its combined strengths in performance media, customer experience, customer relationship management, loyalty, and enterprise marketing technology drive improved marketing results and competitive advantage. With 9,600+ employees, Merkle is headquartered in Columbia, Maryland, with 50+ additional offices throughout the US, EMEA, and APAC. In 2016, the agency joined the Dentsu Aegis Network. For more information, contact Merkle at 1-877-9-Merkle or visit www.merkleinc.com.

Job Description:

We are looking for a talented Data Engineer with big data processing experience to join our Agile development team in building products to support Merkle’s people based marketing vision using the latest data and cloud technologies.  You will work with a team of passionate, experienced, nimble, and goal-oriented engineers that are solving complex problems and critical data analysis and reporting features. You will embrace change, and rapidly build, test, and scale solutions that drive incremental business value for our customers and partners.  We’re looking for smart, enthusiastic, driven individuals who are eager to contribute to our world class solutions.

Key Responsibilities

  • Design, implement, and deploy enterprise data solutions using cutting-edge cloud-based technologies
  • Follow Agile methodologies to release iterative feature sets very rapidly
  • Work independently, research and introduce new solutions and technologies to the project and stakeholders, provide technical guidance and suggest improvements in development
  • Coordinate with other teams as part of a larger data-sharing system
  • Employ software development best practices such as automated testing, peer code reviews, continuous integration, and continuous delivery
  • Translate business requirements and develop technical specifications
  • Communicate clearly and document processes
  • Perform quality assurance and testing of your work
  • Contribute to a collaborative, positive, stimulating, and enjoyable environment for your development team
Qualifications:
  • Bachelor’s degree in Computer Science, Engineering, Information Systems or equivalent experience
  • 5+ years of work experience with programming languages and object-oriented design (Python preferred)
  • Strong database fundamentals including SQL, relational and non-relational data models and schema designs, and understanding of database performance implications
  • Understanding of cloud-based technologies (AWS, GCP or Azure, AWS preferred)
  • Experience leveraging automated tests for code validation and test-driven development
  • Experience building and deploying products using continuous integration principles
  • Working knowledge of software engineering and development methodologies, techniques, and tools, including Issue Tracking (like JIRA), code repositories (like Git, Bitbucket) and the Software Development Lifecycle

Desired Skills:

  • Experience building workflow orchestration, logging, error handling and automated testing utilizing Python and the Pytest framework
  • Understanding of “Big Data” ETL methodologies and managing large scale data sets
  • Experience with Snowflake data warehouse including scheduled tasks, table streams and JavaScript stored procedures
  • Strong understanding of data structures, algorithms, and distributed systems
  • Experience with AWS services (such as S3, EC2, RDS, EMR, Lambda or SNS/SQS)
  • Experience with data processing workflow systems (Apache Nifi, Talend or Airflow)
  • Experience with creating reports, dashboards and visualizations (Tableau preferred)

 

Additional Information:

At Merkle, we believe that a diverse environment improves us as a community and as a business. We want to foster an environment of growth, where all ideas and contributions are encouraged. We need this culture of courage to continue to thrive in our fast-paced industry. We embrace differences of opinion. We value diversity of experience and thought, which help us to challenge and define industry-leading solutions, and support our goal of being a great place to work.

All your information will be kept confidential according to EEO guidelines.

The anticipated salary range for this position is $100,000 to $135,000. Salary is based on a wide range of factors that include relevant experience, knowledge, skills, other job-related qualifications, and geography.  A range of medical, dental, vision, 401(k) matching, paid time off, and/or other benefits also are available. For more information regarding dentsu benefits, please visit https://dentsubenefitsplus.com/  

#LI-KF1

Videos To Watch:

More Information:

Graduate Opportunities: Whether you're still studying, recently graduated or are already working and fancy a career hop, we could have a perfect opportunity for you.
Experienced Hires: Leverage your expertise, challenge the status quo and grow your career at Merkle.

In Our Company