Skip to content

Senior Data Engineer

  • Hybrid
    • Milano, Lombardia, Italy
  • Engineering

Job description

PLEASE NOTE: this position is based in our Swiss HQ in Mendrisio, Switzerland, which is 7km over the border from Italy and easily commutable from Milan, Como, Varese or Lugano. We are also happy to assist with your relocation to this beautiful part of Europe and we have a hybrid remote working policy.

About QA

At QA, we believe the future belongs to organisations that are able to learn, master and apply new skills at pace and scale. As the largest tech training company in the UK and the fastest-growing in the US, we partner with 96% of the FTSE and most of the Fortune500. We have served over 4,000 customers and 1+ million learners since 1985.

We believe skills alone aren’t enough, but need to be applied back to the business in order to effect change. We do this through tailored learning programmes that connect learning across an organisation’s siloes, create continuity for learners, and feature collaborative, cohort-based modalities to apply skills at pace and at scale. Our unique end-to-end learning solution draws from deep expertise across apprenticeships, instructor-led training, and self-paced learning.

QA is headquartered in London and New York. Learn more at QA.com.

We are looking for an experienced, hands‑on engineering leader who can design reliable data pipelines and collaborate effectively with stakeholders to align priorities and objectives. 

You will own our data strategy end‑to‑end, translating business questions into scalable designs, reviewing every critical solution, and coaching teammates toward best‑in‑class engineering practices. You will also be the primary voice of the platform: clarifying priorities with leadership, negotiating scope with cross‑functional partners, and championing a roadmap that balances innovation with operational stability. 

You will be at the centre of several data-driven initiatives, guiding design, unlocking blockers, and making sure we invest time and budget where it creates the most business value. 

What You’ll Do 

  • Lead and mentor the team of Data Engineers, providing technical guidance, coaching, and career development while still contributing code. 

  • Design, build, and maintain robust data pipelines using event‑driven and batch paradigms. 

  • Define best practices for data modelling, warehousing, testing, CI/CD, and infrastructure‑as‑code. 

  • Own the roadmap and budget: prioritise initiatives, evaluate new tools/technologies, and ensure projects land on time and within cost targets. 

  • Interface with stakeholders such as Product, BI, Engineering, Data Science, Customer Success, Sales to gather requirements, translate them into solutions, and communicate progress. 

  • Operate what you build: implement monitoring, alerting, and incident response playbooks to keep our data pipelines reliable and secure. 

Job requirements

  • Proven experience (5+ yrs) in Data Engineering, with at least 1 year leading or mentoring engineers while remaining hands‑on. 

  • Strong programming skills in Python (or similar) and a deep understanding of SQL performance tuning. 

  • Hands‑on expertise with event‑driven architectures (e.g. Kafka, Kinesis, Pub/Sub) and modern orchestration frameworks (Airflow, Prefect, Dagster, etc.). 

  • Solid knowledge of Amazon Web Services (S3, Redshift, Lambda, Glue, Step Functions, IAM) or another major cloud provider. 

  • Production experience with infrastructure‑as‑code (Terraform/CDK/CloudFormation) and automated CI/CD workflows (GitHub Actions, GitLab, etc.). 

  • Demonstrated ability to apply software‑engineering best practices (testing, version control, code review, design patterns) to data problems. 

  • Bachelor’s degree (or equivalent professional experience) in Computer Science, Engineering, or a related field. A Master’s or PhD degree is a plus. 

  • Excellent communication skills and the ability to translate technical trade‑offs for non‑technical audiences. 

  • Good English (both written and spoken). 

Nice to Have 

  • Experience with dbt, Lakehouse architectures, or MPP databases beyond Redshift (Snowflake, BigQuery, Databricks). 

  • Knowledge of Microsoft Azure data services, particularly Azure Synapse Analytics and Azure Data Factory. 

  • Familiarity with modern observability tooling (Datadog, Prometheus, OpenTelemetry). 

  • AWS certification or equivalent. 

  • Budget ownership or vendor‑management experience. 

 

Benefits

  • Four weeks of paid vacation per year (that increases to five weeks after two years with the company!) plus two days off per year to volunteer at your favorite non-profit

  • Access to our Benefits Hub, with many discounts and savings!

  • Train subscription

  • Relocation bonus

  • Highly-skilled teammates and lots of opportunities for growth and development

  • Highly flexible and open-minded environment

or