Job description
The Role
We are a fast-growing organisation in the automotive technology space, dedicated to connecting people with the right vehicles through smarter use of data. Data engineering plays a crucial role in enabling this mission.
You’ll design and deliver impactful data solutions. Your main focus will be on building and optimising data pipelines, handling everything from vehicle stock volumes to customer reviews and offer data, ensuring timely, robust, and efficient data delivery. Great data is the foundation of great analysis, and as a Data Engineer, you’ll be central to the success of the department.
Our modern data stack includes Snowflake as the data warehouse, dbt for data transformation within a medallion architecture, and Apache Airflow for orchestration. The infrastructure is hosted on Microsoft Azure, and you’ll gain hands-on experience with all these technologies. Many datasets you work with will require web scraping, so prior experience in this area is highly valuable.
What You’ll Be Doing
You’ll help shape and optimise our data ecosystem by:
-
Designing and refining data ingestion pipelines for vehicle stock, offers and pricing, images, and other assets.
-
Developing analytics-friendly data models in dbt, delivering clean, structured datasets for analysts and data scientists.
-
Implementing and maintaining CI/CD pipelines to ensure testing, reliability, and smooth deployments.
-
Participating in code reviews and establishing best practices for pipeline development.
-
Staying on top of emerging datasets in the automotive sector and recommending new sources.
-
Supporting and improving Microsoft Azure infrastructure for enhanced pipeline performance.
-
Producing clear and comprehensive documentation to accelerate colleagues’ understanding of available data.
What You’ll Need to Succeed
Essential Requirements:
-
2–4 years’ experience building robust data pipelines in a commercial setting (or through advanced personal projects).
-
Strong Python skills, including use of web scraping libraries (Scrapy, Requests, Selenium, etc.) and writing production-ready, testable code.
-
Advanced SQL knowledge, with experience in query optimisation and data modelling.
-
Solid grasp of software engineering principles (SOLID, DRY, design patterns) applied to data workflows.
-
Experience with version control (Git) and collaborative development.
-
Understanding of CI/CD concepts and experience with automated testing strategies.
-
Knowledge of data quality practices, including validation, monitoring, and testing frameworks.
-
Experience with a modern ELT framework such as dbt or sql-mesh.
-
Exposure to at least one major cloud platform (AWS, Azure, or GCP).
-
Must be based within a 1-hour commute of Liverpool city centre (essential, due to regular office collaboration).
Nice-to-Have Skills:
-
Experience with batch and near real-time pipelines.
-
Familiarity with Infrastructure as Code (e.g., Terraform).
-
Practical experience with dbt and medallion architecture patterns.
-
Knowledge of Apache Airflow or equivalent orchestration tools.
-
Hands-on Azure experience.
Why Join Us?
We’re an innovative digital business with over two decades of experience in the automotive sector, partnering with some of the world’s best-known car brands. Our close-knit team thrives on collaboration, innovation, and a shared passion for technology that drives real-world results.
We offer a supportive, ambitious environment where you’ll have the chance to grow and make a meaningful impact. Our hybrid working model (2 days per week in our modern Liverpool city centre office) strikes the right balance between flexibility and in-person teamwork.
Beyond work, we foster a culture of respect, hard work, and community spirit—whether through team-building events, charity efforts, or even adventurous activities like hiking and skydiving.