Senior Data Engineer: Data Platform DevOps & Architecture
About the Role
We’re seeking a Senior Data Engineer to transform our analytics infrastructure from monolithic SQL/PostgreSQL systems into a modern Azure-based data platform with DevOps at its core. This permanent position focuses on building an automated, self-service architecture designed for continuous evolution.
You’ll bring 5+ years of data engineering experience with proven expertise in implementing enterprise-scale data platforms and DevOps methodologies. You’ll architect solutions using modern patterns, establish automated data pipelines with CI/CD practices, implement infrastructure-as-code, and mentor our team while building systems that adapt to emerging technologies.
We need someone who prioritises maintainable, scalable, and adaptable systems. We’re committed to Azure but flexible on specific products – we value pragmatic solutions based on actual needs.
Structure & Expectations
- Reports to: Product organisation – you’ll work closely with product teams to align data capabilities with business objectives
- Location: London office 3 days per week, flexible remote work the rest of the time
- Team: You’ll be the senior technical lead and engineer for data platform initiatives, working with business analysts, engineers, and product stakeholders
- Timeline: Initial focus on migration (6 months), then platform evolution, DevOps maturity, and capability expansion
Core Technical Requirements
Azure Data Platform Architecture
- Microsoft Azure data stack expertise: ADF, Delta Lake, ADLS Gen2, Synapse, Databricks (with flexibility to integrate non-Microsoft solutions)
- Experience with PostgreSQL and SQL Server, including data replication/integration patterns for migration
- Dimensional and relational data modeling expertise
- Medallion architecture implementation (bronze→silver→gold)
Data Engineering & Pipelines
- Proficiency in Python and SQL for ELT/ETL pipelines
- Batch, streaming, and micro-batching architectures
- CDC patterns and incremental load strategies at scale
- Experience with Delta Change Data Feed or equivalent real-time sync mechanisms
- Experience building ingestion pipelines from diverse external APIs (CRM, analytics, finance, observability tools) into a unified lakehouse
DevOps & Infrastructure
- Infrastructure as Code (Terraform preferred)
- CI/CD with Azure DevOps or GitHub Actions
- Test automation for data pipelines (unit, integration, regression)
- Azure networking and security best practices
- Container orchestration (Kubernetes/AKS) experience beneficial
- Event-driven architectures and API integration patterns
- Monitoring and alerting with Azure Monitor and Log Analytics
- Performance tuning and cost optimisation
- Git version control proficiency
Governance, Compliance & Security
- Strong understanding of information security requirements
- Experience with Microsoft Purview or similar governance frameworks
- GDPR/data protection implementation, including the right-to-be-forgotten
- RBAC, encryption strategies, and audit logging
- Data lineage and quality enforcement
Non-Technical Requirements
Communication & Documentation
- Strong technical writing – please provide examples
- Clear presentation skills for technical and non-technical audiences
- Experience creating architecture diagrams, data flows, and runbooks
Leadership
- Track record of training and upskilling teams
- Experience with structured project planning: from high-level initiatives through quarterly planning to executable tasks
- Comfort working with product managers and translating between technical and business contexts
What You’ll Do
- Architect a modern data platform with DevOps principles for long-term business evolution
- Establish automated engineering best practices for data, governance, and operations
- Build robust batch and real-time processing pipelines (integrating web application data with analytics sources and external APIs)
- Create comprehensive documentation and training materials
- Translate business requirements into maintainable technical solutions
- Mentor team members and drive continuous improvement
Why This Matters
You'll establish the foundation for our data platform's next decade. This platform will power our next generation of internal and customer products while empowering our business with real-time insights for faster, smarter decisions.
Application Requirements
Please provide:
- Technical documentation samples
- A description of a challenging data platform project you’ve delivered
- Your approach to implementing GDPR compliance in a lakehouse environment
- Locations
- London
- Remote status
- Hybrid
This is Edgefolio.
It's not the tech. It's our people.
In each of us, you'll witness sharp minds, kind spirits, and the determination to transform our ambitions into reality and make a positive impact on the world.
Joining Edgefolio means to become the best version of yourself and to be given the freedom and confidence to make this the best job of your life.
Shaping the future is not easy and also not done overnight. The challenge is a great one and we're well on our way - Join us!
Already working at Edgefolio?
Let’s recruit together and find your next colleague.