Lead Data Engineer
2/25/2026
The Lead Data Engineer is responsible for designing, building, and stewarding the enterprise data lakehouse platform built on Microsoft Fabric, ensuring reliable, high-quality, and well-governed data assets. This role involves owning ETL/ELT pipelines, implementing Medallion Architecture, and enabling analytics and reporting through robust data products.
Working Hours
40 hours/week
Company Size
11-50 employees
Language
English
Visa Sponsorship
No
Description
General Statement of Responsibility
The Lead Data Engineer plays a critical role in designing, building, and stewarding Petra, Lasting Change’s enterprise data lakehouse platform built on Microsoft Fabric. This position is responsible for delivering reliable, high-quality, and well-governed data assets that enable analytics, reporting, and data-driven decision-making across the organization.
The Lead Data Engineer partners closely with analysts, leadership, and other stakeholders to translate business needs into scalable data solutions. This role combines hands-on engineering, architectural ownership, and leadership of contracted data engineers, while continuously improving development velocity through disciplined architecture and intentional use of AI-assisted development.
Requirements
Major Responsibilities / Activities
Data Analysis & Reporting
- Own and manage Petra, the organization’s Microsoft Fabric–based data lakehouse.
- Design, build, and maintain robust ETL/ELT pipelines that curate high-quality, trusted datasets.
- Implement and enforce Medallion Architecture standards (Bronze, Silver, Gold).
- Ensure data reliability, performance, scalability, and governance across the platform.
- Integrate data from multiple internal and external systems using Azure-native services.
Analytics & Reporting Enablement
- Partner with analysts and stakeholders to deliver robust data products that serve business needs.
- Develop analytical and dimensional data models optimized for reporting and self-service analytics.
- Build and maintain Power BI semantic models, reports, and dashboards as needed.
- Ensure datasets are well-documented, discoverable, and reliable for downstream use.
AI-Accelerated Development
- Design and curate an AI-assisted engineering workflow that provides system context to:
- Generate the majority of transformation and pipeline code upfront
- Assist with documentation, testing, and pull requests
- Increase overall development velocity and consistency
- Continuously refine AI usage patterns while maintaining quality, security, and governance standards.
Leadership & System Ownership
- Potential to lead and oversee contracted or W2 data engineers, providing clear architectural, coding, and documentation standards.
- Review work products for quality, performance, and adherence to established patterns.
- Maintain comprehensive documentation covering data logic, ETL processes, and system architecture.
- Demonstrate strong ownership of the data platform, treating it as a long-term system of record.
Collaboration & Communication
- Gather and refine requirements for assigned data initiatives.
- Communicate clearly with technical and non-technical stakeholders.
- Translate business needs into scalable, maintainable data solutions.
Essential Functions
Reasonable accommodations may be made to enable individuals with disabilities to perform these functions.
- Use of Fingers
- Feeling
- Speaking
- Hearing
- Repetitive Motions
- Capable of making sound decisions by using reasonable and logical judgments.
- Demonstrated competence in understanding, interpreting, and communicating procedures, policies, information, ideas, and instructions.
Travel
Travel may be required occasionally to subsidiary sites and training opportunities.
Required Experience
- 7–10+ years of professional experience in data engineering, ETL orchestration, and multi-system data integration.
- Strong experience with Azure data platforms, with a preference for Microsoft Fabric; experience with Azure Synapse Analytics and Azure Data Factory acceptable.
- Deep understanding of data lakes, data warehouses, and lakehouse architectures.
- Demonstrated expertise in Medallion Architecture design and implementation.
- High proficiency in Python, PySpark, Spark SQL, Delta Lake, and T-SQL.
- Strong data modeling and dimensional modeling experience to support analytics and reporting.
- Experience collaborating with stakeholders to gather requirements and deliver data solutions.
- Demonstrated use of AI tools to enhance development, code quality, documentation, and delivery velocity.
- Highly organized, detail-oriented, and committed to advancing the organization’s mission.
Preferred Qualifications
- Experience building and maintaining Power BI reports and semantic models.
- Microsoft or other cloud platform certifications.
- Experience leading or mentoring other data engineers, particularly contractors.
- Commitment to continuous learning and professional growth.
Please let Lasting Change Inc know you found this job on InterviewPal. This helps us grow!
We scan and aggregate real interview questions reported by candidates across thousands of companies. This role already has a tailored question set waiting for you.
Generate a resume, cover letter, or prepare with our AI mock interviewer tailored to this job's requirements.