Enterprise Data Architect
Posted 2026-05-06
Remote, USA
Full-time
Immediate Start
We are seeking an experienced Enterprise Data Architect to lead the design and implementation of scalable, enterprise-grade data platforms. This role will focus on defining data architecture strategy, governance, and modern data solutions leveraging Azure Databricks and Lakehouse architecture, with a strong emphasis on AI/ML integration.
The ideal candidate will have a strong background in enterprise data architecture, cloud data platforms, and AI/ML-driven data solutions, with the ability to work closely with business and technical stakeholders to deliver high-quality, scalable data products.
- Key Responsibilities:
- Define and implement enterprise data architecture strategy and roadmap
- Design scalable data platforms using Azure Databricks and Lakehouse architecture
- Lead the integration of AI/ML models and data pipelines into enterprise data platforms
- Establish and enforce data governance, data quality, and security frameworks
- Collaborate with data engineers, data scientists, and business stakeholders to translate requirements into data solutions
- Define standards for data modeling, data integration, and data lifecycle management
- Drive cloud data platform modernization initiatives
- Provide architectural guidance on performance, scalability, and cost optimization
- Ensure alignment with enterprise architecture standards and best practices
- Required Skills:
- 15+ years of overall IT experience
- 7+ years of experience as a Data Architect / Enterprise Data Architect (MANDATORY)
- Strong expertise in Azure Databricks, Delta Lake, and Lakehouse architecture
- Mandatory experience with AI/ML (model development, integration, or MLOps)
- Strong knowledge of data modeling (Dimensional, Data Vault, etc.)
- Experience with Azure data services (ADF, ADLS, Synapse, etc.)
- Strong understanding of data governance, security, and compliance frameworks
- Experience designing large-scale, distributed data systems
- Proficiency in SQL, Python, and Spark-based technologies
- Experience with CI/CD pipelines and DevOps practices