Technical Architect - Data & Analytics
Technical Architect - Data & Analytics Full Time / Contract (FTE/W2 / *) Newport Beach, CA 92660 Candidates from Local or CA are preferred for interview. We are looking for a seasoned Technical Architect to own end-to-end solution architecture for an insurance and financial services enterprise's large-scale, multi-wave data modernization program. You will design and govern a Snowflake AWS S3 Matillion dbt platform built on medallion architecture principles, define the ACORD-based enterprise data model, and set the technical standards that all delivery workstreams will follow. This is a hands-on architecture role — you will be deeply embedded in the delivery team, not a distant reviewer. Skills / Experience 12 years in data engineering and analytics with 3 years in a solution/technical architect role on enterprise-scale programs Deep, hands-on expertise in Snowflake — query optimization, clustering, Data Metric Functions, Snowpipe, Streams, and native AI capabilities Experience in Platform Architecture, Snowflake Data Mesh, AI-Augmented Delivery and Enterprise-Scale Modernization Key Skills – Snowflake, AWS S3, Matillion, dbt (data build tool), Collibra, Profisee MDM, Python, SQL, Snowflake Cortex AI, WinAIDM, SnowConvert AI, Tableau, Power BI, CI/CD (DevOps), Git, ACORD Data Model Proven experience designing medallion/lake house architectures with AWS S3 as the raw data lake layer Strong command of dbt — project structure, macros, testing frameworks, and CI/CD integration Experience architecting data governance solutions using Collibra — Catalog, lineage, business glossary, and certification workflows Demonstrated ability to lead multi-wave, multi-workstream data modernization programs in a regulated (insurance, healthcare, or financial services) environment Hands-on experience migrating legacy ETL (Informatica or SSIS) to modern dbt/Matillion pipelines Bachelor’s or Master’s degree in Computer Science, Engineering, or related technical discipline Strong communication skills—ability to explain technical concepts clearly; Proactive, ownership-driven mindset with high accountability Ability to collaborate across engineering, operations, and support teams; Adaptability to fast-paced iterative environments Key Responsibilities Architecture & Solution Design Architect and deliver the enterprise data platform on Snowflake AWS S3 using a medallion (Bronze–Silver–Gold) architecture, supporting 80 source systems and 7-year historical migration Design the ACORD Life & Annuity-based enterprise data model customized for insurance domains — Policy, Claims, Finance, Actuarial, Agent/Distribution, Customer/Party Define the data mesh architecture with federated governance, domain ownership boundaries, and self-serve platform patterns for multi-wave delivery Establish reusable ingestion templates (Matillion), dbt transformation frameworks, and Snowflake-native quality patterns (Data Metric Functions) as cross-program standards Govern architecture decisions across all 5 parallel workstreams: Ingestion, Transformation, Data Quality/DRE, Consumption, and Governance/MDM Expected Outcome – The candidate is expected to deliver the following measurable outcomes Wave 1 platform foundation (Snowflake environments, AWS S3 data lake, Matillion/dbt/Collibra/Profisee, CI/CD pipelines) delivered by Month 3 with no rework required All 9 Wave 1 certified data products (CMD Life, Finance, Actuarial) achieve SLO targets: 99.9% completeness, 99.5% accuracy, end-to-end lineage in Collibra Architecture standards and dbt/Matillion templates adopted consistently across all 5 delivery workstreams — no divergence in patterns WinAIDM accelerator framework implemented and deployed and contributing to 40–50% reduction in data engineering effort vs. baseline Client technical stakeholders describe WinWire as a ""trusted architecture guide"" — proactive, decision-ready, and commercially aware Secondary Skills Familiarity with ACORD Life & Annuity data standards and insurance domain concepts (Policy, Claims, Actuarial, Reinsurance) Experience with Profisee MDM or equivalent enterprise MDM platforms; SnowPro Advanced certification (Data Engineer or Architect) Exposure to Snowflake Cortex AI, AI-assisted development tools (GitHub Copilot, Azure OpenAI), or LLM-based data engineering accelerators