GCP Data Engineer
V2Soft is a global leader in IT services and business solutions, delivering innovative and cost-effective technology solutions worldwide since 1998. We have headquarteerd in Bloomfiled Hills, MI and have 16 offices spread across six countries. We partner with Fortune 500 companies to address complex business challenges. Our services span AI, IT staffing, cloud computing, engineering, mobility, testing, and more. Certified with CMMI Level 3 and ISO standards, V2Soft is committed to quality and security. Beyond our work, we actively support local communities and non-profits, reflecting our core values. Join us to be part of a dynamic and impactful global company! Please visit us at www.v2soft.com to know more . 4 Days Onsite at Dearborn, Mi - Only W2, No C2C. Skills Required: GCP, Big Data, Data Warehousing, Artificial Intelligence & Expert Systems, API 1. GCP – Experience deploying and managing services on Google Cloud Platform, including Compute Engine, Cloud Storage, IAM, and Cloud Functions. For example, designing and implementing a cloud-native application architecture using GKE (Google Kubernetes Engine) with Cloud SQL and Pub/Sub. 2. Big Data – Experience working with large-scale data processing frameworks such as Apache Spark, Dataflow, or BigQuery. For example, building ETL pipelines that process terabytes of daily event data and transform it for downstream analytics. 3. Data Warehousing – Experience designing and maintaining data warehouse solutions (e.g., BigQuery, Snowflake, Redshift). For example, modeling a star schema for a retail analytics platform that supports reporting on sales, inventory, and customer behavior. 4. Artificial Intelligence & Expert Systems – Experience developing or integrating AI/ML models and rule-based expert systems. For example, building a classification model using Vertex AI to predict customer churn, or implementing a rule engine that automates underwriting decisions. 5. API – Experience designing, building, and consuming RESTful or gRPC APIs. For example, developing a versioned REST API with OAuth 2.0 authentication that serves as the integration layer between a mobile application and backend microservices. Skills Preferred: Google Cloud Platform 1. Google Cloud Platform – Familiarity with advanced GCP services beyond core compute and storage, such as Vertex AI, Dataflow, Cloud Composer (Airflow), and BigQuery ML. For example, using Cloud Composer to orchestrate scheduled data pipelines that feed into a BigQuery data warehouse. Experience Required: Senior Engineer Exp: 10 years Data Engineering work experience Experience Preferred: * As a Senior Data Engineer, you will architect and scale end-to-end data pipelines on GCP, transforming complex telemetry and enterprise data into high-quality, analytics-ready assets using Medallion architectures. You will lead the implementation of robust CI/CD workflows, rigorous data governance, and security controls while mentoring junior talent and driving engineering best practices. By collaborating with cross-functional stakeholders and optimizing cloud performance, you will ensure the data platform remains secure, cost-effective, and highly available to power critical business insights. • Operational Excellence: Using Terraform, Git, and Airflow to ensure reproducible, secure, and cost-optimized cloud infrastructure. • Governance & Quality: Prioritizing data lineage, PII protection, and observability to maintain high trust in data assets. • Collaboration: Acting as a bridge between technical teams (Data Science, Security) and business stakeholders to deliver self-service analytics. Strong understanding of Generative AI principles and architectures, including Large Language Models (LLMs) and Retrieval-Augmented Generation (RAG) systems. * Proven experience in building and deploying RAG systems, including the use of Vector Databases. * Proficiency in Python programming. * Solid experience with SQL for data manipulation and querying. * Hands-on experience with Google Cloud Platform (GCP) services relevant to AI/ML. * Basic understanding and practical experience with Machine Learning model fine-tuning. * Familiarity with data engineering concepts and practices. * Expertise in prompt engineering techniques for interacting with LLMs. * Experience with the OpenAI SDK. * Experience developing robust APIs, preferably with FastAPI. * Proficiency with version control systems (e.g., Git). * Experience with containerization technologies (e.g., Docker). Education Required: Bachelor's Degree Education Preferred: Certification Program Additional Information: POSITION IS HYBRID / 3 - 4 DAYS PER WEEK IN THE OFFICE V2Soft is an Equal Opportunity Employer ( EOE). We welcome applicants from all backgrounds, including individuals with disabilities and veterans. https://www.v2soft.com/careers - to view all of our open opportunities and to learn more about our benefits.