Role: GCP Data Engineer
Location: Dearborn, MI
Employment Type: Full-Time (W2)
We are seeking a hands-on GCP Data Engineer to design, develop, and maintain cloud-based data pipelines and solutions. This role focuses on building reliable data ingestion and transformation processes, supporting analytics, and optimizing performance and efficiency in GCP.
Key Responsibilities:
β’ Design, develop, and maintain data pipelines from multiple sources
β’ Build and support cloud-based solutions using GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Functions)
β’ Develop transformation logic using Python, SQL, DBT/Dataform
β’ Ensure data quality, consistency, and readiness for analytics
β’ Monitor and optimize pipeline performance, scalability, and cost
β’ Support data governance, access controls, and security best practices
β’ Collaborate with cross-functional teams to deliver practical solutions
β’ Automate workflows and manage infrastructure using Terraform
β’ Document pipelines, processes, and technical solutions
β’ Troubleshoot issues related to data pipelines and cloud processing
Required Skills & Experience:
β’ GCP, Python, SQL
β’ 5β7 years in Data Engineering or Software Engineering
β’ At least 2 years hands-on with cloud-based data solutions, preferably GCP
β’ Experience with BigQuery, Dataflow, DataProc, Pub/Sub, Cloud Run
β’ Data pipeline design and development
Share your resumes at: [Upgrade to PRO to see contact]
#Hiring #GCPDataEngineer #PythonDeveloper #BigQuery #Dataflow #PubSub #DBT #DataEngineering #CloudJobs #TechJobs #ITJobs #MichiganJobs #DearbornJobs #USJobs #Recruiting #NowHiring #W2 #FullTime #FullTimeJobs #W2Jobs #DirectHire #CloudEngineer #DataPipeline #DataSolutions