Freelance Requirements:
Data Engineer (GCP | Databricks | BigQuery | Pub/Sub | Airflow)
Type: Part-Time / Freelance
Time Commitment: 10 hours per week (flexible)
Compensation: Hourly (fixed)
πΉ Key Responsibilities
Design and develop scalable ETL/ELT pipelines on GCP
Build and manage data workflows using Apache Airflow (Cloud Composer preferred)
Implement real-time data ingestion using Google Pub/Sub
Develop and optimize data models in BigQuery
Work with Databricks for large-scale data processing and transformation
Ensure data quality, performance optimization, and cost efficiency
Collaborate with analytics and business teams to understand data requirements
Maintain documentation and follow best practices in data governance
πΉ Required Skills
5+ years of experience in Data Engineering
Strong hands-on experience with:
Google Cloud Platform (GCP)
BigQuery
Pub/Sub
Databricks (Spark, PySpark)
Apache Airflow
Strong SQL and Python skills
Experience in building batch and real-time pipelines
Understanding of data warehousing concepts
Experience with Git and CI/CD pipeline
Whatsapp +91 9884442684
Contact details hidden
β‘ Apply Now β Be First
Direct contact with the client. No agencies, no middlemen. Respond quickly β freelance projects get filled fast.
π PRO members see all contact info and can apply directly. Upgrade to stop missing opportunities.