Data Engineer (GCP) - Lisbon/Poland
Posted 4 hours 47 minutes ago by Contracts IT Recruitment Consulting Ltd
Data Engineer (GCP)
Location: Lisbon (preferred) or Poland (Warsaw preferred), Mostly Remote
Employment Type: Permanent or Contract-to-Hire (3-6 months)
Seniority Target: 3-6 years (Mid-level, hands-on individual contributor)
Language Requirement: English
Package: 45k to 50k plus company benefits
*Urgent Permanent Contract*
This is a *Lisbon* based role with an excellent immediate start within a global technology brand working on maintaining cloud-based data pipelines and data infrastructure for an internal AI Chatbot platform.
Role Summary
We are hiring a Data Engineer to develop and maintain cloud-based data pipelines and data infrastructure for an internal AI Chatbot platform. The role includes data modelling, ETL and ELT development, and ensuring data quality, data governance, and data reliability for analytics, machine learning, and NLP workloads. This is a hands-on engineering role requiring strong experience with GCP, BigQuery, SQL, and Python.
Key Responsibilities
-
Design, develop, and maintain ETL and ELT pipelines for analytics and machine learning
-
Build and optimize datasets using BigQuery and other Google Cloud Platform services
-
Work with Dataflow, Pub Sub, Cloud Storage, Dataproc, Composer, and related GCP tools
-
Implement data validation, data monitoring, data quality checks, and error handling
-
Collaborate with Data Scientists to prepare data for NLP, LLM, and AI-based workloads
-
Ensure data governance, data security, data compliance, and access control enforcement
-
Support performance tuning, pipeline optimization, and data platform enhancements
-
Document data flows, pipelines, models, and system architecture
-
Participate in Agile processes and follow software engineering best practices
Mandatory Requirements
-
3 to 6 years of hands-on Data Engineering experience
-
Experience with Google Cloud Platform, including BigQuery and at least two additional GCP components
-
Strong SQL skills, including analytical SQL and performance optimization
-
Experience developing data pipelines using ETL or ELT orchestration tools
-
Programming experience with Python or Java or Scala (Python preferred)
-
Knowledge of data modelling for data warehousing and analytical environments
-
Strong communication skills and ability to work with cross-functional stakeholders
Preferred Skills (Nice to Have)
-
Experience with streaming tools such as Kafka, Pub Sub, or Dataflow
-
Experience preparing datasets for NLP, LLM, or other machine learning workloads
-
Experience with dbt or Dataform
-
Experience with Terraform, CI CD pipelines, or Cloud Build
-
Experience working in Agile delivery environments
What You Will Love About Working Here
-
Multicultural and inclusive team environment.
-
Supportive atmosphere with a focus on work-life balance.
-
Opportunities to work on national and international projects.
-
Hybrid working model.
-
Strong career development focus with structured programs and professional growth pathways.
-
Access to training and certification programs.
-
Health and life insurance benefits.
-
Referral program with bonuses for recommended talent.
-
Excellent office locations.
Please send your CV or call Yasin to discuss further.
We are an equal opportunities employment agency and welcome applications from all suitably qualified persons regardless of race, sex, disability, religion/belief, sexual orientation, or age.
We champion differences in technology recruitment and work with clients who actively wish to diversify their talent force - ALL applicants are welcome to apply.