Data Engineer - Data & AI

Posted 1 day 2 hours ago by Global Enterprise Partners

Contract
Not Specified
Other
Not Specified, Poland
Job Description
Data Engineer - Data & AI Platforms

Data Engineering | Cloud Data Platforms | Lakehouse | Pharma IT | Agile Delivery

Global Enterprise Partners is supporting a leading global life sciences organisation in expanding its Data & AI team by securing an experienced Senior Data Engineer. This role plays a critical part in designing, building, and operating production-grade data pipelines and data products that enable compliant, high-quality, and timely access to data across the pharmaceutical value chain.

This is a hands-on, technical role for a Senior Data Engineer who enjoys working closely with data scientists, product managers, semantic experts, and business stakeholders in an agile, product-oriented environment.

Role Purpose

The organisation is continuing to invest heavily in data products, FAIR data principles, and modern cloud-based analytics platforms. The successful candidate will:

  • Design, build, and operate reliable end-to-end data pipelines from source to data product
  • Enable compliant and contextual access to high-quality data assets
  • Translate business and data requirements into efficient, scalable technical solutions
  • Contribute to a healthy, standards-driven data ecosystem through best practices and shared patterns
  • Mentor junior engineers and promote strong engineering discipline

Requirements

Must-Have

  • 5+ years of hands-on experience in data engineering, data operations, or similar roles
  • Strong experience building production-grade data pipelines using SQL, Python, PySpark, or Spark
  • Hands-on experience with modern cloud data platforms such as Databricks or Snowflake
  • Experience with ETL/ELT tools (eg dbt, AWS Glue, or similar)
  • Strong understanding of data product concepts, life cycle management, and data quality principles
  • Practical experience with data quality frameworks (eg dbt tests, Great Expectations, Elementary)
  • Solid knowledge of lakehouse table formats (Delta Lake, Apache Iceberg)
  • Experience integrating data via APIs and interfaces (REST, OData, ODBC, GraphQL, streaming, etc.)
  • Strong analytical mindset, ownership mentality, and high standards for code quality, testing, and documentation
  • Experience working in Agile/self-organised teams with excellent collaboration skills

Nice-to-Have

  • Experience with FinOps, cost optimisation, and performance tuning in cloud data platforms
  • Understanding of data architecture patterns (data hub, data mesh, data fabric)
  • Experience defining conceptual, logical, and physical data models
  • Prior exposure to GxP environments and computerised system validation
  • Experience mentoring engineers or acting as a technical lead

Contract Details

  • Employment type: Contract
  • Location: EU (remote)
  • Language: English (mandatory)
Interested?

Apply directly or contact us for further details.

Contact:
Beatrice Vilkaite

Let op: vacaturefraude

Helaas komt vacaturefraude steeds vaker voor. We waarschuwen je voor mogelijke misleiding:
* Wij zullen nooit via WhatsApp of in een videogesprek vragen om jouw persoonlijke gegevens (zoals een kopie van je ID, bankgegevens of BSN).
* Twijfel je over de echtheid van een vacature of contactpersoon? Neem dan altijd rechtstreeks contact met ons op via de officiële contactgegevens op onze website.

Important: job fraud

Unfortunately, job fraud is becoming more common. Beware of such scams:
* We will never ask for personal information (such as a copy of your ID, bank details, or social security number) via WhatsApp or during a video call.
* If you're unsure whether a vacancy or contact person is legitimate, please reach out to us directly using the official contact details on our website.