Leave us your email address and we'll send you all the new jobs according to your preferences.

Data Migration Engineer

Posted 1 day 6 hours ago by Square One Resources

270,00 € Daily
Contract
Not Specified
Other
Małopolskie, Kraków, Poland
Job Description

Job Title: Data Migration Engineer
Location: Krakow (Remote)
Salary/Rate: €270 Per Day
Start Date: 29/09/2025
Job Type: Initial 3 Month Contract

Company Introduction

We have an exciting opportunity now available with one of our sector-leading consultancy clients! They are currently looking for a skilled Data Migration Engineer with strong experience in SAS and Hadoop to assist in migrating large-scale proprietary datasets (3PB+) to open-source formats such as ORC/Parquet.

Job Responsibilities/Objectives

You will be responsible for migrating large-scale SAS proprietary SPDE datasets stored in Hadoop to open-source formats like ORC/Parquet, managing their conversion, partitioning, and loading onto Hive, while ensuring data consistency and optimising performance throughout the process.

. Migrate large datasets from SAS proprietary SPDE formats to open-source formats (ORC/Parquet).
. Work with large datasets, including chunking and partitioning, to ensure efficient conversion and loading onto Hive.
. Utilise Big Data storage systems (HDFS, VAST) for data migration and management.
. Collaborate with engineering teams to ensure smooth data flow and integration into existing systems.
. Implement and maintain tables and partitions in Hive, Spark, and other tools.
. Ensure data consistency and integrity throughout the migration process.

Required Skills/Experience
The ideal candidate will have the following:

. SAS experience, particularly with proprietary SPDE datasets.
. Strong hands-on experience with Hadoop for managing large datasets.
. Storage Systems: HDFS, VAST.
. File Formats: Avro, ORC, Parquet.
. Table Formats: Delta, Iceberg.
. Engines: Hive (External Tables, Metastore, HiveServer2), Spark, Python.
. Other Skills: DML (Data Manipulation Language) over large datasets, managing partitions.
. Experience with Starburst and Databricks.
. Strong experience in Big Data platforms and tools.
. Ability to handle large-scale data migrations with a focus on optimisation and performance.
. Solid understanding of SAS-based data processing and integration with Hadoop.
. Excellent troubleshooting and problem-solving skills.

If you are interested in this opportunity, please apply now with your updated CV in Microsoft Word/PDF format.

Disclaimer
Notwithstanding any guidelines given to level of experience sought, we will consider candidates from outside this range if they can demonstrate the necessary competencies.
Square One is acting as both an employment agency and an employment business, and is an equal opportunities recruitment business. Square One embraces diversity and will treat everyone equally. Please see our website for our full diversity statement.

Email this Job