Leave us your email address and we'll send you all the new jobs according to your preferences.

Data Engineer Databricks, Data Factory Education London

Posted 14 days 7 hours ago by Morgan McKinley

£75,000 - £80,000 Annual
Permanent
Not Specified
Academic Jobs
London, United Kingdom
Job Description

The Company

Our client is a global beacon of premium international schooling. Within their vibrant community, each day unfolds with their dedicated educators and support staff empowering thousands of students to surpass their wildest dreams. Their commitment to excellence encompasses stellar academic achievements, fostering creativity, nurturing wellbeing, and fostering international connections. Leveraging their expansive global reach, they curate an unparalleled educational experience. By attracting and retaining the finest educators worldwide, our client ensures an environment where excellence thrives.

Job Purpose

As the Data Platform Engineer overseeing the Data Warehouse and analytics infrastructure, you will shape the backbone of our client's business analysis and reporting capabilities, both now and into the future. They are seeking a visionary with a proven track record of architecting enterprise-level data ecosystems, someone who thrives on pushing the boundaries of what's possible in data manipulation. Your expertise will ensure data is not just accessible but optimised for safe, secure utilisation across multiple fronts - be it reporting, analytics, or powering applications - ultimately driving substantial productivity gains for all data consumers. As a collaborative force, you will seamlessly integrate across central, regional, and local teams, refining our client's data visualisation strategy and elevating our collective impact. At the heart of it all lies their overarching vision: to cultivate a generation of resilient, creative global citizens poised to enact positive change. Your role will be instrumental in advancing this mission, catalysing increased productivity across teams and unlocking the full potential of data within the organisation.

Responsibilities

  • Maintain and optimise available data warehouse infrastructure
  • Design, implement and document ETL procedures for intake of new data from relevant sources employing industry standards and best practices; as well as ensure data is verified and quality checked
  • Collaborate with business and technology stakeholders in ensuring data warehouse architecture development and utilisation
  • Carry out monitoring, tuning, and database performance analysis
  • Perform the design and extension of data marts, meta data and data models in collaboration with the Power BI Engineer
  • Maintain and add all data warehouse architecture codes in AzureDevops
  • Automate tasks and deploy production standard code (with unit testing, continuous integration, versioning etc.)
  • Identify opportunities for optimising the set-up and configuration of systems in order to improve the level of service provided to the business either through increased reliability, performance or functionality
  • Ensure that a quality solution is delivered in a timely manner, within budget, and according to the customer's satisfaction.
  • Support stakeholders on troubleshooting & access management

Requirements

  • Possess bachelor's degree in an analytical related field, including information technology, science, and engineering discipline.
  • Experience performing data warehouse architecture development and management.
  • Remarkable experience with Development and maintenance of data pipeline and ETL processes using Azure data Factory, Azure Data Lake and Azure Databricks.
  • Exceptional experience developing codes, testing for quality assurance, administering Azure Databricks platform and exposure to Unity Catalog.
  • Exposure of Performance Tuning, Optimization, Governance and Security of Databricks platform.
  • Experience in designing scalable solutions using Azure Data Factory (ADF) to integrate & orchestrate data workflows, data transformations, and data pipelines.
  • High proficiency in dimensional modelling techniques and their applications
  • Strong analytical, consultative, and communication skills; as well as the ability to make good judgement and work with both technical and business personnel.
  • Proficiency in SQL, Python, Spark (with either Python or SQL)
  • Good cross-cultural, interpersonal and communication skills to interact with diverse nationalities and cultures.
  • Excellent analytical skills - with the passion and drive to demonstrate and quantify success.
  • Results orientated with the ability to consistently map efforts against identified KPIs.
  • Excellent time management skills and flexibility in dealing with multi-functional tasks.
  • You'd like to work in a purpose-led sector.
  • Fluent English.

Morgan McKinley is acting as an Employment Agency and references to pay rates are indicative.

BY APPLYING FOR THIS ROLE YOU ARE AGREEING TO OUR TERMS OF SERVICE WHICH TOGETHER WITH OUR PRIVACY STATEMENT GOVERN YOUR USE OF MORGAN MCKINLEY SERVICES.

Email this Job