I am currently working with a very dynamic and ambitious subsidiary of a global consultancy who are looking to onboard an Data Flow Specialist.
Perhaps the most ambitious of its kind in the world, this programme will allow energy suppliers to roll out millions of smart electricity and gas meters to homes and small non-domestic properties.
Reporting to the Head of Business and System Architecture, the Data Flow System Specialist will be responsible for leading the architecture decisions for data architecture, platforms, and patterns that meets the Network Evolution needs and delivers the company mission of delivering value for money to its customers, underpinned by a culture of transparency.
The role holder will work with the company strategic framework to enable the CTO to clearly articulate its strategy and translate that into practical steps that need to be taken to deliver it. Provide traceability for our change portfolio to ensure what we are delivering is working towards a common goal.
This role is open for London and Manchester based applicants and the salary is £60000-£65000 per annum.
Key Accountabilities
Working directly for the CTO Network Evolution Design Function, as Data Architect you will provide API and Data Integration Specialism through:
- Leading the analysis of DSP data requirements, data integration, and performing complex data mapping activities
- Defining and architecting the end-to-end DSP data flow architecture and data management systems including migration.
- Deliver impact assessment of master data changes within end-to-end processes and data life cycle.
- Validation of current draft conceptual data models for DSP reprocurement design options and support further development into logical data model.
- Solving the data engineering problems across the Network Evolution DSP big data ecosystem
- Anticipating downstream consequences of change to DSP data and on other business areas and systems
- Work with Lead Solution Architect and System Integration Specialist in ensuring interfaces are clearly designed and documented following best-practice to ensure data integrity of the highest magnitude;
Technologies
- Integration - APIs, micro-services and ETL patterns
- Execution Paradigm - low latency/Streaming, batch and micro batch processing
- Data platforms - Big Data (Hadoop, Spark, Hive, Kafka etc.) and Data Warehouse (Teradata, Redshift, BigQuery etc.)
- Languages and scripting: Python, Java etc
- Cloud Services - AWS, GCP, Azure and Cloud Native
- DevOps - Ansible, Jenkins, ELK
- Containerisation - Docker, Kubernetes etc
- Orchestration - Airflow, Step Functions, Ctrl M etc.
- Knowledge of foundation infrastructure requirements such as Networking, Storage, Security and Compute Optimization
If you are interested in this position, please apply and we can have a conversation.
Thank you,