Data Support Engineer - Hybrid - London at Hexegic

  • Anywhere (100% Remote) Only
  • Hexegic
Job Description:

We provide integrated Cyber, Risk and Intelligence services to Governments and high performing businesses empowering them to safely achieve their mission. We strategically partner with our customers to share risk, improve operational resilience and achieve organisational success.

We pride ourselves on delivering, excellence, differently. We provide a complete range of Cyber security services and unlike many of our competitors we live our advice and develop our best practice through the running of our secure managed services. We have a multi disciplinary team comprising highly qualified industry experts in security, infrastructure and development which coupled with our other teams across risk and intelligence brings unrivaled experience to your service.

What will you be doing?

The Data team is the backbone of the businesses data driven operations. Architects of the data infrastructure to implement and measure the impact of strategy projects and inform how the next generation of the data platform is developed. You will provide remote technical expertise to client data pipelines into a premium data analytics tool. You are the first to respond to data health failures on key pipelines.

You are technical, with passion for systems and technical teams. You’re comfortable with reading and writing code to identify fixes, making changes and develop systems. Your insights into the product and process will form an essential part of development with the product engineering team.

Key Responsibilities:

  • Build and maintain schedules so that pipelines run effectively
  • Setting up and maintaining health checks on different pipelines
  • Identify, respond, triage, and debug the pipeline when it is broken
  • Reading code and writing code changes and/or modifying the monitoring set-up where necessary
  • Communicate insights in a way that resonates with partners, wider team, and leadership – turn theory into action
  • Communicating outages with the end users of a pipeline
  • Contributing to improvements wider systems and monitoring tooling
  • On call pager model to cover weekends – with time off if weekend covered

The ideal candidate will:

  • Be comfortable with code for ETL in Python
  • Have a basic understanding of Spark and familiar/interested in learning the basics of tuning Spark jobs
  • Have experience developing and supporting data integration technologies
  • Have strong written and verbal communication skills with the ability to skillfully engage with customers on complex topics

Essential:

  • Data Analytics knowledge
  • Technical ability
  • Strong communications and problem-solving skills
  • Comfortable with code for ETL in Python

Benefits:

Exceptional training and benefits package we offer a £5,000 a year training budget to help you grow and develop professionally.

Company Benefits

  • Private pension
  • Remote working
  • Professional Development budget £5k per year
  • Wellness Programme

Interview Process

  • Screening interview
  • Technical assessment
  • Interview

Other Jobs in Data Scientist