Job Description:

As a Data Engineer at Capco you will be:

  • Work alongside clients to interpret requirements and define industry-leading - solutions
  • Design and develop robust, well tested data pipelines
  • Demonstrate and help clients adhere to best practices in engineering and SDLC
  • Excellent knowledge of building event-driven, loosely coupled distributed - applications
  • Experience in developing both on-premise and cloud-based solutions
  • Good understanding of key security technologies, protocols e.g. TLS, OAuth, Encryption
  • Support internal Capco capabilities by sharing insight, experience and credentials

Skills & Expertise:

You will have experience working with some of the following Methodologies/Technologies:

  • Strong cloud provider’s experience on at least one of AWS, Azure or GCP
  • Hands on experience using Scala/JAVA
  • Experience in most of data and cloud technologies such as Hadoop, HIVE, Spark, Pig, - SQOOP, Flume, PySpark, Databricks, Cloudera, Airflow, Oozie, S3, Glue, Athena, - Terraform etc.
  • Experience with schema design using semi-structured and structured data structures
  • Experience on messaging technologies – Kafka, Spark Streaming, Amazon Kinesis
  • Strong experience in SQL
  • Good understanding of the differences and tradeoff between SQL and NoSQL, ETL and ELT
  • Understanding of containerisation, Graph Databases and ML Algorithms
  • Experience with data lake formation and data warehousing principles and technologies – BigQuery, Redshift, Snowflake
  • Experience using version control tool such as Git
  • Experiencing building CI/CD Pipelines on Jenkins, CircleCI
  • Enthusiasm and ability to pick up new technologies as needed to solve your problems

Why Join Capco as a Data Engineer?

  • You will work on engaging projects with some of the largest banks in the world, on projects that will transform the financial services industry.
  • You’ll be part of digital engineering team that develop new and enhance existing financial and data solutions, having the opportunity to work on exciting greenfield projects as well as on established Tier1 bank applications adopted by millions of users.
  • You’ll be involved in digital and data transformation processes through a continuous delivery model.
  • You will work on automating and optimising data engineering processes, develop robust and fault tolerant data solutions both on cloud and on-premise deployments.
  • You’ll be able to work across different data, cloud and messaging technology stacks.
  • You’ll have an opportunity to learn and work with specialised data and cloud technologies to widen the skill set.

Company Benefits

  • Bonus Scheme
  • Private Pension
  • Budget for Certifications
  • Udemy Subscription
  • MacBook
  • Private Medical Insurance
  • Life Insurance
  • 2x Performance reviews p/a
  • 40 hours minimum training allowance
  • Hybrid working model

Interview Process

  • Technical Interview
  • Head of Engineering Interview
  • HR Interview

Other Jobs in Data Scientist