Data Engineer at Florence

  • Anywhere (100% Remote) Only
  • Florence
Job Description:

At Florence we solve our data engineering problems through modern data tools, eliminating the ‘boring and mundane’ work of our data enthusiasts so they are able to focus their efforts on the real technical challenges ahead. We are on a ELT workflow leveraging tools such as Stitch and dbt pushed into BigQuery, visualised via Looker.

We are looking for a highly motivated individual to help us transform and optimise the way we think about our current infrastructure, data models and our code base.

Let’s revolutionise data!

Responsibilities:

  • Creation and maintenance of ingestion pipelines from various sources such as Hubspot, Postgres, Google Analytics through tools such as Stitch
  • Building and maintaining bespoke ingestion pipelines not supported by existing tools
  • Designing and maintaining data flows and schemas
  • Designing, building, maintaining and upgrading data pipelines and self-service tooling to provide clean, efficient results
  • Creating architectures for master/reference data, and integration for data processing
  • Creating conceptual, logical, and physical data models for analytics and, where applicable, - operational data structures in accordance to industry best practices
  • Creating and owning data architecture at various levels – logical data models, data governance plans, data integration roadmaps, and data storage/management/access tooling options
  • Documenting and communicating standards and the guiding principles necessary to define, assure and govern our data infrastructure
  • Reviewing changes to the data model and scripts
  • Managing and optimizing core data infrastructure
  • Supporting data team resources with design and performance optimization

Requirements:

  • Bachelor's/Master's in STEM (Mathematics, Computer Science, Physics, Engineering)
  • Experienced and passionate for ELT projects, Data Modelling, Big Data and cloud solutions
  • Experience with Stitch, dbt, BigQuery, Looker or similar
  • Hands-on experience of performance tuning
  • Experience in Cloud technologies such as AWS, MS Azure, Google BigQuery (BigQuery preferred)
  • Experience of working with structured, semi-structured, and unstructured data
  • A strong knowledge of data flows, data analysis, and data profiling
  • Very strong SQL skills, data management knowledge and expertise in data architecture as well as code optimization skills
  • Deep understanding of how analytical databases work
  • Experience in designing and developing analytical data platforms, data models, data warehouse techniques and operational data sharing strategies
  • Excellent communication skills with management, development teams and vendors

Interview Process

  • Chat with a member of the Talent Team if interested
  • Interview with Chief Analytics Officer
  • Task
  • 2nd meet with the rest of the Analytics and wider team

Other Jobs in Data Scientist