Here at Kano we’re using data to understand how our users engage with our range of hardware and software products. As well as product analytics, we also use data to manage a complex supply chain and track key commercial metrics across multiple sales channels, retailers, warehouses, and inventory management systems.
We’re looking for Data Engineer to help us develop and manage our infrastructure as we launch further hardware and software products. We’re expecting our data sources and volumes to increase substantially over the next few years.
This is a great opportunity for a Data Engineer to get more hands on experience with data infrastructure design and management, take ownership and responsibility for a business critical function, become an expert in Apache Airflow and work with a diverse range of stakeholders across functions.
Highlights of working at Kano
- Build and manage critical data infrastructure for a fast growing startup (finance, marketing, product, revops, manufacturing, retail)
- Work with Apache Airflow on a greenfield project
- It’s not just product analytics data, you’ll also work with international manufacturing, shipping, sales and finance functions’ data
- Extend the infrastructure to support future Kano products and machine learning projects
Our current data tech stack:
- Manage a new data infrastructure including Apache Airflow, S3, Postgres and Mongo databases
- Build new data pipelines, consuming data from new sources and extending the data warehouse to support emerging business needs
- Maintain a legacy data platform while migrating pipelines to the new infrastructure
- Manage and automate code releases
- Stakeholder / culture
- Handle changing requirements from product owners
- Manage your own workload in a fast moving environment
- Run ad hoc queries and create reports for product owners as required
- Help stakeholders self-serve by creating automated dashboards and reports
- Three years software development experience with Python and SQL
- At least one year’s experience building and maintaining data pipelines
- Good understanding of how to write well structured code
- Good understanding of at least one database, ideally Postgres
- Good understanding of database design, data modelling and data migrations
- Ability to effectively work independently in a fast-moving environment
- Creating complex workflows in Airflow
- Using AWS services including S3, RDS and EC2
- Building software with a test driven development methodology
- Testing data pipelines
- Using continuous integration and deployment
- Experience creating automated dashboards and reports (Tableau)
- Join one of the world’s fastest growing full stack tech startups - huge opportunities for learning and advancement.
- 10% time back to work on your own projects
- Flexible working schedule.
- You will partake in the company’s Stock Option plan
- Team nights out, tons of office snacks and a competitive holiday structure
- £1600 flexible benefits pot
- £500 annual personal development fund
- Large discount on all Kano products
- Work with an exceptional and diverse group of people
- Informal, friendly and collaborative office environment