For our client – Firebolt – a cloud-native data warehouse, we are looking for Python Data Engineers to help our client bring speed and efficiency to big-data analytic on the cloud. The company delivers extreme speed and elasticity at any scale solving impossible data challenges. Firebolt has a full SQL compliant with an advanced optimizations layer under the hood. Firebolt was built with three principles in mind: speed, scale, and Elasticity.
It’s up to you whether to work remotely or in the office. Apply below and join the Firebolt family.
About the job to be done:
- Take a key part in the company’s echo-system team.
- Create connectors and integrations with ELT tools.
- Build public SDKs.
- Integrations with 3rd parties products.
- BS/BA in Technical Field, Computer Science or Mathematics.
- At least 5 years of experience developing in Python.
- 3+ years experience in the data warehouse space.
- 3+ years of hands-on experience in using sophisticated SQL queries and writing/optimizing highly efficient SQL queries.
- Experience integrating with 3rd party APIs.
- Experience in custom ETL design, implementation, and maintenance.
- Experience with ETL/ELT tools: Airflow/dbt/Fivetran, etc.
- Communication skills including the ability to identify and communicate data-driven insights.
What we offer you:
- An opportunity to make an impact on the industrial future and be part of disruptive and groundbreaking products.
- In-depth exposure to a modern cloud-scale distributed data warehouse.
- Competitive salary and benefits (including pension plans, insurance, benefits, and more).
- IT equipment and tools to allow you to be productive.
- Medical insurance.
- Tax compensation.
- Ability to work remotely or in the modern and comfortable office near the Vystavkovyi center.
- Long-term employment with 20 working-days paid vacation.
- Paid sick leaves (10 per year).
- Flexible working schedule.
About company tech stack:
Firebolt is composed of several open-source projects and relies on a unique IP that boosts data analytics and enables full scalability and decoupling compute from storage. The SQL core teams work with C++. The backend teams work with Go, Python, Rust in order to create microservices exposing REST APIs, grpc and GraphQL interfaces. The company is using both CockroachDB and FoundationDB as application data storage. The frontend teams work with TypeScript, React, Redux + Apollo. CI/CD is handled by a combination of CircleCI and ArgoCD to test and deploy code to production. The infrastructure is managed as code with Terraform, deployed on Kubernetes. Services are monitored by Prometheus and Grafana.