Overview
We are looking for a senior data engineer to join our data and analytics team to help us build the next generation of our data platform and product. Together with the rest of the team, you will be responsible for the design and implementation of our data platform while also building data products for marketplaces, integrations and analytics. Your work will encompass the full spectrum of data engineering while also crossing over into areas of software and analytical engineering.
You will report to Fleetio’s Director of Data and Analytics and partner closely with our product and engineering teams as well as other internal stakeholders. This is an exciting opportunity to join a growing data-first company and push the boundaries of the modern data stack and deliver data products to our customers in new and exciting ways.
A little about us… Fleetio is a modern software platform that helps thousands of organizations around the world manage their fleet operations. Transportation technology is a hot market and we’re leading the charge, with raving fans and new customers signing up every day. We raised $144M in Series C in June of 2023 and are on an exciting trajectory as a company.
Who you are:
The ideal candidate for this role is business savvy, senior-level data engineer with experience owning and building complex data platforms and pipelines. You have a deep understanding of the data modeling, data warehousing and modern data platforms. You have experience with not just data, but back-end or full-stack development and have worked on true customer-facing products. You possess strong, but loosely held opinions about design patterns, architecture and modeling. You are passionate about solving business challenges with data and take pride in implementing scalable solutions. You have the skills to translate ambiguous and non-technical asks to specific technical tasks and deliver it incrementally . You are product-minded and team-oriented, willing to hear others’ opinions and educate on best practices. You have excellent communication skills (particularly written).
Your impact:
- Enable and scale self-serve analytics for all Fleetio team members. You’ll architect clean data sets by modeling data and metrics via tools like dbt to empower employees to make data-driven decisions with accurate information.
- Develop data destinations, custom integrations and maintain open source packages that allow our customers to easily integrate Fleetio data with the modern data stack.
- Maintain and develop custom data pipelines from operational source systems to our data platform for both streaming and batch sources.
- Work on development of our internal data infrastructure stack. Improve the hygiene and integrity of our data platform and ancillary tools by maintaining and monitoring the ELT pipeline.
- Architect, design and implement core components of our data platform beyond the traditional warehouse to include data observability, experimentation, data science and other data products.
- Develop and maintain streaming data pipelines from a variety of databases and data sources
- Collaborate with other Fleetians around the company to understand data needs and ensure required data is collected, modeled, and available to team members.
- Document best practices and coach/advise other data analysts, product managers, engineers, etc. on data modeling, SQL query optimization & reusability, etc. Keep our data platform tidy by managing roles and permissions and deprecating old projects.
Your experience:
- 5+ years experience working in a data engineering or data-focused software engineering role.
- Experience transforming raw data into clean models using standard tools of the modern data stack and deep understanding of ELT and data modeling concepts.
- Experience with streaming data and pipelines such as kafka or kinesis.
- Proficiency in python and proven track record of delivering production-ready python applications.
- Experience in designing, building, and administering modern data pipelines and data warehouses
- Experience with dbt.
- Experience with semantic layers like cube or metricflow.
- Experience with Snowflake, BigQuery, or Redshift.
- Experience with version control tools such as Github or Gitlab.
- Experience with ELT tools such as Stitch or Fivetran.
- Experience with orchestration tools such as Prefect or Dagster.
- Experience with CI/CD and IaaC tooling such as github Actions and Terraform.
- Experience with business intelligence solutions (Metabase, Looker, Tableau, Periscope, Mode)
- Experience with serverless cloud functions (AWS Lambda, Google Cloud Functions, etc.)
- Excellent communication and project management skills with a customer service focused mindset.
- Be sure to mention “coffee” in your application so we know you read this
Considered a plus:
- Experience with data marketplaces/private sharing technologies such as snowflake marketplace or AWS data exchange (preferred).
- Experience contributing to open-source projects (preferred).
- Experience with full-stack engineering.
Benefits:
- Multiple health/dental coverage options
- Vision insurance
- Incentive stock options
- 401(k) match of 4%
- PTO – 4 weeks
- 12 company holidays + 2 floating holidays
- Parental leave- birthing parent (12 weeks paid) non-birthing (4 weeks)
- FSA & HSA options
- Short and long term disability (short term 100% paid)
- Community service funds
- Professional development funds
- Wellbeing fund – $150 quarterly
- Business expense stipend- $125 quarterly
- Mac laptop + new hire equipment stipend
- Monthly catered lunches
- Fully stocked kitchen with tons of drinks & snacks
- Remote working friendly since 2012 #LI-REMOTE
Fleetio provides equal employment opportunities to all employees and applicants and prohibits discrimination and harassment. We celebrate diversity and are committed to creating an inclusive environment for all. All employment is decided on the basis of qualifications, merit and business need.