airflow Remote Jobs

138 Results

+30d

Lead Product Architect - Qubole

SalesFull TimeDevOPSMaster’s DegreeairflowDesignrubyjavac++pythonjavascript

Idera, Inc. is hiring a Remote Lead Product Architect - Qubole

Lead Product Architect - Qubole - Idera, Inc. - Career PageThe Lead Product Architect is a key member of the core team tasked with defining and achieving product release results that achieve defined business goals. This role owns all technical and delivery matters r

See more jobs at Idera, Inc.

Apply for this job

+30d

Sales & Partnerships - Data Engineer

lastminute.comLisbon, Portugal, Remote
Sales2 years of experiencetableauscalaairflowsqlDesignmobilepythonAWS

lastminute.com is hiring a Remote Sales & Partnerships - Data Engineer

Job Description

lastminute.com is looking for a Data Engineer for its Sales & Partnerships team inside the Data & Analytics department.

The activities of the Sales & Partnerships domain team are focused on reports, tables, analysis and, more generally, all sorts of deliverables related to company's sales data in order to create an important value in supporting decision-making of the business. Significant emphasis will be placed on partnerships data preparation and analysis, helping our business to find best solutions with the partners, monitoring performances and evaluating the effectiveness of sales campaigns, agreements and initiatives through the time. 

The candidate will have the opportunity to become a key member of the team leveraging their engineering skills to acquire, manipulate, orchestrate and monitor data.

Data is at our core and its reliability and effectiveness have direct impact in producing actionable insights and improving business performances

* Please note that this is a remote working model position, remote possibilities can be evaluated inside Portuguese territory only.

Qualifications

Key Responsibilities

  • Understand and analyse functional needs, raw data and develop data dimensional models
  • Design, build and deploy data pipelines with a focus on automation, performance optimization, scalability, and reliability aspects
  • Helps the business to understand the data and find insights that enable the company to take data driven decisions
  • Leverage data and business principles to solve large-scale web, mobile and data infrastructure problems
  • Build data expertise and own data quality for your area

 

Skills and Experience

Essentials

  • At least 2 years of experience in similar role in a fast-paced environment
  • SQL advanced knowledge
  • Experience in Data Modelling
  • Experience in ETL design, implementation and maintenance
  • Experience with workflow management engines (e.g. Airflow, Google Cloud Composer, Talend)
  • Experience with data quality and validation
  • Fluent in English both written and spoken


Desirable 

  • Bachelor or master degree in Statistics, Mathematics, Engineering or Physics or similar fields
  • Experience working with cloud or on-prem Big Data/MPP analytics platform (e.g. AWS Redshift, Google BigQuery or similar)
  • Programming languages knowledge (e.g. Python, R, Scala)
  • Experience in analysing data to discover opportunities, address gaps and anomaly/outlier detection
  • Experience with Analytics tool (e.g. QlikView, Tableau, Spotfire)
  • Familiarity with digital and e-commerce business

 

Abilities/qualities 

  • Problem solving and decision making skills and innovative thinking 
  • Proactivity and strategic approach
  • Ability to interface with business stakeholders by presenting and negotiating one's solutions
  • Passionate about digital world, ambitious and motivated with a can-do attitude
  • High attention to detail and ability to effectively manage multiple projects at a time, successfully able to meet deadlines
  • Strong team player with a willingness to challenge existing processes and applications

See more jobs at lastminute.com

Apply for this job

+30d

Senior AI Infra Engineer, Caper

InstacartUnited States - Remote
S3Master’s DegreeairflowDesigndockerelasticsearchkubernetesAWS

Instacart is hiring a Remote Senior AI Infra Engineer, Caper

We're transforming the grocery industry

At Instacart, we invite the world to share love through food because we believe everyone should have access to the food they love and more time to enjoy it together. Where others see a simple need for grocery delivery, we see exciting complexity and endless opportunity to serve the varied needs of our community. We work to deliver an essential service that customers rely on to get their groceries and household goods, while also offering safe and flexible earnings opportunities to Instacart Personal Shoppers.

Instacart has become a lifeline for millions of people, and we’re building the team to help push our shopping cart forward. If you’re ready to do the best work of your life, come join our table.

Instacart is a Flex First team

There’s no one-size fits all approach to how we do our best work. Our employees have the flexibility to choose where they do their best work—whether it’s from home, an office, or your favorite coffee shop—while staying connected and building community through regular in-person events. Learn more about our flexible approach to where we work.

 

Overview

About the Role

We are seeking a highly skilled and motivated AI Infra Engineer to design, develop, and maintain our data platform specifically tailored for deep learning and computer vision applications. You will be responsible for building and optimizing our data infrastructure to support large-scale data collection, labeling, management, model training, evaluation, and continuous deployment. Your work will be critical in enabling our AI and computer vision teams to build and deploy state-of-the-art models efficiently and reliably.

About the Team

The AI and CV team at Caper (Instacart) innovates at the industry frontier across cloud and edge computing. The systems and algorithms built enable a magical shopping and checkout process in grocery stores. Our enthusiastic researchers and engineers are spread across different time zones but collaborate effectively on multiple exciting projects.

About the Job 

Your responsibilities will include one or more of the following:

  • Design, build, and maintain scalable and efficient data pipelines for collecting, processing, and storing large volumes of structured and unstructured data, specifically image and video streams and relevant metadata.
  • Develop and integrate tools for data labeling and annotation, ensuring high-quality training datasets for deep learning and computer vision models.
  • Collaborate with data scientists and machine learning engineers to build and optimize the infrastructure required for training and evaluating deep learning models at scale.
  • Build and maintain CI/CD pipelines to seamlessly deploy machine learning models into production environments.

About You

Minimum Qualifications

  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
  • 5+ years of experience in data engineering, full-stack and/or infrastructure development.
  • Proven experience with building and maintaining large-scale data pipelines (batching or streaming) for computer vision and/or machine learning applications.
  • Familiarity with observability and monitoring tools (e.g. Datadog) and best practices. 
  • Familiarity with frameworks for large-scale data processing (e.g., Kafka, Spark, Airflow, Ray), storage (e.g., S3, Delta Lake), indexing and search (e.g. Elasticsearch).
  • Experience with cloud platforms (e.g., AWS, GCP) and containerization technologies (e.g., Docker, Kubernetes).
  • Strong problem-solving skills to work in a fast-paced, dynamic environment.
  • Excellent communication skills to work collaboratively in a cross-functional team

Preferred Qualifications

  • Experience building and/or integrating computer vision data collection, labeling and management systems.
  • Experience in edge inference and optimization on Nvidia chipsets. 
  • Experience with deep learning frameworks (e.g., TensorFlow, PyTorch) and model management platforms (e.g., Kubeflow, MLflow, TensorBoard).
  • Knowledge of computer vision and machine learning algorithms and models.
  • Experience with frameworks and best practices for data security or compliance.

Instacart provides highly market-competitive compensation and benefits in each location where our employees work. This role is remote and the base pay range for a successful candidate is dependent on their permanent work location. Please review our Flex First remote work policyhere.

Offers may vary based on many factors, such as candidate experience and skills required for the role.Additionally, this role is eligible for a new hire equity grant as well as annual refresh grants. Please read more about our benefits offeringshere.

For US based candidates, the base pay ranges for a successful candidate are listed below.

CA, NY, CT, NJ
$198,000$220,000 USD
WA
$190,000$211,000 USD
OR, DE, ME, MA, MD, NH, RI, VT, DC, PA, VA, CO, TX, IL, HI
$182,000$202,000 USD
All other states
$165,000$183,000 USD

See more jobs at Instacart

Apply for this job

+30d

Data Operations Engineer

PleoRemote - India
SalesagileairflowDesignscrumpostgresqlpython

Pleo is hiring a Remote Data Operations Engineer

Let’s face it: out-of-pocket expenses suck. And manual expense spreadsheets are old-school. No one wants to wait until payday to be reimbursed for something they bought for work, and finance teams have better things to do than spend hours tapping away on Excel. At Pleo, we’re on a mission to change this. We’re here to make spend management surprisingly effective and empowering – for finance teams and employees. But.. we need your help!

What do we need?

As part of our growing Revenue Operations domain, we have established a team called Business Architecture & Technology. This team will focus on optimising processes, data, and tools for our commercial part of the business (Marketing, Sales, Partnerships & Customer Experience). For this team, are looking for a best in class Data Operations Engineer to join our new Hub in Chennai, India on a hybrid working arrangement.

Your role in this will be to help us design and implement new data structures and integrate new tools to support this. Think of it as this - we need to make sure the commercial aspect of Pleo is happening as smoothly and efficiently as possible. We achieve this by designing and maintaining the tech stack, extensions, and related processes. We are a leading design customer of many vendors and always aim to push our tools above and beyond. 

In Pleo, Revenue Operations is a cross-functional discipline that goes across many areas, such as the demand and opportunity management process, sales planning and forecasting, and monitoring business performance. Hence, this is a unique opportunity to learn more about the many different aspects of running a business, the decision processes and prioritisation needed to execute strategic projects, and insights into where the company is going - almost like a front-row seat!

What to expect in the role ????

In numbers: 75% of your time will be spent working with our data models, data pipelines and creating easy-to-understand and readable documentation, while the other 25% of the time will be discussing priorities with our stakeholders.

  • You will design, develop and maintain high-quality, scalable data pipelines and datasets for commercial teams.
  • Own the ETL and rETL (reverse ETL) processes and enable data activation in your stakeholders' projects.
  • Optimise performance and scalability of existing data pipelines and datasets through code refactoring and infrastructure improvements.
  • Collaborate with internal teams to understand business requirements and translate them into technical specifications.
  • Troubleshoot and resolve issues related to HubSpot, Vitally and Iterable configuration, resolving bugs and tickets that might rise.

What we need from you:

  • Minimum of 1 year of professional experience as a data engineer, data analyst or a proven track record working as an analyst in a similar operations position.
  • Business understanding of what data activation means and how your work impacts commercial teams in the front line (such as Sales, Marketing and Customer Success).
  • Good understanding and hands-on experience with SQL. Understanding of dbt.
  • Experience writing digestible and accessible documentation for your stakeholders.
  • Experience building or maintaining ETL pipelines with Python, nodeJS is a plus.
  • Familiarity with Agile development methodologies and tools (e.g., Scrum, Kanban).
  • Willingness to learn new technologies and languages.
  • Ability to work independently and contribute to multiple projects simultaneously, delivering high-quality results within deadlines.
  • Familiarity with database systems and structures (e.g. PostgreSQL, BigQuery).
  • Excellent problem-solving skills and the ability to debug and resolve complex technical issues efficiently.
  • Strong communication skills, both written and verbal in English (our company language), with the ability to effectively collaborate with cross-functional teams.
  • Familiarity with: Census, Fivetran, dbt, Airflow, Castor, HubSpot, Vitally.

Show me the Benefits!

  • Your own Pleo card (no more out-of-pocket spending!)
  • A monthly allowance of €55 per month (INR equivalent) towards your lunch ????
  • Hybrid, flexible working arrangement
  • 25 days of PTO + public holidays
  • Option to purchase 5 additional days of holiday through a salary sacrifice
  • Wellbeing days - fully paid days off designed for a slower pace, allowing you to take time to recharge and prioritise self-care
  • We’re trialling MyndUp to give our employees access to free mental health and wellbeing support with great success so far  ❤️‍????
  • Access to LinkedIn Learning - acquire new skills, stay abreast of industry trends and fuel your personal and professional development continuously 
  • Paid parental leave - we want to make sure that we're supportive of families and help you feel that you don't have to compromise your family due to work ????

Why join us?

Working at Pleo means you're working on something very exciting: the future of work. Our mission is to help every company go beyond the books. Pleo itself means ‘more than you’d expect’, and it’s been the secret to our success over the last 8 years. So it’s only fitting that we’d pass this philosophy onto our customers to help them make the most of their finances.

We think company spending should be delegated to all employees and teams, that it should be as automated as possible, and that it should drive a culture of responsible spending. Finance teams shouldn’t be siloed from the rest of the organisation – they should work in unity with marketing, sales, IT and everyone else.

Speaking of working in unity, our values tell the story of how we work at Pleo. We have four core values, the first of which is ‘champion the customer’, which means we address real pain points that businesses face. Next up is ‘succeed as a team’, which highlights how our strength lies in our diversity and trust in each other. We also ‘make it happen’ by taking bold decisions and following through to deliver results. Last but not least, we ‘build to scale’, creating lasting solutions that address today’s challenges and anticipate tomorrow’s needs.

So, in a nutshell, that's Pleo. Today we are a 850+ team, from over 100 nations, sitting in our Copenhagen HQ, London, Stockholm, Berlin, Madrid, Montreal and Lisbon offices —and quite a few full-time remotes in 35 other countries! Being HQ'd out of Copenhagen means we're inspired by things like a good work-life balance. If you don't work in the office with us, we'll help you set up the best remote setup possible and make sure you still have time to connect with your team.

About your application

  • Please submit your application in English; it’s our company language so you’ll be speaking lots of it if you join ????
  • We treat all candidates equally:If you are interested please apply through our application system - any correspondence should come from there! Our lovely support isn't able to pass on any calls/ emails our way - and this makes sure that the candidate experience is smooth and fair to everyone????
  • We’re on a mission to make everyone feel valued at work. That’s only achievable if our team reflects the diversity of the world around us - and that starts with you, hitting apply, even if you are worried you might not tick all the boxes! We embrace and encourage people from all backgrounds to apply - regardless of race/ethnicity, colour, religion, nationality, gender, sex, sexual orientation, age, marital status, disability, neurodiversity, socio-economic status, culture or beliefs.
  • When you submit an application we process your personal data as a data processor. Find out more about how your data is used in the FAQs section at the bottom of our jobs page.

See more jobs at Pleo

Apply for this job

+30d

Engineering Manager, Data Infrastructure

GrammarlyGermany; Hybrid
remote-firstairflowDesignazureAWS

Grammarly is hiring a Remote Engineering Manager, Data Infrastructure

Grammarly is excited to offer aremote-first hybrid working model. Grammarly team members in this role must be based in Germany, and, depending on business needs, they must meet in person for collaboration weeks, traveling if necessary to the hub(s) where their team is based.

This flexible approach gives team members the best of both worlds: plenty of focus time along with in-person collaboration that fosters trust and unlocks creativity.

About Grammarly

Grammarly is the world’s leading AI writing assistance company trusted by over 30 million people and 70,000 teams. From instantly creating a first draft to perfecting every message, Grammarly helps people at 96% of theFortune 500 and teams at companies like Atlassian, Databricks, and Zoom get their point across—and get results—with best-in-class security practices that keep data private and protected. Founded in 2009, Grammarly is No. 14 on the Forbes Cloud 100, one of TIME’s 100 Most Influential Companies, one of Fast Company’s Most Innovative Companies in AI, and one of Inc.’s Best Workplaces.

The Opportunity

To achieve our ambitious objectives, we are seeking a highly skilled and experienced Manager for our Data Infrastructure team. This role is crucial in managing and evolving our data ingestion processes and tooling to support self-serve analytics and policy management across the organization. The ideal candidate will possess strong technical expertise, exceptional leadership abilities, and the capability to mentor and develop a high-performing team.

This person will be an integral part of the larger data organization, reporting directly to the Director of Data Engineering based in the US, and they’ll have the opportunity to influence decisions and the direction of our overall data platform, including infrastructure, data processing and analytics engineering.

Grammarly’s engineers and researchers have the freedom to innovate and uncover breakthroughs—and, in turn, influence our product roadmap. The complexity of our technical challenges is growing rapidly as we scale our interfaces, algorithms, and infrastructure. You can hear more from our team on our technical blog.

As the Manager of the Data Infrastructure team, you will lead and mentor a team of data & software engineers, fostering a collaborative and innovative environment focused on professional growth. You will oversee the design, implementation, and maintenance of secure, scalable, and optimized data infrastructure, ensuring high performance and reliability. Your role includes developing and executing strategic roadmaps aligned with business objectives and collaborating closely with cross-functional teams and the larger data organization to ensure seamless data integration and access. Additionally, you will provide technical leadership and be pivotal in resource management and recruiting efforts, driving the team’s success and aligning with the organization’s long-term data strategy.

In this role, you will:

  • Build a highly specialized engineering team to support the growing needs and complexity of our product and business organizations. 
  • Oversee the design, implementation, and maintenance of a robust data ingestion framework that ensures high availability and reliability.
  • Develop and manage tooling that enables self-serve analytics and policy management across the organization.
  • Ensure data is collected, transformed, and stored efficiently to support real-time and batch processing needs.
  • Act as a liaison between the local team and the broader organization to ensure seamless communication and collaboration.
  • Participate in cross-functional meetings and initiatives to represent the Data Infrastructure team’s interests and contribute to the organization’s overall data strategy.
  • Drive the evaluation, selection, and implementation of new technologies and tools that enhance the team’s capabilities and the organization’s data infrastructure.
  • Implement and enforce data governance policies and practices to ensure data quality, security, and compliance with organizational standards.
  • Collaborate with stakeholders to define and refine data policies that align with business objectives.
  • Monitor and assess the performance of the data infrastructure to identify areas for optimization and improvement.
  • Foster a collaborative and high-performance culture within the team.
  • Cultivate an ownership mindset and culture on your team and across product teams: provide the necessary metrics to help us understand what is working, what is not, and how to fix it.
  • Set high performance and quality standards, coach team members to meet them; mentor and grow junior and senior IC talent.

Qualifications

  • 7+ years of experience in data engineering or data infrastructure, with at least 2-3 years in a leadership or management role.
  • Proven experience in building and managing large-scale data ingestion pipelines and infrastructure.
  • Experience with one or more data platforms (e.g., AWS, GCP, Azure Databricks) 
  • Familiarity with modern data engineering tools and frameworks (e.g., Apache Kafka, Airflow, DBT)
  • Strong understanding of data governance, policy management, and self-serve analytics.
  • Excellent leadership and people management skills, with a track record of mentoring and developing high-performing teams.
  • Experience working with geographically distributed teams and aligning with global data strategies.
  • Strong problem-solving skills, with the ability to navigate and resolve complex technical challenges.
  • Excellent communication and collaboration skills, with the ability to work effectively with stakeholders across different locations and time zones.
  • Has the ability and desire to operate in a fast-paced, dynamic environment where things change quickly.
  • Leads by setting well-understood goals and sharing the appropriate level of context for maximum autonomy but is also deeply technical and can dive in to help when necessary.
  • Embodies our EAGER values—is ethical, adaptable, gritty, empathetic, and remarkable.
  • Is inspired by our MOVE principles: move fast and learn faster; obsess about creating customer value; value impact over activity; and embrace healthy disagreement rooted in trust.
  • Is able to meet in person for their team’s scheduled collaboration weeks, traveling if necessary to the hub where their team is based.

Support for you, professionally and personally

  • Professional growth:We believe that autonomy and trust are key to empowering our team members to do their best, most innovative work in a way that aligns with their interests, talents, and well-being. We also support professional development and advancement with training, coaching, and regular feedback.
  • A connected team: Grammarly builds a product that helps people connect, and we apply this mindset to our own team. Our remote-first hybrid model enables a highly collaborative culture supported by our EAGER (ethical, adaptable, gritty, empathetic, and remarkable) values. We work to foster belonging among team members in a variety of ways. This includes our employee resource groups, Grammarly Circles, which promote connection among those with shared identities including BIPOC and LGBTQIA+ team members, women, and parents. We also celebrate our colleagues and accomplishments with global, local, and team-specific programs. 
  • Comprehensive benefits for candidates based in Germany:Grammarly offers all team members competitive pay along with a benefits package encompassing life care (including mental health care and risk benefits) and ample and defined time off. We also offer support to set up a home office, wellness and pet care stipends, learning and development opportunities, and more.

We encourage you to apply

At Grammarly, we value our differences, and we encourage all to apply. Grammarly is an equal-opportunity company. We do not discriminate on the basis of race or ethnic origin, religion or belief, gender, disability, sexual identity, or age.

For more details about the personal data Grammarly collects during the recruitment process, for what purposes, and how you can address your rights, please see the Grammarly Data Privacy Notice for Candidates here

#LI-Hybrid

 

Apply for this job

+30d

Data Engineer (Intermediate)

SecurityScorecardRemote (Canada)
redisagileBachelor's degreescalanosqlairflowpostgressqlDesignc++jenkinsAWS

SecurityScorecard is hiring a Remote Data Engineer (Intermediate)

About SecurityScorecard:

SecurityScorecard is the global leader in cybersecurity ratings, with over 12 million companies continuously rated, operating in 64 countries. Founded in 2013 by security and risk experts Dr. Alex Yampolskiy and Sam Kassoumeh and funded by world-class investors, SecurityScorecard’s patented rating technology is used by over 25,000 organizations for self-monitoring, third-party risk management, board reporting, and cyber insurance underwriting; making all organizations more resilient by allowing them to easily find and fix cybersecurity risks across their digital footprint. 

Headquartered in New York City, our culture has been recognized by Inc Magazine as a "Best Workplace,” by Crain’s NY as a "Best Places to Work in NYC," and as one of the 10 hottest SaaS startups in New York for two years in a row. Most recently, SecurityScorecard was named to Fast Company’s annual list of theWorld’s Most Innovative Companies for 2023and to the Achievers 50 Most Engaged Workplaces in 2023 award recognizing “forward-thinking employers for their unwavering commitment to employee engagement.”  SecurityScorecard is proud to be funded by world-class investors including Silver Lake Waterman, Moody’s, Sequoia Capital, GV and Riverwood Capital.

About the Team

The Data Analytics Engineering team is responsible for developing and managing the core data platform for ratings infrastructure, architecting and implementing business-critical data solutions and pipelines, and enabling data-driven decisions within the organization and for our customers.

About the Role

As a Senior Data Engineer - you will work alongside outstanding engineers to implement new products and features focused on meeting the evolving needs of our customers, while refining requirements with product management and collaborating cross-team. All our team members actively participate in product definition, technical architecture review, iterative development, code review, and operations. Along with this, you’ll have the opportunity to interact with customers to ensure their needs are met.You will be working in a high-performance, fast-paced environment, and contribute to an inclusive work environment.

Responsibilities:

  • Lead and collaborate with engineers to deliver projects from inception to successful execution
  • Write well-crafted, well-tested, readable, maintainable code
  • Participate in code reviews to ensure code quality and distribute knowledge
  • Share engineering support, release, and on-call responsibilities for an always-on 24x7 site
  • Participate in Technical Design Review sessions, and have the ability to explain the various trade-offs made in decisions
  • Maintain existing APIs and data pipelines, contribute to increasing code-coverage 
  • Understand requirements, build business logic and demonstrate ability to learn and quickly adopt 
  • Automate and improve existing processes to sustainably maintain the current features and pipelines
  • Analyze our internal systems and processes, and locate areas for improvement/automation

Requirements

  • BS/MS in computer science or equivalent technical experience, and must have worked in Data engineering space for 3+ years
  • Must have experience in full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations 
  • Technical requirements:
    • Must have 2+ years experience in building and maintaining big data pipelines using Scala with Spark, Airflow, Hive, Presto, Redis
    • Must have 2+ years experience with NoSQL databases, preferably Cassandra / Scylla and Clickhouse; and SQL databases, preferably Postgres
    • Must have 2+ years experience in developing batch/real-time data streams 
    • Worked with CI/CD pipelines using Jenkins
    • Experience with cloud environments, preferably AWS
    • Worked with variety of data (structured/unstructured), data formats (flat files, XML, JSON, relational, parquet)
  • Worked in Agile methodology

Benefits

We offer a competitive salary, stock options, a comprehensive benefits package, including health and dental insurance, unlimited PTO, parental leave, tuition reimbursements, and much more!

SecurityScorecard is committed to Equal Employment Opportunity and embraces diversity. We believe that our team is strengthened through hiring and retaining employees with diverse backgrounds, skill sets, ideas, and perspectives. We make hiring decisions based on merit and do not discriminate based on race, color, religion, national origin, sex or gender (including pregnancy) gender identity or expression (including transgender status), sexual orientation, age, marital, veteran, disability status or any other protected category in accordance with applicable law. 

We also consider qualified applicants regardless of criminal histories, in accordance with applicable law. We are committed to providing reasonable accommodations for qualified individuals with disabilities in our job application procedures. If you need assistance or accommodation due to a disability, please contact talentacquisitionoperations@securityscorecard.io.

Any information you submit to SecurityScorecard as part of your application will be processed in accordance with the Company’s privacy policy and applicable law. 

SecurityScorecard does not accept unsolicited resumes from employment agencies.  Please note that we do not provide immigration sponsorship for this position. #LI-DNI

See more jobs at SecurityScorecard

Apply for this job

+30d

Senior Data Engineer

StyleSeat100% Remote (U.S. Based Only, Select States)
scalanosqlairflowsqlDesignc++dockerMySQLpython

StyleSeat is hiring a Remote Senior Data Engineer

Senior Data Engineer

100% Remote (U.S. Based Only, Select States - See Below)

About the role

StyleSeat is looking to add a Senior Data Engineer to its cross-functional Search product team. This team of data scientists, analysts, data engineers, software engineers and SDETs is focused on improving our search capability and customer search experience. The Senior Data Engineer will use frameworks and tools to perform the ETL and propose abstractions of those methods to aid in solving the problems associated with data ingestion. 

What you’ll do

  • Handle data engineering tasks in a team focused on improving search functionality and customer search experience.
  • Design, develop, and own ETL pipelines that deliver data with measurable quality.
  • Scope, architect, build, release, and maintain data oriented projects, considering performance, stability, and an error-free operation.
  • Identify and resolve pipeline issues while discovering opportunities for improvement.
  • Architect scalable and reliable solutions to move data across systems from multiple products in nearly real-time.
  • Continuously improve our data platform and keep the technology stack current.
  • Solve critical issues in complex designs or coding schemes.
  • Monitor metrics, analyze data, and partner with other internal teams to solve difficult problems creating a better customer experience.

Who you are 

Successful candidates can come from a variety of backgrounds, yet here are some of the must have  and nice to have experiences we’re looking for:

Must-Have:

  • Expert SQL skills.
  • 4 + years experience with::
    • Scaling and optimizing schemas.
    • Performance tuning ETL pipelines.
    • Building pipelines for processing large amounts of data.
  • Proficiency with Python, Scala and other scripting languages.
  • Experience with:
    • MySQL and Redshift.
    • NoSQL data stores, methods and approaches.
    • Kinesis or other data streaming services. 
    • Airflow or other pipeline workflow management tools. 
    • EMR,  Spark and ElasticSearch.
    • Docker or other container management tools. 
    • Developing infrastructure as code (IAC).
  • Ability to effectively work and communicate with cross-departmental partners and non-technical teams.

Nice to Have:

  • Experience with:
    • Segment customer data platform with integration to Braze.
    • Terraform. 
    • Tableau.
    • Django.
    • Flask.

Salary Range

Our job titles may span more than one career level. The career level we are targeting for this role has a base pay between $136,900 and $184,600. The actual base pay is dependent upon many factors, such as: training, transferable skills, work experience, business needs and market demands. Base pay ranges are subject to change and may be modified in the future. 

Who we are

StyleSeat is the premier business platform for SMBs in the beauty and wellness industry to run and grow their business; and destination for consumers to discover, book and pay. To date, StyleSeat has powered more than 200 million appointments totaling over $12 billion in revenue for small businesses. StyleSeat is a platform and marketplace designed to support and promote the beauty and personal care community. 

Today, StyleSeat connects consumers with top-rated beauty professionals in their area for a variety of services, including hair styling, barbering, massage, waxing, and nail care, among others. Our platform ensures that Pros maximize their schedules and earnings by minimizing gaps and cancellations, effectively attracting and retaining clientele.

StyleSeat Culture & Values 

At StyleSeat, our team is committed to fostering a positive and inclusive work environment. We respect and value the unique perspectives, experiences, and skills of our team members and work to create opportunities for all to grow and succeed. 

  • Diversity - We celebrate and welcome diversity in backgrounds, experiences, and perspectives. We believe in the importance of creating an inclusive work environment where everyone can thrive. 
  • Curiosity- We are committed to fostering a culture of learning and growth. We ask questions, challenge assumptions, and explore new ideas. 
  • Community - We are committed to making a positive impact on each, even when win-win-win scenarios are not always clear or possible in every decision. We strive to find solutions that benefit the community as a whole and drive our shared success.
  • Transparency - We are committed to open, honest, and clear communication. We hold ourselves accountable for maintaining the trust of our customers and team.
  • Entrepreneurship - We are self-driven big-picture thinkers - we move fast and pivot when necessary to achieve our goals. 

Applicant Note: 

StyleSeat is a fully remote, distributed workforce, however, we only have business entities established in the below list of states and, thus, are unable to consider candidates who live in states not on this list for the time being.
**Applicants must be authorized to work for ANY employer in the U.S. We are unable to sponsor or take over sponsorship of an employment visa at this time.

 

* Arizona

* Alabama

* California

* Colorado

* Florida

* Georgia

* Illinois

* Indiana

* Massachusetts

* Maryland

* Michigan

* Nebraska

* New York

* New Jersey 

* Ohio

* Oregon

* Pennsylvania

* Virginia 

* Washington

 

See more jobs at StyleSeat

Apply for this job

+30d

Technical Lead - Data Engineering

carwowLondon,England,United Kingdom, Remote
DevOPStableauterraformairflowsqlDesignrubypostgresqlpythonAWS

carwow is hiring a Remote Technical Lead - Data Engineering

THE CARWOW GROUP

Carwow Group is driven by a passion for getting people into cars. But not just any car, the right car. That’s why we are building the go-to destination for car-changing. Designed to reach drivers everywhere with our trail-blazing portfolio of personality rich automotive brands; Carwow, Auto Express, evo, Driving Electric and Car Buyer.

What started as a simple reviews site, is now one of the largest online car-changing destinations in Europe - over 10m customers have used Carwow to help them buy and sell cars since its inception. Last year we grew over 50% with nearly £3bn worth of cars bought on site, while £1.8bn of cars were listed for sale through our Sell My Car service.

In 2024 we went big and acquired Autovia, doubling our audience overnight. Together we now have one of the biggest YouTube channels in the world with over 1.1 billion annual views, sell 1.2 million print copies of our magazines and have an annual web content reach over 350 million.

WHY JOIN US?

We are winners of the prestigious Culture 100 award that recognises the most loved and happiest tech companies to work for! We have just raised $52m in funding led by global venture capital firm Bessemer Venture Partners (an early backer of LinkedIn and Shopify) to accelerate our growth plans!

As pioneers, we’re always driving for new territory and positive change, so our work as a group is never done. Where others see difficulty, it’s our responsibility to see possibility – building new experiences, launching new titles and listening to drivers.

Being a part of Carwow Group means championing drivers and the automotive industry, acting as a disrupter and never being afraid to fail (but learning fast when we do!).

Our team of 500 employees across the UK, Germany, Spain and Portugal are revolutionising car-changing and we are fast expanding our mission across every single brand and country we operate in, so jump in!

ABOUT THE TEAM

As a team our mission is to empower all Carwow teams with access to reliable, actionable data and enable informed decision-making through scalable and secure data platforms.

We’re investing heavily in our Data Engineering capabilities at carwow, ensuring we have the expertise to deliver infrastructure and data-driven best practices that support the growing needs of our Data Analytics team. We build our ETL pipelines in Python and SQL, using Airflow to automate workflows, with data ingested into a Snowflake warehouse from various in-house and 3rd party sources and Terraform to manage infrastructure.

Our tech stack is primarily Ruby on Rails using PostgreSQL, all hosted on Heroku. We use Terraform to manage our infrastructure and encourage teams to be involved in how we deploy and run our code in Production.

You'll be supported by one of our Engineering Managers who are there to help you with your career progression and ensure that the team you are on are working as well as they can - continuously improving!

We have a career progression framework that means you know what is expected of you and how you can progress through our career ladder and your EM will work with you to make sure you are happy, fulfilled, and doing the best work you can be doing.

WHAT YOU’LL DO

  • You will be the Technical Lead within the Data Engineering squad, working simultaneously with the Principal Engineer, Product Manager and Engineering Manager whilst partnering closely with our Analytics & Data Science team to help enhance their productivity
  • You will help to shape the technical strategy for our approach to building data products and the future of Data Engineering at carwow by providing knowledge and expertise in optimum practices and implementation. This includes developing and testing product improvements
  • You will lead the Data Engineering team in scoping how new features can be built and how Stakeholder needs can be met while making pragmatic technical trade-offs
  • Develop and enhance our data architecture and pipelines to enable us to deliver versus our business goals
  • Collaborate with, and mentor your team by reviewing code, monitoring and guiding your team's software in production. Collaborate with other Stakeholders helping them to do their finest work and deliver projects according to our values and brand
  • Define a roadmap of initiatives to both improve our current infrastructure and assist other teams in creating new data products.
  • Contribute to a diverse Engineering culture based on customer-centricity, high-quality code, data-driven outcomes, technical innovation and business impact
  • Identify the important technical differentiating factors of our data product, and leverage those in our architecture

WHAT YOU’LL NEED

  • Previous experience as a Senior Data Engineer or Technical Lead
  • Knowledge of data modelling and schema design with a focus on efficiency, accuracy and scalability
  • Experience with SQL, Python, data infrastructure, SQL ETL/ELT knowledge, experience with DAGs to oversee script dependencies with tools like dbt, Airflow, Snowflake, Terraform
  • It's preferred but not essential if you have experience in Ruby, data visualisation tools (e.g. Looker, Tableau, Power BI), Amplitude, DevOps, Circle CI/CD, AWS
  • Experience guiding and mentoring distributed teams of Engineers to success, working cross-functionally with Product Managers, Stakeholders, and other Developers
  • A desire to learn continuously, share your knowledge, communicate effectively and build a product in close collaboration with others. You raise the bar for technical quality and sharing your expertise
  • Experience delivering work that has measurable impact and value to stakeholders – early and often. You actively scope work and drive projects forward
  • You are fully engaged in the product development life cycle, helping to shape your team’s roadmap

You’re not expected to be an expert in all of these technologies and tools, we are happy to support your learning journey. If you’re unsure about any of the above, please apply.

WHAT’S IN IT FOR YOU

  • Fully remote working role, with offices in London, Munich, Madrid, and Porto that you can work from
  • 4-5 trips to the London office for social and team bonding events
  • Competitive salary to fund that dream holiday to Bali
  • Matched pension contributions for a peaceful retirement
  • Share options - when we thrive, so do you!
  • Vitality Private Healthcare, for peace of mind, plus eyecare vouchers
  • Life Assurance for (even more) peace of mind
  • Monthly coaching sessions with Spill - our mental wellbeing partner
  • Enhanced holiday package, plus Bank Holidays
    • 28 days annual leave
    • 1 day for your wedding
    • 1 day off when you move house - because moving is hard enough without work!
    • For your third year anniversary, get 30 days of annual leave per year
    • For your tenth year anniversary, get 35 days of annual leave per year
    • Option to buy 3 extra days of holiday per year
  • Work from abroad for a month
  • Inclusive parental, partner and shared parental leave, fertility treatment and pregnancy loss policies
  • Bubble childcare support and discounted nanny fees for little ones
  • The latest tech (Macbook or Surface) to power your gif-sending talents
  • Up to £500/€550 home office allowance for that massage chair you’ve been talking about
  • Generous learning and development budget to help you master your craft
  • Regular social events:, tech lunches, coffee with the exec sessions, book clubs, social events/anything else you pester us for
  • Refer a friend, get paid. Repeat for infinite money
  • Lunch & learns and Carwow Classrooms with expert speakers who are here for a free lunch

Diversity and inclusion is an integral part of our culture. We know that diverse teams are strong teams, so we welcome those with alternative identities, backgrounds, and experiences to apply for this position. We make recruiting decisions based on experience, skills and potential, so all our applicants are treated fairly and equally.

See more jobs at carwow

Apply for this job

+30d

Senior Data Engineer

EquipmentShareRemote; Chicago; Denver; Kansas City; Columbia MO
LambdaagileairflowsqlDesignc++postgresqlpythonAWS

EquipmentShare is hiring a Remote Senior Data Engineer

EquipmentShare is Hiring a Senior Data Engineer.

Your role in our team

At EquipmentShare, we believe it’s more than just a job. We invest in our people and encourage you to choose the best path for your career. It’s truly about you, your future, and where you want to go.

We are looking for a Senior Data Engineer to help us continue to build the next evolution of our data platform in a scalable, performant, and customer-centric architecture.

Our main tech stack includes Snowflake, Apache Airflow, AWS cloud infrastructure (e.g., Kinesis, Kubernetes/EKS, Lambda, Aurora RDS PostgreSQL), Python and Typescript.

What you'll be doing

We are typically organized into agile cross-functional teams composed of Engineering, Product, and Design, which allows us to develop deep expertise and rapidly deliver high-value features and functionality to our next-generation T3 Platform.

You’ll be part of a close-knit team of data engineers developing and maintaining a data platform built with automation and self-service in mind to support analytics and machine learning data products for the next generation of our T3 Fleet that enableend-users to track, monitor and manage the health of their connected vehicles and deployed assets. 

We'll be there to support you as you become familiar with our teams, product domains, tech stack and processes — generally how we all work together.

Primary responsibilities for a Senior Data Engineer

  • Collaborate with Product Managers, Designers, Engineers, Data Scientists and Data Analysts to take ideas from concept to production at scale.
  • Design, build and maintain our data platform to enable automation and self-service for data scientists, machine learning engineers and analysts.
  • Design, build and maintain data product framework to support EquipmentShare application data science and analytics features.
  • Design, build and maintain CI/CD pipelines and automated data and machine learning deployment processes.
  • Develop data monitoring and alerting capabilities.
  • Document architecture, processes and procedures for knowledge sharing and cross-team collaboration.
  • Mentor peers to help them build their skills.

Why We’re a Better Place to Work

We can promise that every day will be a little different with new ideas, challenges and rewards.

We’ve been growing as a team and we are not finished just yet— there is plenty of opportunity to shape how we deliver together.

Our missionis to enable the construction industry with tools that unlock substantial increases to productivity. Together with our team and customers, we are building the future of construction.

T3is the only cloud-based operating system that brings together construction workflows & data from constantly moving elements in one place.

  • Competitive base salary and market leading equity package.
  • Unlimited PTO.
  • Remote first.
  • True work/life balance.
  • Medical, Dental, Vision and Life Insurance coverage.
  • 401(k) + match.
  • Opportunities for career and professional development with conferences, events, seminars and continued education.
  • On-site fitness center at the Home Office in Columbia, Missouri, complete with weightlifting machines, cardio equipment, group fitness space, racquetball courts, a climbing wall, and much more!
  • Volunteering and local charity support that help you nurture and grow the communities you call home through our Giving Back initiative.
  • Stocked breakroom and full kitchen with breakfast and lunch provided daily by our chef and kitchen crew.

About You

You're a hands-on developer who enjoys solving complex problems and building impactful solutions.  Most importantly, you care about making a difference.

  • Take the initiative to own outcomes from start to finish — knowing what needs to be accomplished within your domain and how we work together to deliver the best solution.
  • You are passionate about developing your craft — you understand what it takes to build quality, robust and scalable solutions.
  • You’ll see the learning opportunity when things don’t quite go to plan — not only for you but for how we continuously improve as a team.
  • You take a hypothesis-driven approach — knowing how to source, create and leverage data to inform decision making, using data to drive how we improve, to shape how we evaluate and make platform recommendations.

So, what is important to us?

Above all, you’ll get stuff done. More importantly, you’ll collaborate to do the right things in the right wayto achieve the right outcomes.

  • 7+ years of relevant data platform development experience building production-grade solutions.
  • Proficient with SQL and a high-order object-oriented language (e.g., Python).
  • Experience with designing and building distributed data architecture.
  • Experience building and managing production-grade data pipelines using tools such as Airflow, dbt, DataHub, MLFlow
  • Experience building and managing production-grade data platforms using distributed systems such as Kafka, Spark, Flink and/or others.
  • Familiarity with event data streaming at scale.
  • Proven track record learning new technologies and applying that learning quickly.
  • Experience building observability and monitoring into data products. 
  • Motivated to identify opportunities for automation to reduce manual toil.

EquipmentShare is committed to a diverse and inclusive workplace. EquipmentShare is an equal opportunity employer and does not discriminate on the basis of race, national origin, gender, gender identity, sexual orientation, protected veteran status, disability, age, or other legally protected status.

 

#LI-Remote

 

See more jobs at EquipmentShare

Apply for this job

+30d

HVAC Airflow Loop design release engineer.

Segula TechnologiesMexico City, Mexico, Remote
Bachelor's degree5 years of experienceairflowDesign

Segula Technologies is hiring a Remote HVAC Airflow Loop design release engineer.

Job Description

Responsibilities

The HVAC Release Engineer is responsible for the development and validation of the automotive HVAC unit and airflow loop components. The candidate will ensure the HVAC unit and airflow loop components meet performance objectives and will coordinate development with other functional areas such as IP, hard trim, BIW, Aero/Thermal, Engine Cooling, PDO and suppliers. Duties will include writing source packages, specifications, requirements, change notices and overall supplier management to meet program timing and deliverables. The candidate will maintain a component and sub-system DVP&R, support and keep up to date with component bench testing, simulation, drive-cell and static cell test results as well as perform data analysis and establish proper corrective actions as required. Need to lead virtual team to design and perform CFD analysis of ducts that will meet airflow performance. Candidate will also support pilot build process at the assembly plant for development and installation of HVAC components.

Qualifications

Education

             Bachelor of Science (Engineering) from an ABET accredited university

Basic qualifications, experience, and knowledge

  • Bachelor's Degree in Mechanical or Electrical Engineering (or equivalent) from an ABET accredited university.
  • 1-5 years of experience in HVAC and duct design/development. Other related design experience (Engine cooling, Aero-Thermal) also a plus.
  • Strong PC + MS Office skills required. Excellent written, verbal and listening skills along with strong interpersonal skills.
  • Self-starter and motivated to produce results.
  • Experience with Design Failure Mode & Effects Analysis (DFMEA).
  • Experience with Design Verification Plan & Report (DVP&R).
  • Basic thermodynamics knowledge.

Preferred qualifications            

  • Experience in HVAC, PTC’s, ducts and outlets are a plus.
  • Experience using CAD software (NX, Catia, etc.)
  • Experience with Design for Six Sigma (DFSS).
  • Basic knowledge in Thermodynamics.
  • Automotive assembly plant experience.

Duration

  • As specified in contract

See more jobs at Segula Technologies

Apply for this job

+30d

HVAC Airflow Loop design release engineer

Segula TechnologiesMexico City, Mexico, Remote
Bachelor's degree5 years of experienceairflowDesign

Segula Technologies is hiring a Remote HVAC Airflow Loop design release engineer

Job Description

Responsibilities

The HVAC Release Engineer is responsible for the development and validation of the automotive HVAC unit and airflow loop components. The candidate will ensure the HVAC unit and airflow loop components meet performance objectives and will coordinate development with other functional areas such as IP, hard trim, BIW, Aero/Thermal, Engine Cooling, PDO and suppliers. Duties will include writing source packages, specifications, requirements, change notices and overall supplier management to meet program timing and deliverables. The candidate will maintain a component and sub-system DVP&R, support and keep up to date with component bench testing, simulation, drive-cell and static cell test results as well as perform data analysis and establish proper corrective actions as required. Need to lead virtual team to design and perform CFD analysis of ducts that will meet airflow performance. Candidate will also support pilot build process at the assembly plant for development and installation of HVAC components.

Qualifications

Education

             Bachelor of Science (Engineering) from an ABET accredited university

Basic qualifications, experience, and knowledge

  • Bachelor's Degree in Mechanical or Electrical Engineering (or equivalent) from an ABET accredited university.
  • 1-5 years of experience in HVAC and duct design/development. Other related design experience (Engine cooling, Aero-Thermal) also a plus.
  • Strong PC + MS Office skills required. Excellent written, verbal and listening skills along with strong interpersonal skills.
  • Self-starter and motivated to produce results.
  • Experience with Design Failure Mode & Effects Analysis (DFMEA).
  • Experience with Design Verification Plan & Report (DVP&R).
  • Basic thermodynamics knowledge.

Preferred qualifications            

  • Experience in HVAC, PTC’s, ducts and outlets are a plus.
  • Experience using CAD software (NX, Catia, etc.)
  • Experience with Design for Six Sigma (DFSS).
  • Basic knowledge in Thermodynamics.
  • Automotive assembly plant experience.

See more jobs at Segula Technologies

Apply for this job

+30d

Senior Analytics Engineer

FlywireSpain Remote, Spain, Remote
Bachelor degreeairflowsqlgitdockerpython

Flywire is hiring a Remote Senior Analytics Engineer

Job Description

The Opportunity:

We, at Flywire, are seeking a Senior Analytics Engineer to join our Analytics team and further scale our data driven company.

As a Senior Analytics Engineer, you will bridge the gap between Data Engineering and Reporting to enable our end users to make the most of our data. You will handle data from all areas of the company and succeed through the following responsibilities:

  • Own, manage, and further our full dbt environment and related tooling
  • Manage efficient materializations of streaming data to balance efficiency and latency
  • Explore and implement additional tools to support the wider Analytics team
  • Productionalize complex logic and queries into “gold standard” datasets
  • Provide dbt support throughout the company
  • Optimize dbt models for efficiency, clarity, and scalability
  • Remove and reduce technical debt and automate manual operations
  • Treat data as a product and enforce testing and CI to catch errors early
  • Work with and support company-wide data science/AI initiatives
  • Become a subject matter expert to support all business segments
  • Work with engineering and product teams to understand and model our event data
  • Drive change by identifying areas for improvement while analyzing data

Qualifications

Here’s What We’re Looking For:

  • Bachelor Degree in Economics, Mathematics, or Computer Science
  • 5+ years of experience in an Analytics role
  • Strong proficiency in SQL
  • Strong proficiency with dbt
  • Experience with a cloud data warehouse (BigQuery, Snowflake, Redshift)
  • Experience with git and CI/CD deployments
  • Experience with Python and Docker
  • Experience with Elementary or Great Expectations is preferred
  • Ability to communicate and follow up with internal stakeholders in a timely manner and excellent attention to detail.
  • Experience with BI Tools (Looker or Preset) is a plus

Technologies We Use :

  • GCP/AWS
  • Fivetran/Apache Beam -> BigQuery -> dbt -> Looker/Preset
  • Airflow (GCP Composer)
  • Flink and Apache Beam

See more jobs at Flywire

Apply for this job

+30d

Senior Software Engineer

Live PersonToronto, Canada- Remote
Bachelor's degreenosqlairflowsqlDesigngitjavac++MySQLkubernetesjenkinspythonbackend

Live Person is hiring a Remote Senior Software Engineer

 LivePerson (NASDAQ: LPSN) is the global leader in enterprise conversations. Hundreds of the world’s leading brands — including HSBC, Chipotle, and Virgin Media — use our award-winning Conversational Cloud platform to connect with millions of consumers. We power nearly a billion conversational interactions every month, providing a uniquely rich data set and safety tools to unlock the power of Conversational AI for better customer experiences.  

At LivePerson, we foster an inclusive workplace culture that encourages meaningful connection, collaboration, and innovation. Everyone is invited to ask questions, actively seek new ways to achieve success and reach their full potential. We are continually looking for ways to improve our products and make things better. This means spotting opportunities, solving ambiguities, and seeking effective solutions to the problems our customers care about. 

 

Overview:

The successful candidate has an opportunity to join our AI& Automation team within a fast-paced and successful organization.

 

You will: 

  • Design and develop high-volume, low-latency applications for mission-critical systems and deliver high-availability and performance
  • Design REST-based backend services
  • Debug production issues and help maintain existing code
  • Development of technical specifications and documentation
  • Participate in on-call rotations
  • Work with Bots & Automation team in building next-generation bot runtime platform

 

 

You have:

  • Bachelor's degree in Computer Science or a related field
  • 7+ years of experience building successful production software systems
  • Solid understanding of Data Structures and Algorithm Design
  • Strong programming skills in Java with good knowledge of multi-threading.
  • Expert-level knowledge of Databases (SQL, NoSQL) like Cassandra, MySQL
  • Experience with Data Processing tools like Kafka, Airflow, Apache Spark, Hadoop
  • Experience building REST APIs & debugging distributed microservice-based applications
  • Experience with Git, Jenkins, and other Development tools
  • Experience integrating with third-party APIs
  • Experience in Kubernetes 
  • Experience with NodeJS & Python is a plus

 

 

Benefits: 

  • Health: medical, dental, vision and wellbeing.
  • Time away: 15 days PTO, Public holidays as well 5 care days and 10 sick days.
  • Financial: ESPP, Basic life and AD&D insurance, long-term and short-term disability
  • Family: parental leave, maternity support, fertility services.
  • Development: Generous tuition reimbursement and access to internal professional development resources.
  • Additional: Health Service Navigator, Counseling Services & resources to help you and your family maintain overall good health and wellness
  • #LI-Remote

Why you’ll love working here: 

As leaders in enterprise customer conversations, we celebrate diversity, empowering our team to forge impactful conversations globally. LivePerson is a place where uniqueness is embraced, growth is constant, and everyone is empowered to create their own success. And, we're very proud to have earned recognition from Fast Company, Newsweek, and BuiltIn for being a top innovative, beloved, and remote-friendly workplace. 

Belonging at LivePerson:

We are proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, color, family or medical care leave, gender identity or expression, genetic information, marital status, medical condition, national origin, physical or mental disability, protected veteran status, race, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable laws, regulations and ordinances. We also consider qualified applicants with criminal histories, consistent with applicable federal, state, and local law.

We are committed to the accessibility needs of applicants and employees. We provide reasonable accommodations to job applicants with physical or mental disabilities. Applicants with a disability who require reasonable accommodation for any part of the application or hiring process should inform their recruiting contact upon initial connection.



 

Apply for this job

+30d

Staff Data Infrastructure Engineer

WebflowU.S. Remote
DevOPSS3Webflowremote-firstterraformairflowDesignc++dockerAWSbackend

Webflow is hiring a Remote Staff Data Infrastructure Engineer

At Webflow, our mission is to bring development superpowers to everyone. Webflow is a Website Experience Platform (WXP) that empowers modern marketing teams to visually build, manage, and optimize stunning websites. With AI-driven personalization baked in, Webflow enables teams to significantly boost conversion rates, translating directly into measurable business growth. From independent designers and creative agencies to Fortune 500 companies, millions worldwide use Webflow to be more nimble, creative, and collaborative.

We’re looking for a Staff Data Infrastructure Engineer to join our Data Platform team. In this pivotal role, you'll lead efforts to build and manage robust, secure, and scalable infrastructure that powers our data operations. Your responsibilities will include provisioning, deploying, configuring, managing, and scaling key components such as Kafka, Spark, Airflow, and query engines like Athena. You'll work extensively with AWS, containerization technologies, and infrastructure as code tools to ensure our systems run smoothly and reliably. Your expertise will be crucial in integrating these components, providing a solid foundation for our data-driven products. Additionally, you'll have the opportunity to mentor junior engineers and drive best practices across the team. If you're passionate about leveraging cutting-edge technologies to make a real impact, we’d love to connect with you!

About the role:

  • Location: Remote-first (United States; BC & ON, Canada)
  • Full-time
  • Permanent
  • Exempt
  • The cash compensation for this role is tailored to align with the cost of labor in different geographic markets. We've structured the base pay ranges for this role into zones for our geographic markets, and the specific base pay within the range will be determined by the candidate’s geographic location, job-related experience, knowledge, qualifications, and skills.
    • United States  (all figures cited below in USD and pertain to workers in the United States)
      • Zone A: $187,000 - $263,500
      • Zone B: $175,000 - $247,000
      • Zone C: $164,000 - $231,500
    • Canada  (All figures cited below in CAD and pertain to workers in ON & BC, Canada)
      • CAD $212,000 - CAD $299,000
  • Please visit our Careers page for more information on which locations are included in each of our geographic pay zones. However, please confirm the zone for your specific location with your recruiter.
  • Reporting to the Senior Engineering Manager.

As a Staff Data Infrastructure Engineer, you'll

  • Oversee the provisioning and deployment of infrastructure using Pulumi, ensuring seamless deployment of Kafka, Spark, Airflow, Athena, and other critical systems on AWS.
  • Design and implement strategies for scaling Airflow, Kafka, and Spark clusters to accommodate increasing workloads and user demands.
  • Lead efforts in optimizing performance, capacity planning, ensuring fault tolerance, and implementing failure recovery strategies across all infrastructure components.
  • Configure and manage VPCs, load balancers, and VPC endpoints for secure communication between internal and external services.
  • Manage IAM roles, apply security patches, plan and execute version upgrades, and ensure compliance with regulations such as GDPR.
  • Architect and implement high-availability solutions across multiple zones and regions, including backups, multi-region replication, and disaster recovery plans.
  • Oversee S3 data lake management, including file size management, compaction, encryption, and compression to maximize storage efficiency.
  • Implement caching strategies, indexing, and query optimization to ensure efficient data retrieval and processing.
  • Implement monitoring and logging using tools like Datadog, CloudWatch and OpenSearch.
  • Lead efforts to develop services, tools and automation to simplify infrastructure complexity for other engineering teams, enabling them to focus on building great products.
  • Participate in all engineering activities including incident response, interviewing, designing and reviewing technical specifications, code review, and releasing new functionality.
  • Mentor, coach, and inspire a team of engineers of various levels.

In addition to the responsibilities outlined above, at Webflow we will support you in identifying where your interests and development opportunities lie and we'll help you incorporate them into your role.

About you:

You'll thrive as a Staff Data Infrastructure Engineer if you have:

  • 8+ years of experience as a Data Infrastructure Engineer or related roles like Platform Engineer, SRE, DevOps or Backend Engineer.
  • Deep expertise in provisioning and managing data infrastructure components like Kafka, Spark, and Airflow.
  • Extensive experience with cloud services and environments (compute, storage, networking, identity management, infrastructure as code, etc.).
  • Strong experience with containerization technologies like Docker and Kubernetes.
  • Advanced knowledge of infrastructure as code tools like Terraform and Pulumi.
  • Strong understanding of networking concepts and configurations, including VPCs, load balancers, and endpoints.
  • Extensive experience with monitoring and logging tools.
  • Strong problem-solving skills and attention to detail.
  • Excellent leadership, communication, and mentoring skills.

You get extra points if you have:

  • AWS certifications (e.g., AWS Certified Solutions Architect, AWS Certified DevOps Engineer).
  • Proven track record of designing and implementing multi-zone and multi-region high availability and disaster recovery strategies.
  • Prior experience managing data lake storage using Apache Iceberg.
  • Knowledge of compliance standards (GDPR, CCPA) and security best practices.

Our Core Behaviors:

  • Obsess over customer experience. We deeply understand what we’re building and who we’re building for and serving. We define the leading edge of what’s possible in our industry and deliver the future for our customers
  • Move with heartfelt urgency. We have a healthy relationship with impatience, channeling it thoughtfully to show up better and faster for our customers and for each other. Time is the most limited thing we have, and we make the most of every moment
  • Say the hard thing with care. Our best work often comes from intelligent debate, critique, and even difficult conversations. We speak our minds and don’t sugarcoat things — and we do so with respect, maturity, and care
  • Make your mark. We seek out new and unique ways to create meaningful impact, and we champion the same from our colleagues. We work as a team to get the job done, and we go out of our way to celebrate and reward those going above and beyond for our customers and our teammates

Benefits & wellness

  • Equity ownership (RSUs) in a growing, privately-owned company.
  • 100% employer-paid healthcare, vision, and dental insurance coverage for employees and dependents (full-time employees working 30+ hours per week), as well as Health Savings Account/Health Reimbursement Account, dependent care Flexible Spending Account (US only), dependent on insurance plan selection where applicable in the respective country of employment; Employees may also have voluntary insurance options, such as life, disability, hospital protection, accident, and critical illness where applicable in the respective country of employment
  • 12 weeks of paid parental leave for both birthing and non-birthing caregivers, as well as an additional 6-8 weeks of pregnancy disability for birthing parents to be used before child bonding leave (where local requirements are more generous employees receive the greater benefit); Employees also have access to family planning care and reimbursement
  • Flexible PTO with a mandatory annual minimum of 10 days paid time off for all locations (where local requirements are more generous employees receive the greater benefit), and sabbatical program
  • Access to mental wellness and professional coaching, therapy, and Employee Assistance Program
  • Monthly stipends to support health and wellness, smart work, and professional growth
  • Professional career coaching, internal learning & development programs
  • 401k plan and pension schemes (in countries where statutorily required) financial wellness benefits, like CPA or financial advisor coverage
  • Discounted Pet Insurance offering (US only)
  • Commuter benefits for in-office employees

Temporary employees are not eligible for paid holiday time off, accrued paid time off, paid leaves of absence, or company-sponsored perks unless otherwise required by law.

Remote, together

At Webflow, equality is a core tenet of our culture. We are an Equal Opportunity (EEO)/Veterans/Disabled Employer and are committed to building an inclusive global team that represents a variety of backgrounds, perspectives, beliefs, and experiences. Employment decisions are made on the basis of job-related criteria without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, veteran status, or any other classification protected by applicable law. Pursuant to the San Francisco Fair Chance Ordinance, Webflow will consider for employment qualified applicants with arrest and conviction records.

Stay connected

Not ready to apply, but want to be part of the Webflow community? Consider following our story on our Webflow Blog, LinkedIn, X (Twitter), and/or Glassdoor

Please note:

We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Upon interview scheduling, instructions for confidential accommodation requests will be administered.

To join Webflow, you'll need a valid right to work authorization depending on the country of employment.

If you are extended an offer, that offer may be contingent upon your successful completion of a background check, which will be conducted in accordance with applicable laws. We may obtain one or more background screening reports about you, solely for employment purposes.

For information about how Webflow processes your personal information, please review Webflow’s Applicant Privacy Notice.

 

See more jobs at Webflow

Apply for this job

+30d

Staff Software Engineer

InstacartCanada - Remote
scalaairflowpostgressqlDesignpython

Instacart is hiring a Remote Staff Software Engineer

We're transforming the grocery industry

At Instacart, we invite the world to share love through food because we believe everyone should have access to the food they love and more time to enjoy it together. Where others see a simple need for grocery delivery, we see exciting complexity and endless opportunity to serve the varied needs of our community. We work to deliver an essential service that customers rely on to get their groceries and household goods, while also offering safe and flexible earnings opportunities to Instacart Personal Shoppers.

Instacart has become a lifeline for millions of people, and we’re building the team to help push our shopping cart forward. If you’re ready to do the best work of your life, come join our table.

Instacart is a Flex First team

There’s no one-size fits all approach to how we do our best work. Our employees have the flexibility to choose where they do their best work—whether it’s from home, an office, or your favorite coffee shop—while staying connected and building community through regular in-person events. Learn more about our flexible approach to where we work.

 

Overview

 

 

About the Role 

The Catalog Content Management Engineering team is looking for a Staff Engineer to join our table where your expertise will be crucial in leading engineering initiatives to improve the content and quality of our Catalog. You will oversee key systems responsible for generating over 20 million unique products, spanning over 100,000 locations across North America. To keep it up to date, we update over 4,500,000,000 lines of data every day. Your work will directly impact our capacity to deliver exceptional service and maintain our platform's scalability and efficiency.

 

 

About the Team 

Instacart’s Catalog is a backbone of an incredibly complex four-sided marketplace - Customers, Retailers, Brands and Shoppers. The Catalog Content Management Engineering consists of three different teams responsible for understanding the intent of the retailer, processing the incoming data files, creating the product identity, enriching the incoming data and detecting and fixing any and all issues with products and items on the Storefront. To this end we not only build superb and robust catalog functions but also provide other teams with intuitive and easy tools to use and manage the catalog data. 

 

 

About the Job 

As a Staff Software Engineer, you will:

  • Lead Initiatives: Spearhead projects across multiple departments to provide the best Catalog for our customers, retailers and shoppers.S
  • ystem Design and Maintenance: Design, build, and maintain scalable and critical systems that support a high volume of transactions.
  • Project Leadership: Guide engineering teams through the execution of crucial projects while promoting and maintaining high-quality standards.
  • Technology Advocacy: Champion the integration of new technologies and methodologies to enhance system flexibility, resilience, and robustness.
  • Mentoring: Provide mentorship to engineers, setting exemplary standards and best practices.
  • Technical Vision: Develop and promote a concise technical strategy that addresses both immediate and long-term goals, aligning stakeholders with the vision.


About You

Minimum Qualifications

  • Bachelor's and/or Master's degree in Computer Science (CS) or a related field or equivalent practical experience.
  • 10+ years of experience in building and managing scalable platform solutions.
  • Proven track record in leading the design, implementation, and deployment of high-scale, cross-functional systems.
  • Excellent technical communication skills with a capacity to work effectively with engineering teams and cross-functional units.I
  • n-depth knowledge of systems architecture and service-oriented solutions.
  • Demonstrated leadership in project management and engineer mentorship.

 

 

Preferred Qualifications

  • Proficiency in Ruby/Rails and/or Python Advanced understanding of asynchronous processing and working with real-time systems 
  • Strong knowledge of common data infra technologies (Python, Scala, Kafka, Flink, Airflow) in a production environment
  • Strong knowledge of common industry data stores or warehouses (Postgres, Snowflake)
  • Strong knowledge of SQL
  • Experience navigating and integrating various codebases and systems.

 

Instacart provides highly market-competitive compensation and benefits in each location where our employees work. This role is remote and the base pay range for a successful candidate is dependent on their permanent work location. Please review our Flex First remote work policy here.

Offers may vary based on many factors, such as candidate experience and skills required for the role. Additionally, this role is eligible for a new hire equity grant as well as annual refresh grants. Please read more about our benefits offerings here.

For Canadian based candidates, the base pay ranges for a successful candidate are listed below.

CAN
$221,000$245,000 CAD

See more jobs at Instacart

Apply for this job

+30d

Senior Data Infrastructure Engineer

WebflowU.S. Remote
DevOPSS3Webflowremote-firstterraformairflowDesignc++dockerAWSbackend

Webflow is hiring a Remote Senior Data Infrastructure Engineer

At Webflow, our mission is to bring development superpowers to everyone. Webflow is the leading visual development platform for building powerful websites without writing code. By combining modern web development technologies into one platform, Webflow enables people to build websites visually, saving engineering time, while clean code seamlessly generates in the background. From independent designers and creative agencies to Fortune 500 companies, millions worldwide use Webflow to be more nimble, creative, and collaborative. It’s the web, made better.

We’re excited for a Senior Data Infrastructure Engineer to join our Data Platform team. In this role, you’ll play a key part in building robust, secure, and scalable infrastructure that powers our data operations. You will have the opportunity to optimize the performance of our data services and automate infrastructure management ensuring everything runs smoothly and reliably. Your expertise will be crucial in integrating and managing essential components like Kafka, Spark, and Airflow, providing a solid foundation for our data-driven products. If you are passionate about leveraging cutting-edge technologies to make a real impact, we’d love to connect with you!

About the role 

  • Location: Remote-first (United States; BC & ON, Canada)
  • Full-time
  • Permanent 
  • Exempt
  • The cash compensation for this role is tailored to align with the cost of labor in different geographic markets. We've structured the base pay ranges for this role into zones for our geographic markets, and the specific base pay within the range will be determined by the candidate’s geographic location, job-related experience, knowledge, qualifications, and skills.
    • United States  (all figures cited below in USD and pertain to workers in the United States)
      • Zone A: $158,000 - $218,000
      • Zone B: $149,000 - $205,000
      • Zone C: $139,000 - $192,000
    • Canada  (All figures cited below in CAD and pertain to workers in ON & BC, Canada)
      • CAD 180,000 - CAD 248,000
  • Please visit our Careers page for more information on which locations are included in each of our geographic pay zones. However, please confirm the zone for your specific location with your recruiter.
  • Reporting to the Senior Engineering Manager

As aSenior Data Infrastructure Engineer, you’ll … 

  • Provision and deploy infrastructure using Pulumi for Kafka, Spark, Airflow, Athena, and other critical systems on AWS.
  • Manage and maintain clusters, ensuring optimal performance and reliability, including implementing auto-scaling and right-sizing instances.
  • Configure and manage VPCs, load balancers, and VPC endpoints for secure communication between internal and external services.
  • Manage IAM roles, apply security patches, plan and execute version upgrades, and ensure compliance with regulations such as GDPR.
  • Design and implement high-availability solutions across multiple zones and regions, including backups, multi-region replication, and disaster recovery plans.
  • Oversee S3 data lake management, including file size management, compaction, encryption, and compression to maximize storage efficiency.
  • Implement caching strategies, indexing, and query optimization to ensure efficient data retrieval and processing.
  • Spearhead initiatives for optimizing performance, capacity planning, ensuring fault tolerance, and implementing failure recovery across all infrastructure components.
  • Implement monitoring and logging using tools like Datadog, CloudWatch and OpenSearch.
  • Develop services, tools and automation to simplify infrastructure complexity for other engineering teams, enabling them to focus on building great products.
  • Participate in all engineering activities including incident response, interviewing, designing and reviewing technical specifications, code review, and releasing new functionality.
  • Mentor, coach, and inspire a team of engineers of various levels.

In addition to the responsibilities outlined above, at Webflow we will support you in identifying where your interests and development opportunities lie and we'll help you incorporate them into your role.

About you 

You’ll thrive as a Senior Data Infrastructure Engineer if you have: 

  • 5+ years of experience as a Data Infrastructure Engineer or in related roles like Platform Engineer, SRE, DevOps or Backend Engineer.
  • Strong experience with provisioning and managing data infrastructure components like Kafka, Spark, and Airflow.
  • Proficiency with cloud services and environments (compute, storage, networking, identity management, infrastructure as code, etc.).
  • Experience with containerization technologies like Docker and Kubernetes.
  • Expertise in infrastructure as code tools like Terraform and Pulumi.
  • Solid understanding of networking concepts and configurations, including VPCs, load balancers, and endpoints.
  • Experience with monitoring and logging tools.
  • Strong problem-solving skills and attention to detail.
  • Excellent communication and collaboration skills.

Bonus points if you have:

  • AWS certifications (e.g., AWS Certified Solutions Architect, AWS Certified DevOps Engineer).
  • Familiarity with multi-zone and multi-region high availability and disaster recovery strategies.
  • Knowledge of compliance standards (GDPR, CCPA) and security best practices.

Our Core Behaviors:

  • Obsess over customer experience. We deeply understand what we’re building and who we’re building for and serving. We define the leading edge of what’s possible in our industry and deliver the future for our customers.
  • Move with heartfelt urgency. We have a healthy relationship with impatience, channeling it thoughtfully to show up better and faster for our customers and for each other. Time is the most limited thing we have, and we make the most of every moment.
  • Say the hard thing with care. Our best work often comes from intelligent debate, critique, and even difficult conversations. We speak our minds and don’t sugarcoat things — and we do so with respect, maturity, and care.
  • Make your mark. We seek out new and unique ways to create meaningful impact, and we champion the same from our colleagues. We work as a team to get the job done, and we go out of our way to celebrate and reward those going above and beyond for our customers and our teammates.

Benefits & wellness

  • Equity ownership (RSUs) in a growing, privately-owned company
  • 100% employer-paid healthcare, vision, and dental insurance coverage for employees and dependents (US; full-time Canadian workers working 30+ hours per week), as well as Health Savings Account/Health Reimbursement Account, dependent on insurance plan selection. Employees also have voluntary insurance options, such as life, disability, hospital protection, accident, and critical illness
  • 12 weeks of paid parental leave for both birthing and non-birthing caregivers, as well as an additional 6-8 weeks of pregnancy disability for birthing parents to be used before child bonding leave. Employees also have access to family planning care and reimbursement.
  • Flexible PTO with an mandatory annual minimum of 10 days paid time off, and sabbatical program
  • Access to mental wellness coaching, therapy, and Employee Assistance Program
  • Monthly stipends to support health and wellness, as well as smart work, and annual stipends to support professional growth
  • Professional career coaching, internal learning & development programs
  • 401k plan and financial wellness benefits, like CPA or financial advisor coverage
  • Commuter benefits for in-office workers

Temporary employees are not eligible for paid holiday time off, accrued paid time off, paid leaves of absence, or company-sponsored perks.

Be you, with us

At Webflow, equality is a core tenet of our culture. We are committed to building an inclusive global team that represents a variety of backgrounds, perspectives, beliefs, and experiences. Employment decisions are made on the basis of job-related criteria without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, veteran status, or any other classification protected by applicable law.

Stay connected

Not ready to apply, but want to be part of the Webflow community? Consider following our story on our Webflow Blog, LinkedIn, Twitter, and/or Glassdoor. 

Please note:

To join Webflow, you'll need valid U.S. or Canadian work authorization depending on the country of employment.

If you are extended an offer, that offer may be contingent upon your successful completion of a background check, which will be conducted in accordance with applicable laws. We may obtain one or more background screening reports about you, solely for employment purposes.

Webflow Applicant Privacy Notice

See more jobs at Webflow

Apply for this job

+30d

Senior Analytics Engineer

agileBachelor's degreeremote-firsttableauairflowsqlDesignpython

Parsley Health is hiring a Remote Senior Analytics Engineer

About us:

Parsley Health is a digital health company with a mission to transform the health of everyone, everywhere with the world's best possible medicine. Today, Parsley Health is the nation's largest health care company helping people suffering from chronic conditions find relief with root cause resolution medicine. Our work is inspired by our members’ journeys and our actions are focused on impact and results.

The opportunity:

You will be joining a remote team of passionate engineers reporting into our Data Manager. In this role, you will work closely with Engineering, Product, Design and Customer Reliability teams. Parsley Health is an outcomes-driven organization and your work will directly contribute to the company objectives, including expanding the business nationally; improving activation, conversion and retention; and expansion of our healthcare products.

We work in a blameless environment and we take ownership and pride in our efforts. We like to work in small cross functional product pods where each pod owns the development lifecycle of their products. We follow agile development practices and encourage each pod to tailor the processes to their needs. Our teams are built on pillars of trust, humility and continuous improvement.

About you:

You appreciate the challenge of building reliable and timely business intelligence systems to promote actionable insights. You know that good ETL, preparation, and visualization are key to getting the right answer - and you also know that understanding your stakeholder’s problems is the key to getting the right question. 

You know when to develop an MVP based on a few requirements gathered in a conversation, and when to suggest a dedicated meeting to sort out the 'what-why-how.'

You have a healthy appreciation for the many ways in which distributed systems may fail. You're always thinking beyond the scope of the current project, and about the larger product vision. You are thrilled to deliver the right information, at the right time, in support of our member’s health and clinician’s decision process.

What you’ll do:

  • Manage and architect our ETL, warehouse, and data delivery/visibility.
  • Craft critical retrospective reports for use by our Ops, Clinical, Product, Finance, and MX teams.
  • Provide actionable, prospective insights to support business and clinical decisions.
  • Engage functional peers on core business strategies and how data products support those efforts.
  • Work closely with stakeholder groups to define requirements, design appropriate BI solutions, and implement applications against development standards and best practices.
  • Design, develop, and maintain scalable data pipelines that extract, transform, and load data from various sources into our data warehouse (BigQuery).
  • Ensure data accuracy, consistency, and availability for business intelligence and analytics purposes.
  • Optimize and enhance data processing workflows for performance and efficiency.
  • Implement and maintain data quality monitoring and alerting systems.
  • Work closely with cross-functional teams - including Product and Clinical Operations - to understand data needs and provide actionable insights.
  • Stay up-to-date with emerging trends and technologies in data engineering and analytics.

What you’ll need:

  • Bachelor's degree in Computer Science, Data Science, Information Systems, or a related field (Master's degree preferred).
  • 3+ years of experience in data and analytics.
  • Strong proficiency in SQL and Python
  • Experience with data modeling techniques.
  • Hands-on experience with data warehousing technologies (e.g., Redshift, Snowflake, BigQuery) and ETL tools (e.g., Airflow, DBT).
  • Experience with data visualization tools (e.g., Tableau, Looker) is a plus.
  • Experience with Google Cloud Platform
  • Containerization experience, knowledge of CI tooling, testing frameworks and other code quality tools
  • Familiarity with healthcare data standards and regulations (e.g., HIPAA) is desirable.
  • Excellent problem-solving skills and attention to detail.
  • Strong communication and collaboration skills, with the ability to work effectively in a team-oriented environment.
  • A lean towards self-directed learning in data tooling and solutions.

Benefits and Compensation:

  • Equity Stake
  • 401(k) + Employer Matching program
  • Remote-first with the option to work from one of our centers in NYC or LA
  • Complimentary Parsley Health Complete Care membership
  • Subsidized Medical, Dental, and Vision insurance plan options
  • Generous 4+ weeks of paid time off
  • Annual professional development stipend
  • Annual wellness stipend

Parsley Health is committed to providing an equitable, fair and transparent compensation program for all employees.

The starting salary for this role is between $115,000 - $130,000, depending on skills and experience. We take a geo-neutral approach to compensation within the US, meaning that we pay based on job function and level, not location.

Individual compensation decisions are based on a number of factors, including experience level, skillset, and balancing internal equity relative to peers at the company. We expect the majority of the candidates who are offered roles at our company to fall healthily throughout the range based on these factors. We recognize that the person we hire may be less experienced (or more senior) than this job description as posted. If that ends up being the case, the updated salary range will be communicated with candidates during the process.


At Parsley Health we believe in celebrating everything that makes us human and are proud to be an equal opportunity workplace. We embrace diversity and are committed to building a team that represents a variety of backgrounds, perspectives, and skills. We believe that the more inclusive we are, the better we can serve our members. 


Important note:

In light of recent increase in hiring scams, if you're selected to move onto the next phase of our hiring process, a member of our Talent Acquisition team will reach out to you directly from an@parsleyhealth.comemail address to guide you through our interview process. 

    Please note: 

  • We will never communicate with you via Microsoft Teams
  • We will never ask for your bank account information at any point during the recruitment process, nor will we send you a check (electronic or physical) to purchase home office equipment

We look forward to connecting!

#LI-Remote

See more jobs at Parsley Health

Apply for this job

Instacart is hiring a Remote Senior Product Manager, Ads Data

We're transforming the grocery industry

At Instacart, we invite the world to share love through food because we believe everyone should have access to the food they love and more time to enjoy it together. Where others see a simple need for grocery delivery, we see exciting complexity and endless opportunity to serve the varied needs of our community. We work to deliver an essential service that customers rely on to get their groceries and household goods, while also offering safe and flexible earnings opportunities to Instacart Personal Shoppers.

Instacart has become a lifeline for millions of people, and we’re building the team to help push our shopping cart forward. If you’re ready to do the best work of your life, come join our table.

Instacart is a Flex First team

There’s no one-size fits all approach to how we do our best work. Our employees have the flexibility to choose where they do their best work—whether it’s from home, an office, or your favorite coffee shop—while staying connected and building community through regular in-person events. Learn more about our flexible approach to where we work.

 

Overview

About the Role

We are seeking a senior product manager to build and extend the data platform we leverage to power our ads and insights businesses that service over 5,000 CPG brands a day. 

You will work in collaboration with R&D, data science, data engineering and commercial leadership, to build the foundation that powers how we surface data to internal and external customers.

This is a unique opportunity to have end to end ownership of a technical platform for the company that is core to our growth. Your portfolio will directly impact how some of the world’s largest CPG brands make investment and strategy decisions.

 

About The Team

The Advertiser Experience team owns the set of systems that power our advertising facing offerings (inclusive of Instacart Ads Manager, Instacart Ads API, Ads Measurement and Data Pipelines). Our teams own the E2E systems from our back-end platforms and data that power a complex ecosystem of CPGs, retailers, customers, and operators .

We are a passionate team of 100+ engineers, data scientists, designers, marketers and product managers focused on driving growth for CPGs. We work hard to make sure everyone is brought along for the journey as we ship award winning products and services to the industry.

 

About The Job

  • Manage the roadmap and execution for our ads data platform and data pipelines for a diverse set of use cases.
  • Drive forward strategy on all aspects of the data platform especially our data sharing practices and ability to derive signal from noise.
  • Lead product planning, product & customer discovery, the product development process, effort estimation, and collaboration with teams across the organization (i.e. Data Engineering and Commercial Teams).
  • Build and maintain a variety of integrations with a complex ecosystem of 3rd party partners like identity graphs, verification providers, media partners, and clean rooms providers.
  • Ensure our ads data platform meets the highest standards for privacy, data protection, and regulatory compliance.
  • Intake & validate new ideas through a set of frameworks and drive them into implementable projects.
  • Advocate for data quality throughout the entire Ads and Eversight R&D organization, and build tools that allow internal customers to be evangelists themselves.

About You

Minimum Qualifications

  • 5+ years of Product Management experience
  • Experience managing data products and data platforms
  • Experience working in deeply technical domains with the ability to quickly ramp up when onboarding into new areas
  • Experience partnering with technical audiences and “translating” to senior audiences across functions
  • Direct experience partnering with Data Engineering and Product teams to identify new roadmap opportunities and improvements
  • Ability to manage and align multiple stakeholders

Preferred Qualifications

  • Fluent in core data processing technologies like Airflow, dbt, cloud data warehouses
  • Experience in influencing and building out multi year strategy

Instacart provides highly market-competitive compensation and benefits in each location where our employees work. This role is remote and the base pay range for a successful candidate is dependent on their permanent work location. Please review our Flex First remote work policyhere.

Offers may vary based on many factors, such as candidate experience and skills required for the role.Additionally, this role is eligible for a new hire equity grant as well as annual refresh grants. Please read more about our benefits offeringshere.

For US based candidates, the base pay ranges for a successful candidate are listed below.

CA, NY, CT, NJ
$187,000$208,000 USD
WA
$180,000$200,000 USD
OR, DE, ME, MA, MD, NH, RI, VT, DC, PA, VA, CO, TX, IL, HI
$172,000$191,000 USD
All other states
$156,000$173,000 USD

See more jobs at Instacart

Apply for this job

+30d

Junior Solutions Engineer

SingleStoreRemote, United State
SalesscalaairflowsqlDesignazurejavapythonAWS

SingleStore is hiring a Remote Junior Solutions Engineer

Junior SE Position Overview

We are looking for a SingleStore Solutions Engineer who is passionate about removing data bottlenecks for their customers and enabling real-time data capabilities to some of the most difficult data challenges in the industry. In this role you will work directly with our sales teams, and channel partners to identify prospective and current customer pain points where SingleStore can remove those bottlenecks and deliver real-time capabilities. You will provide value-based demonstrations, presentations, and support proof of concepts to validate proposed solutions.

As a SingleStore solutions engineer, you must share our passion for real-time data, fast analytics, and simplified data architecture. You must be comfortable in both high executive conversations as well as being able to deeply understand the technology and its value-proposition.

About our Team

At SingleStore, the Solutions Engineer team epitomizes a dynamic blend of innovation, expertise, and a fervent commitment to meeting complex data challenges head-on. This team is composed of highly skilled individuals who are not just adept at working with the latest technologies but are also instrumental in ensuring that SingleStore is the perfect fit for our customers.

Our team thrives on collaboration and determination, building some of the most cutting-edge deployments of SingleStore data architectures for our most strategic customers. This involves working directly with product management to ensure that our product is not only addressing current data challenges but is also geared up for future advancements.

Beyond the technical prowess, our team culture is rooted in a shared passion for transforming how businesses leverage data. We are a community of forward-thinkers, where each member's contribution is valued in our collective pursuit of excellence. Our approach combines industry-leading engineering, visionary design, and a dedicated customer success ethos to shape the future of database technology. In our team, every challenge is an opportunity for growth, and we support each other in our continuous learning journey. At SingleStore, we're more than a team; we're innovators shaping the real-time data solutions of tomorrow.

Responsibilities

  • Engage with both current and prospective clients to understand their technical and business challenges
  • Present and demonstrate SingleStore product offering to fortune 500 companies.
  • Enthusiastic about the data analytics and data engineering landscape
  • Provide valuable feedback to product teams based on client interactions
  • Stay up to date with database technologies and the SingleStore product offerings

 

Qualifications

  • Excellent presentation and communication skills, with experience presenting to large corporate organizations
  • Ability to communicate complex technical concepts for non-technical audiences.
  • Strong team player with interpersonal skills
  • Broad range of experience within large-scale  database and/or data warehousing technologies
  • Experience with data engineering tools  Apache Spark, Apache Flink,Apache Airflow
  • Demonstrated proficiency in ANSI SQL query languages
  • Demonstrated proficiency in Python, Scala or Java
  • Understanding of private and public cloud platforms such as AWS, Azure, GCP, VMware

SingleStore delivers the cloud-native database with the speed and scale to power the world’s data-intensive applications. With a distributed SQL database that introduces simplicity to your data architecture by unifying transactions and analytics, SingleStore empowers digital leaders to deliver exceptional, real-time data experiences to their customers. SingleStore is venture-backed and headquartered in San Francisco with offices in Sunnyvale, Raleigh, Seattle, Boston, London, Lisbon, Bangalore, Dublin and Kyiv. 

Consistent with our commitment to diversity & inclusion, we value individuals with the ability to work on diverse teams and with a diverse range of people.

Please note that SingleStore's COVID-19 vaccination policy requires that team members in the United States be up to date with the current CDC guidelines for their vaccinations with one of the United States FDA-approved vaccine options to meet in person for SingleStore business or to work from one of our U.S. office locations. [It is expected that this will be a requirement for this role]. If an exemption and/or accommodation to our vaccination policy is requested, a member of the Human Resources department will be available to begin the interactive accommodation process.

To all recruitment agencies: SingleStore does not accept agency resumes. Please do not forward resumes to SingleStore employees. SingleStore is not responsible for any fees related to unsolicited resumes and will not pay fees to any third-party agency or company that does not have a signed agreement with the Company.

#li-remote #remote-li 

SingleStore values individuals for their unique skills and experiences, and we’re proud to offer roles in a variety of locations across the United States. Salary is based on permissible, non-discriminatory factors such as skills, experience, and geographic location, and is just one part of our total compensation and benefits package. Certain roles are also eligible for additional rewards, including merit increases and annual bonuses. 

Our benefits package for this role includes: stock options, flexible paid time off, monthly three-day weekends, 14 weeks of fully-paid gender-neutral parental leave, fertility and adoption assistance, mental health counseling, 401(k) retirement plan, and rich health insurance offerings—including medical, dental, vision and life and disability insurance. 

SingleStore’s base salary range for this role, if based in California, Colorado, Washington, or New York City is: $X - $X USD per year

For candidates residing in California, please see ourCalifornia Recruitment Privacy Notice. For candidates residing in the EEA, UK, and Switzerland, please see ourEEA, UK, and Swiss Recruitment Privacy Notice.

 

Apply for this job

+30d

Data Engineer Cloud GCP

DevoteamTunis, Tunisia, Remote
airflowsqlscrum

Devoteam is hiring a Remote Data Engineer Cloud GCP

Description du poste

Au sein de la direction « Plateforme Data », le consultant intégrera une équipe SCRUM et se concentrera sur un périmètre fonctionnel spécifique.

Votre rôle consistera à contribuer à des projets data en apportant votre expertise sur les tâches suivantes :

  • Concevoir, développer et maintenir des pipelines de données robustes et évolutifs sur Google Cloud Plateform (GCP), en utilisant des outils tels que BigQuery, Airflow, Looker et DBT.
  • Collaborer avec les équipes métier pour comprendre les exigences en matière de données et concevoir des solutions adaptées.
  • Optimiser les performances des traitements des données et des processus ELT en utilisant AirFlow, DBT et BigQuery.
  • Mettre en œuvre des processus de qualité des données pour garantir l'intégrité et la cohérence des données.
  • Travailler en étroite collaboration avec les équipes d'ingénierie pour intégrer les pipelines de données dans les applications et les services existants.
  • Rester à jour avec les nouvelles technologies et les meilleures pratiques dans le domaine du traitement des données et de l'analyse.

 

    Qualifications

    ???? Compétences

    Quels atouts pour rejoindre l’équipe ?

    Diplômé(e) d’un Bac+5 en école d'ingénieur ou équivalent universitaire avec une spécialisation en informatique.

    • Au moins 4 ans d'expérience dans le domaine de l'ingénierie des données, avec une expérience significative dans un environnement basé sur le cloud GCP.
    • Maîtrise avancée de SQL pour l'optimisation et le traitement des données.
    • Certification Google Professional Data Engineer est un plus.
    • Très bonne communication écrite et orale (livrables et reportings de qualité).

    Alors, si vous souhaitez progresser, apprendre et partager, rejoignez-nous !

    See more jobs at Devoteam

    Apply for this job