airflow Remote Jobs

142 Results

+30d

Data Engineer

AmpleInsightIncToronto, Canada, Remote
DevOPSairflowsqlpython

AmpleInsightInc is hiring a Remote Data Engineer

Job Description

We are looking for a data engineer who is passionate about analytics and helping companies build and scale data. You enjoy working with data and are motivated to produce high quality data tools and pipelines that help empower other data scientists. You are experienced in architecting data ETL workflows and schemas. Critical thinking and problem-solving skills are essential for this role.

Qualifications

  • BS (or higher, e.g., MS, or PhD) in Computer Science, Engineering, Math, or Statistics
  • Hands on experience working with user engagement, social, marketing, and/or finance data
  • Proficient in Python (i.e. Pandas, Numpy, scikit-learn, etc), R, TensorFlow, amongst other data science related tools and libraries
  • Extensive experience working on relational databases, designing complex data schemas, and writing SQL queries
  • Deep knowledge on performance tuning of ETL Jobs, SQL, and databases
  • Working knowledge of Snowflake
  • Experience working with Airflow is a strong plus
  • Devops experiences is a plus

See more jobs at AmpleInsightInc

Apply for this job

+30d

Senior Product Analyst

StyleSeat100% Remote (U.S. Based Only - Select States)
tableauairflowsqlB2CDesignqac++python

StyleSeat is hiring a Remote Senior Product Analyst

Senior Product Analyst

100% Remote (U.S. Based Only, Select States - See Below)

About the role

As a Senior Product Analyst, you will use hands-on proven data analysis skills to help contribute to StyleSeat’s Analytics function. You'll be deeply involved in driving decision-making across all aspects of the business, from exploratory analysis to better understand our customers, to building data pipelines that democratize a standard level of data across the company.

You will work closely with Product and Engineering teams to define and answer key questions, as well as enabling stakeholders & supporting a data-driven culture. This role will be fully embedded within a product squad (PM, Designer, QA, Engineers, and you - the Analyst), but also have the opportunity for personal growth by supporting other areas of the business as well: Customer Experience, Finance, Product Marketing, etc.

What you’ll do

  • Own and optimize key problem areas aligned with squad goals, currently with a focus on enhancing Pro Tools and Payments usage to drive user engagement and satisfaction
  • Lead the ideation and execution of product changes that drive growth, by partnering with Product, Engineering, Design, and Marketing
  • Design A/B tests and analyze results to inform strategic decision-making & next steps
  • Utilize your personal data-driven tendencies to explore your own curiosities within the data – going off the beaten path to identify areas for improvement + growth
  • Translate analytical insights into actionable recommendations for business and process improvements, presenting all the way up to senior leadership
  • Design and assist in building analytical infrastructure (Reporting, Dashboards, Pipelines, and Analyses)
  • Work with business stakeholders to recommend data standards and best practices to align the way we measure, think, and talk about our Product + Business
  • Routinely communicate metrics, trends and other key indicators to Leadership

Who you are 

Successful candidates can come from a variety of backgrounds, yet here are some of the critical experiences we’re looking for:

Main Responsibilities:

  • 3+ years of relevant experience in product analytics / data science, or other quantitative disciplines
  • Experience working with large datasets and an ability to write complex SQL queries
  • Experience translating business objectives into actionable analyses, and explaining technical concepts and implications to a broad, non-technical audience
  • Experience with data visualization tools / techniques (Tableau preferred, Looker, Quicksight, Amplitude, etc)
  • Experience working directly embedded within product squads, going deep into the user problems or pain points and solving them with data

Nice to haves:

  • Experience using tools such as R, Python, or similar for causal inference or similar statistical techniques
  • Proficiency in designing/building data pipelines or using ETL tools
  • Knowledgeable in one or more advanced data pipeline tools: Airflow, DBT, Hevo
  • Experience in B2B2C marketplace, eCommerce, or B2C organization
  • Experience at a startup or late-stage growth company
  • Familiarity with statistical modeling and predictive analytics to inform product decisions

Some year 1 deliverables:

  • Develop framework (reporting, metrics, segmentations) for understanding a central & rapidly growing area of our product – the Client experience
  • Utilize said framework to generate insights and create actionable recommendations to inform our product roadmap, ranging from impacts of current sprint to quarters away.
  • Actively present findings & recommendations to not only product stakeholders, but senior stakeholders across the organization, following up and ensuring they are actionable

Desired traits 

  • Strong product mindset and knowledge, with the ability to drive roadmaps & PMs with data
  • Curiosity - a natural drive to find out and explain why things happen.
  • The ability to tell a story with data, and the ability to explain technical matters to less technical co-workers
  • A strong and adaptable communicator who can ably interact with executives
  • Ability to manage projects simultaneously while understanding which to prioritize alongside their stakeholder partners

Salary Range

Our job titles may span more than one career level. The career level we are targeting for this role has a base pay between $135,000 and $160,000.The actual base pay is dependent upon many factors, such as: training, transferable skills, work experience, business needs and market demands. Base pay ranges are subject to change and may be modified in the future. 

Who we are 

StyleSeat is the premier business platform for SMBs in the beauty and wellness industry to run and grow their business; and destination for consumers to discover, book and pay. To date, StyleSeat has powered more than 200 million appointments totaling over $12 billion in revenue for small businesses.StyleSeat is a platform and marketplace designed to support and promote the beauty and personal care community. 

Today, StyleSeat connects consumers with top-rated beauty professionals in their area for a variety of services, including hair styling, barbering, massage, waxing, and nail care, among others. Our platform ensures that Pros maximize their schedules and earnings by minimizing gaps and cancellations, effectively attracting and retaining clientele.

StyleSeat Culture & Values 

At StyleSeat, our team is committed to fostering a positive and inclusive work environment. We respect and value the unique perspectives, experiences, and skills of our team members and work to create opportunities for all to grow and succeed. 

  • Diversity - We celebrate and welcome diversity in backgrounds, experiences, and perspectives. We believe in the importance of creating an inclusive work environment where everyone can thrive. 
  • Curiosity- We are committed to fostering a culture of learning and growth. We ask questions, challenge assumptions, and explore new ideas. 
  • Community - We are committed to making a positive impact on each, even when win-win-win scenarios are not always clear or possible in every decision. We strive to find solutions that benefit the community as a whole and drive our shared success.
  • Transparency - We are committed to open, honest, and clear communication. We hold ourselves accountable for maintaining the trust of our customers and team.
  • Entrepreneurship - We are self-driven big-picture thinkers - we move fast and pivot when necessary to achieve our goals. 

Applicant Note: 

StyleSeat is a fully remote, distributed workforce, however, we only have business entities established in the below list of states and, thus are unable to consider candidates who live in states not on this list for the time being. **Applicants must be authorized to work for ANY employer in the U.S. We are unable to sponsor or take over sponsorship of an employment visa at this time.

* Arizona

* Alabama

* California

* Colorado

* Florida

* Georgia

* Illinois

* Indiana

* Massachusetts

* Maryland

* Michigan

* Nebraska

* New York

* New Jersey 

* Ohio

* Oregon

* Pennsylvania

* Virginia

* Washington

See more jobs at StyleSeat

Apply for this job

+30d

Sr. Data Engineer - Data Analytics

R.S.ConsultantsPune, India, Remote
SQSLambdaBachelor's degreescalaairflowsqlDesigntypescriptpythonAWSNode.js

R.S.Consultants is hiring a Remote Sr. Data Engineer - Data Analytics

Job Description

We are looking for a Sr. Data Engineer for an International client. This is a 100% remote job. The person will be working from India and will be collaborating with global team. 

Total Experience: 7+ Years

Your role

  • Have key responsibilities within the requirements analysis, scalable & low latency streaming platform solution design, architecture, and end-to-end delivery of key modules in order to provide real-time data solutions for our product
  • Write clean scalable code using Go, Typescript / Node.js / Scala / python / SQL and test and deploy applications and systems
  • Solve our most challenging data problems, in real-time, utilizing optimal data architectures, frameworks, query techniques, sourcing from structured and unstructured data sources.
  • Be part of an engineering organization delivering high quality, secure, and scalable solutions to clients
  • Involvement in product and platform performance optimization and live site monitoring
  • Mentor team members through giving and receiving actionable feedback.

Our tech. stack:

  • AWS (Lambda, SQS, Kinesis, KDA, Redshift, Athena, DMS, Glue,Go/Typescript, Dynamodb), Airflow, Flink, Spark, Looker, EMR
  • A continuous deployment process based on GitLab

A little more about you:

  • A Bachelor's degree in a technical field (eg. computer science or mathematics). 
  • 3+ years experience with real-time, event driven architecture
  • 3+ years experience with a modern programming language such as Scala, Python, Go, Typescript
  • Experience of designing complex data processing pipeline
  • Experience of data modeling(star schema, dimensional modeling etc)
  • Experience of query optimisation
  • Experience of kafka is a plus
  • Shipping and maintaining code in production
  • You like sharing your ideas, and you're open-minded

Why join us?

???? Key moment to join in term of growth and opportunities

????‍♀️ Our people matter, work-life balance is important

???? Fast-learning environment, entrepreneurial and strong team spirit

???? 45+ Nationalities: cosmopolite & multi-cultural mindset

???? Competitive salary package & benefits (health coverage, lunch, commute, sport

DE&I Statement: 

We believe diversity, equity and inclusion, irrespective of origins, identity, background and orientations, are core to our journey. 

Qualifications

Hands-on experience in Scala / Python with Data Modeling, Real Time / Streaming Data. Experience of complex data processing pipeline and Data Modeling.

BE/ BTech in Computer Science

See more jobs at R.S.Consultants

Apply for this job

+30d

Databricks Data Engineer - Data & Analytics team (remote / Costa Rica- or LATAM-based)

HitachiSan Jose, Costa Rica, Remote
scalaairflowsqlDesignazuregitpythonAWS

Hitachi is hiring a Remote Databricks Data Engineer - Data & Analytics team (remote / Costa Rica- or LATAM-based)

Job Description

 

Please note: Although our position is primarily remote / virtual (could be some occasional onsite in downtown San Jose, should you live close enough) you MUST live, and be authorized to work, in Costa Rica without sponsorship. Candidates in other Latin America (LATAM) countries can be considered as an employee if willing to relocate to Costa Rica or can work via our 3rd party payroll company.

 

DATA ENGINEER (DATABRICKS, PYTHON, SPARK) 

This is a full-time, well benefited, career opportunity in our Data & Analytics organization (Azure DataWarehouse / DataLakehouse and Business Intelligence) for a highly experienced Data Engineer in Big Data systems design with hnads-on knowledge in data architecture, especially Spark and Delta/Data Lake technologies.

Individuals in this role will assist in the design, development, enhancement, and maintenance of complex data pipelines products that manage business critical operations, and large-scale analytics pipelines.   Qualified applicants will have a demonstrated capability to learn new concepts quickly, have a data engineering background, and/or have robust software engineering expertise.  

Responsibilities

  • Scope and execute together with team leadership. Work with the team to understand platform capabilities and how to best improve and expand those capabilities.
  • Strong independence and autonomy.
  • Design, development, enhancement, and maintenance of complex data pipeline products which manage business-critical operations and large-scale analytics applications.
  • Experience leading mid- and senior-level data engineers. 
  • Support analytics, data science and/or engineering teams and understand their unique needs and challenges. 
  • Instill excellence into the processes, methodologies, standards, and technology choices embraced by the team.
  • Embrace new concepts quickly to keep up with fast-moving data engineering technology.
  • Dedicate time to continuous learning to keep the team appraised of the latest developments in the space.
  • Commitment to developing technical maturity across the company.

Qualifications

  • 5+ years of Data Engineering experience including 2+ years designing and building Databricks data pipelines is REQUIRED; Azure cloud is highly preferred, however will consider AWS, GCP or other cloud platform experience in lieu of Azure
  • Experience with conceptual, logical and/or physical database designs is a plus
  • 2+ years of hands-on Python/Pyspark/SparkSQL and/or Scala experience is REQUIRED
  • 2+ years of experience with Big Data pipelines or DAG Tools (Data Factory, Airflow, dbt, or similar) is REQUIRED
  • 2+ years of Spark experience (especially Databricks Spark and Delta Lake) is REQUIRED
  • 2+ years of hands-on experience implementing Big Data solutions in a cloud ecosystem, including Data/Delta Lakes, is REQUIRED
  • Experience with source control (git) on the command line is REQUIRED
  • 2+ years of SQL experience, specifically to write complex, highly optimized queries across large volumes of data is HIGHLY DESIRED
  • Data modeling / data profiling capabilities with Kimball/star schema methodology is a plus
  • Professional experience with Kafka, or other live data streaming technology, is HIGHLY DESIRED
  • Professional experience with database deployment pipelines (i.e., dacpac’s or similar technology) is HIGHLY DESIRED
  • Professional experience with one or more unit testing or data quality frameworks is HIGHLY DESIRED

#LI-CA1

#REMOTE

#databricks

#python

#spark

#dataengineer

#datawrangler

Apply for this job

+30d

Lead Data Architect (AWS, Azure, GCP)

CapTech ConsultingDenver, CO, Remote
nosqlairflowsqlDesignmongodbazurepythonAWS

CapTech Consulting is hiring a Remote Lead Data Architect (AWS, Azure, GCP)

Job Description

CapTech Data Architects match our clients’ business goals with available technologies when developing a strategy for a successful data delivery implementation. We improve our clients’ business value by enhancing data use, improving effectiveness of information stewardship, and streamlining data flows. After gaining in-depth understanding of our client’s business challenges, our architects apply experience-based insight and use state-of-the-art tools and techniques to identify the best solutions. We view our Data Architects as thought leaders in the data space. We task them with growing CapTech talent and expanding data and analytics delivery capabilities. 

Specific responsibilities for the Data Architect position include:  

  • Assessing and advocating data management technologies and practices eliminating gaps between the current state and a well-targeted future state 
  • Interpreting and delivering impactful strategic plans improving data integration, data quality, and data delivery in support of business initiatives and roadmaps 
  • Formulating and articulating architectural trade-offs across solution options before recommending an optimal solution ensuring technical requirements are met 
  • Leading teams of data engineers and other technical team members through design, implementation and best practices.
  • Providing insights to executive stakeholders and development teams to ensure data architecture recommendations maximize the value of client data across the organization
  • Works to ensure the solutions recommended are providing business value and alignment with client’s strategic goals
  • Communication with non-technical executives focusing on value of modern enterprise level data solutions
  • Driving innovative technology solutions through thought leadership on emerging trends 
  • Sharing project solutions and outcomes with colleagues to improve delivery on future projects 
  • Leadership within the Data & Analytics practice area focused on growing and development capabilities and talent
  • Partnering with CapTech business development team to demonstrate CapTech’s technical capabilities, envision a proposed solution CapTech can offer, and estimate proposed work plans. 

Qualifications

Typical experience for successful candidates includes: 

  • 7+ years of experience implementing with a variety of on-premises and cloud data management, integration, visualization, and analytical technologies 
  • Advanced proficiency in the design and implementation of modern data architectures and concepts such as cloud services (e.g., AWS, Azure, GCP), real-time data distribution (e.g., Kafka, Kinesis, DataFlow, Airflow), NoSQL (e.g., MongoDB, DynamoDB, HBase, CosmosDB) and modern data warehouse tools including Snowflake and DataBricks
  • Advanced proficiency in end-to-end data architecture solutions including ingestion, storage and relational modeling leveraging industry standard languages including SQL and Python
  • Ability to think strategically and relate architectural decisions and recommendations to business needs and client culture 
  • Ability to assess traditional and modern data architectural components based on business needs
  • Experience in recommending data governance best practices including MDM, security, privacy and policies
  • Experience leading enterprise engineering teams through implementations serving as POC on design decisions and best practices

Preferred QualificationsPrevious consulting industry experience

  • Experience in recommending data governance best practices including MDM, security, privacy and policies
  • Providing thought leadership and internal engagement with leadership and innovation across enterprise
  • Participating in providing mentorship and investing talent growth within Data & Analytics practice area
  • Awareness and continued education around emerging technologies and skills in data landscape

See more jobs at CapTech Consulting

Apply for this job

+30d

Data Architect (AWS, Azure, GCP)

CapTech ConsultingColumbus, OH, Remote
nosqlairflowsqlDesignmongodbazurepythonAWS

CapTech Consulting is hiring a Remote Data Architect (AWS, Azure, GCP)

Job Description

CapTech Data Architects match our clients’ business goals with available technologies when developing a strategy for a successful data delivery implementation. We improve our clients’ business value by enhancing data use, improving effectiveness of information stewardship, and streamlining data flows. After gaining in-depth understanding of our client’s business challenges, our architects apply experience-based insight and use state-of-the-art tools and techniques to identify the best solutions. We view our Data Architects as thought leaders in the data space. We task them with growing CapTech talent and expanding data and analytics delivery capabilities. 

Specific responsibilities for the Data Architect position include:  

  • Assessing and advocating data management technologies and practices eliminating gaps between the current state and a well-targeted future state 
  • Interpreting and delivering impactful strategic plans improving data integration, data quality, and data delivery in support of business initiatives and roadmaps 
  • Formulating and articulating architectural trade-offs across solution options before recommending an optimal solution ensuring technical requirements are met 
  • Motivating and developing staff through teaching, empowering, and influencing technical and consulting “soft” skills 
  • Collaborating with client stakeholders and development staff to ensure data architecture recommendations maximize the value of client data across the organization 
  • Driving innovative technology solutions through thought leadership on emerging trends 
  • Sharing project solutions and outcomes with colleagues to improve delivery on future projects 
  • Partnering with CapTech business development team to demonstrate CapTech’s technical capabilities, envision a proposed solution CapTech can offer, and estimate proposed work plans. 

Qualifications

  • 5+ years of experience implementing with a variety of on-premises and cloud data management, integration, visualization, and analytical technologies 
  • Advanced proficiency in end-to-end data architecture solutions including ingestion, storage and relational modeling leveraging industry standard languages including SQL and Python
  • Demonstrated proficiency in the design and implementation of modern data architectures and concepts such as cloud services (e.g., AWS, Azure, GCP), real-time data distribution (e.g., Kafka, Kinesis, DataFlow, Airflow), NoSQL (e.g., MongoDB, DynamoDB, HBase, CosmosDB) and modern data warehouse tools including Snowflake and DataBricks
  • Ability to think strategically and relate architectural decisions and recommendations to business needs and client culture 
  • Ability to assess traditional and modern data architectural components based on business needs 
  • Familiarity with recommending data governance best practices including MDM, security, privacy and policies

See more jobs at CapTech Consulting

Apply for this job

+30d

Senior Software Engineer, Data Platform

InstacartCanada Remote (BC, AB or ON only)
scalaairflowpostgressqlDesignpythonbackend

Instacart is hiring a Remote Senior Software Engineer, Data Platform

We're transforming the grocery industry

At Instacart, we invite the world to share love through food because we believe everyone should have access to the food they love and more time to enjoy it together. Where others see a simple need for grocery delivery, we see exciting complexity and endless opportunity to serve the varied needs of our community. We work to deliver an essential service that customers rely on to get their groceries and household goods, while also offering safe and flexible earnings opportunities to Instacart Personal Shoppers.

Instacart has become a lifeline for millions of people, and we’re building the team to help push our shopping cart forward. If you’re ready to do the best work of your life, come join our table.

Instacart is a Flex First team

There’s no one-size fits all approach to how we do our best work. Our employees have the flexibility to choose where they do their best work—whether it’s from home, an office, or your favorite coffee shop—while staying connected and building community through regular in-person events. Learn more about our flexible approach to where we work.

OVERVIEW

 

About the Role

Our backend systems power the clients used by millions of customers every year to buy their groceries online. These systems must also support tight integration with the largest retailers in the US and Canada. Engineering at Instacart provides the opportunity to work on challenging scaling problems while also designing the features that will define our industry. You will learn how to build in an open collaborative environment serving millions of requests daily.

As a Senior Software Engineer in our Data Platform team, you will be the technical force that shapes our data infrastructure. You will design, develop, and maintain comprehensive access controls and governance frameworks to safeguard the integrity and privacy of our data. Your work will elevate the data utilization across several departments, enabling valuable insights, informed decision-making, and driving business value.

 

About the Team

The Data Platform team provides a robust and cutting edge platform to process petabytes of data daily using industry best practices. We work closely with Product/Data Science/ML teams to understand their needs and provide high quality solutions. The team is still small and many of our bigger initiatives are at an early stage. We expect you to work closely with stakeholders and shape these systems from design, technical decisions, project management to execution. Your input will be critical for driving the Data Strategy and building the platform that Instacart data lake will be built upon. Instacart's technology is constantly changing and adapting. Some of the technologies you would use at Instacart include: Scala, Python, Postgres, Snowflake, DeltaLake, Iceberg, Clickhouse, Spark, DBT, Flink Kafka, Airflow If you have experience with these technologies, you will have the opportunity to dive deeper. If you haven't used these technologies you will have the chance to learn from the collective experience of our team.

 

About the Job

Your responsibilities will include:

  • Designing, developing, and sustaining the comprehensive access controls and governance systems that improve the integrity and privacy of our data.
  • Ensuring the reliability, scalability, and security of the data platform.
  • Collaborating with various stakeholders and actively involve in the data infrastructure.

About you

Minimum Qualifications

We are looking for someone who:

  • Has 5+ years of experience in software engineering.
  • Exhibits an in-depth understanding of distributed systems, with proven experience with data processing technologies such as DBT and Airflow, and common web frameworks such as Rails.
  • Proficiently uses SQL for writing and reviewing complex queries for data analysis and debugging.
  • Can design for scale with the entire system in mind.
  • Capably communicates and is comfortable seeking and receiving feedback.
  • Possesses strong analytical and debugging skills.
  • Takes a strong sense of ownership while working with large codebases and diverse suite of products.
  • Embraces a collaborative mindset to partner with engineers, designers, and PMs from multiple teams to co-create impactful solutions while supporting system contributions.
  • Communicates clearly, presents ideas well, and can influence key stakeholders at manager, director, and VP levels.

Preferred Qualifications

We would love it if you:

  • Holds a Bachelor’s degree in Computer Science, Software Engineering, or a related field, or can demonstrate equivalent industry experience (4+ years).
  • Have prior work experience in Data Platforms.
  • Hold experience with big data technologies such as Spark, Hadoop, Flink, Hive, or Kafka, and with both streaming and batching data pipelines.
  • Have proven experience with distributed system designs.
  • Possess strong general programming and algorithm skills.
  • Show strong attention to detail and accuracy in your implementation.
  • Have a strong experience writing complex and optimized SQL queries.
  • Appreciate a data-driven mindset.

Instacart provides highly market-competitive compensation and benefits in each location where our employees work. This role is remote and the base pay range for a successful candidate is dependent on their permanent work location. Please review our Flex First remote work policy here.

Offers may vary based on many factors, such as candidate experience and skills required for the role. Additionally, this role is eligible for a new hire equity grant as well as annual refresh grants. Please read more about our benefits offerings here.

For Canadian based candidates, the base pay ranges for a successful candidate are listed below.

CAN
$162,000$180,000 CAD

See more jobs at Instacart

Apply for this job

+30d

Staff Software Engineer, Data Platform

InstacartUnited States - Remote
scalaairflowpostgressqlDesignelasticsearchpythonbackend

Instacart is hiring a Remote Staff Software Engineer, Data Platform

We're transforming the grocery industry

At Instacart, we invite the world to share love through food because we believe everyone should have access to the food they love and more time to enjoy it together. Where others see a simple need for grocery delivery, we see exciting complexity and endless opportunity to serve the varied needs of our community. We work to deliver an essential service that customers rely on to get their groceries and household goods, while also offering safe and flexible earnings opportunities to Instacart Personal Shoppers.

Instacart has become a lifeline for millions of people, and we’re building the team to help push our shopping cart forward. If you’re ready to do the best work of your life, come join our table.

Instacart is a Flex First team

There’s no one-size fits all approach to how we do our best work. Our employees have the flexibility to choose where they do their best work—whether it’s from home, an office, or your favorite coffee shop—while staying connected and building community through regular in-person events. Learn more about our flexible approach to where we work.

OVERVIEW

 

ABOUT THE ROLE

Our backend systems power the clients used by millions of customers every year to buy their groceries online. These systems must also support tight integration with the largest retailers in the US and Canada. Engineering at Instacart provides the opportunity to work on challenging scaling problems while also designing the features that will define our industry. You will learn how to build in an open collaborative environment serving millions of requests daily.

 

ABOUT THE TEAM

The Data Platform team provides a robust and cutting edge platform to process petabytes of data daily using industry best practices. We work closely with Product/Data Science/ML teams to understand their needs and provide high quality solutions.The team is still small and many of our bigger initiatives are at an early stage. We expect you to work closely with stakeholders and shape these systems from design, technical decisions, project management to execution.Your input will be critical for driving the Data Strategy and building the platform that Instacart data lake will be built upon.Instacart's technology is constantly changing and adapting. Some of the technologies you would use at Instacart include:Scala, Python, Postgres, Snowflake, DeltaLake, Iceberg, Clickhouse, Spark, DBT, Flink Kafka, Airflow

If you have experience with these technologies, you will have the opportunity to dive deeper. If you haven't used these technologies you will have the chance to learn from the collective experience of our team.

 

ABOUT THE JOB

  • You will work closely with other teams to understand their main pain points and translate them into self-serve and reliable solutions.
  • You are expected to mentor other team members and be a champion of engineering excellence.
  • You will be part of a small team, with a large amount of ownership and autonomy for managing things directly.
  • You will have the freedom to suggest and drive organization-wide initiatives.

ABOUT YOU

MINIMUM QUALIFICATIONS

  • Self-motivation and an enjoyment for a startup environment
  • A strong sense of ownership
  • Strong knowledge of common data infra technologies (Python, Scala, Kafka, Airflow, Spark, Iceberg, Delta Lake) in a production environment
  • Strong knowledge of common industry data stores or warehouses (Postgres, ElasticSearch, Cassandra, Dynamo, Snowflake)
  • An ability to balance a sense of urgency with shipping high quality and pragmatic solutions
  • Experience working with a large codebase on a cross functional team.
  • Strong knowledge of SQL
  • 10+ years of working experience in a similar field/environment

PREFERRED QUALIFICATIONS

Bachelor’s degree in Computer Science, computer engineering, electrical engineering ORequivalent work experience

Instacart provides highly market-competitive compensation and benefits in each location where our employees work. This role is remote and the base pay range for a successful candidate is dependent on their permanent work location. Please review our Flex First remote work policyhere.

Offers may vary based on many factors, such as candidate experience and skills required for the role.Additionally, this role is eligible for a new hire equity grant as well as annual refresh grants. Please read more about our benefits offeringshere.

For US based candidates, the base pay ranges for a successful candidate are listed below.

CA, NY, CT, NJ
$255,000$283,000 USD
WA
$245,000$272,000 USD
OR, DE, ME, MA, MD, NH, RI, VT, DC, PA, VA, CO, TX, IL, HI
$234,000$260,000 USD
All other states
$212,000$235,000 USD

See more jobs at Instacart

Apply for this job

+30d

Senior Data Engineer

RemoteRemote-Southeast Asia
airflowsqljenkinspythonAWS

Remote is hiring a Remote Senior Data Engineer

About Remote

Remote is solving global remote organizations’ biggest challenge: employing anyone anywhere compliantly. We make it possible for businesses big and small to employ a global team by handling global payroll, benefits, taxes, and compliance. Check out remote.com/how-it-works to learn more or if you’re interested in adding to the mission, scroll down to apply now.

Please take a look at remote.com/handbook to learn more about our culture and what it is like to work here. Not only do we encourage folks from all ethnic groups, genders, sexuality, age and abilities to apply, but we prioritize a sense of belonging. You can check out independent reviews by other candidates on Glassdoor or look up the results of our candidate surveys to see how others feel about working and interviewing here.

All of our positions are fully remote. You do not have to relocate to join us!

What this job can offer you

This is an exciting time to join the growing Data Team at Remote, which today consists of over 15 Data Engineers, Analytics Engineers and Data Analysts spread across 10+ countries. Throughout the team we're focused on driving business value through impactful decision making. We're in a transformative period where we're laying the foundations for scalable company growth across our data platform, which truly serves every part of the Remote business. This team would be a great fit for anyone who loves working collaboratively on challenging data problems, and making an impact with their work. We're using a variety of modern data tooling on the AWS platform, such as Snowflake and dbt, with SQL and python being extensively employed.

This is an exciting time to join Remote and make a personal difference in the global employment space as a Senior Data Engineer, joining our Data team, composed of Data Analysts and Data Engineers. We support the decision making and operational reporting needs by being able to translate data into actionable insights to non-data professionals at Remote. We’re mainly using SQL, Python, Meltano, Airflow, Redshift, Metabase and Retool.

What you bring

  • Experience in data engineering; high-growth tech company experience is a plus
  • Strong experience with building data extraction/transformation pipelines (e.g. Meltano, Airbyte) and orchestration platforms (e.g. Airflow)
  • Strong experience in working with SQL, data warehouses (e.g. Redshift) and data transformation workflows (e.g. dbt)
  • Solid experience using CI/CD (e.g. Gitlab, Github, Jenkins)
  • Experience with data visualization tools (e.g. Metabase) is considered a plus
  • A self-starter mentality and the ability to thrive in an unstructured and fast-paced environment
  • You have strong collaboration skills and enjoy mentoring
  • You are a kind, empathetic, and patient person
  • Writes and speaks fluent English
  • It's not required to have experience working remotely, but considered a plus

Key Responsibilities

  • Playing a key role in Data Platform Development & Maintenance:
    • Managing and maintaining the organization's data platform, ensuring its stability, scalability, and performance.
    • Collaboration with cross-functional teams to understand their data requirements and optimize data storage and access, while protecting data integrity and privacy.
    • Development and testing architectures that enable data extraction and transformation to serve business needs.
  • Improving further our Data Pipeline & Monitoring Systems:
    • Designing, developing, and deploying efficient Extract, Load, Transform (ELT) processes to acquire and integrate data from various sources into the data platform.
    • Identifying, evaluating, and implementing tools and technologies to improve ELT pipeline performance and reliability.
    • Ensuring data quality and consistency by implementing data validation and cleansing techniques.
    • Implementing monitoring solutions to track the health and performance of data pipelines and identify and resolve issues proactively.
    • Conducting regular performance tuning and optimization of data pipelines to meet SLAs and scalability requirements.
  • Dig deep into DBT Modelling:
    • Designing, developing, and maintaining DBT (Data Build Tool) models for data transformation and analysis.
    • Collaboration with Data Analysts to understand their reporting and analysis needs and translate them into DBT models, making sure they respect internal conventions and best practices.
  • Driving our Culture of Documentation:
    • Creating and maintaining technical documentation, including data dictionaries, process flows, and architectural diagrams.
    • Collaborating with cross-functional teams, including Data Analysts, SREs (Site Reliability Engineers) and Software Engineers, to understand their data requirements and deliver effective data solutions.
    • Sharing knowledge and offer mentorship, providing guidance and advice to peers and colleagues, creating an environment that empowers collective growth

Practicals

  • You'll report to: Engineering Manager - Data
  • Team: Data 
  • Location:For this position we welcome everyone to apply, but we will prioritise applications from the following locations as we encourage our teams to diversify; Vietnam, Indonesia, Taiwan and South-Korea
  • Start date: As soon as possible

Remote Compensation Philosophy

Remote's Total Rewards philosophy is to ensure fair, unbiased compensation and fair equitypayalong with competitive benefits in all locations in which we operate. We do not agree to or encourage cheap-labor practices and therefore we ensure to pay above in-location rates. We hope to inspire other companies to support global talent-hiring and bring local wealth to developing countries.

At first glance our salary bands seem quite wide - here is some context. At Remote we have international operations and a globally distributed workforce.  We use geo ranges to consider geographic pay differentials as part of our global compensation strategy to remain competitive in various markets while we hiring globally.

The base salary range for this full-time position is $53,500 USD to $131,300 USD. Our salary ranges are determined by role, level and location, and our job titles may span more than one career level. The actual base pay for the successful candidate in this role is dependent upon many factors such as location, transferable or job-related skills, work experience, relevant training, business needs, and market demands. The base salary range may be subject to change.

Application process

  1. Interview with recruiter
  2. Interview with future manager
  3. Async exercise stage 
  4. Interview with team members

#LI-DP

Benefits

Our full benefits & perks are explained in our handbook at remote.com/r/benefits. As a global company, each country works differently, but some benefits/perks are for all Remoters:
  • work from anywhere
  • unlimited personal time off (minimum 4 weeks)
  • quarterly company-wide day off for self care
  • flexible working hours (we are async)
  • 16 weeks paid parental leave
  • mental health support services
  • stock options
  • learning budget
  • home office budget & IT equipment
  • budget for local in-person social events or co-working spaces

How you’ll plan your day (and life)

We work async at Remote which means you can plan your schedule around your life (and not around meetings). Read more at remote.com/async.

You will be empowered to take ownership and be proactive. When in doubt you will default to action instead of waiting. Your life-work balance is important and you will be encouraged to put yourself and your family first, and fit work around your needs.

If that sounds like something you want, apply now!

How to apply

  1. Please fill out the form below and upload your CV with a PDF format.
  2. We kindly ask you to submit your application and CV in English, as this is the standardised language we use here at Remote.
  3. If you don’t have an up to date CV but you are still interested in talking to us, please feel free to add a copy of your LinkedIn profile instead.

We will ask you to voluntarily tell us your pronouns at interview stage, and you will have the option to answer our anonymous demographic questionnaire when you apply below. As an equal employment opportunity employer it’s important to us that our workforce reflects people of all backgrounds, identities, and experiences and this data will help us to stay accountable. We thank you for providing this data, if you chose to.

See more jobs at Remote

Apply for this job

+30d

Data Architect

MLDevOPSairflowsqlB2CRabbitMQDesignjavac++pythonAWS

hims & hers is hiring a Remote Data Architect

Hims & Hers Health, Inc. (better known as Hims & Hers) is the leading health and wellness platform, on a mission to help the world feel great through the power of better health. We are revolutionizing telehealth for providers and their patients alike. Making personalized solutions accessible is of paramount importance to Hims & Hers and we are focused on continued innovation in this space. Hims & Hers offers nonprescription products and access to highly personalized prescription solutions for a variety of conditions related to mental health, sexual health, hair care, skincare, heart health, and more.

Hims & Hers is a public company, traded on the NYSE under the ticker symbol “HIMS”. To learn more about the brand and offerings, you can visit hims.com and forhers.com, or visit our investor site. For information on the company’s outstanding benefits, culture, and its talent-first flexible/remote work approach, see below and visit www.hims.com/careers-professionals.

​​About the Role:

We're looking for an experienced Data Architectto join our Data Platform Engineering team. Our team is responsible for enabling Hims & Hers business (Product, Analytics, Operations, Finance, Data Science, Machine Learning, Customer Experience, and Engineering) by providing a platform with a rich set of data and tools to leverage. 

As a Data Architect, you will focus on the big picture of an organization’s data strategy. You will work closely with other architects, engineering, product, analytics, data science, and DevOps leaders to align the data architecture with the overall business goals and objectives.

Your primary responsibility is to develop a cohesive data architecture that supports the organization’s long-term vision. You design efficient data storage structures, data models, standards, and optimize data retrieval for reporting, analytics, and AI purposes.

You Will

  • Collaborate with cross-functional stakeholders including product management, engineering, analytics, and key business representatives to align the architecture, vision and roadmap with stakeholder needs
  • Establish guidelines, controls, and processes to make data available for developing scalable data-driven solutions for Analytics and AI
  • Create and set best practices for data ingestion, integration, and access patterns to support both real-time and batch-based consumer data needs
  • Design and develop scalable, high-performance data architecture solutions that supports both the consumer side of the business as well as analytic use cases
  • Implement security measures to safeguard sensitive data from unauthorized access, ensuring data privacy, and compliance with relevant regulations
  • Plan and oversee large-scale and complex technical migrations to new data systems and platforms
  • Drive continuous data transformation to minimize technical debt
  • Display strong thought leadership in pursuit of modern data architecture principles and technology modernization
  • Define and lead technology proof of concepts to ensure feasibility of new data technology solutions
  • Provide technical leadership and mentorship to the members of the team
  • Create comprehensive documentation for design, and processes to support ongoing maintenance and knowledge sharing
  • Conduct design reviews to ensure that proposed solutions address platform and stakeholder pain points, as well as meet business, and technical requirements, with alignment to standards and best practices
  • Prepare and deliver efficient communications to convey architectural direction and how it aligns with company strategy. Be able to explain the architectural vision and implementation to executives

You Have

  • Bachelor's or Master's degree in Computer Science or equivalent, with over 12 years of Data Architecture and Data Engineering experience, including team leadership
  • Proven expertise in designing data platforms for large-scale data and diverse data architectures, including warehouses, lakehouses, and integrated data stores.
  • Proficiency in a variety of technologies such as: SQL, Bash, Python, Java, Presto, Spark, AWS, data streaming like Kafka, RabbitMQ, data stacks like Airflow, Databricks, dbt, and data stores like Cassandra, Aurora, ZooKeeper
  • Experience with data security (including PHI and PII), as well as data privacy regulations (CCPA and GDPR)
  • Proficient in addressing data-related challenges through analytical problem-solving and aligning data architecture with organizational business goals and objectives
  • Exposure to analytics techniques using ML and AI to assist data scientists and analysts in deriving insights from data
  • Analytical and problem-solving skills to address data-related challenges and find optimal solutions
  • Ability to manage projects effectively, plan tasks, set priorities, and meet deadlines in a fast-paced and ever changing environment

Nice to Have

  • Experience working in healthcare or in a B2C company

Our Benefits (there are more but here are some highlights):

  • Competitive salary & equity compensation for full-time roles
  • Unlimited PTO, company holidays, and quarterly mental health days
  • Comprehensive health benefits including medical, dental & vision, and parental leave
  • Employee Stock Purchase Program (ESPP)
  • Employee discounts on hims & hers & Apostrophe online products
  • 401k benefits with employer matching contribution
  • Offsite team retreats

 

#LI-Remote

 

Outlined below is a reasonable estimate of H&H’s compensation range for this role for US-based candidates. If you're based outside of the US, your recruiter will be able to provide you with an estimated salary range for your location.

The actual amount will take into account a range of factors that are considered in making compensation decisions including but not limited to skill sets, experience and training, licensure and certifications, and location. H&H also offers a comprehensive Total Rewards package that may include an equity grant.

Consult with your Recruiter during any potential screening to determine a more targeted range based on location and job-related factors.

An estimate of the current salary range for US-based employees is
$245,000$275,000 USD

We are focused on building a diverse and inclusive workforce. If you’re excited about this role, but do not meet 100% of the qualifications listed above, we encourage you to apply.

Hims is an Equal Opportunity Employer and considers applicants for employment without regard to race, color, religion, sex, orientation, national origin, age, disability, genetics or any other basis forbidden under federal, state, or local law. Hims considers all qualified applicants in accordance with the San Francisco Fair Chance Ordinance.

Hims & hers is committed to providing reasonable accommodations for qualified individuals with disabilities and disabled veterans in our job application procedures. If you need assistance or an accommodation due to a disability, you may contact us at accommodations@forhims.com. Please do not send resumes to this email address.

For our California-based applicants – Please see our California Employment Candidate Privacy Policy to learn more about how we collect, use, retain, and disclose Personal Information. 

See more jobs at hims & hers

Apply for this job

+30d

Senior Software Engineer, Data

CLEAR - CorporateNew York, New York, United States (Hybrid)
tableauairflowpostgresDesignjenkinspythonAWS

CLEAR - Corporate is hiring a Remote Senior Software Engineer, Data

Today, CLEAR is well-known as a leader in digital and biometric identification, reducing friction for our members wherever an ID check is needed. We’re looking for an experienced Senior Software Engineer to help us build the next generation of products which will go beyond just ID and enable our members to leverage the power of a networked digital identity. As a Senior Software Engineer at CLEAR, you will participate in the design, implementation, testing, and deployment of applications to build and enhance our platform- one that interconnects dozens of attributes and qualifications while keeping member privacy and security at the core. 


A brief highlight of our tech stack:

  • Python / Postgres / Snowflake / Airflow / Databricks / Spark / dbt
  • AWS 

What you'll do:

  • Build a scalable data system in which Analysts and Engineers can self-service changes in an automated, tested, secure, and high-quality manner 
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management
  • Develop and maintain data pipelines to collect, clean, and transform data. Owning end to end data product from ingestion to visualization
  • Develop and implement data analytics models
  • Partner with product and other stakeholders to uncover requirements, to innovate, and to solve complex problems
  • Have a strong sense of ownership, responsible for architectural decision-making and striving for continuous improvement in technology and processes at CLEAR

 What you're great at:

  • 6+ years of data engineering experience
  • Working with cloud-based application development, and be fluent in at least a few of: 
    • Cloud services providers like AWS
    • Data pipeline orchestration tools like Airflow, Dagster, Luigi, etc
    • Big data tools like Spark, Kafka, Snowflake, Databricks, etc
    • Collaboration, integration, and deployment tools like Github, Argo, and Jenkins 
    • Data visualization tool like Looker, Tableau, etc
  • Articulating technical concepts to a mixed audience of technical and non-technical stakeholders
  • Collaborating and mentoring less experienced members of the team
  • Comfort with ambiguity 
  • Curiosity about technology, believe in constant learning, and ability to be autonomous to figure out what's important

How You'll be Rewarded:

At CLEAR we help YOU move forward - because when you’re at your best, we’re at our best. You’ll work with talented team members who are motivated by our mission of making experiences safer and easier. Our hybrid work environment provides flexibility. In our offices, you’ll enjoy benefits like meals and snacks. We invest in your well-being and learning & development with our stipend and reimbursement programs. 

We offer holistic total rewards, including comprehensive healthcare plans, family building benefits (fertility and adoption/surrogacy support), flexible time off, free OneMedical memberships for you and your dependents, and a 401(k) retirement plan with employer match. The base salary range for this role is $175,000-$215,000, depending on levels of skills and experience.

The base salary range represents the low and high end of CLEAR’s salary range for this position. Salaries will vary depending on various factors which include, but are not limited to location, education, skills, experience and performance. The range listed is just one component of CLEAR’s total compensation package for employees and other rewards may include annual bonuses, commission, Restricted Stock Units.

About CLEAR

Have you ever had that green-light feeling? When you hit every green light and the day just feels like magic. CLEAR's mission is to create frictionless experiences where every day has that feeling. With more than 15+ million passionate members and hundreds of partners around the world, CLEAR’s identity platform is transforming the way people live, work, and travel. Whether it’s at the airport, stadium, or right on your phone, CLEAR connects you to the things that make you, you - unlocking easier, more secure, and more seamless experiences - making them all feel like magic.

CLEAR provides reasonable accommodation to qualified individuals with disabilities or protected needs. Please let us know if you require a reasonable accommodation to apply for a job or perform your job. Examples of reasonable accommodation include, but are not limited to, time off, extra breaks, making a change to the application process or work procedures, policy exceptions, providing documents in an alternative format, live captioning or using a sign language interpreter, or using specialized equipment.

 

See more jobs at CLEAR - Corporate

Apply for this job

+30d

Senior Software Engineer, Ads Measurement

InstacartCanada - Remote
Bachelor's degreescalaairflowsqlDesignpython

Instacart is hiring a Remote Senior Software Engineer, Ads Measurement

We're transforming the grocery industry

At Instacart, we invite the world to share love through food because we believe everyone should have access to the food they love and more time to enjoy it together. Where others see a simple need for grocery delivery, we see exciting complexity and endless opportunity to serve the varied needs of our community. We work to deliver an essential service that customers rely on to get their groceries and household goods, while also offering safe and flexible earnings opportunities to Instacart Personal Shoppers.

Instacart has become a lifeline for millions of people, and we’re building the team to help push our shopping cart forward. If you’re ready to do the best work of your life, come join our table.

Instacart is a Flex First team

There’s no one-size fits all approach to how we do our best work. Our employees have the flexibility to choose where they do their best work—whether it’s from home, an office, or your favorite coffee shop—while staying connected and building community through regular in-person events. Learn more about our flexible approach to where we work.

ABOUT THE ROLE

Are you ready to take your development skills to the next level? We’re looking for a Senior Software Engineer to join our Ads team. You’ll play a critical role in the evolution of our Ads suite and help build world-class reporting solutions across various platforms, ensuring that advertisers and retailers receive timely, accurate, and actionable data insights. By working closely with Product Designers, Product Managers, Data Scientists, Machine Learning Engineers, and other cross-functional partners, you’ll contribute to the advancement of our Ads suite and guarantee a seamless flow of data to our users.

The Instacart Ads team is at the forefront of refining our Ads products and supporting infrastructure, so your work will directly enhance our capability to process petabyte-scale data and deliver reports essential for billing, strategic decision-making, and partner management.

Our products are used by millions of people every year. To meet–and exceed–expectations we are rapidly improving and modernizing our ads platform, helping raise the quality bar for our products across the entire organization. Sound exciting? Keep reading.

 

ABOUT THE TEAM

The Ads team is a diverse group of spirited and highly-dedicated engineers focused on crafting and delivering comprehensive reporting solutions to our advertisers and retailers.

Our team thrives on dynamic challenges, and we take pride in developing and maintaining scalable and fault-tolerant metrics delivery systems. We've embraced a culture of open and candid collaboration where everyone's views matter, allowing us to continuously innovate and make substantial impact to the digital advertising industry through our work.

Our tech stack includes but is not limited to Rails, Go, DBT, Airflow, Scala, Apache Spark, Databricks, Delta Lake, Snowflake, Python and Terraform. We believe in constantly learning, growing and adopting the most efficient practices that enable us to deliver quality data services to our stakeholders. If you're a detective at heart, love solving complex problems, and are passionate about the intersection of data and technology, you'll fit right in!

 

Overview of the Ads teams that are currently hiring: 

  • Ads Measurement & Data: The Ads Measurement & Data team is focused on developing scalable and fault-tolerant data processing systems and delivering comprehensive reporting solutions to our advertisers and retailers. 

 

ABOUT THE JOB

We believe that high-quality data is essential for any business organization, as such we are looking for a strong software engineer excited to raise our efficiency, quality and scalability bar. You will be able to have extensive ownership and the ability to help set best practices and contribute to product and infrastructure features. 

As a craft leader, you'll be responsible for contributing to the vision, strategy and development of our multi-platform reporting system that is efficient, scalable, and meets diverse user needs. You will advocate for data quality, correctness, scalability and latency standards to ensure consistency in how we enable data-driven decisions and features across the organization. 

You will also be proactive in spearheading new initiatives, coding and documenting components, writing and reviewing system design documents and partnering with other teams and functions to gather and understand our customer's requirements. You will think and plan strategically for short and long term initiatives to continue shaping our platform and products.

MINIMUM QUALIFICATIONS

  • Bachelor's degree or higher in Computer Science, Software Engineering, or a related field, or equivalent proven industry experience (4+ years).
  • 5+ years of experience in software engineering.
  • Comprehensive understanding of distributed systems, proven experience with data processing technologies such as DBT and Airflow and common web frameworks such as Rails.
  • Highly proficient with SQL, capable of writing and reviewing complex queries for data analysis and debugging.
  • You can design for scale with the entire system in mind.
  • Solid communicator, comfortable seeking and receiving feedback.
  • Strong analytical and debugging skills.
  • Strong sense of ownership working with a large codebase and diverse suite of products.
  • A collaborative mindset to be able to partner with engineers, designers and PM's from multiple teams to co-create impactful solutions while supporting system contributions.
  • Strong organizational skills with the ability to communicate and present ideas clearly and influence key stakeholders at the manager, director, and VP level.

 

PREFERRED QUALIFICATIONS

  • Prior work experience in the digital advertising industry.
  • Experience with big data technologies such as Spark, Hadoop, Flink, Hive or Kafka, and with both streaming and batching data pipelines.
  • Proven experience with distributed system design.
  • Strong general programming and algorithm skills.
  • Strong attention to detail and accuracy in the implementation, keen eye for edge cases and code reviews. 
  • Data driven mindset.

Instacart provides highly market-competitive compensation and benefits in each location where our employees work. This role is remote and the base pay range for a successful candidate is dependent on their permanent work location. Please review our Flex First remote work policy here. Currently, we are only hiring in the following provinces: Ontario, Alberta and British Columbia.

Offers may vary based on many factors, such as candidate experience and skills required for the role. Additionally, this role is eligible for a new hire equity grant as well as annual refresh grants. Please read more about our benefits offerings here.

For Canadian based candidates, the base pay ranges for a successful candidate are listed below.

CAN
$162,000$180,000 CAD

See more jobs at Instacart

Apply for this job

+30d

Senior Machine Learning Engineer

AltRemote US
airflowpostgresDesignpythonAWS

Alt is hiring a Remote Senior Machine Learning Engineer

At Alt, we’re on a mission to unlock the value of alternative assets, and looking for talented people who share our vision. Our platform enables users to exchange, invest, value, securely store, and authenticate their collectible cards. And we envision a world where anything is an investable asset. 

To date, we’ve raised over $100 million from thought leaders at the intersection of culture, community, and capital. Some of our investors include Alexis Ohanian’s fund Seven Seven Six, the founders of Stripe, Coinbase co-founder Fred Ehrsam, BlackRock co-founder Sue Wagner, the co-founders of AngelList, First Round Capital, and BoxGroup. We’re also backed by professional athletes including Tom Brady, Candace Parker, Giannis Antetokounmpo, Alex Morgan, Kevin Durant, and Marlon Humphrey.

Alt is a dedicated equal opportunity employer committed to creating a diverse workforce. We celebrate our differences and strive to create an inclusive environment for all. We are focused on fostering a culture of empowerment which starts with providing our employees with the resources needed to reach their full potential.

What we are looking for:

We are seeking a Senior Machine Learning Engineer who is eager to make a significant impact. In this role, you'll get the opportunity to leverage your technical expertise and problem-solving skills to solve some of the hardest data problems in the hobby. Your primary focus in this role will be on enhancing and optimizing our pricing engine to support strategic business goals. Our ideal candidate is passionate about trading cards, has a strong sense of ownership, and enjoys challenges. At Alt, data is core to everything we do and is a differentiator for our customers. The team’s scope covers data pipeline development, search infrastructure, web scraping, detection algorithms, internal toolings and data quality. We give our engineers a lot of individual responsibility and autonomy, so your ability to make good trade-offs and exercise good judgment is essential.

The impact you will make:

  • Partner with engineers, and cross-functional stakeholders to contribute to all phases of algorithm development including: ideation, prototyping, design, and production
  • Build, iterate, productionize, and own Alt's valuation models
  • Leverage background in pricing strategies and models to develop innovative pricing solutions
  • Design and implement scalable, reliable, and maintainable machine learning systems
  • Partner with product to understand customer requirements and prioritize model features

What you bring to the table:

  • Experience: 5+ years of experience in software development, with a proven track record of developing and deploying models in production. Experience with pricing models preferred.
  • Technical Skills: Proficiency in programming languages and tools such as Python, AWS, Postgres, Airflow, Datadog, and JavaScript.
  • Problem-Solving: A knack for solving tough problems and a drive to take ownership of your work.
  • Communication: Effective communication skills with the ability to ship solutions quickly.
  • Product Focus: Excellent product instincts, with a user-first approach when designing technical solutions.
  • Team Player: A collaborative mindset that helps elevate the performance of those around you.
  • Industry Knowledge: Knowledge of the sports/trading card industry is a plus.

What you will get from us:

  • Ground floor opportunity as an early member of the Alt team; you’ll directly shape the direction of our company. The opportunities for growth are truly limitless.
  • An inclusive company culture that is being built intentionally to foster an environment that supports and engages talent in their current and future endeavors.
  • $100/month work-from-home stipend
  • $200/month wellness stipend
  • WeWork office Stipend
  • 401(k) retirement benefits
  • Flexible vacation policy
  • Generous paid parental leave
  • Competitive healthcare benefits, including HSA, for you and your dependent(s)

Alt's compensation package includes a competitive base salary benchmarked against real-time market data, as well as equity for all full-time roles. We want all full-time employees to be invested in Alt and to be able to take advantage of that investment, so our equity grants include a 10-year exercise window. The base salary range for this role is: $194,000 - $210,000. Offers may vary from the amount listed based on geography, candidate experience and expertise, and other factors.

See more jobs at Alt

Apply for this job

+30d

Data Engineer (OBRIO)

GenesisKyiv, UA - Remote - Hybrid
MLairflowsqlFirebaseapiiosgitandroidpostgresqlMySQLkubernetespython

Genesis is hiring a Remote Data Engineer (OBRIO)

OBRIO is an IT company with Ukrainian roots inside Genesis business ecosystem. Our team consists of more than 120talented professionals whose ambitions and striving for success help us build the best products on the market. We have offices in Kyiv and Warsaw.
We are developing Nebula — the biggest brand in the spiritual niche. Nebula has over 45 million users worldwide and has been ranked as the № 1 lifestyle app of the day in the Apple Store and Play Market in the USA, Canada, and Australia several times. Nebula is available on iOS, Android, and Web.

Our mission is to make people happier by improving the quality of their relationships.

Here are some details we would like to share with you:

  • Nebulais #1 in its niche in terms of downloads and revenue targets;
  • 45 million users worldwide;
  • Users from 50+ countries;
  • 4.8 — our average AppStore rating (with more than 215 thousand ratings.

In terms of OBRIO scaling and team processes, we have a need for Data Engineer to strengthen our team and bring his/her talent and technical skills to ensure smooth data operations through building and maintaining a new structure. By joining us, you will be the first Data Engineer, who'll be able to independently form a complete architecture and have the opportunity to influence the process of ETL building. In this position, you will mostly interact with our analysts and back-end team. That is why we would like to share the backstage of team life and introduce you to our Analytics Lead Zhenya???? She joined our team almost 2 years ago and is always open to expertise sharing. That's something we all trust in ????

Your impact:

  • Reviewing existing ETL processes, their optimizing and constant upgrade;
  • Automating health checks and alerting development;
  • Collecting and updating of documentation;
  • Building a process of communication with analysts and data/dataset transfer with the appropriate review to optimize work with the database;
  • Developing the database architecture and its support;
  • Building dashboards to monitor data quality and data structure;
  • Data collecting from various sources via API (marketing sources/ data from amplitude/ etc.);
  • Preparing datasets for ML models.

Our technical stack:

  • Vertica;
  • PostgreSQL;
  • MySQL;
  • BigQuery;
  • Python;
  • Git;
  • Our services: Firebase, Amplitude, AppsFlyer, Google Analytics.

The green flags:

  • Experience of working with data from 1 year;
  • Solid knowledge of Python in the context of creating ETL data pipelines (Pandas);
  • Good Skills of autonomous work with third-party APIs;
  • Excellent SQL skills (PostgreSQL, MySQL, Vertica);
  • Understanding of database architecture construction;
  • Experience with cloud services;
  • Experience with a big data storage and/or processing environment (eg. Apache Spark, Snowflake, BigQuery, and similar);
  • Understanding of process of working with orchestration tools (Kubernetes, Apache Airflow).

Why OBRIO is the best place to work?

  • Unleash Your Ambitions: Our company was built by ambitious people who never settle for less. By joining OBRIO, you'll have the chance to unleash your own ambitions and achieve your career dreams.
  • We don’t just give you opportunities for growth and development, we give you maximum autonomy and believe we can’t do without you and your active thinking.
  • Innovate and Be Creative: We embrace innovation and creativity at OBRIO, and we encourage our team members to bring their unique ideas to the table. You'll have the chanceto explore new solutions and make a real impact on our company's success.
  • At OBRIO, we’ve gathered influential experts, all of whom are open to sharing their knowledge and ready to help solve issues based on their experience. This is the company where you can quickly reach your potential and advance your career.

Our benefits:

  • Benefit from the flexibility to work from anywhere in the world;
  • Work from the comfort of your home or from one of our offices in Kyiv or Warsaw. The choice is yours!
  • Enjoy 20 annual vacation days and unlimited sick leave, all covered by the company;
  • Don't worry about getting the right equipment, we've got you covered if necessary;
  • Stay healthy with access to a corporate doctor online, and health insurance options in Ukraine or a fixed amount towards insurance abroad after your probation period;
  • Keep learning with our extensive corporate library, internal online meetings, and lectures;
  • Grow your skills with our training compensation program;
  • Take advantage of our supportive corporate culture, including assistance with relocation, advice on legal stay abroad, housing support, and help for third-country nationals;
  • Have fun with our online events and team-building activities!

Here's what our hiring journey looks like: Initial Screening ➡️ Team Interview ➡️ Optional Skill Assessment ➡️ Final Check ➡️ Job Offer.

    Let's team up and reach for the stars together!

    More about us on social media: Facebook, Instagram, LinkedIn, TikTok.

    Discover our job openings, refer friends, and get an exclusive behind-the-scenes look at OBRIO by joining our Telegram.

    See more jobs at Genesis

    Apply for this job

    +30d

    Data Engineer (m/w/d) - Python / Remote möglich

    Ebreuninger GmbHStuttgart, Germany, Remote
    DevOPSterraformairflowsqloracleazuredockerpostgresqlkubernetespythonAWS

    Ebreuninger GmbH is hiring a Remote Data Engineer (m/w/d) - Python / Remote möglich

    Stellenbeschreibung

    Das Data Engineering-Team versorgt die Data Platform von Breuninger mit allen Daten, die für diverse Datenprodukte wie Reporting-Dashboards, Marketing-Analysen, Data Science und weitere Use-Cases benötigt werden. Unsere Aufgabe ist es, die Rohdaten von verschiedensten Quellsystemen in der Data Platform (Google Cloud / BigQuery) bereitszustellen. Für diesen Zweck betreiben wir über 100 Daten-Pipelines mit unterschiedlichen Technologien. Als Datenexperten treiben wir außerdem das datengetriebene Arbeiten bei Breuninger voran und sind damit auch Berater für Teams rund um Datenarchitektur und -bereitstellung. 

    • Als Data Engineer (m/w/d) bei Breuninger entwickelst Du unsere Data Platform kontinuierlich weiter (Google Cloud / BigQuery) 
    • Du bist für die Konzeption, Implementierung und Wartung von Datenpipelines verantwortlich (Python, Airflow, dbt Cloud, Kubernetes) 
    • Du stellst die Dateninfrastruktur für unsere Datenpipelines bereit und verbesserst sie (Terraform, Google Cloud) 
    • Du berätst andere Teams beim Aufbau ihrer Datenprozesse 
    • Du übernimmst DevOps-Aufgaben, automatisierst wiederkehrende Prozesse und betreibst CI/CD-Pipelines (Gitlab, Terraform) 
    • Du treibst den Wissensaustausch im Team und im Unternehmen voran und hilfst damit, datengetriebenes Arbeiten im Unternehmen zu fördern 

    Qualifikationen

    • Du hast mindestens 2 Jahre relevante Erfahrung im Data Engineering-Umfeld gesammelt 
    • Du hast Erfahrung in der Programmierung mit Python 
    • Du hast gute SQL-Kenntnisse 
    • Du hast Erfahrung mit Airflow oder anderen, vergleichbaren Orchestrierungs-Softwares (Dagster, Prefect, …) 
    • Du hast ein grundsätzliches Verständnis von Datenbanktechnologien (PostgreSQL, Oracle, …) 
    • Du kennst dich mit dem stabilen Betrieb von ELT/ETL-Strecken aus 
    • Du hast bereits Erfahrung mit mindestens einem Cloud-Provider (AWS, GCP, Azure, …) 
    • Du kennst dich mit unterschiedlichen Big Data-Technologien (Apache Beam, Apache Spark, …) und Daten-Architekturen aus (Streaming, Batch, …) 
    • Du hast ein „automate everything“-Mindset und DevOps ist für dich eine Selbstverständlichkeit 
    • Du hast idealerweise schon Erfahrung mit BigQuery, Terraform, Kubernetes und Docker gesammelt
    • Du verfügst über gute Englischkenntnisse 

    Apply for this job

    +30d

    Data Architect

    Cohere HealthRemote
    agiletableaunosqlairflowsqlDesignc++pythonAWS

    Cohere Health is hiring a Remote Data Architect

    Company Overview: 

    Cohere Health is a fast-growing clinical intelligence company that’s improving lives at scale by promoting the best patient-specific care options, using leading edge AI combined with deep clinical expertise. In only four years our solutions have been adopted by health insurance plans covering over 15 million people, while our revenues and company size have quadrupled.  That growth combined with capital raises totaling $106M positions us extremely well for continued success. Our awards include: 2023 and 2024 BuiltIn Best Place to Work, Top 5 LinkedIn™ Startup, TripleTree iAward, multiple KLAS Research Points of Light, along with recognition on Fierce Healthcare's Fierce 15 and CB Insights' Digital Health 150 lists.

    Opportunity Overview: 

    You will be a key leader in designing and implementing our data architecture, which is central to our value proposition and crucial to our company's success. As a Data Architect at Cohere, you will work with a high degree of autonomy to design and optimize data warehouses, ensure data governance, and enable data-driven decision-making across the business. You will partner closely with data, product, and engineering teams to solve complex problems and deliver scalable solutions.

    Last but not least, People who succeed here are empathetic teammates who are candid, kind, caring, and embody our core values and principles. We believe that diverse, inclusive teams make the most impactful work. Cohere is deeply invested in ensuring a supportive, growth-oriented environment for everyone.

    What you will do:

    • Lead the design, implementation, and optimization of our data warehouse and governance policies to ensure scalability and compliance with healthcare regulations.
    • Work closely with stakeholders to understand data requirements and deliver actionable insights that can be efficiently productized.
    • Design, develop, operationalize, and maintain the data models, both logic, physical and conceptual to support various business use cases.
    • Collaborate with cross-functional teams to define data architecture standards and best practices.
    • Ensure data quality, integrity, and security across all data sources and systems.
    • Provide technical leadership and mentorship to data engineering and management teams.
    • Create and manage data governance frameworks, including data catalogs, lineage, and metadata management.
    • Stay current with emerging data technologies and evaluate their potential impact on our architecture.

    Your background & requirements:

    • 4+ years of experience leading data architecture initiatives in a fast-paced, agile environment.
    • Bachelor's or Master's degree in Computer Science, Data Science, or a related field with at least 12 years of relevant experience.
    • Proven track record of designing and implementing scalable data warehouses and governance policies.
    • Expertise in SQL and proficiency in data modeling and design. Experience having integrated NoSQL systems as part of a broader data architecture.
    • Hands-on experience with ETL processes, data integration, and data quality frameworks.
    • Experience with data governance tools and practices.
    • Experience building data platforms using python, AWS, Airflow, dbt, and data warehouses.
    • Familiarity with healthcare data standards such as HL7, FHIR, and CCDA.
    • Strong understanding of data privacy and security regulations, including HIPAA.
    • Experience with business intelligence tools like Tableau or PowerBI.
    • Strong analytical and problem-solving skills.
    • Excellent communication and collaboration skills. 

    Equal Opportunity Statement: 

    Cohere Health is an Equal Opportunity Employer. We are committed to fostering an environment of mutual respect where equal employment opportunities are available to all. To us, it’s personal.

    We can’t wait to learn more about you and meet you at Cohere Health!

    The salary range for this position is $160,000 to $185,000 annually; as part of a total benefits package which includes health insurance, 401k and bonus. In accordance with state applicable laws, Cohere is required to provide a reasonable estimate of the compensation range for this role. Individual pay decisions are ultimately based on a number of factors, including but not limited to qualifications for the role, experience level, skillset, and internal alignment.

     

    #LI-Remote

    #BI-Remote




    Apply for this job

    +30d

    Sr. Data Engineer, Marketing Tech

    MLDevOPSLambdaagileairflowsqlDesignapic++dockerjenkinspythonAWSjavascript

    hims & hers is hiring a Remote Sr. Data Engineer, Marketing Tech

    Hims & Hers Health, Inc. (better known as Hims & Hers) is the leading health and wellness platform, on a mission to help the world feel great through the power of better health. We are revolutionizing telehealth for providers and their patients alike. Making personalized solutions accessible is of paramount importance to Hims & Hers and we are focused on continued innovation in this space. Hims & Hers offers nonprescription products and access to highly personalized prescription solutions for a variety of conditions related to mental health, sexual health, hair care, skincare, heart health, and more.

    Hims & Hers is a public company, traded on the NYSE under the ticker symbol “HIMS”. To learn more about the brand and offerings, you can visit hims.com and forhers.com, or visit our investor site. For information on the company’s outstanding benefits, culture, and its talent-first flexible/remote work approach, see below and visit www.hims.com/careers-professionals.

    We're looking for a savvy and experienced Senior Data Engineer to join the Data Platform Engineering team at Hims. As a Senior Data Engineer, you will work with the analytics engineers, product managers, engineers, security, DevOps, analytics, and machine learning teams to build a data platform that backs the self-service analytics, machine learning models, and data products serving Million+ Hims & Hers subscribers.

    You Will:

    • Architect and develop data pipelines to optimize performance, quality, and scalability.
    • Build, maintain & operate scalable, performant, and containerized infrastructure required for optimal extraction, transformation, and loading of data from various data sources.
    • Design, develop, and own robust, scalable data processing and data integration pipelines using Python, dbt, Kafka, Airflow, PySpark, SparkSQL, and REST API endpoints to ingest data from various external data sources to Data Lake.
    • Develop testing frameworks and monitoring to improve data quality, observability, pipeline reliability, and performance 
    • Orchestrate sophisticated data flow patterns across a variety of disparate tooling.
    • Support analytics engineers, data analysts, and business partners in building tools and data marts that enable self-service analytics.
    • Partner with the rest of the Data Platform team to set best practices and ensure the execution of them.
    • Partner with the analytics engineers to ensure the performance and reliability of our data sources.
    • Partner with machine learning engineers to deploy predictive models.
    • Partner with the legal and security teams to build frameworks and implement data compliance and security policies.
    • Partner with DevOps to build IaC and CI/CD pipelines.
    • Support code versioning and code deployments for data Pipelines.

    You Have:

    • 8+ years of professional experience designing, creating and maintaining scalable data pipelines using Python, API calls, SQL, and scripting languages.
    • Demonstrated experience writing clean, efficient & well-documented Python code and are willing to become effective in other languages as needed.
    • Demonstrated experience writing complex, highly optimized SQL queries across large data sets.
    • Experience working with customer behavior data. 
    • Experience with Javascript, event tracking tools like GTM, tools like Google Analytics, Amplitude and CRM tools. 
    • Experience with cloud technologies such as AWS and/or Google Cloud Platform.
    • Experience with serverless architecture (Google Cloud Functions, AWS Lambda).
    • Experience with IaC technologies like Terraform.
    • Experience with data warehouses like BigQuery, Databricks, Snowflake, and Postgres.
    • Experience building event streaming pipelines using Kafka/Confluent Kafka.
    • Experience with modern data stack like Airflow/Astronomer, Fivetran, Tableau/Looker.
    • Experience with containers and container orchestration tools such as Docker or Kubernetes.
    • Experience with Machine Learning & MLOps.
    • Experience with CI/CD (Jenkins, GitHub Actions, Circle CI).
    • Thorough understanding of SDLC and Agile frameworks.
    • Project management skills and a demonstrated ability to work autonomously.

    Nice to Have:

    • Experience building data models using dbt
    • Experience designing and developing systems with desired SLAs and data quality metrics.
    • Experience with microservice architecture.
    • Experience architecting an enterprise-grade data platform.

    Outlined below is a reasonable estimate of H&H’s compensation range for this role for US-based candidates. If you're based outside of the US, your recruiter will be able to provide you with an estimated salary range for your location.

    The actual amount will take into account a range of factors that are considered in making compensation decisions including but not limited to skill sets, experience and training, licensure and certifications, and location. H&H also offers a comprehensive Total Rewards package that may include an equity grant.

    Consult with your Recruiter during any potential screening to determine a more targeted range based on location and job-related factors.

    An estimate of the current salary range for US-based employees is
    $140,000$170,000 USD

    We are focused on building a diverse and inclusive workforce. If you’re excited about this role, but do not meet 100% of the qualifications listed above, we encourage you to apply.

    Hims is an Equal Opportunity Employer and considers applicants for employment without regard to race, color, religion, sex, orientation, national origin, age, disability, genetics or any other basis forbidden under federal, state, or local law. Hims considers all qualified applicants in accordance with the San Francisco Fair Chance Ordinance.

    Hims & hers is committed to providing reasonable accommodations for qualified individuals with disabilities and disabled veterans in our job application procedures. If you need assistance or an accommodation due to a disability, you may contact us at accommodations@forhims.com. Please do not send resumes to this email address.

    For our California-based applicants – Please see our California Employment Candidate Privacy Policy to learn more about how we collect, use, retain, and disclose Personal Information. 

    See more jobs at hims & hers

    Apply for this job

    +30d

    Data Architect (remote in Spain possible)

    LanguageWireSpain, Remote
    DevOPSairflowsqlazureqapostgresql

    LanguageWire is hiring a Remote Data Architect (remote in Spain possible)

    Do you just love tweaking that one annoying query to perform just a little bit better?

    Are you the go-to guy to know how to find or use data in a complex distributed ecosystem including plenty of services and databases?

    Are you interested in pushing organizations to use their data more effectively and become more data-driven?

    Yes? You should definitely read on!

    The role you’ll play

    As LanguageWire accelerates our AI developments, we are in the process of re-architecting our data infrastructure by revising our existing pipelines and data warehouse and moving towards a data lake architecture.

    As Data Architect, you will be responsible of LanguageWire’s efficient management and use of data.

    As a technical leader you will define LanguageWire’s data vision and strategy including aspects like architecture, governance, compliance, etc.

    Supported by our Senior Director of Technology, you will collaborate closely with our engineering teams to make this vision a reality by driving the data-related aspects of our roadmap, planning and delivering trainings to our engineering teams and supporting them in all their needs.

    In parallel to that, you will need to support engineers in their data to day work. Technology selection, data modelling, query optimization, monitoring & troubleshooting issues, etc. are continuous needs that you will help teams with.

    This means that you will need to balance your focus between long-term strategic initiatives, evangelization of our engineering teams and more tactical day-to-day support.

    The team you’ll be a part of

    We have 8 software teams working across 5 countries and taking care of the continuous development of our platform. We strongly believe in building our own tech so we can deliver the best solutions for our customers. Our teams cover the full technical scope needed to create advanced language solutions with AI, complex web-based tools, workflow engines, large scale data stores and much more. Our technology and linguistic data assets set us apart from the competition and we’re proud of that.

    You will report directly to our Senior Director of Technology and work as part of our Technical Enablement team which is a cross-functional team of specialists working closely with all our other engineering teams in core technical aspects (architecture, data engineering, QA automation, performance, cybersecurity, etc.). Our Technical Enablement team es key to ensure that LanguageWire platform is built, run, and maintained in a scalable, reliable, performant and secure manner.

    If you want to make a difference, make it with us by…

    • Defining LanguageWire’s data architecture framework, standards, and principles, including modeling, metadata, security, reference data, and master data.
    • Driving the strategy execution across the entire tech organization by closely collaborating with other teams.
    • Ensuring the optimal operations of our products and services by being the hands-on expert that support our teams on with their databases and data needs.

    In one year, you’ll know you were successful if…

    • All of LanguageWire’s data is well modelled and documented.
    • LanguageWire has a powerful core data engine that allows our ML/AI teams to effectively leverage all of our data.
    • You are regarded as the go-to person for all database and data needs.

     

    Desired experience and competencies

    What does it take to work for LanguageWire?

    What you’ll need to bring

    You are a hands-on technical expert

    • Expert knowledge of SQL (SQL Server, PostgreSQL, etc.)
    • Good knowledge of cloud services (Azure & GCP) and DevOps engineering
    • Solid data modelling skills, including conceptual, logical and physical models.
    • Experience with Data Warehousing (BigQuery, SnowFlake, Databricks, …)
    • Experience with Orchestration technology (Apache Airflow, Azure Data Factory, …)
    • Experience with Data Lakes and Data Warehouses

    You are a technical leader

    • You stand out as a trusted leader and respected mentor in your team.
    • Excellent communicator able to create engagement and commitment from teams around you

    You are a team player 

    • You love solving complex puzzles with engineers from different areas and different backgrounds 
    • You’re eager to understand how the different areas of the ecosystem connect to create the complete value chain

    Fluent English (reading, writing, speaking) 

    This will make you stand out

    • Technical Leadership experience (influencing without authority)
    • Experience working within a microservice-based architecture

    Your colleagues say you

    • Are approachable and helpful when needed
    • know all the latest trends in the industry
    • never settle for second best

    Our perks

    • Enjoy flat hierarchies, responsibility and freedom, direct feedback, and room to stand up for your own ideas
    • Internal development opportunities, ongoing support from your People Partner, and an inclusive and fun company culture
    • International company with over 400 employees. Offices in Copenhagen, Aarhus, Stockholm, Varberg, London, Leuven, Lille, Paris, Munich, Hamburg, Zurich, Kiev, Gdansk, Atlanta, Finland and Valencia
    • We offer flexible work options tailored to how you work best. Depending on your team, you may have the option to work full-time from the office as an "Office Bee," part-time from the office as a "Nomad," or full-time from home as a "Homey."
    • We take care of our people and initiate many social get-togethers from Friday Bars a to Summer or Christmas parties. We have fun!
    • 200 great colleagues in the Valencia office belonging to different business departments
    • Excellent location in cool and modern offices in the city center, with a great rooftop terrace and a view over the Town Hall Square
    • Working in an international environment—more than 20 different nationalities
    • A private health insurance
    • A dog friendly atmosphere
    • Big kitchen with access to organic fruits, nuts and biscuits and coffee.
    • Social area and game room (foosball table, darts, and board games)
    • Bike and car parking

     

    About LanguageWire

    At LanguageWire, we want to wire the world together with language. Why? Because we want to help people & businesses simplify communication. We are fueled by the most advanced technology (AI) and our goal is to make customer's lives easier by simplifying their communication with any audience across the globe.

     

    Our values drive our behavior

    We are curious. We are trustworthy. We are caring. We are ambitious.

    At LanguageWire, we are curious and intrigued by what we don’t understand. We believe relationships are based on honesty and responsibility, and being trustworthy reinforces an open, humble, and honest way of communicating. We are caring and respect each other personally and professionally. We encourage authentic collaboration, invite feedback and a positive social environment. Our desire to learn, build, and share knowledge is a natural part of our corporate culture.

     

    Working at LanguageWire — why we like it: 

    “We believe that we can wire the world together with language. It drives us to think big, follow ambitious goals, and get better every day. By embracing and solving the most exciting and impactful challenges, we help people to understand each other better and to bring the world closer together.”

    (Waldemar, Senior Director of Product Management, Munich)

    Yes, to diversity, equity & inclusion

    In LanguageWire, we believe diversity in gender, age, background, and culture is essential for our growth. Therefore, we are committed to creating a culture that incorporates diverse perspectives and expertise in our everyday work.

    LanguageWire’s recruitment process is designed to be transparent and fair for all candidates. We encourage candidates of all backgrounds to apply, and we ensure that candidates are provided with an equal opportunity to demonstrate their competencies and skills.

    Want to know more?

    We can’t wait to meet you! So, why wait 'til tomorrow? Apply today!

    If you want to know more about LanguageWire, we encourage you to visit our website!

    See more jobs at LanguageWire

    Apply for this job

    +30d

    Sr. Data Engineer, Kafka

    DevOPSagileterraformairflowpostgressqlDesignapic++dockerkubernetesjenkinspythonAWSjavascript

    hims & hers is hiring a Remote Sr. Data Engineer, Kafka

    Hims & Hers Health, Inc. (better known as Hims & Hers) is the leading health and wellness platform, on a mission to help the world feel great through the power of better health. We are revolutionizing telehealth for providers and their patients alike. Making personalized solutions accessible is of paramount importance to Hims & Hers and we are focused on continued innovation in this space. Hims & Hers offers nonprescription products and access to highly personalized prescription solutions for a variety of conditions related to mental health, sexual health, hair care, skincare, heart health, and more.

    Hims & Hers is a public company, traded on the NYSE under the ticker symbol “HIMS”. To learn more about the brand and offerings, you can visit hims.com and forhers.com, or visit our investor site. For information on the company’s outstanding benefits, culture, and its talent-first flexible/remote work approach, see below and visit www.hims.com/careers-professionals.

    We're looking for a savvy and experienced Senior Data Engineer to join the Data Platform Engineering team at Hims. As a Senior Data Engineer, you will work with the analytics engineers, product managers, engineers, security, DevOps, analytics, and machine learning teams to build a data platform that backs the self-service analytics, machine learning models, and data products serving over a million Hims & Hers users.

    You Will:

    • Architect and develop data pipelines to optimize performance, quality, and scalability
    • Build, maintain & operate scalable, performant, and containerized infrastructure required for optimal extraction, transformation, and loading of data from various data sources
    • Design, develop, and own robust, scalable data processing and data integration pipelines using Python, dbt, Kafka, Airflow, PySpark, SparkSQL, and REST API endpoints to ingest data from various external data sources to Data Lake
    • Develop testing frameworks and monitoring to improve data quality, observability, pipeline reliability, and performance
    • Orchestrate sophisticated data flow patterns across a variety of disparate tooling
    • Support analytics engineers, data analysts, and business partners in building tools and data marts that enable self-service analytics
    • Partner with the rest of the Data Platform team to set best practices and ensure the execution of them
    • Partner with the analytics engineers to ensure the performance and reliability of our data sources
    • Partner with machine learning engineers to deploy predictive models
    • Partner with the legal and security teams to build frameworks and implement data compliance and security policies
    • Partner with DevOps to build IaC and CI/CD pipelines
    • Support code versioning and code deployments for data Pipelines

    You Have:

    • 8+ years of professional experience designing, creating and maintaining scalable data pipelines using Python, API calls, SQL, and scripting languages
    • Demonstrated experience writing clean, efficient & well-documented Python code and are willing to become effective in other languages as needed
    • Demonstrated experience writing complex, highly optimized SQL queries across large data sets
    • Experience with cloud technologies such as AWS and/or Google Cloud Platform
    • Experience building event streaming pipelines using Kafka/Confluent Kafka
    • Experience with IaC technologies like Terraform
    • Experience with data warehouses like BigQuery, Databricks, Snowflake, and Postgres
    • Experience with Databricks platform
    • Experience with modern data stack like Airflow/Astronomer, Databricks, dbt, Fivetran, Confluent, Tableau/Looker
    • Experience with containers and container orchestration tools such as Docker or Kubernetes
    • Experience with Machine Learning & MLOps
    • Experience with CI/CD (Jenkins, GitHub Actions, Circle CI)
    • Thorough understanding of SDLC and Agile frameworks
    • Project management skills and a demonstrated ability to work autonomously

    Nice to Have:

    • Experience building data models using dbt
    • Experience with Javascript and event tracking tools like GTM
    • Experience designing and developing systems with desired SLAs and data quality metrics
    • Experience with microservice architecture
    • Experience architecting an enterprise-grade data platform

    Outlined below is a reasonable estimate of H&H’s compensation range for this role for US-based candidates. If you're based outside of the US, your recruiter will be able to provide you with an estimated salary range for your location.

    The actual amount will take into account a range of factors that are considered in making compensation decisions including but not limited to skill sets, experience and training, licensure and certifications, and location. H&H also offers a comprehensive Total Rewards package that may include an equity grant.

    Consult with your Recruiter during any potential screening to determine a more targeted range based on location and job-related factors.

    An estimate of the current salary range for US-based employees is
    $140,000$170,000 USD

    We are focused on building a diverse and inclusive workforce. If you’re excited about this role, but do not meet 100% of the qualifications listed above, we encourage you to apply.

    Hims is an Equal Opportunity Employer and considers applicants for employment without regard to race, color, religion, sex, orientation, national origin, age, disability, genetics or any other basis forbidden under federal, state, or local law. Hims considers all qualified applicants in accordance with the San Francisco Fair Chance Ordinance.

    Hims & hers is committed to providing reasonable accommodations for qualified individuals with disabilities and disabled veterans in our job application procedures. If you need assistance or an accommodation due to a disability, you may contact us at accommodations@forhims.com. Please do not send resumes to this email address.

    For our California-based applicants – Please see our California Employment Candidate Privacy Policy to learn more about how we collect, use, retain, and disclose Personal Information. 

    See more jobs at hims & hers

    Apply for this job

    +30d

    Data Engineer GCP| Summer Job Dating

    DevoteamTunis, Tunisia, Remote
    airflowsqlscrum

    Devoteam is hiring a Remote Data Engineer GCP| Summer Job Dating

    Description du poste

    Au sein de la direction « Plateforme Data », le consultant intégrera une équipe SCRUM et se concentrera sur un périmètre fonctionnel spécifique.

    Votre rôle consistera à contribuer à des projets data en apportant votre expertise sur les tâches suivantes :

    • Concevoir, développer et maintenir des pipelines de données robustes et évolutifs sur Google Cloud Plateform (GCP), en utilisant des outils tels que BigQuery, Airflow, Looker et DBT.
    • Collaborer avec les équipes métier pour comprendre les exigences en matière de données et concevoir des solutions adaptées.
    • Optimiser les performances des traitements des données et des processus ELT en utilisant AirFlow, DBT et BigQuery.
    • Mettre en œuvre des processus de qualité des données pour garantir l'intégrité et la cohérence des données.
    • Travailler en étroite collaboration avec les équipes d'ingénierie pour intégrer les pipelines de données dans les applications et les services existants.
    • Rester à jour avec les nouvelles technologies et les meilleures pratiques dans le domaine du traitement des données et de l'analyse.

     

      Qualifications

      ???? Compétences

      Quels atouts pour rejoindre l’équipe ?

      Diplômé(e) d’un Bac+5 en école d'ingénieur ou équivalent universitaire avec une spécialisation en informatique.

      • Au moins 4 ans d'expérience dans le domaine de l'ingénierie des données, avec une expérience significative dans un environnement basé sur le cloud GCP.
      • Maîtrise avancée de SQL pour l'optimisation et le traitement des données.
      • Certification Google Professional Data Engineer est un plus.
      • Très bonne communication écrite et orale (livrables et reportings de qualité).

      Alors, si vous souhaitez progresser, apprendre et partager, rejoignez-nous !

      See more jobs at Devoteam

      Apply for this job