airflow Remote Jobs

138 Results

9h

Senior Data Engineer

GeminiRemote (USA)
remote-firstairflowsqlDesigncsskubernetespythonjavascript

Gemini is hiring a Remote Senior Data Engineer

About the Company

Gemini is a global crypto and Web3 platform founded by Tyler Winklevoss and Cameron Winklevoss in 2014. Gemini offers a wide range of crypto products and services for individuals and institutions in over 70 countries.

Crypto is about giving you greater choice, independence, and opportunity. We are here to help you on your journey. We build crypto products that are simple, elegant, and secure. Whether you are an individual or an institution, we help you buy, sell, and store your bitcoin and cryptocurrency. 

At Gemini, our mission is to unlock the next era of financial, creative, and personal freedom.

In the United States, we have a flexible hybrid work policy for employees who live within 30 miles of our office headquartered in New York City and our office in Seattle. Employees within the New York and Seattle metropolitan areas are expected to work from the designated office twice a week, unless there is a job-specific requirement to be in the office every workday. Employees outside of these areas are considered part of our remote-first workforce. We believe our hybrid approach for those near our NYC and Seattle offices increases productivity through more in-person collaboration where possible.

The Department: Data

The Role: Senior Data Engineer

As a member of our data engineering team, you'll deliver high quality work while solving challenges that impact the whole or part of the team's data architecture. You'll update yourself with recent advances in Big data space and provide solutions for large-scale applications aligning with team's long term goals. Your work will help resolve complex problems with identifying root causes, documenting the solutions, and implementing Operations excellence (Data auditing, validation, automation, maintainability) in mind. Communicating your insights with leaders across the organization is paramount to success.

Responsibilities:

  • Design, architect and implement best-in-class Data Warehousing and reporting solutions
  • Lead and participate in design discussions and meetings
  • Mentor data engineers and analysts
  • Design, automate, build, and launch scalable, efficient and reliable data pipelines into production using Python
  • Build real-time data and reporting solutions
  • Design, build and enhance dimensional models for Data Warehouse and BI solutions
  • Research new tools and technologies to improve existing processes
  • Develop new systems and tools to enable the teams to consume and understand data more intuitively
  • Partner with engineers, project managers, and analysts to deliver insights to the business
  • Perform root cause analysis and resolve production and data issues
  • Create test plans, test scripts and perform data validation
  • Tune SQL queries, reports and ETL pipelines
  • Build and maintain data dictionary and process documentation

Minimum Qualifications:

  • 5+ years experience in data engineering with data warehouse technologies
  • 5+ years experience in custom ETL design, implementation and maintenance
  • 5+ years experience with schema design and dimensional data modeling
  • Experience building real-time data solutions and processes
  • Advanced skills with Python and SQL are a must
  • Experience with one or more MPP databases(Redshift, Bigquery, Snowflake, etc)
  • Experience with one or more ETL tools(Informatica, Pentaho, SSIS, Alooma, etc)
  • Strong computer science fundamentals including data structures and algorithms
  • Strong software engineering skills in any server side language, preferable Python
  • Experienced in working collaboratively across different teams and departments
  • Strong technical and business communication

Preferred Qualifications:

  • Kafka, HDFS, Hive, Cloud computing, machine learning, text analysis, NLP & Web development experience is a plus
  • Experience with Continuous integration and deployment
  • Knowledge and experience of financial markets, banking or exchanges
  • Web development skills with HTML, CSS, or JavaScript
It Pays to Work Here
 
The compensation & benefits package for this role includes:
  • Competitive starting salary
  • A discretionary annual bonus
  • Long-term incentive in the form of a new hire equity grant
  • Comprehensive health plans
  • 401K with company matching
  • Paid Parental Leave
  • Flexible time off

Salary Range: The base salary range for this role is between $136,000 - $170,000 in the State of New York, the State of California and the State of Washington. This range is not inclusive of our discretionary bonus or equity package. When determining a candidate’s compensation, we consider a number of factors including skillset, experience, job scope, and current market data.

At Gemini, we strive to build diverse teams that reflect the people we want to empower through our products, and we are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity, or Veteran status. Equal Opportunity is the Law, and Gemini is proud to be an equal opportunity workplace. If you have a specific need that requires accommodation, please let a member of the People Team know.

#LI-AA1

Apply for this job

3d

Machine Learning Operations Engineer - Intermediate

AssentOttawa, Canada, Remote
MLS3EC2Lambda4 years of experienceairflowsqldockerpythonAWS

Assent is hiring a Remote Machine Learning Operations Engineer - Intermediate

Job Description

The Intermediate Machine Learning Operations Engineer is responsible for developing and maintaining key machine learning infrastructure throughout its lifecycle. This role will ensure that ML and AI models deployed in production at Assent are performant, reliable, maintain (and improve in) accuracy over time, and integrate well with the overall Assent software suite. The Intermediate ML-Ops Engineer will also support data pipeline development with respect to ML model inputs and outputs. This person will have an important role in designing cutting edge infrastructure, supporting key product offerings from Assent. The Intermediate Machine Learning Operations Engineer  is a data-oriented, out-of-the-box thinker who is passionate about data, machine learning, understanding the business, and driving business value.

Key Requirements and Responsibilities

  • Develop key data pipelines moving data into and out of ML and AI models in production environments for Assent's digital products;
  • Support the development of a robust ML-Ops framework to support model tracking and continuous improvement;
  • Support ongoing and automated statistical analysis of ML models in production;
  • Work closely with adjacent teams to proactively identify potential issues in performance and availability, with respect to data systems impacting ML models;
  • Be curious, proactive and iterative, prepared to try unconventional ideas to find solutions to difficult problems;
  • Apply engineering principles to proactively identify issues, develop solutions, and recommend improvements to existing ML Operations;
  • Be self-motivated and highly proactive at exploring new technologies;
  • Stay up to date with machine learning and AI principles, models, tools and their applications in data processing and analysis;
  • Find creative solutions to challenges involving data that is difficult to obtain, complex or ambiguous;
  • Manage multiple concurrent projects, priorities and timelines;
  • Support the Machine Learning team in the pursuit of building business critical data products;
  • Configure & deploy relevant implementations to the Amazon Web Service (AWS) Cloud;

This is not an exhaustive list of duties. Responsibilities may be altered and/or added from time to time to meet business needs.

Qualifications

We strongly value your talent, energy and passion. It will also be valuable to Assent if you have the following qualifications:

  • 2-4 years of experience in MLOps, machine learning engineering, or related fields, with hands-on experience deploying and maintaining ML models in production environments.
  • A degree in Computer Science, Engineering, or a related field (A Masters level degree , or higher, is highly preferred)
  • A demonstrable understanding of machine learning and AI principles, models, tools, and their applications in data processing and analysis.
  • Strong knowledge of SQL for data retrieval.
  • Excellent ability to use Python for data extraction and manipulation
  • Solid working knowledge of AWS systems and services; comfort working with SageMaker, EC2, S3, Lambda, Terraform.
  • Solid working knowledge of MLOps, versioning, orchestration and containerization tools; MLFlow, Kubeflow, Airflow, DVC, Weights & Biases, Docker, Kubernetes.
  • Strong understanding of statistical analysis methods and procedures
  • Ability to apply engineering principles to proactively identify issues, develop solutions, and recommend improvements
  • Excellent analytical ability and creative problem-solving skills, including the ability to deal with situations where information is difficult to obtain, complex or ambiguous
  • Excellent organizational skills and ability to manage multiple priorities and timelines
  • You’re a great team player, constantly looking to support your teammates on their mission of building great data products.

Reasonable Accommodations Statement:To perform this job successfully, an individual must be able to perform the aforementioned duties and responsibilities satisfactorily. Reasonable accommodations may be made to enable qualified individuals with disabilities to perform these essential functions. Assent is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.

See more jobs at Assent

Apply for this job

6d

Sr. Engineer II, Analytics

MLagiletableauairflowsqlgitc++pythonbackend

hims & hers is hiring a Remote Sr. Engineer II, Analytics

Hims & Hers Health, Inc. (better known as Hims & Hers) is the leading health and wellness platform, on a mission to help the world feel great through the power of better health. We are revolutionizing telehealth for providers and their patients alike. Making personalized solutions accessible is of paramount importance to Hims & Hers and we are focused on continued innovation in this space. Hims & Hers offers nonprescription products and access to highly personalized prescription solutions for a variety of conditions related to mental health, sexual health, hair care, skincare, heart health, and more.

Hims & Hers is a public company, traded on the NYSE under the ticker symbol “HIMS”. To learn more about the brand and offerings, you can visit hims.com and forhers.com, or visit our investor site. For information on the company’s outstanding benefits, culture, and its talent-first flexible/remote work approach, see below and visit www.hims.com/careers-professionals.

​​About the Role:

We're looking for a savvy and experienced Senior Analytics Engineerto build seamless data products in collaboration with our data engineering, analytics, engineering, business, and product management teams.

You Will:

  • Take the data products to the next level by developing scalable data models 
  • Manage transformations of data after load of raw data through both technical processes and business logic
  • Create an inventory of the data sources and documents needed to implement self-service analytics
  • Define quality standards of the data and partner with the analytics team to define minimum acceptance criteria for the data sources
  • Data cataloging & documentation of the data sources
  • Regularly meet with business partners and analytics teams to understand and solve data needs, short-term and medium-term
  • Build trust with internal stakeholders to encourage data-driven decision-making
  • Work with all organizations to continually grow the value of our data products by onboarding new data from our backend and 3rd party system

You Have:

  • 8+ years of experience with SQL, preferably for data transformation or analytical use cases
  • 4+ years of experience building scalable data models for analytical and BI purposes
  • 3+ years of solid experience with dbt
  • Mastery of data warehouse methodologies and techniques from transactional databases to dimensional data modeling, to wide denormalized data marts
  • Solid experience with BI tools like Tableau and Looker
  • Experience using version control (command-line, Git)
  • Familiarity with one of the data warehouses like Google Big Query, Snowflake, Redshift, Databricks
  • Domain expertise in one or more of the Finance, Product, Marketing, Operations, Customer Experience
  • Demonstrated experience engaging and influencing senior leaders across functions, including an ability to communicate effectively with both business and technical teams
  • Strong analytical and quantitative skills with the ability to use data and metrics to back up assumptions and recommendations to drive actions
  • Ability to articulate vision, mission, and objectives, and change the narrative appropriate to the audience
  • Experience working with management to define and measure KPIs and other operating metrics
  • Understanding of SDLC and Agile frameworks
  • Project management skills and a demonstrated ability to work autonomously

Nice to Have:

  • Experience working in telehealth or e-commerce
  • Previous working experience at startups
  • Knowledge of Python programming
  • Knowledge of Airflow, and modern data stack (Airflow, Databricks, dbt, Fivetran, Tableau / Looker)
  • ML training model development

Our Benefits (there are more but here are some highlights):

  • Competitive salary & equity compensation for full-time roles
  • Unlimited PTO, company holidays, and quarterly mental health days
  • Comprehensive health benefits including medical, dental & vision, and parental leave
  • Employee Stock Purchase Program (ESPP)
  • Employee discounts on hims & hers & Apostrophe online products
  • 401k benefits with employer matching contribution
  • Offsite team retreats

#LI-Remote

Outlined below is a reasonable estimate of H&H’s compensation range for this role for US-based candidates. If you're based outside of the US, your recruiter will be able to provide you with an estimated salary range for your location.

The actual amount will take into account a range of factors that are considered in making compensation decisions, including but not limited to skill sets, experience and training, licensure and certifications, and location. H&H also offers a comprehensive Total Rewards package that may include an equity grant.

Consult with your Recruiter during any potential screening to determine a more targeted range based on location and job-related factors.

An estimate of the current salary range is
$150,000$180,000 USD

We are focused on building a diverse and inclusive workforce. If you’re excited about this role, but do not meet 100% of the qualifications listed above, we encourage you to apply.

Hims considers all qualified applicants for employment, including applicants with arrest or conviction records, in accordance with the San Francisco Fair Chance Ordinance, the Los Angeles County Fair Chance Ordinance, the California Fair Chance Act, and any similar state or local fair chance laws.

It is unlawful in Massachusetts to require or administer a lie detector test as a condition of employment or continued employment. An employer who violates this law shall be subject to criminal penalties and civil liability.

Hims & Hers is committed to providing reasonable accommodations for qualified individuals with disabilities and disabled veterans in our job application procedures. If you need assistance or an accommodation due to a disability, please contact us at accommodations@forhims.com and describe the needed accommodation. Your privacy is important to us, and any information you share will only be used for the legitimate purpose of considering your request for accommodation. Hims & Hers gives consideration to all qualified applicants without regard to any protected status, including disability. Please do not send resumes to this email address.

For our California-based applicants – Please see our California Employment Candidate Privacy Policy to learn more about how we collect, use, retain, and disclose Personal Information. 

See more jobs at hims & hers

Apply for this job

6d

Desenvolvedor Backend Pleno

SossegoRemoto, Brazil, Remote
DevOPSairflowsqlscrumtypescriptpython

Sossego is hiring a Remote Desenvolvedor Backend Pleno

Descrição da vaga

DESAFIO

Atuar no desenvolvimento e manutenção de captura de dados, usando APIs e automação de scraping, com foco em eficiência de execução e qualidade dos dados capturados.

Extrair dados de fontes variadas, como páginas HTML, arquivos XLS/CSV, arquivos PDF, e outras fontes.

Atuar conjuntamente com a Squad, de forma a proporcionar melhoria contínua nos processos e produtos.

DESCRIÇÃO DA VAGA

Estamos buscando um Desenvolvedor Pleno com experiência em APIs e automações com TypeScript e Playwright. O profissional terá a oportunidade de expandir suas atividades com Python e Airflow como diferenciais. Experiência com RegEx e SQL é essencial. 

RESPONSABILIDADES

  • Desenvolver e manter automações de captura de dados utilizando TypeScript e Playwright

  • Desenvolver e manter automações de extração de dados de arquivos variados (PDF, XLS, etc)

  • Documentar e mapear os processos automatizados de captura, extração e tratamento de dados

  • Monitorar o desempenho e qualidade das automações de captura de dados

  • Garantir conformidade com regras de segurança e privacidade no tratamento de dados

  • Manipular dados de forma eficiente usando SQL

  • Manter-se atualizado com as tendências e avanços relacionados à ETL

  • Colaborar com equipes multidisciplinares para identificar oportunidades de melhoria nos processos e produtos

Qualificações

REQUISITOS:

  • Experiência profissional de pelo menos 3 anos 
  • Experiência com automação em TypeScript e Playwright ou tecnologias similares (Puppeteer, Selenium, etc)
  • Experiência com TypeScript
  • Experiência com expressões regulares (RegEx)
  • Experiência com SQL
  • Conhecimento em Python
  • Formação em Ciência da Computação, Engenharia, Matemática ou área relacionada
  • Habilidades analíticas para análise e resolução de problemas
  • Excelente comunicação escrita e verbal

DIFERENCIAIS:

  • Familiaridade com ambiente Cloud e processos de Devops
  • Experiencia com Airflow ou outros orquestradores de ETL
  • Conhecimento em BPMN
  • Conhecimento no setor de seguros
  • Familiaridade com metodologias ágeis (Scrum, Kanban)

See more jobs at Sossego

Apply for this job

7d

Senior Azure Scala Data Engineer

DevOPSagileBachelor's degree5 years of experiencescalaairflowsqlDesignazurescrum

FuseMachines is hiring a Remote Senior Azure Scala Data Engineer

Senior Azure Scala Data Engineer - Fusemachines - Career PageSee more jobs at FuseMachines

Apply for this job

8d

Senior Data Engineer

Nile BitsCairo, Egypt, Remote
agileairflowsqlDesigndockerlinuxpythonAWS

Nile Bits is hiring a Remote Senior Data Engineer

Job Description

  • Designing and implementing core functionality within our data pipeline in order to support key business processes
  • Shaping the technical direction of the data engineering team
  • Supporting our Data Warehousing approach and strategy
  • Maintaining our data infrastructure so that our jobs run reliably and at scale
  • Taking responsibility for all parts of the data ecosystem, including data governance, monitoring and alerting, data validation, and documentation
  • Mentoring and upskilling other members of the team

Qualifications

  • Experience building data pipelines and/or ETL processes
  • Experience working in a Data Engineering role
  • Confident writing performant and readable code in Python, building upon the rich Python ecosystem wherever it makes sense to do so.
  • Good software engineering knowledge & skills: OO programming, design patterns, SOLID design principles and clean code
  • Confident writing SQL and good understanding of database design.
  • Experience working with web APIs.
  • Experience leading projects from a technical perspective
  • Knowledge of Docker, shell scripting, working with Linux
  • Experience with a cloud data warehouse
  • Experience in managing deployments and implementing observability and fault tolerance in cloud based infrastructure (i.e. CI/CD, Infrastructure as Code, container-based infrastructure, auto-scaling, monitoring and alerting)
  • Pro-active with a self-starter mindset; able to identify elegant solutions to difficult problems and able to suggest new and creative approaches.
  • Analytical, problem-solving and an effective communicator; leveraging technology and subject matter expertise in the business to accelerate our roadmap.
  • Able to lead technical discussions, shape the direction of the team, identify opportunities for innovation and improvement
  • Able to lead and deliver projects, ensuring stakeholders are kept up-to-date through regular communication
  • Willing to support the rest of the team when necessary, sharing knowledge and best practices, documenting design decisions, etc.
  • Willing to step outside your comfort zone to broaden your skills and learn new technologies.
  • Experience working with open source orchestration frameworks like Airflow or data analytics tools such as dbt
  • Experience with AWS services or those of another cloud provider
  • Experience with Snowflake
  • Good understanding of Agile

See more jobs at Nile Bits

Apply for this job

8d

Digital Analytics Manager

Nile BitsCairo, Egypt, Remote
tableauairflowsqlsalesforceFirebasemobileqadockerpython

Nile Bits is hiring a Remote Digital Analytics Manager

Job Description

We’re looking for a hands-on, highly technical and analytically minded individual who can work cross-functionally with the product team, tech team, marketing team and data team to:

  • Identify what and how we should be collecting data (client-side, server-side) to support deep understanding of customer behavior
  • Devise the technical specifications for data collection, writing and QA-ing code where needed
  • Oversee the tracking implementation and QA-ing process end to end
  • Implement processes to ensure tracking stays robust and up to date
  • Maintain compliance and ethical values with regards to user behavioral tracking
  • Ensure our data collection keeps up with the business!

Key Responsibilities

  • Take ownership of all tag implementations in GTM and Server side GTM to feed data to tools and partners such as Snowplow, Google Analytics, Firebase, Criteo, Epsilon.
  • Working closely with Marketing teams to ensure efficient and well structured tracking code
  • Devising and owning new tracking specifications to be implemented
  • Managing all project correspondence with stakeholders
  • Experience of managing tracking implementation projects
  • Set the direction of our digital analytics strategy and enforce best practices
  • Audit the existing client-side/server-side data collection setup, identifying gaps in tracking and processes, identify inefficiencies and opportunities to improve the richness and quality of data collection at every step of the process
  • Responsible for the end to end delivery of tracking projects, this encapsulates data capture, testing/validating results and surfacing data in the data warehouse
  • Maintaining and creating documentation of tracking and processes
  • Maintaining our tracking architecture to ensure we follow best practices and reduce tech debt
  • Set up tracking monitoring processes to ensure we minimize downtime and preserve high quality data
  • Administration and maintenance of various tracking related tools including -but not limited to- Snowplow, GA4, GTM, OneTrust

 

The Deal Breakers

  • Expert technical knowledge of Google Tag Manager and its ecosystem
  • Proven experience setting up and managing complex, large-scale implementations across web and mobile
  • Experience implementing or working with clickstream data
  • Some experience with SQL
  • Comfortable with exploring large datasets, with an emphasis on event data to ensure our tracking is meeting downstream requirements
  • Good understanding of the flow of data from data collection to reporting and insights and the impacts tracking can have on business processes
  • Highly competent in translating and presenting complex technical information to a less informed audience

 

And you are…

  • A doer! Willing to step outside your comfort zone to broaden your skills and learn new technologies
  • Meticulous when it comes to devising processes, documentation and QA work
  • Proactive and highly organized, with strong time management and planning skills
  • Approachable personality, happy to help resolve ad-hoc unscheduled problems
  • Proactive, self-starter mindset; identifying elegant solutions to difficult problems and being able to suggest new and creative approaches
  • Great time management skills with the ability to identify priorities

Qualifications

Nice to have

  • Experience working with Snowplow or other event-level analytics platform is a big plus
  • Experience setting up Server Side Google Tag Manager to reduce page load times
  • Exposure to cloud based data warehousing and modelling
  • Experience setting up analytics integrations with AB testing platforms (we use Optimizely)
  • Knowledge or experience of server-side tracking implementation
  • An engineering mindset looking to leverage modern tools and technologies to drive efficiencies
  • Exposure to Python/R or similar procedural programming language

 

Our data stack

We collect data from dozens of data sources, ranging from transactional data, availability data, payments data, customer event-level data, voice-of-customer data, third party data and much much more. Our historical data runs into tens of billions of records and grows at a rate of tens of millions of records every day. Our data is extremely varied, some being very finely-grained, event-level data, other being already aggregated to various degrees. It also arrives on different schedules!

Our tracking infrastructure contains tools such as GTM, SS GTM, Snowplow, GA4.

Our data stack is Python for the data pipeline, Airflow for orchestration and Snowflake is our data warehousing technology of choice. On top of our warehouse we have Tableau to assist with standardized reporting and self service, there is also a Tableau embedding within Salesforce.

Our wider ecosystem of tools and partners includes Iterable, Docker, Branch, GA4, Salesforce, Tableau. Everything runs in AWS.

Our team culture

The data platform team is an enthusiastic group who are passionate about our profession. We are continuously maintaining our team culture via things like retrospective meetings, weekly socials, open door mentality and cross profession knowledge sharing. We adopt a fail fast mentality that promotes a safe environment for our team to upskill comfortably. Our team make up reflects the company ethos of inclusion and diversity, we are made up of a collection of different people/genders/backgrounds and celebrate our differences. Ultimately we are a team and we work as one together as one, no individual is solely responsible for any area of our pipeline, our successes and failures are shared.

See more jobs at Nile Bits

Apply for this job

8d

Senior Engineer - ML Ops

PindropUS - Remote
MLBachelor's degreeremote-firstterraformairflowDesignazuregitc++dockerkubernetespythonAWS

Pindrop is hiring a Remote Senior Engineer - ML Ops

Who we are

Are you passionate about innovating at the intersection of technology and personal security? At Pindrop, we recognize that the human voice is a unique personal identifier, increasingly susceptible to sophisticated fraud, including the threat of deepfakes. We're leading the way in developing cutting-edge authentication, fraud prevention, and deepfake detection. Our mission is to provide seamless and secure digital experiences, safeguarding the most personal aspect of our identity: our voice. Here, you'll be part of a team driven by values of Innovation, Customer Advocacy, Excellence, and Impact. We're not just creating a safer digital landscape by fortifying trust and integrity with those we serve, we’re also building a dynamic, supportive workplace where your contributions make a real difference.

Headquartered in Atlanta, GA, Pindrop is backed by world-class investors such as Andreessen-Horowitz, IVP, and CapitalG.

 

What you’ll do

As a Senior Software Engineer, you will play a critical role in the development and maintenance of software applications and systems. You will be responsible for leading and contributing to complex software projects, providing technical expertise, and mentoring junior engineers. You will expand capabilities, bring new solutions to market. As a member of the MLOps team you will be responsible for systems which train models and produce predictions.

 

More specifically, you will:

  • Software Development: Design, develop, test, and maintain our complex software applications, ensuring high-quality code and adherence to best practices. Play a critical role in the development and maintenance of our software products by designing, building, evolving, and scaling state-of-the-art solutions for our Pindrop platform.
  • Technical Leadership: Provide technical leadership and guidance to junior engineers and the development team, including code reviews, architecture decisions, and mentoring. 
  • Architecture and Design: Contribute to the design and architecture of software systems, ensuring scalability, maintainability, and performance
  • Problem Solving: Analyze and solve complex technical problems, and make recommendations for improvements and optimizations.
  • Quality Assurance: Implement and advocate for best practices in testing and quality assurance, including unit testing, integration testing, and automated testing.
  • Code Review: Participate in code reviews and provide constructive feedback to ensure code quality and consistency.
  • Research and Innovation: Stay current with emerging technologies, tools, and programming languages and apply them where relevant to improve software development processes.
  • Security and Compliance: Ensure software adheres to security standards and compliance requirements, addressing vulnerabilities and potential risks.
  • Design and implement cloud solutions, build MLOps on cloud (AWS, Azure, or GCP)
  • Build CI/CD pipelines orchestration by GitLab CI, GitHub Actions, Circle CI, Airflow or similar tools
  • Data science model review: run code and refactor, optimize, containerize, deploy, version, and monitor its quality
  • Validate and add automated tests for Data Science models
  • Work closely with a team of researchers and data scientists to productionize and document research innovations

 

Who you are

  • You are resilient in the face of challenges, change, and ambiguity
  • You are optimistic and believe that you can make a problem into a solution
  • You are resourceful, excited to uncover innovative solutions and teach yourself something new when needed
  • You take accountability, do the things you say you’ll do, under-promise and over-deliver
  • You are a strong problem-solver with excellent analytical skills.
  • You are an owner and enjoy taking on project leadership as well as mentorship
  • You are a strong verbal and written communicator 

Your skill-set: 

  • Must Have
    • 5-7 Years of Software engineering experience
    • Experience with cloud computing environments, especially AWS and container-based deployment using Docker and Kubernetes
    • Experience working with python 2-3 years minimum 
    • Experience operating services in production environments
    • A strong understanding of software design principles, software architecture and design patterns as well as software development best practices, including testing, version control, and continuous integration
    • Experience with infrastructure as code tools like Terraform or AWS CDK
    • Experience in monitoring and performance of Production platforms using tech stacks and tools such as Datadog, ELK, Grafana, Prometheus
    • Participation in on-call rotation required
  • Nice to Have
    • Experience with Machine Learning frameworks and libraries such as XGBoost, SciKit-Learn, H2O, TensorFlow, PyTorch, Keras, Spark MLlib
    • Experience with leading industry Machine Learning tools and operation frameworks such as MLflow, Kubeflow, Airflow, Seldon Core, TFServing
    • Experience building microservices and RESTful APIs
    • CI/CD pipelines using tools such as GIT, Jenkins.

 

What’s in it for you:

As a Pindropper, you join a rapidly growing company making technology more human with the power of voice. You will work alongside some of the best and brightest. We’re a passionate group committed to excellence - but that doesn’t stop us from enjoying the journey as a team with chess and poker tournaments, catered lunches and happy hours, wellness programming, and more. Because we take our jobs seriously, we add in time for rest with Unlimited PTO, Focus Thursday, and Company-wide Rest Days. 

0-30 (Acclimating)

    • Complete onboarding and attend New Employee Orientation sessions with other new Pindroppers
    • Onboard in the MLOps Team
    • 1:1s with all the team members
    • Get started with your first project, first PR merged

30-60 (Learning)

    • Be a part of planning and contribute to the smaller tasks to fix existing issues
    • Be a part of triaging only the most important issues for the team to be focusing on
    • Add small features/resolve tech debt for the MLOps team

60-90 (Assimilating)

    • Fully acclimated with the team
    • Be able to pick up any task that comes out of sprint planning
    • Be able to design enhancements to the ML platform
    • Teach us something new

 

What we offer 

As a part of Pindrop, you’ll have a direct impact on our growing list of products and the future of security in the voice-driven economy. We hire great people and take care of them. Here’s a snapshot of the benefits we offer:

  • Competitive compensation, including equity for all employees
  • Unlimited Paid Time Off (PTO)
  • 4 company-wide rest days in 2024 where the entire company rests and recharges!
  • Generous health and welfare plans to choose from - including one employer-paid “employee-only” plan!
  • Best-in-class Health Savings Account (HSA) employer contribution
  • Affordable vision and dental plans for you and your family
  • Employer-provided life and disability coverage with additional supplemental options
  • Paid Parental Leave - Equal for all parents, including birth, adoptive & foster parents
    • One year of diaper delivery for your newest addition to the family! It’s our way of welcoming new Pindroplets to the family!
  • Identity protection through Norton LifeLock
  • Remote-first culture with opportunities for in-person team events
  • Recurring monthly home office allowance
  • When we need a break, we keep it fun with happy hours, ping pong and foosball, drinks and snacks, and monthly massages!
  • Remote and in-person team activities (think cheese tastings, chess tournaments, talent shows, murder mysteries, and more!)
  • Company holidays
  • Annual professional development and learning benefit
  • Pick your own Apple MacBook Pro
  • Retirement plan with competitive 401(k) match
  • Wellness Program including Employee Assistance Program, 24/7 Telemedicine

 

What we live by

At Pindrop, our Core Values are fundamental beliefs at the center of all we do. They are our guiding principles that dictate our actions and behaviors. Our Values are deeply embedded into our culture in big and small ways and even help us decide right from wrong when the path forward is unclear. At Pindrop, we believe in taking accountability to make decisions and act in a way that reflects who we are. We truly believe making decisions and acting with our Core Values in mind will help us to achieve our goals and keep Pindrop a great place to work:    

  • Audaciously Innovate - We continue to change the world, and the way people safely engage and interact with technology. As first principle thinkers, we challenge standards, take risks and learn from our mistakes in order to make positive change and continuous improvement. We believe nothing is impossible.
  • Evangelical Customers for Life - We delight, inspire and empower customers from day one and for life. We create a partnership and experience that results in a shared passion.   We are champions for our customers, and our customers become our champions, creating a universal commitment to one another. 
  • Execution Excellence - We do what we say and say what we do. We are accountable for making the tough decisions and necessary tradeoffs to deliver quality and effective solutions on time.
  • Win as a Company - Every time we win, we win as a company. Every time we lose, we lose as a company. We break down silos, support one another, embrace diversity and celebrate our successes. We are better together. 
  • Make a Difference - Every day we have the opportunity to make a positive impact. We operate with dedication, passion, and uncompromising integrity, creating a safer, more secure world.

Not sure if this is you?

We want a diverse, global team, with a broad range of experience and perspectives. If this job sounds great, but you’re not sure if you qualify, apply anyway! We carefully consider every application and will either move forward with you, find another team that might be a better fit, keep in touch for future opportunities, or thank you for your time.

Pindrop is an Equal Opportunity Employer

Here at Pindrop, it is our mission to create and maintain a diverse and inclusive work environment. As an equal opportunity employer, all qualified applicants receive consideration for employment without regard to race, color, age, religion, sex, gender, gender identity or expression, sexual orientation, national origin, genetic information, disability, marital and/or veteran status.

#LI-REMOTE

See more jobs at Pindrop

Apply for this job

10d

Sr Data Engineer GCP

Ingenia AgencyMexico - Remote
Bachelor's degree5 years of experience3 years of experienceairflowsqlapipython

Ingenia Agency is hiring a Remote Sr Data Engineer GCP


AtIngenia Agency we’re looking for a Sr Data Engineer to join our team.

Responsible for creating and sustaining pipelines that allow for the analysis of data.

What will you be doing?

  • Sound understanding of Google Cloud Platform.
  • Should have worked on Big Query, Workflow or Composer.
  • Should know how to reduce BigQuery costs by reducing the amount of data processed by the queries.
  • Should be able to speed up queries by using denormalized data structures, with or without nested repeated fields.
  • Exploring and preparing data using BigQuery.
  • Experience in delivering artifacts scripts Python, dataflow components, SQL, Airflow and Bash/Unix scripting.
  • Building and productionizing data pipelines using dataflow.

What are we looking for?

  • Bachelor's degree in Data Engineering, Big Data Analytics, Computer Engineering, or related field.
  • Age indifferent.
  • 3 to 5 years of experience in GCP is required.
  • Must have Excellent GCP, Big Query and SQL skills.
  • Should have at least 3 years of experience in BigQuery Dataflow and Experience with Python and Google Cloud SDK API Scripting to create reusable framework.
  • Candidate should have strong hands-on experience in PowerCenter
  • In depth understanding of architecture, table partitioning, clustering, type of tables, best practices.
  • Proven experience as a Data Engineer, Software Developer, or similar.
  • Expert proficiency in Python, R, and SQL.
  • Candidates with Google Cloud certification will be preferred
  • Excellent analytical and problem-solving skills.
  • A knack for independent and group work.
  • Capacity to successfully manage a pipeline of duties with minimal supervision.
  • Advanced English.
  • Be Extraordinary!

What are we offering?

  • Competitive salary
  • Law benefits:
    • 10 days of vacations to the first year fulfilled
    • IMSS
  • Additional benefits:
    • Contigo Membership (Insurance of minor medical expenses)
      • Personal accident policy.
      • Funeral assistance.
      • Dental and visual health assistance.
      • Emotional wellness.
      • Benefits & discounts.
      • Network of medical services and providers with a discount.
      • Medical network with preferential prices.
      • Roadside assistance with preferential price, among others.
    • 3 special permits a year, to go out to any type of procedure that you have to do half day equivalent
    • Half day off for birthdays
    • 5 days of additional vacations in case of marriage
    • 50% scholarship in language courses in the Anglo
    • Percentage scholarship in the study of graduates or masters with the Tec. de Mty.
    • Agreement with ticket company for preferential rates for events of entertainment.

See more jobs at Ingenia Agency

Apply for this job

10d

Data Engineer

Charlotte TilburyLondon,England,United Kingdom, Remote Hybrid
terraformairflowsqlDesigngitpythonAWSjavascript

Charlotte Tilbury is hiring a Remote Data Engineer

About Charlotte Tilbury Beauty

Founded by British makeup artist and beauty entrepreneur Charlotte Tilbury MBE in 2013, Charlotte Tilbury Beauty has revolutionised the face of the global beauty industry by de-coding makeup applications for everyone, everywhere, with an easy-to-use, easy-to-choose, easy-to-gift range. Today, Charlotte Tilbury Beauty continues to break records across countries, channels, and categories and to scale at pace.

Over the last 10 years, Charlotte Tilbury Beauty has experienced exceptional growth and is one of the most talked about brands in the beauty industry and beyond. It has become a global sensation across 50 markets (and growing), with over 2,300 employees globally who are part of the Dream Team making the magic happen.

Today, Charlotte Tilbury Beauty is a truly global business, delivering market-leading growth, innovative retail and product launches fuelled by industry-leading tech — all with an internal culture of embracing challenges, disruptive thinking, winning together, and sharing the magic. The energy behind the bran­d is infectious, and as we grow, we are always looking for extraordinary talent who want to be part of this our success and help drive our limitless ambitions.

The Role

 

Data is at the heart of our strategy to engage and delight our customers, and we are determined to harness its power to go as far as we can to deliver a euphoric, personalised experience that they'll love. 

 

We're seeking a skilled and experienced Data Engineer to join our Data function to join our team of data engineers in the design, build & maintenance of the pipelines that support this ambition. The ideal candidate will not only be able to see many different routes to engineering success, but also to work collaboratively with Engineers, Analysts, Scientists & stakeholders to design & build robust data products to meet business requirements.

 

Our stack is primarily GCP, with Fivetran handling change detection capture, Google Cloud Functions for file ingestion, Dataform & Composer (Airflow) for orchestration, GA & Snowplow for event tracking and Looker as our BI Platform. We use Terraform Cloud to manage our infrastructure programmatically as code.

 

Reporting Relationships

 

This role will report into the Lead Data Engineer

 

About you and attributes we're looking for



  • Extensive experience with cloud data warehouses and analytics query engines such as BigQuery, Redshift or Snowflow and a good understanding of cloud technologies in general. 
  • Proficient in SQL, Python and Git 
  • Prior experience with HCL (Terraform configuration language), YAML, JavaScript, CLIs and Bash.
  • Prior experience with serverless tooling e.g. Google Cloud Functions, AWS Lambdas, etc.
  • Familiarity with tools such as Fivetran and Dataform/DBT 
  • Bachelor's or Master's degree in Computer Science, Data Science, or related field 
  • Collaborative mindset and a passion for sharing ideas & knowledge
  • Demonstrable experience developing high quality code in the retail sector is a bonus

At Charlotte Tilbury Beauty, our mission is to empower everybody in the world to be the most beautiful version of themselves. We celebrate and support this by encouraging and hiring people with diverse backgrounds, cultures, voices, beliefs, and perspectives into our growing global workforce. By doing so, we better serve our communities, customers, employees - and the candidates that take part in our recruitment process.

If you want to learn more about life at Charlotte Tilbury Beauty please follow ourLinkedIn page!

See more jobs at Charlotte Tilbury

Apply for this job

11d

Staff Ledger Operations Engineer

GeminiRemote (USA)
remote-firstairflowsqlDesignjavapython

Gemini is hiring a Remote Staff Ledger Operations Engineer

About the Company

Gemini is a global crypto and Web3 platform founded by Tyler Winklevoss and Cameron Winklevoss in 2014. Gemini offers a wide range of crypto products and services for individuals and institutions in over 70 countries.

Crypto is about giving you greater choice, independence, and opportunity. We are here to help you on your journey. We build crypto products that are simple, elegant, and secure. Whether you are an individual or an institution, we help you buy, sell, and store your bitcoin and cryptocurrency. 

At Gemini, our mission is to unlock the next era of financial, creative, and personal freedom.

In the United States, we have a flexible hybrid work policy for employees who live within 30 miles of our office headquartered in New York City and our office in Seattle. Employees within the New York and Seattle metropolitan areas are expected to work from the designated office twice a week, unless there is a job-specific requirement to be in the office every workday. Employees outside of these areas are considered part of our remote-first workforce. We believe our hybrid approach for those near our NYC and Seattle offices increases productivity through more in-person collaboration where possible.

The Department: Customer Support (Ledger Operations)

As a team within the Support group, the Ledger Operations team is data driven and customer-centric. Team members work closely with data scientists, engineers, product managers, and corporate operational stakeholders to reconcile data. Projects range from the very urgent, short sprint to redesigning long-term solutions to improve scalability and automation. Ledger Operations primary goal is to ensure high quality, scalable data reconciliation with proactive monitoring and reducing delays for reactive corrections. This team is responsible for building the “next generation” of reconciliation tools to maintain and expand Gemini’s internal reconciliation processes. 

The Role: Staff Ledger Operations Engineer

Responsibilities:

  • Mentor engineers while also self-managing as an individual contributor
  • Design data pipelines and automate ETL processes and SQL optimization, which will influence the “next generation” for Gemini’s ledger reconciliation processes and reporting (including improving the data warehouse changes, if necessary)
  • Partner with the third party vendors, Data Analytics team and Crypto Core engineers for data processing
  • Be responsible for maintaining and creating data adaptors to process real-time data, create data validation processes, reporting, and root cause analysis for exception reports

Minimum Qualifications:

  • 7+ years experience with schema design and dimensional data modeling
  • 7+years with design and implementing a reconciliation system while improving existing data pipelines
  • Must have experience with cryptocurrency data 
  • Must have experience in trade or ledger reconciliation
  • Must have advanced SQL skills and database design experience 
  • Experience building real-time data solutions and processes to automate reconciliation analysis and reporting
  • Experience building and integrating web analytics solutions
  • Experience with FIX, Kafka, REST, and other data messaging types
  • Experience and expertise in Airflow, Databricks, Spark, Hadoop etc.
  • Skilled in programming languages Python and/or Java
  • Experience with one or more MPP databases(Redshift, Bigquery, Snowflake, etc)
  • Extensive ETL and database experience with financial transaction systems for (banking systems, exchange systems)
  • Extensive experience in financial reporting for “above the line” revenue for building annual, audited financials
  • Strong technical and business communication skills

Preferred Qualifications:

  • Experience with financial Reporting Requirements for publicly traded companies
It Pays to Work Here
 
The compensation & benefits package for this role includes:
  • Competitive starting salary
  • A discretionary annual bonus
  • Long-term incentive in the form of a new hire equity grant
  • Comprehensive health plans
  • 401K with company matching
  • Paid Parental Leave
  • Flexible time off

Salary Range: The base salary range for this role is between $152,000 - $190,000 in the State of New York, the State of California and the State of Washington. This range is not inclusive of our discretionary bonus or equity package. When determining a candidate’s compensation, we consider a number of factors including skillset, experience, job scope, and current market data.

At Gemini, we strive to build diverse teams that reflect the people we want to empower through our products, and we are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity, or Veteran status. Equal Opportunity is the Law, and Gemini is proud to be an equal opportunity workplace. If you have a specific need that requires accommodation, please let a member of the People Team know.

Apply for this job

14d

Senior Data Engineer (Portfolio Companies)

IFSColombo, Sri Lanka, Remote
S3EC2golang6 years of experienceagilenosqlairflowsqlDesignmongodbdockerelasticsearchjenkinsAWS

IFS is hiring a Remote Senior Data Engineer (Portfolio Companies)

Job Description

  • Design, develop, and maintain a generic ingestion framework capable of processing various types of data (structured, semi-structured, unstructured) from customer sources.
  • Implement and optimize ETL (Extract, Transform, Load) pipelines to ensure data integrity, quality, and reliability as it flows into the centralized datastore like Elasticsearch.
  • Ensure the ingestion framework is scalable, secure, efficient and capable of handling large volumes of data in real-time or batch processes.
  • Continuously monitor and enhance the data ingestion process to improve performance, reduce latency, and handle new data sources and formats.
  • Develop automated testing and monitoring tools to ensure the framework operates smoothly and can quickly adapt to changes in data sources or requirements.
  • Provide documentation, support, and training to other team members and stakeholders on using the ingestion framework.
  • Implement large-scale near real-time streaming data processing pipelines.
  • Design, support and continuously enhance the project code base, continuous integration pipeline, etc.
  • Build analytics tools that utilize the data pipeline to provide actionable insights into key business performance metrics.
  • Perform POCs and evaluate different technologies and continue to improve the overall architecture.

Qualifications

  • Experience building and optimizing Big Data data pipelines, architectures and data sets.
  • Strong proficiency in Elasticsearch, its architecture and optimal querying of data.
  • Strong analytic skills related to working with unstructured datasets.
  • Experience supporting and working with cross-functional teams in a dynamic environment.
  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data systems.
  • One plus years of experience contributing to the architecture and design (architecture, design patterns, reliability and scaling) of new and current systems.
  • Candidates must have 4 to 6 years of experience in a Data Engineer role with  Bachelors or Masters (preferred) in Computer Science or Information Systems or equivalent field. Candidate should have knowledge of using following technologies/tools:
    • Experience working on Big Data processing systems like Hadoop, Spark, Spark Streaming, or Flink Streaming.
    • Experience with SQL systems like Snowflake or Redshift
    • Direct, hands-on experience in two or more of these integration technologies; Java/Python, React, Golang, SQL, NoSQL (Mongo), Restful API.
    • Versed in Agile, APIs, Microservices, Containerization etc.
    • Experience with CI/CD pipeline running on GitHub, Jenkins, Docker, EKS.
    • Knowledge of at least one distributed datastores like MongoDb, DynamoDB, HBase.
    • Experience using batch scheduling frameworks like Airflow (preferred), Luigi, Azkaban etc is a plus.
    • Experience with AWS cloud services: EC2, S3, DynamoDB, Elasticsearch

See more jobs at IFS

Apply for this job

15d

Analytics Engineer

CLEAR - CorporateNew York, New York, United States (Hybrid)
airflowsqlDesignpython

CLEAR - Corporate is hiring a Remote Analytics Engineer

At CLEAR, we are pioneers in digital and biometric identification, known for reducing friction wherever identity verification is needed. Now, we’re evolving further, building the next generation of products to go beyond ID, empowering our members to harness the power of a networked digital identity. As an Analytics Engineer, you will play a pivotal role in designing and enhancing our data platform, ensuring it supports data-driven insights while safeguarding member privacy and security.


A brief highlight of our tech stack:

  • SQL / Python / Looker / Snowflake / Dagster / dbt

What you'll do:

  • Design and maintain scalable, self-service data platforms enabling Analysts and Engineers to drive automation, testing, security, and high-quality analytics.
  • Develop robust processes for data transformation, structuring, metadata management, and workflow optimization.
  • Own and manage end-to-end data pipelines—from ingestion to transformation, modeling, and visualization—ensuring high data quality.
  • Collaborate with stakeholders across product and business teams to understand requirements and deliver actionable insights.
  • Lead the development of data models and analytics workflows that support strategic decision-making and reporting.
  • Maintain a strong focus on privacy, ensuring that member data is used securely and responsibly.
  • Drive architectural improvements in data processes, continuously improving CLEAR’s data infrastructure.

 What you're great at:

  • 6+ years of experience in data engineering, with a focus on data transformation, analytics, and cloud-based solutions.
  • Proficient in building and managing data pipelines using orchestration tools (Airflow, Dagster,) and big data tools (Spark, Kafka, Snowflake, Databricks).
  • Expertise in modern data tools like dbt and data visualization platforms like Looker, Tableau.
  • Ability to communicate complex technical concepts clearly to both technical and non-technical stakeholders.
  • Experience mentoring and collaborating with team members to foster a culture of learning and development.
  • Comfortable working in a dynamic, fast-paced environment with a passion for leveraging data to solve complex business challenges.

How You'll be Rewarded:

At CLEAR we help YOU move forward - because when you’re at your best, we’re at our best. You’ll work with talented team members who are motivated by our mission of making experiences safer and easier. Our hybrid work environment provides flexibility. In our offices, you’ll enjoy benefits like meals and snacks. We invest in your well-being and learning & development with our stipend and reimbursement programs. 

We offer holistic total rewards, including comprehensive healthcare plans, family building benefits (fertility and adoption/surrogacy support), flexible time off, free OneMedical memberships for you and your dependents, and a 401(k) retirement plan with employer match. The base salary range for this role is $145,000 - $175,000, depending on levels of skills and experience.

The base salary range represents the low and high end of CLEAR’s salary range for this position. Salaries will vary depending on various factors which include, but are not limited to location, education, skills, experience and performance. The range listed is just one component of CLEAR’s total compensation package for employees and other rewards may include annual bonuses, commission, Restricted Stock Units.

About CLEAR

Have you ever had that green-light feeling? When you hit every green light and the day just feels like magic. CLEAR's mission is to create frictionless experiences where every day has that feeling. With more than 25+ million passionate members and hundreds of partners around the world, CLEAR’s identity platform is transforming the way people live, work, and travel. Whether it’s at the airport, stadium, or right on your phone, CLEAR connects you to the things that make you, you - unlocking easier, more secure, and more seamless experiences - making them all feel like magic.

CLEAR provides reasonable accommodation to qualified individuals with disabilities or protected needs. Please let us know if you require a reasonable accommodation to apply for a job or perform your job. Examples of reasonable accommodation include, but are not limited to, time off, extra breaks, making a change to the application process or work procedures, policy exceptions, providing documents in an alternative format, live captioning or using a sign language interpreter, or using specialized equipment.

See more jobs at CLEAR - Corporate

Apply for this job

15d

Cloud Data Engineer

DevoteamTunis, Tunisia, Remote
airflowsqlscrum

Devoteam is hiring a Remote Cloud Data Engineer

Description du poste

Au sein de la direction « Plateforme Data », le consultant intégrera une équipe SCRUM et se concentrera sur un périmètre fonctionnel spécifique.

Votre rôle consistera à contribuer à des projets data en apportant votre expertise sur les tâches suivantes :

  • Concevoir, développer et maintenir des pipelines de données robustes et évolutifs sur Google Cloud Plateform (GCP), en utilisant des outils tels que BigQuery, Airflow, Looker et DBT.
  • Collaborer avec les équipes métier pour comprendre les exigences en matière de données et concevoir des solutions adaptées.
  • Optimiser les performances des traitements des données et des processus ELT en utilisant AirFlow, DBT et BigQuery.
  • Mettre en œuvre des processus de qualité des données pour garantir l'intégrité et la cohérence des données.
  • Travailler en étroite collaboration avec les équipes d'ingénierie pour intégrer les pipelines de données dans les applications et les services existants.
  • Rester à jour avec les nouvelles technologies et les meilleures pratiques dans le domaine du traitement des données et de l'analyse.

 

    Qualifications

    ???? Compétences

    Quels atouts pour rejoindre l’équipe ?

    Diplômé(e) d’un Bac+5 en école d'ingénieur ou équivalent universitaire avec une spécialisation en informatique.

    • Au moins 2 ans d'expérience dans le domaine de l'ingénierie des données, avec une expérience significative dans un environnement basé sur le cloud GCP.
    • Maîtrise avancée de SQL pour l'optimisation et le traitement des données.
    • Certification Google Professional Data Engineer est un plus.
    • Très bonne communication écrite et orale (livrables et reportings de qualité).

    Alors, si vous souhaitez progresser, apprendre et partager, rejoignez-nous !

    See more jobs at Devoteam

    Apply for this job

    16d

    Data Quality Analyst - Remote

    Paramo TechnologiesBuenos Aires, AR - Remote
    airflowsqlpython

    Paramo Technologies is hiring a Remote Data Quality Analyst - Remote

    To apply for this position, you must be based in the Americas, preferably Latin America (the United States of America is not applicable). Applications from other locations will be disqualified from this selection process.

    We are

    a cutting-edge e-commerce company developing products for our own technological platform. Our creative, smart and dedicated teams pool their knowledge and experience to find the best solutions to meet project needs, while maintaining sustainable and long-lasting results. How? By making sure that our teams thrive and develop professionally. Strong advocates of hiring top talent and letting them do what they do best, we strive to create a workplace that allows for an open, collaborative and respectful culture.

    What you will be doing

    As a Data Quality Analyst your primary priority is to ensure that the data used by the company is accurate, complete, and consistent. You will be ensuring that the data used by the company is of high quality, which would elevate the business decision-making, improves operational efficiency, and enhances customer satisfaction. This role also provides continuous automated and manual testing of data sets for use in internal data systems and for delivery from internal systems.

    As part of your essential functions you will have to:

    • Identify data quality issues and working with other teams to resolve them.
    • Establish data quality standards and metrics: The data analyst/engineer would work with stakeholders to define the quality standards that data must met, and establish metrics to measure data quality.
    • Monitor data quality: The data analyst/engineer would monitor data quality on an ongoing basis, using automated tools and manual checks to identify issues.
    • Investigate data quality issues: When data quality issues are identified, the analyst/engineer would investigate them to determine their root cause and work with other teams to resolve them.
    • Develop data quality processes: The analyst/engineer would develop processes to ensure that data is checked for quality as it is collected and processed, and that data quality issues are addressed promptly. This stage includes the use of tools and technology in order to promote efficiency in the data check activities.
    • Train stakeholders: The analyst/engineer would train other stakeholders in the company on data quality best practices, to ensure that everyone is working towards the same quality goals.
    • Promote improvements in the development process in order to ensure the integrity and availability of the data.

    Some other responsibilities are:

    • Provide technical directions and mentor other data engineers about data quality.
    • Perform data validity, accuracy, and integrity test across different components of the Data Platform.
    • Build automated test framework and tools to automate the Data Platform services and applications.
    • Automate regression tests and perform functional, integration, and load testing.
    • Articulate issues to BI developers during meetings and particularly in the daily standups.
    • Articulate issues to data engineers and analysts during meetings and particularly in the daily standups.
    • Triage production-level issues if data if affected, and working with the involved teams until the issues are resolved.
    • Proactively solve problems and suggest process improvements.
    • Provide test case coverage and defect metrics to substantiate release decisions.

    Knowledge and skills you need to have

    • Bachelor in Computer Science or Information Systems or 2+ years’ experience with corporate data management systems in high-compliance contexts.
    • 2+ years of experience writing complex SQL on large customer data sets (complex queries).
    • High proficiency in relational or non-relational databases.
    • Knowledgeable about industry data compliance strategies and practices, such as continuous integration, regression testing, and versioning.
    • Familiarity with Big Data environments, dealing with large diverse data sets.
    • Experience with BI projects.
    • Strong scripting experience with any of the scripting languages.
    • Accountability for receiving challenges.
    • Excellent communication skills, with the ability to drive & collaborate with cross teams.

    Bonus points for the following

    Additional requirements, not essential but "nice to have".

    • Python experience (for data analysis - Airflow)

    Why choose us?

    We provide the opportunity to be the best version of yourself, develop professionally, and create strong working relationships, whether working remotely or on-site. While offering a competitive salary, we also invest in our people's professional development and want to see you grow and love what you do. We are dedicated to listening to our team's needs and are constantly working on creating an environment in which you can feel at home.

    We offer a range of benefits to support your personal and professional development:

    Benefits:

    • 22 days of annual leave.
    • 10 days of public/national holidays.
    • Maternity and paternity leave.
    • Health insurance options.
    • Access to online learning platforms.
    • On-site English classes in some countries, and many more.

    Join our team and enjoy an environment that values and supports your well-being. If this sounds like the place for you, contact us now!

    See more jobs at Paramo Technologies

    Apply for this job

    17d

    Senior Software Engineer (Generative AI)

    ExperianCosta Mesa, CA, Remote
    MLLambdajiraterraformairflowsqlslackpythonAWSjavascriptNode.js

    Experian is hiring a Remote Senior Software Engineer (Generative AI)

    Job Description

    The Experian Consumer Services Generative AI team is accelerating Experian's impact by bringing together data, technology, and data science to build game-changing products and services for our customers. We are looking for a Senior Software Engineer, reporting to the Head of AI/ML Innovation, to support developing and integrating our Generative AI Models with Consumer Services products. These new capabilities will help provide Financial Power to All our customers. As a growing team, we embrace a startup mentality while operating in a large organization. We value speed and effect – and our results and ways of working are transforming the culture of the larger organizations around us.

    Role accountabilities and essential activities

    • You'll develop and integrate our generative AI solutions with existing software teams building products
    • Develop a scalable machine learning framework for data science products
    • You will develop scalable pipelines, tools, and services for building production-ready machine-learning models
    • Work with our data scientists to pilot our products with beta customers
    • Maintain our culture of simple, streamlined code and full CI/CD automation
    • Develop simple, streamlined, and well-tested ML pipeline components

    Qualifications

    • You have 5+ years of experience
    • Strong coding experience in Python with some familiarity to PySpark and SQL
    • Familiarity with popular python libraries such as pandas, numpy, flask, matplotlib
    • Familiarity with the AWS platform and services, including CI/CD automation methods
    • Familiarity with AWS serverless methodology, particularly Fargate, Lambda, ECR
    • Familiarity with CloudFormation, Terraform, or equivalent
    • You have experience working with machine learning or generative AI libraries such as Langchain, llamaindex, Langsmith, and llamaguard
    • Open source Foundation models: llama3, llama2, mistral, falcon, phi
    • Orchestration Frameworks: Mlflow, airflow
    • Some knowledge of javascript (node.js) as a front end
    • You have experience or familiarity with Databricks AI services and Mosaic branded services
    • Comfortable supporting troubleshooting and assessment of production issues when needed
    • You are experienced with monitoring tools such as Splunk, Datadog, Dynatrace
    • Test writing discipline in standard development tools and processes, e.g., gitub, Jira, Slack
    • Record of building and maintaining large-scale software systems in production

    See more jobs at Experian

    Apply for this job

    18d

    Data Engineer

    Clover HealthRemote - Canada
    MLremote-firsttableauairflowpostgressqlDesignqac++pythonAWS

    Clover Health is hiring a Remote Data Engineer

    At Clover, the Business Enablement team spearheads our technological advancement while ensuring robust security and compliance. We deliver user-friendly corporate applications, manage complex data ecosystems, and provide efficient tech solutions across the organization. Our goal is simple, we make it easy for the business to do what’s right for Clover. 

    We are looking for a Data Engineer to join our team. You'll work on the development of data pipelines and tools to support our analytics and machine learning development. Applying insights through data is a core part of our thesis as a company — and you will work on a team that is a central part of helping to deliver that promise through making a wide variety of data easily accessible for internal and external consumers. We work primarily in SQL, Python and our data is stored primarily in Snowflake. You will work with data analysts, other engineers, and healthcare professionals in a unique environment building tools to improve the health of real people. You should have extensive experience leading data warehousing projects with advanced knowledge in data cleansing, ingestion, ETL and data governance.

    As a Data Engineer, you will:

    • Collaborate closely with operations, IT and vendor partners to understand the data landscape and contribute to the vision, development and implementation of the Data Warehouse solution.
    • Recommend technologies and tools to support the future state architecture.
    • Develop standards, processes and procedures that align with best practices in data governance and data management.
    • Be responsible for logical and physical data modeling, load and query performance.
    • Develop new secure data feeds with external parties as well as internal applications.
    • Perform regular analysis and QA, diagnose ETL and database related issues, perform root cause analysis, and recommend corrective actions to management.
    • Work with cross-functional teams to support the design, development, implementation, monitoring, and maintenance of new ETL programs.

    Success in this role looks like:

    • First 90 days:
      • Develop a strong understanding of our existing data ecosystem and data pipelines.
      • Build relationships with various stakeholder departments to understand their day to day operation and their usage and need of Data Eng products.
      • Contribute in the design and implementation of new ETL programs to support the growth and operation efficiency of Clover.
      • Perform root cause analysis after issues have been identified and propose solutions for both short term and long term fixture to increase the stability and accuracy of our pipelines.
    • First 6 months:
      • Provide feedback and propose opportunities for improvement on current data engineering processes and procedures.
      • Work with platform engineers on improving data ecosystem stability, data quality monitoring and data governance.
      • Lead discussion with key stakeholders, propose, design and implement new data eng projects that solve critical business problems.
    • How will success be measured in the future?
      • Continue the creation and management of ETL program and data assets.
      • Be the technical Data Eng lead of our data squad’s day to day operation.
      • Guide and mentor other junior members of the team.

    You should get in touch if:

    • You have a Bachelor’s degree in Computer Science or related field along with 5+ years of experience in ETL programming.
    • You have professional experience working in a healthcare setting. Health Plan knowledge highly desired, Medicare preferred.  
    • You have expertise in most of these technologies: 
      • Python 
      • Snowflake 
      • DBT
      • Airflow 
      • GCP
      • AWS
      • BigQuery
      • Postgres 
      • Data Governance 
      • Some experience with analytics, data science, ML collaboration tools such as Tableau, Mode, Looker

    #LI-Remote

    Pursuant to the San Francisco Fair Chance Ordinance, we will consider for employment qualified applicants with arrest and conviction records.We are an E-Verify company.


    Benefits Overview:

    • Financial Well-Being: Our commitment to attracting and retaining top talent begins with a competitive base salary and equity opportunities. Additionally, we offer a performance-based bonus program and regular compensation reviews to recognize and reward exceptional contributions.
    • Physical Well-Being: We prioritize the health and well-being of our employees and their families by offering comprehensive group medical coverage that include coverage for hospitalization, outpatient care, optical services, and dental benefits.
    • Mental Well-Being: We understand the importance of mental health in fostering productivity and maintaining work-life balance. To support this, we offer initiatives such as No-Meeting Fridays, company holidays, access to mental health resources, and a generous annual leave policy. Additionally, we embrace a remote-first culture that supports collaboration and flexibility, allowing our team members to thrive from any location. 
    • Professional Development: We are committed to developing our talent professionally. We offer learning programs, mentorship, professional development funding, and regular performance feedback and reviews.

    Additional Perks:

    • Reimbursement for office setup expenses
    • Monthly cell phone & internet stipend
    • Flexibility to work from home, enabling collaboration with global teams
    • Paid parental leave for all new parents
    • And much more!

    About Clover:We are reinventing health insurance by combining the power of data with human empathy to keep our members healthier. We believe the healthcare system is broken, so we've created custom software and analytics to empower our clinical staff to intervene and provide personalized care to the people who need it most.

    We always put our members first, and our success as a team is measured by the quality of life of the people we serve. Those who work at Clover are passionate and mission-driven individuals with diverse areas of expertise, working together to solve the most complicated problem in the world: healthcare.

    From Clover’s inception, Diversity & Inclusion have always been key to our success. We are an Equal Opportunity Employer and our employees are people with different strengths, experiences and backgrounds, who share a passion for improving people's lives. Diversity not only includes race and gender identity, but also age, disability status, veteran status, sexual orientation, religion and many other parts of one’s identity. All of our employee’s points of view are key to our success, and inclusion is everyone's responsibility.


    See more jobs at Clover Health

    Apply for this job

    19d

    Senior Data Engineer (with Spark, Airflow)

    Accesa - RatiodataEmployee can work remotely, Romania, Remote
    agileairflowsqlDesignjavapostgresqlkubernetespythonbackend

    Accesa - Ratiodata is hiring a Remote Senior Data Engineer (with Spark, Airflow)

    Job Description

    One of our clients operates prominently in the financial sector, where we enhance operations across their extensive network of 150,000 workstations and support a workforce of 4,500 employees. As part of our commitment to optimizing data management strategies, we are migrating data warehouse (DWH) models into data products within the Data Integration Hub (DIH). 

    Responsibilities:

    • Drive Data Efficiency: Create and maintain optimal data transformation pipelines.  

    • Master Complex Data Handling: Work with large, complex financial data sets to generate outputs that meet functional and non-functional business requirements.  

    • Lead Innovation and Process Optimization: Identify, design, and implement process improvements such as automating manual processes, optimizing data delivery, and re-designing infrastructure for higher scalability. 

    • Architect Scalable Data Infrastructure: Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using open-source technologies. 

    • Unlock Actionable Insights: Build/use analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.  

    • Collaborate with Cross-Functional Teams:Work clients and internal stakeholders, including Senior Management, Departments Heads, Product, Data, and Design teams, to assist with data-related technical issues and support their data infrastructure needs. 

    Qualifications

    Must have:

    • 5+ years of experience in a similar role, preferably within Agile teams 

    • Skilled in SQL and relational databases for data manipulation 

    • Experience in building and optimizing Big Data pipelines and architectures 

    • Familiarity with innovative technologies in message queuing, stream processing, and scalable big data storage solutions 

    • Knowledge of Apache Spark framework and object-oriented programming in Java; experience with Python is a plus.   

    • Proven experience in performing data analysis and root cause analysis on diverse datasets to identify opportunities for improvement.  

    • Experience with ETL processes, including scheduling and orchestration using tools like Apache Airflow (or similar) 

    • Automate CI/CD pipelines using ArgoCD, Tekton, and Helm to streamline deployment and improve efficiency across the SDLC 

    • Manage Kubernetes deployments (e.g. OpenShift), focusing on scalability, security, and optimized container orchestration 

    • Strong analytical skills in working with both structured and unstructured data 

     

    Nice to have:  

    • Expertise in processing large, disconnected datasets to extract actionable insights 

    • Technical skills in the following areas are a plus: relational databases (e.g. PostgreSQL), Big Data Tools: (e.g. Databricks), and workflow management (e.g. Airflow), and backend development using Spring Boot. 

    Apply for this job

    19d

    Data Engineer (with Spark, Airflow)

    Accesa - RatiodataRemote, Romania, Remote
    agileairflowsqlDesignjavapostgresqlkubernetespythonbackend

    Accesa - Ratiodata is hiring a Remote Data Engineer (with Spark, Airflow)

    Job Description

    Job Description 

    One of our clients operates prominently in the financial sector, where we enhance operations across their extensive network of 150,000 workstations and support a workforce of 4,500 employees. As part of our commitment to optimizing data management strategies, we are migrating data warehouse (DWH) models into data products within the Data Integration Hub (DIH). 

    Responsibilities:  

    • Drive Data Efficiency: Create and maintain optimal data transformation pipelines.  

    • Master Complex Data Handling: Work with large, complex financial data sets to generate outputs that meet functional and non-functional business requirements.  

    • Lead Innovation and Process Optimization: Identify, design, and implement process improvements such as automating manual processes, optimizing data delivery, and re-designing infrastructure for higher scalability. 

    • Architect Scalable Data Infrastructure: Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using open-source technologies. 

    • Unlock Actionable Insights: Build/use analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.  

    • Collaborate with Cross-Functional Teams: Work clients and internal stakeholders, including Senior Management, Departments Heads, Product, Data, and Design teams, to assist with data-related technical issues and support their data infrastructure needs. 

    Qualifications

    Qualifications 

    Must have 

    • 3+ years of experience in a similar role, preferably within Agile teams.  

    • Strong analytical skills in working with both structured and unstructured data 

    • Skilled in SQL and relational databases for data manipulation.  

    • Experience in building and optimizing Big Data pipelines and architectures.  

    • Knowledge of Apache Spark framework and object-oriented programming in Java; experience with Python is a plus.   

    • Experience with ETL processes, including scheduling and orchestration using tools like Apache Airflow (or similar). 

    • Proven experience in performing data analysis and root cause analysis on diverse datasets to identify opportunities for improvement.  

    Nice to have:  

    • Expertise in manipulating and processing large, disconnected datasets to extract actionable insights 

    • Automate CI/CD pipelines using ArgoCD, Tekton, and Helm to streamline deployment and improve efficiency across the SDLC 

    • Manage Kubernetes deployments on OpenShift, focusing on scalability, security, and optimized container orchestration 

    • Technical skills in the following areas are a plus: relational databases (e.g. Postgresql), Big Data Tools: (e.g. Databricks), and workflow management (e.g. Airflow), and backend development using Spring Boot. 

    Apply for this job

    20d

    Staff AV Software Systems Engineer

    CruiseUS Remote
    Bachelor's degreeairflowsqlDesignc++python

    Cruise is hiring a Remote Staff AV Software Systems Engineer

    We're Cruise, a self-driving service designed for the cities we love.

    We’re building the world’s most advanced self-driving vehicles to safely connect people to the places, things, and experiences they care about. We believe self-driving vehicles will help save lives, reshape cities, give back time in transit, and restore freedom of movement for many.

    In our cars, you’re free to be yourself. It’s the same here at Cruise. We’re creating a culture that values the experiences and contributions of all of the unique individuals who collectively make up Cruise, so that every employee can do their best work. 

    Cruise is committed to building a diverse, equitable, and inclusive environment, both in our workplace and in our products. If you are looking to play a part in making a positive impact in the world by advancing the revolutionary work of self-driving cars, come join us. Even if you might not meet every requirement, we strongly encourage you to apply. You might just be the right candidate for us.

    This team is responsible for the Autonomous Vehicle Systems Engineering and acts as a technical liaison between Product, Program, Legal, Engineering, and Test. This group includes Systems Engineers that work across Cruise on end-to-end system design. They provide the specific analysis, tools, and translation of requirements into a form that is easily consumed by the engineers who design and build these systems.

    WHAT YOU’LL BE DOING:

    • Define requirements, metrics, and simulation testing to validate safe, legal, and comfortable Autonomous Vehicle operations

    • Perform analysis of Autonomous Vehicle driving behaviors to characterize vehicle systems performance, validate systems engineering requirements, influence development of the autonomy stack, and enable release and launch decisions by developing simulation testing and on road monitoring

    • Apply risk management tools and methodologies to drive decision making

    • Combine experience in Systems Engineering and data analysis to facilitate data driven design, validation, risk analysis, and prioritization decisions

    • Lead and communicate data analyses, trade studies, and safety analysis that provide teams with all the information needed to develop and continuously improve the systems function and performance

    • Develop methods to analyze and compare the impact of software changes on top level systems engineering requirements

    • Own roadmaps for developing long-term and stable requirements, verification/validation, and/or software solutions to advanced problems

    • Mentor team members in support of successful delivery of roadmaps or technical solutions

    The salary range for this position is $166,600 - $245,000. Compensation will vary depending on location, job-related knowledge, skills, and experience. You may also be offered a bonus, long-term incentives, and benefits. These ranges are subject to change.

    WHAT YOU MUST HAVE:

    • BS, MS, or PhD in Mechanical Engineering, Aerospace Engineering, Physics, Computer Science, Data Science, or another related field

    • 10+ years of experience in systems engineering, robotics, systems analysis, data analysis, statistical analysis, or other related fields

    • Proficient with SQL, Python, and C++ or similar to analyze large data sets coming out of hardware systems

    • Excellent critical and analytical thinking skills: you can interpret what the data is telling us, and which analyses will reveal impactful conclusions

    • Fluent in data-based and analytical engineering problem solving practices, including using paretos, root cause analysis, and more

    • Proficient in statistics and probability 

    • Ability to understand complex technical systems

    • Ability to adapt and operate under ambiguity

    • Prioritize while maintaining the tasks at hand

    • Strong technical and nontechnical communication skills

    • Comfortable at context switching between executive view and deep technical details and solving technical challenges that have not been previously encountered in the organization

    BONUS POINTS!

    • Experience with robotics, autonomous vehicles, vehicle development or ADAS development

    • Experience building data pipelines and familiarity with ETL software such as DBT or Airflow

    • Experience developing dashboards and data visualizations using tools such as Looker or Jupyter notebooks

    • Industry experience in system engineering and requirements management including system analysis, requirements authoring, test generation, and validation activities

    • Experience with requirements management tools (Jama, DNG, DOORS, etc.)

    • Proven track record of successful systems engineering for a safety-critical product

    Why Cruise?

    Our benefits are here to support the whole you:

    • Competitive salary and benefits 
    • Medical / dental / vision, Life and AD&D
    • Subsidized mental health benefits
    • Paid time off and holidays
    • Paid parental, medical, family care, and military leave of absence
    • 401(k) Cruise matching program 
    • Fertility benefits
    • Dependent Care Flexible Spending Account
    • Flexible Spending Account & Health Saving Account
    • Perks Wallet program for benefits/perks
    • Pre-tax Commuter benefit plan for local employees
    • CruiseFlex, our location-flexible work policy. (Learn more about CruiseFlex).

    We’re Integrated

    • Through our partnerships with General Motors and Honda, we are the only self-driving company with fully integrated manufacturing at scale.

    We’re Funded

    • GM, Honda, Microsoft, T. Rowe Price, and Walmart have invested billions in Cruise. Their backing for our technology demonstrates their confidence in our progress, team, and vision and makes us one of the leading autonomous vehicle organizations in the industry. Our deep resources greatly accelerate our operating speed.

    Cruise LLC is an equal opportunity employer. We strive to create a supportive and inclusive workplace where contributions are valued and celebrated, and our employees thrive by being themselves and are inspired to do the best work of their lives. We seek applicants of all backgrounds and identities, across race, color, caste, ethnicity, national origin or ancestry, age, citizenship, religion, sex, sexual orientation, gender identity or expression, veteran status, marital status, pregnancy or parental status, or disability. Applicants will not be discriminated against based on these or other protected categories or social identities. Cruise will consider for employment qualified applicants with arrest and conviction records, in accordance with applicable laws.

    Cruise is committed to the full inclusion of all applicants. If reasonable accommodation is needed to participate in the job application or interview process please let our recruiting team know or emailHR@getcruise.com.

    We proactively work to design hiring processes that promote equity and inclusion while mitigating bias. To help us track the effectiveness and inclusivity of our recruiting efforts, please consider answering the following demographic questions. Answering these questions is entirely voluntary. Your answers to these questions will not be shared with the hiring decision makers and will not impact the hiring decision in any way. Instead, Cruise will use this information not only to comply with any government reporting obligations but also to track our progress toward meeting our diversity, equity, inclusion, and belonging objectives. Know Your Rights: Workplace Discrimination is Illegal

    In any materials you submit, you may redact or remove age-identifying information such as age, date of birth, or dates of school attendance or graduation. You will not be penalized for redacting or removing this information.

    Candidates applying for roles that operate and remotely operate the AV:Licensed to drive a motor vehicle in the U.S. for the three years immediately preceding your application, currently holding an active in-state regular driver’s license or equivalent, and no more than one point on driving record. A successful completion of a background check, drug screen and DMV Motor Vehicle Record check is also required.

    Note to Recruitment Agencies:Cruise does not accept unsolicited agency resumes. Furthermore, Cruise does not pay placement fees for candidates submitted by any agency other than its approved partners. 

    No Application Deadline

    Apply for this job