Data Engineer Remote Jobs

105 Results

+30d

Data Engineer II

Agile SixUnited States, Remote
MLagileDesignapigitc++pythonbackend

Agile Six is hiring a Remote Data Engineer II

Agile Six is a people-first, remote-work company that serves shoulder-to-shoulder with federal agencies to find innovative, human-centered solutions. We build better by putting people first. We are animated by our core values of Purpose, Wholeness, Trust, Self-Management and Inclusion. We deliver our solutions in autonomous teams of self-managed professionals (no managers here!) who genuinely care about each other and the work. We know that’s our company’s purpose – and that we can only achieve it by supporting a culture where people feel valued, self-managed, and love to come to work.

The role

Agile Six is looking for a Data Engineer for an anticipated role on our cross-functional agile teams. Our partners include: the Department of Veteran Affairs (VA), Centers for Medicare & Medicaid Services (CMS), Centers for Disease Control and Prevention (CDC) and others. 

The successful candidate will bring their experience in data formatting and integration engineering to help us expand a reporting platform. As part of the team, you will primarily be responsible for data cleaning and data management tasks, building data pipelines, and data modeling (designing the schema/structure of datasets and relationships between datasets). We are looking for someone who enjoys working on solutions to highly complex problems and someone who is patient enough to deal with the complexities of navigating the Civic Tech space. The successful candidate for this role is an excellent communicator, as well as someone who is curious about where data analysis, backend development, data engineering, and data science intersect.

We embrace open source software and an open ethos regarding software development, and are looking for a candidate who does the same. Most importantly, we are looking for someone with a passion for working on important problems that have a lasting impact on millions of users and make a difference in our government!

Please note, this position is anticipated, pending contract award response.

Responsibilities

  • Contribute as a member of a cross functional Agile team using your expertise in data engineering, critical thinking, and collaboration to solve problems related to the project
    • Experience with Java/Kotlin/Python, command line, and Git is required
    • Experience with transport protocols including: REST, SFTP, SOAP is required
    • Experience with HL7 2.5.1 and FHIR is strongly preferred
  • Extract, transform, and load data. Pull together datasets, build data pipelines, and turn semi-structured and unstructured data into datasets that can be used for machine learning models.
  • Evaluate and recommend
  • We expect the responsibilities of this position to shift and grow organically over time, in response to considerations such as the unique strengths and interests of the selected candidate and other team members and an evolving understanding of the delivery environment.

Basic qualifications

  • 2+ years of hands-on data engineering experience in a production environment
  • Experience with Java/Kotlin/Python, command line, and Git
  • Demonstrated experience with extract, transform, load (ETL) and data cleaning, data manipulation, and data management
  • Demonstrated experience building and orchestrating automated data pipelines in Java/Python
  • Experience with data modeling: defining the schema/structure of datasets and the relationships between datasets
  • Ability to create usable datasets from semi-structured and unstructured data
  • Solution-oriented mindset and proactive approach to solving complex problems
  • Ability to be autonomous, take initiative, and effectively communicate status and progress
  • Experience successfully collaborating with cross-functional partners and other designers and researchers, seeking and providing feedback in an Agile environment
  • Adaptive, empathetic, collaborative, and holds a positive mindset
  • Has lived and worked in the United States for 3 out of the last 5 years
  • Some of our clients may request or require travel from time to time. If this is a concern for you, we encourage you to apply and discuss it with us at your initial interview

Additional desired qualifications

  • Familiarity with the Electronic Laboratory Reporting workflows and data flow
  • Knowledge of FHIR data / API standard, HL7 2.5.1
  • Experience building or maintaining web service APIs
  • Familiarity with various machine learning (ML) algorithms and their application to common ML problems (e.g. regression, classification, clustering)
  • Statistical experience or degree
  • Experience developing knowledge of complex domain and systems
  • Experience working with government agencies
  • Ability to work across multiple applications, components, languages, and frameworks
  • Experience working in a cross-functional team, including research, design, engineering, and product
  • You are a U.S. Veteran. As a service-disabled veteran-owned small business, we recognize the transition to civilian life can be tricky, and welcome and encourage Veterans to apply

At Agile Six, we are committed to building teams that represent a variety of backgrounds, perspectives, and skills. Even if you don't meet every requirement, we encourage you to apply. We’re eager to meet people who believe in our mission and who can contribute to our team in a variety of ways.

Salary and Sixer Benefits

To promote equal pay for equal work, we publish salary ranges for each position.

The salary range for this position is $119,931-$126,081

Our benefits are designed to reinforce our core values of Wholeness, Self Management and Inclusion. The following benefits are available to all employees. We respect that only you know what balance means for your life and season. While we offer support from coaches, we expect you to own your wholeness, show up for work whole, and go home to your family the same. You will be seen, heard and valued. We expect you to offer the same for your colleagues, be kind (not controlling), be caring (not directive) and ready to participate in a state of flow. We mean it when we say “We build better by putting people first”.

All Sixers Enjoy:

  • Self-managed work/life balance and flexibility
  • Competitive and equitable salary (equal pay for equal work)
  • Employee Stock Ownership (ESOP) for all employees!
  • 401K matching
  • Medical, dental, and vision insurance
  • Employer paid short and long term disability insurance
  • Employer paid life insurance
  • Self-managed and generous paid time off
  • Paid federal holidays and Election day off
  • Paid parental leave
  • Self-managed professional development spending
  • Self-managed wellness days

Hiring practices

Agile Six Applications, Inc. is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, national origin, ancestry, sex, sexual orientation, gender identity or expression, religion, age, pregnancy, disability, work-related injury, covered veteran status, political ideology, marital status, or any other factor that the law protects from employment discrimination.

Note: We participate in E-Verify. Upon hire, we will provide the federal government with your Form I-9 information to confirm that you are authorized to work in the U.S. Unfortunately, we are unable to sponsor visas at this time.

If you need assistance or reasonable accommodation in applying for any of these positions, please reach out to careers@agile6.com. We want to ensure you have the ability to apply for any position at Agile Six.

Please read and respond to the application questions carefully. Interviews are conducted on a rolling basis until the position has been filled.

 

Apply for this job

+30d

Data Engineer H/F

SocotecPalaiseau, France, Remote
S3LambdanosqlairflowsqlgitkubernetesAWS

Socotec is hiring a Remote Data Engineer H/F

Description du poste

SOCOTEC Monitoring France, leader dans le domaine de l'inspection et de la certification, offre des services dans les secteurs de la construction, des infrastructures et de l'industrie.

Le Data & AI Hub SOCOTEC, composé de spécialistes en Data Engineering et Data Science, est chargé non seulement de la gestion et de l'optimisation des données, mais aussi de la mise en place de traitements et d'analyses de données. Nous développons des applications basées sur les données pour soutenir les activités métiers de SOCOTEC.

Nous recherchons un(e) alternant(e) Data Engineer pour intégrer notre équipe Data SOCOTEC.

En intégrant l'équipe, vous participerez activement à la maintenance et à l'optimisation de notre Datalake, ainsi qu'à la création et à la mise à jour des flux de données. Vous serez responsable de la documentation et de la validation de ces flux, ainsi que de la création et de la mise en place d'outils de reporting tels que Power BI. Vous proposerez également de nouvelles solutions, participerez aux qualifications techniques et contribuerez à l'amélioration continue de notre infrastructure data.

 

Vous travaillerez sur trois missions principales :

  • Au sein de l’entité Socotec Monitoring France (20%), vous participerez à la définition de la stratégie optimale de données pour Socotec Monitoring (structuration, processus, open data, achats de données externes)
  • Pour le compte du groupe Socotec (60%), vous participez à la construction du Data Lake à l’échelle monde. Votre objectif sera de développer les flux de donner pour leur analyse en lien avec les équipes BI et Data Science. Vous apprendrez à organiser et ordonnancer les flux d’extraction, de transformation et de chargement des données en garantissant leur fiabilité, leur disponibilité, etc.
  • Auprès des clients (20%), vous participerez au pilotage de A à Z de projets finaux : collecte des données, pipeline de prétraitement, modélisation et déploiement.

Vous ferez preuve d’autonomie, de sagacité et de qualités certaines dans la rédaction et la communication de codes et documentations techniques.

Le stack technique utilisée :

  • Amazon Web Services (AWS)
  • Apache Airflow comme ordonnanceur
  • Spark pour les pipelines ETL
  • Gitlab pour versionner les sources
  • Kubernetes
  • DeltaLake
  • S3
  • Gérer les metadata avec OpenMetadata
  • Power BI, l’outil de BI, géré avec les équipes BI

Qualifications

  • Master en Big Data ou diplôme d'ingénieur en informatique avec une forte appétence pour la data
  • Maîtrise des bases de données SQL et NoSQL, ainsi que des concepts associés
  • Connaissance de la stack Big Data (Airflow, Spark, Hadoop)
  • Expérience avec les outils collaboratifs de développement (Git, GitLab, Jupyter Notebooks, etc.)
  • Connaissance appréciée des services AWS (Lambda, EMR, S3)
  • Intérêt marqué pour les technologies innovantes
  • Esprit d'équipe
  • Anglais courant, y compris un bon niveau technique

See more jobs at Socotec

Apply for this job

+30d

Senior Data Engineer

PostscriptRemote, Anywhere in North America
Lambda8 years of experience6 years of experience4 years of experience5 years of experience3 years of experience10 years of experienceterraformnosqlRabbitMQDesignc++pythonAWSbackend

Postscript is hiring a Remote Senior Data Engineer

Postscript is redefining marketing for ecommerce companies. By introducing SMS as an entirely new channel for ecommerce stores to engage, retain, and convert their customer base, brands are seeing huge ROI with Postscript. Backed by Greylock, Y Combinator and other top investors, Postscript is growing fast and looking for remarkable people to help build a world class organization. 

Postscript Description

Postscript is redefining marketing for ecommerce companies. By introducing SMS as an entirely new channel for ecommerce stores to engage, retain, and convert their customer base, brands are seeing huge ROI with Postscript. Backed by Greylock, Y Combinator and other top investors, Postscript is growing fast and looking for remarkable people to help build a world class organization. 

 

Job Description

As a Senior Data Engineer for the Data Platform team at Postscript, you will provide the company with best in class data foundations to support a broad range of key engineering and product initiatives. The Data Platform team at Postscript focuses on data integration through various sources like our production application and various 3rd party integrations. You will focus on designing and building end to end data pipeline solutions: data ingestion, propagation, persistence, and services to support both our product and our internal BI organization. This role is critical in ensuring data and events are reliable and actionable throughout the Postscript Platform.

 

Primary duties

  • Design and build performant and scalable data systems with high scale
  • Architect cloud native data solutions in AWS
  • Write high quality code to make your software designs a reality
  • Build services to support our product with cross domain data
  • Advise the team and organization on Data Engineering best practices to level up our competency in the organization
  • Mentor and support your fellow engineers via code reviews, design reviews and peer feedback

What We’ll Love About You

  • You’re a polyglot technologist who is passionate about data problems at scaleYou have a proven track record designing and implementing complex data systems from scratch
  • You’ve built data engineering solutions in an AWS environment and have working experience with several AWS services (Lambda, Redshift, Glue, RDS, DMS, etc.)
  • You have several years (5+) of experience writing high quality production code, preferably in Python or Go
  • You have a broad range of experience with data persistence technologies (RDBMS, NoSQL, OLAP, etc.) and know how to select the right tool for the job
  • You’ve worked in event driven systems and have experience with technologies like Kafka, Kinesis, RabbitMQ, etc.
  • You’ve gotten your hands dirty with infrastructure and have used infrastructure as code technologies like Terraform
  • You’re comfortable with ambiguity and like to dig into the problems as much as you love creating solutions

What You’ll Love About Us

  • Salary range of USD $170,000-$190,000 base plus significant equity (we do not have geo based salaries) 
  • High growth startup - plenty of room for you to directly impact the company and grow your career!
  • Work from home (or wherever)
  • Fun - We’re passionate and enjoy what we do
  • Competitive compensation and opportunity for equity
  • Flexible paid time off
  • Health, dental, vision insurance

 

What to expect from our hiring process :

  • Intro Call:You’ll hop on a quick call with the Recruiter so we can get to know you better — and you can learn a little more about the role and Postscript. 
  • Hiring Manager Intro:You’ll hop on a quick call with the Hiring Manager so your future Manager can get to know you better — This is a great time to learn more about the team & position. 
  • Homework Assignment:We will send over an exercise that challenges you to solve a problem & come up with a creative solution, or outline how you've solved a problem in the past. Get a feel for what you’ll be doing on a daily basis!
  • Virtual Onsite Interviews: You’ll be meeting with 2-4 team members on a series of video calls. This is your chance to ask questions and see who this role interacts with on a daily basis.
  • Final FEACH Interview:This is our interview to assess your ability to represent how you work via our FEACH values. As we bui in ld the #1 team in Ecommerce, we look for individuals who embody FEACH professionally and personally. We want to hear about this in your final interview!
  • Reference Checks: We ask to speak with at least two references who have previously worked with you, at least one should be someone who has previously managed your work.
  • Offer:We send over an offer and you (hopefully) accept! Welcome to Postscript!

You are welcome here. Postscript is an ever-evolving place of equal employment for talented individuals.

See more jobs at Postscript

Apply for this job

+30d

Junior/Mid Data Analytics Engineer

EXUSAthens,Attica,Greece, Remote

EXUS is hiring a Remote Junior/Mid Data Analytics Engineer

EXUS is an enterprise software company, founded in 1989 with the vision to simplify risk management software. EXUS launched its Financial Suite (EFS) in 2003 to support financial entities worldwide to improve their results. Today, our EXUS Financial Suite (EFS) is trusted by risk professionals in more than 32 countries worldwide (MENAEUSEA). We introduce simplicity and intelligence in their business processes through technology, improving their collections performance.

Our people constitute the source of inspiration that drives us forward and helps us fulfill our purpose of being role models for a better world.
This is your chance to be part of a highly motivated, diverse, and multidisciplinary team, which embraces breakthrough thinking and technology to create software that serves people.

Our shared Values:

  • We are transparent and direct
  • We are positive and fun, never cynical or sarcastic
  • We are eager to learn and explore
  • We put the greater good first
  • We are frugal and we do not waste resources
  • We are fanatically disciplined, we deliver on our promises

We are EXUS! Are you?

Join our dynamic Data Analytics Teamas we expand our capabilities into data Lakehouse architecture. We are seeking a Junior/Mid Data Analytics Engineer who is enthusiastic about creating compelling data visualizations, effectively communicating them with customers, conducting training sessions, and gaining experience in managing ETL processes for big data.

Key Responsibilities:

  • Develop and maintain reports and dashboards using leading visualization tools, and craft advanced SQL queries for additional report generation.
  • Deliver training sessions on our Analytic Solution and effectively communicate findings and insights to both technical and non-technical customer audiences.
  • Collaborate with business stakeholders to gather and analyze requirements.
  • Debug issues in the front-end analytic tool, investigate underlying causes, and resolve these issues.
  • Monitor and maintain ETL processes as part of our transition to a data lakehouse architecture.
  • Proactively investigate and implement new data analytics technologies and methods.

Required Skills and Qualifications:

  • A BSc or MSc degree in Computer Science, Engineering, or a related field.
  • 1-5 years of experience with data visualization tools and techniques. Knowledge of MicroStrategy and Apache Superset is a plus.
  • 1-5 years of experience with Data Warehouses, Big Data, and/or Cloud technologies. Exposure to these areas in academic projects, internships, or entry-level roles is also acceptable.
  • Familiarity with PL/SQL and practical experience with SQL for data manipulation and analysis. Hands-on experience through academic coursework, personal projects, or job experience is valued.
  • Familiarity with data Lakehouse architecture.
  • Excellent analytical skills to understand business needs and translate them into data models.
  • Organizational skills with the ability to document work clearly and communicate it professionally.
  • Ability to independently investigate new technologies and solutions.
  • Strong communication skills, capable of conducting presentations and engaging effectively with customers in English.
  • Demonstrated ability to work collaboratively in a team environment.
  • Competitive salary
  • Friendly, pleasant, and creative working environment
  • Remote Working
  • Development Opportunities
  • Private Health Insurance

Privacy Notice for Job Applications: https://www.exus.co.uk/en/careers/privacy-notice-f...

See more jobs at EXUS

Apply for this job

+30d

Data Engineer

Maker&Son LtdBalcombe, United Kingdom, Remote
golangtableauairflowsqlmongodbelasticsearchpythonAWS

Maker&Son Ltd is hiring a Remote Data Engineer

Job Description

We are looking for a highly motivated individual to join our team as a Data Engineer.

We are based in Balcombe [40 mins from London by train, 20 minutes from Brighton] and we will need you to be based in our offices at least 3 days a week.

You will report directly to the Head of Data.

Candidate Overview

As a part of the Technology Team your core responsibility will be to help maintain and scale our infrastructure for analytics as our data volume and needs continue to grow at a rapid pace. This is a high impact role, where you will be driving initiatives affecting teams and decisions across the company and setting standards for all our data stakeholders. You’ll be a great fit if you thrive when given ownership, as you would be the key decision maker in the realm of architecture and implementation.

Responsibilities

  • Understand our data sources, ETL logic, and data schemas and help craft tools for managing the full data lifecycle
  • Play a key role in building the next generation of our data ingestion pipeline and data warehouse
  • Run ad hoc analysis of our data to answer questions and help prototype solutions
  • Support and optimise existing ETL pipelines
  • Support technical and business stakeholders by providing key reports and supporting the BI team to become fully self-service
  • Own problems through to completion both individually and as part of a data team
  • Support digital product teams by performing query analysis and optimisation

 

Qualifications

Key Skills and Requirements

  • 3+ years experience as a data engineer
  • Ability to own data problems and help to shape the solution for business challenges
  • Good communication and collaboration skills; comfortable discussing projects with anyone from end users up to the executive company leadership
  • Fluency with a programming language - we use NodeJS and Python but looking to use Golang
  • Ability to write and optimise complex SQL statements
  • Familiarity with ETL pipeline tools such as Airflow or AWS Glue
  • Familiarity with data visualisation and reporting tools, like Tableau, Google Data Studio, Looker
  • Experience working in a cloud-based software development environment, preferably with AWS or GCP
  • Familiarity with no-SQL databases such as ElasticSearch, DynamoDB, or MongoDB

See more jobs at Maker&Son Ltd

Apply for this job

+30d

Principal Data Engineer

MLairflowsqlB2CRabbitMQDesignjavac++pythonAWS

hims & hers is hiring a Remote Principal Data Engineer

Hims & Hers Health, Inc. (better known as Hims & Hers) is the leading health and wellness platform, on a mission to help the world feel great through the power of better health. We are revolutionizing telehealth for providers and their patients alike. Making personalized solutions accessible is of paramount importance to Hims & Hers and we are focused on continued innovation in this space. Hims & Hers offers nonprescription products and access to highly personalized prescription solutions for a variety of conditions related to mental health, sexual health, hair care, skincare, heart health, and more.

Hims & Hers is a public company, traded on the NYSE under the ticker symbol “HIMS”. To learn more about the brand and offerings, you can visit hims.com and forhers.com, or visit our investor site. For information on the company’s outstanding benefits, culture, and its talent-first flexible/remote work approach, see below and visit www.hims.com/careers-professionals.

​​About the Role:

We're looking for an experienced Principal Data Engineer to join our Data Platform Engineering team. Our team is responsible for enabling H&H business (Product, Analytics, Operations, Finance, Data Science, Machine Learning, Customer Experience, Engineering) by providing a platform with a rich set of data and tools to leverage.

You Will:

  • Serve as a technical leader within the Data Platform org. Provide expert guidance and hands-on development of complex engineering problems and projects
  • Collaborate with cross-functional stakeholders including product management, engineering, analytics, and key business representatives to align the architecture, vision, and roadmap with stakeholder needs
  • Establish guidelines, controls, and processes to make data available for developing scalable data-driven solutions for Analytics and AI
  • Create and set best practices for data ingestion, integration, and access patterns to support both real-time and batch-based consumer data needs
  • Implement and maintain data governance practices to ensure compliance, data security, and privacy.
  • Design and lead development on scalable, high-performance data architecture solutions that supports both the consumer side of the business as well as analytic use cases
  • Plan and oversee large-scale and complex technical migrations to new data systems and platforms
  • Drive continuous data transformation to minimize technical debt
  • Display strong thought leadership and execution in pursuit of modern data architecture principles and technology modernization
  • Define and lead technology proof of concepts to ensure feasibility of new data technology solutions
  • Provide technical leadership and mentorship to the members of the team, fostering a culture of technical excellence
  • Create comprehensive documentation for design, and processes to support ongoing maintenance and knowledge sharing
  • Conduct design reviews to ensure that proposed solutions address platform and stakeholder pain points, as well as meet business, and technical requirements, with alignment to standards and best practices
  • Prepare and deliver efficient communications to convey architectural direction and how it aligns with company strategy. Be able to explain the architectural vision and implementation to executives

You Have:

  • Bachelor's or Master's degree in Computer Science or equivalent, with over 12 years of Data Architecture and Data Engineering experience, including team leadership
  • Proven expertise in designing data platforms for large-scale data and diverse data architectures, including warehouses, lakehouses, and integrated data stores.
  • Proficiency and hands-on knowledge in a variety of technologies such as SQL, Bash, Python, Java, Presto, Spark, AWS, data streaming like Kafka, RabbitMQ,
  • Hands-on experience and proficiency with data stacks including Airflow, Databricks, and dbt, as well as data stores such as Cassandra, Aurora, and ZooKeeper
  • Experience with data security (including PHI and PII), as well as data privacy regulations (CCPA and GDPR)
  • Proficient in addressing data-related challenges through analytical problem-solving and aligning data architecture with organizational business goals and objectives
  • Exposure to analytics techniques using ML and AI to assist data scientists and analysts in deriving insights from data
  • Analytical and problem-solving skills to address data-related challenges and find optimal solutions
  • Ability to manage projects effectively, plan tasks, set priorities, and meet deadlines in a fast-paced and ever changing environmen

Nice To Have:

  • Experience working in healthcare or in a B2C company

Our Benefits (there are more but here are some highlights):

  • Competitive salary & equity compensation for full-time roles
  • Unlimited PTO, company holidays, and quarterly mental health days
  • Comprehensive health benefits including medical, dental & vision, and parental leave
  • Employee Stock Purchase Program (ESPP)
  • Employee discounts on hims & hers & Apostrophe online products
  • 401k benefits with employer matching contribution
  • Offsite team retreats

 

#LI-Remote

 

Outlined below is a reasonable estimate of H&H’s compensation range for this role for US-based candidates. If you're based outside of the US, your recruiter will be able to provide you with an estimated salary range for your location.

The actual amount will take into account a range of factors that are considered in making compensation decisions including but not limited to skill sets, experience and training, licensure and certifications, and location. H&H also offers a comprehensive Total Rewards package that may include an equity grant.

Consult with your Recruiter during any potential screening to determine a more targeted range based on location and job-related factors.

An estimate of the current salary range for US-based employees is
$210,000$250,000 USD

We are focused on building a diverse and inclusive workforce. If you’re excited about this role, but do not meet 100% of the qualifications listed above, we encourage you to apply.

Hims is an Equal Opportunity Employer and considers applicants for employment without regard to race, color, religion, sex, orientation, national origin, age, disability, genetics or any other basis forbidden under federal, state, or local law. Hims considers all qualified applicants in accordance with the San Francisco Fair Chance Ordinance.

Hims & hers is committed to providing reasonable accommodations for qualified individuals with disabilities and disabled veterans in our job application procedures. If you need assistance or an accommodation due to a disability, you may contact us at accommodations@forhims.com. Please do not send resumes to this email address.

For our California-based applicants – Please see our California Employment Candidate Privacy Policy to learn more about how we collect, use, retain, and disclose Personal Information. 

See more jobs at hims & hers

Apply for this job

+30d

Data Quality Engineer

MLMid LevelFull TimeBachelor's degreesqlmobileuiqa

Pixalate, Inc. is hiring a Remote Data Quality Engineer

Data Quality Engineer - Pixalate, Inc. - Career PageSee more jobs at Pixalate, Inc.

Apply for this job

+30d

Data Engineer

IncreasinglyBengaluru, India, Remote
S3LambdaDesigngitjenkinspythonAWS

Increasingly is hiring a Remote Data Engineer

Job Description

Working experience in data integration and pipeline development

Qualifications

3+ years of relevant experience with AWS Cloud on data integration with Databricks,Apache Spark, EMR, Glue, Kafka, Kinesis, and Lambda in S3, Redshift, RDS, MongoDB/DynamoDB ecosystems

Strong real-life experience in python development especially in pySpark in the AWS Cloud environment.

Design, develop, test, deploy, maintain and improve data integration pipeline.

Experience in Python and common python libraries.

Strong analytical experience with the database in writing complex queries, query optimization, debugging, user-defined functions, views, indexes, etc.

Strong experience with source control systems such as Git, Bitbucket, and Jenkins build and continuous integration tools.

 

See more jobs at Increasingly

Apply for this job

+30d

Data Engineer

AmpleInsightIncToronto, Canada, Remote
DevOPSairflowsqlpython

AmpleInsightInc is hiring a Remote Data Engineer

Job Description

We are looking for a data engineer who is passionate about analytics and helping companies build and scale data. You enjoy working with data and are motivated to produce high quality data tools and pipelines that help empower other data scientists. You are experienced in architecting data ETL workflows and schemas. Critical thinking and problem-solving skills are essential for this role.

Qualifications

  • BS (or higher, e.g., MS, or PhD) in Computer Science, Engineering, Math, or Statistics
  • Hands on experience working with user engagement, social, marketing, and/or finance data
  • Proficient in Python (i.e. Pandas, Numpy, scikit-learn, etc), R, TensorFlow, amongst other data science related tools and libraries
  • Extensive experience working on relational databases, designing complex data schemas, and writing SQL queries
  • Deep knowledge on performance tuning of ETL Jobs, SQL, and databases
  • Working knowledge of Snowflake
  • Experience working with Airflow is a strong plus
  • Devops experiences is a plus

See more jobs at AmpleInsightInc

Apply for this job

+30d

Data Engineer

JLIConsultingVaughan, Canada, Remote
oracleazureapigitAWS

JLIConsulting is hiring a Remote Data Engineer

Job Description

Data Engineer Job Responsibilities:

 

•       Work with stakeholders to understand data sources and Data, Analytics and Reporting team strategy in supporting within our on-premises environment and enterprise AWS cloud solution

•       Work closely with Data, Analytics and Reporting Data Management and Data Governance teams to ensure all industry standards and best practices are met

•       Ensure metadata and data lineage is captured and compatible with enterprise metadata and data management tools and processes

•       Run quality assurance and data integrity checks to ensure accurate reporting and data records

•       Ensure ETL pipelines are produced with the highest quality standards, metadata and validated for completeness and accuracy

•       Develops and maintains scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity.

•       Collaborates with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization.

•       Implements processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.

•       Writes unit/integration tests, contributes to engineering wiki, and documents work.

•       Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues.

•       Defines company data assets (data models), spark, sparkSQL jobs to populate data models.

•       Designs data integrations and data quality framework.

•       Designs and evaluates open source and vendor tools for data lineage.

•       Works closely with all business units and engineering teams to develop strategy for long term data platform architecture.

•       Focusing on structured problem solving

•       Phenomenal communication and business awareness

•       Working with ETL tools, Querying languages, and data repositories

•       Support of technical Data Management Solutions

•       Provide support to the development and testing teams to resolve data issues

Qualifications

•       Experience in database, storage, collection and aggregation models, techniques, and technologies – and how to apply them in business

•       Working knowledge of source code control tool such as GIT

•       Knowledge about file formats (e.g. XML, CSV, JSON), databases (e.g. Redshift, Oracle) and different type of connectivity is also very useful.

•       Working experience with the following Cloud platforms is a plus: Amazon Web Services, Google Cloud Platform, Azure

•       Working experience with data modeling, relational modeling, and dimensional modeling

•       The interpersonal skills: You have a way of speaking that engages your audience and instills confidence and credibility. You know how to leverage communication tools and methodologies. You can build relationships internal and external team members, positioning yourself as a trusted advisor. You are always looking for ways to improve processes, and you always ensure your communications have been received and are clearly understood. Your commitment and focus influence those around you to do better.

See more jobs at JLIConsulting

Apply for this job

+30d

Data Center Design Engineer

CloudflareHybrid or Remote
jiraDesign

Cloudflare is hiring a Remote Data Center Design Engineer

About Us

At Cloudflare, we are on a mission to help build a better Internet. Today the company runs one of the world’s largest networks that powers millions of websites and other Internet properties for customers ranging from individual bloggers to SMBs to Fortune 500 companies. Cloudflare protects and accelerates any Internet application online without adding hardware, installing software, or changing a line of code. Internet properties powered by Cloudflare all have web traffic routed through its intelligent global network, which gets smarter with every request. As a result, they see significant improvement in performance and a decrease in spam and other attacks. Cloudflare was named to Entrepreneur Magazine’s Top Company Cultures list and ranked among the World’s Most Innovative Companies by Fast Company. 

We realize people do not fit into neat boxes. We are looking for curious and empathetic individuals who are committed to developing themselves and learning new skills, and we are ready to help you do that. We cannot complete our mission without building a diverse and inclusive team. We hire the best people based on an evaluation of their potential and support them throughout their time at Cloudflare. Come join us! 

Available Location: Lisbon, Portugal; London, UK; Singapore or Remote US 

About the Role

We are seeking a Data Center Design Engineer to design Cloudflare’s pending and future infrastructure deployments for generational improvement in cost, quality, and speed of deployment. We are looking for someone who excels at progressing many projects in parallel, managing dynamic day to day priorities with many stakeholders, and has experience implementing and refining data center design best practices in a high growth environment.  Getting stuff done is a must!

The Data Center Strategy team is part of Cloudflare’s global Infrastructure (INF) team. The INF team grows and manages Cloudflare’s global data center/PoP footprint, enabling Cloudflare’s tremendous growth with compute, network, and data center infrastructure, utilizing an extensive range of global partner vendors.  

What you get to do in this role:

  • Translate data center capacity requirements into actionable white space design and/or rack plans within individual data center contract constraints for power, cooling
  • Manage implementation phase of cage projects with data center providers
  • Design low voltage structured cabling, fiber, cross-connect & conveyance infrastructure as well as any supporting infrastructure on the data center ceiling/floor
  • Work with supply chain team on rack integration plans and location deployment qualification
  • Work cross-functionally with Cloudflare data center engineering team and other internal teams (capacity planning, network strategy, security) to verify scope and solution and create repeatable standard installation procedures
  • Take ownership of and lead projects to design and implement data center expansions or new data centers on tight deadlines with minimal oversight 
  • Technical support in negotiations with external data center partners
  • Assist in RFP preparation, review and cost/engineering analysis
  • Review one-line diagrams and cooling equations for new and existing data centers (Data Center M&E)
  • Power component (PDU) review/approval for Hardware sourcing team
  • Implement, document and maintain power consumption tracking tools and fulfil ad-hoc reporting requests
  • Research new and innovative power efficiency technologies and programs
  • Travel up to 25% to perform infrastructure audits, validate data center construction work and buildouts, and participate in commercial processes.
  • Other duties as assigned

Requirements

  • Bachelors or equivalent experience plus 5+ years of experience in data center mechanical and electrical design and operations/deployment/installation, P.E. certification or equivalent a plus
  • Experience in HVAC, Chilled Water Systems, Condenser Water Systems, Pump controls, Glycool/Glycols, AHU units (DX, split, RTU, CRAC, etc.), CRAH, Raised Floor Systems, HOT/COLD aisle containment and Building Management Systems
  • Understanding of basic electrical theory (voltage, current, power), basic circuit design & analysis, and single- and three-phase power systems
  • Familiarity with Data Center M&E infrastructure design concepts, electrical/UPS topologies, cooling methodologies (central plant, room cooling, high density thermal strategies)
  • Familiarity with industry standards for resilient Data Center design and Uptime Institute Tier Classifications
  • Excellent verbal, written communication and presentation skills
  • Experience working with multiple time zones and multiple cross-functional teams
  • Experience working on time sensitive projects with delivery responsibility under pressure

Bonus Points

  • Degree in electrical/mechanical engineering or IT a plus
  • Experience in large-scale mission critical facility infrastructure design, construction, commissioning, and/or operations a plus
  • Experience with industry standards, building codes and safety standards including UMC, NFPA, ASHRAE, UBC, UMC and LEED, Uptime Institute
  • Knowledge of programming languages a plus
  • JIRA, Confluence admin-level experience a plus
  • AutoCAD experience a plus
  • Experience with FLOTHERM or Tileflow a plus

What Makes Cloudflare Special?

We’re not just a highly ambitious, large-scale technology company. We’re a highly ambitious, large-scale technology company with a soul. Fundamental to our mission to help build a better Internet is protecting the free and open Internet.

Project Galileo: We equip politically and artistically important organizations and journalists with powerful tools to defend themselves against attacks that would otherwise censor their work, technology already used by Cloudflare’s enterprise customers--at no cost.

Athenian Project: We created Athenian Project to ensure that state and local governments have the highest level of protection and reliability for free, so that their constituents have access to election information and voter registration.

1.1.1.1: We released 1.1.1.1to help fix the foundation of the Internet by building a faster, more secure and privacy-centric public DNS resolver. This is available publicly for everyone to use - it is the first consumer-focused service Cloudflare has ever released. Here’s the deal - we don’t store client IP addresses never, ever. We will continue to abide by our privacy commitmentand ensure that no user data is sold to advertisers or used to target consumers.

Sound like something you’d like to be a part of? We’d love to hear from you!

This position may require access to information protected under U.S. export control laws, including the U.S. Export Administration Regulations. Please note that any offer of employment may be conditioned on your authorization to receive software or technology controlled under these U.S. export laws without sponsorship for an export license.

Cloudflare is proud to be an equal opportunity employer.  We are committed to providing equal employment opportunity for all people and place great value in both diversity and inclusiveness.  All qualified applicants will be considered for employment without regard to their, or any other person's, perceived or actual race, color, religion, sex, gender, gender identity, gender expression, sexual orientation, national origin, ancestry, citizenship, age, physical or mental disability, medical condition, family care status, or any other basis protected by law.We are an AA/Veterans/Disabled Employer.

Cloudflare provides reasonable accommodations to qualified individuals with disabilities.  Please tell us if you require a reasonable accommodation to apply for a job. Examples of reasonable accommodations include, but are not limited to, changing the application process, providing documents in an alternate format, using a sign language interpreter, or using specialized equipment.  If you require a reasonable accommodation to apply for a job, please contact us via e-mail athr@cloudflare.comor via mail at 101 Townsend St. San Francisco, CA 94107.

See more jobs at Cloudflare

Apply for this job

+30d

Sr. Data Engineer - Data Analytics

R.S.ConsultantsPune, India, Remote
SQSLambdaBachelor's degreescalaairflowsqlDesigntypescriptpythonAWSNode.js

R.S.Consultants is hiring a Remote Sr. Data Engineer - Data Analytics

Job Description

We are looking for a Sr. Data Engineer for an International client. This is a 100% remote job. The person will be working from India and will be collaborating with global team. 

Total Experience: 7+ Years

Your role

  • Have key responsibilities within the requirements analysis, scalable & low latency streaming platform solution design, architecture, and end-to-end delivery of key modules in order to provide real-time data solutions for our product
  • Write clean scalable code using Go, Typescript / Node.js / Scala / python / SQL and test and deploy applications and systems
  • Solve our most challenging data problems, in real-time, utilizing optimal data architectures, frameworks, query techniques, sourcing from structured and unstructured data sources.
  • Be part of an engineering organization delivering high quality, secure, and scalable solutions to clients
  • Involvement in product and platform performance optimization and live site monitoring
  • Mentor team members through giving and receiving actionable feedback.

Our tech. stack:

  • AWS (Lambda, SQS, Kinesis, KDA, Redshift, Athena, DMS, Glue,Go/Typescript, Dynamodb), Airflow, Flink, Spark, Looker, EMR
  • A continuous deployment process based on GitLab

A little more about you:

  • A Bachelor's degree in a technical field (eg. computer science or mathematics). 
  • 3+ years experience with real-time, event driven architecture
  • 3+ years experience with a modern programming language such as Scala, Python, Go, Typescript
  • Experience of designing complex data processing pipeline
  • Experience of data modeling(star schema, dimensional modeling etc)
  • Experience of query optimisation
  • Experience of kafka is a plus
  • Shipping and maintaining code in production
  • You like sharing your ideas, and you're open-minded

Why join us?

???? Key moment to join in term of growth and opportunities

????‍♀️ Our people matter, work-life balance is important

???? Fast-learning environment, entrepreneurial and strong team spirit

???? 45+ Nationalities: cosmopolite & multi-cultural mindset

???? Competitive salary package & benefits (health coverage, lunch, commute, sport

DE&I Statement: 

We believe diversity, equity and inclusion, irrespective of origins, identity, background and orientations, are core to our journey. 

Qualifications

Hands-on experience in Scala / Python with Data Modeling, Real Time / Streaming Data. Experience of complex data processing pipeline and Data Modeling.

BE/ BTech in Computer Science

See more jobs at R.S.Consultants

Apply for this job

+30d

Senior Data Engineer

Balsam BrandsMexico City, Mexico, Remote
postgressqloracleDesignapiMySQLpython

Balsam Brands is hiring a Remote Senior Data Engineer

Job Description

In this hands-on role as a Senior Data Engineer, your primary responsibility will be to partner with key business partners, data analysts and software engineers to design and build a robust, scalable, company-wide data infrastructure to move and translate data that will be used to inform strategic business decisions. You will ensure performance, stability, cost-efficiency, security, and accuracy of the data on the centralized data platform. The ideal candidate will possess advanced knowledge and hands-on experience in data integration, building data pipelines, batch processing frameworks, and data modeling techniques to facilitate seamless data movement. You will collaborate with various technology and business stakeholders to define requirements and design and deliver data products that meet user needs. The candidate should demonstrate intellectual acumen, excel in engineering best practices, and have a strong interest in developing enterprise-scale solutions using industry-recognized cloud platforms, data warehouses, data integration and orchestration tools.

This full-time position reports to the Senior Manager, Data Engineering and requires in-office presence twice a week (Tuesdays and Wednesdays) to facilitate effective collaboration with both local and remote team members. Some flexibility in the regular work schedule is necessary, as most teams have overlapping hours in the early morning and/or early evening PST. Specific scheduling needs for this role will be discussed in the initial interview.

What you’ll do:

  • Data Infrastructure Design: Develop and maintain robust, scalable, and high-performance data infrastructure to meet the company-wide data and analytics needs
  • Data Lifecycle Management: Manage the entire data lifecycle, including ingestion, modeling, warehousing, transformation, access control, quality, observability, retention, and deletion
  • Strategic Data Movement: Define and implement data integration strategies to collect and ingest various data sources. Design, build and launch efficient and reliable data pipelines to process data of different structures and sizes using Python, APIs, SQL, and platforms like Snowflake
  • Collaboration and Consultation: Serve as a trusted partner to collaborate with technical and cross-functional teams to support their data needs, address data-related technical issues, and provide expert consultation
  • Process Efficiency and Stability: Apply engineering best practices to streamline manual processes, optimize data pipelines, and establish observability capabilities to monitor and alert data quality and infrastructure health and stability
  • Innovative Solutions: Stay updated on the latest technologies and lead the evaluation and deployment of cutting-edge tools to enhance data infrastructure and processes
  • Coaching and Mentorship: Foster a culture of knowledge sharing by acting as a subject matter expert, leading by example, and mentoring others

What you bring to the table:

  • Must be fluent in English, both written and verbal
  • 8+ years of professional experience in the data engineering
  • Extensive hands-on experience with designing and maintaining scalable, efficient, secure and fault tolerant distributed database on Snowflake Cloud Data Platform. In-depth knowledge on cloud platforms, particularly GCP and Microsoft
  • Proficient in designing and implementing data movement pipelines for diverse data sources including databases, external data providers, and streaming sources, for both inbound and outbound data workflows
  • Deep understanding of relational database (SQL Server, Oracle, Postgres, and MySQL) with advanced SQL and Python skills for building API integration, ETLs, and data models
  • Proven experience in building efficient and reliable data pipelines with comprehensive data quality checks, workflow management, and CI/CD integration
  • Excellent analytical thinking skills for performing root cause analysis on external and internal processes and data, resolving data incidents, and identifying opportunities for improvement
  • Effective communication skills for articulating complex technical details in simple business terms to non-technical audience from a various business function
  • Strong understanding of coding standards, best practices, and data governance

Location and Travel: At Balsam Brands, we believe that time spent together, in-person, collaborating and building relationships is important. To be considered for this role, it is preferred that candidates live within the Mexico City, Guadalajara, or Monterrey metropolitan areas in order to attend occasional team meetings, offsites, or learning and development opportunities that will be planned in a centralized location. Travel to the U.S. may be required for companywide and broader team retreats.

Notes: This is a full-time (40 hours/week), indefinite position with benefits. Candidates must be Mexican nationals to be eligible for this position; this screening question will be asked during the application process. Velocity Global is the Employer of Record for Balsam Brands' Mexico City location, and you will be employed and provided benefits under their payroll. Balsam Brands has partnered with Velocity Global to act as your Employer of Record to ensure your employment will comply with all local laws and regulations and you will receive an exceptional employment experience.

Benefits Offered:

  • Competitive compensation; salary is reviewed yearly and may be adjusted as part of the normal compensation review process
  • Career development and growth opportunities; access to online learning solutions and annual stipend for continuous learning
  • Fully remote work and flexible schedule
  • Collaborate in a multicultural environment; learn and share best practices around the globe
  • Government mandated benefits (IMSS, INFONAVIT, SAR, 50% vacation premium)
  • Healthcare coverage provided for the employee and dependents
  • Life insurance provided for the employee
  • Monthly grocery coupons
  • Monthly non-taxable amount for the electricity and internet services 
  • 20 days Christmas bonus
  • Paid Time Off: Official Mexican holidays and 12 vacation days (increases with years of service), plus additional wellness days available at start of employment 

 

 

Qualifications

See more jobs at Balsam Brands

Apply for this job

+30d

Data Engineer

Zensark Tecnologies Pvt LtdHyderabad, India, Remote
S3EC2nosqlpostgressqloracleDesignjavapythonAWS

Zensark Tecnologies Pvt Ltd is hiring a Remote Data Engineer

Job Description

Job Title:              Data Engineer

Department:      Product Development

Reports to:         Director, Software Engineering

 

Summary:

Responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. Support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. Responsible for optimizing or even re-designing Tangoe’s data architecture to support our next generation of products and data initiatives.

 

Responsibilities:

  • Create and maintain optimal data pipeline architecture.
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater performance and scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
  • Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
  • Work with data and analytics experts to strive for greater functionality in our data systems.

 

Skills & Qualifications:

  • 5+ years of experience in a Data Engineer role
  • Experience with relational SQL and NoSQL databases, including Postgres, Oracle and Cassandra.
  • Experience with data pipeline and workflow management tools.
  • Experience with AWS cloud services: S3, EC2, EMR, RDS, Redshift.
  • Experience with stream-processing systems: Storm, Spark-Streaming, Amazon Kinesis, etc.
  • Experience with object-oriented/object function scripting languages: Python, Java, NodeJs.
  • Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Strong analytic skills related to working with both structured and unstructured datasets.
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
  • A successful history of manipulating, processing and extracting value from large disconnected datasets.
  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
  • Strong project management and organizational skills.
  • Experience supporting and working with cross-functional teams in a dynamic environment.

 

 

Education:

  • Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field.

 

Working conditions: 

  • Remote

 

Tangoe reaffirms its commitment to providing equal opportunities for employment and advancement to qualified employees and applicants. Individuals will be considered for positions for which they meet the minimum qualifications and are able to perform without regard to race, color, gender, age, religion, disability, national origin, veteran status, sexual orientation, gender identity, current unemployment status, or any other basis protected by federal, state or local laws. Tangoe is an Equal Opportunity Employer -Minority/Female/Disability/Veteran/Current Unemployment Status.

 

Qualifications

  • Bachelor’s degree in Computer Science, Engineering or a related subject

See more jobs at Zensark Tecnologies Pvt Ltd

Apply for this job

+30d

Data Engineer--US Citizens/Green Card

Software Technology IncBrentsville, VA, Remote
Lambdanosqlsqlazureapigit

Software Technology Inc is hiring a Remote Data Engineer--US Citizens/Green Card

Job Description

I am a Lead Talent Acquisition Specialist at STI (Software Technology Inc) and currently looking for a Data Engineer.

Below is a detailed job description. Should you be interested, please feel free to reach me via call or email. Amrutha.duddula@ AT tiorg.com/732-664-8807

Title:  Data Engineer
Location: Manassas, VA (Remote until Covid)
Duration: Long Term Contract

 Required Skills:

•             Experience working in Azure Databricks, Apache Spark
•             Proficient programming in Scala/Python/Java
•             Experience developing and deploying data pipelines for streaming and batch data sources getting from multiple sources
•             Experience creating data models and implementing business logic using tools and languages listed
•             Working knowledge in Kafka, Structured Streaming, DataFrame API, SQL, NoSQL Database
•             Comfortable with API, Azure Datalake, Git, Notebooks, Spark Cluster, Spark Jobs, Performance tuning
•             Must have excellent communication skills
•             Familiarity with Power BI, Delta Lake, Lambda Architecture, Azure Data Factory, Azure Synapse a plus
•             Telecom domain experience not necessary but really helpful

Thank you,
Amrutha Duddula
Lead Talent Acquisition Specialist
Software Technology Inc (STI)

Email: amrutha.duddula@ AT tiorg.com
Phone : 732-664-8807
www.stiorg.com
www.linkedin.com/in/amruthad/

Qualifications

See more jobs at Software Technology Inc

Apply for this job

+30d

Senior Data science Engineer - Remote

RapidSoft CorpReston, VA, Remote
agileDesignjavapython

RapidSoft Corp is hiring a Remote Senior Data science Engineer - Remote

Job Description

Duties and Responsibilities: • Develop data solutions in collaboration with other team members and software engineering teams that meet and anticipate business goals and strategies • Work with senior data science engineers in analyzing and understanding all aspects of data, including source, design, insight, technology and modeling • Develop and manage scalable data processing platforms for both exploratory and real-time analytics • Oversee and develop algorithms for quick data acquisition, analysis and evolution of the data model to improve search and recommendation engines. • Document and demonstrate solutions • Design system specifications and provide standards and best practices • Support and mentor junior data engineers by providing advice and coaching • Make informed decisions quickly and taking ownership of services and applications at scale • Be a persistent, creative problem solver, constantly striving to improve and iterate on both processes and technical solutions • Remain cool and effective in a crisis • Understand business needs and know how to create the tools to manage them • Take initiative, own the problem and own the solution • Other duties as assigned  Supervisory Responsibilities: • None  Minimum Qualifications: • Bachelor's Degree in Data Engineering, Computer Science, Information Technology, or a related discipline (or equivalent experience) • 8+ years experience in data engineering development • 5+ years experience working in object oriented programming languages such as Python or Java • Experience working in an Agile environment

Qualifications

See more jobs at RapidSoft Corp

Apply for this job

+30d

Senior Data Engineer

phDataIndia - Remote
scalasqlazurejavapythonAWS

phData is hiring a Remote Senior Data Engineer

Job Application for Senior Data Engineer at phData

See more jobs at phData

Apply for this job

+30d

Lead Data Engineer

phDataIndia - Remote
scalasqlazurejavapythonAWS

phData is hiring a Remote Lead Data Engineer

Job Application for Lead Data Engineer at phData

See more jobs at phData

Apply for this job

+30d

Senior Data Engineer

RemoteRemote-Southeast Asia
airflowsqljenkinspythonAWS

Remote is hiring a Remote Senior Data Engineer

About Remote

Remote is solving global remote organizations’ biggest challenge: employing anyone anywhere compliantly. We make it possible for businesses big and small to employ a global team by handling global payroll, benefits, taxes, and compliance. Check out remote.com/how-it-works to learn more or if you’re interested in adding to the mission, scroll down to apply now.

Please take a look at remote.com/handbook to learn more about our culture and what it is like to work here. Not only do we encourage folks from all ethnic groups, genders, sexuality, age and abilities to apply, but we prioritize a sense of belonging. You can check out independent reviews by other candidates on Glassdoor or look up the results of our candidate surveys to see how others feel about working and interviewing here.

All of our positions are fully remote. You do not have to relocate to join us!

What this job can offer you

This is an exciting time to join the growing Data Team at Remote, which today consists of over 15 Data Engineers, Analytics Engineers and Data Analysts spread across 10+ countries. Throughout the team we're focused on driving business value through impactful decision making. We're in a transformative period where we're laying the foundations for scalable company growth across our data platform, which truly serves every part of the Remote business. This team would be a great fit for anyone who loves working collaboratively on challenging data problems, and making an impact with their work. We're using a variety of modern data tooling on the AWS platform, such as Snowflake and dbt, with SQL and python being extensively employed.

This is an exciting time to join Remote and make a personal difference in the global employment space as a Senior Data Engineer, joining our Data team, composed of Data Analysts and Data Engineers. We support the decision making and operational reporting needs by being able to translate data into actionable insights to non-data professionals at Remote. We’re mainly using SQL, Python, Meltano, Airflow, Redshift, Metabase and Retool.

What you bring

  • Experience in data engineering; high-growth tech company experience is a plus
  • Strong experience with building data extraction/transformation pipelines (e.g. Meltano, Airbyte) and orchestration platforms (e.g. Airflow)
  • Strong experience in working with SQL, data warehouses (e.g. Redshift) and data transformation workflows (e.g. dbt)
  • Solid experience using CI/CD (e.g. Gitlab, Github, Jenkins)
  • Experience with data visualization tools (e.g. Metabase) is considered a plus
  • A self-starter mentality and the ability to thrive in an unstructured and fast-paced environment
  • You have strong collaboration skills and enjoy mentoring
  • You are a kind, empathetic, and patient person
  • Writes and speaks fluent English
  • It's not required to have experience working remotely, but considered a plus

Key Responsibilities

  • Playing a key role in Data Platform Development & Maintenance:
    • Managing and maintaining the organization's data platform, ensuring its stability, scalability, and performance.
    • Collaboration with cross-functional teams to understand their data requirements and optimize data storage and access, while protecting data integrity and privacy.
    • Development and testing architectures that enable data extraction and transformation to serve business needs.
  • Improving further our Data Pipeline & Monitoring Systems:
    • Designing, developing, and deploying efficient Extract, Load, Transform (ELT) processes to acquire and integrate data from various sources into the data platform.
    • Identifying, evaluating, and implementing tools and technologies to improve ELT pipeline performance and reliability.
    • Ensuring data quality and consistency by implementing data validation and cleansing techniques.
    • Implementing monitoring solutions to track the health and performance of data pipelines and identify and resolve issues proactively.
    • Conducting regular performance tuning and optimization of data pipelines to meet SLAs and scalability requirements.
  • Dig deep into DBT Modelling:
    • Designing, developing, and maintaining DBT (Data Build Tool) models for data transformation and analysis.
    • Collaboration with Data Analysts to understand their reporting and analysis needs and translate them into DBT models, making sure they respect internal conventions and best practices.
  • Driving our Culture of Documentation:
    • Creating and maintaining technical documentation, including data dictionaries, process flows, and architectural diagrams.
    • Collaborating with cross-functional teams, including Data Analysts, SREs (Site Reliability Engineers) and Software Engineers, to understand their data requirements and deliver effective data solutions.
    • Sharing knowledge and offer mentorship, providing guidance and advice to peers and colleagues, creating an environment that empowers collective growth

Practicals

  • You'll report to: Engineering Manager - Data
  • Team: Data 
  • Location:For this position we welcome everyone to apply, but we will prioritise applications from the following locations as we encourage our teams to diversify; Vietnam, Indonesia, Taiwan and South-Korea
  • Start date: As soon as possible

Remote Compensation Philosophy

Remote's Total Rewards philosophy is to ensure fair, unbiased compensation and fair equitypayalong with competitive benefits in all locations in which we operate. We do not agree to or encourage cheap-labor practices and therefore we ensure to pay above in-location rates. We hope to inspire other companies to support global talent-hiring and bring local wealth to developing countries.

At first glance our salary bands seem quite wide - here is some context. At Remote we have international operations and a globally distributed workforce.  We use geo ranges to consider geographic pay differentials as part of our global compensation strategy to remain competitive in various markets while we hiring globally.

The base salary range for this full-time position is $53,500 USD to $131,300 USD. Our salary ranges are determined by role, level and location, and our job titles may span more than one career level. The actual base pay for the successful candidate in this role is dependent upon many factors such as location, transferable or job-related skills, work experience, relevant training, business needs, and market demands. The base salary range may be subject to change.

Application process

  1. Interview with recruiter
  2. Interview with future manager
  3. Async exercise stage 
  4. Interview with team members

#LI-DP

Benefits

Our full benefits & perks are explained in our handbook at remote.com/r/benefits. As a global company, each country works differently, but some benefits/perks are for all Remoters:
  • work from anywhere
  • unlimited personal time off (minimum 4 weeks)
  • quarterly company-wide day off for self care
  • flexible working hours (we are async)
  • 16 weeks paid parental leave
  • mental health support services
  • stock options
  • learning budget
  • home office budget & IT equipment
  • budget for local in-person social events or co-working spaces

How you’ll plan your day (and life)

We work async at Remote which means you can plan your schedule around your life (and not around meetings). Read more at remote.com/async.

You will be empowered to take ownership and be proactive. When in doubt you will default to action instead of waiting. Your life-work balance is important and you will be encouraged to put yourself and your family first, and fit work around your needs.

If that sounds like something you want, apply now!

How to apply

  1. Please fill out the form below and upload your CV with a PDF format.
  2. We kindly ask you to submit your application and CV in English, as this is the standardised language we use here at Remote.
  3. If you don’t have an up to date CV but you are still interested in talking to us, please feel free to add a copy of your LinkedIn profile instead.

We will ask you to voluntarily tell us your pronouns at interview stage, and you will have the option to answer our anonymous demographic questionnaire when you apply below. As an equal employment opportunity employer it’s important to us that our workforce reflects people of all backgrounds, identities, and experiences and this data will help us to stay accountable. We thank you for providing this data, if you chose to.

See more jobs at Remote

Apply for this job

+30d

Data Engineer

Out There MediaMarousi,Attica,Greece, Remote Hybrid
MLmobile

Out There Media is hiring a Remote Data Engineer

We are offering an amazing opportunity to a talented and skilled Data Engineer to play a key role in leveraging big data analytics and technology to improve Out There Media's overall business operations.

About OTM

Out There Media (OTM) is a leading international mobile advertising company that uniquely links mobile operators with advertisers, public figures and international organizations via its proprietary, award-winning technology, Mobucks™, while also offering world class creative services.

Out There Media is trusted by some of the world’s most popular brands, such as Unilever, P&G, Disney, Starbucks, Budweiser, Netflix, Coca Cola, L’Oréal and McDonalds, international organizations such as the UN and the WHO, major mobile operators including Verizon, T-Mobile, Vodafone, Starhub, O2 Telefonica, Telcel (America Movil), MTN Group and many more, as well as Public Figures and Political Parties. The Company is headquartered in Vienna, Austria with operations across the globe.

What’s In for You

As a Data Engineer at Out There Media, you will play a critical role in building and maintaining the data infrastructure that powers our technology platform Mobucks™. You will be responsible for designing, developing, and deploying data pipelines that ingest, transform, and store massive datasets from various sources. Your work will directly impact the success of our advertising campaigns and the overall growth of the company.

Your Role and Responsibilities

  • Analyze and organize raw data
  • Developing and maintaining datasets and evaluate datasets for accuracy and quality in order to improving data quality and efficiency
  • Build data systems and pipelines
  • Prepare data for prescriptive and predictive modeling
  • Build up processes for data mining, data modeling, data streaming and create efficient ML models for interpret trends and patterns
  • Develop analytical tools and programs
  • Ensure that all data systems meet the high transactional requirements as well as industry best practices
  • Integrate up-and-coming data management and software engineering technologies into existing data structures
  • Create custom software components and analytics applications
  • Research new uses for existing data
  • Employ an array of technological languages and tools to connect systems together
  • Define data retention policies and Install/update disaster recovery procedures
  • Always be on the edge of technology and continuously monitor and test the system to ensure optimized performance.
  • Work with internal teams in understanding the business requirements and implementing solutions to achieve business goals

  • Degree in a related field such as software / computer engineering, applied mathematics, physics statistics or business informatics
  • 5+ years of working experience acquired by working in companies dealing heavily with big data (e.g. research companies)
  • Proficient use of SQL Language
  • Excellent use of Google BigQuery and Google Datastudio and Dataflow
  • Experienced with technologies / tools such as Spark, SparkSql, Flink, GeoSpark
  • Experienced with Hadoop, and MapReduce processes
  • Good knowledge of Big Data querying tools, such as Hive and others
  • Experience building and using clustering and classification algorithms and methods
  • Experience with integration of data from multiple data sources
  • Excellent English skills, written and oral.
  • Intellectual curiosity to find new and unusual ways of how to solve data management issues.
  • Ability to approach data organization challenges while keeping an eye on what’s important.

Working At OTM:

Our culture is fast-paced, entrepreneurial, and rewarding. If you are passionate about representing a company that truly believes in delighting its customers using cutting-edge, market leading digital technologies and products, you are at the right place!

  • We offer a hybrid working environment
  • A unique, diverse and multi-national company culture
  • The compensation package includes a competitive remuneration, dependent on experience and skills, and a bonus upon achievement of KPIs, in line with the company’s performance and rewards scheme.
  • Referral bonus scheme
  • Opportunity to work on cutting-edge technology and make a real impact
  • Be part of a team that is revolutionizing the mobile advertising industry

We are an equal-opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.

See more jobs at Out There Media

Apply for this job