Remote data platform engineer jobs

We, at Turing, are looking for highly-skilled remote data platform engineers who will be participating in architecture and implementation of cloud-native data pipelines and infrastructure to enable analytics and machine learning on rich datasets. Get an opportunity to work with the leading U.S. companies and rise quickly through the ranks.

Find remote software jobs with hundreds of Turing clients

Job description

Job responsibilities

  • Build scalable data architecture, including data extractions and data transformation
  • Build cost-effective and strategic solutions by developing a clear understanding of data platform cost
  • Design and build data products and data flows for the continued expansion of the data platform
  • Write high-performant, well-styled, validated, and documented code
  • Participate in data cleansing and data quality initiatives
  • Build automated data engineering pipelines

Minimum requirements

  • Bachelor’s/Master’s degree in Engineering, Computer Science (or equivalent experience)
  • At least 3+ years of experience in data engineering (rare exceptions for highly skilled developers)
  • Experience of developing real-time data streaming pipelines using Change Data Capture (CDC), Kafka and Streamsets/NiFi/Flume/Flink
  • Proficient with big data technologies like Hadoop, Hive, etc.
  • Experience with Change Data Capture tooling such as IBM Infosphere, Oracle Golden Gate, Attunity, Debezium
  • Experience of ETL technical design, automated data quality testing, QA and documentation, data warehousing, data modeling and data wrangling
  • Expertise in Unix and DevOps automation tools like Terraform and Puppet and experience in deploying applications to at least one of the major public cloud provider such AWS, GCP or Azure
  • Extensive experience using RDMS and one of the NoSQL databases such as MongoDB, ETL pipelines, Python, Java APIs using spring boot and writing complex SQLs
  • Strong Python, Java and other backend development skills
  • Fluency in English language for effective communication
  • Ability to work full-time (40 hours/week) with a 4 hour overlap with US time zones

Preferred skills

  • Basic understanding of data systems or data pipelines
  • Knowledge of moving trained ML models into production data pipelines
  • A good understanding of cloud warehouse such as Snowflake
  • Grasp on modern code development practices
  • Experience with core AWS services and concepts (S3, IAM, autoscaling groups)
  • Basic DevOps knowledge
  • Knowledge of relational database modeling concepts and SQL skills
  • Strong analytical, consultative, and communication skills

Interested in this job?

Apply to Turing today.

Apply now

Why join Turing?

Elite US Jobs

1Elite US Jobs

Turing’s developers earn better than market pay in most countries, working with top US companies.
Career Growth

2Career Growth

Grow rapidly by working on challenging technical and business problems on the latest technologies.
Developer success support

3Developer success support

While matched, enjoy 24/7 developer success support.

Developers Turing

Read Turing.com reviews from developers across the world and learn what it’s like working with top U.S. companies.
4.65OUT OF 5
based on developer reviews as of June 2024
View all reviews

How to become a Turing developer?

Work with the best software companies in just 4 easy steps
  1. Create your profile

    Fill in your basic details - Name, location, skills, salary, & experience.

  2. Take our tests and interviews

    Solve questions and appear for technical interview.

  3. Receive job offers

    Get matched with the best US and Silicon Valley companies.

  4. Start working on your dream job

    Once you join Turing, you’ll never have to apply for another job.

cover

How to become a Data platform engineer ?

Data platform engineering is a broad topic that encompasses a range of titles, with a primary focus on constructing trustworthy infrastructures that allow for continuous data flow in a data-driven environment. These individuals serve as facilitators of clean and raw data from a variety of sources, allowing employees to use it to make data-driven decisions inside the firm.

The process of designing and building large-scale data collecting, storage, and analysis systems is known as Data platform engineering. It's a broad field with applications in nearly every industry. Organizations may collect massive amounts of data, but they need the right people and technology to ensure that the data reaches data scientists and analysts in usable form.

Data platform engineers create systems that collect, process and transform raw data into information that data scientists and business analysts can understand in a variety of situations. The ultimate goal is to make data more accessible to enterprises so they can evaluate and improve their performance.

What is the scope of Data platform engineering?

One of the most in-demand jobs in the industry is that of a remote Data platform engineer. Businesses regard them highly in all sectors, and they are handsomely paid for their work.

As more companies get on the Big Data bandwagon and mine data for relevant insights, the demand for data-related jobs is rising by the day. Engineers who work with data aren't exempt from this rule. Companies are constantly on the lookout for qualified Data platform engineers that can deal with large volumes of complex data to provide relevant business insights. Data platform engineers' income potential has also improved as a result of the work requiring a high level of Big Data experience and ability.

What are the roles and responsibilities of Data platform engineers?

The major role of a Data platform engineer is to develop and create a reliable infrastructure for converting data into forms that Data Scientists can interpret. In addition to building scalable algorithms to turn semi-structured and unstructured data into useful representations, remote Data platform engineers must be able to recognize trends in large datasets. Raw data is prepared and transformed by Data platform engineers so that it may be used for analytical or operational reasons. Let's look at the duties of remote Data platform engineer jobs now:

  • Create a scalable data architecture that includes data extraction and manipulation.
  • Build a thorough grasp of data platform costs to develop cost-effective and strategic solutions.
  • Create data products and data flows to support the data platform's continuing growth.
  • Participate in data cleansing and data quality projects - Build automated Data platform engineering pipelines
  • Write high-performance, well-styled, validated, and documented code
  • Translate intricate designs into complicated functional and technological requirements.
  • Hadoop, NoSQL, and other technologies are used to store data.
  • Create models and uncover hidden data patterns
  • Data management techniques must be integrated into the organization's present structure.
  • Assist development of a sound infrastructure with third-party integration.
  • To track data, create high-performance and scalable web services.

How to become a Data platform engineer?

With the right combination of skills and experience, you may start or advance your career in Data platform engineering. Data platform engineers often have a bachelor's degree in computer science or a related field. A degree can help you build a solid foundation of knowledge in this continuously changing field. A master's degree can also help you advance your career and open doors to higher-paying positions.

Data platform engineers are often educated in computer science, engineering, applied mathematics, or a related IT field. Because the work requires a high degree of technical knowledge, prospective Data platform engineers may find that a boot camp or certification is insufficient.

You'll need knowledge of SQL database design and programming abilities in a range of languages, including Python and Java. If you already have a background in IT or a related field like mathematics or analytics, a boot camp or certification might help you create a CV for remote Data platform engineering jobs.

If you don't have any prior experience with technology or IT, you may need to participate in a more intense program to demonstrate your understanding. If you don't already have one, you may need to enroll in an undergraduate program. If you have an undergraduate degree but it isn't in a relevant field, keep looking at master's degrees in data analytics and Data platform engineering.

You'll have a better sense of how your expertise fits into that function if you spend some time going through job advertisements to see what companies are searching for.

Interested in remote Data Platform Engineer jobs?

Become a Turing developer!

Apply now

Skills required to become a Data platform engineer

1. Hadoop and Spark

The Apache Hadoop software library is a platform that enables the distributed processing of enormous data volumes across clusters of devices using fundamental programming principles. It's designed to scale from a single server to tens of thousands of devices, each with its own processing and storage capabilities. The framework supports a number of programming languages, including Python, Scala, Java, and R. While Hadoop is the most powerful tool for massive data, it does have certain drawbacks, such as delayed processing and a high degree of coding. Apache Spark is a data processing engine that supports stream processing, or data input and output in real-time. It's a lot like Hadoop in that it does a lot of the same things.

2. C++

C++ is a relatively basic yet powerful programming language for quickly computing big data sets when you don't have a preset algorithm. It's the only programming language capable of processing more than 1GB of data in a single second. You may also apply real-time predictive analytics to retrain the data while keeping the system of record constant.

3. Data Warehousing

A data warehouse is a relational database that can be queried and analyzed to find information. Its purpose is to provide you with a long-term perspective of data through time. A database, on the other hand, refreshes real-time data on a regular basis. Data platform engineers must be familiar with the most prominent data warehousing solutions, such as Amazon Web Services and Amazon Redshift. AWS is a necessity for practically all remote Data platform engineer jobs.

4. Azure

Azure is a Microsoft cloud platform that enables Data platform engineers to build large-scale data analytics applications. It has an easy-to-deploy integrated analytics solution that makes supporting applications and servers quite systematic.The bundle includes pre-built services for everything from data storage to advanced machine learning. Because Azure is so popular, some Data platform engineers have chosen to specialize in it.

5. SQL and NoSQL

For designing and managing relational database systems, the SQL programming language is the industry standard (tables that consist of rows and columns). Depending on their data types, such as a graph or a text, non-tabular NoSQL databases come in a variety of forms and sizes. Data platform engineers must be familiar with database management systems (DBMS), which is a software program that provides an interface to databases for information storage and retrieval.

6. ETL (Extract, Transfer, Load)

The process of taking data from a source, turning it into a format that can be examined, and storing it in a data warehouse is known as ETL (Extract, Transfer, Load). This approach uses batch processing to aid users in assessing data relevant to a specific business situation. The ETL gathers data from various sources, applies business rules to it, and then puts the transformed data into a database or business intelligence platform where it can be accessed and used by everyone in the organization.

Interested in remote Data Platform Engineer jobs?

Become a Turing developer!

Apply now

How to get remote Data platform engineer jobs?

Working as a programmer may be quite satisfying. However, a solid grasp of programming languages is required. It is suggested that you practice until you achieve perfection. Furthermore, having a product vision is necessary for being in sync with the team. Collaboration with team members and work prioritization according to the long-term goal is aided by good communication skills.

Turing has made things a bit easier for you in your hunt for remote Data platform engineering jobs. Turing has the greatest remote Data platform engineer jobs that can help you advance in your career as a Data platform engineer. Get full-time, long-term remote Data platform engineer jobs with greater income and career progression by joining a network of the world's greatest developers.

Why become a Data platform engineer at Turing?

Elite US jobs

Long-term opportunities to work for amazing, mission-driven US companies with great compensation.

Career growth

Work on challenging technical and business problems using cutting-edge technology to accelerate your career growth.

Exclusive developer community

Join a worldwide community of elite software developers.

Once you join Turing, you’ll never have to apply for another job.

Turing's commitments are long-term and full-time. As one project draws to a close, our team gets to work identifying the next one for you in a matter of weeks.

Work from the comfort of your home

Turing allows you to work according to your convenience. We have flexible working hours and you can work for top US firms from the comfort of your home.

Great compensation

Working with top US corporations, Turing developers make more than the standard market pay in most nations.

How much does Turing pay their Data platform engineer?

Turing can assist you in recommending a salary range that will allow you to settle on a lucrative and long-term position. The majority of our recommendations are based on market circumstances and our clients' needs. Because Turing believes in providing the best suitable opportunities to people. As a result, each Data platform engineer is free to choose their own price range based on their talents and experience.

Frequently Asked Questions

Turing is an AGI infrastructure company specializing in post-training large language models (LLMs) to enhance advanced reasoning, problem-solving, and cognitive tasks. Founded in 2018, Turing leverages the expertise of its globally distributed technical, business, and research experts to help Fortune 500 companies deploy customized AI solutions that transform operations and accelerate growth. As a leader in the AGI ecosystem, Turing partners with top AI labs and enterprises to deliver cutting-edge innovations in generative AI, making it a critical player in shaping the future of artificial intelligence.

After uploading your resume, you will have to go through the three tests -- seniority assessment, tech stack test, and live coding challenge. Once you clear these tests, you are eligible to apply to a wide range of jobs available based on your skills.

No, you don't need to pay any taxes in the U.S. However, you might need to pay taxes according to your country’s tax laws. Also, your bank might charge you a small amount as a transaction fee.

We, at Turing, hire remote developers for over 100 skills like React/Node, Python, Angular, Swift, React Native, Android, Java, Rails, Golang, PHP, Vue, among several others. We also hire engineers based on tech roles and seniority.

Communication is crucial for success while working with American clients. We prefer candidates with a B1 level of English i.e. those who have the necessary fluency to communicate without effort with our clients and native speakers.

Currently, we have openings only for the developers because of the volume of job demands from our clients. But in the future, we might expand to other roles too. Do check out our careers page periodically to see if we could offer a position that suits your skills and experience.

Our unique differentiation lies in the combination of our core business model and values. To advance AGI, Turing offers temporary contract opportunities. Most AI Consultant contracts last up to 3 months, with the possibility of monthly extensions—subject to your interest, availability, and client demand—up to a maximum of 10 continuous months. For our Turing Intelligence business, we provide full-time, long-term project engagements.

No, the service is absolutely free for software developers who sign up.

Ideally, a remote developer needs to have at least 3 years of relevant experience to get hired by Turing, but at the same time, we don't say no to exceptional developers. Take our test to find out if we could offer something exciting for you.

View more FAQs

Latest posts from Turing

Gul-bhai-Turkey

Gültekin from Istanbul Reviews Turing.com, Says Remote Work Has Helped Him Spend More Time with Family

In his Turing.com review, Gultekin said he would recommend Turing to his friends and other developers who want to...

Read more

Turing Blog: Articles, Insights, Company News and Updates

Explore insights on AI and AGI at Turing's blog. Get expert insights on leveraging AI-powered solutions to drive ...

Read more

Here’s Why You Should Choose ReactJS for Your Project

ReactJS offers various benefits for application and web development which is exactly why it should be your prefer...

Read more

Why a ‘Deep Jobs’ Platform is a Better Choice for Companies and Remote Job Seekers

Turing’s approach is a vertically-integrated solution that replaces traditional IT service company offerings with...

Read more

Rust Survey 2021: Key Highlights

Take a look at the key highlights from the Rust Survey 2021 that analyzed the usability, adoption, and...

Read more

Leadership

In a nutshell, Turing aims to make the world flat for opportunity. Turing is the brainchild of serial A.I. entrepreneurs Jonathan and Vijay, whose previous successfully-acquired AI firm was powered by exceptional remote talent. Also part of Turing’s band of innovators are high-profile investors, such as Facebook's first CTO (Adam D'Angelo), executives from Google, Amazon, Twitter, and Foundation Capital.

Equal Opportunity Policy

Turing is an equal opportunity employer. Turing prohibits discrimination and harassment of any type and affords equal employment opportunities to employees and applicants without regard to race, color, religion, sex, sexual orientation, gender identity or expression, age, disability status, protected veteran status, or any other characteristic protected by law.

Explore remote developer jobs

briefcase
Python Automation and Task Creator

About Turing:

Based in San Francisco, California, Turing is the world’s leading research accelerator for frontier AI labs and a trusted partner for global enterprises deploying advanced AI systems. Turing supports customers in two ways: first, by accelerating frontier research with high-quality data, advanced training pipelines, plus top AI researchers who specialize in coding, reasoning, STEM, multilinguality, multimodality, and agents; and second, by applying that expertise to help enterprises transform AI from proof of concept into proprietary intelligence with systems that perform reliably, deliver measurable impact, and drive lasting results on the P&L.


Role Overview

We are seeking a detail-oriented Computer-Using Agent (CUA) to perform structured automation tasks within Ubuntu-based virtual desktop environments. In this role, you will interact with real desktop applications using Python-based GUI automation tools, execute workflows with high accuracy, and document every step taken.

This is a hands-on execution role ideal for candidates who are comfortable working with Linux systems, virtualization tools, and repeatable task workflows in a controlled environment.


What Does the Day-to-Day Look Like?

  • Set up and operate Ubuntu virtual machines using VMware or VirtualBox
  • Automate mouse and keyboard interactions using Python-based GUI automation (e.g., PyAutoGUI)
  • Execute predefined workflows across various Ubuntu desktop applications
  • Ensure tasks are completed accurately and can be reproduced consistently
  • Capture and document all actions, steps, and outcomes in a structured format
  • Collaborate with the delivery team to refine automation scenarios and workflows

Required Skills & Qualifications

  • Hands-on experience with Ubuntu/Linux desktop environments
  • Working knowledge of PyAutoGUI or similar GUI automation frameworks
  • Basic Python scripting and debugging skills
  • Familiarity with VMware or VirtualBox
  • Strong attention to detail and ability to follow step-by-step instructions
  • Clear documentation and reporting skills

Application Domains

You will be expected to perform automation tasks across the following Ubuntu-based environments:

  • os – Core Ubuntu desktop environment
  • chrome – Ubuntu with Google Chrome
  • gimp – Ubuntu with GIMP
  • libreoffice_calc – LibreOffice Calc
  • libreoffice_writer – LibreOffice Writer
  • libreoffice_impress – LibreOffice Impress
  • thunderbird – Thunderbird email client
  • vlc – VLC media player
  • vs_code – Visual Studio Code

Perks of Freelancing With Turing

  • Fully remote work.
  • Opportunity to work on cutting-edge AI projects with leading LLM companies.

Offer Details:

  • Commitments Required: 40 hours per week with 4 hours of overlap with PST. 
  • Engagement  type  : Contractor assignment (no medical/paid leave)
  • Duration of contract : 2 month
Holding Companies & Conglomerates
10K+ employees
Python
briefcase
Knowledge Graph Expert (Knowledge Graph / SQL / LLM)
About the Client

Our mission is to bring community and belonging to everyone in the world. We are a community of communities where people can dive into anything through experiences built around their interests, hobbies, and passions. With more than 50 million people visiting 100,000+ communities daily, it is home to the most open and authentic conversations on the internet.

About the Team

The Ads Content Understanding team’s mission is to build the foundational engine for interpretable and frictionless understanding of all organic and paid content on our platform. Leverage state-of-the-art applied ML and a robust Knowledge Graph (KG) to extract high-quality, monetization-focused signals from raw content — powering better ads, marketplace performance, and actionable business insights at scale.

We are seeking a Knowledge Graph Expert to help us grow and curate our KG of entities and relationships, bringing it to the next level.


About the Role


We are looking for a detail-oriented and strategic Knowledge Graph Curator. In this role, you will sit at the intersection of AI automation and human judgment. You will not only manage incoming requests from partner teams but also proactively shape the growth of our Knowledge Graph (KG) to ensure high fidelity, relevance, and connectivity. You will serve as the expert human-in-the-loop, validating LLM-generated entities and ensuring our graph represents the "ground truth" for the business.

 

Key Responsibilities


  • Onboarding of new entities to the Knowledge Graph maintained by the Ads team
  •  Data entry, data labeling for automation of content understanding capabilities
  • LLM Prompt tuning for content understanding automation

What You'll Do


1. Pipeline Management & Prioritization

  • Manage Inbound Requests: Act as the primary point of contact for partner teams (Product, Engineering, Analytics) requesting new entities or schema changes.
  • Strategic Prioritization: Triage the backlog of requests by assessing business impact, urgency, and technical feasibility.

2. AI-Assisted Curation & Human-in-the-Loop

  • Oversee Automation: Interact with internal tooling to review entities generated by Large Language Models (LLMs). You will approve high-confidence data, edit near-misses, and reject hallucinations.
  • Quality Validation: Perform rigorous QA on batches of generated entities to ensure they adhere to the strict ontological standards and factual accuracy required by the KG.
  • Model Feedback Loops: Participate in ad-hoc labeling exercises (creation of Golden Sets) to measure current model quality and provide training data to fine-tune classifiers and extraction algorithms.

3. Data Integrity & Stakeholder Management

  • Manual Curation & Debugging: Investigate bug reports from downstream users or automated anomaly detection systems. You will manually fix data errors, merge duplicate entities, and resolve conflicting relationships.
  • Feedback & Reporting: Close the loop with partner teams. You will report on the status of their requests, explain why certain modeling decisions were made, and educate stakeholders on how to best query the new data.


Qualifications for this role:

  • Knowledge Graph Fundamentals: Understanding of graph concepts (Nodes, Edges, Properties)
  • Taxonomy & Ontology: Experience categorizing data, managing hierarchies, and understanding semantic relationships between entities.
  • Data Literacy: Proficiency in navigating complex datasets. Experience with SQL, SPARQL, or Cypher is a strong plus.
  • AI/LLM Familiarity: Understanding of how Generative AI works, common failure modes (hallucinations), and the importance of ground-truth data in training.

Operational & Soft Skills

  • Analytical Prioritization: Ability to look at a list of 50 tasks and determine the 5 that will drive the most business value.
  • Attention to Detail: An "eagle eye" for spotting inconsistencies, typos, and logical fallacies in data.
  • Stakeholder Communication: Ability to translate complex data modeling concepts into clear language for non-technical product managers and business stakeholders.
  • Tool Proficiency: Comfort learning proprietary internal tools, ticketing systems (e.g., Jira), and spreadsheet manipulation (Excel/Google Sheets).


Offer Details


  • Full-time contractor or full-time employment, depending on the country
  • Remote only, full-time dedication (40 hours/week)
  • 8 hours of overlap with Netherlands
  • Competitive compensation package.
  • Opportunities for professional growth and career development.
  • Dynamic and inclusive work environment focused on innovation and teamwork
Media & Internet
251-10K employees
LLMSQL
sample card

Apply for the best jobs

View more openings
Turing books $87M at a $1.1B valuation to help source, hire and manage engineers remotely
Turing named one of America's Best Startup Employers for 2022 by Forbes
Ranked no. 1 in The Information’s "50 Most Promising Startups of 2021" in the B2B category
Turing named to Fast Company's World's Most Innovative Companies 2021 for placing remote devs at top firms via AI-powered vetting
Turing helps entrepreneurs tap into the global talent pool to hire elite, pre-vetted remote engineers at the push of a button

Work with the world's top companies

Create your profile, pass Turing Tests and get job offers as early as 2 weeks.