🟢 Are you in Brazil, Argentina or Colombia? Join us as we actively recruit in these locations, offering a comfortable remote environment. Submit your CV in English, and we'll get back to you!
We invite a Junior Data Engineer to join our dynamic team supporting a major enterprise client in modernizing their data platform. In this role, you’ll assist in migrating and transforming legacy data pipelines to a modern cloud environment. You’ll work closely with senior engineers, architects, DevOps, QA, and product stakeholders, gaining hands-on data engineering experience and contributing to reliable, scalable data solutions.
🟩 What's in it for you :
- Join a supportive delivery team built on collaboration, transparency, and mutual respect
- Get hands-on exposure to a high-impact, real-world data platform transformation project
- Grow your skills with modern technologies like GCP, Snowflake, Apache Iceberg, dbt, Airflow, Dataflow, and BigQuery
✅ Is that you?
1,5+ years of experience in data engineering, data analytics, or software developmentBasic understanding of data warehouse concepts and ETL pipelinesGood knowledge of SQL and willingness to learn Snowflake or similar data storage technologiesBasic experience with Python for scripting or simple ETL tasksExperience with GCP platforms (BigQuery, GCS, Airflow, Dataflow, Dataproc, Pub / Sub)Understanding of version control (Git) and eagerness to learn CI / CD and IaC toolsDegree in Computer Science, Data Engineering, or related field, or equivalent practical experienceStrong communication and collaboration skillsUpper-Intermediate English levelDesirable :
Basic exposure to streaming data pipelines and event-driven architecturesFamiliarity with basic scripting and containerization tools (Bash, Docker)Basic understanding of data lakehouse concepts (Iceberg tables)Awareness of data transformation tools like dbtFamiliarity with AI-assisted tools like GitHub Copilot🧩 Key responsibilities and your contribution
In this role, you'll assist with key data engineering activities while supporting the team in delivering high-quality data solutions.
Assist in reviewing and analyzing existing ETL solutions for migration to the new architectureSupport the migration of batch and streaming data pipelines to the GCP Landing ZoneHelp build and maintain data transformations with dbt, supporting ELT pipelines in SnowflakeHelp with data jobs refactoring and mappingAssist in setting up and maintaining monitoring and alerting for data pipelinesContribute to migrating historical data to Iceberg tables with guidance from senior engineersCollaborate with senior engineers and stakeholders to understand requirements and implement solutionsParticipate in code reviews, team discussions, and technical planning to develop your skills🎾 What's working at Dev.Pro like?
Dev.Pro is a global company that's been building great software since 2011. Our team values fairness, high standards, openness, and inclusivity for everyone — no matter your background
🌐 We are 99.9% remote — you can work from anywhere in the world
🌴 Get 30 paid days off per year to use however you like — vacations, holidays, or personal time
️ 5 paid sick days, up to 60 days of medical leave, and up to 6 paid days off per year for major family events like weddings, funerals, or the birth of a child⚡️ Partially covered health insurance after the probation, plus a wellness bonus for gym memberships, sports nutrition, and similar needs after 6 months
💵 We pay in U.S. dollars and cover all approved overtime
📓 Join English lessons and Dev.Pro University programs, and take part in fun online activities and team-building events
Our next steps :
✅ Submit a CV in English — ✅ Intro call with a Recruiter — ✅ Internal interview — ✅ Client interview — ✅ Offer
Interested? Find out more :
📋How we work
💻 LinkedIn Page
📈 Our website
💻IG Page