Required Skills and Qualifications: - Bachelor's degree in Systems Engineering, Computer Science, Data Science, Industrial Engineering, or related fields - 4+ years of experience in designing, developing, and optimizing data pipelines - Advanced English language skills (B2+) for effective communication in an international environment - Programming skills: Strong proficiency in Python (Pandas, NumPy, PySpark) and SQL (Snowflake, PostgreSQL, MySQL, SQL Server) - Data Pipelines & ETL: Hands-on experience in designing, developing, and maintaining scalable ETL processes and data ingestion/transformation workflows - Databases: Experience with relational and NoSQL databases (MongoDB, Cassandra) - Cloud & Big Data: Experience with AWS (S3, BigQuery, Snowflake) and familiarity with big data frameworks (Hadoop, Spark) - DevOps & Orchestration: Experience with containerization (Docker, Git) and workflow automation tools like Airflow, Cron Jobs