**Location**:Colombia, Perú, Argentina, Costa Rica or Bolivia **Work mode**:Remote - Full time **English**:Advanced **Position Overview**: We are seeking a **Senior Pytho**n Develope**r with strong expertise in end-to-end data pipeline development. This role is ideal for professionals who enjoy working closely with business stakeholders and contributing to high-impact, scalable data solutions in cloud-native environments. You'll be part of a cross-functional team, driving ingestion, transformation, modeling, and visualization efforts, while also playing a key role in the integration of modern AI tools and techniques into our data workflows. **Key Responsibilities**: - Design, build, and maintain **robust data pipelines** for ingestion, transformation, and modeling. - Implement **automated testing** and **error handling** to ensure pipeline reliability and data quality. - Collaborate directly with business teams to develop visualizations using **Plotly** and **Streamlit**. - Operate within a **cloud-based architecture (AWS)**, working with services such as **S3**, **ECS**, **CloudWatch**, and **Managed Workflows for Apache Airflow (MWAA)**. - Deploy and manage infrastructure using tools like **Docker**, **Terraform**, and **Azure DevOps**. - Integrate and monitor modern AI solutions using tools and frameworks such as **RAG**, **vector stores**, **Langchain**, **N8N**, **MCP Servers**, and **AWS SageMaker**. **Required Skills & Experience**: - **5+ years** of hands-on experience with **Python** for data engineering and automation. - Strong proficiency in **SQL** and **data modeling** techniques. - Proven experience designing and implementing **data pipelines** in production environments. - Solid understanding of **data validation**, **testing frameworks**, and **error handling** strategies. - Deep familiarity with the **AWS ecosystem**, including services like **S3**, **ECS**, **CloudWatch**, and **Airflow (MWAA)**. - Experience creating business-facing **data visualizations** using **Plotly** and **Streamlit**. - Proven track record working with **AI system integrations**, including **Langchain**, **RAG**, **vector stores**, **N8N**, **MCP Servers**, **AWS SageMaker**, and other tools for AI model orchestration and monitoring. - Comfortable working independently and collaboratively in a **remote, distributed team**. **Nice to Have**: - Experience with **Airflow**, **Snowflake**, **Terraform**, **Docker**, **MongoDB**, **Elastic**, **PostgreSQL**, and **Azure DevOps**. **About Encora** Encora is a global company that offers Software and Digital Engineering solutions. Our practices include Cloud Services, Product Engineering & Application Modernization, Data & Analytics, Digital Experience & Design Services, DevSecOps, Cybersecurity, Quality Engineering, AI & LLM Engineering, among others. **At Encora, we hire professionals based solely on their skills and do not discriminate based on age, disability, religion, gender, sexual orientation, socioeconomic status, or nationality.