What do you need? Experience in building and maintaining data pipelines using tools like Apache Airflow o Pentaho, Spark or BigQuery. Automatization, ETL/ELT, data workflow orchestrators Strong knowledge of database management systems (SQL, and NoSQL, e.g. PostgreSQL, MySQL, MongoDB). Familiarity with automation and monitoring tools to ensure data infrastructure performance (e.g. Cloud Logging, Apache Airflow). Knowledge of agile methodologies for project management and delivery. What do we offer you? - Growth opportunities - Diverse, dynamic, and multicultural environment - Benefits for your financial and emotional well-being - Competitive wages If you are seeking a fulfilling opportunity to join our esteemed team and contribute to the overall success, we invite you to apply for the role of DataOps. Take the next step in your career journey and make a significant impact today!

trabajosonline.net © 2017–2021
Más información