(D-496) | DATA INFRASTRUCTURE ENGINEER

Agileengine


AgileEngine Job Description We are one of the Inc. 5000 fastest-growing companies in the US and a top-3 ranked dev shop according to Clutch. Our custom software solutions help companies across 15+ industries change the lives of millions. This is an exciting opportunity for you to work with the best, learn every day, and grow professionally. Key Responsibilities: - Design, build, and maintain modern and robust real-time and batch data analytics pipelines; - Develop and maintain declarative data models and transformations; - Implement data ingestion integrations for streaming and traditional sources like Postgres, Kafka, and DynamoDB; - Deploy and configure BI tooling for data analysis; - Collaborate with product, finance, legal, and compliance teams to create dashboards and reports for business operations, regulatory obligations, and customer needs; - Establish, communicate, and enforce data governance policies; - Document and share best practices regarding schema management, data integrity, availability, and security; - Protect sensitive data by implementing a secure permissioning model and establishing data masking and tokenization processes; - Identify and communicate data platform needs, including additional tooling and staffing; - Work with cross-functional teams to define requirements, plan projects, and execute on the plan. Requirements: - 5+ years of engineering and data analytics experience; - Strong SQL and Python/Scala skills for complex data analysis; - Hands-on experience building automation tooling and pipelines using Python, Scala, Go, or TypeScript; - Experience with modern data pipeline and warehouse tools (e.g., Snowflake, Databricks, Spark, AWS Glue); - Proficiency with declarative data modeling and transformation tools (e.g., DBT, SqlMesh); - Familiarity with real-time data streaming (e.g., Kafka, Spark); - Experience configuring and maintaining data orchestration platforms (e.g., Airflow); - Background working with cloud-based data lakes and secure data practices; - Ability to work autonomously and drive projects end-to-end; - Strong bias for simplicity, speed, and avoiding overengineering; - Upper-intermediate English level. Nice to Haves: - Experience with infrastructure-as-code tools (e.g., Terraform); - Familiarity with container orchestration (e.g., Kubernetes); - Prior experience managing external data vendors; - Exposure to Web3 / Crypto data systems; - Background working cross-functionally with compliance, legal, and finance teams; - Experience driving company-wide data governance or permissioning frameworks. Benefits: - Professional growth: Accelerate your professional journey with mentorship, TechTalks, and personalized growth roadmaps; - Competitive compensation: We match your ever-growing skills, talent, and contributions with competitive USD-based compensation and budgets for education, fitness, and team activities; - A selection of exciting projects: Join projects with modern solutions development and top-tier clients that include Fortune 500 enterprises and leading product brands; - Flextime: Tailor your schedule for an optimal work-life balance by having the options of working from home and going to the office - whatever makes you the happiest and most productive.

trabajosonline.net © 2017–2021
Más información