DATA INFRASTRUCTURE ENGINEER ID35383

80.000.000 - 120.000.000


Join to apply for the Data Infrastructure Engineer ID35383 role at AgileEngine 3 weeks ago Be among the first 25 applicants Get AI-powered advice on this job and more exclusive features. AgileEngine is one of the Inc. 5000 fastest-growing companies in the US and a top-3 ranked dev shop according to Clutch. We create award-winning custom software solutions that help companies across 15+ industries change the lives of millions. If you like a challenging environment where you’re working with the best and are encouraged to learn and experiment every day, there’s no better place - guaranteed! :) WHAT YOU WILL DO Architect, build, and maintain modern and robust real-time and batch data analytics pipelines; Develop and maintain declarative data models and transformations; Implement data ingestion integrations for streaming and traditional sources such as Postgres, Kafka, and DynamoDB; Deploy and configure BI tooling for data analysis; Work closely with product, finance, legal, and compliance teams to build dashboards and reports to support business operations, regulatory obligations, and customer needs; Establish, communicate, and enforce data governance policies; Document and share best practices with regards to schema management, data integrity, availability, and security; Protect and limit access to sensitive data by implementing a secure permissioning model and establishing data masking and tokenization processes; Identify and communicate data platform needs, including additional tooling and staffing; Work with cross-functional teams to define requirements, plan projects, and execute on the plan. MUST HAVES 5+ years of engineering and data analytics experience; Strong SQL and Python skills for complex data analysis; Hands-on experience building automation tooling and pipelines using Python, Go, or TypeScript ; Experience with modern data pipeline and warehouse tools using Databricks, Spark or AWS Glue ; Experience with infrastructure-as-code using Terraform ; Proficiency with declarative data modeling and transformation tools using DBT ; Familiarity with real-time data streaming ( e.g., Kafka, Spark ); Experience configuring and maintaining data orchestration platforms with Airflow ; Background working with cloud-based data lakes and secure data practices; Ability to work autonomously and drive projects end-to-end; Upper-intermediate English level. NICE TO HAVES Familiarity with container orchestration (e.g., Kubernetes); Prior experience managing external data vendors; Exposure to Web3 / Crypto data systems; Background working cross-functionally with compliance, legal, and finance teams; Experience driving company-wide data governance or permissioning frameworks; Strong bias for simplicity, speed, and avoiding overengineering. THE BENEFITS OF JOINING US Professional growth: Accelerate your professional journey with mentorship, TechTalks, and personalized growth roadmaps. Competitive compensation: We match your ever-growing skills, talent, and contributions with competitive USD-based compensation and budgets for education, fitness, and team activities. A selection of exciting projects: Join projects with modern solutions development and top-tier clients that include Fortune 500 enterprises and leading product brands. Flextime: Tailor your schedule for an optimal work-life balance, by having the options of working from home and going to the office – whatever makes you the happiest and most productive. Your application doesn't end here! To unlock the next steps, check your email and complete your registration on our Applicant Site . The incomplete registration results in the termination of your process. Seniority level Mid-Senior level Employment type Full-time Job function Information Technology Industries IT Services and IT Consulting #J-18808-Ljbffr

trabajosonline.net © 2017–2021
Más información