BIG DATA ARCHITECT

80.000.000 - 120.000.000


Data architects lead the design and implementation of data collection, storage, transformation, orchestration (movement) and consumption to achieve optimum value from data. They are the technical leaders within data delivery teams. They play a key role in modeling data for optimal reuse, interoperability, security and accessibility as well as in the design of efficient ingestion and transformation pipelines. They ensure data accessibility through a performant, cost-effective consumption layer that supports use by citizen developers, data scientists, AI, and application integration. And they instill trust through the employment of data quality frameworks and tools. The data architect at Chevron predominantly works within the Azure Data Analytics Platform, but they are not limited to it. The data architect is responsible for optimizing costs for delivering data. They are also responsible for ensuring compliance to enterprise standards and are expected to contribute to the evolution of those standards resulting from changing technologies and best practices. Qualifications Requirements: At least 2 years of proven experience as a Data Architect or similar role. Strong knowledge of data modeling, data warehousing, and data integration techniques. Experience with big data technologies (e.g., Hadoop, Spark) and data lake solutions (e.g., Azure Data Lake, AWS Lake Formation). Familiarity with cloud platforms (e.g., Microsoft Azure, AWS, Google Cloud Platform). Strong understanding of data governance and security best practices. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Bachelor’s degree in computer science, Information Technology, or a related field (or equivalent experience). Preferred Qualifications: Familiarity with Azure AI/ML Services, Azure Analytics: Event Hub, Azure Stream Analytics, Scripting: Ansible. Experience with machine learning and advanced analytics. Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes). Understanding of CI/CD pipelines and automated testing frameworks. Selection Criteria: Technical skills Experience with big data technologies, data lake solutions, DBMS and cloud platforms. Experience in designing data pipelines for optimal performance, resiliency, and cost efficiency. Experience in data modeling, ERDs, Star and/or Snowflake, and physical model design for analytics and application integration. Influential leadership & change agency. Track record for defining/implementing data architecture framework and governance around master data, meta data, modeling. Experience in defining/applying frameworks, processes and standards to ensure trust in data solutions and shepherding data ownership and security. Demonstrated application of standards in architecture design. Business acumen. Demonstrated experience in more than one industry vertical. Experience translating business objectives and goals into technical architecture for data solutions. Ability to drive business results by building optimal cost data landscapes. Professional skills. Excellent communication and collaboration skills. Ability to work across boundaries and functions and evidence of strong relationship building. #J-18808-Ljbffr

trabajosonline.net © 2017–2021
Más información