Want to build a stronger, more sustainable future and cultivate your career? Join Cargill’s global team of 155,000 employees who use new technologies, dynamic insights and over 154 years of experience to connect farmers with markets, customers with ingredients, and people and animals with the food they need to thrive.
Job Purpose and Impact
The Data Engineering Lead will oversee a team of data engineering professionals in the execution of design, build, and operation of high-performance data-centric products or solutions with modern engineering practices and technologies in the assigned region. In this role, you will collaborate with multiregional teams and the greater engineering management community to build an effective and authentic engineering culture and maximize the effectiveness of the engineering capabilities. You will also provide leadership to drive local maturity and help to evolve data engineering as a capability in the company. This person will collaborate with engineering teams throughout Cargill establishing standards and best practices promoting consistent use of processes and technology to enable data engineering patterns.
- Develop strategic partnerships with local leaders and key stakeholders to help understand business expectations and priorities on data engineering products and solutions.
- Provide broad oversight and advise supervisors on data engineering delivery, testing, and operations to support the strategic direction of the business.
- Lead a local team to perform the advanced and varied data engineering solutions utilizing modern platforms and technologies.
- Provide broad oversight and guidance for activities which include analysis of business requirements and preparation of detailed technical specifications to support the strategic direction of the business.
- Champion high-level direction and support strategy development and technical planning of system or application technology aligned with internal and external software compliance standards.
- You will develop plans and deliver results in a fast-changing business and/or regulatory environment while leading and developing a team of experienced professionals and supervisors, exhibiting authority for talent management decisions related to hiring, performance, and disciplinary actions. You will also coordinate subordinate managers and supervisors in selecting staff to align with current and future needs of the organization.
- Define engineering standards while promoting awareness and adoption driving toward consistent and modern architecture approaches
- Other duties as assigned
- Bachelor’s degree in a related field or equivalent experience
- Minimum of 4 years supervising software/ data engineering development teams
- Minimum of 6 years with enterprise-level software delivery and release management practices
- Comfortable working in a global enterprise setting managing multiple stakeholder requirements and priorities
- Passionate for all things data and understand the art of possible in realizing unprecedented business value.
- Excited for being a change agent; bring forward new standards and practices ensuring quality, performance, and scale to the data engineering community at large
- Deep experience with codifying best practices for future reuse in the form of accessible, reusable patterns, templates, and code bases to accelerate engineering of robust data pipelines
- Deep experience with log capturing and alerting and monitoring strategies leveraging technologies such as Datadog, PagerDuty, ServiceNow, Logstash
- Proficient in configuring/administering CI/CD pipelines leveraging anyone of the following technologies, GitHub, GetLab, Jenkins, AzureDevOps, AWS CodeDeploy, Nexus
- Proficient with git and enterprise branching strategies
- Proficient with authoring custom pipeline tasks for AWS CodePipeline or GitHub Actions
- Proficient with workload scheduling and orchestration leveraging Airflow, AWS Step Functions
- Proficient with data engineering approaches leveraging Java, Spark, SQL, Scala
- Deep understanding of data engineering best practices related to architecture patterns supporting varying data types, volume, and velocity
- Experience with code quality and security tools such as SonarQube, Checkmarx, or Fortify
- Experience with data storage, processing and warehousing technologies such as Hadoop, S3, Snowflake
- Ability to engage senior business leaders and translate between the needs of the business and the demands of / capacities of technology
- Technical background in computer science, software engineering, database systems, distributed systems
- Direct experience having built and deployed robust, complex production systems supporting data and analytic solutions at scale
- Confirmed data engineering experience including technologies of big data, machine learning or data streaming
- Confirmed knowledge of best practice patterns and how to apply them to data architectures and solutions
- Confirmed experience with full stack engineering including application development or application programming interface engineering