Databricks AWS Architect - Hybrid | Minnestar

Affinity Plus Credit Union

At Affinity Plus every employee understands how their work affects our members experience and we strive to provide an experience that can’t be found anywhere else. Great service starts with great employees and that is why we focus on providing not only the best place our members will ever bank but the best place our employees will ever work. Between our one of a kind culture, incredible benefits, and work/life balance; we believe you will feel the Affinity Plus difference.

Position Overview:

A Databricks AWS Architect at Affinity Plus builds secure, highly scalable big data solutions to achieve tangible, data-driven outcomes all the while keeping simplicity and operational effectiveness in mind.  This role collaborates with teammates, product teams, and cross-functional project teams to lead the adoption and integration of the Databricks Lakehouse Platform into the enterprise ecosystem and AWS architecture.  This role is responsible for implementing securely architected big data solutions that are operationally reliable, performant, and deliver on strategic initiatives.


Duties and Responsibilities:

  • Work closely with team members to lead and drive enterprise solutions, advising on key decision points on trade-offs, best practices, and risk mitigation
  • Promote, emphasize, and leverage big data solutions to deploy performant systems that appropriately auto-scale, are highly available, fault-tolerant, self-monitoring, and serviceable
  • Use a defense-in-depth approach in designing data solutions and AWS infrastructure
  • Assemble large, complex data sets that meet functional and non-functional business requirements
  • Assist and advise data engineers in the preparation and delivery of raw data for prescriptive and predictive modeling
  • Aid developers to identify, design, and implement process improvements with automation tools to optimizing data delivery
  • Build infrastructure required for optimal extraction, loading and transformation of data from a wide variety of data sources
  • Work with the developers to maintain and monitor scalable data pipelines
  • Perform root cause analysis to answer specific business questions and identify opportunities for process improvement
  • Build out new API integrations to support continuing increases in data volume and complexity
  • Collaborate with Enterprise Digital Intelligence (Edi) team to improve data models that feed business intelligence tools, increase data accessibility, and foster data-driven decision-making across the organization
  • Implement processes and systems to monitor data quality and security, ensuring production data is accurate and available for key stakeholders and the business processes that depend on it
  • Employ change management best practices to ensure that data remains readily accessible to the business
  • Maintain tools, processes and associated documentation to manage API gateways and underlying infrastructure
  • Implement reusable design templates and solutions to integrate, automate, and orchestrate cloud operational needs


Qualifications and Skills:

Required Qualifications and Skills

  • 2+ years’ experience with Databricks
  • 3+ years’ of related experience in designing secure, scalable and cost-effective big data architecture
  • 5+ years’ experience in a software development, data engineering, or data analytics field using Python, Scala, Spark, Java, or equivalent technologies
  • Bachelor’s or Master’s degree in Big Data, Computer Science, Engineering, Mathematics, or similar area of study or equivalent work experience
  • Knowledge of different programming and scripting languages
  • Mid-level knowledge of code versioning tools [such as Git, Mercurial or SVN]
  • Expert proficiency in Python, C++, Java, R, and SQL
  • Proficiency in software engineering best practices employed in the software development lifecycle, including coding standards, code reviews, source control management, build processes, testing and operations
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
  • Strong understanding and knowledge of financial industry technology standards, compliance requirements, and experience working with audit and regulatory bodies
  • Mid-level experience of the AWS Cloud platform core foundational native services and expertise in the Big Data & AI first party services such as AWS Glue, Amazon Athena, Amazon Kinesis, AmazonQuickSight, Crawlers
  • Proficiency in working with all types of operating systems, especially Linux and Unix. RHEL RHCSA/RHCE certifications preferred.
  • Proficient level experience with architecture design, build and optimization of big data collection, ingestion, storage, processing, and visualization
  • Proficient in building, automating and deploying data pipelines and workflows into end-user facing applications
  • Ability to remain up to date with industry standards and technological advancements that will enhance data quality and reliability to advance strategic initiatives
  • Basic experience with or knowledge of agile methodologies
  • Technical expertise with data models, data mining, and segmentation techniques
  • Working knowledge of RESTful APIs, OAuth2 authorization framework and security best practices for API Gateways
  • Expert at diagnostic and problem resolution providing third-level support
  • Familiarity of working with unstructured data sets (i.e. voice, image, log files, social media posts, email)
  • Possess an organized methodical approach and bring a continuous improvement mindset
  • Demonstrated predisposition for action, willingness to partner, and innate drive to provide an exceptional member and employee experience
  • Highly creative and innovative technologist that thrives independently and collaborates well in a team environment
  • Strong analytical and decision-making skills with a high degree of accuracy
  • Strong verbal, written, and interpersonal communication skills
  • Time Management skills and the ability to prioritize workloads


Preferred Qualifications

  • Experience in a financial institution
  • Expert-level knowledge of AWS infrastructure configurations and services offering
  • Expert-level knowledge of data frameworks, data lakes and open-source projects such as Apache Spark, MLflow, and Delta Lake
  • Experience with MDM using data governance solutions (Collibra preferred)
  • Advanced technical certifications: AWS Certified Data Analytics, DASCA Big Data Engineering and Analytics
  • AWS Certified Cloud Practitioner, Solutions Architect, Certified Developer, or SysOps Administrator certifications preferred


Workplace Environment:

  • Working in a stationary position for 80% of the work day
  • Utilizing the telephone and video conferencing 10-20% of the day
  • Moving, lifting and/or carrying 30 pounds with or without accommodations
  • Bending, twisting, kneeling, stooping or crouching when appropriate, on occasion
  • Repetitive movements, including but not limited to typing, mousing, phones, etc.
  • Requires onsite presence based on coordination of work with other employees and/or departments.  May require travel to attend on-site meetings/events for collaboration, connection, project work, All-Employee Day, etc.


Required Work Schedule:

Standard Monday through Friday business hours with participation in a 24/7 on-call rotation as well as a willingness to work afterhours as needed for upgrades, equipment replacement, etc.  Consistent and reliable attendance is a required essential function of this role to meet the needs of the department/team and organization.


Equal Opportunity Employer/Protected Veterans/Individuals with Disabilities

The contractor will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay or the pay of another employee or applicant. However, employees who have access to the compensation information of other employees or applicants as a part of their essential job functions cannot disclose the pay of other employees or applicants to individuals who do not otherwise have access to compensation information, unless the disclosure is (a) in response to a formal complaint or charge, (b) in furtherance of an investigation, proceeding, hearing, or action, including an investigation conducted by the employer, or (c) consistent with the contractor’s legal duty to furnish information. 41 CFR 60-1.35(c)

Job Type: Full-time
Compensation Type: Salaried
Location: Remote & St. Paul, MN
Posted by Affinity Plus Credit Union on August 29, 2022