Databricks Lead - 25-02773 Job at LeadStack Inc., Cincinnati, OH

OTRoOUtzQWRZT3BpSTR1MTRaSGxjelkyUlE9PQ==
  • LeadStack Inc.
  • Cincinnati, OH

Job Description

Job Title: Databricks Technical Lead

Location: Cincinnati, OH- Hybrid

Duration: 12 + Months – Contract to Hire

PR: $80/hr - $90/hr on W2

Top 3 Skills - Databricks certified technical lead, Python, Data Domain experience

This is a contract-to-hire position, and candidates must be prepared to transition to a full-time employee role.

Job Summary:

We are seeking a seasoned Databricks Technical Lead to join our HR Systems Engineering team. This role is pivotal in enhancing the experience of over 420,000 associates by leading the design, build, and optimization of our data platform, services, APIs, and cloud migrations.

As a product-centric role within an agile delivery framework, you will ensure our data solutions align with business objectives and deliver tangible value. Join us in an organization recognized as one of the best places to work in IT for seven consecutive years.

Required Qualifications:

• 10+ years of experience in data engineering, big data, or API development, with at least 3+ years in a leadership role.

• Proven experience leading product-centric data engineering initiatives in an agile delivery environment.

• Expertise in Azure Databricks, Apache Spark, Azure SQL, and other Microsoft Azure services.

• Strong programming skills in Python, Scala, and SQL for data processing and API development.

• Experience in building and managing APIs (REST, GraphQL, gRPC) and microservices.

• Hands-on experience with Azure Data Factory (ADF), Azure Synapse Analytics, and Delta Lake.

• Proficiency in CI/CD pipelines, Terraform/Bicep, and Infrastructure-as-Code.

• Experience with data security and compliance measures (e.g., encryption, access control, auditing) for sensitive HR and employee data.

• Strong problem-solving skills, with a focus on performance tuning, security, and cost optimization.

• Experience with containerization (Docker, Kubernetes) and event-driven architecture is a plus.

• Exposure to Informatica for ETL/ELT and data integration.

• Excellent communication and leadership skills in a fast-paced environment.

Preferred Qualifications:

• Microsoft Certified: Azure Solutions Architect Expert or Databricks Certified Data Engineer/Architect certification.

• Experience with agile development methodologies such as Scrum or SAFe.

• Familiarity with machine learning workflows in Azure Databricks.

• Knowledge of Azure API Management and Event Hub for API integration.

• Experience with Informatica PowerCenter or Informatica Intelligent Cloud Services (IICS).

• Hands-on experience with Oracle HCM database models and APIs, including integrating this data into enterprise data solutions

• Experience with HR Analytics.

Key Responsibilities

Core Responsibilities:

  • Develop and maintain the HR data domain in a secure, compliant, and efficient manner in accordance with best practices.
  • Lead the development of a data engineering team responsible for designing scalable, high performance data solutions, APIs, and microservices in Azure Databricks, Azure SQL and Informatica.
  • Ensure the highest levels of security and privacy regarding sensitive data.
  • This is a job for an exceptional professional who deeply understands big data processing, data architecture, cloud migrations, API development, data security, and agile methodologies in the Azure ecosystem.

Key Responsibilities:

Azure Databricks & Big Data Architecture:

• Design and implement scalable data pipelines and architectures on Azure Databricks.

• Optimize ETL/ELT workflows, ensuring efficiency in data processing, storage, and retrieval.

• Leverage Apache Spark, Delta Lake, and Azure-native services to build high-performance data solutions.

• Ensure best practices in data governance, security, and compliance within Azure environments.

• Troubleshoot and fine-tune Spark jobs for optimal performance and cost efficiency.

Azure SQL & Cloud Migration:

• Lead the migration of Azure SQL to Azure Databricks, ensuring a seamless transition of data workloads.

• Design and implement scalable data pipelines to extract, transform, and load (ETL/ELT) data from Azure SQL into Databricks Delta Lake.

• Optimize Azure SQL queries and indexing strategies before migration to enhance performance in Databricks.

• Implement best practices for data governance, security, and compliance throughout the migration process.

• Work with Azure Data Factory (ADF), Informatica, and Databricks to automate and orchestrate migration workflows.

• Ensure seamless integration of migrated data with APIs, machine learning models, and business intelligence tools.

• Establish performance monitoring and cost-optimization strategies post-migration to ensure efficiency.

API & Services Development:

• Design and develop RESTful APIs and microservices for seamless data access and integrations.

• Implement scalable and secure API frameworks to expose data processing capabilities.

• Work with GraphQL, gRPC, or streaming APIs for real-time data consumption.

• Integrate APIs with Azure-based data lakes, warehouses, Oracle HCM, and other enterprise applications.

• Ensure API performance, monitoring, and security best practices (OAuth, JWT, Azure API Management).

HR Data Domain & Security:

• Build and manage the HR data domain, ensuring a scalable, well-governed, and secure data architecture.

• Implement role-based access control (RBAC), encryption, and data masking to protect sensitive employee information.

• Ensure compliance with GDPR, CCPA, HIPAA, and other data privacy regulations.

• Design and implement audit logging and monitoring to track data access and modifications.

• Work closely with HR and security teams to define data retention policies, access permissions, and data anonymization strategies.

• Enable secure API and data sharing mechanisms for HR analytics and reporting while protecting employee privacy.

• Work with Oracle HCM data structures and integrate them within the Azure Databricks ecosystem.

Product-Centric & Agile Delivery:

• Drive a product-centric approach to data engineering, ensuring alignment with business objectives and user needs.

• Work within an agile delivery framework, leveraging Scrum/Kanban methodologies to ensure fast, iterative deployments.

• Partner with product managers and business stakeholders to define data-driven use cases and prioritize backlog items.

• Promote a continuous improvement mindset, leveraging feedback loops and data-driven decision-making.

• Implement DevOps and CI/CD best practices to enable rapid deployment and iteration of data solutions.

Leadership & Collaboration:

• Provide technical leadership and mentorship to a team of data engineers and developers.

• Collaborate closely with business stakeholders, product managers, HR teams, and architects to translate requirements into actionable data solutions.

• Advocate for automation, DevOps, and Infrastructure-as-Code (Terraform, Bicep) to improve efficiency.

• Foster a culture of innovation and continuous learning within the data engineering team.

• Stay updated on emerging trends in Azure Databricks, Azure SQL, Informatica, Oracle HCM, and cloud technologies.

Job Tags

Full time, Contract work,

Similar Jobs

CadCamNYC

Photographer / Editor Internship Job at CadCamNYC

 ...Unpaid Photography and Editing Internship at CadCamNYC Valuable Experience with Potential for Future Compensation CadCamNYC is seeking a dedicated and creative Photography and Editing Intern to join our team. This is an unpaid internship offering valuable hands-on... 

Hayman Daugherty Associates, Inc

Physician / Pulmonology / Connecticut / Locum or Permanent / Critical Care - Pulmonary Physician Job near Hartford, Connecticut Job Job at Hayman Daugherty Associates, Inc

Pulmonary Critical Care Near Hartford, Connecticut! A multi-specialty group just outside of Hartford is seeking an additional permanent Pulm/CC physician to join their group. The current physicians have all been with the hospital for 5+ years. This position includes working... 

AYANA Hospitality

Education and Outreach Officer | SAKA Museum Job at AYANA Hospitality

 ...SAKA Museum is a cultural gem located within AYANA Resort in Jimbaran, Bali. Opening to the public in early 2024, SAKA will take visitors...  ...the Balinese principle of Tri Hita Karana. Learn more at As Education and Outreach Officer ensures that SAKAs educational messages... 

Baptist Senior Family

CRM & Marketing Specialist Job at Baptist Senior Family

 ...& Lead Management: Utilize CRM and marketing data to identify target leads for reactivation, nurture existing leads, develop contact...  ... Competitive Pay: Hourly rates based on experience with shift/weekend differentials. Convenient Location: Easy access in Scott Township... 

Hayes Locums LLC

Locum Tenens Neurosurgery Opening in Iowa (Iowa) Job at Hayes Locums LLC

 ...A Midwest hospital is searching for locum tenens assistance in their trauma department. As the incoming physician, you must feel comfortable with cranial and other neuro trauma surgery. This opportunity is for 24 hour call coverage and includes four hours of patient care...