Premier Employers are industry leaders that have forged exclusive partnerships with Meytier to forward our shared mission to offset bias in hiring, and are only visible to members of the Meytier community.
EXCLUSIVELY ON MEYTIER
You're in luck. This opportunity exclusively available through Meytier.
About Meytier Premier Employers
Premier Employers are industry leaders that have forged exclusive partnerships with Meytier to forward our shared mission to offset bias in hiring, and are only visible to members of the Meytier community.
This role requires expertise in managing data infrastructure, ensuring best practices for CI/CD, and troubleshooting technical issues related to databases, web services, Kafka, Spark, Fabric, and Databricks. The candidate will play a critical role in streamlining release management, ensuring seamless migration of product releases from development to upper environments, and maintaining robust exception management processes.
Key Responsibilities:
CI/CD Pipeline Automation:
Design, develop, and maintain automated CI/CD pipelines for migrating product releases across multiple environments (Dev, Test, UAT, Prod).
Implement and manage pipelines for Azure Data Factory (ADF), WhereScape, Fabric, Semarchy MDM, and other data platforms.
Automate deployment processes for products which use databases, Kafka, Spark, and other data infrastructure components.
Infrastructure Management:
Manage and optimize data infrastructure to ensure scalability, reliability, and performance.
Configure and manage connections to databases, web services, Kafka, Spark, Fabric, and Databricks.
Abstract and automate infrastructure configurations to enable repeatable and scalable deployments.
Scripting and Development:
Develop and maintain Python scripts for automation, deployment, and configuration management.
Write and maintain Windows and Unix shell scripts for infrastructure and deployment tasks.
Collaborate with development teams to integrate scripts and automation into the CI/CD process.
Release and Configuration Management:
Implement best practices for release management, configuration management, and exception management.
Ensure version control, change management, and auditability across all deployments.
Troubleshoot and resolve deployment issues across environments. 5. Technical Support and Troubleshooting:
Provide technical expertise in configuring and managing connections to various data platforms and services.
Diagnose and resolve technical issues related to pipelines, infrastructure, and deployments.
Collaborate with cross-functional teams to ensure smooth delivery of product releases.
Tooling and Best Practices:
Utilize Azure DevOps to build, test, and deploy pipelines.
Stay updated with industry best practices and emerging technologies in DevOps, data engineering, and cloud platforms.
Advocate for and implement DevOps best practices across the organization.
Desired Profile:
Proven experience in DevOps roles, with a focus on Azure Data Factory (ADF), WhereScape, Fabric, Semarchy MDM, and Microsoft Purview.
Strong expertise in Azure DevOps for building and managing CI/CD pipelines.
Proficiency in Python scripting for automation and deployment tasks.
Experience with Windows and Unix shell scripting.
Experience with databases, Kafka, Spark, Fabric, and Databricks.
Knowledge of configuration management, release management, and exception management practices.
Ability to troubleshoot and resolve technical issues related to data infrastructure and deployments.
Strong understanding of cloud platforms (Azure preferred) and data engineering concepts.
Excellent communication and collaboration skills, with the ability to work in a cross-functional team environment.
Preferred Skills:
Experience with Semarchy MDM and Microsoft Purview for data governance and master data management.
Familiarity with Kubernetes and Docker for containerization and orchestration.
Knowledge of IaC (Infrastructure as Code) tools such as Terraform or ARM templates. Understanding of data security and compliance best practices.
The Meytier team will review it and get back to you.Click here to view.
{"group":"Organization","title":"Sr.Data DevOps Engineer","skills":"<ul><li>Proven experience in DevOps roles, with a focus on Azure Data Factory (ADF), WhereScape, Fabric, Semarchy MDM, and Microsoft Purview.</li><li>Strong expertise in Azure DevOps for building and managing CI/CD pipelines.</li><li>Proficiency in Python scripting for automation and deployment tasks.</li><li>Experience with Windows and Unix shell scripting.</li><li>Experience with databases, Kafka, Spark, Fabric, and Databricks.</li><li>Knowledge of configuration management, release management, and exception management practices.</li><li>Ability to troubleshoot and resolve technical issues related to data infrastructure and deployments.</li><li>Strong understanding of cloud platforms (Azure preferred) and data engineering concepts.</li><li>Excellent communication and collaboration skills, with the ability to work in a cross-functional team environment.</li></ul><p><strong>Preferred Skills: </strong></p><ul><li>Experience with Semarchy MDM and Microsoft Purview for data governance and master data management.</li><li>Familiarity with Kubernetes and Docker for containerization and orchestration.</li><li>Knowledge of IaC (Infrastructure as Code) tools such as Terraform or ARM templates. Understanding of data security and compliance best practices.</li></ul>","zohoId":"","endDate":"2025-12-30T18:30:00.000Z","isDraft":false,"jobType":"Full Time","job_url":"2206-assetmark-sr-data-devops-engineer","agencyId":1,"benefits":"<p>AssetMark’s culture is driven by our mission and connected by our values; Heart, Integrity, Excellence and Respect. You will join a team that lives these values every day by doing the best and what is right in all we do and encouraging different ideas for continual success and innovation. Additionally, we offer a wide range of benefits to meet the needs of our team members and their families.</p><ul><li>Flex Time Off or Paid Time/Sick Time Off</li><li>401K – 6% Employer Match</li><li>Medical, Dental, Vision – HDHP or PPO</li><li>HSA – Employer contribution (HDHP only)</li><li>Volunteer Time Off</li><li>Career Development / Recognition</li><li>Fitness Reimbursement</li><li>Hybrid Work Schedule.</li></ul>","clientId":"35","location":[{"lat":34.1592881,"lon":-118.5011688,"zip":"","city":"Los Angeles","text":"Encino, Los Angeles, CA, USA","state":"California","country":"United States","is_city":true,"is_state":false,"is_country":false,"state_code":"CA","countryCode":"US","isLocationSet":true,"nearByHexCodes":["8429a19ffffffff","8429a57ffffffff","8429a1dffffffff","8429a11ffffffff","8429a1bffffffff","8429125ffffffff","8429a53ffffffff"],"loc_h3_hex_res4":"8429a19ffffffff","isLocationResolved":true}],"eeocFound":true,"maxSalary":"","minSalary":"","questions":[],"startDate":"2025-06-16T18:30:00.000Z","hiringSPOC":["Sneh Sharma"],"hiringTags":[],"onBehalfOf":"47","companyName":"Meytier","description":" ","isHybridJob":false,"isRemoteJob":false,"salaryRange":"","titleSkills":[{"keyword":"Devops Engineer","node_id":"i10272","removed":false,"node_ptr":[["meytier_root","information technology","systems engineering","software engineering","software development","software development areas","devops"]],"priority":-1,"alignedTF":true,"must_have":false,"node_name":"devops","extractedTF":true,"not_a_skill":false,"nice_to_have":false,"nodeAlignedWt":21,"is_industry_term":false,"gender_threshold_yn":"balanced","final_node_fft_weights":{"software development areas":1},"final_node_skarea_basetype":""}],"otherCohorts":[],"benefitsFound":true,"hiringManager":["Aditya Mishra"],"maxExperience":15,"minExperience":10,"type_of_slate":"job","hiringFunction":["Technology & IT Delivery"],"isOnPremiseJob":true,"onBehalfOfName":"AssetMark","otherlocations":[],"blindHiringMode":false,"experienceLevel":"Mid / Senior","numberOfOpenings":"1","otherCohortsName":"","otherInformation":"","responsibilities":"<p><strong>CI/CD Pipeline Automation:</strong></p><ul><li>Design, develop, and maintain automated CI/CD pipelines for migrating product releases across multiple environments (Dev, Test, UAT, Prod).</li><li>Implement and manage pipelines for Azure Data Factory (ADF), WhereScape, Fabric, Semarchy MDM, and other data platforms.</li><li>Automate deployment processes for products which use databases, Kafka, Spark, and other data <strong>infrastructure components.</strong></li></ul><p><strong>Infrastructure Management:</strong></p><ul><li>Manage and optimize data infrastructure to ensure scalability, reliability, and performance.</li><li>Configure and manage connections to databases, web services, Kafka, Spark, Fabric, and Databricks.</li><li>Abstract and automate infrastructure configurations to enable repeatable and scalable deployments.</li></ul><p><strong>Scripting and Development</strong>:</p><ul><li>Develop and maintain Python scripts for automation, deployment, and configuration management.</li><li>Write and maintain Windows and Unix shell scripts for infrastructure and deployment tasks.</li><li>Collaborate with development teams to integrate scripts and automation into the CI/CD process.</li></ul><p><strong>Release and Configuration Management:</strong></p><ul><li>Implement best practices for release management, configuration management, and exception management.</li><li>Ensure version control, change management, and auditability across all deployments.</li><li>Troubleshoot and resolve deployment issues across environments. 5. Technical Support and Troubleshooting:</li><li>Provide technical expertise in configuring and managing connections to various data platforms and services.</li><li>Diagnose and resolve technical issues related to pipelines, infrastructure, and deployments.</li><li>Collaborate with cross-functional teams to ensure smooth delivery of product releases.</li></ul><p><strong>Tooling and Best Practices:</strong></p><ul><li>Utilize Azure DevOps to build, test, and deploy pipelines.</li><li>Stay updated with industry best practices and emerging technologies in DevOps, data engineering, and cloud platforms.</li><li>Advocate for and implement DevOps best practices across the organization.</li></ul>","extractedSkillIds":["i10272"],"maxSeniorityLevel":6,"minSeniorityLevel":3,"otherJobReference":"","sharpenedJobTitle":"Sr.Data DevOps Engineer","job_category_group":"10","growthOppurtunities":[],"educationQualification":"Baccalaureate Degree","skillSenNormalizedTitle":"","extractSkillsFromHereToo":true,"normalizedTitleSkillsObj":{},"companyTeamJobIntroduction":"<p><strong>About Role:</strong></p><p class=\"ql-align-justify\">This role requires expertise in managing data infrastructure, ensuring best practices for CI/CD, and troubleshooting technical issues related to databases, web services, Kafka, Spark, Fabric, and Databricks. The candidate will play a critical role in streamlining release management, ensuring seamless migration of product releases from development to upper environments, and maintaining robust exception management processes.</p>","normalized_title_object_new":{"in_use":true,"skills":["devops"],"industry":["ALL INDUSTRIES"],"root_role":"devops engineer","is_root_role":false,"matched_with":"senior devops engineer","department_team":"software development","normalized_title":"senior devops engineer","reason_for_match":"BTM","frequently_found_skills":[],"inormalized_titles_master_new":"191","match_inside_length_threshold":true,"normalized_title_display_name":"Senior Devops Engineer"},"dNIEEOCTextFocusOtherControl":"<p><span style=\"color: rgb(34, 34, 34);\">As an Equal Opportunity Employer, AssetMark is committed to building a diverse and inclusive workplace where everyone feels valued.</span></p>","expertise_coreskill_or_product":["devops"],"displayJobDescriptionSimpleForm":true,"expertise_coreskill_or_product_id":["i10272"],"job_id":"2206"}