Staff Big Data DevOps & Deployment Engineer at ServiceNow
ServiceNow is making the world of work, work better for people. Our cloud-based platform and solutions deliver digital workflows that create great experiences and unlock productivity for employees and the enterprise. We're growing fast, innovating faster, and making an impact on our customers' and employees' lives in significant and important ways. With over 6,900 customers, we serve approximately 80% of the Fortune 500, and we're on the 2020 list of FORTUNE World's Most Admired Companies.®
We're looking for people who are ready to jump right in and help us build on our incredible momentum, our diverse, engaged workforce, and our purpose to make the world of work, work better.
Learn more on Life at Now blog and hear from our employees about their experiences working at ServiceNow.
Please Note: This position will include supporting our US Federal customers.
This position requires passing a ServiceNow background screening, USFedPASS (US Federal Personnel Authorization Screening Standards). This includes a credit check, criminal/misdemeanor check and taking a drug test. Any employment is contingent upon passing the screening. Due to Federal requirements, only US citizens, US naturalized citizens or US Permanent Residents, holding a green card, will be considered.
The Big Data team plays a critical and strategic role in ensuring that ServiceNow can exceed the availability and performance SLAs of the ServiceNow Platform powered Customer instances - deployed across the ServiceNow cloud and Azure cloud. Our mission is to:
Deliver state of the art Monitoring, Analytics and Actionable Business Insights by employing new tools, Big Data systems, Enterprise Data Lake, AI, and Machine Learning methodologies that improve efficiencies across a variety of functions in the company: Cloud Operations, Customer Support, Product Usage Analytics, Product Upsell Opportunities enabling to have a significant impact both on the top-line and bottom-line growth. The Big Data team is responsible for:
- Collecting, storing, and providing real-time access to large amount of data
- Provide real-time analytic tools and reporting capabilities for various functions including:
- Monitoring, alerting, and troubleshooting
- Machine Learning, Anomaly detection and Prediction of P1s
- Capacity planning
- Data analytics and deriving Actionable Business Insights
What you get to do in this role
- Responsible for deploying, production monitoring, maintaining and supporting of Big Data infrastructure, Applications on ServiceNow Cloud and Azure environments.
- Architect and drive the end-end Big Data deployment automation from vision to delivering the automation of Big Data foundational modules (Cloudera CDP), prerequisite components and Applications leveraging Ansible, Puppet, Terraform, Jenkins, Docker, Kubernetes to deliver end-end deployment automation across all ServiceNow environments.
- Automate Continuous Integration / Continuous Deployment (CI/CD) data pipelines for applications leveraging tools such as Jenkins, Ansible, and Docker.
- Performance tuning and troubleshooting of various Hadoop components and other data analytics tools in the environment: HDFS, YARN, Hive, HBase. Spark, Kafka, RabbitMQ, Impala, Kudu, Redis, Hue, Kerberos, Tableau, Grafana, MariaDB, and Prometheus.
- Provide production support to resolve critical Big Data pipelines and application issues and mitigating or minimizing any impact on Big Data applications. Collaborate closely with Site Reliability Engineers (SRE), Customer Support (CS), Developers, QA and System engineering teams in replicating complex issues leveraging broad experience with UI, SQL, Full-stack and Big Data technologies.
- Responsible for enforcing data governance policies in Commercial and Regulated Big Data environments.
To be successful in this role you have:
- 12+ years of overall experience with at least 7+ years as a Big Data DevOps / Deployment Engineer
- Demonstrated expert level experience in delivering end-end deployment automation leveraging Puppet, Ansible, Terraform, Jenkins, Docker, Kubernetes or similar technologies.
- Deep understanding of Hadoop/Big Data Ecosystem. Good knowledge in Querying and analyzing large amount of data on Hadoop HDFS using Hive and Spark Streaming and working on systems like HDFS, YARN, Hive, HBase. Spark, Kafka, RabbitMQ, Impala, Kudu, Redis, Hue, Tableau, Grafana, MariaDB, and Prometheus.
- Experience securing Hadoop stack with Sentry, Ranger, LDAP, Kerberos KDC
- Experience supporting CI/CD pipelines on Cloudera on Native cloud and Azure/AWS environments
- Good knowledge of Perl, Python, Bash, Groovy and Java.
- In-depth knowledge of Linux internals (Centos 7.x) and shell scripting
- Ability to learn quickly in a fast-paced, dynamic team environment
ServiceNow is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, creed, religion, sex, sexual orientation, national origin or nationality, ancestry, age, disability, gender identity or expression, marital status, veteran status or any other category protected by law.
All new employees hired in the United States are required to be fully vaccinated against COVID-19, subject to such exceptions as required by law. If hired, you will be required to submit proof of full vaccination or have an approved accommodation, by your start date. Visit our Candidate FAQ page to learn more.
If you require a reasonable accommodation to complete any part of the application process, or are limited in the ability or unable to access or use this online application process and need an alternative method for applying, you may contact us at [email protected] for assistance.
For positions requiring access to technical data subject to export control regulations, including Export Administration Regulations (EAR), ServiceNow may have to obtain export licensing approval from the U.S. Government for certain individuals. All employment is contingent upon ServiceNow obtaining any export license or other approval that may be required by the U.S. Government.
Please Note: Fraudulent job postings/job scams are increasingly common. Click here to learn what to watch out for and how to protect yourself. All genuine ServiceNow job postings can be found through the ServiceNow Careers site .
Work personas are categories that are assigned to employees depending on the nature of their work. Employees will fall into one of three categories: Remote, Flexible or Required in Office.
Required in Office
A required in office work persona is defined as an employee who is contracted to work from or aligned to a ServiceNow-affiliated office. This persona is required to work from their assigned workplace location 100% of the work week based on the business needs of their role.
A flexible work persona is defined as an employee who is contracted to work from or aligned to a ServiceNow-affiliated office and will work from their assigned workplace location roughly 3 days/week or less (generally around 40-60% of the work week). Flexible employees may choose to work the remaining working time from their workplace location or home. Flexible employees are required to work within their state, province, region, or country of employment.
A remote work persona is defined as an employee who performs their responsibilities exclusively outside of a ServiceNow workplace and is not contracted or aligned to a ServiceNow-affiliated office, including those whose place of work (pursuant to their terms and conditions of employment) is their home. Remote employees are required to work within their state, province, region, or country of employment.