Staff DevOps Engineer - Big Data
Company Description
ServiceNow is making the world of work, work better for people. Our cloud‑based platform and solutions deliver digital workflows that create great experiences and unlock productivity for employees and the enterprise. We're growing fast, innovating faster, and making an impact on our customers' and employees' lives in significant and important ways. With over 6,900 customers, we serve approximately 80% of the Fortune 500, and we're on the 2020 list of FORTUNE World's Most Admired Companies.®
We’re looking for people who are ready to jump right in and help us build on our incredible momentum, our diverse, engaged workforce, and our purpose to make the world of work, work better.
Learn more on Life at Now blog and hear from our employees about their experiences working at ServiceNow.
Job Description
What you get to do in this role:
Building, deploying, maintaining, and managing Big Data applications based on established best practice methods and ensure availability, performance, scalability, and security of the Big Data systems are maintained. This is achieved using Software Configuration Management (SCM) and Build tools such as Git, GitLab, Nexus, Maven, Grunt and Node.Js.
- Establishing Continuous Integration (CI) and deployment pipeline of applications using tools such as Jenkins, Docker and Ansible following proven business solution created by Servicenow’s Devops practice.
- Providing production support to resolve critical build and release issues and avoid or minimize any impact on Big Data applications. Providing support to development, testing and system engineering teams with the help of specialized knowledge in areas such as Devops and Big Data Analytics.
- Enhancing build and release tools with new requirements from different stakeholders by ensuring that the technical specifications meet business needs. This involves automating build, release and configuration management system of non-production and production environments using RPM and scripting languages like Perl, Python, Bash and Groovy.
- Supporting production Cloudera based Big Data analytics platform which includes resolving incident tickets created by Site Reliability Engineers (SRE).
- Monitoring and reporting of Big Data by utilizing Grafana and Prometheus tools.
- Querying and analyzing large amount of data on Hadoop HDFS using Hive and Spark.
- Troubleshooting issues in Hadoop ecosystem and different frameworks inside this system like HDFS, Kerberos, kudu, impala, tableau, Redis and HBase.
- Attending Scrum meetings to provide status report about the designed solution. Responsible for providing direction and assistance to the team, proactively sharing knowledge and information with the team to build a skilled and high-performing workforce.
- Conducting technical interviews for several internal and external candidates for hiring new resources for various positions withing Big Data team. Providing technical training to new resources with various tools and technologies that we use in the team so that he/she can start contributing to the team.
Qualifications
To be successful in this role you have
- Expert level experience in Software Configuration Management (SCM) and Build tools such as Git, GitLab, Nexus, Maven, Grunt and Node.Js.
- Expert level experience in Jenkins, Docker and Ansible, Terraform, Puppet and similar technologies
- Expert level experience in areas such as DevOps and Big Data Analytics and monitoring tools like Grafana and Prometheus.
- Demonstrated experience involving automating builds, release and configuration management system of non-production and production environments using RPM and scripting languages like Perl, Python, Bash and Groovy.
- Deep knowledge in Querying and analyzing large amount of data on Hadoop HDFS using Hive and Spark and working on systems like HDFS, Kerberos, kudu, Impala, Tableau, Redis and HBase.
- In-depth knowledge of Centos 7.x and shell scripts
- Ability to learn quickly in a fast-paced, dynamic team environment
- Highly effective communication and collaboration skills
- Required MS Degree in Computer Science or equivalent experience
- 7+ years of overall experience with at least 2 in Big Data related positions
Additional Information
ServiceNow is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, creed, religion, sex, sexual orientation, national origin or nationality, ancestry, age, disability, gender identity or expression, marital status, veteran status or any other category protected by law.
If you are an individual with a disability and require a reasonable accommodation to complete any part of the application process, or are limited in the ability or unable to access or use this online application process and need an alternative method for applying, you may contact us at +1 (408) 501-8550, or [email protected] for assistance.
For positions requiring access to technical data subject to export control regulations, including Export Administration Regulations (EAR), ServiceNow may have to obtain export licensing approval from the U.S. Government for certain individuals. All employment is contingent upon ServiceNow obtaining any export license or other approval that may be required by the U.S. Government.