SRE/DevOps Engineer
Location: REMOTE
Description: *W2 Applicants Only - Cannot Support C2C*Our client is currently searching for an SRE/DevOps Engineer to be be involved in the design of big data solutions that leverage open source and cloud-based solutions and to work with multiple teams across the organization (i.e. cloud analytics, data architects, business groups). This person will participate in the building of large-scale data processing systems, is an expert in data warehousing solutions and should be able to work with the latest (NoSQL) database technologies across AWS and Google cloud (or similar).
An SRE/DevOps Engineer should embrace the challenge of dealing with petabyte or even exabytes of data daily. This individual understands how to apply technologies to solve big data problems and to develop innovative big data solutions. The SRE/DevOps Engineer generally works on implementing complex big data projects with a focus on collecting, parsing, managing, analyzing, and visualizing large sets of data to turn information into insights using multiple platforms. This person should be able to develop prototypes and proof of concepts for the selected solutions.
Responsibilities:
• You will work with other SREs and Software Engineers to "automate all things" for our hybrid cloud platform.
• Build tooling to improve CI/CD patterns for multiple development teams that are consistent with security and DevOps best practices
• Coordinate and assist teams in building competencies with infrastructure using object-oriented programming and configuration management domain specific language
• Create Terraform artifacts to automate the deployment and management of Cloud Infrastructure
• Create Kubernetes related artifacts based on Kustomize
• Continuously improve the observability of the tech stack by integrating logging, testing, metrics, tracing, dashboards, and alerts with CD pipelines
Required Qualifications:
• Experience working with large geospatial data sets applying GDAL and similar spatial libraries
• Ability to quickly learn technologies such as Terraform, Packer, FastAPI, Kubernetes, Nginx, Elasticsearch, PostgreSQL, AWS, GCP, Grafana, Kibana and Google Data Studio
• Basic understanding of OAuth 2.0 and Open ID Connect
• Software engineering experience with programming languages like Python, Golang or Java
• Experience with python3, java, PostgreSQL, erlang, redshift, JavaScript, BigQuery, Cloud Spanner
• Experience building CI/CD pipelines
Preferred Qualifications:
• Experience with creating Kubernetes operators using Kubebuilder or Operator SDK
• Experience with Argo CD / Flux / Argo Workflows
• Experience implementing a GitOps workflow
• Experience with object-oriented design, coding, and testing patterns as well as experience in engineering (commercial or open source) software platforms and large-scale data infrastructures should be present.
• Experience creating cloud computing solutions and web applications leveraging public and private API's
Contact: [email protected]
This job and many more are available through The Judge Group. Find us on the web at www.judge.com