Principal Software Development Engineer in Test, Data Engineering (Remote) at Netskope
Today, there's more data and users outside the enterprise than inside, causing the network perimeter as we know it to dissolve. We realized a new perimeter was needed, one that is built in the cloud and follows and protects data wherever it goes, so we started Netskope to redefine Cloud, Network and Data Security.
Since 2012, we have built the market-leading cloud security company and an award-winning culture powered by hundreds of employees spread across offices in Santa Clara, Bangalore, London, Melbourne, and Tokyo. Our core values are openness, honesty, and transparency, and we purposely developed our open desk layouts and large meeting spaces to support and promote partnerships, collaboration, and teamwork. From catered lunches and office celebrations to employee recognition events (pre and hopefully post-Covid) and social professional groups such as the Awesome Women of Netskope (AWON), we strive to keep work fun, supportive and interactive. Visit us at Netskope Careers. Please follow us on LinkedIn and [email protected]
Principal Software Development Engineer in Test, Data Engineering
Please note, this team is hiring across Staff Engineer, Sr. Staff Engineer and Principal Engineer levels and all candidates are individually assessed and appropriately leveled based upon their skills and experience.What’s in it for you
- You will be part of a growing team of industry experts in the exciting space of Cloud Analytics.
- Your contributions will create a high impact in the industry and our customers through our products.
- You can shape Data Engineering at Netskope!
- 12+ years of experience
- Experience working with SQL and no-SQL datastores like Elasticsearch, MongoDB, Druid, Postgres, Teradata, Big Query
- Understanding of DB Internals
- Solid test automation experience in Python, Go or any other language
- Experience testing data ingestion pipeline and data querying services
- Experience testing REST services
- Experience in test plan and test case documentation
- Understanding the impact of data organization and query optimization on query performance
- Good understanding of data structures and algorithms and excellent programming skills
- Experience with Docker and Kubernetes
- Expert level understanding of big data infrastructure.
- Strong verbal and written communication skills
You will be writing tests for our big data workflows; You will help us validate the ingestion pipelines and query platforms, work closely with the Data Engineers to optimize and test data infrastructure and platforms for massive scale.Education
BS or MS in Computer Science or equivalent technical degree