Remote Associate Director, Data Engineering (Knowledge Graph) at S&P Global (Remote)

| Remote
Sorry, this job was removed at 4:43 p.m. (CST) on Monday, September 5, 2022
Find out who’s hiring remotely Nationwide
See all Remote jobs Nationwide
Apply Now
By clicking continue you agree to Built In’s Privacy Policy and Terms of Use.
Position Summary

We are looking for an adept, action-oriented Principal/Associate Director of Data Engineering to designand build out a multi-tenant data mesh to enable our soon-to-be-launched digital transformation product which uses advanced NLP, knowledge engineering, and ML to accelerate innovation in engineering, manufacturing, and scientific operations. The perfect candidates will have strong data infrastructure and data architecture skills, a proven track record of collaborating and iteratively implementing data-intensive solutions, strong operational skills to drive efficiency and speed, strong project leadership, and a strong vision for how data engineering can proactively create positive impact for companies. You will be a part of an early-stage team. You will educate stakeholders, mentor team members, and have a significant stake in defining the future of the Data Engineering function for the product.

Job Responsibilities
  • Design, build, and maintain a multi-tenant Data Mesh within the AWS cloud comprised of Data Lakes, Warehouses, Streaming, Graphs, and analytical NoSQL stores
  • Drive adoption and standardization of data governance, lineage, cataloging, and stewardship practices across teams
  • Work closely with data scientists, micro-service developers, and security experts to build out a big data platform incrementally and securely
  • Work closely with the product management and development teams to rapidly translate the understanding of customer data and requirements to product and solutions
  • Maintain an excellent understanding of the business's long-term goals and strategy and ensures that the design and architecture are aligned with these
  • Define and manage SLA's for data sets and processes running in production
  • Design for disaster recovery balancing availability and consistency in multi-region scenarios
  • Research and experiment with emerging technologies and tools related to big data
  • Establish and reinforce disciplined software engineering processes and best-practices

Ideal Qualifications
  • Comfort and ideally substantial experience operating big data infrastructure in a cloud-based ecosystem (AWS preferred)
  • Deep understanding of the theoretical and practical tradeoffs of various data formats in object/file stores (Parquet, Avro, JSON, etc.) in combination with a variety of ETL tools (Spark, Presto, etc.)
  • Deep understanding of the theoretical and practical tradeoffs of various NoSQL stores (Cassandra, Elasticsearch, DynamoDB, etc.) with respect to different read/write patterns and availability/consistency requirements
  • Mastery of operating and designing stream-based data systems (Kafka, AWS Kinesis, GCP PusSub, etc.) particularly under varying load
  • Be proficient in modern big data architectural approaches (Kappa/Lambda architectures, Data Lake Zones, etc.)
  • Experience with data governance, lineage, cataloging tooling (Apache Atlas, Apache Ranger, AWS Glue Catalog, etc.)
  • Experience with data pipeline and workflow management tools (AWS Data Pipeline, Apache Airflow, Argo, etc.)
  • Experience with stream-processing systems (ksqlDB, Spark Streaming, Apache Beam/Flink, etc.)
  • Experience with software engineering standard methodologies (unit testing, code reviews, design document, continuous delivery)
  • Develop and deploy production-grade services, SDK's, and data infrastructure emphasizing performance, scalability, and self-service.
  • Ability to conceptualize and articulate ideas clearly and concisely
  • Entrepreneurial or intrapreneurial experience where you helped lead the creation of a new product & organization

Nice to Have's
  • Strong algorithms, data structures, and coding background with either Java, Python or Scala programming experience
  • Experience working with knowledge graphs stores (Stardog, TigerGraph, Ontotext GraphDB, Neo4j) and surrounding semantic technology (OWL, RDF, SWRL, SPARQL, JSON-LD)
  • Experience working with Snowflake data warehouses and dimensional modeling practices
  • BA/BS or Masters in Computer Science, Math, Physics, or other technical fields
  • Experience with at least 10+ terabyte datasets, ideally up to multiple petabytes

What We Offer
  • Competitive base salary, bonus plans and equity.
  • A comprehensive, benefits package that includes medical, dental, vision and life insurance plans, paid time off, a generous 401k match with no vesting period, parental leave and 3 volunteering days each year. For more information on benefits, please access the benefits page on our careers site:
  • For work locations in the state of Colorado, the anticipated minimum base salary for this role would be $160,000 - $279,000. Compensation will be determined by the education, experience, knowledge, and abilities of the applicant.

We're building a software solution that connects data in revolutionary ways, illuminating answers that were previously impossible to find and empowering our clients to envision the future so they can determine the best course of action in the present. Join us!

Equal Opportunity Employer:

S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment.

If you need an accommodation during the application process due to a disability, please send an email to: [email protected] and your request will be forwarded to the appropriate person.

US Candidates Only:

The EEO is the Law Poster describes discrimination protections under federal law.
More Information on S&P Global
S&P Global operates in the Analytics industry. The company is located in New York, NY. S&P Global was founded in 2022. It has 26747 total employees. It offers perks and benefits such as Flexible Spending Account (FSA), Disability Insurance, Dental Benefits, Vision Benefits, Health Insurance Benefits and Life Insurance. To see all 21 open jobs at S&P Global, click here.
Read Full Job Description
Apply Now
By clicking continue you agree to Built In’s Privacy Policy and Terms of Use.

Similar Jobs

Apply Now
By clicking continue you agree to Built In’s Privacy Policy and Terms of Use.
Save jobView S&P Global's full profileFind similar jobs