About the Principal Data Architect role:
Scorpion is looking for an experienced Data Architect to improve our data systems' accessibility, quality, and structure. This is a highly hands-on role requiring you to be capable of working under minimal to no supervision. We want you to hit the ground running quickly and use your expertise to carve out plans and rethink/revolutionize the way we approach data challenges. To do this, you’ll need to have a unique combination of deep technical expertise and collaborative ability and be adept and eager to earn the trust of our Data Science and Engineering executives in the projects you lead.
More specifically, we’re looking for a candidate who:
- Takes ownership of projects without needing a lot of assistance.
- Can get up to speed and contribute significantly within your first 90 days.
- Has the ability to thoroughly explain and back up your plans to get buy-in.
- Determine database structural requirements by analyzing client or internal operations, applications, and programming.
- Accurately handle the migration of data from legacy systems to new systems.
- Design new methods to source data effectively and efficiently.
- Visualize and conceptualize solutions to store and retrieve information.
- Consult for optimal database environments, analyze complex distributed production deployments, and make recommendations to optimize performance.
- Optimize the database architecture through regular monitoring and troubleshooting / assess and implement ways to improve the system’s performance.
- Improve data governance across the engineering organization, including training in best practices, defining conventions, and improving documentation.
- 5+ years experience as a data architect dealing with web-scale data (billions of records / week) and performance demands.
- Prior experience developing resilient data architectures balancing disparate needs of data science, BI, and application engineering.
- Bachelor’s Degree in computer science or related field.
- Deep experience in data modeling, ETL/ELT workflow management & ability to write and tune complex SQL queries.
- Experience in Python, Spark, SQL, Hadoop, etc.
- Experience migrating critical database environments from one platform to another in a multi-stakeholder environment.
- Experience with modern cloud datastores like Azure Synapse, Amazon Redshift, Snowflake, Google Big Query, and/or Data Bricks.
- Experience coordinating with C-Level leadership to craft Database Architecture and IT Strategies.
- Experience writing procedural language artifacts: PL/SQL, T-SQL, PL/pgSQL, stored procedures.
- Experience with Azure products and technologies.
Read Full Job Description