The Role
As a Senior Data Analytics Engineer, you will design and maintain ETL processes, ensure data accuracy, and provide actionable insights. You will work with various programming languages, optimize workflows, and collaborate with cross-functional teams to meet data requirements. Your role will also involve deploying solutions in a Kubernetes environment and performing advanced data analysis.
Summary Generated by Built In
Position Overview:
As a Senior ETL Operations and Data Analytics Engineer, you will play a crucial role in our data-driven decision-making process. You will be responsible for designing, implementing, and maintaining ETL processes, ensuring data accuracy, and providing valuable insights to drive business growth.
Key Responsibilities:
- Design, develop, and maintain ETL processes to extract, transform, and load data from various sources.
- Monitor and optimize ETL workflows to ensure data quality and performance.
- Collaborate with cross-functional teams to gather and understand data requirements.
- Optimize and tune ETL processes for performance and scalability.
- Create and maintain documentation for ETL processes and data analytics solutions.
- Create and maintain data models to support reporting and analysis needs.
- Utilize expert knowledge of Go, Python, SQL, git, JSON, YAML, CSV, and MS Excel
- Working knowledge and experience with Ruby, Bash, Argo CD/Workflow, Kubernetes (K8s), containers, GitHub actions, Linux, and AWS to enhance data operations.
- Collaborate with DevOps teams to deploy ETL solutions efficiently in a Kubernetes environment using CI/CD pipelines.
- Support and troubleshoot ETL processes and resolve any issues in a timely manner.
- Perform data analysis, develop dashboards, and present actionable insights to stakeholders.
- Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
- Minimum of 10 years of experience in ETL operations, Systems Operations, and Data Analytics.
- Expert knowledge of SQL, git, various data formats (JSON, YAML, csv), and MS Excel.
- Expert Python and Bash skills including OO techniques.
- Proficiency in Ruby, Go, and other languages is a plus.
- Familiarity with Argo CD/Workflow, Kubernetes (K8s), containers, GitHub ac0ons, Linux, and AWS is highly desirable.
- Strong problem-solving skills and attention to detail.
- Excellent communication and collaboration skills.
- Ability to work independently and as part of a team.
- Strong proficiency in SQL and experience with MySQL or similar relational databases.
- Must be able to interact with databases using raw-SQL.
- Solid understanding of data modeling concepts and techniques.
- Experience with Jaspersob or similar reporting tools is preferred.
Desired:
- Familiarity with ELK (Elasticsearch, Logstash, Kibana) or OpenSearch for advanced log and data analysis
- Familiarity with Jasper Reports and BIRT
- Familiarity with Apache Kafka for real-time data streaming and event-driven architectures
- Experience with relational databases such as PostgreSQL and MySQL for handling structured data
- Knowledge of Druid, an open-source analytics data store, and its integration into data pipelines
- Proficiency in Apache Superset for creating interactive and insightful data visualizations
- Health Care Plan (Medical, Dental & Vision)
- Retirement Plan (401k, IRA)
- Life Insurance (Basic, Voluntary & AD&D)
- Paid Time Off (Vacation, Sick & Public Holidays)
- Short Term & Long Term Disability
- Training & Development
- Work From Home
Top Skills
Bash
Git
Go
Python
Ruby
SQL
The Company
What We Do
At Sophinea, our focus is to “Bring Clarity to Data” for our customers. We deliver this clarity through applying our data analytics- focused technical and business consulting expertise to design mission-centric solutions that enable our clients to meet their analytics objectives.