Responsibilities:
- Lead the design and implementation of scalable, secure, and high-performing data architectures across diverse industries.
- Develop and execute comprehensive data strategies to support analytics, business intelligence, and operational reporting.
- Collaborate with business stakeholders, analysts, and technical teams to understand data requirements and translate them into architecture solutions.
- Design conceptual, logical, and physical data models across various data domains.
- Build and optimize modern data pipelines and ETL/ELT processes using tools like Apache Airflow, dbt, Fivetran, or Stitch.
- Define and enforce data standards, governance policies, and data management practices to ensure consistency, quality, and compliance.
- Evaluate and implement technologies such as data warehouses (Snowflake, BigQuery, Redshift), data lakes, and cloud storage solutions (AWS S3, Azure Storage).
- Implement and maintain BI platforms, including Power BI, Tableau, or Looker, to enable self-service analytics and reporting.
- Define and enforce data security and access controls to protect sensitive information and ensure adherence to data privacy regulations.
- Collaborate with infrastructure and operations teams to ensure the scalability, reliability, and performance of data platforms.
- Provide expert guidance on database design, performance tuning, and architectural best practices.
- Stay current with emerging data technologies, including DataOps, data mesh, ML/AI integration, and modern cloud-native data solutions.
- Document data architecture artifacts such as data flow diagrams, system integrations, and metadata repositories.
Required skills and experience:
- 8+ years of overall data development experience, with at least 5+ years in a Head of Data / Data Architect or similar technical leadership role.
- Proven experience designing and implementing cloud-based data solutions using AWS, Azure, or GCP.
- Expertise in SQL and at least one programming language such as Python (preferably), Scala, or Java.
- Hands-on experience with data warehouses (Snowflake, BigQuery, Redshift) and data lakes.
- Solid background in ETL/ELT pipeline design, orchestration (e.g., Airflow, dbt), and data integration.
- Hands-on experience with BI tools (Power BI, Tableau, Looker).
- Proficiency in data modeling tools and metadata management systems.
- Deep understanding of data governance, data management, and data quality frameworks.
- Experience managing and mentoring data teams (Data Engineers, Analysts, BI Specialists).
- Excellent analytical and communication skills with the ability to engage both technical and business stakeholders.
- English – Fluent/Advanced
- Ukrainian – is a big plus
Will be a competitive advantage:
- Experience in Machine Learning or Data Science.
- Familiarity with ELT tools (Fivetran, Stitch, and dbt).
- Knowledge of DataOps practices and data mesh architecture.
Similar Jobs
What We Do
Viseven is a Global MarTech Services Provider for Pharma and Life Sciences with more than 10 years of expertise in the industry.
The company's solutions, products, and services are actively used by the TOP 100 Pharma and Life Sciences companies in more than 50 countries around the globe.
With over 150 Veeva-certified specialists and the statuses of Veeva Silver Certified Technology Partner and Veeva DFAP Content Partner, Viseven offers custom solutions for pharmaceutical communications.
We enable digital marketing transformation for enterprises of different sizes and digital maturity levels.
Our expertise covers the entire range of relevant aspects, from modular content to customer journey management, aligning tech with strategic goals.







