Key Responsibilities
- Design, develop, and maintain data pipelines and workflows for ingestion, transformation, and delivery of clean, reliable business data for analysis and reporting.
- Collaborate with business teams to gather requirements, perform data validation, and support UAT/demos.
- Extract, integrate, and transform data from diverse systems including Workday, Salesforce, on-prem and SaaS applications using APIs, JDBC/ODBC, and native/direct connections.
- Write and optimize advanced SQL for data modeling, transformation, and cost-efficient query execution.
- Build and optimize Power BI datasets, models, and dashboards for business insights and performance tracking.
- Use Databricks Notebooks with Python and/or Scala for data preparation, automation, and analysis.
- Monitor and optimize compute resources and job performance for cost control and efficiency.
- Document data pipelines, transformation logic, and architecture for transparency and maintainability.
Education and Experience
- 2–5 years in a Data Engineering or Business Data Analysis role.
- Strong hands-on experience with Databricks (including Delta Lake, Spark SQL, and Notebooks).
- Strong working knowledge of Power BI (data modeling, DAX, dashboard design, publishing).
- Advanced SQL skills for large-scale data transformation and optimization.
- Proficiency in Python and/or Scala for data processing in Databricks.
- Proven experience with Fivetran or similar ETL/ELT tools for automated data ingestion.
- Experience integrating data from Business Applications like Workday and Salesforce (via APIs, reports, or connectors).
- Ability to manage and transform data from on-premises and cloud systems.
- Strong communication skills with experience in business requirement gathering and data storytelling.
- Bachelor’s degree in Computer Science, Data Engineering, Information Systems, Statistics, or a related field.
- Relevant certifications (e.g., Databricks Certified Data Engineer, Microsoft Power BI Data Analyst, Workday Reporting Specialist) are a plus.
Preferred / Nice-to-Have
- Fundamental knowledge of Apache Spark (architecture, RDDs, DataFrames, optimization).
- Experience in query and compute cost optimization within Databricks or similar platforms.
- Familiarity with data governance, security, and metadata management.
- Exposure to CI/CD for data pipelines using Git or DevOps tools.
- GenAI Agents and/or ML experience
Decision Making and Supervision
- Work under minimal supervision.
- Make decisions and recommendations requiring analysis and interpretation within established procedures.
Working Conditions
- Generally comfortable working conditions with lifting and onsite installations.
- Moderate visual concentration in use of video display terminal.
Top Skills
What We Do
Backed by a legacy of engineering excellence, reliability, and industry-leading customer service, Telesat is one of the largest and most successful global satellite operators. Telesat works collaboratively with its customers to deliver critical connectivity solutions that tackle the world’s most complex communications challenges, providing powerful advantages that improve their operations and drive profitable growth.
Continuously innovating to meet the connectivity demands of the future, Telesat Lightspeed, the company’s Low Earth Orbit (LEO) satellite network, will be the first and only LEO network optimized to meet the rigorous requirements of telecom, government, maritime and aeronautical customers. Telesat Lightspeed will redefine global satellite connectivity with ubiquitous, affordable, high-capacity links with fiber-like speeds.








