Back-End Senior Staff Software Engineer, Channels & Attributions
About Udemy
We believe anyone can build the life they imagine through online learning. Today, millions of students around the world are advancing their careers and passions by exploring and mastering new skills on Udemy, and expert instructors are able to share their knowledge with the world. Through our global marketplace and our solutions for businesses and governments, we connect people everywhere with the skills they need for success in work and life. We're a close-knit bunch that enjoys problem-solving and collaboration, and we share a serious belief in the power of learning and teaching to change lives. Udemy's culture encourages innovation, creativity, passion, and teamwork. We also celebrate our milestones and support each other every day.
Founded in 2010, Udemy is publicly traded and headquartered in San Francisco's SOMA neighborhood with offices in Denver (Colorado), Dublin (Ireland), Ankara (Turkey), Istanbul (Turkey), Gurugram (India), and São Paulo (Brazil).
About the Role
As a Senior Software Engineer of the Channels & Attribution team, you will work with your team members to build robust, complex and scalable workflows, data and web platforms, and tools to enable attribution of, and insights into, complex cross-channel traffic. The platforms and tools will enable Udemy's growth and business strategy agility by enabling and supporting new product and marketing strategies. We are seeking an analytical and dynamic engineer, who will collaborate closely with partners across multiple disciplines.
Here is what you will be doing:
- Be the subject matter expert in business critical attribution pipelines and related web and data applications.
- Support Udemy's growth by building and scaling systems to enable our growing business development and marketing efforts.
- Collaborate closely with engineering teams, product owners, and senior leadership to develop attribution platforms and data workflows.
- Work closely with cross-functional partners to understand, refine and translate requirements to build scalable solutions that drive impact.
- Design and build data pipelines to ingest, clean, query and share data.
- Monitor workflows, diagnose and resolve issues, and fix business critical pipelines.
We are excited about you because you have:
- B.S. degree or higher in Computer Science, or a related technical field.
- 4+ years of software engineering experience.
- Applicable experience in developing web applications, application development best practices.
- Technical accomplishments working with large datasets using distributed computing frameworks (e.g. Spark) and/or ETL workflows with multiple storage technologies (Hive, nosql DBs, RDBs)
Hands on experience with designing data systems, and complex and scalable ETLs
- Experience developing integrations with 3rd party tools and platforms
- Strong communications skills to enable effective collaboration with asynchronous and remote co-workers and teams in fast-paced environments with performance-driven culture.
- Technologies: Spark (Scala, Python) for data applications, Python (Django Web Application), Kotlin (services applications), Kafka, GRPC