Open Source LLM Engineering Platform that helps teams build useful AI applications via tracing, evaluation, and prompt management (mission, product)
We have the chance to build the "Datadog" of this category; model capabilities continue to improve but building useful applications is really hard, both in startups and enterprises
Largest open source solution in this category: >2k customers, >14M monthly SDK downloads, >6M docker pulls
We raised money from world class investors (Y-Combinator, Lightspeed, General Catalyst) and are default alive by making more money than we spend
We're a small, engineering-heavy, and experienced team in Berlin and San Francisco (how we work)
In Short: Langfuse is looking for a hands-on Technical Support Engineer to join the Berlin Office who will own the front line of our customer/community experience, keep a tight feedback loop with product engineering, and help thousands of AI‑product teams succeed with tracing & evals.
What you will be working onAt Langfuse support is not a cost‑center but a product discovery engine (see handbook): engineers own their features end-to-end, fixes land fast, and users ideally see “magic” when their feedback ships the same day. You will be running that engine while engaging with our large OSS community.
Community and low-touch Customer Support (70%)Triage & solve inbound questions from diverse channels (Slack Connect, email, GitHub, Discord)
Improve Docs, snippets and auto‑reply prompts so the next 1000 teams self‑serve the answer → ideally users do not have questions and our product/docs are self-explanatory
Create helpful content i.e. helpful blog posts, sample repos and tutorials
Escalate bugs to product engineering and close the loop with the customer once shipped
Spin up proof‑of‑concepts (TypeScript/Python) to reproduce edge cases or demonstrate work‑arounds
Act as named technical contact for a handful of high‑volume customers; jump on Slack huddles, unblock integrations, surface best practices
Track their feature requests, quantify impact, and champion them on our roadmap
Go through deep technical integration/implementation questions together with our customers and in their codebase
Must:
You deeply care about customers and want to help other engineers
Solid foundations in software engineering (CS or Data Science degree or comparable projects)
Strong writing: short Slack messages and long‑form guides/documentation
Curiosity for LLM application development
Bonus:
Past B2B solutions engineering, technical support, or DevRel
Open‑source maintainer
You are joining really early
We have the greatest OSS community ❤️
Strong Product Market Fit & revenue growth!
We are building a very small but highly capable team
Only hiring engineering in foreseeable future
Building for very sophisticated technical users
Competitive salary & equity
Zero‑bureaucracy budget for learning
Team that pushes you
Everyone on the team says: "best place I've ever worked at"
Engineering based in Berlin Mitte ( we aim for 3-5 days in the office ) + optional SF trips
Only 2 scheduled meetings per week
Extreme ownership & autonomy
We can run this process in less than 7 days.
Intro call (30 min) – meet a founder, talk motivation & logistics
Functional deep‑dive (45 min) – talk about your experience that maps to this role, what makes you excellent and the perfect fit
Implement a Demo Project (Take Home Challenge, expected time 3h) We will share with you a small implementation challenge to use Langfuse
Present your Demo Project (60min) – present your demo project, answer some followup questions and share product feedback with us.
On‑site Super‑Day (½ day) – we simulate a couple of core activities of this role during a mini-work-trial in our office to make both sides go from “Yes” to “Hell Yes”, during this day you will also get to meet the whole team
Decision & Offer – usually within 24 h.
We are very public about our strategy and product thinking, so feel free to start some deep research threads on this. Throughout the process, there will be ample time for any questions you may have that haven't already been addressed. We are generally super transparent, don't hesitate to ask any of the interviewers.
LinksAll repos: github.com/langfuse
Company handbook: langfuse.com/handbook
Support Handbook: https://langfuse.com/handbook/support/support
Product Ops: https://langfuse.com/handbook/product-engineering/product-ops
How we hire: langfuse.com/handbook/how-we-hire
Blog: langfuse.com/blog
Top Skills
What We Do
Langfuse is the 𝗺𝗼𝘀𝘁 𝗽𝗼𝗽𝘂𝗹𝗮𝗿 𝗼𝗽𝗲𝗻 𝘀𝗼𝘂𝗿𝗰𝗲 𝗟𝗟𝗠𝗢𝗽𝘀 𝗽𝗹𝗮𝘁𝗳𝗼𝗿𝗺. It helps teams collaboratively develop, monitor, evaluate, and debug AI applications.
Langfuse can be 𝘀𝗲𝗹𝗳-𝗵𝗼𝘀𝘁𝗲𝗱 in minutes and is battle-tested and used in production by thousands of users from YC startups to large companies like Khan Academy or Twilio. Langfuse builds on a proven track record of reliability and performance.
Developers can trace any Large Language model or framework using our SDKs for Python and JS/TS, our open API or our native integrations (OpenAI, Langchain, Llama-Index, Vercel AI SDK). Beyond tracing, developers use 𝗟𝗮𝗻𝗴𝗳𝘂𝘀𝗲 𝗣𝗿𝗼𝗺𝗽𝘁 𝗠𝗮𝗻𝗮𝗴𝗲𝗺𝗲𝗻𝘁, 𝗶𝘁𝘀 𝗼𝗽𝗲𝗻 𝗔𝗣𝗜𝘀, 𝗮𝗻𝗱 𝘁𝗲𝘀𝘁𝗶𝗻𝗴 𝗮𝗻𝗱 𝗲𝘃𝗮𝗹𝘂𝗮𝘁𝗶𝗼𝗻 𝗽𝗶𝗽𝗲𝗹𝗶𝗻𝗲𝘀 to improve the quality of their applications.
Product managers can 𝗮𝗻𝗮𝗹𝘆𝘇𝗲, 𝗲𝘃𝗮𝗹𝘂𝗮𝘁𝗲, 𝗮𝗻𝗱 𝗱𝗲𝗯𝘂𝗴 𝗔𝗜 𝗽𝗿𝗼𝗱𝘂𝗰𝘁𝘀 by accessing detailed metrics on costs, latencies, and user feedback in the Langfuse Dashboard. They can bring 𝗵𝘂𝗺𝗮𝗻𝘀 𝗶𝗻 𝘁𝗵𝗲 𝗹𝗼𝗼𝗽 by setting up annotation workflows for human labelers to score their application. Langfuse can also be used to 𝗺𝗼𝗻𝗶𝘁𝗼𝗿 𝘀𝗲𝗰𝘂𝗿𝗶𝘁𝘆 𝗿𝗶𝘀𝗸𝘀 through security framework and evaluation pipelines.
Langfuse enables 𝗻𝗼𝗻-𝘁𝗲𝗰𝗵𝗻𝗶𝗰𝗮𝗹 𝘁𝗲𝗮𝗺 𝗺𝗲𝗺𝗯𝗲𝗿𝘀 to iterate on prompts and model configurations directly within the Langfuse UI or use the Langfuse Playground for fast prompt testing.
Langfuse is 𝗼𝗽𝗲𝗻 𝘀𝗼𝘂𝗿𝗰𝗲 and we are proud to have a fantastic community on Github and Discord that provides help and feedback. Do get in touch with us!








