In Short: Langfuse is looking for a hands-on Customer Engineer to join the Berlin Office who will own the front line of our customer experience, keep a tight feedback loop with product engineering, and help thousands of AI‑product teams succeed with tracing & evals.
About LangfuseWe are building the fastest growing open-source LLM engineering platform.
We help engineers and enterprises build and run LLM applications at scale.
We've raised $4m from Y Combinator, Lightspeed and General Catalyst.
Thousands of teams and engineers use Langfuse daily
Thousands are joining every month
We're a small, focused team working from Berlin, Germany.
At Langfuse support is not a cost‑center but a product discovery engine: engineers own their features end-to-end, fixes land fast, and users ideally see “magic” when their feedback ships the same day. You will be running that engine.
Improve Low Touch (70%)
Triage & solve inbound questions from Plain’s unified inbox (Slack Connect, email, GitHub, Discord)
Improve Docs, snippets and auto‑reply prompts so the next 1000 teams self‑serve the answer
Build AI powered workflows to help customers faster
Create helpful content i.e. helpful blog posts, sample repos and tutorials
Escalate bugs to product engineering and close the loop with the customer once shipped
Ship small improvements yourself to close the loop even faster
Engage High Touch (40%)
Act as named technical contact for a handful of high‑volume customers; jump on Slack huddles, unblock integrations, surface best practices
Track their feature requests, quantify impact, and champion them on our roadmap
Go through deep technical integration/implementation questions together with our customers and in their codebase
Improve the Product
Maintain the support → issue tracker flow: label, cluster, and prioritise insights so Product Engineering can act fast
Spin up proof‑of‑concepts (TypeScript/Python) to reproduce edge cases or demonstrate work‑arounds
Solid foundations in software engineering (CS degree or comparable projects).
New grads and career‑switchers encouraged!
Excited when to other engineers, debugging APIs, and turning chaos into crisp docs
Strong writing: short Slack messages and long‑form guides
Curiosity for LLM application development
Bonus:
past B2B solutions engineering, technical support, open‑source maintainer, or DevRel history
You are joining really early
We have the greatest OSS community ❤️
Strong Product Market Fit & revenue growth!
We are building a very small but highly capable team
Only hiring engineering in foreseeable future
Building for very sophisticated technical users
Competitive salary & equity
Zero‑bureaucracy budget for learning
Team that pushes you
Everyone on the team says: "best place I've ever worked at"
Engineering based in Berlin Mitte ( we aim for 4-5 days in the office ) + optional SF trips
No more than 2 short scheduled meetings per week
Extreme ownership & autonomy how and what to work on
We can run this process in less than 7 days.
Intro call (30 min) – meet a founder, talk motivation & logistics
Functional deep‑dive (45 min) – talk about your experience that maps to this role, what makes you excellent and the perfect fit
Coding exercise (90 min) – this role is technical, during this call we will go deep on your aptitude to help customers get technical issues resolved and small fixes shipped
On‑site Super‑Day (½ day) – we simulate a couple of core activities of this role during a mini-work-trial to make both sides go from “Yes” to “Hell Yes”, during this day you will also get to meet the whole team
Decision & Offer – usually within 24 h.
We are very public about our strategy and product thinking, so feel free to start some deep research threads on this. Throughout the process, there will be ample time for any questions you may have that haven't already been addressed. We are generally super transparent, don't hesitate to ask any of the interviewers.
LinksOur codebase: GitHub Repository
Some tools we use: Plain.com, Dosu, Inkeep
Ben Kuhn on Impact, Agency and Taste
Who we are & how we work
How we use AI to scale: blog post
Top Skills
What We Do
Langfuse is the 𝗺𝗼𝘀𝘁 𝗽𝗼𝗽𝘂𝗹𝗮𝗿 𝗼𝗽𝗲𝗻 𝘀𝗼𝘂𝗿𝗰𝗲 𝗟𝗟𝗠𝗢𝗽𝘀 𝗽𝗹𝗮𝘁𝗳𝗼𝗿𝗺. It helps teams collaboratively develop, monitor, evaluate, and debug AI applications.
Langfuse can be 𝘀𝗲𝗹𝗳-𝗵𝗼𝘀𝘁𝗲𝗱 in minutes and is battle-tested and used in production by thousands of users from YC startups to large companies like Khan Academy or Twilio. Langfuse builds on a proven track record of reliability and performance.
Developers can trace any Large Language model or framework using our SDKs for Python and JS/TS, our open API or our native integrations (OpenAI, Langchain, Llama-Index, Vercel AI SDK). Beyond tracing, developers use 𝗟𝗮𝗻𝗴𝗳𝘂𝘀𝗲 𝗣𝗿𝗼𝗺𝗽𝘁 𝗠𝗮𝗻𝗮𝗴𝗲𝗺𝗲𝗻𝘁, 𝗶𝘁𝘀 𝗼𝗽𝗲𝗻 𝗔𝗣𝗜𝘀, 𝗮𝗻𝗱 𝘁𝗲𝘀𝘁𝗶𝗻𝗴 𝗮𝗻𝗱 𝗲𝘃𝗮𝗹𝘂𝗮𝘁𝗶𝗼𝗻 𝗽𝗶𝗽𝗲𝗹𝗶𝗻𝗲𝘀 to improve the quality of their applications.
Product managers can 𝗮𝗻𝗮𝗹𝘆𝘇𝗲, 𝗲𝘃𝗮𝗹𝘂𝗮𝘁𝗲, 𝗮𝗻𝗱 𝗱𝗲𝗯𝘂𝗴 𝗔𝗜 𝗽𝗿𝗼𝗱𝘂𝗰𝘁𝘀 by accessing detailed metrics on costs, latencies, and user feedback in the Langfuse Dashboard. They can bring 𝗵𝘂𝗺𝗮𝗻𝘀 𝗶𝗻 𝘁𝗵𝗲 𝗹𝗼𝗼𝗽 by setting up annotation workflows for human labelers to score their application. Langfuse can also be used to 𝗺𝗼𝗻𝗶𝘁𝗼𝗿 𝘀𝗲𝗰𝘂𝗿𝗶𝘁𝘆 𝗿𝗶𝘀𝗸𝘀 through security framework and evaluation pipelines.
Langfuse enables 𝗻𝗼𝗻-𝘁𝗲𝗰𝗵𝗻𝗶𝗰𝗮𝗹 𝘁𝗲𝗮𝗺 𝗺𝗲𝗺𝗯𝗲𝗿𝘀 to iterate on prompts and model configurations directly within the Langfuse UI or use the Langfuse Playground for fast prompt testing.
Langfuse is 𝗼𝗽𝗲𝗻 𝘀𝗼𝘂𝗿𝗰𝗲 and we are proud to have a fantastic community on Github and Discord that provides help and feedback. Do get in touch with us!