HumanWox

Help build the governance layer for AI

AI adoption is outpacing governance capacity. We are building the operational infrastructure that organisations need to register, assess, and govern their AI systems. If that problem interests you, we would like to hear from you.

Why HumanWox

Work that matters

AI governance is becoming critical infrastructure. The tools organisations use to manage risk, demonstrate accountability, and satisfy regulators are being defined right now. Your work here shapes how that tooling works.

Small team, high ownership

We are a small, focused team. Everyone has direct influence on the product, the architecture, and the direction. No layers. No approval chains for decisions that should be made by the people doing the work.

Remote-first, UK-based

We work remotely across the UK. We communicate asynchronously by default and meet when it is useful, not because it is scheduled. We value clear writing over frequent meetings.

Technical depth over process

We hire people who care about building things properly. Our stack is modern (Next.js, TypeScript, Supabase, Vercel) and we invest in getting the architecture right rather than shipping theatre.

Open positions

There are no advertised positions at the moment. We are always interested in hearing from people with strong backgrounds in any of the following:

  • Full-stack engineering (TypeScript, React, Next.js)
  • AI governance, risk, and compliance
  • Product design and UX for enterprise SaaS
  • Technical writing and content strategy

If you are interested, send your CV and a short note about what draws you to this space to careers@humanwox.com. We read everything.

How we work

We believe good software comes from people who understand the problem, not just the technology. Everyone here spends time with users, reads the standards, and understands the regulatory context that shapes our product.

We ship frequently, review each other's work honestly, and treat disagreement as a feature. The best idea wins, regardless of who proposes it. We do not optimise for consensus. We optimise for building the right thing.