Skip to content
  • Home
  • Perspectives
  • About & Contact
Newsletter · Mar 05, 2026

Observations from Loc Tech Live 2026

Libor Safar
Observations from Loc Tech Live 2025

Last week, I spent two afternoons glued to my screen attending Loc Tech Live 2026, a virtual conference that brought together 340 localization professionals to discuss the messy, technical "how" of localization.

The event went deep: architecture diagrams, code demos, and the real infrastructure behind localization at scale. Not my natural habitat, to be honest. But given how localization is evolving, it's a world everyone needs to understand, at least to some extent.

The speakers were some of the smartest, most tech-savvy folks in the industry, and their organizations are at the top tier of AI maturity. Yet, or perhaps because of this, they painted a clear, coherent picture of where the industry is heading:

1) "AI localization infrastructure" is becoming an orchestration layer

I know, big words. But across very different organizations, large or small, producers or service providers, the same architecture keeps appearing:

  • A consistent orchestration layer
  • Routing (to the right engine, workflow, or human)
  • Governance + feedback loops (quality, risk, cost)

The shift is away from monolithic point solutions and toward modular, orchestration-based approaches.

The idea is to give teams flexibility while maintaining centralized governance. Product teams care about speed. Marketing teams care about nuance. Legal teams care about compliance. Forcing everyone through the same narrow workflow creates shadow systems and kills adoption.

Your localization "platform" doesn't need to be a single tool. It's the middleware layer that routes content intelligently, applies quality checks consistently, and learns from feedback loops between translation, evaluation, and routing.

2) The missing slide: human infrastructure

It wasn't stated explicitly during the conference, but there's a clear "human infrastructure" we don't talk about enough. One that's as important as the technical side.

The best systems are built with, and often by, tech-savvy localization professionals who bring years, if not decades, of domain experience. Not by "Colin the AI dev whiz kid" who assumes localization is a solved engineering problem that can be shipped "in a day or two."

In other words, “AI localization infrastructure” isn’t only technical infrastructure. It’s human infrastructure:

  • people who know what “good” looks like in your context,
  • people who can translate that into practical rules and review processes,
  • people who can influence internal teams so the workflow actually gets used.

It’s tempting, especially in an era of layoffs explained away as “AI efficiency,” to treat institutional knowledge as replaceable. But the conference reinforced the opposite:

You can always swap out tools. You can't easily replace institutional knowledge about what good localization actually means for your organization.

This human infrastructure is at least as important as any technical or AI infrastructure. Chief AI Officers are cool (CAIO!), but heads of HR are equally critical for any organization committed to AI transformation.

3) Build vs. buy is being reframed, not replaced

Several examples showed how small, targeted custom tools (for narrow, deterministic problems) can deliver outsized ROI, especially with AI-assisted coding.

But the underlying question hasn't disappeared. The examples are amazing and inspiring, but they may not necessarily be a blueprint everyone should follow fully.

Does the world need 10,000+ custom, home-grown AI localization platforms? In the global scheme of things, this is hardly efficient long-term. For most organizations, this isn't how they'll build competitive advantage or use limited resources effectively.

"Custom platform vs. off-the-shelf" was never a binary choice. It still isn't.

The question isn't "can we build it?" but "should we own the whole thing long-term?"

The invisible cost isn't in the initial build. It's in maintenance (as the organization evolves), knowledge transfer (when the builder leaves), integration complexity (as we add more custom pieces), security, scalability, and opportunity cost.

4) A localization-flavored version of the "SaaS-pocalypse" debate

There’s a broader debate about whether AI “kills SaaS.” Whatever the outcome, the pattern from past disruptions is familiar: early fragmentation, then consolidation around platforms that can absorb complexity while still delivering specialized value.

The CRM analogy is imperfect but useful: many organizations built custom CRMs in the early 2000s and later regretted it as the ecosystem matured. Building a custom CRM is much easier in 2026, but “easier” is not the same as “strategic.”

Localization might follow a similar path: solutions that leverage AI and embed decades of localization expertise, rather than thousands of organizations each trying to encode that knowledge from scratch.

That said, many organizations have legitimate reasons for building custom AI solutions. Unique content types, proprietary formats, specific security requirements, or integration needs often can't be addressed by off-the-shelf solutions.

There's also the danger of platform lock-in. Any dependency is a risk, and platform dependency is no exception. Consolidation often leads to reduced innovation, higher prices, and less flexibility.

The best outcome for 2026 isn't everyone moving to a few large platforms. Instead, the industry needs: (1) better education about the total cost of ownership for DIY solutions, (2) more modular, interoperable tools that reduce lock-in, (3) stronger industry standards, and (4) recognition that different organizations need different levels of customization. That's the conversation worth having in 2026. 

5) Localization managers as ultimate risk managers

The conference confirmed this oft-repeated truth and revealed just how many tech risks now need managing on top of everything else.

The risks are multiplying: AI hallucinations, data privacy, prompt injection, model drift, tokenization costs that explode for non-English languages (see my previous newsletter), vendor lock-in, and technical debt from hastily built systems.

Teams manage these risks through identifying, measuring, and mitigating risk. Nothing new perhaps, this has always been the case.

What this means for your 2026 strategy

Three takeaways for teams building or rebuilding their localization infrastructure:

  1. Think orchestration, not consolidation. Your platform can be a collection of specialized tools connected by smart middleware, not a single monolithic system.
  2. Meet people where they work. The best workflow disappears into existing tools and processes. It doesn't force everyone through a brand-new special portal.
  3. Invest in human infrastructure. Tech-savvy localization professionals who understand both the domain and the technology are your most valuable asset. Build that capability.

Kudos to the organizers (Kevin O'Donnell, Oleksandr Pysaryuk, Melissa Sterner and Stefania Russo) and all the speakers who shared technical specifics so generously. Looking forward to Loc Tech Live next year!

Opt into the newsletter

© 2026 Libor Safar

Home

Perspectives

About & Contact