We have touched upon interoperability in our terminology lifecycle management posts, but today we arrived at a major milestone with our back-end tool, so it’s time for a second helping. It may seem redundant to develop our own software that converts and manages data, when solutions that are available in every LSP’s toolbox, such as qTerm, MultiTerm or Swordfish handle conversion as well to a certain extent. On the bright side, their interoperability capabilities satisfy straightforward and elementary processes; however, as complexity rises, their limitations become apparent.
After zooming through terminology lifecycle management in the past few weeks, let’s continue with off-the-wall Beatles references and a case study.
We must cultivate our gardens. Even the most robust structures tend to bend out of shape over time, and need maintenance. Managing terminology is not a finite project, but a continuous effort, and once the tools and methods have been established, it becomes harder and harder to fix conceptual mistakes. Initial choices can be a pest and cause adverse effects on the long run, and it is a common mistake to go for the already available options instead of customizing the whole process, which resonates with the proverb: If you only have a hammer, everything looks like a nail. Now that all the vagueness is out of the way, let’s get down to brass tacks.
The purpose of terminology management is to give companies a face, and enable them to speak with one voice and develop their brands. Is there an optimal, all-around solution to achieve this? Probably not. Businesses of different portfolios, market penetration and complexity face different challenges. The Common Sense Advisory model, close to being the golden standard, classifies localization maturity on a scale of 0-5 (let’s graciously skip the negative grades). The top of the food chain with a number five classification is usually inhabited by mammoth corporations, such as Microsoft, IBM or Intel, and usually have an in-house, proprietary workflow model and IT infrastructure that serves their specific needs. For smaller organizations assimilating localization management to such an extent may not only be overkill, but would also make a significant dent in their budget without a proven ROI. Having said all this, there are common considerations for any company, may they go for complete integration, outsourcing or Software-as-a-Service (SaaS).
Insanity is hereditary – we get it from our children, or, in this particular case, our products. While added value and perfection comes from human input, if the setup is not flexible and well thought-out, terminology management can quickly grow unwieldy and turn into moody struggling with constant retrofitting.
Conceptualization & Scope
Whenever the need arises to a new terminology model, the first phase should always be a sketch of the final product. Jumping into erecting a cathedral just to tear it down is best left for masochists. This stage usually cannot be done parallel to others, because of dependencies, and even in time-sensitive cases, it is not worth trimming the allocated time.
Because of the sensitive and complex nature of conceptualization, optimally an experienced localization lead can conduct interviews with all parties involved in order to integrate all ideas into a consolidated concept. Such a “champion” is rare to find, and has to understand all aspects of localization from translation through localization, IT development, infrastructures and workflow models to other soft factors, such as empowering participants and giving them a sense of ownership. We may usually disregard the latter elements as being less important, but they are crucial to bringing quality and productivity home at the same time.
The answers given in this phase determine how the final structure would shape up, get developed, governed and maintained. What would be the scope of the terminology database? Is only organization-specific terminology needed, and if so, how? How should terms be picked and identified? If generic domain terminology is on the table, do we have guidelines as to the scope? Should the terminology database be fragmented, or an integrated version is preferred? What kinds of meta-data are needed? How will consistency be ensured with legacy content or translation memories? If there is redundancy because how data are stored and if we opt for that, what kind of preventive measures should be employed to avoid additional maintenance time and potential inconsistencies? Is the terminology business-critical or some fuzziness can be allowed? Should the structure be optimized for conceptual clarity or usability (in other words: form and/or content)?
These are only a few select questions that should come to mind before going forward.
Architecture & Infrastructure
Once the concept is finalized and agreed on, the project can proceed to the actualization stage by finding or developing the proper tool. The market is not saturated to say at least with terminology management tools, and each and every of them has advantages and drawbacks. Just to mention two, Kilgray’s qTerm and Acrolinx’ IQ serve very different needs, but both need to be customized so that the requirements established in phase one can be met.
And then there’s the desired level of integration. A tough question. Embedded terminology management on the organization’s side will have the benefit of better integration with authoring or proprietary project- and content management systems, but it may not be able to deliver the desired linguistic quality if the latter is outsourced. The SaaS model, usually handled by language service providers offers the additional value of experience as well as better integration with the localization ecosystem. Depending on your focus (and your luck), either can be a successful and productive choice.
Deployment and Ownership
Ownership in this sense does not pertain to content property rights, but to the responsibility of maintenance and deployment. This is another point which should be made in the initial drafting stage to avoid misunderstandings and miscommunication further down the road. Regardless to whom responsibilities belong, it is best to assign mutually dependent tasks and decision making power to the same party, otherwise the constant back-and-forth is likely to poison both the process and the relationship.
Before this post gets on your nerves, let’s halve it, and continue with governance and maintenance next week. As always, stay tuned!
We hope you are already famished for another LABS post after last week’s hiatus. With all the terminology talk in the industry on best practices, database creation and concepts, there’s no immediate urge for us to add to the sea of discussions. Not that we already have a silver bullet for all the issues, but there are two topics where the eloquence seems to be subsiding. Interoperability and lifecycle management are not hitting the charts, but they are key issues for integrated, flexible solutions, so we try to say a few words edgeways this week about our experiences.