There is a strong sense that the economic, cultural and political landscape is becoming more deeply integrated and interdependent. Nonetheless, the metrics and data collection frameworks for measuring the patterns and structures that are taking shape are still in development and being tested. Today, there is no consensus as to how globalization should be measured, let alone how pervasive the process is. Different schools attest to different views whether globalization has already rendered the world flat in terms of information, people and capital flow. Certain data points show that cross-border differences and locality play a more significant role than most of us would guesstimate:

  • International calls, including internet telephony lingers at only 6%;
  • The population of first generation immigrants still doesn’t exceed 3%;
  • Direct cross-border investments peaked at 9% in 2012;
  • The export/GDP ratio is at 30%; however, accounting for multiple counting of relations the number is likely around only 20%

Even if the hard facts don’t always confirm the expectations, it is unmistakable that embracing globalization practices is the prime enabler of economic viability.

Hic et ubique?

Hic et ubique?

Ubiquity, Mobility and Immediacy

In a ubiquitous environment servicing global audiences is the norm, not the exception. Challenges of locale-specificity have been shifting from addressing special, unique requirements to an integrated view that perceives globalization as an inherent property of the business model. This is a necessity that affects small and large businesses alike, as the ubiquity of information is bringing about the exponential growth of content and acceleration of complexity.

Every day, we create approximately 2.5 quintillion bytes of data, and more and more people are getting connected. Microsoft currently localizes into 127 languages with a reach of about a billion people, but to capture the long tail of 4 billion potential customers living in emerging markets, organizations will need to produce and localize content into more than a thousand languages. Traditional content supply chains are unable to cater for such a demand. Moreover, waterfall processes and top-down problem solving are becoming less and less adept tools at managing volumes and complexity of this level.

Building an ecosystem

Proponents of globalization often see its advantages diffuse and its drawbacks too concentrated. Either way, globalization highlights the importance of scalability, flexibility and sharing information. Organizations that fail to globalize usually make the mistake of compartmentalizing their processes and creating internal fragmentation. With the rate of information flow and demand shifts that can be expected today, it is a luxury to retain linear, highly co-dependent processes that feel like hangover from an earlier time. A flexible ecosystem that has the capacity to accommodate multiple teams and verticals as the business grows or demand changes is much easier to set up and recalibrate as needed. A fragmented environment is a barrier that causes time loss, miscommunication, deflates the customer experience and prevents leveraging available assets. The integration of various stakeholders in the organization and involvement of external partners, such as localization providers is not only the token of growth, but it will be a sheer necessity to remain competitive. The topology of the organization is also an important factor; however, one implementation may be optimal for one, and inefficient for another, depending on the industry, company size, driving philosophy and people.

Instant content

Availability of content is not enough, it needs to be instant. 99% of content generated today is already being translated using machine translation. Automated MT tools have become part of our everyday lives even where we don’t expect them to, while business-critical use of MT is still frowned upon in many sectors, and with a reason. However, it will be more and more important to capitalize on the potential of the long tail and reach out to customers in new markets. With the proliferation of content as well as languages, current translation and localization models will break down, justifying “good enough” quality. A cloud-based framework that allows access to an integrated, comprehensive repository of data can support localization and authoring teams alike to produce the desired level of quality in a consolidated environment.

Empowering the customer

The converging trends disrupt the one-way communication channels of provider to customer. Social media presence is an obvious change, but many organizations still back away from harnessing the power of interactivity they provide. MT has already come into play with the translation of support materials and knowledge bases in order to tap into the knowledge and expertise of the users. The next step will blur the barrier between the walled garden and the community space by monetizing contributions, thus extending the possibility for active collaboration. IT companies are traditionally quick to adopt new methods and technologies, and some of them already use this approach. Along the lines of preventing fragmentation and facilitating knowledge transfer, having a community manager on board may be instrumental to bringing in resources of user-generated content, and consequently, to the enrichment of the company’s know-how.

From Big Data to Intelligent Content

According to a 2012 research involving 1,500 CEOs, CFOs and CIOs, almost half of the respondents reported that Big Data is essential to understand their customers, but only 13% said that the integration is on their agenda. While harnessing the power of Big Data and deep web research is not only an enthralling possibility, it raises confidentiality questions. Not to mention the technical obstacles: volume, scattered disconnected islands of data, dynamic content, association rules, copious variety of content structures and implementations, and so on. Future content models are not likely to be standardized, just like today, but the application of structured, semantically aware, discoverable and adaptable content models can fuel productivity, galvanize visibility and help detect trends.

The challenges are on our doorstep, for language solution providers and buyers alike.


Leave a Reply