Throughout history, communication has been a crucial component of all aspects of human endeavor. From business and diplomacy to technology in more recent times, the ability to express and understand ideas continues to be a major factor in progress across all sectors. This has become even more important due to globalization and the advent of instant translation technologies via API, not only for trade but also for news, legal issues, etc.
Given that there are over 7,000 languages, content creation and its translation have predictably become an essential role. However, the exponential data deluge has long rendered human services incapable of servicing the incremental needs of publishers and users of translation services. Although the role has historically seen less impact from technology than other spheres due to the inability of automated translation tools to pick up language nuances as well as humans, the advent of advanced artificial intelligence and machine learning software has advanced the state-of-the-art to near human parity.
Big data processing
One of the key drivers of the advanced AI software being deployed for translation services is the access to huge datasets and computers that can process them much more efficiently than was ever possible in the past. Using both, Google, for instance, has been able to drastically improve its translation service, translating 300 trillion words compared to an estimated 200 billion words translated by the professional translation industry in 2019. Although the accuracy of the translations can often be off the mark, the improvement has been significant. This has changed the lives of many businesses and users who had not been able to afford translation services in the past. But it has also raised concerns about privacy and customization of those generic engines to specific needs for industrial and commercial settings.
Although one might get the impression that only huge companies can access and mine big data to own or improve their machine translation systems, nothing could be further from the truth. The underlying machine learning required for the development of machine translation, although advanced, is often open-source, enabling a wide range of companies of all sizes to access it and customize it to suit their specific needs. The key points, then, are how these companies build pre- and post-processes and how much data they amass in order to build their own systems.
The sheer number of languages used globally has opened an opportunity for businesses to specialize in specific languages as well as specific sectors, such that while the everyday person might use Google to translate tweets, businesspeople who need highly accurate translations of a legal document might typically consider to continue to hire service providers who specialize in that field. But even in these cases, the amount of data gathered (sometimes many hard disks) make machine translation an ally of the legal profession, defense and law enforcement as they transfer information from one language to another.
Will the figure of professional translators go extinct soon? That has been a frequent question in many professions that have been affected by AI and near human-parity deep automation. Savvy companies are building on the concept of human-in-the-loop so that humans review machine-translated output, make stylistic changes and improvements and it is their own feedback that improves the quality of the translation software. Hardly any language company proposes to translate a document from scratch. In this way, more efficient processes are being implemented, running files and documents through a local or cloud machine translation software and then having people review it for accuracy. That way, the work can be done faster without compromising the high level of accuracy that is required.
Although globalization has led to an increase in the demand for translation services, recent events like the Covid pandemic have accelerated that demand exponentially. Currently, there is a focus on distributed work and the minimization of inter-personal contact to essential occasions. The effect of those trends has been to make companies rely on AI tools to facilitate communication across language barriers.
That increase in demand has in turn accelerated to development of machine translation technology. Looking back to only five years ago, technologies have moved from rule-based and statistical models for translation to Neural Machine Translation (NMT) based on neural networks that seek to deeply mimic the way a human translator would handle and translate documents. As more focus is placed on the sector, the pace of development and involvement of humans-in-the-loop will continue to increase and so the efficiency and accuracy of the machine translation software.
The old method of having human translators pore through documents and translate them line by line has long become outdated and practically extinct, except in a few specialized cases. Twenty-first century enterprises look for high-quality and immediate language deliveries as they handle big data. Advanced artificial intelligence has now made it possible for companies of all sizes to compete on content publishing using specialist machine translation, particularly when they focus on specific languages and use cases. That has also had the effect of opening up opportunities for innovators and entrepreneurs to provide specialized solutions to meet the increasing, globalization-driven demand.
Read the original article on Entrepreneur