HomeNewsIntroduction to BERT

Introduction to BERT

Published on 11 September 2020 by Alice Chan

BERT as a tool for Google to enhance the search engine

Bidirectional Encoder Representations from Transformers (BERT) is a natural language processing (NLP) technique created by Google to help its search engine better understand the content on a web page. Compared with single-direction language model, this open-source language model which is bidirectionally trained can have a deeper sense of language context and flow. It demonstrates better performance on a wide array of monolingual tasks such as question answering system and language inference.

BERT is not ideal to use as translation itself but pretraining

A September 2019 paper by South Korean internet company NAVER concluded that the information encoded by BERT is useful but, on its own, insufficient to perform a translation task. However, it did note that “BERT pretraining allows for a better initialization point for [an] NMT model” if it can be trained for one source language and further reused for several translation pairs.1

Multi techniques can help better performance of the NMT

With a specific financial domain, DeepTranslate uses multi techniques to perform the neutral machine translation (NMT). Its AI system can model sentences with jargons of the financial world on the basis of a mass volume of sources from the stock market. By applying the name and entity recognition, DeepTranslate can even generate the specific lists of directors and subsidiaries in the financial reports of the listed companies. DeepTranslate team does not only build TM for you, but also provides customized trainings for your TM to meet your needs. 

Introduction to different computer-aided translation
  • No products in the cart.
Choose your category menu in Lorada Theme Options -> Header -> Menu -> Mobile Category Menu.