{"id":5267,"date":"2020-09-11T14:18:00","date_gmt":"2020-09-11T06:18:00","guid":{"rendered":"https:\/\/deeptranslate.hk\/?p=5267"},"modified":"2021-06-17T17:20:03","modified_gmt":"2021-06-17T09:20:03","slug":"introduction-to-bert","status":"publish","type":"post","link":"https:\/\/deeptranslate.hk\/en\/introduction-to-bert\/","title":{"rendered":"Introduction to BERT"},"content":{"rendered":"\t\t<div data-elementor-type=\"wp-post\" data-elementor-id=\"5267\" class=\"elementor elementor-5267\" data-elementor-settings=\"[]\">\n\t\t\t\t\t\t<div class=\"elementor-inner\">\n\t\t\t\t\t\t\t<div class=\"elementor-section-wrap\">\n\t\t\t\t\t\t\t<section class=\"elementor-section elementor-top-section elementor-element elementor-element-401564e elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"401564e\" data-element_type=\"section\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t\t\t<div class=\"elementor-row\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-c1e08f0\" data-id=\"c1e08f0\" data-element_type=\"column\">\n\t\t\t<div class=\"elementor-column-wrap elementor-element-populated\">\n\t\t\t\t\t\t\t<div class=\"elementor-widget-wrap\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-a3e0692 elementor-widget elementor-widget-text-editor\" data-id=\"a3e0692\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t<div class=\"elementor-text-editor elementor-clearfix\">\n\t\t\t\t\t<p>Published on 11 September 2020 by Alice Chan<\/p>\t\t\t\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-f47d12d elementor-widget elementor-widget-heading\" data-id=\"f47d12d\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">BERT as a tool for Google to enhance the search engine<\/h2>\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-9d0618d elementor-widget elementor-widget-text-editor\" data-id=\"9d0618d\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t<div class=\"elementor-text-editor elementor-clearfix\">\n\t\t\t\t\t<p>Bidirectional Encoder Representations from Transformers (BERT) is a natural language processing (NLP) technique created by Google to help its search engine better understand the content on a web page. Compared with single-direction language model, this open-source language model which is bidirectionally trained can have a deeper sense of language context and flow. It demonstrates better performance on a wide array of monolingual tasks such as question answering system and language inference.<\/p>\t\t\t\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-8d6c3ea elementor-widget elementor-widget-heading\" data-id=\"8d6c3ea\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">BERT is not ideal to use as translation itself but pretraining<\/h2>\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-ff992be elementor-widget elementor-widget-text-editor\" data-id=\"ff992be\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t<div class=\"elementor-text-editor elementor-clearfix\">\n\t\t\t\t\t<p>A September 2019 paper by South Korean internet company NAVER concluded that the information encoded by BERT is useful but, on its own, insufficient to perform a translation task. However, it did note that \u201cBERT pretraining allows for a better initialization point for [an] NMT model\u201d if it can be trained for one source language and further reused for several translation pairs.<sup>1<\/sup><\/p>\t\t\t\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-21a6b04 elementor-widget elementor-widget-heading\" data-id=\"21a6b04\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Multi techniques can help better performance of the NMT<\/h2>\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-aaeaee7 elementor-widget elementor-widget-text-editor\" data-id=\"aaeaee7\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t<div class=\"elementor-text-editor elementor-clearfix\">\n\t\t\t\t\t<p>With a specific financial domain, DeepTranslate uses multi techniques to perform the neutral machine translation (NMT). Its AI system can model sentences with jargons of the financial world on the basis of a mass volume of sources from the stock market. By applying the name and entity recognition, DeepTranslate can even generate the specific lists of directors and subsidiaries in the financial reports of the listed companies. DeepTranslate team does not only build TM for you, but also provides customized trainings for your TM to meet your needs.\u00a0<\/p>\t\t\t\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-923859b elementor-widget elementor-widget-text-editor\" data-id=\"923859b\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t<div class=\"elementor-text-editor elementor-clearfix\">\n\t\t\t\t\t<a style=\"color: #f7811b;\" href=\"https:\/\/www.aclweb.org\/anthology\/D19-5611.pdf\">https:\/\/www.aclweb.org\/anthology\/D19-5611.pdf<\/a>\t\t\t\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-173c6e3 elementor-widget elementor-widget-image\" data-id=\"173c6e3\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t<div class=\"elementor-image\">\n\t\t\t\t\t\t\t\t\t\t\t\t<img width=\"2560\" height=\"1707\" src=\"https:\/\/deeptranslate.hk\/wp-content\/uploads\/20200911-WeChat-Article-Introduction-to-BERT-scaled.jpg\" class=\"attachment-full size-full\" alt=\"Introduction to different computer-aided translation\" loading=\"lazy\" srcset=\"https:\/\/deeptranslate.hk\/wp-content\/uploads\/20200911-WeChat-Article-Introduction-to-BERT-scaled.jpg 2560w, https:\/\/deeptranslate.hk\/wp-content\/uploads\/20200911-WeChat-Article-Introduction-to-BERT-scaled-900x600.jpg 900w, https:\/\/deeptranslate.hk\/wp-content\/uploads\/20200911-WeChat-Article-Introduction-to-BERT-300x200.jpg 300w, https:\/\/deeptranslate.hk\/wp-content\/uploads\/20200911-WeChat-Article-Introduction-to-BERT-1024x683.jpg 1024w, https:\/\/deeptranslate.hk\/wp-content\/uploads\/20200911-WeChat-Article-Introduction-to-BERT-768x512.jpg 768w, https:\/\/deeptranslate.hk\/wp-content\/uploads\/20200911-WeChat-Article-Introduction-to-BERT-1536x1024.jpg 1536w, https:\/\/deeptranslate.hk\/wp-content\/uploads\/20200911-WeChat-Article-Introduction-to-BERT-2048x1365.jpg 2048w, https:\/\/deeptranslate.hk\/wp-content\/uploads\/20200911-WeChat-Article-Introduction-to-BERT-400x267.jpg 400w\" sizes=\"(max-width: 2560px) 100vw, 2560px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t","protected":false},"excerpt":{"rendered":"<p>Bidirectional Encoder Representations from Transformers (BERT) is a natural language processing (NLP) technique created by Google to help its search engine better understand the content on a web page.<\/p>\n","protected":false},"author":3,"featured_media":5222,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":[],"categories":[36],"tags":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v18.1 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Introduction to BERT - deeptranslate.hk<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/deeptranslate.hk\/en\/introduction-to-bert\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Introduction to BERT - deeptranslate.hk\" \/>\n<meta property=\"og:description\" content=\"Bidirectional Encoder Representations from Transformers (BERT) is a natural language processing (NLP) technique created by Google to help its search engine better understand the content on a web page.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/deeptranslate.hk\/en\/introduction-to-bert\/\" \/>\n<meta property=\"og:site_name\" content=\"deeptranslate.hk\" \/>\n<meta property=\"article:published_time\" content=\"2020-09-11T06:18:00+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2021-06-17T09:20:03+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/deeptranslate.hk\/wp-content\/uploads\/20200911-WeChat-Article-Introduction-to-BERT-scaled.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"2560\" \/>\n\t<meta property=\"og:image:height\" content=\"1707\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"support@manfulls.com\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"1 minute\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"WebSite\",\"@id\":\"https:\/\/deeptranslate.hk\/en\/#website\",\"url\":\"https:\/\/deeptranslate.hk\/en\/\",\"name\":\"deeptranslate.hk\",\"description\":\"\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/deeptranslate.hk\/en\/?s={search_term_string}\"},\"query-input\":\"required name=search_term_string\"}],\"inLanguage\":\"en-US\"},{\"@type\":\"ImageObject\",\"@id\":\"https:\/\/deeptranslate.hk\/en\/introduction-to-bert\/#primaryimage\",\"inLanguage\":\"en-US\",\"url\":\"https:\/\/deeptranslate.hk\/wp-content\/uploads\/20200911-WeChat-Article-Introduction-to-BERT-scaled.jpg\",\"contentUrl\":\"https:\/\/deeptranslate.hk\/wp-content\/uploads\/20200911-WeChat-Article-Introduction-to-BERT-scaled.jpg\",\"width\":2560,\"height\":1707,\"caption\":\"Introduction to different computer-aided translation\"},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/deeptranslate.hk\/en\/introduction-to-bert\/#webpage\",\"url\":\"https:\/\/deeptranslate.hk\/en\/introduction-to-bert\/\",\"name\":\"Introduction to BERT - deeptranslate.hk\",\"isPartOf\":{\"@id\":\"https:\/\/deeptranslate.hk\/en\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/deeptranslate.hk\/en\/introduction-to-bert\/#primaryimage\"},\"datePublished\":\"2020-09-11T06:18:00+00:00\",\"dateModified\":\"2021-06-17T09:20:03+00:00\",\"author\":{\"@id\":\"https:\/\/deeptranslate.hk\/en\/#\/schema\/person\/c71a8d20ce313776c9825bb6591944f6\"},\"breadcrumb\":{\"@id\":\"https:\/\/deeptranslate.hk\/en\/introduction-to-bert\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/deeptranslate.hk\/en\/introduction-to-bert\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/deeptranslate.hk\/en\/introduction-to-bert\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/deeptranslate.hk\/en\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Introduction to BERT\"}]},{\"@type\":\"Person\",\"@id\":\"https:\/\/deeptranslate.hk\/en\/#\/schema\/person\/c71a8d20ce313776c9825bb6591944f6\",\"name\":\"support@manfulls.com\",\"image\":{\"@type\":\"ImageObject\",\"@id\":\"https:\/\/deeptranslate.hk\/en\/#personlogo\",\"inLanguage\":\"en-US\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/338dfd2d2b0801ce441cb076a3db7b95?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/338dfd2d2b0801ce441cb076a3db7b95?s=96&d=mm&r=g\",\"caption\":\"support@manfulls.com\"},\"url\":\"https:\/\/deeptranslate.hk\/en\/author\/supportmanfulls-com\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Introduction to BERT - deeptranslate.hk","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/deeptranslate.hk\/en\/introduction-to-bert\/","og_locale":"en_US","og_type":"article","og_title":"Introduction to BERT - deeptranslate.hk","og_description":"Bidirectional Encoder Representations from Transformers (BERT) is a natural language processing (NLP) technique created by Google to help its search engine better understand the content on a web page.","og_url":"https:\/\/deeptranslate.hk\/en\/introduction-to-bert\/","og_site_name":"deeptranslate.hk","article_published_time":"2020-09-11T06:18:00+00:00","article_modified_time":"2021-06-17T09:20:03+00:00","og_image":[{"width":2560,"height":1707,"url":"https:\/\/deeptranslate.hk\/wp-content\/uploads\/20200911-WeChat-Article-Introduction-to-BERT-scaled.jpg","type":"image\/jpeg"}],"twitter_card":"summary_large_image","twitter_misc":{"Written by":"support@manfulls.com","Est. reading time":"1 minute"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebSite","@id":"https:\/\/deeptranslate.hk\/en\/#website","url":"https:\/\/deeptranslate.hk\/en\/","name":"deeptranslate.hk","description":"","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/deeptranslate.hk\/en\/?s={search_term_string}"},"query-input":"required name=search_term_string"}],"inLanguage":"en-US"},{"@type":"ImageObject","@id":"https:\/\/deeptranslate.hk\/en\/introduction-to-bert\/#primaryimage","inLanguage":"en-US","url":"https:\/\/deeptranslate.hk\/wp-content\/uploads\/20200911-WeChat-Article-Introduction-to-BERT-scaled.jpg","contentUrl":"https:\/\/deeptranslate.hk\/wp-content\/uploads\/20200911-WeChat-Article-Introduction-to-BERT-scaled.jpg","width":2560,"height":1707,"caption":"Introduction to different computer-aided translation"},{"@type":"WebPage","@id":"https:\/\/deeptranslate.hk\/en\/introduction-to-bert\/#webpage","url":"https:\/\/deeptranslate.hk\/en\/introduction-to-bert\/","name":"Introduction to BERT - deeptranslate.hk","isPartOf":{"@id":"https:\/\/deeptranslate.hk\/en\/#website"},"primaryImageOfPage":{"@id":"https:\/\/deeptranslate.hk\/en\/introduction-to-bert\/#primaryimage"},"datePublished":"2020-09-11T06:18:00+00:00","dateModified":"2021-06-17T09:20:03+00:00","author":{"@id":"https:\/\/deeptranslate.hk\/en\/#\/schema\/person\/c71a8d20ce313776c9825bb6591944f6"},"breadcrumb":{"@id":"https:\/\/deeptranslate.hk\/en\/introduction-to-bert\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/deeptranslate.hk\/en\/introduction-to-bert\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/deeptranslate.hk\/en\/introduction-to-bert\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/deeptranslate.hk\/en\/"},{"@type":"ListItem","position":2,"name":"Introduction to BERT"}]},{"@type":"Person","@id":"https:\/\/deeptranslate.hk\/en\/#\/schema\/person\/c71a8d20ce313776c9825bb6591944f6","name":"support@manfulls.com","image":{"@type":"ImageObject","@id":"https:\/\/deeptranslate.hk\/en\/#personlogo","inLanguage":"en-US","url":"https:\/\/secure.gravatar.com\/avatar\/338dfd2d2b0801ce441cb076a3db7b95?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/338dfd2d2b0801ce441cb076a3db7b95?s=96&d=mm&r=g","caption":"support@manfulls.com"},"url":"https:\/\/deeptranslate.hk\/en\/author\/supportmanfulls-com\/"}]}},"_links":{"self":[{"href":"https:\/\/deeptranslate.hk\/en\/wp-json\/wp\/v2\/posts\/5267"}],"collection":[{"href":"https:\/\/deeptranslate.hk\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/deeptranslate.hk\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/deeptranslate.hk\/en\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/deeptranslate.hk\/en\/wp-json\/wp\/v2\/comments?post=5267"}],"version-history":[{"count":18,"href":"https:\/\/deeptranslate.hk\/en\/wp-json\/wp\/v2\/posts\/5267\/revisions"}],"predecessor-version":[{"id":6280,"href":"https:\/\/deeptranslate.hk\/en\/wp-json\/wp\/v2\/posts\/5267\/revisions\/6280"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/deeptranslate.hk\/en\/wp-json\/wp\/v2\/media\/5222"}],"wp:attachment":[{"href":"https:\/\/deeptranslate.hk\/en\/wp-json\/wp\/v2\/media?parent=5267"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/deeptranslate.hk\/en\/wp-json\/wp\/v2\/categories?post=5267"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/deeptranslate.hk\/en\/wp-json\/wp\/v2\/tags?post=5267"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}