On April 17, Laboro.AI announced that as part of its research and development activities it has created and released an original Japanese version of BERT (Bidirectional Encoder Representations from Transformers), an algorithm which has gained attention in recent years in the field of AI natural language processing. The original Japanese model has been pretrained on independently collected web text information, and was released as open source.
Independently collected text information from around 4,300 websites and a total of 2.6 million webpages was used to build a corpus (language database) for pretraining. Laboro.AI was able to confirm through passage classification performance tests that it holds high accuracy capabilities on par with those of models based on standard data.
In addition to continuing to engage in all fields of research and development related to AI, the company seeks to have order-made AI solutions (custom AI) using machine-trained technology introduced to more businesses in a greater array of industries moving forward.