قالب وردپرس درنا توس
Home / Technology / Google's new AI language model can capture entire books

Google's new AI language model can capture entire books



One of the biggest challenges of a language-based AI model is understanding the context of the surrounding content.

To solve this problem, Google has introduced a new model called Reformer that understands the context of 1 million lines with only 16 GB of storage space. The company built this to solve the problems of its old Transformer model – a neural network that compares words in a paragraph to understand the relationship between them.

Current models support the understanding of some lines or paragraphs before and after the text in focus.

However, since Transformer uses pair mapping, it takes a lot of data space if text with more than a few thousand words has to be processed. Therefore, it is impractical to edit a long article or book.

Google has set up reformers to solve the problem of a short "attention span" and memory usage of the old model. To solve the first problem, the new model uses locality-sensitive-hashing (LSH).

What does this mean? Instead of comparing all words to one another, the model uses a hash function to group similar words in one bucket and then compare words in the same or a neighboring bucket, reducing the processing overload.

Photo credit: Google AI