News on Medial

AI language models can exceed PNG and FLAC in lossless compression, says study

ArstechnicaArstechnica · 1y
AI language models can exceed PNG and FLAC in lossless compression, says study

A research paper titled "Language Modeling Is Compression" reveals that DeepMind's large language model, Chinchilla 70B, can achieve better lossless compression on images and audio compared to the PNG and FLAC algorithms. The study suggests that language models like Chinchilla not only excel in text prediction but also in compressing other types of data. This brings forth the idea that effective compression is a form of general intelligence, as it involves identifying patterns and making sense of complexity. The relationship between prediction and compression is bidirectional, allowing compression algorithms to generate new data based on what they have learned. While the paper is not peer-reviewed, it highlights the potential applications of large language models in compression tasks.

Comments

Download the medial app to read full posts, comements and news.