Meta's research introduces a "Large Concept Model" that operates on higher-level semantic representations, termed "concepts," as opposed to traditional token-level processing in language models. Utilizing the SONAR sentence embedding space, the model is trained for autoregressive sentence prediction across multiple languages, showing impressive zero-shot generalization and outperforming existing models of similar size. The training code is made freely available, contributing to advancements in natural language processing.