Microsoft's recent research estimates that OpenAI's o1 Mini and GPT 4o Mini have 100 billion and 8 billion parameters, respectively. This disclosure has sparked interest, especially since many AI companies, including OpenAI, have not revealed detailed model specifics. The trend in AI models shows decreasing parameter counts as efficiency techniques like 'mixture of experts' (MoE) gain traction, focusing more on innovative approaches rather than sheer size.