Grok 4 Lands: Tools, Live Search, and a Heavy Mode from xAI

Grok 4 Lands: Tools, Live Search, and a Heavy Mode from xAI

xAI has launched Grok 4, which it calls “the most intelligent model in the world.” It is live for SuperGrok and Premium+ users and also available through the xAI API. xAI is also adding a new SuperGrok Heavy tier with access to Grok 4 Heavy, its most powerful version. :contentReference[oaicite:0]{index=0}

The headline features are built-in tool use and real-time search. Grok 4 can decide when to open a code tool or browse the web, and it can search across X as well. It can even look at media from X to improve answers. All of this runs natively inside Grok. :contentReference[oaicite:1]{index=1}

xAI says it scaled up reinforcement learning using Colossus, its 200,000-GPU cluster. The team reports a 6× gain in training efficiency and a big expansion of verifiable training data beyond math and code, across many domains. The Grok 4 training run used over an order of magnitude more compute than before. :contentReference[oaicite:2]{index=2}

For tough tasks, the company introduces Grok 4 Heavy. xAI says this model pushes parallel thinking further and “sets a new standard for performance and reliability.” It claims Grok 4 Heavy is the first to score about 50% on “Humanity’s Last Exam,” and that Grok 4 leads on ARC-AGI V2. :contentReference[oaicite:3]{index=3}

Developers get a Grok 4 API with text and vision understanding, a 256,000-token context window, and direct access to live search across X, the open web, and news sources. xAI highlights security and compliance, including SOC 2 Type 2, GDPR, and CCPA. It also says Grok 4 is coming soon to major cloud partners. :contentReference[oaicite:4]{index=4}

There is also a new Voice Mode. xAI says conversations sound more natural, and you can turn on your camera so Grok can “see what you see” and describe the scene in real time.

From:x.ai