Grok 4.1 Parameters, Context Window & API Model List (March 2026)
Grok 4.1 has 1.7T parameters with 256K context window. Full xAI API model identifier strings: grok-4.1, grok-4.1-mini, grok-3. Exact parameter counts and context sizes for every Grok model. Updated March 2026.
Current xAI Grok Model Identifier Strings (March 2026)
Here is the exact, current model identifier string for every xAI Grok model available via the API as of March 2026:
| Model | Parameters | API Identifier | Context Window | Status |
|---|---|---|---|---|
| Grok-5 | 6 trillion | grok-5 (expected) | 512K+ (expected) | Coming Q1-Q2 2026 |
| Grok-4.1 | ~3 trillion (MoE) | grok-4.1 | 256K | Current flagship |
| Grok-4.1 Mini | ~400B (MoE) | grok-4.1-mini | 128K | Current fast model |
| Grok-3 | ~3 trillion (MoE) | grok-3 | 128K | Legacy |
| Grok-3 Mini | ~400B (MoE) | grok-3-mini | 128K | Legacy |
| Grok-2 | ~300B (estimated) | grok-2 | 128K | Deprecated |
| Grok-1 | 314B (MoE, 86B active) | grok-1 | 8K | Open-source, deprecated |
grok-4.1. The current Grok model (Grok-4.1) has approximately 3 trillion parameters in a Mixture-of-Experts architecture. The xAI API base URL is https://api.x.ai/v1.
Current xAI Grok API Available Models List (March 2026)
If you're looking for the exact model identifier string to use in your API calls right now:
grok-4.1— Current flagship model. Best quality. 256K context window. ~3T parameters (MoE).grok-4.1-mini— Current fast model. Cost-effective. 128K context window. ~400B parameters (MoE).grok-3— Legacy. Still available. 128K context window. ~3T parameters (MoE).grok-3-mini— Legacy fast model. 128K context window. ~400B parameters (MoE).
https://api.x.ai/v1
To list all available models programmatically:
curl https://api.x.ai/v1/models -H "Authorization: Bearer $XAI_API_KEY"
Grok Context Window Sizes (March 2026)
| Model | Context Window Size | Tokens |
|---|---|---|
| Grok-4.1 | 256K | 262,144 |
| Grok-4.1 Mini | 128K | 131,072 |
| Grok-3 | 128K | 131,072 |
| Grok-3 Mini | 128K | 131,072 |
| Grok-5 (expected) | 512K+ | 524,288+ |
How Many Parameters Does Grok Have?
The answer depends on which Grok model you mean. xAI has released multiple generations, each with different parameter counts:
Grok-1 (Open Source, 2023)
Grok-1 was xAI's first public model and remains the only Grok model that is fully open-source. It uses a Mixture-of-Experts (MoE) architecture with 314 billion total parameters, of which 86 billion are active per inference pass. This means only ~27% of the model's weights fire for any given token.
- Total parameters: 314 billion
- Active parameters: 86 billion
- Architecture: MoE with 8 experts, 2 active
- Context window: 8,192 tokens
- License: Apache 2.0 (open-source)
- API string:
grok-1(deprecated)
Grok-2 (2024)
Grok-2 was a significant jump in capability. xAI never disclosed the exact parameter count, but external estimates place it around 300 billion parameters in a dense or sparse architecture. It introduced image understanding and improved reasoning.
- Total parameters: ~300B (estimated, not confirmed)
- Context window: 128K tokens
- API string:
grok-2(deprecated)
Grok-3 (Late 2024 / Early 2025)
Grok-3 marked xAI's leap to trillion-scale models. Trained on the Colossus supercomputer in Memphis with 100,000+ NVIDIA H100 GPUs, Grok-3 uses a Mixture-of-Experts architecture with approximately 3 trillion total parameters.
- Total parameters: ~3 trillion (MoE)
- Active parameters: Not disclosed (estimated 300-600B active)
- Context window: 128K tokens
- API string:
grok-3
Grok-4.1 (Current Flagship, 2025-2026)
Grok-4.1 is the current production model available through the xAI API and the Grok chatbot on X. It builds on Grok-3's architecture with additional training and refinements. The parameter count remains in the ~3 trillion range with improved training data and techniques.
- Total parameters: ~3 trillion (MoE)
- Context window: 256K tokens
- API string:
grok-4.1 - Mini variant:
grok-4.1-mini(~400B MoE, 128K context)
Grok-5 (Upcoming)
Grok-5 is expected to double the parameter count to 6 trillion parameters. It will feature native multimodal capabilities (text, image, video, audio) and real-time data access from Tesla's fleet and X.
- Total parameters: 6 trillion (expected)
- Context window: 512K+ (expected)
- API string:
grok-5(expected) - Release: Q1-Q2 2026
How to Find the Grok Model Identifier String
If you need the exact model identifier string for API calls, here is what to use:
xAI API (Direct)
The xAI API follows OpenAI-compatible formatting. Your API call looks like this:
curl https://api.x.ai/v1/chat/completions \
-H "Authorization: Bearer $XAI_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "grok-4.1",
"messages": [{"role": "user", "content": "Hello"}]
}'
Available model strings as of March 2026:
| Model String | Description |
|---|---|
grok-4.1 | Current flagship, best quality |
grok-4.1-mini | Fast, cost-effective |
grok-3 | Previous generation |
grok-3-mini | Previous generation fast |
Listing Available Models
You can query the API to get all currently available model identifiers:
curl https://api.x.ai/v1/models \
-H "Authorization: Bearer $XAI_API_KEY"
This returns a JSON list of all model objects, each with an id field containing the model identifier string.
Using Grok in Third-Party Tools
Many tools and frameworks support xAI's Grok models. The model identifier string is the same — you just need the xAI API base URL:
- Base URL:
https://api.x.ai/v1 - Model ID:
grok-4.1(or whichever variant you need) - Authentication: Bearer token with your xAI API key
Grok vs Other Frontier Models (March 2026)
How does Grok stack up against the competition?
| Feature | Grok 4.1 | GPT-5.2 | Claude Opus 4.6 | Gemini 3.1 Pro |
|---|---|---|---|---|
| Parameters | ~3T (MoE) | ~2T (est.) | Not disclosed | Not disclosed |
| Context Window | 256K | 400K | 1M | 1M |
| SWE-bench | ~78% | 80.0% | 80.8% | 80.6% |
| GPQA Diamond | ~90% | 92.4% | 91.3% | 94.3% |
| ARC-AGI-2 | ~55% | 52.9% | 68.8% | 77.1% |
| API Pricing (in/out) | $5/$15 | $15/$60 | $15/$75 | $2/$12 |
| Open Source | Grok-1 only | No | No | No |
| Real-time Data | Yes (X, Tesla) | Limited (browsing) | No | Yes (Google Search) |
Key Takeaways
- Grok leads on real-time data — access to X and Tesla fleet data is a genuine differentiator no other lab can match.
- Grok trails on reasoning — the ARC-AGI-2 gap (55% vs 77.1% for Gemini) is significant. Grok-5 needs to close this.
- Grok is competitively priced — cheaper than GPT-5.2 and Claude Opus, more expensive than Gemini 3.1 Pro.
- Parameter count doesn't determine quality — Grok-4.1 has the most disclosed parameters but doesn't lead benchmarks. Architecture, training data, and RLHF matter more.
Understanding Mixture-of-Experts (MoE) Parameters
When xAI says Grok has "3 trillion parameters," that number deserves context. In a Mixture-of-Experts architecture, only a fraction of parameters are active during each forward pass.
How MoE works:- The model has many "expert" sub-networks
- A router network decides which experts to activate for each token
- Typically 2-4 experts out of 8-16+ are active per token
- Total parameters are large, but compute cost is closer to a smaller dense model
Grok-1 is the only model where xAI confirmed the exact numbers: 314B total, 86B active (8 experts, 2 active). The later models have not had their expert configurations publicly confirmed.
Frequently Asked Questions
What is the current Grok xAI model identifier string in 2026?
The current Grok xAI model identifier string is grok-4.1 for the flagship model and grok-4.1-mini for the fast variant. These are the exact strings you use in API calls to https://api.x.ai/v1/chat/completions.
What are the xAI Grok API available models list in March 2026?
The available xAI Grok API models as of March 2026 are: grok-4.1, grok-4.1-mini, grok-3, and grok-3-mini. You can query https://api.x.ai/v1/models to get the full list programmatically.
How many parameters does the current Grok model have?
Grok-4.1, the current flagship model, has approximately 3 trillion parameters in a Mixture-of-Experts (MoE) architecture. The active parameter count per inference is lower, likely 300-600 billion.
What is the Grok xAI context window size in March 2026?
Grok-4.1 has a context window size of 256K tokens (262,144). Grok-4.1-mini has 128K tokens. The upcoming Grok-5 is expected to expand to 512K+ tokens.
What is the Grok 4.1 parameter count?
The Grok 4.1 parameter count is approximately 3 trillion total parameters in a Mixture-of-Experts architecture. The exact number of active parameters per inference has not been disclosed by xAI, but is estimated at 300-600 billion.
Is Grok open source?
Only Grok-1 (314B parameters) is open-source under an Apache 2.0 license. All subsequent models (Grok-2, 3, 4.1, and the upcoming 5) are proprietary.
How does the Grok xAI model size compare to GPT-5?
Grok-4.1 at ~3 trillion parameters has the largest disclosed parameter count among frontier models. GPT-5.2's exact parameter count is not confirmed but is estimated around 2 trillion. However, parameter count alone does not determine model quality.
When will Grok-5 be released?
xAI has indicated a Q1 2026 release for Grok-5 with 6 trillion parameters. As of March 2026, no exact date has been announced. It could arrive any day or slip to Q2.
How many parameters does Grok AI have?
The number of parameters in Grok AI depends on the version: Grok-1 has 314B, Grok-2 has ~300B, Grok-3 has ~3T, Grok-4.1 has ~3T, and the upcoming Grok-5 will have 6T. All models from Grok-1 onward use Mixture-of-Experts architecture.
Build With Any AI Model
The AI model landscape in 2026 is fragmented — Grok, GPT, Claude, Gemini all have different strengths. The smart move is building products that can swap between models as the landscape shifts.
Y Build gives you the full growth stack for AI-powered products: deployment, Demo Cut product videos, AI SEO, and analytics. Works with any model, any framework. Start free →Sources: