🧠 Model

Magma-8B

by microsoft

--- library_name: transformers license: mit pipeline_tag: robotics --- Magma: A Foundation Model for Multimodal AI Agents Jianwei Yang*1†  Reuben Tan1†  Qianhui Wu1†  Ruijie Zheng2‡  Baolin Peng1‡  Yongyuan Liang2‡ Yu Gu1  ...

🕐 Updated 12/18/2025

🧠 Architecture Explorer

Neural network architecture

1 Input Layer
2 Hidden Layers
3 Attention
4 Output Layer

About

Magma: A Foundation Model for Multimodal AI Agents Jianwei Yang*1†  Reuben Tan1†  Qianhui Wu1†  Ruijie Zheng2‡  Baolin Peng1‡  Yongyuan Liang2‡ Yu Gu1  ...

📝 Limitations & Considerations

  • Benchmark scores may vary based on evaluation methodology and hardware configuration.
  • VRAM requirements are estimates; actual usage depends on quantization and batch size.
  • FNI scores are relative rankings and may change as new models are added.
  • Data source: [{"source_platform":"huggingface","source_url":"https://huggingface.co/microsoft/Magma-8B","fetched_at":"2025-12-18T04:21:59.016Z","adapter_version":"3.2.0"}]

📚 Related Resources

📄 Related Papers

No related papers linked yet. Check the model's official documentation for research papers.

📊 Training Datasets

Training data information not available. Refer to the original model card for details.

🔗 Related Models

Data unavailable

🚀 What's Next?