Datasets:

Modalities:
Image
ArXiv:
Libraries:
Datasets
License:
Dataset Viewer
Auto-converted to Parquet
Search is not available for this dataset
image
imagewidth (px)
196
4.74k
End of preview. Expand in Data Studio

When Visualizing is the First Step to Reasoning: MIRA, a Benchmark for Visual Chain-of-Thought

Dataset Description

MIRA (Multimodal Imagination for Reasoning Assessment) evaluates whether MLLMs can think while drawing—i.e., generate and use intermediate visual representations (sketches, diagrams, trajectories) as part of reasoning.
MIRA includes 546 carefully curated problems spanning 20 task types across four domains:

  • Euclidean Geometry (EG)
  • Physics-Based Reasoning (PBR)
  • Abstract Spatial & Logical Puzzles (ASLP)
  • Causal Transformations (CT)

Each instance comes with gold visual chain-of-thought (Visual-CoT) images and final answers. We provide three evaluation settings: Direct (image + question), Text-CoT, and Visual-CoT.



Paper / Code / Project


Dataset Usage

Install

You can download the dataset by the following command (Taking downloading billiards data as an example):

from datasets import load_dataset
dataset = load_dataset("YiyangAiLab/MIRA", "billiards")

Data Format

The dataset is provided in JSON Lines (jsonl) format. Each line is a standalone JSON object with the following fields:

{
  "uid (int)": Unique identifier for the sample,
  "image_path (string)": Relative or absolute path to the input image file,
  "question (string)": The natural-language prompt associated with the image,
  "answer (int|string)": The gold final answer. Use a number for numeric answers; a string for textual answers if applicable,
}

Automatic Evaluation

To automatically evaluate a model on the dataset, please refer to our GitHub repository here.

Citation

@misc{zhou2025visualizingstepreasoningmira,
      title={When Visualizing is the First Step to Reasoning: MIRA, a Benchmark for Visual Chain-of-Thought}, 
      author={Yiyang Zhou and Haoqin Tu and Zijun Wang and Zeyu Wang and Niklas Muennighoff and Fan Nie and Yejin Choi and James Zou and Chaorui Deng and Shen Yan and Haoqi Fan and Cihang Xie and Huaxiu Yao and Qinghao Ye},
      year={2025},
      eprint={2511.02779},
      archivePrefix={arXiv},
      primaryClass={cs.CV},
      url={https://arxiv.org/abs/2511.02779}, 
}
Downloads last month
511