Help
RSS
API
Feed
Maltego
Contact
Domain > opencompass-openvlm-video-leaderboard.hf.space
×
More information on this domain is in
AlienVault OTX
Is this malicious?
Yes
No
DNS Resolutions
Date
IP Address
2025-05-31
3.219.73.27
(
ClassC
)
2025-09-30
54.145.224.52
(
ClassC
)
Port 80
HTTP/1.1 301 Moved PermanentlyServer: awselb/2.0Date: Tue, 30 Sep 2025 07:47:37 GMTContent-Type: text/htmlContent-Length: 134Connection: keep-aliveLocation: https://opencompass-openvlm-video-leaderboard.hf.space:443/ html>head>title>301 Moved Permanently/title>/head>body>center>h1>301 Moved Permanently/h1>/center>/body>/html>
Port 443
HTTP/1.1 200 OKDate: Tue, 30 Sep 2025 07:47:37 GMTContent-Type: text/html; charsetutf-8Content-Length: 145819Connection: keep-aliveserver: uvicornx-proxied-host: http://10.27.172.45x-proxied-replica: d4dqgq8w-2063lx-proxied-path: /link: https://huggingface.co/spaces/opencompass/openvlm_video_leaderboard>;relcanonicalx-request-id: pEcrhNvary: origin, access-control-request-method, access-control-request-headersaccess-control-allow-credentials: true !doctype html>html langen style margin: 0; padding: 0; min-height: 100%; display: flex; flex-direction: column; > head> meta charsetutf-8 /> meta nameviewport contentwidthdevice-width, initial-scale1, shrink-to-fitno /> style> :root { --bg: white; --col: #1f2937; --bg-dark: #0b0f19; --col-dark: #f3f4f6; } body { background: var(--bg); color: var(--col); font-family: Arial, Helvetica, sans-serif; } @media (prefers-color-scheme: dark) { body { background: var(--bg-dark); color: var(--col-dark); } } /style> meta propertyog:url contenthttps://gradio.app/ /> meta propertyog:type contentwebsite /> meta propertyog:image content /> meta propertyog:title contentGradio /> meta propertyog:description content /> meta nametwitter:card contentsummary_large_image /> meta nametwitter:creator content@teamGradio /> meta nametwitter:title contentGradio /> meta nametwitter:description content /> meta nametwitter:image content /> script data-gradio-mode> window.__gradio_mode__ app; window.iFrameResizer { heightCalculationMethod: taggedElement }; window.parent?.postMessage( { type: SET_SCROLLING, enabled: false }, * ); /script> script>window.gradio_config {version:4.44.0,mode:blocks,app_id:4827806927611629075,dev_mode:false,analytics_enabled:true,components:{id:1,type:markdown,props:{value:# OpenVLM Video Leaderboard\n### Welcome to the OpenVLM Video Leaderboard! On this leaderboard we share the evaluation results of VLMs on the video understanding benchmark obtained by the OpenSource Framework **VLMEvalKit**(https://github.com/open-compass/VLMEvalKit) 🏆 \n### Currently, OpenVLM Video Leaderboard covers 49 different VLMs (including GPT-4o, Gemini-1.5, LLaVA-OneVision, etc.) and 5 different video understanding benchmarks. \n\nThis leaderboard was last updated: 25.06.25 13:00:06. ,show_label:true,rtl:false,latex_delimiters:{left:$$,right:$$,display:true},visible:true,elem_classes:,sanitize_html:true,line_breaks:false,header_links:false,show_copy_button:false,name:markdown,_selectable:false},skip_api:false,component_class_id:0f35515b9a3362b42bb6721c836157a0,key:null,api_info:{type:string},example_inputs:# Hello!},{id:2,type:tabs,props:{visible:true,elem_classes:tab-buttons,name:tabs},skip_api:true,component_class_id:3e4202431baca71099b95e73fb9d900c,key:null},{id:3,type:tabitem,props:{label:🏅 OpenVLM Video Leaderboard,visible:true,interactive:true,id:0,elem_id:main,name:tab},skip_api:true,component_class_id:18f46968e2b273dc5dbda0fe78433d7e,key:null},{id:4,type:markdown,props:{value:## Main Evaluation Results\n\n- Avg Score: The average score on all video understanding Benchmarks (normalized to 0 - 100, the higher the better). \n- Avg Rank: The average rank on all video understanding Benchmarks (the lower the better). \n- The overall evaluation results on 5 video understanding benchmarks, sorted by the ascending order of Avg Rank. \n- Tip: The total score of MLVU is calculated as a weighted sum of M-Avg and G-Avg, with weights based on the proportion of the number of questions in each category relative to the total. The maximum possible score is 100.,show_label:true,rtl:false,latex_delimiters:{left:$$,right:$$,display:true},visible:true,elem_classes:,sanitize_html:true,line_breaks:false,header_links:false,show_copy_button:false,name:markdown,_selectable:false},skip_api:false,component_class_id:0f35515b9a3362b42bb6721c836157a0,key:null,api_info:{type:string},example_inputs:# Hello!},{id:5,type:checkboxgroup,props:{choices:Avg Score,Avg Score,Avg Rank,Avg Rank,OpenSource,OpenSource,Verified,Verified,MVBench,MVBench,Video-MME (w/o subs),Video-MME (w/o subs),MMBench-Video,MMBench-Video,TempCompass,TempCompass,MLVU,MLVU,value:Avg Score,Avg Rank,type:value,label:Evaluation Dimension,show_label:true,container:true,min_width:160,interactive:true,visible:true,elem_classes:,name:checkboxgroup,_selectable:false},skip_api:false,component_class_id:0635a5d3bf87369f1c2c1c69f8a3d0cc,key:null,api_info:{items:{enum:Avg Score,Avg Rank,OpenSource,Verified,MVBench,Video-MME (w/o subs),MMBench-Video,TempCompass,MLVU,type:string},title:Checkbox Group,type:array},example_inputs:Avg Score},{id:6,type:row,props:{variant:default,visible:true,equal_height:true,show_progress:false,name:row},skip_api:true,component_class_id:b5cf3d15e2293fb5ccc071d7b8d69f64,key:null},{id:7,type:checkboxgroup,props:{choices:\u003c10B,\u003c10B,10B-20B,10B-20B,20B-40B,20B-40B,\u003e40B,\u003e40B,Unknown,Unknown,value:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:value,label:Model Size,show_label:true,container:true,min_width:160,interactive:true,visible:true,elem_classes:,name:checkboxgroup,_selectable:false},skip_api:false,component_class_id:0635a5d3bf87369f1c2c1c69f8a3d0cc,key:null,api_info:{items:{enum:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:string},title:Checkbox Group,type:array},example_inputs:\u003c10B},{id:8,type:checkboxgroup,props:{choices:API,API,OpenSource,OpenSource,value:API,OpenSource,type:value,label:Model Type,show_label:true,container:true,min_width:160,interactive:true,visible:true,elem_classes:,name:checkboxgroup,_selectable:false},skip_api:false,component_class_id:0635a5d3bf87369f1c2c1c69f8a3d0cc,key:null,api_info:{items:{enum:API,OpenSource,type:string},title:Checkbox Group,type:array},example_inputs:API},{id:9,type:form,props:{scale:2,min_width:320,name:form},skip_api:true,component_class_id:43cda85c6ff7a5dd8f83d29496b014ee,key:null},{id:10,type:dataframe,props:{value:{headers:Method,Parameters (B),Language Model,Vision Model,Frames,Avg Score,Avg Rank,data:\u003ca href\https://huggingface.co/OpenGVLab/InternVL3-78B\\u003eInternVL3-78B\u003c/a\u003e,78.4,Qwen2.5-72B,InternViT-6B-v2.5,64,72.7,3.0,\u003ca href\https://huggingface.co/OpenGVLab/InternVL3-38B\\u003eInternVL3-38B\u003c/a\u003e,38.4,Qwen2.5-32B,InternViT-6B-v2.5,64,71.9,3.6,\u003ca href\https://huggingface.co/Qwen/Qwen2.5-VL-72B-Instruct\\u003eQwen2.5-VL-72B\u003c/a\u003e,73.4,Qwen2.5-72B,QwenViT,64,68.6,6.4,\u003ca href\https://huggingface.co/OpenGVLab/InternVL2_5-78B\\u003eInternVL2_5-78B\u003c/a\u003e,78.4,Qwen2.5-72B-Instruct,InternViT-6B-448px-V2_5,64,68.1,7.2,\u003ca href\https://huggingface.co/Qwen/Qwen2-VL-72B-Instruct\\u003eQwen2-VL-72B\u003c/a\u003e,73.4,Qwen2-72B,QwenViT,64,67.7,7.8,\u003ca href\https://huggingface.co/OpenGVLab/InternVL3-8B\\u003eInternVL3-8B\u003c/a\u003e,7.94,Qwen2.5-7B,InternViT-300M-v2.5,64,66.8,8.8,\u003ca href\https://huggingface.co/lmms-lab/LLaVA-Video-72B-Qwen2\\u003eLLaVA-Video-72B-Qwen2\u003c/a\u003e,73.2,Qwen-2-72B,siglip-so400m-patch14-384,64,66.2,9.0,\u003ca href\https://huggingface.co/rhymes-ai/Aria\\u003eAria\u003c/a\u003e,25.3,N/A,N/A,64,66.2,9.6,\u003ca href\https://deepmind.google/technologies/gemini/\\u003eGemini-2.0-Flash\u003c/a\u003e,null,,,64,64.6,11.0,\u003ca href\https://openai.com/index/introducing-structured-outputs-in-the-api/\\u003eGPT-4o-2024-08-06\u003c/a\u003e,N/A,N/A,N/A,16,63.1,12.0,\u003ca href\https://huggingface.co/Qwen/Qwen2.5-Omni-7B\\u003eQwen2.5-Omni-7B\u003c/a\u003e,10.7,Qwen-2.5-7B,QwenViT,64,64.5,12.2,\u003ca href\https://huggingface.co/OpenGVLab/InternVL2_5-8B\\u003eInternVL2_5-8B\u003c/a\u003e,8.08,internlm2_5-7b-chat,InternViT-300M-448px-V2_5,64,64.6,12.4,\u003ca href\https://huggingface.co/Qwen/Qwen2.5-VL-7B-Instruct\\u003eQwen2.5-VL-7B\u003c/a\u003e,8.29,Qwen2.5-7B,QwenViT,64,62.8,14.8,\u003ca href\https://huggingface.co/lmms-lab/LLaVA-Video-7B-Qwen2\\u003eLLaVA-Video-7B-Qwen2\u003c/a\u003e,8.03,Qwen-2-7B,siglip-so400m-patch14-384,64,62.4,15.6,\u003ca href\https://huggingface.co/OpenGVLab/InternVL2-Llama3-76B\\u003eInternVL2-76B\u003c/a\u003e,76.3,Hermes-Llama3-70B,InternViT-6B-448px-V1-5,16,62.6,16.4,\u003ca href\https://huggingface.co/Qwen/Qwen2-VL-7B-Instruct\\u003eQwen2-VL-7B\u003c/a\u003e,8,Qwen2-7B,QwenViT,64,61.2,17.2,\u003ca href\https://huggingface.co/lmms-lab/llava-onevision-qwen2-72b-ov-sft\\u003eLLaVA-OneVision-72B-sft\u003c/a\u003e,73.2,Qwen-2-72B,siglip-so400m-patch14-384,32,59.3,17.4,\u003ca href\https://huggingface.co/XiaomiMiMo/MiMo-VL-7B-RL\\u003eMiMo-VL-7B-RL\u003c/a\u003e,8.31,MiMo-7B,Qwen2.5-ViT,64,51.4,19.6,\u003ca href\https://huggingface.co/openbmb/MiniCPM-o-2_6\\u003eMiniCPM-o-2_6\u003c/a\u003e,8.67,Qwen2.5-7B-Instruct,siglip-so400m,64,55.8,20.4,\u003ca href\https://deepmind.google/technologies/gemini/flash/\\u003eGemini-1.5-Flash\u003c/a\u003e,N/A,N/A,N/A,16,57.0,21.0,\u003ca href\https://huggingface.co/OpenGVLab/InternVL2-8B\\u003eInternVL2-8B\u003c/a\u003e,8.08,InternLM2.5-7B-chat,InternViT-300M-448px,16,56.4,21.4,\u003ca href\https://huggingface.co/openbmb/MiniCPM-V-2_6\\u003eMiniCPM-V-2.6\u003c/a\u003e,8.1,Qwen-2-7B,siglip-so400m,64,54.7,21.4,\u003ca href\https://huggingface.co/ermu2001/pllava-34b\\u003ePLLaVA-34B\u003c/a\u003e,34.9,Nous-Hermes-2-Yi-34B,clip-vit-large-patch14-336,16,53.9,24.0,\u003ca href\https://huggingface.co/OpenGVLab/VideoChat2_HD_stage4_Mistral_7B\\u003eVideoChat2-HD\u003c/a\u003e,7 (LLM),Mistral-7B-Instruct-v0.2,umt_l16_qformer,16,51.9,25.8,\u003ca href\https://huggingface.co/microsoft/Phi-3.5-vision-instruct\\u003ePhi-3.5-Vision\u003c/a\u003e,4.15,Phi-3 Mini,N/A,16,50.3,26.4,\u003ca href\https://huggingface.co/VITA-MLLM/Long-VITA-16K\\u003eLong-VITA-16K\u003c/a\u003e,14,Qwen2.5-14B,InternViT-300M,64,41.4,27.0,\u003ca href\https://huggingface.co/Efficient-Large-Model/VILA1.5-40b\\u003eVILA1.5-40B\u003c/a\u003e,40,Yi-34B,InternViT-6B,14,50.8,27.0,\u003ca href\https://huggingface.co/HuggingFaceTB/SmolVLM2-2.2B-Instruct\\u003eSmolVLM2-2.2B-Instruct\u003c/a\u003e,2.25,SmolLM2-1.7B,,64,49.8,27.2,\u003ca href\https://huggingface.co/mPLUG/mPLUG-Owl3-7B-240728\\u003emPLUG-Owl3\u003c/a\u003e,8.06,Qwen-2-7B,siglip-so400m,16,49.6,28.0,\u003ca href\https://www.anthropic.com/news/claude-3-5-sonnet\\u003eClaude-3.5-Sonnet\u003c/a\u003e,N/A,N/A,N/A,8,45.9,28.8,\u003ca href\https://huggingface.co/HuggingFaceM4/Idefics3-8B-Llama3\\u003eIdefics3-8B-Llama3\u003c/a\u003e,8.46,Llama-3.1-8B-Instruct,siglip-so400m-patch14-384,16,45.9,31.8,\u003ca href\https://github.com/PKU-YuanGroup/Video-LLaVA\\u003eVideo-LLaVA\u003c/a\u003e,7.5,Vicuna-v1.5-7B,LanguageBind Encoder,8,41.8,33.0,\u003ca href\https://huggingface.co/Chat-UniVi/Chat-UniVi-7B-v1.5\\u003eChat-UniVi-7B-v1.5\u003c/a\u003e,7 (LLM),Vicuna-v1.5-7B,clip-vit-large-patch14-336,1.0 fps,41.9,33.2,\u003ca href\https://huggingface.co/YanweiLi/llama-vid-7b-full-224-video-fps-1\\u003eLLaMA-VID-7B\u003c/a\u003e,7 (LLM),Vicuna-v1.5-7B,LAVIS-EVA-G,1 fps,39.5,34.2,\u003ca href\https://huggingface.co/microsoft/Phi-4-multimodal-instruct\\u003ePhi-4-multimodal-instruct\u003c/a\u003e,5.57,Phi-4-Mini-Instruct,,64,42.3,34.8,\u003ca href\https://huggingface.co/HuggingFaceTB/SmolVLM2-500M-Video-Instruct\\u003eSmolVLM2-500M-Video-Instruct\u003c/a\u003e,0.507,SmolLM2-360M-Instruct,Siglip-base-patch16-512,64,41.8,35.4,\u003ca href\https://github.com/mbzuai-oryx/Video-ChatGPT\\u003eVideo-ChatGPT\u003c/a\u003e,7 (LLM),Vicuna-v1.1-7B,CLIP ViT-L/14,100,34.2,37.8,metadata:null},headers:1,2,3,row_count:1,dynamic,col_count:3,dynamic,datatype:html,number,str,str,str,number,number,type:pandas,latex_delimiters:{left:$$,right:$$,display:true},show_label:true,height:500,min_width:160,interactive:false,visible:true,elem_classes:,wrap:false,line_breaks:true,column_widths:,name:dataframe,_selectable:false},skip_api:false,component_class_id:6a51bb3357f2e94c40fb8d4fbe06aa8e,key:null,api_info:{properties:{headers:{items:{type:string},title:Headers,type:array},data:{items:{items:{},type:array},title:Data,type:array},metadata:{anyOf:{additionalProperties:{anyOf:{items:{},type:array},{type:null}},type:object},{type:null},default:null,title:Metadata}},required:headers,data,title:DataframeData,type:object},example_inputs:{headers:a,b,data:foo,bar}},{id:11,type:form,props:{scale:0,min_width:0,name:form},skip_api:true,component_class_id:43cda85c6ff7a5dd8f83d29496b014ee,key:null},{id:12,type:tabitem,props:{label:🔍 About,visible:true,interactive:true,id:1,elem_id:about,name:tab},skip_api:true,component_class_id:18f46968e2b273dc5dbda0fe78433d7e,key:null},{id:13,type:markdown,props:{value:!LOGO(http://opencompass.openxlab.space/utils/MMLB.jpg)\n\n\u003cb\u003eA Toolkit for Evaluating Large Vision-Language Models. \u003c/b\u003e\n\n!github-contributors-shieldgithub-contributors-link • !github-forks-shieldgithub-forks-link • !github-stars-shieldgithub-stars-link • !github-issues-shieldgithub-issues-link • !github-license-shieldgithub-license-link\n\nEnglish | 简体中文(/docs/zh-CN/README_zh-CN.md) | 日本語(/docs/ja/README_ja.md)\n\n\u003ca href\https://rank.opencompass.org.cn/leaderboard-multimodal\\u003e🏆 OC Learderboard \u003c/a\u003e •\n\u003ca href\#%EF%B8%8F-quickstart\\u003e🏗️Quickstart \u003c/a\u003e •\n\u003ca href\#-datasets-models-and-evaluation-results\\u003e📊Datasets \u0026 Models \u003c/a\u003e •\n\u003ca href\#%EF%B8%8F-development-guide\\u003e🛠️Development \u003c/a\u003e\n\n\u003ca href\https://huggingface.co/spaces/opencompass/open_vlm_leaderboard\\u003e🤗 HF Leaderboard\u003c/a\u003e •\n\u003ca href\https://huggingface.co/datasets/VLMEval/OpenVLMRecords\\u003e🤗 Evaluation Records\u003c/a\u003e •\n\u003ca href\https://huggingface.co/spaces/opencompass/openvlm_video_leaderboard\\u003e🤗 HF Video Leaderboard\u003c/a\u003e •\n\n\u003ca href\https://discord.gg/evDT4GZmxN\\u003e🔊 Discord\u003c/a\u003e •\n\u003ca href\https://www.arxiv.org/abs/2407.11691\\u003e📝 Report\u003c/a\u003e •\n\u003ca href\#-the-goal-of-vlmevalkit\\u003e🎯Goal \u003c/a\u003e •\n\u003ca href\#%EF%B8%8F-citation\\u003e🖊️Citation \u003c/a\u003e\n\u003c/div\u003e\n\n**VLMEvalKit** (the python package name is **vlmeval**) is an **open-source evaluation toolkit** of **large vision-language models (LVLMs)**. It enables **one-command evaluation** of LVLMs on various benchmarks, without the heavy workload of data preparation under multiple repositories. In VLMEvalKit, we adopt **generation-based evaluation** for all LVLMs, and provide the evaluation results obtained with both **exact matching** and **LLM-based answer extraction**.\n\n## Recent Codebase Changes\n- **2025-09-12** **Major Update: Improved Handling for Models with Thinking Mode**\n\n A new feature in PR 1229(https://github.com/open-compass/VLMEvalKit/pull/1175) that improves support for models with thinking mode. VLMEvalKit now allows for the use of a custom `split_thinking` function. **We strongly recommend this for models with thinking mode to ensure the accuracy of evaluation**. To use this new functionality, please enable the following settings: `SPLIT_THINKTrue`. By default, the function will parse content within `\u003cthink\u003e...\u003c/think\u003e` tags and store it in the `thinking` key of the output. For more advanced customization, you can also create a `split_think` function for model. Please see the InternVL implementation for an example.\n- **2025-09-12** **Major Update: Improved Handling for Long Response(More than 16k/32k)**\n\n A new feature in PR 1229(https://github.com/open-compass/VLMEvalKit/pull/1175) that improves support for models with long response outputs. VLMEvalKit can now save prediction files in TSV format. **Since individual cells in an `.xlsx` file are limited to 32,767 characters, we strongly recommend using this feature for models that generate long responses (e.g., exceeding 16k or 32k tokens) to prevent data truncation.**. To use this new functionality, please enable the following settings: `PRED_FORMATtsv`.\n- **2025-08-04** In PR 1175(https://github.com/open-compass/VLMEvalKit/pull/1175), we refine the `can_infer_option` and `can_infer_text`, which increasingly route the evaluation to LLM choice extractors and empirically leads to slight performance improvement for MCQ benchmarks.\n\n## 🆕 News\n- **2025-07-07** Supported **SeePhys**(https://seephys.github.io/), which is a full spectrum multimodal benchmark for evaluating physics reasoning across different knowledge levels. thanks to **Quinn777**(https://github.com/Quinn777) 🔥🔥🔥\n- **2025-07-02** Supported **OvisU1**(https://huggingface.co/AIDC-AI/Ovis-U1-3B), thanks to **liyang-7**(https://github.com/liyang-7) 🔥🔥🔥\n- **2025-06-16** Supported **PhyX**(https://phyx-bench.github.io/), a benchmark aiming to assess capacity for physics-grounded reasoning in visual scenarios. 🔥🔥🔥\n- **2025-05-24** To facilitate faster evaluations for large-scale or thinking models, **VLMEvalKit supports multi-node distributed inference** using **LMDeploy** (supports *InternVL Series, QwenVL Series, LLaMa4*) or **VLLM**(supports *QwenVL Series, LLaMa4*). You can activate this feature by adding the ```use_lmdeploy``` or ```use_vllm``` flag to your custom model configuration in config.py(vlmeval/config.py) . Leverage these tools to significantly speed up your evaluation workflows 🔥🔥🔥\n- **2025-05-24** Supported Models: **InternVL3 Series, Gemini-2.5-Pro, Kimi-VL, LLaMA4, NVILA, Qwen2.5-Omni, Phi4, SmolVLM2, Grok, SAIL-VL-1.5, WeThink-Qwen2.5VL-7B, Bailingmm, VLM-R1, Taichu-VLR**. Supported Benchmarks: **HLE-Bench, MMVP, MM-AlignBench, Creation-MMBench, MM-IFEval, OmniDocBench, OCR-Reasoning, EMMA, ChaXiv,MedXpertQA, Physics, MSEarthMCQ, MicroBench, MMSci, VGRP-Bench, wildDoc, TDBench, VisuLogic, CVBench, LEGO-Puzzles, Video-MMLU, QBench-Video, MME-CoT, VLM2Bench, VMCBench, MOAT, Spatial457 Benchmark**. Please refer to **VLMEvalKit Features**(https://aicarrier.feishu.cn/wiki/Qp7wwSzQ9iK1Y6kNUJVcr6zTnPe?tabletblsdEpLieDoCxtb) for more details. Thanks to all contributors 🔥🔥🔥\n- **2025-02-20** Supported Models: **InternVL2.5 Series, Qwen2.5VL Series, QVQ-72B, Doubao-VL, Janus-Pro-7B, MiniCPM-o-2.6, InternVL2-MPO, LLaVA-CoT, Hunyuan-Standard-Vision, Ovis2, Valley, SAIL-VL, Ross, Long-VITA, EMU3, SmolVLM**. Supported Benchmarks: **MMMU-Pro, WeMath, 3DSRBench, LogicVista, VL-RewardBench, CC-OCR, CG-Bench, CMMMU, WorldSense**. Thanks to all contributors 🔥🔥🔥\n- **2024-12-11** Supported **NaturalBench**(https://huggingface.co/datasets/BaiqiL/NaturalBench), a vision-centric VQA benchmark (NeurIPS\u002724) that challenges vision-language models with simple questions about natural imagery.\n- **2024-12-02** Supported **VisOnlyQA**(https://github.com/psunlpgroup/VisOnlyQA/), a benchmark for evaluating the visual perception capabilities 🔥🔥🔥\n- **2024-11-26** Supported **Ovis1.6-Gemma2-27B**(https://huggingface.co/AIDC-AI/Ovis1.6-Gemma2-27B), thanks to **runninglsy**(https://github.com/runninglsy) 🔥🔥🔥\n- **2024-11-25** Create a new flag `VLMEVALKIT_USE_MODELSCOPE`. By setting this environment variable, you can download the video benchmarks supported from **modelscope**(https://www.modelscope.cn) 🔥🔥🔥\n\n## 🏗️ QuickStart\n\nSee QuickStart(/docs/en/Quickstart.md) | 快速开始(/docs/zh-CN/Quickstart.md) for a quick start guide.\n\n## 📊 Datasets, Models, and Evaluation Results\n\n### Evaluation Results\n\n**The performance numbers on our official multi-modal leaderboards can be downloaded from here!**\n\n**OpenVLM Leaderboard**(https://huggingface.co/spaces/opencompass/open_vlm_leaderboard): **Download All DETAILED Results**(http://opencompass.openxlab.space/assets/OpenVLM.json).\n\nCheck **Supported Benchmarks** Tab in **VLMEvalKit Features**(https://aicarrier.feishu.cn/wiki/Qp7wwSzQ9iK1Y6kNUJVcr6zTnPe?tabletblsdEpLieDoCxtb) to view all supported image \u0026 video benchmarks (70+).\n\nCheck **Supported LMMs** Tab in **VLMEvalKit Features**(https://aicarrier.feishu.cn/wiki/Qp7wwSzQ9iK1Y6kNUJVcr6zTnPe?tabletblsdEpLieDoCxtb) to view all supported LMMs, including commercial APIs, open-source models, and more (200+).\n\n**Transformers Version Recommendation:**\n\nNote that some VLMs may not be able to run under certain transformer versions, we recommend the following settings to evaluate each VLM:\n\n- **Please use** `transformers4.33.0` **for**: `Qwen series`, `Monkey series`, `InternLM-XComposer Series`, `mPLUG-Owl2`, `OpenFlamingo v2`, `IDEFICS series`, `VisualGLM`, `MMAlaya`, `ShareCaptioner`, `MiniGPT-4 series`, `InstructBLIP series`, `PandaGPT`, `VXVERSE`.\n- **Please use** `transformers4.36.2` **for**: `Moondream1`.\n- **Please use** `transformers4.37.0` **for**: `LLaVA series`, `ShareGPT4V series`, `TransCore-M`, `LLaVA (XTuner)`, `CogVLM Series`, `EMU2 Series`, `Yi-VL Series`, `MiniCPM-V1/V2`, `OmniLMM-12B`, `DeepSeek-VL series`, `InternVL series`, `Cambrian Series`, `VILA Series`, `Llama-3-MixSenseV1_1`, `Parrot-7B`, `PLLaVA Series`.\n- **Please use** `transformers4.40.0` **for**: `IDEFICS2`, `Bunny-Llama3`, `MiniCPM-Llama3-V2.5`, `360VL-70B`, `Phi-3-Vision`, `WeMM`.\n- **Please use** `transformers4.42.0` **for**: `AKI`.\n- **Please use** `transformers4.44.0` **for**: `Moondream2`, `H2OVL series`.\n- **Please use** `transformers4.45.0` **for**: `Aria`.\n- **Please use** `transformerslatest` **for**: `LLaVA-Next series`, `PaliGemma-3B`, `Chameleon series`, `Video-LLaVA-7B-HF`, `Ovis series`, `Mantis series`, `MiniCPM-V2.6`, `OmChat-v2.0-13B-sinlge-beta`, `Idefics-3`, `GLM-4v-9B`, `VideoChat2-HD`, `RBDash_72b`, `Llama-3.2 series`, `Kosmos series`.\n\n**Torchvision Version Recommendation:**\n\nNote that some VLMs may not be able to run under certain torchvision versions, we recommend the following settings to evaluate each VLM:\n\n- **Please use** `torchvision\u003e0.16` **for**: `Moondream series` and `Aria`\n\n**Flash-attn Version Recommendation:**\n\nNote that some VLMs may not be able to run under certain flash-attention versions, we recommend the following settings to evaluate each VLM:\n\n- **Please use** `pip install flash-attn --no-build-isolation` **for**: `Aria`\n\n```python\n# Demo\nfrom vlmeval.config import supported_VLM\nmodel supported_VLM\u0027idefics_9b_instruct\u0027()\n# Forward Single Image\nret model.generate(\u0027assets/apple.jpg\u0027, \u0027What is in this image?\u0027)\nprint(ret) # The image features a red apple with a leaf on it.\n# Forward Multiple Images\nret model.generate(\u0027assets/apple.jpg\u0027, \u0027assets/apple.jpg\u0027, \u0027How many apples are there in the provided images? \u0027)\nprint(ret) # There are two apples in the provided images.\n```\n\n## 🛠️ Development Guide\n\nTo develop custom benchmarks, VLMs, or simply contribute other codes to **VLMEvalKit**, please refer to Development_Guide(/docs/en/Development.md) | 开发指南(/docs/zh-CN/Development.md).\n\n**Call for contributions**\n\nTo promote the contribution from the community and share the corresponding credit (in the next report update):\n\n- All Contributions will be acknowledged in the report.\n- Contributors with 3 or more major contributions (implementing an MLLM, benchmark, or major feature) can join the author list of VLMEvalKit Technical Report(https://www.arxiv.org/abs/2407.11691) on ArXiv. Eligible contributors can create an issue or dm kennyutc in VLMEvalKit Discord Channel(https://discord.com/invite/evDT4GZmxN).\n\nHere is a contributor list(/docs/en/Contributors.md) we curated based on the records.\n\n## 🎯 The Goal of VLMEvalKit\n\n**The codebase is designed to:**\n\n1. Provide an **easy-to-use**, **opensource evaluation toolkit** to make it convenient for researchers \u0026 developers to evaluate existing LVLMs and make evaluation results **easy to reproduce**.\n2. Make it easy for VLM developers to evaluate their own models. To evaluate the VLM on multiple supported benchmarks, one just need to **implement a single `generate_inner()` function**, all other workloads (data downloading, data preprocessing, prediction inference, metric calculation) are handled by the codebase.\n\n**The codebase is not designed to:**\n\n1. Reproduce the exact accuracy number reported in the original papers of all **3rd party benchmarks**. The reason can be two-fold:\n 1. VLMEvalKit uses **generation-based evaluation** for all VLMs (and optionally with **LLM-based answer extraction**). Meanwhile, some benchmarks may use different approaches (SEEDBench uses PPL-based evaluation, *eg.*). For those benchmarks, we compare both scores in the corresponding result. We encourage developers to support other evaluation paradigms in the codebase.\n 2. By default, we use the same prompt template for all VLMs to evaluate on a benchmark. Meanwhile, **some VLMs may have their specific prompt templates** (some may not covered by the codebase at this time). We encourage VLM developers to implement their own prompt template in VLMEvalKit, if that is not covered currently. That will help to improve the reproducibility.\n\n## 🖊️ Citation\n\nIf you find this work helpful, please consider to **star🌟** this repo. Thanks for your support!\n\n!Stargazers repo roster for @open-compass/VLMEvalKit(https://reporoster.com/stars/open-compass/VLMEvalKit)(https://github.com/open-compass/VLMEvalKit/stargazers)\n\nIf you use VLMEvalKit in your research or wish to refer to published OpenSource evaluation results, please use the following BibTeX entry and the BibTex entry corresponding to the specific VLM / benchmark you used.\n\n```bib\n@inproceedings{duan2024vlmevalkit,\n title{Vlmevalkit: An open-source toolkit for evaluating large multi-modality models},\n author{Duan, Haodong and Yang, Junming and Qiao, Yuxuan and Fang, Xinyu and Chen, Lin and Liu, Yuan and Dong, Xiaoyi and Zang, Yuhang and Zhang, Pan and Wang, Jiaqi and others},\n booktitle{Proceedings of the 32nd ACM International Conference on Multimedia},\n pages{11198--11201},\n year{2024}\n}\n```\n\n\u003cp align\right\\u003e\u003ca href\#top\\u003e🔝Back to top\u003c/a\u003e\u003c/p\u003e\n\ngithub-contributors-link: https://github.com/open-compass/VLMEvalKit/graphs/contributors\ngithub-contributors-shield: https://img.shields.io/github/contributors/open-compass/VLMEvalKit?colorc4f042\u0026labelColorblack\u0026styleflat-square\ngithub-forks-link: https://github.com/open-compass/VLMEvalKit/network/members\ngithub-forks-shield: https://img.shields.io/github/forks/open-compass/VLMEvalKit?color8ae8ff\u0026labelColorblack\u0026styleflat-square\ngithub-issues-link: https://github.com/open-compass/VLMEvalKit/issues\ngithub-issues-shield: https://img.shields.io/github/issues/open-compass/VLMEvalKit?colorff80eb\u0026labelColorblack\u0026styleflat-square\ngithub-license-link: https://github.com/open-compass/VLMEvalKit/blob/main/LICENSE\ngithub-license-shield: https://img.shields.io/github/license/open-compass/VLMEvalKit?colorwhite\u0026labelColorblack\u0026styleflat-square\ngithub-stars-link: https://github.com/open-compass/VLMEvalKit/stargazers\ngithub-stars-shield: https://img.shields.io/github/stars/open-compass/VLMEvalKit?colorffcb47\u0026labelColorblack\u0026styleflat-square,show_label:true,rtl:false,latex_delimiters:{left:$$,right:$$,display:true},visible:true,elem_classes:,sanitize_html:true,line_breaks:false,header_links:false,show_copy_button:false,name:markdown,_selectable:false},skip_api:false,component_class_id:0f35515b9a3362b42bb6721c836157a0,key:null,api_info:{type:string},example_inputs:# Hello!},{id:14,type:tabitem,props:{label:📊 Video-MME (w/o subs) Leaderboard,visible:true,interactive:true,id:2,elem_id:Video-MME (w/o subs),name:tab},skip_api:true,component_class_id:18f46968e2b273dc5dbda0fe78433d7e,key:null},{id:15,type:markdown,props:{value:## Video-MME (w/o subs) Evaluation Results\n\n- We give the total scores for the three video lengths (short, medium and long), as well as the total scores for each task type.,show_label:true,rtl:false,latex_delimiters:{left:$$,right:$$,display:true},visible:true,elem_classes:,sanitize_html:true,line_breaks:false,header_links:false,show_copy_button:false,name:markdown,_selectable:false},skip_api:false,component_class_id:0f35515b9a3362b42bb6721c836157a0,key:null,api_info:{type:string},example_inputs:# Hello!},{id:16,type:checkboxgroup,props:{choices:short,short,medium,medium,long,long,Temporal Perception,Temporal Perception,Spatial Perception,Spatial Perception,Attribute Perception,Attribute Perception,Action Recognition,Action Recognition,Object Recognition,Object Recognition,OCR Problems,OCR Problems,Counting Problem,Counting Problem,Temporal Reasoning,Temporal Reasoning,Spatial Reasoning,Spatial Reasoning,Action Reasoning,Action Reasoning,Object Reasoning,Object Reasoning,Information Synopsis,Information Synopsis,Overall,Overall,value:Overall,short,medium,long,type:value,label:Video-MME (w/o subs) CheckBoxes,show_label:true,container:true,min_width:160,interactive:true,visible:true,elem_classes:,name:checkboxgroup,_selectable:false},skip_api:false,component_class_id:0635a5d3bf87369f1c2c1c69f8a3d0cc,key:null,api_info:{items:{enum:short,medium,long,Temporal Perception,Spatial Perception,Attribute Perception,Action Recognition,Object Recognition,OCR Problems,Counting Problem,Temporal Reasoning,Spatial Reasoning,Action Reasoning,Object Reasoning,Information Synopsis,Overall,type:string},title:Checkbox Group,type:array},example_inputs:short},{id:17,type:row,props:{variant:default,visible:true,equal_height:true,show_progress:false,name:row},skip_api:true,component_class_id:b5cf3d15e2293fb5ccc071d7b8d69f64,key:null},{id:18,type:checkboxgroup,props:{choices:\u003c10B,\u003c10B,10B-20B,10B-20B,20B-40B,20B-40B,\u003e40B,\u003e40B,Unknown,Unknown,value:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:value,label:Model Size,show_label:true,container:true,min_width:160,interactive:true,visible:true,elem_classes:,name:checkboxgroup,_selectable:false},skip_api:false,component_class_id:0635a5d3bf87369f1c2c1c69f8a3d0cc,key:null,api_info:{items:{enum:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:string},title:Checkbox Group,type:array},example_inputs:\u003c10B},{id:19,type:checkboxgroup,props:{choices:API,API,OpenSource,OpenSource,value:API,OpenSource,type:value,label:Model Type,show_label:true,container:true,min_width:160,interactive:true,visible:true,elem_classes:,name:checkboxgroup,_selectable:false},skip_api:false,component_class_id:0635a5d3bf87369f1c2c1c69f8a3d0cc,key:null,api_info:{items:{enum:API,OpenSource,type:string},title:Checkbox Group,type:array},example_inputs:API},{id:20,type:form,props:{scale:2,min_width:320,name:form},skip_api:true,component_class_id:43cda85c6ff7a5dd8f83d29496b014ee,key:null},{id:21,type:dataframe,props:{value:{headers:Method,Parameters (B),Language Model,Vision Model,Frames,Overall,short,medium,long,data:\u003ca href\https://huggingface.co/OpenGVLab/InternVL3-78B\\u003eInternVL3-78B\u003c/a\u003e,78.4,Qwen2.5-72B,InternViT-6B-v2.5,64,73.1,83.8,71.1,64.4,\u003ca href\https://huggingface.co/OpenGVLab/InternVL3-38B\\u003eInternVL3-38B\u003c/a\u003e,38.4,Qwen2.5-32B,InternViT-6B-v2.5,64,72.1,82.1,72.1,62.1,\u003ca href\https://huggingface.co/OpenGVLab/InternVL2_5-78B\\u003eInternVL2_5-78B\u003c/a\u003e,78.4,Qwen2.5-72B-Instruct,InternViT-6B-448px-V2_5,64,72.1,83.0,71.1,62.2,\u003ca href\https://deepmind.google/technologies/gemini/\\u003eGemini-2.0-Flash\u003c/a\u003e,null,,,64,71.0,79.7,71.8,61.4,\u003ca href\https://huggingface.co/lmms-lab/LLaVA-Video-72B-Qwen2\\u003eLLaVA-Video-72B-Qwen2\u003c/a\u003e,73.2,Qwen-2-72B,siglip-so400m-patch14-384,64,70.5,80.8,68.6,62.1,\u003ca href\https://huggingface.co/Qwen/Qwen2.5-VL-72B-Instruct\\u003eQwen2.5-VL-72B\u003c/a\u003e,73.4,Qwen2.5-72B,QwenViT,64,68.6,78.2,68.6,58.9,\u003ca href\https://openai.com/index/introducing-structured-outputs-in-the-api/\\u003eGPT-4o-2024-08-06\u003c/a\u003e,N/A,N/A,N/A,16,67.9,77.7,66.2,59.8,\u003ca href\https://huggingface.co/Qwen/Qwen2-VL-72B-Instruct\\u003eQwen2-VL-72B\u003c/a\u003e,73.4,Qwen2-72B,QwenViT,64,67.3,78.9,64.0,59.1,\u003ca href\https://huggingface.co/lmms-lab/llava-onevision-qwen2-72b-ov-sft\\u003eLLaVA-OneVision-72B-sft\u003c/a\u003e,73.2,Qwen-2-72B,siglip-so400m-patch14-384,32,66.2,77.2,63.2,58.2,\u003ca href\https://huggingface.co/OpenGVLab/InternVL3-8B\\u003eInternVL3-8B\u003c/a\u003e,7.94,Qwen2.5-7B,InternViT-300M-v2.5,64,66.0,76.2,66.4,55.4,\u003ca href\https://huggingface.co/rhymes-ai/Aria\\u003eAria\u003c/a\u003e,25.3,N/A,N/A,64,66.0,77.1,64.9,56.0,\u003ca href\https://huggingface.co/XiaomiMiMo/MiMo-VL-7B-RL\\u003eMiMo-VL-7B-RL\u003c/a\u003e,8.31,MiMo-7B,Qwen2.5-ViT,64,65.0,75.1,65.8,54.1,\u003ca href\https://huggingface.co/Qwen/Qwen2.5-Omni-7B\\u003eQwen2.5-Omni-7B\u003c/a\u003e,10.7,Qwen-2.5-7B,QwenViT,64,64.1,75.1,63.0,54.3,\u003ca href\https://huggingface.co/lmms-lab/LLaVA-Video-7B-Qwen2\\u003eLLaVA-Video-7B-Qwen2\u003c/a\u003e,8.03,Qwen-2-7B,siglip-so400m-patch14-384,64,63.7,76.7,62.2,52.2,\u003ca href\https://huggingface.co/OpenGVLab/InternVL2_5-8B\\u003eInternVL2_5-8B\u003c/a\u003e,8.08,internlm2_5-7b-chat,InternViT-300M-448px-V2_5,64,63.7,75.9,63.2,52.1,\u003ca href\https://huggingface.co/VITA-MLLM/Long-VITA-16K\\u003eLong-VITA-16K\u003c/a\u003e,14,Qwen2.5-14B,InternViT-300M,64,63.6,74.6,61.2,55.0,\u003ca href\https://huggingface.co/Qwen/Qwen2.5-VL-7B-Instruct\\u003eQwen2.5-VL-7B\u003c/a\u003e,8.29,Qwen2.5-7B,QwenViT,64,62.8,73.9,62.3,52.2,\u003ca href\https://huggingface.co/openbmb/MiniCPM-o-2_6\\u003eMiniCPM-o-2_6\u003c/a\u003e,8.67,Qwen2.5-7B-Instruct,siglip-so400m,64,62.7,75.2,63.1,49.8,\u003ca href\https://huggingface.co/Efficient-Large-Model/VILA1.5-40b\\u003eVILA1.5-40B\u003c/a\u003e,40,Yi-34B,InternViT-6B,14,62.3,72.7,61.0,53.1,\u003ca href\https://huggingface.co/OpenGVLab/InternVL2-Llama3-76B\\u003eInternVL2-76B\u003c/a\u003e,76.3,Hermes-Llama3-70B,InternViT-6B-448px-V1-5,16,61.0,72.4,58.6,52.1,\u003ca href\https://deepmind.google/technologies/gemini/flash/\\u003eGemini-1.5-Flash\u003c/a\u003e,N/A,N/A,N/A,16,60.7,71.3,58.6,52.1,\u003ca href\https://huggingface.co/Qwen/Qwen2-VL-7B-Instruct\\u003eQwen2-VL-7B\u003c/a\u003e,8,Qwen2-7B,QwenViT,64,59.7,71.1,57.9,50.0,\u003ca href\https://huggingface.co/openbmb/MiniCPM-V-2_6\\u003eMiniCPM-V-2.6\u003c/a\u003e,8.1,Qwen-2-7B,siglip-so400m,64,59.7,70.4,58.1,50.4,\u003ca href\https://huggingface.co/mPLUG/mPLUG-Owl3-7B-240728\\u003emPLUG-Owl3\u003c/a\u003e,8.06,Qwen-2-7B,siglip-so400m,16,54.0,63.3,51.8,46.8,\u003ca href\https://huggingface.co/OpenGVLab/InternVL2-8B\\u003eInternVL2-8B\u003c/a\u003e,8.08,InternLM2.5-7B-chat,InternViT-300M-448px,16,53.7,65.9,49.8,45.3,\u003ca href\https://huggingface.co/ermu2001/pllava-34b\\u003ePLLaVA-34B\u003c/a\u003e,34.9,Nous-Hermes-2-Yi-34B,clip-vit-large-patch14-336,16,53.4,62.0,52.9,45.4,\u003ca href\https://www.anthropic.com/news/claude-3-5-sonnet\\u003eClaude-3.5-Sonnet\u003c/a\u003e,N/A,N/A,N/A,8,51.4,63.6,48.3,42.2,\u003ca href\https://huggingface.co/HuggingFaceTB/SmolVLM2-2.2B-Instruct\\u003eSmolVLM2-2.2B-Instruct\u003c/a\u003e,2.25,SmolLM2-1.7B,,64,51.2,79.0,50.8,39.2,\u003ca href\https://huggingface.co/microsoft/Phi-3.5-vision-instruct\\u003ePhi-3.5-Vision\u003c/a\u003e,4.15,Phi-3 Mini,N/A,16,51.1,61.7,49.0,42.7,\u003ca href\https://huggingface.co/HuggingFaceM4/Idefics3-8B-Llama3\\u003eIdefics3-8B-Llama3\u003c/a\u003e,8.46,Llama-3.1-8B-Instruct,siglip-so400m-patch14-384,16,48.1,56.1,45.1,43.0,\u003ca href\https://huggingface.co/OpenGVLab/VideoChat2_HD_stage4_Mistral_7B\\u003eVideoChat2-HD\u003c/a\u003e,7 (LLM),Mistral-7B-Instruct-v0.2,umt_l16_qformer,16,45.6,54.2,42.8,39.9,\u003ca href\https://huggingface.co/HuggingFaceTB/SmolVLM2-500M-Video-Instruct\\u003eSmolVLM2-500M-Video-Instruct\u003c/a\u003e,0.507,SmolLM2-360M-Instruct,Siglip-base-patch16-512,64,42.7,50.6,42.7,34.8,\u003ca href\https://huggingface.co/microsoft/Phi-4-multimodal-instruct\\u003ePhi-4-multimodal-instruct\u003c/a\u003e,5.57,Phi-4-Mini-Instruct,,64,42.4,48.0,39.4,39.7,\u003ca href\https://huggingface.co/Chat-UniVi/Chat-UniVi-7B-v1.5\\u003eChat-UniVi-7B-v1.5\u003c/a\u003e,7 (LLM),Vicuna-v1.5-7B,clip-vit-large-patch14-336,1.0 fps,31.8,36.2,32.7,26.6,\u003ca href\https://github.com/PKU-YuanGroup/Video-LLaVA\\u003eVideo-LLaVA\u003c/a\u003e,7.5,Vicuna-v1.5-7B,LanguageBind Encoder,8,30.3,31.6,31.4,28.0,\u003ca href\https://github.com/mbzuai-oryx/Video-ChatGPT\\u003eVideo-ChatGPT\u003c/a\u003e,7 (LLM),Vicuna-v1.1-7B,CLIP ViT-L/14,100,28.0,28.0,27.7,28.2,\u003ca href\https://huggingface.co/YanweiLi/llama-vid-7b-full-224-video-fps-1\\u003eLLaMA-VID-7B\u003c/a\u003e,7 (LLM),Vicuna-v1.5-7B,LAVIS-EVA-G,1 fps,22.0,28.0,30.1,7.9,metadata:null},headers:1,2,3,row_count:1,dynamic,col_count:3,dynamic,datatype:html,number,str,str,str,number,number,number,number,type:pandas,latex_delimiters:{left:$$,right:$$,display:true},show_label:true,height:500,min_width:160,interactive:false,visible:true,elem_classes:,wrap:false,line_breaks:true,column_widths:,name:dataframe,_selectable:false},skip_api:false,component_class_id:6a51bb3357f2e94c40fb8d4fbe06aa8e,key:null,api_info:{properties:{headers:{items:{type:string},title:Headers,type:array},data:{items:{items:{},type:array},title:Data,type:array},metadata:{anyOf:{additionalProperties:{anyOf:{items:{},type:array},{type:null}},type:object},{type:null},default:null,title:Metadata}},required:headers,data,title:DataframeData,type:object},example_inputs:{headers:a,b,data:foo,bar}},{id:22,type:textbox,props:{value:Video-MME (w/o subs),lines:1,max_lines:20,label:Video-MME (w/o subs),show_label:true,container:true,min_width:160,visible:false,autofocus:false,autoscroll:true,elem_classes:,type:text,rtl:false,show_copy_button:false,name:textbox,_selectable:false},skip_api:false,component_class_id:d0491bb0fbd427bdb2a4549c4d1ac869,key:null,api_info:{type:string},example_inputs:Hello!!},{id:23,type:form,props:{scale:0,min_width:0,name:form},skip_api:true,component_class_id:43cda85c6ff7a5dd8f83d29496b014ee,key:null},{id:24,type:form,props:{scale:0,min_width:0,name:form},skip_api:true,component_class_id:43cda85c6ff7a5dd8f83d29496b014ee,key:null},{id:25,type:tabitem,props:{label:📊 MVBench Leaderboard,visible:true,interactive:true,id:3,elem_id:MVBench,name:tab},skip_api:true,component_class_id:18f46968e2b273dc5dbda0fe78433d7e,key:null},{id:26,type:checkboxgroup,props:{choices:action_sequence,action_sequence,action_prediction,action_prediction,action_antonym,action_antonym,fine_grained_action,fine_grained_action,unexpected_action,unexpected_action,object_existence,object_existence,object_interaction,object_interaction,object_shuffle,object_shuffle,moving_direction,moving_direction,action_localization,action_localization,scene_transition,scene_transition,action_count,action_count,moving_count,moving_count,moving_attribute,moving_attribute,state_change,state_change,fine_grained_pose,fine_grained_pose,character_order,character_order,egocentric_navigation,egocentric_navigation,episodic_reasoning,episodic_reasoning,counterfactual_inference,counterfactual_inference,Overall,Overall,value:Overall,type:value,label:MVBench CheckBoxes,show_label:true,container:true,min_width:160,interactive:true,visible:true,elem_classes:,name:checkboxgroup,_selectable:false},skip_api:false,component_class_id:0635a5d3bf87369f1c2c1c69f8a3d0cc,key:null,api_info:{items:{enum:action_sequence,action_prediction,action_antonym,fine_grained_action,unexpected_action,object_existence,object_interaction,object_shuffle,moving_direction,action_localization,scene_transition,action_count,moving_count,moving_attribute,state_change,fine_grained_pose,character_order,egocentric_navigation,episodic_reasoning,counterfactual_inference,Overall,type:string},title:Checkbox Group,type:array},example_inputs:action_sequence},{id:27,type:row,props:{variant:default,visible:true,equal_height:true,show_progress:false,name:row},skip_api:true,component_class_id:b5cf3d15e2293fb5ccc071d7b8d69f64,key:null},{id:28,type:checkboxgroup,props:{choices:\u003c10B,\u003c10B,10B-20B,10B-20B,20B-40B,20B-40B,\u003e40B,\u003e40B,Unknown,Unknown,value:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:value,label:Model Size,show_label:true,container:true,min_width:160,interactive:true,visible:true,elem_classes:,name:checkboxgroup,_selectable:false},skip_api:false,component_class_id:0635a5d3bf87369f1c2c1c69f8a3d0cc,key:null,api_info:{items:{enum:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:string},title:Checkbox Group,type:array},example_inputs:\u003c10B},{id:29,type:checkboxgroup,props:{choices:API,API,OpenSource,OpenSource,value:API,OpenSource,type:value,label:Model Type,show_label:true,container:true,min_width:160,interactive:true,visible:true,elem_classes:,name:checkboxgroup,_selectable:false},skip_api:false,component_class_id:0635a5d3bf87369f1c2c1c69f8a3d0cc,key:null,api_info:{items:{enum:API,OpenSource,type:string},title:Checkbox Group,type:array},example_inputs:API},{id:30,type:form,props:{scale:2,min_width:320,name:form},skip_api:true,component_class_id:43cda85c6ff7a5dd8f83d29496b014ee,key:null},{id:31,type:dataframe,props:{value:{headers:Method,Parameters (B),Language Model,Vision Model,Frames,Overall,data:\u003ca href\https://huggingface.co/OpenGVLab/InternVL3-78B\\u003eInternVL3-78B\u003c/a\u003e,78.4,Qwen2.5-72B,InternViT-6B-v2.5,64,79.2,\u003ca href\https://huggingface.co/OpenGVLab/InternVL3-38B\\u003eInternVL3-38B\u003c/a\u003e,38.4,Qwen2.5-32B,InternViT-6B-v2.5,64,76.0,\u003ca href\https://huggingface.co/OpenGVLab/InternVL2_5-78B\\u003eInternVL2_5-78B\u003c/a\u003e,78.4,Qwen2.5-72B-Instruct,InternViT-6B-448px-V2_5,64,75.6,\u003ca href\https://huggingface.co/OpenGVLab/InternVL3-8B\\u003eInternVL3-8B\u003c/a\u003e,7.94,Qwen2.5-7B,InternViT-300M-v2.5,64,73.2,\u003ca href\https://huggingface.co/Qwen/Qwen2.5-VL-72B-Instruct\\u003eQwen2.5-VL-72B\u003c/a\u003e,73.4,Qwen2.5-72B,QwenViT,64,71.3,\u003ca href\https://huggingface.co/OpenGVLab/InternVL2_5-8B\\u003eInternVL2_5-8B\u003c/a\u003e,8.08,internlm2_5-7b-chat,InternViT-300M-448px-V2_5,64,70.5,\u003ca href\https://huggingface.co/Qwen/Qwen2-VL-72B-Instruct\\u003eQwen2-VL-72B\u003c/a\u003e,73.4,Qwen2-72B,QwenViT,64,70.2,\u003ca href\https://huggingface.co/OpenGVLab/InternVL2-Llama3-76B\\u003eInternVL2-76B\u003c/a\u003e,76.3,Hermes-Llama3-70B,InternViT-6B-448px-V1-5,16,69.3,\u003ca href\https://huggingface.co/Qwen/Qwen2.5-Omni-7B\\u003eQwen2.5-Omni-7B\u003c/a\u003e,10.7,Qwen-2.5-7B,QwenViT,64,69.0,\u003ca href\https://huggingface.co/rhymes-ai/Aria\\u003eAria\u003c/a\u003e,25.3,N/A,N/A,64,67.9,\u003ca href\https://huggingface.co/Qwen/Qwen2.5-VL-7B-Instruct\\u003eQwen2.5-VL-7B\u003c/a\u003e,8.29,Qwen2.5-7B,QwenViT,64,67.5,\u003ca href\https://huggingface.co/Qwen/Qwen2-VL-7B-Instruct\\u003eQwen2-VL-7B\u003c/a\u003e,8,Qwen2-7B,QwenViT,64,66.0,\u003ca href\https://huggingface.co/OpenGVLab/InternVL2-8B\\u003eInternVL2-8B\u003c/a\u003e,8.08,InternLM2.5-7B-chat,InternViT-300M-448px,16,64.5,\u003ca href\https://huggingface.co/XiaomiMiMo/MiMo-VL-7B-RL\\u003eMiMo-VL-7B-RL\u003c/a\u003e,8.31,MiMo-7B,Qwen2.5-ViT,64,63.2,\u003ca href\https://huggingface.co/VITA-MLLM/Long-VITA-16K\\u003eLong-VITA-16K\u003c/a\u003e,14,Qwen2.5-14B,InternViT-300M,64,63.1,\u003ca href\https://huggingface.co/lmms-lab/LLaVA-Video-72B-Qwen2\\u003eLLaVA-Video-72B-Qwen2\u003c/a\u003e,73.2,Qwen-2-72B,siglip-so400m-patch14-384,64,63.1,\u003ca href\https://huggingface.co/lmms-lab/LLaVA-Video-7B-Qwen2\\u003eLLaVA-Video-7B-Qwen2\u003c/a\u003e,8.03,Qwen-2-7B,siglip-so400m-patch14-384,64,62.1,\u003ca href\https://huggingface.co/OpenGVLab/VideoChat2_HD_stage4_Mistral_7B\\u003eVideoChat2-HD\u003c/a\u003e,7 (LLM),Mistral-7B-Instruct-v0.2,umt_l16_qformer,16,61.6,\u003ca href\https://huggingface.co/lmms-lab/llava-onevision-qwen2-72b-ov-sft\\u003eLLaVA-OneVision-72B-sft\u003c/a\u003e,73.2,Qwen-2-72B,siglip-so400m-patch14-384,32,60.8,\u003ca href\https://deepmind.google/technologies/gemini/\\u003eGemini-2.0-Flash\u003c/a\u003e,null,,,64,59.4,\u003ca href\https://huggingface.co/ermu2001/pllava-34b\\u003ePLLaVA-34B\u003c/a\u003e,34.9,Nous-Hermes-2-Yi-34B,clip-vit-large-patch14-336,16,57.8,\u003ca href\https://openai.com/index/introducing-structured-outputs-in-the-api/\\u003eGPT-4o-2024-08-06\u003c/a\u003e,N/A,N/A,N/A,16,57.5,\u003ca href\https://deepmind.google/technologies/gemini/flash/\\u003eGemini-1.5-Flash\u003c/a\u003e,N/A,N/A,N/A,16,54.1,\u003ca href\https://huggingface.co/microsoft/Phi-3.5-vision-instruct\\u003ePhi-3.5-Vision\u003c/a\u003e,4.15,Phi-3 Mini,N/A,16,48.2,\u003ca href\https://huggingface.co/openbmb/MiniCPM-o-2_6\\u003eMiniCPM-o-2_6\u003c/a\u003e,8.67,Qwen2.5-7B-Instruct,siglip-so400m,64,47.9,\u003ca href\https://huggingface.co/HuggingFaceTB/SmolVLM2-2.2B-Instruct\\u003eSmolVLM2-2.2B-Instruct\u003c/a\u003e,2.25,SmolLM2-1.7B,,64,46.7,\u003ca href\https://huggingface.co/HuggingFaceM4/Idefics3-8B-Llama3\\u003eIdefics3-8B-Llama3\u003c/a\u003e,8.46,Llama-3.1-8B-Instruct,siglip-so400m-patch14-384,16,46.1,\u003ca href\https://huggingface.co/mPLUG/mPLUG-Owl3-7B-240728\\u003emPLUG-Owl3\u003c/a\u003e,8.06,Qwen-2-7B,siglip-so400m,16,45.1,\u003ca href\https://huggingface.co/openbmb/MiniCPM-V-2_6\\u003eMiniCPM-V-2.6\u003c/a\u003e,8.1,Qwen-2-7B,siglip-so400m,64,44.7,\u003ca href\https://huggingface.co/microsoft/Phi-4-multimodal-instruct\\u003ePhi-4-multimodal-instruct\u003c/a\u003e,5.57,Phi-4-Mini-Instruct,,64,44.1,\u003ca href\https://github.com/PKU-YuanGroup/Video-LLaVA\\u003eVideo-LLaVA\u003c/a\u003e,7.5,Vicuna-v1.5-7B,LanguageBind Encoder,8,43.5,\u003ca href\https://huggingface.co/Efficient-Large-Model/VILA1.5-40b\\u003eVILA1.5-40B\u003c/a\u003e,40,Yi-34B,InternViT-6B,14,43.2,\u003ca href\https://huggingface.co/Chat-UniVi/Chat-UniVi-7B-v1.5\\u003eChat-UniVi-7B-v1.5\u003c/a\u003e,7 (LLM),Vicuna-v1.5-7B,clip-vit-large-patch14-336,1.0 fps,42.9,\u003ca href\https://huggingface.co/YanweiLi/llama-vid-7b-full-224-video-fps-1\\u003eLLaMA-VID-7B\u003c/a\u003e,7 (LLM),Vicuna-v1.5-7B,LAVIS-EVA-G,1 fps,41.5,\u003ca href\https://huggingface.co/HuggingFaceTB/SmolVLM2-500M-Video-Instruct\\u003eSmolVLM2-500M-Video-Instruct\u003c/a\u003e,0.507,SmolLM2-360M-Instruct,Siglip-base-patch16-512,64,40.9,\u003ca href\https://github.com/mbzuai-oryx/Video-ChatGPT\\u003eVideo-ChatGPT\u003c/a\u003e,7 (LLM),Vicuna-v1.1-7B,CLIP ViT-L/14,100,33.0,\u003ca href\https://www.anthropic.com/news/claude-3-5-sonnet\\u003eClaude-3.5-Sonnet\u003c/a\u003e,N/A,N/A,N/A,8,25.5,metadata:null},headers:1,2,3,row_count:1,dynamic,col_count:3,dynamic,datatype:html,number,str,str,str,number,type:pandas,latex_delimiters:{left:$$,right:$$,display:true},show_label:true,height:500,min_width:160,interactive:false,visible:true,elem_classes:,wrap:false,line_breaks:true,column_widths:,name:dataframe,_selectable:false},skip_api:false,component_class_id:6a51bb3357f2e94c40fb8d4fbe06aa8e,key:null,api_info:{properties:{headers:{items:{type:string},title:Headers,type:array},data:{items:{items:{},type:array},title:Data,type:array},metadata:{anyOf:{additionalProperties:{anyOf:{items:{},type:array},{type:null}},type:object},{type:null},default:null,title:Metadata}},required:headers,data,title:DataframeData,type:object},example_inputs:{headers:a,b,data:foo,bar}},{id:32,type:textbox,props:{value:MVBench,lines:1,max_lines:20,label:MVBench,show_label:true,container:true,min_width:160,visible:false,autofocus:false,autoscroll:true,elem_classes:,type:text,rtl:false,show_copy_button:false,name:textbox,_selectable:false},skip_api:false,component_class_id:d0491bb0fbd427bdb2a4549c4d1ac869,key:null,api_info:{type:string},example_inputs:Hello!!},{id:33,type:form,props:{scale:0,min_width:0,name:form},skip_api:true,component_class_id:43cda85c6ff7a5dd8f83d29496b014ee,key:null},{id:34,type:form,props:{scale:0,min_width:0,name:form},skip_api:true,component_class_id:43cda85c6ff7a5dd8f83d29496b014ee,key:null},{id:35,type:tabitem,props:{label:📊 MMBench-Video Leaderboard,visible:true,interactive:true,id:4,elem_id:MMBench-Video,name:tab},skip_api:true,component_class_id:18f46968e2b273dc5dbda0fe78433d7e,key:null},{id:36,type:checkboxgroup,props:{choices:CP,CP,FP-S,FP-S,FP-C,FP-C,HL,HL,LR,LR,AR,AR,RR,RR,CSR,CSR,TR,TR,Perception,Perception,Reasoning,Reasoning,Overall,Overall,value:Overall,Perception,Reasoning,type:value,label:MMBench-Video CheckBoxes,show_label:true,container:true,min_width:160,interactive:true,visible:true,elem_classes:,name:checkboxgroup,_selectable:false},skip_api:false,component_class_id:0635a5d3bf87369f1c2c1c69f8a3d0cc,key:null,api_info:{items:{enum:CP,FP-S,FP-C,HL,LR,AR,RR,CSR,TR,Perception,Reasoning,Overall,type:string},title:Checkbox Group,type:array},example_inputs:CP},{id:37,type:row,props:{variant:default,visible:true,equal_height:true,show_progress:false,name:row},skip_api:true,component_class_id:b5cf3d15e2293fb5ccc071d7b8d69f64,key:null},{id:38,type:checkboxgroup,props:{choices:\u003c10B,\u003c10B,10B-20B,10B-20B,20B-40B,20B-40B,\u003e40B,\u003e40B,Unknown,Unknown,value:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:value,label:Model Size,show_label:true,container:true,min_width:160,interactive:true,visible:true,elem_classes:,name:checkboxgroup,_selectable:false},skip_api:false,component_class_id:0635a5d3bf87369f1c2c1c69f8a3d0cc,key:null,api_info:{items:{enum:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:string},title:Checkbox Group,type:array},example_inputs:\u003c10B},{id:39,type:checkboxgroup,props:{choices:API,API,OpenSource,OpenSource,value:API,OpenSource,type:value,label:Model Type,show_label:true,container:true,min_width:160,interactive:true,visible:true,elem_classes:,name:checkboxgroup,_selectable:false},skip_api:false,component_class_id:0635a5d3bf87369f1c2c1c69f8a3d0cc,key:null,api_info:{items:{enum:API,OpenSource,type:string},title:Checkbox Group,type:array},example_inputs:API},{id:40,type:form,props:{scale:2,min_width:320,name:form},skip_api:true,component_class_id:43cda85c6ff7a5dd8f83d29496b014ee,key:null},{id:41,type:dataframe,props:{value:{headers:Method,Parameters (B),Language Model,Vision Model,Frames,Overall,Perception,Reasoning,data:\u003ca href\https://openai.com/index/introducing-structured-outputs-in-the-api/\\u003eGPT-4o-2024-08-06\u003c/a\u003e,N/A,N/A,N/A,1 fps,2.15,2.19,2.08,\u003ca href\https://deepmind.google/technologies/gemini/\\u003eGemini-2.0-Flash\u003c/a\u003e,null,,,64,2.01,2.07,1.86,\u003ca href\https://huggingface.co/OpenGVLab/InternVL2_5-78B\\u003eInternVL2_5-78B\u003c/a\u003e,78.4,Qwen2.5-72B-Instruct,InternViT-6B-448px-V2_5,64,1.98,2.0,1.94,\u003ca href\https://ai.google.dev/gemini-api/docs/models/gemini#gemini-1.5-pro\\u003eGemini-1.5-Pro\u003c/a\u003e,N/A,N/A,N/A,1 fps,1.94,1.98,1.86,\u003ca href\https://openai.com/index/introducing-structured-outputs-in-the-api/\\u003eGPT-4o-2024-08-06\u003c/a\u003e,N/A,N/A,N/A,16,1.87,1.89,1.81,\u003ca href\https://huggingface.co/rhymes-ai/Aria\\u003eAria\u003c/a\u003e,25.3,N/A,N/A,1 fps,1.85,1.9,1.71,\u003ca href\https://huggingface.co/Qwen/Qwen2.5-VL-72B-Instruct\\u003eQwen2.5-VL-72B\u003c/a\u003e,73.4,Qwen2.5-72B,QwenViT,64,1.84,1.83,1.85,\u003ca href\https://huggingface.co/OpenGVLab/InternVL3-78B\\u003eInternVL3-78B\u003c/a\u003e,78.4,Qwen2.5-72B,InternViT-6B-v2.5,64,1.81,1.84,1.72,\u003ca href\https://huggingface.co/rhymes-ai/Aria\\u003eAria\u003c/a\u003e,25.3,N/A,N/A,64,1.81,1.83,1.73,\u003ca href\https://huggingface.co/OpenGVLab/InternVL3-38B\\u003eInternVL3-38B\u003c/a\u003e,38.4,Qwen2.5-32B,InternViT-6B-v2.5,64,1.8,1.82,1.75,\u003ca href\https://huggingface.co/openbmb/MiniCPM-o-2_6\\u003eMiniCPM-o-2_6\u003c/a\u003e,8.67,Qwen2.5-7B-Instruct,siglip-so400m,64,1.76,1.83,1.64,\u003ca href\https://huggingface.co/lmms-lab/LLaVA-Video-72B-Qwen2\\u003eLLaVA-Video-72B-Qwen2\u003c/a\u003e,73.2,Qwen-2-72B,siglip-so400m-patch14-384,64,1.71,1.74,1.63,\u003ca href\https://huggingface.co/openbmb/MiniCPM-V-2_6\\u003eMiniCPM-V-2.6\u003c/a\u003e,8.1,Qwen-2-7B,siglip-so400m,64,1.7,1.75,1.6,\u003ca href\https://huggingface.co/Qwen/Qwen2-VL-72B-Instruct\\u003eQwen2-VL-72B\u003c/a\u003e,73.4,Qwen2-72B,QwenViT,64,1.7,1.71,1.65,\u003ca href\https://huggingface.co/OpenGVLab/InternVL3-8B\\u003eInternVL3-8B\u003c/a\u003e,7.94,Qwen2.5-7B,InternViT-300M-v2.5,64,1.69,1.73,1.58,\u003ca href\https://huggingface.co/OpenGVLab/InternVL2_5-8B\\u003eInternVL2_5-8B\u003c/a\u003e,8.08,internlm2_5-7b-chat,InternViT-300M-448px-V2_5,64,1.68,1.73,1.54,\u003ca href\https://ai.google.dev/gemini-api/docs/models/gemini#gemini-1.5-pro\\u003eGemini-1.5-Pro\u003c/a\u003e,N/A,N/A,N/A,16,1.66,1.61,1.55,\u003ca href\https://deepmind.google/technologies/gemini/flash/\\u003eGemini-1.5-Flash\u003c/a\u003e,N/A,N/A,N/A,16,1.66,1.66,1.63,\u003ca href\https://huggingface.co/Qwen/Qwen2.5-Omni-7B\\u003eQwen2.5-Omni-7B\u003c/a\u003e,10.7,Qwen-2.5-7B,QwenViT,64,1.65,1.67,1.58,\u003ca href\https://huggingface.co/XiaomiMiMo/MiMo-VL-7B-RL\\u003eMiMo-VL-7B-RL\u003c/a\u003e,8.31,MiMo-7B,Qwen2.5-ViT,64,1.62,1.64,1.55,\u003ca href\https://huggingface.co/Efficient-Large-Model/VILA1.5-40b\\u003eVILA1.5-40B\u003c/a\u003e,40,Yi-34B,InternViT-6B,14,1.61,1.63,1.52,\u003ca href\https://huggingface.co/rhymes-ai/Aria\\u003eAria\u003c/a\u003e,25.3,N/A,N/A,16,1.61,1.61,1.58,\u003ca href\https://huggingface.co/lmms-lab/LLaVA-Video-7B-Qwen2\\u003eLLaVA-Video-7B-Qwen2\u003c/a\u003e,8.03,Qwen-2-7B,siglip-so400m-patch14-384,64,1.6,1.65,1.47,\u003ca href\https://huggingface.co/OpenGVLab/InternVL2-Llama3-76B\\u003eInternVL2-76B\u003c/a\u003e,76.3,Hermes-Llama3-70B,InternViT-6B-448px-V1-5,16,1.59,1.59,1.59,\u003ca href\https://huggingface.co/Qwen/Qwen2.5-VL-7B-Instruct\\u003eQwen2.5-VL-7B\u003c/a\u003e,8.29,Qwen2.5-7B,QwenViT,64,1.49,1.49,1.45,\u003ca href\https://huggingface.co/Qwen/Qwen2-VL-7B-Instruct\\u003eQwen2-VL-7B\u003c/a\u003e,8,Qwen2-7B,QwenViT,64,1.45,1.48,1.34,\u003ca href\https://huggingface.co/OpenGVLab/InternVL2-26B\\u003eInternVL2-26B\u003c/a\u003e,25.5,InternLM2-20B-chat,InternViT-6B-448px-V1-5,16,1.41,1.42,1.35,\u003ca href\https://huggingface.co/HuggingFaceTB/SmolVLM2-2.2B-Instruct\\u003eSmolVLM2-2.2B-Instruct\u003c/a\u003e,2.25,SmolLM2-1.7B,,64,1.4,1.45,1.29,\u003ca href\https://www.anthropic.com/news/claude-3-5-sonnet\\u003eClaude-3.5-Sonnet\u003c/a\u003e,N/A,N/A,N/A,8,1.38,1.38,1.35,\u003ca href\https://huggingface.co/Efficient-Large-Model/VILA1.5-13b\\u003eVILA1.5-13B\u003c/a\u003e,13,Llama-2-13B,siglip,14,1.36,1.39,1.28,\u003ca href\https://huggingface.co/mPLUG/mPLUG-Owl3-7B-240728\\u003emPLUG-Owl3\u003c/a\u003e,8.06,Qwen-2-7B,siglip-so400m,16,1.35,1.37,1.28,\u003ca href\https://huggingface.co/OpenGVLab/InternVL2-8B\\u003eInternVL2-8B\u003c/a\u003e,8.08,InternLM2.5-7B-chat,InternViT-300M-448px,16,1.26,1.3,1.16,\u003ca href\https://huggingface.co/OpenGVLab/VideoChat2_HD_stage4_Mistral_7B\\u003eVideoChat2-HD\u003c/a\u003e,7 (LLM),Mistral-7B-Instruct-v0.2,umt_l16_qformer,16,1.22,1.2,1.23,\u003ca href\https://huggingface.co/microsoft/Phi-3.5-vision-instruct\\u003ePhi-3.5-Vision\u003c/a\u003e,4.15,Phi-3 Mini,N/A,16,1.2,1.22,1.11,\u003ca href\https://huggingface.co/ermu2001/pllava-34b\\u003ePLLaVA-34B\u003c/a\u003e,34.9,Nous-Hermes-2-Yi-34B,clip-vit-large-patch14-336,16,1.19,1.19,1.16,\u003ca href\https://huggingface.co/llava-hf/LLaVA-NeXT-Video-34B-hf\\u003eLLaVA-NeXT-Video-34B\u003c/a\u003e,34.8,Nous-Hermes-2-Yi-34B,CLIP ViT-L/14,32,1.14,1.14,1.13,\u003ca href\https://huggingface.co/papers/2405.16009\\u003eVideoStreaming\u003c/a\u003e,8.3,Vicuna-v1.5-7B,Phi-2-2.7B (half),64,1.12,1.13,1.09,\u003ca href\https://huggingface.co/YanweiLi/llama-vid-7b-full-224-video-fps-1\\u003eLLaMA-VID-7B\u003c/a\u003e,7 (LLM),Vicuna-v1.5-7B,LAVIS-EVA-G,1 fps,1.08,1.09,1.02,\u003ca href\https://huggingface.co/Chat-UniVi/Chat-UniVi-7B-v1.5\\u003eChat-UniVi-7B-v1.5\u003c/a\u003e,7 (LLM),Vicuna-v1.5-7B,clip-vit-large-patch14-336,1.0 fps,1.06,1.08,0.99,\u003ca href\https://huggingface.co/Lin-Chen/sharegpt4video-8b\\u003eShareGPT4Video-8B\u003c/a\u003e,8.35,Llama-3-8B-Instruct,CLIP ViT-L/14,16,1.05,1.04,1.03,\u003ca href\https://github.com/PKU-YuanGroup/Video-LLaVA\\u003eVideo-LLaVA\u003c/a\u003e,7.5,Vicuna-v1.5-7B,LanguageBind Encoder,8,1.03,1.04,0.95,\u003ca href\https://huggingface.co/HuggingFaceTB/SmolVLM2-500M-Video-Instruct\\u003eSmolVLM2-500M-Video-Instruct\u003c/a\u003e,0.507,SmolLM2-360M-Instruct,Siglip-base-patch16-512,64,1.01,1.04,0.92,\u003ca href\https://huggingface.co/HuggingFaceM4/Idefics3-8B-Llama3\\u003eIdefics3-8B-Llama3\u003c/a\u003e,8.46,Llama-3.1-8B-Instruct,siglip-so400m-patch14-384,16,1.0,1.0,0.97,\u003ca href\https://huggingface.co/lmms-lab/llava-onevision-qwen2-72b-ov-sft\\u003eLLaVA-OneVision-72B-sft\u003c/a\u003e,73.2,Qwen-2-72B,siglip-so400m-patch14-384,32,0.94,1.03,0.7,\u003ca href\https://github.com/mbzuai-oryx/Video-ChatGPT\\u003eVideo-ChatGPT\u003c/a\u003e,7 (LLM),Vicuna-v1.1-7B,CLIP ViT-L/14,100,0.9,0.88,0.93,\u003ca href\https://huggingface.co/sfsdfsafsddsfsdafsa/llama-vid-7b-full-224-long-video-MovieLLM\\u003eMovieLLM\u003c/a\u003e,7,Vicuna-v1.5-7B,CLIP ViT-L/14,1 fps,0.87,0.81,0.97,\u003ca href\https://huggingface.co/microsoft/Phi-4-multimodal-instruct\\u003ePhi-4-multimodal-instruct\u003c/a\u003e,5.57,Phi-4-Mini-Instruct,,64,0.87,0.85,0.86,\u003ca href\https://huggingface.co/Vision-CAIR/MiniGPT4-video-mistral-hf\\u003eMiniGPT4-Video\u003c/a\u003e,8.28,Mistral-7B-v0.1,CLIP ViT-L/14,90,0.7,0.62,0.85,\u003ca href\https://huggingface.co/VITA-MLLM/Long-VITA-16K\\u003eLong-VITA-16K\u003c/a\u003e,14,Qwen2.5-14B,InternViT-300M,64,0.65,0.71,0.51,metadata:null},headers:1,2,3,row_count:1,dynamic,col_count:3,dynamic,datatype:html,number,str,str,str,number,number,number,type:pandas,latex_delimiters:{left:$$,right:$$,display:true},show_label:true,height:500,min_width:160,interactive:false,visible:true,elem_classes:,wrap:false,line_breaks:true,column_widths:Method,Parameters (B),Language Model,Vision Model,Frames,Avg Score,Avg Rank,name:dataframe,_selectable:false},skip_api:false,component_class_id:6a51bb3357f2e94c40fb8d4fbe06aa8e,key:null,api_info:{properties:{headers:{items:{type:string},title:Headers,type:array},data:{items:{items:{},type:array},title:Data,type:array},metadata:{anyOf:{additionalProperties:{anyOf:{items:{},type:array},{type:null}},type:object},{type:null},default:null,title:Metadata}},required:headers,data,title:DataframeData,type:object},example_inputs:{headers:a,b,data:foo,bar}},{id:42,type:textbox,props:{value:MMBench-Video,lines:1,max_lines:20,label:MMBench-Video,show_label:true,container:true,min_width:160,visible:false,autofocus:false,autoscroll:true,elem_classes:,type:text,rtl:false,show_copy_button:false,name:textbox,_selectable:false},skip_api:false,component_class_id:d0491bb0fbd427bdb2a4549c4d1ac869,key:null,api_info:{type:string},example_inputs:Hello!!},{id:43,type:form,props:{scale:0,min_width:0,name:form},skip_api:true,component_class_id:43cda85c6ff7a5dd8f83d29496b014ee,key:null},{id:44,type:form,props:{scale:0,min_width:0,name:form},skip_api:true,component_class_id:43cda85c6ff7a5dd8f83d29496b014ee,key:null},{id:45,type:tabitem,props:{label:📊 TempCompass Leaderboard,visible:true,interactive:true,id:5,elem_id:TempCompass,name:tab},skip_api:true,component_class_id:18f46968e2b273dc5dbda0fe78433d7e,key:null},{id:46,type:checkboxgroup,props:{choices:action,action,multi-choice,multi-choice,direction,direction,speed,speed,order,order,attribute_change,attribute_change,caption_matching,caption_matching,captioning,captioning,yes_no,yes_no,overall,overall,value:overall,type:value,label:TempCompass CheckBoxes,show_label:true,container:true,min_width:160,interactive:true,visible:true,elem_classes:,name:checkboxgroup,_selectable:false},skip_api:false,component_class_id:0635a5d3bf87369f1c2c1c69f8a3d0cc,key:null,api_info:{items:{enum:action,multi-choice,direction,speed,order,attribute_change,caption_matching,captioning,yes_no,overall,type:string},title:Checkbox Group,type:array},example_inputs:action},{id:47,type:row,props:{variant:default,visible:true,equal_height:true,show_progress:false,name:row},skip_api:true,component_class_id:b5cf3d15e2293fb5ccc071d7b8d69f64,key:null},{id:48,type:checkboxgroup,props:{choices:\u003c10B,\u003c10B,10B-20B,10B-20B,20B-40B,20B-40B,\u003e40B,\u003e40B,Unknown,Unknown,value:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:value,label:Model Size,show_label:true,container:true,min_width:160,interactive:true,visible:true,elem_classes:,name:checkboxgroup,_selectable:false},skip_api:false,component_class_id:0635a5d3bf87369f1c2c1c69f8a3d0cc,key:null,api_info:{items:{enum:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:string},title:Checkbox Group,type:array},example_inputs:\u003c10B},{id:49,type:checkboxgroup,props:{choices:API,API,OpenSource,OpenSource,value:API,OpenSource,type:value,label:Model Type,show_label:true,container:true,min_width:160,interactive:true,visible:true,elem_classes:,name:checkboxgroup,_selectable:false},skip_api:false,component_class_id:0635a5d3bf87369f1c2c1c69f8a3d0cc,key:null,api_info:{items:{enum:API,OpenSource,type:string},title:Checkbox Group,type:array},example_inputs:API},{id:50,type:form,props:{scale:2,min_width:320,name:form},skip_api:true,component_class_id:43cda85c6ff7a5dd8f83d29496b014ee,key:null},{id:51,type:dataframe,props:{value:{headers:Method,Parameters (B),Language Model,Vision Model,Frames,overall,data:\u003ca href\https://huggingface.co/Qwen/Qwen2-VL-72B-Instruct\\u003eQwen2-VL-72B\u003c/a\u003e,73.4,Qwen2-72B,QwenViT,64,79.36,\u003ca href\https://huggingface.co/OpenGVLab/InternVL3-38B\\u003eInternVL3-38B\u003c/a\u003e,38.4,Qwen2.5-32B,InternViT-6B-v2.5,64,78.49,\u003ca href\https://huggingface.co/Qwen/Qwen2.5-VL-72B-Instruct\\u003eQwen2.5-VL-72B\u003c/a\u003e,73.4,Qwen2.5-72B,QwenViT,64,77.69,\u003ca href\https://huggingface.co/OpenGVLab/InternVL3-78B\\u003eInternVL3-78B\u003c/a\u003e,78.4,Qwen2.5-72B,InternViT-6B-v2.5,64,76.98,\u003ca href\https://openai.com/index/introducing-structured-outputs-in-the-api/\\u003eGPT-4o-2024-08-06\u003c/a\u003e,N/A,N/A,N/A,16,72.67,\u003ca href\https://huggingface.co/Qwen/Qwen2.5-VL-7B-Instruct\\u003eQwen2.5-VL-7B\u003c/a\u003e,8.29,Qwen2.5-7B,QwenViT,64,72.04,\u003ca href\https://huggingface.co/Qwen/Qwen2.5-Omni-7B\\u003eQwen2.5-Omni-7B\u003c/a\u003e,10.7,Qwen-2.5-7B,QwenViT,64,70.72,\u003ca href\https://huggingface.co/OpenGVLab/InternVL3-8B\\u003eInternVL3-8B\u003c/a\u003e,7.94,Qwen2.5-7B,InternViT-300M-v2.5,64,70.42,\u003ca href\https://huggingface.co/lmms-lab/LLaVA-Video-72B-Qwen2\\u003eLLaVA-Video-72B-Qwen2\u003c/a\u003e,73.2,Qwen-2-72B,siglip-so400m-patch14-384,64,70.38,\u003ca href\https://huggingface.co/lmms-lab/llava-onevision-qwen2-72b-ov-sft\\u003eLLaVA-OneVision-72B-sft\u003c/a\u003e,73.2,Qwen-2-72B,siglip-so400m-patch14-384,32,70.19,\u003ca href\https://huggingface.co/rhymes-ai/Aria\\u003eAria\u003c/a\u003e,25.3,N/A,N/A,64,69.62,\u003ca href\https://huggingface.co/Qwen/Qwen2-VL-7B-Instruct\\u003eQwen2-VL-7B\u003c/a\u003e,8,Qwen2-7B,QwenViT,64,69.56,\u003ca href\https://huggingface.co/OpenGVLab/InternVL2-Llama3-76B\\u003eInternVL2-76B\u003c/a\u003e,76.3,Hermes-Llama3-70B,InternViT-6B-448px-V1-5,16,69.52,\u003ca href\https://huggingface.co/OpenGVLab/InternVL2_5-8B\\u003eInternVL2_5-8B\u003c/a\u003e,8.08,internlm2_5-7b-chat,InternViT-300M-448px-V2_5,64,68.66,\u003ca href\https://www.anthropic.com/news/claude-3-5-sonnet\\u003eClaude-3.5-Sonnet\u003c/a\u003e,N/A,N/A,N/A,8,68.29,\u003ca href\https://huggingface.co/lmms-lab/LLaVA-Video-7B-Qwen2\\u003eLLaVA-Video-7B-Qwen2\u003c/a\u003e,8.03,Qwen-2-7B,siglip-so400m-patch14-384,64,65.46,\u003ca href\https://huggingface.co/OpenGVLab/InternVL2-8B\\u003eInternVL2-8B\u003c/a\u003e,8.08,InternLM2.5-7B-chat,InternViT-300M-448px,16,65.42,\u003ca href\https://deepmind.google/technologies/gemini/flash/\\u003eGemini-1.5-Flash\u003c/a\u003e,N/A,N/A,N/A,16,63.63,\u003ca href\https://huggingface.co/ermu2001/pllava-34b\\u003ePLLaVA-34B\u003c/a\u003e,34.9,Nous-Hermes-2-Yi-34B,clip-vit-large-patch14-336,16,61.56,\u003ca href\https://huggingface.co/OpenGVLab/VideoChat2_HD_stage4_Mistral_7B\\u003eVideoChat2-HD\u003c/a\u003e,7 (LLM),Mistral-7B-Instruct-v0.2,umt_l16_qformer,16,60.72,\u003ca href\https://huggingface.co/openbmb/MiniCPM-V-2_6\\u003eMiniCPM-V-2.6\u003c/a\u003e,8.1,Qwen-2-7B,siglip-so400m,64,59.64,\u003ca href\https://huggingface.co/microsoft/Phi-3.5-vision-instruct\\u003ePhi-3.5-Vision\u003c/a\u003e,4.15,Phi-3 Mini,N/A,16,59.56,\u003ca href\https://deepmind.google/technologies/gemini/\\u003eGemini-2.0-Flash\u003c/a\u003e,null,,,64,57.85,\u003ca href\https://huggingface.co/openbmb/MiniCPM-o-2_6\\u003eMiniCPM-o-2_6\u003c/a\u003e,8.67,Qwen2.5-7B-Instruct,siglip-so400m,64,57.08,\u003ca href\https://huggingface.co/OpenGVLab/InternVL2_5-78B\\u003eInternVL2_5-78B\u003c/a\u003e,78.4,Qwen2.5-72B-Instruct,InternViT-6B-448px-V2_5,64,55.95,\u003ca href\https://huggingface.co/mPLUG/mPLUG-Owl3-7B-240728\\u003emPLUG-Owl3\u003c/a\u003e,8.06,Qwen-2-7B,siglip-so400m,16,55.76,\u003ca href\https://huggingface.co/HuggingFaceM4/Idefics3-8B-Llama3\\u003eIdefics3-8B-Llama3\u003c/a\u003e,8.46,Llama-3.1-8B-Instruct,siglip-so400m-patch14-384,16,55.52,\u003ca href\https://huggingface.co/Efficient-Large-Model/VILA1.5-40b\\u003eVILA1.5-40B\u003c/a\u003e,40,Yi-34B,InternViT-6B,14,54.1,\u003ca href\https://huggingface.co/HuggingFaceTB/SmolVLM2-2.2B-Instruct\\u003eSmolVLM2-2.2B-Instruct\u003c/a\u003e,2.25,SmolLM2-1.7B,,64,52.72,\u003ca href\https://huggingface.co/Chat-UniVi/Chat-UniVi-7B-v1.5\\u003eChat-UniVi-7B-v1.5\u003c/a\u003e,7 (LLM),Vicuna-v1.5-7B,clip-vit-large-patch14-336,1.0 fps,51.1,\u003ca href\https://github.com/PKU-YuanGroup/Video-LLaVA\\u003eVideo-LLaVA\u003c/a\u003e,7.5,Vicuna-v1.5-7B,LanguageBind Encoder,8,51.1,\u003ca href\https://huggingface.co/microsoft/Phi-4-multimodal-instruct\\u003ePhi-4-multimodal-instruct\u003c/a\u003e,5.57,Phi-4-Mini-Instruct,,64,50.4,\u003ca href\https://huggingface.co/YanweiLi/llama-vid-7b-full-224-video-fps-1\\u003eLLaMA-VID-7B\u003c/a\u003e,7 (LLM),Vicuna-v1.5-7B,LAVIS-EVA-G,1 fps,48.77,\u003ca href\https://huggingface.co/HuggingFaceTB/SmolVLM2-500M-Video-Instruct\\u003eSmolVLM2-500M-Video-Instruct\u003c/a\u003e,0.507,SmolLM2-360M-Instruct,Siglip-base-patch16-512,64,47.51,\u003ca href\https://github.com/mbzuai-oryx/Video-ChatGPT\\u003eVideo-ChatGPT\u003c/a\u003e,7 (LLM),Vicuna-v1.1-7B,CLIP ViT-L/14,100,44.12,\u003ca href\https://huggingface.co/XiaomiMiMo/MiMo-VL-7B-RL\\u003eMiMo-VL-7B-RL\u003c/a\u003e,8.31,MiMo-7B,Qwen2.5-ViT,64,13.54,\u003ca href\https://huggingface.co/VITA-MLLM/Long-VITA-16K\\u003eLong-VITA-16K\u003c/a\u003e,14,Qwen2.5-14B,InternViT-300M,64,0.29,metadata:null},headers:1,2,3,row_count:1,dynamic,col_count:3,dynamic,datatype:html,number,str,str,str,number,type:pandas,latex_delimiters:{left:$$,right:$$,display:true},show_label:true,height:500,min_width:160,interactive:false,visible:true,elem_classes:,wrap:false,line_breaks:true,column_widths:,name:dataframe,_selectable:false},skip_api:false,component_class_id:6a51bb3357f2e94c40fb8d4fbe06aa8e,key:null,api_info:{properties:{headers:{items:{type:string},title:Headers,type:array},data:{items:{items:{},type:array},title:Data,type:array},metadata:{anyOf:{additionalProperties:{anyOf:{items:{},type:array},{type:null}},type:object},{type:null},default:null,title:Metadata}},required:headers,data,title:DataframeData,type:object},example_inputs:{headers:a,b,data:foo,bar}},{id:52,type:textbox,props:{value:TempCompass,lines:1,max_lines:20,label:TempCompass,show_label:true,container:true,min_width:160,visible:false,autofocus:false,autoscroll:true,elem_classes:,type:text,rtl:false,show_copy_button:false,name:textbox,_selectable:false},skip_api:false,component_class_id:d0491bb0fbd427bdb2a4549c4d1ac869,key:null,api_info:{type:string},example_inputs:Hello!!},{id:53,type:form,props:{scale:0,min_width:0,name:form},skip_api:true,component_class_id:43cda85c6ff7a5dd8f83d29496b014ee,key:null},{id:54,type:form,props:{scale:0,min_width:0,name:form},skip_api:true,component_class_id:43cda85c6ff7a5dd8f83d29496b014ee,key:null},{id:55,type:tabitem,props:{label:📊 MLVU Leaderboard,visible:true,interactive:true,id:6,elem_id:MLVU,name:tab},skip_api:true,component_class_id:18f46968e2b273dc5dbda0fe78433d7e,key:null},{id:56,type:markdown,props:{value:## MLVU Evaluation Results\n\n- The ranking here is determined by sorting the M-Avg scores in descending order.\n- The number of evaluation questions used here is consistent with the official Hugging Face benchmark.,show_label:true,rtl:false,latex_delimiters:{left:$$,right:$$,display:true},visible:true,elem_classes:,sanitize_html:true,line_breaks:false,header_links:false,show_copy_button:false,name:markdown,_selectable:false},skip_api:false,component_class_id:0f35515b9a3362b42bb6721c836157a0,key:null,api_info:{type:string},example_inputs:# Hello!},{id:57,type:checkboxgroup,props:{choices:plotQA,plotQA,needle,needle,ego,ego,count,count,order,order,anomaly_reco,anomaly_reco,topic_reasoning,topic_reasoning,sub_scene,sub_scene,summary,summary,M-Avg,M-Avg,G-Avg,G-Avg,value:M-Avg,G-Avg,type:value,label:MLVU CheckBoxes,show_label:true,container:true,min_width:160,interactive:true,visible:true,elem_classes:,name:checkboxgroup,_selectable:false},skip_api:false,component_class_id:0635a5d3bf87369f1c2c1c69f8a3d0cc,key:null,api_info:{items:{enum:plotQA,needle,ego,count,order,anomaly_reco,topic_reasoning,sub_scene,summary,M-Avg,G-Avg,type:string},title:Checkbox Group,type:array},example_inputs:plotQA},{id:58,type:row,props:{variant:default,visible:true,equal_height:true,show_progress:false,name:row},skip_api:true,component_class_id:b5cf3d15e2293fb5ccc071d7b8d69f64,key:null},{id:59,type:checkboxgroup,props:{choices:\u003c10B,\u003c10B,10B-20B,10B-20B,20B-40B,20B-40B,\u003e40B,\u003e40B,Unknown,Unknown,value:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:value,label:Model Size,show_label:true,container:true,min_width:160,interactive:true,visible:true,elem_classes:,name:checkboxgroup,_selectable:false},skip_api:false,component_class_id:0635a5d3bf87369f1c2c1c69f8a3d0cc,key:null,api_info:{items:{enum:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:string},title:Checkbox Group,type:array},example_inputs:\u003c10B},{id:60,type:checkboxgroup,props:{choices:API,API,OpenSource,OpenSource,value:API,OpenSource,type:value,label:Model Type,show_label:true,container:true,min_width:160,interactive:true,visible:true,elem_classes:,name:checkboxgroup,_selectable:false},skip_api:false,component_class_id:0635a5d3bf87369f1c2c1c69f8a3d0cc,key:null,api_info:{items:{enum:API,OpenSource,type:string},title:Checkbox Group,type:array},example_inputs:API},{id:61,type:form,props:{scale:2,min_width:320,name:form},skip_api:true,component_class_id:43cda85c6ff7a5dd8f83d29496b014ee,key:null},{id:62,type:dataframe,props:{value:{headers:Method,Parameters (B),Language Model,Vision Model,Frames,M-Avg,G-Avg,data:\u003ca href\https://huggingface.co/OpenGVLab/InternVL3-78B\\u003eInternVL3-78B\u003c/a\u003e,78.4,Qwen2.5-72B,InternViT-6B-v2.5,64,79.5,4.5,\u003ca href\https://huggingface.co/OpenGVLab/InternVL3-38B\\u003eInternVL3-38B\u003c/a\u003e,38.4,Qwen2.5-32B,InternViT-6B-v2.5,64,77.4,4.8,\u003ca href\https://huggingface.co/OpenGVLab/InternVL2_5-78B\\u003eInternVL2_5-78B\u003c/a\u003e,78.4,Qwen2.5-72B-Instruct,InternViT-6B-448px-V2_5,64,75.8,4.5,\u003ca href\https://huggingface.co/lmms-lab/LLaVA-Video-72B-Qwen2\\u003eLLaVA-Video-72B-Qwen2\u003c/a\u003e,73.2,Qwen-2-72B,siglip-so400m-patch14-384,64,75.2,4.26,\u003ca href\https://huggingface.co/OpenGVLab/InternVL3-8B\\u003eInternVL3-8B\u003c/a\u003e,7.94,Qwen2.5-7B,InternViT-300M-v2.5,64,73.1,4.1,\u003ca href\https://huggingface.co/lmms-lab/LLaVA-Video-7B-Qwen2\\u003eLLaVA-Video-7B-Qwen2\u003c/a\u003e,8.03,Qwen-2-7B,siglip-so400m-patch14-384,64,73.0,3.96,\u003ca href\https://huggingface.co/lmms-lab/llava-onevision-qwen2-72b-ov-sft\\u003eLLaVA-OneVision-72B-sft\u003c/a\u003e,73.2,Qwen-2-72B,siglip-so400m-patch14-384,32,72.9,4.24,\u003ca href\https://deepmind.google/technologies/gemini/\\u003eGemini-2.0-Flash\u003c/a\u003e,null,,,64,71.6,4.9,\u003ca href\https://huggingface.co/rhymes-ai/Aria\\u003eAria\u003c/a\u003e,25.3,N/A,N/A,64,70.3,5.0,\u003ca href\https://huggingface.co/Qwen/Qwen2-VL-72B-Instruct\\u003eQwen2-VL-72B\u003c/a\u003e,73.4,Qwen2-72B,QwenViT,64,69.6,4.2,\u003ca href\https://huggingface.co/VITA-MLLM/Long-VITA-16K\\u003eLong-VITA-16K\u003c/a\u003e,14,Qwen2.5-14B,InternViT-300M,64,69.6,0.0,\u003ca href\https://huggingface.co/OpenGVLab/InternVL2_5-8B\\u003eInternVL2_5-8B\u003c/a\u003e,8.08,internlm2_5-7b-chat,InternViT-300M-448px-V2_5,64,68.5,4.0,\u003ca href\https://huggingface.co/Qwen/Qwen2.5-VL-72B-Instruct\\u003eQwen2.5-VL-72B\u003c/a\u003e,73.4,Qwen2.5-72B,QwenViT,64,67.6,4.6,\u003ca href\https://huggingface.co/Qwen/Qwen2.5-Omni-7B\\u003eQwen2.5-Omni-7B\u003c/a\u003e,10.7,Qwen-2.5-7B,QwenViT,64,67.5,4.3,\u003ca href\https://huggingface.co/Qwen/Qwen2-VL-7B-Instruct\\u003eQwen2-VL-7B\u003c/a\u003e,8,Qwen2-7B,QwenViT,64,66.4,4.1,\u003ca href\https://huggingface.co/XiaomiMiMo/MiMo-VL-7B-RL\\u003eMiMo-VL-7B-RL\u003c/a\u003e,8.31,MiMo-7B,Qwen2.5-ViT,64,66.2,3.4,\u003ca href\https://huggingface.co/Qwen/Qwen2.5-VL-7B-Instruct\\u003eQwen2.5-VL-7B\u003c/a\u003e,8.29,Qwen2.5-7B,QwenViT,64,65.3,4.5,\u003ca href\https://huggingface.co/OpenGVLab/InternVL2-Llama3-76B\\u003eInternVL2-76B\u003c/a\u003e,76.3,Hermes-Llama3-70B,InternViT-6B-448px-V1-5,16,64.1,3.88,\u003ca href\https://huggingface.co/OpenGVLab/InternVL2-8B\\u003eInternVL2-8B\u003c/a\u003e,8.08,InternLM2.5-7B-chat,InternViT-300M-448px,16,59.9,3.84,\u003ca href\https://huggingface.co/ermu2001/pllava-34b\\u003ePLLaVA-34B\u003c/a\u003e,34.9,Nous-Hermes-2-Yi-34B,clip-vit-large-patch14-336,16,59.8,4.26,\u003ca href\https://openai.com/index/introducing-structured-outputs-in-the-api/\\u003eGPT-4o-2024-08-06\u003c/a\u003e,N/A,N/A,N/A,16,57.0,4.37,\u003ca href\https://huggingface.co/openbmb/MiniCPM-o-2_6\\u003eMiniCPM-o-2_6\u003c/a\u003e,8.67,Qwen2.5-7B-Instruct,siglip-so400m,64,56.0,3.4,\u003ca href\https://huggingface.co/microsoft/Phi-3.5-vision-instruct\\u003ePhi-3.5-Vision\u003c/a\u003e,4.15,Phi-3 Mini,N/A,16,55.1,3.91,\u003ca href\https://huggingface.co/HuggingFaceTB/SmolVLM2-2.2B-Instruct\\u003eSmolVLM2-2.2B-Instruct\u003c/a\u003e,2.25,SmolLM2-1.7B,,64,55.0,3.5,\u003ca href\https://huggingface.co/openbmb/MiniCPM-V-2_6\\u003eMiniCPM-V-2.6\u003c/a\u003e,8.1,Qwen-2-7B,siglip-so400m,64,54.4,4.45,\u003ca href\https://deepmind.google/technologies/gemini/flash/\\u003eGemini-1.5-Flash\u003c/a\u003e,N/A,N/A,N/A,16,54.0,3.71,\u003ca href\https://huggingface.co/OpenGVLab/VideoChat2_HD_stage4_Mistral_7B\\u003eVideoChat2-HD\u003c/a\u003e,7 (LLM),Mistral-7B-Instruct-v0.2,umt_l16_qformer,16,52.7,4.08,\u003ca href\https://github.com/PKU-YuanGroup/Video-LLaVA\\u003eVideo-LLaVA\u003c/a\u003e,7.5,Vicuna-v1.5-7B,LanguageBind Encoder,8,52.5,3.67,\u003ca href\https://huggingface.co/YanweiLi/llama-vid-7b-full-224-video-fps-1\\u003eLLaMA-VID-7B\u003c/a\u003e,7 (LLM),Vicuna-v1.5-7B,LAVIS-EVA-G,1 fps,51.2,3.95,\u003ca href\https://huggingface.co/Chat-UniVi/Chat-UniVi-7B-v1.5\\u003eChat-UniVi-7B-v1.5\u003c/a\u003e,7 (LLM),Vicuna-v1.5-7B,clip-vit-large-patch14-336,1.0 fps,50.7,3.67,\u003ca href\https://huggingface.co/HuggingFaceM4/Idefics3-8B-Llama3\\u003eIdefics3-8B-Llama3\u003c/a\u003e,8.46,Llama-3.1-8B-Instruct,siglip-so400m-patch14-384,16,50.2,2.78,\u003ca href\https://huggingface.co/mPLUG/mPLUG-Owl3-7B-240728\\u003emPLUG-Owl3\u003c/a\u003e,8.06,Qwen-2-7B,siglip-so400m,16,49.9,3.82,\u003ca href\https://huggingface.co/microsoft/Phi-4-multimodal-instruct\\u003ePhi-4-multimodal-instruct\u003c/a\u003e,5.57,Phi-4-Mini-Instruct,,64,48.0,3.3,\u003ca href\https://huggingface.co/HuggingFaceTB/SmolVLM2-500M-Video-Instruct\\u003eSmolVLM2-500M-Video-Instruct\u003c/a\u003e,0.507,SmolLM2-360M-Instruct,Siglip-base-patch16-512,64,47.0,3.1,\u003ca href\https://huggingface.co/Efficient-Large-Model/VILA1.5-40b\\u003eVILA1.5-40B\u003c/a\u003e,40,Yi-34B,InternViT-6B,14,42.7,3.18,\u003ca href\https://www.anthropic.com/news/claude-3-5-sonnet\\u003eClaude-3.5-Sonnet\u003c/a\u003e,N/A,N/A,N/A,8,38.1,3.89,\u003ca href\https://github.com/mbzuai-oryx/Video-ChatGPT\\u003eVideo-ChatGPT\u003c/a\u003e,7 (LLM),Vicuna-v1.1-7B,CLIP ViT-L/14,100,35.6,3.58,metadata:null},headers:1,2,3,row_count:1,dynamic,col_count:3,dynamic,datatype:html,number,str,str,str,number,number,type:pandas,latex_delimiters:{left:$$,right:$$,display:true},show_label:true,height:500,min_width:160,interactive:false,visible:true,elem_classes:,wrap:false,line_breaks:true,column_widths:,name:dataframe,_selectable:false},skip_api:false,component_class_id:6a51bb3357f2e94c40fb8d4fbe06aa8e,key:null,api_info:{properties:{headers:{items:{type:string},title:Headers,type:array},data:{items:{items:{},type:array},title:Data,type:array},metadata:{anyOf:{additionalProperties:{anyOf:{items:{},type:array},{type:null}},type:object},{type:null},default:null,title:Metadata}},required:headers,data,title:DataframeData,type:object},example_inputs:{headers:a,b,data:foo,bar}},{id:63,type:textbox,props:{value:MLVU,lines:1,max_lines:20,label:MLVU,show_label:true,container:true,min_width:160,visible:false,autofocus:false,autoscroll:true,elem_classes:,type:text,rtl:false,show_copy_button:false,name:textbox,_selectable:false},skip_api:false,component_class_id:d0491bb0fbd427bdb2a4549c4d1ac869,key:null,api_info:{type:string},example_inputs:Hello!!},{id:64,type:form,props:{scale:0,min_width:0,name:form},skip_api:true,component_class_id:43cda85c6ff7a5dd8f83d29496b014ee,key:null},{id:65,type:form,props:{scale:0,min_width:0,name:form},skip_api:true,component_class_id:43cda85c6ff7a5dd8f83d29496b014ee,key:null},{id:66,type:row,props:{variant:default,visible:true,equal_height:true,show_progress:false,name:row},skip_api:true,component_class_id:b5cf3d15e2293fb5ccc071d7b8d69f64,key:null},{id:67,type:accordion,props:{label:Citation,open:false,visible:true,name:accordion},skip_api:true,component_class_id:b49c1903c3591b866d0adab5a2f6e48b,key:null},{id:68,type:textbox,props:{value:@misc{duan2024vlmevalkitopensourcetoolkitevaluating,\n title{VLMEvalKit: An Open-Source Toolkit for Evaluating Large Multi-Modality Models}, \n author{Haodong Duan and Junming Yang and Yuxuan Qiao and Xinyu Fang and Lin Chen and Yuan Liu and Amit Agarwal and Zhe Chen and Mo Li and Yubo Ma and Hailong Sun and Xiangyu Zhao and Junbo Cui and Xiaoyi Dong and Yuhang Zang and Pan Zhang and Jiaqi Wang and Dahua Lin and Kai Chen},\n year{2024},\n eprint{2407.11691},\n archivePrefix{arXiv},\n primaryClass{cs.CV},\n url{https://arxiv.org/abs/2407.11691}, \n},lines:1,max_lines:20,label:Copy the following snippet to cite these results,show_label:true,container:true,min_width:160,visible:true,elem_id:citation-button,autofocus:false,autoscroll:true,elem_classes:,type:text,rtl:false,show_copy_button:false,name:textbox,_selectable:false},skip_api:false,component_class_id:d0491bb0fbd427bdb2a4549c4d1ac869,key:null,api_info:{type:string},example_inputs:Hello!!},{id:69,type:form,props:{scale:0,min_width:0,name:form},skip_api:true,component_class_id:43cda85c6ff7a5dd8f83d29496b014ee,key:null},css:null,connect_heartbeat:false,js:null,head:null,title:Gradio,space_id:opencompass/openvlm_video_leaderboard,enable_queue:true,show_error:false,show_api:true,is_colab:false,max_file_size:null,stylesheets:https://fonts.googleapis.com/css2?familySource+Sans+Pro:wght@400;600\u0026displayswap,https://fonts.googleapis.com/css2?familyIBM+Plex+Mono:wght@400;600\u0026displayswap,theme:default,protocol:sse_v3,body_css:{body_background_fill:white,body_text_color:#1f2937,body_background_fill_dark:#0b0f19,body_text_color_dark:#f3f4f6},fill_height:false,fill_width:false,theme_hash:aa3cf5a76ff4ff3d05b407358c4a85aa7a55a3f5751b53d49cba0b8d49a6c148,layout:{id:0,children:{id:1},{id:2,children:{id:3,children:{id:4},{id:11,children:{id:5}},{id:6,children:{id:9,children:{id:7},{id:8}}},{id:10}},{id:12,children:{id:13}},{id:14,children:{id:15},{id:23,children:{id:16}},{id:17,children:{id:20,children:{id:18},{id:19}}},{id:21},{id:24,children:{id:22}}},{id:25,children:{id:33,children:{id:26}},{id:27,children:{id:30,children:{id:28},{id:29}}},{id:31},{id:34,children:{id:32}}},{id:35,children:{id:43,children:{id:36}},{id:37,children:{id:40,children:{id:38},{id:39}}},{id:41},{id:44,children:{id:42}}},{id:45,children:{id:53,children:{id:46}},{id:47,children:{id:50,children:{id:48},{id:49}}},{id:51},{id:54,children:{id:52}}},{id:55,children:{id:56},{id:64,children:{id:57}},{id:58,children:{id:61,children:{id:59},{id:60}}},{id:62},{id:65,children:{id:63}}}},{id:66,children:{id:67,children:{id:69,children:{id:68}}}}},dependencies:{id:0,targets:5,change,inputs:5,7,8,outputs:10,backend_fn:true,js:null,queue:true,api_name:filter_df,scroll_to_output:false,show_progress:full,batch:false,max_batch_size:4,cancels:,types:{generator:false,cancel:false},collects_event_data:false,trigger_after:null,trigger_only_on_success:false,trigger_mode:always_last,show_api:true,zerogpu:false,rendered_in:null},{id:1,targets:7,change,inputs:5,7,8,outputs:10,backend_fn:true,js:null,queue:true,api_name:filter_df_1,scroll_to_output:false,show_progress:full,batch:false,max_batch_size:4,cancels:,types:{generator:false,cancel:false},collects_event_data:false,trigger_after:null,trigger_only_on_success:false,trigger_mode:always_last,show_api:true,zerogpu:false,rendered_in:null},{id:2,targets:8,change,inputs:5,7,8,outputs:10,backend_fn:true,js:null,queue:true,api_name:filter_df_2,scroll_to_output:false,show_progress:full,batch:false,max_batch_size:4,cancels:,types:{generator:false,cancel:false},collects_event_data:false,trigger_after:null,trigger_only_on_success:false,trigger_mode:always_last,show_api:true,zerogpu:false,rendered_in:null},{id:3,targets:16,change,inputs:22,16,18,19,outputs:21,backend_fn:true,js:null,queue:true,api_name:filter_df_l2,scroll_to_output:false,show_progress:full,batch:false,max_batch_size:4,cancels:,types:{generator:false,cancel:false},collects_event_data:false,trigger_after:null,trigger_only_on_success:false,trigger_mode:always_last,show_api:true,zerogpu:false,rendered_in:null},{id:4,targets:18,change,inputs:22,16,18,19,outputs:21,backend_fn:true,js:null,queue:true,api_name:filter_df_l2_1,scroll_to_output:false,show_progress:full,batch:false,max_batch_size:4,cancels:,types:{generator:false,cancel:false},collects_event_data:false,trigger_after:null,trigger_only_on_success:false,trigger_mode:always_last,show_api:true,zerogpu:false,rendered_in:null},{id:5,targets:19,change,inputs:22,16,18,19,outputs:21,backend_fn:true,js:null,queue:true,api_name:filter_df_l2_2,scroll_to_output:false,show_progress:full,batch:false,max_batch_size:4,cancels:,types:{generator:false,cancel:false},collects_event_data:false,trigger_after:null,trigger_only_on_success:false,trigger_mode:always_last,show_api:true,zerogpu:false,rendered_in:null},{id:6,targets:26,change,inputs:32,26,28,29,outputs:31,backend_fn:true,js:null,queue:true,api_name:filter_df_l2_3,scroll_to_output:false,show_progress:full,batch:false,max_batch_size:4,cancels:,types:{generator:false,cancel:false},collects_event_data:false,trigger_after:null,trigger_only_on_success:false,trigger_mode:always_last,show_api:true,zerogpu:false,rendered_in:null},{id:7,targets:28,change,inputs:32,26,28,29,outputs:31,backend_fn:true,js:null,queue:true,api_name:filter_df_l2_4,scroll_to_output:false,show_progress:full,batch:false,max_batch_size:4,cancels:,types:{generator:false,cancel:false},collects_event_data:false,trigger_after:null,trigger_only_on_success:false,trigger_mode:always_last,show_api:true,zerogpu:false,rendered_in:null},{id:8,targets:29,change,inputs:32,26,28,29,outputs:31,backend_fn:true,js:null,queue:true,api_name:filter_df_l2_5,scroll_to_output:false,show_progress:full,batch:false,max_batch_size:4,cancels:,types:{generator:false,cancel:false},collects_event_data:false,trigger_after:null,trigger_only_on_success:false,trigger_mode:always_last,show_api:true,zerogpu:false,rendered_in:null},{id:9,targets:36,change,inputs:42,36,38,39,outputs:41,backend_fn:true,js:null,queue:true,api_name:filter_df_l2_6,scroll_to_output:false,show_progress:full,batch:false,max_batch_size:4,cancels:,types:{generator:false,cancel:false},collects_event_data:false,trigger_after:null,trigger_only_on_success:false,trigger_mode:always_last,show_api:true,zerogpu:false,rendered_in:null},{id:10,targets:38,change,inputs:42,36,38,39,outputs:41,backend_fn:true,js:null,queue:true,api_name:filter_df_l2_7,scroll_to_output:false,show_progress:full,batch:false,max_batch_size:4,cancels:,types:{generator:false,cancel:false},collects_event_data:false,trigger_after:null,trigger_only_on_success:false,trigger_mode:always_last,show_api:true,zerogpu:false,rendered_in:null},{id:11,targets:39,change,inputs:42,36,38,39,outputs:41,backend_fn:true,js:null,queue:true,api_name:filter_df_l2_8,scroll_to_output:false,show_progress:full,batch:false,max_batch_size:4,cancels:,types:{generator:false,cancel:false},collects_event_data:false,trigger_after:null,trigger_only_on_success:false,trigger_mode:always_last,show_api:true,zerogpu:false,rendered_in:null},{id:12,targets:46,change,inputs:52,46,48,49,outputs:51,backend_fn:true,js:null,queue:true,api_name:filter_df_l2_9,scroll_to_output:false,show_progress:full,batch:false,max_batch_size:4,cancels:,types:{generator:false,cancel:false},collects_event_data:false,trigger_after:null,trigger_only_on_success:false,trigger_mode:always_last,show_api:true,zerogpu:false,rendered_in:null},{id:13,targets:48,change,inputs:52,46,48,49,outputs:51,backend_fn:true,js:null,queue:true,api_name:filter_df_l2_10,scroll_to_output:false,show_progress:full,batch:false,max_batch_size:4,cancels:,types:{generator:false,cancel:false},collects_event_data:false,trigger_after:null,trigger_only_on_success:false,trigger_mode:always_last,show_api:true,zerogpu:false,rendered_in:null},{id:14,targets:49,change,inputs:52,46,48,49,outputs:51,backend_fn:true,js:null,queue:true,api_name:filter_df_l2_11,scroll_to_output:false,show_progress:full,batch:false,max_batch_size:4,cancels:,types:{generator:false,cancel:false},collects_event_data:false,trigger_after:null,trigger_only_on_success:false,trigger_mode:always_last,show_api:true,zerogpu:false,rendered_in:null},{id:15,targets:57,change,inputs:63,57,59,60,outputs:62,backend_fn:true,js:null,queue:true,api_name:filter_df_l2_12,scroll_to_output:false,show_progress:full,batch:false,max_batch_size:4,cancels:,types:{generator:false,cancel:false},collects_event_data:false,trigger_after:null,trigger_only_on_success:false,trigger_mode:always_last,show_api:true,zerogpu:false,rendered_in:null},{id:16,targets:59,change,inputs:63,57,59,60,outputs:62,backend_fn:true,js:null,queue:true,api_name:filter_df_l2_13,scroll_to_output:false,show_progress:full,batch:false,max_batch_size:4,cancels:,types:{generator:false,cancel:false},collects_event_data:false,trigger_after:null,trigger_only_on_success:false,trigger_mode:always_last,show_api:true,zerogpu:false,rendered_in:null},{id:17,targets:60,change,inputs:63,57,59,60,outputs:62,backend_fn:true,js:null,queue:true,api_name:filter_df_l2_14,scroll_to_output:false,show_progress:full,batch:false,max_batch_size:4,cancels:,types:{generator:false,cancel:false},collects_event_data:false,trigger_after:null,trigger_only_on_success:false,trigger_mode:always_last,show_api:true,zerogpu:false,rendered_in:null},root:https://opencompass-openvlm-video-leaderboard.hf.space,username:null};/script>script>window.gradio_api_info {named_endpoints:{/filter_df:{parameters:{label:Evaluation Dimension,parameter_name:fields,parameter_has_default:true,parameter_default:Avg Score,Avg Rank,type:{items:{enum:Avg Score,Avg Rank,OpenSource,Verified,MVBench,Video-MME (w/o subs),MMBench-Video,TempCompass,MLVU,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027Avg Score\u0027, \u0027Avg Rank\u0027, \u0027OpenSource\u0027, \u0027Verified\u0027, \u0027MVBench\u0027, \u0027Video-MME (w/o subs)\u0027, \u0027MMBench-Video\u0027, \u0027TempCompass\u0027, \u0027MLVU\u0027,description:},component:Checkboxgroup,example_input:Avg Score},{label:Model Size,parameter_name:model_size,parameter_has_default:true,parameter_default:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:{items:{enum:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027\u003c10B\u0027, \u002710B-20B\u0027, \u002720B-40B\u0027, \u0027\u003e40B\u0027, \u0027Unknown\u0027,description:},component:Checkboxgroup,example_input:\u003c10B},{label:Model Type,parameter_name:model_type,parameter_has_default:true,parameter_default:API,OpenSource,type:{items:{enum:API,OpenSource,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027API\u0027, \u0027OpenSource\u0027,description:},component:Checkboxgroup,example_input:API},returns:{label:value_10,type:{properties:{headers:{items:{type:string},title:Headers,type:array},data:{items:{items:{},type:array},title:Data,type:array},metadata:{anyOf:{additionalProperties:{anyOf:{items:{},type:array},{type:null}},type:object},{type:null},default:null,title:Metadata}},required:headers,data,title:DataframeData,type:object},python_type:{type:Dict(headers: Liststr, data: ListListAny, metadata: Dict(str, ListAny | None) | None),description:},component:Dataframe},show_api:true},/filter_df_1:{parameters:{label:Evaluation Dimension,parameter_name:fields,parameter_has_default:true,parameter_default:Avg Score,Avg Rank,type:{items:{enum:Avg Score,Avg Rank,OpenSource,Verified,MVBench,Video-MME (w/o subs),MMBench-Video,TempCompass,MLVU,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027Avg Score\u0027, \u0027Avg Rank\u0027, \u0027OpenSource\u0027, \u0027Verified\u0027, \u0027MVBench\u0027, \u0027Video-MME (w/o subs)\u0027, \u0027MMBench-Video\u0027, \u0027TempCompass\u0027, \u0027MLVU\u0027,description:},component:Checkboxgroup,example_input:Avg Score},{label:Model Size,parameter_name:model_size,parameter_has_default:true,parameter_default:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:{items:{enum:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027\u003c10B\u0027, \u002710B-20B\u0027, \u002720B-40B\u0027, \u0027\u003e40B\u0027, \u0027Unknown\u0027,description:},component:Checkboxgroup,example_input:\u003c10B},{label:Model Type,parameter_name:model_type,parameter_has_default:true,parameter_default:API,OpenSource,type:{items:{enum:API,OpenSource,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027API\u0027, \u0027OpenSource\u0027,description:},component:Checkboxgroup,example_input:API},returns:{label:value_10,type:{properties:{headers:{items:{type:string},title:Headers,type:array},data:{items:{items:{},type:array},title:Data,type:array},metadata:{anyOf:{additionalProperties:{anyOf:{items:{},type:array},{type:null}},type:object},{type:null},default:null,title:Metadata}},required:headers,data,title:DataframeData,type:object},python_type:{type:Dict(headers: Liststr, data: ListListAny, metadata: Dict(str, ListAny | None) | None),description:},component:Dataframe},show_api:true},/filter_df_2:{parameters:{label:Evaluation Dimension,parameter_name:fields,parameter_has_default:true,parameter_default:Avg Score,Avg Rank,type:{items:{enum:Avg Score,Avg Rank,OpenSource,Verified,MVBench,Video-MME (w/o subs),MMBench-Video,TempCompass,MLVU,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027Avg Score\u0027, \u0027Avg Rank\u0027, \u0027OpenSource\u0027, \u0027Verified\u0027, \u0027MVBench\u0027, \u0027Video-MME (w/o subs)\u0027, \u0027MMBench-Video\u0027, \u0027TempCompass\u0027, \u0027MLVU\u0027,description:},component:Checkboxgroup,example_input:Avg Score},{label:Model Size,parameter_name:model_size,parameter_has_default:true,parameter_default:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:{items:{enum:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027\u003c10B\u0027, \u002710B-20B\u0027, \u002720B-40B\u0027, \u0027\u003e40B\u0027, \u0027Unknown\u0027,description:},component:Checkboxgroup,example_input:\u003c10B},{label:Model Type,parameter_name:model_type,parameter_has_default:true,parameter_default:API,OpenSource,type:{items:{enum:API,OpenSource,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027API\u0027, \u0027OpenSource\u0027,description:},component:Checkboxgroup,example_input:API},returns:{label:value_10,type:{properties:{headers:{items:{type:string},title:Headers,type:array},data:{items:{items:{},type:array},title:Data,type:array},metadata:{anyOf:{additionalProperties:{anyOf:{items:{},type:array},{type:null}},type:object},{type:null},default:null,title:Metadata}},required:headers,data,title:DataframeData,type:object},python_type:{type:Dict(headers: Liststr, data: ListListAny, metadata: Dict(str, ListAny | None) | None),description:},component:Dataframe},show_api:true},/filter_df_l2:{parameters:{label:Video-MME (w/o subs),parameter_name:dataset_name,parameter_has_default:true,parameter_default:Video-MME (w/o subs),type:{type:string},python_type:{type:str,description:},component:Textbox,example_input:Hello!!},{label:Video-MME (w/o subs) CheckBoxes,parameter_name:fields,parameter_has_default:true,parameter_default:Overall,short,medium,long,type:{items:{enum:short,medium,long,Temporal Perception,Spatial Perception,Attribute Perception,Action Recognition,Object Recognition,OCR Problems,Counting Problem,Temporal Reasoning,Spatial Reasoning,Action Reasoning,Object Reasoning,Information Synopsis,Overall,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027short\u0027, \u0027medium\u0027, \u0027long\u0027, \u0027Temporal Perception\u0027, \u0027Spatial Perception\u0027, \u0027Attribute Perception\u0027, \u0027Action Recognition\u0027, \u0027Object Recognition\u0027, \u0027OCR Problems\u0027, \u0027Counting Problem\u0027, \u0027Temporal Reasoning\u0027, \u0027Spatial Reasoning\u0027, \u0027Action Reasoning\u0027, \u0027Object Reasoning\u0027, \u0027Information Synopsis\u0027, \u0027Overall\u0027,description:},component:Checkboxgroup,example_input:short},{label:Model Size,parameter_name:model_size,parameter_has_default:true,parameter_default:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:{items:{enum:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027\u003c10B\u0027, \u002710B-20B\u0027, \u002720B-40B\u0027, \u0027\u003e40B\u0027, \u0027Unknown\u0027,description:},component:Checkboxgroup,example_input:\u003c10B},{label:Model Type,parameter_name:model_type,parameter_has_default:true,parameter_default:API,OpenSource,type:{items:{enum:API,OpenSource,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027API\u0027, \u0027OpenSource\u0027,description:},component:Checkboxgroup,example_input:API},returns:{label:value_21,type:{properties:{headers:{items:{type:string},title:Headers,type:array},data:{items:{items:{},type:array},title:Data,type:array},metadata:{anyOf:{additionalProperties:{anyOf:{items:{},type:array},{type:null}},type:object},{type:null},default:null,title:Metadata}},required:headers,data,title:DataframeData,type:object},python_type:{type:Dict(headers: Liststr, data: ListListAny, metadata: Dict(str, ListAny | None) | None),description:},component:Dataframe},show_api:true},/filter_df_l2_1:{parameters:{label:Video-MME (w/o subs),parameter_name:dataset_name,parameter_has_default:true,parameter_default:Video-MME (w/o subs),type:{type:string},python_type:{type:str,description:},component:Textbox,example_input:Hello!!},{label:Video-MME (w/o subs) CheckBoxes,parameter_name:fields,parameter_has_default:true,parameter_default:Overall,short,medium,long,type:{items:{enum:short,medium,long,Temporal Perception,Spatial Perception,Attribute Perception,Action Recognition,Object Recognition,OCR Problems,Counting Problem,Temporal Reasoning,Spatial Reasoning,Action Reasoning,Object Reasoning,Information Synopsis,Overall,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027short\u0027, \u0027medium\u0027, \u0027long\u0027, \u0027Temporal Perception\u0027, \u0027Spatial Perception\u0027, \u0027Attribute Perception\u0027, \u0027Action Recognition\u0027, \u0027Object Recognition\u0027, \u0027OCR Problems\u0027, \u0027Counting Problem\u0027, \u0027Temporal Reasoning\u0027, \u0027Spatial Reasoning\u0027, \u0027Action Reasoning\u0027, \u0027Object Reasoning\u0027, \u0027Information Synopsis\u0027, \u0027Overall\u0027,description:},component:Checkboxgroup,example_input:short},{label:Model Size,parameter_name:model_size,parameter_has_default:true,parameter_default:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:{items:{enum:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027\u003c10B\u0027, \u002710B-20B\u0027, \u002720B-40B\u0027, \u0027\u003e40B\u0027, \u0027Unknown\u0027,description:},component:Checkboxgroup,example_input:\u003c10B},{label:Model Type,parameter_name:model_type,parameter_has_default:true,parameter_default:API,OpenSource,type:{items:{enum:API,OpenSource,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027API\u0027, \u0027OpenSource\u0027,description:},component:Checkboxgroup,example_input:API},returns:{label:value_21,type:{properties:{headers:{items:{type:string},title:Headers,type:array},data:{items:{items:{},type:array},title:Data,type:array},metadata:{anyOf:{additionalProperties:{anyOf:{items:{},type:array},{type:null}},type:object},{type:null},default:null,title:Metadata}},required:headers,data,title:DataframeData,type:object},python_type:{type:Dict(headers: Liststr, data: ListListAny, metadata: Dict(str, ListAny | None) | None),description:},component:Dataframe},show_api:true},/filter_df_l2_2:{parameters:{label:Video-MME (w/o subs),parameter_name:dataset_name,parameter_has_default:true,parameter_default:Video-MME (w/o subs),type:{type:string},python_type:{type:str,description:},component:Textbox,example_input:Hello!!},{label:Video-MME (w/o subs) CheckBoxes,parameter_name:fields,parameter_has_default:true,parameter_default:Overall,short,medium,long,type:{items:{enum:short,medium,long,Temporal Perception,Spatial Perception,Attribute Perception,Action Recognition,Object Recognition,OCR Problems,Counting Problem,Temporal Reasoning,Spatial Reasoning,Action Reasoning,Object Reasoning,Information Synopsis,Overall,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027short\u0027, \u0027medium\u0027, \u0027long\u0027, \u0027Temporal Perception\u0027, \u0027Spatial Perception\u0027, \u0027Attribute Perception\u0027, \u0027Action Recognition\u0027, \u0027Object Recognition\u0027, \u0027OCR Problems\u0027, \u0027Counting Problem\u0027, \u0027Temporal Reasoning\u0027, \u0027Spatial Reasoning\u0027, \u0027Action Reasoning\u0027, \u0027Object Reasoning\u0027, \u0027Information Synopsis\u0027, \u0027Overall\u0027,description:},component:Checkboxgroup,example_input:short},{label:Model Size,parameter_name:model_size,parameter_has_default:true,parameter_default:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:{items:{enum:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027\u003c10B\u0027, \u002710B-20B\u0027, \u002720B-40B\u0027, \u0027\u003e40B\u0027, \u0027Unknown\u0027,description:},component:Checkboxgroup,example_input:\u003c10B},{label:Model Type,parameter_name:model_type,parameter_has_default:true,parameter_default:API,OpenSource,type:{items:{enum:API,OpenSource,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027API\u0027, \u0027OpenSource\u0027,description:},component:Checkboxgroup,example_input:API},returns:{label:value_21,type:{properties:{headers:{items:{type:string},title:Headers,type:array},data:{items:{items:{},type:array},title:Data,type:array},metadata:{anyOf:{additionalProperties:{anyOf:{items:{},type:array},{type:null}},type:object},{type:null},default:null,title:Metadata}},required:headers,data,title:DataframeData,type:object},python_type:{type:Dict(headers: Liststr, data: ListListAny, metadata: Dict(str, ListAny | None) | None),description:},component:Dataframe},show_api:true},/filter_df_l2_3:{parameters:{label:MVBench,parameter_name:dataset_name,parameter_has_default:true,parameter_default:MVBench,type:{type:string},python_type:{type:str,description:},component:Textbox,example_input:Hello!!},{label:MVBench CheckBoxes,parameter_name:fields,parameter_has_default:true,parameter_default:Overall,type:{items:{enum:action_sequence,action_prediction,action_antonym,fine_grained_action,unexpected_action,object_existence,object_interaction,object_shuffle,moving_direction,action_localization,scene_transition,action_count,moving_count,moving_attribute,state_change,fine_grained_pose,character_order,egocentric_navigation,episodic_reasoning,counterfactual_inference,Overall,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027action_sequence\u0027, \u0027action_prediction\u0027, \u0027action_antonym\u0027, \u0027fine_grained_action\u0027, \u0027unexpected_action\u0027, \u0027object_existence\u0027, \u0027object_interaction\u0027, \u0027object_shuffle\u0027, \u0027moving_direction\u0027, \u0027action_localization\u0027, \u0027scene_transition\u0027, \u0027action_count\u0027, \u0027moving_count\u0027, \u0027moving_attribute\u0027, \u0027state_change\u0027, \u0027fine_grained_pose\u0027, \u0027character_order\u0027, \u0027egocentric_navigation\u0027, \u0027episodic_reasoning\u0027, \u0027counterfactual_inference\u0027, \u0027Overall\u0027,description:},component:Checkboxgroup,example_input:action_sequence},{label:Model Size,parameter_name:model_size,parameter_has_default:true,parameter_default:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:{items:{enum:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027\u003c10B\u0027, \u002710B-20B\u0027, \u002720B-40B\u0027, \u0027\u003e40B\u0027, \u0027Unknown\u0027,description:},component:Checkboxgroup,example_input:\u003c10B},{label:Model Type,parameter_name:model_type,parameter_has_default:true,parameter_default:API,OpenSource,type:{items:{enum:API,OpenSource,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027API\u0027, \u0027OpenSource\u0027,description:},component:Checkboxgroup,example_input:API},returns:{label:value_31,type:{properties:{headers:{items:{type:string},title:Headers,type:array},data:{items:{items:{},type:array},title:Data,type:array},metadata:{anyOf:{additionalProperties:{anyOf:{items:{},type:array},{type:null}},type:object},{type:null},default:null,title:Metadata}},required:headers,data,title:DataframeData,type:object},python_type:{type:Dict(headers: Liststr, data: ListListAny, metadata: Dict(str, ListAny | None) | None),description:},component:Dataframe},show_api:true},/filter_df_l2_4:{parameters:{label:MVBench,parameter_name:dataset_name,parameter_has_default:true,parameter_default:MVBench,type:{type:string},python_type:{type:str,description:},component:Textbox,example_input:Hello!!},{label:MVBench CheckBoxes,parameter_name:fields,parameter_has_default:true,parameter_default:Overall,type:{items:{enum:action_sequence,action_prediction,action_antonym,fine_grained_action,unexpected_action,object_existence,object_interaction,object_shuffle,moving_direction,action_localization,scene_transition,action_count,moving_count,moving_attribute,state_change,fine_grained_pose,character_order,egocentric_navigation,episodic_reasoning,counterfactual_inference,Overall,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027action_sequence\u0027, \u0027action_prediction\u0027, \u0027action_antonym\u0027, \u0027fine_grained_action\u0027, \u0027unexpected_action\u0027, \u0027object_existence\u0027, \u0027object_interaction\u0027, \u0027object_shuffle\u0027, \u0027moving_direction\u0027, \u0027action_localization\u0027, \u0027scene_transition\u0027, \u0027action_count\u0027, \u0027moving_count\u0027, \u0027moving_attribute\u0027, \u0027state_change\u0027, \u0027fine_grained_pose\u0027, \u0027character_order\u0027, \u0027egocentric_navigation\u0027, \u0027episodic_reasoning\u0027, \u0027counterfactual_inference\u0027, \u0027Overall\u0027,description:},component:Checkboxgroup,example_input:action_sequence},{label:Model Size,parameter_name:model_size,parameter_has_default:true,parameter_default:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:{items:{enum:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027\u003c10B\u0027, \u002710B-20B\u0027, \u002720B-40B\u0027, \u0027\u003e40B\u0027, \u0027Unknown\u0027,description:},component:Checkboxgroup,example_input:\u003c10B},{label:Model Type,parameter_name:model_type,parameter_has_default:true,parameter_default:API,OpenSource,type:{items:{enum:API,OpenSource,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027API\u0027, \u0027OpenSource\u0027,description:},component:Checkboxgroup,example_input:API},returns:{label:value_31,type:{properties:{headers:{items:{type:string},title:Headers,type:array},data:{items:{items:{},type:array},title:Data,type:array},metadata:{anyOf:{additionalProperties:{anyOf:{items:{},type:array},{type:null}},type:object},{type:null},default:null,title:Metadata}},required:headers,data,title:DataframeData,type:object},python_type:{type:Dict(headers: Liststr, data: ListListAny, metadata: Dict(str, ListAny | None) | None),description:},component:Dataframe},show_api:true},/filter_df_l2_5:{parameters:{label:MVBench,parameter_name:dataset_name,parameter_has_default:true,parameter_default:MVBench,type:{type:string},python_type:{type:str,description:},component:Textbox,example_input:Hello!!},{label:MVBench CheckBoxes,parameter_name:fields,parameter_has_default:true,parameter_default:Overall,type:{items:{enum:action_sequence,action_prediction,action_antonym,fine_grained_action,unexpected_action,object_existence,object_interaction,object_shuffle,moving_direction,action_localization,scene_transition,action_count,moving_count,moving_attribute,state_change,fine_grained_pose,character_order,egocentric_navigation,episodic_reasoning,counterfactual_inference,Overall,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027action_sequence\u0027, \u0027action_prediction\u0027, \u0027action_antonym\u0027, \u0027fine_grained_action\u0027, \u0027unexpected_action\u0027, \u0027object_existence\u0027, \u0027object_interaction\u0027, \u0027object_shuffle\u0027, \u0027moving_direction\u0027, \u0027action_localization\u0027, \u0027scene_transition\u0027, \u0027action_count\u0027, \u0027moving_count\u0027, \u0027moving_attribute\u0027, \u0027state_change\u0027, \u0027fine_grained_pose\u0027, \u0027character_order\u0027, \u0027egocentric_navigation\u0027, \u0027episodic_reasoning\u0027, \u0027counterfactual_inference\u0027, \u0027Overall\u0027,description:},component:Checkboxgroup,example_input:action_sequence},{label:Model Size,parameter_name:model_size,parameter_has_default:true,parameter_default:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:{items:{enum:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027\u003c10B\u0027, \u002710B-20B\u0027, \u002720B-40B\u0027, \u0027\u003e40B\u0027, \u0027Unknown\u0027,description:},component:Checkboxgroup,example_input:\u003c10B},{label:Model Type,parameter_name:model_type,parameter_has_default:true,parameter_default:API,OpenSource,type:{items:{enum:API,OpenSource,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027API\u0027, \u0027OpenSource\u0027,description:},component:Checkboxgroup,example_input:API},returns:{label:value_31,type:{properties:{headers:{items:{type:string},title:Headers,type:array},data:{items:{items:{},type:array},title:Data,type:array},metadata:{anyOf:{additionalProperties:{anyOf:{items:{},type:array},{type:null}},type:object},{type:null},default:null,title:Metadata}},required:headers,data,title:DataframeData,type:object},python_type:{type:Dict(headers: Liststr, data: ListListAny, metadata: Dict(str, ListAny | None) | None),description:},component:Dataframe},show_api:true},/filter_df_l2_6:{parameters:{label:MMBench-Video,parameter_name:dataset_name,parameter_has_default:true,parameter_default:MMBench-Video,type:{type:string},python_type:{type:str,description:},component:Textbox,example_input:Hello!!},{label:MMBench-Video CheckBoxes,parameter_name:fields,parameter_has_default:true,parameter_default:Overall,Perception,Reasoning,type:{items:{enum:CP,FP-S,FP-C,HL,LR,AR,RR,CSR,TR,Perception,Reasoning,Overall,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027CP\u0027, \u0027FP-S\u0027, \u0027FP-C\u0027, \u0027HL\u0027, \u0027LR\u0027, \u0027AR\u0027, \u0027RR\u0027, \u0027CSR\u0027, \u0027TR\u0027, \u0027Perception\u0027, \u0027Reasoning\u0027, \u0027Overall\u0027,description:},component:Checkboxgroup,example_input:CP},{label:Model Size,parameter_name:model_size,parameter_has_default:true,parameter_default:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:{items:{enum:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027\u003c10B\u0027, \u002710B-20B\u0027, \u002720B-40B\u0027, \u0027\u003e40B\u0027, \u0027Unknown\u0027,description:},component:Checkboxgroup,example_input:\u003c10B},{label:Model Type,parameter_name:model_type,parameter_has_default:true,parameter_default:API,OpenSource,type:{items:{enum:API,OpenSource,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027API\u0027, \u0027OpenSource\u0027,description:},component:Checkboxgroup,example_input:API},returns:{label:value_41,type:{properties:{headers:{items:{type:string},title:Headers,type:array},data:{items:{items:{},type:array},title:Data,type:array},metadata:{anyOf:{additionalProperties:{anyOf:{items:{},type:array},{type:null}},type:object},{type:null},default:null,title:Metadata}},required:headers,data,title:DataframeData,type:object},python_type:{type:Dict(headers: Liststr, data: ListListAny, metadata: Dict(str, ListAny | None) | None),description:},component:Dataframe},show_api:true},/filter_df_l2_7:{parameters:{label:MMBench-Video,parameter_name:dataset_name,parameter_has_default:true,parameter_default:MMBench-Video,type:{type:string},python_type:{type:str,description:},component:Textbox,example_input:Hello!!},{label:MMBench-Video CheckBoxes,parameter_name:fields,parameter_has_default:true,parameter_default:Overall,Perception,Reasoning,type:{items:{enum:CP,FP-S,FP-C,HL,LR,AR,RR,CSR,TR,Perception,Reasoning,Overall,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027CP\u0027, \u0027FP-S\u0027, \u0027FP-C\u0027, \u0027HL\u0027, \u0027LR\u0027, \u0027AR\u0027, \u0027RR\u0027, \u0027CSR\u0027, \u0027TR\u0027, \u0027Perception\u0027, \u0027Reasoning\u0027, \u0027Overall\u0027,description:},component:Checkboxgroup,example_input:CP},{label:Model Size,parameter_name:model_size,parameter_has_default:true,parameter_default:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:{items:{enum:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027\u003c10B\u0027, \u002710B-20B\u0027, \u002720B-40B\u0027, \u0027\u003e40B\u0027, \u0027Unknown\u0027,description:},component:Checkboxgroup,example_input:\u003c10B},{label:Model Type,parameter_name:model_type,parameter_has_default:true,parameter_default:API,OpenSource,type:{items:{enum:API,OpenSource,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027API\u0027, \u0027OpenSource\u0027,description:},component:Checkboxgroup,example_input:API},returns:{label:value_41,type:{properties:{headers:{items:{type:string},title:Headers,type:array},data:{items:{items:{},type:array},title:Data,type:array},metadata:{anyOf:{additionalProperties:{anyOf:{items:{},type:array},{type:null}},type:object},{type:null},default:null,title:Metadata}},required:headers,data,title:DataframeData,type:object},python_type:{type:Dict(headers: Liststr, data: ListListAny, metadata: Dict(str, ListAny | None) | None),description:},component:Dataframe},show_api:true},/filter_df_l2_8:{parameters:{label:MMBench-Video,parameter_name:dataset_name,parameter_has_default:true,parameter_default:MMBench-Video,type:{type:string},python_type:{type:str,description:},component:Textbox,example_input:Hello!!},{label:MMBench-Video CheckBoxes,parameter_name:fields,parameter_has_default:true,parameter_default:Overall,Perception,Reasoning,type:{items:{enum:CP,FP-S,FP-C,HL,LR,AR,RR,CSR,TR,Perception,Reasoning,Overall,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027CP\u0027, \u0027FP-S\u0027, \u0027FP-C\u0027, \u0027HL\u0027, \u0027LR\u0027, \u0027AR\u0027, \u0027RR\u0027, \u0027CSR\u0027, \u0027TR\u0027, \u0027Perception\u0027, \u0027Reasoning\u0027, \u0027Overall\u0027,description:},component:Checkboxgroup,example_input:CP},{label:Model Size,parameter_name:model_size,parameter_has_default:true,parameter_default:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:{items:{enum:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027\u003c10B\u0027, \u002710B-20B\u0027, \u002720B-40B\u0027, \u0027\u003e40B\u0027, \u0027Unknown\u0027,description:},component:Checkboxgroup,example_input:\u003c10B},{label:Model Type,parameter_name:model_type,parameter_has_default:true,parameter_default:API,OpenSource,type:{items:{enum:API,OpenSource,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027API\u0027, \u0027OpenSource\u0027,description:},component:Checkboxgroup,example_input:API},returns:{label:value_41,type:{properties:{headers:{items:{type:string},title:Headers,type:array},data:{items:{items:{},type:array},title:Data,type:array},metadata:{anyOf:{additionalProperties:{anyOf:{items:{},type:array},{type:null}},type:object},{type:null},default:null,title:Metadata}},required:headers,data,title:DataframeData,type:object},python_type:{type:Dict(headers: Liststr, data: ListListAny, metadata: Dict(str, ListAny | None) | None),description:},component:Dataframe},show_api:true},/filter_df_l2_9:{parameters:{label:TempCompass,parameter_name:dataset_name,parameter_has_default:true,parameter_default:TempCompass,type:{type:string},python_type:{type:str,description:},component:Textbox,example_input:Hello!!},{label:TempCompass CheckBoxes,parameter_name:fields,parameter_has_default:true,parameter_default:overall,type:{items:{enum:action,multi-choice,direction,speed,order,attribute_change,caption_matching,captioning,yes_no,overall,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027action\u0027, \u0027multi-choice\u0027, \u0027direction\u0027, \u0027speed\u0027, \u0027order\u0027, \u0027attribute_change\u0027, \u0027caption_matching\u0027, \u0027captioning\u0027, \u0027yes_no\u0027, \u0027overall\u0027,description:},component:Checkboxgroup,example_input:action},{label:Model Size,parameter_name:model_size,parameter_has_default:true,parameter_default:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:{items:{enum:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027\u003c10B\u0027, \u002710B-20B\u0027, \u002720B-40B\u0027, \u0027\u003e40B\u0027, \u0027Unknown\u0027,description:},component:Checkboxgroup,example_input:\u003c10B},{label:Model Type,parameter_name:model_type,parameter_has_default:true,parameter_default:API,OpenSource,type:{items:{enum:API,OpenSource,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027API\u0027, \u0027OpenSource\u0027,description:},component:Checkboxgroup,example_input:API},returns:{label:value_51,type:{properties:{headers:{items:{type:string},title:Headers,type:array},data:{items:{items:{},type:array},title:Data,type:array},metadata:{anyOf:{additionalProperties:{anyOf:{items:{},type:array},{type:null}},type:object},{type:null},default:null,title:Metadata}},required:headers,data,title:DataframeData,type:object},python_type:{type:Dict(headers: Liststr, data: ListListAny, metadata: Dict(str, ListAny | None) | None),description:},component:Dataframe},show_api:true},/filter_df_l2_10:{parameters:{label:TempCompass,parameter_name:dataset_name,parameter_has_default:true,parameter_default:TempCompass,type:{type:string},python_type:{type:str,description:},component:Textbox,example_input:Hello!!},{label:TempCompass CheckBoxes,parameter_name:fields,parameter_has_default:true,parameter_default:overall,type:{items:{enum:action,multi-choice,direction,speed,order,attribute_change,caption_matching,captioning,yes_no,overall,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027action\u0027, \u0027multi-choice\u0027, \u0027direction\u0027, \u0027speed\u0027, \u0027order\u0027, \u0027attribute_change\u0027, \u0027caption_matching\u0027, \u0027captioning\u0027, \u0027yes_no\u0027, \u0027overall\u0027,description:},component:Checkboxgroup,example_input:action},{label:Model Size,parameter_name:model_size,parameter_has_default:true,parameter_default:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:{items:{enum:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027\u003c10B\u0027, \u002710B-20B\u0027, \u002720B-40B\u0027, \u0027\u003e40B\u0027, \u0027Unknown\u0027,description:},component:Checkboxgroup,example_input:\u003c10B},{label:Model Type,parameter_name:model_type,parameter_has_default:true,parameter_default:API,OpenSource,type:{items:{enum:API,OpenSource,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027API\u0027, \u0027OpenSource\u0027,description:},component:Checkboxgroup,example_input:API},returns:{label:value_51,type:{properties:{headers:{items:{type:string},title:Headers,type:array},data:{items:{items:{},type:array},title:Data,type:array},metadata:{anyOf:{additionalProperties:{anyOf:{items:{},type:array},{type:null}},type:object},{type:null},default:null,title:Metadata}},required:headers,data,title:DataframeData,type:object},python_type:{type:Dict(headers: Liststr, data: ListListAny, metadata: Dict(str, ListAny | None) | None),description:},component:Dataframe},show_api:true},/filter_df_l2_11:{parameters:{label:TempCompass,parameter_name:dataset_name,parameter_has_default:true,parameter_default:TempCompass,type:{type:string},python_type:{type:str,description:},component:Textbox,example_input:Hello!!},{label:TempCompass CheckBoxes,parameter_name:fields,parameter_has_default:true,parameter_default:overall,type:{items:{enum:action,multi-choice,direction,speed,order,attribute_change,caption_matching,captioning,yes_no,overall,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027action\u0027, \u0027multi-choice\u0027, \u0027direction\u0027, \u0027speed\u0027, \u0027order\u0027, \u0027attribute_change\u0027, \u0027caption_matching\u0027, \u0027captioning\u0027, \u0027yes_no\u0027, \u0027overall\u0027,description:},component:Checkboxgroup,example_input:action},{label:Model Size,parameter_name:model_size,parameter_has_default:true,parameter_default:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:{items:{enum:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027\u003c10B\u0027, \u002710B-20B\u0027, \u002720B-40B\u0027, \u0027\u003e40B\u0027, \u0027Unknown\u0027,description:},component:Checkboxgroup,example_input:\u003c10B},{label:Model Type,parameter_name:model_type,parameter_has_default:true,parameter_default:API,OpenSource,type:{items:{enum:API,OpenSource,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027API\u0027, \u0027OpenSource\u0027,description:},component:Checkboxgroup,example_input:API},returns:{label:value_51,type:{properties:{headers:{items:{type:string},title:Headers,type:array},data:{items:{items:{},type:array},title:Data,type:array},metadata:{anyOf:{additionalProperties:{anyOf:{items:{},type:array},{type:null}},type:object},{type:null},default:null,title:Metadata}},required:headers,data,title:DataframeData,type:object},python_type:{type:Dict(headers: Liststr, data: ListListAny, metadata: Dict(str, ListAny | None) | None),description:},component:Dataframe},show_api:true},/filter_df_l2_12:{parameters:{label:MLVU,parameter_name:dataset_name,parameter_has_default:true,parameter_default:MLVU,type:{type:string},python_type:{type:str,description:},component:Textbox,example_input:Hello!!},{label:MLVU CheckBoxes,parameter_name:fields,parameter_has_default:true,parameter_default:M-Avg,G-Avg,type:{items:{enum:plotQA,needle,ego,count,order,anomaly_reco,topic_reasoning,sub_scene,summary,M-Avg,G-Avg,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027plotQA\u0027, \u0027needle\u0027, \u0027ego\u0027, \u0027count\u0027, \u0027order\u0027, \u0027anomaly_reco\u0027, \u0027topic_reasoning\u0027, \u0027sub_scene\u0027, \u0027summary\u0027, \u0027M-Avg\u0027, \u0027G-Avg\u0027,description:},component:Checkboxgroup,example_input:plotQA},{label:Model Size,parameter_name:model_size,parameter_has_default:true,parameter_default:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:{items:{enum:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027\u003c10B\u0027, \u002710B-20B\u0027, \u002720B-40B\u0027, \u0027\u003e40B\u0027, \u0027Unknown\u0027,description:},component:Checkboxgroup,example_input:\u003c10B},{label:Model Type,parameter_name:model_type,parameter_has_default:true,parameter_default:API,OpenSource,type:{items:{enum:API,OpenSource,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027API\u0027, \u0027OpenSource\u0027,description:},component:Checkboxgroup,example_input:API},returns:{label:value_62,type:{properties:{headers:{items:{type:string},title:Headers,type:array},data:{items:{items:{},type:array},title:Data,type:array},metadata:{anyOf:{additionalProperties:{anyOf:{items:{},type:array},{type:null}},type:object},{type:null},default:null,title:Metadata}},required:headers,data,title:DataframeData,type:object},python_type:{type:Dict(headers: Liststr, data: ListListAny, metadata: Dict(str, ListAny | None) | None),description:},component:Dataframe},show_api:true},/filter_df_l2_13:{parameters:{label:MLVU,parameter_name:dataset_name,parameter_has_default:true,parameter_default:MLVU,type:{type:string},python_type:{type:str,description:},component:Textbox,example_input:Hello!!},{label:MLVU CheckBoxes,parameter_name:fields,parameter_has_default:true,parameter_default:M-Avg,G-Avg,type:{items:{enum:plotQA,needle,ego,count,order,anomaly_reco,topic_reasoning,sub_scene,summary,M-Avg,G-Avg,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027plotQA\u0027, \u0027needle\u0027, \u0027ego\u0027, \u0027count\u0027, \u0027order\u0027, \u0027anomaly_reco\u0027, \u0027topic_reasoning\u0027, \u0027sub_scene\u0027, \u0027summary\u0027, \u0027M-Avg\u0027, \u0027G-Avg\u0027,description:},component:Checkboxgroup,example_input:plotQA},{label:Model Size,parameter_name:model_size,parameter_has_default:true,parameter_default:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:{items:{enum:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027\u003c10B\u0027, \u002710B-20B\u0027, \u002720B-40B\u0027, \u0027\u003e40B\u0027, \u0027Unknown\u0027,description:},component:Checkboxgroup,example_input:\u003c10B},{label:Model Type,parameter_name:model_type,parameter_has_default:true,parameter_default:API,OpenSource,type:{items:{enum:API,OpenSource,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027API\u0027, \u0027OpenSource\u0027,description:},component:Checkboxgroup,example_input:API},returns:{label:value_62,type:{properties:{headers:{items:{type:string},title:Headers,type:array},data:{items:{items:{},type:array},title:Data,type:array},metadata:{anyOf:{additionalProperties:{anyOf:{items:{},type:array},{type:null}},type:object},{type:null},default:null,title:Metadata}},required:headers,data,title:DataframeData,type:object},python_type:{type:Dict(headers: Liststr, data: ListListAny, metadata: Dict(str, ListAny | None) | None),description:},component:Dataframe},show_api:true},/filter_df_l2_14:{parameters:{label:MLVU,parameter_name:dataset_name,parameter_has_default:true,parameter_default:MLVU,type:{type:string},python_type:{type:str,description:},component:Textbox,example_input:Hello!!},{label:MLVU CheckBoxes,parameter_name:fields,parameter_has_default:true,parameter_default:M-Avg,G-Avg,type:{items:{enum:plotQA,needle,ego,count,order,anomaly_reco,topic_reasoning,sub_scene,summary,M-Avg,G-Avg,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027plotQA\u0027, \u0027needle\u0027, \u0027ego\u0027, \u0027count\u0027, \u0027order\u0027, \u0027anomaly_reco\u0027, \u0027topic_reasoning\u0027, \u0027sub_scene\u0027, \u0027summary\u0027, \u0027M-Avg\u0027, \u0027G-Avg\u0027,description:},component:Checkboxgroup,example_input:plotQA},{label:Model Size,parameter_name:model_size,parameter_has_default:true,parameter_default:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:{items:{enum:\u003c10B,10B-20B,20B-40B,\u003e40B,Unknown,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027\u003c10B\u0027, \u002710B-20B\u0027, \u002720B-40B\u0027, \u0027\u003e40B\u0027, \u0027Unknown\u0027,description:},component:Checkboxgroup,example_input:\u003c10B},{label:Model Type,parameter_name:model_type,parameter_has_default:true,parameter_default:API,OpenSource,type:{items:{enum:API,OpenSource,type:string},title:Checkbox Group,type:array},python_type:{type:ListLiteral\u0027API\u0027, \u0027OpenSource\u0027,description:},component:Checkboxgroup,example_input:API},returns:{label:value_62,type:{properties:{headers:{items:{type:string},title:Headers,type:array},data:{items:{items:{},type:array},title:Data,type:array},metadata:{anyOf:{additionalProperties:{anyOf:{items:{},type:array},{type:null}},type:object},{type:null},default:null,title:Metadata}},required:headers,data,title:DataframeData,type:object},python_type:{type:Dict(headers: Liststr, data: ListListAny, metadata: Dict(str, ListAny | None) | None),description:},component:Dataframe},show_api:true}},unnamed_endpoints:{}};/script> link relpreconnect hrefhttps://fonts.googleapis.com /> link relpreconnect hrefhttps://fonts.gstatic.com crossoriginanonymous /> script srchttps://cdnjs.cloudflare.com/ajax/libs/iframe-resizer/4.3.1/iframeResizer.contentWindow.min.js async >/script> script typemodule crossorigin src./assets/index-D5ROCp7B.js>/script> link relstylesheet crossorigin href./assets/index-DuXXhepF.css> /head> body style width: 100%; margin: 0; padding: 0; display: flex; flex-direction: column; flex-grow: 1; > gradio-app control_page_titletrue embedfalse eagertrue styledisplay: flex; flex-direction: column; flex-grow: 1 > /gradio-app> script> const ce document.getElementsByTagName(gradio-app); if (ce0) { ce0.addEventListener(domchange, () > { document.body.style.padding 0; }); document.body.style.padding 0; } /script> /body>/html>
Subdomains
Date
Domain
IP
memex-in-wainsfwillustrious-v110.hf.space
2025-09-30
44.219.145.144
allknowingroger-image-models-test190.hf.space
2025-02-18
52.55.55.177
hysts-controlnet-v1-1.hf.space
2025-02-18
3.215.212.165
stabilityai-stable-diffusion-1.hf.space
2025-02-16
35.174.94.150
kazuk-youtube-whisper-01.hf.space
2025-03-02
18.209.154.186
minimaxai-minimax-m1.hf.space
2025-09-17
52.201.90.248
mistral-7b-instruct-v0-2.hf.space
2025-09-21
18.233.176.84
fffiloni-clip-interrogator-2.hf.space
2025-09-26
54.145.224.52
videocrafter-videocrafter2.hf.space
2025-09-28
3.221.99.236
gradio-space-api-fetcher-v2.hf.space
2025-02-12
34.237.237.210
vision-cair-minigpt-v2.hf.space
2025-09-26
54.145.224.52
dromerosm-groq-llama3.hf.space
2025-09-23
3.221.99.236
lixin4ever-videorefer-videollama3.hf.space
2025-09-14
34.206.241.91
gradio-hello-world-4.hf.space
2025-09-25
3.221.99.236
abidlabs-trackio-1234.hf.space
2025-09-28
3.221.99.236
allknowingroger-image-models-test94.hf.space
2025-02-17
98.85.138.39
vision-cair-minigpt4.hf.space
2025-09-26
3.221.99.236
ysharma-chatgpt4.hf.space
2025-02-13
52.3.84.195
playgroundai-playground-v2-5.hf.space
2025-08-29
3.214.53.138
mrfakename-sync-f5.hf.space
2025-09-22
18.233.176.84
jpgallegoar-spanish-f5.hf.space
2025-09-26
3.221.99.236
flagrantia-character-select-saa.hf.space
2025-09-30
3.221.99.236
fabiogra-moseca.hf.space
2025-09-21
34.225.178.167
starcoder-15b.hf.space
2025-08-20
54.208.116.228
nari-labs-dia-1-6b.hf.space
2025-09-19
34.197.61.212
aidc-ai-ovis2-16b.hf.space
2025-09-10
3.214.53.138
nous-hermes-2-mistral-7b.hf.space
2025-08-30
3.214.53.138
gradio-map-airbnb.hf.space
2025-09-22
34.196.172.139
algoworks-image-face-upscale-restoration-gfpgan-pub.hf.space
2025-02-09
44.215.187.175
bga-spacio-v1pub.hf.space
2025-09-26
3.221.99.236
jordielebowen-train-flux-lora-ease-public.hf.space
2025-09-30
54.145.224.52
kkazahra888-unfilteredai-nsfw-gen-v2.static.hf.space
2025-09-14
3.175.34.121
lalashechka-img2img-6.static.hf.space
2025-09-14
3.175.34.4
lalashechka-img2img-8.static.hf.space
2025-09-14
3.175.34.121
adobfile-mailakdenizedutrowaauthlogonspxreplacec-6524a2d.static.hf.space
2025-09-04
3.175.34.16
hesamation-primer-llm-embedding.static.hf.space
2025-07-25
3.175.34.16
habolt-gabriel-habolt-immobilier.static.hf.space
2025-09-14
3.175.34.16
niansuh-ps.static.hf.space
2025-09-16
3.175.34.121
ateeqq-mistral-7b-instruct-v0-2-chatbot.static.hf.space
2025-09-04
3.175.34.117
ilcve21-sparc3d.hf.space
2025-09-29
44.219.145.144
cavargas10-trellis-imagen3d.hf.space
2025-09-30
54.145.224.52
lightricks-ltx-video-distilled.hf.space
2025-08-17
34.225.229.167
nihalgazi-flux-pro-unlimited.hf.space
2025-09-06
3.214.53.138
multimodalart-ip-adapter-faceid.hf.space
2025-09-26
3.221.99.236
instantx-instantid.hf.space
2025-09-28
54.145.224.52
gradio-hello-world.hf.space
2025-09-26
54.145.224.52
cohereforai-c4ai-command.hf.space
2025-09-28
3.221.99.236
codellama-codellama-playground.hf.space
2025-02-09
34.237.237.210
huggingface-inference-playground.hf.space
2025-09-21
18.233.176.84
openevals-find-a-leaderboard.hf.space
2025-08-26
3.232.180.39
lmarena-ai-chatbot-arena-leaderboard.hf.space
2025-09-23
3.221.99.236
lmsys-chatbot-arena-leaderboard.hf.space
2025-02-12
3.222.39.245
mteb-leaderboard.hf.space
2025-02-10
44.215.187.175
optimum-llm-perf-leaderboard.hf.space
2025-09-27
54.145.224.52
huggingfaceh4-open-llm-leaderboard.hf.space
2025-02-12
3.222.39.245
open-llm-leaderboard-open-llm-leaderboard.hf.space
2025-09-26
44.219.145.144
opencompass-open-vlm-leaderboard.hf.space
2025-09-26
44.219.145.144
opencompass-openvlm-video-leaderboard.hf.space
2025-09-30
54.145.224.52
hf-audio-open-asr-leaderboard.hf.space
2025-09-30
54.145.224.52
bigcode-bigcode-models-leaderboard.hf.space
2025-09-26
3.221.99.236
galileo-ai-agent-leaderboard.hf.space
2025-09-22
18.233.176.84
ai-secure-llm-trustworthy-leaderboard.hf.space
2025-09-26
3.221.99.236
markovidrih-image-manipulation-face.hf.space
2025-02-11
34.237.237.210
hilley-chattts-openvoice.hf.space
2025-08-24
54.208.116.228
oup-ai-lab-qa-with-image.hf.space
2025-02-17
3.220.39.102
adrek-text-to-image.hf.space
2025-09-04
3.214.53.138
hf-accelerate-model-memory-usage.hf.space
2025-09-16
44.219.132.240
pseudolab-huggingface-korea-theme.hf.space
2025-02-17
98.85.138.39
docs4you-tre.hf.space
2025-09-21
18.233.176.84
circl-vulnerability-severity-classification-roberta-base.hf.space
2025-08-31
3.214.53.138
littletest-sorryplease.hf.space
2025-09-25
54.145.224.52
breezedeus-cnocr-demo-private.hf.space
2025-09-26
54.145.224.52
cvlab-zero123-live.hf.space
2025-09-22
18.233.176.84
dylanebert-igf.hf.space
2025-02-10
44.215.187.175
limcheekin-yarn-mistral-7b-128k-gguf.hf.space
2025-02-09
34.237.237.210
course-demos-remove-bg.hf.space
2025-09-26
54.145.224.52
multimodalart-self-forcing.hf.space
2025-09-26
54.145.224.52
innoai-edge-tts-text-to-speech.hf.space
2025-08-21
52.202.40.84
lmsys-mt-bench.hf.space
2025-02-09
44.215.187.175
felladrin-minisearch.hf.space
2025-09-30
54.145.224.52
tencentarc-instantmesh.hf.space
2025-08-26
54.208.116.228
shariqfarooq-zoedepth.hf.space
2025-09-07
52.86.163.242
devquasar-trashcanai.hf.space
2025-09-30
54.145.224.52
api.hf.space
2025-02-20
3.215.212.165
thestinger-uvr5-ui.hf.space
2025-09-02
3.214.53.138
falcon-completions-r44ui.hf.space
2025-08-23
52.202.40.84
oitsminez-rvc-v2-webui.hf.space
2025-09-29
54.145.224.52
faceonlive-face-recognition-sdk.hf.space
2025-02-17
52.0.30.169
suno-bark.hf.space
2025-02-12
3.222.39.245
microsoft-phi-4-multimodal.hf.space
2025-09-29
54.145.224.52
makiai-gradio-mcp-minimal.hf.space
2025-09-30
44.219.145.144
not-lain-background-removal.hf.space
2025-09-04
3.214.53.138
radames-real-time-latent-consistency-model.hf.space
2025-02-13
3.222.39.245
black-forest-labs-flux-1-schnell.hf.space
2025-09-26
54.145.224.52
huggingfacem4-screenshot2html.hf.space
2025-09-27
54.145.224.52
prodia-sdxl-stable-diffusion-xl.hf.space
2025-02-14
52.55.55.177
openskyml-fast-sdxl-stable-diffusion-xl.hf.space
2025-03-08
54.209.103.164
google-sdxl.hf.space
2025-02-12
3.222.39.245
3daigc-lhm.hf.space
2025-09-13
44.219.132.240
vokturz-can-it-run-llm.hf.space
2025-02-08
3.222.39.245
rphrp1985-stable-diffusion-3-medium.hf.space
2025-09-25
44.219.145.144
gradio-fake-gan.hf.space
2025-09-04
44.219.8.222
clem-image-face-upscale-restoration-gfpgan.hf.space
2025-09-26
54.145.224.52
pseudolab-schoolrecord-gen.hf.space
2025-02-12
3.222.39.245
andrewstalin-youtube-comments-gen.hf.space
2025-09-20
18.233.176.84
r3gm-aicovergen.hf.space
2025-02-17
3.215.212.165
ngoctuanai-copiloten.hf.space
2025-09-28
3.221.99.236
fastrtc-turn-server-login.hf.space
2025-09-21
34.225.178.167
ivanmeyer-icon.hf.space
2025-09-27
44.219.145.144
clebersla-rvc-v2-huggingface-version.hf.space
2025-02-16
3.221.245.176
segmind-segmind-stable-diffusion.hf.space
2025-09-26
44.219.145.144
stabilityai-stable-diffusion.hf.space
2025-02-15
52.3.84.195
gustavosta-magicprompt-stable-diffusion.hf.space
2025-02-16
35.174.94.150
multimodalart-stable-video-diffusion.hf.space
2025-09-27
54.145.224.52
ap123-illusiondiffusion.hf.space
2025-02-10
34.237.237.210
levihsu-ootdiffusion.hf.space
2025-09-26
54.145.224.52
course-demos-speech-to-speech-translation.hf.space
2025-02-13
52.3.84.195
modelscope-old-photo-restoration.hf.space
2025-09-21
18.233.176.84
tonyassi-image-segmentation.hf.space
2025-02-13
34.237.237.210
veb-101-uwmgi-medical-image-segmentation.hf.space
2025-02-13
44.220.97.6
kahraman42-animal-species-detection.hf.space
2025-02-10
34.237.237.210
fffiloni-image2sfx-comparison.hf.space
2025-09-27
54.145.224.52
bhaskartripathi-pdfgpt-turbo.hf.space
2025-02-16
35.174.94.150
yuntian-deng-chatgpt4turbo.hf.space
2025-02-24
52.54.116.144
orderlymirror-text-to-video.hf.space
2025-09-30
3.221.99.236
wrdias-comfyui-advanced-video2video.hf.space
2025-09-19
34.225.178.167
andzhk-pnginfo.hf.space
2025-02-14
52.3.84.195
gradio-chatinterface-streaming-echo.hf.space
2025-09-26
3.221.99.236
wushuang98-direct3d-s2-v1-0-demo.hf.space
2025-09-04
44.219.8.222
qwen-qwen3-demo.hf.space
2025-09-30
44.219.145.144
resembleai-chatterbox-tts-demo.hf.space
2025-09-30
44.219.145.144
qwen-qwen2-5-max-demo.hf.space
2025-08-20
54.208.116.228
multichem-pd-mlb-dfs-roo.hf.space
2025-09-22
34.225.178.167
kerimmkirac-vavoo.hf.space
2025-09-21
34.225.178.167
alibaba-pai-easyphoto.hf.space
2025-02-12
34.237.237.210
felixrosberg-face-swap.hf.space
2025-09-30
3.221.99.236
tonyassi-face-swap.hf.space
2025-02-14
52.3.84.195
coxapi-faceswap.hf.space
2025-09-21
18.233.176.84
pminervini-tmp.hf.space
2025-02-12
34.237.237.210
sumit7864-image-enhancer.hf.space
2025-02-17
3.215.212.165
multichem-pd-dfs-portfolio-manager.hf.space
2025-09-06
3.214.53.138
yuqi-gender-classifier.hf.space
2025-02-14
52.3.84.195
vchitect-vlogger-showmaker.hf.space
2024-01-30
3.211.30.211
pinyuchen-attention-tracker.hf.space
2025-08-26
3.232.180.39
u5ername-markdown2ppt-docker.hf.space
2025-08-21
54.211.9.90
vinthony-sadtalker.hf.space
2025-02-11
52.3.84.195
mapitanywhere-mapper.hf.space
2025-09-30
44.219.145.144
openai-whisper.hf.space
2025-02-14
52.55.55.177
distil-whisper-whisper-vs-distil-whisper.hf.space
2025-02-12
3.222.39.245
View on OTX
|
View on ThreatMiner
Please enable JavaScript to view the
comments powered by Disqus.
Data with thanks to
AlienVault OTX
,
VirusTotal
,
Malwr
and
others
. [
Sitemap
]