timestamp
stringdate 902-01-01 00:00:00
1798-01-01 00:00:00
| video_id
stringclasses 64
values | question_id
stringlengths 18
19
| bbox
sequencelengths 4
4
| action_id
int64 1
80
| action_name
stringclasses 79
values | action_type
stringclasses 3
values | question
stringclasses 989
values | options
sequencelengths 4
4
| answer
stringclasses 4
values | combined_action_ids
nulllengths 2
4
⌀ |
|---|---|---|---|---|---|---|---|---|---|---|
0902
|
1j20qq1JyX4
|
1j20qq1JyX4_0902_1
|
[
0.002,
0.11800000000000001,
0.714,
0.977
] | 12 |
stand
|
PERSON_MOVEMENT
|
What position has the person on the left taken in this frame?
|
[
"A) sit",
"B) walk",
"C) bend/bow (at the waist)",
"D) stand"
] |
D
| null |
0902
|
1j20qq1JyX4
|
1j20qq1JyX4_0902_2
|
[
0.002,
0.11800000000000001,
0.714,
0.977
] | 79 |
talk to (e.g., self, a person, a group)
|
PERSON_INTERACTION
|
How would you characterize the social dynamic between the person on the left and others?
|
[
"A) sing to (e.g., self, a person, a group)",
"B) watch (a person)",
"C) talk to (e.g., self, a person, a group)",
"D) listen to (a person)"
] |
C
| null |
0902
|
1j20qq1JyX4
|
1j20qq1JyX4_0902_3
|
[
0.444,
0.054,
0.992,
0.99
] | 12 |
stand
|
PERSON_MOVEMENT
|
How would you categorize the person on the right's physical activity?
|
[
"A) bend/bow (at the waist)",
"B) walk",
"C) sit",
"D) stand"
] |
D
| null |
0902
|
1j20qq1JyX4
|
1j20qq1JyX4_0902_4
|
[
0.444,
0.054,
0.992,
0.99
] | 17 |
carry/hold (an object)
|
OBJECT_MANIPULATION
|
How is the person on the right interacting with objects in this frame?
|
[
"A) lift/pick up",
"B) put down",
"C) throw",
"D) carry/hold (an object)"
] |
D
| null |
0902
|
1j20qq1JyX4
|
1j20qq1JyX4_0902_5
|
[
0.444,
0.054,
0.992,
0.99
] | 79 |
talk to (e.g., self, a person, a group)
|
PERSON_INTERACTION
|
How would you characterize the social dynamic between the person on the right and others?
|
[
"A) listen to (a person)",
"B) sing to (e.g., self, a person, a group)",
"C) watch (a person)",
"D) talk to (e.g., self, a person, a group)"
] |
D
| null |
0903
|
1j20qq1JyX4
|
1j20qq1JyX4_0903_1
|
[
0.02,
0.069,
0.985,
0.985
] | 12 |
stand
|
PERSON_MOVEMENT
|
What movement is the person making?
|
[
"A) sit",
"B) walk",
"C) stand",
"D) bend/bow (at the waist)"
] |
C
| null |
0903
|
1j20qq1JyX4
|
1j20qq1JyX4_0903_2
|
[
0.02,
0.069,
0.985,
0.985
] | 17 |
carry/hold (an object)
|
OBJECT_MANIPULATION
|
Which type of object interaction is the person demonstrating?
|
[
"A) lift/pick up",
"B) throw",
"C) carry/hold (an object)",
"D) put down"
] |
C
| null |
0903
|
1j20qq1JyX4
|
1j20qq1JyX4_0903_3
|
[
0.02,
0.069,
0.985,
0.985
] | 79 |
talk to (e.g., self, a person, a group)
|
PERSON_INTERACTION
|
What social behavior is the person demonstrating?
|
[
"A) talk to (e.g., self, a person, a group)",
"B) watch (a person)",
"C) listen to (a person)",
"D) sing to (e.g., self, a person, a group)"
] |
A
| null |
0904
|
1j20qq1JyX4
|
1j20qq1JyX4_0904_1
|
[
0.015,
0.052000000000000005,
1,
0.979
] | 12 |
stand
|
PERSON_MOVEMENT
|
What kind of body movement is the person engaged in?
|
[
"A) sit",
"B) stand",
"C) bend/bow (at the waist)",
"D) walk"
] |
B
| null |
0904
|
1j20qq1JyX4
|
1j20qq1JyX4_0904_2
|
[
0.015,
0.052000000000000005,
1,
0.979
] | 17 |
carry/hold (an object)
|
OBJECT_MANIPULATION
|
Which object-related action is the person performing?
|
[
"A) put down",
"B) lift/pick up",
"C) carry/hold (an object)",
"D) throw"
] |
C
| null |
0904
|
1j20qq1JyX4
|
1j20qq1JyX4_0904_3
|
[
0.015,
0.052000000000000005,
1,
0.979
] | 79 |
talk to (e.g., self, a person, a group)
|
PERSON_INTERACTION
|
How is the person relating to others in this frame?
|
[
"A) sing to (e.g., self, a person, a group)",
"B) listen to (a person)",
"C) talk to (e.g., self, a person, a group)",
"D) watch (a person)"
] |
C
| null |
0905
|
1j20qq1JyX4
|
1j20qq1JyX4_0905_1
|
[
0.193,
0.016,
1,
0.978
] | 11 |
sit
|
PERSON_MOVEMENT
|
How would you describe the way the person is moving?
|
[
"A) lie/sleep",
"B) crouch/kneel",
"C) stand",
"D) sit"
] |
D
| null |
0905
|
1j20qq1JyX4
|
1j20qq1JyX4_0905_2
|
[
0.193,
0.016,
1,
0.978
] | 74 |
listen to (a person)
|
PERSON_INTERACTION
|
What type of interpersonal contact is the person having?
|
[
"A) listen to (a person)",
"B) watch (a person)",
"C) answer phone",
"D) talk to (e.g., self, a person, a group)"
] |
A
| null |
0906
|
1j20qq1JyX4
|
1j20qq1JyX4_0906_1
|
[
0.17500000000000002,
0.001,
0.995,
1
] | 11 |
sit
|
PERSON_MOVEMENT
|
What bodily action is the person performing?
|
[
"A) crouch/kneel",
"B) sit",
"C) lie/sleep",
"D) stand"
] |
B
| null |
0906
|
1j20qq1JyX4
|
1j20qq1JyX4_0906_2
|
[
0.17500000000000002,
0.001,
0.995,
1
] | 74 | null |
PERSON_INTERACTION
|
What type of social interaction is the person engaged in?
|
[
"A) listen to (a person) and watch (a person)",
"B) sing to (e.g., self, a person, a group) and hand shake",
"C) take (an object) from (a person) and hand clap",
"D) hug (a person) and hand shake"
] |
A
| null |
0907
|
1j20qq1JyX4
|
1j20qq1JyX4_0907_1
|
[
0.041,
0.145,
0.992,
0.985
] | 12 |
stand
|
PERSON_MOVEMENT
|
How is the person positioned in this frame?
|
[
"A) bend/bow (at the waist)",
"B) stand",
"C) walk",
"D) sit"
] |
B
| null |
0907
|
1j20qq1JyX4
|
1j20qq1JyX4_0907_2
|
[
0.041,
0.145,
0.992,
0.985
] | 17 |
carry/hold (an object)
|
OBJECT_MANIPULATION
|
What is the person doing with their hands in this frame?
|
[
"A) lift/pick up",
"B) throw",
"C) put down",
"D) carry/hold (an object)"
] |
D
| null |
0907
|
1j20qq1JyX4
|
1j20qq1JyX4_0907_3
|
[
0.041,
0.145,
0.992,
0.985
] | 80 |
watch (a person)
|
PERSON_INTERACTION
|
What type of interpersonal contact is the person having?
|
[
"A) talk to (e.g., self, a person, a group)",
"B) take a photo",
"C) watch (a person)",
"D) listen to (a person)"
] |
C
| null |
0908
|
1j20qq1JyX4
|
1j20qq1JyX4_0908_1
|
[
0.022,
0,
0.9,
0.982
] | 12 |
stand
|
PERSON_MOVEMENT
|
How would you characterize the person's posture?
|
[
"A) walk",
"B) stand",
"C) bend/bow (at the waist)",
"D) sit"
] |
B
| null |
0908
|
1j20qq1JyX4
|
1j20qq1JyX4_0908_2
|
[
0.022,
0,
0.9,
0.982
] | 17 |
carry/hold (an object)
|
OBJECT_MANIPULATION
|
How is the person using the object in this scene?
|
[
"A) lift/pick up",
"B) put down",
"C) throw",
"D) carry/hold (an object)"
] |
D
| null |
0909
|
1j20qq1JyX4
|
1j20qq1JyX4_0909_1
|
[
0.025,
0.40700000000000003,
0.376,
0.991
] | 12 |
stand
|
PERSON_MOVEMENT
|
What type of motion is the person on the left exhibiting?
|
[
"A) sit",
"B) stand",
"C) walk",
"D) bend/bow (at the waist)"
] |
B
| null |
0909
|
1j20qq1JyX4
|
1j20qq1JyX4_0909_2
|
[
0.025,
0.40700000000000003,
0.376,
0.991
] | 80 |
watch (a person)
|
PERSON_INTERACTION
|
What social behavior is the person on the left exhibiting?
|
[
"A) take a photo",
"B) listen to (a person)",
"C) watch (a person)",
"D) talk to (e.g., self, a person, a group)"
] |
C
| null |
0909
|
1j20qq1JyX4
|
1j20qq1JyX4_0909_3
|
[
0.23800000000000002,
0.075,
0.993,
1
] | 12 |
stand
|
PERSON_MOVEMENT
|
How would you characterize the person on the right's posture?
|
[
"A) sit",
"B) walk",
"C) stand",
"D) bend/bow (at the waist)"
] |
C
| null |
0909
|
1j20qq1JyX4
|
1j20qq1JyX4_0909_4
|
[
0.23800000000000002,
0.075,
0.993,
1
] | 17 |
carry/hold (an object)
|
OBJECT_MANIPULATION
|
What activity involving an object is the person on the right engaged in?
|
[
"A) carry/hold (an object)",
"B) throw",
"C) put down",
"D) lift/pick up"
] |
A
| null |
0909
|
1j20qq1JyX4
|
1j20qq1JyX4_0909_5
|
[
0.23800000000000002,
0.075,
0.993,
1
] | 80 |
watch (a person)
|
PERSON_INTERACTION
|
How is the person on the right communicating or connecting with others?
|
[
"A) watch (a person)",
"B) listen to (a person)",
"C) talk to (e.g., self, a person, a group)",
"D) take a photo"
] |
A
| null |
0910
|
1j20qq1JyX4
|
1j20qq1JyX4_0910_1
|
[
0.021,
0.042,
0.8210000000000001,
1
] | 12 |
stand
|
PERSON_MOVEMENT
|
What physical state is the person on the left in?
|
[
"A) sit",
"B) bend/bow (at the waist)",
"C) stand",
"D) walk"
] |
C
| null |
0910
|
1j20qq1JyX4
|
1j20qq1JyX4_0910_2
|
[
0.021,
0.042,
0.8210000000000001,
1
] | 17 |
carry/hold (an object)
|
OBJECT_MANIPULATION
|
Which object-related action is the person on the left performing?
|
[
"A) lift/pick up",
"B) put down",
"C) throw",
"D) carry/hold (an object)"
] |
D
| null |
0910
|
1j20qq1JyX4
|
1j20qq1JyX4_0910_3
|
[
0.021,
0.042,
0.8210000000000001,
1
] | 80 |
watch (a person)
|
PERSON_INTERACTION
|
How is the person on the left showing attention to the other individual(s)?
|
[
"A) watch (a person)",
"B) take a photo",
"C) talk to (e.g., self, a person, a group)",
"D) listen to (a person)"
] |
A
| null |
0910
|
1j20qq1JyX4
|
1j20qq1JyX4_0910_4
|
[
0.741,
0.442,
1,
0.99
] | 12 |
stand
|
PERSON_MOVEMENT
|
Which of these activities is the person on the right doing?
|
[
"A) bend/bow (at the waist)",
"B) sit",
"C) walk",
"D) stand"
] |
D
| null |
0911
|
1j20qq1JyX4
|
1j20qq1JyX4_0911_1
|
[
0.219,
0.037,
0.999,
1
] | 11 |
sit
|
PERSON_MOVEMENT
|
Which of these activities is the person doing?
|
[
"A) sit",
"B) stand",
"C) lie/sleep",
"D) crouch/kneel"
] |
A
| null |
0912
|
1j20qq1JyX4
|
1j20qq1JyX4_0912_1
|
[
0.223,
0.03,
1,
0.974
] | 11 |
sit
|
PERSON_MOVEMENT
|
How would you describe the way the person is moving?
|
[
"A) stand",
"B) crouch/kneel",
"C) lie/sleep",
"D) sit"
] |
D
| null |
0913
|
1j20qq1JyX4
|
1j20qq1JyX4_0913_1
|
[
0.001,
0,
0.5630000000000001,
0.992
] | 12 |
stand
|
PERSON_MOVEMENT
|
How would you categorize the person on the left's physical activity?
|
[
"A) stand",
"B) walk",
"C) bend/bow (at the waist)",
"D) sit"
] |
A
| null |
0913
|
1j20qq1JyX4
|
1j20qq1JyX4_0913_2
|
[
0.001,
0,
0.5630000000000001,
0.992
] | 17 |
carry/hold (an object)
|
OBJECT_MANIPULATION
|
How would you describe what the person on the left is doing with the item?
|
[
"A) throw",
"B) carry/hold (an object)",
"C) lift/pick up",
"D) put down"
] |
B
| null |
0913
|
1j20qq1JyX4
|
1j20qq1JyX4_0913_3
|
[
0.001,
0,
0.5630000000000001,
0.992
] | 79 |
talk to (e.g., self, a person, a group)
|
PERSON_INTERACTION
|
What is the nature of the person on the left's interaction with others?
|
[
"A) sing to (e.g., self, a person, a group)",
"B) listen to (a person)",
"C) watch (a person)",
"D) talk to (e.g., self, a person, a group)"
] |
D
| null |
0913
|
1j20qq1JyX4
|
1j20qq1JyX4_0913_4
|
[
0.611,
0.539,
0.962,
0.987
] | 12 |
stand
|
PERSON_MOVEMENT
|
What kind of physical action is the person on the right engaged in?
|
[
"A) bend/bow (at the waist)",
"B) stand",
"C) sit",
"D) walk"
] |
B
| null |
0913
|
1j20qq1JyX4
|
1j20qq1JyX4_0913_5
|
[
0.611,
0.539,
0.962,
0.987
] | 80 |
watch (a person)
|
PERSON_INTERACTION
|
How is the person on the right showing attention to the other individual(s)?
|
[
"A) listen to (a person)",
"B) watch (a person)",
"C) take a photo",
"D) talk to (e.g., self, a person, a group)"
] |
B
| null |
0914
|
1j20qq1JyX4
|
1j20qq1JyX4_0914_1
|
[
0.015,
0.074,
0.68,
0.981
] | 12 |
stand
|
PERSON_MOVEMENT
|
What kind of body movement is the person on the left engaged in?
|
[
"A) walk",
"B) bend/bow (at the waist)",
"C) stand",
"D) sit"
] |
C
| null |
0914
|
1j20qq1JyX4
|
1j20qq1JyX4_0914_2
|
[
0.015,
0.074,
0.68,
0.981
] | 17 |
carry/hold (an object)
|
OBJECT_MANIPULATION
|
What action involving an item is the person on the left performing?
|
[
"A) throw",
"B) carry/hold (an object)",
"C) lift/pick up",
"D) put down"
] |
B
| null |
0914
|
1j20qq1JyX4
|
1j20qq1JyX4_0914_3
|
[
0.015,
0.074,
0.68,
0.981
] | 79 |
talk to (e.g., self, a person, a group)
|
PERSON_INTERACTION
|
How would you describe the person on the left's interaction with others?
|
[
"A) sing to (e.g., self, a person, a group)",
"B) talk to (e.g., self, a person, a group)",
"C) listen to (a person)",
"D) watch (a person)"
] |
B
| null |
0914
|
1j20qq1JyX4
|
1j20qq1JyX4_0914_4
|
[
0.6890000000000001,
0.589,
0.979,
0.992
] | 12 |
stand
|
PERSON_MOVEMENT
|
How would you describe the person on the right's movement?
|
[
"A) bend/bow (at the waist)",
"B) stand",
"C) sit",
"D) walk"
] |
B
| null |
0914
|
1j20qq1JyX4
|
1j20qq1JyX4_0914_5
|
[
0.6890000000000001,
0.589,
0.979,
0.992
] | 74 | null |
PERSON_INTERACTION
|
Which interpersonal action is the person on the right performing?
|
[
"A) listen to (a person) and watch (a person)",
"B) kiss (a person) and sing to (e.g., self, a person, a group)",
"C) watch (a person) and kiss (a person)",
"D) take (an object) from (a person) and give/serve (an object) to (a person)"
] |
A
| null |
0915
|
1j20qq1JyX4
|
1j20qq1JyX4_0915_1
|
[
0.043000000000000003,
0.061,
0.6930000000000001,
0.979
] | 12 |
stand
|
PERSON_MOVEMENT
|
What physical activity is the person on the left performing?
|
[
"A) stand",
"B) bend/bow (at the waist)",
"C) sit",
"D) walk"
] |
A
| null |
0915
|
1j20qq1JyX4
|
1j20qq1JyX4_0915_2
|
[
0.043000000000000003,
0.061,
0.6930000000000001,
0.979
] | 17 |
carry/hold (an object)
|
OBJECT_MANIPULATION
|
Which action is the person on the left performing with the object?
|
[
"A) carry/hold (an object)",
"B) throw",
"C) lift/pick up",
"D) put down"
] |
A
| null |
0915
|
1j20qq1JyX4
|
1j20qq1JyX4_0915_3
|
[
0.043000000000000003,
0.061,
0.6930000000000001,
0.979
] | 79 |
talk to (e.g., self, a person, a group)
|
PERSON_INTERACTION
|
How would you describe the person on the left's interaction with others?
|
[
"A) sing to (e.g., self, a person, a group)",
"B) watch (a person)",
"C) talk to (e.g., self, a person, a group)",
"D) listen to (a person)"
] |
C
| null |
0915
|
1j20qq1JyX4
|
1j20qq1JyX4_0915_4
|
[
0.809,
0.599,
0.992,
0.99
] | 12 |
stand
|
PERSON_MOVEMENT
|
What type of motion is the person on the right exhibiting?
|
[
"A) walk",
"B) bend/bow (at the waist)",
"C) sit",
"D) stand"
] |
D
| null |
0915
|
1j20qq1JyX4
|
1j20qq1JyX4_0915_5
|
[
0.809,
0.599,
0.992,
0.99
] | 74 | null |
PERSON_INTERACTION
|
How would you characterize the way the person on the right is relating to others?
|
[
"A) listen to (a person) and push (another person)",
"B) take (an object) from (a person) and hand wave",
"C) hand clap and give/serve (an object) to (a person)",
"D) listen to (a person) and watch (a person)"
] |
D
| null |
0916
|
1j20qq1JyX4
|
1j20qq1JyX4_0916_1
|
[
0.03,
0.082,
0.662,
0.982
] | 12 |
stand
|
PERSON_MOVEMENT
|
Which of the following best describes what the person is doing?
|
[
"A) walk",
"B) sit",
"C) stand",
"D) bend/bow (at the waist)"
] |
C
| null |
0916
|
1j20qq1JyX4
|
1j20qq1JyX4_0916_2
|
[
0.03,
0.082,
0.662,
0.982
] | 17 |
carry/hold (an object)
|
OBJECT_MANIPULATION
|
Which type of object interaction is the person demonstrating?
|
[
"A) put down",
"B) lift/pick up",
"C) carry/hold (an object)",
"D) throw"
] |
C
| null |
0916
|
1j20qq1JyX4
|
1j20qq1JyX4_0916_3
|
[
0.03,
0.082,
0.662,
0.982
] | 79 |
talk to (e.g., self, a person, a group)
|
PERSON_INTERACTION
|
Which social action is the person performing?
|
[
"A) sing to (e.g., self, a person, a group)",
"B) watch (a person)",
"C) talk to (e.g., self, a person, a group)",
"D) listen to (a person)"
] |
C
| null |
0917
|
1j20qq1JyX4
|
1j20qq1JyX4_0917_1
|
[
0.082,
0.115,
0.77,
0.969
] | 12 |
stand
|
PERSON_MOVEMENT
|
How would you categorize the person's physical activity?
|
[
"A) stand",
"B) sit",
"C) walk",
"D) bend/bow (at the waist)"
] |
A
| null |
0917
|
1j20qq1JyX4
|
1j20qq1JyX4_0917_2
|
[
0.082,
0.115,
0.77,
0.969
] | 17 |
carry/hold (an object)
|
OBJECT_MANIPULATION
|
How would you describe the way the person is handling the object?
|
[
"A) lift/pick up",
"B) carry/hold (an object)",
"C) put down",
"D) throw"
] |
B
| null |
0917
|
1j20qq1JyX4
|
1j20qq1JyX4_0917_3
|
[
0.082,
0.115,
0.77,
0.969
] | 79 |
talk to (e.g., self, a person, a group)
|
PERSON_INTERACTION
|
Which of these describes the interaction the person is involved in?
|
[
"A) watch (a person)",
"B) listen to (a person)",
"C) sing to (e.g., self, a person, a group)",
"D) talk to (e.g., self, a person, a group)"
] |
D
| null |
0918
|
1j20qq1JyX4
|
1j20qq1JyX4_0918_1
|
[
0.03,
0.096,
0.837,
1
] | 12 |
stand
|
PERSON_MOVEMENT
|
What type of motion is the person on the left exhibiting?
|
[
"A) walk",
"B) sit",
"C) bend/bow (at the waist)",
"D) stand"
] |
D
| null |
0918
|
1j20qq1JyX4
|
1j20qq1JyX4_0918_2
|
[
0.03,
0.096,
0.837,
1
] | 17 |
carry/hold (an object)
|
OBJECT_MANIPULATION
|
What function is the person on the left performing with this object?
|
[
"A) put down",
"B) lift/pick up",
"C) carry/hold (an object)",
"D) throw"
] |
C
| null |
0918
|
1j20qq1JyX4
|
1j20qq1JyX4_0918_3
|
[
0.03,
0.096,
0.837,
1
] | 79 |
talk to (e.g., self, a person, a group)
|
PERSON_INTERACTION
|
What social behavior is the person on the left displaying?
|
[
"A) sing to (e.g., self, a person, a group)",
"B) talk to (e.g., self, a person, a group)",
"C) listen to (a person)",
"D) watch (a person)"
] |
B
| null |
0918
|
1j20qq1JyX4
|
1j20qq1JyX4_0918_4
|
[
0.8130000000000001,
0.712,
0.987,
0.99
] | 12 |
stand
|
PERSON_MOVEMENT
|
Which action is the person on the right performing?
|
[
"A) walk",
"B) stand",
"C) sit",
"D) bend/bow (at the waist)"
] |
B
| null |
0918
|
1j20qq1JyX4
|
1j20qq1JyX4_0918_5
|
[
0.8130000000000001,
0.712,
0.987,
0.99
] | 74 |
listen to (a person)
|
PERSON_INTERACTION
|
What type of social interaction is the person on the right engaged in?
|
[
"A) watch (a person)",
"B) talk to (e.g., self, a person, a group)",
"C) answer phone",
"D) listen to (a person)"
] |
D
| null |
0919
|
1j20qq1JyX4
|
1j20qq1JyX4_0919_1
|
[
0.021,
0.108,
0.752,
0.982
] | 12 |
stand
|
PERSON_MOVEMENT
|
How is the person positioned in this frame?
|
[
"A) stand",
"B) sit",
"C) walk",
"D) bend/bow (at the waist)"
] |
A
| null |
0919
|
1j20qq1JyX4
|
1j20qq1JyX4_0919_2
|
[
0.021,
0.108,
0.752,
0.982
] | 17 |
carry/hold (an object)
|
OBJECT_MANIPULATION
|
What function is the person performing with this object?
|
[
"A) lift/pick up",
"B) put down",
"C) carry/hold (an object)",
"D) throw"
] |
C
| null |
0919
|
1j20qq1JyX4
|
1j20qq1JyX4_0919_3
|
[
0.021,
0.108,
0.752,
0.982
] | 79 |
talk to (e.g., self, a person, a group)
|
PERSON_INTERACTION
|
What interpersonal activity is the person involved in?
|
[
"A) sing to (e.g., self, a person, a group)",
"B) watch (a person)",
"C) listen to (a person)",
"D) talk to (e.g., self, a person, a group)"
] |
D
| null |
0920
|
1j20qq1JyX4
|
1j20qq1JyX4_0920_1
|
[
0.017,
0.094,
0.8230000000000001,
0.985
] | 12 |
stand
|
PERSON_MOVEMENT
|
What is the stance or position of the person?
|
[
"A) stand",
"B) bend/bow (at the waist)",
"C) walk",
"D) sit"
] |
A
| null |
0920
|
1j20qq1JyX4
|
1j20qq1JyX4_0920_2
|
[
0.017,
0.094,
0.8230000000000001,
0.985
] | 17 |
carry/hold (an object)
|
OBJECT_MANIPULATION
|
What activity involving an object is the person engaged in?
|
[
"A) put down",
"B) carry/hold (an object)",
"C) throw",
"D) lift/pick up"
] |
B
| null |
0920
|
1j20qq1JyX4
|
1j20qq1JyX4_0920_3
|
[
0.017,
0.094,
0.8230000000000001,
0.985
] | 79 |
talk to (e.g., self, a person, a group)
|
PERSON_INTERACTION
|
What type of social interaction is the person engaged in?
|
[
"A) sing to (e.g., self, a person, a group)",
"B) listen to (a person)",
"C) watch (a person)",
"D) talk to (e.g., self, a person, a group)"
] |
D
| null |
0921
|
1j20qq1JyX4
|
1j20qq1JyX4_0921_1
|
[
0.017,
0.12,
0.721,
0.981
] | 12 |
stand
|
PERSON_MOVEMENT
|
How would you describe the person on the left's movement?
|
[
"A) bend/bow (at the waist)",
"B) walk",
"C) sit",
"D) stand"
] |
D
| null |
0921
|
1j20qq1JyX4
|
1j20qq1JyX4_0921_2
|
[
0.017,
0.12,
0.721,
0.981
] | 17 |
carry/hold (an object)
|
OBJECT_MANIPULATION
|
What object-based activity is the person on the left engaged in?
|
[
"A) throw",
"B) put down",
"C) carry/hold (an object)",
"D) lift/pick up"
] |
C
| null |
0921
|
1j20qq1JyX4
|
1j20qq1JyX4_0921_3
|
[
0.017,
0.12,
0.721,
0.981
] | 79 |
talk to (e.g., self, a person, a group)
|
PERSON_INTERACTION
|
What social behavior is the person on the left displaying?
|
[
"A) listen to (a person)",
"B) sing to (e.g., self, a person, a group)",
"C) watch (a person)",
"D) talk to (e.g., self, a person, a group)"
] |
D
| null |
0921
|
1j20qq1JyX4
|
1j20qq1JyX4_0921_4
|
[
0.8200000000000001,
0.71,
0.995,
0.992
] | 12 |
stand
|
PERSON_MOVEMENT
|
How would you categorize the person on the right's physical activity?
|
[
"A) sit",
"B) stand",
"C) bend/bow (at the waist)",
"D) walk"
] |
B
| null |
0921
|
1j20qq1JyX4
|
1j20qq1JyX4_0921_5
|
[
0.8200000000000001,
0.71,
0.995,
0.992
] | 74 |
listen to (a person)
|
PERSON_INTERACTION
|
How would you characterize the way the person on the right is relating to others?
|
[
"A) answer phone",
"B) listen to (a person)",
"C) talk to (e.g., self, a person, a group)",
"D) watch (a person)"
] |
B
| null |
0922
|
1j20qq1JyX4
|
1j20qq1JyX4_0922_1
|
[
0.03,
0.088,
0.8170000000000001,
0.987
] | 12 |
stand
|
PERSON_MOVEMENT
|
Which of these activities is the person on the left doing?
|
[
"A) bend/bow (at the waist)",
"B) sit",
"C) walk",
"D) stand"
] |
D
| null |
0922
|
1j20qq1JyX4
|
1j20qq1JyX4_0922_2
|
[
0.03,
0.088,
0.8170000000000001,
0.987
] | 17 |
carry/hold (an object)
|
OBJECT_MANIPULATION
|
What is the person on the left doing with their hands in this frame?
|
[
"A) carry/hold (an object)",
"B) throw",
"C) put down",
"D) lift/pick up"
] |
A
| null |
0922
|
1j20qq1JyX4
|
1j20qq1JyX4_0922_3
|
[
0.03,
0.088,
0.8170000000000001,
0.987
] | 79 |
talk to (e.g., self, a person, a group)
|
PERSON_INTERACTION
|
How would you characterize the social dynamic between the person on the left and others?
|
[
"A) sing to (e.g., self, a person, a group)",
"B) talk to (e.g., self, a person, a group)",
"C) listen to (a person)",
"D) watch (a person)"
] |
B
| null |
0922
|
1j20qq1JyX4
|
1j20qq1JyX4_0922_4
|
[
0.841,
0.7020000000000001,
0.998,
0.989
] | 12 |
stand
|
PERSON_MOVEMENT
|
How would you categorize the person on the right's physical activity?
|
[
"A) stand",
"B) bend/bow (at the waist)",
"C) walk",
"D) sit"
] |
A
| null |
0922
|
1j20qq1JyX4
|
1j20qq1JyX4_0922_5
|
[
0.841,
0.7020000000000001,
0.998,
0.989
] | 74 |
listen to (a person)
|
PERSON_INTERACTION
|
What interpersonal dynamic is the person on the right engaged in?
|
[
"A) talk to (e.g., self, a person, a group)",
"B) watch (a person)",
"C) answer phone",
"D) listen to (a person)"
] |
D
| null |
0923
|
1j20qq1JyX4
|
1j20qq1JyX4_0923_1
|
[
0.024,
0.252,
0.197,
0.998
] | 12 |
stand
|
PERSON_MOVEMENT
|
Which of these activities is the person on the left doing?
|
[
"A) walk",
"B) sit",
"C) bend/bow (at the waist)",
"D) stand"
] |
D
| null |
0923
|
1j20qq1JyX4
|
1j20qq1JyX4_0923_2
|
[
0.024,
0.252,
0.197,
0.998
] | 67 | null |
PERSON_INTERACTION
|
Which social gesture is the person on the left making?
|
[
"A) play with kids and take (an object) from (a person) and push (another person)",
"B) listen to (a person) and give/serve (an object) to (a person) and hand shake",
"C) hand clap and kiss (a person) and lift (a person)",
"D) hand clap and listen to (a person) and watch (a person)"
] |
D
| null |
0923
|
1j20qq1JyX4
|
1j20qq1JyX4_0923_3
|
[
0.105,
0.20600000000000002,
0.446,
0.989
] | 12 |
stand
|
PERSON_MOVEMENT
|
How would you categorize the person on the right's physical activity?
|
[
"A) sit",
"B) walk",
"C) bend/bow (at the waist)",
"D) stand"
] |
D
| null |
0923
|
1j20qq1JyX4
|
1j20qq1JyX4_0923_4
|
[
0.105,
0.20600000000000002,
0.446,
0.989
] | 74 | null |
PERSON_INTERACTION
|
How would you describe the person on the right's interaction with others?
|
[
"A) listen to (a person) and watch (a person)",
"B) push (another person) and kiss (a person)",
"C) watch (a person) and hand clap",
"D) watch (a person) and lift (a person)"
] |
A
| null |
0924
|
1j20qq1JyX4
|
1j20qq1JyX4_0924_1
|
[
0.037,
0.129,
0.737,
0.978
] | 12 |
stand
|
PERSON_MOVEMENT
|
Which of the following best describes what the person is doing?
|
[
"A) sit",
"B) walk",
"C) bend/bow (at the waist)",
"D) stand"
] |
D
| null |
0924
|
1j20qq1JyX4
|
1j20qq1JyX4_0924_2
|
[
0.037,
0.129,
0.737,
0.978
] | 17 |
carry/hold (an object)
|
OBJECT_MANIPULATION
|
What is the person doing with their hands in this frame?
|
[
"A) put down",
"B) lift/pick up",
"C) throw",
"D) carry/hold (an object)"
] |
D
| null |
0924
|
1j20qq1JyX4
|
1j20qq1JyX4_0924_3
|
[
0.037,
0.129,
0.737,
0.978
] | 79 |
talk to (e.g., self, a person, a group)
|
PERSON_INTERACTION
|
What interpersonal dynamic is the person engaged in?
|
[
"A) talk to (e.g., self, a person, a group)",
"B) sing to (e.g., self, a person, a group)",
"C) watch (a person)",
"D) listen to (a person)"
] |
A
| null |
0925
|
1j20qq1JyX4
|
1j20qq1JyX4_0925_1
|
[
0.021,
0.082,
0.8140000000000001,
0.989
] | 12 |
stand
|
PERSON_MOVEMENT
|
How is the person on the left positioned in this frame?
|
[
"A) stand",
"B) walk",
"C) sit",
"D) bend/bow (at the waist)"
] |
A
| null |
0925
|
1j20qq1JyX4
|
1j20qq1JyX4_0925_2
|
[
0.021,
0.082,
0.8140000000000001,
0.989
] | 17 |
carry/hold (an object)
|
OBJECT_MANIPULATION
|
How is the person on the left manipulating or using an item?
|
[
"A) lift/pick up",
"B) put down",
"C) throw",
"D) carry/hold (an object)"
] |
D
| null |
0925
|
1j20qq1JyX4
|
1j20qq1JyX4_0925_3
|
[
0.021,
0.082,
0.8140000000000001,
0.989
] | 79 |
talk to (e.g., self, a person, a group)
|
PERSON_INTERACTION
|
What social behavior is the person on the left demonstrating?
|
[
"A) sing to (e.g., self, a person, a group)",
"B) talk to (e.g., self, a person, a group)",
"C) watch (a person)",
"D) listen to (a person)"
] |
B
| null |
0925
|
1j20qq1JyX4
|
1j20qq1JyX4_0925_4
|
[
0.7000000000000001,
0.682,
0.893,
0.968
] | 12 |
stand
|
PERSON_MOVEMENT
|
What physical state is the person on the right in?
|
[
"A) sit",
"B) stand",
"C) bend/bow (at the waist)",
"D) walk"
] |
B
| null |
0925
|
1j20qq1JyX4
|
1j20qq1JyX4_0925_5
|
[
0.7000000000000001,
0.682,
0.893,
0.968
] | 74 |
listen to (a person)
|
PERSON_INTERACTION
|
What type of social interaction is the person on the right engaged in?
|
[
"A) answer phone",
"B) listen to (a person)",
"C) watch (a person)",
"D) talk to (e.g., self, a person, a group)"
] |
B
| null |
0926
|
1j20qq1JyX4
|
1j20qq1JyX4_0926_1
|
[
0.017,
0.092,
0.8200000000000001,
0.977
] | 12 |
stand
|
PERSON_MOVEMENT
|
What physical state is the person on the left in?
|
[
"A) bend/bow (at the waist)",
"B) stand",
"C) sit",
"D) walk"
] |
B
| null |
0926
|
1j20qq1JyX4
|
1j20qq1JyX4_0926_2
|
[
0.017,
0.092,
0.8200000000000001,
0.977
] | 17 |
carry/hold (an object)
|
OBJECT_MANIPULATION
|
How would you describe what the person on the left is doing with the item?
|
[
"A) carry/hold (an object)",
"B) lift/pick up",
"C) throw",
"D) put down"
] |
A
| null |
0926
|
1j20qq1JyX4
|
1j20qq1JyX4_0926_3
|
[
0.017,
0.092,
0.8200000000000001,
0.977
] | 79 |
talk to (e.g., self, a person, a group)
|
PERSON_INTERACTION
|
What kind of social exchange is the person on the left participating in?
|
[
"A) sing to (e.g., self, a person, a group)",
"B) watch (a person)",
"C) talk to (e.g., self, a person, a group)",
"D) listen to (a person)"
] |
C
| null |
0926
|
1j20qq1JyX4
|
1j20qq1JyX4_0926_4
|
[
0.759,
0.6920000000000001,
0.9410000000000001,
0.982
] | 12 |
stand
|
PERSON_MOVEMENT
|
How is the person on the right positioned in this frame?
|
[
"A) walk",
"B) sit",
"C) stand",
"D) bend/bow (at the waist)"
] |
C
| null |
0926
|
1j20qq1JyX4
|
1j20qq1JyX4_0926_5
|
[
0.759,
0.6920000000000001,
0.9410000000000001,
0.982
] | 74 |
listen to (a person)
|
PERSON_INTERACTION
|
How is the person on the right communicating or connecting with others?
|
[
"A) answer phone",
"B) listen to (a person)",
"C) watch (a person)",
"D) talk to (e.g., self, a person, a group)"
] |
B
| null |
0927
|
1j20qq1JyX4
|
1j20qq1JyX4_0927_1
|
[
0.017,
0.08,
0.8,
0.979
] | 12 |
stand
|
PERSON_MOVEMENT
|
Which of the following best describes what the person on the left is doing?
|
[
"A) bend/bow (at the waist)",
"B) stand",
"C) sit",
"D) walk"
] |
B
| null |
0927
|
1j20qq1JyX4
|
1j20qq1JyX4_0927_2
|
[
0.017,
0.08,
0.8,
0.979
] | 17 |
carry/hold (an object)
|
OBJECT_MANIPULATION
|
Which action is the person on the left performing with the object?
|
[
"A) put down",
"B) throw",
"C) lift/pick up",
"D) carry/hold (an object)"
] |
D
| null |
0927
|
1j20qq1JyX4
|
1j20qq1JyX4_0927_3
|
[
0.017,
0.08,
0.8,
0.979
] | 79 |
talk to (e.g., self, a person, a group)
|
PERSON_INTERACTION
|
What social behavior is the person on the left exhibiting?
|
[
"A) talk to (e.g., self, a person, a group)",
"B) watch (a person)",
"C) listen to (a person)",
"D) sing to (e.g., self, a person, a group)"
] |
A
| null |
0927
|
1j20qq1JyX4
|
1j20qq1JyX4_0927_4
|
[
0.762,
0.6950000000000001,
0.936,
0.981
] | 12 |
stand
|
PERSON_MOVEMENT
|
What posture or movement does the person on the right display?
|
[
"A) bend/bow (at the waist)",
"B) sit",
"C) stand",
"D) walk"
] |
C
| null |
0928
|
1j20qq1JyX4
|
1j20qq1JyX4_0928_1
|
[
0.016,
0.113,
0.803,
0.974
] | 12 |
stand
|
PERSON_MOVEMENT
|
How would you characterize the person on the left's posture?
|
[
"A) bend/bow (at the waist)",
"B) stand",
"C) walk",
"D) sit"
] |
B
| null |
0928
|
1j20qq1JyX4
|
1j20qq1JyX4_0928_2
|
[
0.016,
0.113,
0.803,
0.974
] | 17 |
carry/hold (an object)
|
OBJECT_MANIPULATION
|
What is the person on the left doing with their hands in this frame?
|
[
"A) carry/hold (an object)",
"B) lift/pick up",
"C) throw",
"D) put down"
] |
A
| null |
0928
|
1j20qq1JyX4
|
1j20qq1JyX4_0928_3
|
[
0.016,
0.113,
0.803,
0.974
] | 79 |
talk to (e.g., self, a person, a group)
|
PERSON_INTERACTION
|
Which of these describes the interaction the person on the left is involved in?
|
[
"A) listen to (a person)",
"B) sing to (e.g., self, a person, a group)",
"C) talk to (e.g., self, a person, a group)",
"D) watch (a person)"
] |
C
| null |
0928
|
1j20qq1JyX4
|
1j20qq1JyX4_0928_4
|
[
0.788,
0.713,
0.975,
0.992
] | 12 |
stand
|
PERSON_MOVEMENT
|
Which of the following best describes what the person on the right is doing?
|
[
"A) stand",
"B) bend/bow (at the waist)",
"C) sit",
"D) walk"
] |
A
| null |
0929
|
1j20qq1JyX4
|
1j20qq1JyX4_0929_1
|
[
0.017,
0.064,
0.875,
0.982
] | 12 |
stand
|
PERSON_MOVEMENT
|
How would you describe the person on the left's movement?
|
[
"A) bend/bow (at the waist)",
"B) walk",
"C) sit",
"D) stand"
] |
D
| null |
0929
|
1j20qq1JyX4
|
1j20qq1JyX4_0929_2
|
[
0.017,
0.064,
0.875,
0.982
] | 17 |
carry/hold (an object)
|
OBJECT_MANIPULATION
|
What is the person on the left's interaction with the item?
|
[
"A) throw",
"B) carry/hold (an object)",
"C) lift/pick up",
"D) put down"
] |
B
| null |
Dataset Card for FineBench
FineBench is a large-scale, multiple-choice Video Question Answering (VQA) dataset designed specifically to evaluate the fine-grained understanding of human actions in videos. It leverages the dense spatial (bounding boxes) and temporal (timestamps) annotations from the AVA v2.2 dataset, providing ~200k questions focused on nuanced person movements, interactions, and object manipulations within long video contexts.
Dataset Details
Dataset Description
FineBench addresses a key gap in existing VQA benchmarks by focusing on fine-grained human action understanding coupled with dense spatio-temporal grounding. Based on the AVA v2.2 dataset, which annotates atomic visual actions in movie clips, FineBench automatically generates multiple-choice questions (MCQs) using a template-based approach. Each question probes specific aspects of person movement, person interaction, or object manipulation, referencing individuals using spatial descriptors derived from their bounding boxes. The dataset includes ~200k QA pairs across 64 unique source videos (derived from AVA sources, primarily movies), with an average video duration of 900 seconds and high QA density. Its primary goal is to provide a challenging benchmark for evaluating the ability of Vision-Language Models (VLMs) to precisely localize and comprehend subtle human behaviors in complex scenes over time.
- Curated by: N/A
- Language(s) (NLP): English
- License: MIT
Dataset Sources
- Repository: https://huggingface.co/datasets/FINEBENCH/FineBench
- Paper: Coming Soon
- Demo: Coming Soon
Uses
Direct Use
FineBench is primarily intended for evaluating and benchmarking Vision-Language Models (VLMs) on tasks requiring fine-grained understanding of human actions in videos. Specific use cases include:
- Assessing model capabilities in spatio-temporal reasoning regarding human actions.
- Evaluating understanding of nuanced person movement, person interaction, and object manipulation categories.
- Probing model robustness in handling multiple actors and spatial references within complex scenes.
- Analyzing model failure modes related to fine-grained comprehension (as demonstrated in the associated paper).
- [Stretch] Training or fine-tuning VLMs to improve fine-grained action understanding (though primarily designed as a benchmark).
Out-of-Scope Use
FineBench is not suitable for:
- Directly inferring real-world statistics about human behavior (due to the source videos being primarily movies).
- Training models for surveillance or sensitive identity recognition, as it lacks the necessary labels and focuses on atomic actions from fictional content. Misuse related to analyzing depicted sensitive actions, even if fictional, should be avoided.
Dataset Structure
FineBench is structured as a multiple-choice question-answering dataset. Each instance typically corresponds to a question about a specific person within a specific timestamped segment of a video. The key fields likely include:
video_id: Identifier for the source video.timestamp: Timestamp indicating the relevant moment or segment in the video.bbox: Bounding box coordinates for the person(s) relevant to the question.question: The generated multiple-choice question (string).options: A list of possible answers (strings), including the correct answer and generated distractors.answer: The index of the correct answer within theoptionslist.action_name: The ground truth action label(s) the question is based on.action_type: The high-level category (Person Movement, Person Interaction, Object Manipulation) the question pertains to.
The dataset structure ensures that each question is grounded in specific spatial regions (bounding boxes) and temporal moments (timestamps).
Dataset Creation
Curation Rationale
Existing VQA datasets often lack the necessary dense spatial and temporal grounding, or the specific focus on fine-grained human actions required to rigorously evaluate modern VLMs' capabilities in nuanced video understanding. As shown in analyses accompanying this dataset, even state-of-the-art VLMs struggle with precisely localizing actions and distinguishing between subtle variations in human movement and interaction. FineBench was created to directly address this gap, providing a large-scale, challenging benchmark specifically designed to probe these fine-grained understanding abilities.
Source Data
The primary source data for FineBench is the AVA (Atomic Visual Actions) v2.2 dataset \cite{gu2018ava}. AVA provides dense annotations of atomic visual actions performed by humans within movie clips, including:
- Action labels (80 atomic actions).
- Bounding boxes localizing the person performing the action.
- Timestamps indicating when the action occurs.
FineBench utilizes these annotations and the corresponding video segments from AVA's source movies.
Data Collection and Processing
The FineBench QA pairs were not manually collected but algorithmically generated based on the AVA v2.2 annotations. The process involved:
- Template-Based Question Generation: A comprehensive set of question templates (~70) was designed, categorized by action type (Person Movement, Object Manipulation, Person Interaction).
- Spatial Referencing: Placeholders in templates (e.g.,
{person}) were instantiated using dynamic spatial descriptors (e.g., "the leftmost person", "the person in the center", "the second person from the left") derived from AVA bounding box locations to ensure unambiguous subject reference. - Distractor Selection: For each question based on a ground truth AVA action, plausible incorrect answer options (distractors) were selected using a two-tiered strategy: first prioritizing semantically similar actions based on a predefined mapping, and falling back to random selection within the same action category if necessary. Compound questions were generated for simultaneous actions.
- Data Structuring: The generated questions, options, correct answer labels, and relevant metadata (video ID, timestamp, bounding box, action category) were compiled into the final dataset splits, preserving the original AVA annotations.
Who are the source data producers?
The original annotations (action labels, bounding boxes, timestamps) were created by human annotators as part of the AVA v2.2 dataset curation process \cite{gu2018ava}. Details on the annotators (demographics, compensation) are available in the original AVA publications. The underlying visual data comes from movies, produced by various film studios, directors, actors, etc.
Annotation process
Described in the accompanying paper.
Who are the annotators?
- Base Annotations (Actions, Boxes): Human annotators for AVA v2.2.
- QA Pairs (Questions, Distractors): Algorithmically generated by the creators of FineBench ([N/A]).
Personal and Sensitive Information
The source videos are from commercially distributed movies, not private recordings. Therefore, the risk of exposing PII of individuals in the traditional sense is low. The dataset itself does not contain explicit PII beyond potentially identifiable actors (who are public figures). No anonymization was applied as the source material is public-domain or commercially distributed film content. However, the actions depicted (even if fictional) could potentially be sensitive depending on the context (e.g., depictions of violence, specific interactions).
Bias, Risks, and Limitations
- Bias: FineBench inherits potential biases from its source, AVA v2.2, which is based on movies.
- Limitations:
- Focuses exclusively on human actions; does not cover general scene understanding or object-centric VQA beyond human manipulation.
Citation
BibTeX:
Coming Soon
- Downloads last month
- 102