Kilich commited on
Commit
dbbf2c4
·
1 Parent(s): ebcfe72

Update index.html

Browse files
Files changed (1) hide show
  1. index.html +227 -18
index.html CHANGED
@@ -1,19 +1,228 @@
1
- <!DOCTYPE html>
2
  <html>
3
- <head>
4
- <meta charset="utf-8" />
5
- <meta name="viewport" content="width=device-width" />
6
- <title>My static Space</title>
7
- <link rel="stylesheet" href="style.css" />
8
- </head>
9
- <body>
10
- <div class="card">
11
- <h1>Welcome to your static Space!</h1>
12
- <p>You can modify this app directly by editing <i>index.html</i> in the Files and versions tab.</p>
13
- <p>
14
- Also don't forget to check the
15
- <a href="https://huggingface.co/docs/hub/spaces" target="_blank">Spaces documentation</a>.
16
- </p>
17
- </div>
18
- </body>
19
- </html>
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  <html>
2
+
3
+ <head lang="en">
4
+ <meta charset="UTF-8">
5
+ <meta http-equiv="x-ua-compatible" content="ie=edge">
6
+
7
+ <title>Affective VisDial</title>
8
+
9
+ <meta name="description" content="">
10
+
11
+ <meta name="viewport" content="width=device-width, initial-scale=1">
12
+
13
+ <link rel="apple-touch-icon" href="apple-touch-icon.png">
14
+
15
+ <link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.5/css/bootstrap.min.css">
16
+ <link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/font-awesome/4.4.0/css/font-awesome.min.css">
17
+ <link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/codemirror/5.8.0/codemirror.min.css">
18
+ <link rel="stylesheet" href="assets/css/app.css">
19
+
20
+ <link rel="stylesheet" href="assets/css/bootstrap.min.css">
21
+
22
+ <script src="https://ajax.googleapis.com/ajax/libs/jquery/1.11.3/jquery.min.js"></script>
23
+ <script src="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.5/js/bootstrap.min.js"></script>
24
+ <script src="https://cdnjs.cloudflare.com/ajax/libs/codemirror/5.8.0/codemirror.min.js"></script>
25
+ <script src="https://cdnjs.cloudflare.com/ajax/libs/clipboard.js/1.5.3/clipboard.min.js"></script>
26
+ <script src="js/app.js"></script>
27
+ </head>
28
+
29
+ <body>
30
+ <div class="container">
31
+ <div class="row">
32
+ <h2 class="col-md-12 text-center">
33
+ Affective Visual Dialog: A Large-Scale Benchmark for Emotional Reasoning
34
+ Based on Visually Grounded Conversations</br>
35
+ <small></small>
36
+ </h2>
37
+ </div>
38
+
39
+ <!--- Authors List --->
40
+ <div class="row">
41
+ <div class="col-md-12 text-center">
42
+ <ul class="list-inline">
43
+ <li>
44
+ <a href="https://kilichbek.github.io/webpage/">
45
+ Kilichbek Haydarov
46
+ </a>
47
+ </br>KAUST
48
+ </li>
49
+ <li>
50
+ <a href="https://xiaoqian-shen.github.io/">
51
+ Xiaoqian Shen
52
+ </a>
53
+ </br>KAUST
54
+ </li>
55
+ <li>
56
+ <a href="https://avinashsai.github.io/">
57
+ Avinash Madasu
58
+ </a>
59
+ </br>KAUST
60
+ </li>
61
+ <li>
62
+ <a href="#">
63
+ Mahmoud Salem
64
+ </a>
65
+ </br>KAUST
66
+ </li>
67
+ </br>
68
+ <li>
69
+ <a href="https://healthunity.org/team/jia-li/">
70
+ Jia Li
71
+ </a>
72
+ </br>Stanford University, HealthUnity
73
+ </li>
74
+ <li>
75
+ <a href="https://research.google/people/GamaleldinFathyElsayed/">
76
+ Gamaleldin Elsayed
77
+ </a>
78
+ </br>Google DeepMind
79
+ </li>
80
+ <li>
81
+ <a href="https://www.mohamed-elhoseiny.com/">
82
+ Mohamed Elhoseiny
83
+ </a>
84
+ </br>KAUST
85
+ </li>
86
+ </ul>
87
+ </div>
88
+ </div>
89
+
90
+ <!--- Teaser ---->
91
+ <div class="row" id="header_img">
92
+ <figure class="col&#45;md&#45;4 col&#45;md&#45;offset&#45;4">
93
+ <image src="assets/img/web_teaser.png" class="img&#45;responsive" alt="overview">
94
+ <figcaption>
95
+ </figcaption>
96
+ </figure>
97
+ </div>
98
+ <!--- Links --->
99
+ <div class="row">
100
+ <div class="col-md-6 col-md-offset-3">
101
+ <h3>
102
+ <!-- <h3 class="text&#45;center"> -->
103
+ Links
104
+ </h3>
105
+ <div class="col-md-6 col-md-offset-3 text-center">
106
+ <ul class="nav nav-pills nav-justified">
107
+ <li>
108
+ <a href="https://arxiv.org/abs/2308.16349">
109
+ Paper
110
+ </a>
111
+ </li>
112
+ <li>
113
+ <a href="#">
114
+ Dataset (coming soon)
115
+ </a>
116
+ </li>
117
+ <li>
118
+ <a href="https://github.com/Vision-CAIR/affectiveVisDial">
119
+ Code
120
+ </a>
121
+ </li>
122
+ <!---
123
+ <li>
124
+ <a href="img/modsine.txt">
125
+ BibTeX
126
+ </a>
127
+ </li>
128
+ --->
129
+ <li>
130
+ <a href="mailto:[email protected]">
131
+ Contact
132
+ </a>
133
+ </li>
134
+ </ul>
135
+ </div>
136
+ </div>
137
+ </div>
138
+
139
+ <!--- End of Links --->
140
+
141
+ <!--- Abstract --->
142
+ <div class="row">
143
+ <div class="col-md-6 col-md-offset-3">
144
+ <h3>
145
+ Overview
146
+ </h3>
147
+ <p class="text-justify">
148
+ We introduce Affective Visual Dialog, an emotion explanation
149
+ and reasoning task as a testbed for research on understanding
150
+ the formation of emotions in visually-grounded
151
+ conversations. The task involves three skills:
152
+ (1) Dialog-based Question Answering (2) Dialog-based Emotion Prediction
153
+ and (3) Affective emotion explanation generation
154
+ based on the dialog. Our key contribution is the collection of a
155
+ large-scale dataset, dubbed AffectVisDial, consisting of 50K
156
+ 10-turn visually grounded dialogs as well as
157
+ concluding emotion attributions and dialog-informed textual emotion
158
+ explanations, resulting in a total of 27,180
159
+ working hours. We explain our design decisions in collecting the
160
+ dataset and introduce the questioner and answerer tasks that are
161
+ associated with the participants in the
162
+ conversation. We train and demonstrate solid Affective Visual Dialog
163
+ baselines adapted from state-of-the-art models. Remarkably,
164
+ the responses generated by our models show promising emotional
165
+ reasoning abilities in response to visually grounded conversations
166
+ </p>
167
+ </div>
168
+ </div>
169
+
170
+ <!--- Data Collection Process--->
171
+ <!--- Abstract --->
172
+ <div class="row">
173
+ <div class="col-md-6 col-md-offset-3">
174
+ <h3>
175
+ Data Collection Process
176
+ </h3>
177
+ <!-- 16:9 aspect ratio -->
178
+ <div class="embed-responsive embed-responsive-16by9">
179
+ <iframe class="embed-responsive-item" src="https://drive.google.com/file/d/10BGIvpQH_4tkXl_QVZJf5bNQtKXhakmo/preview" allow="autoplay"></iframe>
180
+ </div>
181
+ </div>
182
+ </div>
183
+
184
+ <div class="row">
185
+ <div class="col-md-6 col-md-offset-3">
186
+ <h3>
187
+ Qualitative Results
188
+ </h3>
189
+
190
+ <div id="header_img">
191
+ <figure class="figure">
192
+ <image src="assets/img/dialog_based_qa.png" class="img&#45;responsive" alt="dialog_task">
193
+ <figcaption class="figure-caption text-center">
194
+ Qualitative Examples of Dialog-Based Question Answering Task. Open the image in new tab for better view.
195
+
196
+ </figcaption>
197
+ </figure>
198
+ </div>
199
+
200
+ <figure class="figure">
201
+ <image src="assets/img/qual_examples.png" class="img&#45;responsive" alt="explanation_task">
202
+ <figcaption class="figure-caption text-center">
203
+ Qualitative Examples of Emotion Explanation Generation Task. Open the image in new tab for better view.
204
+ </figcaption>
205
+ </figure>
206
+
207
+ </div>
208
+ </div>
209
+ <div class="row">
210
+ <div class="col-md-6 col-md-offset-3">
211
+ <h3>
212
+ Acknowledgements
213
+ </h3>
214
+ <p class="text-justify">
215
+ This project is funded by KAUST
216
+ BAS/1/1685-01-01, SDAIA-KAUST Center of Excellence
217
+ in Data Science and Artificial Intelligence. The authors express
218
+ their appreciation to Jack Urbanek, Sirojiddin Karimov, and Umid Nejmatullayev
219
+ for their valuable assistance in data collection setup. Lastly, the authors extend their
220
+ gratitude to the diligent efforts of the Amazon Mechanical
221
+ Turkers, DeepenAI, and SmartOne teams, as their contributions were indispensable for the successful completion of
222
+ this work.
223
+ </p>
224
+ </div>
225
+ </div>
226
+ </div>
227
+ </body>
228
+ </html>