Social media currently provide a window on our lives, making it possible to learn how people from different places, with different backgrounds, ages, and genders use language. In this work we exploit a newly-created Arabic dataset with ground truth age and gender labels to learn these attributes both individually and in a multi-task setting at the sentence level. Our models are based on variations of deep bidirectional neural networks. More specifically, we build models with gated recurrent units and bidirectional encoder representations from transformers (BERT). We show the utility of multi-task learning (MTL) on the two tasks and identify task-specific attention as a superior choice in this context. We also find that a single-task BERT model outperform our best MTL models on the two tasks. We report tweet-level accuracy of 51.43% for the age task (three-way) and 65.30% on the gender task (binary), both of which outperforms our baselines with a large margin. Our models are language-agnostic, and so can be applied to other languages.
翻译:目前,社交媒体为我们的生活提供了一个窗口,可以了解来自不同地方、不同背景、年龄和性别的人如何使用语言。在这项工作中,我们利用一个新创建的阿拉伯语数据集,该数据集具有地面真相年龄和性别标签,以单独和在句级多任务设置中学习这些属性。我们的模型基于深双向神经网络的变异。更具体地说,我们用变压器(变压器)的封闭式经常性单元和双向编码显示模型构建模型。我们展示了多任务学习(MTL)在这两项任务上的效用,并确定了任务特殊关注作为这方面的一个优选。我们还发现,单任务BERT模型优于我们最好的MTL模型在这两项任务上的表现。我们报告了年龄任务(三路)和性别任务(双向)的推文精度为51.43%,性别任务(双向)的精度为65.30%,两者都大大超越了我们的基线。我们的模型是语言,因此可以应用于其他语言。