Alexis Conneau
Alexis Conneau
Verified email at
Cited by
Cited by
Supervised Learning of Universal Sentence Representations from Natural Language Inference Data
A Conneau, D Kiela, H Schwenk, L Barrault, A Bordes
EMNLP 2017 (Outstanding Paper), 2017
Cross-lingual Language Model Pretraining
A Conneau, G Lample
NeurIPS 2019 (Spotlight), 2019
Word Translation Without Parallel Data
A Conneau, G Lample, MA Ranzato, L Denoyer, H Jégou
ICLR 2018, 2017
Very deep convolutional networks for text classification
A Conneau, H Schwenk, L Barrault, Y Lecun
EACL 2017, 2016
Unsupervised Cross-lingual Representation Learning at Scale
A Conneau, K Khandelwal, N Goyal, V Chaudhary, G Wenzek, F Guzmán, ...
ACL 2020, 2019
Unsupervised Machine Translation Using Monolingual Corpora Only
G Lample, A Conneau, L Denoyer, MA Ranzato
ICLR 2018, 2017
Phrase-Based & Neural Unsupervised Machine Translation
G Lample, M Ott, A Conneau, L Denoyer, MA Ranzato
EMNLP 2018 (Best Paper), 2018
What you can cram into a single vector: Probing sentence embeddings for linguistic properties
A Conneau, G Kruszewski, G Lample, L Barrault, M Baroni
ACL 2018, 2018
XNLI: Evaluating Cross-lingual Sentence Representations
A Conneau, G Lample, R Rinott, A Williams, SR Bowman, H Schwenk, ...
EMNLP 2018, 2018
SentEval: An Evaluation Toolkit for Universal Sentence Representations
A Conneau, D Kiela
LREC 2018, 2018
Product embeddings using side-information for recommendation
F Vasile, E Smirnova, A Conneau
WWW 2016, 2016
Ccnet: Extracting high quality monolingual datasets from web crawl data
G Wenzek, MA Lachaux, A Conneau, V Chaudhary, F Guzman, A Joulin, ...
LREC 2020, 2019
Emerging Cross-lingual Structure in Pretrained Language Models
A Conneau, S Wu, H Li, L Zettlemoyer, V Stoyanov
ACL 2020, 2019
Unsupervised cross-lingual representation learning for speech recognition
A Conneau, A Baevski, R Collobert, A Mohamed, M Auli
Interspeech 2021, 2020
Learning visually grounded sentence representations
D Kiela, A Conneau, A Jabri, M Nickel
NAACL 2018, 2017
Supervised Contrastive Learning for Pre-trained Language Model Fine-tuning
B Gunel, J Du, A Conneau, V Stoyanov
ICLR 2021, 2020
Self-training improves pre-training for natural language understanding
J Du, E Grave, B Gunel, V Chaudhary, O Celebi, M Auli, V Stoyanov, ...
NAACL 2021, 2020
Self-training and pre-training are complementary for speech recognition
Q Xu, A Baevski, T Likhomanenko, P Tomasello, A Conneau, R Collobert, ...
ICASSP 2021, 2021
Unsupervised Speech Recognition
A Baevski, WN Hsu, A Conneau, M Auli
NeurIPS 2021 (Oral), 2021
Multilingual Speech Translation with Efficient Finetuning of Pretrained Models
X Li, C Wang, Y Tang, C Tran, Y Tang, J Pino, A Baevski, A Conneau, ...
ACL 2021, 2020
The system can't perform the operation now. Try again later.
Articles 1–20