Skip to main content

Research Repository

Advanced Search

Outputs (5)

Language as a latent sequence: Deep latent variable models for semi-supervised paraphrase generation (2023)
Journal Article
Yu, J., Cristea, A. I., Harit, A., Sun, Z., Aduragba, O. T., Shi, L., & Al Moubayed, N. (2023). Language as a latent sequence: Deep latent variable models for semi-supervised paraphrase generation. AI open, 4, 19-32. https://doi.org/10.1016/j.aiopen.2023.05.001

This paper explores deep latent variable models for semi-supervised paraphrase generation, where the missing target pair for unlabelled data is modelled as a latent paraphrase sequence. We present a novel unsupervised model named variational sequence... Read More about Language as a latent sequence: Deep latent variable models for semi-supervised paraphrase generation.

Efficient Uncertainty Quantification for Multilabel Text Classification (2022)
Presentation / Conference Contribution
Yu, J., Cristea, A. I., Harit, A., Sun, Z., Aduragba, O. T., Shi, L., & Al Moubayed, N. (2022, July). Efficient Uncertainty Quantification for Multilabel Text Classification. Presented at 2022 International Joint Conference on Neural Networks (IJCNN), Padova, Italy

Despite rapid advances of modern artificial intelligence (AI), there is a growing concern regarding its capacity to be explainable, transparent, and accountable. One crucial step towards such AI systems involves reliable and efficient uncertainty qua... Read More about Efficient Uncertainty Quantification for Multilabel Text Classification.

Contrastive Learning with Heterogeneous Graph Attention Networks on Short Text Classification (2022)
Presentation / Conference Contribution
Sun, Z., Harit, A., Cristea, A. I., Yu, J., Shi, L., & Al Moubayed, N. (2022, July). Contrastive Learning with Heterogeneous Graph Attention Networks on Short Text Classification. Presented at 2022 International Joint Conference on Neural Networks (IJCNN), Padova, Italy

Graph neural networks (GNNs) have attracted extensive interest in text classification tasks due to their expected superior performance in representation learning. However, most existing studies adopted the same semi-supervised learning setting as the... Read More about Contrastive Learning with Heterogeneous Graph Attention Networks on Short Text Classification.

INTERACTION: A Generative XAI Framework for Natural Language Inference Explanations (2022)
Presentation / Conference Contribution
Yu, J., Cristea, A. I., Harit, A., Sun, Z., Aduragba, O. T., Shi, L., & Al Moubayed, N. (2022, July). INTERACTION: A Generative XAI Framework for Natural Language Inference Explanations. Presented at 2022 International Joint Conference on Neural Networks (IJCNN), Padova, Italy

XAI with natural language processing aims to produce human-readable explanations as evidence for AI decisionmaking, which addresses explainability and transparency. However, from an HCI perspective, the current approaches only focus on delivering a s... Read More about INTERACTION: A Generative XAI Framework for Natural Language Inference Explanations.

A Generative Bayesian Graph Attention Network for Semi-supervised Classification on Scarce Data (2021)
Presentation / Conference Contribution
Sun, Z., Harit, A., Yu, J., Cristea, A., & Al Moubayed, N. (2021, July). A Generative Bayesian Graph Attention Network for Semi-supervised Classification on Scarce Data. Presented at IEEE International Joint Conference on Neural Network (IJCNN2021), Virtual

This research focuses on semi-supervised classification tasks, specifically for graph-structured data under datascarce situations. It is known that the performance of conventional supervised graph convolutional models is mediocre at classification ta... Read More about A Generative Bayesian Graph Attention Network for Semi-supervised Classification on Scarce Data.