DSTI at LLMs4OL 2024 Task A: Intrinsic Versus Extrinsic Knowledge for Type Classification

Applications on WordNet and GeoNames Datasets

Authors

DOI:

https://doi.org/10.52825/ocp.v4i.2492

Keywords:

Large Language Models, Ontology Learning, Semantic Web, Knowledge Representation, Semantic Primes

Abstract

We introduce semantic towers, an extrinsic knowledge representation method, and compare it to intrinsic knowledge in large language models for ontology learning. Our experiments show a trade-off between performance and semantic grounding for extrinsic knowledge compared to a fine-tuned model's intrinsic knowledge. We report our findings on the Large Language Models for Ontology Learning (LLMs4OL) 2024 challenge.

Downloads

Download data is not yet available.

References

[1] F. Ronzano and J. Nanavati, “Towards ontology-enhanced representation learning for large language models,” arXiv preprint arXiv:2405.20527, 2024.

[2] V. K. Kommineni, B. König-Ries, and S. Samuel, “From human experts to machines: An llm supported approach to ontology and knowledge graph construction,” arXiv preprint arXiv:2403.08345, 2024.

[3] H. B. Giglou, J. D’Souza, and S. Auer, “Llms4om: Matching ontologies with large language models,” arXiv preprint arXiv:2404.10317, 2024.

[4] Y. He, J. Chen, H. Dong, and I. Horrocks, “Exploring large language models for ontology alignment,” arXiv preprint arXiv:2309.07172, 2023.

[5] S. Toro, A. V. Anagnostopoulos, S. Bello, et al., “Dynamic retrieval augmented generation of ontologies using artificial intelligence (dragon-ai),” arXiv preprint arXiv:2312.10904, 2023.

[6] M. J. Buehler, “Generative retrieval-augmented ontologic graph and multiagent strategies for interpretive large language model-based materials design,” ACS Engineering Au, vol. 4, no. 2, pp. 241–277, 2024.

[7] A. Wierzbicka, Semantics: Primes and universals: Primes and universals. Oxford University Press, UK, 1996.

[8] J. Fähndrich, Semantic decomposition and marker passing in an artificial representation of meaning. Technische Universitaet Berlin (Germany), 2018.

[9] H. Babaei Giglou, J. D’Souza, and S. Auer, “Llms4ol: Large language models for ontology learning,” in International Semantic Web Conference, Springer, 2023, pp. 408–427.

[10] H. Babaei Giglou, J. D’Souza, and S. Auer, “Llms4ol 2024 overview: The 1st large language models for ontology learning challenge,” Open Conference Proceedings, vol. 4, Oct. 2024.

[11] H. Babaei Giglou, J. D’Souza, S. Sadruddin, and S. Auer, “Llms4ol 2024 datasets: Toward ontology learning with large language models,” Open Conference Proceedings, vol. 4, Oct. 2024.

[12] Z. Li, X. Zhang, Y. Zhang, D. Long, P. Xie, and M. Zhang, “Towards general text embeddings with multi-stage contrastive learning,” arXiv preprint arXiv:2308.03281, 2023.

Downloads

Published

2024-10-02

How to Cite

Abi Akl, H. (2024). DSTI at LLMs4OL 2024 Task A: Intrinsic Versus Extrinsic Knowledge for Type Classification: Applications on WordNet and GeoNames Datasets. Open Conference Proceedings, 4, 93–101. https://doi.org/10.52825/ocp.v4i.2492

Conference Proceedings Volume

Section

LLMs4OL 2024 Task Participant Papers