Dr Fatemeh
About Fard

Dr. Fatemeh H. Fard

Dr. Fatemeh Hendijani Fard is an Assistant Professor at The University of British Columbia, Okanagan, Canada, where she leads the Foundational AIware Research and Development (FARD). Her research interests are at the intersection of Natural Language Processing and Software Engineering, focusing on code representation learning and transfer learning for low-resource languages and mining software repositories. Few-shot learning, retrieval augmented generation and large language models (LLM)-based agents are at the heart of her research with directions for developing more precise and smaller models.

She collaborates closely with industry, advises some AI startup companies in NLP area, and has served as a program committee member and reviewer in several journals and conferences, including TSE, TOSEM, EMSE, FSE, and ASE. She is co-chairing the ASE 2024 artifact evaluation track.

  • Research
  • Publications
  • Teaching
  • Team

Research

Parameter Efficient Fine Tuning of (L)LMs (PEFT)

We have several projects and the arxiv version of all papers are available. See my Google Scholar

Language models have millions/billions of parameters. PEFT techniques are introduced as an alternative way of fully fine-tuning a model. We have pioneered using PEFT techniques in software engineering, showing their advantages in bi-modal knowledge transfer (i.e., from natural language to programming languages) [MOD-X].

Our major findings show that Adapters improve the performance of the models for low resource programming languages and for other languages, models fine-tuned with PEFT achieve on par results with fully fine-tuned models [EMSE].

However, this improvement depends on the task (e.g., code summarization) and the architecture of the models. Additionally, we can introduce SE-specific adapters.

1731906188_about.jpg
1731906188_ab2.jpg

Publications

Google Scholar

2024

Davit Abrahamyan and Fatemeh H. Fard. STACKRAG Agent: Improving developer answers with Retrieval-Augmented Generation. The International Conference on Software Maintenance and Evolution (ICSME) Tool Demo Track, 2024.

Iman Saberi, Fatemeh Fard, and Fuxiang Chen. Utilization of pre-trained language models for adapter-based knowledge transfer in software engineering. Empirical Software Engineering, 29(4):94, 2024.

Shawn Zhao and Fatemeh H. Fard. Empirical studies on comment generation for R. ACM Transactions on Software Engineering and Methodology, 2024 (Registered Report).

1731587846_pback.jpg
1731587846_pfront.png

Teaching

Master of Data Science Program.
  • DATA 542 (Data Wrangling), 2018, 2019, 2020, 2022, 2023, 2024
  • DATA 551 (Data Viz II), 2018, 2019, 2020, 2022, 2023, 2024
  • DATA 553 (Data Security, Privacy, Ethics), 2018, 2019
Computer Science
  • COSC 520 (Advanced Analysis of Algorithms), 2019, 2020, 2024
  • COSC 320 (Analysis of Algorithms), 2019, 2020, 2022, 2023, 2024
  • COSC 328 (Computer Networks), 2019, 2020, 2022, 2023
  • COSC 247 (Network Analysis), 2018
1731587889_tback.jpg
1731587889_tfront.png

Team

Dr. Fatemeh Hendijani Fard is an Assistant Professor at The University of British Columbia, Okanagan, Canada, where she leads the Foundational AIware Research and Development (FARD).Her research interests are at the intersection of Natural Language Processing and Software Engineering, focusing on code representation learning and transfer learning for low-resource languages and mining software repositories.

Few-shot learning, retrieval augmented generation and large language models (LLM)-based agents are at the heart of her research with directions for developing more precise and smaller models.

1731587905_teamback.png
1731587905_teamfront.png
Publication

List of news