Cookies Policy
The website need some cookies and similar means to function. If you permit us, we will use those means to collect data on your visits for aggregated statistics to improve our service. Find out More
Accept Reject
  • Menu
About

About

Ana Paiva (publishes as Ana C. R. Paiva). Ana Paiva is Assistant Professor at the Informatics Engineering Department of the Faculty of Engineering of University of Porto (FEUP) where she works since 1999. She is a researcher at INESC TEC in the Software Engineering area and member of the Software Engineering research group which gathers researchers and post graduate students with common interests in software engineering. She teaches subjects like Software Testing, Formal Methods and Software Engineering, among others. She has a PhD in Electrical and Computer Engineering from FEUP with a thesis titled"Automated Specification Based Testing of Graphical User Interfaces". Her expertise is on the implementation and automation of the model based testing process. She has been developing research work in collaboration with Foundation of Software Engineering research group within Microsoft Research where she had the opportunity to extend Microsoft's model-based testing tool, Spec Explorer, for GUI testing. She is PI of a National Science Foundation funded project on Pattern-Based GUI Testing (PBGT). She is a member of the PSTQB (Portuguese Software Testing Qualification Board) board general assembly, member of TBok, Glossary, and the MBT Examination Working Groups of the ISTQB (International Software Testing Qualification Board), member of the Council of the Department of Informatics Engineering, and member of the Executive Committee of the Department of Informatics Engineering.

Interest
Topics
Details

Details

  • Name

    Ana Cristina Paiva
  • Role

    Senior Researcher
  • Since

    01st February 2014
002
Publications

2024

Exploring students' opinion on software testing courses

Authors
Cammaerts, F; Tramontana, P; Paiva, ACR; Flores, N; Ricós, FP; Snoeck, M;

Publication
PROCEEDINGS OF 2024 28TH INTERNATION CONFERENCE ON EVALUATION AND ASSESSMENT IN SOFTWARE ENGINEERING, EASE 2024

Abstract
Software testing is an important part of the software development lifecycle. As it is a highly sought-after skill in the industry, it is not surprising that there has been a great deal of research into the teaching of software testing in higher education. Most of this research proposes or evaluates pedagogical approaches or software testing tools to assist teachers in educating the next generation of software engineers. These evaluations are often limited to measuring teachers' opinions about the use of a novel pedagogical approach or an educational tool and students' acceptance and performance in terms of desired software testing skills. While tools and pedagogical approaches address specific aspects of a course, to date, little attention has been paid to the opinions of the students about all the individual aspects of a software testing course. This paper aims to address this missing student perspective by taking a holistic view of software testing course designs. To address this gap, an exploratory study was performed by distributing a questionnaire to 103 students from ten different courses to gauge their opinions on a software testing course they are enrolled in. The results show that students generally have a positive perception of the different aspects of their software testing course. However, several areas for improvement were suggested based on the gathered data.

2023

Collecting cognitive strategies applied by students during test case design

Authors
Cammaerts, F; Snoeck, M; Paiva, ACR;

Publication
27TH INTERNATIONAL CONFERENCE ON EVALUATION AND ASSESSMENT IN SOFTWARE ENGINEERING, EASE 2023

Abstract
It is important to properly test developed software because this may contribute to fewer bugs going unreported in deployed software. Often, little attention is spent on the topic of software testing in curricula, yielding graduate students without adequate preparation to deal with the quality standards required by the industry. This problem could be tackled by introducing bite-sized software testing education capsules that allow teachers to introduce software testing to their students in a less time-consuming manner and with a hands-on component that will facilitate learning. In order to design appropriate software testing educational tools, it is necessary to consider both the software testing needs of the industry and the cognitive models of students. This work-in-progress paper proposes an experimental design to gain an understanding of the cognitive strategies used by students during test case design based on real-life cases. Ultimately, the results of the experiment will be used to develop educational support for teaching software testing.

2023

ENACTEST project - European Innovation Alliance for Testing Education

Authors
Marín, B; Vos, TEJ; Snoeck, M; Paiva, ACR; Fasolino, AR;

Publication
Proceedings of the Research Projects Exhibition Papers Presented at the 35th International Conference on Advanced Information Systems Engineering (CAiSE 2023), Zaragoza, Spain, June 12-16, 2023.

Abstract

2023

An Approach to Regression Testing Selection based on Code Changes and Smells

Authors
Mori, A; Paiva, ACR; Souza, SRS;

Publication
PROCEEDINGS OF THE 8TH BRAZILIAN SYMPOSIUM ON SYSTEMATIC AND AUTOMATED SOFT-WARE TESTING, SAST 2023

Abstract
Regression testing is a software engineering maintenance activity that involves re-executing test cases on a modified software system to check whether code changes introduce new faults. However, it can be time-consuming and resource-intensive, especially for large systems. Regression testing selection techniques can help address this issue by selecting a subset of test cases to run. The change-based technique selects a subset of test cases based on the modified software classes, reducing the test suite size. Thereby, it will cover a smaller number of classes, decreasing the efficiency of the test suite to reveal design flaws. From this perspective, code smells are known to identify poor design and threaten the quality of software systems. In this study, we propose an approach to combine code change and smell to select regression tests and present two new techniques: code smell based and code change and smell. Additionally, we developed the Regression Testing Selection Tool (RTST) to automate the selection process. We empirically evaluated the approach in Defects4J projects by comparing the new techniques' effectiveness with the change-based as a baseline. The results show that the change-based technique achieves the highest reduction rate in the test suite size but with less class coverage. On the other hand, test cases selected using code smells and changed classes combined can potentially find more bugs. The code smell-based technique provides a comparable class coverage to the code change and smell approach. Our findings highlight the benefits of incorporating code smells in regression testing selection and suggest opportunities for improving the efficiency and effectiveness of regression testing.

2022

ENACTEST - European Innovation Alliance for Testing Education

Authors
Marín, B; Vos, TEJ; Paiva, ACR; Fasolino, AR; Snoeck, M;

Publication
Joint Proceedings of RCIS 2022 Workshops and Research Projects Track co-located with the 16th International Conference on Research Challenges in Information Science (RCIS 2022), Barcelona, Spain, May 17-20, 2022.

Abstract
Testing software is very important, but not done well, resulting in problematic and erroneous software applications. The cause radicates from a skills mismatch between what is needed in industry, the learning needs of students, and the way testing is currently being taught at higher and vocational education institutes. The goal of this project is to identify and design seamless teaching materials for testing that are aligned with industry and learning needs. To represent the entire socio-economic environment that will benefit from the results, this project consortium is composed of a diverse set of partners ranging from universities to small enterprises. The project starts with research in sensemaking and cognitive models when doing and learning testing. Moreover, a study will be done to identify the needs of industry for training and knowledge transfer processes for testing. Based on the outcomes of this research and the study, we will design and develop capsules on teaching software testing including the instructional materials that take into account the cognitive models of students and the industry needs. Finally, we will validate these teaching testing capsules developed during the project. © 2021 The Authors.

Supervised
thesis

2023

Data-Driven Hint Generation for Alloy using Historial Student Submissions

Author
Ana Inês Oliveira de Barros

Institution
UP-FEUP

2023

Mutation Operators for Android Apps

Author
Ana Rita Cheio da Veiga

Institution
UP-FEUP

2023

Mutation Testing Cost Reduction Techniques for Java Applications

Author
David Roberto Cravo da Mata

Institution
UP-FEUP

2023

Mutation Testing Cost Reduction Techniques for Java Applications

Author
David Roberto Cravo da Mata

Institution
UP-FEUP

2022

Structural Clustering of Web Pages for Usage Pattern Mining

Author
Pedro Nuno de Oliveira Duarte Carvalho

Institution
UP-FEUP