Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collection
Permanent URI for this collectionhttps://hdl.handle.net/11147/7148
Browse
7 results
Search Results
Conference Object Microarc: Event Driven Analysis and Design Method for Microservices(Elsevier B.V., 2025) Yıldız, Ali; Demirors, OnurThe rapid development of the Internet infrastructure has enabled software applications to leverage almost unlimited and scalable resources. Microservice-based architecture has emerged as a solution to harness the benefits of a distributed cloud-based infrastructure. Event-driven architecture is a powerful approach for addressing challenges in distributed systems, such as scalability, distributed data, and sharing of data at scale. In an event-driven microservice architecture, decoupled services interact by responding to events and event streams facilitate data sharing between them. Despite these advantages, there is no de facto method for the analysis and design of systems within microservice architecture. Organizations often face difficulties in developing microservice-based systems, owing to the lack of well-defined methodologies for analysis and design. In this study, we present an analysis and design method for microservice-based systems. MicroArc is a method for analyzing and designing microservice-based systems, and comprises modeling notations, guiding processes to articulate how the method is applied, and a supporting tool for modelling. The MicroArc approach enables the identification of events and microservice candidates by modeling the flow of processes in the early phase of development. © 2025 Elsevier B.V., All rights reserved.Conference Object Citation - WoS: 3Citation - Scopus: 5Predicting Software Functional Size Using Natural Language Processing: an Exploratory Case Study(IEEE, 2024) Unlu, Huseyin; Tenekeci, Samet; Ciftci, Can; Oral, Ibrahim Baran; Atalay, Tunahan; Hacaloglu, Tuna; Demirors, OnurSoftware Size Measurement (SSM) plays an essential role in software project management as it enables the acquisition of software size, which is the primary input for development effort and schedule estimation. However, many small and medium-sized companies cannot perform objective SSM and Software Effort Estimation (SEE) due to the lack of resources and an expert workforce. This results in inadequate estimates and projects exceeding the planned time and budget. Therefore, organizations need to perform objective SSM and SEE using minimal resources without an expert workforce. In this research, we conducted an exploratory case study to predict the functional size of software project requirements using state-of-the-art large language models (LLMs). For this aim, we fine-tuned BERT and BERT_SE with a set of user stories and their respective functional size in COSMIC Function Points (CFP). We gathered the user stories included in different project requirement documents. In total size prediction, we achieved 72.8% accuracy with BERT and 74.4% accuracy with BERT_SE. In data movement-based size prediction, we achieved 87.5% average accuracy with BERT and 88.1% average accuracy with BERT_SE. Although we use relatively small datasets in model training, these results are promising and hold significant value as they demonstrate the practical utility of language models in SSM.Conference Object Citation - WoS: 1Citation - Scopus: 1Towards the Construction of a Software Benchmarking Dataset Via Systematic Literature Review(IEEE, 2024) Yurum, Ozan Rasit; Unlu, Huseyin; Demirors, OnurEffort estimation is a fundamental task during the planning of software projects. Prediction models usually rely on two essential factors: software size and effort data. Measuring the size of the software can be done at various stages of the project with desired accuracy. Nevertheless, the industry faces challenges when it comes to collecting reliable actual effort data. Consequently, organizations encounter difficulties in establishing effort prediction models. Benchmarking datasets are available, but, in most cases, they have huge variances that make them less useful for effort prediction. In this study, we aimed to answer whether creating a software benchmarking dataset is possible by gathering the data from the literature. To the best of our knowledge, a comprehensive dataset that gathers the functional size and effort data of the studies from the literature is unavailable. For this purpose, we performed a systematic literature review to find studies that include projects measured with the COSMIC Functional Size Measurement (FSM) method and the related effort. As a result, we formed a dataset including 337 records from 18 studies that shared the corresponding size and effort data. Although we performed a limited search, we created a larger dataset than many datasets in the literature. In light of our review, we obtained that most studies did not share their dataset, and many lacked case details such as implementation environment and the scope of software development life cycle activities included in the effort data. We also compared the dataset with the ISBSG repository and found that our dataset has less variation in productivity. Our review showed the applicability of creating a software benchmarking dataset is possible by gathering the data from the literature. In conclusion, this study addresses gaps in the literature through a cost-free and easily extendable dataset.Conference Object Analysis, Design, Test, and Devops in Microservice-Based Software Architectures: Results From Pakistan(Springer international Publishing Ag, 2024) Unlu, Huseyin; Soylu, Gorkem Kilinc; Ahmad, Isra Shafique; Demirors, OnurIn today's software industry, Microservice-based Software Architecture (MSSA) has been a common practice and has been adopted by many companies. MSSA differs from traditional object-oriented architecture in several ways. The architecture moved away from being data-driven and evolved into a behavior-oriented structure. The usage of a single database is replaced by the structures in which each microservice is developed independently and has its own database. Therefore, adaptation demands software organizations to transform their culture. However, there is no de facto method for analyzing, designing, and testing systems for these architectures, similar to object-oriented analysis and design practices. This study aimed to understand how Pakistani software organizations undertake analysis, design, test, and DevOps processes in software projects adopting the MSSA paradigm. To achieve this goal, we surveyed 49 participants from various agile organizations in Pakistan, encompassing different roles and domains. The results reveal that Pakistani software organizations continue using familiar object-oriented analysis and design approaches. However, they have already started exploring event-oriented analysis and design methods for MSSA projects.Conference Object Citation - WoS: 2Citation - Scopus: 2Effort Prediction With Limited Data: a Case Study for Data Warehouse Projects(IEEE, 2022) Unlu, Huseyin; Yildiz, Ali; Demirors, OnurOrganizations may create a sustainable competitive advantage against competitors by using data warehouse systems with which they can assess the current status of their operations at any moment. They can analyze trends and connections using up-to-date data. However, data warehouse projects tend to fail more often than other projects as it can be tough to estimate the effort required to build a data warehouse system. Functional size measurement is one of the methods used as an input for estimating the amount of work in a software project. In this study, we formed a measurement basis for DWH projects in an organization based on the COSMIC Functional Size Measurement Method. We mapped COSMIC rules on two different architectures used for DWH projects in the organization and measured the size of the projects. We calculated the productivity of the projects and compared them with the organization's previous projects and DWH projects in the ISBSG repository. We could not create an organization-wide effort estimation model as we had a limited number of projects. As an alternative, we evaluated the success of effort estimation using DWH projects in the ISBSG repository. We also reported the challenges we faced during the size measurement process.Conference Object Citation - WoS: 7Citation - Scopus: 12Utilization of Three Software Size Measures for Effort Estimation in Agile World: a Case Study(IEEE, 2022) Unlu, Huseyin; Hacaloglu, Tuna; Buber, Fatma; Berrak, Kivilcim; Leblebici, Onur; Demirors, OnurFunctional size measurement (FSM) methods, by being systematic and repeatable, are beneficial in the early phases of the software life cycle for core project management activities such as effort, cost, and schedule estimation. However, in agile projects, requirements are kept minimal in the early phases and are detailed over time as the project progresses. This situation makes it challenging to identify measurement components of FSM methods from requirements in the early phases, hence complicates applying FSM in agile projects. In addition, the existing FSM methods are not fully compatible with today's architectural styles, which are evolving into event-driven decentralized structures. In this study, we present the results of a case study to compare the effectiveness of different size measures: functional -COSMIC Function Points (CFP)-, event-based - Event Points-, and code length-based - Line of Code (LOC)- on projects that were developed with agile methods and utilized a microservice- based architecture. For this purpose, we measured the size of the project and created effort estimation models based on three methods. It is found that the event-based method estimated effort with better accuracy than the CFP and LOC-based methods.Conference Object Citation - Scopus: 7From Requirements to Data Analytics Process: An Ontology-Based Approach(Springer International Publishing AG, 2019) Bandara, Madhushi; Behnaz, Ali; Rabhi, Fethi A.; Demirors, OnurComprehensively describing data analytics requirements is becoming an integral part of developing enterprise information systems. It is a challenging task for analysts to completely elicit all requirements shared by the organization's decision makers. With a multitude of data available from e-commerce sites, social media and data warehouses selecting the correct set of data and suitable techniques for an analysis itself is difficult and time-consuming. The reason is that analysts have to comprehend multiple dimensions such as existing analytics techniques, background knowledge in the domain of interest and the quality of available data. In this paper, we propose to use semantic models to represent different spheres of knowledge related to data analytics space and use them to assist in analytics requirements definition. By following this approach users can create a sound analytics requirements specification, linked with concepts from the operation domain, available data, analytics techniques and their implementations. Such requirements specifications can be used to drive the creation and management of analytics solutions, well aligned with organizational objectives. We demonstrate the capabilities of the proposed method by applying on a data analytics project for house price prediction.
