Computer Engineering / Bilgisayar Mühendisliği

Permanent URI for this collectionhttps://hdl.handle.net/11147/10

Browse

Search Results

Now showing 1 - 10 of 14
  • Article
    Citation - Scopus: 3
    Development of Chrono-Spectral Gold Nanoparticle Growth Based Plasmonic Biosensor Platform
    (Elsevier, 2024) Sözmen, Alper Baran; Elveren, Beste; Erdoğan, Duygu; Mezgil, Bahadır; Baştanlar, Yalın; Yıldız, Ümit Hakan; Arslan Yıldız, Ahu
    Plasmonic sensor platforms are designed for rapid, label-free, and real-time detection and they excel as the next generation biosensors. However, current methods such as Surface Plasmon Resonance require expertise and well-equipped laboratory facilities. Simpler methods such as Localized Surface Plasmon Resonance (LSPR) overcome those limitations, though they lack sensitivity. Hence, sensitivity enhancement plays a crucial role in the future of plasmonic sensor platforms. Herein, a refractive index (RI) sensitivity enhancement methodology is reported utilizing growth of gold nanoparticles (GNPs) on solid support and it is backed up with artificial neural network (ANN) analysis. Sensor platform fabrication was initiated with GNP immobilization onto solid support; immobilized GNPs were then used as seeds for chrono-spectral growth, which was carried out using NH2OH at varied incubation times. The response to RI change of the platform was investigated with varied concentrations of sucrose and ethanol. The detection of bacteria E.coli BL21 was carried out for validation as a model microorganism and results showed that detection was possible at 102 CFU/ml. The data acquired by spectrophotometric measurements were analyzed by ANN and bacteria classification with percentage error rates near 0% was achieved. The proposed LSPR-based, label-free sensor application proved that the developed methodology promises utile sensitivity enhancement potential for similar sensor platforms. © 2024 The Author(s)
  • Article
    Citation - WoS: 9
    Citation - Scopus: 13
    Microservice-Based Projects in Agile World: a Structured Interview
    (Elsevier, 2024) Unlu, Huseyin; Kennouche, Dhia Eddine; Soylu, Gorkem Kiling; Demirors, Onur
    Context: During the last decade, Microservice-based software architecture (MSSA) has been a preferred design paradigm for a growing number of companies. MSSA, specifically in the form of reactive systems, has substantial differences from the more conventional design paradigms, such as object-oriented analysis and design. Therefore, adaptation demands software organizations to transform their culture. However, there is a lack of research studies that explore common practices utilized by software companies that implement MSSAs.Objective: In this study, our goal is to get an insight into how practices such as an agile methodology, software analysis, design, test, size measurement, and effort estimation are performed in software projects which embrace the Microservice-based software architecture paradigm. Together with the identification of practices utilized for the MSSA paradigm, we aim to determine the challenges organizations face to adopt microservice-based software architectures.Method: We performed a structured interview with participants coming from 20 different organizations over different roles, domains, and countries to collect information on their views, experience, and the challenges faced.Results: Our results reveal that organizations find agile development compatible with microservices. In general, they continue to use traditional object-oriented modeling notations for analysis and design in an abstract way. They continue to use the same subjective size measurement and effort estimation approaches that they were using previously in traditional architectures. However, they face unique challenges in developing microservices.Conclusion: Although organizations face challenges, practitioners continue to use familiar techniques that they have been using for traditional architectures. The results provide a snapshot of the software industry that utilizes microservices.
  • Article
    Citation - Scopus: 1
    An Interestingness Measure for Knowledge Bases
    (Elsevier, 2023) Oğuz, Damla; Soygazi, Fatih
    Association rule mining and logical rule mining both aim to discover interesting relationships in data or knowledge. In association rule mining, relationships are identified based on the occurrence of items in a dataset, while in logical rule mining, relationships are determined based on logical relationships between atoms in a knowledge base. Association rule mining has been widely studied in transactional databases, mainly for market basket analysis. Confidence has become the most widely used interesting measure to assess the strength of a rule. Many other interestingness measures have been proposed since confidence can be insufficient to filter negatively associated relationships. Recently, logical rule mining has become an important area of research, as new facts can be inferred by applying discovered logical rules. They can be used for reasoning, identifying potential errors in knowledge bases, and to better understand data. However, there are currently only a few measures for logical rule mining. Furthermore, current measures do not consider relations that can have several objects, called quasi-functions, which can dramatically alter the interestingness of the rule. In this paper, we focus on effectively assessing the strength of logical rules. We propose a new interestingness measure that takes into account two categories of relations, functions and quasi-functions, to assess the degree of certainty of logical rules. We compare our proposed measure with a widely used measure on both synthetic test data and real knowledge bases. We show that it is more effective in indicating rule quality, making it an appropriate interestingness measure for logical rule evaluation. & COPY; 2023 Karabuk University. Publishing services by Elsevier B.V. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).
  • Article
    Citation - WoS: 3
    Citation - Scopus: 4
    Scalable Rfid Authentication Protocol Based on Physically Unclonable Functions
    (Elsevier, 2023) Kurt, Işıl; Alagöz, Fatih; Akgün, Mete
    Radio Frequency Identification (RFID) technology is commonly used for tracking and identifying objects. However, this technology poses serious security and privacy concerns for individuals carrying the tags. To address these issues, various security protocols have been proposed. Unfortunately, many of these solutions suffer from scalability problems, requiring the back-end server to work linearly in the number of tags for a single tag identification. Some protocols offer O(1) or O(log n) identification complexity but are still susceptible to serious attacks. Few protocols consider attacks on the reader-side. Our proposed RFID authentication protocol eliminates the need for a search in the back-end and leverages Physically Unclonable Functions (PUFs) to securely store tag secrets, making it resistant to tag corruption attacks. It provides constant-time identification without sacrificing privacy and offers log2 n times better identification performance than the state-of-the-art protocol. It ensures destructive privacy for tag holders in the event of reader corruption without any conditions. Furthermore, it enables offline readers to maintain destructive privacy in case of corruption.
  • Article
    Citation - WoS: 41
    Citation - Scopus: 50
    Bim-Carem: Assessing the Bim Capabilities of Design, Construction and Facilities Management Processes in the Construction Industry
    (Elsevier, 2023) Gökçen, Yılmaz; Akçamete, Aslı; Demirörs, Onur
    BIM adoption has accelerated worldwide since it is an important enabling technology for digitalisation in the construction industry. Adopting BIM requires transforming the traditional building life cycle stages (planning, design, construction and facilities management) into BIM-integrated project deliveries. Assessing the BIM ca- pabilities of these stages helps organisations to identify gaps in their BIM uses and improve them. There is a lack of a comprehensive model in the literature for assessing the BIM capabilities of individual building life cycle stages and their processes. Existing assessment models focus on assessing the BIM maturity of construction projects and organisations which do not inform the required BIM improvements for individual stages and their processes. Hence, we iteratively developed the Building Information Modelling (BIM) Capability Assessment REference Model (BIM-CAREM) and demonstrated its usability through multiple explanatory case studies per- formed with two international design and engineering companies and two general contractors in Turkey. We assessed the BIM capabilities of design, construction and facility management processes of various buildings i.e. residential, stadiums, hospitals and airports. The results showed that the BIM capability levels of design, con- struction and facility management processes vary within and across the companies.
  • Conference Object
    Citation - Scopus: 2
    Repository Landscape in Turkiye and Gcris: the First National Research Information System
    (Elsevier, 2022) Tuğlular, Tuğkan; Gürdal, Gültekin; Kafalı Can, Gönül; Özdemirden, Ahmet Şemsettin
    This paper describes the history and developments of research infrastructures and open science policies in Turkiye. Moreover, it focuses on the GCRIS (Grand Current Research Information Systems), Turkiye's first Research Information System by inter-national standards, emphasizing the need for internationally interoperable research infrastructures in Turkiye. GCRIS Research Information System, implemented on the open-source software DSpace-CRIS 6.3, was developed with data analytics in mind and continues to be improved by Research Ecosystems Inc. As a strategic partner, Izmir Institute of Technology (IZTECH) is the first university to use GCRIS. Other Universities have used GCRIS since then. With the increase in the number of universities using GCRIS, Turkiye's Research Ecosystem will be trackable and measurable much better thanks to GCRIS intelligent reporting sys- tem. Most importantly, not only the research outputs of Turkiye will be more visible, but also research infrastructures' integration will facilitate with the European Open Science Cloud (EOSC) and other initiatives worldwide.
  • Article
    Performance and Accuracy Predictions of Approximation Methods for Shortest-Path Algorithms on Gpus
    (Elsevier, 2022) Aktılav, Busenur; Öz, Işıl
    Approximate computing techniques, where less-than-perfect solutions are acceptable, present performance-accuracy trade-offs by performing inexact computations. Moreover, heterogeneous architectures, a combination of miscellaneous compute units, offer high performance as well as energy efficiency. Graph algorithms utilize the parallel computation units of heterogeneous GPU architectures as well as performance improvements offered by approximation methods. Since different approximations yield different speedup and accuracy loss for the target execution, it becomes impractical to test all methods with various parameters. In this work, we perform approximate computations for the three shortest-path graph algorithms and propose a machine learning framework to predict the impact of the approximations on program performance and output accuracy. We evaluate random predictions for both synthetic and real road-network graphs, and predictions of the large graph cases from small graph instances. We achieve less than 5% prediction error rates for speedup and inaccuracy values.
  • Article
    Citation - WoS: 1
    Citation - Scopus: 1
    Hybrid Probabilistic Timing Analysis With Extreme Value Theory and Copulas
    (Elsevier, 2022) Bekdemir, Levent; Bazlamaçcı, Cüneyt F.
    The primary challenge of time-critical systems is to guarantee that a task completes its execution before its deadline. In order to ensure compliance with timing requirements, it is necessary to analyze the timing behavior of the overall software. Worst-Case Execution Time (WCET) represents the maximum amount of time an individual software unit takes to execute and is used for scheduling analysis in safety-critical systems. Recent studies focus on statistical approaches, which augments measurement-based timing analysis with probabilistic confidence level by applying stochastic methods. Common approaches either utilize Extreme Value Theory (EVT) for end-to-end measurements or convolution techniques for a group of program units to derive probabilistic upper bounds for the program. The former method does not ensure path coverage while the latter suffers from ignoring possible extreme cases. Furthermore, current state-of-the-art convolution methods employed in a commercial WCET analysis tool overestimates the results because of using the assumption of worst-case dependence between basic blocks. In this paper, we propose a hybrid probabilistic timing analysis framework and modeling the program units with EVT to capture extreme cases and use Copulas to model the dependency between the units to derive tighter distributional bounds in order to mitigate the effects of co-monotonic assumptions.
  • Article
    Citation - WoS: 7
    Citation - Scopus: 8
    Long-Term Image-Based Vehicle Localization Improved With Learnt Semantic Descriptors
    (Elsevier, 2022) Çınaroğlu, İbrahim; Baştanlar, Yalın
    Vision based solutions for the localization of vehicles have become popular recently. In this study, we employ an image retrieval based visual localization approach, in which database images are kept with GPS coordinates and the location of the retrieved database image serves as the position estimate of the query image in a city scale driving scenario. Regarding this approach, most existing studies only use descriptors extracted from RGB images and do not exploit semantic content. We show that localization can be improved via descriptors extracted from semantically segmented images, especially when the environment is subjected to severe illumination, seasonal or other long-term changes. We worked on two separate visual localization datasets, one of which (Malaga Streetview Challenge) has been generated by us and made publicly available. Following the extraction of semantic labels in images, we trained a CNN model for localization in a weakly-supervised fashion with triplet ranking loss. The optimized semantic descriptor can be used on its own for localization or preferably it can be used together with a state-of-the-art RGB image based descriptor in hybrid fashion to improve accuracy. Our experiments reveal that the proposed hybrid method is able to increase the localization performance of the standard (RGB image based) approach up to 7.7% regarding Top-1 Recall values.
  • Article
    Citation - WoS: 12
    Citation - Scopus: 21
    A Change Management Model and Its Application in Software Development Projects
    (Elsevier, 2019) Efe, Pınar; Demirörs, Onur
    Change is inevitable in software projects and software engineers strive to find ways to manage changes. A complete task could be easily in a team's agenda sometime later due to change demands. Change demands are caused by failures and/or improvements and require additional effort which in most cases have not been planned upfront and affect project progress significantly. Earned Value Management (EVM) is a powerful performance management and feedback tool for project management. EVM depicts the project progress in terms of scope, cost, and schedule and provides future predictions based on trends and patterns of the past. Even though EVM works quite well and widely used in disciplines like construction and mining, it is not the case for software discipline. Software projects require special attention and adoption for change. In this study, we present a model to measure change and subsequent rework and evolution costs to monitor software projects accurately. We have performed five case studies in five different companies to explore the usability of the proposed model. This paper depicts the proposed model and discusses the results of the case studies.