2018
Authors
Roxo, MT; Brito, PQ;
Publication
Asian Journal of Business Research
Abstract
Augmented Reality (AR) is emerging as a technology that is reshaping the current society, especially the fields of Business and Economics (B&E). Therefore, the scientific studies produced on AR call for an interdisciplinary systematic review of the knowledge generated to structure an organized framework. Three main questions are addressed: How has the production of AR scientific knowledge evolved? What user-related aspects does AR affect? Also, which set of subtopics is associated with each motivation to develop an AR solution? The content of 328 papers produced between 1997 and 2016 in the field of AR is analyzed, unveiling 58 coding categories. There are 13 digital media characteristics that assume instrumental roles in addressing four major motivations to develop AR solutions. Technological topics dominate the research focus over behavioral ones. The investigations on AR in mobile displays show the highest increase. This research identifies the main scientific topics that have led researchers' agenda. Consequently, they contributed to develop and to adopt AR solutions and to forecast its future application in the organizations' strategies.
2018
Authors
Rocha, A; Adeli, H; Reis, LP; Costanzo, S;
Publication
WorldCIST (2)
Abstract
2018
Authors
Costa, J; Botelho, A; Matias, J;
Publication
Entrepreneurship and the Industry Life Cycle - Studies on Entrepreneurship, Structural Change and Industrial Dynamics
Abstract
2018
Authors
Lago, AS; Ferreira, HS;
Publication
CoRR
Abstract
2018
Authors
Martins, I; Carvalho, P; Corte Real, L; Alba Castro, JL;
Publication
PATTERN ANALYSIS AND APPLICATIONS
Abstract
Developing robust and universal methods for unsupervised segmentation of moving objects in video sequences has proved to be a hard and challenging task that has attracted the attention of many researchers over the last decades. State-of-the-art methods are, in general, computationally heavy preventing their use in real-time applications. This research addresses this problem by proposing a robust and computationally efficient method, coined BMOG, that significantly boosts the performance of a widely used method based on a Mixture of Gaussians. The proposed solution explores a novel classification mechanism that combines color space discrimination capabilities with hysteresis and a dynamic learning rate for background model update. The complexity of BMOG is kept low, proving its suitability for real-time applications. BMOG was objectively evaluated using the ChangeDetection.net 2014 benchmark. An exhaustive set of experiments was conducted, and a detailed analysis of the results, using two complementary types of metrics, revealed that BMOG achieves an excellent compromise in performance versus complexity.
2018
Authors
Costa, J; Silva, C; Antunes, M; Ribeiro, B;
Publication
Proceedings of the International Joint Conference on Neural Networks
Abstract
Current challenges in machine learning include dealing with temporal data streams, drift and non-stationary scenarios, often with text data, whether in social networks or in business systems. This dynamic nature tends to limit the performance of traditional static learning models and dynamic learning strategies must be put forward. However, acquiring the performance of those strategies is not a straightforward issue, as sample's dependency undermines the use of validation techniques, like crossvalidation. In this paper we propose to use the McNemar's test to compare two distinct approaches that tackle adaptive learning in dynamic environments, namely DARK (Drift Adaptive Retain Knowledge) and Learn++. NSE (Learn++ for Non-Stationary Environments). The validation is based on a Twitter case study benchmark constructed using the DOTS (Drift Oriented Tool System) dataset generator. The results obtained demonstrate the usefulness and adequacy of using McNemar's statistical test in dynamic environments where time is crucial for the learning algorithm. © 2018 IEEE.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.