Cookies Policy
The website need some cookies and similar means to function. If you permit us, we will use those means to collect data on your visits for aggregated statistics to improve our service. Find out More
Accept Reject
  • Menu
Publications

2015

Detecting Motion Patterns in Dense Flow Fields: Euclidean Versus Polar Space

Authors
Pinto, A; Costa, P; Moreira, AP;

Publication
PROGRESS IN ARTIFICIAL INTELLIGENCE-BK

Abstract
This research studies motion segmentation based on dense optical flow fields for mobile robotic applications. The optical flow is usually represented in the Euclidean space however, finding the most suitable motion space is a relevant problem because techniques for motion analysis have distinct performances. Factors like the processing-time and the quality of the segmentation provide a quantitative evaluation of the clustering process. Therefore, this paper defines a methodology that evaluates and compares the advantage of clustering dense flow fields using different feature spaces, for instance, Euclidean and Polar space. The methodology resorts to conventional clustering techniques, Expectation-Maximization and K-means, as baseline methods. The experiments conducted during this paper proved that the K-means clustering is suitable for analyzing dense flow fields.

2015

TwitterJam: Identification of Mobility Patterns in Urban Centers Based on Tweets

Authors
Rebelo, F; Soares, C; Rossetti, RJF;

Publication
2015 IEEE FIRST INTERNATIONAL SMART CITIES CONFERENCE (ISC2)

Abstract
In the early twenty-first century, social networks served only to let the world know our tastes, share our photos and share some thoughts. A decade later, these services are filled with an enormous amount of information. Now, the industry and the academia are exploring this information, in order to extract implicit patterns. TwitterJam is a tool that analyses the contents of the social network Twitter to extract events related to road traffic. To reach this goal, we started by analysing tweets to know those which really contains road traffic information. The second step was to gather official information to confirm the extracted information. With these two types of information (official and general), we correlated them in order to verify the credibility of public tweets. The correlation between the two types of information was done separately in two ways: the first one concerns the amount of tweets in a certain time of day and the second on the localization of these tweets. Two hypothesis were also devised concerning these correlations. The results were not perfect but where reasonable enough. We also analysed tools suitable for the visualization of data to decide what is the best strategy to follow. At the end we developed a web application that shows the results, to help the analysis of results.

2015

A Framework for Simulator Development for Fixed Horizon, Rolling Horizon and Real Time Management Modelling and Evaluation

Authors
Putnik, Goran; Alves, Cátia; Ávila, Paulo; Ferreira, Luís; Castro, Helio; Shah, Vaibhav;

Publication
PROCEEDINGS of 2100 Projects Association Joint Conferences

Abstract
This paper presents a framework for simulator development for fixed horizon, rolling horizon and real time management models for their modelling and evaluation in ubiquitous production networks under conditions of dynamic environments for economic and environmental sustainability.

2015

A Formal Perspective on IEC 61499 Execution Control Chart Semantics

Authors
Lindgren, P; Lindner, M; Pereira, D; Pinho, LM;

Publication
2015 IEEE TRUSTCOM/BIGDATASE/ISPA, VOL 3

Abstract
The IEC 61499 standard proposes an event driven execution model for distributed control applications for which an informal execution semantics is provided. Consequently, run-time implementations are not rigorously described and therefore their behavior relies on the interpretation made by the tool provider. In this paper, as a step towards a formal semantics, we focus on the Execution Control Chart semantics, which is fundamental to the dynamic behavior of Basic Function Block elements. In particular we develop a well-formedness criterion that ensures a finite number of Execution Control Chart transitions for each triggering event. We also describe the first step towards the mechanization of the well-formedness checking algorithm in the Coq proof-assistant so that, ultimately, we are able to show, once and for all, that this algorithm is effectively correct with respect to our proposed execution semantics. The algorithm is extractable from the mechanization in a correct-by-construction way, and can be directly incorporated in certified toolchain for analysis, compilation and execution of IEC 61499 models. As a proof of concept a prototype tool RTFM-4FUN has been developed. It performs well-formedness checks on Basic Function Blocks using the extracted algorithm's code.

2015

A Review of Cloud Computing and Its Opportunities in the Development of Information Systems for SMEs

Authors
Cunha, CR; Morais, EP; Sousa, JP; Gomes, JP;

Publication
INNOVATION MANAGEMENT AND SUSTAINABLE ECONOMIC COMPETITIVE ADVANTAGE: FROM REGIONAL DEVELOPMENT TO GLOBAL GROWTH, VOLS I - VI, 2015

Abstract
This paper reviews the main characteristics of cloud computing, where they are exposed their main components and ways of use. In addition to the technological review that is done, is also carried out a critical analysis of their potential and challenges in the context of SMEs. To understand how cloud computing can lead to a powerful ally of SMEs in the context of organizational competitiveness in a world where the role of information systems for a long time proved decisive, it is a reflection that the SMEs, whose core business is not technology, need to carry out.

2015

Adam Hilger revisited: a museum instrument as a modern teaching tool

Authors
Carvalhal, MJ; Marques, MB;

Publication
EDUCATION AND TRAINING IN OPTICS AND PHOTONICS: ETOP 2015

Abstract
Spectroscopy can be historically traced down to the study of the dispersion of light by a glass prism. In the early 19th century, inspired by Newton's experiment, Fraunhofer creates a device where an illuminated slit and a lens are placed before the prism; such a device is later transformed, by Kirchoff and Bunsen, into a much handier and more precise observation and measurement instrument, the spectroscope. In the 1930's, the Physics Laboratory of the Faculty of Science of the University of Porto would buy, from Adam Hilger, Ltd., London, a constant deviation spectrometer. The ultimate purpose was to set up a spectroscopy laboratory for teaching and research. This model's robust construction (the telescope and the collimator are rigidly fixed) makes it adequate for student's practice. To sweep across the spectrum, all it takes is to rotate the high quality, constant deviation prism -known as Pellin-Broca prism. Spectra in the 390-900 nm interval are observed, either directly, or through photographic recording, or even by using a thermopile and associated galvanometer, when working in the infra-red range. The wavelength of the line under observation is read straight on a drum, which is fixed to the prism's rotation mechanism. Details of the construction and operation of this spectrometer are explored, against the background of present day spectrometers, automatic and computerized, thereby offering a deeper understanding of spectroscopic analysis: for instance, the use of the raies ultimes powder, a mixture of 50 chemical elements whose emission spectra provide a way of calibrating the instrument.

  • 2566
  • 4364