Virtual sensors: an industrial application for illumination attributes based on machine learning techniques.

Citation metadata

From: Annales des Telecommunications(Vol. 76, Issue 7-8)
Publisher: Springer
Document Type: Report; Brief article
Length: 255 words

Document controls

Main content

Abstract :

Keywords: Virtual sensor; Light sensor; Machine learning Abstract Nowadays, the Internet of Things is a technology used in a wide range of applications, empowering fields such as the smart city, smart transportation, and the manufacturing industry. Their growing number increases the demand for robust and intelligent solutions providing relevant data, while relying on as few resources as possible. To this end, it is essential to search for techniques and solutions that can achieve a high level of quality of service, using the least hardware and cost. Machine learning can tackle the challenge by generating virtual data. It aims on replicating a sensor's activity by utilizing real data from a subset of sensors. Such a task could be difficult with existing means, while the proposed approach might reduce it to a trivial calculation. In a similar fashion, it is possible to use simulation models for data analysis and model validation, by feeding the existing simulation models with varying conditions and comparing the results with the real ones. The current work aims to utilize the virtual IoT paradigm, in order to immerse and test everyday applications in realistic conditions and constraints. Finally, a prototype's implementation in real-life use cases is discussed, such as the illumination in an industrial environment. Author Affiliation: (1) Department of Computer Engineering and Informatics, University of Patras, Patras, Greece (2) Computer Technology Institute and Press "Diophantus", Athens, Greece (3) Athenian Brewery S.A. Patras Plant, Patras, Greece (a) drakouleli@ceid.upatras.gr Article History: Registration Date: 05/20/2021 Received Date: 09/09/2019 Accepted Date: 05/20/2021 Online Date: 06/21/2021 Byline:

Source Citation

Source Citation   

Gale Document Number: GALE|A667937602