Please use this identifier to cite or link to this item: http://hdl.handle.net/1942/26323
Title: A Neural network approach to visibility range estimation under foggy weather conditions
Authors: Chaabani, Hazar
Kamoun, Faouzi
Bargaoui, Hichem
Outay, Fatma
YASAR, Ansar 
Issue Date: 2017
Publisher: Elsevier BV
Source: Shakshuki, Elhadi (Ed.). The 8th International Conference on Emerging Ubiquitous Systems and Pervasive Networks (EUSPN 2017) / The 7th International Conference on Current and Future Trends of Information and Communication Technologies in Healthcare (ICTH-2017) / Affiliated Workshops, Elsevier BV,p. 466-471
Series/Report: Procedia Computer Science
Series/Report no.: 113
Abstract: The degradation of visibility due to foggy weather conditions is a common trigger for road accidents and, as a result, there has been a growing interest to develop intelligent fog detection and visibility range estimation systems. In this contribution, we provide a brief overview of the state-of-the-art contributions in relation to estimating visibility distance under foggy weather conditions. We then present a neural network approach for estimating visibility distances using a camera that can be fixed to a roadside unit (RSU) or mounted onboard a moving vehicle. We evaluate the proposed solution using a diverse set of images under various fog density scenarios. Our approach shows very promising results that outperform the classical method of estimating the maximum distance at which a selected target can be seen. The originality of the approach stems from the usage of The degradation of visibility due to foggy weather conditions is a common trigger for road accidents and, as a result, there has been a growing interest to develop intelligent fog detection and visibility range estimation systems. In this contribution, we provide a brief overview of the state-of-the-art contributions in relation to estimating visibility distance under foggy weather conditions. We then present a neural network approach for estimating visibility distances using a camera that can be fixed to a roadside unit (RSU) or mounted onboard a moving vehicle. We evaluate the proposed solution using a diverse set of images under various fog density scenarios. Our approach shows very promising results that outperform the classical method of estimating the maximum distance at which a selected target can be seen. The originality of the approach stems from the usage of a single camera and a neural network learning phase based on a hybrid global feature descriptor. The proposed method can be applied to support next-generation cooperative hazard & incident warning systems based on I2V, I2I and V2V communications. (c) 2017 The Authors. Published by Elsevier B.V.
Notes: Kamoun, F (reprint author), ESPRIT Sch Engn, ZI Chotrana 2,POB 160, Tunis, Tunisia. faouzi.kammoun@esprit.tn
Keywords: visibility distance; fog detection; intelligent transportation systems; meteorologcal visibility; driving assistance; neural networks; machine learning; Koschmieder Law; computer vision; Fourier Transform
Document URI: http://hdl.handle.net/1942/26323
DOI: 10.1016/j.procs.2017.08.304
ISI #: 000419236500061
Rights: © 2017 The Authors. Published by Elsevier B.V
Category: C1
Type: Proceedings Paper
Validations: ecoom 2019
Appears in Collections:Research publications

Files in This Item:
File Description SizeFormat 
chaabani 1.pdfPublished version747.79 kBAdobe PDFView/Open
Show full item record

SCOPUSTM   
Citations

12
checked on Sep 2, 2020

WEB OF SCIENCETM
Citations

34
checked on Apr 30, 2024

Page view(s)

90
checked on Sep 7, 2022

Download(s)

154
checked on Sep 7, 2022

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.