Please use this identifier to cite or link to this item: http://hdl.handle.net/1942/26323
Full metadata record
DC FieldValueLanguage
dc.contributor.authorChaabani, Hazar-
dc.contributor.authorKamoun, Faouzi-
dc.contributor.authorBargaoui, Hichem-
dc.contributor.authorOutay, Fatma-
dc.contributor.authorYASAR, Ansar-
dc.date.accessioned2018-07-12T14:05:50Z-
dc.date.available2018-07-12T14:05:50Z-
dc.date.issued2017-
dc.identifier.citationShakshuki, Elhadi (Ed.). The 8th International Conference on Emerging Ubiquitous Systems and Pervasive Networks (EUSPN 2017) / The 7th International Conference on Current and Future Trends of Information and Communication Technologies in Healthcare (ICTH-2017) / Affiliated Workshops, Elsevier BV, p. 466-471-
dc.identifier.issn1877-0509-
dc.identifier.urihttp://hdl.handle.net/1942/26323-
dc.description.abstractThe degradation of visibility due to foggy weather conditions is a common trigger for road accidents and, as a result, there has been a growing interest to develop intelligent fog detection and visibility range estimation systems. In this contribution, we provide a brief overview of the state-of-the-art contributions in relation to estimating visibility distance under foggy weather conditions. We then present a neural network approach for estimating visibility distances using a camera that can be fixed to a roadside unit (RSU) or mounted onboard a moving vehicle. We evaluate the proposed solution using a diverse set of images under various fog density scenarios. Our approach shows very promising results that outperform the classical method of estimating the maximum distance at which a selected target can be seen. The originality of the approach stems from the usage of The degradation of visibility due to foggy weather conditions is a common trigger for road accidents and, as a result, there has been a growing interest to develop intelligent fog detection and visibility range estimation systems. In this contribution, we provide a brief overview of the state-of-the-art contributions in relation to estimating visibility distance under foggy weather conditions. We then present a neural network approach for estimating visibility distances using a camera that can be fixed to a roadside unit (RSU) or mounted onboard a moving vehicle. We evaluate the proposed solution using a diverse set of images under various fog density scenarios. Our approach shows very promising results that outperform the classical method of estimating the maximum distance at which a selected target can be seen. The originality of the approach stems from the usage of a single camera and a neural network learning phase based on a hybrid global feature descriptor. The proposed method can be applied to support next-generation cooperative hazard & incident warning systems based on I2V, I2I and V2V communications. (c) 2017 The Authors. Published by Elsevier B.V.-
dc.description.sponsorshipThis research was supported by Zayed University Research Incentive Fund (RIF) grant #R16075.-
dc.language.isoen-
dc.publisherElsevier BV-
dc.relation.ispartofseriesProcedia Computer Science-
dc.rights2017 The Authors. Published by Elsevier B.V-
dc.subject.othervisibility distance-
dc.subject.otherfog detection-
dc.subject.otherintelligent transportation systems-
dc.subject.othermeteorologcal visibility-
dc.subject.otherdriving assistan-
dc.subject.otherceneural networks-
dc.subject.othermachine learning-
dc.subject.otherKoschmieder Lawcomputer vision-
dc.subject.otherFourier Transform-
dc.titleA Neural network approach to visibility range estimation under foggy weather conditions-
dc.typeProceedings Paper-
local.bibliographicCitation.authorsShakshuki, Elhadi-
local.bibliographicCitation.conferencedate2017, September 18-20-
local.bibliographicCitation.conferencename8th International Conference on Emerging Ubiquitous Systems and Pervasive Networks (EUSPN) / 7th International Conference on Current and Future Trends of Information and Communication Technologies in Healthcare (ICTH)-
local.bibliographicCitation.conferenceplaceLund, Sweden-
dc.identifier.epage471-
dc.identifier.spage466-
dc.identifier.volume113-
local.format.pages6-
local.bibliographicCitation.jcatC1-
dc.description.notesKamoun, F (reprint author), ESPRIT Sch Engn, ZI Chotrana 2,POB 160, Tunis, Tunisia. faouzi.kammoun@esprit.tn-
local.publisher.placeSARA BURGERHARTSTRAAT 25, PO BOX 211, 1000 AE AMSTERDAM, NETHERLANDS-
local.type.refereedRefereed-
local.type.specifiedProceedings Paper-
local.relation.ispartofseriesnr113-
local.classdsPublValOverrule/author_version_not_expected-
dc.identifier.doi10.1016/j.procs.2017.08.304-
dc.identifier.isi000419236500061-
local.bibliographicCitation.btitleThe 8th International Conference on Emerging Ubiquitous Systems and Pervasive Networks (EUSPN 2017) / The 7th International Conference on Current and Future Trends of Information and Communication Technologies in Healthcare (ICTH-2017) / Affiliated Workshops-
local.uhasselt.internationalyes-
item.fullcitationChaabani, Hazar; Kamoun, Faouzi; Bargaoui, Hichem; Outay, Fatma & YASAR, Ansar (2017) A Neural network approach to visibility range estimation under foggy weather conditions. In: Shakshuki, Elhadi (Ed.). The 8th International Conference on Emerging Ubiquitous Systems and Pervasive Networks (EUSPN 2017) / The 7th International Conference on Current and Future Trends of Information and Communication Technologies in Healthcare (ICTH-2017) / Affiliated Workshops, Elsevier BV, p. 466-471.-
item.contributorChaabani, Hazar-
item.contributorKamoun, Faouzi-
item.contributorBargaoui, Hichem-
item.contributorOutay, Fatma-
item.contributorYASAR, Ansar-
item.fulltextWith Fulltext-
item.accessRightsOpen Access-
item.validationecoom 2019-
crisitem.journal.issn1877-0509-
Appears in Collections:Research publications
Files in This Item:
File Description SizeFormat 
chaabani 1.pdfPublished version747.79 kBAdobe PDFView/Open
Show simple item record

SCOPUSTM   
Citations

54
checked on Sep 26, 2025

WEB OF SCIENCETM
Citations

39
checked on Sep 27, 2025

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.