Design and Development of Automated Lane Detection Using Improved Canny Edge Detection Method
Main Article Content
Abstract
Autonomous ground vehicles or self-driving cars require high-level scenarios consciousness to operate safely and effectively in the real world. Each individual in this world is worried about being sheltered. Expanding well-being and diminishing street mishaps, along these lines sparing lives are one of extraordinary interest with regards to Advanced Driver Assistance Systems (ADAS). Among the mind boggling and testing errands of future street vehicles is street path discovery or street limits identification. In driving help frameworks, impediment location particularly for moving item discovery is a critical segment of impact evasion. Numerous sensors can be utilized for obstruction identification and path location, for example, laser, radar and vision sensors. The most as often as possible utilized chief way to deal with recognize street limits and paths utilizing vision framework on the vehicle. The identifying a wide range of hindrance out and about, primarily incorporate IPM (Inverse Perspective Mapping) technique. The framework secures the front view utilizing a camera mounted on the vehicle at that point applying not many cycles to recognize the paths and items. An adaptable system is utilized so as to recognizing the paths and items. There are few challenges is this project to overcome these challenging scenarios, we have used advanced methods like Canny Edge Detection and Hough Transform. The method proposed here uses sliding window for object and Lane detection by feature-based technique. An advanced version is also designed and implemented with better feature and accuracy. Proposed method uses Computer vision with a sliding rectangular fixed width window acting as stature that "slides" over a picture. In our designed method a hybrid Lane Detection technique is implemented with improved accuracy and better recognition rate.
Article Details
This work is licensed under a Creative Commons Attribution 4.0 International License.