Published March 30, 2021 | Version v1
Journal article Open

Development of Scale Invariant lens Opacity Estimation System using Hough Circle Detection Transform, Normalization and Entropy

  • 1. Department of Electronics, RSCOE, Tathawade, Savitribai Phule Pune University, Pune, working as Asst Prof. at SKNSCOE, Korti, Pandharpur, Maharashtra, India.
  • 2. Director, Symbiosis Skills & Professional University (SSPU) Pune, Maharashtra, India.
  • 3. DTE, Pune, Maharashtra, India.
  • 1. Publisher

Description

Clear eye lens is responsible for correct vision. Ageing effect acquires opacity at lens structure causing foggy or blurred vision. It is termed as cataract. This may become cause of permanent blindness if remain unidentified and untreated. Due to hazards change in environment and adoption of sluggish lifestyle many diseases like cataract are becoming universal challenge for health organization over the world. Lack of medication and diagnosis facility in developing countries makes cataract as savior vision problem. Proposed methodology suggests image processing based, low cost solution for lens opacity or cataract detection. In this system eye lens image from input image is acquired using Iterative Hough circle detection transform. It is normalized using Daugman’s rubber sheet normalization algorithm which makes system scale invariant. Structural variation in normalized lens image is estimated in terms of entropy or mean value. Comparison of right and left half entropies of normalized image is basis for estimation of lens opacity. It is used to detect and categorize lens opacity or cataract. This system easily categorize lens opacity based on structural features of opacity in one of three grades such as “No cataract”, “Cortical cataract” or “Nuclear cataract”.

Files

E86140310521.pdf

Files (373.8 kB)

Name Size Download all
md5:4962a758f928a449fbcfb4908dc337d4
373.8 kB Preview Download

Additional details

Related works

Is cited by
Journal article: 2278-3075 (ISSN)

Subjects

Retrieval Number
100.1/ijitee.E86140310521
ISSN
2278-3075