The research - which has been conducted at Moorfields Eye Hospital in London and the Google company DeepMind - has discovered that a cutting-edge machine is capable of detecting more than 50 different eye conditions.
The study is potentially significant, because it could mean that waiting times for scans for patients are dramatically reduced, and that stress is relieved on the healthcare system.
Dr Dominic King, the medical director of DeepMind Health, has explained the detail behind how the machine was trained to read eye scans.
He said: "We used two neural networks, which are complex mathematical systems which mimic the way the brain operates, and inputted thousands of eye scans.
"They divided the eye into anatomical areas and were able to classify whether disease was present."
As well as trying to speed up diagnoses for patients with eye disease, DeepMind is also doing research with Imperial College London to see whether artificial intelligence can be taught how to interpret mammograms.
The overarching ambition behind the project is to improve the accuracy of breast cancer screenings.