The researchers are hopeful the new algorithm could be used to aid self-driving vehicles as well as for search and rescue.
It works by videoing the "penumbra", which is where the objects in the way reflect an amount of light on the ground. Using the penumbra, the software is able to put together a series of one-dimensional images.
Katherine Bouman, lead author on a new paper about the system, said: "Even though those objects aren't actually visible to the camera, we can look at how their movements affect the penumbra to determine where they are and where they're going. In this way, we show that walls and other obstructions with edges can be exploited as naturally-occurring 'cameras' that reveal the hidden scenes beyond them ...
"Given that the rain was literally changing the color of the ground, I figured that there was no way we'd be able to see subtle differences in light on the order of a tenth of a percent. But because the system integrates so much information across dozens of images, the effect of the raindrops averages out, and so you can see the movement of the objects even in the middle of all that activity."
And it is believed this work will be a "significant step" in helping to develop this field.
Professor Marc Christensen, who is the dean of the Lyle School of Engineering at Southern Methodist University and was not involved in the research, shared: "The notion to even try to achieve this is innovative in and of itself, but getting it to work in practice shows both creativity and adeptness. This work is a significant step in the broader attempt to develop revolutionary imaging capabilities that are not limited to line-of-sight observation."