The problem I see is detecting the eye or lens at any kind of range. That's where I would start at, since it seems like it would be the tough part.
So, MCU hooked up to a camera device of some sort. With no zoom, my 10MP camera has 3880 pixel width images, and about 60 degrees of view. (let's pretend this is evenly distributed for simpler math, and that there is no interpolation) For an eyeball to be easy to see, let's say it needs to be at least 15 pixels wide. That works out to almost an arc minute, so let's just call it that. The eye is about an inch, so Wolfram Alpha tells me that it's max distance would be 87 meters. Using a cheaper camera, you could aim for face detection and then pick the eyes out and swing the laser across a larger area to dazzle the target. Or you could use a zoom lens and aim for a certain narrower area.
But detecting a good anti-glare anti-reflective lens would be a whole other problem.
I think a billion in grants is too small. To cover any area, you'd need a good high res, larger format, sensor, a long focus lens, lots of torque to move sensor and lens combo, and a beefy CPU behind it if you want a chance of detecting the lens of a sniper weapon. And you'll need multiple devices, because the snipers would just shoot the camera first.