Multi-spectral detection on satellites is to be expected. So the concept of tracking in real time is feasible with enough coverage
density. The US is using commercial Trojan Horse projects to achieve this. Starlink is clearly a dual use system. Even though
the Keyhole satellites are the size of a bus, they are old designs. I suspect that modern CCDs have evolved enough that a full sized
telescope optical layout is not necessary to get high resolution images. Unlike observations of remote stars, the photon flux
from the surface of the Earth is high enough that ultra-focusing is not required.
I think you may be confusing light detection with light collection in photography. CCDs are light detection / image sensing components that replaced film from older era. They have the advantage of being compact and digitizing the image for signal processing but have actually taken them at least a decade to reach the color sensitivity level of film.
The light collection part is a totally different task that is done by the optical aperture, which is an assembly of lenses. The bigger the aperture, the greater the amount of collected light, the clearer the image. There is no getting around that. The larger the megapixels on the CCD, the more you can do with the collected light but that is about it. What has been improving recently is in software based (intelligent) image processing that results in cleaner images by compensating for missing photons based on some prior geometrical programming and heuristic estimation. You can see what a smart phone camera can do with such a small aperture. But couple that same system to a larger aperture and the result would be proportionally superior. So there is really no replacement for a good old large lens or reflector either in a satellite or smartphone.
As for geostationary satellites, unless someone figures out how to launch a 25 meter (just a guess) aperture telescope there, we can safely forget getting any useful details from an image taken from there.