nhyone, I do hope you know the size of a photon...I don't believe the engineers will have to worry much about the size of the sensor being smaller than one for a loooong time.
TME: analog audio recordings pick up the intangables, namely things above 18,000 hz or so that effect the listener but cannot be audibly perceived. Film doesn't have that -- what you see is what's there.
The grain structure is indeed random, but on a quality scan you can see them easily enough (on good quality 400 slide film. 100 is a good bit tougher.)
You are correct in stating that some devices show grain differently -- monitors and inkjet printers (and even digital files) represent things as a grid. Grain, which does not fall into said grid also does not look as good when displayed on one. But if you sampe enough, film becomes more and more like digital.
What some people have argued about digital cameras is that they do not reproduce color as well. I don't know about you, but all film is biased one way or another; my elite chrome turns blue if there's too little light, some films are ultra saturated, etc. And when I scan them they look different again, blahblahblah.
Remember, this is photography. It's not how you get there, it's what you finish with. Affordable digital cameras -will- overtake film. It's merely a matter of time. Still like film? It definitely has its advantages (namely: you don't need to worry about memory cards. Just carry around a big bag of film when out in the field.). Do what works for you. I don't see film disappearing anytime soon.
But from a computer engineering standpoint, it's only a matter of time. There will always be people who prefer a certain film for a certain application due to the results -- the same how some people still like vinyl LP records: they say it has a warmer, richer sound. Whatever works.