Turned into a rant that might or might not help you! Sorry about that, ill go into this a bit deeper on monday since im leaving now for the weekend.
Ahh CEC url is http://cade.scope.edu/ and they have some brief discussions on this (pertty good). And yes you wouldt find this on the net because they have a flair for using no text in navigation so searcheingenes get very hung up on this.
But to be honest from your first post i tought you had problems with the lens disortion (since its a very common issue during tracking). And it could have well ben this you were talking about
Well yes im not the definitive pro on the subject on camera filmback and such (im more of aprocedural shaders guy and a particle dynamics worker). And so this explanaition would require a ton of pictures. But its not allways you have huge amonts of time to dedicate on exact workings of things. you can push people on the way but you cant do their jobs. By the way i swiched tactics from trying to explain wery troughly to brief answers about a year ago. Since nobody understod anyway (except for the few who get the drift with brief posts anyway). the CEC site site explains tha camera values pretty well, and the rest you get from the maya manual especialy under the maya live documentation, so i will not go in there so much
But about units here goes some rant: To be honest the camera values in maya are real world units (as long as your pref units is in scale, oh it doesent realty matter sice cameras work the same way independent of scale) thats why they are so hard to increment. the filmback and focal length is in mm etc. That is also the reason why your other values get so high. This is very common in realworld physics. They all make sanse if you get the units they are talking about the filmback is in mm as is the focal leght the angle of view is in dergees so too big values wouldnt make any sense. While photonmapping is number of pohotons to be shot and so 0.1 would be quite out of the question (in realworld you get bombarder by millions of photons all the time, not to mention those that dont hit the retina).
The only thing here that is realy (kindoff) unitless is the light intensity, wich is reported in screenspace maxima (white with intensity of 1 would be 255 255 255 in rgb), since we are not using floating point output files wich would be more inteligent. Altough there are many things that affect this such as dropoff and surface color surface fallof etc. So a truly white lambert surface with white light of 1 1 1 rgb with intensity of 1 with no fallof then the points in the surface that face the light would be 255 255 255 white ina 8 bit image.
By the way what has light intensity to do with camera settings by the way?
Hope this get you on your way, no offense taken by the way 