This isn't exactly an answer to your question, but it is perhaps a clue to figuring it out.
First, take a look at the section on "Distortion in Camera Calibration" in this link.
The statement of interest is:
x, y — Undistorted pixel locations. x and y are in normalized image coordinates. Normalized image coordinates are calculated from pixel coordinates by translating to the optical center and dividing by the focal length in pixels. Thus, x and y are dimensionless.
From the above, it would seem that you can convert the dimensionless coordinates to physical units by simple scale-and-shift operations.
As far as the coefficients go ... I'm not even sure what it means to convert their "units." The job of the normalizing scale-and-shift operations above is to remap your physical coordinate frame to the closed intervals [-1, 1] in both x and y directions, which makes the numerical solution for the polynomial coefficients more numerically stable. If you tried to solve for polynomial coefficients using the original coordinate frames (and units), you would get back a completely different set of coefficients, if your solution returned anything of value at all (due to lack of numerical stability).
However, in general there is not a simple linear scale factor between the dimensionless polynomial coefficients obtained via normalization and those obtained if the normalization step is skipped ... because by definition, the polynomials do not implement linear transformations.
Having said all this, I suppose you could obtain something useful by comparing polynomial derivatives at points of interest (for example, optical boresight), but I don't think that's what you are really looking for.