The normal method is to start by exposing a piece of film through a commercial step wedge. The wedge is typically a piece of B&W film, with each step getting successively darker, by a specified amount. So you essentially have made a single exposure, but it contains a very wide range of light intensities. A common style of wedge covers a 10 f-stop range in 21 steps.
Originally Posted by StoneNYC
After developing, you measure each step on your film with a transmission densitometer, which is essentially a light meter for the darkness of film. It reports this as "optical density," thus the name densitometer.
Finally, you plot each step on a graph. For the scales, the exposure is normally done in log units; typically the amateur doesn't know the exact power of the light source, so they just use the relative values. The other scale is the film density. This is basically the same thing as the published "characteristic curves" for film.
Since about the mid-1980s, nearly all densitometers have a computer interface, at least as an option, so if you have a proper program, you can collect the numbers automatically. It's easy enough to make graphs using a computer spreadsheet. Or just bypass the graph and calculate what you want directly.