Densitometer readings are in log10. What that means is D = - log10 ( fraction of light passing through film ). For example if the transmission is 10%, D=1. For 1% transmission, D=2, etc.

Stops are log2 therefore you can convert D to stops by multiplying by log(10)/log(2) = 3.3219. So yes, an error of 0.1 is basically 1/3 of a stop... in DENSITY. Not in exposure, not in processing time. They're related but a 1/3 stop change in one will not cause that same change in the other two, it's all a bit non-linear, depends on the film type you're using, the contrast you're developing to and the starting density you're adjusting from.

Chromes are high contrast (>1) therefore achieving a 0.1D change in density generally requires less than 1/3 stop change in exposure. If it's at the toe (speed point), it differs though because of the toe shape. Negatives are usually low contrast (0.5 to 0.7), which means that a 0.1D change in density usually requires about a 2/3 stop exposure change.

So that doesn't get you anywhere. Since you're trying to correct a chemical process, you're adding a certain quantity of chemicals, e.g. a pH change to CD in order to cause a colour shift or a pH change (or processing time change) to FD in order to change contrast and/or speed. How much you need to add and how much you need to change your processing times is specified in the process documentation, which you absolutely must read if you're doing process control. Make sure you get the documentation from the appropriate manufacturer, i.e. Fuji or Kodak or Tetenal or whoever, as the suggested corrections differ slightly.