That is an interesting question that Jacob Cohen’s addressed in

Things I have learned (So Far)

He writes:

“A less profound application of the less-is-more principle

is to our habits of reporting numerical results. There

are computer programs that report by default four, five,

or even more decimal places for all numerical results.

Their authors might well be excused because, for all the

programmer knows, they may be used by atomic scientists.

But we social scientists should know better than to

report our results to so many places. What, pray, does an

r = .12345 mean? or, for an IQ distribution, a mean of

105.6345? For N = 100, the standard error of the r is

about . 1 and the standard error of the IQ mean about

1.5. Thus, the 345 part of r = .12345 is only 3% of its

standard error, and the 345 part of the IQ mean of

105.6345 is only 2% of its standard error. These superfluous

decimal places are no better than random numbers.

They are actually worse than useless because the clutter

they create, particularly in tables, serves to distract the

eye and mind from the necessary comparisons among the

meaningful leading digits. Less is indeed more here.”

End of the quote.

So, r = 0.12 is good enough.

Some statistical tables use more than 2 decimal places, but then is up to the writer to use two decimal places in the actual experimental results.