Document Type

Article

Publication Date

1996

Abstract

The classical algorithm for multiple-precision division normalizes digits during each step and sometimes makes correction steps when the initial guess for the quotient digit turns out to be wrong. A method is presented that runs faster by skipping most of the intermediate normalization and recovers from wrong guesses without separate correction steps.

Original Publication Citation

Smith, D. M. A Multiple-Precision Division Algorithm, Mathematics of Computation. vol. 65 (1996) pp. 157-163.

Publisher Statement

First published in Mathematics of Computation in 1996, published by the American Mathematical Society

Included in

Mathematics Commons

Share

COinS