Document Type

Article

Publication Date

1996

Abstract

The classical algorithm for multiple-precision division normalizes digits during each step and sometimes makes correction steps when the initial guess for the quotient digit turns out to be wrong. A method is presented that runs faster by skipping most of the intermediate normalization and recovers from wrong guesses without separate correction steps.

Publisher Statement

First published in Mathematics of Computation in 1996, published by the American Mathematical Society

Recommended Citation

Smith, D. M. A Multiple-Precision Division Algorithm, Mathematics of Computation. vol. 65 (1996) pp. 157-163.

Included in

Mathematics Commons

Share

COinS