A variable-order, variable-step Taylor-series method in Cartesian space is discussed which makes it possible to solve simultaneous first-order differential equations expressed in GMA-system canonical form with a super high-order accuracy that is comparable to the machine accuracy of the computer. In this method, the stepsize for each differential equation is calculated from the formula derived by setting the ratio of the appropriate two terms of the first few Taylor-series terms at unity, and a minimum value of them is then used as the initial value of the stepsize at each step of the integration and is halved if several criterions are satisfied. Also, when each dependent variable takes a value near zero, the integration is carried out with a very small stepsize. The Taylor-series method constructed here is first applied to the Lotka-Volterra equation to confirm the validity of the calculation algorithm and it is found that the calculated values completely agree with those by the conventional method in which the Taylor-series solution in logarithmic space is used. Second, the differential equation with a solution of cos(t) is solved and the relative error is found to instantaneously increase when the dependent variables take a value near zero and becomes equal to zero or almost zero in the other region. Finally, the differential equation for a two-body problem is solved and it is shown that the accuracies of the calculated values are kept at high levels even after integration over 625 cycles. In conclusion, the Taylor-series method constructed in Cartesian space is considered to have high applicability to simultaneous differential equations and provides high accuracies for the calculated values.
All Science Journal Classification (ASJC) codes
- Computational Mathematics
- Applied Mathematics