Meet Inspiring Speakers and Experts at our 3000+ Global Conference Series Events with over 1000+ Conferences, 1000+ Symposiums
and 1000+ Workshops on Medical, Pharma, Engineering, Science, Technology and Business.

Explore and learn more about Conference Series : World's leading Event Organizer

Back

James D. Turner

James D. Turner

Texas A&M University, USA

Title: Revisiting taylor series models for astrodynamics applications

Biography

Biography: James D. Turner

Abstract

Taylor series methods for generating solutions to differential equations have existed since the earliest developments Calculus.  For many years these methods have fallen out of favor because of the complexity and sheer volume of work required to derive and code vector differential equation time derivative models.   Experience for these solution strategies has indicated that >10 derivative terms are often required.  The advantage of Taylor series models is that larger step sizes can be used for propagating the solutions.  Many tools exist for linking computer-aided algebra tools for generating symbolic Taylor series models.  This work develops closed-form arbitrary order analytic time derivative models for celestial mechanics applications that allow nonlinear Taylor series models to outperform the state-of-the-art numerical integration methods.  Three computational advantages are realized: (1) self-adapting step size algorithm (no tuning or analyst intervention required), (2) double precision accuracy achieved over the entire LEO to GEO range of applications, and (3) very high-speed computation achieved.  Though initially derived for particle models these same computational benefits are expected for Taylor series models that extend to rigid-body attitude/trajectory coupling behaviors.  The improved integration performance is attributed to these models retaining 10+ derivative terms, whereas, existing numerical methods sample an equation multiple time to generate an average estimate of the behavior, where 4-8 derivative orders are approximated.