0
TECHNICAL PAPERS

Overcoming Limitations of the Conventional Strain-Life Fatigue Damage Model

[+] Author and Article Information
T. E. Langlais, J. H. Vogel

Department of Mechanical Engineering, University of Minnesota, Minneapolis, MN 55455

J. Eng. Mater. Technol 118(1), 103-108 (Jan 01, 1996) (6 pages) doi:10.1115/1.2805921 History: Received April 13, 1994; Revised December 10, 1994; Online November 27, 2007

Abstract

The strain-based approach to fatigue life prediction usually relies on the conventional strain-life equation which correlates the elastic and plastic strain to the life. The correlation is based on separate log-linear curve fits of the elastic and plastic components of the strain data versus the life. It is well known, however, that these linear relationships may be valid only within a specific interval of stress or strain. When material behavior approaches elastic-perfectly plastic, for instance, it is not uncommon for the test data to deviate from linearity at both very high and very low strains. For such materials a separate fit of each curve is likely to give material constants significantly inconsistent with the fit of the cyclic stress-strain curve, especially if a good local fit over a restricted interval is obtained. In this work, some of the errors that arise as a result of this inconsistency are described, and recommended methods are developed for treating these errors. Numerical concerns are also addressed, and sample results are included.

Copyright © 1996 by The American Society of Mechanical Engineers
Your Session has timed out. Please sign back in to continue.

References

Figures

Tables

Errata

Discussions

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In