0
TECHNICAL PAPERS

Fatigue-Life Prediction Methodology Using a Crack-Closure Model

[+] Author and Article Information
J. C. Newman

Mechanics of Materials Branch, NASA Langley Research Center, Hampton, VA 23681-0001

J. Eng. Mater. Technol 117(4), 433-439 (Oct 01, 1995) (7 pages) doi:10.1115/1.2804736 History: Received July 15, 1995; Online November 27, 2007

Abstract

This paper reviews the capabilities of a plasticity-induced crack-closure model and life-prediction code, FASTRAN, to predict fatigue lives of metallic materials using small-crack theory. Crack-tip constraint factors, to account for three-dimensional state-of-stress effects, were selected to correlate large-crack growth rate data as a function of the effective-stress-intensity factor range (ΔKeff ) under constant-amplitude loading. Some modifications to the ΔKeff -rate relations were needed in the near-threshold regime to fit small-crack growth rate behavior and endurance limits. The model was then used to calculate small- and large-crack growth rates, and to predict total fatigue lives, for notched specimens made of several aluminum alloys and a titanium alloy under constant-amplitude and spectrum loading. Fatigue lives were calculated using the crack-growth relations and microstructural features like those that initiated cracks for the aluminum alloys. An equivalent-initial-flaw-size concept was used to bound the fatigue lives for the titanium alloy. Results from the tests and analyses agreed well.

Copyright © 1995 by The American Society of Mechanical Engineers
Your Session has timed out. Please sign back in to continue.

References

Figures

Tables

Errata

Discussions

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In