0
PREDICTIVE SCIENCE AND TECHNOLOGY IN MECHANICS AND MATERIALS

Microstructure Design to Improve Wear Resistance in Bioimplant UHMWPE Materials

[+] Author and Article Information
D. S. Li1

School of Materials Science and Engineering, Georgia Institute of Technology, 771 Ferst Drive, Atlanta, GA 30332-0245dli@gatech.edu

H. Garmestani

School of Materials Science and Engineering, Georgia Institute of Technology, 771 Ferst Drive, Atlanta, GA 30332-0245hamid.garmestani@mse.gatech.edu

S. Ahzi

IMFS, University of Strasbourg, 2 Rue Boussingault, 67000 Strasbourg, Franceahzi@imfs.u-strasbg.fr

M. Khaleel

Computational Science and Mathematics Division, Pacific Northwest National Laboratory, Richland, WA 99352moe.khaleel@pnl.gov

D. Ruch

 Public Research Centre Henri Tudor, AMS, 66 rue de Luxembourg, B.P. 144 4002 Esch/Alzette, Luxembourgdavid.ruch@tudor.lu

1

Corresponding author.

J. Eng. Mater. Technol 131(4), 041211 (Sep 03, 2009) (7 pages) doi:10.1115/1.3183786 History: Received June 13, 2009; Revised June 29, 2009; Published September 03, 2009

A microstructure design framework for multiscale modeling of wear resistance in bioimplant materials is presented here. The increase in service lifetime of arthroplasty depends on whether we can predict wear resistance and microstructure evolution of a bioimplant material made from ultra high molecular weight polyethylene during processing. Experimental results show that the anisotropy introduced during deformation increases wear resistance in desired directions. After uniaxial compression, wear resistance along the direction, perpendicular to compression direction, increased 3.3 times. Micromechanical models are used to predict microstructure evolution and the improvement in wear resistance during processing. Predicted results agree well with the experimental data. These models may guide the materials designer to optimize processing to achieve better wear behavior along desired directions.

FIGURES IN THIS ARTICLE
<>
Copyright © 2009 by American Society of Mechanical Engineers
Your Session has timed out. Please sign back in to continue.

References

Figures

Grahic Jump Location
Figure 1

Number of procedures in the United States in 1990, 2005, and projected 2030 for (a) primary and revision TKA, (b) primary and revision THA, (c) expense on primary and revision THA in 2005 and projected in 2030

Grahic Jump Location
Figure 2

Structure of UHMWPE used in bio-implant liner at multiple different length scales

Grahic Jump Location
Figure 3

Schematic geometry of the sample during uniaxial compression to illustrate sample coordinate definition used in texture measurement and wearing resistance study. Loading direction is compression direction.

Grahic Jump Location
Figure 4

Recalculated (001), (010), and (100) pole figures of (a) as received; (b) uniaxial compressed to a strain of 20%; (c) uniaxial compressed to a strain of 50%; and (d) uniaxial compressed to a strain of 100%. The cross hair section represented the compression loading direction.

Grahic Jump Location
Figure 5

Evolution of intensity of texture components measured in UHMWPE samples during uniaxial compression

Grahic Jump Location
Figure 6

Predicted texture evolution of UHMWPE during uniaxial compression using micromechanical composite model, showing (001) pole figure (a) as received; (b) at a strain of 20%; and (c) at a strain of 100%

Grahic Jump Location
Figure 7

Measured and predicted wear behavior and predicted elastic modulus of UHMWPE on the plane parallel to the compression direction (a) along the direction parallel to the compression direction, and (b) along the direction perpendicular to the compression direction

Tables

Errata

Discussions

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In