20. March 2015   9:48 am
Dr. K. L. Mittal, Dr. Robert H. Lacombe

Dr. K. L. Mittal, Dr. Robert H. Lacombe

The previous issue of the “SURFACES: THE INVISIBLE UNIVERSE” blog focused on the topic of polymer surface modification. In this issue we continue on this topic and would again like to remind the reader of the upcoming Tenth International Symposium on Polymer Surface Modification: Relevance to Adhesion, to be held at the University of Maine, Orono, June 22-24 (2015). All readers are cordially invited to join the symposium either to present a paper on their current work in this field or to simply attend and greatly expand their awareness of current developments. Further details are available on the conference web site at: www.mstconf.com/surfmod10.htm

 

PLASMA CHEMISTRY OF POLYMER SURFACES

We continue our discussion on the topic of polymer surface modification via a book on this most important subject:

The Plasma Chemistry of Polymer Surfaces: Advanced Techniques for Surface Design, by Jörg Friedrich (WILEY-CVH Verlag GmbH& Co. KgaA, 2012)

The author began his studies in Macromolecular Chemistry at the German Academy of Sciences in Berlin and has been active in this field ever since and is now professor at the Technical University of Berlin. He is best know to us through his past participation in previous gatherings of the above mentioned Polymer Surface Modification symposium series going back to 1993.

In his introduction Prof. Friedrich points out the apparently incredible fact that more than 99% of all visible matter is in the plasma state. A moments reflection, however, easily confirms this statement since the Sun above, which of itself accounts for more than 99% of all matter in the solar system, is in fact an exceedingly dense and hot ball of plasma. Here on earth the plasma state is rarely observed outside of special devices and consists mainly of low and atmospheric pressure plasmas which are a source of moderate quantities of energy mainly transferred through the kinetic energy of free electrons. Such plasmas have sufficient energy to produce reactive species and photons which are able to initiate all types of polymerizations or activate the surface of normally inactive polymers. Thus plasmas offer the opportunity to promote chemical reactions at surfaces which would otherwise be difficult to achieve. However, the very active nature of plasma systems also present a problem in that the broadly distributed energies in the plasma can also initiate a wide range of unwanted reactions including polymer chain scission and cross linking. The problem now becomes how does one tame the plasma into performing only the chemical reactions one desires by eliminating unwanted and destructive processes. This is the topic to which we will give more attention to shortly but first a quick look at the contents of the volume.

The volume is divided into 12 separate chapters as follows:

  1. Introduction
  2. Interaction Between Plasma and Polymers
  3. Plasma
  4. Chemistry and Energetics in Classic and Plasma Processes
  5. Kinetics of Polymer Surface Modification
  6. Bulk, Ablative and Side Reactions
  7. Metallization of Plasma-Modified Polymers
  8. Accelerated Plasma-Aging of Polymers
  9. Polymer Surface Modifications with Monosort Functional Groups
  10. Atmospheric-Pressure Plasmas
  11. Plasma Polymerization
  12. Pulsed-Plasma Polymerization

Given the above list I think it can be fairly said that the volume covers the entire range of surface chemistries associated with plasma processes and far more topics than can be adequately addressed in this review. Thus the remainder of this column will focus on the above outlined problem of controlling the surface chemistry by taming normally indiscreet plasma reactions. This problem is discussed in chapters 9 and 12 of Prof. Friedrich’s book.

Chapter 9 attacks the problem of controlling an otherwise unruly surface chemistry initiated by aggressive plasma reactions through the use of “Monosort Functional” groups. For the benefit of the uninitiated we give a short tutorial on the concept of functional group in organic chemistry. The term functional group arises from classic organic chemistry and typically refers to chemical species which engage in well known chemical reactions. The classic example refers to chemical species attached to hydrocarbon chains. As is well known the hydrocarbons form a series of molecules composed solely of carbon and hydrogen. The simplest which is methane or natural gas which is simply one carbon atom with 4 hydrogens attached in a tetrahedral geometry and commonly symbolized as CH4. The chemistry of carbon allows it to form strings of indefinite length and in the hydrocarbon series each carbon is attached to two other carbons and two hydrogens except for the terminal carbons which attach to one other carbon and 3 hydrogens. Thus moving up the series we get to the chain with 8 carbons called octane which is the basic component of the gasoline which powers nearly all motor vehicles. Octane is a string of 8 carbons with 6 in the interior and two on the ends of the chain. The interior carbons carry two hydrogens and the two end carbons carry 3 hydrogens each giving a total of 18 hydrogens. Octane is thus designated as C8H18. Moving on to indefinitely large chain lengths we arrive at polyethylene which is a common thermo-plastic material used in fabricating all varieties of plastic containers such as tupper ware® , plastic sheeting and wire insulation. Outside of being quite flammable the low molecular weight hydrocarbons have a rather boring chemistry in that they react only sluggishly with other molecules. However, if so called functional groups are introduced the chemistry becomes much more interesting. Take the case of ethane C2H6 the second molecule in the series which is a gas similar to methane only roughly twice as heavy. If we replace one of the hydrogens with what is called the hydroxyl functional group designated as -O-H which is essentially a fragment of a water molecule, ie H-O-H with one H lopped off, we get the molecule C2OH6 which now has dramatically different properties. Ethane the non water soluble gas becomes ethanol a highly water soluble liquid also known as grain alcohol and much better known as the active ingredient in all intoxicating beverages. Thus through the use of functional groups chemists can work nearly miraculous changes in the properties of common materials and Prof. Friedrich’s monosort functionalization is a process for using plasmas to perform this bit of magic on polymer surfaces by attaching the appropriate functional groups. The process can be rather tricky, however, and requires understanding of the physical processes involved at the atomic and molecular level.

The following example illustrates the nature of the problem and how successful functionalization can be carried out using plasma technology.

Figure (1a) illustrates the basic problem with most common plasma surface treatments. The exceedingly high energy associated with the ionization of oxygen coupled with the equally high energies associated with the tail of the electron energy distribution give rise to a panoply of functional groups plus free radicals that can give rise to degradation and crosslinking in the underlying polymer substrate. Thus it would be difficult to control the chemical behavior of the nonspecific functionalized surface shown in Fig.(1a) with regard to further chemical treatment such as the grafting on of a desired molecule. In essence the wide range of chemically reactive entities make it very difficult to control any further chemical treatment of the surface due to the presence of a wide range of reactive species with widely different chemical behaviors.

Prof. Friedrich points out that unfortunately most plasma gasses behave as shown in Fig.(1a) but somewhat surprisingly use of Bromine (symbol Br) is different due to a special set of circumstances related to the thermodynamic behavior of this molecule, which are too technical to go into in this discussion. It turns out that bromine plasmas can be controlled to give a uniform functionalization of the polymer surface as shown in Fig.(1b). The now uniformly functionalized surface can be subjected to further chemical treatment such as grafting of specific molecules to give a desired well controlled surface chemistry.

In a similar vein, in chapter 12 Prof. Friedrich approaches the problem of plasma polymerization through the use of pulsed as opposed to continuous plasma methods. The problem is much the same as with the surface functionalization problem discussed above. Continuous plasmas involve a steady flux of energy which gives rise to unwanted reactions whereas by turning the energy field on and off in a carefully controlled manner limits the amount of excess energy dumped into the system and thus also the unwanted side reactions.

As this blog is already getting too long we leave it to the interested reader to explore the details by consulting Prof. Friedrich’s volume. As usual the author welcomes any further comments or inquiries concerning this topic and may be readily contacted at the coordinates below.

 

Dr. Robert H. Lacombe, Chairman

Materials Science and Technology CONFERENCES

3 Hammer Drive, Hopewell Junction, NY 12533-6124

Tel. 845-897-1654, FAX  212-656-1016; E-mail: rhlacombe@compuserve.com

Recommend
1. February 2015   2:11 pm
Dr. K. L. Mittal, Dr. Robert H. Lacombe

Dr. K. L. Mittal, Dr. Robert H. Lacombe

The last issue of the  “SURFACES: THE INVISIBLE UNIVERSE” focused on viewing the chemistry of surfaces through the lens of X-ray Photoelectron Spectroscopy (XPS, aka ESCA).  In this issue we shift focus to the problem of surface modification and in particular polymer surface modification.  This topic is highly apropos in view of the upcoming Tenth International Symposium on Polymer Surface Modification: Relevance to Adhesion, to be held at the University of Maine, Orono, June 22-24 (2015).  This symposium will cover all aspects of polymer surface modification from mechanical roughening to laser modification.  The meeting will be concerned with the those areas where surface modification is a key technology which allows for the processing and manufacture of products which would otherwise be unobtainable.  This meeting will also delve into the realm of biopolymer materials with applications to forest products, medical implants and food processing.  All readers are cordially invited to join the symposium either to present a paper on their current work in this field or to simply attend and greatly expand their awareness of current developments.  Further details are available on the conference web site at:  www.mstconf.com/surfmod10.htm

 SURFACES: LOW VOLUME BUT HIGH IMPACT ENTITIES

 An elementary calculation for just about any cube of a simple atomic or molecular material 1 centimeter on a side indicates that roughly one atom/molecule in ten million resides within ten angstroms of the outer surface.  Though relatively minuscule compared to the roughly 1023 atoms present in the sample, this surface layer determines many of the most important physical and technological properties of the material.  Common examples include optical properties such as reflection of light, all contact properties including friction, surface wetting and adhesion and many chemical phenomena including resistance to corrosion, biofouling and staining.  As important as these properties are, however, surface related phenomena received relatively scant attention during the early decades of the 20th century mainly due to the difficulty in performing careful experiments on such small amounts of material.  Well into mid century the solid state physics community was more focused on bulk properties of solids such as crystallographic structure, electronic band structure and bulk thermodynamic behavior.  What made all this work possible was the relative ease in preparing well characterized samples of uniform quality.  The influence of small amounts of contamination and crystallographic defects is small for bulk samples and more or less in proportion to the amount present. 

 In dealing with surfaces, however, we are faced with an entirely different situation.  The effect of small amounts of contamination and surface defects, rather than being more or less suppressed, can actually dominate the surface properties of the material under investigation.  Thus the basic problem of preparing well characterized surfaces for investigation is far more difficult than in the case of examining bulk properties.  Nonetheless, even though the sensitive properties of surfaces present daunting difficulties for those who want to perform careful investigations, these same properties present opportunities for the technologist who wants to alter surface behavior in order to create useful products and devices.  Nowhere is this opportunity more prevalent than for polymer materials which offer surfaces that can be modified by a number of physico-chemical methods to give specifically tailored results.  What makes this technology even more attractive is the fact that surface modification leaves all the bulk properties of the material intact so that one can, so to speak, have one’s cake and eat it too.  A typical example might be the case of a high modulus fiber being considered as a reinforcing agent in a particular matrix material.  One often finds that the fiber and matrix material are incompatible as obtained off the shelf.  However, appropriate surface treatment of the fiber can make it compatible with the matrix without sacrificing its reinforcing properties making possible the creation of a new composite material.

 MICRO-ORGANISMS AND THE FIBER-MATRIX INTERFACE

 One of the biologically oriented applications of surface modification technology is the removal of pathogens from food products such as berries, fruits and vegetables using atmospheric  plasma technology. Work in this area is currently underway and is planned to be presented at the symposium mentioned above.  However, on the flip side of this development, microorganisms, instead of being removed from a surface can also be added to improve adhesion as was discussed in a most interesting paper given at the first in the Polymer Surface Modification series.

 Since ancient times mankind has taken advantage of the bio-chemical synthesis skills of micro-organisms to create a number of desirable products.  Wine, beer and cheese derived from fermentation are the most recognized achievements of microbial industry.  However, to the best of my knowledge the work of Pisanova and Zhandarov is the first to put the microbes to work in modifying the surface properties of fibers for use in reinforced composites.[1]  These authors worked with the following fiber materials:

  1.    poly(caproamide) (Abbr. PCA)
  2.    poly(p- amidobenzimideazole) (Abbr. PABI)
  3.    poly(p-phenyleneterephthalamide) (Abbr.
  4.    PPTA)
  5.    poly(m-phenyleneisophthalamide) (Abbr. PPIA)
  6.    polyimde (Abbr. PI)

 The above fiber materials were considered as reinforcements for the following matrix materials:

  1.    polycarbonate   Abbr. PC
  2.    polysulfone   Abbr. PSF
  3.    polyethylene   Abbr. PE

 The fibers were exposed to the following micro-organisms:

  1.    Bacillus vulgaris (bacterium)
  2.    Bacillus cereus (bacterium)
  3.    Pseudomonas (bacterium)
  4.    Aspergillus (fungus)

 The authors knew that the above listed micro-organisms could attack and degrade polymer materials.  However, they reasoned that in the case of densely packed fibers the bacteria would be limited to performing their biochemical antics at the fiber surface leaving the bulk properties intact.  Thus each of the fibers was immersed in a nutrient medium containing one of the micro-organisms for up to two weeks to see what changes in the surface properties could be obtained.  The timing of exposure was important so as to limit the action of the organisms to just the fiber surface.  The mechanical properties of the fibers were measured beforehand by standard methods and the adhesion of the fiber to the matrix material was ascertained using the fiber pullout method. The results of their experiments were quite surprising in that in many cases they saw both improved mechanical properties of the fiber and improved adhesion of the fiber to the binding matrix material.  A sample of their results for the PCA/high density polyethylene (HDPE) system is given in Table I.  A cursory look at the data in Table I shows that the fiber mechanical properties are moderately improved by the micro-organism action with an increase in fiber modulus by as much as 18% and the strain at break by nearly 25%.  However, the bond strength of the fiber to the matrix in some cases could increase by nearly a factor of 2 as seen for the Bacillus vulgaris data.

 Also notable from the data in the table is the deleterious effect of overexposure to the microbial treatment as evidenced by the drop off in properties when the exposure time for Bacillus cereus was increased a factor of 4.

  

TABLE I:  Effect of micro-organism treatment on fiber mechanical properties for PCA and fiber/matrix bond strength for the PCA/HDPE system
TREATMENT (micro-organism) TENSILE STRENGTH (GPa) ELONGATION AT BREAK % BOND STRENGTH PCA/HDPE COMPOSITE (MPa)
No treatment 1.52 16 13.1
Bacillus vulgaris 1.74 17.8 21
Bacillus megaterium 1.74 18.2 17.6
Bacillus cereus 1.79 20.2 20
Bacillus cereus(8 week treatment) 1.23 15.5 19.9

 Though the precise details of the microbial action were not uncovered in these experiments a few things were made clear by microscopic examination.  In particular the first stage of the treatment  involved the filling of cavities, pores and microcracks on the fiber surface with the byproducts of the micro-organism’s metabolism which resulted in the healing of the defects and the subsequent improvement in mechanical properties.

In addition the surface chemistry of the fiber is significantly altered making it more compatible with the binding matrix and thus also improving the performance of the composite material.

 As a final word it is clear that this type of work is in its very early stages and that there is tremendous scope for expansion of the types of systems which can be looked at and the extent of improvement that can be achieved.  In addition, since the types of micro-organisms being used are commonly present in the environment this type of surface treatment could be considered more environmentally friendly than other methods which rely on harsh chemicals.

 Dr. Robert H. Lacombe, Chairman Materials Science and Technology CONFERENCES

3 Hammer Drive, Hopewell Junction, NY 12533-6124

Tel. 845-897-1654, FAX  212-656-1016 E-mail: rhlacombe@compuserve.com

[1] “Modification of Polyamide Fiber Surfaces by Micro-organisms”, Elena V. Pisanova and Serge F. Zhandarov, in Polymer Surface Modification: Relevance to Adhesion, Ed. K. L. Mittal (VSP, The Netherlands, 1995) p. 417

Recommend
7. July 2014   2:26 pm
Dr. K. L. Mittal, Dr. Robert H. Lacombe

Dr. K. L. Mittal, Dr. Robert H. Lacombe

We continue the INVISIBLE UNIVERSE blog after a delay due to our heavy involvement in the recently completed 9th INTERNATIONAL SYMPOSIUM ON CONTACT ANGLE, WETTABILITY AND ADHESION held at Lehigh University June 16-18, 2014.

Speaking of CONTACT ANGLE phenomena this would be a good time to review this topic as the contact angle method is by far and away the most popular surface analysis method in use today. This technique is of special interest to anyone doing surface modification either by plasma or any other method as it is the simplest and least expensive method for analyzing the impact of any surface treatment whatever. It is fortunate that our office has received a recently published volume on this topic which we will review shortly but first a rudimentary introduction to the concept of contact angle behavior of droplets for the sake of those who may be new to this topic.

Classic definition of the equilibrium contact angle of a drop of liquid on a surface as the balance of three surface tensions.

Fig.(1). Classic definition of the equilibrium contact angle of a drop of liquid on a surface as the balance of three surface tensions.

When a drop of some liquid is placed on a surface it will typically bead up and form a sessile drop as exhibited in Fig.(1). The angle that the edge of the drop makes with the underlying solid is determined by the balance of three surface tensions or surface energies if you prefer. The concept of surface energy has been covered in some detail in the February 2014 issue of this blog so it will not be reviewed here only to point out that surface tensions as measured in Nt/m are dimensionally the same as surface energies (J/m2) which are readily derived from surface tensions by multiplying the numerator and denominator of the units by m (meter) giving Nt-m/m2 or J(joules)/m2.

And now to our book review:

Wetting of Real Surfaces, By Edward Yu. Bormashenko, (Walter de Gruyter GmbH, Berlin/Boston, 2013)

In this rather compact monograph Prof. Bormashenko provides a rather comprehensive account of the theoretical developments in the field of contact angle and wettability of surfaces.

Interestingly, Prof. Bormashenko points out that the field of contact angle and wettability remained rather a backwater endeavor in the field of modern physics from the time of Thomas Young’s pioneering work up to roughly the 1990’s despite the fact that scientific heavyweights such as Einstein, Schrödinger and Bohr devoted a significant portion of their research activity to this topic. Much of this stems from the fact that surfaces presented a rather messy and intractable research topic due to the difficulty in obtaining well defined surfaces free of contamination and other defects. Indeed the eminent theoretical physicist Wolfgang Pauli remarked that “God created matter but the Devil created surfaces”. Thus the solid state physics literature up to about the early 1980’s tended to be dominated by topics such as superconductivity, electronic band structure, phase transitions, semiconductors and similar topics dealing primarily with the bulk behavior of solids.

This all started to change significantly by about the 1980’s being led in large part by the microelectronics industry which was fabricating multilevel thin film structures which were becoming more and more dominated by interfacial surfaces between metals, insulators and semiconductors.  Even by the early 1970’s it was becoming apparent that in order to fabricate devices with higher and higher circuit densities it was critical to understand the nature of the interactions between the various material components at their contact surfaces. This need was supported by advances in microscopy starting with electron microscopy and evolving further to electron tunneling microscopy and finally to the now ubiquitous atomic force microscopy. On top of this a number of surface analysis techniques emerged nearly too numerous to mention the most popular of which being X-ray Photoelectron Spectroscopy (XPS also called ESCA Electron Spectroscopy for Chemical Analysis).

The need for understanding surface properties was of course not limited to the microelectronics industry. The entire coatings industry needed to understand the wetting properties of various paints and inks and the biotechnology industry dealing with medical implants needed to understand how the surfaces of their devices would interact in the in vivo environment. The contact angle technique thus started to emerge as a low cost and highly sensitive method for exploring the wetting behavior of surfaces.

A critical juncture of sorts was achieved with the work of Barthlott and Neinhuis in 1997[1] who first studied the extreme hydrophobicity of the lotus leaf and its effect in removing all manner detritus from the leaf’s surface. This work lead to a literal explosion of work on the superhydrophobic effect and a variety of applications to self cleaning surfaces and other highly innovative technologies.

Getting back to Prof. Bormashenko’s volume a brief look at the table of contents reveals a rather wide range of topics:

  1. What is surface tension
  2. Wetting of ideal surfaces
  3. Contact angle hysteresis
  4. Dynamics of wetting
  5. Wetting of rough and chemically heterogeneous surfaces: the Wenzel and Cassie models
  6. Superhydrophobicity, superhydrophilicity and the rose petal effect
  7. Wetting transitions on rough surfaces
  8. Electrowetting and wetting in the presence of external fields
  9. Nonstick droplets

There is clearly not enough space here to cover all of the above topics in any detail so we will focus on chapter 6 dealing with super hydrophobic and hydrophilic phenomena which, as Prof. Bormashenko points out, are among the currently most actively researched topics in the contact angle field.

Schematic illustrating the difference between a truly superhydrophobic surface and one exhibiting only a high contact angle

Fig.(2). Schematic illustrating the difference between a truly superhydrophobic surface and one exhibiting only a high contact angle

Interestingly, the author points out that exhibiting a high contact angle is not sufficient to define a state of superhydrophobicity as might casually be assumed. In addition the contacting water drop must also exhibit low contact angle hysterisis.  That is the advancing and receding contact angles must be approximately the same. This property is required for the so called “lotus leaf” effect where water drops not only form with a high contact angle but also roll very easily off the leaf carrying any collected debris with them as shown in Fig.(2).

The counter example is the “rose petal” effect reported on by Jiang and co-workers.[2]  These investigators looked at water droplets on rose petals which also form very high contact angles but unlike the lotus leaf case these drops also exhibit a very strong hysterisis. An immediate consequence is that these drops do not roll even when held at a steep angle as also shown in Fig.(2).

Example of a hierarchical relief morphology.

Fig.(3). Example of a hierarchical relief morphology.

 A further subtle point brought out in the chapter is the fact that the substrate material does not have to be highly hydrophobic in order to exhibit the superhydrophobic effect. The lotus leaf material is in fact hydrophilic. What gives rise to the superhydrophobic behavior is the hierarchical relief morphology of the surface. An example of such a structure would be the fractal Koch curve shown schematically in Fig.(3). The author goes on to analyze the wetting of these highly variegated surfaces in terms of the Wenzel and Cassie models covered in chapter 5.

In all this volume can be highly recommended to anyone interested in coming up to date on the latest theoretical developments in the rapidly expanding field of contact angle phenomena.

The author invites any inquiries or comments on this article.


 [1] “Purity of the Sacred Lotus or Escape from Contamination in Biological Surfaces”, W. Barthlott and C. Neinhuis, Planta, 202, 1 (1997).<
[2] “Petal effect: A superhydrophobic state with high adhesive force”, L. Feng, Y. Zhang, J. Xi, Y. Zhu, N. Wang, F. Xia and L. Jiang, Langmuir, 24, 4114 (2008).
Recommend
3. April 2014   12:39 pm
Dr. K. L. Mittal, Dr. Robert H. Lacombe

Dr. K. L. Mittal, Dr. Robert H. Lacombe

Is Nylon Hydrophobic, Hydrophilic or Maybe Both?

In a recent posting on LINKEDIN Scott Sabreen Owner-President, The Sabreen Group Inc. Initiated the following discussion:

 

Nylons are inherently difficult to bond because they are hydrophobic, chemically inert and possess poor surface wetting. …

Nylons are hygroscopic and will absorb moisture in excess of 3 percent of its mass of water from the atmosphere. Moisture, in and of itself, creates adhesion problems.  …

 

Hold on, on the face of it the above remarks would seem to be mutually contradictory.  Is nylon hydophobic or hydrophilic?

The resolution of this apparent paradox comes in recognizing that the hydrophobic behavior of nylon is a surface property and the hydrophilic behavior is a bulk property.

Since nylon is an organic polymer it has a relatively low surface energy as do most polymers.  This is a consequence of the surface chemistry and surface physics of polymers and other organics as discussed in the previous edition of this blog.

However the amide groups in the nylon chain attract water and they give rise to the hydrophilic behavior of this material in regard to BULK ABSORPTION of water.  A number of other polymers such as the polyimides also behave in a similar manner.

So in the bulk nylon can behave as a hydrophilic material but on the surface it can exhibit hydrophobic behavior. Just another hidden property of surfaces that make them both tricky and fascinating to study.

The author invites any inquiries or comments.

Recommend
3. March 2014   12:35 pm
Dr. K. L. Mittal, Dr. Robert H. Lacombe

Dr. K. L. Mittal, Dr. Robert H. Lacombe

Origin of Surface Energy

In the December 2013 issue of this blog we noted that nearly all the information we commonly acquire about any given surface comes from the radiation reflected from it.  We also noted that of the entire range of the electromagnetic spectrum available that a surface could possibly radiate, we detect only the so called visible spectrum which amounts to barely 2% of what could be emitted. Thus what our eyes alone tell us about what is happening on a given surface is very limited indeed.  Not only that but the type of information is limited to basically the bulk geometry, gross surface morphology and color.

The fact of the matter is that an entire universe of important physical properties remain invisible to our eyes.  The invisible  property we want to explore here in fact cannot be seen even in principle.  This property is what is called the surface energy and to get a picture of it we need the apparatus of thermodynamics.

The whole concept of energy is rather subtle and intricate in general.  In particular it can take many forms including:

  • Electromagnetic energy stored in electric fields
  • Magnetic energy stored in magnetic fields
  • Thermal energy stored in any material at a finite temperature
  • Potential energy of any mass in a gravitational field
  • Kinetic energy of any moving object
  • Relativistic energy of any massive object as given by Einstein’s famous formula E = mc2
  • etc.

However, for our purposes we only need to understand the elastic energy stored in common solids and we can approach this by considering the behavior of a common spring.  Stretch or compress a spring and it will store a certain amount of elastic energy which can be perceived by allowing the spring to return to its equilibrium length. Much the same type of behavior goes on in common solids.  Figure (1) gives a highly idealized but reasonably realistic picture of a solid material viewed at the atomic level.

Schematic diagram of a solid viewed at the atomic level

Fig. 1: Schematic diagram of a solid viewed at the atomic level. To a first approximation the atoms can be thought of as being held together by microscopic springs which account for the elastic properties of the material.

The atoms/molecules in a given solid are held together by atomic and intermolecular forces which arise from the rather complex electromagnetic interactions among the electrons and nuclei which make up the bulk of any material. Fortunately, near equilibrium and for small deformations these interactions behave in a linear fashion very much like the behavior of simple springs.  Thus as Robert Hook pointed out more than a century ago the restoring force tending to bind the atoms together increases in a linear fashion as they tend to separate from one another. Things get quite a bit more complicated at large deformations but that need not concern us here.

Referring to the upper diagram in figure (1) we see that a typical atom in the bulk of our hypothetical solid feels either tensile or compressive loads from all directions and much the same is experienced by all the rest of the atoms in the deep interior of the solid.  However, the situation is quite a bit different for those atoms at or near the surface as shown in the bottom diagram of figure (1).  Looking down into the bulk of the material they see that same forces as the bulk atoms do but now there is no material on top which creates a highly asymmetrical situation.  It is precisely this asymmetry that gives rise to the unique surface tension or surface energy of the solid.

 

A Word on Units

Perhaps one of the most confusing things about surface energies are the units they are expressed in so we take a quick break here to clear up this issue.  Going back to our spring, if we stretch it there arises an immediate force tending to return it to the un stretched length.  Current international convention expresses this force in the standard SI units[1] of newtons. All systems of units are essentially arbitrary but it is nonetheless important to settle on a common standard.  Thus the common SI units for force are the newton with the dyne and the pound also in use but not considered standard by the international community.  The newton is the canonical unit of the international scientific community and the dyne is a scaled down derivative.  The pound is an archaic holdover from the past but is still much in use in commercial transactions especially in the USA.

Focusing on the newton it is formally defined as the force required to accelerate a one kilogram weight resting on a friction free surface one meter per second per second.  That means that the weight increases its speed by one meter per second every second under the action of the applied 1 newton load.  We can also thing of the newton in more intuitive if less rigorous terms as the weight of a common apple at sea level.  Thus if you hold a standard sized apple it imparts a force of close to one newton on your hand.  In terms of pounds the apple weighs slightly more that 1/5 th of a pound and in dynes it weighs about 100,000 dynes.

With the concept of force now rigorously defined we move on to the concept of energy again using our standard apple as a prop. Force times distance is energy in the form or work. Let’s assume that the apple weighs exactly 1 newton.  If we raise the apple from the ground to a height of 1 meter we will have done 1 joules worth of work or putting it a little differently we will have increased the apple’s gravitational energy by one joule.

Getting back to our spring, it stores what is called elastic energy.  As stated above energy is force times distance and the restoring force of a spring is proportional to the extension so the energy stored in an extended spring is proportional to force times distance squared.  These ideas can all be compactly summarized in the following formulas:

F = -k x (Restoring force exerted by a stretched spring) (1)

Where:
F = Force in newtons
x = Displacement in meters
k = Spring constant in newtons/meter

The minus sign in Eq(1) indicates the force is always a restoring force tending to oppose any extension or compression.

The energy stored in the spring is the integral of Eq(1) from 0 to some extension d:

W = IFxdx = -(½)kd2 (Energy stored in stretched spring) (2)

Thus if our spring has a spring constant of 1 newton/meter and we extend it to a distance of 1 meter it will pull back with a force of 1 newton.  Also it will store an energy of ½ joule.

We can think of the spring as having a tension of 1 newton/meter which is just another name for the spring constant.  Turning now to the physical surface of a polymer such as nylon the springs binding the atomic units together would have a tension of about 8×10-25 newtons/meter and this is the basis of the so called surface tension or surface energy of this material.  Now in practice we do not deal with such impossibly small numbers so we need to scale things up a bit.  In the case of our nylon, one square meter of the surface will contain something like 5×1022 molecular bonds among the surface moieties or in terms of our simple spring model 5×1022 springs.  So we take the surface tension of our polymer to be the tension in a single bond times the total number of bonds in a square meter which for our nylon material comes to 40×10-3 newtons/meter.  This is still to awkward so we introduce the milli newton (abbr mN) which is 10-3 newtons so nylon now has a surface tension of 40mN/m (abbreviating the meter as m).  Well why stop here.  We can do a little algebra on the units and say that 40mN/m is the same as 40mN-m/m2 by multiplying numerator and denominator by m.  Now the mN-m we recognize as a mJ or milli joule and so our nylon can be thought of as having a surface tension (aka surface energy) of 40mJ/m2.  And it does not stop here.  Many folks dealing with surface tension measurements would rather not have to deal with the milli prefix and such huge surface areas as a square meter.  It is more convenient to scale down the force unit to dynes (10-5 newtons) and use square centimeters (abbr cm) instead of square meters.  Thus 40mN/m scales down to 40 dynes/cm.

 

Real Solid Surfaces

Now that we have pinned down suitable units for measuring surface energies we can have a closer look at real material surfaces.  The diagrams in figure 1 give a much better depiction of a liquid surface than they do for a solid.  Liquids have a high mobility and can always adjust their configuration to give a uniform surface of minimum surface tension.  Not so with solids.  The surface configuration of a solid depends sensitively on the thermal-mechanical loading conditions under which it was created.  Was the material cooled rapidly or slowly?  What type of loads if any were active during the cooling process?

More realistic depiction of an actual solid surface

Fig. 2: More realistic depiction of an actual solid surface

Figure 2 gives a more realistic depiction of what a typical solid surface looks like.  This figure depicts 4 typical surface flaws that can significantly alter the surface energy of any real solid:

  1. VOIDS: Materials that have been rapidly quenched may not have time to completely condense giving rise to voids which are a source of tensile stress that will alter the surface energy in their vicinity.
  2. INCLUSIONS: No material is 100% pure and contamination species have a strong tendency to migrate toward surfaces where they upset the normal packing and in many cases give rise to a local compressive stress.
  3. GRAIN BOUNDARIES/DISLOCATIONS: Nearly all crystalline and semi crystalline materials are polycrystalline in nature.  That is they are made up of an aggregate of a large number of small crystals all packed together in no particular order.  The boundary where two crystallites meet form what is called a grain boundary.  Further the misalignment of planes within the crystalline give rise to what are called dislocations.  These and other imperfections can give rise to local stress fields where they intersect a surface that again alter the local surface energy.
  4. CONTAMINATION: Of all the surface imperfections contamination layers have the profoundest effect on surface energies.  Real material objects sit around on benches in the lab or other platforms where they are subject to constant bombardment from all the contaminants and gases in a typical atmosphere not the mention the greasy fingers of human handlers.

Needless to say all of these considerations make the accurate measurement of the surface energies of solids a rather tricky business. But enough for now.  We take up this question in the next chapter.

The author invites any inquiries or comments.

 

 


            [1]The International System of Units (abbreviated SI from French: Le Système international d’unités). A bunch of folks got together and formed the General Conference on Weights and Measures, an organization set up by the Convention of the Metre in 1875, which succeeded in bringing together many international organizations to agree not only the definitions of the SI, but also rules on writing and presenting measurements in a standardized manner around the globe.

 

 

Recommend
17. February 2014   8:30 am
Dr. K. L. Mittal, Dr. Robert H. Lacombe

Dr. K. L. Mittal, Dr. Robert H. Lacombe

We interrupt the normal flow of this blog on “SURFACES: THE INVISIBLE UNIVERSE”  to present a book review on a volume which should be of keen interest to all working in the field of atmospheric  plasma technology.  The volume in question is:

ATMOSPHERIC PRESSURE PLASMA TREATMENT OF POLYMERS: RELEVANCE TO ADHESION;  Eds. Michael Thomas and K. L. Mittal (WILEY-Scrivener Publishing, 2013)

The volume contains 15 review articles ranging from surface modification with plasma printing  to the deposition of nanosilica coatings on plasma activated polyethylene.  Each article has been produced by leading experts in the respective topics and gives a truly authoritative examination of the subject matter under review.  A quick look at the table of contents reveals the remarkable scope of this volume.

PART 1: FUNDAMENTAL ASPECTS

  1. Combinatorial Plasma-based Surface Modification by Means of Plasma Printing with Gas-carrying Plasma Stamps at Ambient Pressure;  Alena Hinze, Andrew Marchesseault, Stephanus Büttgenbach, Michael Thomas and Claus-Peter Klages
  2. Treatment of Polymer Surfaces with Surface Dielectric Barrier Discharge Plasmas; Marcel Šimor and Yves Creyghton
  3. Selective Surface Modification of Polymeric Materials by Atmospheric-Pressure Plasmas: Selective Substitution Reactions on Polymer Surfaces by Different Plasmas; Norihiro Inagaki
  4. Permanence of Functional Groups at Polyolefin Surfaces Introduced by Dielectric Barrier Discharge Pretreatment in Presence of Aerosols;  R. Mix, J. F. Friedrich and N. Inagaki
  5. Achieving Nano-scale Surface Structure on Wool Fabric by Atmospheric Pressure Plasma Treatment;  C. W. Kan, W. Y. I Tsoi, C. W. M. Yuen, T. M. Choi and T. B. Tang
  6. Deposition of Nanosilica Coatings on Plasma Activated Polyethylene Films;  D. D.  Pappas, A. A. Bujanda, J. A. Orlicki, J. D. Demaree, J. K. Hirvonen, R. E. Jensen and S. H. McKnight
  7. Atmospheric Plasma Treatment of Polymers for Biomedical Applications;  N. Gomathi, A. K. Chanda and S. Neogi

PART 2: ADHESION ENHANCEMENT

  1. Atmospheric Pressure Plasma Polymerization Surface Treatments by Dielectric Barrier Discharge for Enhanced Polymer-polymer and Metal-polymer Adhesion;  Maryline Moreno-Couranjou, Nicolas D. Boscher, David Duday, Rémy Maurau, Elodie Lecoq and Patrick Choquet
  2. Adhesion Improvement by Nitrogen Functionalization of Polymers Using DBD-based Plasma Sources at Ambient Pressure;  Michael Thomas, Marko Eichler, Kristina Lachmann, Jochen Borris, Alena Hinze and Klaus-Peter Klages
  3. Adhesion Improvement of Polypropylene through Aerosol Assisted Plasma Deposition at Atmospheric Pressure;  Marorie Dubreuil, Erik Bongaers and Dirk Vangeneugden
  4. The Effect of Helium-Air, Helium-Water, Helium-Oxygen and Helium-Nitrogen Atmospheric Pressure Plasmas on the Adhesion Strength of Polyethylene;  Victor Rodriguez-Santiago, Andres A. Bujanda, Kenneth E. Strawhecker and Daphne D. Pappas
  5. Atmospheric Plasma Surface Treatment of Styrene-Butadiene Rubber: Study of Adhesion Ageing Effects;  Cátia A. Carreira, Ricardo M. Silva, Vera V. Pinto, Maria José Ferreira, Fernando Sousa, Fernando Silva and Carlos M. Pereira
  6. Atmospheric Plasma Treatment in Extrusion Coating:  Part 1 Surface Wetting and LDPE Adhesion to Paper;  Mikko Tuominen, J. Lavonen, H. Teisala, M. Stepien and J. Kuusipalo
  7. Atmospheric Plasma Treatment in Extrusion Coating:  Part 2 Surface Modification of LDPE and PP Coated Papers;  Mikko Tuominen, J. Lavonen, J. Lahti and J. Kuusipalo
  8. Achieving Enhanced Fracture Toughness of Adhesively Bonded Cured Composite Joint Systems Using Atmospheric Pressure Plasma Treatments;  Amsarani Ramamoorthy, Joseph Mohan, Greg Byrne, Neal Murphy, Alojz  Ivankoviv and Denis P. Dowling
Schematic two dimensional slice through a typical plasma stamp

Schematic two dimensional slice through a typical plasma stamp

A cursory glance at the above list readily gives one the impression that the applications of the atmospheric plasma technique are limited solely by ones imagination.  It is also clear that this short review will be able to cover only a small fraction of the material covered in this volume.  Quite likely the most innovative paper in the collection is the one on “Combinatorial Plasma-based Surface Modification …” listed as number 1 above.  This work attempts to take the process of plasma surface modification to a higher level through the use of “plasma stamps” which can be used to pattern a substrate with varying levels of plasma treatment in a single run.  A schematic diagram of a plasma stamp is shown in figure (1).  The substrate to be treated is patterned with an array of chambers using poly(dimethylsiloxane) PDMS as an insulator layer.  The resulting array is sandwiched between a porus metal mesh and an electrode.  The metal mesh in this case serves a dual purpose as a gas carrier and as an electrode.

The authors site a number of advantages of the plasma stamp configuration including:

  • Due to the small size of the plasma chambers it is easy to supply nearly unlimited volumes of gas to the active micro-plasmas which is very useful when performing film depositions as opposed to simply performing a surface modification.
  • Again due to the small cavity size the stamp can be rapidly filled using  a small amount of gas.  Thus the process is not only economical in the use of gas but the small chambers can be rapidly purged of unwanted oxygen which is a critical requirement when performing plasma nitrogenation treatments.
  • The small cavity size also allows reaction products created in the cavities that are not deposited to be swept away efficiently in the gas stream.  This is very useful in preventing fouling due to the redeposition of plasma  polymers.
  • Quite likely the most significant advantage of the plasma stamp technology is the fact that quite large arrays  of the plasma micro-cavities  can be created allowing for very efficient combinatorial studies of different  plasma treatments on a single substrate in a single run.  Thus one can easily imagine a 2 dimensional array where two different gas streams are independently introduced to the array from opposite sides of the inlet edge.  The streams will combine continuously across the entire array of cavities giving a well defined gradient of gas composition over the entire array.  Different cavities thereby receive different treatments depending on their location in the overall array.  The results can then be inspected by any of a number of surface analysis methods such as Fourier Transform Infra Red spectroscopy (FTIR) or Xray Photoelectron Spectroscopy (XPS).  Thus a large number of different surface treatments can be screened in a highly efficient manner.

Again the large scope of the volume does not allow us to comment on the other equally interesting articles. It should be clear, however, as mentioned above that the possibilities are limited only by the imagination.

Recommend
27. January 2014   9:33 am
Dr. K. L. Mittal, Dr. Robert H. Lacombe

Dr. K. L. Mittal, Dr. Robert H. Lacombe

This issue of the SURFACE SCIENCE CORNER blog inaugurates a series of essays on the above mentioned topic concerning how we recognize the ubiquitous surfaces we look at all of our waking moments.  The answer of course is through the subtle apparatus of our visual neurology which is activated by the light coming at us from all directions.  The question then becomes what information is all this light carrying to our visual cortex.  However, before we try to unravel this question we need to examine in a little more detail the nature of the light that is being reflected into our eyes from all directions.

Electromagnetic spectrum on a logarithmic scale

Electromagnetic spectrum on a logarithmic scale

The light coming at us is part of what is called the electromagnetic radiation spectrum the entire extent of which is displayed in figure (1).  The striking thing about this figure is the enormous range of the spectrum covering some 20 orders of magnitude on the logarithmic scale shown in the figure.  To print the entire figure on a linear scale would require a sheet of paper extending out to the edge of the solar system assuming 0.1 mm of sheet per 10Hz of frequency.  The second remarkable feature of this diagram for our purposes is the fact that the range of frequencies of visible radiation which is what our eyes detect is less than 2% of the range.  Though we detect a limited number of mechanical and thermal properties of surfaces through direct touch, the preponderance of our awareness of surfaces comes from reflected radiation.  Going by the figure we see that our eyes are missing some 98% of what is potentially being reflected at us.

So what are we missing in particular.  Lets consider first the infrared region from roughly 1011 to 1014 Hz.  This is radiation contributed by the incessant atomic and molecular motions of the atoms and molecules which make up all matter.  To our eyes all surfaces lying at rest are perfectly still certainly at the macroscopic level.  However, consider for example a carbon atom associated with a particular bit of organic contamination residing on some apparently undisturbed surface at room temperature.  An elementary thermodynamic calculation indicates that far from sitting still such an atom is vibrating in place at an average velocity near 300 m/sec.  This motion combined with the motion of all the other atoms and molecules on the surface gives rise to infrared radiation which is beamed in all directions and is entirely invisible to our eyes.  We can, however, detect the molecular vibrations of organic molecules on surfaces with the aid of specialized infrared spectrometers so we know they are there even though we cannot see them.

What else are we missing?  Going to higher frequencies in the range of 1016 to 1019 Hz we find ourselves in the land of the X-rays which reflect information on atomic and molecular structure on the scale of a few angstroms or roughly 10-10 cm.  If we could detect this radiation we would be able to see the atomic and molecular packing of all the species at the surface.  Things like crystal structure, grain boundaries, dislocations and assorted other types of contamination and defects lying on the surface.  In addition we would be able to detect the inherent roughness of the surface at the atomic and molecular scale.  What to the eye would appear to be a perfectly smooth surface when observed with X-rays would appear to be quite rough an rugose.  Such variable surface topography can have quite significant effects on common properties such as surface wettability.

Going to yet higher frequencies we find ourselves in the range of the gamma rays which live in the range from roughly 1019 to 1020 Hz.  The gamma rays allow us to peer into the goings on in the atomic nucleus some 1000 times smaller than the typical atom.  In particular, some nuclei are unstable and can disintegrate into smaller nuclei giving off gamma rays and other particles in the process.  Most common materials are not radioactive but some do have contamination level concentrations of radioactive species which can give off barely detectable amounts of radiation.  Now one might not expect that sub detectable levels of radiation would be of much concern to the practical product engineer manufacturing some wholly macroscopic device for industry.  However, the world of surfaces can be quite subtle and engineers in the microelectronics industry got an elementary lesson in nuclear physics from that most common of common materials lead.  It turns out that lead can harbor contamination levels of radioactive species the activity of which are quite invisible to our eyes as explained above.  Lead has a long history of being used to make electrical connections in the electronics industry going back to at least the mid 19th century.  It so happened that in the early 1980’s lead solder was being used to connect sensitive memory chips to ceramic substrates.  The memory chips were of a special kind which utilized the very high resistivity of single crystal silicon to trap a small amount of charge in a small cell which formed the basis of an elementary unit of memory.  A cell containing charge served as a boolean “1″ and an empty cell represented a boolean “0″.  All well and good but all the cells had to be connected to the remaining computer circuitry using metal lines and interconnects and of course that old work horse lead served as one of the interconnect materials.  The reader can now well guess what happened.  The radioactive contaminant species in the lead would decay from time to time giving off not only a gamma ray but also a highly charged alpha particle.  The alpha particle was the main mischief maker.  Carrying a charge of +2 it does not travel very far in ordinary matter but where it does go it leaves behind a trail of ionization which can momentarily turn a highly resistive material like single crystal silicon into a good conductor along the path of the alpha particle.  One can easily imagine a wayward alpha particle crashing into one of the silicon memory cells causing a charged cell to discharge along the ionization path left by the alpha.  A memory register of the computer has now been randomly and irreversibly changed which is not good from the point of view of programming logic.  If the affected register happened to contain an important logic instruction the result would easily lead to a serious programming error or simply machine lockup.  The field engineers came to know this type of problem as a “soft” error since the ionization trail left by the alpha particle would quickly dissipate leaving the affected memory cell quite unharmed.  Thus any attempt to locate the source of the error would be futile since no permanent hardware malfunction was involved.  Such so called “soft” errors are the worst kind from the point of view of the field engineer.  They come at random from seemingly nowhere and the culprit escapes without a trace.  How this problem was eventually solved is a story for another time but for now it simply illustrates that what we cannot see coming off a surface can indeed bite us in wholly unsuspected ways.

If we now go to the lower frequencies from 104 to 1011 Hz we come across the radio and microwave bands.  If we could detect these frequencies we would be able to see the myriad electronic and polarization currents which endlessly flow in all materials and give rise to a number of phenomena which affect us in various ways even though we cannot visually see them.

All this will be covered in future issues of the SURFACE SCIENCE CORNER blog.

The author invites any inquiries or comments.

Recommend
28. October 2013   10:36 am
Dr. K. L. Mittal, Dr. Robert H. Lacombe

Dr. K. L. Mittal, Dr. Robert H. Lacombe

We interrupt the normal flow of essays on the invisible universe of surfaces to present a short review of a newly available volume which should be of strong interest to all who are interested in the modification of polymer surfaces using plasma technology. The volume in question is:

THE PLASMA CHEMISTRY OF POLYMER SURFACES: Advanced techniques for Surface design

By: Jörg Friedrich, (Wiley-VCH Verlag GmbH & Co. KgaA, Weinheim, Germany, 2012)

By way of disclosure we should point out that Prof. Friedrich is a good friend and colleague of ours and a long time participant in the MST CONFERENCES symposia series on Polymer Surface Modification and Metallized Plastics.  We can therefore personally vouch for his outstanding expertise in the topic under discussion.

The volume is indeed a major undertaking to categorize and review the truly vast topic of plasma processing as applied to polymer surfaces.  The table of contents gives a glimpse of the wide scope of topics covered:

  • Interaction between Plasmas and Polymers
  • Plasma
  • Chemistry and Energetics in Classic and Plasma Processes
  • Kinetics of Polymer Surface Modification
  • Bulk, Ablative and Side Reactions
  • Metallization of Plasma-Modified Polymers
  • Accelerated Plasma-Aging of Polymers
  • Polymer Surface Modification with Monosort Functional Groups
  • Atmospheric Pressure Plasmas
  • Plasma Polymerization
  • Pulsed-Plasma Polymerization

 

The volume is clearly directed toward the polymer chemist with emphasis on the creation and behavior of “functional groups” which tend to dominate the chemical behavior of polymer surfaces. The ubiquitous hydroxyl (-OH) group is a prime example. The introduction of such groups to a highly hydrophobic polypropylene surface, which repels water and most common inks, can almost magically turn it into a hydrophilic surface which can be easily written on with a ball point pen.  There are of course many more such “functional groups” and the volume goes into great detail as to how they can be created, how they affect the surface chemistry and how they behave over time.

The wide scope of the volume makes it impossible to review all of the topics listed above in any detail so we will focus instead on the chapter on Atmospheric Pressure Plasmas which is likely to be of most interest to readers of this column.

The section on atmospheric plasmas begins with the initially astonishing assertion that more than 99% of the universe is in the plasma state.  To us earthlings bound to a highly condensed rock of a planet this seems hardly credible as our local experience of the plasma state is confined to relatively rare phenomena such as lightning, auroras and flames.

However, simply looking up at the sky during daylight reveals the sun which is a seething cauldron of plasma gasses weighing more than 100 million times the mass of the earth.  The sun of course is just one of a near infinitude of stars that make up the universe so the 99% estimate is actually highly conservative.

Having established the ubiquity of the plasma state in the universe the author then goes on to review a wide range of recent work on the use of atmospheric plasmas for the surface modification of polymers.  Of particular interest is the comparison of atmospheric plasma jets, low pressure oxygen plasma and dielectric barrier discharge plasma.  Figure 10.1 from the text is particularly interesting.  It compares the relative effectiveness of these techniques in introducing oxygen into the surface of polypropylene as a function of application time.  Interestingly the data show that after about 6 seconds the atmospheric plasma jet technique delivers nearly twice the oxygen concentration to the surface than either the low pressure oxygen or dielectric barrier method.  This easily explains why the plasma jet approach is highly favored in manufacturing applications where a rapid process is necessary to achieve required manufacturing volumes.

A second most interesting application is the use of atmospheric plasma for polymer deposition onto surfaces.  This is an apparent attempt to apply a polymer coating to a surface which in conventional low pressure plasma processes is carried out by the plasma polymerization method.  By the conventional approach plasma activated monomer species are allowed to condense on the target surface where they can then react to form polymer networks.  This approach to applying a polymer coating to a surface faces a number of problems due to the high reactivity of the plasma activated monomers.  One on the surface they could react in a wide variety of ways giving rise to a complicated morphology which is difficult to understand.  In addition the activated monomers tend to condense not only on the target substrate but on all the vacuum chamber surfaces.  Over time this can give rise to coated chamber surfaces which can flake off and cause a contamination problem.

In order to circumvent the problems encountered with the conventional plasma polymerization method the atmospheric pressure approach injects a fully formed polymer species into the plasma stream as an aerosol.  The plasma gas activates both the polymer and the target surface allowing the polymer to form a strongly adhered coating.  Thus the prefabricated polymer goes down as a complete entity and there is no vacuum chamber to be coated and build up possible contamination effects.

These and other innovative plasma techniques are reviewed in some detail in this quite comprehensive volume.  We can highly recommend this book as a reference work for all those engaged in the research and development of plasma surface treatment technologies.

Recommend
7. August 2013   1:11 pm
Dr. K. L. Mittal, Dr. Robert H. Lacombe

Dr. K. L. Mittal, Dr. Robert H. Lacombe

CHAPTER 1: DO WE REALLY KNOW WHAT A SOLID IS

As outlined in last month’s issue of the SURFACE SCIENCE CORNER blog the next several issues will be devoted to exploring the invisible properties of surfaces that we are rarely able to detect by simple inspection but nonetheless have rather profound consequences for the way surfaces behave. However, if we are going to uncover the hidden aspects of surfaces we are going to have to look at the atomic and molecular level which is where all the secretive phenomena reside. Before doing this it would be very helpful to look at the behavior of bulk solids at the atomic and molecular level since what goes on at surfaces is basically a subtle variation of what goes on in the bulk.

So what do we really know about the nature of solids? I wrote an encyclopedia article a while back dealing with Basic Concepts in Adhesion Science and as part of that article I tried to introduce some fundamental concepts on the nature of solids at the atomic and molecular level. The editor wrote back that this section could be eliminated since everyone knows what atoms are all about. Really! I wrote back that yes most people have the 19th century view that atoms are some kind of tiny ball bearings and that solid matter is just a colossal agglomeration of these tiny beads held together by some kind of atomic forces. However, how many people are aware that atomic matter is mostly an exceedingly intense electromagnetic field?

Before entering into the realm of quantum electrodynamics it would be a good idea to cycle back to the late 19th century and investigate the far more intuitive concepts of what defines a piece of solid matter. At that point in time the prevailing scientific notion of a solid was anything that had a significant modulus. So what is a modulus? This concept goes back to the 17th century British physicist Robert Hooke who first stated his law of elasticity in 1660 as a Latin anagram, whose solution he published in 1678 as Ut tensio, sic vis; literally translated as: “As the extension, so the force” more commonly “The extension is proportional to the force”. Basically if you pull on a strip of some material the amount of extension achieved will be directly proportional to the level of applied force and the constant of proportionality is what we call the modulus of the material. For our purposes we need to make this concept more quantitative and a simple example will help in this regard.

Say we have a thin strip of a typical engineering thermoplastic such as poly-styrene and apply our stretch test to it. For quantitative purposes we let our strip be 10 centimeters (10cm or 0.1 meter (0.1m)) long by 0.5cm (0.005 m) wide by 0.1cm (0.001 m)thick. Common thermoplastic materials are fairly stiff materials at room temperature so we will need to put it in a tensile test machine which will apply a tensile load to it measured in Newtons. For consistency we need to keep our units within the commonly accepted SI system (The Système international d’unités (SI), or International System of Units, defines seven units of measure as a basic set from which all other SI units are derived) so we use the Newton as the measure of force abbreviated as Nt which in magnitude amounts roughly to the weight of an apple or close to 200 grams. Now our tensile tester applies a force over the entire cross section of our sample which has some finite area and thus applies a stress or force per unit area to the test specimen. Again for consistency with the SI system of units we measure the stress in Newtons per square meter (abbr. Nt/m2) or Pascals the SI derived unit of pressure, stress, modulus and tensile strength, named after the French mathematician, physicist, inventor, writer, and philosopher Blaise Pascal (1623-1662).

Upon applying a load to the sample strip we note that it extends by some amount Δ. To be in conformity with common practice we divide the amount of extension Δ by the original length of the sample “l” to get a dimension less quantity ε = Δ/l called the strain. We can now recast Hook’s law as the stress is proportional to the strain where the constant of proportionality is called the modulus E also called the Young’s modulus in honor of Thomas Young an English polymath notable for scientific contributions to the fields of vision, light, solid mechanics, energy, physiology, language, musical harmony and Egyptology (a polymath indeed!). Our formula can now be compactly expressed by the following simple expression:

σ = E x ε or stress is the modulus times the strain.

We use the traditional Greek symbols sigma for the stress and epsilon for the strain. The stress has units Nt/m2and since the strain is dimension less the modulus must also have units Nt/m2.

Now back to our test strip. Say we apply a load of 150 Nt with the tensile tester and note that the sample stretches by 1 millimeter or 0.001 m. since our sample is 0.1 m long we have induced a strain of 0.001/0.1 = 0.01 or by 1 percent of its original length. The cross section of our strip was 0.005 m x 0.001 m = 0.000005 m2 and the applied load was 150 Nt so the applied stress was 150/0.000005 Nt/m2 = 30,000,000 pascals (abbr. Pa). We note that the pascal is a rather small unit since it amounts to roughly 200 grams spread over one square meter. The pascal amounts to roughly one ten millionth of the atmospheric pressure which is squeezing on every one of us living reasonably close to sea level. Rather than deal with such an inconvenient unit and have to carry all those cumbersome zeros around we define a new unit called the mega pascal or 1 million pascals (abbr. MPa). The applied stress is thus 30 MPa. Going back to Hook’s law as formulated above we see that the modulus of our sample has to be the stress divided by the strain or 30 MPa/0.01 = 3000 MPa. Since we abhor carrying around a lot of zeros we define yet a new unit called the giga pascal (abbr. GPa) which is 1000 MPa or 1 billion pascals. The modulus of our sample thus comes to a nice round 3 GPa. This is quite close the modulus of all common polymers from plastic cups and dinner plates to plastic eye glass lenses.

At this point our understanding of the elastic properties of solids is roughly where things stood at the end of the 19th century. All solids can be defined by their elastic modulus which is determined experimentally by stretching a test piece and dividing the applied stress by the induced strain. All was not well, however, since the theory was based on the hypothesis that all solid materials were part of continuum which looked the same at all scales of length. This notion was already under attack by the late 19th century since many of the ideas associated with chemical reactions were best explained by assuming that elements were discreet entities. By the mid 20th century the atomic theory of matter was established beyond any reasonable doubt and not only that but it was further determined that atoms themselves were made up of still smaller units called electrons, protons and neutrons.

So to explore the hidden nature of surfaces or bulk matter we have to look at the atomic level or at spatial separations of
10-10 m or 0.1 nanometers (abbr. nm). This means we must leave the classical mechanics of 19th century continuum theory behind and delve into the nether world of quantum mechanics. Now we certainly do not have the space here to get into the details of quantum theory but we can present the relevant results fairly succinctly. Those interested in a more detailed discussion are invited to go on the MST CONFERENCES web and look up Vol. 2 No. 3 of the conference newsletter under the heading Hamaker Theory or at Atomic Distances the World Follows the Rules of Quantum Mechanics (www.mstconf.com/Vol2No3-2005.pdf)

By the time the 20th century was half over it was well established that the properties of all common materials were fundamentally governed by electromagnetic interactions. Classical electromagnetism was well explained by Maxwell’s Equations and quantum theory explained how the electromagnetic interactions between the fundamental particles such as the proton and the electron served to bind atomic matter together. What was happening within the atomic nucleus was still problematic since an entirely different set of forces were operating but these forces held the typical nucleus together so strongly and within such a small volume that for the purposes of understanding the chemical nature of matter the nucleus could be assumed to be a point mass and a source of positive charge which attracted and bound the electrons. To get some idea of the sizes and distances involved consider that a typical atomic nucleus is on the order of 3.4×10-15m , the electromagnetic radius of the electron comes to about 2.8×10-15m and within a typical atom electrons and protons are separated on average by roughly a Bohr radius which is close to 5.3×10-11m. From these numbers one readily perceives that atomic matter amounts to relatively small point masses separated by huge distances. To put the matter in better perspective simply scale the numbers up by a factor of 1013. The nucleus then would have a radius of 3.4 cm, which is roughly a small orange, and the electron would come to 2.8 cm, or a crab apple, and they would be separated on average by a distance of 530 meters or roughly half a kilometer! Thus the world at the atomic level seems to be mostly empty space. Or is it? What is it that fills all that space and gives the impression that atomic matter is something sturdy and substantive? The answer as stated above is the electromagnetic field. In essence it is the electromagnetic field between the charged proton and electron that provides the substance of the hydrogen atom or any other atom for that matter. It turns out that the square of the electric field is proportional to an energy density typically measured in joules per cubic meter (abbr. J/m3) where the joule is the SI unit of energy named in honor of James Prescott Joule (1818-1889 an English physicist and brewer). Now we know from our elementary physics courses that the joule is formally defined as the force of 1 Nt acting through a distance of 1 m. Thus if you lift an apple off the floor and raise it to a height of 1 m you have expended 1 joule of energy in the process. We see then that the joule has the fundamental units of Newton-meter (abbr. Nt-m). Thus the units of energy density can be written J/m3 = Nt-m/m3 = Nt/m2 or putting it succinctly the units of energy density are the same as the units of stress or modulus. So the energy density associated with the electric field generated by the electron and proton in the hydrogen atom gives rise to a modulus of some sort.

It turns out that the electric field close to an electron or proton is exceedingly intense. The newsletter article mentioned above derives a simple formula for estimating the energy density associated with the electric field close to an electron and the results are summarized in the following table:

TABLE 1: Electromagnetic energy density in a neighborhood of an electron at representative atomic distances estimated from Eq.(10) (Newsletter article www.mstconf.com/Vol2No3-2005.pdf)

DISTANCE FROM ELECTRON (angstrom = 10-10m) ENERGY DENSITY/ ELECTROMAGNETIC MODULUS (GPa) COMMENTS
4.0 2.8 Roughly the modulus of a thermoplastic i.e. polystyrene
1.37 208.0 Modulus of metal such as steel
1.0 732.0 Modulus of refractory material like silicon carbide or diamond

Table 1 clearly shows that the electromagnetic energy density associated with the electron’s electric field can account, at least in a heuristic fashion, for the elastic properties of all forms of common matter.

This seems like a rather remarkable conclusion since it is also the electromagnetic field associated with sunlight that warms us on a sunny day. Even though solar radiation can warm us it certainly does not appear to have any appreciable substance. Nevertheless sunlight also presses down on us with a very week but non zero pressure. In particular if you step out on a sunny day and hold your hand facing the sun it receives a thermal input of about 7 watts which is just enough to detect. What is not detectable but still present is the fact that the solar radiation exerts a pressure on your hand of roughly 4.5×10-6 Pa which is so small that you would need a balance with nano Newton resolution to detect it. So it is essentially a matter of intensity that determines the apparent behavior of electromagnetic fields. The field close to an electron exerts a reactive force so intense that it closely emulates the behavior of solid matter whereas the field associated with common light is so week it seems to have no substance at all.

This basic fact was brought home to me many years ago as a graduate student doing a project on particle physics. At that time long ago we were analyzing bubble chamber tracks from proton collisions generated by the Zero Gradient Synchrotron at Argonne National Labs. We were essentially using highly specialized camera equipment to analyze the tracks of high energy protons as they traversed the chamber. Mostly what you saw on a given exposure were the slightly curved tracks of the protons as they wizzed through the chamber. The tracks had curvature since the chamber was immersed in a strong magnetic field which exerted a force perpendicular to the travel direction of the proton. Every dozen or so frames however, there would be an event whereby one of the incident protons would smash into one of the hydrogen atom protons which filled the chamber as the entire contents of the chamber was liquid hydrogen. A collision event would appear as a proton track suddenly terminating at a point in the chamber and a number of separate tracks would appear going off in all directions. Two protons essentially collided and exploded into a number of other particles indicating that the proton is not an elementary particle but made up of still smaller entities. What caught my eye on occasion, however, was the fact that some of the collision events gave off a phantom particle that would leave no track near the collision point. The only way one would know that such a particle had been emitted is that at some distance from the primary collision point a pair of tracks would appear emanating from a single point and veering off in separate directions leaving a V shaped imprint behind. That phantom particle was in fact a bit of exceedingly high energy electromagnetic radiation called a gamma ray. These gamma rays were so energetic that they were unstable and would spontaneously self annihilate by turning into an electron and a positron. This clearly illustrated the substantive nature of the electromagnetic field by directly turning radiation into two elementary particles.

This basically ends our introductory tutorial on the electromagnetic nature of solid matter. The stage is now set to explore further the consequences of what we have uncovered in relation to the behavior of solids at their surfaces and how they interact with other solids. We commonly deal with plasmas to modify surfaces and indeed the energy density of plasmas is also locked up in intense electromagnetic fields. So we will see that the electromagnetic field has a few more tricks up its sleeve as will be uncovered in future issues of THE SURFACE SCIENCE CORNER.

Recommend
2. July 2013   3:18 pm
Dr. K. L. Mittal, Dr. Robert H. Lacombe

Dr. K. L. Mittal, Dr. Robert H. Lacombe

This issue of the SURFACE SCIENCE CORNER blog begins a series of essays dealing with the general topic of the hidden aspects of surfaces that, though generally invisible to us, have an important influence not only on our day to day activities but also on our ability to manufacture common objects of everyday use.

A common example is the polymer materials we use to encapsulate all manner of foodstuffs to protect them from moisture, oxidation and other unwanted atmospheric influences. A typical packaging resin may be quite effective at protecting against unwanted atmospheric invaders but also impossible to label or decorate with commonly available inks. The ability of a given surface to be written on by a given ink is controlled by the surface energy of the polymer and the surface energy of any given material is just one among a host of invisible surface properties.

Thus the underlying thread in all of the discussion will be the fact that although most properties of surfaces may be invisible to us they nonetheless have a rather profound influence on not only our ability to manufacture items of common commerce but our everyday experience as well.

The following is an outline of the topics to be explored in more detail in subsequent issues of the blog:

1. OPTICAL PROPERTIES: What we see is only an infinitesimal fraction of the radiation emitted by surfaces.

2. SURFACE ENERGY: A world that can only be explored through the lens of thermodynamics.

3. SURFACE ANALYSIS METHODS: ESCA, AUGER, TOFF SIMS, EELS, … the alphabet soup of surface analysis techniques that allow us examine surfaces at the atomic level.

4. CONTACT ANGLE: A poor man’s surface analysis tool you can implement in your kitchen and curiously enough turns out to be superior to the high power methods in an interesting way.

5. SURFACE FORCES: Ever wonder why insects and the gecko lizard can walk upside down on your ceiling or nearly any other surface? The surface van der Waals forces are of exceedingly short range but their influence extends to surprisingly large distances.

6. ADHESION: Mostly you find that when two objects come into contact they do not adhere very well. Some materials, however, seem to stick to nearly everything. These behaviors can only be sorted out through an understanding of surface forces and interactions.

7. CONTAMINATION: Surfaces are an invisible refuge for all manner of foreign species and these invaders can alter the surface’s properties in ways both benign and malign. The oil layer on the cylinder walls of your car’s engine is absolutely critical to it’s life and function. That same oil can also prevent you from painting over a bare spot on the fender.

8. TRIBOLOGY AND FRICTION: An invisible property that determines whether things stick or slip or whether your tires will keep you on the road or send you to the gutter.

9. SURFACE MODIFICATION: Getting surfaces to behave the way we want.

10. LIFE ON SURFACES: Microbes are another invisible inhabitant in the world of surfaces and have many important consequences including biofouling of pipes and marine surfaces and also the common infections that can lay us low.

Recommend