An Overview of Human Development Issues
David S. Walonick, Ph.D.
Each of us invents informal ways of looking at our own and other people’s growth. These paradigms of human development, while obviously lacking in scholastic rigor, provide us with a conceptual framework for understanding ourselves and others. This paper will discuss several aspects of human development, including consciousness, intelligence, learning, memory, motivation, personality and aging.
Psychological models of development
A psychological model is an attempt to map some of the dimensions of behavior. Martone and Fredenburgh (1973) summarize four general models of human development.
1) The Freudian psychoanalytic model view man as an animal caught in a state of conflict between primal urges and civilized forms of behavior. Motivation is a manifestation of subconscious sexual and aggressive urges.
2) The behaviorist model views man as a machine. Learning involves the association of stimulus and response, or secondary reinforces. Classical conditioning theory best exemplifies the behaviorist model.
3) The humanistic model describes behavior as from the perspective of self-perception. The model draws heavily from existential themes. People are in the act of becoming and transcendence.
4) The consistency model stresses the idea that people generally retain the same behavior patterns during their life-long endeavors. This model states that individuals generally follow general principles and courses of action. Cognitive-dissonance theory best illustrates this model.
There is a wide diversity of characteristics involved in human development. The most common paradigm is to view human development from the physiological-psychological dichotomy. Some traits seem to be genetically determined, while others appear to be of a cognitive origin. It is only when we integrate the two perspectives that a coherent picture of development occurs. For example, as the adolescent reaches puberty (a physiological change), there are also accompanying psychological changes, and the emotional needs and drives of the individual change.
British neurologist John Lober has examined several children who were victims of hydrocephalus, or water on the brain. These individuals often have less than five percent of the normal brain cortex, yet they sometimes have higher than average IQ scores, and show little or no thinking impairment. This finding casts serious doubts on the notion that the cerebral cortex is the seat of consciousness. (Talbot, 1988)
Descartes spoke of the dualistic nature of the physical and mental universes. Many modern thinkers have abandoned this way of thinking and have instead adopted a materialistic approach. They believe that the only things that should be studied are the physical components of the brain, and that consciousness can be understood from a biological perspective. The latest approach, called functionalism, is to view consciousness as a structure, rather than a substance.
Physicist Fred Alan Wolf predicts that “the mind will not be found in any physical pattern of our brain material”. (Talbot, 1988, p. 100) He believes that consciousness is a nonlocal entity, and like the quantum itself, cannot be found in any single location. This holistic approach encompasses the idea that all things are somehow connected.
British neurophysiologist Sir John Eccles believes that consciousness exists apart from the brain itself, and he has identified the portion of the brain that might be responsible for the interaction between matter and spirit. (Talbot, 1988) This region, known as the supplementary motor area (SMA), was first explored in the 1920s. Several neurophysiologists found synaptic brain activity in the SMA that preceded muscular movement by nearly a full second. This increase in electrical activity became known as readiness potential.
In 1980, a team of Swedish neurophysiologists Nils Lassen and Per Roland used a new radioactive technique to map brain activity before and during complex motor tasks. They found that intention (the thought of making a movement) began as brain activity in the SMA, and it preceded all other brain activity. Eccles points out that the bursts of activity in the SMA nerve cells were not triggered by other nerve cells in the brain–they originated there. Eccles takes this as irrefutable evidence to mean that “a mental act of intention initiates the bursts of discharges in the nerve cells”. (Talbot, 1988, p. 107)
Michael Talbot describes the activity of the SMA following mental intention as the software of consciousness, while the physical material of the brain is the hardware. The link between intention and SMA electrical activity is the at the “edge of physical reality as we know it, and still something seems to lie beyond”. (Talbot, 1988, p. 110) Talbot believes that something is pure information. This seems similar to Jung’s theory of a universal consciousness. Physicist David Bohm has hypothesized a holographic picture of consciousness, where our brains are somehow interconnected with the rest of the universe. (Wilber, 1982, p. 44-104)
For many years, we believed that IQ tests provided a reliable indicator of human intelligence. We held that they provided an accurate picture of how well a person would do in school. We even accepted their limitation that they were insensitive to racial and cultural differences. However, in 1987 Yale psychologist Robert Sternberg announced that IQ tests only account for between 5 and 25 percent of the variance in scholastic achievement. Sternberg asserts that current tests lack emphasis on practicality, novelty, and creative thinking. “In requiring only the answering of questions, IQ tests are missing a vital half of intelligence–the asking of questions”. (Ferguson, 1990, p.111) In his book, Beyond I.Q.(Cambridge University Press, 1985), Sternberg states that intelligence involves three major components: Context has to do with the ability to adapt to or shape the environment. Experience is the set of learning situations that a person has been exposed to. Components are the structures and mechanisms that underlie intelligence.
New Zealand psychologist J.R. Flynn agrees. He believes that IQ tests measure an abstract problem-solving ability that is sometimes correlated with intelligence, but that true measures of intelligence must involve issues of motivation, cultural context, values, incentives and environmental characteristics (Ferguson, 1990). Flynn studied the results from 14 nations and concluded that IQ scores had increased significantly in the last generation, but he points out several contradictions. For example, American and Norwegian IQ scores went up, while academic achievement in these countries declined. In addition, American students of Japanese and Chinese descent usually had lower IQ scores, but often attained higher levels of academic achievement than their white counterparts.
Benjamin Bloom and colleagues at the University of Chicago studied top performing Americans in six fields. They found that a child’s determination (not innate gifts) was the most important predictor of success. Persistence and eagerness were found to be strongly related to achievement. Bloom believes that these qualities are learned from parents teachers and peers. (Ferguson, 1990)
Drake University researchers Margaret Lloyd and Theresa Zylla presented convincing evidence in 1989 that offering students rewards could increase their IQ scores by up to 14 points. Students were given an IQ test to establish a baseline. A second, equally difficult, IQ test was administered, but half the students were promised prizes for correct answers. Fifty percent of these students showed significant improvement in test scores, while only one-eighth of the students in the other group showed improvement. (Ferguson, 1990)
Piaget’s theory of intellectual development was the first to challenge the behaviorist model. While working on the standardization of a children’s intelligence test, Piaget began focusing on children’s wrong answers. He became convinced that older children were not simple “smarter” than younger children, but that qualitative differences existed in their thinking. He adopted an interview style of investigation, which was substantially different from traditional quantitative testing procedures. Piaget, an epistemologist, believed that we could not understand intelligence unless we studied its formation and evolution in childhood. (Ginsburg and Opper, 1969)
Piaget’s discoveries included the concepts of object-constancy and conservation, or the ability of a child to understand that some physical attributes of objects (e.g., weight, substance, volume) are not altered if the object changes shape. These concepts did not appear to be something that could be taught to a child, but rather, they appeared at a specific age during the child’s development.
Piaget’s put forth many definitions of intelligence, but they all seem to be general…”a particular instance of biological adaptation”, “the form of equilibrium toward which all the cognitive structures tend”, and “a system of living and acting operations”. (Ginsburg and Opper, 1969, p. 14). Intelligence, according to Piaget, involves adaptation, equilibrium and evolution. Operational thinking (or the ability to manipulate abstract concepts in creative ways) is the outward manifestation of intellectual growth. Piaget recognized that emotions somehow influenced thought, but did not explore the concept in his research.
Piaget postulated that all human development involves the biological features of organization and adaptation. Organization is the tendency to systematize our world into coherent structures. Adaptation is the tendency to adapt to the environments through either assimilation or accommodation. We assimilate features of the “external reality” into our own psychological structures, and we adapt our psychological structures to meet the demands of our environment.
Piaget’s understanding of intelligence changed over time. Originally, he was concerned with the development of verbal communication and moral issues. This gave way to a study of the child’s assimilation of various scientific and mathematical ideas such as space, time, velocity, and causality. Piaget’s later thinking is similar to that of author Hans Furth. In his book, Thinking Without Language (Free Press, 1966), Furth states that human intellect grows through contact with the environment… and that this growth occurs even when there is no linguistic system available.
How we learn has been the area of intense study in the field of psychology. All theories of learning embody the idea of establishing relationships between information.
The connectonist theory of learning holds that a bond is established between a given stimulus and response. This is known as the S-R law. Operant conditioning was first proposed by Thorndike in 1898. In 1913, Watson expanded the theory to include human learning. Classical conditioning theory was developed by Pavlov in 1927 while studying the salivation of dogs in anticipation of food. Instrumental conditioning is said to occur when we repeat a behavior that has been linked with a reward. Skinner (1951; 1953) further developed operant conditioning and explored continuous, interval and ratio reinforcement schedules, as well as positive and negative reinforcement. Shaping behavior involves reinforcing a series of continuously improving approximations of the desired behavior. Skinner (1954) is best known for his support of the behaviorist school and his work with educational teaching machines.
The cognitive theory emphasizes the role of perception, attitudes and beliefs in the learning process. Gestalt theory (Köhler, 1929; Koffka, 1935; Perls, Hefferline, & Goodman, 1951) embodies the two notions that “perception is organized, and that the organization tends to be as good as the stimulus conditions permit.” (Deutsch and Krauss, 1965, p. 16) In Gestalt theory, the whole system contains more than the sum of the parts. The greatest learning takes place as a sudden insight when an individual perceives the gestalt (i.e., organized whole). Field theory (Lewin, 1935; Festinger, 1954) proposed the idea that tension within a person drive the individual to act in ways that will release the tension. According to this theory, learning occurs because a person is driven to reach a goal.
While studying outstanding achievers, Bloom (1985) discovered three phases to the learning process. Phase one was before age ten, where children were exposed to a field more often by circumstance than personal choice. During phase two (10-14 years), children became “possessed” with the field, devoting themselves to prolonged study and training. Finally, in phase three (16-20+ years), they shifted from technical precision to personal expression.
One recent idea of learning was proposed by Marc Bornstein and Mariam Sigman (1987). By observing four-month-old babies, they theorize that visual and auditory attention are good indicators of scores on intelligence tests. Their reasoning is that an alert/attentive child is driven to explore the environment, and thus has greater learning experiences. (Ferguson, 1990, p. 111)
One novel theory of learning is proposed by George Leonard, former senior editor of Look. Leonard believes that the mind-set of a “fool” is most conducive to learning. The fool sees things as if they were being seen for the first time. The baby is allowed to babble (like a fool) and thus learns at very rapid rates. A “beginner’s mind-set” is associated with accelerated learning. (Ferguson, 1990, p. 119)
Human development is undergoing constant change, and is therefore dynamic. It describes the changes which everyone passes through during life. The epigenetic principle states that genetic/hereditary physiological changes lead to developmental milestones which we normally achieve and pass. The critical period hypothesis states that if a change doesn’t happen when it is supposed to, it doesn’t happen at all. (Martone and Fredenburgh, 1973)
Another way of learning
Cambridge biochemist Rupert Sheldrake (1981) has pointed out many mysteries in the field of morphogenesis, the study of how living forms come into being. While it is clear that DNA is the blueprint, we currently have little idea how individual cells differentiate and become various organs of the body. Before a cell differentiates, it has the capability to take on any form or function. How does an individual cell know what to become? Our understanding of regulation, the ability of an organism to alter itself to an unexpected change, is still in its infancy, and we also have almost nonexistent knowledge how regeneration works. (Talbot, 1987)
The problems in understanding morphogenesis encouraged Sheldrake to postulate the existence of a new “morphogenetic field” (M-field). Sheldrake believes that M-fields surround all living organisms and govern their growth and structure. Even more important though, is the proposal that habits and behaviors of a species build up, and through a process called morphic resonance, the information is transferred to the new members of the species. Morphic resonance involves the interspecies transfer of information without a direct genetic connection.
Several astonishing experiments have been done that support this theory. Harvard psychologist William McDougall (1927) ran a series of experiments beginning in 1920 to study how rats performed in a T-maze. Each generation of rats seemed to be able to learn the maze faster than the previous generation. In fact, after twenty-two generations, rats figured out the maze ten times faster than the first generation had. Even more astonishing, rats who were not offspring of the trained rats, also acquired the enhanced learning ability. In other words, it appeared that information was somehow being transferred, although there was no contact between the rats.
Scottish researcher F.A.E. Crew (1930) set out to disprove McDougall’s results. Crew’s rats picked up where McDougall’s had left off. Even though a completely different set of rats was involved, they seemed to have acquired the knowledge of McDougall’s rats. Australian researcher W. E. Agar (1938) ran a similar set of experiments for twenty-five years with identical results. In another set of experiments with fruit flies, English biologist Mae Wan Ho (1983) reported that genetic mutations in one population were acquired by a completely unrelated population. Talbot, 1987, p. 68-70)
If the M-field theory if correct, then human development may involve more that we usually believe. If new thoughts and behaviors were to somehow become habitual in a sufficient number of people, it would become increasingly easier for other members of humanity to “tap into” this information. Biologist Lyall Watson (1979), coined the term “hundredth monkey effect” to describe the critical mass required before information transfer could occur. While studying Japanese monkeys in the 1950s, he observed that when a sufficient number of monkeys learned a new skill, it quickly become part of the repertoire of all monkeys in the colony. Even more remarkable though, the new skill simultaneously became part of the repertoire of other monkeys that lived on different islands.
Watson believes this evidence points to the prospect of a “group mind”, similar to Carl Jung’s theory of the collective unconscious. Sheldrake’s M-field theory has produced heated response from the scientific community. Nature, one of the most prestigious British scientific journals, condemned Sheldrake’s work, calling it “the best candidate for burning”. (Talbot, 1987, p. 77) Sheldrake’s theory, right or wrong, is testable, and we can expect to see future studies in this area.
Aristotle believed that the formation of memory was like “tracing a signet ring on wax”; it depends to a great extent on the state of the brain. He believed that the young and the old had poor memories because they are in a state of flux, either growth or decay.
The current estimate of the percent of our brain capacity that we use is .01 percent. Our brain weighs a mere three pounds, yet it possess nearly unlimited potential. It contains between 10 and 15 billion nerve cells. Richard Restak (1984) claims that the brain can store more information than all the libraries in the world.
Johns Hopkins neurophysiologist, Neal Cohen believes that memory defines our concept of self. It “pervades all that we do, what we are, our personalities, how we interact with other people…” (Restak, 1984, p. 219) Our behavior is made from our memories, even though we don’t perceive them as such.
Russian psychologist, Aleksandr Luria, a prominent twentieth century psychologist, had the opportunity to test a Moscow reporter known to have a remarkable memory. Writing about those sessions in The Mind of a Mnemonist, Luria reports that the reporter (referred to as S) had a near perfect memory and could recall long strings of random digits thirty years later. S described his technique as mostly visual, but also one of sensory cross-over (synesthesia) where sights, sounds, taste, smell, and feelings would become intermingled. For example, he would speak of the color of a person’s voice. (Restak, 1984)
Luria’s study suggests that memory depends on: 1) the construction of stark and original images, 2) focused concentration at the moment of memorization, 3) practice, and 4) an intense desire to improve one’s memory.
In spite of all the technological progress in computers, it is clear that the mind is not just an advanced computer. The mind has capabilities that the computer cannot begin to imitate. One of the most significant differences is that the mind can easily recognize, complete and correct patterns, as well as accommodate ambiguity. Computers perform these functions quite poorly. Some recent computer programs incorporate a technique known as fuzzy logic in an attempt to model these aspects of the human brain. For example, all people are different, yet, we can easily classify someone as a person. Our definition of what defines a “person” is necessarily vague in order to encompass all people. Fuzzy logic is an attempt to model that intentional ambiguity.
Restak (1984, p. 203) describes the experiments of psychologist M. Cole that showed vast cultural differences in memory. When twenty unrelated words were presented to Liberian rice farmers, they could recall only half of the words. If the words were incorporated into a folk story, their memories improved dramatically. Memorizing unrelated information is not compatible with a Liberian rice farmer’s learning paradigm. It is, however, very much a part of Western education. Additional studies have shown that the ability to memorize seemingly meaningless information is positively correlated with education.
Experiments with different cultures have shown that cultures that rely heavily on verbal communication have better memories than those that depend on written communication. The use of memory systems is strongly correlated with the unavailability of books. Cultures that depend on written records experience a kind of disuse atrophy. (Restak, 1984).
Stanford researcher Gordon Bower and his coworkers reported (1982) that our mood can effect our recall ability. They found that we best remember happy events when we are in a happy mood, and vice versa. Additionally, they found that our mood during the learning process was also the mood that would most likely enable recall. “A given memory record can be retrieved only by returning to that library, or physiological state, in which the event was first stored”. (Ferguson, 1990, p. 122)
In a comprehensive report Remembering and Forgetting: An Inquiry into the Nature of Memory, Edmund Blair Bolles (1988) points out that ancient and medieval cultures frequently used memory a system to help the recall extensive information. The system involved associating visual imagery and a particular space. By the 1300’s, paper had become readily available, and as we became increasingly reliant on paper, our memories deteriorated. Bolles believes that learning occurs when we pay attention to those things which deny our assumptions. He maintains that “honest and responsible remembering is a product of a lifetime of practice. It doesn’t just happen; it demands commitment”. (Ferguson, 1990, p. 122)
Researcher David Meir (1984) conducted a year-long study of students from the University of Wisconsin. On a simple recall test, students that used mental imagery performed 12% better on immediate recall, and 26% better on long-term retention tests. Furthermore, the more senses that are involved in the learning process, the greater the retention. At the Center for Accelerated Learning, Meir uses a variety of techniques to construct a total learning environment, including a relaxing environment, high-energy involvement, music, games, songs, positive suggestions, and humor. (Ferguson, 1990)
There are two parts to the process of memory–storage and recall. Many researchers believe that we store all our experiences. Considering the vast amount of information that we are exposed to during our lifetimes, this is truly an enormous task. Karl Pribram (1969) of Stanford Medical School developed a holographic theory of the brain to explain how this is possible.
Working with Karl Lashley (1950) on a series of experiment with rats, Pribram hypothesized that memory is not stored in a specific location in the brain, but rather, it is distributed in some manner throughout the brain in the same manner as a hologram. Physicist David Bohm supports Pribram’s holographic theory, and believes that the entire universe operates in a holographic fashion. (Wycoff, 1991)
If we are storing all experience in our brains, then why can we not recall everything? The main reason that we forget information is interference. Experiences interfere with each other, and recall diminishes because the retrieval patterns are lost.
In her book Mindmapping, Wycoff (1991, p. 17) states that there are four primary ways that we can improve memory:
- Repetition is the customary method of learning. There is no doubt that it works, but it may not be the most effective method.
- Association involves linking a piece of information with some other information already in our memory.
- Intensity refers to the emotional content of information. Information that has strong intensity or emotional content will be more easily remembered.
- Involvement of more than one of our senses will help memory. The more senses that are involved, the easier it will be to remember. Working with information increases the chances of remembering it.
The homeostatic model of human motivation states that biological “need” is the driving force behind behavior. (Thorndike, 1898; Watson, 1913 ) The theory views a “need” as physiological deprivation. The psychological consequence of a need is “drive” (motivation). The need-drive theory specifies the sequence of events as: 1) need, 2) drive (tension), 3) goal seeking behavior, and 4) drive reduction.
The meta motivational model expands on the homeostatic model. (Fromm, 1941; Maslow, 1954; Allport, 1955) It agrees that physiological deficiencies (D-motives) are primary in nature. It states that once physical needs have been met, there are other being-needs (B-motives) that add to our sense of being. The hierarchy of needs from low to high is: physiological, safety, belongingness, esteem, self-actualization, knowledge, understanding, and aesthetic. The theory acknowledges that the ranking of the needs might not be the same for all individuals.
A activation model of motivation is based on the idea the brain produces “motivation neurochemicals” in response to environmental stimulation. (Luria, 1973; Brown, 1976; Blackmore, 1977) This theory stresses that perception (through receptor organs such as eyes, ears, nose, etc.) cause neurochemical changes that create an optimal level of excitation in the brain. It embodies the idea that internal stimulation (thoughts) can also provide neural excitation.
The doctrine of instincts states that all humans are born with species specific behavior (instinct). (Martone and Fredenburgh, 1973, p. 105) The theory encompasses the idea that there is a critical window where specific behaviors develop. If some negative influence prevents it, then the behavior will never occur or it will be retarded. Once the behavior occurs, it will be very resistant to extinction. The theory states that individuals become ready for certain information for brief stages during their development. Failures to grow during these critical periods are, for the most part, irreversible. (Piaget, 1952)
Drives that exert a positive influence on behavior are called appetitive drives. Examples are eating to satisfy hunger, or drinking to satisfy thirst. Food and water would be called primary reinforcers because they directly address the depravation. Aversive drives exert a negative influence on behavior. An example would be avoidance behavior. (Martone and Fredenburgh, 1973, p. 110)
There are many alternatives to drive theory. Incentive theory promotes the idea that incentives pull the organism, where drives push the organism. (Janis and Gilmore, 1965) Reinforcement theory states that whatever motivates behavior also reinforces (rewards) it. (Miller and Dollard, 1941; Hull, 1943) Cognitive dissonance theory (Festinger, 1957) embodies the idea that we attempt to maintain consistency in our attitudes, beliefs and behaviors. When two items are in conflict, the dissonance that exists in our minds provides the motivation for behavior.
In the 5th Century B.C. Hippocrates proposed the idea that personality and physique are related. In 1942, Sheldon and Stevens developed a numerical method of classifying body types based on bone and muscle structure. (Fryer, et. al., 1954, p.207; Engle, 1964, p. 157) Three temperaments (personality types) were associated with physical characteristics. This led to the idea that personality is constitutionally (genetically) determined. In spite of the unpopularity of the theory, some researchers have found evidence confirming that personality is correlated with physique. (Martone and Fredenburgh, 1973)
Most researchers, however, believe that personality is a function of both genetics and environment. Monozygotic twins have identical genetic structures because the egg splits into two parts during the first cell division. (Martone and Fredenburgh, 1973). Their twins physical characteristics and intelligence are very similar, yet, personality differences emerge due to social forces. Personality traits express the uniqueness of our individualities.
There are two opposing theories to describe the development of personality traits. One explanation known as the gene block theory (Cooley, 1902; Tarde, 1903, McDougall, 1908; Ross, 1908) maintains that genetics is responsible for personality differences. The opposite view is described by sociocultural theory (Watson, 1930; Lewin, 1935), where cultural experience creates a mold for the personality. Both theories are correct. (Hebb, D., 1966, p. 192-196)
Personality traits are often classified by category. Examples of categories are: sociability, emotional stability, objectivity, friendliness, and thoughtfulness. Others have described personality traits as consisting of bipolar dimensions. People are viewed as falling somewhere on a continuum between two extremes. Examples are: bright vs. dull, calm vs. unstable, radical vs. conservative, insecure vs. confident, and suspecting vs. accepting.
There do not seem to be any universal rules about aging. Cognitive aging, however, does involve a reduction in the number of active brain cells. This somehow interferes with short-term memory, while leaving long-term memory relatively intact. CAT scans of very old people confirm a measurable loss of brain cells, resulting increased difficulty in new learning. (Restak, 1984) It is important to note, however, that intelligence grows through the middle ages, and there is no marked decline in I.Q. until after seventy years of age. (Martone and Fredenburgh, 1973) Less that 5% of those over age sixty-five have any significant memory impairment. (Ferguson, 1990)
University of Michigan gerontologist Marion Perlmutter believes that growth continues throughout life, and that “mental decline is not the result of aging.” (Ferguson, 1990, p. 198) During a study of people over eighty, Perlmutter discovered societal biases against the elderly. For example, when young people spend more time on a problem, we call them thoughtful, where an older person might be called senile. As Perlmutter points out, spending more time in thought may be a demonstration of greater wisdom.
Modern research indicates that some forms of intelligence show steady improvement throughout a person’s life. (Hutchison, 1986, p. 44) The term crystallized intelligence was coined by psychologist Daniel Goleman (1984) as “a person’s ability to use an accumulated body of general information to make judgments and solve problems”. Goleman believes that being mentally active, educated, and flexible are the factors that enable continued intellectual growth. Horn’s findings confirm that older people are better able to incorporate a wider variety of information in problem solving.
Alex Comfort, author of A Good Age (Crown, 1972) believes that 75% of the “aging” in this country is cultural self-fulfilling prophesy made up of folklore, prejudices, and misconceptions about age. Comfort states that many physical illnesses are misdiagnosed as “old age”, and his reasoning was embraced by the National Institute of Mental Health. A Presidential Commission on Mental Health concurred, and stated that “sick old people are sick because of illness, not because of old age”. (Ferguson, 1990, p. 200)
University of Wisconsin researcher Dean Rodenheaver (1983) describes two seasons of love in an adults life. Generative love is of family and home and often makes sacrifices for the next generation. Existential love comes only with greater maturity, and it involves a deep awareness of the fleeting nature of life. (Ferguson, 1990)
Harvard psychologist Ellen Langer points out that shared cultural beliefs about aging are so strong that they become shared realities. She believes that our beliefs directly affect our aging process. In her book Mindfulness (Addison, Wesley, 1989) Langler cites a 1970’s study where researchers created an 1959 environment for the research participants. Men between the ages of 75 and 80 were placed in the environment and shown pictures of themselves as they had been twenty years before. Within a few days, participants showed a marked “decrease in apparent age”, both psychologically and physiologically. Biological changes included increased finger lengths, sitting height, improved manual dexterity, and improved vision. Langer concludes that mindfulness (remembering to pretend) is sufficient to produce youthfulness. (Ferguson, 1990)
Researchers Margaret Linn and Kathleen Hunter (1979) of the Veterans Administration Medical Center in Miami, Florida interviewed 150 people that were 65 or older. They found that self-perceptions of one’s relative age are an accurate indicator of psychological functioning. People 65 or older were asked the question “Compared with others your age, how do you feel?”. Those who viewed themselves as younger had greater self-esteem, more life satisfaction, were of a higher social class, and had better health. The most important finding of this study was that people who feel that they have control over their own lives, also feel younger. (Ferguson, 1990, p. 199)
Neurophysiologist John Lilly in The Center of the Cyclone (1972, p. 9) reverberates Linn and Hunter’s findings. “What one believes to be true, either is true or becomes true in one’s mind, within limits to be determined experimentally and experientially. These limits are beliefs to be transcended.”
If we continue throughout our lives to expose ourselves to new experiences, to take on new challenges, and to be flexible to change, then we set the stage for our continued growth. Furthermore, even if we let certain intellectual skills go dormant, they respond quickly to an enriched environment.
© Dr. David Walonick – used with permission
survey software to design online surveys and questionnaires David’s free web survey software will create sophisticated internet surveys as well as written questionnaires and scripts for telephone interviewing. Plus, you’ll find information on questionnaire design and marketing research methods.