Friday, May 24, 2024


Communalism is the belief in the primacy of one's own religion over others, often leading to conflict and even violence among religious groups. The political identity of communalism emphasizes the superiority of one religious group over the other.

The renowned historian Romila Thapar, in her book The Past as Present: Forging Contemporary Identities Through History, published in 2014, suggests that the ideology of communalism defines groups in terms of religion and that the identity that is formed becomes more significant than any other identity. Thapar further states that communalism is the political exploitation of religion - religion is used as a mechanism to control society. It takes the form of deliberately opposing secularism and rationality, wherein political parties draw on religious identities, using religion as a basis to spread hate and violence.

In their book, India’s Struggle for Independence, published in 2016, historians Bipin Chandra, Mridula Mukherjee, Aditya Mukherjee, Sucheta Mahajan, and K. N. Panikkar describe three basic elements of the ideology of communalism. First, people who follow the same religion have common political, economic, social, and cultural interests. 

Second, in a multi-religious society, the political, economic, social, and cultural interests of people from one religion are different from that of another religion. Third, the interests of people from different religions are  mutually incompatible, antagonistic, and hostile. This is where communalism takes the form of fascism, as it is based on fear and hatred, and has the possibility of leading to violence.

Romila Thapar suggests that the ideology of communalism is a relatively recent phenomenon, specifically arising in 19th century India. Bipin Chandra also emphasizes that communalism is a political trend of modern times. Its roots lie within the social, economic, and political objectives that existed in modern Indian history. In the book India’s Struggle for Independence, Bipin Chandra and his co-authors suggest that communalism emerged as a result of colonialism, and it was a major shift in politics after the revolt of 1857, famously known as the first war of Indian independence.

Historians argue that the Britishers' divide-and-rule policy is responsible for the emergence of the political ideology of communalism. Communalism, propagated by the Britishers, was accepted by many Indian leaders as well as commoners, leading it to develop into a political ideology.

The Britishers propagated communalism strategically by associating it with the history of India, indicating that it has always been a part of the past. The beginning of this can be traced back to the book History of British India, published in 1817. The book was written by the philosopher, historian, economist, and political theorist, James Mill. He made significant contributions to empiricism and utilitarianism. Mill’s book laid the foundation for what Romila Thapar calls the communal interpretation of Indian history and provided a justification for the two-nation theory.

James Mill

Romila Thapar describes the impact that James Mill’s book had on Indian politics and society, in her paper Communalism and the Writing of Ancient Indian History, which was published in 1969 as a chapter in the book, Communalism and the Writing of Indian History, which has two other chapters - one by the historian Harbans Mukhia and the other by the historian Bipan Chandra.

In her paper, Thapar writes that James Mill gave a factually incorrect and very arbitrary periodization of Indian history. He was the first person to divide the history of India into three periods - Hindu civilization, Muslim civilization, and British civilization. This was the first recognized work on the History of India, and it had such a huge impact that later historians used similar divisions of Indian history. The periodization, which is now commonly used as ancient history, medieval history, and modern history - indicating that ancient history is about the Hindu civilization, medieval history is about the Muslim civilization, and modern history is about the coming of Britishers to India.

The periodization of Indian history by Mill has been found to be factually incorrect. The Hindu civilization is usually suggested to be from 1000 BCE to 1200 CE, because the ruling dynasties were from the Hindu religion. This ignores the fact that during this period there were other kingdoms like the Indo-Greeks, Shakas, Kushanas, and Mauryans, which did not follow the Hindu religion. 

More importantly, Thapar suggests that the pre-Islamic sources related to India have no mention of the term Hindu. It was first used by the Arabs and used more in a geographical sense rather than religion. In the period referred to as the Hindu civilization, the concept of Hindu did not even exist. There was no unified idea of the Hindu religion in those times. The unified idea of Hinduism, which is known today, came up much later in the post-Gupta period, post fifth century CE.

Further, the Muslim civilization as described by Mill, because it was dominated by Muslim rulers, is also found to be factually incorrect. The rulers from this period were never bracketed as Muslims. They were referred to as Arabs, Turks, or Persians, depending on the place of their origin. The single term Muslim was never used for them. In addition to that, these rulers arrived in India at different times and ruled in different periods and regions. Their kingdoms were largely in the northern parts of India. They ruled in the southern parts much later.

It is also incorrect to mention that this period was dominated by Muslim rulers. During this time, there were also strong Hindu kingdoms like the Vijayanagar Empire and many Rajput kingdoms. The date of the arrival of Muslim rulers has also been considered to be very arbitrary, with some referring it to be as 1000 CE and some referring it to be as 1200 CE.

The periodization of James Mill, therefore, has communal intentions. It is factually incorrect and was only meant to create a divide in terms of religion among Indians. Unfortunately, Mill’s periodization was carried forward by communal historians and was used by communal politicians for their agenda to spread hatred. It was challenged by historians much later.

Along with the inaccurate periodization, James Mill also heavily criticized the Hindu civilization, which is now referred to as a part of ancient history. He referred to the Hindu civilization as backward and irrational, among many other unjustifiable derogatory remarks. This led many politicians to an over-glorification of the Hindu civilization, a glorious civilization that somehow had declined over the years.

The decline was very conveniently blamed on the arrival of the Muslim rulers. The Muslim rulers being responsible for the decline of the glorious ancient culture became a justification for the two-nation theory. This idea has been continuously spread by communal politicians and historians and has become a common narrative among propagandists wanting to create division and hate.

The political ideology of communalism emerged as a result of the divide-and-rule policy of the Britishers. The origins of this ideology can be traced back to the publication of James Mill's book History of British India, which is a communal interpretation of Indian history, and became a justification for the two-nation theory. The after-effects of this communal interpretation of Indian history, leading to the emergence of the political ideology of communalism can be found even in the current scenario. The political ideology of communalism is being used rampantly in today's time.

Saturday, December 30, 2023


Critical thinking has been described in many ways. The philosophers Michael Scriven and Richard Paul, in 1987, in describing critical thinking emphasized on the evaluation, synthesis, and analysis of information. The philosopher Peter Facione, in 2005, suggested that critical thinking is purposeful and self-regulatory judgment. It involves interpretation, analysis, evaluation, explanation, and self-regulation. More recently, in 2018, the Foundation of Critical Thinking described critical thinking as self-directed, self-disciplined, self-monitored, and self-corrective thinking, which involves effective communication and problem-solving abilities, as well as overcoming egocentrism and sociocentrism.

Over the years, scholars have argued about being able to blend the idea of critical thinking within formal education. The significance of this idea was reflected when in 1983 the California State University system introduced the requirement of undergraduate students to complete a course on critical thinking.  This came to be known as critical thinking pedagogy which involves “an understanding of the relationship of language to logic, leading to the ability to analyze, criticize and advocate ideas, reason inductively and deductively, and reach factual or judgmental conclusions based on sound inferences drawn from unambiguous statements of knowledge or belief.” 

The introduction of critical thinking pedagogy is referred to as the Big Bang moment of teaching critical thinking in higher education. It allegedly led to what is often called the critical thinking movement. However, before and after this historical moment, philosophers, educationists, psychologists, and scientists have proposed significant perspectives on critical thinking and education.

One of the earliest figures proposing critical thinking in education is the psychologist, philosopher, and educationist John Dewey. Dewey developed what came to be known as the progressive education model, which involves learning through discovery-based activities. He opposed rote memorization and suggested the idea of learning by doing.

John Dewey

Dewey believed that education should facilitate creative intelligence and prepare students to live effectively in society. In his book How We Think, published in 1910, Dewey argued that thinking should be a means to clear doubt. He suggested that students should be given effective learning activities meant for problem-solving.

For this, Dewey suggested a science-inspired method of reasoning, which involved proposing a tentative solution, till more evidence is gathered to either confirm or disprove it. Dewey termed this type of reasoning as reflective thinking, which he defined as “active, persistent, and careful consideration of any belief or supposed form of knowledge in the light of the grounds that support it and the further conclusions to which it tends.” Scholars have suggested that reflective thinking is the same as critical thinking, making Dewey one of the first individuals to incorporate critical thinking within education.

Like Dewey, the educationist, philosopher, and often considered to be the father of critical pedagogy, Paulo Freire opposed the traditional methods of education. In his book Pedagogy of the Oppressed, published in 1970, Freire criticized the educational system, calling it banking education, in which students are passive beings and teachers simply deposit information in their minds. Freire suggested that banking education does not encourage dialogue with students and completely inhibits critical thinking.

Paulo Freire

In contrast, Freire proposed the problem-posing method, in which the teacher and student become co-investigators of knowledge. The problem-posing method encourages a dialog and invites the oppressed to explore their reality as a problem that can be transformed, and not something that is fixed. 

According to Freire, the aim of education is to make students develop a critical consciousness that will help them understand the roots of social, political, and economic oppression - a form of critical thinking that will create awareness of society. Freire’s ideas came to be known as critical pedagogy, which encourages learners to confront their knowledge, ideas, and biases to question the power dynamics in society and develop new ways of thinking. 

In similar ways to Dewey and Freire, the philosopher Robert Ennis believed that critical thinking in individuals should be developed in aspects of everyday life, something that is not covered in their formal education. Ennis, in this way, views critical thinking as a lifelong perspective. For this, Ennis suggested the idea of transfer - being able to apply learnings in everyday aspects of life, beyond academics.

Robert Ennis

In this regard, Ennis proposed a framework of four approaches - general, infusion, immersion, and mixed. In the general approach, critical thinking is taught independently of the regular course. In the infusion approach, critical thinking is blended within the regular course, in which critical thinking skills are made explicit. In the immersion approach, critical thinking is blended into the regular course, and critical thinking skills are not made explicit. Finally, in the mixed approach, there is a combination of the general approach combined with either the infusion or the immersion approach. According to research, the mixed approach has been found to be the most effective in learning critical thinking as well as transfer.

Apart from Dewey, Freire, and Ennis, the teaching style of the theoretical physicist Richard Feynman is considered to promote critical thinking. Along with his contributions to physics, Feynman has been widely known for his teaching style, which came to be known as the Feynman technique, after he passed away in 1988. The Feynman technique involves understanding, simplifying, and explaining difficult concepts. It is about simplifying substantive and complex concepts and explaining them in their simplest forms.

Richard Feynman

The Feynman technique involves four steps - (1) identifying a concept to learn, (2) understanding and explaining the concept in simple terms, (3) reviewing the concept and identifying gaps, and (4) simplifying the concept further and creating analogies for a better understanding. The Feynman technique enables active learning and not just passive re-reading and memorization. It also enables a deep understanding of an idea or concept. Both of these aspects promote critical thinking.

Further, the Feynman technique is associated with autodidactism and heutagogy. Autodidactism is self-directed learning, which involves taking initiative in acquiring knowledge. Heutagogy is self-determined learning, which involves reflecting upon learning experiences. Both autodidactism and heutagogy are features of critical thinking. This indicates that the Feynman technique is not just an effective learning method, but it is also useful in developing critical thinking.

Philosophers, educationists, psychologists, and scientists, from the early 20th century, have suggested ways in which critical thinking can be blended into formal education. According to them, traditional methods of learning should be shunned, and instead, education should incorporate critical thinking to make learners indulge in active learning and deep understanding, develop problem-solving skills, confront biases, be aware of various societal issues, and develop new ways of thinking.

To know more about critical thinking, refer to my podcast - Psychology, Critical Thinking, and Society

Sunday, August 27, 2023


The American Psychological Association (APA) characterizes Industrial-Organizational (I-O) Psychology as the scientific study of human behavior in organizations and the workplace. It focuses on deriving principles of individual, group, and organizational behavior and applying this knowledge to the solution of problems at work.

The field of industrial-organizational psychology emerged during the applied psychology movement, which was an aspect of the school of functionalism. Functionalism opposed the idea of elementism propagated by the school of structuralism. The functionalists argued that breaking down consciousness into smaller elements leads to losing the essence of experience. 

Instead of understanding the content of consciousness, the functionalists emphasized on the functions of consciousness. They were interested in answering the question of how does consciousness help in adapting to the environment. In this way, functionalism emphasized on the utilitarian aspect of consciousness.

The philosopher and psychologist William James is regarded as the major precursor to the beginning of functionalism. James propagated the philosophy of pragmatism. Pragmatism is the doctrine that the validity of an idea is measured by its practical consequences. According to pragmatism, the worth of an idea is determined by its practical applications.

William James

The idea of pragmatism became the cornerstone of functionalism. All the later functionalists stressed on the philosophy of pragmatism, which made functionalism to be concerned with utilities of consciousness and behavior.

Due to the emphasis on the utility of consciousness and behavior, the school of functionalism became interested in understanding the applications of psychology to everyday issues. This gave rise to the applied psychology movement.

The applied psychologists took psychology from an academic and laboratory setup to real-life settings and practical issues. They took psychology towards understanding behaviors in schools, factories, advertising agencies, courthouses, and mental health centers. It is this shift in psychology that led to the emergence of industrial-organizational psychology.

The pioneering works of two psychologists, namely, Walter Dill Scott and Hugo Munsterberg, are considered to have led to the beginning of industrial-organizational psychology. 

Walter Dill Scott completed his Ph.D. under Wilhelm Wundt. His interests shifted towards applied psychology after he returned to America. Scott was intrigued by the idea of using psychology to make advertisements more effective. He became the first person to apply psychology to advertising.

Walter Dill Scott

In 1903, Scott published his book - The Theory and Practice of Advertising. This is considered to be the first book on this topic. Scott argued that advertisements should involve factors like emotions, sympathy, and sentimentality. According to him, these factors heighten suggestibility, which makes advertisements more effective. These factors began to be used widely in advertisements, and are used even today.

Scott, later, shifted his attention towards the application of psychology in personnel selection and management and became the first person to do so. In order to select the best employees, especially salespersons, business executives, and military personnel, Scott developed rating scales and group intelligence tests. He used these scales and tests to measure the characteristics of people who were already successful in these occupations. Scott believed that intelligence should be defined in practical terms like judgment, quickness, and accuracy. According to him, these are characteristics that are needed to perform well on the job.

After the First World War, Scott formed his own consulting company - Imaginatively, the Scott Company. He formed this company to provide consulting services related to personnel selection and work efficiency to corporations. In doing so, Scott became the founder of the first psychological consulting company.

Around the same time as Walter Dill Scott, the psychologist, Hugo Munsterberg made pioneering efforts in applied psychology, including industrial-organizational psychology. Like Scott, Munsterberg completed his Ph. D. under Wilhelm Wundt. Later, William James recruited Munsterberg to be the director of the Harvard Psychology Laboratory. 

After spending some time there, his interests began to shift towards the practical applications of psychological principles. He strongly propagated the idea of psychology being applied to understanding real-life issues. In this regard, Munsterberg worked extensively in the areas of mental illness, legal matters, and the workplace.

Hugo Munsterberg

Munsterberg began his work related to industrial psychology in his article published in 1909, called Psychology and the Market. This article is about the application of psychology to vocational guidance, advertising, personnel management, employee motivation, and job performance.

In 1912, Munsterberg published the book Vocation and Learning and in 1913, he published the book Industrial Efficiency. These two books are often considered the beginning of what later became known as industrial psychology. In these books, Munsterberg wrote about personnel selection, work efficiency, marketing, and advertising.

Musterberg suggested that to make personnel selection better the skills for performing a task should be defined, and then the person’s ability to perform that task be determined. Munsterberg also suggested that the mental and emotional abilities of workers should be matched with their positions, to increase job efficiency, productivity, and satisfaction. For this, he suggested the use of proper psychological techniques that include mental tests and job simulations. Apart from this, Munsterberg also emphasized the role of individual differences in personnel and selection and job assignments.

The contributions of Walter Dill Scott and Hugo Munsterberg gave emphasis on the applications of psychology to organizations and the workplace. It drew the attention of psychologists to how psychology can be used to study behavior in organizations. It broadened the scope of psychology.

Along with personnel selection, work efficiency, and advertising, which were the major contributions of Scott and Munsterberg, psychologists began to study more complex aspects of organizations such as the social-psychological work climate, employee attitudes, communication patterns, organizational structure, power and politics in organizations, etc.

Realizing the significance of psychology in organizations, the APA, in 1945, founded its Division 14, which was called the Industrial and Business Psychology Division. In 1962, Division 14 of APA was renamed from the Industrial and Business Psychology Division to the Industrial Psychology Division. In 1973, this was renamed to the Division of Industrial and Organizational Psychology. In 1982, Division 14 of APA was again renamed as the Society for Industrial and Organizational Psychology. It promotes the science, practice, and teaching of industrial-organizational psychology.

Tuesday, May 23, 2023


The concept of self-actualization, over the years, has gained a lot of popularity. It has made experts as well as lay persons from varying backgrounds to be highly interested in the idea. The popularity of the concept is reflected in its usage in a wide range of areas such as teaching, counseling, healthcare, leadership, and management.

The pioneer of humanistic psychology, Abraham Maslow, in the mid-20th century (1950s to 1960s), popularized the concept of self-actualization in the context of his theory of personality and motivation. According to Maslow, self-actualization is an innate tendency. It is the tendency of individuals to realize and fulfill their true potential and abilities. It is the desire to become more and more what one idiosyncratically is and to become what one is fully capable of becoming.

Abraham Maslow

This suggests that the concept of self-actualization is about individuals being unique in their own way. The state of self-actualization, according to Maslow, is not about being a better person. Instead, it is about being the person that one is supposed to be, which indicates that every individual is unique in their own way. Thus, for every individual, the idea of self-actualization is going to be different from the other.

Self-actualization is about individuals reaching their full existential capacity. It is not about achievement or becoming an extraordinary individual. It is actually about personal growth and fulfilling one’s potential to the highest level possible, whatever the endeavor may be. It is an intrinsic unfolding process, which does not rely on any rewards system.

Further, self-actualization is not about striving for specific goals or reducing a deficiency. It is about striving for stimulating and challenging tasks and events, and by doing so, enriching one’s life. Instead of accepting life as it is, self-actualization involves constantly seeking new challenges, and avoiding secure and routine behaviors and attitudes.

Maslow’s description of the concept of self-actualization was a moving away from the causal tradition in psychology. Maslow opposed the existing deterministic perspectives of psychoanalysis and behaviorism. He suggested that behavior is not driven by a cause, something that has already existed. Behavior, instead is driven by a future state that the individual is striving for. This is called teleology or the teleological perspective.

For Maslow, self-actualization is a future state that individuals strive for and it is not a cause that has already existed. It is not something that is pushing the individual, but it is actually pulling the individual. This makes self-actualization describe behavior from the teleological perspective, which is a shift from the traditional physical sciences approach (determinism) that psychology had been following.

Like Maslow, the humanistic psychologist Carl Rogers also emphasized on self-actualization. Around the same time as that of Maslow, Rogers suggested that self-actualization is innate, and it is the greatest motivating factor of individuals. Rogers believed that people have an innate tendency to enhance themselves. According to Rogers, self-actualization or the actualizing tendency (as he referred to it) is an active, controlling drive towards the fulfillment of potential, which helps in maintaining and enhancing the self.

Carl Rogers

Rogers, further, suggested that human beings have a tendency to always seek new experiences and avoid environments that lack stimulation. From his clinical experiences, Rogers suggested that people have a directional tendency to grow and have new and varied experiences.

Additionally, Rogers suggested that human beings are basically good. Human beings develop this innate goodness when society is helpful. According to Rogers, people are not able to develop their innate goodness due to faulty socialization patterns. In this regard, Rogers suggested unconditional positive regard to be very important. If parents or caregivers show unconditional positive regard then the child grows into a healthy individual. Therefore, according to Rogers society and socialization patterns play a role in self-actualization.

Rogers suggested that self-actualization is the highest level of psychological health. He referred to self-actualized people as psychologically healthy individuals or fully functioning persons. Rogers described the fully functioning person as actualizing and not actualized because he believed that growth never ends - it is always a work in progress. Rogers believed that his concept of self-actualization is similar to that of Maslow.

In describing the concept of self-actualization, Maslow borrowed the psychoanalyst, Carl Jung’s idea of the self archetype and the transcendent function. Archetypes are archaic, generalized, emotionally toned collections of associated images derived from the collective unconscious (aspects of the unconscious that have its roots in the ancient past of the entire species). In the early to mid-20th century, Jung suggested that the self archetype is innate and has the potential of being realized in everyone. It involves a process called the way of individuation and leads towards self-realization.


Carl Jung

According to Jung, the way of individuation is a process by which individuals become the definite, unique being that they are. It is about fulfilling the peculiarity of the individual. Jung, therefore, suggests that the self is the final goal of striving. This self-realization does not come easily. The person has to go through a wide range of experiences and make many efforts to resolve conflicts within the psyche.

The self, according to Jung, becomes a unifying force by the transcendent function. The transcendent function works towards the ideal goal of perfect wholeness. It reveals the essential person by producing and unfolding the original, potential wholeness.

Apart from Jung’s self archetype, the concept of self-actualization used by Maslow has also been found to be similar to the psychoanalyst Alfred Adler’s idea of striving for superiority. According to Adler, striving for superiority is the innate, ultimate drive of human beings to realize their full potential. In the early 20th century, Adler suggested that striving for superiority is a fundamental human need - people strive to feel superior to overcome their feelings of inferiority and inferiority complex. This striving for superiority is not in the sense of social status and dominance. It is rather an urge for completion and perfection.

Alfred Adler

Adler suggested that striving for superiority is the final goal of all humankind. It unifies personality and makes all behaviors comprehensible. Adler also suggested that striving for superiority is a way to compensate the feelings of inferiority and weakness. People are always pushed by the need to overcome inferiority and pulled by the desire for completion and wholeness.

Even though the concept of self-actualization was popularized by Maslow, the term was originated by the neuropsychiatrist, Kurt Goldstein. Goldstein had a holistic approach. He was one of the major proponents of the holistic movement at the beginning of the 20th century. He criticized the reductionist approach and atomization in neurology that existed at that time. He opposed experiences to be viewed in terms of smaller components. He rejected the localization theory (each brain area has specific functions), suggesting that the brain functions as a whole, and if damage occurs in one brain area other areas take over the functioning of the missing brain area.

Kurt Goldstein

The holistic approach of Goldstein led him to introduce the concept of self-actualization. Self-actualization, according to Goldstein, is a striving for completeness. It is the organic principle by which individuals become more fully developed and complete. Goldstein, in the 1930s, suggested that self-actualization is the main motive of human nature. It is the creative trend of human nature. Human beings are governed by the strong tendency to actualize their potential. 

According to Goldstein, each individual has certain potentialities, which are expressed through interests, preferences, and aptitudes. The fulfillment of these potentialities is finding a way towards completeness and represents self-actualization.

Goldstein, further, suggested that even though self-actualization is a universal phenomenon, the process differs from person to person. This is because people differ with respect to their innate potentialities. These differences direct them in their own ways towards growth, development, and self-actualization. It also differs because of the different environments and cultures that they may belong to.

The concept of self-actualization is a part of Goldstein’s organismic theory. The organismic theory views the individual in totality and emphasizes the integration of personality. It is about viewing individuals in terms of a holistic and unified experience and viewing any event in the context of the organism.

Goldstein applied the Gestalt approach to his organismic theory. Gestalt psychology suggests that the mind has a tendency to organize experience into configurations and wholes. It emphasizes that the whole of anything is greater than the sum of its parts, indicating that consciousness or experience as a whole cannot be reduced to smaller components. 

The organismic theory formed the basis of Gestalt therapy. Gestalt therapy is a form of therapy that focuses on the individual’s present, in-the-moment experiences rather than examining the past. It also involves taking responsibility and understanding the context of the person’s life.

Self-actualization is a widely known concept in the discipline of psychology. The concept was popularized by the humanistic psychologist Abraham Maslow, but it was the neuropsychiatrist Kurt Goldstein who originated the idea. Goldstein applied the Gestalt theory and used the findings of his studies of patients with brain damage in introducing the term. 

Additionally, Maslow’s description of self-actualization has been found similar to Carl Rogers’s idea of actualizing tendency, Carl Jung’s concept of the self archetype, and Alfred Adler’s idea of striving for superiority. The concept of self-actualization, therefore, provides a link between the fields of neuroscience, Gestalt psychology, humanistic psychology, and psychoanalysis.

This article can also be found on the blog Life and Psychology