{"title":"情报","authors":"Holly Yanacek","doi":"10.1111/criq.12756","DOIUrl":null,"url":null,"abstract":"<p><i>Intellectual</i>, not <i>intelligence</i>, was the headword in Raymond Williams's <i>Keywords</i> (1976, 1983).<sup>1</sup> In the early twenty-first century, despite discussions of the rise of anti-intellectualism in the United States and in other places around the world, <i>intelligence</i> is the more complex and contested term. The meaning of the word <i>intelligence</i> has changed over the past six centuries in response to social, political, scientific and technological shifts. Since its introduction into English, <i>intelligence</i> has undergone a process of semantic broadening, becoming at once more generalised while also gaining specialised meanings in a variety of contexts from psychology and education to international security and computer science.</p><p>The origins of the noun <i>intelligence</i> can be traced to French <i>intelligence</i> and Latin <i>intellegentia</i>. Its earliest sense in English, ‘Faculty of understanding; intellect’, dates to the late fourteenth century and remains active today. The synonym <i>intellect</i>, a borrowing from Latin <i>intellectus</i>, was also first attested in the late fourteenth century. According to the <i>OED</i>, <i>intellect</i> is defined as ‘That faculty, or sum of faculties, of the mind or soul by which a person knows and reasons; power of thought; understanding’, and it is often distinguished from sensation, imagination and will.</p><p>In addition to denoting the mental faculty of an individual, <i>intelligence</i> also encompassed ‘The action or fact of mentally apprehending something; understanding, knowledge, comprehension (<i>of</i> something)’ from the mid-fifteenth century onwards. This expanded meaning enabled the word to describe the acquisition of knowledge across various fields of study, as well as the capacities of non-human entities.</p><p>Already around the time of the word's borrowing into English in the late fourteenth century, <i>intelligence</i> referred to an intelligent or rational spiritual being outside the human realm, such as an angel, a spirit or extraterrestrial life. This sense of <i>intelligence</i> recalls the 1816 coinage <i>intelligent design</i>, which describes deliberate design in the natural world attributed to an intelligent entity often identified as God. More recently, confusion has arisen over the distinction between the theory of Intelligent Design (ID), whose proponents claim is based on empirical evidence, and creationism, which is based on religious texts and teachings, particularly in the context of discussions about whether evolution and intelligent design should be taught in state-funded schools.</p><p><i>Intelligence</i> took on a measurable aspect from the mid-fifteenth century, denoting ‘Understanding as a quality admitting of degree; <i>spec</i>. quickness or superiority of understanding, sagacity’. This idea that <i>intelligence</i> can be quantified and compared in terms of scope, depth or speed of understanding is most evident in the compound noun <i>intelligence quotient</i>, which was modelled after the German term <i>Intelligenzquotient</i> proposed by German psychologist William Stern and first attested in English in 1913. The <i>OED</i> defines <i>intelligence quotient (IQ)</i> as ‘A number intended to represent a person's intelligence, measured using problem-solving tests and compared to the statistical norm or average for their age (which is taken as 100)’, and the Stanford-Binet test and Weschler Intelligence Scale are two of the most used intelligence tests today. Although intelligence tests have long been controversial, they have come under scrutiny recently due to greater public awareness of their racist, classist and ableist origins, particularly their history of use in eugenics. Metaphorical uses of the term <i>intelligence quotient</i> are also possible, as in the following example cited in the <i>OED</i>: ‘They dumb-down the city's intelligence quotient’ (<i>Chicago-Sun Times</i>, 15 June 2008). However, over the past decade, researchers have developed analytical frameworks to measure a city's intelligence level or <i>smartness</i>. The IMD Smart City Index produced by the International Institute for Management Development World Competitiveness Center, for example, uses survey data to compare cities around the world according to different social, cultural, environmental, economic and technological dimensions.</p><p><i>Intelligence</i> acquired specialised meanings in political and military contexts in the late fifteenth century, becoming synonymous with words like <i>information</i>, <i>knowledge</i> and <i>news</i>, particularly if that information had military value. This sense of <i>intelligence</i> expanded again in the early seventeenth century to describe not only information but also the covert processes involved in collecting such information, making <i>intelligence</i> synonymous with <i>espionage</i> and <i>surveillance</i>. Some of the most frequently used collocations with <i>intelligence</i> in English relate to this sense of the word: <i>intelligence agency</i> (1878), <i>intelligence officer</i> (1779), <i>national intelligence</i> and <i>military intelligence</i>. Technological advancements after the First World War supported the formalisation and expansion of <i>intelligence agencies</i> during the inter-war period (1918–39). <i>Intelligence</i> and <i>counter-intelligence</i> (1940), which the <i>OED</i> defines as ‘The activity of preventing the enemy from obtaining secret information’, became even more expansive industries around the world during the Cold War (1947–91), and the United States established the Central Intelligence Agency (CIA) in 1947. <i>Intel</i>, the clipped form of <i>intelligence</i>, has been used colloquially since 1961 to describe information of military or political value. While the Google Books Ngram Viewer shows a steady increase in the relative frequency of <i>intel</i> beginning around 2001, corpus data include references to the multinational technology company <i>Intel</i>, which seems to have profited from the positive associations of <i>intelligence</i> in its branding since the corporation's founding in 1968.</p><p>The meaning of <i>intelligence</i> has also continued to evolve alongside advancements in computer technologies. In 1950, English computer scientist and mathematician Alan Turing developed an imitation game to determine whether a computer can exhibit human intelligence: a computer would be considered <i>intelligent</i> if it could carry on a conversation and ‘pass’ as human without being identified as a machine. This method, now known as the Turing Test, has been important in the development of the field of artificial intelligence and machine learning. First attested in 1955, <i>artificial intelligence</i>, abbreviated as <i>AI</i>, refers to ‘The capacity of computers or other machines to exhibit or simulate intelligent behavior’. However, the term <i>artificial intelligence</i> has been contested, and some experts on the social implications of AI have claimed that the term is a misnomer. Microsoft researcher Kate Crawford, for example, has argued that artificial intelligence ‘is neither <i>artificial</i> nor <i>intelligent</i>. Rather, artificial intelligence is both embodied and material, made from natural resources, fuel, human labor, infrastructures, logistics, histories, and classifications. AI systems are not autonomous, rational, or able to discern anything without extensive, computationally intensive training with large datasets or predefined rules and rewards.’<sup>2</sup></p><p><i>Smartness</i>, a synonym for <i>intelligence</i> first attested in 1340, has taken on new meaning in the age of AI. The adjective ‘smart’ in <i>smart technologies</i> originally comes from the acronym SMART (Self-Monitoring, Analysis, and Reporting Technology). Smart technologies are devices designed to connect with internet networks and other devices, and they ‘allow inanimate objects such as tools and machines to communicate with humans, and vice versa: they integrate digital and non-digital functionalities’.<sup>3</sup> The <i>smartphone</i> (1980), which is now typically understood as a mobile phone equipped with computer applications and internet access, is just one example, but smart technologies include smartwatches, smart wearables, smart doorbells, smart gardens, smart thermostats, smart homes and many more. Headlines of recent online news articles and videos, such as ‘Is smart technology making us dumb?’ (<i>Deutsche Welle</i>, 30 October 2021) and ‘Are smartphones making us stupid?’ (<i>Psychology Today</i>, 25 June 2017), communicate a general sense of anxiety about the potential negative impact of new technologies on human intelligence. According to recent cognitive science research, however, there is no evidence that smart technologies diminish human cognitive capacities, and some researchers have even emphasised that smart technologies enhance our abilities to perform more creative and intellectually demanding tasks.</p><p>Speculation about whether non-human animals and plants have a soul or are <i>intelligent</i> has existed for centuries. In the early twenty-first century there is more widespread acceptance of the idea that other lifeforms possess intelligence, due to both scientific discoveries and the semantic broadening of the word <i>intelligence</i>. The brain, specifically the cerebrum, is the centre of intelligence in humans, and most animals, except for some ocean floor creatures, have a brain or other nervous tissue. Plants lack a brain, but already in the late nineteenth century Charles Darwin recognised that plants are much more complicated than we think they are. Arguing that intelligence unites rather than divides humans, other animals and plants, Italian researcher and founder of plant neurobiology Stefano Mancuso has written extensively about <i>plant intelligence</i>, defining <i>intelligence</i> as ‘the ability to solve problems’.<sup>4</sup></p><p>Contemporary definitions of <i>intelligence</i>, whether rooted in adaptability or problem-solving abilities, demonstrate an expanded, more inclusive understanding of the term beyond its original limited sense meaning the (human) faculty of mind or <i>intellect</i>. Answers to the questions of whether plants or computers are intelligent hinge on our understanding of <i>intelligence</i>, and ongoing heated debates about the validity and ethics of IQ tests reflect not only disagreements over how to define <i>intelligence</i>, but also the social importance attached to the concept. <i>Intelligence</i> remains a complex and contested term, as the concept it represents holds great value in many societies, and discussions about who or what qualifies as intelligent, how to define it, and how to measure it are shaped by power dynamics. Awareness of these power dynamics is important because the act of attributing or denying intelligence to humans and other entities often implies attributing or denying respect.</p><p>See <span>artificial, communication, education, excellence, experience, information, life, network, security, soul, technology.</span></p>","PeriodicalId":44341,"journal":{"name":"CRITICAL QUARTERLY","volume":"66 3","pages":"101-105"},"PeriodicalIF":0.2000,"publicationDate":"2023-11-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1111/criq.12756","citationCount":"0","resultStr":"{\"title\":\"Intelligence\",\"authors\":\"Holly Yanacek\",\"doi\":\"10.1111/criq.12756\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><i>Intellectual</i>, not <i>intelligence</i>, was the headword in Raymond Williams's <i>Keywords</i> (1976, 1983).<sup>1</sup> In the early twenty-first century, despite discussions of the rise of anti-intellectualism in the United States and in other places around the world, <i>intelligence</i> is the more complex and contested term. The meaning of the word <i>intelligence</i> has changed over the past six centuries in response to social, political, scientific and technological shifts. Since its introduction into English, <i>intelligence</i> has undergone a process of semantic broadening, becoming at once more generalised while also gaining specialised meanings in a variety of contexts from psychology and education to international security and computer science.</p><p>The origins of the noun <i>intelligence</i> can be traced to French <i>intelligence</i> and Latin <i>intellegentia</i>. Its earliest sense in English, ‘Faculty of understanding; intellect’, dates to the late fourteenth century and remains active today. The synonym <i>intellect</i>, a borrowing from Latin <i>intellectus</i>, was also first attested in the late fourteenth century. According to the <i>OED</i>, <i>intellect</i> is defined as ‘That faculty, or sum of faculties, of the mind or soul by which a person knows and reasons; power of thought; understanding’, and it is often distinguished from sensation, imagination and will.</p><p>In addition to denoting the mental faculty of an individual, <i>intelligence</i> also encompassed ‘The action or fact of mentally apprehending something; understanding, knowledge, comprehension (<i>of</i> something)’ from the mid-fifteenth century onwards. This expanded meaning enabled the word to describe the acquisition of knowledge across various fields of study, as well as the capacities of non-human entities.</p><p>Already around the time of the word's borrowing into English in the late fourteenth century, <i>intelligence</i> referred to an intelligent or rational spiritual being outside the human realm, such as an angel, a spirit or extraterrestrial life. This sense of <i>intelligence</i> recalls the 1816 coinage <i>intelligent design</i>, which describes deliberate design in the natural world attributed to an intelligent entity often identified as God. More recently, confusion has arisen over the distinction between the theory of Intelligent Design (ID), whose proponents claim is based on empirical evidence, and creationism, which is based on religious texts and teachings, particularly in the context of discussions about whether evolution and intelligent design should be taught in state-funded schools.</p><p><i>Intelligence</i> took on a measurable aspect from the mid-fifteenth century, denoting ‘Understanding as a quality admitting of degree; <i>spec</i>. quickness or superiority of understanding, sagacity’. This idea that <i>intelligence</i> can be quantified and compared in terms of scope, depth or speed of understanding is most evident in the compound noun <i>intelligence quotient</i>, which was modelled after the German term <i>Intelligenzquotient</i> proposed by German psychologist William Stern and first attested in English in 1913. The <i>OED</i> defines <i>intelligence quotient (IQ)</i> as ‘A number intended to represent a person's intelligence, measured using problem-solving tests and compared to the statistical norm or average for their age (which is taken as 100)’, and the Stanford-Binet test and Weschler Intelligence Scale are two of the most used intelligence tests today. Although intelligence tests have long been controversial, they have come under scrutiny recently due to greater public awareness of their racist, classist and ableist origins, particularly their history of use in eugenics. Metaphorical uses of the term <i>intelligence quotient</i> are also possible, as in the following example cited in the <i>OED</i>: ‘They dumb-down the city's intelligence quotient’ (<i>Chicago-Sun Times</i>, 15 June 2008). However, over the past decade, researchers have developed analytical frameworks to measure a city's intelligence level or <i>smartness</i>. The IMD Smart City Index produced by the International Institute for Management Development World Competitiveness Center, for example, uses survey data to compare cities around the world according to different social, cultural, environmental, economic and technological dimensions.</p><p><i>Intelligence</i> acquired specialised meanings in political and military contexts in the late fifteenth century, becoming synonymous with words like <i>information</i>, <i>knowledge</i> and <i>news</i>, particularly if that information had military value. This sense of <i>intelligence</i> expanded again in the early seventeenth century to describe not only information but also the covert processes involved in collecting such information, making <i>intelligence</i> synonymous with <i>espionage</i> and <i>surveillance</i>. Some of the most frequently used collocations with <i>intelligence</i> in English relate to this sense of the word: <i>intelligence agency</i> (1878), <i>intelligence officer</i> (1779), <i>national intelligence</i> and <i>military intelligence</i>. Technological advancements after the First World War supported the formalisation and expansion of <i>intelligence agencies</i> during the inter-war period (1918–39). <i>Intelligence</i> and <i>counter-intelligence</i> (1940), which the <i>OED</i> defines as ‘The activity of preventing the enemy from obtaining secret information’, became even more expansive industries around the world during the Cold War (1947–91), and the United States established the Central Intelligence Agency (CIA) in 1947. <i>Intel</i>, the clipped form of <i>intelligence</i>, has been used colloquially since 1961 to describe information of military or political value. While the Google Books Ngram Viewer shows a steady increase in the relative frequency of <i>intel</i> beginning around 2001, corpus data include references to the multinational technology company <i>Intel</i>, which seems to have profited from the positive associations of <i>intelligence</i> in its branding since the corporation's founding in 1968.</p><p>The meaning of <i>intelligence</i> has also continued to evolve alongside advancements in computer technologies. In 1950, English computer scientist and mathematician Alan Turing developed an imitation game to determine whether a computer can exhibit human intelligence: a computer would be considered <i>intelligent</i> if it could carry on a conversation and ‘pass’ as human without being identified as a machine. This method, now known as the Turing Test, has been important in the development of the field of artificial intelligence and machine learning. First attested in 1955, <i>artificial intelligence</i>, abbreviated as <i>AI</i>, refers to ‘The capacity of computers or other machines to exhibit or simulate intelligent behavior’. However, the term <i>artificial intelligence</i> has been contested, and some experts on the social implications of AI have claimed that the term is a misnomer. Microsoft researcher Kate Crawford, for example, has argued that artificial intelligence ‘is neither <i>artificial</i> nor <i>intelligent</i>. Rather, artificial intelligence is both embodied and material, made from natural resources, fuel, human labor, infrastructures, logistics, histories, and classifications. AI systems are not autonomous, rational, or able to discern anything without extensive, computationally intensive training with large datasets or predefined rules and rewards.’<sup>2</sup></p><p><i>Smartness</i>, a synonym for <i>intelligence</i> first attested in 1340, has taken on new meaning in the age of AI. The adjective ‘smart’ in <i>smart technologies</i> originally comes from the acronym SMART (Self-Monitoring, Analysis, and Reporting Technology). Smart technologies are devices designed to connect with internet networks and other devices, and they ‘allow inanimate objects such as tools and machines to communicate with humans, and vice versa: they integrate digital and non-digital functionalities’.<sup>3</sup> The <i>smartphone</i> (1980), which is now typically understood as a mobile phone equipped with computer applications and internet access, is just one example, but smart technologies include smartwatches, smart wearables, smart doorbells, smart gardens, smart thermostats, smart homes and many more. Headlines of recent online news articles and videos, such as ‘Is smart technology making us dumb?’ (<i>Deutsche Welle</i>, 30 October 2021) and ‘Are smartphones making us stupid?’ (<i>Psychology Today</i>, 25 June 2017), communicate a general sense of anxiety about the potential negative impact of new technologies on human intelligence. According to recent cognitive science research, however, there is no evidence that smart technologies diminish human cognitive capacities, and some researchers have even emphasised that smart technologies enhance our abilities to perform more creative and intellectually demanding tasks.</p><p>Speculation about whether non-human animals and plants have a soul or are <i>intelligent</i> has existed for centuries. In the early twenty-first century there is more widespread acceptance of the idea that other lifeforms possess intelligence, due to both scientific discoveries and the semantic broadening of the word <i>intelligence</i>. The brain, specifically the cerebrum, is the centre of intelligence in humans, and most animals, except for some ocean floor creatures, have a brain or other nervous tissue. Plants lack a brain, but already in the late nineteenth century Charles Darwin recognised that plants are much more complicated than we think they are. Arguing that intelligence unites rather than divides humans, other animals and plants, Italian researcher and founder of plant neurobiology Stefano Mancuso has written extensively about <i>plant intelligence</i>, defining <i>intelligence</i> as ‘the ability to solve problems’.<sup>4</sup></p><p>Contemporary definitions of <i>intelligence</i>, whether rooted in adaptability or problem-solving abilities, demonstrate an expanded, more inclusive understanding of the term beyond its original limited sense meaning the (human) faculty of mind or <i>intellect</i>. Answers to the questions of whether plants or computers are intelligent hinge on our understanding of <i>intelligence</i>, and ongoing heated debates about the validity and ethics of IQ tests reflect not only disagreements over how to define <i>intelligence</i>, but also the social importance attached to the concept. <i>Intelligence</i> remains a complex and contested term, as the concept it represents holds great value in many societies, and discussions about who or what qualifies as intelligent, how to define it, and how to measure it are shaped by power dynamics. Awareness of these power dynamics is important because the act of attributing or denying intelligence to humans and other entities often implies attributing or denying respect.</p><p>See <span>artificial, communication, education, excellence, experience, information, life, network, security, soul, technology.</span></p>\",\"PeriodicalId\":44341,\"journal\":{\"name\":\"CRITICAL QUARTERLY\",\"volume\":\"66 3\",\"pages\":\"101-105\"},\"PeriodicalIF\":0.2000,\"publicationDate\":\"2023-11-27\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://onlinelibrary.wiley.com/doi/epdf/10.1111/criq.12756\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"CRITICAL QUARTERLY\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://onlinelibrary.wiley.com/doi/10.1111/criq.12756\",\"RegionNum\":4,\"RegionCategory\":\"文学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"0\",\"JCRName\":\"LITERARY REVIEWS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"CRITICAL QUARTERLY","FirstCategoryId":"1085","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1111/criq.12756","RegionNum":4,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"0","JCRName":"LITERARY REVIEWS","Score":null,"Total":0}
Intellectual, not intelligence, was the headword in Raymond Williams's Keywords (1976, 1983).1 In the early twenty-first century, despite discussions of the rise of anti-intellectualism in the United States and in other places around the world, intelligence is the more complex and contested term. The meaning of the word intelligence has changed over the past six centuries in response to social, political, scientific and technological shifts. Since its introduction into English, intelligence has undergone a process of semantic broadening, becoming at once more generalised while also gaining specialised meanings in a variety of contexts from psychology and education to international security and computer science.
The origins of the noun intelligence can be traced to French intelligence and Latin intellegentia. Its earliest sense in English, ‘Faculty of understanding; intellect’, dates to the late fourteenth century and remains active today. The synonym intellect, a borrowing from Latin intellectus, was also first attested in the late fourteenth century. According to the OED, intellect is defined as ‘That faculty, or sum of faculties, of the mind or soul by which a person knows and reasons; power of thought; understanding’, and it is often distinguished from sensation, imagination and will.
In addition to denoting the mental faculty of an individual, intelligence also encompassed ‘The action or fact of mentally apprehending something; understanding, knowledge, comprehension (of something)’ from the mid-fifteenth century onwards. This expanded meaning enabled the word to describe the acquisition of knowledge across various fields of study, as well as the capacities of non-human entities.
Already around the time of the word's borrowing into English in the late fourteenth century, intelligence referred to an intelligent or rational spiritual being outside the human realm, such as an angel, a spirit or extraterrestrial life. This sense of intelligence recalls the 1816 coinage intelligent design, which describes deliberate design in the natural world attributed to an intelligent entity often identified as God. More recently, confusion has arisen over the distinction between the theory of Intelligent Design (ID), whose proponents claim is based on empirical evidence, and creationism, which is based on religious texts and teachings, particularly in the context of discussions about whether evolution and intelligent design should be taught in state-funded schools.
Intelligence took on a measurable aspect from the mid-fifteenth century, denoting ‘Understanding as a quality admitting of degree; spec. quickness or superiority of understanding, sagacity’. This idea that intelligence can be quantified and compared in terms of scope, depth or speed of understanding is most evident in the compound noun intelligence quotient, which was modelled after the German term Intelligenzquotient proposed by German psychologist William Stern and first attested in English in 1913. The OED defines intelligence quotient (IQ) as ‘A number intended to represent a person's intelligence, measured using problem-solving tests and compared to the statistical norm or average for their age (which is taken as 100)’, and the Stanford-Binet test and Weschler Intelligence Scale are two of the most used intelligence tests today. Although intelligence tests have long been controversial, they have come under scrutiny recently due to greater public awareness of their racist, classist and ableist origins, particularly their history of use in eugenics. Metaphorical uses of the term intelligence quotient are also possible, as in the following example cited in the OED: ‘They dumb-down the city's intelligence quotient’ (Chicago-Sun Times, 15 June 2008). However, over the past decade, researchers have developed analytical frameworks to measure a city's intelligence level or smartness. The IMD Smart City Index produced by the International Institute for Management Development World Competitiveness Center, for example, uses survey data to compare cities around the world according to different social, cultural, environmental, economic and technological dimensions.
Intelligence acquired specialised meanings in political and military contexts in the late fifteenth century, becoming synonymous with words like information, knowledge and news, particularly if that information had military value. This sense of intelligence expanded again in the early seventeenth century to describe not only information but also the covert processes involved in collecting such information, making intelligence synonymous with espionage and surveillance. Some of the most frequently used collocations with intelligence in English relate to this sense of the word: intelligence agency (1878), intelligence officer (1779), national intelligence and military intelligence. Technological advancements after the First World War supported the formalisation and expansion of intelligence agencies during the inter-war period (1918–39). Intelligence and counter-intelligence (1940), which the OED defines as ‘The activity of preventing the enemy from obtaining secret information’, became even more expansive industries around the world during the Cold War (1947–91), and the United States established the Central Intelligence Agency (CIA) in 1947. Intel, the clipped form of intelligence, has been used colloquially since 1961 to describe information of military or political value. While the Google Books Ngram Viewer shows a steady increase in the relative frequency of intel beginning around 2001, corpus data include references to the multinational technology company Intel, which seems to have profited from the positive associations of intelligence in its branding since the corporation's founding in 1968.
The meaning of intelligence has also continued to evolve alongside advancements in computer technologies. In 1950, English computer scientist and mathematician Alan Turing developed an imitation game to determine whether a computer can exhibit human intelligence: a computer would be considered intelligent if it could carry on a conversation and ‘pass’ as human without being identified as a machine. This method, now known as the Turing Test, has been important in the development of the field of artificial intelligence and machine learning. First attested in 1955, artificial intelligence, abbreviated as AI, refers to ‘The capacity of computers or other machines to exhibit or simulate intelligent behavior’. However, the term artificial intelligence has been contested, and some experts on the social implications of AI have claimed that the term is a misnomer. Microsoft researcher Kate Crawford, for example, has argued that artificial intelligence ‘is neither artificial nor intelligent. Rather, artificial intelligence is both embodied and material, made from natural resources, fuel, human labor, infrastructures, logistics, histories, and classifications. AI systems are not autonomous, rational, or able to discern anything without extensive, computationally intensive training with large datasets or predefined rules and rewards.’2
Smartness, a synonym for intelligence first attested in 1340, has taken on new meaning in the age of AI. The adjective ‘smart’ in smart technologies originally comes from the acronym SMART (Self-Monitoring, Analysis, and Reporting Technology). Smart technologies are devices designed to connect with internet networks and other devices, and they ‘allow inanimate objects such as tools and machines to communicate with humans, and vice versa: they integrate digital and non-digital functionalities’.3 The smartphone (1980), which is now typically understood as a mobile phone equipped with computer applications and internet access, is just one example, but smart technologies include smartwatches, smart wearables, smart doorbells, smart gardens, smart thermostats, smart homes and many more. Headlines of recent online news articles and videos, such as ‘Is smart technology making us dumb?’ (Deutsche Welle, 30 October 2021) and ‘Are smartphones making us stupid?’ (Psychology Today, 25 June 2017), communicate a general sense of anxiety about the potential negative impact of new technologies on human intelligence. According to recent cognitive science research, however, there is no evidence that smart technologies diminish human cognitive capacities, and some researchers have even emphasised that smart technologies enhance our abilities to perform more creative and intellectually demanding tasks.
Speculation about whether non-human animals and plants have a soul or are intelligent has existed for centuries. In the early twenty-first century there is more widespread acceptance of the idea that other lifeforms possess intelligence, due to both scientific discoveries and the semantic broadening of the word intelligence. The brain, specifically the cerebrum, is the centre of intelligence in humans, and most animals, except for some ocean floor creatures, have a brain or other nervous tissue. Plants lack a brain, but already in the late nineteenth century Charles Darwin recognised that plants are much more complicated than we think they are. Arguing that intelligence unites rather than divides humans, other animals and plants, Italian researcher and founder of plant neurobiology Stefano Mancuso has written extensively about plant intelligence, defining intelligence as ‘the ability to solve problems’.4
Contemporary definitions of intelligence, whether rooted in adaptability or problem-solving abilities, demonstrate an expanded, more inclusive understanding of the term beyond its original limited sense meaning the (human) faculty of mind or intellect. Answers to the questions of whether plants or computers are intelligent hinge on our understanding of intelligence, and ongoing heated debates about the validity and ethics of IQ tests reflect not only disagreements over how to define intelligence, but also the social importance attached to the concept. Intelligence remains a complex and contested term, as the concept it represents holds great value in many societies, and discussions about who or what qualifies as intelligent, how to define it, and how to measure it are shaped by power dynamics. Awareness of these power dynamics is important because the act of attributing or denying intelligence to humans and other entities often implies attributing or denying respect.
期刊介绍:
Critical Quarterly is internationally renowned for it unique blend of literary criticism, cultural studies, poetry and fiction. The journal addresses the whole range of cultural forms so that discussions of, for example, cinema and television can appear alongside analyses of the accepted literary canon. It is a necessary condition of debate in these areas that it should involve as many and as varied voices as possible, and Critical Quarterly welcomes submissions from new researchers and writers as well as more established contributors.