How has technology changed - and changed us - in the past 20 years?

An internet surfer views the Google home page at a cafe in London, August 13, 2004.

Remember this? Image:  REUTERS/Stephen Hird

.chakra .wef-1c7l3mo{-webkit-transition:all 0.15s ease-out;transition:all 0.15s ease-out;cursor:pointer;-webkit-text-decoration:none;text-decoration:none;outline:none;color:inherit;}.chakra .wef-1c7l3mo:hover,.chakra .wef-1c7l3mo[data-hover]{-webkit-text-decoration:underline;text-decoration:underline;}.chakra .wef-1c7l3mo:focus,.chakra .wef-1c7l3mo[data-focus]{box-shadow:0 0 0 3px rgba(168,203,251,0.5);} Madeleine Hillyer

development of science and technology in the 21st century essay

.chakra .wef-9dduvl{margin-top:16px;margin-bottom:16px;line-height:1.388;font-size:1.25rem;}@media screen and (min-width:56.5rem){.chakra .wef-9dduvl{font-size:1.125rem;}} Explore and monitor how .chakra .wef-15eoq1r{margin-top:16px;margin-bottom:16px;line-height:1.388;font-size:1.25rem;color:#F7DB5E;}@media screen and (min-width:56.5rem){.chakra .wef-15eoq1r{font-size:1.125rem;}} Digital Communications is affecting economies, industries and global issues

A hand holding a looking glass by a lake

.chakra .wef-1nk5u5d{margin-top:16px;margin-bottom:16px;line-height:1.388;color:#2846F8;font-size:1.25rem;}@media screen and (min-width:56.5rem){.chakra .wef-1nk5u5d{font-size:1.125rem;}} Get involved with our crowdsourced digital platform to deliver impact at scale

Stay up to date:, technological transformation.

  • Since the dotcom bubble burst back in 2000, technology has radically transformed our societies and our daily lives.
  • From smartphones to social media and healthcare, here's a brief history of the 21st century's technological revolution.

Just over 20 years ago, the dotcom bubble burst , causing the stocks of many tech firms to tumble. Some companies, like Amazon, quickly recovered their value – but many others were left in ruins. In the two decades since this crash, technology has advanced in many ways.

Many more people are online today than they were at the start of the millennium. Looking at broadband access, in 2000, just half of Americans had broadband access at home. Today, that number sits at more than 90% .

More than half the world's population has internet access today

This broadband expansion was certainly not just an American phenomenon. Similar growth can be seen on a global scale; while less than 7% of the world was online in 2000, today over half the global population has access to the internet.

Similar trends can be seen in cellphone use. At the start of the 2000s, there were 740 million cell phone subscriptions worldwide. Two decades later, that number has surpassed 8 billion, meaning there are now more cellphones in the world than people

Have you read?

The future of jobs report 2023, how to follow the growth summit 2023.

At the same time, technology was also becoming more personal and portable. Apple sold its first iPod in 2001, and six years later it introduced the iPhone, which ushered in a new era of personal technology. These changes led to a world in which technology touches nearly everything we do.

Technology has changed major sectors over the past 20 years, including media, climate action and healthcare. The World Economic Forum’s Technology Pioneers , which just celebrated its 20th anniversary, gives us insight how emerging tech leaders have influenced and responded to these changes.

Media and media consumption

The past 20 years have greatly shaped how and where we consume media. In the early 2000s, many tech firms were still focused on expanding communication for work through advanced bandwidth for video streaming and other media consumption that is common today.

Others followed the path of expanding media options beyond traditional outlets. Early Tech Pioneers such as PlanetOut did this by providing an outlet and alternative media source for LGBTQIA communities as more people got online.

Following on from these first new media options, new communities and alternative media came the massive growth of social media. In 2004 , fewer than 1 million people were on Myspace; Facebook had not even launched. By 2018, Facebook had more 2.26 billion users with other sites also growing to hundreds of millions of users.

The precipitous rise of social media over the past 15 years

While these new online communities and communication channels have offered great spaces for alternative voices, their increased use has also brought issues of increased disinformation and polarization.

Today, many tech start-ups are focused on preserving these online media spaces while also mitigating the disinformation which can come with them. Recently, some Tech Pioneers have also approached this issue, including TruePic – which focuses on photo identification – and Two Hat , which is developing AI-powered content moderation for social media.

Climate change and green tech

Many scientists today are looking to technology to lead us towards a carbon-neutral world. Though renewed attention is being given to climate change today, these efforts to find a solution through technology is not new. In 2001, green tech offered a new investment opportunity for tech investors after the crash, leading to a boom of investing in renewable energy start-ups including Bloom Energy , a Technology Pioneer in 2010.

In the past two decades, tech start-ups have only expanded their climate focus. Many today are focuses on initiatives far beyond clean energy to slow the impact of climate change.

Different start-ups, including Carbon Engineering and Climeworks from this year’s Technology Pioneers, have started to roll out carbon capture technology. These technologies remove CO2 from the air directly, enabling scientists to alleviate some of the damage from fossil fuels which have already been burned.

Another expanding area for young tech firms today is food systems innovation. Many firms, like Aleph Farms and Air Protein, are creating innovative meat and dairy alternatives that are much greener than their traditional counterparts.

Biotech and healthcare

The early 2000s also saw the culmination of a biotech boom that had started in the mid-1990s. Many firms focused on advancing biotechnologies through enhanced tech research.

An early Technology Pioneer, Actelion Pharmaceuticals was one of these companies. Actelion’s tech researched the single layer of cells separating every blood vessel from the blood stream. Like many other biotech firms at the time, their focus was on precise disease and treatment research.

While many tech firms today still focus on disease and treatment research, many others have been focusing on healthcare delivery. Telehealth has been on the rise in recent years , with many young tech expanding virtual healthcare options. New technologies such as virtual visits, chatbots are being used to delivery healthcare to individuals, especially during Covid-19.

Many companies are also focusing their healthcare tech on patients, rather than doctors. For example Ada, a symptom checker app, used to be designed for doctor’s use but has now shifted its language and interface to prioritize giving patients information on their symptoms. Other companies, like 7 cups, are focused are offering mental healthcare support directly to their users without through their app instead of going through existing offices.

The past two decades have seen healthcare tech get much more personal and use tech for care delivery, not just advancing medical research.

The World Economic Forum was the first to draw the world’s attention to the Fourth Industrial Revolution, the current period of unprecedented change driven by rapid technological advances. Policies, norms and regulations have not been able to keep up with the pace of innovation, creating a growing need to fill this gap.

The Forum established the Centre for the Fourth Industrial Revolution Network in 2017 to ensure that new and emerging technologies will help—not harm—humanity in the future. Headquartered in San Francisco, the network launched centres in China, India and Japan in 2018 and is rapidly establishing locally-run Affiliate Centres in many countries around the world.

The global network is working closely with partners from government, business, academia and civil society to co-design and pilot agile frameworks for governing new and emerging technologies, including artificial intelligence (AI) , autonomous vehicles , blockchain , data policy , digital trade , drones , internet of things (IoT) , precision medicine and environmental innovations .

Learn more about the groundbreaking work that the Centre for the Fourth Industrial Revolution Network is doing to prepare us for the future.

Want to help us shape the Fourth Industrial Revolution? Contact us to find out how you can become a member or partner.

In the early 2000s, many companies were at the start of their recovery from the bursting dotcom bubble. Since then, we’ve seen a large expansion in the way tech innovators approach areas such as new media, climate change, healthcare delivery and more.

At the same time, we have also seen tech companies rise to the occasion of trying to combat issues which arose from the first group such as internet content moderation, expanding climate change solutions.

The Technology Pioneers' 2020 cohort marks the 20th anniversary of this community - and looking at the latest awardees can give us a snapshot of where the next two decades of tech may be heading.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

The Agenda .chakra .wef-n7bacu{margin-top:16px;margin-bottom:16px;line-height:1.388;font-weight:400;} Weekly

A weekly update of the most important issues driving the global agenda

.chakra .wef-1dtnjt5{display:-webkit-box;display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-align-items:center;-webkit-box-align:center;-ms-flex-align:center;align-items:center;-webkit-flex-wrap:wrap;-ms-flex-wrap:wrap;flex-wrap:wrap;} More on Forum Institutional .chakra .wef-17xejub{-webkit-flex:1;-ms-flex:1;flex:1;justify-self:stretch;-webkit-align-self:stretch;-ms-flex-item-align:stretch;align-self:stretch;} .chakra .wef-nr1rr4{display:-webkit-inline-box;display:-webkit-inline-flex;display:-ms-inline-flexbox;display:inline-flex;white-space:normal;vertical-align:middle;text-transform:uppercase;font-size:0.75rem;border-radius:0.25rem;font-weight:700;-webkit-align-items:center;-webkit-box-align:center;-ms-flex-align:center;align-items:center;line-height:1.2;-webkit-letter-spacing:1.25px;-moz-letter-spacing:1.25px;-ms-letter-spacing:1.25px;letter-spacing:1.25px;background:none;padding:0px;color:#B3B3B3;-webkit-box-decoration-break:clone;box-decoration-break:clone;-webkit-box-decoration-break:clone;}@media screen and (min-width:37.5rem){.chakra .wef-nr1rr4{font-size:0.875rem;}}@media screen and (min-width:56.5rem){.chakra .wef-nr1rr4{font-size:1rem;}} See all

development of science and technology in the 21st century essay

Institutional update

World Economic Forum

May 21, 2024

development of science and technology in the 21st century essay

Reflections from MENA at the #SpecialMeeting24

Maroun Kairouz

May 3, 2024

development of science and technology in the 21st century essay

Day 2 #SpecialMeeting24: Key insights and what to know

Gayle Markovitz

April 28, 2024

development of science and technology in the 21st century essay

Day 1 #SpecialMeeting24: Key insights and what just happened

April 27, 2024

development of science and technology in the 21st century essay

#SpecialMeeting24: What to know about the programme and who's coming

Mirek Dušek and Maroun Kairouz

development of science and technology in the 21st century essay

Climate finance: What are debt-for-nature swaps and how can they help countries?

Kate Whiting

April 26, 2024

Oxford Martin School logo

Technology over the long run: zoom out to see how dramatically the world can change within a lifetime

It is easy to underestimate how much the world can change within a lifetime. considering how dramatically the world has changed can help us see how different the world could be in a few years or decades..

Technology can change the world in ways that are unimaginable until they happen. Switching on an electric light would have been unimaginable for our medieval ancestors. In their childhood, our grandparents would have struggled to imagine a world connected by smartphones and the Internet.

Similarly, it is hard for us to imagine the arrival of all those technologies that will fundamentally change the world we are used to.

We can remind ourselves that our own future might look very different from the world today by looking back at how rapidly technology has changed our world in the past. That’s what this article is about.

One insight I take away from this long-term perspective is how unusual our time is. Technological change was extremely slow in the past – the technologies that our ancestors got used to in their childhood were still central to their lives in their old age. In stark contrast to those days, we live in a time of extraordinarily fast technological change. For recent generations, it was common for technologies that were unimaginable in their youth to become common later in life.

The long-run perspective on technological change

The big visualization offers a long-term perspective on the history of technology. 1

The timeline begins at the center of the spiral. The first use of stone tools, 3.4 million years ago, marks the beginning of this history of technology. 2 Each turn of the spiral represents 200,000 years of history. It took 2.4 million years – 12 turns of the spiral – for our ancestors to control fire and use it for cooking. 3

To be able to visualize the inventions in the more recent past – the last 12,000 years – I had to unroll the spiral. I needed more space to be able to show when agriculture, writing, and the wheel were invented. During this period, technological change was faster, but it was still relatively slow: several thousand years passed between each of these three inventions.

From 1800 onwards, I stretched out the timeline even further to show the many major inventions that rapidly followed one after the other.

The long-term perspective that this chart provides makes it clear just how unusually fast technological change is in our time.

You can use this visualization to see how technology developed in particular domains. Follow, for example, the history of communication: from writing to paper, to the printing press, to the telegraph, the telephone, the radio, all the way to the Internet and smartphones.

Or follow the rapid development of human flight. In 1903, the Wright brothers took the first flight in human history (they were in the air for less than a minute), and just 66 years later, we landed on the moon. Many people saw both within their lifetimes: the first plane and the moon landing.

This large visualization also highlights the wide range of technology’s impact on our lives. It includes extraordinarily beneficial innovations, such as the vaccine that allowed humanity to eradicate smallpox , and it includes terrible innovations, like the nuclear bombs that endanger the lives of all of us .

What will the next decades bring?

The red timeline reaches up to the present and then continues in green into the future. Many children born today, even without further increases in life expectancy, will live well into the 22nd century.

New vaccines, progress in clean, low-carbon energy, better cancer treatments – a range of future innovations could very much improve our living conditions and the environment around us. But, as I argue in a series of articles , there is one technology that could even more profoundly change our world: artificial intelligence (AI).

One reason why artificial intelligence is such an important innovation is that intelligence is the main driver of innovation itself. This fast-paced technological change could speed up even more if it’s driven not only by humanity’s intelligence but also by artificial intelligence. If this happens, the change currently stretched out over decades might happen within a very brief time span of just a year. Possibly even faster. 4

I think AI technology could have a fundamentally transformative impact on our world. In many ways, it is already changing our world, as I documented in this companion article . As this technology becomes more capable in the years and decades to come, it can give immense power to those who control it (and it poses the risk that it could escape our control entirely).

Such systems might seem hard to imagine today, but AI technology is advancing quickly. Many AI experts believe there is a real chance that human-level artificial intelligence will be developed within the next decades, as I documented in this article .

legacy-wordpress-upload

Technology will continue to change the world – we should all make sure that it changes it for the better

What is familiar to us today – photography, the radio, antibiotics, the Internet, or the International Space Station circling our planet – was unimaginable to our ancestors just a few generations ago. If your great-great-great grandparents could spend a week with you, they would be blown away by your everyday life.

What I take away from this history is that I will likely see technologies in my lifetime that appear unimaginable to me today.

In addition to this trend towards increasingly rapid innovation, there is a second long-run trend. Technology has become increasingly powerful. While our ancestors wielded stone tools, we are building globe-spanning AI systems and technologies that can edit our genes.

Because of the immense power that technology gives those who control it, there is little that is as important as the question of which technologies get developed during our lifetimes. Therefore, I think it is a mistake to leave the question about the future of technology to the technologists. Which technologies are controlled by whom is one of the most important political questions of our time because of the enormous power these technologies convey to those who control them.

We all should strive to gain the knowledge we need to contribute to an intelligent debate about the world we want to live in. To a large part, this means gaining knowledge and wisdom on the question of which technologies we want.

Acknowledgments: I would like to thank my colleagues Hannah Ritchie, Bastian Herre, Natasha Ahuja, Edouard Mathieu, Daniel Bachler, Charlie Giattino, and Pablo Rosado for their helpful comments on drafts of this essay and the visualization. Thanks also to Lizka Vaintrob and Ben Clifford for the conversation that initiated this visualization.

Appendix: About the choice of visualization in this article

The recent speed of technological change makes it difficult to picture the history of technology in one visualization. When you visualize this development on a linear timeline, then most of the timeline is almost empty, while all the action is crammed into the right corner:

Linear version of the spiral chart

In my large visualization here, I tried to avoid this problem and instead show the long history of technology in a way that lets you see when each technological breakthrough happened and how, within the last millennia, there was a continuous acceleration of technological change.

The recent speed of technological change makes it difficult to picture the history of technology in one visualization. In the appendix, I show how this would look if it were linear.

It is, of course, difficult to assess when exactly the first stone tools were used.

The research by McPherron et al. (2010) suggested that it was at least 3.39 million years ago. This is based on two fossilized bones found in Dikika in Ethiopia, which showed “stone-tool cut marks for flesh removal and percussion marks for marrow access”. These marks were interpreted as being caused by meat consumption and provide the first evidence that one of our ancestors, Australopithecus afarensis, used stone tools.

The research by Harmand et al. (2015) provided evidence for stone tool use in today’s Kenya 3.3 million years ago.

References:

McPherron et al. (2010) – Evidence for stone-tool-assisted consumption of animal tissues before 3.39 million years ago at Dikika, Ethiopia . Published in Nature.

Harmand et al. (2015) – 3.3-million-year-old stone tools from Lomekwi 3, West Turkana, Kenya . Published in Nature.

Evidence for controlled fire use approximately 1 million years ago is provided by Berna et al. (2012) Microstratigraphic evidence of in situ fire in the Acheulean strata of Wonderwerk Cave, Northern Cape province, South Africa , published in PNAS.

The authors write: “The ability to control fire was a crucial turning point in human evolution, but the question of when hominins first developed this ability still remains. Here we show that micromorphological and Fourier transform infrared microspectroscopy (mFTIR) analyses of intact sediments at the site of Wonderwerk Cave, Northern Cape province, South Africa, provide unambiguous evidence—in the form of burned bone and ashed plant remains—that burning took place in the cave during the early Acheulean occupation, approximately 1.0 Ma. To the best of our knowledge, this is the earliest secure evidence for burning in an archaeological context.”

This is what authors like Holden Karnofsky called ‘Process for Automating Scientific and Technological Advancement’ or PASTA. Some recent developments go in this direction: DeepMind’s AlphaFold helped to make progress on one of the large problems in biology, and they have also developed an AI system that finds new algorithms that are relevant to building a more powerful AI.

Cite this work

Our articles and data visualizations rely on work from many different people and organizations. When citing this article, please also cite the underlying data sources. This article can be cited as:

BibTeX citation

Reuse this work freely

All visualizations, data, and code produced by Our World in Data are completely open access under the Creative Commons BY license . You have the permission to use, distribute, and reproduce these in any medium, provided the source and authors are credited.

The data produced by third parties and made available by Our World in Data is subject to the license terms from the original third-party authors. We will always indicate the original source of the data in our documentation, so you should always check the license of any such third-party data before use and redistribution.

All of our charts can be embedded in any site.

Our World in Data is free and accessible for everyone.

Help us do this work by making a donation.

  • Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

development of science and technology in the 21st century essay

Understanding Science

How science REALLY works...

  • Understanding Science 101

Advances in science often drive technological innovations, which may, in turn, contribute to new scientific discoveries.

Science and technology on fast forward

Science  and  technology  feed off of one another, propelling both forward. Scientific knowledge allows us to build new technologies, which often allow us to make new  observations about the world, which, in turn, allow us to build even more scientific knowledge, which then inspires another technology … and so on. As an example, we’ll start with a single scientific idea and trace its applications and impact through several different fields of science and technology, from the discovery of electrons in the 1800s to modern forensics and DNA fingerprinting…

From cathodes to crystallography

We pick up our story in the late 1800s with a bit of technology that no one much understood at the time, but which was poised to change the face of science: the cathode ray tube (node A in the diagram below and pictured above). This was a sealed glass tube emptied of almost all air — but when an electric current was passed through the tube, it no longer seemed empty. Rays of eerie light shot across the tube. In 1897, physicists would discover that these cathode rays were actually streams of electrons (B). The discovery of the electron would, in turn, lead to the discovery of the atomic nucleus in 1910 (C). On the technological front, the cathode ray tube would slowly evolve into the television (which is constructed from a cathode ray tube with the electron beam deflected in ways that produce an image on a screen) and, eventually, into many sorts of image monitors (D and E). But that’s not all…

The discovery of X-rays also pointed William and William Bragg (a father-son team) in 1913 and 1914 to the idea that X-rays could be used to figure out the arrangements of atoms in a crystal (L). This works a bit like trying to figure out the size and shape of a building based on the shadow it casts: you can work backwards from the shape of the shadow to make a guess at the building’s dimensions. When X-rays are passed through a crystal, some of the X-rays are bent or spread out (i.e., diffracted) by the atoms in the crystal. You can then extrapolate backwards from the locations of the deflected X-rays to figure out the relative locations of the crystal atoms. This technique is known as X-ray crystallography, and it has profoundly influenced the course of science by providing snapshots of molecular structures.

Perhaps most notably, Rosalind Franklin used X-ray crystallography to help uncover the structure of the key molecule of life: DNA. In 1952, Franklin, like James Watson and Francis Crick, was working on the structure of DNA — but from a different angle. Franklin was painstakingly producing diffracted images of DNA, while Watson and Crick were trying out different structures using tinker-toy models of the component molecules. In fact, Franklin had already proposed a double helical form for the molecule when, in 1953, a colleague showed Franklin’s most telling image to Watson. That picture convinced Watson and Crick that the molecule was a double helix and pointed to the arrangement of atoms within that helix. Over the next few weeks, the famous pair would use their models to correctly work out the chemical details of DNA (M).

The impact of the discovery of DNA’s structure on scientific research, medicine, agriculture, conservation, and other social issues has been wide-ranging — so much so, that it is difficult to pick out which threads of influence to follow. To choose just one, understanding the structure of DNA (along with many other inputs) eventually allowed biologists to develop a quick and easy method for copying very small amounts of DNA, known as PCR — the polymerase chain reaction (N). This technique (developed in the 1980s), in turn, allowed the development of DNA fingerprinting technologies, which have become an important part of modern criminal investigations (O).

As shown by the flowchart above, scientific knowledge (like the discovery of X-rays) and technologies (like the invention of PCR) are deeply interwoven and feed off one another. In this case, tracing the influence of a single technology, the cathode ray tube, over the course of a century has taken us on a journey spanning ancient fossils, supernovas, the invention of television, the atomic nucleus, and DNA fingerprinting. And even this complex network is incomplete. Understanding DNA’s structure, for example, led to many more advances besides just the development of PCR. And similarly, the invention of the CT scanner relied on much more scientific knowledge than just an understanding of how X-ray machines work. Scientific knowledge and technology form a maze of connections in which every idea is connected to every other idea through a winding path.

  • Science in action

Through many intervening steps, the cathode ray tube is connected to modern advances in DNA. For a focus on the steps leading up to the discovery of the arrangement of atoms in DNA, visit  The Structure of DNA: Cooperation and competition .

Fueling technology

Making strides in medicine

Subscribe to our newsletter

  • The science flowchart
  • Science stories
  • Grade-level teaching guides
  • Teaching resource database
  • Journaling tool
  • Misconceptions
  • Current Issue
  • Past Issues
  • Get New Issue Alerts
  • American Academy of Arts 
and Sciences

Science in the 21st Century

development of science and technology in the 21st century essay

From the invention of new life forms to the discovery of life beyond Earth, science is reshaping our understanding of the universe in the twenty-first century. In the Summer 2012 issue of  Dædalus , leading scientists describe emerging advances in nanoscience, neuroscience, genetics, paleontology, microbiology, mathematics, planetary science, and plant biology, among other areas. The authors examine how their disciplines might address some of this century’s most critical challenges, such as treating an explosion of degenerative neurological disease and providing food, fuel and a habitable environment for a global population predicted to reach nine to ten billion by 2050.

development of science and technology in the 21st century essay

The Search for Habitable Worlds: Planetary Exploration in the 21st Century

The search for and detailed characterization of habitable environments on other worlds – places where liquid water, heat/energy sources, and biologically important organic molecules exist or could have once existed – is a major twenty-first-century goal for space exploration by NASA and other space agencies, motivated by intense public interest and highly ranked science objectives identified in recent National Academy decadal surveys.

E pluribus unum : From Complexity, Universality

In a brief survey, Terence Tao discusses some examples of the fascinating phenomenon of universality in complex systems, in which universal macroscopic laws of nature emerge from a variety of different microscopic dynamics.

Small Machines

Over the last fifty years, small has emerged as the new big thing. The reduction of information and electronics to nanometer dimensions has revolutionized science, technology, and society. Now scientists and engineers are creating physical machines that operate at the nanoscale. Using approaches ranging from lithographic patterning to the co-opting of biological machinery, new devices are being built that can navigate, sense, and alter the nanoscale world. In the coming decades, these machines will have enormous impact in fields ranging from biotechnology to quantum physics, blurring the boundary between technology and life.

Can We Progress from Solipsistic Science to Frugal Innovation?

Energy demand in the twenty-first century will be driven by the needs of three billion people in the emerging world and three billion new inhabitants to our planet. To provide them with a renewable and sustainable energy supply is perhaps the greatest challenge for science in the twenty-first century. The science practiced to meet the energy needs of the twentieth century responded to a society of wealth, and energy systems were designed to be large and centralized.

The Future of Fundamental Physics

Fundamental physics began the twentieth century with the twin revolutions of relativity and quantum mechanics, and much of the second half of the century was devoted to the construction of a theoretical structure unifying these radical ideas. But this foundation has also led us to a number of paradoxes in our understanding of nature.

Microbes as Menaces, Mates & Marvels

The conventional understanding of microbes as causative agents of disease has led us to fear them and to consider them our deadly enemies. Much less appreciated are the central roles microbes play in shaping the environment and in maintaining plant, animal, and human health. All metazoan organisms – organisms that we can see with the naked eye – exist in lifelong partnerships with vast microbial communities.

Fossils Everywhere

History is omnipresent in the natural world, from inside rocks on the continents to the genes, cells, and organs of each creature on the planet. Linking the historical records of rocks, fossils, and genes has been a boon to understanding the major events in evolution. We use these seemingly different lines of evidence as tools for discovery.

Deciphering the Parts List for the Mechanical Plant

The development of inexpensive DNA sequencing technologies has revolutionized all aspects of biological research. The proliferation of plant genome sequences, in conjunction with the parallel development of robust tools for directed genetic manipulation, has given momentum and credibility to the goal of understanding several model plants as the sum of their parts.

The Coming Epidemic of Neurologic Disorders: What Science Is - and Should Be - Doing About It

The Earth’s population is aging fast, and the coming sharp increase in the number of people over age sixty-five will bring with it an epidemic of age-related neurodegenerative diseases, such as Alzheimer’s and Parkinson’s diseases. Currently, no cures exist for the major neurologic disorders. Unless cures can be found, by 2050 the cost of these diseases will exceed $1 trillion annually in the United States.

Biodiversity & Environmental Sustainability amid Human Domination of Global Ecosystems

Concern about the loss of Earth’s biological diversity sparked two decades of research of unprecedented intensity, intellectual excitement, and societal relevance. This research shows that biodiversity is among the most important factors determining how ecosystems function. In particular, the loss of biodiversity decreases the productivity, stability, and efficiency of terrestrial, freshwater, and marine ecosystems.

3D printing, E-cigarettes among the most important inventions of the 21st century

development of science and technology in the 21st century essay

The human race has always innovated, and in a relatively short time went from building fires and making stone-tipped arrows to creating smartphone apps and autonomous robots. Today, technological progress will undoubtedly continue to change the way we work, live, and survive in the coming decades.

Since the beginning of the new millennium, the world has witnessed the emergence of social media, smartphones, self-driving cars, and autonomous flying vehicles. There have also been huge leaps in energy storage, artificial intelligence, and medical science. Men and women have mapped the human genome and are grappling with the ramifications of biotechnology and gene editing. 

We are facing immense challenges in global warming and food security, among many other issues. While human innovation has contributed to many of the problems we are facing, it is also human innovation and ingenuity that can help humanity deal with these issues. These are 21 strategies that could avert climate disaster . 

Get ready: These are the best vehicles coming in 2020

Small-business headaches: Ruling on sales tax creates expenses for some small-business owners

24/7 Wall St. examined media reports and other sources on the latest far-reaching innovations to find some of the most important 21st-century inventions. In some cases, though there were some precursor research and ancillary technologies before 2001, the innovation did not become available to the public until this century. This list focuses on innovations (such as touch screen glass) that support products rather than the specific products themselves (like the iPhone). 

It remains to be seen if all the technology on this list will continue to have an impact throughout the century. Legislation in the United States may limit the longevity of e-cigarettes, for example. But some of the inventions of the last 20 years will likely have staying power for the foreseeable future. Here are some inventions that are hundreds of years old but are still widely used today .

1. 3D printing

Most inventions come as a result of previous ideas and concepts, and 3D printing is no different. The earliest application of the layering method used by today's 3D printers took place in the manufacture of topographical maps in the late 19th century, and 3D printing as we know it began in 1980.

The convergence of cheaper manufacturing methods and open-source software, however, has led to a revolution of 3D printing in recent years. Today, the technology is being used in the production of everything from lower-cost car parts to bridges to less painful ballet slippers and it is even considered for artificial organs.

2. E-cigarettes

While components of the technology have existed for decades, the first modern e-cigarette was introduced in 2006. Since then, the devices have become wildly popular as an alternative to traditional cigarettes, and new trends, such as the use of flavored juice, have contributed to the success of companies like Juul.

Recent studies have shown that there remains a great deal of uncertainty and risk surrounding the devices, with an increasing number of deaths and injuries linked to vaping. In early 2020, the FDA issued a widespread ban on many flavors of cartridge-based e-cigarettes, in part because those flavors are especially popular with children and younger adults.

3. Augmented reality

Augmented reality, in which digital graphics are overlaid onto live footage to convey information in real time, has been around for a while. Only recently, however, following the arrival of more powerful computing hardware and the creation of an open source video tracking software library known as ARToolKit that the technology has really taken off.

Smartphone apps like the Pokémon Go game and Snapchat filters are just two small popular examples of modern augmented reality applications. The technology is being adopted as a tool in manufacturing, health care, travel, fashion, and education.

4. Birth control patch

The early years of the millennia have brought about an innovation in family planning, albeit one that is still focused only on women and does nothing to protect against sexually transmitted infections. Still, the birth control patch was first released in the United States in 2002 and has made it much easier for women to prevent unintended pregnancies. The plastic patch contains the same estrogen and progesterone hormones found in birth control pills and delivers them in the same manner as nicotine patches do to help people quit tobacco products.

5. Blockchain

You've likely heard about it even if you don't fully understand it. The simplest explanation of blockchain is that it is an incorruptible way to record transactions between parties – a shared digital ledger that parties can only add to and that is transparent to all members of a peer-to-peer network where the blockchain is logged and stored.

The technology was first deployed in 2008 to create Bitcoin, the first decentralized cryptocurrency, but it has since been adopted by the financial sector and other industries for myriad uses, including money transfers, supply chain monitoring, and food safety.

6. Capsule endoscopy

Advancements in light emitting electrodes, image sensors, and optical design in the '90s led to the emergence of capsule endoscopy, first used in patients in 2001. The technology uses a tiny wireless camera the size of a vitamin pill that the patient swallows.

As the capsule traverses the digestive system, doctors can examine the gastrointestinal system in a far less intrusive manner. Capsule endoscopy can be used to identify the source of internal bleeding, inflammations of the bowel ulcers, and cancerous tumors.

7. Modern artificial pancreas

More formally known as closed-loop insulin delivery system, the artificial pancreas has been around since the late '70s, but the first versions were the size of a filing cabinet. In recent years, the artificial pancreas, used primarily to treat type 1 diabetes, became portable. The first artificial pancreas (the modern, portable kind) was approved for use in the United States in 2016.

The system continuously monitors blood glucose levels, calculates the amount of insulin required, and automatically delivers it through a small pump. British studies have shown that patients using these devices spent more time in their ideal glucose-level range. In December 2019, the FDA approved an even more advanced version of the artificial pancreas, called Control-IQ, developed by UVA.

Dear GM: Don't revive Hummer. Focus on Cadillac instead.

8. E-readers

Sony was the first company to release an e-reader using a so-called microencapsulated electrophoretic display, commonly referred to as e-ink. E-ink technology, which mimics ink on paper that is easy on the eyes and consumes less power, had been around since the '70s (and improved in the '90s), but the innovation of e-readers had to wait until after the broader demand for e-books emerged. Sony was quickly overtaken by Amazon's Kindle after its 2007 debut. The popularity of e-readers has declined with the emergence of tablets and smartphones, but they still command loyalty from bookworms worldwide.

9. Gene editing

Researchers from the University of California, Berkeley and a separate team from Harvard and the Broad Institute independently discovered in 2012 that a bacterial immune system known as CRISPR (an acronym for clustered regularly interspaced short palindromic repeats) could be used as a powerful gene-editing tool to make detailed changes to any organism's DNA. This discovery heralded a new era in biotechnology.

The discovery has the potential to eradicate diseases by altering the genes in mice and mosquitoes to combat the spread of Lyme disease and malaria but is also raising ethical questions, especially with regards to human gene editing such as for reproductive purposes.

10. High-density battery packs

Tesla electric cars have received so much attention largely because of their batteries. The batteries, located underneath the passenger cabin, consist of thousands of high-density lithium ion cells, each barely larger than a standard AA battery, nestled into a large, heavy battery pack that also offers Tesla electric cars a road-gripping low center of gravity and structural support.

The brainchild of Tesla co-founder J.B. Straubel, these battery modules pack more of a punch than standard (and cheaper) electric car batteries. These packs are also being used in residential, commercial, and grid-scale energy storage devices.

11. Digital assistants

One of the biggest technology trends in recent years has been smart home technology, which can now be found in everyday consumer devices like door locks, light bulbs, and kitchen appliances. The key piece of technology that has helped make all this possible is the digital assistant. Apple was the first major tech company to introduce a virtual assistant called Siri, in 2011, for iOS.

Other digital assistants, such as Microsoft's Cortana and Amazon's Alexa, have since entered the market. The assistants gained another level of popularity when tech companies introduced smart speakers. Notably, Google Home and Amazon's Echo can now be found in millions of homes, with an ever-growing range of applications.

12. Robot heart

Artificial hearts have been around for some time. They are mechanical devices connected to the actual heart or implanted in the chest to assist or substitute a heart that is failing. Abiomed, a Danvers, Massachusetts-based company, developed a robot heart called AbioCor, a self-contained apparatus made of plastic and titanium.

AbioCor is a self-contained unit with the exception of a wireless battery pack that is attached to the wrist. Robert Tools, a technical librarian with congestive heart failure, received the first one on July 2, 2001.

13. Retinal implant

When he was a medical student, Dr. Mark Humayun watched his grandmother gradually lose her vision. The ophthalmologist and bioengineer focused on finding a solution to what causes blindness. He collaborated with Dr. James Weiland, a colleague at the USC Gayle and Edward Roski Eye Institute, and other experts to create the Argus II.

The Argus II is a retinal prosthesis device that is considered to be a breakthrough for those suffering from retinitis pigmentosa, an inherited retinal degenerative condition that can lead to blindness. The condition afflicts 1.5 million people worldwide. The device was approved by the U.S. Food and Drug Administration in 2013.

14. Mobile operating systems

Mobile operating systems for smartphones and other portable gadgets have enabled the proliferation of smartphones and other mobile gadgets thanks to their intuitive user interfaces and seemingly endless app options. Mobile operating systems have become the most consumer-facing of computer operating systems. When Google first purchased Android Inc. in 2005, the operating system was just two years old, and the first iPhone (with its iOS) was still two years from its commercial debut.

15. Multi-use rockets

Billionaire entrepreneur Elon Musk may not necessarily be remembered for his contributions to electric cars innovations, but rather for his contributions to space exploration. Musk's private space exploration company, SpaceX, has developed rockets that can be recovered and reused in other launches – a more efficient and cheaper alternative to the method of using the rockets only once and letting them fall into the ocean.

On March 30, 2017, SpaceX became the first to deploy one of these used rockets, the Falcon 9. Blue Origin, a space-transport company founded by Amazon.com's Jeff Bezos, has launched its own reusable rocket.

16. Online streaming

Online streaming would not be possible without the convergence of widespread broadband internet access and cloud computing data centers used to store content and direct web traffic. While internet-based live streaming has been around almost since the internet was broadly adopted in the '90s, it was not until the mid-2000s that the internet could handle the delivery of streaming media to large audiences. Online streaming is posing an existential threat to existing models of delivering media entertainment, such as cable television and movie theaters.

17. Robotic exoskeletons

Ever since researchers at the University of California, Berkeley, created in 2003 a robotic device that attaches to the lower back to augment strength in humans, the demand for robotic exoskeletons for physical rehabilitation has increased, and manufacturing has taken off.

Wearable exoskeletons are increasingly helping people with mobility issues (particularly lower body paralysis), and are being used in factories. Ford Motor Company, for example, has used an exoskeleton vest that helps auto assemblers with repetitive tasks in order to lessen the wear and tear on shoulders and arms.

18. Small satellites

As modern electronics devices have gotten smaller, so, too, have orbital satellites, which companies, governments, and organizations use to gather scientific data, collect images of Earth, and for telecommunications and intelligence purposes. These tiny, low-cost orbital devices fall into different categories by weight, but one of the most common is the shoebox-sized CubeSat. As of October 2019, over 2,400 satellites weighing between 1 kg (2.2 lbs) and 40 kgs (88 lbs) have been launched, according to Nanosats Database.

19. Solid-state lidar

Lidar is an acronym that stands for light detection and ranging, and is also a portmanteau of the words "light" and "radar." The technology today is most often used in self-driving cars. Like radars, which use radio waves to bounce off objects and determine their distance, lidar uses a laser pulse to do the same.

By sending enough lasers in rotation, it can create a constantly updated high-resolution image map of the surrounding environment. The next steps in the technology would include smaller and cheaper lidar sensors, and especially solid state ones – no spinning tops on the cars.

20. Tokenization

If you have ever used the chip embedded in a credit or debit card to make a payment by tapping rather than swiping, then you have benefited from the heightened security of tokenization. This data security technology replaces sensitive data with an equivalent randomized number †known as a token †that is used only once per transaction and has no value to would-be hackers and identity thieves attempting to intercept transaction data as it travels from sender to recipient. Social media site classmates.com was reportedly the first to use tokenization in 2001 to protect its subscribers' sensitive data. Tokenization is also being touted as a way to prevent hackers from interfering with driverless cars.

21. Touchscreen glass

Super-thin, chemically strengthened glass is a key component of the touchscreen world. This sturdy, transparent material is what helps keep your iPad or Samsung smartphone from shattering into pieces at the slightest drop. Even if these screens crack, in most cases the damage is cosmetic and the gadget still works.

Corning Inc., already a leader in the production of treated glass used in automobiles, was asked by Apple to develop 1.3-mm treated glass for its iPhone, which debuted in 2007. Corning's Gorilla Glass is still the most well known, though other brands exist in the marketplace.

24/7 Wall Street is a USA TODAY content partner offering financial news and commentary. Its content is produced independently of USA TODAY.

  • Search Menu
  • Sign in through your institution
  • Conflict, Security, and Defence
  • East Asia and Pacific
  • Energy and Environment
  • Global Health and Development
  • International History
  • International Governance, Law, and Ethics
  • International Relations Theory
  • Middle East and North Africa
  • Political Economy and Economics
  • Russia and Eurasia
  • Sub-Saharan Africa
  • Advance Articles
  • Editor's Choice
  • Special Issues
  • Virtual Issues
  • Reading Lists
  • Archive Collections
  • Book Reviews
  • Author Guidelines
  • Submission Site
  • Open Access
  • Self-Archiving Policy
  • About International Affairs
  • About Chatham House
  • Editorial Board
  • Advertising & Corporate Services
  • Journals Career Network
  • Journals on Oxford Academic
  • Books on Oxford Academic

Article Contents

Reconceptualizing war: the rise of post-modern war, 1945–1989, the persistence of post-modern war after the cold war, post-modern war and the future of the state.

  • < Previous

Technology, war and the state: past, present and future

This article is part of a special issue of International Affairs (July 2019) on ‘Re-visioning war and the state in the twenty-first century’, guest-edited by Tracey German.

  • Article contents
  • Figures & tables
  • Supplementary Data

Warren Chin, Technology, war and the state: past, present and future, International Affairs , Volume 95, Issue 4, July 2019, Pages 765–783, https://doi.org/10.1093/ia/iiz106

  • Permissions Icon Permissions

War made the state, and the state made war, but does this statement hold true today? Will it apply in the future? The consensus is that the absence of major war within the western world, post 1945, did cause the war–state relationship to change, but each became significantly less important to the other. This article argues that the relationship was closer and deeper than has been assumed. It proposes that the peculiar strategic conditions created by the nuclear age caused states to wage a ritualistic style of war, in which demonstration rather than the physical application of violence became increasingly important. Within this setting, the state drove the process of technological innovation in defence to its limits in an effort to demonstrate its military superiority. This massive peacetime investment in defence technology exerted a huge impact on the character of war, which led to new strategic forms. However, most importantly, the diffusion of military technology also affected the wider economy and society, leading to a form of internal power transition within states. The author speculates on how these elemental forces will play out in the future, what will happen to war and the state, and whether we will reach a point where war leads to the unmaking of the state.

This article explores the changing relationship between war and the state in the western world since the end of the Second World War. Specifically, it analyses how that relationship evolved during and after the Cold War, and extrapolates from current trends to speculate what impact war will have on the future evolution of the state. Our understanding of the connection between war and the state assumes that war played an instrumental role in the formation of the state in the early modern period. The synergistic relationship established at that time then blossomed over the next four centuries, during which both state and war grew exponentially. However, this expansion was checked by the declining incidence and scale of interstate war after 1945, which eventually allowed new political and economic priorities to emerge that resulted in the reshaping of, and a changed role for, the state. 1

The article presents an alternative view of the war–state relationship in the post-Second World War era. It does not challenge the logic that the decline in war affected the war–state connection. 2 However, it does not see this change as evidence of atrophy. Instead, it demonstrates how the complexity of war after 1945 led to a deep but more subtle interaction, which had a profound effect on war, the state and society in the western world. While I do not challenge the premise that a range of factors played a role in shaping the connection between war and the state, the precise interaction and relative importance of these forces have altered over time, and this has caused the demands of war on the state to shift in significant ways. In the period under scrutiny in this article, I argue that the role of technology in war increased dramatically because of the nuclear revolution. In this setting, technological development reduced the opportunities for war, but the arms race it generated also brought into being new technologies, and these facilitated new forms of conflict. These developments affected our understanding of war's character and its interaction with the state.

Military history provides a rich literature on war and technology, but its focus has tended to be on the importance of technology in helping militaries win wars. 3 In rarer cases, writers have sought to situate war within a broader technological, economic, social and cultural framework. 4 This is where the principal focus of the present article lies. However, my aim is to turn this domain upside down and explore not just how the world has changed (and continues to change) war, but how the war–technology dynamic has changed the world, in what might be described as a form of positive feedback. To this end, I expand and build on the historical overview presented by William McNeill and Maurice Pearton of the financial and technical linkages forged between war and the state starting in the late nineteenth century. 5 This provides a conceptual framework within which to explore how that relationship evolved and how it might change in the future. Most importantly, this construct allows the contemporary war–state relationship to be viewed through a different lens, one that sees a stronger, darker and more damaging connection than is generally recognized.

In addressing this issue, I have relied on the experiences of the United States and United Kingdom, as representative examples of western states, to support the propositions set out here. Most importantly, in both cases the state played a leading role in promoting defence research after 1945; technology was of central importance in their strategic frameworks, and continues to be so today. Second, both states consciously exploited defence technology to promote wider economic prosperity. I recognize that attempts to look into the future carry a great deal of risk. I am aware of this risk and explain below how I have taken it into account. The only general point I would make here is that history also shows that, sometimes, military forecasting is successful. I have looked at these examples and drawn on their methodologies.

In sum, the central argument of this article is that, after 1945, technology acted as a vital agent of change in the war–state relationship, and eventually the ripples of this change spread throughout society. To illustrate this point, you have only to look at the ubiquitous smartphone and the genesis of technologies produced by defence research that made it possible. This capability has in turn affected the conduct of war; and this has affected the state. Thus the smartphone provides just one significant example of how technology and war are shaping the state and the world we live in. 6

The article is divided into three parts. The first explores the war–state relationship and the factors that shaped it during the Cold War. It explains why technological innovation became so important in war, and how this imperative influenced both our understanding of war and the interaction between war and the state. The second section examines why the imperative for technological innovation persisted, and why the war–state infrastructure survived in the post-Cold War era. Finally, the third section explores how current trends might influence the war–state relationship in the future.

Clausewitz missed the importance of technology as a variable in his analysis of war. 7 Tilly, one of the most critical commentators on the war–state relationship, was also sceptical about the importance of technology in this process, and focused instead on the economics of waging war. 8 The omission is understandable, because the history of war is characterized by long phases of technological stagnation punctuated by occasional spasms of revolutionary change caused by a variety of forces. 9 This point is illustrated by a cursory glance at naval technology, which shows that ship design and armaments in Europe remained largely unchanged from 1560 to 1850. 10 However, I contend that the importance of technology increased dramatically in the conduct of war from the nineteenth century onwards, for three reasons. The first was the impact of the Industrial Revolution. This period of sustained and rapid technological innovation eventually affected all areas of human activity, including war. Evidence of the increased pace in technological change can be seen from Schumpeter's economic analysis of capitalism and its relationship to technology. In his view, four long economic cycles in the Industrial Revolution led to ground-breaking changes in the mode of production in little more than a hundred years. 11 At the microeconomic level, Schumpeter also challenged economic orthodoxy by arguing that capitalism was based not on price competitiveness but on innovation, via the creation of ‘the new commodity, the new technology, the new source of supply, the new type of organisation’. Schumpeter called this the process of ‘creative destruction’ as firms seek to innovate to achieve a position of monopoly and thereby maximize profits until that advantage is cancelled out by the next innovation. 12

During this time, the technological needs of the armed forces ‘were met out of the same scientific and technical knowledge that manufacturing industry had put to use in satisfying its commercial needs’. 13 As such, wider forces fed into the realm of war. However, this situation slowly changed such that the demands for military technology eventually shaped the wider context in which it existed—which brings us to the second reason why the importance of technology increased. O'Neill demonstrates how the state began to assume a role as a sponsor of technological innovation in defence in the late nineteenth century as the military became increasingly interested in the exploitation of technology. Such state sponsorship of innovation was termed ‘command technology’. 14 However, as Hartcup observed, this process of innovation operated within military, fiscal and time constraints that imposed a limit on the ambition of defence research. 15 In general, mass industrialized war in the twentieth century emphasized quantity more than quality, and required the mobilization of society and the economy via the state. The demands of war also resulted in the state expanding into the provision of education and health care to ensure the population was fit to wage war. Even liberal Britain succumbed to this view of the state. 16 These features eventually became the defining characteristics of what Hables Gray called ‘modern war’. 17

The advent of the nuclear age precipitated a profound change in the organization and conduct of war. Hables Gray asserts that 1945 marks the dividing line between modern war and the birth of what he terms post-modern war. 18 This philosophical construct is used as intended by post-modernism, not as a label, but as a way of indicating that war, like many forms of human activity, is a discourse. 19 That discourse changed profoundly after 1945 because at that point scientific advance, in the form of nuclear weapons, made modern war impossible. This new strategic setting precipitated what Holsti described as the diversification of warfare; and this in turn resulted in a blurring of the line between peace and war as governments employed a range of means to achieve their policy goals below the threshold of general war. Most importantly, the forms of war proliferated as new ways were devised to employ war as a political tool in a nuclear world. 20 This change did not render Clausewitz's concept of war obsolete, but it did require it to be adapted. 21

Clausewitz explained that ‘war is an act of violence to compel our opponent to fulfil our will’. 22 War is also the continuation of policy by other means. 23 War, then, is defined as a discourse of physical violence to achieve a political goal. However, in examining the post-1945 war–state relationship in the West, we need to revise our understanding of war so that it extends beyond physical violence and bloodshed. Russian military reflections on the Cold War reveal an interesting narrative that reinforces this expansion of war beyond its traditional domain. According to this analysis, the Soviet Union lost the Cold War because it was defeated by non-military means employed by its enemy that focused on psychological, political, information, social and economic attacks against the Soviet state. 24 Although this interpretation can be contested, it is important to acknowledge that states used both military and non-military levers to confront their enemies in this conflict. Technology played a vital role in facilitating this process, for example via the communications revolution, which facilitated the waging of activities such as political warfare. However, the most salient aspect of the Cold War was the discourse of deterrence. Within this context, the rituals of war in terms of organizing, preparing and demonstrating an ability to fight nuclear war in the hope of deterring potential opponents and thereby preventing the possibility of war became substitutes for organized violence. Small wars happened on the periphery of the US and Soviet geopolitical space, but in the core region, a different kind of cognitive and cultural violence emerged, which can be seen as a form of war. 25

How, then, did technology fit into this new discourse of war? According to Buzan, because nuclear deterrence relied on anticipated weapons performance, it became sensitive to technical innovation, which meant the state had to respond to technological change by investing in defence research to maintain the credibility of its deterrent. 26 As a result, a premium came to be placed on technological innovation in defence, and this caused the role of the state in military research to expand. 27 Consequently, states came to play an essential part in a military version of Schumpeter's process of creative destruction, albeit in the realm of defence. The role of the state was vital because it was the state that provided the critical financial resources required to take embryonic technologies and develop them at a speed unlikely to be matched by the civilian market. This facilitated a profound change in the relationship between the state and private industry and undermined the operation of the free market as governments opted to support defence contractors capable of conducting large and complex forms of research and development (R&D). 28 This trend did not go unnoticed; in 1961, President Dwight Eisenhower warned against the pernicious influence exerted by the creation of a military–industrial complex (MIC), a construct which referred to the incestuous relationship between the military, defence industries and politicians acting in concert as an interest group to persuade the state to spend more on defence. 29 Harold Laswell also noted the rising prominence of the military in peacetime in his thesis of the ‘garrison state’, which described the potential militarization of the American polity. 30 Samuel Huntington echoed this concern in his book The soldier and the state , which considered how the United States could manage an immense military establishment in a time of peace without jeopardizing the sanctity of its democracy. 31 These debates and themes waxed and waned as the Cold War progressed, but they persisted, and even in the 1980s the notion of the MIC was still being discussed. 32 The strategic logic of nuclear deterrence created a climate which justified high defence spending and significant investment in defence research—but why did this infrastructure persist in the more benign environment of the post-Cold War world?

The end of the Cold War resulted in a significant fall in defence expenditure. Equally importantly, the state reduced its participation in sustaining defence research and allowed the private sector to play a more prominent role in defence production. In the UK, where the nationalized defence industries had already been privatized in the 1980s, this process was extended to include the sale of the state's defence research and development arm. This change in industrial and technological policy reflected a broader adjustment as the state lost its position in the vanguard of the technological revolution. Since the start of the Cold War, US government-funded defence research had given rise to technologies such as the internet, virtual reality, jet travel, data joining, closed-circuit TV, global positioning, rocketry, remote control, microwaves, radar, global positioning, networked computers, wireless communications and satellite surveillance. 33 The subsequent exploitation of these technologies by the private sector reflected a conscious policy choice by most western governments, which was to promote technology spinoffs from defence research into the wider economy as a way of generating wealth creation. 34 Once the technology had been created, the civil, commercial sector proved adept at adapting and changing the new capabilities. The critical difference between innovation in the defence market and its civilian counterpart was that, in the latter, high rates of consumption led to product and process innovation by companies. As a result, civil technology providers increasingly took the lead in the information revolution. Given this new dynamism, military power relied increasingly on the existing pool of technological knowledge within the broader economy. The increasing emphasis on quality in war also generated greater complexity during operations. This trend facilitated the rise of private military companies in the post-Cold War era and resulted in western states increasingly subcontracting the provision of internal and external security to the private sector. 35

However, in spite of the end of the Cold War, western governments continued to have an appetite for technological innovation and its integration into ever more complex weapons. Indeed, an important feature of post-modern war was that machines assumed an unprecedented importance in the post-Cold War era. As Hables Gray explained: ‘War is a discourse system, but each type of war has different rules of discourse. In postmodern war, the central role of human bodies in war is being eclipsed rhetorically by the growing importance of machines.’ 36

The First Gulf War was an important marker because it revealed to western society the power of technology, at least in a conventional war. As Freedman observed, this conflict resolved the high-tech versus low-tech debate which had persisted throughout the Cold War. 37 Observers now spoke of a paradigm shift in the conduct of war and a revolution in military affairs (RMA) caused by technological advance in computers and communications. 38 Paradoxically, cuts in defence spending and provision compounded the drive to rely on technology in war as smaller militaries sought to pack a bigger punch to compensate for their lack of mass. 39 In the 1990s, the RMA served another purpose in that it allowed for the creation of what Shaw described as ‘risk-free’ war. Technology allowed western states to engage targets at long range with high accuracy, but at no risk to those firing the weapons—something that became very useful in an era of wars of choice. 40 Perhaps the best example of the strengths and weaknesses of this approach was NATO's 78-day bombing campaign against Serbia in 1999. 41

Technological innovation in the techniques of war allowed the state to continue using force as an instrument of policy, especially in those instances where there was no clear political consensus on taking military action. In sum, the state continued to see its security through the prism of technological advance; and this, in turn, helped to sustain the MIC in that brief period between the end of the Cold War and the start of the ‘war on terror’. The idea of an MIC persists today. For example, David Keen points to the powerful economic functions fulfilled by the war on terror, which he believed explained the persistence of a war based on counterproductive strategy and tactics. 42 More recently, Paul Rogers has referred to the creation of a military–industrial academic–bureaucratic complex, which is exploiting the latest iteration of the war on terror: the war against the so-called ‘Islamic State in Iraq and Syria’ (ISIS). 43 While the technology paradigm was briefly challenged in Iraq in 2006 and replaced by a more labour-intensive approach to war, as articulated in the principles of counter-insurgency, this, in turn, was quickly replaced by less risky, more capital-intensive techniques of war waged with satellites, robots, drones, precision weaponry and special forces. 44 In summary, the elaborate infrastructure of war created during the Cold War endured in the post-Cold War era before being reinvigorated by the fiscal stimulus generated by the war on terror. During this period technology was viewed almost as a silver bullet. As such, it provided a neat answer to complex questions posed by the human and physical terrain of war. Most importantly, for a brief moment at least, it allowed western states to reimagine decisive victories and tidy peace settlements. 45 Such was the allure of technology that Coker speculated on the possibility of a future ‘post-human warfare’ in which machines replaced humanity on the battlefield. 46

How, then, will predicted developments in technology shape the future of war and the state? This is a question that is causing much anxiety in both academic and policy-making circles. As Freedman points out, the future is based on decisions that have yet to be made in circumstances that remain unclear to those looking into a crystal ball. 47 Just as important as this uncertainty are those biases that shape our preferences regarding how we see the future. Cohen has pointed out that debates on the future of war often suffer from being technologically sanitized, ignoring politics and therefore lacking a meaningful context. 48 As a result, the ‘future war’ literature often suffers from an overreliance on a simplistic overview of decisive military technologies. I address these problems in two ways.

The first is to follow the advice offered by the sociologist Michael Mann, who observed that no one could accurately predict the future of large-scale power structures like the state; the most one can do is provide alternative scenarios of what might happen given different conditions, and in some cases to arrange them in order of probability. 49 The UK's Concepts and Doctrine Centre adopted this approach and set out multiple scenarios to support its analysis of future strategic trends. 50 Second, it is essential to widen the lens through which the future is projected and to understand the political context within which technology, war and the state will all be situated. To this end, I adopt here the Clausewitzian framework of analysis which Colin Gray employed in considering future war. As he explains:

Future warfare can be approached in the light of the vital distinction drawn by Clausewitz, between war's ‘grammar’ and its policy ‘logic’. Both avenues must be travelled here. Future warfare viewed as grammar requires us to probe probable and possible developments in military science, with reference to how war actually could be waged. From the perspective of policy logic we need to explore official motivations to fight. 51

In exploring the future relationship between war and the state, and the role played by technology, two possible visions are presented here. The first explores the continuation of the status quo and represents the default setting of both the UK and US governments with regard to the future. The second follows the recommendation offered by Paul Davis, who advised when selecting a scenario to choose a vision that challenges and provokes controversy and that breaks out of orthodox thinking. 52

Both models have one thing in common: they will be influenced by what might be seen as the next wave of technological change. This latest technical convulsion is illustrated by Schwab's idea of the fourth Industrial Revolution, which is a crude facsimile of Schumpeter's theory of long economic cycles. The fourth Industrial Revolution builds on the digital revolution, which began in the 1960s, but differs from it in that it entails ‘a much more ubiquitous and mobile internet, … smaller and more powerful sensors that have become cheaper, and … powerful artificial intelligence (AI) and machine learning’. 53 The term ‘artificial intelligence’ was first used by the American scientist John McCarthy in 1956. According to his definition, AI is merely the development of computer systems to perform tasks that generally need human intelligence, such as speech recognition, visual perception and decision-making. More recently, Max Tegmark has defined AI as a non-biological intelligence possessing the capability to accomplish any complex task at least as well as humans. 54 Currently, the exponential rise of AI is being driven by three developments in the world of computing: smarter algorithms, a vast increase in computing power and an ability to process vast quantities of data. 55 What this means is that humans are now being challenged by machines in the cognitive as well as the physical domains of work. Digital technologies that have computer hardware, software and networks at their core are not new, but represent a break with the third Industrial Revolution because of the level of sophistication and integration within and between them. These technologies are transforming societies and the global economy.

The fourth Industrial Revolution is not only about smart and connected machines and systems. It is linked with other areas of scientific innovation ranging from gene sequencing to nanotech and from renewables to computing. It is the fusion of these technologies and their interaction across the physical, digital and biological domains that make the fourth Industrial Revolution fundamentally different from previous epochs. Emerging technologies and broad-based innovations are diffusing much more quickly and more widely than their predecessors, which continue to unfold in some parts of the world. It took the spindle, the hallmark of the first Industrial Revolution, 120 years to spread outside Europe; by contrast, the internet permeated the globe in less than a decade. 56 In sum, it is not one specific technology but the sheer number of technologies and the interaction between them that is creating change on such an unprecedented scale that Schwab believes it can be described as a revolution. What, then, does this mean for the relationship between war and the state?

The first model of the future adopts a ‘business as normal’ scenario. In this version of the future, the policy logic of war remains focused on the security of the state and concentrates on state-based threats. The principal causes of war can be identified in the anarchy of the international system. 57 The state preserves its monopoly on the use of force because the barriers to entry into the weapons market remain high. In addition, the state continues to function effectively and to be able to extract the resources needed to maintain its legitimacy and territorial integrity. Within this context, the state still pursues the development of advanced technologies to defend against mostly state-based threats. In this scenario, future war is imagined as a symmetrical contest between conventional forces on an increasingly automated battlefield. Within this space, humans will be augmented and in some instances replaced by AI and robots contending with increasingly lethal forms of weaponry. 58

In this vision of the future, the military's pursuit of the next technology follows a familiar pattern, and the risk and uncertainty involved continue to make state finance and policy support indispensable to defence research. The most recent example of this activity is the UK government's promise to share with British Aerospace the cost of funding the development of a technology demonstrator for the next generation of fighter aircraft. Named Tempest, this fighter can operate either as a manned or as an unmanned aircraft; it will rely on AI and employ directed energy weapons. 59 A grander example of the status quo scenario is the American-led ‘Third Offset’ strategy, a programme designed to preserve America's military-technological superiority. At the core of the Third Offset is the intention to exploit advances in machine autonomy, AI, quantum computing and enhanced digital communications to improve the man–machine interface in the future battlespace. 60 The United States is investing US$18 billion in the creation of these capabilities, even though it is not clear how feasible the development of technologies such as AI will be. 61

It is important to note that non-western states are also pursuing these policies. The outstanding example here is China. Its economic model, which is based on state-sponsored capitalism, is enabling it to work in a close partnership with privately owned Chinese tech firms to achieve a broad-based technological self-sufficiency in both commerce and defence. 62 Investment in research and development has grown by 20 per cent per year since 1999 to the point where China now spends US$233 billion per annum, a sum that accounts for 20 per cent of the world's research and development spending. 63 Three technologies, it is claimed, matter most to China, and all three relate to its ability to control the internet. These are semiconductors, quantum computing and AI. 64 In 2017, China accounted for 48 per cent of all AI venture funding, and the Beijing government aims to be the centre of global innovation in AI by 2030. 65

In this scenario, then, the state can harvest and refine a range of new technologies generated by the private rather than the public sector in a manner that preserves its monopoly on the use of force. At the same time, that monopoly is reinforced because of the complexity of these capabilities and the challenges posed in their use on operations, which require well-trained and professional forces. Private military companies will persist, but their existence will rely on their ability to draw on this pool of trained personnel created by the state to populate their organizations, which means they will support, not challenge, the state's role as a provider of security.

In the second scenario of the future, the policy logic of war reflects a darker, dystopian image of the relationship between war and the state. In this setting, conflict is a product of desperation caused by scarcity, which is occurring on a global scale. Most importantly, the causes of war lie within states as well as between them. In this multifaceted crisis, technological change is weakening rather than strengthening the state and undermining its ability to cope with the tsunami of problems sweeping over it. The debate over this view of the future policy logic of war began in 1972 with the publication of a hugely controversial book called The limits to growth . 66 This study explored the impact of population growth, industrialization, pollution, and resource and agricultural shortages on the global economic system. Its principal conclusion was that population growth would create an insatiable demand for goods, outstripping the finite resource base of the planet. Humanity's efforts to address this imbalance in demand and supply by increasing productivity would be self-defeating and cause a host of environmental problems. In spite of the passage of time since its first appearance, this book set out themes that are explicitly linked to the spectrum of security issues we face today. 67 Moreover, a recent study conducted by Melbourne University in 2014 claimed that the world might still be moving along the trajectory mapped out in 1972, and that economic and environmental collapse could happen before 2070. 68

There is a general assumption that the worst effects of these environmental trends will be for the most part experienced outside the western world. Even when western states are affected, it is assumed, rich countries will possess the financial means to weather this future storm. However, a recent report by Laybourn-Langton and colleagues challenges this simplistic assumption and points to the social and economic harm being caused globally by current forms of human-induced environmental change. These authors also demonstrate that no region of the world will be untouched by this phenomenon, and use the UK as a case-study to illustrate the point. In their view, the degradation of the environment will interact with existing political and economic trends to undermine the cohesion and internal stability of states across the globe. 69 Interestingly, the report's analysis of the challenges facing governments has not been contested, although their proposed solutions in terms of radical economic reform have been strongly challenged by economists. 70

Current trends suggest that a potential environmental crisis might run in parallel with a possible economic crisis. Ironically, the source of this predicament lies in potential problems generated by the fourth Industrial Revolution. Like the military, business is also fast approaching a time when machine intelligence can perform many of the functions hitherto carried out by humans in a range of occupations. As McAfee and Brynjolfson explain, innovation was hugely advantageous in those occupations which relied on physical labour, allowing new forms of economic activity and employment based on human cognitive abilities to develop. 71 However, this cognitive comparative advantage is now under threat, as computer algorithms have reached a point where they can outperform humans in many jobs. 72

As in the military domain, so in our economic and political affairs it is predicted that AI will precipitate a revolution. A PriceWaterhouseCooper report predicted that 38 per cent of all jobs in the United States are at high risk of automation by the early 2030s. 73 Most of these are routine occupations such as those of forklift drivers, factory workers and cashiers in retail and other service industries. This depressing analysis is supported by the Bank of England's estimate that up to 15 million jobs are at risk in the UK from increasingly sophisticated robots, and that their loss will serve to widen the gap between rich and poor. 74 Most worrying is the fact that, in the short term, the jobs most at risk are low-paid and low-skilled occupations, which are precisely the jobs the UK and US economies have been so successful in generating to create record levels of employment since the financial crash in 2008.

As in the past, those most affected by this change will be the economically least powerful sectors of society—the old, and unskilled and unorganized labour. Until now, the managerial and professional classes have been able to use their economic and political positions to protect themselves from the worst effects of such crises. 75 The big difference about this revolution is that AI is threatening traditional professional middle-class occupations. Any job that can be done via the application of pattern-searching algorithms will be vulnerable. This includes banking and finance, the law and even education. Daniela Russ has argued that humans need the personal touch in their day-to-day lives and that humans are therefore guaranteed to have a place in the job market. 76 Sadly, Harari challenges even this view, and claims machines can mimic empathy by monitoring blood pressure and other physical indicators in interactions between AI and humans. 77 A recent report by the Wall Street Journal supports this view. In their investigation of the use of AI in the provision of psychological therapy, they found people preferred the treatment offered by the AI precisely because it was a machine and so they did not feel judged. The system can also be configured to fit people's preferences, creating a 3D computer-generated image that is comforting and reassuring. 78

A significant limitation of AI and machine technology is that currently they cannot replicate the dexterity of humans in handling delicate objects, and this does leave a role for humans in the workplace. However, scientists in California are looking at the use of AI and machine technology as a way of addressing the acute labour shortages experienced in the fruit-picking industry; this includes the development of machines capable of deciding which fruit is ripe for picking, and doing so in a way that does not damage the produce during picking, processing or distribution. Given these developments, Harari's prediction for humans in the workplace is bleak. ‘In the twenty-first century we might witness the creation of a massive new unworking class: people devoid of any economic, political or even artistic value, who contribute nothing to the prosperity, power and glory of society.’ 79 The mass unemployment generated would be on an unprecedented scale and likely to precipitate instability and violence. 80

Further evidence to support the depressing scenario depicted here is provided by the former head of Google China, Dr Kai-Fu Lee, a man with decades of experience in the world of AI. In his view, AI ‘will wipe out billions of jobs up and down the economic ladder’. 81 A typical counter to this view is that AI will lead to the creation of new jobs and new careers; but, as Tegmark explains, the evidence does not support this claim. If we look back over the last century, what is clear is that ‘the vast majority of today's occupations predate the computer revolution. Most importantly, the new jobs created by computers did not generate a massive number of jobs.’ 82

What then are the political and security implications of this profound economic change in terms of war and the state? Although depressing, the scenario depicted above does not mean we are condemned to what Martin Wolf describes as a kind of ‘technological feudalism’. 83 As Gurr points out, past economic crises have provided political incentives for social reforms: for example, the New Deal in the United States, which represented a revolutionary change in how central government sought to manage the economy. 84

According to Wolf, three factors might determine how well the state deals with these challenges: first, the speed and severity of the transformation we are about to experience; second, whether the problem is temporary or likely to endure; and third, whether the resources are available to the state to mitigate the worst effects of these changes. In the past, western governments have deployed a range of policies to deal with recessions or, as in the 1970s, scarcity of resources such as oil. However, these macroeconomic policy responses operated on the assumption that such crises were temporary, and that economic growth would resume and normality be restored quickly if the right measures were in place. In contrast, the environmental crisis and the AI revolution are happening rapidly and both will be enduring features of economic and political life. In Wolf's view, this latest revolution will require a radical change in our attitude towards work and leisure, with the emphasis on the latter. He also believes we will need to redistribute wealth on a large scale. In the absence of work, the government might resort to providing a basic income for every adult, together with funds for education and training. The revenue to fund such a scheme could come from tax increases on pollution and other socially negative behaviours. In addition, intellectual property, which will become an important source of wealth, could also be taxed. 85

However, the introduction of these measures will not necessarily prevent a rise in politically motivated violence. As Gurr explains, recourse to political violence is caused primarily not by poverty but by relative deprivation. This is defined as ‘actors’ perception of discrepancy between their value expectations and their environment's apparent value capabilities'. 86 As such, it reflects the difference between what people believe they are legitimately entitled to and what they achieve, perceptions of which have become acute in the age of the smartphone. Relative deprivation applies to both the individual and the group. Seen in this light, the bright, shiny new world created by AI provides a potentially rich environment for relative deprivation—particularly if large swathes of the middle classes are frustrated in their ambitions and suffer a loss of status as a socio-economic group. 87 More worrying is that this technological and economic revolution will coincide with the global deterioration of the environment set out above, which also challenges the state.

Within this scenario, states in the western world will struggle just as much as states in the developing world. If the legitimacy of the state is measured in terms of its capacity to effectively administer a territory under its control, then the political context set out here poses a significant threat to this institution. The extraction of resources through taxation will prove extremely difficult as the tax base shrinks. This will affect the ability of the state to provide the public goods the population expects and requires. A weaker state, which lacks the resources and capacity to sustain the population, will also lack legitimacy; this could cause the social contract to break down and result in widespread violence. What, then, will the future grammar of war look like in this political and social context?

In this version of the future, the most fundamental aspect of the technology–war interaction will be the challenge to the state's retention of the monopoly of violence. Projections about the end of the state's monopoly on the use of force have been made before, but the current trajectory of technological change is making this threat more plausible, and bringing it closer. 88 This speculative line of enquiry was given substance in 1999 by two colonels in the Chinese People's Liberation Army, Qiao Lang and Wang Xiangsui. Their study was conceived mainly within the context of a future war between the United States and China, and so their thinking was developed within the setting of a state-based conflict. However, their central thesis is relevant here because they believed the world was living in an unprecedented age in terms of the speed and breadth of technological innovation. There are, they argued, so many essential technologies emerging that it is difficult to predict how these will combine, or what the effect of these combinations might be in military and political terms. Developments in biotechnology, materials technology, nanotechnology and, of course, the information revolution are creating new opportunities and ways of attacking other states. 89 An important observation made in Unrestricted warfare is that new technologies, which could be used as weapons, are increasingly part of our normal day-to-day lives. 90 In sum, the colonels identified a range of non-military means that are technically outside the state's control and that might allow a weaker actor to fight and defeat their more powerful adversary. The 20 years that have passed since first publication of Unrestricted warfare have demonstrated the prescience of the authors in respect of what are deemed to be new types of conflict today. For example, what they called ‘superterrorism war’ seemed to come to fruition on 9/11. We can see how state and non-state actors have exploited emerging everyday technologies that challenge powerful nation-states. Of great importance is the way in which groups such as ISIS and revisionist powers such as Russia have weaponized social media in their efforts to weaken those who oppose them. ISIS, indeed, claimed that media weapons could be more potent than atomic bombs. 91

It is believed that Russia is increasingly relying on non-military means to challenge the West. Not surprisingly, evidence is mounting that it influenced the outcome of the 2016 US presidential election. 92 This form of activity is now a persistent feature of the conflict spectrum and is practised by a variety of states. 93 In August 2018, Facebook closed 652 fake accounts and pages with ties to Russian and Iranian state-based organizations. In both cases, the objective appears to have been to influence domestic politics in the UK, the US, the Middle East and Latin America. Four campaigns were identified, three of which originated in Iran. 94 With over 2 billion accounts to police on Facebook, it is feared this practice will persist.

It is not only because of the blurring of the distinction between military and civilian that more technology is becoming more accessible. Moises Naim points to the falling cost of many technologies used in both defence and the civilian sector, which is making them accessible to weak states and violent non-state actors. 95 An excellent example of this trend can be seen in the domain of synthetic biology, a new field that combines the power of computing and biology to ‘design and engineer new biological parts, devices and systems and redesign existing ones for other purposes’. 96 In 2003, the Human Genome Project completed the first full sequencing of human DNA. The successful completion of this project took ten years and was the result of work done in over 160 laboratories, involving several thousand scientists and costing several billion dollars. It is now possible to buy a DNA sequencing device for several thousand dollars and sequence a person's genome in less than twenty-four hours. So steeply, in fact, have sequencing costs fallen that the industry is no longer profitable in the developed world and is now primarily conducted within China. By way of example of the potential threat posed by this new science, in 2005 scientists, worried about the possibility of another flu pandemic, recreated the Spanish flu virus which during and after 1918 killed 50 million people in two years. In 2011, scientists employed these techniques to manipulate the H5N1 bird flu virus and create a variation which could be spread from the avian to the human species. It is feared the technical bar to entry into this domain is now sufficiently low that it can be exploited for nefarious purposes by individuals or groups. 97 Precisely the same fears have been expressed about the cyber domain. According to one Israeli general, ‘cyber power gives the little guys the kind of ability that used to be confined to superpowers’. 98 In the future, we might even be able to make weapons via 3D printers. In theory, it is possible to build a handgun or even an assault rifle with this technology.

However, before concluding that the state is about to wither away, we need to remember that these technologies are still maturing. Therefore, whether or not advances in the cyber domain will undermine or reinforce the power of the state remains a contested point. As Betz points out, launching a successful attack against another state via this medium can be very costly. The Stuxnet computer virus, which was used to attack Iran's nuclear programme, was a very sophisticated piece of software developed by a dedicated team of specialists over a long period. The successful insertion of this virus also required high-grade intelligence on the Iranian nuclear programme. Consequently, the success of a cyber attack depends on a combination of capabilities, not just the development of a virus, and at the moment this puts the state at a considerable advantage. 99 A similar point can be made in the case of 3D printing: you need to do more than just download the code to print the weapon. You also need access to complicated and expensive computer-aided design software and a high-quality metal 3D printer capable of using steel, aluminium or nickel. Such a machine costs over US$100,000, which is nearly 60 times the price of a standard 3D printer which uses plastic. The latter has been used to print plastic guns, but these proved unreliable and likely to explode in the user's hand. 100

Finally, technology will also allow the state to attempt to counter internal threats to its authority. Stephen Graham notes that a significant trend in the war on terror has been the blurring between civilian and military applications of technologies dealing with control, surveillance, communications, simulation and targeting. The capability to exercise control via technologies which are intended to provide a service, such as parking and congestion charging, has increased dramatically the opportunities to conduct electronic surveillance for a host of other purposes. 101

‘War made the state, and the state made war’ is a maxim that has shaped our historical understanding of this relationship. In the West, the general absence of major war since 1945 changed the war–state relationship, and there is now a consensus that each is significantly less important to the other. My aim in this article has been to provide a more nuanced understanding of the war–state relationship that emerged after 1945.

The existence of nuclear arsenals made total or modern war obsolete. Within this strategic setting a new form of war emerged. Post-modern war did not require the state to mobilize its entire population and economy to fight a life-or-death struggle against other states, largely because its principal focus was on devising ways to use military power to deter war or devising new means to attack the enemy's moral rather than its physical power. As a result, the logic of war transcended simple notions of battle and victory. War between the Great Powers and their allies tended to be confined to the grey zone between peace and open violence. However, the drive for technological innovation, caused by the peculiarities of the Cold War, ensured that war and the state remained strongly connected, as only the state had the capacity to stimulate research and development on the scale required to ensure the efficacy of strategic deterrence.

The drift towards more capital-intensive modes of warfare continued in the post-Cold War era. Technology gave western governments the internal independence to prosecute wars because they demanded little sacrifice from society. In a period characterized by a plethora of politically unpopular ‘wars of choice’, this allowed states to employ force in pursuit of even vague, value-based objectives. Most importantly, these new means of war enabled nuclear-armed states to continue fighting each other in the space between war and peace using both military and non-military means. We have seen evidence of this in Ukraine and in the South China Sea.

This corporatist alliance between the state and private industry had impacts on politics, the economy and society, but in ways that did not conform with recognized patterns of behaviour associated with modern war. This is possibly why the war–state relationship since 1945 is viewed in terms of decline. However, the persistent debate about the existence of the MIC, admittedly a crude construct, is evidence of the survival of the war–state relationship and of its wider impact. The clearest evidence of this can be seen in the role played by military research in causing and accelerating scientific invention, which has been instrumental in bringing about dramatic economic, political and social change in contemporary western society. Most important of all are the non-military means created by military research which are now being exploited by both state and non-state actors. As Graham explains, western scientific research has gone through a cycle from defence to the commercial world and back again:

Hence, technologies with military origins—refracted through the vast worlds of civilian research, development and application that help constitute high tech economies, societies and cultures—are now being reappropriated as the bases for new architectures of militarized control, tracking, surveillance, targeting and killing. 102

Looking to the future, the likelihood is that war will continue to have a significant impact on the state. Commentators today note with concern the ways in which technology is undermining the state's monopoly on the use of force as the technical and fiscal barriers to weapons production fall. However, capability should not be equated with intent, and people rarely decide to initiate violence without cause. For this reason, it is important to reflect on the political context, which will provide the policy logic for war in the future. The most important potential effect of projected technological change is transformation of the means of production, which could trigger huge economic and political turmoil in the West. If the fourth Industrial Revolution proves to be as disruptive as is predicted, this will lead to increased instability and possibly violence. These developments will weaken the state and damage its legitimacy as it struggles to fulfil the needs of its population. Western states may be able to deal with this transformation; but if it coincides with the predicted deterioration in the global environment, the institution of the state will struggle to bear the combined weight of the demands imposed on it. Under these circumstances, civil conflict might result. The irony here is that the technological preparation for war after 1945 sowed the seeds of the state's demise, playing an important role in creating the conditions that might cause a future existential crisis of the western state. Not only has that technological advance created the conditions for war, especially civil war, it has compounded this threat by democratizing the means of violence and empowering non-state actors. In the future, then, the war–state relationship could take an unexpected turn; and war might actually precipitate the unmaking of the state.

See Martin van Creveld, The rise and decline of the state (Cambridge: Cambridge University Press, 1999), pp. 336–414; Michael Mann, The sources of social power , vol. 4: Globalizations, 1945–2011 (Cambridge: Cambridge University Press, 2013), p. 432; Philip Bobbit, The shield of Achilles (London: Penguin, 2002), pp. 214–19; Charles Tilly, ‘Warmaking and state making as organized crime’, in Peter Evans, Dietrich Rueschmeyer and Theda Skocpol, eds, Bringing the state back in: strategies of analysis in current research (Cambridge: Cambridge University Press, 1985), pp. 169–86.

Lawrence Freedman, ‘The rise and fall of Great Power wars’, International Affairs 95: 1, Jan. 2019, pp. 101–18.

See Martin van Creveld, Technology and war from 2000 Bc to the present (New York: Free Press, 1989); Andrew F. Krepinevich, ‘Cavalry to the computer: the pattern of military revolutions’, The National Interest , no. 37, Fall 1994, pp. 31–43.

See Michael Howard, War in European history (Oxford: Oxford University Press, 1977); Hans Delbruck, The history of the art of war , vols 1–4 (Lincoln, NE: University of Nebraska Press, 1990).

William McNeill, The pursuit of power: technology, armed force, and society (Chicago: Chicago University Press, 1982); Maurice Pearton, The knowledgeable state: diplomacy, war and technology since 1830 (London: Burnett, 1982).

Mariana Mazzucato, The entrepreneurial state: debunking the public versus private sector (London: Penguin Random House, 2013), pp. 92–119.

See Christopher Coker, Rebooting Clausewitz on war in the 21st century (London: Hurst, 2017); Martin van Creveld, More on war (Oxford: Oxford University Press, 2017).

See Tilly, ‘War making and state making as organized crime’, pp. 170–86.

See Macgregor Knox and Williamson Murray, eds, The dynamics of military revolution 1300–2050 (Cambridge: Cambridge University Press, 2001).

Samuel P. Huntington, ‘Arms races: prerequisites and results’, in Richard K. Betts, ed., Conflict after the war on terror (London: Pearson Longman, 2005), p. 361.

See Kelik Mumatz, Schumpeter innovation and growth: long cycle dynamics in post World War Two American manufacturing industries (Aldershot: Ashgate, 2003); Paul Mason, Postcapitalism: a guide to our future (London: Allen Lane, 2015), p. 33.

Joseph Schumpeter, Capitalism, socialism and democracy (London: Allen & Unwin, 1943), p. 84.

Solly Zuckerman, Scientists and war (London: Hamish Hamilton, 1966), pp. 28–9.

William O'Neill, The pursuit of power (Oxford: Blackwell, 1983), pp. 280–87.

Guy Hartcup, The challenge of war: scientific contributions to World War Two (Newton Abbott: David Charles, 1970), p. 21.

See David Wrigley, ‘The Fabian Society and the South African War, 1899–1902’, South African Historical Journal 10: 1, 1978, pp. 65–78.

Chris Hables Gray, Postmodern war: the new politics of conflict (London: Routledge, 1997), pp. 128–49.

Hables Gray, Postmodern war , p. 22.

For studies that use the term differently, see Mark Duffield, ‘Post modern conflict: warlords, post adjustment states and private protection’, Civil Wars 1: 1, Spring 1998, pp. 65–102; Mary Kaldor, New and old wars (Cambridge: Polity, 1999).

Kalevi Holsti, Peace and war: armed conflicts and international order 1648–1989 (Cambridge: Cambridge University Press, 1991), pp. 270–71.

See Stephen Cimbala, Clausewitz and escalation: classical perspectives on nuclear strategy (Abingdon: Routledge, 2012).

Carl von Clausewitz, On war (Princeton: Princeton University Press, 1976), p. 77.

Clausewitz, On war , p. 87.

Ofer Fridman, Russian ‘hybrid warfare’: resurgence and politicisation (London: Hurst, 2018), p. 91.

For more on the rituals of violence in war, see Christopher Cramer, Civil war is not a stupid thing: accounting for violence in developing countries (London: Hurst, 2006), pp. 1–20.

Barry Buzan, Military technology and international relations (London: Macmillan, 1987), p. 216.

See J. Lyall and I. Wilson, ‘Rage of the machines; explaining outcomes in counterinsurgency wars’, International Organisation 63: 1, Winter 2010/11, pp. 67–106.

Warren A. Chin, British weapons acquisition policy and the futility of reform (Aldershot: Ashgate, 2004), pp. 43–69.

Dwight D. Eisenhower, ‘Farewell radio and television address to the American people’, 17 Jan. 1961, https://eisenhower.archives.gov/all_about_ike/speeches/farewell_address.pdf .

Harold Laswell, Essays on the garrison state , ed. Jay Stanley (New Brunswick, NJ: Transaction, 1997), pp. 77–116.

See Samuel Huntington, The soldier and the state (Cambridge, MA: Harvard University Press, 1985).

See Mary Kaldor, The baroque arsenal (London: Deutsch, 1982).

Stephen Graham, Cities under siege: the new military urbanism (London: Verso, 2010), Kindle edn, loc. 2069, chapter 3: ‘The new military urbanism’, section: ‘Tracking citizen–consumer–soldier’.

Vincent P. Luchsinger and John Van Blois, ‘Spin-offs from military technology: past and future’, Journal of Technology Management 4: 1, 1989, pp. 21–9.

See P. W. Singer, Corporate warriors: the rise of the privatised military industry (Ithaca, NY: Cornell University Press, 2003), p. 38.

Lawrence Freedman, ‘The changing forms of military conflict’, Survival 40: 4, Winter 1998–9, pp. 39–56.

See Alvin Toffler and Heidi Toffler, War and anti war: survival at the dawn of the 21st century (London: Little, Brown, 1993).

D. L. I. Kirkpatrick, ‘The rising unit cost of defence equipment: the reasons and the results’, Defence and Peace Economics 6: 4, 1995, pp. 263–88.

Martin Shaw, The new western way of war (Cambridge: Polity, 2004), pp. 29–41.

Bobbit, The shield of Achilles , pp. 301–303.

David Keen, Endless war: hidden functions of the war on terror (London: Pluto, 2006), pp. 51–83.

Paul Rogers, Irregular warfare: ISIS and the new threat from the margins (London: Tauris, 2016), Kindle edn, loc. 2391–6, chapter 6: ‘Irregular war’.

See Grégoire Chamayou, Drone theory (London: Penguin, 2015).

See Robert Kaplan, The revenge of geography: what the map tells us about coming conflicts (New York: Random House, 2012).

Christopher Coker, The future of war (Oxford: Blackwell, 2004).

Lawrence Freedman, The future of war: a history (London: Allen Lane, 2017), p. xviii; Damien van Puyvelde, Stephen Coulthardt and M. Shahmir Hossain, ‘Beyond the buzzword: big data and national security decision-making’, International Affairs 93: 6, Nov. 2017, pp. 1397–416.

Elliot Cohen, ‘Change and transformation in military affairs’, Journal of Strategic Studies 27: 3, 2004, p. 396.

Mann, Globalizations, 1945–2011 , p. 432.

UK Ministry of Defence, Global strategic trends—out to 2045 (London: The Stationery Office, 2014).

Colin Gray, Another bloody century (London: Weidenfeld & Nicolson, 2005), 39.

Paul K. Davis, Lessons from RAND's work on planning under uncertainty for national security (Santa Monica, CA: RAND, 2012), p. 5.

Klaus Schwab, The fourth Industrial Revolution (London: Penguin Random House, 2017), p. 7.

Max Tegmark, Life 3.0: being human in the age of artificial intelligence (London: Penguin Random House, 2017), Kindle edn, p. 37.

John Thornhill, ‘AI: the new frontier’, ‘Big picture podcast’, Financial Times , 4 July 2018, https:podcasts.apple.com>podcast>ft .

Schwab, Fourth Industrial Revolution , p. 8.

See John Mearsheimer, The tragedy of Great Power politics (London: Norton, 2001).

Robert Latiff, Future war: preparing for the new global battlefield (New York: Knopf, 2017).

Rob Davies, ‘UK unveils new Tempest fighter to replace Typhoon’, Guardian , 16 July 2018.

Bob Work, Deputy Secretary of Defense, ‘Third Offset strategy bolsters America's military deterrence’, US Dept of Defense, 31 Oct. 2016, https://www.defense.gov/News/Article/Article/991434/deputy-secretary-third-offset-strategy-bolsters-americas-military-deterrence/ . (Unless otherwise noted at point of citation, all URLs cited in this article were accessible on 20 May 2019.)

Franz-Stefan Gady, ‘New US defence budget: £18 billion for Third Offset’, The Diplomat , 10 Feb. 2016, https://thediplomat.com/2016/02/new-us-defense-budget-18-billion-for-third-offset-strategy/ .

Kai-Fu Lee, AI super-powers: China, Silicon Valley, and the new world order (New York: Houghton Mifflin Harcourt, 2018), p. 19. See also Evan Feigenbaum, China's techno warriors (Stanford: Stanford University Press, 2003).

Adam Segal, ‘When China rules the Web’, Foreign Affairs 97: 5, Sept.–Oct. 2018, p. 12.

Segal, ‘When China rules the Web’.

Kai-Fu Lee, AI super-powers , p. 4.

Donella Meadows, Dennis L. Meadows, J⊘rgen Randers and William W. Behrens III, The limits to growth: a report for the Club of Rome's project on the predicament of mankind (New York: Potomac Associates–Universe Books, 1972).

See David Kilcullen, Out of the mountains: the coming age of the urban guerrilla (Oxford: Oxford University Press, 2013).

Graham Turner, Is global collapse imminent? , research paper no. 4 (Melbourne: University of Melbourne, Sustainable Society Institute, Aug. 2014).

Laurie Laybourn-Langton, Lesley Rankin and Darren Baxter, This is a crisis: facing up to the age of environmental breakdown (London: Institute for Public Policy Research, Feb. 2019), p. 5.

Matthew Green, ‘New economics—the way to save the planet?’, Reuters, 8 May 2019, https://uk.reuters.com/article/uk-climatechange-extinction/new-economics-the-way-to-save-the-planet-idUKKCN1SE2CU .

See Andrew McAfee and Erik Brynjolfson, The second machine age: work, progress in times of brilliant technologies (New York: Norton, 2014).

Yuval Noah Harari, Homo deus: a brief history of tomorrow (London: Vintage, 2017), p. 363.

PWC report, Will robots really steal our jobs? How will automation impact on jobs , https://www.pwc.co.uk/economic-services/assets/international-impact-of-automation-feb-2018.pdf .

Larry Elliot, ‘Robots threaten 15m jobs, says Bank of England chief economist’, Guardian , 12 Nov. 2015.

Ted Robert Gurr, Political rebellion: causes, outcomes and alternatives (Abingdon: Routledge, 2015), p. 58.

Daniela Russ, ‘The robots are coming’, Foreign Affairs 94: 3, June–July 2015, pp. 2–6.

Harari, Homo deus , p. 370.

‘The future of everything: how AI is augmenting therapy’, podcast, Wall Street Journal , https://www.wsj.com/podcasts/wsj-the-future-of-everything/how-ai-is-augmenting-therapy/810a7099-0cc3-4e03-8148-dd87c3673152 .

Harari, Homo deus , p. 379.

Kevin Drum, ‘Tech world welcome to the digital revolution’, Foreign Affairs 97: 4, July–Aug. 2018, p. 47.

Kai-Fu Lee, AI super-powers , p. 19.

Tegmark, Life 3.0 , p. 103.

Martin Wolf, ‘Same as it ever was’, Foreign Affairs 94: 4, 2015, p. 18.

Gurr, Political rebellion , p. 59.

Wolf, ‘Same as it ever was’, p. 22.

Gurr, Political rebellion , p. 15.

Gurr, Political rebellion , p. 16.

See Martin van Creveld, The transformation of war (New York: Free Press, 1991).

Qiao Lang and Wang Xiangsui, Unrestricted warfare (Marina Del Rey, CA: Shadow Lawn Press, 2017; first publ. 1999), Kindle edn, p. 5

Qiao Lang and Wang Xiangsui, Unrestricted warfare , p. 48.

P. W. Singer and Emerson T. Brooking, Like war: the weaponization of social media (Boston: Houghton Mifflin Harcourt, 2018), pp. 151–4.

Karen Kornbluh, ‘The internet's lost promise and how America can restore it’, Foreign Affairs 97: 5, Sept.–Oct. 2018, p. 33; Mikael Wigell, ‘Hybrid interference as a wedge strategy: a theory of external interference’, International Affairs 95: 2, March 2019, pp. 255–76; Yevgeniy Golovchenko, Mareike Martmann and Rebecca Adler-Nissen, ‘State, media and civil society in the information warfare over Ukraine’, International Affairs 94: 5, Sept. 2018, pp. 975–94.

Rory Cormac and Richard J. Aldrich, ‘Grey is the new black: covert action and implausible deniability’, International Affairs 94: 3, May 2018, pp. 477–94.

Oliver Solon, ‘Facebook removes 652 fake accounts and pages meant to influence world politics’, Guardian , 22 Aug. 2018.

Moises Naim, The end of power (New York: Basic Books, 2013), Kindle edn, loc. 2579.

Ronald K. Noble, ‘Keeping science in the right hands’, Foreign Affairs 92: 6, Nov.–Dec. 2013, p. 47.

Laurie Garrett, ‘Biology's brave new world: the promise and perils of the syn bio revolution’, Foreign Affairs 92: 6, Nov.–Dec. 2013, pp. 28–46.

Cited in Naim, The end of power , loc. 2571.

David Betz, ‘Cyberpower in strategic affairs’, Journal of Strategic Studies 35: 5, 2012, p. 695.

Dan Tynan, ‘“I wouldn't waste my time”: firearms experts dismiss flimsy 3D-printed guns’, Guardian , 1 Aug. 2018.

Graham, Cities under siege , loc. 2011, chapter 3: ‘The military urbanism’, section: ‘Tracking: citizen–consumer–soldier’.

Graham, Cities under siege , loc. 2099, chapter 3: ‘The military urbanism’, section: ‘Tracking: citizen–consumer–soldier’.

Author notes

Email alerts, citing articles via.

  • Recommend to Your Librarian
  • Advertising and Corporate Services

Affiliations

  • Online ISSN 1468-2346
  • Print ISSN 0020-5850
  • Copyright © 2024 The Royal Institute of International Affairs
  • About Oxford Academic
  • Publish journals with us
  • University press partners
  • What we publish
  • New features  
  • Open access
  • Institutional account management
  • Rights and permissions
  • Get help with access
  • Accessibility
  • Advertising
  • Media enquiries
  • Oxford University Press
  • Oxford Languages
  • University of Oxford

Oxford University Press is a department of the University of Oxford. It furthers the University's objective of excellence in research, scholarship, and education by publishing worldwide

  • Copyright © 2024 Oxford University Press
  • Cookie settings
  • Cookie policy
  • Privacy policy
  • Legal notice

This Feature Is Available To Subscribers Only

Sign In or Create an Account

This PDF is available to Subscribers Only

For full access to this pdf, sign in to an existing account, or purchase an annual subscription.

Information technologies of 21st century and their impact on the society

  • Original Research
  • Published: 16 August 2019
  • Volume 11 , pages 759–766, ( 2019 )

Cite this article

development of science and technology in the 21st century essay

  • Mohammad Yamin   ORCID: orcid.org/0000-0002-3778-3366 1  

39k Accesses

19 Citations

Explore all metrics

Twenty first century has witnessed emergence of some ground breaking information technologies that have revolutionised our way of life. The revolution began late in 20th century with the arrival of internet in 1995, which has given rise to methods, tools and gadgets having astonishing applications in all academic disciplines and business sectors. In this article we shall provide a design of a ‘spider robot’ which may be used for efficient cleaning of deadly viruses. In addition, we shall examine some of the emerging technologies which are causing remarkable breakthroughs and improvements which were inconceivable earlier. In particular we shall look at the technologies and tools associated with the Internet of Things (IoT), Blockchain, Artificial Intelligence, Sensor Networks and Social Media. We shall analyse capabilities and business value of these technologies and tools. As we recognise, most technologies, after completing their commercial journey, are utilised by the business world in physical as well as in the virtual marketing environments. We shall also look at the social impact of some of these technologies and tools.

Similar content being viewed by others

development of science and technology in the 21st century essay

Integration of Artificial Intelligence and the Internet of Things with Blockchain Technology

development of science and technology in the 21st century essay

Known Unknowns in an Era of Technological and Viral Disruptions—Implications for Theory, Policy, and Practice

development of science and technology in the 21st century essay

Internet of Things: A Review on Its Applications

Avoid common mistakes on your manuscript.

1 Introduction

Internet, which was started in 1989 [ 1 ], now has 1.2 million terabyte data from Google, Amazon, Microsoft and Facebook [ 2 ]. It is estimated that the internet contains over four and a half billion websites on the surface web, the deep web, which we know very little about, is at least four hundred times bigger than the surface web [ 3 ]. Soon afterwards in 1990, email platform emerged and then many applications. Then we saw a chain of web 2.0 technologies like E-commerce, which started, social media platforms, E-Business, E-Learning, E-government, Cloud Computing and more from 1995 to the early 21st century [ 4 ]. Now we have a large number of internet based technologies which have uncountable applications in many domains including business, science and engineering, and healthcare [ 5 ]. The impact of these technologies on our personal lives is such that we are compelled to adopt many of them whether we like it or not.

In this article we shall study the nature, usage and capabilities of the emerging and future technologies. Some of these technologies are Big Data Analytics, Internet of Things (IoT), Sensor networks (RFID, Location based Services), Artificial Intelligence (AI), Robotics, Blockchain, Mobile digital Platforms (Digital Streets, towns and villages), Clouds (Fog and Dew) computing, Social Networks and Business, Virtual reality.

With the ever increasing computing power and declining costs of data storage, many government and private organizations are gathering enormous amounts of data. Accumulated data from the years’ of acquisition and processing in many organizations has become enormous meaning that it can no longer be analyzed by traditional tools within a reasonable time. Familiar disciplines to create Big data include astronomy, atmospheric science, biology, genomics, nuclear physics, biochemical experiments, medical records, and scientific research. Some of the organizations responsible to create enormous data are Google, Facebook, YouTube, hospitals, proceedings of parliaments, courts, newspapers and magazines, and government departments. Because of its size, analysis of big data is not a straightforward task and often requires advanced methods and techniques. Lack of timely analysis of big data in certain domains may have devastating results and pose threats to societies, nature and echo system.

2.1 Big medic data

Healthcare field is generating big data, which has the potential to surpass other fields when it come to the growth of data. Big Medic data usually refers to considerably bigger pool of health, hospital and treatment records, medical claims of administrative nature, and data from clinical trials, smartphone applications, wearable devices such as RFID and heart beat reading devices, different kinds of social media, and omics-research. In particular omics-research (genomics, proteomics, metabolomics etc.) is leading the charge to the growth of Big data [ 6 , 7 ]. The challenges in omics-research are data cleaning, normalization, biomolecule identification, data dimensionality reduction, biological contextualization, statistical validation, data storage and handling, sharing and data archiving. Data analytics requirements include several tasks like those of data cleaning, normalization, biomolecule identification, data dimensionality reduction, biological contextualization, statistical validation, data storage and handling, sharing and data archiving. These tasks are required for the Big data in some of the omics datasets like genomics, tran-scriptomics, proteomics, metabolomics, metagenomics, phenomics [ 6 ].

According to [ 8 ], in 2011 alone, the data in the United States of America healthcare system amounted to one hundred and fifty Exabyte (One Exabyte = One billion Gigabytes, or 10 18  Bytes), and is expected soon reach to 10 21 and later 10 24 . Some scientist have classified Medical into three categories having (a) large number of samples but small number of parameters; (b) small number of samples and small number of parameters; (c) large small number of samples and small number of parameters [ 9 ]. Although the data in the first category may be analyzed by classical methods but it may be incomplete, noisy, and inconsistent, data cleaning. The data in the third category could be big and may require advanced analytics.

2.2 Big data analytics

Big data cannot be analyzed in real time by traditional analytical methods. The analysis of Big data, popularly known as Big Data Analytics, often involves a number of technologies, sophisticated processes and tools as depicted in Fig.  1 . Big data can provide smart decision making and business intelligence to the businesses and corporations. Big data unless analyzed is impractical and a burden to the organization. Big data analytics involves mining and extracting useful associations (knowledge discovery) for intelligent decision-making and forecasts. The challenges in Big Data analytics are computational complexities, scalability and visualization of data. Consequently, the information security risk increases with the surge in the amount of data, which is the case in Big Data.

figure 1

Big Data Analytics

The aim of data analytics has always been knowledge discovery to support smart and timely decision making. With big data, knowledge base becomes widened and sharper to provide greater business intelligence and assist businesses in becoming a leader in the market. Conventional processing paradigm and architecture are inefficient to deal with the large datasets from the Big data. Some of the problems of Big Data are to deal with the size of data sets in Big Data, requiring parallel processing. Some of the recent technologies like Spark, Hadoop, Map Reduce, R, Data Lakes and NoSQL have emerged to provide Big Data analytics. With all these and other data analytics technologies, it is advantageous to invest in designing superior storage systems.

Health data predominantly consists of visual, graphs, audio and video data. Analysing such data to gain meaningful insights and diagnoses may depend on the choice of tools. Medical data has traditionally been scattered in the organization, often not organized properly. What we find usually are medical record keeping systems which consist of heterogeneous data, requiring more efforts to reorganize the data into a common platform. As discussed before, the health profession produces enormous data and so analysing it in an efficient and timely manner can potentially save many lives.

Commercial operations of Clouds from the company platforms began in the year 1999 [ 10 ]. Initially, clouds complemented and empowered outsourcing. At earlier stages, there were some privacy concerns associated with Cloud Computing as the owners of data had to give the custody of their data to the Cloud owners. However, as time passed, with confidence building measures by Cloud owners, the technology became so prevalent that most of the world’s SMEs started using it in one or the other form. More information on Cloud Computing can be found in [ 11 , 12 ].

3.1 Fog computing

As faster processing became the need for some critical applications, the clouds regenerated Fog or Edge computing. As can be seen in Gartner hyper cycles in Figs.  2 and 3 , Edge computing, as an emerging technology, has also peaked in 2017–18. As shown in the Cloud Computing architecture in Fig.  4 , the middle or second layers of the cloud configuration are represented by the Fog computing. For some applications delay in communication between the computing devices in the field and data in a Cloud (often physically apart by thousands of miles), is detrimental of the time requirements, as it may cause considerable delay in time sensitive applications. For example, processing and storage for early warning of disasters (stampedes, Tsunami, etc.) must be in real time. For these kinds of applications, computing and storing resources should be placed closer to where computing is needed (application areas like digital street). In these kind of scenarios Fog computing is considered to be suitable [ 13 ]. Clouds are integral part of many IoT applications and play central role on ubiquitous computing systems in health related cases like the one depicted in Fig.  5 . Some applications of Fog computing can be found in [ 14 , 15 , 16 ]. More results on Fog computing are also available in [ 17 , 18 , 19 ].

figure 2

Emerging Technologies 2018

figure 3

Emerging Technologies 2017

figure 4

Relationship of Cloud, Fog and Dew computing 

figure 5

Snapshot of a Ubiquitous System

3.2 Dew computing

When Fog is overloaded and is not able to cater for the peaks of high demand applications, it offloads some of its data and/or processing to the associated cloud. In such a situation, Fog exposes its dependency to a complementary bottom layer of the cloud architectural organisation as shown in the Cloud architecture of Fig.  4 . This bottom layer of hierarchical resources organization is known as the Dew layer. The purpose of the Dew layer is to cater for the tasks by exploiting resources near to the end-user with minimum internet access [ 17 , 20 ]. As a feature, Dew computing takes care of determining as to when to use for its services linking with the different layers of the Cloud architecture. It is also important to note that the Dew computing [ 20 ] is associated with the distributed computing hierarchy and is integrated by the Fog computing services, which is also evident in Fig.  4 . In summary, Cloud architecture has three layers, first being Cloud, second as Fog and the third Dew.

4 Internet of things

Definition of Internet of Things (IoT), as depicted in Fig.  6 , has been changing with the passage of time. With growing number of internet based applications, which use many technologies, devices and tools, one would think, the name of IoT seems to have evolved. Accordingly, things (technologies, devices and tools) used together in internet based applications to generate data to provide assistance and services to the users from anywhere, at any time. The internet can be considered as a uniform technology from any location as it provides the same service of ‘connectivity’. The speed and security however are not uniform. The IoT as an emerging technology has peaked during 2017–18 as is evident from Figs.  2 and 3 . This technology is expanding at a very fast rate. According to [ 21 , 22 , 23 , 24 ], the number of IoT devices could be in millions by the year 2021.

figure 6

Internet of Things

IoT is providing some amazing applications in tandem with wearable devices, sensor networks, Fog computing, and other technologies to improve some the critical facets of our lives like healthcare management, service delivery, and business improvements. Some applications of IoT in the field of crowd management are discussed in [ 14 ]. Some applications in of IoT in the context of privacy and security are discussed in [ 15 , 16 ]. Some of the key devices and associated technologies to IoT include RFID Tags [ 25 ], Internet, computers, cameras, RFID, Mobile Devices, coloured lights, RFIDs, Sensors, Sensor networks, Drones, Cloud, Fog and Dew.

5 Applications of blockchain

Blockchain is usually associated with Cryptocurrencies like Bitcoin (Currently, there are over one and a half thousand cryptocurrencies and the numbers are still rising). But the Blockchain technology can also be used for many more critical applications of our daily lives. The Blockchain is a distributed ledger technology in the form of a distributed transactional database, secured by cryptography, and governed by a consensus mechanism. A Blockchain is essentially a record of digital events [ 26 ]. A block represents a completed transaction or ledger. Subsequent and prior blocks are chained together, displaying the status of the most recent transaction. The role of chain is to provide linkage between records in a chronological order. This chain continues to grow as and when further transactions take place, which are recorded by adding new blocks to the chain. User security and ledger consistency in the Blockchain is provided by Asymmetric cryptography and distributed consensus algorithms. Once a block is created, it cannot be altered or removed. The technology eliminates the need for having a bank statement for verification of the availability of funds or that of a lawyer for certifying the occurrence of an event. The benefits of Blockchain technology are inherited in its characteristics of decentralization, persistency, anonymity and auditability [ 27 , 28 ].

5.1 Blockchain for business use

Blockchain, being the technology behind Cryptocurrencies, started as an open-source Bitcoin community to allow reliable peer-to-peer financial transactions. Blockchain technology has made it possible to build a globally functional currency relying on code, without using any bank or third-party platforms [ 28 ]. These features have made the Blockchain technology, secure and transparent for business transactions of any kind involving any currencies. In literature, we find many applications of Blockchain. Nowadays, the applications of Blockchain technology involve various kinds of transactions requiring verification and automated system of payments using smart contracts. The concept of Smart Contacts [ 28 ] has virtually eliminated the role of intermediaries. This technology is most suitable for businesses requiring high reliability and honesty. Because of its security and transparency features, the technology would benefit businesses trying to attract customers. Blockchain can be used to eliminate the occurrence of fake permits as can be seen in [ 29 ].

5.2 Blockchain for healthcare management

As discussed above, Blockchain is an efficient and transparent way of digital record keeping. This feature is highly desirable in efficient healthcare management. Medical field is still undergoing to manage their data efficiently in a digital form. As usual the issues of disparate and non-uniform record storage methods are hampering the digitization, data warehouse and big data analytics, which would allow efficient management and sharing of the data. We learn the magnitude of these problem from examples of such as the target of the National Health Service (NHS) of the United Kingdom to digitize the UK healthcare is by 2023 [ 30 ]. These problems lead to inaccuracies of data which can cause many issues in healthcare management, including clinical and administrative errors.

Use of Blockchain in healthcare can bring revolutionary improvements. For example, smart contracts can be used to make it easier for doctors to access patients’ data from other organisations. The current consent process often involves bureaucratic processes and is far from being simplified or standardised. This adds to many problems to patients and specialists treating them. The cost associated with the transfer of medical records between different locations can be significant, which can virtually be reduced to zero by using Blockchain. More information on the use of Blockchain in the healthcare data can be found in [ 30 , 31 ].

6 Environment cleaning robot

One of the ongoing healthcare issue is the eradication of deadly viruses and bacteria from hospitals and healthcare units. Nosocomial infections are a common problem for hospitals and currently they are treated using various techniques [ 32 , 33 ]. Historically, cleaning the hospital wards and operating rooms with chlorine has been an effective way. On the face of some deadly viruses like EBOLA, HIV Aids, Swine Influenza H1N1, H1N2, various strands of flu, Severe Acute Respiratory Syndrome (SARS) and Middle Eastern Respiratory Syndrome (MERS), there are dangerous implications of using this method [ 14 ]. An advanced approach is being used in the USA hospitals, which employs “robots” to purify the space as can be seen in [ 32 , 33 ]. However, certain problems exist within the limitations of the current “robots”. Most of these devices require a human to place them in the infected areas. These devices cannot move effectively (they just revolve around themselves); hence, the UV light will not reach all areas but only a very limited area within the range of the UV light emitter. Finally, the robot itself maybe infected as the light does not reach most of the robot’s surfaces. Therefore, there is an emerging need to build a robot that would not require the physical presence of humans to handle it, and could purify the entire room by covering all the room surfaces with UV light while, at the same time, will not be infected itself.

Figure  7 is an overview of the design of a fully motorized spider robot with six legs. This robot supports Wi-Fi connectivity for the purpose of control and be able to move around the room and clean the entire area. The spider design will allow the robot to move in any surface, including climbing steps but most importantly the robot will use its legs to move the UV light emitter as well as clear its body before leaving the room. This substantially reduces the risk of the robot transmitting any infections.

figure 7

Spider Robot for virus cleaning

Additionally, the robot will be equipped with a motorized camera allowing the operator to monitor space and stop the process of emitting UV light in case of unpredicted situations. The operator can control the robot via a networked graphical user interface and/or from an augmented reality environment which will utilize technologies such as the Oculus Touch. In more detail, the user will use the oculus rift virtual reality helmet and the oculus touch, as well as hand controllers to remote control the robot. This will provide the user with the vision of the robot in a natural manner. It will also allow the user to control the two front robotic arms of the spider robot via the oculus touch controller, making it easy to do conduct advance movements, simply by move the hands. The physical movements of the human hand will be captured by the sensors of oculus touch and transmitted to the robot. The robot will then use reverse kinematics to translate the actions and position of the human hand to movements of the robotic arm. This technique will also be used during the training phase of the robot, where the human user will teach the robot how to clean various surfaces and then purify itself, simply by moving their hands accordingly. The design of the spider robot was proposed in a project proposal submitted to the King Abdulaziz City of Science and Technology ( https://www.kacst.edu.sa/eng/Pages/default.aspx ) by the author and George Tsaramirsis ( https://www.researchgate.net/profile/George_Tsaramirsis ).

7 Conclusions

We have presented details of some of the emerging technologies and real life application, that are providing businesses remarkable opportunities, which were previously unthinkable. Businesses are continuously trying to increase the use of new technologies and tools to improve processes, to benefit their client. The IoT and associated technologies are now able to provide real time and ubiquitous processing to eliminate the need for human surveillance. Similarly, Virtual Reality, Artificial Intelligence robotics are having some remarkable applications in the field of medical surgeries. As discussed, with the help of the technology, we now can predict and mitigate some natural disasters such as stampedes with the help of sensor networks and other associated technologies. Finally, the increase in Big Data Analytics is influencing businesses and government agencies with smarter decision making to achieve targets or expectations.

Naughton John (2016) The evolution of the internet: from military experiment to general purpose technology. J Cyber Policy 1(1):5–28. https://doi.org/10.1080/23738871.2016.1157619

Article   Google Scholar  

Gareth Mitchell (2019). How much data is on the internet? Science Focus (The Home of BBC Science Focus Magazine). [Online]. https://www.sciencefocus.com/future-technology/how-much-data-is-on-the-internet/ . Accessed 20 April 2019

Mae Rice (2018). The deep web is the 99% of the internet you can’t google. Curiosity. [Online]. https://curiosity.com/topics/the-deep-web-is-the-99-of-the-internet-you-cant-google-curiosity/ . Accessed 20 April 2019

Slumkoski C (2012) History on the internet 2.0: the rise of social media. Acadiensis 41(2):153–162 (Summer/Autumn-ÉTÉ/Automne)

Google Scholar  

Ibarra-Esquer JE, González-Navarro FF, Flores-Rios BL, Burtseva L, María A, Astorga-Vargas M (2017) Tracking the evolution of the internet of things concept across different application domains. Sensors (Basel) 17(6):1379. https://doi.org/10.3390/s17061379

Misra Biswapriya B, Langefeld Carl, Olivier Michael, Cox Laura A (2018) Integrated omics: tools, advances and future approaches. J Mol Endocrinol 62(1):R21–R45. https://doi.org/10.1530/JME-18-0055

Lee Choong Ho, Yoon Hyung-Jin (2017) Medical big data: promise and challenges. Kidney Res Clin Practice 36(1):3–13. https://doi.org/10.23876/j.krcp.2017.36.1.3

Faggella D (2019). Where healthcare’s big data actually comes from. Emerj. [Onlone]. Last accessed from https://emerj.com/ai-sector-overviews/where-healthcares-big-data-actually-comes-from/

Sinha A, Hripcsak G, Markatou M (2009) Large datasets in biomedicine: a discussion of salient analytic issues. J Am Med Inform Assoc 16:759–767. https://doi.org/10.1197/jamia.M2780

Keith D. Foote a brief history of cloud computing. DATAVERSITY, 2017. {Online]. Last accessed on 5/5/2019 from A Brief History of Cloud Computing

Vassakis K, Petrakis E, Kopanakis I (2018) Big data analytics: applications, prospects and challenges. In: Skourletopoulos G, Mastorakis G, Mavromoustakis C, Dobre C, Pallis E (eds) Mobile big data. Lecture notes on data engineering and communications technologies, vol 10. Springer, Cham

Yamin M, Al Makrami AA (2015) Cloud computing in SMEs: case of Saudi Arabia. BIJIT—BVICAM’s Int J Inform Technol 7(1):853–860

Ahmed E, Ahmed A, Yaqoob I, Shuja J, Gani A, Imran M, Shoaib M (2017) Bringing computation closer toward the user network: is edge computing the solution? IEEE Commun Mag 55:138–144

Yamin M, Basahel AM, Abi Sen AA (2018) Managing crowds with wireless and mobile technologies. Hindawi. Wireless Commun Mobile Comput. Volume 2018, Article ID 7361597, pp 15. https://doi.org/10.1155/2018/7361597

Yamin M, Abi Sen AA (2018) Improving privacy and security of user data in location based services. Int J Ambient Comput Intell 9(1):19–42. https://doi.org/10.4018/IJACI.2018010102

Sen AAA, Eassa FA, Jambi K, Yamin M (2018) Preserving privacy in internet of things—a survey. Int J Inform Technol 10(2):189–200. https://doi.org/10.1007/s41870-018-0113-4

Longo Mathias, Hirsch Matías, Mateos Cristian, Zunino Alejandro (2019) Towards integrating mobile devices into dew computing: a model for hour-wise prediction of energy availability. Information 10(3):86. https://doi.org/10.3390/info10030086

Nunna S, Kousaridas A, Ibrahim M, Dillinger M, Thuemmler C, Feussner H, Schneider A Enabling real-time context-aware collaboration through 5G and mobile edge computing. In: Proceedings of the 12th international conference on information technology-new generations, Las Vegas, NV, USA, 13–15 April 2015; pp 601–605

Vaquero LM, Rodero-Merino L (2014) Finding your way in the fog: towards a comprehensive definition of fog computing. SIGCOMM Comput Commun Rev 44:27–32

Ray PP (2019) Minimizing dependency on internetwork: Is dew computing a solution? Trans Emerg Telecommun Technol 30:e3496

Bonomi F, Milito R, Zhu J, Addepalli S Fog computing and its role in the internet of things. In: Proceedings of the first edition of the workshop on mobile cloud computing, Helsinki, Finland, 17 August 2012; pp 13–16. [Google Scholar]

Jia X, Feng Q, Fan T, Lei Q, RFID technology and its applications in internet of things (IoT), consumer electronics, communications and networks (CECNet). In: 2nd international conference proceedings, pp 1282–1285. IEEE, 2012, https://doi.org/10.1109/cecnet.2012.6201508

Said O, Masud M (2013) Towards internet of things: survey and future vision. Int J Comput Netw 5(1):1–17

Gubbi J, Buyya R, Marusic S, Palaniswami M (2013) Internet of things (IoT): a vision, architectural elements, and future directions. Future Gener Comput Syst 29(7):1645–1660. https://doi.org/10.1016/j.future.2013.01.010

Beck R, Avital M, Rossi M et al (2017) Blockchain technology in business and information systems research. Bus Inf Syst Eng 59:381. https://doi.org/10.1007/s12599-017-0505-1

Yamin M (2018) Managing crowds with technology: cases of Hajj and Kumbh Mela. Int J Inform Technol. https://doi.org/10.1007/s41870-018-0266-1

Zheng Z, Xie S, Dai H, Chen X, Wang H An overview of blockchain technology: architecture, consensus, and future trends. https://www.researchgate.net/publication/318131748_An_Overview_of_Blockchain_Technology_Architecture_Consensus_and_Future_Trends . Accessed May 01 2019

Al-Saqafa W, Seidler N (2017) Blockchain technology for social impact: opportunities and challenges ahead. J Cyber Secur Policy. https://doi.org/10.1080/23738871.2017.1400084

Alotaibi M, Alsaigh M, Yamin M (2019) Blockchain for controlling Hajj and Umrah permits. Int J Comput Sci Netw Secur 19(4):69–77

Vazirani AA, O’Donoghue O, Brindley D (2019) Implementing blockchains for efficient health care: systematic review. J Med Internet Res 21(2):12439. https://doi.org/10.2196/12439

Yamin M (2018) IT applications in healthcare management: a survey. Int J Inform Technol 10(4):503–509. https://doi.org/10.1007/s41870-018-0203-3

Begić A. (2018) Application of Service Robots for Disinfection in Medical Institutions. In: Hadžikadić M., Avdaković S. (eds) Advanced Technologies, Systems, and Applications II. IAT (2017) Lecture Notes in Networks and Systems, vol 28. Springer, Cham

Mettler T, Sprenger M, Winter R (2017) Service robots in hospitals: new perspectives on niche evolution and technology affordances. Euro J Inform Syst. 10:11. https://doi.org/10.1057/s41303-017-0046-1

Download references

Author information

Authors and affiliations.

Department of MIS, Faculty of Economics and Admin, King Abdulaziz University, Jeddah, Saudi Arabia

Mohammad Yamin

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Mohammad Yamin .

Rights and permissions

Reprints and permissions

About this article

Yamin, M. Information technologies of 21st century and their impact on the society. Int. j. inf. tecnol. 11 , 759–766 (2019). https://doi.org/10.1007/s41870-019-00355-1

Download citation

Received : 05 May 2019

Accepted : 09 August 2019

Published : 16 August 2019

Issue Date : December 2019

DOI : https://doi.org/10.1007/s41870-019-00355-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Emerging and future technologies
  • Internet of things
  • Sensor networks
  • Location based services
  • Mobile digital platforms
  • Find a journal
  • Publish with us
  • Track your research

Essay on Science and Technology for Students and Children

500+ words essay on science and technology.

Essay on Science and Technology: Science and technology are important parts of our day to day life. We get up in the morning from the ringing of our alarm clocks and go to bed at night after switching our lights off. All these luxuries that we are able to afford are a resultant of science and technology . Most importantly, how we can do all this in a short time are because of the advancement of science and technology only. It is hard to imagine our life now without science and technology. Indeed our existence itself depends on it now. Every day new technologies are coming up which are making human life easier and more comfortable. Thus, we live in an era of science and technology.

Essentially, Science and Technology have introduced us to the establishment of modern civilization . This development contributes greatly to almost every aspect of our daily life. Hence, people get the chance to enjoy these results, which make our lives more relaxed and pleasurable.

Essay on Science and Technology

Benefits of Science and Technology

If we think about it, there are numerous benefits of science and technology. They range from the little things to the big ones. For instance, the morning paper which we read that delivers us reliable information is a result of scientific progress. In addition, the electrical devices without which life is hard to imagine like a refrigerator, AC, microwave and more are a result of technological advancement.

Furthermore, if we look at the transport scenario, we notice how science and technology play a major role here as well. We can quickly reach the other part of the earth within hours, all thanks to advancing technology.

In addition, science and technology have enabled man to look further than our planet. The discovery of new planets and the establishment of satellites in space is because of the very same science and technology. Similarly, science and technology have also made an impact on the medical and agricultural fields. The various cures being discovered for diseases have saved millions of lives through science. Moreover, technology has enhanced the production of different crops benefitting the farmers largely.

Get the huge list of more than 500 Essay Topics and Ideas

India and Science and Technology

Ever since British rule, India has been in talks all over the world. After gaining independence, it is science and technology which helped India advance through times. Now, it has become an essential source of creative and foundational scientific developments all over the world. In other words, all the incredible scientific and technological advancements of our country have enhanced the Indian economy.

development of science and technology in the 21st century essay

Looking at the most recent achievement, India successfully launched Chandrayaan 2. This lunar exploration of India has earned critical acclaim from all over the world. Once again, this achievement was made possible due to science and technology.

In conclusion, we must admit that science and technology have led human civilization to achieve perfection in living. However, we must utilize everything in wise perspectives and to limited extents. Misuse of science and technology can produce harmful consequences. Therefore, we must monitor the use and be wise in our actions.

{ “@context”: “https://schema.org”, “@type”: “FAQPage”, “mainEntity”: [{ “@type”: “Question”, “name”: “List some benefits of science and technology.”, “acceptedAnswer”: { “@type”: “Answer”, “text”: “Science and Technology helps us to function daily comfortably. It has given us railway systems, TV, refrigerator, internet and more.” } }, { “@type”: “Question”, “name”: “Name the most recent achievement of India with the help of science and technology.”, “acceptedAnswer”: { “@type”: “Answer”, “text”:”India most recently launched Chandrayaan 2 successfully. This lunar exploration helped India make a distinctive place amongst the other developed countries.”} }] }

Customize your course in 30 seconds

Which class are you in.

tutor

  • Travelling Essay
  • Picnic Essay
  • Our Country Essay
  • My Parents Essay
  • Essay on Favourite Personality
  • Essay on Memorable Day of My Life
  • Essay on Knowledge is Power
  • Essay on Gurpurab
  • Essay on My Favourite Season
  • Essay on Types of Sports

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Download the App

Google Play

Numbers, Facts and Trends Shaping Your World

Read our research on:

Full Topic List

Regions & Countries

Publications

  • Our Methods
  • Short Reads
  • Tools & Resources

Read Our Research On:

Internet & Technology

6 facts about americans and tiktok.

62% of U.S. adults under 30 say they use TikTok, compared with 39% of those ages 30 to 49, 24% of those 50 to 64, and 10% of those 65 and older.

Many Americans think generative AI programs should credit the sources they rely on

Americans’ use of chatgpt is ticking up, but few trust its election information, whatsapp and facebook dominate the social media landscape in middle-income nations, sign up for our internet, science, and tech newsletter.

New findings, delivered monthly

Electric Vehicle Charging Infrastructure in the U.S.

64% of Americans live within 2 miles of a public electric vehicle charging station, and those who live closest to chargers view EVs more positively.

When Online Content Disappears

A quarter of all webpages that existed at one point between 2013 and 2023 are no longer accessible.

A quarter of U.S. teachers say AI tools do more harm than good in K-12 education

High school teachers are more likely than elementary and middle school teachers to hold negative views about AI tools in education.

Teens and Video Games Today

85% of U.S. teens say they play video games. They see both positive and negative sides, from making friends to harassment and sleep loss.

Americans’ Views of Technology Companies

Most Americans are wary of social media’s role in politics and its overall impact on the country, and these concerns are ticking up among Democrats. Still, Republicans stand out on several measures, with a majority believing major technology companies are biased toward liberals.

22% of Americans say they interact with artificial intelligence almost constantly or several times a day. 27% say they do this about once a day or several times a week.

About one-in-five U.S. adults have used ChatGPT to learn something new (17%) or for entertainment (17%).

Across eight countries surveyed in Latin America, Africa and South Asia, a median of 73% of adults say they use WhatsApp and 62% say they use Facebook.

5 facts about Americans and sports

About half of Americans (48%) say they took part in organized, competitive sports in high school or college.

REFINE YOUR SELECTION

Research teams, signature reports.

development of science and technology in the 21st century essay

The State of Online Harassment

Roughly four-in-ten Americans have experienced online harassment, with half of this group citing politics as the reason they think they were targeted. Growing shares face more severe online abuse such as sexual harassment or stalking

Parenting Children in the Age of Screens

Two-thirds of parents in the U.S. say parenting is harder today than it was 20 years ago, with many citing technologies – like social media or smartphones – as a reason.

Dating and Relationships in the Digital Age

From distractions to jealousy, how Americans navigate cellphones and social media in their romantic relationships.

Americans and Privacy: Concerned, Confused and Feeling Lack of Control Over Their Personal Information

Majorities of U.S. adults believe their personal data is less secure now, that data collection poses more risks than benefits, and that it is not possible to go through daily life without being tracked.

Americans and ‘Cancel Culture’: Where Some See Calls for Accountability, Others See Censorship, Punishment

Social media fact sheet, digital knowledge quiz, video: how do americans define online harassment.

1615 L St. NW, Suite 800 Washington, DC 20036 USA (+1) 202-419-4300 | Main (+1) 202-857-8562 | Fax (+1) 202-419-4372 |  Media Inquiries

Research Topics

  • Email Newsletters

ABOUT PEW RESEARCH CENTER  Pew Research Center is a nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping the world. It conducts public opinion polling, demographic research, media content analysis and other empirical social science research. Pew Research Center does not take policy positions. It is a subsidiary of  The Pew Charitable Trusts .

© 2024 Pew Research Center

IMAGES

  1. Technology in the 21st Century by on Prezi

    development of science and technology in the 21st century essay

  2. 🌷 Computer science college essay. Career Goals. 2022-11-07

    development of science and technology in the 21st century essay

  3. In the 21st century the use of information technologies [IT] Free Essay

    development of science and technology in the 21st century essay

  4. ⇉Development of Science and Technology Essay Example

    development of science and technology in the 21st century essay

  5. ⇉Advancement of Technology and Science Essay Example

    development of science and technology in the 21st century essay

  6. 3 major scientific and technological developments in the world

    development of science and technology in the 21st century essay

VIDEO

  1. CNBC Xynteo Exchange Episode 03

  2. (REMAKE) India in the 21st Century

  3. Technology 21st Century Context

  4. Essay on 21st century//Essay about 21st century

  5. Science and Technology Essay Writing in English || Essay on Science and Technology in English

  6. India In 21st Century essay| essay on india in 21st century| india in 21st century essay in english

COMMENTS

  1. How Is Technology Changing the World, and How Should the World Change

    A crucial part of understanding how technology has created global change and, in turn, how global changes have influenced the development of new technologies is understanding the technologies themselves in all their richness and complexity—how they work, the limits of what they can do, what they were designed to do, how they are actually used.

  2. Here's how technology has changed the world since 2000

    Since the dotcom bubble burst back in 2000, technology has radically transformed our societies and our daily lives. From smartphones to social media and healthcare, here's a brief history of the 21st century's technological revolution. Just over 20 years ago, the dotcom bubble burst, causing the stocks of many tech firms to tumble.

  3. PDF Science, technology and innovation in a 21st century context

    Science, technology and innovation in a 21st century context John H. Marburger III Springer Science+Business Media, LLC. 2011 This editorial essay was prepared by John H. ''Jack'' Marburger for a workshop on the ''science of science and innovation policy'' held in 2009 that was the basis for this special issue.

  4. Role of Science and Technology in 21st Century

    Science and technology are not the problems; rather how we decided to use this technology determines the consequences. For instance, the invention and development of vehicles were not the problems ...

  5. Science, technology and innovation in a 21st century context

    Science, technology and innovation in a 21st century context. This editorial essay was prepared by John H. "Jack" Marburger for a workshop on the "science of science and innovation policy" held in 2009 that was the basis for this special issue. It is published posthumously. Linking the words "science," "technology," and ...

  6. History of technology

    The 20th and 21st centuries Technology from 1900 to 1945. Recent history is notoriously difficult to write, because of the mass of material and the problem of distinguishing the significant from the insignificant among events that have virtually the power of contemporary experience. In respect to the recent history of technology, however, one fact stands out clearly: despite the immense ...

  7. Role of Science and Technology in the 21st Century

    Role of Science and Technology in the 21st Century — Essay Competition Winner, Shreya Shrestha ... we can say that science and technology are key divers to development, because technological and ...

  8. Technology over the long run: zoom out to see how dramatically the

    You can use this visualization to see how technology developed in particular domains. Follow, for example, the history of communication: from writing to paper, to the printing press, to the telegraph, the telephone, the radio, all the way to the Internet and smartphones. Or follow the rapid development of human flight.

  9. Science and technology on fast forward

    Science and technology feed off of one another, propelling both forward. Scientific knowledge allows us to build new technologies, which often allow us to make new observations about the world, which, in turn, allow us to build even more scientific knowledge, which then inspires another technology … and so on. As an example, we'll start with a single scientific idea and trace its ...

  10. The Role of Science and Technology in the Developing World in the 21st

    The Role of Science and Technology in the Developing World in the 21st Century Lee-Roy Chetty2012-10-03 00:00:00. Science and technology are key drivers to development, because technological and scientific revolutions underpin economic advances, improvements in health systems, education and infrastructure. The technological revolutions of the ...

  11. (PDF) Educational Theory in the 21st Century: Science, Technology

    Abstract and Figures. This open access book reviews the effects of the twenty-first century scientific-technological and social developments on the educational theory. The first part handles the ...

  12. Science in the 21st Century

    From the invention of new life forms to the discovery of life beyond Earth, science is reshaping our understanding of the universe in the twenty-first century. In the Summer 2012 issue of Dædalus, leading scientists describe emerging advances in nanoscience, neuroscience, genetics, paleontology, microbiology, mathematics, planetary science, and plant biology, among other areas. The authors ...

  13. Science in the 21st Century

    Science in the 21st Century. Imagine a new century, full of promise, molded by science, shaped by technology, powered by knowledge. We are now embarking on our most daring explorations, unraveling the mysteries of our inner world and charting new routes to the conquest of disease. We have not and we must not shrink from exploring the frontiers ...

  14. Technology: 21 of the most important inventions of the 21st century

    3D printing, E-cigarettes among the most important inventions of the 21st century. Angelo Young and Michael B. Sauter. 24/7 Wall Street. 0:00. 1:24. The human race has always innovated, and in a ...

  15. The Role of Technology Integration in the Development of 21 st Century

    The rapid growth of science and technology in the 21 st century has significantly changed the global landscape (Marburger, 2011), including education (Ramaila & Molwele, 2022). The current ...

  16. Role of Science and Technology in the 21st Century

    Technology has increased the productivity of almost every industry in the world. Currently residing on 21st century- the era of science and technology, human tends to approach highest level of ...

  17. Technology in the 21st century: New challenges and opportunities

    Road map to review studies on big data, BDA and business intelligence. 4. Technology in big data analytics workflow. Due to the challenges of big data in terms of huge volume, high variety, high velocity, high variability, low veracity and high value, greater efficiency is a primary goal in handling such data sets.

  18. (PDF) DATA SCIENCE IN THE 21ST CENTURY: EVOLUTION ...

    Data Science has undergone a remarkable evolution in the 21st century, transforming from a niche field into an integral component of var ious industries. This. article explor es the dynamic ...

  19. Technology, war and the state: past, present and future

    The article is divided into three parts. The first explores the war-state relationship and the factors that shaped it during the Cold War. It explains why technological innovation became so important in war, and how this imperative influenced both our understanding of war and the interaction between war and the state.

  20. Information technologies of 21st century and their impact on the

    Twenty first century has witnessed emergence of some ground breaking information technologies that have revolutionised our way of life. The revolution began late in 20th century with the arrival of internet in 1995, which has given rise to methods, tools and gadgets having astonishing applications in all academic disciplines and business sectors. In this article we shall provide a design of a ...

  21. New technologies and 21st century children

    Building digital resilience is an important skill for 21st century children. Effective strategies to accomplish this include encouragement of active rather than passive Internet use, e-safety in the school curriculum, and teacher and parental Information and Communication Technology (ICT) support.

  22. Essay on Science and Technology for Students and Children

    500+ Words Essay on Science and Technology. Essay on Science and Technology: Science and technology are important parts of our day to day life. We get up in the morning from the ringing of our alarm clocks and go to bed at night after switching our lights off. All these luxuries that we are able to afford are a resultant of science and ...

  23. Internet & Technology

    Americans' Views of Technology Companies. Most Americans are wary of social media's role in politics and its overall impact on the country, and these concerns are ticking up among Democrats. Still, Republicans stand out on several measures, with a majority believing major technology companies are biased toward liberals. short readsApr 3, 2024.

  24. 21st Century Skills and Teachers' Performance ...

    The study examines the correlation between 21st-century teaching skills and teachers' performance using the Classroom Observation Tool (COT) among public Junior High School teachers in Region X during the 2023-2024 school year. It aims to assess teachers' profiles, 21st-century skills, performance levels, and the relationship between these factors. Statistical analysis revealed that ...

  25. PDF Science, technology and innovation in a 21st century context

    basic science of the twenty-first century is neither biology nor physics, but an interdisci-plinary mix of these and other traditional fields. Continued development of this domain contributes to information technology and much else. I mentioned two frontiers. The other physical science frontier borders the nearly unexploited domain of quantum ...