Information Age
The Information Age[a] is a historical period that began in the middle of the 20th century. It is defined by a quick change from older industries, like those created during the Industrial Revolution, to an economy focused on information technology. The start of the Information Age is often connected to the invention of the transistor in 1947.[2] This technological step forward greatly changed how information is handled and sent.
According to the United Nations Public Administration Network, the Information Age was made possible by taking advantage of smaller computers and computer parts.[3] This led to modern information systems and internet communication becoming the main force behind social evolution.[4]
There is an ongoing discussion about whether the Third Industrial Revolution has ended and if the Fourth Industrial Revolution has already begun because of recent progress in areas like artificial intelligence and biotechnology.[5] It is suggested that this next change might bring about the Imagination Age, the Internet of things (IoT), and fast advances in machine learning.
History
[change | change source]The digital revolution changed technology from a continuous analog format to a discrete digital format. This change made it possible to create perfect copies that were exactly the same as the original. For example, in digital communication, special hardware could boost the digital signal and send it along without losing any information. Just as important to this revolution was the ability to easily move digital information between different media and to access or share it from far away. One major moment in the revolution was the change from analog to digitally recorded music.[6] During the 1980s, the digital format of optical compact discs (CDs) slowly replaced analog formats, such as vinyl records and cassette tapes, as the most popular choice.[7]
Previous inventions
[change | change source]Humans have been making tools for counting and calculating since ancient times, such as the abacus, astrolabe, equatorium, and mechanical clocks. More complex tools started to appear in the 1600s, including the slide rule and mechanical calculators. By the early 1800s, the Industrial Revolution had led to mass-market calculators like the arithmometer and the enabling technology of the punch card. Charles Babbage proposed a mechanical, general-purpose computer called the Analytical Engine, but it was never built successfully and was largely forgotten by the 20th century. Most inventors of modern computers were unaware of it.
The Second Industrial Revolution in the last 25 years of the 19th century led to useful electrical circuits and the telegraph. In the 1880s, Herman Hollerith created electromechanical counting and calculating devices using punch cards and unit record equipment, which became widely used in business and government.
At the same time, various analog computer systems used electrical, mechanical, or hydraulic systems to model problems and find answers. These included a tide-predicting machine (1872), differential analyzers, perpetual calendar machines, the Deltar for water management in the Netherlands, network analyzers for electrical systems, and various machines for aiming military guns and bombs. The building of analog computers for specific problems continued into the late 1940s and beyond with FERMIAC for neutron transport, Project Cyclone for military uses, and the Phillips Machine for economic modeling.
Building on the complexity of the Z1 and Z2, German inventor Konrad Zuse used electromechanical systems to complete the Z3 in 1941. It was the world's first working, programmable, fully automatic digital computer. Also during World War II, Allied engineers built electromechanical bombes to break German Enigma machine codes. The base-10 electromechanical Harvard Mark I was finished in 1944, and was somewhat improved with ideas from Charles Babbage's designs.
1947–1969: Origins
[change | change source]In 1947, the first working transistor, the germanium-based point-contact transistor, was invented by John Bardeen and Walter Houser Brattain while working under William Shockley at Bell Labs.[8] This paved the way for more advanced digital computers. From the late 1940s, universities, the military, and businesses developed computer systems to digitally repeat and automate mathematical calculations that were previously done by hand, with the LEO being the first commercially available general-purpose computer.
Digital communication became affordable enough for widespread use after the invention of the personal computer in the 1970s. Claude Shannon, a Bell Labs mathematician, is recognized for setting the foundation for digitalization in his important 1948 article, A Mathematical Theory of Communication.[9]
In 1948, Bardeen and Brattain patented an insulated-gate transistor (IGFET) with an inversion layer. Their idea forms the basis of today's CMOS and DRAM technology.[10] In 1957 at Bell Labs, Frosch and Derick were able to make planar silicon dioxide transistors.[11] Later, a team at Bell Labs showed a working MOSFET (Metal-Oxide-Semiconductor Field-Effect Transistor).[12] The first key step for the integrated circuit was achieved by Jack Kilby in 1958.[13]
Other important technological steps included the invention of the single-piece integrated circuit chip by Robert Noyce at Fairchild Semiconductor in 1959,[14] which was made possible by the planar process developed by Jean Hoerni.[15] In 1963, complementary MOS (CMOS) was developed by Chih-Tang Sah and Frank Wanlass at Fairchild Semiconductor.[16] The self-aligned gate transistor, which made mass production even easier, was invented in 1966 by Robert Bower at Hughes Aircraft[17][18] and separately by Robert Kerwin, Donald Klein, and John Sarace at Bell Labs.[19]
In 1962, AT&T introduced the T-carrier for long-distance digital voice transmission using pulse-code modulation (PCM). The T1 format carried 24 PCM, time-division multiplexed speech signals, each encoded in 64 kbit/s streams, plus 8 kbit/s of framing information for synchronization. Over the next few decades, digitizing voice became the standard for almost all transmissions except the very last part of the connection (where analog stayed the norm until the late 1990s).
Following the development of MOS integrated circuit chips in the early 1960s, MOS chips achieved a higher density of transistors and lower manufacturing costs than older bipolar integrated circuits by 1964. MOS chips continued to grow in complexity at the rate predicted by Moore's law, leading to large-scale integration (LSI) with hundreds of transistors on a single MOS chip by the late 1960s. The use of MOS LSI chips in computing formed the basis for the first microprocessors, as engineers started to realize that an entire computer processor could fit on a single MOS LSI chip.[20] In 1968, Fairchild engineer Federico Faggin improved MOS technology with his development of the silicon-gate MOS chip, which he later used to develop the Intel 4004, the first single-chip microprocessor.[21] It was released by Intel in 1971 and set the stage for the microcomputer revolution that began in the 1970s.
MOS technology also led to the development of semiconductor image sensors suitable for digital cameras.[22] The first such image sensor was the charge-coupled device (CCD), developed by Willard S. Boyle and George E. Smith at Bell Labs in 1969,[23] based on MOS capacitor technology.[22]
1969–1989: Invention of the internet, rise of home computers
[change | change source]The public first learned about the ideas that led to the Internet when a message was sent over the ARPANET in 1969. Packet-switched networks like ARPANET, Mark I, CYCLADES, Merit Network, Tymnet, and Telenet were developed in the late 1960s and early 1970s using various protocols. ARPANET, in particular, led to the creation of protocols for internetworking, which allowed many separate networks to be connected into one network of networks.
The Whole Earth movement of the 1960s promoted the use of new technology.[24]
In the 1970s, the home computer was introduced,[25] along with time-sharing computers,[26] the video game console, the first coin-operated video games,[27][28] and the golden age of arcade video games began with Space Invaders. As digital technology spread and the change from analog to digital record keeping became the new standard in business, a relatively new job became popular: the data entry clerk. This role, drawn from the ranks of secretaries and typists, involved converting analog data (like customer records and invoices) into digital data.
In developed nations, computers became quite common during the 1980s as they found their way into schools, homes, businesses, and industry. Automated teller machines (ATMs), industrial robots, CGI in film and television, electronic music, bulletin board systems, and video games all contributed to the spirit of the 1980s. Millions of people bought home computers, making early personal computer makers like Apple, Commodore, and Tandy well-known names. The Commodore 64 is often cited as the best-selling computer of all time, reportedly selling 17 million units[29] between 1982 and 1994.
In 1984, the U.S. Census Bureau began gathering data on computer and Internet use in the United States; their first survey showed that 8.2% of all U.S. households owned a personal computer in 1984, and households with children under 18 were nearly twice as likely to own one at 15.3% (middle and upper-middle-class households were the most likely to own one, at 22.9%).[30] By 1989, 15% of all U.S. households owned a computer, and nearly 30% of households with children under 18 owned one.[31] By the late 1980s, many businesses relied on computers and digital technology.
Motorola created the first mobile phone, the Motorola DynaTac, in 1983. However, this device used analog communication. Digital cell phones were not sold commercially until 1991 when the 2G network started to open in Finland to meet the unexpected demand for cell phones that became clear in the late 1980s.
Compute! magazine predicted that the CD-ROM would be the central part of the revolution, with many household devices reading the discs.[32]
The first true digital camera was created in 1988, and the first were sold in December 1989 in Japan and in 1990 in the United States.[33] By the early 2000s, digital cameras had become more popular than traditional film cameras.
Digital ink and paint was also invented in the late 1980s. Disney's CAPS system (created 1988) was used for a scene in The Little Mermaid (1989 film) and for all their animated films between The Rescuers Down Under (1990) and Home on the Range (2004).
1989–2005: Invention of the World Wide Web, mainstreaming of the Internet, Web 1.0
[change | change source]Tim Berners-Lee invented the World Wide Web in 1989.[34] The "Web 1.0 era" ended in 2005, around the time more advanced technologies began to develop at the start of the 21st century.[35]
The first public digital HDTV broadcast was of the 1990 World Cup that June, shown in 10 theaters in Spain and Italy. However, HDTV did not become a standard until the mid-2000s outside of Japan.
The World Wide Web became publicly available in 1991, having previously been available only to government and universities.[36] in 1993, Marc Andreessen and Eric Bina introduced Mosaic, the first web browser that could show images directly on the page[37] and the foundation for later browsers like Netscape Navigator and Internet Explorer. Stanford Federal Credit Union was the first financial institution to offer online internet banking services to all its members in October 1994.[38] In 1996, OP Financial Group, also a cooperative bank, became the second online bank in the world and the first in Europe.[39] The Internet grew quickly, and by 1996, it was part of popular culture, and many businesses included websites in their advertisements.[source?] By 1999, almost every country had an internet connection, and nearly half of Americans and people in several other countries used the Internet regularly.[source?] However, throughout the 1990s, "getting online" required complex setup, and dial-up was the only affordable connection type for individuals; the current mass Internet culture was not yet possible.
In 1989, about 15% of all households in the United States owned a personal computer.[40] For households with children, nearly 30% owned a computer in 1989, and this number reached 65% in 2000.
Cell phones became as common as computers by the early 2000s, with movie theaters starting to show ads asking people to silence their phones. They also became much more advanced than the phones of the 1990s, most of which only made calls or at most allowed for simple games.
Text messaging became widely used globally in the late 1990s, except in the United States of America, where it did not become common until the early 2000s.[source?]
The digital revolution also became truly global during this time. After changing society in the developed world in the 1990s, the digital revolution spread to the general population in the developing world in the 2000s.
By 2000, a majority of U.S. households had at least one personal computer, and the following year, they had internet access.[41] In 2002, a majority of U.S. survey respondents reported having a mobile phone.[42]
2005–present: Web 2.0, social media, smartphones, digital TV
[change | change source]In late 2005, the number of Internet users reached 1 billion,[43] and 3 billion people worldwide used cell phones by the end of the decade. HDTV became the standard television broadcast format in many countries by the end of the decade. In September and December 2006, respectively, Luxembourg and the Netherlands were the first countries to completely switch from analog to digital television. In September 2007, a majority of U.S. survey respondents reported having broadband internet at home.[44] According to estimates from Nielsen Media Research, around 45.7 million U.S. households in 2006 (about 40 percent of 114.4 million) owned a dedicated home video game console,[45][46] and by 2015, 51 percent of U.S. households owned one according to an Entertainment Software Association report.[47][48] By 2012, over 2 billion people used the Internet, which was twice the number from 2007. Cloud computing became widely used by the early 2010s. In January 2013, a majority of U.S. survey respondents reported owning a smartphone.[49] By 2016, half of the world's population was connected[50] and as of 2020, that number has risen to 67%.[51]
Social and economic impact
[change | change source]The Information Age has led to a knowledge-based society, driven by computerization and the ability to process, share, and access information instantly.[52] The growth of the Internet has played a major role in this change.[53]
In 1996, the Canadian government recognized the profound shift in the economy by publishing a report titled The Information Highway, which summarized the effects of a rapidly changing economic structure:
"The economic shift from the industrial age to the information age is rooted in two intersecting factors: information technology (especially the computer), and the network, or the ability of information technology to connect computers and people in a global web. The ability of the Information Highway to transmit and process vast amounts of data at ever-increasing speeds is a driving force in the restructuring of the global economy. It has been argued that the importance of the Information Highway is that it provides greater opportunities for human resources to be used to create greater wealth. Information Highway technologies allow firms to reduce inventory and organizational costs by managing a variety of business operations in a more efficient manner. Furthermore, these technologies provide opportunities for Canadian firms to gain new clients through an expanded market base, and can result in significant productivity gains, particularly for small and medium-sized firms. Ultimately, the growth of the Information Highway will contribute to economic growth in Canada in the 21st century by providing the necessary infrastructure for the development of the high-value-added knowledge-based economy."[54]
Economic effects
[change | change source]The development of the transistor and the integrated circuit (IC) was essential for the creation of new technologies, including the personal computer and the mobile phone.[55] These inventions created new jobs and changed existing industries, such as the banking sector, which introduced technologies like ATMs and electronic funds transfer, allowing services to be available around the clock.[56] The Information Age has impacted other sectors, including defense, by providing command, control, communications, and intelligence support; manufacturing, through process automation and robotics; and medicine, with improved medical imaging and diagnostic technology.[57]
The Internet has also transformed many industries, leading to new ways of shopping and media consumption.[58]
Telecommuting
[change | change source]The ability to work from home, or telecommuting, has become common. By 2003, 24% of Americans reported they were telecommuting, and over the next three years, the number of telecommuters grew by about 1.5 million people per year.[59] A 2009 survey by the U.S. Bureau of Labor Statistics found that 21% of employed people did some or all of their work from home.[60]
Developing world
[change | change source]The spread of information technology is also seen in the developing world. The number of mobile phones in Africa increased from 15 million in 2002 to 650 million in 2012.[61] This growth has led to increased market efficiency, for example, by giving farmers access to up-to-date pricing information for their goods.[62]
Social and ethical challenges
[change | change source]The Information Age has brought with it several concerns, including the digital divide, which is the gap between those who have access to modern information technology and those who do not.[63] Other issues include challenges to intellectual property and data privacy, as information can be easily copied and shared globally.[64]
Privacy
[change | change source]The easy collection and sharing of personal data, especially on the Internet, has raised major concerns about privacy.[65] The rise of digital technology means that governments and companies can gather massive amounts of information about individuals, leading to worries about surveillance and the misuse of this data.[66]
Intellectual property
[change | change source]The Information Age has made it easier and cheaper to copy digital media, creating conflicts over copyright and other forms of intellectual property.[67] The music, film, and software industries have faced challenges from illegal file-sharing and online piracy. Organizations have responded by introducing Digital Rights Management (DRM) technologies to control how digital content is used, but these measures are often criticized for limiting fair use.
Freedom of speech and censorship
[change | change source]The Internet has provided new opportunities for freedom of speech and sharing information across borders, but it has also enabled new forms of censorship and information control by governments and powerful organizations.[68] Debates continue over who should regulate online content and balance free expression with preventing harm or illegal activity.
Workforce and employment
[change | change source]As technology automates more tasks, there are concerns about its effect on employment and the need for new skills. While some jobs, such as data entry clerk, declined, new ones in areas like software development, data analysis, and network administration emerged.[69] The shift to a knowledge-based economy requires workers to continuously learn new skills to stay relevant.[70]
Further developments and future outlook
[change | change source]The Information Age continues to evolve rapidly. Developments such as the growth of the Internet of Things (IoT), which connects everyday objects to the Internet, and advancements in artificial intelligence (AI) are expected to drive the next phase of this era, potentially leading to the Fourth Industrial Revolution.[5]
The increasing power of technology and the vast amounts of data being generated suggest that the changes seen so far are only the beginning of a larger transformation in society and the global economy.
Notes
[change | change source]<ref group=lower-alpha>
References
[change | change source]- ↑ Hoover, Stewart M. (2006-04-26). Religion in the Media Age. Media, Religion and Culture (1st ed.). New York: Routledge. ISBN 978-0-415-31423-7.
- ↑ Manuel, Castells (1996). The information age : economy, society and culture. Oxford: Blackwell. ISBN 978-0631215943. OCLC 43092627.
- ↑ Kluver, Randy. "Globalization, Informatization, and Intercultural Communication". un.org. Archived from the original on 19 July 2013. Retrieved 18 April 2013.
- ↑ "The History of Computers". thought.co. Archived from the original on 2020-08-01. Retrieved 2019-10-17.
- 1 2 "Regulation for the Fourth Industrial Revolution". gov.uk. Retrieved 2024-09-16.
- ↑ "Museum Of Applied Arts And Sciences – About". Museum of Applied Arts and Sciences. Retrieved 22 August 2017.
- ↑ "The Digital Revolution Ahead for the Audio Industry," Business Week. New York, 16 March 1981, p. 40D.
- ↑ Phil Ament (17 April 2015). "Transistor History – Invention of the Transistor". Archived from the original on 13 August 2011. Retrieved 17 April 2015.
- ↑ Shannon, Claude E.; Weaver, Warren (1963). The mathematical theory of communication (4. print. ed.). Urbana: University of Illinois Press. p. 144. ISBN 0252725484.
{{cite book}}: ISBN / Date incompatibility (help) - ↑ Howard R. Duff (2001). "John Bardeen and transistor physics". AIP Conference Proceedings. Vol. 550. pp. 3–32. doi:10.1063/1.1354371.
- ↑ Frosch, C. J.; Derick, L (1957). "Surface Protection and Selective Masking during Diffusion in Silicon". Journal of the Electrochemical Society. 104 (9): 547. doi:10.1149/1.2428650.
- ↑ Lojek, Bo (2007). History of Semiconductor Engineering. Berlin, Heidelberg: Springer-Verlag Berlin Heidelberg. p. 321. ISBN 978-3-540-34258-8.
- ↑ "Milestones:First Semiconductor Integrated Circuit (IC), 1958". IEEE Global History Network. IEEE. Retrieved 3 August 2011.
- ↑ Saxena, Arjun (2009). Invention of Integrated Circuits: Untold Important Facts. pp. x–xi.
- ↑ Saxena, Arjun (2009). Invention of Integrated Circuits: Untold Important Facts. pp. 102–103.
- ↑ "1963: Complementary MOS Circuit Configuration is Invented". Computer History Museum. Retrieved 6 July 2019.
- ↑ US3472712A, Bower, Robert W., "Field-effect device with insulated gate", issued 1969-10-14
- ↑ US3615934A, Bower, Robert W., "Insulated-gate field-effect device having source and drain regions formed in part by ion implantation and method of making same", issued 1971-10-26
- ↑ US3475234A, Kerwin, Robert E.; Klein, Donald L. & Sarace, John C., "Method for making mis structures", issued 1969-10-28
- ↑ Shirriff, Ken (30 August 2016). "The Surprising Story of the First Microprocessors". IEEE Spectrum. 53 (9). Institute of Electrical and Electronics Engineers: 48–54. doi:10.1109/MSPEC.2016.7551353. S2CID 32003640. Retrieved 13 October 2019.
- ↑ "1971: Microprocessor Integrates CPU Function onto a Single Chip". Computer History Museum.
- 1 2 Williams, J. B. (2017). The Electronics Revolution: Inventing the Future. Springer. pp. 245–8. ISBN 9783319490885.
- ↑ James R. Janesick (2001). Scientific charge-coupled devices. SPIE Press. pp. 3–4. ISBN 978-0-8194-3698-6.
- ↑ "History of Whole Earth Catalog". Archived from the original on 13 February 2021. Retrieved 17 April 2015.
- ↑ "Personal Computer Milestones". Retrieved 17 April 2015.
- ↑ Criss, Fillur (14 August 2014). "2,076 IT jobs from 492 companies". ICTerGezocht.nl (in Dutch). Retrieved 19 August 2017.
- ↑ "Atari – Arcade/Coin-op". Archived from the original on 2 November 2014. Retrieved 17 April 2015.
- ↑ Vincze Miklós (15 June 2013). "Forgotten arcade games let you shoot space men and catch live lobsters". io9. Archived from the original on 14 February 2015. Retrieved 17 April 2015.
- ↑ "How many Commodore 64 computers were really sold?". pagetable.com. Archived from the original on 6 March 2016. Retrieved 17 April 2015.
- ↑ "Archived copy" (PDF). Archived from the original (PDF) on 2 April 2013. Retrieved 20 December 2017.
{{cite web}}: CS1 maint: archived copy as title (link) - ↑ Kominski, Robert (Feb 1991). "Computer Use in the United States: 1989. Current Population Reports, Special Studies". Bureau of the Census (DOC), Suitland, Md. Population Div. – via ERIC (Education Resources Information Center).
- ↑ "COMPUTE! magazine issue 93 Feb 1988". February 1988.
If the wheels behind the CD-ROM industry have their way, this product will help open the door to a brave, new multimedia world for microcomputers, where the computer is intimately linked with the other household electronics, and every gadget in the house reads tons of video, audio, and text data from CD-ROM disks.
- ↑ "1988". Retrieved 17 April 2015.
- ↑ "A short history of the Web". CERN. 2024-01-25. Retrieved 2024-02-16.
- ↑ "World Wide Web and Its Journey from Web 1.0 to Web 4.0" (PDF). ijcsit. Retrieved January 20, 2025.
- ↑ Martin Bryant (6 August 2011). "20 years ago today, the World Wide Web was born – TNW Insider". The Next Web. Retrieved 17 April 2015.
- ↑ "The World Wide Web". PBS. Retrieved 17 April 2015.
- ↑ "Stanford Federal Credit Union Pioneers Online Financial Services" (Press release). 1995-06-21. Archived from the original on 21 December 2018. Retrieved 21 December 2018.
- ↑ "History – About us – OP Group".
- ↑ Cheeseman Day, Jennifer; Janus, Alex; Davis, Jessica (October 2005). "Computer and Internet Use in the United States: 2003" (PDF). Census Bureau. Archived from the original (PDF) on 6 March 2009. Retrieved 10 March 2009.
- ↑ File, Thom (May 2013). Computer and Internet Use in the United States (PDF) (Report). Current Population Survey Reports. Washington, D.C.: U.S. Census Bureau. Retrieved 11 February 2020.
- ↑ Tuckel, Peter; O'Neill, Harry (2005). Ownership and Usage Patterns of Cell Phones: 2000–2005 (PDF) (Report). JSM Proceedings, Survey Research Methods Section. Alexandria, VA: American Statistical Association. p. 4002. Retrieved 25 September 2020.
- ↑ "One Billion People Online!". Archived from the original on 22 October 2008. Retrieved 17 April 2015.
- ↑ "Demographics of Internet and Home Broadband Usage in the United States". Pew Research Center. 7 April 2021. Retrieved 19 May 2021.
- ↑ Arendt, Susan (5 March 2007). "Game Consoles in 41% of Homes". WIRED. Condé Nast. Retrieved 29 June 2021.
- ↑ Statistical Abstract of the United States: 2008 (PDF) (Report). Statistical Abstract of the United States (127 ed.). U.S. Census Bureau. 30 December 2007. p. 52. Retrieved 29 June 2021.
- ↑ North, Dale (14 April 2015). "155M Americans play video games, and 80% of households own a gaming device". VentureBeat. Archived from the original on 9 July 2021. Retrieved 29 June 2021.
- ↑ 2015 Essential Facts About the Computer and Video Game Industry (Report). Essential Facts About the Computer and Video Game Industry. Vol. 2015. Entertainment Software Association. Retrieved 29 June 2021.
- ↑ "Demographics of Mobile Device Ownership and Adoption in the United States". Pew Research Center. 7 April 2021. Retrieved 19 May 2021.
- ↑ "World Internet Users Statistics and 2014 World Population Stats". Archived from the original on 23 June 2011. Retrieved 17 April 2015.
- ↑ Clement. "Worldwide digital population as of April 2020". Statista. Retrieved 21 May 2020.
- ↑ Drucker, Peter Ferdinand (1992). "The post-capitalist world". Managing for the future: the 1990s and beyond. New York: Truman Talley Books/Dutton. p. 428. ISBN 9780525934141.
- ↑ "History of the Internet" (PDF). Archived from the original (PDF) on 24 May 2012. Retrieved 17 April 2015.
- ↑ "THE INFORMATION HIGHWAY" (PDF). Archived from the original (PDF) on 17 April 2016. Retrieved 17 April 2015.
- ↑ Saxena, Arjun (2009). Invention of Integrated Circuits: Untold Important Facts. World Scientific. pp. 1–2, 71, 140, 158–161. ISBN 978-981-283-818-4.
{{cite book}}: Check|isbn=value: checksum (help) - ↑ "Information Age". Archived from the original on 2 April 2015. Retrieved 17 April 2015.
- ↑ Hajduk, Katarzyna (2021). The Impact of the Information Age on Banking and Finance. Emerald Publishing Limited. pp. 1–20. doi:10.1108/978-1-83982-125-120211001. ISBN 978-1-83982-125-1.
{{cite book}}: Check|isbn=value: checksum (help) - ↑ "Information Age". Britannica. Retrieved 17 April 2015.
- ↑ "The Rise Of Telecommuting". Archived from the original on 10 February 2012. Retrieved 17 April 2015.
- ↑ "Time Spent in Selected Leisure Activities, Averages per Day by Day of Week and Sex, 2009 annual averages". U.S. Bureau of Labor Statistics. Archived from the original on 4 January 2012. Retrieved 17 April 2015.
- ↑ "Wireless technologies help Africa leapfrog technology". ComputerWorld. 15 August 2012. Archived from the original on 21 October 2012. Retrieved 17 April 2015.
- ↑ Mckenzie, David (17 June 2011). "Mobile phones and economic development in Africa". World Bank Blogs. Archived from the original on 11 September 2012. Retrieved 17 April 2015.
- ↑ Wirsig, Jared (2015). The Digital Divide. Rosen Publishing Group, Inc. pp. 3–16. ISBN 978-1-4994-6332-6.
{{cite book}}: Check|isbn=value: checksum (help) - ↑ Webster, Frank (2014-01-21). Theories of the Information Society. Routledge. pp. 9–24. ISBN 978-1-317-86475-4.
{{cite book}}: Check|isbn=value: checksum (help) - ↑ Schneier, Bruce (2015-03-03). Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World. W. W. Norton & Company. pp. 1–30. ISBN 978-0-393-24481-6.
- ↑ Gil Press (28 May 2013). "A Very Short History of Big Data". Forbes. Archived from the original on 17 April 2014. Retrieved 17 April 2015.
- ↑ Tapscott, Don; Williams, Anthony D. (2008-01-15). Wikinomics: How Mass Collaboration Changes Everything. Penguin. pp. 1–20. ISBN 978-1-59184-193-7.
- ↑ Deibert, Ronald J. (2013-05-14). Black Code: Inside the Battle for Cyberspace. McClelland & Stewart. pp. 1–25. ISBN 978-0-7710-2565-5.
- ↑ "Computer Use and the Digital Divide: 2017". Census Bureau. Retrieved 19 February 2024.
- ↑ Drucker, Peter F. (2017). Post-Capitalist Society. Routledge. pp. 17–38. ISBN 978-1-351-47774-8.
{{cite book}}: Check|isbn=value: checksum (help)
Related pages
[change | change source]- ↑ Also known as the Third Industrial Revolution, Computer Age, Digital Age, Silicon Age, New Media Age, Internet Age, or the Digital Revolution[1]
<ref group=lower-alpha> tags or {{efn}} templates on this page, but the references will not show without a {{reflist|group=lower-alpha}} template or {{notelist}} template (see the help page).