小黑游戏最新手机游戏下载平台

手游资讯 单机资讯 最近更新 热门排行
您现在的位置:首页 > 软件资讯 > 应用教程 > 现代主要通讯技术年表

现代主要通讯技术年表

2022-10-26 19:27作者:佚名来源:本站整理浏览:385

  ?Timeline of communication technology

  Timeline of communication technology

  3500s BC - The Sumerians develop cuneiform writing and the Egyptians develop hieroglyphic writing

  1500s BC - The Phoenicians develop an alphabet

  170 BC - Parchment is discovered in Pergamum

  105 - Tsai Lun invents paper

  350 - The Chinese develop a method for printing pages using symbols carved on a wooden block

  1450 - The Chinese develop wooden block movable type printing

  1454 - Johannes Gutenberg finishes a printing press with metal movable type

  1793 - Claude Chappe establishes the first long-distance semaphore telegraph line

  1831 - Joseph Henry proposes and builds an electric telegraph

  1835 - Samuel Morse develops the Morse code

  1843 - Samuel Morse builds the first long distance electric telegraph line

  1876 - Alexander Graham Bell and Thomas Watson exhibit an electric telephone

  1877 - Thomas Edison patents the phonograph

  1889 - Almon Strowger patents the direct dial telephone

  1901 - Guglielmo Marconi transmits radio signals from Cornwall to Newfoundland

  1925 - John Logie Baird transmits the first television signal

  1942 - Hedy Lamarr and George Antheil invent frequency hopping spread spectrum communication technique

  1948 - Claude Shannon writes a paper that establishes the mathematical basis of information theory

  1958 - Chester Carlson presents the first photocopier suitable for office use

  1966 - Charles Kao realizes that silica-based optical waveguides offer a practical way to transmit light via total internal reflection

  1969 - The first hosts of ARPANET, Internet’s ancestor, are connected.

  1973 - Akira Hasegawa and Fred Tappert propose the use of solitary waves to carry information in optical fibers

  1977 - Donald Knuth begins work on TeX

  1980 - Linn Mollenauer, Rogers Stollen, and James Gordon demonstrate that solitary waves can be propagated through optical fibers

  1989 - Tim Berners-Lee and Robert Cailliau built the prototype system which became the World Wide Web at CERN

  1991 - Anders Olsson transmits solitary waves through an optical fiber with a data rate of 32 billion bits per second

  Modern Communications TechnologyTimeline

  Date

  Essays

  New Essays

  Links & Abstracts

  1840’s

  Wired Telegraph

  1870’s

  Telephone

  Telephone

  1900’s

  Wireless Telegraph

  1920’s

  AM Radio

  by

  Glenn Stordeur

  1930’s

  FM Radio

  by

  Susan Bachety

  1930’s

  NTSC Television

  1930’s

  Pulsed Radar

  Pulsed Radar

  1940’s

  Doppler Radar

  Doppler Radar

  by

  John Antonelli

  1940’s

  Loran

  by

  Michael Fiore

  1960’s

  CAT Scan

  1960’s

  Ultra Sound Medical Imagining

  Ultra Sound Medical Imag.

  by

  Jennifer Fanno

  1960’s

  MRI

  MRI

  by

  Kerry Munao

  1970’s

  Compact Disc

  Compact Disc

  by

  Renee Gonsalves

  1980’s

  UPC Barcode

  Barcode

  by

  Jill Fanuzzi

  1980’s

  US Postal Code

  1980’s

  Cellular Telephone

  by

  Kevin Geary

  1980’s

  HDTV

  by

  John Alesse

  1980’s

  Digital Audio Tape

  Digital Audio Tape

  by

  Nikki Jackson

  1990’s

  GPS

  GPS

  by

  Mark Sardzinski

  1990’s

  Digital Video Disc (DVD)

  DVD

  by

  Stacey Wrigley

  DATE

  EVENT

  c. 3000

  B. C.

  Sumerian writing system uses pictographs to represent words

  This language also contains elements of early forms of phonetic indicators used to indicate the pronunciation of a word and to allow a pictograph to represent more than one word when each incorporates a different phonetic indicator.

  c. 2900

  B. C.

  Beginnings of Egyptian Hieroglyphic writing

  Each hieroglyph was a pictogram but when certain combinations of hieroglyphs appeared together, they created an entirely new word with a different meaning. The phonetic elements of hieroglyphics are much more highly developed than those of previous languages. The earlier Sumerian pictographic system may have influenced the development of the Egyptian hieroglyphic system. In 1799, the Rosetta Stone, containing a copy of the  passage in hieroglyphic writing, an Egyptian priestly shorthand, and Greek allowed the hierpglyphs to be translated.

  1300

  B. C.

  Tortoise Shell and Oracle Bone Writing

  Chinese tortoise shell and oracle bone writing. This type of primitive writing was the ancestor to the beautiful Chinese calligraphic writing of later centuries and the simplified characters of today.

  c. 500

  B. C.

  Papyrus Roll

  Up until the papyrus roll was introduced, long manuscripts were written on cumbersome and bulky clay tablets. The Egyptians discovered the secret of making papyrus from reeds that grew along the Nile. This material was tough and flexible. A coarse and a smooth variety were made, and several grades in between. The Egyptians tried to keep the process of manufacturing papyrus a secret, but it eventually spread throughout the Mediterranean world. Books on papyrus were rolled on a hardwood stick with an identifying leather tag on one end. As the book was read, the scroll was rolled and unrolled onto another such stick.

  c. 220

  B. C.

  Chinese Small Seal writing developed

  Qin Shih Huang Ti, China’s first emperor, outlaws local dialects and gives the newly unified China a standard language. Some fine examples of early Chinese characters can be seen on ancient Chinese coins.

  c. A. D. 100

  Book (Parchment Codex)

  From about A. D. 100 onwards, the bound book with separate pages started to be used by the Romans. Known by its Latin name of CODEX, this kind of book provided a major breakthrough in quick and efficient access to information. No longer must a reader roll through an entire rolled document to find a passage. The inpidual pages later came to be numbered and indexed for even more efficient access. This type of manuscript undoubtedly made it easier for the Roman bureaucracy to grow and flourish as it did in later years.

  A. D. 105

  Wood block printing and paper is invented by the Chinese

  Paper, made from plant cellulose fibers that have been pounded, separated, washed, and finally felted together and dried, was a much less expensive material than Vellum and tougher than papyrus. Its invention allowed printing to be done with inked wooden blocks, a process also developed by the Chinese at about the  time as paper was invented. paper was supposedly invented by a court official of the emperor Ho Ti.

  A.D. 641

  Final destruction of the library at Alexandria

  The library at Alexandria was one of the largest and oldest collections of manuscripts from the ancient world was lost.

  1455

  Johann Gutenberg invents printing press using movable type cast from metal

  After a way had been found to cast precisely sized and shaped type, it became possible to arrange the inpidual letters in a wooden frame bound together with clamps. The first book published using this method was a translation of the Bible in 1455.

  1755

  Samuel Johnson’s dictionary brings standardized spelling to the English Language

  1802

  Library of Congress established

  A national book collection is started for the new United States

  1802

  Discovery of the carbon arc lamp

  This was the world’s first controlled source of electric light.

  1824

  Research on persistence of vision published

  In 1824 an Englishman, Peter Mark Roget published a paper detailing his discoveries in the area of persistence of vision among humans. The fact that the eye holds on to an image for approximately one sixteenth of a second after it is no longer visible led to further experiments in animated drawings, culminating with modern motion pictures and video, which are shown at 24 and about 30 frames per second, respectively.

  1830’s

  First viable design for a digital computer

  1830’s

  Augusta Lady Byron writes world’s first computer program

  Augusta lady Byron, Countess of Lovelace, was the first person to write a program designed to run on a computer. Lady Byron collaborated with Charles Babbage on his Analytical Engine, whose design included data to be input on punched cards, a primitive central processing unit (CPU) or perhaps more correctly an Arithmetic Logic Unit (ALU), a form of memory, and a printer to output data. The entire design was the first digital computer, though it was mechanical rather than electronic.

  Unfortunately, the Analytical Engine never came to be, despite many years of hard, frustrating work and sound design principles by this early engineering and programming team. The technology of the Nineteenth Century could not support such a development. The world would not have a working digital computer until 1946, when ENIAC I was built. Lady Byron would undoubtedly have experienced much satisfaction had she been able to see her pioneering ideas come to pass but this was not to be, for she died in 1852, 94 years before ENIAC executed its first machine language instruction.

  1837

  Invention of Telegraph in Great Britain and the United States

  Working independently from one another Samuel F. B. Morse and the two British engineers Sir William Cook and Sir Charles Wheatstone developed a method of sending an electronic message over a distance of several miles almost instantaneously. The implications of this development were enormous. For the first time in history, human beings had the means to overcome the obstacles imposed upon communications by vast distances. A "Virtual world" of human communications had just been born that was much smaller and could potentially be much more closely knit together than the "real" world. Though the concept of virtual place and time would not become part of the lexicon of human thinking for almost another 150 years, the foundations were laid with this invention. It is significant that during this  period we were making rapid progress in conquering distance and time in the real world with advances in the steam engines which powered ships and railroad locomotives. America’s telegraph network grew up with and was often found alongside the rapidly developing network of railroad lines that began to tie the various corners of our nation together.

  1837

  Daguerrotype photography invented

  During the 1820’s, Louis Jacques Daguerre and a colleague began working on a process whereby the action of light on silver halide caused a (more or less) permanent image to be formed. Suspended in an emulsion spread over a flat surface, the silver halide formed microscopic grains of pure silver when exposed to light. Like most metals, silver loses its shiny reflective qualities and becomes pure black when it is pided into very fine particles. Daguerre discovered that the amount of pure silver formed depended on the amount of light which fell on the plate. When a lens or pinhole was used to admit the light, an image could be projected on the plate and recorded. This discovery led to the first camera. Of course, the lightest parts of the image produced the deepest blacks, so the image formed on the sensitive emulsion was a negative image. If the plate was made of a clear material like glass, light could be shined through the negative to expose another plate, which was the reverse of the negative or a positive image. This is the principle of black and white photography which uses the  basic principle today, 150 years after Daguerre’s discovery.

  1861

  Motion pictures projected onto a screen

  In 1861 Coleman Sellers patented the Kinematoscope, a machine that took a series of posed still photographs and flashed them onto a screen. Though it was a crude device, the pictures appeared to move and America had its first movie theatre.

  1876

  Dewey Decimal System introduced

  In 1876 Melvyl Dewey published a 32 page booklet introducing a new library classification scheme based on the decimal numbering system. The resulting Dewey Decimal Classification system is used today by most elementary and high school libraries, as well as municipal libraries. (University, most community college, and special libraries use the Library of Congress classification system, however.)

  The Dewey system was designed to address the problem of organizing books by subject content while leaving adequate space into which entries for new publications can be inserted. Since the set of positive rational numbers has the delightful property of beingcontinuous, Dewey found that classification numbers for new material could be created simply by adding digits to the right of an existing Dewey number or assigning an unused numeral at an existing decimal place. Oddly enough, though this system was developed over 120 years ago, it still meets the needs of small libraries, having undergone nineteen revisions since its introduction.

  1877

  Edweard Muybridge demonstrates high - speed photography

  Prior to 1877, no one really knew how a horse’s legs moved while the animal was running at full speed. The motion of a horse’s legs were a blur, even to the best eyes. Because of a phenomenon known as persistence of vision, the human eye holds onto an image for a short period of time, causing blurring when rapid motion is observed. After two wealthy horse racers placed bets as to whether a horse ever had all four legs off the ground at the  time, American photographer Edweard Muybridge set up an experiment to test this. He placed twenty - four cameras with shutters hooked to trip wires set a fixed, uniform distance apart. As a horse raced past, he would break the wires, thus snapping 24 pictures of itself. Muybridge’s photos showed conclusively that a horse does in fact have all four legs in the air for a fleeting moment during each repetition of the motions in its gait.

  1877

  Wax cylinder phonograph invented by Thomas A. Edison

  The first sounds recorded onto a medium and played back were reproduced with Thomas Alva Edison’s phonograph. It consisted of a horizontal cylinder coated with hard wax into which a groove had been cut by a vibrating needle while the cylinder rotated beneath it. The groove had bumps and valleys along its bottom which corresponded to the vibrations of the recorded sounds. When playing back the sound, another needle rode along in the groove and converted the moving bumps and valleys back to sound. The first phonographs had no electronic amplifiers. Rather, the vibrating needle was coupled to a diaphragm set in the small end of a horn. The horn efficiently converted the energy produced by the needle and diaphragm into audible sound.

  1877

  Alexander Graham Bell invents first practical telephone

  While working on a device to enable hearing impaired inpiduals to hear sound, Alexander Bell developed a communications device that was to eventually be introduced into almost every American home and become available to nearly everyone worldwide. The voice transducer (transmitter and receiver) technology developed by Bell paved the way for electronic recording of sounds, music, and voice.

  1899

  First magnetic recordings

  Valdemar Poulsen was the first to develop a method of recording sounds using a magnetic medium. With the development of his method of recording to a magnetized steel tape, the necessity of relying on a vibrating needle or other mechanical transducers was eliminated. This development was the birth of magnetic recording technology, the basis for mass data storage on disk and tape today, as well as the music recording industry.

  1902

  Motion picture special effects

  George Melies, a French magician, experimented with stop - action photography, fades, and transitions. By stopping the film, he could cause things to appear and disappear and by backing up the film, he could create a double exposure or a fade from one scene to the next. He was the father of all these things we consider cute when used once or twice but which become so tedious when they are overused.

  1906

  Lee Deforest invents electronic amplifying tube (triode)

  This invention revolutionized the world of electronic communication. For the first time, a weak electronic signal such as that produced by a microphone could be amplified as much as needed, up until the ability of the tube to withstand the excess heat generated by the process. Heretofore, a telephone signal became so weak as to be unusable when two people were separated by more than a few miles. Now, telephone conversations could be amplified and sent down even longer lines and the process could even be repeated several times. The electronic amplifier paved the way for the transmission of the human voice by radio and high quality sound recordings. Until the invention of the triode vacuum tube amplifier, Long distance communications was possible only by telegraph and its associated clunky electro - mechanical repeaters and inefficient spark gap radio transmitters and crystal receivers.

  1923

  Television camera tube invented by Zvorkyn

  Vladimir Zvorkyn, a Russian bork American inventor developed the Iconoscope, first TV camera tube in 1923. Rapid advances in radio tube technology were being made at the time, and the technology of broadcast television was developed a short time later. The first TV broadcasts were made in England in 1927 and the U. S. in 1930.

  1926

  First practical sound movie

  Although Thomas Edison had experimented with a crude method of recording sound on motion pictures right from their inception in 1889, the first practical "Talkies" didn’t come out until 1926 when Warner Brothers Studios developed a technology that recorded sound separately from the film on large disks and synchronized the sound and motion picture tracks upon playback. In 1927 the first real talking picture to be played in theatres was released, the Warner Brothers productionThe Jazz Singer,starring Al Jolson. Four years later, the Movietone system of recording sound on an audio track right on the film was developed. This used a separate strip of film, exposed to light whose intensity and variation corresponded to the loudness and frequency of the sound. Light was projected through the audio track on playback and converted back into sound with a light sensitive photomultiplier tube and an audio amplifier. This system is not unlike that used by modern CD’s to play sound, with the exception that CD’s use reflected laser light and digitally encoded audio.

  1939

  Regularly scheduled television broadcasting begins in U. S.

  On April 30, 1939, the first regularly scheduled TV broadcast was made in conjunction with the opening of the New York World’s Fair. Broadcasting continued until interrupted by World War II. Broadcasting began again in 1946 after the war.

  1940’s

  Beginnings of Information Science

  Information science as we know it had its beginnings in the 1940’s. The information needs of science, engineering, the military, and logistical management during World War II led to the development of early automated search methods. When the digital computer made its appearance after the war, information could be stored on tape and other digital media. The development of controlled vocabularies and Boolean search techniques made it possible to write search engine programs. These could efficiently search through vast collections of on-line documents to find the little piece of information that was needed for a particular need.

  1945

  Vannevar Bush foresees the invention of hypertext

  Vannevar Bush, American nuclear physicist and futurist, recognized the information explosion that few recognized was in progress even then. The technological and scientific advances made during the recent world war did much to hasten the coming of the information revolution. Bush realized that if a way were not soon found to search for information following the thread of an idea across many publications, books, and documents, the vast store of knowledge amassed by humanity would become all but inaccessible, and therefore of little use to anybody except narrowly defined specialists in a particular field. The idea of hypermedia was born.

  1946

  ENIAC computer developed

  ENIAC I was the world’s first practical all electronic digital computer. It had input/output, memory. a CPU, and electronic logic switching in the form of 18,000 vacuum tubes. Programming was done by teams of programmers with patch cables who hard - wired each program into the computer and had to repeat the process each time a program was changed. A problem that occurred during the hard wiring of an early computer program led to one of today’s most popular though dreaded computer terms. When one of the early systems went down and nobody could figure out what had gone wrong, an intensive search for the source of the difficulty led to the discovery of a moth that had gotten into the wiring and had shorted out a circuit. When programmers (or users) say that a program has bugs, this bug is the original great granddaddy to which they are referring.

  Far from being solely a tool to serve scientists and mathematicians, computers have today become tools for communication and art. Graphics artists, teachers, and instructional designers often find them an essential tool for expressing their art and extending their talents into new areas.

  1948

  Birth of field of information theory proposed by Claude E, Shannon

  1948

  Scientists at Bell Telephone labs invent the transistor (solid state amplifier)

  William Shockley, Walter Brattain, and William Bardeen, working on a new type of amplifier for long distance telephone relay service, developed the first transistor. Working on the principle of controlling the current flow through a piece of very pure germanium to which very slight amounts of other elements have been added, the new amplifier was much smaller and more rugged than the vacuum tube amplifier it replaced. additionally, the transistor required much less power and lower voltages than tubes. There was almost no limit to how small a transistor could be made. The invention of the transistor paved the way for the development of the integrated circuit chip and microelectronics in general. Personal computers as we know them would be impossible without transistors. The latest microprocessor chips contain over three million inpidual transistor switches. Electronic miniaturization and the number of transistors that can be built into one chip is continuing to increase at a rapid pace.

  1957

  Planar transistor developed by Jean Hoerni

  Planar technology is a process that forces certain types of atoms to infuse an otherwise almost pure piece of silicon. This technology permits thousands or even millions of transistors to be "grown" on a single wafer shaped crystal of silicon. These impurities or dopants create the conducting and control structures of the transistors. With the perfection of other technologies that allow microscopic metal conducting circuits to be deposited on the  crystal, integrated circuits became a reality.

  1958

  First integrated circuit

  In 1957, a group of eight electronics engineers and physicists formed Fairchild Semiconductor. The next year, one of these men, Jack Kilby, produced the first integrated circuit.

  1960’s

  Library of Congress LC MARC

  In the 1960’s, the U.S. Library of Congress developed the LC MARC (Machine Readable Code) system of putting bibliographic records in a form readable by mainframe computers. See the separate article on LC MARC records for more about the major breakthrough in information technology they represented.

  1960’s

  ARPANET developed by the U. S. Department of Defense

  Originally intended to be a network of government, university, research, and scientific computers designed to enable researchers to share information, this government project eventually grew into the Internet. The networking technology and topology was originally designed to enable the network to survive nuclear attack. The idea was that we would most probably be subjected to a nuclear attack someday, and the network would route traffic and data flow around the damage. Today, the Internet has replaced ARPANET and is no longer the exclusive domain of researchers, scientists, and techno - whizzes that it once was. Rather, it has become accessible to an ever growing segment of mainstream humanity. The  ARPANET design philosophy, with improvements, enables the Internet to move data around overly busy portions of the network and provide more efficient service to consumers.

  1969

  UNIX Operating System Developed

  Developed By AT&T engineers Ken Thompson and Dennis Ritchie, the UNIX operating system could handle multitasking and was ideally suited for networking minicomputers and large microcomputers. UNIX was developed in conjunction with the C programming language and is the natural environment for C. Today, both C and UNIX are available for a wider variety of computer hardware platforms than any other programming language or operating system. This quality of running on a wide variety of machines is called portability in computer programming jargon.

  1971

  Intel introduces the first microprocessor chip

  Three inventors, Andrew Grove, Robert Noyce, and Gordon Moore founded Intel to produce computer memory chips in 1968. In 1971, the 4004 microprocessor chip, designed by a team under the leadership of Federico Faggin, was introduced to replace the central processing units that heretofore had been constructed from discrete components. The microprocessor chip was born. Intel’s later products, from 8080 through 8088 and currently Pentium Pro were all descended from the 4004.

  1972

  Optical laserdisc developed by both Philips and MCA

  These early laserdiscs both stored 30 minutes of video and audio on a reflective plastic platter. The information was recorded as a series of pits which varied in reflectivity, thus causing the information to be converted to reflected laser light of varying intensity. This light was converted into electrical signals by a photo diode, and back into sound and pictures by standard analog television and audio technology.

  1974

  MCA and Philips agree on standard videodisc encoding format

  The cooperation by these two early manufacturers in the area of standards made it possible to develop standards which would make commercial distribution possible. Later, the MCA and Philips developed the CAV (Constant Angular Velocity) system which was suitable for storage of both still frames and video sequences on the  disc. The CAV standard also made it easy to interface a computer with an optical laserdisc, allowing precise control of playback. With the development of simple control codes that could be sent via a serial connection between computer and player, no programming was necessary to find and display inpidual frames and sequences to support or augment text displayed on the computer screen. This early form of multimedia was most often written in and controlled by a Hypercard stack.

  1974

  Motorola produces its first microprocessor chip

  Motorola’s 6800 was the forerunner of the 68000, used in the original Macintosh. Currently, the Macintosh uses another Motorola product, the Power PC chip.

  1975

  Altair Microcomputer Kit is the first personal computer available to general public

  The Altair, displayed on the cover of Electronics Illustrated in 1975, was the first computer that was marketed to the home enthusiast. The front panel consisted of a series of small light emitting diodes and the user could list and run programs written in machine language. The program listing and the results of the program after it had run were read off this display as a binary number. It was up to the programmer’s skill at reading machine language displayed as a base 2 number represented by lit ones and unlit zeros on the front panel. The programmer had to read the program by pressing a switch that displayed data and instructions one byte at a time.

  1977

  Radio Shack introduces first personal computer with keyboard and CRT display

  This was the first complete personal computer to be marketed to the general public. Unlike others before it, the Tandy/Radio Shack computer came fully assembled with a built-in keyboard and monitor. This paved the way for convenient word processing and brought about the beginning of a great revolution in thinking which gradually took hold and gained momentum during the next decade. No longer would the computer be seen as an expensive mathematical tool of large scientific, military, and business institutions, but as a communication and information management tool accessible to everyone.

  1977

  Apple Computer begins delivery of the Apple II computer

  The Apple II came fully assembled with a built-in keyboard, monitor and operating system software. The first Apple II’s used a cassette tape to store programs, but a floppy disk drive was soon available. With its ease in storing and running programs, the floppy disk made the Apple II computer the first computer suitable for use in elementary school classrooms.

  1984

  Apple Macintosh computer introduced

  The Macintosh was the first computer to come with a graphical user interface and a mouse pointing device as standard equipment. With the coming of the Mac, the personal microcomputer began to undergo a major revolution in its purpose in serving humankind. No longer merely a mathematical tool of scientists, banks, and engineers, the micro was becoming the tool of choice for many graphics artists, teachers, instructional designers, librarians, and information managers. Its graphical representation of a desktop with its little folders and paper documents brought the idea of a metaphorical user interface to life. This new picture language that was introduced with the Macintosh would eventually develop standardized symbols for humans’ use in communicating with the machine and ultimately contribute to the World Wide Web’s metaphor of a virtual sense of place. The Macintosh GUI also paved the way for the development of multimedia. The hardware obstacles that prevented hypermedia from becoming a reality were no more.

  Mid

  1980’s

  Artificial intelligence develops as a separate discipline from information science

  Artificial Intelligence (AI) is a somewhat broad field that covers many areas. With the development of computer programming involving ever increasing levels of complexity, inheritance, and code re-use culminating in object oriented programming, the software foundations for AI were laid. Other developments in cybernetics, neural networks, and human psychology added their contributions. Some practical but as of yet imperfect implementations of AI include expert systems, management information systems, (MIS), information searching using fuzzy logic, and human speech recognition. Artificial Intelligence today is best defined as a collection of electronic information processing tool that can be applied in a myriad of innovative ways to existing information technologies. Most scientists believe that a machine can never be built to replicate the human mind and emotions, but will be used to do more and more of the tedious labor in finding and presenting the appropriate needed information in humanity’s vast collection of data.

  1987

  Hypercard developed by Bill Atkinson

  In August of this year, Apple Computer introduced Hypercard to the public by bundling it with all new Macintosh computers.Hypermedia was a reality at last,with the hardware and software now in place to bring it into being. Hypercard made hypertext document linking possible for the average person who wished to build an information network linking all his or her electronic documents that could be entered or pasted into a Hypercard stack. Based on the metaphor of index cards in a recipe box, it was easy enough for even young students to use. Yet it was powerful enough to become the software tool used to create the Voyager educational multimedia titles. It was Hypercard that made Vannevar Bush’s vision of a personal library in which books or documents could be tied together with links based on the context of the information being linked. Hypercard also had provision for displaying graphics and controlling an external device to display video, which would ideally be a laserdisc player.

  1991

  450 complete works of literature on one CD-ROM

  In 1991, two major commercial events took place which put the power of CD-ROM storage technology and computer based search engines in the hands of ordinary people. World Library Incorporated produced a fully searchable CD-ROM containing 450 (later expanded to 953) classical works of literature and historic documents. This demonstrated the power of the CD-ROM to take the text content of several bookshelves and concentrate it on one small piece of plastic. The other product was the electronic version of Grolier’s Encyclopedia which actually contained a few pictures in addition to text. Both products were originally marketed through the Bureau of Electronic Publishing, a distributor of CD-ROM products. Many saw this as the ultimate in personal data storage and retrieval. They didn’t have to wait long for much greater things in the world of multimedia, Though both titles sold for several hundred dollars originally, by 1994 they could be found at electronic flea markets selling for a dollar or two each. Technological advances had occurred so rapidly in this area that both the Multimedia PC standard and the Macintosh multimedia system extensions had made these two products obsolete in a couple of years.

  1991

  Power PC chip introduced

  Working together, Motorola, Apple, and IBM developed the Power PC RISC processor to be used in Apple Computer’s new Power Macintosh. The product line currently includes the 601, 603, and 604 microprocessors. These chips are designed around a reduced instruction set machine language, intended to produce more compact, faster executing code. Devotees of the Intel CISC chip architecture heartily disagree with this assertion The result is that the consumer benefits from the intense competition to develop a better computer chip.

  January, 1997

  RSA Internet security code cracked for 48 bit number


Tags: 责任编辑:小黑游戏

热门推荐

'); })();