28 Kasım 2007
Hw-7'ye Ek...
Sınıfta dersin sonunda istediğim gibi. 7. ödevin sonuna / arkasına, bu dönemin başından beri, bu derste öğrendiğiniz sözcüleri / kavramları (İng.) listeleyiniz...Görmek istiyorum...
Computer Software & Programing Languages
The term "software" was first used in this sense by
John W. Tukey in 1958. In computer science and software
engineering, computer software is all computer programs.
The concept of reading different sequences of instructions into
the memory of a device to control computations was invented
by Charles Babbage as part of his difference engine. The theory
that is the basis for most modern software was first proposed by
Alan Turing in his 1935 essay “Computable numbers with an
application to the Entscheidungsproblem”.
Practical computer systems divide software systems into three major
classes: system software, programming software and
application software, although the distinction is arbitrary,
and often blurred.
System software helps run the computer hardware and
computer system. It includes operating systems, device drivers,
diagnostic tools, servers, windowing systems, utilities and more.
The purpose of systems software is to insulate the applications
programmer as much as possible from the details of the
particular computer complex being used, especially memory
and other hardware features, and such accessory devices as
communications, printers, readers, displays, keyboards, etc.
Programming software usually provides tools to assist a programmer
in writing computer programs and software using different
programming languages in a more convenient way. The tools include
text editors, compilers, interpreters, linkers, debuggers, and so on.
An Integrated development environment (IDE) merges those tools into a
software bundle, and a programmer may not need to type multiple commands
for compiling, interpreter, debugging, tracing, and etc., because the
IDE usually has an advanced graphical user interface, or GUI.
Application software allows end users to accomplish one or more
specific (non-computer related) tasks. Typical applications include
industrial automation, business software, educational software,
medical software, databases, and computer games.
Businesses are probably the biggest users of application software,
but almost every field of human activity now uses some form of application
software.
The History of Programming Lanuguages
Ada Lovelace translated Italian mathematician Luigi Menabrea's memoir on Charles Babbage's newest proposed machine, the Analytical Engine, during a nine-month period in 1842-1843. She has been recognized by some historians as the world's first computer programmer.
In the 1940s the first recognizably modern, electrically powered computers were created. The limited speed and memory capacity forced programmers to write hand tuned assembly language programs. It was soon discovered that programming in assembly language required a great deal of intellectual effort and was error-prone.
Some important languages that were developed in this time period include:
1943 - Plankalkül (Konrad Zuse)
1943 - ENIAC coding system
1949 - C-10
In the 1950s the first three modern programming languages whose descendants are still in widespread use today were designed:
FORTRAN, the "FORmula TRANslator”
LISP, the "LISt Processor“
COBOL, the COmmon Business Oriented Language
Another milestone in the late 1950s was the publication, by a committee of American and European computer scientists, of "a new language for algorithms"; the Algol 60 Report .
Algol 60 was particularly influential in the design of later languages, some of which soon became more popular. The Burroughs large systems were designed to be programmed in an extended subset of Algol.
Some important languages that were developed in this time period include:
1951 - Regional Assembly Language
1952 - Autocode
1954 - FORTRAN
1958 - LISP
1958 - ALGOL 58
1959 - COBOL
1962 - APL
1962 - Simula
1964 - BASIC
1964 - PL/I
The period from the late 1960s to the late 1970s brought a major flowering of programming languages. Most of the major language paradigms now in use were invented in this period:
Simula, invented in the late 1960s by Nygaard and Dahl as a superset of Algol 60, was the first language designed to support object-oriented programming.
Smalltalk (mid 1970s) provided a complete ground-up design of an object-oriented language.
C, an early systems programming language, was developed by Dennis Ritchie and Ken Thompson at Bell Labs between 1969 and 1973.
Prolog, designed in 1972 by Colmerauer, Roussel, and Kowalski, was the first logic programming language.
ML
The 1960s and 1970s also saw considerable debate over the merits of "structured programming", which essentially meant programming without the use of GOTO. This debate was closely related to language design: some languages did not include GOTO, which forced structured programming on the programmer. Although the debate raged hotly at the time, nearly all programmers now agree that, even in languages that provide GOTO, it is bad style to use it except in rare circumstances. As a result, later generations of language designers have found the structured programming debate tedious and even bewildering.
Some important languages that were developed in this time period include:
1970 - Pascal
1970 - Forth
1972 - C
1972 - Smalltalk
1972 - Prolog
1973 - ML
1978 - SQL
The 1980s were years of relative consolidation. C++ combined object-oriented and systems programming.
Some important languages that were developed in this time period include:
1983 - Ada
1983 - C++
1985 - Eiffel
1987 - Perl
1989 - FL (Backus)
The 1990s: the Internet age
The rapid growth of the Internet in the mid-1990s was the next major historic event in programming languages. By opening up a radically new platform for computer systems, the Internet created an opportunity for new languages to be adopted. In particular, the Java programming language rose to popularity because of its early integration with the Netscape Navigator web browser, and various scripting languages achieved widespread use in developing customized applications for web servers.
Some important languages that were developed in this time period include:
1990 - Haskell
1990 - Python
1991 - Java
1993 - Ruby
1995 - PHP
2000 - C#
John W. Tukey in 1958. In computer science and software
engineering, computer software is all computer programs.
The concept of reading different sequences of instructions into
the memory of a device to control computations was invented
by Charles Babbage as part of his difference engine. The theory
that is the basis for most modern software was first proposed by
Alan Turing in his 1935 essay “Computable numbers with an
application to the Entscheidungsproblem”.
Practical computer systems divide software systems into three major
classes: system software, programming software and
application software, although the distinction is arbitrary,
and often blurred.
System software helps run the computer hardware and
computer system. It includes operating systems, device drivers,
diagnostic tools, servers, windowing systems, utilities and more.
The purpose of systems software is to insulate the applications
programmer as much as possible from the details of the
particular computer complex being used, especially memory
and other hardware features, and such accessory devices as
communications, printers, readers, displays, keyboards, etc.
Programming software usually provides tools to assist a programmer
in writing computer programs and software using different
programming languages in a more convenient way. The tools include
text editors, compilers, interpreters, linkers, debuggers, and so on.
An Integrated development environment (IDE) merges those tools into a
software bundle, and a programmer may not need to type multiple commands
for compiling, interpreter, debugging, tracing, and etc., because the
IDE usually has an advanced graphical user interface, or GUI.
Application software allows end users to accomplish one or more
specific (non-computer related) tasks. Typical applications include
industrial automation, business software, educational software,
medical software, databases, and computer games.
Businesses are probably the biggest users of application software,
but almost every field of human activity now uses some form of application
software.
The History of Programming Lanuguages
Ada Lovelace translated Italian mathematician Luigi Menabrea's memoir on Charles Babbage's newest proposed machine, the Analytical Engine, during a nine-month period in 1842-1843. She has been recognized by some historians as the world's first computer programmer.
In the 1940s the first recognizably modern, electrically powered computers were created. The limited speed and memory capacity forced programmers to write hand tuned assembly language programs. It was soon discovered that programming in assembly language required a great deal of intellectual effort and was error-prone.
Some important languages that were developed in this time period include:
1943 - Plankalkül (Konrad Zuse)
1943 - ENIAC coding system
1949 - C-10
In the 1950s the first three modern programming languages whose descendants are still in widespread use today were designed:
FORTRAN, the "FORmula TRANslator”
LISP, the "LISt Processor“
COBOL, the COmmon Business Oriented Language
Another milestone in the late 1950s was the publication, by a committee of American and European computer scientists, of "a new language for algorithms"; the Algol 60 Report .
Algol 60 was particularly influential in the design of later languages, some of which soon became more popular. The Burroughs large systems were designed to be programmed in an extended subset of Algol.
Some important languages that were developed in this time period include:
1951 - Regional Assembly Language
1952 - Autocode
1954 - FORTRAN
1958 - LISP
1958 - ALGOL 58
1959 - COBOL
1962 - APL
1962 - Simula
1964 - BASIC
1964 - PL/I
The period from the late 1960s to the late 1970s brought a major flowering of programming languages. Most of the major language paradigms now in use were invented in this period:
Simula, invented in the late 1960s by Nygaard and Dahl as a superset of Algol 60, was the first language designed to support object-oriented programming.
Smalltalk (mid 1970s) provided a complete ground-up design of an object-oriented language.
C, an early systems programming language, was developed by Dennis Ritchie and Ken Thompson at Bell Labs between 1969 and 1973.
Prolog, designed in 1972 by Colmerauer, Roussel, and Kowalski, was the first logic programming language.
ML
The 1960s and 1970s also saw considerable debate over the merits of "structured programming", which essentially meant programming without the use of GOTO. This debate was closely related to language design: some languages did not include GOTO, which forced structured programming on the programmer. Although the debate raged hotly at the time, nearly all programmers now agree that, even in languages that provide GOTO, it is bad style to use it except in rare circumstances. As a result, later generations of language designers have found the structured programming debate tedious and even bewildering.
Some important languages that were developed in this time period include:
1970 - Pascal
1970 - Forth
1972 - C
1972 - Smalltalk
1972 - Prolog
1973 - ML
1978 - SQL
The 1980s were years of relative consolidation. C++ combined object-oriented and systems programming.
Some important languages that were developed in this time period include:
1983 - Ada
1983 - C++
1985 - Eiffel
1987 - Perl
1989 - FL (Backus)
The 1990s: the Internet age
The rapid growth of the Internet in the mid-1990s was the next major historic event in programming languages. By opening up a radically new platform for computer systems, the Internet created an opportunity for new languages to be adopted. In particular, the Java programming language rose to popularity because of its early integration with the Netscape Navigator web browser, and various scripting languages achieved widespread use in developing customized applications for web servers.
Some important languages that were developed in this time period include:
1990 - Haskell
1990 - Python
1991 - Java
1993 - Ruby
1995 - PHP
2000 - C#
21 Kasım 2007
Hw6 & Hw7; Due dates: Nov 28th, & Dec 5th respectively
Hw6:
Operating system is the most important program that runs on a computer. Every general-purpose computer must have an operating system to run other programs. Operating systems perform basic tasks, such as recognizing input from the keyboard, sending output to the display screen, keeping track of files and directories on the disk, and controlling peripheral devices such as disk drives and printers. For large systems, the operating system has even greater responsibilities and powers. It is like a traffic cop -- it makes sure that different programs and users running at the same time do not interfere with each other. The operating system is also responsible for security, ensuring that unauthorized users do not access the system.
Operating systems can be classified as follows:
•multi-user : Allows two or more users to run programs at the same time. Some operating systems permit hundreds or even thousands of concurrent users.
•multiprocessing : Support running programs on more than one CPU.
•multitasking: Allows more than one program to run concurrently.
•multithreading : Allows different parts of a single program to run concurrently.
Hw7:
Operating systems provide a software platform on top of which other programs, called application programs, can run. The application programs must be written to run on top of a particular operating system. Your choice of operating system, therefore, determines to a great extent the applications you can run. For PCs, the most popular operating systems are DOS, OS/2 and Windows, but others are available, such as Linux.
As a user, you normally interact with the operating system through a set of commands. For example, the DOS operating system contains commands such as COPY and RENAME for copying files and changing the names of files, respectively. The commands are accepted and executed by a part of the operating system called the command processor or command line interpreter. Graphical user interfaces allow you to enter commands by pointing and clicking at objects that appear on the screen.
Operating system is the most important program that runs on a computer. Every general-purpose computer must have an operating system to run other programs. Operating systems perform basic tasks, such as recognizing input from the keyboard, sending output to the display screen, keeping track of files and directories on the disk, and controlling peripheral devices such as disk drives and printers. For large systems, the operating system has even greater responsibilities and powers. It is like a traffic cop -- it makes sure that different programs and users running at the same time do not interfere with each other. The operating system is also responsible for security, ensuring that unauthorized users do not access the system.
Operating systems can be classified as follows:
•multi-user : Allows two or more users to run programs at the same time. Some operating systems permit hundreds or even thousands of concurrent users.
•multiprocessing : Support running programs on more than one CPU.
•multitasking: Allows more than one program to run concurrently.
•multithreading : Allows different parts of a single program to run concurrently.
Hw7:
Operating systems provide a software platform on top of which other programs, called application programs, can run. The application programs must be written to run on top of a particular operating system. Your choice of operating system, therefore, determines to a great extent the applications you can run. For PCs, the most popular operating systems are DOS, OS/2 and Windows, but others are available, such as Linux.
As a user, you normally interact with the operating system through a set of commands. For example, the DOS operating system contains commands such as COPY and RENAME for copying files and changing the names of files, respectively. The commands are accepted and executed by a part of the operating system called the command processor or command line interpreter. Graphical user interfaces allow you to enter commands by pointing and clicking at objects that appear on the screen.
18 Kasım 2007
Feedback on Midterm-Exam1
Bu sınavda notlar oldukça düşük. Çok basit soruldu, yüzde 90'ı sınıfta yapılanlar... kısa kısa cümleler.. ama maalesef genelde hiç olmamış. The question "who invented the computer?" will be asked you in every exam until you'll be able to answer it correctly... or it may be given as a homework. :(
Bazılarınızın blog'u okumadığı da anlaşılıyor. Bu yazının hemen yazının altındaki yazıma bakın. Çok basit, açık net "who invented the computer" diye başlayıp anlatıyor. Yani öğretmen size soruyu ve de cevabı veriyor açıkça. Ve aynısı soruyor ama cevap verebilen 1-2 kişi. ...benim zaman ayırıp bu bloga dönütler yazma nedenim, öğrenmenize katkıda bulunmak. Okumayacaksanız, artık yazmayacağım....Bu konudaki görüşlerinizi lütfen yazın. Bu arada bilmenizi isterim ki bu bloga webden bir şekilde erişenler bana eposta yazıp "öğrencilerinize böyle geri bildirimler vermeniz ne kadar hoş, bunu kaç hoca yapar" diyorlar... onlara teşekkür ediyorum.... ama sizlerin bunlardan yaralanıp yararlanmadığınızı bilmiyorum.
The Turkish of the sentence "the integrated circuit is nothing more than very a advanced electric circuit" is ==> "Tümleşik (bütünleşik) devre, çok ileri düzey bir elektrik devresinden başka bir şey değildir".
Low-volume: Düşük hacimli
PLEASE search and understand what the "turnaround time" is.
ve LÜTFEN, "storage" yerine sadece ve sürekli "depolama" demeyin. Bazı yerlerde (bana kalırsa pek çok yerde) saklama demek daha uygun... ör. "storing instructions" ==> konuları saklama... komut depo edilmez bana göre.
Is secondary storage only harddisks??????? This is not acceptable for a 3rd class student in a CEIT department!
Bazılarınızın blog'u okumadığı da anlaşılıyor. Bu yazının hemen yazının altındaki yazıma bakın. Çok basit, açık net "who invented the computer" diye başlayıp anlatıyor. Yani öğretmen size soruyu ve de cevabı veriyor açıkça. Ve aynısı soruyor ama cevap verebilen 1-2 kişi. ...benim zaman ayırıp bu bloga dönütler yazma nedenim, öğrenmenize katkıda bulunmak. Okumayacaksanız, artık yazmayacağım....Bu konudaki görüşlerinizi lütfen yazın. Bu arada bilmenizi isterim ki bu bloga webden bir şekilde erişenler bana eposta yazıp "öğrencilerinize böyle geri bildirimler vermeniz ne kadar hoş, bunu kaç hoca yapar" diyorlar... onlara teşekkür ediyorum.... ama sizlerin bunlardan yaralanıp yararlanmadığınızı bilmiyorum.
The Turkish of the sentence "the integrated circuit is nothing more than very a advanced electric circuit" is ==> "Tümleşik (bütünleşik) devre, çok ileri düzey bir elektrik devresinden başka bir şey değildir".
Low-volume: Düşük hacimli
PLEASE search and understand what the "turnaround time" is.
ve LÜTFEN, "storage" yerine sadece ve sürekli "depolama" demeyin. Bazı yerlerde (bana kalırsa pek çok yerde) saklama demek daha uygun... ör. "storing instructions" ==> konuları saklama... komut depo edilmez bana göre.
Is secondary storage only harddisks??????? This is not acceptable for a 3rd class student in a CEIT department!
07 Kasım 2007
Texts of Nov. 7
Computers’ History
“Who invented the computer?" is not a question with a simple answer. The real answer is that many inventors contributed to the history of computers and that a computer is a complex piece of machinery made up of many parts, each of which can be considered a separate invention.
This series covers many of the major milestones in computer history (but not all of them) with a concentration on the history of personal home computers.
Computer History Year/Enter
Computer History Inventors/Inventions
Computer History Description of Event
1936
Konrad Zuse - Z1 Computer
First freely programmable computer.
1942
John Atanasoff & Clifford Berry ABC Computer
Who was first in the computing biz is not always as easy as ABC.
1944
Howard Aiken & Grace Hopper Harvard Mark I Computer
The Harvard Mark 1 computer.
1946
John Presper Eckert & John W. Mauchly ENIAC 1 Computer
20,000 vacuum tubes later...
1948
Frederic Williams & Tom Kilburn Manchester Baby Computer & The Williams Tube
Baby and the Williams Tube turn on the memories.
1947/48
John Bardeen, Walter Brattain & Wiliam Shockley The Transistor
No, a transistor is not a computer, but this invention greatly affected the history of computers.
1951
John Presper Eckert & John W. Mauchly UNIVAC Computer
First commercial computer & able to pick presidential winners.
1953
International Business Machines IBM 701 EDPM Computer
IBM enters into 'The History of Computers'.
1954
John Backus & IBM FORTRAN Computer Programming Language
The first successful high level programming language.
1955(In Use 1959)
Stanford Research Institute, Bank of America, and General ElectricERMA and MICR
The first bank industry computer - also MICR (magnetic ink character recognition) for reading checks.
1958
Jack Kilby & Robert Noyce The Integrated Circuit
Otherwise known as 'The Chip'
1962
Steve Russell & MIT Spacewar Computer Game
The first computer game invented.
1964
Douglas Engelbart Computer Mouse & Windows
Nicknamed the mouse because the tail came out the end.
1969
ARPAnet
The original Internet.
1970
Intel 1103 Computer Memory
The world's first available dynamic RAM chip.
1971
Faggin, Hoff & Mazor Intel 4004 Computer Microprocessor
The first microprocessor.
1971
Alan Shugart &IBM The "Floppy" Disk
Nicknamed the "Floppy" for its flexibility.
1973
Robert Metcalfe & Xerox The Ethernet Computer Networking
Networking.
1974/75
Scelbi & Mark-8 Altair & IBM 5100 Computers
The first consumer computers.
1976/77
Apple I, II & TRS-80 & Commodore Pet Computers
More first consumer computers.
1978
Dan Bricklin & Bob Frankston VisiCalc Spreadsheet Software
Any product that pays for itself in two weeks is a surefire winner.
1979
Seymour Rubenstein & Rob Barnaby WordStar Software
Word Processors.
1981
IBM The IBM PC - Home Computer
From an "Acorn" grows a personal computer revolution
1981
Microsoft MS-DOS Computer Operating System
From "Quick And Dirty" comes the operating system of the century.
1983
Apple Lisa Computer
The first home computer with a GUI, graphical user interface.
1984
Apple Macintosh Computer
The more affordable home computer with a GUI.
1985
Microsoft Windows
Microsoft begins the friendly war with Apple.
The von Neumann architecture
The von Neumann architecture is a computer design model that uses a processing unit and a single separate storage structure to hold both instructions and data. It is named after mathematician and early computer scientist John von Neumann.
The earliest computing machines had fixed programs. Some very simple computers still use this design, either for simplicity or training purposes. For example, a desk calculator (in principle) is a fixed program computer. It can do basic mathematics, but it cannot be used as a word processor or to run video games. To change the program of such a machine, you have to re-wire, re-structure, or even re-design the machine. Indeed, the earliest computers were not so much "programmed" as they were "designed". "Reprogramming", when it was possible at all, was a very manual process, starting with flow charts and paper notes, followed by detailed engineering designs, and then the often-arduous process of implementing the physical changes.
The idea of the stored-program computer changed all that. By creating an instruction set architecture and detailing the computation as a series of instructions (the program), the machine becomes much more flexible. By treating those instructions in the same way as data, a stored-program machine can easily change the program, and can do so under program control.
The terms "von Neumann architecture" and "stored-program computer" are generally used interchangeably, and that usage is followed in this article. However, the Harvard architecture concept should be mentioned as a design which stores the program in an easily modifiable form, but not using the same storage as for general data.
I/O Channels
Earliest computers were very pure von Neumann machines: all IO had to go through CPU. No notion of DMA, etc.
IBM introduced idea of hardware ``channels'' to manage IO. Switch between CPU, devices, memory. Probably earliest example of parallel processing.
For a long time, the most reasonable way to distinguish between a ``minicomputer'' and a ``mainframe'' was by whether or not there were dedicated IO and memory busses, or if everything plugged into a single bus. Advantage of former system is speed; memory bus doesn't have to worry about arbitration, so memory accesses can be faster. Advantages of latter system are cost and uniformity.
I/O Channel
In computer science, channel I/O is a generic term that refers to an advanced, high-performance input/output architecture that is implemented in various forms on a number of computer architectures, especially on mainframe computers. In the past they were generally implemented with a custom processor, known alternately as peripheral processors, I/O processors, I/O controllers or DMA controllers.
Many input/output tasks can be fairly complex and require logic to be applied to the data to convert formats and other similar duties. In these situations the computer's CPU would normally be asked to handle the logic, but due to the fact that the I/O devices are very slow, the CPU would end up spending a huge amount of time (in computer terms) sitting idle waiting for the data from the device. A channel I/O architecture avoids this problem by using a low-cost processor with enough logic and memory onboard to handle these sorts of tasks. They are typically not powerful or flexible enough to be used as a computer on their own, and are actually a form of co-processor. The CPU sends small programs to the controller to handle an I/O job, which the channel controller can then complete without any help from the CPU. When it is complete, or there is an error, the channel controller communicates with the CPU using a selection of interrupts. Since the channel controller has direct access to the main memory of the computer, they are also often referred to as DMA Controllers (where DMA means direct memory access),
The first use of channel I/O was with the IBM 709[1] vacuum tube mainframe, whose Model 766 Data Synchronizer was the first channel in 1957. Its transistorized successor the IBM 7090[2] had channels (the 7607) and a channel multiplexor (the 7606) which could control up to eight channels. For System/360 computers, and even for early System/370 models, the selector channels, and the multiplexor channels in the larger System/360 computers, still were bulky and expensive separate processors,
One of the earlier non-IBM channel systems was provided in 1964 with the CDC 6600 supercomputer, which used 10 logically independent computers called peripheral processors,
However with the rapid speed increases in computers today, combined with operating systems that don't "block" when waiting for data, the channel controller has become somewhat redundant and are not commonly found on smaller machines.
Chess-playing Computers
The idea of creating a chess-playing machine dates back to the eighteenth century. Around 1769, the chess playing automaton called The Turk became famous before being exposed as a hoax. Before the development of digital computing, serious trials based on automatons such as El Ajedrecista of 1912, were too complex and limited to be useful for playing full games of chess. The field of mechanical chess research languished until the advent of the digital computer in the 1950s. Since then, chess enthusiasts and computer engineers have built, with increasing degrees of seriousness and success, chess-playing machines and computer programs.
Chess-playing computers are now available at a very low cost. There are many programs such as Crafty, Fruit and GNU Chess that can be downloaded from the Internet for free, and yet play a game that with the aid of virtually any modern personal computer, can defeat most master players under tournament conditions. Top commercial programs like Shredder or Fritz have surpassed even world champion caliber players at blitz and short time controls. As of February 2007, Rybka is top-rated in many rating lists such CCRL, CEGT, SSDF, SCCT, and CSS rating lists and has won many recent official computer chess tournaments
Artificial Intelligence
The modern definition of artificial intelligence (or AI) is "the study and design of intelligent agents" where an intelligent agent is a system that perceives its environment and takes actions which maximizes its chances of success.[1] John McCarthy, who coined the term in 1956,[2] defines it as "the science and engineering of making intelligent machines.”
AI research uses tools and insights from many fields, including computer science, psychology, philosophy, neuroscience, cognitive science, linguistics, operations research, economics, control theory, probability, optimization and logic. AI research also overlaps with tasks such as robotics, control systems, scheduling, data mining, logistics, speech recognition, facial recognition and many others.
Typical problems to which AI methods are applied
Pattern recognition
Optical character recognition
Handwriting recognition
Speech recognition
Face recognition
Artificial Creativity
Computer vision, Virtual reality and Image processing
Diagnosis (artificial intelligence)
Game theory and Strategic planning
Game artificial intelligence and Computer game bot
Natural language processing, Translation and Chatterbots
Non-linear control and Robotics
Other fields in which AI methods are implemented
Artificial life
Automated reasoning
Automation
Biologically-inspired computing
Concept mining
Data mining
Knowledge representation
Semantic Web
E-mail spam filtering
Behavior-based robotics
Cognitive robotics
Cybernetics
Epigenetic robotics
Evolutionary robotics
Hybrid intelligent system
Intelligent agent
Robots
Robots have frequently appeared as characters in works of literature; the word robot comes from Karel Čapek's play R.U.R. (Rossum's Universal Robots), premiered in 1920. Isaac Asimov wrote many volumes of science fiction focusing on robots in numerous forms and guises, contributing greatly to reducing the Frankenstein complex, which dominated early works of fiction involving robots. His three laws of robotics have become particularly well known for codifying a simple set of behaviors for robots to remain at the service of their human creators.
Numerous words for different types of robots are now used in literature. Robot has come to mean mechanical humans, while android is a generic term for artificial humans. Cyborg or "bionic man" is used for a human form that is a mixture of organic and mechanical parts. Organic artificial humans have also been referred to as "constructs" (or "biological constructs").
In science fiction, the Three Laws of Robotics are a set of three rules written by Isaac Asimov, which almost all positronic robots appearing in his fiction must obey. Introduced in his 1942 short story "Runaround", although foreshadowed in a few earlier stories, the Laws state the following:
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey orders given to it by human beings except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Later, Asimov added the Zeroth Law: "A robot may not harm humanity, or, by inaction, allow humanity to come to harm"; the rest of the laws are modified sequentially to acknowledge this.
According to the Oxford English Dictionary, the first passage in Asimov's short story "Liar!" (1941) that mentions the First Law is the earliest recorded use of the word robotics. Asimov was not initially aware of this; he assumed the word already existed by analogy with mechanics, hydraulics, and other similar terms denoting branches of applied knowledge.
Printers
A computer printer, or more commonly a printer, produces a hard copy (permanent human-readable text and/or graphics) of documents stored in electronic form, usually on physical print media such as paper transparencies. Many printers are primarily used as computer peripherals, and are attached by a printer cable to a computer which serves as a document source. Other printers, commonly known as network printers, have built-in network interfaces (typically wireless or Ethernet), and can serve as a hardcopy device for any user on the network.
Printers are designed for low-volume, short-turnaround print jobs; requiring virtually no setup time to achieve a hard copy of a given document. However, printers are generally slow devices (30 pages per minute is considered fast; and many consumer printers are far slower than that), and the cost-per-page is relatively high.
In contrast, the printing press (which serves much the same function), is designed and optimized for high-volume print jobs such as newspaper print runs--printing presses are capable of hundreds of pages per minute or more, and have an incremental cost-per-page which is a fraction of that of printers.
The printing press remains the machine of choice for high-volume, professional publishing. However, as printers have improved in quality and performance, many jobs which used to be done by professional print shops are now done by users on local printers; see desktop publishing.
The world's first computer printer was a 19th century mechanically driven apparatus invented by Charles Babbage for his Difference Engine. In 2007, a study revealed that toner-based printers produced pollution as harmful as that from cigarettes.
Blu-ray
The name Blu-ray Disc is derived from the blue-violet laser used to read and write this type of disc. Because of its shorter wavelength (405 nm), substantially more data can be stored on a Blu-ray Disc than on the DVD format, which uses a red (650 nm) laser. A single layer Blu-ray Disc can store 25 gigabytes (GB), over five times the size of a single layer DVD at 4.7 GB. A dual layer Blu-ray Disc can store 50 GB, almost 6 times the size of a dual layer DVD at 8.5 GB.
Blu-ray was developed by the Blu-ray Disc Association, a group of leading companies representing consumer electronics, computer hardware, and motion picture production. The standard is covered by several patents belonging to different companies. As of March 2007, a joint licensing agreement for all the relevant patents has not yet been finalized.
As of October 23, 2007, 351 titles have been released on Blu-ray Disc in the United States (32 of those titles have since been discontinued). As of October 9, 2007, 179 titles have been released in Japan, with 55 titles planned for release.
The Blu-ray standard is currently in a format war with its rival HD DVD, to determine which (if either) of the two formats will become the leading carrier for high-definition content to consumers.
Database management systems
A database management system (DBMS), sometimes just called a database manager, is a program that lets one or more computer users create and access data in a database. The DBMS manages user requests (and requests from other programs) so that users and other programs are free from having to understand where the data is physically located on storage media and, in a multi-user system, who else may also be accessing the data. In handling user requests, the DBMS ensures the integrity of the data (that is, making sure it continues to be accessible and is consistently organized as intended) and security (making sure only those with access privileges can access the data). The most typical DBMS is a relational database management system (RDBMS). A standard user and program interface is the Structured Query Language (SQL). A newer kind of DBMS is the object-oriented database management system (ODBMS).
A DBMS can be thought of as a file manager that manages data in databases rather than files in file systems.
A DBMS is usually an inherent part of a database product. On PCs, Microsoft Access is a popular example of a single- or small-group user DBMS. Microsoft's SQL Server is an example of a DBMS that serves database requests from multiple (client) users. Other popular DBMSs (these are all RDBMSs, by the way) are IBM's DB2, Oracle's line of database management products, and Sybase's products.
Distance Learning
The terms "distance education" or "distance learning" have been applied interchangeably by many different researchers to a great variety of programs, providers, audiences, and media. Its hallmarks are the separation of teacher and learner in space and/or time (Perraton, 1988), the volitional control of learning by the student rather than the distant instructor (Jonassen, 1992), and noncontiguous communication between student and teacher, mediated by print or some form of technology (Keegan, 1986; Garrison and Shale, 1987).
Traditionally, we think of distance learners as adults.
At the elementary and middle school levels, distance learning usually takes the form of curriculum enrichment modules and ongoing telecommunications projects.
At the secondary level, locally or federally funded distance education addresses the needs of small rural school districts or underserved urban school districts. Some secondary school students may enroll in courses to meet graduation requirements which their own districts are unable to offer; some take advanced placement, foreign language, or vocational classes; others may be homebound or disabled. In many instances, talented or gifted high school students have been selected to attend distance classes because of their high academic ability and capacity for handling independent work.
Although technology is an integral part of distance education, any successful program must focus on the instructional needs of the students, rather than on the technology itself. It is essential to consider their ages, cultural and socioeconomic backgrounds, interests and experiences, educational levels, and familiarity with distance education methods and delivery systems (Schamber, 1988). Students usually adapt more quickly than their teachers to new technology. On the other hand, teachers who have begun to feel comfortable with the equipment don't mind having their students teach them new tips and tricks.
Supercomputers
The TOP500 project ranks and details the 500 most powerful publicly-known computer systems in the world. The project was started in 1993 and publishes an updated list of the supercomputers twice a year. The project aims to provide a reliable basis for tracking and detecting trends in high-performance computing.
In the early nineties, a new definition of supercomputer was needed to produce meaningful statistics. After experimenting with metrics based on processor count in 1992, the idea was born at the University of Mannheim to use a detailed listing of installed systems as the basis.
The systems ranked #1 since 1993
IBM Blue Gene/L (since 2004.11)
NEC Earth Simulator (2002.06 - 2004.11)
IBM ASCI White (2000.11 - 2002.06)
Intel ASCI Red (1997.06 - 2000.11)
Hitachi CP-PACS (1996.11 - 1997.06)
Hitachi SR2201 (1996.06 - 1996.11)
Fujitsu Numerical Wind Tunnel (1994.11 - 1996.06)
Intel Paragon XP/S140 (1994.06 - 1994.11)
Fujitsu Numerical Wind Tunnel (1993.11 - 1994.06)
TMC CM-5 (1993.06 - 1993.11)
Silicon Valley
Silicon Valley is the southern part of the San Francisco Bay Area in Northern California in the United States. The term originally referred to the region's large number of silicon chip innovators and manufacturers, but eventually came to refer to all the high-tech businesses in the area; it is now generally used as a metonym for the high-tech sector. Despite the development of other high-tech economic centers throughout the United States, Silicon Valley continues to be the leading high-tech hub because of its large number of engineers and venture capitalists.
The term Silicon Valley was coined by Ralph Vaerst, a Northern California entrepreneur. His journalist friend, Don Hoefler, first published the term in 1971. He used it as the title of a series of articles "Silicon Valley USA" in a weekly trade newspaper Electronic News which started with the January 11, 1971 issue. Valley refers to the Santa Clara Valley, located at the southern end of San Francisco Bay, while Silicon refers to the high concentration of semiconductor and computer-related industries in the area. These and similar technology and electricity firms slowly replaced the orchards which gave the area its initial nickname, the Valley of Heart's Delight.
Electric Circuits
The integrated circuit is nothing more than a very advanced electric circuit. An electric circuit is made from different electrical components such as transistors, resistors, capacitors and diodes, that are connected to each other in different ways. These components have different behaviors.
The transistor acts like a switch. It can turn electricity on or off, or it can amplify current. It is used for example in computers to store information, or in stereo amplifiers to make the sound signal stronger.
The resistor limits the flow of electricity and gives us the possibility to control the amount of current that is allowed to pass. Resistors are used, among other things, to control the volume in television sets or radios.
The capacitor collects electricity and releases it all in one quick burst; like for instance in cameras where a tiny battery can provide enough energy to fire the flashbulb.
The diode stops electricity under some conditions and allows it to pass only when these conditions change. This is used in, for example, photocells where a light beam that is broken triggers the diode to stop electricity from flowing through it.
These components are like the building blocks in an electrical construction kit. Depending on how the components are put together when building the circuit, everything from a burglar alarm to a computer microprocessor can be constructed.
Card Punch
The first practical use of punched cards for data processing is credited to the American inventor Herman Hollerith, who decided to use Jacquard's punched cards to represent the data gathered for the American census of 1890, and to read and collate this data using an automatic machine.
Herman Hollerith's Pantographic Card Punch, developed for the 1890 US census. Prior to 1890, cards were punched using a train conductor's ticket punch that allowed holes to be placed only around the edge of the card, and was not terribly accurate, and which tended to induce strain injuries. The Pantographic punch allowed accurate placement of holes with minimum physical strain, one hole at a time, and also provided access to the interior of the card, allowing more information per card.
A skilled operator could punch 700 cards per day. This photo was taken during the 1920 census.
Hollerith's systems opened up an important new field of employment to women, starting with the 1890 Census."
Prior to 1928, IBM cards were punched with round holes; originally in 20 columns, but by the late 1920s, 45 columns, as shown in this New York University registration card,
Beginning in 1928, IBM cards had rectangular holes 80 columns across, as shown in this Iowa State University registration card.
IBM Type 032 Printing Punch
This is the IBM 032 Printing Punch of 1933. This was the first key punch machine capable of punching letters as well as digits (except by multipunching) and of interpreting the punches by printing across the top of the card. It was used by the US Army in World War II for Morning Reports and such.
“Who invented the computer?" is not a question with a simple answer. The real answer is that many inventors contributed to the history of computers and that a computer is a complex piece of machinery made up of many parts, each of which can be considered a separate invention.
This series covers many of the major milestones in computer history (but not all of them) with a concentration on the history of personal home computers.
Computer History Year/Enter
Computer History Inventors/Inventions
Computer History Description of Event
1936
Konrad Zuse - Z1 Computer
First freely programmable computer.
1942
John Atanasoff & Clifford Berry ABC Computer
Who was first in the computing biz is not always as easy as ABC.
1944
Howard Aiken & Grace Hopper Harvard Mark I Computer
The Harvard Mark 1 computer.
1946
John Presper Eckert & John W. Mauchly ENIAC 1 Computer
20,000 vacuum tubes later...
1948
Frederic Williams & Tom Kilburn Manchester Baby Computer & The Williams Tube
Baby and the Williams Tube turn on the memories.
1947/48
John Bardeen, Walter Brattain & Wiliam Shockley The Transistor
No, a transistor is not a computer, but this invention greatly affected the history of computers.
1951
John Presper Eckert & John W. Mauchly UNIVAC Computer
First commercial computer & able to pick presidential winners.
1953
International Business Machines IBM 701 EDPM Computer
IBM enters into 'The History of Computers'.
1954
John Backus & IBM FORTRAN Computer Programming Language
The first successful high level programming language.
1955(In Use 1959)
Stanford Research Institute, Bank of America, and General ElectricERMA and MICR
The first bank industry computer - also MICR (magnetic ink character recognition) for reading checks.
1958
Jack Kilby & Robert Noyce The Integrated Circuit
Otherwise known as 'The Chip'
1962
Steve Russell & MIT Spacewar Computer Game
The first computer game invented.
1964
Douglas Engelbart Computer Mouse & Windows
Nicknamed the mouse because the tail came out the end.
1969
ARPAnet
The original Internet.
1970
Intel 1103 Computer Memory
The world's first available dynamic RAM chip.
1971
Faggin, Hoff & Mazor Intel 4004 Computer Microprocessor
The first microprocessor.
1971
Alan Shugart &IBM The "Floppy" Disk
Nicknamed the "Floppy" for its flexibility.
1973
Robert Metcalfe & Xerox The Ethernet Computer Networking
Networking.
1974/75
Scelbi & Mark-8 Altair & IBM 5100 Computers
The first consumer computers.
1976/77
Apple I, II & TRS-80 & Commodore Pet Computers
More first consumer computers.
1978
Dan Bricklin & Bob Frankston VisiCalc Spreadsheet Software
Any product that pays for itself in two weeks is a surefire winner.
1979
Seymour Rubenstein & Rob Barnaby WordStar Software
Word Processors.
1981
IBM The IBM PC - Home Computer
From an "Acorn" grows a personal computer revolution
1981
Microsoft MS-DOS Computer Operating System
From "Quick And Dirty" comes the operating system of the century.
1983
Apple Lisa Computer
The first home computer with a GUI, graphical user interface.
1984
Apple Macintosh Computer
The more affordable home computer with a GUI.
1985
Microsoft Windows
Microsoft begins the friendly war with Apple.
The von Neumann architecture
The von Neumann architecture is a computer design model that uses a processing unit and a single separate storage structure to hold both instructions and data. It is named after mathematician and early computer scientist John von Neumann.
The earliest computing machines had fixed programs. Some very simple computers still use this design, either for simplicity or training purposes. For example, a desk calculator (in principle) is a fixed program computer. It can do basic mathematics, but it cannot be used as a word processor or to run video games. To change the program of such a machine, you have to re-wire, re-structure, or even re-design the machine. Indeed, the earliest computers were not so much "programmed" as they were "designed". "Reprogramming", when it was possible at all, was a very manual process, starting with flow charts and paper notes, followed by detailed engineering designs, and then the often-arduous process of implementing the physical changes.
The idea of the stored-program computer changed all that. By creating an instruction set architecture and detailing the computation as a series of instructions (the program), the machine becomes much more flexible. By treating those instructions in the same way as data, a stored-program machine can easily change the program, and can do so under program control.
The terms "von Neumann architecture" and "stored-program computer" are generally used interchangeably, and that usage is followed in this article. However, the Harvard architecture concept should be mentioned as a design which stores the program in an easily modifiable form, but not using the same storage as for general data.
I/O Channels
Earliest computers were very pure von Neumann machines: all IO had to go through CPU. No notion of DMA, etc.
IBM introduced idea of hardware ``channels'' to manage IO. Switch between CPU, devices, memory. Probably earliest example of parallel processing.
For a long time, the most reasonable way to distinguish between a ``minicomputer'' and a ``mainframe'' was by whether or not there were dedicated IO and memory busses, or if everything plugged into a single bus. Advantage of former system is speed; memory bus doesn't have to worry about arbitration, so memory accesses can be faster. Advantages of latter system are cost and uniformity.
I/O Channel
In computer science, channel I/O is a generic term that refers to an advanced, high-performance input/output architecture that is implemented in various forms on a number of computer architectures, especially on mainframe computers. In the past they were generally implemented with a custom processor, known alternately as peripheral processors, I/O processors, I/O controllers or DMA controllers.
Many input/output tasks can be fairly complex and require logic to be applied to the data to convert formats and other similar duties. In these situations the computer's CPU would normally be asked to handle the logic, but due to the fact that the I/O devices are very slow, the CPU would end up spending a huge amount of time (in computer terms) sitting idle waiting for the data from the device. A channel I/O architecture avoids this problem by using a low-cost processor with enough logic and memory onboard to handle these sorts of tasks. They are typically not powerful or flexible enough to be used as a computer on their own, and are actually a form of co-processor. The CPU sends small programs to the controller to handle an I/O job, which the channel controller can then complete without any help from the CPU. When it is complete, or there is an error, the channel controller communicates with the CPU using a selection of interrupts. Since the channel controller has direct access to the main memory of the computer, they are also often referred to as DMA Controllers (where DMA means direct memory access),
The first use of channel I/O was with the IBM 709[1] vacuum tube mainframe, whose Model 766 Data Synchronizer was the first channel in 1957. Its transistorized successor the IBM 7090[2] had channels (the 7607) and a channel multiplexor (the 7606) which could control up to eight channels. For System/360 computers, and even for early System/370 models, the selector channels, and the multiplexor channels in the larger System/360 computers, still were bulky and expensive separate processors,
One of the earlier non-IBM channel systems was provided in 1964 with the CDC 6600 supercomputer, which used 10 logically independent computers called peripheral processors,
However with the rapid speed increases in computers today, combined with operating systems that don't "block" when waiting for data, the channel controller has become somewhat redundant and are not commonly found on smaller machines.
Chess-playing Computers
The idea of creating a chess-playing machine dates back to the eighteenth century. Around 1769, the chess playing automaton called The Turk became famous before being exposed as a hoax. Before the development of digital computing, serious trials based on automatons such as El Ajedrecista of 1912, were too complex and limited to be useful for playing full games of chess. The field of mechanical chess research languished until the advent of the digital computer in the 1950s. Since then, chess enthusiasts and computer engineers have built, with increasing degrees of seriousness and success, chess-playing machines and computer programs.
Chess-playing computers are now available at a very low cost. There are many programs such as Crafty, Fruit and GNU Chess that can be downloaded from the Internet for free, and yet play a game that with the aid of virtually any modern personal computer, can defeat most master players under tournament conditions. Top commercial programs like Shredder or Fritz have surpassed even world champion caliber players at blitz and short time controls. As of February 2007, Rybka is top-rated in many rating lists such CCRL, CEGT, SSDF, SCCT, and CSS rating lists and has won many recent official computer chess tournaments
Artificial Intelligence
The modern definition of artificial intelligence (or AI) is "the study and design of intelligent agents" where an intelligent agent is a system that perceives its environment and takes actions which maximizes its chances of success.[1] John McCarthy, who coined the term in 1956,[2] defines it as "the science and engineering of making intelligent machines.”
AI research uses tools and insights from many fields, including computer science, psychology, philosophy, neuroscience, cognitive science, linguistics, operations research, economics, control theory, probability, optimization and logic. AI research also overlaps with tasks such as robotics, control systems, scheduling, data mining, logistics, speech recognition, facial recognition and many others.
Typical problems to which AI methods are applied
Pattern recognition
Optical character recognition
Handwriting recognition
Speech recognition
Face recognition
Artificial Creativity
Computer vision, Virtual reality and Image processing
Diagnosis (artificial intelligence)
Game theory and Strategic planning
Game artificial intelligence and Computer game bot
Natural language processing, Translation and Chatterbots
Non-linear control and Robotics
Other fields in which AI methods are implemented
Artificial life
Automated reasoning
Automation
Biologically-inspired computing
Concept mining
Data mining
Knowledge representation
Semantic Web
E-mail spam filtering
Behavior-based robotics
Cognitive robotics
Cybernetics
Epigenetic robotics
Evolutionary robotics
Hybrid intelligent system
Intelligent agent
Robots
Robots have frequently appeared as characters in works of literature; the word robot comes from Karel Čapek's play R.U.R. (Rossum's Universal Robots), premiered in 1920. Isaac Asimov wrote many volumes of science fiction focusing on robots in numerous forms and guises, contributing greatly to reducing the Frankenstein complex, which dominated early works of fiction involving robots. His three laws of robotics have become particularly well known for codifying a simple set of behaviors for robots to remain at the service of their human creators.
Numerous words for different types of robots are now used in literature. Robot has come to mean mechanical humans, while android is a generic term for artificial humans. Cyborg or "bionic man" is used for a human form that is a mixture of organic and mechanical parts. Organic artificial humans have also been referred to as "constructs" (or "biological constructs").
In science fiction, the Three Laws of Robotics are a set of three rules written by Isaac Asimov, which almost all positronic robots appearing in his fiction must obey. Introduced in his 1942 short story "Runaround", although foreshadowed in a few earlier stories, the Laws state the following:
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey orders given to it by human beings except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Later, Asimov added the Zeroth Law: "A robot may not harm humanity, or, by inaction, allow humanity to come to harm"; the rest of the laws are modified sequentially to acknowledge this.
According to the Oxford English Dictionary, the first passage in Asimov's short story "Liar!" (1941) that mentions the First Law is the earliest recorded use of the word robotics. Asimov was not initially aware of this; he assumed the word already existed by analogy with mechanics, hydraulics, and other similar terms denoting branches of applied knowledge.
Printers
A computer printer, or more commonly a printer, produces a hard copy (permanent human-readable text and/or graphics) of documents stored in electronic form, usually on physical print media such as paper transparencies. Many printers are primarily used as computer peripherals, and are attached by a printer cable to a computer which serves as a document source. Other printers, commonly known as network printers, have built-in network interfaces (typically wireless or Ethernet), and can serve as a hardcopy device for any user on the network.
Printers are designed for low-volume, short-turnaround print jobs; requiring virtually no setup time to achieve a hard copy of a given document. However, printers are generally slow devices (30 pages per minute is considered fast; and many consumer printers are far slower than that), and the cost-per-page is relatively high.
In contrast, the printing press (which serves much the same function), is designed and optimized for high-volume print jobs such as newspaper print runs--printing presses are capable of hundreds of pages per minute or more, and have an incremental cost-per-page which is a fraction of that of printers.
The printing press remains the machine of choice for high-volume, professional publishing. However, as printers have improved in quality and performance, many jobs which used to be done by professional print shops are now done by users on local printers; see desktop publishing.
The world's first computer printer was a 19th century mechanically driven apparatus invented by Charles Babbage for his Difference Engine. In 2007, a study revealed that toner-based printers produced pollution as harmful as that from cigarettes.
Blu-ray
The name Blu-ray Disc is derived from the blue-violet laser used to read and write this type of disc. Because of its shorter wavelength (405 nm), substantially more data can be stored on a Blu-ray Disc than on the DVD format, which uses a red (650 nm) laser. A single layer Blu-ray Disc can store 25 gigabytes (GB), over five times the size of a single layer DVD at 4.7 GB. A dual layer Blu-ray Disc can store 50 GB, almost 6 times the size of a dual layer DVD at 8.5 GB.
Blu-ray was developed by the Blu-ray Disc Association, a group of leading companies representing consumer electronics, computer hardware, and motion picture production. The standard is covered by several patents belonging to different companies. As of March 2007, a joint licensing agreement for all the relevant patents has not yet been finalized.
As of October 23, 2007, 351 titles have been released on Blu-ray Disc in the United States (32 of those titles have since been discontinued). As of October 9, 2007, 179 titles have been released in Japan, with 55 titles planned for release.
The Blu-ray standard is currently in a format war with its rival HD DVD, to determine which (if either) of the two formats will become the leading carrier for high-definition content to consumers.
Database management systems
A database management system (DBMS), sometimes just called a database manager, is a program that lets one or more computer users create and access data in a database. The DBMS manages user requests (and requests from other programs) so that users and other programs are free from having to understand where the data is physically located on storage media and, in a multi-user system, who else may also be accessing the data. In handling user requests, the DBMS ensures the integrity of the data (that is, making sure it continues to be accessible and is consistently organized as intended) and security (making sure only those with access privileges can access the data). The most typical DBMS is a relational database management system (RDBMS). A standard user and program interface is the Structured Query Language (SQL). A newer kind of DBMS is the object-oriented database management system (ODBMS).
A DBMS can be thought of as a file manager that manages data in databases rather than files in file systems.
A DBMS is usually an inherent part of a database product. On PCs, Microsoft Access is a popular example of a single- or small-group user DBMS. Microsoft's SQL Server is an example of a DBMS that serves database requests from multiple (client) users. Other popular DBMSs (these are all RDBMSs, by the way) are IBM's DB2, Oracle's line of database management products, and Sybase's products.
Distance Learning
The terms "distance education" or "distance learning" have been applied interchangeably by many different researchers to a great variety of programs, providers, audiences, and media. Its hallmarks are the separation of teacher and learner in space and/or time (Perraton, 1988), the volitional control of learning by the student rather than the distant instructor (Jonassen, 1992), and noncontiguous communication between student and teacher, mediated by print or some form of technology (Keegan, 1986; Garrison and Shale, 1987).
Traditionally, we think of distance learners as adults.
At the elementary and middle school levels, distance learning usually takes the form of curriculum enrichment modules and ongoing telecommunications projects.
At the secondary level, locally or federally funded distance education addresses the needs of small rural school districts or underserved urban school districts. Some secondary school students may enroll in courses to meet graduation requirements which their own districts are unable to offer; some take advanced placement, foreign language, or vocational classes; others may be homebound or disabled. In many instances, talented or gifted high school students have been selected to attend distance classes because of their high academic ability and capacity for handling independent work.
Although technology is an integral part of distance education, any successful program must focus on the instructional needs of the students, rather than on the technology itself. It is essential to consider their ages, cultural and socioeconomic backgrounds, interests and experiences, educational levels, and familiarity with distance education methods and delivery systems (Schamber, 1988). Students usually adapt more quickly than their teachers to new technology. On the other hand, teachers who have begun to feel comfortable with the equipment don't mind having their students teach them new tips and tricks.
Supercomputers
The TOP500 project ranks and details the 500 most powerful publicly-known computer systems in the world. The project was started in 1993 and publishes an updated list of the supercomputers twice a year. The project aims to provide a reliable basis for tracking and detecting trends in high-performance computing.
In the early nineties, a new definition of supercomputer was needed to produce meaningful statistics. After experimenting with metrics based on processor count in 1992, the idea was born at the University of Mannheim to use a detailed listing of installed systems as the basis.
The systems ranked #1 since 1993
IBM Blue Gene/L (since 2004.11)
NEC Earth Simulator (2002.06 - 2004.11)
IBM ASCI White (2000.11 - 2002.06)
Intel ASCI Red (1997.06 - 2000.11)
Hitachi CP-PACS (1996.11 - 1997.06)
Hitachi SR2201 (1996.06 - 1996.11)
Fujitsu Numerical Wind Tunnel (1994.11 - 1996.06)
Intel Paragon XP/S140 (1994.06 - 1994.11)
Fujitsu Numerical Wind Tunnel (1993.11 - 1994.06)
TMC CM-5 (1993.06 - 1993.11)
Silicon Valley
Silicon Valley is the southern part of the San Francisco Bay Area in Northern California in the United States. The term originally referred to the region's large number of silicon chip innovators and manufacturers, but eventually came to refer to all the high-tech businesses in the area; it is now generally used as a metonym for the high-tech sector. Despite the development of other high-tech economic centers throughout the United States, Silicon Valley continues to be the leading high-tech hub because of its large number of engineers and venture capitalists.
The term Silicon Valley was coined by Ralph Vaerst, a Northern California entrepreneur. His journalist friend, Don Hoefler, first published the term in 1971. He used it as the title of a series of articles "Silicon Valley USA" in a weekly trade newspaper Electronic News which started with the January 11, 1971 issue. Valley refers to the Santa Clara Valley, located at the southern end of San Francisco Bay, while Silicon refers to the high concentration of semiconductor and computer-related industries in the area. These and similar technology and electricity firms slowly replaced the orchards which gave the area its initial nickname, the Valley of Heart's Delight.
Electric Circuits
The integrated circuit is nothing more than a very advanced electric circuit. An electric circuit is made from different electrical components such as transistors, resistors, capacitors and diodes, that are connected to each other in different ways. These components have different behaviors.
The transistor acts like a switch. It can turn electricity on or off, or it can amplify current. It is used for example in computers to store information, or in stereo amplifiers to make the sound signal stronger.
The resistor limits the flow of electricity and gives us the possibility to control the amount of current that is allowed to pass. Resistors are used, among other things, to control the volume in television sets or radios.
The capacitor collects electricity and releases it all in one quick burst; like for instance in cameras where a tiny battery can provide enough energy to fire the flashbulb.
The diode stops electricity under some conditions and allows it to pass only when these conditions change. This is used in, for example, photocells where a light beam that is broken triggers the diode to stop electricity from flowing through it.
These components are like the building blocks in an electrical construction kit. Depending on how the components are put together when building the circuit, everything from a burglar alarm to a computer microprocessor can be constructed.
Card Punch
The first practical use of punched cards for data processing is credited to the American inventor Herman Hollerith, who decided to use Jacquard's punched cards to represent the data gathered for the American census of 1890, and to read and collate this data using an automatic machine.
Herman Hollerith's Pantographic Card Punch, developed for the 1890 US census. Prior to 1890, cards were punched using a train conductor's ticket punch that allowed holes to be placed only around the edge of the card, and was not terribly accurate, and which tended to induce strain injuries. The Pantographic punch allowed accurate placement of holes with minimum physical strain, one hole at a time, and also provided access to the interior of the card, allowing more information per card.
A skilled operator could punch 700 cards per day. This photo was taken during the 1920 census.
Hollerith's systems opened up an important new field of employment to women, starting with the 1890 Census."
Prior to 1928, IBM cards were punched with round holes; originally in 20 columns, but by the late 1920s, 45 columns, as shown in this New York University registration card,
Beginning in 1928, IBM cards had rectangular holes 80 columns across, as shown in this Iowa State University registration card.
IBM Type 032 Printing Punch
This is the IBM 032 Printing Punch of 1933. This was the first key punch machine capable of punching letters as well as digits (except by multipunching) and of interpreting the punches by printing across the top of the card. It was used by the US Army in World War II for Morning Reports and such.
04 Kasım 2007
Hw-5a, Due Date: Nov 7th, 2007
Computer Screen Technology
A computer display screen that can be used for graphics is divided into dots that are called addressable because they are addressed individually by the graphics software. Each dot can be illuminated individually on the screen. Each dot is potentially a picture element, or pixel. The resolution of the screen - its clarity – is directly related to the number of pixels on the screen: the more pixels, the higher the resolution. Another factor of importance is dot pitch, the amount of space between the dots. The smaller the dot pitch, the better the quality of the screen image.
A computer display screen that can be used for graphics is divided into dots that are called addressable because they are addressed individually by the graphics software. Each dot can be illuminated individually on the screen. Each dot is potentially a picture element, or pixel. The resolution of the screen - its clarity – is directly related to the number of pixels on the screen: the more pixels, the higher the resolution. Another factor of importance is dot pitch, the amount of space between the dots. The smaller the dot pitch, the better the quality of the screen image.
Kaydol:
Yorumlar (Atom)