Computer algorithm (nonfiction): Difference between revisions

From Gnomon Chronicles
Jump to navigation Jump to search
(Created page with "A '''computer algorithm''' is an algorithm written in software by software developers to be effective for the intended "target" Computer (nonficti...")
 
No edit summary
 
(7 intermediate revisions by the same user not shown)
Line 1: Line 1:
A '''computer algorithm''' is an [[Algorithm (nonfiction)|algorithm]] written in software by software developers to be effective for the intended "target" [[Computer (nonfiction)|computer]](s) to produce output from given (perhaps null) input.
A '''computer algorithm''' is an [[Algorithm (nonfiction)|algorithm]] written in software by software developers to be effective for the intended "target" [[Computer (nonfiction)|computer]](s) to produce output from given (perhaps null) input.


An optimal algorithm, even running in old hardware, would produce faster results than a non-optimal (higher time complexity) algorithm for the same purpose, running in more efficient hardware; that is why algorithms, like computer hardware, are considered technology.
An optimal algorithm, even running in old hardware, would produce faster results than a non-optimal (higher time complexity) algorithm for the same purpose, running in more efficient hardware; that is why [[Algorithm (nonfiction)|algorithms]], like computer hardware, are considered technology.
"Elegant" (compact) programs, "good" (fast) programs : The notion of "simplicity and elegance" appears informally in Knuth and precisely in Chaitin:


Knuth: ". . .we want good algorithms in some loosely defined aesthetic sense. One criterion . . . is the length of time taken to perform the algorithm . . .. Other criteria are adaptability of the algorithm to computers, its simplicity and elegance, etc"
''"Elegant" (compact) programs, "good" (fast) programs'': The notion of "simplicity and elegance" appears informally in Knuth and precisely in Chaitin:
Chaitin: " . . . a program is 'elegant,' by which I mean that it's the smallest possible program for producing the output that it does"


Chaitin prefaces his definition with: "I'll show you can't prove that a program is 'elegant'"—such a proof would solve the Halting problem (ibid).
* Knuth: ". . .we want good [[Algorithm (nonfiction)|algorithms]] in some loosely defined aesthetic sense. One criterion . . . is the length of time taken to perform the [[Algorithm (nonfiction)|algorithm]] .... Other criteria are adaptability of the [[Algorithm (nonfiction)|algorithm]] to [[Computer (nonfiction)|computers]], its simplicity and elegance, etc"
* Chaitin: " . . . a program is 'elegant,' by which I mean that it's the smallest possible program for producing the output that it does" Chaitin prefaces his definition with: "I'll show you can't prove that a program is 'elegant'"—such a proof would solve the [[Halting problem (nonfiction)|Halting problem]] (ibid).


Algorithm versus function computable by an algorithm: For a given function multiple algorithms may exist. This is true, even without expanding the available instruction set available to the programmer. Rogers observes that "It is . . . important to distinguish between the notion of algorithm, i.e. procedure and the notion of function computable by algorithm, i.e. mapping yielded by procedure. The same function may have several different algorithms".
''Algorithm versus function computable by an algorithm'': For a given function multiple [[Algorithm (nonfiction)|algorithms]] may exist. This is true, even without expanding the available [[Instruction set (nonfiction)|instruction set]] available to the programmer. Rogers observes that "It is . . . important to distinguish between the notion of [[Algorithm (nonfiction)|algorithm]], i.e. procedure and the notion of function [[Computation (nonfiction)|computable]] by [[Algorithm (nonfiction)|algorithm]], i.e. mapping yielded by procedure. The same function may have several different [[Algorithm (nonfiction)|algorithms]]".


Unfortunately there may be a tradeoff between goodness (speed) and elegance (compactness)—an elegant program may take more steps to complete a computation than one less elegant. An example that uses Euclid's algorithm appears below.
Unfortunately there may be a tradeoff between goodness (speed) and elegance (compactness)—an elegant program may take more steps to complete a computation than one less elegant. An example that uses Euclid's algorithm appears below.


Computers (and computors), models of computation: A computer (or human "computor") is a restricted type of machine, a "discrete deterministic mechanical device" that blindly follows its instructions. Melzak's and Lambek's primitive models reduced this notion to four elements: (i) discrete, distinguishable locations, (ii) discrete, indistinguishable counters (iii) an agent, and (iv) a list of instructions that are effective relative to the capability of the agent.
''Computers (and computors), models of computation'': A [[Computer (nonfiction)|computer]] (or human "computor") is a restricted type of machine, a "discrete deterministic mechanical device" that blindly follows its instructions.


Minsky describes a more congenial variation of Lambek's "abacus" model in his "Very Simple Bases for Computability". Minsky's machine proceeds sequentially through its five (or six, depending on how one counts) instructions, unless either a conditional IF–THEN GOTO or an unconditional GOTO changes program flow out of sequence. Besides HALT, Minsky's machine includes three assignment (replacement, substitution) operations: ZERO (e.g. the contents of location replaced by 0: L ← 0), SUCCESSOR (e.g. L ← L+1), and DECREMENT (e.g. L ← L − 1). Rarely must a programmer write "code" with such a limited instruction set. But Minsky shows (as do Melzak and Lambek) that his machine is Turing complete with only four general types of instructions: conditional GOTO, unconditional GOTO, assignment/replacement/substitution, and HALT.
Melzak's and Lambek's primitive models reduced this notion to four elements:  


Simulation of an algorithm: computer (computor) language: Knuth advises the reader that "the best way to learn an algorithm is to try it . . . immediately take pen and paper and work through an example".[42] But what about a simulation or execution of the real thing? The programmer must translate the algorithm into a language that the simulator/computer/computor can effectively execute. Stone gives an example of this: when computing the roots of a quadratic equation the computor must know how to take a square root. If they don't, then the algorithm, to be effective, must provide a set of rules for extracting a square root.
* Discrete, distinguishable locations
This means that the programmer must know a "language" that is effective relative to the target computing agent (computer/computor).
* Discrete, indistinguishable counters
* An agent
* A list of instructions that are effective relative to the capability of the agent


But what model should be used for the simulation? Van Emde Boas observes "even if we base complexity theory on abstract instead of concrete machines, arbitrariness of the choice of a model remains. It is at this point that the notion of simulation enters". When speed is being measured, the instruction set matters. For example, the subprogram in Euclid's algorithm to compute the remainder would execute much faster if the programmer had a "modulus" instruction available rather than just subtraction (or worse: just Minsky's "decrement").
Minsky describes a more congenial variation of Lambek's "abacus" model in his "Very Simple Bases for Computability". Minsky's machine proceeds sequentially through its five (or six, depending on how one counts) instructions, unless either a conditional IF–THEN GOTO or an unconditional GOTO changes program flow out of sequence. Besides HALT, Minsky's machine includes three assignment (replacement, substitution) operations: ZERO (e.g. the contents of location replaced by 0: L ← 0), SUCCESSOR (e.g. L ← L+1), and DECREMENT (e.g. L ← L − 1). Rarely must a programmer write "code" with such a limited [[Instruction set (nonfiction)|instruction set]].


Structured programming, canonical structures: Per the Church–Turing thesis, any algorithm can be computed by a model known to be Turing complete, and per Minsky's demonstrations, Turing completeness requires only four instruction types—conditional GOTO, unconditional GOTO, assignment, HALT. Kemeny and Kurtz observe that, while "undisciplined" use of unconditional GOTOs and conditional IF-THEN GOTOs can result in "spaghetti code", a programmer can write structured programs using only these instructions; on the other hand "it is also possible, and not too hard, to write badly structured programs in a structured language". Tausworthe augments the three Böhm-Jacopini canonical structures: SEQUENCE, IF-THEN-ELSE, and WHILE-DO, with two more: DO-WHILE and CASE. An additional benefit of a structured program is that it lends itself to proofs of correctness using mathematical induction.
Minsky shows (as do Melzak and Lambek) that his machine is [[Turing completeness (nonfiction)|Turing complete]] with only four general types of instructions:


Canonical flowchart symbols: The graphical aide called a flowchart offers a way to describe and document an algorithm (and a computer program of one). Like program flow of a Minsky machine, a flowchart always starts at the top of a page and proceeds down. Its primary symbols are only four: the directed arrow showing program flow, the rectangle (SEQUENCE, GOTO), the diamond (IF-THEN-ELSE), and the dot (OR-tie). The Böhm–Jacopini canonical structures are made of these primitive shapes. Sub-structures can "nest" in rectangles, but only if a single exit occurs from the superstructure. The symbols, and their use to build the canonical structures, are shown in the diagram.
* Conditional GOTO
* Unconditional GOTO
* Assignment/replacement/substitution
* HALT
 
''Simulation of an algorithm: computer (computor) language'': Knuth advises the reader that "the best way to learn an [[Algorithm (nonfiction)|algorithm]] is to try it ... immediately take pen and paper and work through an example". But what about a simulation or execution of the real thing? The programmer must translate the [[Algorithm (nonfiction)|algorithm]] into a language that the simulator/computer/computor can effectively execute.
 
Stone gives an example of this: when computing the roots of a quadratic equation the computor must know how to take a square root. If they don't, then the [[Algorithm (nonfiction)|algorithm]], to be effective, must provide a set of rules for extracting a square root. This means that the programmer must know a "language" that is effective relative to the target computing agent (computer/computor).
 
What model should be used for the simulation? Van Emde Boas observes "even if we base [[Computational complexity theory (nonfiction)|computational complexity theory]] on abstract instead of concrete machines, arbitrariness of the choice of a model remains. It is at this point that the notion of simulation enters". When speed is being measured, the instruction set matters. For example, the subprogram in Euclid's algorithm to compute the remainder would execute much faster if the programmer had a "modulus" instruction available rather than just subtraction (or worse: just Minsky's "decrement").
 
''Structured programming, canonical structures'': Per the [[Church–Turing thesis (nonfiction)|Church–Turing thesis]], any [[Algorithm (nonfiction)|algorithm]] can be computed by a model known to be [[Turing completeness (nonfiction)|Turing complete]], and per Minsky's demonstrations, [[Turing completeness (nonfiction)|Turing completeness]] requires only four instruction types:
 
* Conditional GOTO
* Unconditional GOTO
* Assignment
* HALT
 
Kemeny and Kurtz observe that, while "undisciplined" use of unconditional GOTOs and conditional IF-THEN GOTOs can result in "spaghetti code", a programmer can write structured programs using only these instructions; on the other hand "it is also possible, and not too hard, to write badly structured programs in a structured language". Tausworthe augments the three Böhm-Jacopini canonical structures: SEQUENCE, IF-THEN-ELSE, and WHILE-DO, with two more: DO-WHILE and CASE.
 
An additional benefit of a structured program is that it lends itself to proofs of correctness using mathematical induction.
 
''Canonical flowchart symbols'': The graphical aide called a flowchart offers a way to describe and document an [[Algorithm (nonfiction)|algorithm]] (and a computer program of one). Like program flow of a Minsky machine, a flowchart always starts at the top of a page and proceeds down.
 
It has four primary symbols:
 
* The directed arrow showing program flow
* The rectangle (SEQUENCE, GOTO)
* The diamond (IF-THEN-ELSE)
* The dot (OR-tie)
 
The Böhm–Jacopini canonical structures are made of these primitive shapes. Sub-structures can "nest" in rectangles, but only if a single exit occurs from the superstructure. The symbols, and their use to build the canonical structures, are shown in the diagram.
 
== In the News ==
 
<gallery>
</gallery>
 
== Fiction cross-reference ==
 
* [[Crimes against mathematical constants]]
* [[Gnomon algorithm]]
* [[Mathematics]]
 
== Nonfiction cross-reference ==
 
* [[Church–Turing thesis (nonfiction)]]
* [[Computation (nonfiction)]]
* [[Computational complexity theory (nonfiction)]]
* [[Computer (nonfiction)]]
* [[Computer science (nonfiction)]]
* [[Halting problem (nonfiction)]]
* [[Program optimization (nonfiction)]]
 
External links:
 
* [https://en.wikipedia.org/wiki/Computer_program Computer program] @ Wikipedia
 
[[Category:Nonfiction (nonfiction)]]
[[Category:Algorithms (nonfiction)]]
[[Category:Computer science (nonfiction)]]
[[Category:Information systems (nonfiction)]]
[[Category:Mathematics (nonfiction)]]

Latest revision as of 15:41, 16 December 2017

A computer algorithm is an algorithm written in software by software developers to be effective for the intended "target" computer(s) to produce output from given (perhaps null) input.

An optimal algorithm, even running in old hardware, would produce faster results than a non-optimal (higher time complexity) algorithm for the same purpose, running in more efficient hardware; that is why algorithms, like computer hardware, are considered technology.

"Elegant" (compact) programs, "good" (fast) programs: The notion of "simplicity and elegance" appears informally in Knuth and precisely in Chaitin:

  • Knuth: ". . .we want good algorithms in some loosely defined aesthetic sense. One criterion . . . is the length of time taken to perform the algorithm .... Other criteria are adaptability of the algorithm to computers, its simplicity and elegance, etc"
  • Chaitin: " . . . a program is 'elegant,' by which I mean that it's the smallest possible program for producing the output that it does" Chaitin prefaces his definition with: "I'll show you can't prove that a program is 'elegant'"—such a proof would solve the Halting problem (ibid).

Algorithm versus function computable by an algorithm: For a given function multiple algorithms may exist. This is true, even without expanding the available instruction set available to the programmer. Rogers observes that "It is . . . important to distinguish between the notion of algorithm, i.e. procedure and the notion of function computable by algorithm, i.e. mapping yielded by procedure. The same function may have several different algorithms".

Unfortunately there may be a tradeoff between goodness (speed) and elegance (compactness)—an elegant program may take more steps to complete a computation than one less elegant. An example that uses Euclid's algorithm appears below.

Computers (and computors), models of computation: A computer (or human "computor") is a restricted type of machine, a "discrete deterministic mechanical device" that blindly follows its instructions.

Melzak's and Lambek's primitive models reduced this notion to four elements:

  • Discrete, distinguishable locations
  • Discrete, indistinguishable counters
  • An agent
  • A list of instructions that are effective relative to the capability of the agent

Minsky describes a more congenial variation of Lambek's "abacus" model in his "Very Simple Bases for Computability". Minsky's machine proceeds sequentially through its five (or six, depending on how one counts) instructions, unless either a conditional IF–THEN GOTO or an unconditional GOTO changes program flow out of sequence. Besides HALT, Minsky's machine includes three assignment (replacement, substitution) operations: ZERO (e.g. the contents of location replaced by 0: L ← 0), SUCCESSOR (e.g. L ← L+1), and DECREMENT (e.g. L ← L − 1). Rarely must a programmer write "code" with such a limited instruction set.

Minsky shows (as do Melzak and Lambek) that his machine is Turing complete with only four general types of instructions:

  • Conditional GOTO
  • Unconditional GOTO
  • Assignment/replacement/substitution
  • HALT

Simulation of an algorithm: computer (computor) language: Knuth advises the reader that "the best way to learn an algorithm is to try it ... immediately take pen and paper and work through an example". But what about a simulation or execution of the real thing? The programmer must translate the algorithm into a language that the simulator/computer/computor can effectively execute.

Stone gives an example of this: when computing the roots of a quadratic equation the computor must know how to take a square root. If they don't, then the algorithm, to be effective, must provide a set of rules for extracting a square root. This means that the programmer must know a "language" that is effective relative to the target computing agent (computer/computor).

What model should be used for the simulation? Van Emde Boas observes "even if we base computational complexity theory on abstract instead of concrete machines, arbitrariness of the choice of a model remains. It is at this point that the notion of simulation enters". When speed is being measured, the instruction set matters. For example, the subprogram in Euclid's algorithm to compute the remainder would execute much faster if the programmer had a "modulus" instruction available rather than just subtraction (or worse: just Minsky's "decrement").

Structured programming, canonical structures: Per the Church–Turing thesis, any algorithm can be computed by a model known to be Turing complete, and per Minsky's demonstrations, Turing completeness requires only four instruction types:

  • Conditional GOTO
  • Unconditional GOTO
  • Assignment
  • HALT

Kemeny and Kurtz observe that, while "undisciplined" use of unconditional GOTOs and conditional IF-THEN GOTOs can result in "spaghetti code", a programmer can write structured programs using only these instructions; on the other hand "it is also possible, and not too hard, to write badly structured programs in a structured language". Tausworthe augments the three Böhm-Jacopini canonical structures: SEQUENCE, IF-THEN-ELSE, and WHILE-DO, with two more: DO-WHILE and CASE.

An additional benefit of a structured program is that it lends itself to proofs of correctness using mathematical induction.

Canonical flowchart symbols: The graphical aide called a flowchart offers a way to describe and document an algorithm (and a computer program of one). Like program flow of a Minsky machine, a flowchart always starts at the top of a page and proceeds down.

It has four primary symbols:

  • The directed arrow showing program flow
  • The rectangle (SEQUENCE, GOTO)
  • The diamond (IF-THEN-ELSE)
  • The dot (OR-tie)

The Böhm–Jacopini canonical structures are made of these primitive shapes. Sub-structures can "nest" in rectangles, but only if a single exit occurs from the superstructure. The symbols, and their use to build the canonical structures, are shown in the diagram.

In the News

Fiction cross-reference

Nonfiction cross-reference

External links: