Since my last edition in March, I've been refining the design of Videlicet: a script I hope will aid in maintenance of digital art collections. Although I haven't yet finished work, it is near completion, as you can see in this abridged snapshot: http://www.mediafire.com/file/l0mh2dac75t63wl/TK-GreatestHits-2017-06-15.zip Presently, the only available functionality is a label-maker. Soon, it will be a label maker with tentacles. (Full disclosure: tentacles are metaphorical in nature, and the program is so described for the sole purpose of setting up this joke about octo-pythons.) Hopefully the finished work will be available to you by this July, at which time I will issue the complete edition. In the meantime, here are some computer-related trivia. Factual computer tidbits, each in 80 columns -- the canonical console width. (These are with considerable reference to FOLDOC and the Jargon File.) We say "computer" about machines these days, but it once meant one who computes. Computer machines are a kind of difference engine. They calculate math rapidly. The first such difference engine is attributed to Charles Babbage. The first reprogrammable machine, however, is reputed to be the Jacquard loom. Source code is a series of instructions telling a computer what to compute. Source code is interpreted by a compiler that translates it to assembler code. An electronic calculator is, in abstract, an infix notation algebra compiler. Assemblers translate human-legible mnemonics into computers' "machine language." Machine language, an abstraction of electrical potential (EMF), is binary code. The Volt, defined by the IEC in 1983, is the unit of electro-motive force (EMF). Metal oxide semiconductor field-effect transistors (MOSFETs) are logic circuits. Logic circuits encode inverse Boolean algebra: NAND, NOR, and NOT. Computer programming languages evolved to assembler mnemonics from machine code. From assembler, programs further evolved to high-level language (such as C). Very high level language (whatever that means - perhaps interpreters?) was next. Modern computer programming reads much like calculus. (C.f. ASM = arithmetic.) Object-oriented programming arranges data in nested structures, then computes. Functional programming computes with nested functions, then arranges the data. File systems are tree-like data structures encoding allocation of disk memory. Compressed archives use something like LZMA to squeeze redundant bytes in files. Non-volatile memory (disk space) lasts longer than volatile (RAM). Harddisk capacity is measured in Gigabytes. They're magnetic platters or EAPROM. Magnetic storage functions by "reading" and "writing" magnetic fields. Electrically alterable programmable read only memory blows fuses & antifuses. Operating systems handle tasks, as process scheduling, incidental to human use. Mainframes ("clouds") have one central OS; the clients are dumb terminals. Dumb terminals have no independent processing or storage capability. Personal computers, by contrast, have processing, storage, and an OS. Graphical user interfaces (GUIs) are the modern point-and-click metaphor. Command line interfaces (CLIs) are the "antiquated" terminal console metaphor. Computer networks are any set of computermachines "speaking" to one another. Sessions are by way of protocol. The Worldwide Web uses HTTP over TCP/IP. The Internet & WWW evolved via cookoffs: see the Requests for Comment. Bandwidth on the Internet has increased from baud to megabytes per second. All computing resources can be served on a network, bandwidth permitting. ^- Senator Ted Stevens' famous "series of tubes" quote was accurate, BTW. Human interface devices are a material tool humans use to interact w/ computers. Graphical computer displays evolved from oscilloscopes etc, to CRTs, to LCDs. Cathode ray tubes work by shooting electrons at a phosphorescent matrix. Modern television-size liquid crystal displays contain millions of circuits. CPUs handle arithmetic and logic. These days, there are 4 of them on one chip. Frequency of a CPU's oscillation is measured in Gigahertz. For example, 2.5 GHz. The chip's clock speed (oscillation) determines how fast it computes. All arithmetic can be computed by adding: by 1 only, too, I think. Ones Complement and Twos Complement are binary encodings for negative numbers. Microchips are printed circuit boards (PCBs) that execute various functions. The chips on a mainboard are connected to one another by a bus ("omnibus bar"). Computer hardware is the aforementioned assemblage of circuit boards. "Firmware" (between hardware & software) is on-board (on-chip?) control logic. Computer software is some instructions compiled & ready to run: a core image. "Bootstrapping" a computer refers to a story about a man who flew in the air. The Basic I/O System of yore was supplanted by the Extended Firmware Interface. Personal computer workstations are sometimes called "boxes," due to their shape. Alan Turing's model of a finite state automaton is a supposed computer. Emulators, or virtual machines, are also logical computers.